You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

Adapting the Global Settings

The Process Mining services need some settings on the Scheer PAS BRIDGE to match your configuration. If you have deployed the services for the first time, some setting variables will be created. The variables will be empty, you will have to configure them.

Figure: Global Bridge Settings Used by the Process Mining Services

All these values appear in red (undefined), because they are not configured yet. You need to configure them now. If you need more information on global settings and their usage, refer to the BRIDGE Documentation > Using Global Setting Variables.

Change these settings to the following:

SettingDescriptionValues
Database Settings

E2E_Dashboards_AnalyticDB_ConnectionString

Supply the name of the database you configured on your database server (as done in Setting up the Database for Process Mining).<name of your database>

E2E_Dashboards_AnalyticDB_Password

Supply the password of the database user, the Process Mining services should use. <password of your database user>

E2E_Dashboards_AnalyticDB_Type

Supply the type of database the analytic data is stored in.MySQL, Oracle, SQLServer

E2E_Dashboards_AnalyticDB_User

Supply the database user, the Process Mining services should use.<your database user>
WorkDirectory Settings
E2E_Dashboards_WorkDirectory_FileSystem_PathSupply the URN path to the work directory.Example: C:\DashboardWorkDirectory

If you want to change the work directory (to FTP or SFTP), refer to the page Changing the Work Directory.

Once you click Apply, these values will be applied to the Process Mining services that are using the setting variables. The settings you updated will not be displayed as undefined anymore.

Configuring the Data Collection Services

Setting up the TrxLogsCollector

The TrxLogsCollector can be configured via the settings of the service and a user interface. For more information on how to change the settings of a service, refer to the BRIDGE Documentation > xUML Service Settings.

SettingDescriptionValuesDefault
BridgeDataPathSupply the path to the BRIDGE Data directory.
The TrxLogsCollector will collect logs from services deployed there. You can configure the services you want to inspect with Process Mining with the services UI.


DB Directory Type Not used.

HostName Enrich the imported data by a host name being the origin of the imported transaction logs.
${server_hostname}

MaxBackCollectionInDays

Specify the number of days the collector should go back in time, when a collect is done.
This setting also defines the number of days that will be collected at first run.

0
(Collect today's logs only at first. Then collect all past logs.)

Any integer.
(Collect only the log files of the past number of days.)

 7
WorkDirectory Settings


Work Directory TypeDefine how the TrxLogsCollector service should transfer the collected service logs to the work directory of the TrxLogsETL service: via file system, FTP or SFTP. FileSystem, FTP, SFTP FileSystem

Scheduler Settings

In the Scheduler Service settings of TrxLogsCollector, you can switch the scheduler triggering the data collection on and off, and change the execution interval of the scheduler. Default execution interval is once per hour.

If you need more information on the scheduler date time patterns, refer to the BRIDGE Documentation > Date Time Patterns.

For more information on how to maintain the TrxLogsCollector Service UI, refer to page Adding new xUML Services to Process Mining.

Setting up the TrxLogsETL

The TrxLogsETL can be configured via the settings of the service. For more information on how to change the settings of a service, refer to the BRIDGE Documentation > xUML Service Settings.

SettingDescriptionValuesDefault
ETLChunk


Ignore after x retries Specify how often TrxLogsETL tries to import a log file before ignoring it.any integer5
Retry after x seconds Specify the time in seconds until TrxLogsETL retries to import a log file after an error.any integer60
useBulkUpload

Supply the upload method: Bulk Upload or Upload via Inserts.
Bulk upload is faster and recommended for large amounts of data. Please note the restrictions in the notes below and also refer to Setting up the Database for Process Mining.

Never use a transaction log level deeper than Service on the TrxLogsETL service (default: none), if you use Upload via Inserts! TrxLogsETL then will create a log entry for each insert and you may reach the maximum file size doing that.

Bulk Upload does not work with MySQL.

true, false

true

SettingDescriptionValueDefault
TrxLogsETL


Retention in days

Supply the amount of days you want to keep the collected data. The TrxLogsETL service will then delete data that is older.

Providing 0 days is equal to never delete collected data.

0
(Never delete collected data.)

Any integer.
(Delete collected data older than the specified amount of days.)

0
SQLServer


workDirectory Supply the path to the work directory for SQLServer. This path is used, if ETL by Bulk is configured with SQLServer. Note, that this path is seen by the SQLServer not the TrxLogsETL service, so it should be a full URN path.Full URN path to the SQL work directory. {{E2E_Dashboards_WorkDirectory_FileSystem_Path}}
WorkDirectory Settings


Work Directory TypeDefine how the TrxLogsETL service should read the collected service logs from the work directory: via file system, FTP or SFTP. FileSystem, FTP, SFTP FileSystem

Scheduler Settings

In the Scheduler Service settings of TrxLogsETL, you can switch the scheduler deleting old data on and off, and change the execution interval of the scheduler. Default execution interval is once per hour.

If you need more information on the scheduler date time patterns, refer to the BRIDGE Documentation > Date Time Patterns.

For more information on how to maintain the TrxLogsETL Service UI, refer to page Importing Process Data to Process Mining.

Setting up the Analytic API

The TrxLogsETL can be configured via the settings of the service. For more information on how to change the settings of a service, refer to the BRIDGE Documentation > xUML Service Settings.

Service Settings

SettingDescription
SQLServer
WorkDirectory
Not used by the Analytic API.

Configuring the Node.js Services

You can use the default settings of the Node.js services. Only make sure that the rdbms setting of the analytics-api-service corresponds to the analytical database.

Further configuration of the services is not necessary, but possible. You can configure the Node.js services via the Settings tab in the detail view of each service. The configuration is provided as a JSON object.

Settings Applicable for all Services

SettingTypeDescriptionDefault
service-repository


hostnameStringThe host providing the service-repository.localhost
port String The port of the service-repository.3017
protocol StringThe protocol of the service-repository.ws
reconnection BooleanSpecify, if this service shall try to reconnect the service-repository in case of connectivity loss.true
requiredServices ObjectSpecify the services required by the service-repository.{}
<service name>


hostname StringThe host providing this service.localhost
port StringThe port of this service.
protocol StringThe protocol of this service.ws
requiredServices StringSpecify the services required by this service.{}

analytics-api-service


SettingTypeDescriptionDefault
analytics-api-service



  requiredServices ObjectSpecify the services required by the analytics-api-service. bpaas-cockpit-service, user-service
portStringThe port of this service.3041
analytics-analytic-api  ObjectSpecify the connection to the Analytic API. Details see below.
rdbms ObjectSpecify the connection to the analytical database. Details  see below.

analytics-analytic-api  

hostname  StringThe host providing the Analytic API.localhost
port  StringThe port of the Analytic API.3040
protocolStringThe protocol of the Analytic API.http

rdbms

vendorStringThe RDBMS of the analytical database to connect to. Must be one of mssql, mysql or oracle.
connection  ObjectThe connection details of the analytical database. Details see below (please note that the available options depend on the RDBMS of the analytical database).
connection (Microsoft SQL Server, MySQL)
hostStringThe host providing the analytical database.
portNumberThe port of the analytical database.1433 (Microsoft SQL Server)
3306 (MySQL)
databaseStringThe database/scheme of the analytical database.
user  StringThe user giving Process Mining access to the analytical database.
passwordStringThe password of the user accessing the analytical database.
connectionLimitNumberThe maximum number of connections to the analytical database to be opened at the same time.10
connection (Oracle)
connectString StringThe Oracle connect string to the analytical database.
user
StringThe user giving Process Mining access to the analytical database.
passwordStringThe password of the user accessing the analytical database.
connectionLimitNumberThe maximum number of connections to the analytical database to be opened at the same time.10

bpaas-cockpit


SettingTypeDescriptionDefault
bpaas-cockpit




requiredServices ObjectSpecify the services required by the bpaas-cockpit. bpaas-cockpit-service, user-service, analytics-api-service

portStringThe port of this service.3005

appObjectDefine app-specific settings. Details see below.
app branding StringSet the branding of the bpaas-cockpit. Must be bpaas or e2e.e2e

bpaas-cockpit-service


SettingTypeDescriptionDefault
bpaas-cockpit-service




requiredServices ObjectSpecify the services required by the bpaas-cockpit-service. user-service, persistence-service

portStringThe port of this service.3021

roleMapping ObjectDefine the mapping of profile names (object keys) to role names (values for the object keys).

userExtensions ObjectContains predefined settings for each user.

defaultGroup ObjectDefine the default group which is added to the cockpit of each user on first login.

Now that you have installed and set-up all components of Scheer PAS Process Mining, you can start the services. 

  • No labels