Versions Compared
Key
- This line was added.
- This line was removed.
- Formatting was changed.
Adapting the Global Settings
The Process Mining services need some settings on the Scheer PAS BRIDGE to match your configuration. If you have deployed the services for the first time, some setting variables will be created. The variables will be empty, you will have to configure them.
Figure: Global Bridge Settings Used by the Process Mining Services
All these values appear in red (undefined), because they are not configured yet. You need to configure them now. If you need more information on global settings and their usage, refer to the BRIDGE Documentation > Using Global Setting Variables.
Change these settings to the following:
Setting | Description | Values |
---|---|---|
Database Settings | ||
E2E_Dashboards_AnalyticDB_ConnectionString | Supply the name of the database you configured on your database server (as done in Setting up the Database for Process Mining). | <name of your database> |
E2E_Dashboards_AnalyticDB_Password | Supply the password of the database user, the Process Mining services should use. | |
E2E_Dashboards_AnalyticDB_Type | Supply the type of database the analytic data is stored in. | MySQL, Oracle, SQLServer |
E2E_Dashboards_AnalyticDB_User | Supply the database user, the Process Mining services should use. | <your database user> |
WorkDirectory Settings | ||
E2E_Dashboards_WorkDirectory_FileSystem_Path | Supply the URN path to the work directory. | Example: C:\DashboardWorkDirectory |
Info |
---|
If you want to change the work directory (to FTP or SFTP), refer to the page Changing the Work Directory. |
Once you click Apply, these values will be applied to the Process Mining services that are using the setting variables. The settings you updated will not be displayed as undefined anymore.
Configuring the Data Collection Services
Setting up the TrxLogsCollector
The TrxLogsCollector can be configured via the settings of the service and a user interface. For more information on how to change the settings of a service, refer to the BRIDGE Documentation > xUML Service Settings.
Settings | Description | Values | Default |
---|---|---|---|
BridgeDataPath | Supply the path to the BRIDGE Data directory. The TrxLogsCollector will collect logs from services deployed there. You can configure the services you want to inspect with Process Mining with the services UI. | ||
DB Directory Type | Not used. | ||
HostName | Enrich the imported data by a host name being the origin of the imported transaction logs. | ${server_hostname} | |
MaxBackCollectionInDays | Specify the number of days the collector should go back in time, when a collect is done. | 0 Any integer. | 7 |
WorkDirectory Settings | |||
Work Directory Type | Define how the TrxLogsCollector service should transfer the collected service logs to the work directory of the TrxLogsETL service: via file system, FTP or SFTP. | FileSystem , FTP , SFTP | FileSystem |
Scheduler Settings
In the Scheduler Service settings of TrxLogsCollector, you can switch the scheduler triggering the data collection on and off, and change the execution interval of the scheduler. Default execution interval is once per hour.
If you need more information on the scheduler date time patterns, refer to the BRIDGE Documentation > Date Time Patterns.
Info |
---|
For more information on how to maintain the TrxLogsCollector Service UI, refer to page Adding new xUML Services to Process Mining. |
Setting up the TrxLogsETL
The TrxLogsETL can be configured via the settings of the service. For more information on how to change the settings of a service, refer to the BRIDGE Documentation > xUML Service Settings.
Setting | Description | Values | Default | ||||
---|---|---|---|---|---|---|---|
ETLChunk | |||||||
Ignore after x retries | Specify how often TrxLogsETL tries to import a log file before ignoring it. | any integer | 5 | ||||
Retry after x seconds | Specify the time in seconds until TrxLogsETL retries to import a log file after an error. | any integer | 60 | ||||
useBulkUpload | Supply the upload method: Bulk Upload or Upload via Inserts.
| true, false | true |
Setting | Description | Value | Default |
---|---|---|---|
SQLServer | |||
workDirectory | Supply the path to the work directory for SQLServer. This path is used, if ETL by Bulk is configured with SQLServer. Note, that this path is seen by the SQLServer not the TrxLogsETL service, so it should be a full URN path. | Full URN path to the SQL work directory. | {{E2E_Dashboards_WorkDirectory_FileSystem_Path}} |
WorkDirectory Settings | |||
Work Directory Type | Define how the TrxLogsETL service should read the collected service logs from the work directory: via file system, FTP or SFTP. | FileSystem , FTP , SFTP | FileSystem |
Setting up the Analytic API
The TrxLogsETL can be configured via the settings of the service. For more information on how to change the settings of a service, refer to the BRIDGE Documentation > xUML Service Settings.
Service Settings
Configuring the Node.js Services
You can use the default settings of the Node.js services. Only make sure that the rdbms setting of the analytics-api-service corresponds to the analytical database.
Further configuration of the services is not necessary, but possible. You can configure the Node.js services via the Settings tab in the detail view of each service. The configuration is provided as a JSON object.
Settings Applicable for all Services
Setting | Type | Description | Default |
---|---|---|---|
service-repository | |||
hostname | String | The host providing the service-repository. | localhost |
port | String | The port of the service-repository. | 3017 |
protocol | String | The protocol of the service-repository. | ws |
reconnection | Boolean | Specify, if this service shall try to reconnect the service-repository in case of connectivity loss. | true |
requiredServices | Object | Specify the services required by the service-repository. | {} |
<service name> | |||
hostname | String | The host providing this service. | localhost |
port | String | The port of this service. | |
protocol | String | The protocol of this service. | ws |
requiredServices | String | Specify the services required by this service. | {} |
analytics-api-service
Setting | Type | Description | Default | |||||||
---|---|---|---|---|---|---|---|---|---|---|
analytics-api-service | ||||||||||
requiredServices | Object | Specify the services required by the analytics-api-service. | bpaas-cockpit-service, user-service | |||||||
port | String | The port of this service. | 3041 | |||||||
analytics-analytic-api | Object | Specify the connection to the Analytic API. Details see below. | ||||||||
rdbms | Object | Specify the connection to the analytical database. Details see below. | ||||||||
deletionScheduler | Object | Enable the scheduler to delete data. Details see below. | ||||||||
intervalLoadProcesses | Number | Define the caching interval for the process list in the Scheer PAS Administration. | 60 | |||||||
analytics-analytic-api
| hostname | String | The host providing the Analytic API. | localhost | ||||||
port | String | The port of the Analytic API. | 3040 | |||||||
protocol | String | The protocol of the Analytic API. | http | |||||||
rdbms | vendor | String | The RDBMS of the analytical database to connect to. Must be one of mssql, mysql or oracle. | |||||||
connection | Object | The connection details of the analytical database. Details see below (please note that the available options depend on the RDBMS of the analytical database). | ||||||||
connection (Microsoft SQL Server, MySQL) | host | String | The host providing the analytical database. | |||||||
port | Number | The port of the analytical database. | 1433 (Microsoft SQL Server) 3306 (MySQL) | |||||||
database | String | The database/scheme of the analytical database. | ||||||||
user | String | The user giving Process Mining access to the analytical database. | ||||||||
password | String | The password of the user accessing the analytical database. | ||||||||
connectionLimit | Number | The maximum number of connections to the analytical database to be opened at the same time. | 10 | |||||||
connection (Oracle) | connectString | String | The Oracle connect string to the analytical database. | |||||||
user | String | The user giving Process Mining access to the analytical database. | ||||||||
password | String | The password of the user accessing the analytical database. | ||||||||
connectionLimit | Number | The maximum number of connections to the analytical database to be opened at the same time. | 10 | |||||||
deletionScheduler | enable | Boolean | Enable or disable the scheduler. | true | ||||||
cronPattern | String | Defines the execution interval of the scheduler. | 0 1 * * 6 | |||||||
api | Object | Contains the reference for the REST API of the TrxLogsETL. Details see below. | ||||||||
api | protocol | String | The protocol of the REST API. | http | ||||||
hostname | String | The host name of the REST API. | localhost | |||||||
port | String | The port of the REST API. | 3043 |
bpaas-cockpit
Setting | Type | Description | Default | |
---|---|---|---|---|
bpaas-cockpit | ||||
requiredServices | Object | Specify the services required by the bpaas-cockpit. | bpaas-cockpit-service, user-service, analytics-api-service | |
port | String | The port of this service. | 3005 | |
app | Object | Define app-specific settings. Details see below. | ||
app | branding | String | Set the branding of the bpaas-cockpit. Must be bpaas or e2e. | e2e |
bpaas-cockpit-service
Setting | Type | Description | Default | |
---|---|---|---|---|
bpaas-cockpit-service | ||||
requiredServices | Object | Specify the services required by the bpaas-cockpit-service. | user-service, persistence-service | |
port | String | The port of this service. | 3021 | |
roleMapping | Object | Define the mapping of profile names (object keys) to role names (values for the object keys). | ||
userExtensions | Object | Contains predefined settings for each user. | ||
defaultGroup | Object | Define the default group which is added to the cockpit of each user on first login. |
Now that you have installed and set-up all components of Scheer PAS Process Mining, you can start the services.