The connector only supports provisioning.
Functionalities
Provisioning Integration
Data Format |
Export |
Create |
Modify |
Delete |
Trigger |
Tables |
Yes |
Yes |
Yes |
Yes |
Yes |
Api |
No |
Yes |
Yes |
Yes |
No |
Prerequisites
Ensure that these prerequisites are satisfied:
- IBM DB2 Version x, 8.x, or 9.x is installed, configured, and running.
- An administrator account that can be used to establish a connection and has authority to manage accounts on the connected system.
Creating the Connected System in the Admin UI
-
Log in to Identity Administration and click the Systems tab.
-
On the Connected System View page, click the Add button and select the IBM DB2 connected system from the Type drop-down list. The Connected System Details page displays the default values:
-
Enter the desired information:
Definition Supported Connectors Displays whether the connected system is Identity only, Provisioning only, or both. Password Policy Displays the name of the password policy associated with the connected system. Note: Because this is a Provisioning only system, it does not support any password policy. Connected System Group Displays the name of the system group that includes this connected system.
Note: If a password policy is associated with a connected system and then the connected system is placed in a group, the group’s password policy will override the connected system’s password policy. The password policy will be removed from the connected system.
Type Select the connected system type. Locale Select the preferred language (default: English). Locale specific information such as Display Name and Description can be added only while modifying the connected system. Name The name for this connected system. Note: The name cannot be modified later. Display Name The display name of the new connected system. Description The description of the connected system. Associated With Select how the connector associated with this system will run:
- Server (default) - Runs locally on the Provisioning/Identity Server.
- Global Identity Gateway - Runs remotely on a Global Identity Gateway cluster member. Note: Only GIG clusters that have at least one registered and enabled member will display in this list.
- See Using the Global Identity Gateway with Connected Systems for additional information.
Password Reset By Enables administrators to configure password management functions normally available to Users and OBO (On Behalf Of) Users:
- OBO User Only - Connected system and account association information is displayed only in Self-Service user management (for OBO Users). OBO Users can reset passwords for accounts on this connected system. Administrators can perform all user management functions for this connected system (e.g., enable/disable, validate, associate user, and password reset). End users will not see their accounts on this connected system in Self-Service and Kiosk; therefore, they cannot reset passwords for accounts on this connected system.
- Users and OBO User - Connected system and account association information is displayed in Self-Service password reset, Self-Service - Kiosk, and Self-Service user management. Self-Service users, Kiosk users, and OBO Users can reset passwords for accounts on this connected system. Administrators can perform all user management functions for this connected system (e.g., enable/disable, validate, associate user, and password reset).
- External - Connected system and account association information is not displayed in Self-Service password reset, Self-Service - Kiosk, and Self-Service user management. Self-Service users, Kiosk users, and OBO Users cannot reset passwords for accounts on this connected system.
Note: When user management configuration enables OBO Users to perform password resets, this definition must be set to OBO User Only or Users and OBO User. For connectors that support Provisioning only, there is no password reset capability.
Provisioning Option Select the provisioning option: - Automated (default) - The connected system functions as a normal connected system; there are no restrictions.
- Administrative - The connected system cannot be used as an object in a workflow.
Enable HPAM Support Select to make the connected system HPAM enabled (default: cleared). Note: This can only be set for systems that support Identity. Connection Information HostThe IP address or host name of the server (e.g., 10.102.200.20 or localhost). Port The database port number. Service Account Name The name of the administrative user account used to connect to the server. Service Account Password The administrative user password. Initial DB The SID or Service Name to connect. Maximum Connection Pool Size Select the maximum number of connections that can be created in the connection pool by the connector. As needed, the connection pool will grow only to this maximum limit. System Owner Add or Remove users assigned as the owners of the system. Displays the Connected System Owner Search page for selecting users. The HPAM column indicates whether the system owner is authorized to use the HPAM feature. The Approvers column indicates whether the system owner is an approver in the approval process. -
Click the Test Connection button to test the Connection Information:
If successful, this message may display:-
Message: Connection from Provisioning to the connected system was established successfully.
- If unsuccessful, this message may display
Error: Failed to establish connection from Provisioning to the connected system.
Note: If the connection fails, additional messages may display providing more information regarding the failure, and additional information may be posted to the Provisioning and Identity logs.
-
-
(Optional) To select owners of the system, click the System Owner Add button. The Connected System Owner Search page displays:
-
Select the owners and then click the Select button. The system owner displays under the System Owner section:
Note: More than one user can be assigned as an owner.
-
To add additional system owners, click the Add button.
-
- On the Connected System Details page, click the Add button to save the configured connected system. The Object Category Association page displays a list of categories that are already associated and/or can be selected to add additional associations to this connected system:
-
Select one or more available object categories or provide search criteria and click the Search button to find specific categories to select. If there are no available categories to select, proceed to Step 7.
-
Click the Add Association button to associate the selected object categories to the connected system.
-
-
Click the Back button to return to the Connected System View page. The new connected system displays in the list.
See Copying, Modifying, and Deleting Connected Systems for additional information.
Creating the Connected System in the Studio
- Log in to the Workflow and Connectivity Studio and click Connectivity ► Add Systems on the menu bar. The Add Connected Systems window displays.
- Select the IBM DB2 connected system from the Type drop-down list. The default values display:
-
-
Enter the desired information:
Definition Type Select the connected system type. Name The name for this connected system. Note: The name cannot be modified later. Display Name The display name of the new connected system. Description The description of the connected system. Supported Connectors Displays whether the connected system is Identity only, Provisioning only, or both. Only connectors that support Provisioning are available here. Associated With Select how the connector associated with this system will run: - Server (default) - Runs locally on the Provisioning/Identity Server.
- Global Identity Gateway - Runs remotely on a Global Identity Gateway cluster member. Note: Only GIG clusters that have at least one registered and enabled member will display in this list.
Password Reset By Enables administrators to configure password management functions normally available to Users and OBO (On Behalf Of) Users:
- OBO User Only - Connected system and account association information is displayed only in Self-Service user management (for OBO Users). OBO Users can reset passwords for accounts on this connected system. Administrators can perform all user management functions for this connected system (e.g., enable/disable, validate, associate user, and password reset). End users will not see their accounts on this connected system in Self-Service and Kiosk; therefore, they cannot reset passwords for accounts on this connected system.
- Users and OBO User - Connected system and account association information is displayed in Self-Service password reset, Self-Service - Kiosk, and Self-Service user management. Self-Service users, Kiosk users, and OBO Users can reset passwords for accounts on this connected system. Administrators can perform all user management functions for this connected system (e.g., enable/disable, validate, associate user, and password reset).
- External - Connected system and account association information is not displayed in Self-Service password reset, Self-Service - Kiosk, and Self-Service user management. Self-Service users, Kiosk users, and OBO Users cannot reset passwords for accounts on this connected system.
Note: When user management configuration enables OBO Users to perform password resets, this definition must be set to OBO User Only or Users and OBO User. For connectors that support Provisioning only, there is no password reset capability.
Provisioning Option Select the provisioning option:
- Automated (default) - The connected system functions as a normal connected system; there are no restrictions.
- Administrative - The connected system cannot be used as an object in a workflow.
Enable HPAM Support Select to make the connected system HPAM enabled (default: cleared). Note: This can only be set for systems that support Identity. Connection Information Host The IP address or host name of the server (e.g., 10.102.200.20 or localhost). Port The port number. Service Account Name The name of the administrative user account used to connect to the server. The Select button displays the Select DN from LDAP Directory window to select the DN value. Service Account Password The administrative user password. Initial DB The SID or Service Name to connect. Maximum Connection Pool Size Select the maximum number of connections that can be created in the connection pool by the connector. As needed, the connection pool will grow only to this maximum limit. -
Click the Connect button to test the Connection Information:
- If successful, this message displays:
Connection from Studio to the connected system was established successfully.
- If unsuccessful, this message displays:
Failed to establish connection from Studio to the connected system.
Note: If the connection fails, additional messages may display providing more information regarding the failure, and additional information may be posted to the Provisioning and Identity logs.
-
Click the Apply button to apply changes. The Category Association window displays.
-
Select one or more object categories from the Available Categories list or enter a category name and click the Search button to find a specific category to select. If there are no available categories to select, proceed to Step 6.
-
Click the Add button to associate the selected object categories to the connected system.
-
-
Click OK to accept selected categories.
See Copying, Modifying, and Deleting Connected Systems for additional information.
Using the Connected System for Provisioning
Perform these procedures to configure the connector:
Note: If the number of records to be processed exceeds one thousand, we recommend configuring the workflow to use bulk mode, which lowers the memory consumption of the system by streaming data to files. Because data is streamed for every task, performance of the workflow execution will be decreased due to increased read-write operations. See the Workflow and Connectivity Studio document for details on how to configure bulk mode.
Configuring for Export
Perform these procedures to configure the connector for data export:
From the Workflow and Connectivity Studio, select the IBM DB2 UserExport workflow listed under the projects folder.
If a workflow does not already exist, create an export workflow. See Workflow and Connectivity Studio documentation for details on creating export workflows.
Configuring the Export Connector
- In the Design pane, double-click the export object (the first workflow object after the Start object). The Configure Data Source window displays:
-
From the Configure Plug-in tab, set these properties as required:
Associated Connection System Select the connected system from the list. The export operation will be done from this connected system. Data Formats Select the type of data format to use: Account (default), Company, Contact, Event, EventLocation, Invoice or Payment. DeltaExportMode Select the type of attribute to export if a change takes place (this works in conjunction with ExportMode when DeltaExport is selected):
- OnlyChangedAttributes - Performs a partial export of only the changed attributes from the last time the query was run.
- ChangedAndMandatoryAttributes (default) - Performs a partial export of both changed and mandatory attributes from the last time the query was run. Mandatory attributes are exported whether they have been changed or not.
- AllAttributes - Performs a full export of all attributes that contain a value.
DynamicConnectedSystem Select the global variable to use as the dynamic connected system name. This works in conjunction with DynamicConnectedSystemOption when GlobalVariable is selected. DynamicConnectedSystemOption Select how to control Dynamic System Support (DSS): - None - There will not be any Dynamic System Support.
- Transaction-SystemName - The value of the Transaction-SystemName attribute in data will be used as the dynamic connected system. The connected system name must be passed as the value of the attribute Transaction-SystemName; if it is missing in data, the operation will fail.
- GlobalVariable - Select a global variable to use as the dynamic connected system name from the property DynamicConnectedSystem.
ExcludeEmptyFields Select how to process null fields:
- FALSE - Returns null fields with empty
- TRUE - Ignores null fields.
ExportMode Select the type of data to export:
- FullExport - Exports all attributes.
- DeltaExport - Exports changed, mandatory, or all attributes, depending on the DeltaExportMode property setting.
MaximumRows Select the maximum number of records to be exported (default: 0 [zero, for unlimited]). Note: Hover the pointer over a property to view its description. - Optional) Select the Appearance tab to change how the Connected System object displays in the Design pane.
-
Click OK to save any changes and return to the Workflow and Connectivity Studio window.
Configuring the Export Link
-
In the Design pane, double-click the export link between the export object (the first workflow object after the Start object) and the Data Mapper object. The Configure Link window displays:
Description Database Select the database name:
- Static - Select from the Database Name drop-down list.
- Dynamic - Select from the Global Variable drop-down list.
Select the database source for the selected fields: Tables or Views.
Tables Lists the schema tables available for export. Fields Lists the fields available for export. Change attributes for delta export Selected Fields Lists the selected schema table fields for export. Note: The check boxes are used only for delta export operations. These checked attributes will always be exported whether they were changed or not.
The Format button specifies a desired date/time format to be applied to a selected date type field. Only selected date fields will be able to apply a date/time format to their value.
Advanced Settings Displays the Configure Attributes window for selecting any attributes that need to be encrypted. SQLQuery Displays the SQL query run against the database. The Edit button edits the query. PreferredKey(s) Select the field(s) in the Selected Fields list: - Set Key - Sets the primary
- Get Key - Gets the Preferred Key(s) of the selected table/view.
- Perform these steps.
- Select a database from the Schema Select whether the source is Tables or Views. Select the table or view from the Tables/Views list. Select the fields from the Fields list to export and add them to the Selected Fields list. Select the field(s) from the Selected Fields list that require a date and/or time format and click the Format button. The Format Date window displays.
- Select the Include Time check box to add the timestamp with the date. Select the 24 Hour or 12 Hour option button and then select the required date/time Click OK to save the selected format. The Configure Link window displays.
- Perform one of these steps:
- Select the field(s) in the Selected Fields list and click the Set Key button to set the primary
Or
- Click the Get Key button to get the Preferred Key(s) of the selected table/view.
- Select the field(s) in the Selected Fields list and click the Set Key button to set the primary
- Edit the query here or click the Edit button.
Notes:- Query modifications can also be done manually after all fields are This field also supports copy/paste from other sources.
- When formatting a SQL query with functions such as rtrim, ltrim, convert, and to_char, you must write the query as in this example before formatting:
Select PSOPRDEFN_DF.EMPLID, PSOPRDEFN_DF.OPRID from PSOPRDEFN_DF
After formatting:
Select rtrim(PSOPRDEFN_DF.EMPLID)
as EMPLID, PSOPRDEFN_DF.OPRID
from PSOPRDEFN_DF - You can set up a dynamic database query when one workflow is initiated by another workflow or trigger. For example, when a data change occurs you can set the query to dynamically substitute the trigger data in the This will return only specific records for the substituted value.
- To filter the search data, enter a WHERE or AND clause at the end of the query or in the Where Clause text area with syntax such as TABLE.COLUMN='##CN##'. Note that single quotation marks ( ' ) must be used outside of the ## syntax for a database.
- Check the boxes in the Selected Fields list to set mandatory attributes.
- Query modifications can also be done manually after all fields are This field also supports copy/paste from other sources.
- Select a database from the Schema Select whether the source is Tables or Views. Select the table or view from the Tables/Views list. Select the fields from the Fields list to export and add them to the Selected Fields list. Select the field(s) from the Selected Fields list that require a date and/or time format and click the Format button. The Format Date window displays.
- Click OK to save any changes and return to the Workflow and Connectivity Studio window.
- Deploy the workflow by selecting Deploy ► New Deployment. See the Workflow and Connectivity Studio documentation for details of deployment options.
- Manage and run the deployed workflow from the Admin UI ► Server tab. See the Identity Suite Administration documentation for details.
Configuring for Import
There are two types of imports for the connector:
- Tables - Classified by add (insert), modify (update), or delete (delete). These operations are dynamically performed, based on the changetype of the incoming changetypes can be set automatically from the incoming process or dynamically using the Data Mapper. The connector runs the appropriate query against the tables and fields specified in the link configuration.
- Api - Stored procedure imports enable the connector to run stored procedures in the database based on selected
Perform these procedures to configure the connector for data import:
From the Workflow and Connectivity Studio, select the IBM 4690v5 OS UserAdd, UserModify, or UserDelete workflow listed under the projects folder.
If a workflow does not already exist, create an import workflow. See the Workflow and Connectivity Studio documentation for details on creating import workflows.
Configuring the Import Connector
Table imports are classified by add, modify, or delete (insert, update, delete) changetypes. These operations are dynamically performed based on the changetype of the incoming document. changetypes can be set automatically from the incoming process, or dynamically using the Data Mapper.
The DB2 database agent runs the appropriate query against the tables and fields specified in the link configuration.
- In the Design pane, double-click the import object (the last workflow object). The Configure Data Source window displays:
-
From the Configure Plug-in tab, set these properties as required:
Associated Connected System Select the connected system from the list. The import operation will be done to this connected system. Data Formats Select the type of data format to use: - Tables - Fetches the data from a table.
- Api - Executes APIs.
DynamicConnectedSystem Select the global variable to use as the dynamic connected system name. This works in conjunction with DynamicConnectedSystemOption when GlobalVariable is selected. DynamicConnectedSystemOption Select how to control Dynamic System Support (DSS): - None - There will not be any Dynamic System Support.
- Transaction-SystemName - The value of the Transaction-SystemName attribute in data will be used as the dynamic connected system. The connected system name must be passed as the value of the attribute Transaction-SystemName; if it is missing in data, the operation will fail.
- GlobalVariable - Select a global variable to use as the dynamic connected system name from the property DynamicConnectedSystem.
See the Dynamic System Support appendix in the Workflow and Connectivity Studio document for additional information.
Id *
Enter the attribute that contains the value used to uniquely identify the user account user ID on the connected system. loginId * Enter the attribute that contains the value used to uniquely identify the user account login ID on the connected system. MaxConcurrentEntryProcessing Specify the maximum number of entries to be processed concurrently. For each concurrent process, the connector creates new resource threads and connections. Therefore, it is important to set this property based on resource availability. When the MaxConcurrentEntryProcessing property is set, multiple entries are processed in parallel, thereby reducing the time taken for bulk import tasks. ModifyIfEntryExists Select whether to perform a modify operation if an add operation fails (default: FALSE). Notes:
* Id and loginId are used by the Provisioning Policy and IdentityHub features to populate the ACCOUNT_ID and ACCOUNT_USERNAME columns of the FISC_USER_ACCOUNT table of the Product database. See the ‘Provisioning Policy’ and ‘Provisioning Using the IdentityHub’ chapters of the Identity Suite Administration Guide for details.
Hover the pointer over a property to view its description.
- (Optional) Select the Appearance tab to change how the Connected System object displays in the Design pane.
- Click OK to save any changes and return to the Workflow and Connectivity Studio window.
Configuring the Import Link
This procedure depends on whether Tables or Api is selected as the data format in Step2, Perform one of these procedures to configure the import link:
Table Data Format
-
In the Design pane, double-click the import link between the Data Mapper object and the import object (the last workflow object). The Configure Link window displays:
Database Select the database name:
- Static - Select from the Database Name drop-down list.
- Dynamic - Select from the Global Variable drop-down list.
Select the database source for the selected fields: Tables or Views.
Tables Lists the schema tables available for import Fields Lists the fields available for import. Check for attribute-level auditing If auditing is enabled and these fields below are checked, Provisioning will log all events for auditing purposes. Selected Fields Lists the selected schema table fields for import.
The Set As Function button specifies system functions in the insert/update value in the import query for the selected attribute (e.g., sysdate, now).
The Format button specifies a desired date/time format to be applied to a selected date type field. Only selected date fields will be able to apply a date/time format to their value.SQL Query Displays the SQL query run against the database. The Edit button edits the query. Import using template format Generates the import query using the given format. The actual query is generated by substituting the ##attribute name## with the values given to the import task.
Template - Generates the template for the import query in the SQL Query text area.Where Clause Displays the where clause in the SQL Query. Note: This text area is editable only when the Update Manually option button is selected. Update Select one of these update option buttons:
- Automatically - The where clause is updated automatically.
- Manually - The where clause can be edited manually in the text area.
Advanced Settings Displays the Configure Attributes window for selecting any attributes that need to be encrypted. Preferred Key(s) Select the field(s) in the Selected Fields list:
- Set Key - Sets the primary key.
- Get Key - Gets the Preferred Key(s) of the selected table/view.
Audit Key Select the attribute to associate with the Audit Key. - From the Datasource tab, perform these steps.
- Select a Schema from the drop-down list. Select the table from the Tables Select the fields from the Fields list.
- Perform one of these steps:
- Select the field(s) in the Selected Fields list and click the Set Key button to set the primary key.
Or - Click the Get Key button to get the Preferred Key(s) of the selected table/view.
- Select the field(s) in the Selected Fields list and click the Set Key button to set the primary key.
- The SQL Query list displays the SQL query run against the You can edit the query here or click the Edit button.
- Check the boxes in the Selected Fields list to enable auditing of the attributes.
- Select a Schema from the drop-down list. Select the table from the Tables Select the fields from the Fields list.
- Click OK to save any changes and return to the Workflow and Connectivity Studio window.
- Deploy the workflow by selecting Deploy ► New Deployment.
See the Workflow and Connectivity Studio document for details of deployment options. -
Manage and run the deployed workflow from the Admin UI ► Server tab. See the Identity Suite Administration documentation for details.
-
(Optional) Select the Appearance tab to change how the link displays in the Design pane.
- Click OK to save any changes and return to the Workflow and Connectivity Studio window.
-
Deploy the workflow by selecting Deploy ► New Deployment.
See the Workflow and Connectivity Studio for details of deployment options. -
Manage and run the deployed workflow from the Admin UI ► Server tab.
See the Identity Suite Administration documentation for details.
API Data Format
-
In the Design pane, double-click the import link between the Data Mapper object and the import object (the last workflow object). If you selected the Api data format in Step 2, this Configure Link window displays:
Note: Modifying APIs is optional. See the appendix Working with Database APIs for detailed information.
Select API variables to be imported. API Lists the APIs available for import API Variables Lists the variables available for import. Check variables for audit Check for attribute-level auditing If auditing is enabled and these fields below are checked, Provisioning will log all events for auditing purposes. Selected Variables Lists the selected schema table fields for import. If auditing is enabled and these variables below are checked, Provisioning will log all events for auditing purposes. Modify APIs Adds or modifies APIs. Displays the Modify API window. Advanced Settings Displays the Configure Attributes window for selecting any attributes that need to be encrypted. - Perform these steps.
- Select the API name in the API field. Select the API variable in the API Variables field. Add the selected API variables to the Selected Variables field.
- Click the Modify APIs button to modify the API. The Modify API window displays:
API Names Displays configured APIs. Select the API to modify or delete. API Variables Displays the API variables corresponding to the selected API. Select the API variable to modify or delete. Use API Format Select the method to build the API at runtime with this check box:
- If not selected (default), each variable will be listed in the API Variable Order list box. The variables will be used in the API Variable Order while generating the API. The Up and Down buttons can be used to change the order.
- If selected, the format can be specified in the API Format text area that becomes visible. Specify the variables inside ## (e.g., ##sp_adduser.loginame##). The API will be generated by substituting the values of the variables in the API format. The Edit button that becomes visible, displays the Api Format dialog.
API Variable Order Change the variable order with the Up and Down buttons. Primary Key Select the value to use to uniquely identify an object (e.g., user or group). Failure Message Attribute The output attribute that returns the failure reason. This is an optional setting and if specified, the return value is used to set the status description on API execution failure. Role Attribute Sets the Role attribute.
- Click OK when done modifying APIs to return to the Configure Link window.
Note: The query is not updated. When importing into multiple tables, a manual query has to be written to join the tables and fields together.
See the appendix Working with Database APIs for additional information about modifying APIs.
- Click OK to save any changes and return to the Workflow and Connectivity Studio window.
- Deploy the workflow by selecting Deploy ► New Deployment.
See the Workflow and Connectivity Studio document for details of deployment options. -
Manage and run the deployed workflow from the Admin UI ► Server tab. See the Identity Suite Administration documentation for details.
Connector Details for Provisioning
Configuration import properties Id and loginId are used by the Provisioning Policy and IdentityHub features to populate the ACCOUNT_ID and ACCOUNT_USERNAME columns of the FISC_USER_ACCOUNT table of the Product database. See the ‘Provisioning Policy’ and ‘Provisioning Using the IdentityHub’ chapters of the Identity Suite Administration Guide for details.
Connector Attributes
The items in the MV (multi-valued), Export, Create, Modify, and Delete columns have these meanings:
- Y = Yes (attribute is supported for this operation)
- N = No (attribute is not supported for this operation)
- R = Required (attribute is mandatory for this operation)
- NA = Not applicable
Name |
MV |
Export |
Create |
Modify |
Delete |
Description |
Access rights |
NA |
NA |
NA |
R |
NA |
Privilege of user. |
Table Name |
NA |
NA |
NA |
R |
NA |
Table name. |
User Name |
NA |
NA |
NA |
R |
NA |
Key attribute for all operations. |
Note: These APIs/Tables are used for Provisioning operations:
|
Configuring Triggers
Perform these procedures to create a trigger:
Prerequisites
Ensure that these prerequisites are satisfied:
- Create an IBM DB2 provisioning connector before creating an IBM DB2 trigger (see the section Creating the Connected System in the Studio).
- Create and deploy workflows to be run by the IBM DB2 See the ‘Creating Workflows’ and ‘Deploying Workflows’ sections in the Workflow Development chapter in the Workflow and Connectivity Studio document for details.
Creating a Trigger
- From the Workflow and Connectivity Studio menu bar, click File ► New Trigger ►IBM DB2 Trigger. The Create a New Trigger window displays.
- Enter a trigger name in the Name field.
- Click the Browse button to select a directory other than the default displayed in the Directory field. The directory should be a child of the default location in order to have the trigger listed under the projects folder of the Workflow and Connectivity Studio.
- Select one of the available systems in the System field.
Note: Only connected systems of the trigger type selected in Step 2. will be available. If there are no connected systems to select, then a Fedora Directory Server provisioning connected system does not exist. This connected system must exist before creating a trigger. - Enter descriptive text in the Description field and then click OK. A new trigger system object and link display in the Design pane.
Note: The trigger must be fully configured before it can be saved and deployed. Continue with the sections below to complete configuring the trigger.
Configuring a Trigger Agent
- Copy the trigger deployment zip file:
From: IdM Suite Software folder\Provisioning\Resource\Triggers\DB2UDB
To: A folder on the DB2 database server (e.g., C:\DB2Trigger) - Unzip the trigger file to the trigger installation folder. These files and folders are created:
- C:\DB2Trigger\trigger.properties (the trigger configuration file)
- \db2triggercallback.jar (the jar file containing all trigger Java classes)
- \data\ (the folder used to store trigger request data)
- \log\ (the folder used to store log files)
-
Modify the properties file according to the Provisioning Server settings such as
TriggerWebServiceHost and TriggerWebServicePort.
Note: Modify LogFilePath and DataFilePath to an existing folder, for example,
LogFilePath=C:\DB2Trigger\log\Trigger.log
DataFilePath=C:\DB2Trigger\data\Trigger.dat) -
Connect to the DB2 database using the CLP (Command Line Processor) command:
connect to <database> user <user> using <password> - Register the jar file using the DB2 CLP (Command Line Processor) command:
db2 => call sqlj.install_jar('file:///C:/DB2Trigger/db2triggercallback.jar','db2trigger'); - Copy the DB2 license files (e.g., jar) from your DB2 Server to the Provisioning installation directory and the appropriate Web server directory. For example:
- <DATAFORUM_HOME>\..\jars (e.g., C:\Fischer\Provisioning\jars)
- <DATAFORUM_HOME>\..\wars\dataforum\WEB_INF\lib (e.g., C:\Fischer\Provisioning\wars\dataforum\WEB-INF\lib)
Where <DATAFORUM_HOME> is the environment variable value referencing the base folder of the Provisioning installation (e.g., C:\Fischer\Provisioning\dataforum).
- (Optional) If you decide to use HTTPS for Simple Object Access Protocol (SOAP) communication, configure TriggerWebServicePort, EnableSSL, TruststorePath, SSLProvider, HttpsHandler, and TruststorePassword in the properties file. For example:
TriggerWebServicePort=8443 [Provisioning SSL port]
EnableSSL=True
SSLProvider=com.sun.net.ssl.internal.ssl.Provider
HttpsHandler=com.sun.net.ssl.internal.www.protocol
TruststorePath=C:\DB2Trigger\cacerts\cacerts
TruststorePassword=changeit
Note: After modifying the trigger.properties file, restart the DB2 database.
- <DATAFORUM_HOME>\..\jars (e.g., C:\Fischer\Provisioning\jars)
- <DATAFORUM_HOME>\..\wars\dataforum\WEB_INF\lib (e.g., C:\Fischer\Provisioning\wars\dataforum\WEB-INF\lib)
Where <DATAFORUM_HOME> is the environment variable value referencing the base folder of the Provisioning installation (e.g., C:\Fischer\Provisioning\dataforum).
Redeploying the War Files
Notes:
- If multiple connectors are being configured, these steps only need to be run once after all jar files have been copied for all jar files have been copied for all connectors.
- If the connector is being run remotely on a Global Identity Gateway (GIG) platform, the steps will vary slightly as noted below.
-
Run war.bat from <IDENTITY_HOME>\..\wars to create a new identity.war file.
- Run war.bat from <DATAFORUM_HOME>\..\wars to create a new dataforum.war file.
Note: If the connector is being run remotely on a GIG, run idmgig.bat and provgig.bat from <GIG_HOME>\wars to create new idmgig.war and provgig.war files.
- Complete the steps required to install the war files on the application server so they can be deployed. For example, for Apache Tomcat, delete the dataforum and identity folders and the war and identity.war files if they exist. Copy the new identity.war and dataforum.war files to the apache-tomcat webapps subdirectory.
Note: If the connector is being run remotely on a GIG, delete the idmgig and provgig folders and the idmgig.war and provgig.war files. Copy the new idmgig.war and provgig.war files to the apache-tomcat webapps subdirectory. -
Restart the application server.
Configuring a Trigger Link -
Double-click the link between the Start object and the Trigger system object. The Configure Link window displays.
Note: To modify an existing trigger, on the menu bar click View ►Triggers, and then select one of the IBM DB2 triggers listed under the projects folder.
Schemas Sets the schema table where the trigger watches for selected database operations to occur. Operations Select which database operations cause the trigger to execute:
- Insert - Insertions of new records into the selected schema table.
- Update - Changes to existing records in the selected schema table.
- ·Delete - Deletions of existing records in the selected schema table.
Tables Lists the schema tables available for trigger support. Fields Lists the fields available for trigger support. Unique Key Displays the attribute(s) that make the entry in the selected schema table unique. Get Key Retrieves a unique key from the selected schema table. Set Key Sets which attribute from the Selected Fields will make the entry unique. Clear Key Removes the current unique key attribute selection. No unique key attribute is defined after selecting this option. Selected Fields Lists the selected schema table fields for trigger support.
The Format button specifies a desired date/time format to be applied to a selected date type field. Only selected date fields will be able to apply a date/time format to their value.
Check Mandatory Attributes - Check boxes in this field set mandatory attributes. These checked attributes will always be exported whether they were changed or not.Advanced Settings Displays the Configure Attributes window for selecting any attributes that need to be encrypted. Callback Folder Enter the physical directory where you copied the Provisioning agent files (e.g., C:\Dataforum). Effective Date Select these effective date options:
- Set - Sets an attribute from the selected attributes to apply an effective date offset to control when the triggered data is run. A condition can be provided that determines when or if an effective date offset should be applied. Set a condition and effective date offset from the Effective Date tab.
- Clear - Removes the selected attribute from being defined for effective date processing.
- Format - Specifies a desired date/time format to be applied to the selected effective date field. Any field type can be selected to apply a date/time format to the effective date value.
- From the Trigger Properties tab, perform these steps:
- Select a schema from the Schemas field.Select a table where the trigger is to be implemented from the Tables list. Select required fields from the Fields list and add them to the Selected Fields list. For the trigger to work properly you must have at least one field selected. Select the preferred key in the Unique Key field by clicking the Set Key button. Check the boxes in the Selected Fields list if the current data and the previous data before the change (called Original_field) are to be sent during the trigger.
-
Under Operations, select the operations (Insert, Update, or Delete) where the trigger is to be generated.
-
Enter the Callback Folder where the trigger configuration file (trigger.properties) resides.
-
Select an attribute from the Selected Fields and click Set in the Effective Date section to apply an effective date, if desired.
- Click the Format button to specify a particular date and time format for the selected Effective Date.
- Click the Effective Date tab to configure an effective date condition and/or offset value. If effective date processing is not required, proceed to Step 23.
- Click the Add The Set Trigger Data Condition window displays.
- Set an Effective Date Offset value and specify a condition when it will be used:
- For triggers - All conditions specified here will be evaluated for each incoming data The offset corresponding to the first condition that is satisfied will be applied to the date contained in the effective date attribute. An offset can be mapped to a condition that is specified as default. If none of the conditions in the list are satisfied, the offset corresponding to the default condition will be applied to the effective date.
- For Chained workflows - From the Chained workflow Configure Data Source window, specify the attribute that should have an effective date condition and offset value From the preceding Data Mapper, provide conditions and offset values to calculate the target effective date value and save this value to the effective date attribute as the target attribute.
- Click OK when finished.
- Set an Effective Date Offset value and specify a condition when it will be used:
- From the Target Workflow Selection tab, select the deployed workflow(s) to run when the trigger occurs, and then click the Add > button.
To remove a selected workflow from being run, highlight it under Selected Workflows and click the < Remove button.Notes:
- If more than one workflow is selected, they are run in the order listed.
- If workflows are deployed in Asynchronous mode, all workflows are run together.
- If serialized execution of workflows is required, consider chaining them.
- Highlight a workflow from the Selected Workflows list and click the Set Condition button to set a condition before running Target workflows. The Set Lookup Condition window displays.
- Set a check condition before running workflows. Build a complex condition with logical AND/ OR.
- Click OK to return to the Configure Link window.
- Set a check condition before running workflows. Build a complex condition with logical AND/ OR.
- From the Lookup Workflow Selection tab, select the deployed workflow(s) to run when the trigger occurs, and then click the Add > button.
To remove a selected workflow from being run, highlight it under Selected Workflows and click the < Remove button.
Notes:- Lookup may be required to get additional attributes to run Target Lookup workflows run prior to Target workflows.
- If more than one workflow is selected, they are run in the order listed.
- Lookup workflows must be deployed in Synchronous mode; otherwise, lookup data may not be available before running Target workflows.
- Highlight a workflow from the Selected Workflows list and click the Set Primary button to set the primary workflow to be run.
- Highlight a workflow from the Selected Workflows list and click the Set Condition button to set a condition before running Lookup workflows. The Set Lookup Condition window displays. Set a check condition before running workflows. Build a complex condition with logical AND/OR. Click OK to return to the Configure Link window.
- Click OK to save any changes and return to the Workflow and Connectivity Studio window.
- Save the trigger.
- Deploy the trigger by clicking the
- Click the Deploy New The Deploy Trigger window displays:
- Select the Deploy Option:
- Execute (default) - Creates the trigger by the standard method.
- Generate Script - Creates database scripts necessary to create the trigger under the <installation folder>\share\workfiles\trigger-dbscripts\<trigger>. The CreateProcedure.sql and CreateTrigger.sql scripts can be used to create the procedure and trigger before enabling the trigger.
- Click OK to deploy the trigger.
- Select the Deploy Option:
- Enable the trigger from the Server tab of the Admin UI. See the Identity Suite Administration Guide for details on enabling triggers.