Monday, May 29, 2017

How to query deleted records from Salesforce with ODI12c

In default, the ODI12c Salesforce connector will not allow you to query the deleted records in Salesforce. So that your connection will have visibility on the deleted records, you have to set deletedRowsBehavior=include in the properties of your JDBC connection. Example below:


In your mapping, add a filter ISDELETED = true. Example below:

Helpful Links:
Creating a Salesforce.com Data Server
Retrieve Deleted Records in Salesforce

Thursday, May 4, 2017

Creating a Mapping to perform an Update and Insert to an Oracle table with source data from Salesforce

As the post title says, in this post I have created a mapping to perform an update and insert to an Oracle table with source data from Salesforce. If you still have not created a Salesforce Data model, please refer to a previous post for reference.

Now to begin, in Designer, expand your project folder and create a new mapping
Enter a name for your Mapping
In Designer > Models accordion, select the necessary tables serving as your source and target object. In the Logical tab of your mapping, drag the tables. Example screenshot below is showing the source table which is in Salesforce and the target table which is in an Oracle database. I have added a filter in this example.
Still in the Logical tab, in the Properties > Target box, select 'Incremental Update' for the Integration Type.
One field in my Salesforce table has a data type of LONGTEXTAREA. Please refer to a previous post on how you can set ODI12c to handle such data type. 
In the Physical tab,  in the TARGET_GROUP highlighted, click on the target units. In Properties > Loading Knowledge Module, in the dropdown you either select <Default> or 'LKM SQL Multi-Connect'. 
Still in the Physical tab, click the target table. In Properties > Integration Knowledge Module, in the dropdown select the appropriate IKM. You can select the out of the box IKM: 'IKM SQL to SQL Incremental Update'. But this IKM will not work if one of the columns on your table has a CLOB data type. The IKM has a task to identify which fields are for insert or update. This involves comparing all fields in the where clause, however, this cannot be done in an SQL if one field has a CLOB data type. In this case, I have modified the IKM 'IKM SQL to SQL Incremental Update' so that only the SYSTEMMODSTAMP field will be checked by Oracle to determine if a record has been a changed. 
Save your work, execute and verify the data.





Creating a Mapping to do an Update & Insert to Salesforce with ODI12c

In this post, I created a mapping that will do an update and insert to a Salesforce table.

Now to begin, in Designer, expand your project folder and create a new mapping
Enter a name for your Mapping
In Designer > Models accordion, select the necessary tables serving as your source and target object. In the Logical tab of your mapping, drag the tables. Example screenshot below is showing the source table which is in an Oracle database and the target table which is in Salesforce. I have added a filter in this example.
Still in the Logical tab, in the Properties > Target box, select 'Incremental Update' for the Integration Type.
Still in the Logical tab, click the target table. In the Properties > Attributes box, make sure to untick the checkbox for the Salesforce Audit columns (The audit columns are the following: LastModifiedDate, CreatedById, IsDeleted, SystemModstamp, CreatedDate, LastModifiedById. In my example, I have also unchecked OwnerID and Name because they will be automatically assigned values by Salesforce during insert).
In the Physical tab,  in the TARGET_GROUP highlighted, click on the target units. In Properties > Loading Knowledge Module, in the dropdown you either select <Default> or 'LKM SQL Multi-Connect'. 
Still in the Physical tab, click the target table. In Properties > Integration Knowledge Module, in the dropdown select 'IKM SQL to SQL Incremental Update'. For the options, set FLOW_CONTROL and FLOW_TABLE_LOCATION to 'False' and 'STAGING', respectively. I have mentioned this in my previous post that Salesforce cannot be used as a staging location. Please refer to a previous post for more info on which IKMs you can use for Salesforce Integration with ODI12c.
Save your work, execute and verify the data.


Friday, April 28, 2017

Creating a Salesforce Data Model with ODI12c

In the previous post, I have shown you how to create the Salesforce data server, as well as the physical and logical schema. In this post, we will proceed in creating the data model.

In Designer > Models > New Model Folder. In case you have more than one instance of Salesforce Data Model available, sandbox and production databases, you can put the two under one Salesforce Data Model folder.

Under the Salesforce Model folder, create a new model. In the definition tab, enter the name of the data model. For Technology, select Salesforce.com. Finally for the Logical Schema, select the name of the physical schema you have created.


Now you can 'Reverse Engineer' the Salesforce tables available to you. I would however, do a 'Selective Reverse-Engineering' so I can choose only the tables I need. Tick the Selective 'Reverse-Engineering', 'New Datastores' and Objects to Reverse Engineer checkboxes. Save. Then click on 'Reverse Engineer'.


Your Data Model should now have the tables you reverse-engineered.


Now you can use the Salesforce tables as Source or Target in a Mapping.


Thursday, April 27, 2017

Creating a Salesforce.com Data Server and the Physical & Logical Schema with ODI12c

Creating the Data Server

1. In Topology > Physical Architecture > Salesforce.com, right click and select "New Data Server".

2. In the Definition tab:

  • Name: Name of the data server that will appear in Oracle Data Integrator
  • Instance/dblink (Data Server): Not required for Salesforce.com. Leave this field blank.
  • User/Password: User name and password for connecting to the data server (Password+SecurityToken, must be at most 40char)


3. In the JDBC tab:

  • JDBC Driver: weblogic.jdbc.sforce.SForceDriver
  • JDBC URL: The URL used for connecting to the data server. For example, jdbc:weblogic:sforce://login.salesforce.com. 

In the Properties section:

  • ConfigOptions: The configuration options that you want to use. For example, (AuditColumns=all;MapSystemColumnNames=0;CustomSuffix=include;). I have set auditcolumns to 'all', so I could view the audit fields . I have set the MapSystemColumnNames and CustomSuffix to 0 and 'include', respectively, so I could view the column names as in they are in Salesforce.
  • DatabaseName: The instance of the database. This needs to be changed as per the JDBC URL used. I suggest to set the DatabaseName so that you would know which Salesforce instance you are connecting at the moment. In default, ODI will set the database name base on the user name.

For more information on the connection properties supported by the Salesforce.com driver, see http://media.datadirect.com/download/docs/jdbc/alljdbc/help.html#page/jdbcconnect%2FConnection_Properties_11.html%23wwID0EZT5Y
4. Save & Test Connection.

Creating the Physical Schema

1. Right click on the Data Server you created and select "New Physical Schema".

2. In the Definition tab, select SFORCE Schema.


Creating the Logical Schema

1. In Topology > Logical Architecture > Salesforce.com, right click and select "New Logical Schema".

2. In the definition tab, enter the name of the logical schema. For the context, select from the dropdown the physical schema you have created earlier.


Congratulations!
You are now ready to create your Salesforce Data Model.


Wednesday, April 26, 2017

Multiple Errors when executing ODI12c Mapping to update data in Salesforce

Error:
oracle.odi.runtime.agent.exception.ExecutionEngineException: java.sql.SQLSyntaxErrorException: [FMWGEN][SForce JDBC Driver][SForce]Table not found in statement [INSERT INTO SFORCE.C$_0AJ8IUSJ58MVU517U6TKC9FJHQK (
ID_,
OWNERID,
ISDELETED,
.....
This error occurs as a result from ODI warning:
ODI-1228: Task Create staging (C$) table-LKM SQL to SQL (Built-In)- fails on the target connection Salesforce_Stg.
Caused By: java.sql.SQLSyntaxErrorException: [FMWGEN][SForce JDBC Driver][SForce]column size is required in statement [create table SFORCE.C$_0AJ8IUSJ58MVU517U6TKC9FJHQK
(
ID_ TEXT]
.....
It is important to note that Salesforce cannot be used as a Staging area (Refer to documentation: http://docs.oracle.com/middleware/122126/odi/develop-connectivity-km/GUID-8B7FF042-1B82-4443-82D6-74D6F8B3D361.htm#ODIKM-GUID-8B7FF042-1B82-4443-82D6-74D6F8B3D361).

Although the ‘Preset Staging Location’ in the Physical tab of the Mapping is set to the source database which is Oracle, the LKM 'LKM SQL to SQL' forces to Stage the table in Salesforce.

From research and advise from Oracle Support, it is important to highlight the following:
  • Starting with ODI 12c, the staging location is set by default to Target and cannot be set to the Source.
  • From ODI 12.2.1.2.6, the C$ tables are going to be created in target even if a staging area is specified. This is the expected behavior of ODI.
  • For Salesforce integration, only the IKMs that use binding could be used.That is, the IKMs that:
    • Do not require a LKM, and
    • Allow the staging area to not be set on target
  • Only "IKM SQL to SQL Incremental Update" and "IKM SQL to SQL Control Append" allow staging area to be different than target, and use the binding mechanism. All the other IKMs require an LKM to be used additional to the IKM. So no binding. Not suitable for Salesforce.
  • Also, the doc should mention that for the IKM (IKM SQL to SQL Incremental Update) to work, the IKM FLOW_TABLE_LOCATION option should be set to "staging" and FLOW_CONTROL is set to False.
Taking into consideration the above facts, when creating a mapping to update a record in Salesforce, do the following:
  • In the target Salesforce table: Logical > Properties > Attributes, make sure to untick the check box for update of the audit columns LastModifiedDate, CreatedById, IsDeleted, SystemModstamp, CreatedDate, LastModifiedById as well as Id. 
Otherwise, you will be prompted with this error during execution: 
java.sql.BatchUpdateException: java.sql.BatchUpdateException: [FMWGEN][SForce JDBC Driver][SForce]Unable to create/update fields: LastModifiedDate, CreatedById, IsDeleted, SystemModstamp, CreatedDate, LastModifiedById. Please check the security settings of this field and verify that it is read/write for your profile or permission set.
  • In the Logical > Properties > Target, select "Incremental Update" on the dropdown.


  • In the Physical > [Your target table] > Properties > Integration Knowledge Module, select "IKM SQL to SQL Incremental Update". Change the option for FLOW_CONTROL to False and FLOW_TABLE_LOCATION to STAGING.
  • Save and Execute.

ODI Error ODI-1228: Task Flag useless rows-IKM SQL to SQL Incremental Update- fails on the target connection DWDEV. Caused By: java.sql.SQLSyntaxErrorException: ORA-00932: inconsistent datatypes: expected - got CLOB

In Topology > Physical Architecture > Salesforce.com > Data Types, you can set the data type to "convert to" when inserting from Salesforce to your Oracle table. For example, for the Salesforce Data Type TEXTAREA, I set the "Converted To" for Oracle Technology to CLOB. This prevents type mismatch error during execution.

Using a Variable to capture Error Messages in ODI12c Load Plans

Create a Variable to hold the Error Message. In this example, I am creating a Global Variable. In the Refreshing tab, enter the follow...