Connectors
Operations Applications

Salesforce

13min
salesforce com is a customer relationship management (crm) platform that helps marketing, sales, commerce, service and it teams collaborate on customer success crm is a technology for managing all your company’s relationships and interactions with customers and potential customers connector pre requisites connecting to salesforce, you need to make sure that you have access to retrieve metadata from your salesforce com account instructions (basic connection setup) remember datalakehouse io securely connects to your salesforce com instance by redirecting you to the salesforce com portal for you to sign in via salesforce com's login datalakehouse io does not capture or store your password/credentials using the form please complete the following basic steps enter a name or alias for this connection, in the 'name/alias' field, that is unique from other connectors enter in the ' target schema prefix ' field, a name that will be unique in your data cloud destination where you wish to land the data click the authorize your account button which will transport you to the salesforce com login, where you will enter your salesforce com account credentials once your credentials are accepted you will be automatically redirected back to this datalakehouse io portal and you should be able to see a successful connection how to instructions issue handling if any issues occur with the authorization simply return to the sources page in datalakehouse io, edit the source details and click the save & test or authorize your account or re authorize account button to confirm connectivity if any issues persist please contact our support team via the datalakehouse io support portal https //datalakehouse zendesk com faq compare record counts between salesforce and target on the sync bridge, actions click compare source to target entity counts in order to view record counts in salesforce compared to record counts in the target connection if it's been awhile since a record count has been performed or the delta from source column has a negative number, then click the re run audit counts process on target button please wait for 10 15 minutes in order for the process to complete you may need to wait longer if you hundreds of salesforce entities to update entities with many records for entities that have many records a baseline for the replication window is that during a historical load 500,000 records takes approximately 30 min to pull for a single entity salesforce trial accounts salesforce limits the ability to use their api on their trial accounts so a licensed snowflake account is needed in order to leverage datalakehouse io sync bridge runs once a day but errors if a sync bridge is set to run once a day and for whatever reason does not successfully complete the bridge, the bridge will load 2 days worth of data during the next run deleted records salesforce will remove from its system any deleted records (hard deleted, etc ) within 15 days, (see salesforce documentation here ) therefore it is important to keep the sync bridge in dlh io running consistently (not paused) if a sync bridge is paused for long periods of time your target destination may have orphan records that will not be marked as isdeleted = true and/or ` dlh is deleted = y` and it will appear that you are pulling potentially eroneous values/records in this case it is recommended to backup your target destination tables, drop the target schema and/or tables in question and conduct a historical resync of the sync bridge if comparing dlh io to another process or system where you have been attempting to synchornize or retrieve salesforce com records, and you find that dlh io has a differing number of records, it may be due to this deletion issue above, or the fact that dlh io is a more comprehensive sychronization platform than any alternatives how does the cdc/incremental loading work with sfdc? as a saas platform salesforce com (sfdc) has native columns/fields that represent timestamps for when a record/event was created or updated these columns vary from entity to entity dlh io uses in order the following timestamp attributes/columns/fields in the sfdc entities to ensure only new or incremental records are synchronized during a standard incremental data sync process lastmodifieddate, createddate, systemmodstamp custom fields and custom entities datalakehouse io treats all custom and non custom fields and entities as equals whereby replicating these fields (as columns) and entities (as tables) to your desired what is a target connection? docid 3wx 24ml25noxc1atdbs4 in fact, most other data integration platforms struggle with custom entities and only replicate a subset of custom columns and custom entities how are field formulas handled datalakehouse io replicates calculated formula values on any custom or non custom field example of a custom formula in sfdc "calculatedformula" "if(my credit limit gross revenue c=0 , \n0, \n( my backlog gross revenue c + my total ar c + my total wip c ) / (my credit limit gross revenue c \n )\n)",