Connectors
Operations Applications
Pipedrive
5min
pipedrive lets you track your sales pipeline, manage leads and automate your entire sales process in one place so you can focus on selling dlh io connects to pipedrive to help consolidate your data from the platform with other data used in your organization in order to gain a holistic view of all operations capability to make better data driven decisions connector pre requisites connecting to pipedrive , you need to make sure that you have administrative access to your pipedrive account instructions (basic connection setup) remember datalakehouse io securely connects to your pipedrive instance by redirecting you to the pipedrive portal for you to sign in via pipedrive login datalakehouse io does not capture or store your password/credentials using the form please complete the following basic steps enter a name or alias for this connection, in the name/alias field, that is unique from other connectors enter in the target schema prefix field, a name that will be unique in your target destination where you wish to integrate the data enter your api key from your pipedrive account in the api key field enter the password credential for the user in the password field click the authorize your account button which will transport you to the pipedrive login, where you will enter your account credentials once your credentials are accepted you will be automatically redirected back to the datalakehouse io portal and you should be able to see a successful connection instructions (video) coming soon source replicated entities the following entities are replicated by defualt to your what is a target connection? docid 3wx 24ml25noxc1atdbs4 these are the table names you will see in your target connection or available to your target integration if you believe any entities/tables are missing please open a ticket with customer support docid\ pbtuxndqrdogoroejbgsv view a list of entities after creating a connection in dlh io for the most up to date list data replication details this section provides information on how data is replicated, at what frequency and which replication/synchronization details are worth noting from dlh io replication frequency configurations details default replication frequency 24 hours minimum frequency 1 hour (lower on business critical plan) maximum frequency 24 hours custom frequency 1 24 hours replication definitions the following types of replication happen between this source and your target destination cloud data warehouse of choice when a sync bridge runs historical data load (also first time sync) occurs during the first/initial data sync of the sync bridge pipeline and any time you need to or are requested to conduct a historical re sync process for your data here, dlh io ingests all available data for the objects/tables you've selected or are available to you from the source based on your authentication access on that source this can take a relative elongated time to retrieve all data from the source and replicated to the target, as the name suggest it is retrieval all data, so if the source contains large amounts of data even in our parallel processing capability a customer could expect more than an hour for large data system if there are any concerns that a historical load or first time sync has not completed within a reasonable amount of time please contact dlh io support incremental (aka deltas) data load after a first time synchronization/replication or a historical data load, all subsequent processes of replicating the data for a sync bridge (source to target) are referred to as delta or incremental loads here, dlh io captures the latest/newest records and/or events and any changes/updates to existing records and/or events in the source connector based on the frequency set in the sync bridge issue handling if any issues occur with the authorization simply return to the sources page in datalakehouse io, edit the source details and click the save & test or authorize your account or re authorize account button to confirm connectivity if any issues persist please contact our support team via the datalakehouse io support portal https //datalakehouse zendesk com