Connectors
Operations Applications
Beans Route
6min
beans route is one of the first delivery, dispatch, and routing management systems to feature timesheets and scheduling the platform increases driver's stops per hour with routing, optimization, and precise apartment directions, and enables users to manage any size delivery team pre requisites access to your beans route dispatch manager and api manager (typically an owner or administrator of your beans route account) configuring your beans route api in your beans route account, access the dispatch manager navigate to the settings > api console from the dropdown menu in the page header menu of the web page portal click on the create new key button in the create new key popup window, set the following values for the options available check the web option checkbox for the platform option enter datalakehouse io for the value in the key name field select the everything including unverified data option in the other dropdown menu click on create key to save and create the new api key after creating the new key you will be able to retrieve the api key value if it is not shown please refresh this api key page, and the key should show in the listing on the page click the eye icon to reveal the key value note the api key is the cryptic looking string value if there is any word such as "basic " in front of it, please ignore, and now copy the api key and save it in a safe place so that you can provide it to either datalakehouse io support or your customer success team for onboarding or for the self service step when you create your beans route connection in datalakehouse io create a beans route source connection in dlh io datalakehouse io securely connects to your account instance by securely using your api key on your behalf, using tls aes 256 encryption, and only accesses read only functionality using the source connector form please complete the following basic steps enter a name or alias for this connection, in the name/alias field, that is unique from other connectors enter in the target schema prefix field, a name that will be unique in your data cloud destination where you wish to land the data enter in the api key field, the value from your api keys area of your account do not enter the "basic " prefix for the api key only enter the long cryptic looking string of characters which is your actual api key after the successful connection click on the create a sync bridge docid\ jvybwq5ydjrq3ajxuqghs option to point to your target destination in order to begin the synchronization/replication of data instructions video data replication details this section provides information on how data is replicated, at what frequency and which replication/synchronization details are worth noting from dlh io replication frequency configurations details default replication frequency 24 hours minimum frequency 1 hour (lower on business critical plan) maximum frequency 24 hours custom frequency 1 24 hours replication definitions the following types of replication happen between this source and your target destination cloud data warehouse of choice when a sync bridge runs historical data load (also first time sync) occurs during the first/initial data sync of the sync bridge pipeline and any time you need to or are requested to conduct a historical re sync process for your data here, dlh io ingests all available data for the objects/tables you've selected or are available to you from the source based on your authentication access on that source this can take a relative elongated time to retrieve all data from the source and replicated to the target, as the name suggest it is retrieval all data, so if the source contains large amounts of data even in our parallel processing capability a customer could expect more than an hour for large data system if there are any concerns that a historical load or first time sync has not completed within a reasonable amount of time please contact dlh io support incremental (aka deltas) data load after a first time synchronization/replication or a historical data load, all subsequent processes of replicating the data for a sync bridge (source to target) are referred to as delta or incremental loads here, dlh io captures the latest/newest records and/or events and any changes/updates to existing records and/or events in the source connector based on the frequency set in the sync bridge issue handling if any issues occur with the authorization simply return to the sources page in datalakehouse io, edit the source details and click the save & test or authorize your account or re authorize account button to confirm connectivity if any issues persist please contact our support team via the datalakehouse io support portal https //datalakehouse zendesk com