FAQ : DataLakeHouse.io
Here are answers to some common questions users typically have about DataLakeHouse:
No! Follow our simple instructions to guide you and with just a few clicks, you can easily load Data from any Source into Snowflake at the frequency of your choosing, and watch your Data flow đ
DataLakeHouse.io currently offers Connectorsďťż (aka connections or integrations) for the following Data Sources:
- Aloha POS
- Ceridian
- DoorDash For Work
- Facebook / Meta Ads
- Harvest
- JIRA
- McLeod Transportation
- MongoDB
- MongoDB Sharded
- MySQL
- NetSuite (Oracle NetSuite)
- Optimum HRIS
- PostgreSQL
- QuickBooks
- Salesforce
- Shopify
- Snowflake
- Snowflake Data Marketplace
- SQL Server
- Square
- Square Marketplace
- Stripe
- Xero
+ New Sources Added Frequently. You can request a connector anytime if the one you are looking for is not listed.
No. DLH.io focuses on a number of cloud data warehouse and storage technology vendors for you to synchronize and replicate your data. Our Business Critical plan offers both tailored Sources and Targets to meet your organization's needs, if needed. We boast turnaround times of less than four weeks for most connector integrations.
How frequently can my source data be Synchronized / Replicated to a target destination like Snowflake?
This depends on the Data Source it's coming from, but we currently offer Sync Bridges that bring data in as frequently as every 15 minutes (faster if on our Business Critical plan). Additional intervals offered include every 30 minutes, 1 hour, 2, 3, 6, 8, 12, and 24 hours. Depending on your organization's specific needs, you might want to consider our Business Critical plan, which offers additional synchronization frequencies.
Yes! You can setup Alerts & Notifications for email and Slack that let you know thinks like:
- if a connector fails to connect to a source
- if a sync bridge does not complete within a certain time threshold
- if an issue is encountered
For more information, review Alerts & Notificationsďťż
Frequently Asked Questions about DataLakeHouse.io will always be available here. Also, join our Slack community for more insights and to collaborate with other DataLakeHouse users and developers.
ďťż
DataLakeHouse.io has a number of system generated columns to each table in the Target Connection. Below is the definition of each of these columns:
- __DLH_IS_DELETED
-  Indicates where this record has been deleted in the source system.
- __DLH_SYNC_TS
- Indicates when this record was synced into Snowflake. Depending on the volume of records and any queuing in the source system, this value may be the same as the __DLH_START_TS and __DLH_FINISH_TS columns.
- __DLH_START_TS
- Indicates when this record started to be synced into Snowflake. Depending on the volume of records and any queuing in the source system, this value may be the same as the __DLH_SYNC_TS and __DLH_FINISH_TS columns.
- __DLH_FINISH_TS
- Indicates when this record finished being synced into Snowflake. Depending on the volume of records and any queuing in the source system, this value may be the same as the __DLH_SYNC_TS and __DLH_START_TS columns.
- __DLH_IS_ACTIVE
- Indicates where this record is the active record. When TRUE, this is the most current record available from the source system, i.e. Ceridian Dayforce. When FALSE, this indicates that a more recent record is available and the source system has Change Data Capture (CDC) enabled.Â
ďťż