Learn the Basic Concepts
FAQ : DataLakeHouse.io
9min
here are answers to some common questions users typically have about datalakehouse do i need to know how to write code to use the datalakehouse io? no! follow our simple instructions to guide you and with just a few clicks, you can easily load data from any source into snowflake at the frequency of your choosing, and watch your data flow đ what data sources can i synchronize to a target destination? datalakehouse io currently offers connectors docid\ zijf1dmidlnpwpqgymbzl (aka connections or integrations) for many popular solutce systems, for example aloha pos ceridian doordash for work facebook / meta ads harvest jira mcleod transportation mongodb mongodb sharded mysql netsuite (oracle netsuite) optimum hris postgresql quickbooks salesforce shopify snowflake snowflake data marketplace sql server square square marketplace stripe xero always check our main connectors page for the latest list of connectors \+ new sources added frequently you can request a connector anytime if the one you are looking for is not listed is snowflake or a cloud data warehouse the only target that i can land my data into? no dlh io focuses on a number of cloud data warehouse and storage technology vendors for you to synchronize and replicate your data our business critical plan offers both tailored sources and targets to meet your organization's needs, if needed we boast turnaround times of less than four weeks for most connector integrations how frequently can my source data be synchronized / replicated to a target destination like snowflake? this depends on the data source it's coming from, but we currently offer sync bridges that bring data in as frequently as every 15 minutes (faster if on our business critical plan) additional intervals offered include every 30 minutes, 1 hour, 2, 3, 6, 8, 12, and 24 hours depending on your organization's specific needs, you might want to consider our business critical plan, which offers additional synchronization frequencies can i receive notifications? yes! you can setup alerts & notifications for email and slack that let you know thinks like if a connector fails to connect to a source if a sync bridge does not complete within a certain time threshold if an issue is encountered for more information, review alerts & notifications docid\ jnraafkqsxvu2cgzabuga frequently asked questions about datalakehouse io will always be available here also, join our slack community for more insights and to collaborate with other datalakehouse users and developers what types of changes or metadata changes from source to target are handled and tracked by dlh io? dlh io handles the following types of changes from source systems change data capture (cdc) change tracking (ct) metadata additions ex columns/attributes added to source system objects/tables new tables/entities what types of changes or metadata changes are not handled by dlh io? dlh io does not handle deletions for structural source system metadata an example is that of if a customer has a source, for example a postgresql database and sychronizes a set of database tables to a target successfully, and then at the source an action is conducted by the customer to delete/drop a colum, dlh io will not in the next subsequent data load of that table remove the column from the target system this would be a direct violation of data integrity as dlh io would be uninformed if the change made was by accident by the customer, temporary, etc it might not be obvious to some customers or developers but the downstream impact potential could be severe in a production setting where if dlh io removed a target column that previously contained data that was used for production reporting or development for exmple in the data warehouse that column and its data would be removed, thus impacting any systems or applications depending on said column or structure therefor, customers should take note that if they choose to delete a database source system structure by dropping a column they must themselves handle the target system impact if a column is dropped in the source system the target column will remain but it will no longer be updated with data by dlh io as dlh io understands that that column no longer exists in the source what do the dlh columns mean? datalakehouse io has a number of system generated columns to each table in the target connection below is the definition of each of these columns dlh is deleted indicates where this record has been deleted in the source system dlh sync ts indicates when this record was synced into the target database depending on the volume of records and any queuing in the source system, this value may be the same as the dlh start ts and dlh finish ts columns dlh start ts indicates when this record started to be synced into the target database depending on the volume of records and any queuing in the source system, this value may be the same as the dlh sync ts and dlh finish ts columns dlh finish ts indicates when this record finished being synced into the target database depending on the volume of records and any queuing in the source system, this value may be the same as the dlh sync ts and dlh start ts columns dlh is active indicates where this record is the active record when true, this is the most current record available from the source system, i e ceridian dayforce when false, this indicates that a more recent record is available and the source system has change data capture (cdc) enabled what type of support is offered by dlh io for troubleshooting issues? you can communicate with our support team using our support portal, or by sending us an email more on support is found here in our customer support docid\ pbtuxndqrdogoroejbgsv page