Connectors
Operations Applications
Slack
5min
slack's unique instant message and collaboration platform provides millions of users with communication capabilities that make moder work possible dlh io connects to your slack workspace to help consolidate data from your slack account to other data used within your organization to create a holistic view as well as aiding in secondary upstream and downstream data integration requirements slack can be both a source connection and a target connection in dlh io please understand the differences and see the instructions for target connection when using slack as a target for sending data, etc to one of your slack channels connector pre requisites connecting to slack, you need to make sure that you have administrative access to connect to your slack account/workspace since you are connecting to slack with your individual user, you will see all of the public and private channels where you are a member if your permissions change in slack where you (or the person who initially created the source/target connector in dlh io) no longer have access to slack or the slack channel that was initially configured, reauthorize the source/target connector in dlh io and choose the appropriate slack channel instructions (basic connection setup) remember datalakehouse io securely connects to your slack instance by redirecting you to the slack portal for you to sign in via your slack login datalakehouse io does not capture or store your password/credentials please remember that dlh io will only list the public & private channels that your individual slack user has access to in slack using the form please complete the following basic steps enter a name or alias for this connection, in the name/alias field, that is unique from other connectors enter in the target schema prefix field, a name that will be unique in your target destination where you wish to integrate the data click the authorize your account button which will transport you to the slack login, where you will enter your slack account credentials once your credentials are accepted you will be automatically redirected back to this datalakehouse io portal and you should be able to see a successful connection source replicated entities the following entities are replicated by defualt to your what is a target connection? docid 3wx 24ml25noxc1atdbs4 these are the table names you will see in your target connection or available to your target integration if you believe any entities/tables are missing please open a ticket with customer support docid\ pbtuxndqrdogoroejbgsv view a list of entities after creating a connection in dlh io for the most up to date list data replication details this section provides information on how data is replicated, at what frequency and which replication/synchronization details are worth noting from dlh io replication frequency configurations details default replication frequency 24 hours minimum frequency 1 hour (lower on business critical plan) maximum frequency 24 hours custom frequency 1 24 hours replication definitions the following types of replication happen between this source and your target destination cloud data warehouse of choice when a sync bridge runs historical data load (also first time sync) occurs during the first/initial data sync of the sync bridge pipeline and any time you need to or are requested to conduct a historical re sync process for your data here, dlh io ingests all available data for the objects/tables you've selected or are available to you from the source based on your authentication access on that source this can take a relative elongated time to retrieve all data from the source and replicated to the target, as the name suggest it is retrieval all data, so if the source contains large amounts of data even in our parallel processing capability a customer could expect more than an hour for large data system if there are any concerns that a historical load or first time sync has not completed within a reasonable amount of time please contact dlh io support incremental (aka deltas) data load after a first time synchronization/replication or a historical data load, all subsequent processes of replicating the data for a sync bridge (source to target) are referred to as delta or incremental loads here, dlh io captures the latest/newest records and/or events and any changes/updates to existing records and/or events in the source connector based on the frequency set in the sync bridge issue handling if any issues occur with the authorization simply return to the sources page in datalakehouse io, edit the source details and click the save & test or authorize your account or re authorize account button to confirm connectivity if any issues persist please contact our support team via the datalakehouse io support portal https //datalakehouse zendesk com