Connectors
Files & Object Storage
Wasabi
5min
wasabi has been changing the cloud storage landscape with hot cloud storage, a disruptively simple, one size fits all cloud storage technology that is 1/5th the price and faster than the competition with no egress fees, api call charges, or additional hidden fees wasabi is game changing, leading edge cloud technology that allows users to affordably store a nearly infinite amount of data wasabi is an entirely new approach to cloud storage… a bottomless cloud that grows with your business datalakehouse supports your integration from wasabi our wasabi datalakehouse integration replicates wasabi data to your cloud data warehouse target synchronizes to your target destination at a scheduled frequency provides a single platform to analyze your data and integrate with other data so that you can combine all into a single source of truth repository for true analytics that will empower your business it allows you to replicate/synchronize your wasabi data files, including capturing snapshots of data at any point int time, and keep it up to date with little to no configuration efforts you don’t even need to prepare the target schema — datalakehouse io will automatically handle all the heavy lifting for you all you need is to specify the connection to your wasabi, point to your target system, or use a datalakehouse io managed data warehouse and datalakehouse io does the rest our support team can even help you set it up for you during a short technical on boarding session setup instructions datalakehouse io securely connects to your wasabi storage using the form in the datalakehouse io portal please complete the following basic steps enter a name or alias for this connection, in the 'name/alias' field, that is unique from other connectors enter a 'target schema prefix', which will be the prefix for the schema at the target you will sync to enter a 'bucket' name, where your files are stored typically starts with s3 // or https //, so enter just the name without the prefix select your 'region' enter your 'access key', credentials to access the bucket enter your 'secret key', credentials to access the bucket enter any other optional details in the available fields (see the setup video if you need help or contact support) folder path, is a path on the root bucket from where desired files will be retrieved file pattern, is a regular expression (regex) used to isolated only certain files to be retrieved file type, allows for a pre determined type of file extension to be retreived click the save & test button once your credentials are accepted you should be able to see a successful connection how to instructions if any issues occur with the authorization simply return to the sources page in datalakehouse io, edit the source details and click the 'save & test' button to confirm connectivity if any issues persist please contact our support team via the datalakehouse support portal https //datalakehouse zendesk com control each column data type sql transformations docid 5zjgrvbhtywqw8 olioh0 allow logic to be executed against a target connection based on a scheduled frequency or triggered event of new data on tables updated via datalakehouse io (dlh io) this especially helps when you want to control the data type set in your target connection since all columns are set as varchar(16777216)