Connectors
Files & Object Storage
Google Cloud Storage
5 min
google cloud storage also known as gcp storage is the google cloud object storage or blob storage concept for storing files and other objects in the cloud dlh io provides this connector as a direct way to work with data and files both as source and as target (i e backup) conduits gcp storage is mainly used for synchronizing data into bigquery but can also be used for other general synchronization data flows and pipelines gcp storage pre requisities name of your gcp project name of your gcp storage bucket service account key json setup instructions datalakehouse io securely connects to your google cloud storage bucket using the form in the datalakehouse io portal please complete the following basic steps enter a name or alias for this connection, in the name/alias field, that is unique from other connectors enter a target schema prefix , which will be the prefix for the schema at the target you will sync your data files into enter a bucket name, where your files are stored typically just the name of the bucket no http or gs prefixes required select your ' region ' enter any other optional details in the available fields (see the setup video if you need help or contact support) folder path, is a path on the root bucket from where desired files will be retrieved file pattern, is a regular expression (regex) used to isolated only certain files to be retrieved file type, allows for a pre determined type of file extension to be retreived enter your service account key , which should be a json string paste the entire service account key (json) if using this bucket as process storage for bigquery, please add this service account email, dlh global bq data sync svc\@stg datalakehouse iam gserviceaccount com , as a principal user on your gcp project and assign it the storage admin role click the save & test button once your credentials are accepted you should be able to see a successful connection faqs & troubleshooting why am i getting a storage buckets get error? this issue is due to how dlh io needs to retrieve file information from your bucket if you see an error message containing this warning or error, it means that you need to update your permissions for the service account and/or the dlh io user shown in the connection to have either the storage admin role granted or create a custom role in your gcp project with this permission see this serverfault com answer for some general direction, if needed dlh io runs a test when you create this gcp storage source connector which lists the files in the bucket and some other steps to ensure that dlh io can act as the conduit to work with your bucket if any portion of the test fails a notification o the permission issue should appear in the logs, alerts, or in the page when the test is conducted can we use the gcp storage option for non bigquery dws? no not currently as of 01/2023, it is only available for bigquery processing control each column data type sql transformations docid 5zjgrvbhtywqw8 olioh0 allow logic to be executed against a target connection based on a scheduled frequency or triggered event of new data on tables updated via datalakehouse io (dlh io) this especially helps when you want to control the data type set in your target connection since all columns are set as varchar(16777216) issue handling if any issues occur with the authorization simply return to the sources page in datalakehouse io, edit the source details and click the save & test or authorize your account or re authorize account button to confirm connectivity if any issues persist please contact our support team via the datalakehouse io support portal https //datalakehouse zendesk com