Google Cloud Storage
Google Cloud Storage also known as GCP Storage is the Google Cloud object storage or blob storage concept for storing files and other objects in the cloud.
DLH.io provides this connector as a direct way to work with data and files both as source and as target (i.e.: backup) conduits.
GCP Storage is mainly used for synchronizing data into BigQuery but can also be used for other general synchronization data flows and pipelines.
GCP Storage Pre-Requisities:
- Name of your GCP Project
- Name of your GCP Storage Bucket
- Service Account Key JSON
DataLakeHouse.io securely connects to your Google Cloud Storage bucket. Using the form in the DataLakeHouse.io portal please complete the following basic steps.
- Enter a Name or Alias for this connection, in the Name/Alias field, that is unique from other connectors
- Enter a Target Schema Prefix, which will be the prefix for the schema at the target you will sync your data files into
- Enter a Bucket name, where your files are stored
- Typically just the name of the bucket. No http or gs prefixes required.
- Select your 'Region'
- Enter any other optional details in the available fields (See the setup video if you need help or contact support)
- Folder Path, is a path on the root bucket from where desired files will be retrieved
- File Pattern, is a regular expression (RegEx) used to isolated only certain files to be retrieved
- File Type, allows for a pre-determined type of file extension to be retreived
- Enter your Service Account Key, which should be a JSON string
- Paste the entire Service Account Key (JSON). If using this bucket as process storage for BigQuery, please Add this Service Account email, firstname.lastname@example.org, as a principal user on your GCP project and assign it the Storage Admin role.
- Click the Save & Test button. Once your credentials are accepted you should be able to see a successful connection.
Why am I getting a storage.buckets.get error?
Can we use the GCP Storage Option for non-BigQuery DWs?
No. Not currently as of 01/2023, it is only available for BigQuery processing.