Connectors
Databases

Redshift

3min
amazon redshift is a steadfast cloud data warehouse platform it is capable of handling analytical workloads typically used for data warehouse models and other data structures for reporting and beyond amazon redshift prerequisites be sure to our ip grantlist / whitelist docid\ ernsp9vcy4af88jk0uexc our ip addresses for cross network database server network access identify the redshift cluster and capture it for later use consult with you redshift administrator as needed on the setup we suggest creating a new read only user and new role to delineate this user service account for dlh io from any other user access to your database, however, using an existing user and/or role is acceptable create a master or limited user since a master user has the necessary permissions that can be used or create a dlh io specific limited user having all the privileges create a read only dlh io user in your instance we suggest creating a new user specific to dlh io for read only access to your database(s), but you may already have an existing user in which case you can ignore this step create user datalakehouse user password tmp!password; grant create, temporary on database public to datalakehouse user; grant create, temporary on database datalakehouse raw to datalakehouse user; identify the redshift user name that will be used and save it for use in the new connector form instructions (basic connection setup) scroll down to ensure you review all steps, as needed create a datalakehouse io user on aws iam / redshift log in with an account having administrator role or similar access connect to your redshift instance and find the cluster information and instance endpoint details determine the database name or we recommend to create a new database called, datalakehouse raw update the vpc security group for the inbound list with the safelist our ip grantlist / whitelist docid\ ernsp9vcy4af88jk0uexc addresses of datalakehouse io for all ip addresses in configuration > workload management we suggest that you turn on automatic wlm in the workload management window click the parameter group then click on switch to wlm mode choose automatic wlm then click save you could use the manual method by updating the workload queues > concurrency on main (query concurrency, in the json) to a value greater than 5 but we recommend the automatic wlm if not an advanced use case for your data on the connection form enter your credentials and other information enter in the name/alias field, the name you'll use within datalakehouse io to differentiate this connection from others enter in the server/host field, the name of the endpoint server name use the full endpoint, for example, enter in the port field, where this database is accessible and the firewall restrictions are open for redshift we always assume port 5439, which is standard but we have it here for future proofing enter in the database field, the name of the database to connect in most cases this is the datalakehouse raw database enter in the username/alias field, the username of user you created in the steps above to give access to datalakehouse io in most cases this is the datalakehouse user leave auth type field alone it is set to password because datalakehouse is currently only using ssl/tls and requires username and password credentials to access the database enter in the password field, the password for the user you created in the steps above click on save & test to save the connection and test that we can connect if updating the form click save & test or just test clicking on save & test will again save any changes such as the password change, etc you will not be able to change the prefix of the schema that will be the target in the destination any test of the connection will attempt to connect to your database with the credentials and info provided a message of success or failure will be shown if success you'll be prompted with the schema objects objects of the database and will need to complete the final steps for configuration shown below if failure happens with the test connection, the connection is still saved but you will need to correct the failure based on the failure reason information provided in the message issue handling if any issues occur with the authorization simply return to the sources page in datalakehouse io, edit the source details and click the save & test or authorize your account or re authorize account button to confirm connectivity if any issues persist please contact our support team via the datalakehouse io support portal https //datalakehouse zendesk com