Connectors
Databases
Oracle
6 min
oracle rdbms is one of original enterprise database technologies that exist both on premise and in the cloud as a robust platform with maximum efforts focused at the enterprise level, this database technology is capable of transactional and analytical workloads and just about everything in between oracle specifications required versions (10 or above) are supported determine which way to connect default or ssh tunneling firewalls on your database server should allow incoming connections through the public internet on your oracle port (typically 1521 unless your network team has changed it) be sure to our ip grantlist / whitelist docid\ ernsp9vcy4af88jk0uexc our ip addresses for cross network database server network access consult with you oracle administrator as needed on the setup we suggest creating a new read only user to delineate this user service account for dlh io from any other user access to your database, however, using an existing user is acceptable instructions (basic connection setup) remember dlh io connects to your database instance with credentials supplied by you we store your credentials securely with our bank grade protocols create a datalakehouse io user on the oracle database connect to the oracle database in question with your admin user create a user for dlh io using the following oracle logic replacing \<newsername> with you a user name that you choose (we recommend 'datalakehouse ro' to signify a read only user), and choose a good password that complies with your security policies create user \<new username> identified by tmp!password; grant resource, connect to \<new username>@'%'; enter your credentials and other information in the fields enter in the name/alias field, the name you'll use within datalakehouse io to differentiate this connection from others enter in the target schema prefix field, is the prefix of schema(s) that gets created on your destination target connection database for each of the schemas you load in this connection so if your database has a schema named 'dbo' the target connection when synced will have a schema in that target database named the value of this field + ' dbo' alphanumeric characters only it must start and end with a letter but can contain an underscore( ) enter in the server/host field, the name of the public server name or the ip address (most customers use the ip for this field) enter in the port field, where this database is accessible and the firewall restrictions are open enter in the database field, the name of the database to connect enter in the userame/alias field, the username of user you created in the steps above to give access to datalakehouse io leave auth type field alone it is set to password because datalakehouse is using tls and requires username and password credentials to access the database enter in the password field, the password for the user you created in the steps above click save & test this attempt to connect to your database with the credentials provided a message of success or failure will be shown if success you'll be prompted with the schema objects objects of the database and will need to complete the final steps for configuration shown below if failure happens with the test connection, the connection is still saved but you will need to correct the failure based on the failure reason information provided in the message instructions (continued & final setup) this section of steps ensures you have coverage of other important steps required on your database side and in datalakehouse io once you have completed the above test connection successfully enable logminer for change tracking to capture incremental loading delta changes of the database records being synchronized and to reduce your costs we use the native logminer feature of oracle connect to your oracle database server and access the file system enable supplemental logging and start logminer as an admin user with appropriate privileges alter database add supplemental log data; execute dbms logmnr start logmnr(options => dbms logmnr dict from online catalog + dbms logmnr committed data only); for more instructions follow the the oracle documentation or confer with your oracle dba restart your oracle server in order for the changes to propagate more specific configuration the oracle database configuration can be time consuming but any oracle dba should understand the general concepts of what is required if a system such as dlh io requires accessing the data tables and retrieving the incremental data changes (i e deltas) to those database tables the following goes into more detail for configuration review the oracle database level supplemental logging documentation and ensure this command is run alter database add supplemental log data; the above should work to create logs for all tables having a primary key but, consider the types of tables that will be synchronized by dlh io in general synchronizing only tables with a primary key is the easiest approach but this is not always the case and there are some tables you may wish to include having no primary keys in the case of the later you will need to enable all columns for the supplimental logging for those tables without a pk if you have tables without a pk, then identify those tables and run the following for each of those tables alter table "\<schema>" "\<table>" add supplemental log data (all) columns; if for soem reason you didn't want or can't enable supplemental logging at the database level but can do at the table level, then for any tables having a pk, then use this statement for each table alter table "\<schema>" "\<table>" add supplemental log data (primary key) columns; nb errors will appear if the logging is not set up correctly but the logs from dlh io should be sufficient for you to resolve the issue on your database on your own grant permissions for log miner by running the following statements grant select on sys v $archived log to \<username>; grant select on sys v $archive dest to \<username>; grant select on sys v $logmnr contents to \<username>; grant execute on dbms logmnr to \<username>; grant execute on dbms logmnr d to \<username>; grant select any transaction to \<username>; grant execute catalog role to \<username>; grant the logmining permission to the user that will be connecting from dlh io grant logmining to \<username>; grant permissions to the user to a few additional system tables these tables help dlh io to review more details about the environment that help optimize performance and consistency if using oracle db standalone run the following grant select on dba free space to \<username>; grant select on sys v $log to \<username>; grant select on sys v $tempfile to \<username>; grant select on sys v $datafile to \<username>; if using oracle container db then run the following first before running the same statement above that you would run for standalone alter session set container=cdb$root; grant select on dba free space to \<username>; grant select on sys v $log to \<username>; grant select on sys v $tempfile to \<username>; grant select on sys v $datafile to \<username>; lastly provide permissions for dlh io to review the databases as the process confirms what type of database it is referencing, container db, etc grant select on sys v $database to \<username>; that's it we may update this section from time to time if we find that oracle versions change the requirements for permissions and as we take feedback from our customers other information about this connector from time to time we will update the instruction set here to inform you about this connection or how specifically we must connect to optimally synchronize your data if you require any other type of authorization to connect to your account instance please reach out to our support team via our datalakehouse io support portal issue handling if any issues occur with the authorization simply return to the sources page in datalakehouse io, edit the source details and click the save & test or authorize your account or re authorize account button to confirm connectivity if any issues persist please contact our support team via the datalakehouse io support portal https //datalakehouse zendesk com