Connectors
Files & Object Storage
Oracle Cloud Storage
6 min
oracle cloud storage or oracle storage is a service offered by oracle, a long running technology company that with its cloud and fusion offerings has their own object storage https //en wikipedia org/wiki/object storage for managing various types of data and blobs oracle cloud storage based on their oracle cloud infrastructure (oci) uses the same scalable storage infrastructure that it already provides to tens of thousands of customers oracle cloud storage can store any type of object, which allows uses like storage for internet applications, backups, disaster recovery, data archives, data lakes https //en wikipedia org/wiki/data lake for analytics, and hybrid cloud storage https //en wikipedia org/wiki/cloud computing#hybrid cloud our oracle cloud storage dlh io integration provides replicatation of stored data to any cloud data warehouse target synchronization of your target destination at a scheduled frequency a single platform to analyze your data and integrate with other data so that you can combine all into a single source of truth repository for true analytics that will empower your business, for example many oracle customers often have snowflake, databricks, or another system they need to integrate with and dlh io provides the ability to do so all you need is to specify the connection to your oracle storage, point to your target system, and does the rest our customer support docid\ pbtuxndqrdogoroejbgsv team can help you set it up for you during a short technical on boarding session, as needed setup instructions dlh io securely connects to your oracle storage using the form in the dlh io portal please complete the following basic steps enter a name or alias for this connection, in the 'name/alias' field, that is unique from other connectors enter a 'target schema prefix', which will be the prefix for the schema at the target you will sync to enter a 'bucket' name, where your files are stored typically starts with https //, so enter just the name without the prefix select your 'region' enter your 'access key', credentials to access the bucket enter your 'secret key', credentials to access the bucket enter any other optional details in the available fields (see the setup video if you need help or contact support) folder path, is a path on the root bucket from where desired files will be retrieved file pattern, is a regular expression (regex) used to isolated only certain files to be retrieved file type, allows for a pre determined type of file extension to be retreived click the save & test button once your credentials are accepted you should be able to see a successful connection creating credentials access & secret key in oci using the console using the console you can create your keys to access oracle storage and oracle fusion bicc data files used for synchronizing data to your target destination, etc the steps are as follows access the iam for users in the domain create a new service account user specifically for the purpose of using a service account credential for dlh and accessing data for bicc and controling the keys as a service account this requires an email address to be created as this is in effect creating a new user in your oci account the benefit here is that a service account will not be tied to a named administrator or named user and a service account credential can be shared by administrators of your account internally we recommend a new service account with a name such as this be created (you can change this suggestion to fit your companies policy and standards, of course) first name = dlh storage last name = service account email = dlh oracle storage svc@\<your company domain> create a new group by navigating to domains > user management > groups (scroll down on user management page) name the group something like object storage access group assign the newly created user to this group create a new policy by navigating to for example, create a policy named, dlh bucket access policy in the policy storage builder (if using the ui) give this policy the permissions of, storage management > let users write objects to object storage buckets assign the newly created group to this policy and save return to the domains > user management area click on the newly created service account user navigate to the tab, customer secret keys click on the generate secret key button copy the secret key then, copy the access key be sure to copy these immediately and paste them somewhere safe for re use as the secret key cannot be recreated, so you would have to delete the key and repeat the process to retrieve it (is essence recycling the key, destroying the previous key rendering it useless) save the secret key and access key and use this for inputs into the dlh io oracle cloud storage connection form setup testing or troubleshooting the access and secret keys the best way to test the keys are working is to create a dlh io oracle cloud storage connection and then use the test connection button another way is to use a quick python script or jupyter notebook (or equivelant) and run the following code on your local machine, etc for testing you will be able to see any error messages returned by oci quickly and troublshoot with customer support docid\ pbtuxndqrdogoroejbgsv much easier if there are issues import boto3 s3 = boto3 resource( 's3', aws access key id="bexxxxxxxxxxxxxxxxxxxxxxxxxxxxxxdb2", aws secret access key="8xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx=", region name="us ashburn 1", # region name here that matches the endpoint endpoint url="https //xxxxxxxxxxx compat objectstorage us ashburn 1 oraclecloud com" # include your namespace in the url ) \# print out the bucket names for bucket in s3 buckets all() print (bucket name) how to video \[coming soon] if any issues occur with the authorization simply return to the sources page in dlh io, edit the source details and click the 'save & test' button to confirm connectivity if any issues persist please contact our support team via the datalakehouse support portal https //datalakehouse zendesk com issue handling if any issues occur with the authorization simply return to the sources page in datalakehouse io, edit the source details and click the save & test or authorize your account or re authorize account button to confirm connectivity if any issues persist please contact our support team via the datalakehouse io support portal https //datalakehouse zendesk com