Logs & Monitoring
6min
overview datalakehouse io generates structured log events from connectors and provides a descriptive view to monitor and troubleshoot the operations happening with each of your connectors datalakehouse io supports monitoring in the following ways log events audit tables in the target environment datalakehouse io monitoring dashboard you can view logs in your logs & monitoring dashboard access the view from the menu bar you can filter on the logs as per the below options a target connection name b bridge name c filter logs error logs only d filter duration 1 hour 1 week 1 day log cycle action state process logging action attributes available action name action types description entity (example) start sync initiate the syncing process datalakehouse io source bridge connect checkpoint source establish connection checkpoint postgres(source) filtering source filtering the source records postgres(source) change tracking change tracking sync setting up logical replication slots postgres(source) start retrieve retrieve records from the source postgres(source) row count sync row count at the source postgres(source) done compress compress files for upload postgres(source) start upload s3 upload files to s3 amazon s3 completed files to s3 upload s3 upload files to s3 amazon s3 amazon s3 start snowflake process setup target connection target bridge start target helper target setup target connection snowflake (target) tmp schema available/dropped/ created drop and setup temporary schemas snowflake (target) loading data load data into temporary schemas snowflake (target) merge main table merge into final schemas snowflake (target) insert audit table insert into audit tables snowflake (target) done end closing the syncing process datalakehouse io source bridge completed bridge closing the syncing process closing the syncing process datalakehouse io source bridge log events the event logs are in a standardized json format