Logs & Monitoring
DataLakeHouse.io generates structured log events from connectors and provides a descriptive view to monitor and troubleshoot the operations happening with each of your connectors.
DataLakeHouse.io supports monitoring in the following ways:
- Log Events
- Audit Tables in the Target environment
You can view logs in your Logs & Monitoring Dashboard
- Access the view from the Menu Bar
- You can filter on the logs as per the below options:
- A - Target Connection Name
- B - Bridge Name
- C- Filter logs - Error Logs only
- D - Filter Duration :
- 1 Hour
- 1 Week
- 1 Day
Action Name | Action Types | Description | ENTITY (Example) |
---|---|---|---|
START | SYNC | Initiate the syncing process | DataLakeHouse.io -Source Bridge |
CONNECT | CHECKPOINT SOURCE | Establish Connection Checkpoint | PostGres(Source) |
FILTERING | SOURCE | Filtering the Source records | PostGres(Source) |
CHANGE_TRACKING | CHANGE_TRACKING SYNC | Setting up Logical Replication Slots | PostGres(Source) |
START | RETRIEVE | Retrieve records from the Source | PostGres(Source) |
ROW_COUNT | SYNC | Row count at the Source | PostGres(Source) |
DONE | COMPRESS | Compress Files for upload | PostGres(Source) |
START | UPLOAD_S3 | Upload Files to S3 | Amazon S3 |
COMPLETED_FILES_TO_S3 | UPLOAD_S3 | Upload Files to S3 Amazon S3 | Amazon S3 |
START | SNOWFLAKE_PROCESS | Setup Target Connection | Target Bridge |
START_TARGET_HELPER | TARGET | Setup Target Connection | Snowflake (Target) |
TMP_SCHEMA | AVAILABLE/DROPPED/ CREATED | Drop and Setup Temporary Schemas | Snowflake (Target) |
LOADING | DATA | Load data into Temporary Schemas | Snowflake (Target) |
MERGE | MAIN_TABLE | Merge into Final Schemas | Snowflake (Target) |
INSERT | AUDIT_TABLE | Insert into audit tables | Snowflake (Target) |
DONE | END | Closing the syncing process | DataLakeHouse.io -Source Bridge |
COMPLETED | BRIDGE | Closing the syncing process Closing the syncing process | DataLakeHouse.io -Source Bridge |
The event logs are in a standardized JSON format