website logo
โŒ˜K
Getting Started ๐Ÿš€
What is DataLakeHouse.io?
Our Business-Value Focus
Learn the Basic Concepts
Connectors
Operations Applications
ConnectWise
Google Sheets
Aloha POS
BILL
Bloom Growth
Ceridian Dayforce
Food Delivery Service Connector
Facebook Ads
Google Analytics 4
Harvest
Hubspot
Jira
McLeod Transportation
NetSuite (Oracle NetSuite)
Optimum HRIS
QuickBooks Online
Salesforce
Shopify
Square
Square Marketplace
Stripe
Workday HCM
Xero
Databases
SQL Transformations
Terraform: Reverse Terraforming
DBT Cloud Transformations
Sync Bridge (Data Pipelines)
Create a Sync Bridge
Manually Run a Sync Bridge
Deleting a Sync Bridge
Analytics
Access Analytics
Snowflake Usage Analytics
FAQ (about syncing data)
How are new columns are added to the target Data Warehouse?
....
Data Catalog
Create the Catalog
Populate the Catalog
Access the Catalog
Data Warehousing
Snowflake
Open Source DW Models
Alerts & Notifications
Integrations (Slack, etc.)
Logs & Monitoring
Security
Release Notes
April 2022
July 2022
Community Overview
Contributor Agreements
Code Contribution Guide
About
Our
License
Viewpoint
Docs powered byย archbeeย 
8min

Logs & Monitoring

Overview

DataLakeHouse.io generates structured log events from connectors and provides a descriptive view to monitor and troubleshoot the operations happening with each of your connectors.

DataLakeHouse.io supports monitoring in the following ways:

  1. Log Events
  2. Audit Tables in the Target environment

DataLakeHouse.io Monitoring Dashboard

You can view logs in your Logs & Monitoring Dashboard

  • Access the view from the Menu Bar
๏ปฟ
  • You can filter on the logs as per the below options:
    • A - Target Connection Name
    • B - Bridge Name
    • C- Filter logs - Error Logs only
    • D - Filter Duration :
      • 1 Hour
      • 1 Week
      • 1 Day
๏ปฟ

Log Cycle

Action State Process

Connection Architecture
Connection Architecture
๏ปฟ

๏ปฟ

Logging Action Attributes Available

Action Name

Action Types

Description

ENTITY (Example)

START

SYNC

Initiate the syncing process

DataLakeHouse.io -Source Bridge

CONNECT

CHECKPOINT SOURCE

Establish Connection Checkpoint

PostGres(Source)

FILTERING

SOURCE

Filtering the Source records

PostGres(Source)

CHANGE_TRACKING

CHANGE_TRACKING SYNC

Setting up Logical Replication Slots

PostGres(Source)

START

RETRIEVE

Retrieve records from the Source

PostGres(Source)

ROW_COUNT

SYNC

Row count at the Source

PostGres(Source)

DONE

COMPRESS

Compress Files for upload

PostGres(Source)

START

UPLOAD_S3

Upload Files to S3

Amazon S3

COMPLETED_FILES_TO_S3

UPLOAD_S3

Upload Files to S3

Amazon S3

Amazon S3

START

SNOWFLAKE_PROCESS

Setup Target Connection

Target Bridge

START_TARGET_HELPER

TARGET

Setup Target Connection

Snowflake (Target)

TMP_SCHEMA

AVAILABLE/DROPPED/

CREATED

Drop and Setup Temporary Schemas

Snowflake (Target)

LOADING

DATA

Load data into Temporary Schemas

Snowflake (Target)

MERGE

MAIN_TABLE

Merge into Final Schemas

Snowflake (Target)

INSERT

AUDIT_TABLE

Insert into audit tables

Snowflake (Target)

DONE

END

Closing the syncing process

DataLakeHouse.io -Source Bridge

COMPLETED

BRIDGE

Closing the syncing process

Closing the syncing process

DataLakeHouse.io -Source Bridge

๏ปฟ

Log events

The event logs are in a standardized JSON format

๏ปฟ

๏ปฟ

๏ปฟ

๏ปฟ

๏ปฟ

Updated 03 Mar 2023
Did this page help you?
Yes
No
UP NEXT
Security
Docs powered byย archbeeย 
TABLE OF CONTENTS
Overview
DataLakeHouse.io Monitoring Dashboard
Log Cycle
Action State Process
Logging Action Attributes Available
Log events