A Comprehensive Guide to Ingesting Data into Singdata Lakehouse
Overview
Environment Setup and Test Data Generation
Data Ingestion: Load Local Files via Singdata Lakehouse Studio
Data Ingestion: Load Data into Lake via ZettaPark PUT File
Data Ingestion: Batch Load via Singdata Lakehouse Studio (Public Network)
Data Ingestion: Multi-table Real-time Sync via Singdata Lakehouse Studio (CDC, Public Network)
Data Ingestion: Import Data via SQL INSERT in Singdata Lakehouse Studio
Data Ingestion: Load Data via SQL INSERT in Zettapark
Data Ingestion: Load Data via SAVE_AS_TABLE in Zettapark
Data Ingestion: Batch and Real-time Load Data via Java SDK
Data Ingestion: Real-time Sync Kafka Data via Lakehouse Studio
Data Ingestion: Continuous Kafka Data Ingestion via Pipe
Data Ingestion: Continuous Object Storage Data Ingestion via Pipe
Data Ingestion: Load Files from Web into Lake via Built-in Python Node in Singdata Lakehouse Studio
Data Ingestion: Load Files via Database Client DBV/SQLWorkbench PUT
Data Ingestion: Through Third-Party Tools