What you will learn
Experience Classroom like environment via White-boarding sessions
Challenges with Delta Lake
Key Big Data Architectures
Why Delta Lake?
Delta Architecture
Delta Lake Demo
Description
Data Lakes built using Hadoop framework were lacking a very basic functionality i.e. ACID compliance. Hive tried to overcome some of the limitations by providing update functionality but the overall process was messy. Databricks (the company behind Spark) came up with a unique solution i.e. Delta Lake. Delta Lake enables ACID transactions over existing Data Lakes. It can seamlessly integrate with many Big Data Frameworks like Spark, Presto, Athena, Redshift, Snowflake etc. Let’s explore this interesting technology more.
It is an Open source storage layer that provides reliability to the data lakes. Delta Lakes provides ACID(Atomicity, Consistency, Integrity and Durability) properties, scalable metadata handling, and unifies streaming and batch data processing. It runs on the top of data lakes and it is fully compatible with Spark APIs. Data in Delta Lake is stored in Parquet format. It enables Delta Lake to leverage the efficient compression and encoding schemes that are native to Parquet.
Enroll Now to explore about Delta Lake. Here is detailed Agenda for the course:
-
Challenges with Delta Lake
-
Key Big Data Architectures
-
Why Delta Lake?
-
Delta Architecture
-
Delta Lake Demo
We are an official training delivery partner of Confluent Kafka.. We conduct corporate trainings on various topics including Confluent Kafka Developer, Confluent Kafka Administration, Confluent Kafka Real-Time Streaming using KSQL & KStreams and Confluent Kafka Advanced Optimization. Our instructors are well qualified and vetted by Confluent for delivering such courses.
Content
Delta Lake Overview
Appendix