Free Download Generative Ai For Data Engineering
Published 2/2024
MP4 | Video: h264, 1920x1080 | Audio: AAC, 44.1 KHz
Language: English | Size: 2.80 GB | Duration: 6h 8m
Hands-On Beginner's Guide to GenAI and LLMs for Data Engineering with Python and SQL
What you'll learn
Use large language models to create Python code to implement data pipelines
Use LLMs to solve data loading, data transformation, and data quality assessment challenges
Create databases and analytic data models using generative AI
Create Python, SQL, and Bash scripts to implement common data engineering tasks
Requirements
Familiarity with working with data, such as with spreadsheets
Ability to read and follow Python and SQL code is helpful
Description
Generative AI tools such as ChatGPT, Claude, and Bard are making data engineering more accessible and more efficient.If you work with spreadsheets or business intelligence tools but aren't too familiar with Python or SQL, then generative AI can help you analyze data and build your own data pipelines and ETL/ELT processes.If you are a data engineer, then GenAI can help you focus your efforts on the problem domain and designing a data architecture while spending less time writing code that can be generated by a machine.Generative AI and LLMs will not replace data engineers or data analysts but those who know how to use these AI tools will be able to build more capable and reliable data pipelines faster. They will also have access to a tool that can help you develop your Python, SQL, and data modeling skills by providing a variety of examples of functional code and help with error messages and troubleshooting processes that do not work as expected.Learn Data Engineering Techniques as Well as Data Engineering ToolsIn this course, you will learn how to break down data engineering problems into a series of tasks that can be automated using Python, SQL, and command line scripts generated by a large language model (LLM). Prompting an AI to "generate a data pipeline to do X, Y, and Z" will probably not get you the results you expect. LLMs are powerful tools, but they are not oracles. As with any tool, we need to understand what the tool is capable of and how to use the capabilities to meet our needs. This course shows you how to think through a data engineering problem, incrementally build components of a solution, and combine those components into functional data pipelines.This course is organized into several topics that cover the fundamental skills needed to begin work in data engineering using GenAI, including:Introduction to large language models, foundation models, and other AI topics related to data engineering. This course uses Claude AI from Anthropic, a large language model that is both well suited to data engineering code generation and free to use.Working with CSV and JSON filesData quality and data cleaning, including statistics and visualizationsExtraction transformation and load (ETL)/ extraction, load, and transform (ELT) processesRelational and NoSQL databasesData modeling using dimensional data model patternsWorking with JSON data in relational databases such as PostgreSQLUnderstanding more advanced components of the modern data stack, including Apache Airflow, Apache Spark, Great Expectations and dbtThe course begins with the most basic of data engineering tasks: working with files. You will learn how to quickly filter, transform, and find problems in data sets made up of comma-separated value (CSV) and JSON files. You'll also see how we can create samples from large data sets to efficiently experiment with different solutions to our data engineer needs. You will learn how to generate code that uses command line utilities like awk, a text processing and data extraction tool, and jq, a tool for parsing, filtering, and transforming JSON data. If you are not familiar with tools like awk and jq, that is no problem. In this course, you will learn how to describe what you want in a solution so the LLM can choose an appropriate tool for the job.Data quality is a primary concern in any data engineering project. Fortunately, with GenAI and a basic understanding of data quality checks, you can quickly generate scripts to check for common data quality problems and apply transformations to the data to correct for those problems. Statistics and visualizations are important tools for ensuring data quality. In this course, you will learn how to use basic statistics and visualizations to help with data quality and data exploration. And because generative AI is used to generate code, you can spend more time learning about statistics, visualizations, and how to apply them to your problem domain and less time trying to find syntax errors or debug a logic error in your code.Databases are the foundation of many applications and data analysis platforms. You will learn about relational databases as well as NoSQL databases and when to use them. Databases are complicated systems that require that we describe how we want to structure our data. This process is known as data modeling. This course will introduce data modeling with a focus on dimensional modeling, a commonly used data model pattern in data analytics. You will also learn how to generate SQL code to implement dimensional models, load data into your database, and query and analyze data once it is loaded.The course concludes with an explanation of more advanced data engineering tools, including Apache Airflow for data pipeline orchestration, Apache Spark for large-scale analytics and machine learning, Great Expectations for data quality control, and dbt for transforming data. These tools are widely used in data engineering but have required some coding skills to use. With generative AI, they are now more accessible to those who understand how to use LLMs like Claude, Bard, and ChatGPT.Now is a great time to become a data engineer because the demand for data engineering skills is high and we now have tools in place that allow us to focus on the problems we are solving while accelerating how quickly we can create scalable, reliable data pipelines.
Overview
Section 1: Introduction
Lecture 1 Introduction
Lecture 2 Course Overview
Lecture 3 Data Engineering, Data Analysis, and Data Science
Lecture 4 Generative AI and Large Language Models (LLMs)
Section 2: Working with CSV Files
Lecture 5 CSV and JSON Files
Lecture 6 Command Line Utilities for Working with Files
Lecture 7 Filtering Rows of a CSV File
Lecture 8 Combining Commands in a Shell Script
Lecture 9 Sampling with Python
Lecture 10 Modifying a Shell Script
Lecture 11 Scheduling Jobs with Cron
Lecture 12 Cron Job Scheduling Example
Section 3: Working with JSON FIles
Lecture 13 Working with JSON Files
Lecture 14 Installing jq
Lecture 15 Filtering JSON files with jq
Lecture 16 Loading JSON into Python
Section 4: Data Quality in Data Engineering
Lecture 17 Overiew of Data Quality
Lecture 18 Sales Data
Lecture 19 Missing Values in Data Files
Lecture 20 Adding Documentation to Scripts
Lecture 21 Range Checks
Lecture 22 Working with Dates and Times
Lecture 23 Checking Date and Time Format
Lecture 24 Visualizations for Data Quality
Section 5: Working with Pandas in Python
Lecture 25 Working with Pandas
Lecture 26 Statistics using Dataframes
Lecture 27 Generating Synthetic Data
Section 6: JSON Schemas
Lecture 28 JSON Schemas for Data Validation 1
Lecture 29 JSON Schemas for Data Validation 2
Section 7: Working with Databases
Lecture 30 Relational Databases
Lecture 31 NoSQL Databases
Lecture 32 PostgreSQL
Lecture 33 Installing PostgreSQL
Lecture 34 Creating PostgreSQL Schemas
Lecture 35 Creating Tables in PostgreSQL
Section 8: Dimensional Modeling for Data Analysis
Lecture 36 Dimensional Modeling
Lecture 37 Loading Sales Data into Staging Tables
Lecture 38 Loading Dimension Data into Staging Tables
Lecture 39 Creating Location Dimension
Lecture 40 Creating Products Dimension
Lecture 41 Create Date Dimension
Section 9: Populating a Fact Table in a Dimensional Model
Lecture 42 Creating a Sales Fact Table
Lecture 43 Preparing Data for Aggregation
Lecture 44 Aggregating Staging Data
Lecture 45 Loading Sales Fact Table
Lecture 46 Generating SQL Queries for a Dimensional Model
Section 10: JSON in PostgreSQL
Lecture 47 JSON in PostgreSQL
Lecture 48 Creating a Table with a JSON Column in PostgreSQL
Lecture 49 Loading JSON Data into PostgreSQL
Lecture 50 Querying JSON Data in PostgreSQL
Section 11: Next Steps Learning Data Engineering
Lecture 51 What to Learn Next?
Lecture 52 Apache Airflow for Orchestration
Lecture 53 Apache Spark for ETL/ELT and Analytics
Lecture 54 Great Expectations for Data Quality Control
Lecture 55 dbt for Data Transformation with SQL
Section 12: Course Wrap Up
Lecture 56 Conclusion
People who work with data and want to build data manipulation scripts faster and develop more complex data pipelines
Homepage
Code:
https://www.udemy.com/course/generative-ai-for-data-engineering/
Recommend Download Link Hight Speed | Please Say Thanks Keep Topic Live
Rapidgator
aqxwh.Generative.Ai.For.Data.Engineering.part3.rar.html
aqxwh.Generative.Ai.For.Data.Engineering.part1.rar.html
aqxwh.Generative.Ai.For.Data.Engineering.part2.rar.html
Uploadgig
aqxwh.Generative.Ai.For.Data.Engineering.part2.rar
aqxwh.Generative.Ai.For.Data.Engineering.part1.rar
aqxwh.Generative.Ai.For.Data.Engineering.part3.rar
Nitroflare
aqxwh.Generative.Ai.For.Data.Engineering.part3.rar
aqxwh.Generative.Ai.For.Data.Engineering.part1.rar
aqxwh.Generative.Ai.For.Data.Engineering.part2.rar
Fikper
aqxwh.Generative.Ai.For.Data.Engineering.part3.rar.html
aqxwh.Generative.Ai.For.Data.Engineering.part1.rar.html
aqxwh.Generative.Ai.For.Data.Engineering.part2.rar.html
No Password - Links are Interchangeable