Part 2 – A simple data pipeline using Python – Develop the required functionality to read the data

Part 2 – A simple data pipeline using Python – Develop the required functionality to read the data

HomeitversityPart 2 – A simple data pipeline using Python – Develop the required functionality to read the data
Part 2 – A simple data pipeline using Python – Develop the required functionality to read the data
ChannelPublish DateThumbnail & View CountDownload Video
Channel AvatarPublish Date not found Thumbnail
0 Views
Let's understand how to build an end-to-end pipeline using Python. Watch these videos to learn more about Pycharm, Git, and setting up and validating the project to get data from a source MySQL database to a target Postgres database.

Here is the list of videos.

Video 1 – Setting up project, GitHub Repository and development code to read the list of tables from a file: https://www.youtube.com/watch?vBxLTTuLlvH0&listPLf0swTFhTI8qhMM6uka63ASG1RrRHws4o&index25
Video 2 (this) – Develop the required functionality to read the data from a table and generate an insert statement using metadata: https://www.youtube.com/watch?vczJ0j-9FK08&listPLf0swTFhTI8qhMM6uka63ASG1RrRHws4o&index26
Video 3 – Develop the required functionality to write data to tables in the target database: https://www.youtube.com/watch?vV1nbPEhjLow&listPLf0swTFhTI8qhMM6uka63ASG1RrRHws4o&index28

If you're relatively new to Python, feel free to take our Mastering Python course at https://python.itversity.com

️Link for full playlist – https://www.youtube.com/playlist?listPLf0swTFhTI8pRV9DDzae2o1m-cqe5PtJ2

More top-rated #DataEngineering courses from #itversity
__________________________________________________________________
Click below to access the course with one month of lab access for /"Data Engineering Essentials Hands-on – SQL, Python and Spark/" –
https://www.udemy.com/course/data-engineering-essentials-sql-python-and-spark/?referralCodeEEF55B4668DA42F6154D

Data engineering using AWS Analytics Services (bestseller)
https://www.udemy.com/course/data-engineering-using-aws-analytics-services/?referralCode99ADF846582E1D7DAEA7

Data Engineering using Databricks features on AWS and Azure (highly rated)
https://www.udemy.com/course/data-engineering-using-databricks-on-aws-and-azure/?referralCodeEEA8219E6538F56E3B5B

Data engineering with Kafka and Spark Structured Streaming (NEW)
https://www.udemy.com/course/data-engineering-using-kafka-and-spark-structured-streaming/?referralCode30F204DEF4644FE9F112

TIME STAMPS
00:00 – 02:52 Introduction
02:52 – 10:40 Connecting to the database
10:40 – 16:25 Establishing a connection and using keyword arguments
4:25 PM – 6:05 PM connect to the docker container and establish the connection
18:05 – 20:22 create the cursor
20:22 – 27:24 how do I run a query?
27:24 – 38:46 update our code to an extent where our code can connect to the database and start reading the data
38:46 – 40:20 print the column
40:20 – 42:48 loading data get doubts about MySQL connection and why am I using two connection differences between MySQL and postgreSQL
42:48– 50:00 Connect the database and establish the connection
50:00 – 52:45 CRUD operations by inserting data into the postgres database
52:45 – 54:04 cycle of creating a function to get a pg connection, building the insert query
54:04 – 01:11:00 load the data into the table
01:11:00 – 01:14:24 general introduction
01:14:24 – 01:16:45 Related question doubts

Connect with me or follow me
https://www.linkedin.com/in/durga0gadiraju
https://www.facebook.com/itversity
https://github.com/dgadiraju
https://www.youtube.com/itversityin
https://twitter.com/itversity

#Python #pipeline #PythonProgramming #Data #itversity

Join this channel to access benefits:
https://www.youtube.com/channel/UCakdSIPsJqiOLqylgoYmwQg/join

Please take the opportunity to connect and share this video with your friends and family if you find it helpful.