Загрузка страницы

Part 1 - A sample data pipeline using Python - Setup Project and Develop Program to read table list

Let us understand how to build end to end pipeline using Python. Go through these videos to learn more about Pycharm, Git as well as setting up and validating the project to get data from a source MySQL Database to a target Postgres Database.

Here are the list of videos.

Video 1 (this one) - Setup Project, GitHub Repository and Develop Code to read list of tables from a file: https://www.youtube.com/watch?v=BxLTTuLlvH0&list=PLf0swTFhTI8qhMM6uka63ASG1RrRHws4o&index=25
Video 2 - Develop required functionality to read the data from table and generate insert statement using metadata: https://www.youtube.com/watch?v=czJ0j-9FK08&list=PLf0swTFhTI8qhMM6uka63ASG1RrRHws4o&index=26
Video 3 - Develop required functionality to write data to tables in target database: https://www.youtube.com/watch?v=V1nbPEhjLow&list=PLf0swTFhTI8qhMM6uka63ASG1RrRHws4o&index=28

▶️Link for complete playlist - https://www.youtube.com/playlist?list=PLf0swTFhTI8pRV9DDzae2o1m-cqe5PtJ2

=============================================
More top rated #DataEngineering Courses from #itversity 👇🏻
_____________________________________________________
🔵Click below to get access to the course with one-month lab access for "Data Engineering Essentials Hands-on - SQL, Python and Spark" -
👉🏼🔗https://www.udemy.com/course/data-engineering-essentials-sql-python-and-spark/?referralCode=EEF55B4668DA42F6154D

🟢Data Engineering using AWS Analytics Services(Bestseller)
👉🏼🔗https://www.udemy.com/course/data-engineering-using-aws-analytics-services/?referralCode=99ADF846582E1D7DAEA7

🟢Data Engineering using Databricks features on AWS and Azure (Highly Rated)
👉🏼🔗https://www.udemy.com/course/data-engineering-using-databricks-on-aws-and-azure/?referralCode=EEA8219E6538F56E3B5B

🟢Data Engineering using Kafka and Spark Structured Streaming (NEW)
🔗https://www.udemy.com/course/data-engineering-using-kafka-and-spark-structured-streaming/?referralCode=30F204DEF4644FE9F112

If you are relatively new to Python feel free to follow our Mastering Python course at https://python.itversity.com

* 00:00:00 Launching Session
* 00:00:33 Prerequistes - Source and Target Databases using Docker. You can use https://github.com/dgadiraju/retail_db.git
* 00:02:20 Setup Project using Pycharm
* 00:15:26 Externalize Database Connectivity Information
* 00:33:09 Validate Python Programs with arguments using Pycharm Wizard
* 00:34:42 Add Environment Variables using Pycharm Wizard
* 00:39:56 Versioning the code using GitHub - Initialize, add and commit using local repository
* 00:46:30 Adding .gitignore to ignore non source code files.
* 00:51:22 Creating Repository using https://www.github.com and pushing from local repository to GitHub
* 00:53:53 Creating a file with list of tables to be loaded
* 00:59:34 Reading table list using Pandas
* 01:11:25 Committing and pushing changes to GitHub
* 01:20:55 Q&A

For quick itversity updates, subscribe to our newsletter or follow us on social platforms.
* Newsletter: http://notifyme.itversity.com
* LinkedIn: https://www.linkedin.com/company/itversity/
* Facebook: https://www.facebook.com/itversity
* Twitter: https://twitter.com/itversity
* Instagram: https://www.instagram.com/itversity/
* YouTube: https://www.youtube.com/itversityin

#Python #pipeline #PythonProgramming #Data #itversity

Join this channel to get access to perks:
https://www.youtube.com/channel/UCakdSIPsJqiOLqylgoYmwQg/join

Видео Part 1 - A sample data pipeline using Python - Setup Project and Develop Program to read table list канала itversity
Показать
Комментарии отсутствуют
Введите заголовок:

Введите адрес ссылки:

Введите адрес видео с YouTube:

Зарегистрируйтесь или войдите с
Информация о видео
15 июня 2020 г. 7:32:16
01:33:21
Яндекс.Метрика