Data Engineer || Data Bricks || IT Company || Python and Azure || MCA ||BCA
Data Engineer || Data Bricks || IT Company || Python and Azure || MCA ||BCA || Kolkata || West Bengal || India || World
Now we are looking some candidates who are from IT programmers.
JD: Data Engineer/Lead with Azure Data Bricks, Python and Azure Data Factory.
Min Experience should have 8+ years of IT Experience, with at least 3 years of Python and Data Bricks experience.
We need a Data Engineer/Lead to develop reusable ETL framework on Azure Data Bricks using Spark 3.0.
This custom framework will be used to create a template driven data load and transformation system on Data Lake and Big Databases. The Framework shall use components from Azure and Data Bricks (Spark).
The developer is expected to write Python Test cases and Data Quality checks for all code produced as a precondition for CI/CD and higher environments.
Must Have Skills:
• Must have Azure Cloud experience and understand Azure components like ADF, Azure SQL and Azure Data Bricks.
• Very Strong Data Bricks, Spark, Pyspark, Data Bricks SQL with Azure
• Must have Strong ETL and ELT experience
• Must have strong Python, Data Bricks SQL skills beyond just calling Spark Api, must be fluent in Python programming language
• Must have Relational Database knowledge for optimum loading of Data from on premise system and data lake
• Must have experience with Data Lake and Data Bases
• Must have knowledge of OOP and functional programming to create reusable framework for ETL
• Must understand encryption and security required for PII, Financial and other sensitive data.
• Must understand Delta Lake and other big data file formats
• Good to have exposure to DevOps and CICD skills in Big Data Space
• Good to have Airflow or AppWorx.
• Good to have exposure to Manufacturing Domain
• Good to have SQL as well as No SQL databases
• Good to have AD security experience
Job Description:
• Work closely with Architect and Business units in understanding technical requirement and implement independently reusable code.
• Develop ETL Framework for template driven ETL
• Develop Databricks code that can call Python and other required libraries
• Work with offshore and on shore teams and mentor team members on ETL and do KT on framework and design
• Implement transformations and aggregations as requirement
• Work in Agile manner and resolve ambiguous requirements and communicate effectively with his peers
* Note:-
You may call us between 9 am to 8 pm
8 7 7 7 2 1 1 zero 1 6
9 3 3 1 2 zero 5 1 3 3
Or you can visit our office.
Ideal Career Zone
128/12A, BidhanSraniShyam Bazaar metro Gate No.1 Gandhi Market Behind Sajjaa Dhaam Bed Sheet Bed cover Show room Kolkata 7 lakh 4
#DataEngineer, #DataBricks, #PythonandAzure, #FullStackDeveloper, #ITCompany, #MCA, #BCA, #Kolkata, #WestBengal, #India, #World,
Видео Data Engineer || Data Bricks || IT Company || Python and Azure || MCA ||BCA канала Ideal Career Zone ( the job hunt)
Now we are looking some candidates who are from IT programmers.
JD: Data Engineer/Lead with Azure Data Bricks, Python and Azure Data Factory.
Min Experience should have 8+ years of IT Experience, with at least 3 years of Python and Data Bricks experience.
We need a Data Engineer/Lead to develop reusable ETL framework on Azure Data Bricks using Spark 3.0.
This custom framework will be used to create a template driven data load and transformation system on Data Lake and Big Databases. The Framework shall use components from Azure and Data Bricks (Spark).
The developer is expected to write Python Test cases and Data Quality checks for all code produced as a precondition for CI/CD and higher environments.
Must Have Skills:
• Must have Azure Cloud experience and understand Azure components like ADF, Azure SQL and Azure Data Bricks.
• Very Strong Data Bricks, Spark, Pyspark, Data Bricks SQL with Azure
• Must have Strong ETL and ELT experience
• Must have strong Python, Data Bricks SQL skills beyond just calling Spark Api, must be fluent in Python programming language
• Must have Relational Database knowledge for optimum loading of Data from on premise system and data lake
• Must have experience with Data Lake and Data Bases
• Must have knowledge of OOP and functional programming to create reusable framework for ETL
• Must understand encryption and security required for PII, Financial and other sensitive data.
• Must understand Delta Lake and other big data file formats
• Good to have exposure to DevOps and CICD skills in Big Data Space
• Good to have Airflow or AppWorx.
• Good to have exposure to Manufacturing Domain
• Good to have SQL as well as No SQL databases
• Good to have AD security experience
Job Description:
• Work closely with Architect and Business units in understanding technical requirement and implement independently reusable code.
• Develop ETL Framework for template driven ETL
• Develop Databricks code that can call Python and other required libraries
• Work with offshore and on shore teams and mentor team members on ETL and do KT on framework and design
• Implement transformations and aggregations as requirement
• Work in Agile manner and resolve ambiguous requirements and communicate effectively with his peers
* Note:-
You may call us between 9 am to 8 pm
8 7 7 7 2 1 1 zero 1 6
9 3 3 1 2 zero 5 1 3 3
Or you can visit our office.
Ideal Career Zone
128/12A, BidhanSraniShyam Bazaar metro Gate No.1 Gandhi Market Behind Sajjaa Dhaam Bed Sheet Bed cover Show room Kolkata 7 lakh 4
#DataEngineer, #DataBricks, #PythonandAzure, #FullStackDeveloper, #ITCompany, #MCA, #BCA, #Kolkata, #WestBengal, #India, #World,
Видео Data Engineer || Data Bricks || IT Company || Python and Azure || MCA ||BCA канала Ideal Career Zone ( the job hunt)
Комментарии отсутствуют
Информация о видео
7 июня 2025 г. 0:45:50
00:03:01
Другие видео канала