Data Engineer ETL (Hosur)
Vistec Partners
Posted on: February 27, 2026
Position: Data Engineer
Experience: 5–6 Years
Location: Work From Home (WFH)
Office Requirement: Once a week – Noida
Time Overlap: Mandatory overlap with US EST
Role Summary
We are seeking an experienced Data Engineer with 5–6 years of industry experience, including a minimum of 2 years of hands-on expertise in Databricks and Azure Data Factory (ADF) . The role involves designing, building, and optimizing scalable data pipelines and analytics solutions on Azure. Collaboration with US-based stakeholders requires daily overlap with the EST time zone.
Key Responsibilities
- Design, develop, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks .
- Build scalable batch and streaming data processing workflows .
- Develop Databricks notebooks, jobs, and Delta Lake tables .
- Perform performance tuning and cost optimization of data workloads.
- Implement robust data quality checks, validations, and monitoring .
- Develop Python-based data transformation and automation scripts .
- Write and optimize complex SQL queries for analytics and reporting.
- Collaborate with Analytics, BI, and Product teams .
- Document technical designs, workflows, and operational procedures .
Required Skills & Experience
- Databricks: Minimum 2 years of hands-on experience with Spark, notebooks, workflows, Delta Lake.
- Azure Data Factory (ADF): Minimum 2 years of experience building production-grade pipelines.
- Python: Strong scripting and transformation capabilities.
- SQL: Advanced querying, joins, window functions, and optimization.
- Data Modeling: Knowledge of data warehousing concepts (star/snowflake schemas).
- Azure Cloud: Familiarity with Azure data services and architecture.
- Version Control: Experience with Git / DevOps workflows.
- Debugging: Solid troubleshooting and problem-solving skills.
- Communication: Ability to collaborate effectively with US-based stakeholders.
Preferred / Good-to-Have
- Experience with streaming technologies (Kafka / Event Hub).
- CI/CD pipelines using Azure DevOps .
- Expertise in Data Lake / Delta Lake architecture .
- Exposure to BI tools such as Power BI.
- Experience in performance and cost optimization initiatives .
Work Conditions
- Primarily Work From Home (WFH) .
- Mandatory once-a-week office visit in Noida .
- Mandatory daily overlap with US EST time zone .
- Flexibility to work evening hours as required.
#DataEngineer #Databricks #AzureDataFactory #Python #SQL #DataEngineering #Hiring #NoidaJobs #WFH #Azure #ETL #BigData
About Company
Vistec Partners
Tamil Nadu ,IN
https://www.vistecpartners.com
Your next job is waiting
Create your profile and start applying in minutes.