Power BI With Data Engineer
Algoleap Technologies Pvt Ltd
Posted on: February 27, 2026
Job Title
Power BI & Data Engineer (8+ years)
Location
[HYD/Noida/Gurugram]
Employment Type
Full-time
About the Role
We’re looking for a Power BI & Data Engineer with solid experience building enterprise-grade BI solutions end-to-end—from SQL data modelling and ETL orchestration to Power BI report development, publishing, and governance. The ideal candidate is hands-on with SQL (CRUD, joins, stored procedures), understands materialized view patterns (e.g., indexed views in SQL Server), can set up and manage scheduled jobs (cron or equivalent), and thrives in an Agile environment with strong communication and stakeholder engagement.
Key Responsibilities
Power BI (Development, Publishing & Governance)
· Design, develop, and maintain interactive Power BI reports and dashboards using Power Query (M), DAX, and best-practice data models (star schema, surrogate keys, conformed dimensions).
· Publish and manage content in Power BI Service (workspaces, apps, datasets, dataflows).
· Implement Row-Level Security (RLS), dataset certification, and workspace governance.
· Optimize performance via query folding, aggregations, measures vs calculated columns, composite models, and incremental refresh.
· Should have experience in Azure DataLake Service, Fabric Capacity Enblement and Management
· Optimize database performance through indexing, query optimization, and regular health checks
· Configure data gateways, scheduled refresh, deployment pipelines, and CI/CD (e.g., with Git/Azure DevOps).
SQL Engineering & Data Modeling
· Write efficient SQL for CRUD operations, complex joins, window functions, CTEs, stored procedures, views, and indexing strategies.
· Build and maintain materialized view equivalents—e.g., indexed views in SQL Server (or materialized views in Postgres/Oracle)—to optimize BI workloads.
· Design dimensional models and semantic layers supporting analytics and self-service BI.
· Conduct SQL performance tuning (execution plans, indexing, statistics, partitioning).
Data Pipelines & Scheduling
· Develop and maintain ETL/ELT pipelines to ingest, transform, and stage data for reporting.
· Configure and monitor scheduled jobs (e.g., cron, SQL Agent, or orchestration tools) with robust logging, retries, and alerts.
· Ensure data quality, lineage, and reliability across environments (dev/test/prod).
Agile Delivery & Stakeholder Management
· Work in Agile/Scrum: sprint planning, backlog grooming, story slicing, demos, and retros.
· Partner with product owners and business stakeholders to clarify requirements, define KPIs, and translate them into well-structured data models and reports.
· Document data dictionaries, lineage diagrams, design decisions, and operational runbooks.
· Communicate clearly with both technical and non-technical audiences; provide user training and support.
Must-Have Qualifications
· 5+ years of professional experience in Power BI and SQL within analytics/data engineering.
· Strong hands-on experience creating, publishing, and sharing Power BI reports and apps in Power BI Service.
· Advanced SQL skills: CRUD, joins, stored procedures, views, indexing, performance tuning.
· Expertise in SQL joins (inner/left/right/full/self), window functions, CTEs.
· Experience with materialized view patterns (e.g., SQL Server indexed views, or materialized views in other RDBMS).
· Should have experience in Azure DataLake Service, Fabric Capacity Enblement and Management
· Experience managing scheduled jobs (e.g., cron, SQL Server Agent, or equivalent schedulers).
· Proven experience working in Agile teams with iterative delivery.
· Excellent communication skills—requirements elicitation, stakeholder management, and presentation.
Nice-to-Have / Preferred
· Azure: Azure SQL, Synapse Analytics, Data Factory, Azure Data Lake, Purview; DevOps for CI/CD.
· SSIS or any ETL tool (ADF, Informatica, dbt, Talend).
· Git-based workflows; branching strategies; automated deployment for BI artifacts.
· Knowledge of data warehousing (Kimball), data governance, privacy/compliance.
· Exposure to Python/PySpark for data prep; APIs/REST for data ingestion.
· Experience with DirectQuery, Composite Models, Large Models.
· Familiarity with unit testing in data pipelines and DAX testing patterns.
· Understanding of observability: logging, metrics, alerts for data jobs.
Technical Stack (Customize per your environment)
· BI: Power BI Desktop & Service, DAX, Power Query (M), Dataflows, Deployment Pipelines
· Databases: SQL Server (or Postgres/Oracle/MySQL), Azure SQL
· Scheduling/Orchestration: cron, SQL Server Agent, Azure Data Factory pipelines, Airflow (optional)
· Version Control/CI-CD: Git (Azure DevOps/GitHub), YAML pipelines
· Data Engineering: SSIS/ADF/dbt (any), APIs/JSON/CSV/Parquet
· Security/Governance: RLS, sensitivity labels, workspace roles, dataset certification
About Company
Algoleap Technologies Pvt Ltd
Telangana ,IN
https://algoleap.com
Your next job is waiting
Create your profile and start applying in minutes.