Skip to main content
Job ID R-534386 Date posted 01/20/2026

Job Description Summary

Job Description

Lead Engineer for Data Operations

Are you ready to join a team known for innovation and ambition at BD? We are hiring a Lead DataOps Engineer to take on an essential role within our Digital Transformation Organization. This position lets you engage with advanced technology in a constantly evolving environment. You will supply to developing outstanding solutions that boost business results and improve patient health.

Position Summary

As a Lead DataOps Engineer, you will assume a data operations leadership position within our Digital Transformation Organization. You will be highly motivated, self-starting, and dedicated to delivering results with a solid sense of responsibility and persistence until resolution. Your excellent organizational skills will allow you to handle various tasks and succeed in a fast-paced environment with minimal oversight. Furthermore, your strong analytical, problem-solving, and solve abilities will be essential in this position.

Educational Background

  • Bachelor’s degree or equivalent experience in Computer Science, Data Analytics, or related fields.

Professional Experience

  • More than 5 years of professional experience with Databricks, Azure Data Factory, and Power BI.

Job Responsibilities

Databricks and Azure Data Factory tools:

  • Proactive Monitoring & Support: Monitor ADF pipeline and Databricks schedule runs and activity logs for all tracks. Handle failures, retries, and blocking issues.
  • Ensure SLA compliance for data movement and transformation.
  • Process files by manually initiating the necessary jobs according to business needs.
  • Cluster upgrade in case of performance issues.
  • Pause & resume jobs and identify impacted objects due to source-side issues.
  • Provide month-end assistance according to business requirements.
  • Build artifacts and document issues & object-level inventory.
  • Collaborate with teams and lead knowledge-sharing sessions.
  • Connects daily with teams and clients to deliver timely updates.
  • Perform data corrections to maintain consistency between source and downstream systems.

Incident Management:

  • Perform root cause analysis for failures and build incidents when necessary.
  • Communicate and inform issues to engineering teams.
  • Interact with users and resolve issues.

Scheduling & Trigger Management:

  • Configure and maintain scheduled, tumbling window, and event-based triggers for ADF.
  • Modify schedules according to business requirements.

Deployment & CI/CD:

  • Validate BD internal requests and coordinate deployments across Dev, QA, and Prod environments.
  • Maintain Git integration and release pipelines.

DevOps Activities:

  • Build folders in DEV, QAS, and PROD ADLs.
  • Manage files across ADLs environments.
  • Make changes in DAB files and upload to ADLs.

Power BI:

  • Monitor scheduled dataset, dataflow, and pipeline refreshes across workspaces.
  • Track refresh failures, identify root causes, and take corrective actions.
  • Maintain daily refresh status reports and send updates to collaborators.
  • Ensure gateway health, connection stability, and capacity utilization.
  • Solve visualization issues, filters, bookmarks, drilldowns, and broken navigation.
  • Validate and fix data mismatch, performance problems, and incorrect measures.
  • Support users with access requests, RLS issues, and permissions.
  • Coordinate deployments using deployment pipelines, PBIX file migration, and parameter updates.
  • Validate releases in QA and Production through Quality Control checks.

Knowledge and Skills

  • Practical experience working directly with Databricks, Azure Data Factory, and Power BI.
  • Strong ETL and SQL knowledge is a must.
  • Advanced knowledge of Power BI (DAX, Power Query, data modeling).
  • Familiarity with Azure Data Services or similar cloud platforms.
  • Excellent problem-solving and interpersonal skills.
  • Experience with Azure DevOps / GitHub.
  • Strong critical thinking, solve, and leadership skills.

Desired / Additional Skills & Knowledge

  • Knowledge of ServiceNow and Microsoft Azure cloud platform.
  • Good interpersonal and behavioral skills.
  • Excellent soft skills.

If you are ready to make an impact and drive digital transformation at BD, we want to hear from you!

Required Skills

Batch Monitoring, Communication, Data Engineering, Data Monitoring, DataOps, Data Pipelines, End-to-End Orchestration, Microsoft Azure Databricks, Operations Orchestration, Problem Resolution, Root Cause Analysis (RCA), Security Monitoring, SLA Monitoring

Optional Skills

Collaborating, Customer Engagement, Emotional Intelligence, User Engagement

.

Primary Work Location

IND Bengaluru - Technology Campus

Additional Locations

Work Shift

Apply

Working in

Bengaluru

Take a look at the map to see what's nearby.

EXPLORE LOCATION

Recommend
to a friend

Approve
of CEO

"Purpose driven company where associates work every day to make healthcare better. A lot of great initiatives going on to make BD the best MedTech company in the world."
Anonymous, Franklin Lakes, NJ

Don’t Miss Out

Receive customized job alerts based on your function and/or location search criteria.

Interested InSelect a job category from the list of options. Search for a location and select one from the list of suggestions. Finally, click “Add” to create your job alert.

  • Information Technology, Bengaluru, Karnataka, IndiaRemove

You acknowledge that you have read our Privacy Policy and consent to receive email communication from BD.

BD Fraud Notice

Please be aware of potentially fraudulent job postings on other websites or suspicious recruiting email or text messages that attempt to collect your confidential information. If you are concerned that an offer of employment with BD, CareFusion or C.R. Bard might be a scam, please verify by searching for the posting on the careers page or contact us at ASC.Americas@bd.com. For more information click here.