Azure Databricks Engineer
Job ID: 112038
Location: Dallas, Texas [Remote]
Category: App/Dev
Employment Type: Contract
Date Added: 02/12/2026
This position is a remote-based role for an experienced Azure/Databricks Engineer and Data Operations Engineer. The candidate will be responsible for designing, implementing, and managing enterprise data lake environments on Azure, focusing on automation, data pipeline development, and security compliance. The role requires a deep understanding of cloud data architecture and a commitment to enhancing data operations efficiency.
Responsibilities
- Design, develop, and maintain scalable data pipelines for ETL/ELT processes using Azure Databricks, Data Factory, and Synapse.
- Build and optimize data ingestion solutions utilizing Azure Event Hub, Kafka, or similar streaming services.
- Manage enterprise data lake environments on Azure, ensuring data security, governance, and compliance using Unity Catalog, Collibra, and Azure IAM/RBAC.
- Implement Infrastructure-as-Code (IaC) using Terraform and Bicep to automate environment provisioning and configurations.
- Develop and deploy data workflows using PySpark, SQL, and Python, with version control through Git repositories.
- Automate deployment processes and maintain CI/CD pipelines leveraging Azure DevOps or GitHub Actions.
- Oversee data security processes, including network isolation with Private Link and VNet configurations.
- Collaborate on Data Ops processes to improve automation, efficiency, and data quality across the organization.
- Enable AI/ML solutions by integrating MLflow and managing feature stores for model tracking and deployment.
- Ensure data environments meet regulatory standards and adhere to FinOps, audit, and compliance requirements.
Qualifications
- Proven experience designing and managing data engineering solutions on Azure, including Databricks, Data Factory, Synapse, and ADLS Gen2.
- Strong programming skills in PySpark, SQL, and Python for building data pipelines.
- Hands-on experience with Infrastructure-as-Code (IaC), particularly Terraform and Bicep.
- Familiarity with DevOps practices, CI/CD pipeline automation, and version control systems like Git.
- Knowledge of data governance, security practices, and cloud compliance standards, especially in regulated industries such as healthcare or finance.
- Experience with real-time data streaming and event ingestion platforms like Kafka or Event Hub.
- Ability to work independently in a remote environment while collaborating effectively with cross-functional teams.
- Strong understanding of data modeling, security, and scalable data architecture principles.
- Excellent problem-solving skills and attention to detail, with an emphasis on automation and efficiency.
- Availability to work full-time with flexible schedules aligning with project needs.
Publishing Pay Range: $70.00 – $75.00 Hourly
This is a fully remote role and can be performed from an approved location.
