Lead Azure Data Engineer – ADF & Databricks

  • Date Posted Aug 6, 2025
  • Location Toronto, ON
  • Job Type Contract
  • Job ID 18093

Are you passionate about developing state-of-the-art technologies? Are you looking for an ambitious opportunity to test and grow your skills? Here is an opportunity…

Working with one of the top financial clients, this role calls for a Lead Azure Data Engineer – ADF & Databricks who will design, develop, and implement applications using in-demand languages and technologies (e.g., WebSphere, Informatica, etc.) to support business requirements.

Responsibilities:

  • Data Source Transformation: Migrate the current data source from a tactical infrastructure, which involves direct pulls from virtual machines (VMs), to a strategic infrastructure utilizing Azure Data Factory (ADF) and Databricks.
  • ETL Process Implementation: Develop an Extract, Transform, Load (ETL) process to efficiently transfer data from various sources, including SFTP, web services, and databases, into a Delta Lake. This will enable seamless data consumption for business analytics.
  • Connectivity Assistance: Support the business in establishing a robust connection between Plotly Dash and Databricks, ensuring that data visualizations are powered by the latest data insights.
  • Databricks Setup and Management: Establish the Unity Catalog, configure clusters, define schemas, and set up permissions within Databricks to ensure secure and organized data management. 

Desired Skill Set:

  • 7+ Years experience with ADF, Databricks, Unity Catalog and Azure Functions (C#)
  • In depth development experience with Python/notebook on Databricks/Spark/pandas
  • Experience in loading datasets from various sources using ADF to Databricks with Medallion architecture
  • Experience in converting a Databricks workspace from Hive metastore to Unity catalog & setup up associated permissions
  • Creating delta table schema/tables with proper design/properties & security permissions
  • DevOps & deployment pipeline using Azure DevOps/GitHub Enterprise (especially on databricks on cluster/secret management and code)
  • In depth experience on Azure Entra ID for SSO/RBAC setup
  • In-depth experience in working with/deploying to Azure PaaS resources such as Azure SQL PaaS, ADF/SHIR, ADLS, Databricks, and Synapse dedicated SQL databases

Nice To Have:

  • Experience with Delta Live Table.
  • Familiar to work in lockdown windows environment for a highly regulated industry
  • Financial Services Industry experience

BeachHead is an equal opportunity agency and employer. We advocate for you and welcome anyone regardless of race, color, religion, national origin, sex, physical or mental disability, or age.
Privacy Policy

 

Attach a Resume file. Accepted file types are DOC, DOCX, PDF, HTML, and TXT.
Attach an additional file, if applicable. Accepted file types are DOC, DOCX, PDF, HTML, and TXT.

We are uploading your application. It may take a few moments to read your resume. Please wait!

Back to Top