25-026 DevOps Engineer
- Location: Toronto, Ontario, Canada
- Salary: $80 - 100 per year
- Category
- Sector: Power, Nuclear and Utilities
- Contract type Contract
- Consultant: Akshata Kamath
Number of Vacancies: 1
Level: MP4
Hourly Rate: $80 - 100/hour
Duration: 10 Months
Hours of work: 35
Location: 700 University Avenue, Toronto (Hybrid – 2 days remote)
Job Overview
We are seeking a skilled DevOps Engineer to support our data analytics developers in deploying, maintaining, and troubleshooting data pipelines within our Azure-based Data Lake environment. This role will be responsible for managing Cl/CD pipelines, ensuring seamless code deployment from development to UAT and production, and establishing best practices for DevOps in our data engineering and analytics functions, specifically, we are standing up a Centre of Advanced Analytics and need dedicated expertise and support. The ideal candidate will bring expertise in cloud-based DevOps, data pipeline automation, and infrastructure management, enabling our team to focus on delivering high-quality data products efficiently.
Key Responsibilities:
- Deployment & Environment Management
- Design, implement, and maintain Cl/CD pipelines for deploying data pipelines and analytics models into UAT and Production environments.
- Support the data development team by automating code deployments, reducing manual errors, and improving deployment efficiency.
- Troubleshoot and resolve pipeline failures, deployment issues, and infrastructure bottlenecks in collaboration with data developers. Manage and optimize data lake infrastructure, security, and access controls to ensure smooth operations.
- DevOps Best Practices & Standardization
- Establish and enforce DevOps standards, best practices, and documentation to improve efficiency and reliability in data product development.
- Develop automated testing strategies for data pipelines to validate transformations, integrity, and performance across environments.
- Work with cross-functional teams to implement observability and monitoring solutions for data workflows and deployments. Enhance version control practices and facilitate collaboration using Git, Azure DevOps, or similar tools.
- Infrastructure & Performance Optimization
- Maintain and optimize cloud-based data lake environments (Azure, Databricks, Synapse) for efficient data processing and analytics.
- Automate infrastructure provisioning and configuration using Infrastructure as Code (laC) (Terraform, ARM Templates, etc.). Identify and resolve performance bottlenecks in data processing pipelines.
- Assist in defining and implementing security, access management, and governance policies for data and analytics environments.
- Collaboration & Stakeholder Engagement
- Act as a liaison between the Data Analytics team and the Data Lake Engineering team to ensure smooth deployments. Work closely with Data Developers, Data Engineers, and Analytics teams to troubleshoot and optimize workflows.
- Provide guidance and mentorship to team members on DevOps principles, automation, and best practices.
Qualifications
- 3+ years of experience in DevOps, Cloud Engineering, or Data Engineering with a strong focus on Cl/CD and automation.
- Additional MLOps experience a nice to have.
- Strong expertise in Cl/CD tools such as Azure DevOps, GitHub Actions, Jenkins, or GitLab Cl/CD.
- Experience working with Azure-based data platforms such as Azure Data Lake, Azure Synapse, Azure Databricks. Proficiency in scripting and automation
- Understanding of monitoring, logging, and alerting solutions for data pipelines (Azure Monitor). Knowledge of security, access management, and compliance standards for data environments.
- Strong problem-solving skills and the ability to debug complex deployment and pipeline issues. Ability to work collaboratively
- Experience with Databricks workflow automation, Delta Lake, Azure