Gore Mutual Insurance Logo

Gore Mutual Insurance

Lead, Data Engineering

Posted 3 Days Ago
Be an Early Applicant
Cambridge, ON
Senior level
Cambridge, ON
Senior level
As a Lead Data Engineer at Gore Mutual, you will design and maintain the data platform, automate data management, and ensure data quality and security while collaborating with cross-functional teams. Your role includes optimizing data flows, conducting system tests, and implementing data management processes.
The summary above was generated by AI

We’re now at the boldest phase of our Next Horizon journey

At Gore Mutual, we’ve completely transformed our business in under three years. By investing in top talent and leading technology, we’ve redefined what it means to be a modern mutual that does good.

Our path forward brings a sharper focus on our business’ performance that’s powered by innovation and an agile, high-performing culture – we’re built for success.

We’re well on our way to becoming a purpose-driven, digitally led national insurer. Come join us.

As a Lead Data Engineer at Gore Mutual Insurance, you have a strong technical background in software engineering / computer science, you will play a pivotal role in designing, building, and maintaining our data platform. Your work will facilitate data accessibility, accuracy, and accountability, enabling data-driven decision-making across our organization. You will collaborate closely with cross-functional teams to ensure our data processes are robust and scalable.

What will you be doing in this role?

Designing Systems for Collecting and Storing Data

  • Design and implement the data infrastructure and tooling that encompasses the data platform. This includes selecting the appropriate hardware and software components, configuring storage resources, and creating security policies.

Automating Data Management Processes and Handling Data Security

  • Automate data management processes and handle data security.
  • Ensure that security protocols and best practices are in place to protect against potential security threats.

Integrating Data Platforms with Necessary Tools

  • Ensure that the necessary tools are integrated with the data platform.

Testing and Optimizing Data Platform

  • Optimize data pipelines to ensure efficient data flow.
  • Ensure that the data extracted from sources is accurate, complete, and usable. This might involve checking for missing values, inconsistent formats, or anomalies that could indicate errors.
  • Test the efficiency and speed of data pipelines and databases. This can help identify bottlenecks and optimize performance.
  • Verify that different components of the data infrastructure work together as expected. This includes checking that data flows correctly from sources to databases, and from databases to applications.

Maintenance of Data Platform

  • Regularly check the health and performance of the data platform. This can involve tracking metrics like query times, error rates, and resource usage.
  • Identify and resolve issues that arise in the data platform. This can involve debugging code, optimizing queries, or adjusting configurations.
  • Keep the data platform up-to-date with the latest technologies and security patches. This can involve updating database software, upgrading hardware, or migrating data to new systems.
  • Implement strategies to protect data from loss or corruption. This can involve regular backups, redundancy measures, and disaster recovery plan

 What will you need to succeed in this role?

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Software Engineering or a related field.
  • A minimum of 5-6 years relevant experience as a data engineer is required. This includes experience in data engineering, data system development, or related roles.
  • Strong understanding of data structures, data modeling, and software architecture.
  • Deep knowledge of Microsoft Azure Services (DevOps, Databricks, SQL Server, Event Hub, Web Apps, Data Factory, Azure Storage, Keyvault, etc.)
  • Experience with software design patterns and test-driven development (TDD)
  • Proficiency in Python, including a strong grasp of Object Oriented and Functional programming paradigms. 
  • A solid understanding of Spark concepts and distributed systems, including data transformations, RDDs, DataFrames, and Spark SQL.
  • Excellent SQL skills and expertise in database management and tuning.
  • Ability to devise and implement master data management processes.
  • Experience with data lakehouse and medallion architecture using Delta Lake within Azure Databrick
  • Proficiency in developing RESTful APIs
  • Strong problem-solving and critical-thinking abilities.
  • Strong communication and collaboration skills.
  • Continuous learning and staying updated with the latest advancements in Azure Databricks and data engineering are essential for success in this role.

Nice to Have

  • A graduate degree in a technical field with specialization in Analytics, Data Science, or a related subject. 
  • Azure certifications (Microsoft Certified: Azure Data Engineer Associate). 
  • Databricks certifications (Databricks Certified /Data Engineer Professional certification).

#LI-Hybrid

#IndHP 


Gore Mutual Insurance is committed to providing accommodations for people with disabilities during all phases of the recruiting process, including the application process. If you require accommodation because of a disability, we will work with you to meet your needs. If you are selected for an interview and require accommodation, please advise the HR representative who will consult with you to determine an appropriate accommodation.

Top Skills

Python
Spark
SQL

Similar Jobs

3 Days Ago
Easy Apply
Hybrid
Toronto, ON, CAN
Easy Apply
Expert/Leader
Expert/Leader
Artificial Intelligence • Marketing Tech • Software
As a Principal Data Engineer, you will drive the direction of the Data Warehouse, enabling data access across departments and designing data pipelines to handle massive data ingestion and ensure compliance with data regulations. You will mentor less experienced team members and optimize pipeline performance.
Top Skills: PythonSQL
3 Days Ago
Toronto, ON, CAN
Senior level
Senior level
Healthtech
As a Data Engineering Lead at Sanofi, you will oversee the development and impact of data and machine learning solutions, manage and coach engineers, and work on technical designs for scalable data pipelines. You'll collaborate with cross-functional teams to solve complex engineering problems and ensure adherence to best practices in data management.
Top Skills: PythonScalaSpark
3 Days Ago
Hybrid
Toronto, ON, CAN
Mid level
Mid level
AdTech • Software
The Engineering Lead Manager will oversee high-scale and low-latency data processing systems, managing a team to transform big data solutions. Responsibilities include leading project initiatives, collaborating with stakeholders, and ensuring system efficiency and reliability in delivering data insights.
Top Skills: AirflowFlinkHadoopKafkaScalaSpark

What you need to know about the Calgary Tech Scene

Employees can spend up to one-third of their life at work, so choosing the right company is crucial, not just for the job itself but for the company culture as well. While startups often offer dynamic culture and growth opportunities, large corporations provide benefits like career development and networking, especially appealing to recent graduates. Fortunately, Calgary stands out as a hub for both, recognized as one of Startup Genome's Top 100 Emerging Ecosystems, while also playing host to a number of multinational enterprises. In Calgary, job seekers can find a wide range of opportunities.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account