Keyera Logo

Keyera

Senior Data Platform Specialist

Posted 4 Days Ago
Be an Early Applicant
In-Office
Calgary, AB
Expert/Leader
In-Office
Calgary, AB
Expert/Leader
The Data Platform Specialist designs and maintains data infrastructure, building scalable pipelines, ensuring data quality, and managing costs.
The summary above was generated by AI
Keyera Career Opportunity

Why Keyera?

It’s our purpose-driven culture, benefits, and people. From flexible, customizable benefits and an employer-funded pension plan to our Keyera Connects social investment program and paid employee volunteer days, we’re focused on empowering our people and communities.

When you work with us, you'll enjoy:

Flexible Benefits to meet your individual and family needs, including

  • $3,500 plus 4.5% of base salary each year to customize your benefits and investments

  • Saving plan options, including an Employee Share Purchase Plan, RRSP, and TFSA

  • Defined Contribution Pension Plan funded by Keyera up to 10% of base salary

  • Wellness Personal Spending Account of $750 per year to cover wellness expenses

Paid vacation and eight flex days each year

  • Two paid volunteer days each year to support the causes that are most important to you

  • Employee Family Assistance Program with a variety of support resources from professional counselling to financial planning support and more.

Role/Location Specific

  • Keyera’s Northern Allowance is a tiered incentive program based on years of service, currently offered to employees working at the following locations: Grande Prairie Office, Wapiti, Simonette, Gold Creek, South Cheecham, and Fox Creek Terminal. This information reflects the current state of the program and may be subject to change over time.

Please note that compensation and benefits may be different based on the work location, position, and a candidate's experience and qualifications.

Job Type:Permanent

 

THE POSITION

The Data Platform Specialist is responsible for designing, developing, and maintaining the infrastructure and systems required for data storage, processing, integration, and analysis. They play a crucial role in building and managing scalable data pipelines and positioning the enterprise data platform as a secure, governed integration hub that enables reliable data exchange across internal systems, external applications, and analytical environments. 

The role also carries accountability for performance optimization and cost management of the data platform, ensuring efficient use of consumption-based services and alignment to business value. 

 

RESPONSIBILITIES

Data Management & Pipeline Development 

  • Designs and develops data pipelines that extract data from various sources, transform it into the desired format, and load it into appropriate data storage systems 

  • Transforms raw data into usable formats through cleansing, aggregation, filtering, enrichment, and modeling techniques 

  • Creates robust data models and architectures to support analytics initiatives 

  • Optimizes data pipelines and processing workflows for performance, scalability, reliability, and efficiency 

  • Designs, develops and maintains data pipelines to ensure efficient and reliable ETL processes 

  • Monitors and tunes data systems, identifies and resolves performance bottlenecks, and implements optimization strategies including partitioning, caching, and indexing 

  • Implements and maintains analytics systems, data warehouses, or data lakes to store and manage structured and unstructured data 

  • Collaborates with data scientists and AI developers to integrate predictive models or machine learning algorithms into analytics workflows 

  • Ensures data quality and accuracy by implementing data validation, monitoring, and error-handling processes 

 

Data Platform as Integration Hub 

  • Designs and implements the data platform as an enterprise integration hub, supporting both inbound and outbound data flows 

  • Develops and manages integrations with internal systems, SaaS platforms, operational applications, APIs, event streams, and external partners 

  • Enables bidirectional data exchange, including batch, streaming, API-based, and event-driven integration patterns 

  • Ensures data consistency, integrity, security, and compliance across integrated systems 

  • Works with application, architecture, and integration teams to define integration standards, reusable patterns, and scalable data exchange frameworks 

Data Quality, Governance & Modeling 

  • Implements data quality checks and validation controls within data pipelines to ensure accuracy, consistency, and completeness 

  • Collaborates with data scientists and analysts to optimize data models for quality, security, performance, and governance 

  • Contributes to the establishment and enforcement of governance practices for data assets and analytical models 

  • Supports enterprise data modeling practices, including logical, physical, and semantic modeling where applicable 

Platform Performance & Cost Management 

  • Models, monitors, and manages platform costs across cloud and data services based on consumption and pricing models (e.g., compute, storage, data movement, API usage 

  • Designs cost-efficient architectures aligned to workload characteristics and service pricing structures 

  • Implements workload management, scaling policies, and resource optimization strategies to balance performance and cost 

  • Provides transparency and reporting on platform usage and cost drivers to support financial planning and accountability 

  • Partners with architecture and finance stakeholders to forecast platform growth and optimize total cost of ownership 

Enterprise Engagement 

  • Collaborates across AI, analytics, architecture, and business teams to enable trusted and accessible data products 

  • Supports executives and senior leaders in understanding the value, risk, and economics of enterprise data assets 

  • Translates between executive, business, IT, and quantitative stakeholders to align technical solutions with business outcomes 

 

QUALIFICATIONS

Education 

  • Bachelor’s degree in Computer Science, Data Science, Software Engineering, Information Systems, or related background and experience 

Experience 

  • Understanding and experience working within a Databricks environment  

  • At least 15 years of work experience in data management or related disciplines, including data integration, modeling, optimization, and data quality 

  • Proven experience designing and maintaining modern data platforms in cloud-based big data environments (e.g., Databricks, Snowflake) 

  • Experience building API-based and event-driven integrations across enterprise systems 

  • Demonstrated experience building, scaling and sustaining enterprise environments  

  • Experience operating within consumption-based cloud pricing models and managing platform cost optimization initiatives 

  • Experience working in or supporting asset-intensive industries such as: 

  • Energy, utilities, mining, manufacturing, transportation, or infrastructure 

  • Capital projects, operations, reliability, maintenance, or supply chain environments 

Skills 

  • Expert in designing and building scalable ELT pipelines within the Databricks Lakehouse, or directly relevant experience, leveraging distributed processing and native platform capabilities, while integrating with complementary enterprise data and operational solutions. 

  • Strong proficiency in programming languages such as Python and Java 
     

  • Advanced SQL skills and experience with modern data warehouse platforms (e.g., Snowflake, Databricks) 

  • Experience with cloud platforms (AWS, Azure) and modern data architecture patterns 

  • Experience with database technologies such as SQL and NoSQL systems 

  • Ability to design, build, and deploy data solutions that support AI, ML, BI, and operational use cases 

  • Strong understanding of integration architectures, API design, event streaming, and data exchange protocols 

  • Demonstrated ability to optimize platform performance and cost in consumption-based environments 

  • Strong problem-solving and debugging skills across distributed systems 

  • Excellent business acumen and interpersonal skills; able to influence and effect change across business lines 

  • Ability to clearly articulate business use cases, data management concepts, architectural tradeoffs, and financial implications of technical decisions 

 

Posting Expiry Date:Mar 27, 2026

At Keyera, we embrace collaboration, inclusion, and a workplace that is as diverse as the communities we serve. Our values foster an environment for every person to bring their whole self to work.

We offer a well-rounded total compensation package and a comprehensive benefits program designed with the well-being and empowerment of our employees and their families in mind.

If you are interested in an opportunity to join a winning, purpose-driven culture then you’ll enjoy a career with us.

We thank all applicants for their interest; however, only those considered for an interview will be contacted.

Top Skills

AWS
Azure
Databricks
Java
Python
Snowflake
SQL
HQ

Keyera Calgary, Alberta, CAN Office

144 4 Ave SW, Calgary, Alberta, Canada, T2P 3N4

Similar Jobs

9 Hours Ago
In-Office or Remote
Calgary, AB, CAN
Senior level
Senior level
Big Data • Information Technology • Software • Analytics • Energy
Responsible for driving business automation solutions in the energy sector, conducting technical discussions, and presenting technologies to clients while collaborating with sales teams.
Top Skills: Enverus ToolsGisMicrosoft Product SuiteSource-To-Pay SolutionsSpotfire
15 Hours Ago
Easy Apply
In-Office or Remote
8 Locations
Easy Apply
Entry level
Entry level
Greentech • Hardware • Internet of Things • Machine Learning • Software • Business Intelligence • Agriculture
Halter seeks expressions of interest for various roles across teams like Engineering, Product, Hardware, Sales, and Support. Applicants should be passionate about impactful work and problem-solving. A cover letter is required to express interest and qualifications.
Yesterday
Easy Apply
Remote or Hybrid
Canada
Easy Apply
Senior level
Senior level
eCommerce • Healthtech • Kids + Family • Retail • Social Media
As a Senior Technical Recruiter, you will manage full-cycle recruiting for complex technical roles, emphasizing Data Science, Machine Learning, and core Engineering positions, while fostering diverse talent pipelines and enhancing candidate experiences.
Top Skills: AIGreenhouseLinkedin Recruiter

What you need to know about the Calgary Tech Scene

Employees can spend up to one-third of their life at work, so choosing the right company is crucial, not just for the job itself but for the company culture as well. While startups often offer dynamic culture and growth opportunities, large corporations provide benefits like career development and networking, especially appealing to recent graduates. Fortunately, Calgary stands out as a hub for both, recognized as one of Startup Genome's Top 100 Emerging Ecosystems, while also playing host to a number of multinational enterprises. In Calgary, job seekers can find a wide range of opportunities.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account