Honeycomb.io Logo

Honeycomb.io

Senior Data Engineer

Posted Yesterday
Remote
2 Locations
Senior level
Remote
2 Locations
Senior level
As the Senior Data Engineer, you will own Honeycomb’s data platform, designing and maintaining scalable solutions for data access. You will collaborate with various teams to drive innovation in data quality and reliability while leading critical projects that shape the company's data capabilities.
The summary above was generated by AI

What We’re Building

Honeycomb is the observability platform for teams who manage software that matters. Send any data to our one-of-a-kind data store, solve problems with all the relevant context, and fix issues before your customers find them. Honeycomb is the unified, fast, and collaborative choice for engineering teams who care about customer experience to get the answers they need, quickly. We are passionate about consumer-quality developer tools and excited to build technology that raises our industry’s expectations of what our tools can do for us. We’re working with well known companies like HelloFresh, Slack, LaunchDarkly, and Vanguard and more across a range of industries. This is an exciting time in our trajectory, as we’ve closed Series D funding, scaled past the 200-person mark, and were named to Forbes’ America’s Best Startups of 2022 and 2023! If you want to see what we’ve been up to, please check out these blog posts and Honeycomb.io press releases.


Who We Are

We come for the impact, and stay for the culture! We’re a talented, opinionated, passionate, fiercely inclusive, and responsible group of bees. We have conviction and we strive to live our values every day. We want our people to do what they truly love amongst a team of highly talented (but humble) peers. 


How We Work

We are a remote-first company, which means we believe it is not where you sit, but how you deliver that matters most. We invest in our people and care about how you orient to our culture and processes. At the same time we imbue a lot of trust, autonomy, and accountability from Day 1. #LI-Remote


The Role

As our very first Senior Data Engineer, you’ll have a unique opportunity to lay the foundation for Honeycomb’s data-driven future. Partnering directly with the Head of Data, you will architect and build a modern, scalable data platform that not only powers our business-critical insights but also sets the standard for data quality and reliability across the organization.

What You’ll Do in the Role:

  • Own the Data Platform: Take full ownership of our Snowflake data warehouse, DBT models, and diverse ingestion platform. You’ll design and maintain end-to-end solutions that enable access to clean, accurate and well-annotated data.
  • Build Scalable Systems: Leverage modern technologies to create robust, production-grade data pipelines and models. Your work will enable rapid iteration and empower teams from R&D to Sales, Marketing, Finance, and beyond to make informed, data-driven decisions and have ownership over their data.
  • Collaborate Across Functions: Work hand-in-hand with engineering, product, sales, marketing, and business stakeholders to translate complex needs into aligned data architectures and actionable insights. Your collaborative spirit will help bridge gaps and foster a culture of shared success.
  • Drive Innovation and Quality: Establish best practices for data quality and reliability by setting meaningful SLO metrics and continuously refining our systems. You’ll have the autonomy to experiment with new technologies and approaches, driving innovation in a fast-paced, evolving environment.
  • Lead with Impact: From planning and deployment to long-term maintenance, you’ll lead critical projects with a keen sense of ownership and strategic vision. Your ability to balance technical excellence with business value will be key to our next phase of growth.

  • If you are a seasoned data professional with a passion for creating scalable, robust data solutions and enjoy solving complex problems through innovative thinking, we’d love to have you help shape the future of Honeycomb. Join us, and be at the forefront of transforming our data capabilities while making a lasting impact across the entire organization.

What You'll Bring:

  • Extensive data development including expert-level SQL and programming experience in a scripting language (preferably Python)
  • Demonstrated experience with modern data tooling including: MPP Data warehouses (e.g. Redshift or Snowflake (preferred)), DBT Workflow automation (e.g. Airflow, Dagster, Prefect)
  • Experience implementing structured data models, architectures and marts (e.g. Inmon, Kimball)
  • Experience collaborating with data analysts, data scientists and business users with varying levels of data savvy
  • Comfortable working through ambiguous problems - this is our first DE hire so there will be a fair amount of role shaping

  • Bonus / Preferred experience: 

  • Experience with any of the following: Spark, Scala, Terraform, AWS/K8s, Debezium/Flink
  • Experience managing production-grade data pipelines powering customer-facing applications
  • Exposure to MLOps and supporting ML/AI team’s data requirements
  • Experience working with CRM, Martech and other GTM datasets and systems

What You Get When You Join the Hive

  • Base pay (range) of $170,000 - $200,000 USD, CAD $233,504 - CAD $274,710
  • A stake in our success - generous equity with employee-friendly stock program
  • It’s not about how strong of a negotiator you are - our pay is based on transparent levels relative to experience
  • Time to recharge - Unlimited PTO and paid sabbatical
  • A remote-first mindset and culture (really!)
  • Home office, co-working, and internet stipend
  • 100% employee/75% for dependents coverage for all benefits
  • Up to 16 weeks of paid parental leave, regardless of path to parenthood
  • Annual development allowance
  • And much more...

Please note we cannot currently sponsor or do visa transfers at this time.



Diversity & Accommodations:

We're building a diverse and inclusive workplace where we learn from each other, and welcome nontraditional candidates, and people of all backgrounds, experiences, abilities and perspectives. You don't need to be a millennial to join us, all gens are welcome! Further, we (of course) follow federal and state disability laws and are happy to provide reasonable accommodations during the application phase, interview process, and employment. Please email [email protected] to discuss accessible formats or accommodations. As an equal opportunity employer our hiring process is designed to put you at ease and help you show your best work; if we can do better - we want to know! 

Top Skills

Airflow
AWS
Dagster
Dbt
Debezium
Flink
K8S
Prefect
Python
Scala
Snowflake
Spark
SQL
Terraform

Similar Jobs

10 Days Ago
Remote
Los Angeles, CA, USA
Senior level
Senior level
Information Technology • Software • Consulting
This Sr. Data Engineer role involves transforming on-premise Python applications to cloud-native implementations. You will work on strategic initiatives for a major financial client, focusing on data pipeline development in AWS environments while maintaining high standards of quality, security, and transparency.
Top Skills: AWSEmrLambdaOraclePostgresPysparkPythonRdsRedshiftSQL
8 Days Ago
Easy Apply
Remote
Hybrid
3 Locations
Easy Apply
Senior level
Senior level
AdTech • Big Data • Information Technology • Marketing Tech • Sales • Software
As a Senior Data Engineer at Bombora, you will develop and maintain scalable ETL pipelines, optimize data processing systems, and design databases. The role involves troubleshooting production issues, mentoring other engineers, and establishing best practices while leveraging technologies like Python, Airflow, and GCP.
Top Skills: AirflowBigQueryDataflowGoogle Cloud PlatformKubernetesPythonSQL
2 Days Ago
Remote
Fort Walton Beach, FL, USA
Expert/Leader
Expert/Leader
Aerospace • Hardware • Information Technology • Security • Software • Cybersecurity • Defense
As a Senior Data Engineer, you will lead the design and implementation of scalable data systems and pipelines, ensuring data quality and integrity while collaborating with teams to leverage data-driven insights for business growth. Key responsibilities include developing ETL processes, optimizing data flow, and maintaining documentation for data architectures.
Top Skills: AWSAws GlueAzureDockerEtl ToolsKubernetesOraclePower BIPysparkPythonRedshiftSQL ServerTableau

What you need to know about the Calgary Tech Scene

Employees can spend up to one-third of their life at work, so choosing the right company is crucial, not just for the job itself but for the company culture as well. While startups often offer dynamic culture and growth opportunities, large corporations provide benefits like career development and networking, especially appealing to recent graduates. Fortunately, Calgary stands out as a hub for both, recognized as one of Startup Genome's Top 100 Emerging Ecosystems, while also playing host to a number of multinational enterprises. In Calgary, job seekers can find a wide range of opportunities.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account