PsiQuantum Logo

PsiQuantum

Senior Data Software Engineer

Job Posted 4 Days Ago Posted 4 Days Ago
Remote
3 Locations
Senior level
Remote
3 Locations
Senior level
The Senior Data Software Engineer will architect and maintain scalable data pipelines, collaborating with scientists and engineers to enhance data operations for quantum applications.
The summary above was generated by AI

Quantum computing holds the promise of humanity’s mastery over the natural world, but only if we can build a real quantum computer. PsiQuantum is on a mission to build the first real, useful quantum computers, capable of delivering the world-changing applications that the technology has long promised. We know that means we will need to build a system with roughly 1 million qubits that supports fault tolerant error correction within a scalable architecture, and a data center footprint.

By harnessing the laws of quantum physics, quantum computers can provide exponential performance increases over today’s most powerful supercomputers, offering the potential for extraordinary advances across a broad range of industries including climate, energy, healthcare, pharmaceuticals, finance, agriculture, transportation, materials design, and many more.

PsiQuantum has determined the fastest path to delivering a useful quantum computer, years earlier than the rest of the industry. Our architecture is based on silicon photonics which gives us the ability to produce our components at Tier-1 semiconductor fabs such as GlobalFoundries where we leverage high-volume semiconductor manufacturing processes, the same processes that are already producing billions of chips for telecom and consumer electronics applications. We also benefit from the quantum mechanics reality that photons don’t feel heat or electromagnetic interference, allowing us to take advantage of existing cryogenic cooling systems and industry standard fiber connectivity.

In 2024, PsiQuantum announced two government-funded projects to support the build-out of our first Quantum Data Centers and utility-scale quantum computers in Brisbane, Australia and Chicago, Illinois. Both projects are backed by nations that understand quantum computing’s potential impact and the need to scale this technology to unlock that potential. And we won’t just be building the hardware, but also the fault tolerant quantum applications that will provide industry-transforming results.

Quantum computing is not just an evolution of the decades-old advancement in compute power. It provides the key to mastering our future, not merely discovering it. The potential is enormous, and we have the plan to make it real. Come join us.

There’s much more work to be done and we are looking for exceptional talent to join us on this extraordinary journey!

Team Overview

PsiQuantum’s Quantum Applications Software Team is dedicated to creating a software environment where quantum application developers can discover, build, utilize, optimize, and visualize large-scale quantum algorithms for classically intractable problems. As we continue to scale, we’re building out a QA function that bridges classical software QA best practices with cutting-edge quantum software development.

Role Overview

We seek a Senior Data Software Engineer with deep expertise in data engineering to architect, build, and maintain scalable, flexible data pipelines and systems that support critical quantum software applications. These pipelines will enable classical pre-processing on high-performance computing (HPC) platforms and other data-intensive operations. In this hands-on role, you will bridge engineering and scientific research—gathering and analyzing requirements, collaborating with scientists to understand existing data operations, and using that knowledge to build foundational data infrastructure and frameworks. You will champion engineering excellence and design-first thinking across the department, while partnering with Product and other software engineering teams to establish robust data architecture and systems.

You’ll have the opportunity to:

  • Collaborate with key stakeholders to gather requirements, incorporate feedback, and produce engineering design documents that unify user and system needs.
  • Build and own critical data infrastructure using modern data technologies (e.g., Databricks, AWS Glue/Athena, AWS Redshift, Snowflake) to enable secure, large-scale operations.
  • Champion data engineering best practices by automating data workflows and implementing proactive monitoring for reliability and performance.
  • Partner with cross-functional teams (QA, Product, Engineering) to ensure compliance, scalability, and high availability of the data platform.

Responsibilities:

  • Develop and refine data processing pipelines to handle complex scientific or computational datasets.
  • Design and implement scalable database solutions to efficiently store, query, and manage large volumes of domain-specific data.
  • Refactor and optimize existing codebases to enhance performance, reliability, and maintainability across various data workflows.
  • Collaborate with cross-functional teams (e.g., research scientists, HPC engineers) to support end-to-end data solutions in a high-performance environment.
  • Integrate workflow automation tools, ensuring the smooth operation of data-intensive tasks at scale.
  • Contribute to best practices for versioning, reproducibility, and metadata management of data assets.
  • Implement Observability: Deploy monitoring/logging tools (e.g., CloudWatch, Prometheus, Grafana) to preempt issues, optimize performance, and ensure SLA compliance.

Experience/Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • 8+ years in Data Engineering with hands-on cloud and SaaS experience.
  • Proven experience designing data pipelines and workflows, preferably for high-performance or large-scale scientific computations.
  • Strong knowledge of database design principles (relational and/or NoSQL) for complex or high-volume datasets.
  • Proficiency in one or more programming languages commonly used for data engineering (e.g., Python, C++, Rust).
  • Hands-on experience with orchestration tools such as Prefect, Apache Airflow, or equivalent frameworks.
  • Hands-on experience with cloud data services, e.g. Databricks, AWS Glue/Athena, AWS Redshift, Snowflake, or similar.
  • Excellent teamwork and communication skills, especially in collaborative, R&D-focused settings.

Preferred Qualifications:

  • Knowledge and experience with containerization and orchestration tools such as Docker and Kubernetes and event-driven architectures.
  • Knowledge of HPC job schedulers (e.g., Slurm, LSF, or PBS) and distributed computing best practices is a plus.
  • Experience with Infrastructure as Code (IaC) tools like Terraform, AWS CDK, etc.
  • Deployed domain-specific containerization (Apptainer/Singularity) or managed GPU/ML clusters.

PsiQuantum provides equal employment opportunity for all applicants and employees. PsiQuantum does not unlawfully discriminate on the basis of race, color, religion, sex (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, ancestry, citizenship, age, physical or mental disability, military or veteran status, marital status, domestic partner status, sexual orientation, genetic information, or any other basis protected by applicable laws.

Note: PsiQuantum will only reach out to you using an official PsiQuantum email address and will never ask you for bank account information as part of the interview process. Please report any suspicious activity to recruiting@psiquantum.com.

We are not accepting unsolicited resumes from employment agencies.

Top Skills

Apache Airflow
Aws Glue
Aws Redshift
C++
Databricks
Docker
Kubernetes
Prefect
Python
Rust
Snowflake
Terraform

Similar Jobs

12 Days Ago
Remote
United States
Senior level
Senior level
Cloud • eCommerce • Enterprise Web • Information Technology • Software
The role involves developing and optimizing data pipelines, collaborating with engineering teams, and supporting data scientists with engineering needs.
Top Skills: GoGoogle Big QueryGCPSQL
13 Days Ago
Easy Apply
Remote
2 Locations
Easy Apply
Senior level
Senior level
Healthtech • Pharmaceutical • Telehealth
Design, build, and maintain scalable data infrastructure for analytics and machine learning. Collaborate with teams to enhance data accessibility and resolve production issues.
Top Skills: AWSDbtEksGoIamKafkaPulumiPythonS3Terraform
9 Hours Ago
Remote
U.S.
Senior level
Senior level
Fintech • Payments • Real Estate • Software • Financial Services
Design and maintain scalable data infrastructure services, develop robust DBT models, create streaming pipelines, and collaborate with analytics teams to enhance data utilization.
Top Skills: AirflowAWSCassandraCdkDbtHadoopHiveKafkaPythonSnowflakeSparkSQLTerraform

What you need to know about the Calgary Tech Scene

Employees can spend up to one-third of their life at work, so choosing the right company is crucial, not just for the job itself but for the company culture as well. While startups often offer dynamic culture and growth opportunities, large corporations provide benefits like career development and networking, especially appealing to recent graduates. Fortunately, Calgary stands out as a hub for both, recognized as one of Startup Genome's Top 100 Emerging Ecosystems, while also playing host to a number of multinational enterprises. In Calgary, job seekers can find a wide range of opportunities.
By clicking Apply you agree to share your profile information with the hiring company.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account