Staff Data Engineer

arrow

Boston / $80 - $95 hour

INFO

Salary
SALARY:

$80 - $95

Location

LOCATION

Boston

Job Type
JOB TYPE

Contract

Staff Data Engineer

Term: 12 Months, extension likely

Rate: $80/hr - $95/hr (dependent on experience)

Location: Remote (Boston, MA preferable)

Utilization: 40hrs / week

About the Company

A rapidly growing biotech organization is leveraging advanced computation, modern cloud infrastructure, and cross‑functional scientific collaboration to accelerate development of next‑gen therapeutic platforms. The team blends data engineering, biology, chemistry, and applied analytics to enable high‑impact scientific decision‑making. The environment is fast‑paced, interdisciplinary, and driven by curiosity, data integrity, and continuous improvement.


About the Role

The company is seeking a Staff Data Engineer who can partner closely with scientists and quantitative teams to design and operate data systems that power experimental research and analysis. You will translate research workflows into reliable ETL/ELT pipelines, create scalable cloud‑based data architectures, and ensure strong governance and validation practices. This role is essential in building a modern data lakehouse that supports diverse experimental data and accelerates scientific insights.


What You'll Do

  • Work with scientists across multiple research functions to gather requirements and understand data structures, experimental workflows, and analysis needs.
  • Architect scalable data schemas and optimize warehouse performance for analytical and cost efficiency.
  • Build, automate, and maintain ETL/ELT pipelines within a modern cloud ecosystem.
  • Deploy infrastructure and data pipelines using IaC tools such as Terraform, Helm, and Kubernetes.
  • Support analytics teams by ensuring high availability, reliability, and clarity of underlying data.
  • Identify bottlenecks, enhance performance, and improve end‑to‑end data quality.
  • Contribute to data governance, validation, monitoring, and compliance frameworks.
  • Stay current with best practices in cloud computing, DevOps, and modern data engineering patterns.

Required Qualifications

  • Bachelor's degree in Computer Science, Engineering, Data Science, or a related quantitative field (or equivalent experience).
  • Strong experience building and maintaining data pipelines in production environments.
  • Ability to translate scientific or analytical requirements into optimized data models.
  • Proficiency in Python and associated data libraries (e.g., Pandas, NumPy).
  • Understanding of warehousing concepts across relational and NoSQL systems.
  • Experience with Terraform or similar Infrastructure‑as‑Code tools.
  • Familiarity with CI/CD pipelines and cloud‑native deployment workflows.
  • Strong communication skills with the ability to collaborate across technical and scientific teams.
  • Comfort operating in dynamic environments where priorities can change quickly.

Preferred Qualifications

  • Master's degree in a related computational or scientific field.
  • Hands‑on experience with major cloud platforms (GCP, AWS, or Azure), including large‑scale warehouse technologies.
  • Experience with Docker, Kubernetes, or container orchestration frameworks.
  • Familiarity with modern UI or data visualization frameworks (e.g., Streamlit, Angular).
  • Experience building APIs (REST/gRPC) and working with Protocol Buffers.
  • Understanding of regulated data environments (e.g., HIPAA, ELN/Compliance).
  • Exposure to MLOps workflows and model deployment practices.
  • Experience converting notebooks into production data pipelines.
  • Knowledge of scientific data types (e.g., genomics, proteomics, assay data).


CONTACT

SIMILAR
JOB RESULTS

4k-Harnham_DA copy
CAN’T FIND THE RIGHT OPPORTUNITY?

STILL
LOOKING?

If you can’t see what you’re looking for right now, send us your CV anyway – we’re always getting fresh new roles through the door.