INFORMATION FOR REGISTERED CANDIDATES WITH HARNHAM


KIDÂ DOCUMENTS
On this page you will find your KID (Key Information Document) and information about the alternative options available to you via our preferred list of Umbrella companies.
UMBRELLA COMPANY 1
GIANT - For almost 30 years giant have provided specialist, end to end workforce management software and support services to large and small recruitment agencies, internationally.

UMBRELLA COMPANY 2
PAYSTREAM - A market leading business providing a range of payroll and accountancy services for recruitment agencies and contractors.

UMBRELLA COMPANY 3
SAPPHIRE - Working through our umbrella company is a quick and easy solution for contractors, without having to file paperwork. By joining Sapphire, we’ll be with you every step of the way by offering you top of the line employment support and benefits.

UMBRELLA COMPANY 4
WORKWELL (FORMERLY JSA) - Enjoy the freedom & flexibility of contracting whilst they provide benefits such as holiday and sick pay. Unlike a limited company structure, there’s no admin to worry about – you simply work and get paid, and they take care of the rest.

JOBS
LATEST CONTRACT
OPPORTUNITIES
Harnham are a specialist Data & Analytics recruitment business with a dedicated Contract and Freelance team, working across the UK, US and Europe.

Software Djongo Developer
London
£400 - £400
+ Data Engineering
ContractLondon
To Apply for this Job Click Here
Software Django Developer
£400 Outside IR35
1 day in London
We’re working with a fast-growing technology company focused on delivering scalable SaaS products to enterprise clients. The business combines product-led engineering with data-driven decision making, and is looking for an experienced backend engineer to help build and maintain their core platform.
The team values clean code, automated testing, and pragmatic architecture. You’ll join a small, experienced engineering group where ownership and collaboration are central to delivery.
The Role
You will take responsibility for backend services and APIs, helping to shape the platform architecture and deliver new features end-to-end. The role involves close collaboration with product, data, and frontend colleagues to deliver high-quality, reliable services.
Day-to-day responsibilities include:
-
Designing, implementing and maintaining backend services and REST/GraphQL APIs using Python and Django (or Django REST Framework).
-
Building performant, secure data models and database schemas (Postgres).
-
Writing automated tests (unit/integration) and participating in code review processes.
-
Collaborating with frontend engineers to define interfaces and deliver product features.
-
Working with DevOps/Platform teams on CI/CD, containerisation and deployment (Docker, Kubernetes or managed alternatives).
-
Troubleshooting production issues and improving observability (logging, metrics, tracing).
-
Contributing to technical design discussions and driving improvements to reliability and performance.
Tech Stack & Skills
Core skills:
-
Strong Python development experience (5+ years preferred) with production Django/Django REST Framework work.
-
Solid relational database experience, ideally Postgres (schema design, query optimisation).
-
Test-driven development practices and experience with pytest or equivalent.
-
Experience working with RESTful APIs and/or GraphQL.
-
Familiarity with containerisation and cloud deployment (Docker, Kubernetes/EKS/GKE or equivalent).
-
Version control with Git and experience of CI/CD pipelines (GitHub Actions, GitLab CI, CircleCI, etc.).
Nice to have:
-
Experience with async frameworks (FastAPI, Celery, or asyncio-based work).
-
Exposure to event-driven architectures, message queues (Kafka, RabbitMQ) or pub/sub.
-
Knowledge of observability tooling (Prometheus, Grafana, Sentry, ELK).
-
Understanding of security best practices for web services (OWASP, authentication/authorization patterns).
-
Experience working in product-led teams and mentoring junior engineers.
To Apply for this Job Click Here

Master Data Lead
Birmingham
£350 - £500
+ Data Management & Governance
ContractBirmingham, West Midlands
To Apply for this Job Click Here
MASTER DATA LEAD
2 DAYS PER WEEK IN THE OFFICE (BIRMINGHAM)
3 – 6 MONTH CONTRACT
THE COMPANY:
Join an international company and help support establish a new Master Data function.
THE ROLE:
As a Master Data Lead, you’ll play a pivotal role in setting up the UK capability – cleansing existing data, defining new processes, and ensuring alignment between business and data teams. Your key responsibilities will include:
- Support the setup of a new Master Data team in the UK, in collaboration with the global team.
- Lead data cleansing and standardisation efforts across customer, supply chain, and operations data.
- Assess current data structures and identify opportunities for process improvement.
- Design and document master data management processes for ongoing use.
- Partner with Customer Service, Supply Chain, and Operations teams to align business needs with data accuracy.
YOUR SKILLS AND EXPERIENCE:
The successful Master Data Lead will have:
- Strong background in Master Data Management (MDM), ideally within large or international organisations.
- Proven experience in data cleansing, data quality improvement, and process creation.
- Excellent understanding of both business and data operations – more business-facing than purely technical.
- Familiarity with ERP systems, preferably IFS or similar.
HOW TO APPLY:
Please register your interest by sending your CV to Mojola Coker via the apply link on this page.
To Apply for this Job Click Here

Data Analyst
London
£300 - £400
+ Advanced Analytics & Marketing Insights
ContractLondon
To Apply for this Job Click Here
DATA ANALYST
£300 – £400 PER DAY OUTSIDE IR35
FULLY REMOTE (MUST BE BASED IN THE UK)
3 MONTH CONTRACT
THE ROLE:
As the Data Analyst, you’ll be the go-to expert for all things data and reporting within the marketing team.
Your key responsibilities will include:
- Building and automating marketing dashboards and reports to track performance and ROI in Funnel.IO.
- Defining key marketing metrics and KPIs in collaboration with the Performance Marketing Manager.
- Streamlining data processes and improving visibility of performance across channels (TikTok, Facebook, Google, etc.).
- Acting as the sole analyst in the team, providing actionable insights that guide marketing strategy and budget allocation.
- Ensuring data quality, consistency, and accessibility for stakeholders across marketing and leadership teams.
YOUR SKILLS AND EXPERIENCE:
The successful Data Analyst will have:
- Proven experience as a Data Analyst, ideally within a marketing or performance marketing team (B2C environment preferred).
- Strong skills in Google Sheets (automation, complex formulas, data structuring).
- Experience using Funnel.io for dashboard creation.
- Comfortable collaborating with non-technical stakeholders to capture reporting requirements.
- Ability to translate marketing data into clear, actionable insights.
- Experience building dashboards in Tableau is a nice to have.
HOW TO APPLY:
Please register your interest by sending your CV to Mojola Coker via the apply link on this page.
To Apply for this Job Click Here

Data Analyst
London
£300 - £400
+ Advanced Analytics & Marketing Insights
ContractLondon
To Apply for this Job Click Here
Job Title: Data Analyst
Location: Remote
Contract: 3 months
IR35 Status: Outside IR35
Start Date: ASAP (max 1-week notice)
Rate: £300-£400 per day
Overview:
A fast-growing German company is expanding and building a new UK-based data team. They are seeking a Data Analyst to extract, manage, and visualise data to support performance marketing initiatives.
Key Responsibilities:
- Create, populate, and automate dashboards for reporting and analysis.
- Work closely with the Performance Marketing Lead to deliver actionable insights.
- Support data-driven decision-making by translating raw data into visual reports.
Required Skills & Experience:
- Hands-on experience with Funnel.io and Google Sheets.
- Strong communication skills to present and explain dashboards to stakeholders.
- Familiarity with Tableau (desirable).
- Experience in performance marketing, preferably B2C.
To Apply for this Job Click Here

Software & AI Engineer
London
£750 - £800
+ Data Science & AI
ContractLondon
To Apply for this Job Click Here
Software & AI Engineer
£750 – £800 per day
2 days onsite
We’re working with a global healthcare and AI research organisation that’s pioneering the use of advanced Machine Learning to accelerate the discovery and delivery of life-changing treatments. Their vision is to enable faster, personalised therapies that improve patient outcomes worldwide – and they’re seeking a Backend Software Engineer to help bring that mission to life.
The Role
You’ll be joining a multidisciplinary AI/ML team responsible for developing the infrastructure and software that powers cutting-edge research and intelligent applications. The role focuses on backend development in Python, integrating AI components with data, compute, and frontend systems to create scalable, high-performance solutions.
Day-to-day responsibilities include:
-
Designing and implementing backend services for Python-based web applications (e.g., FastAPI).
-
Integrating AI and ML components into production systems and APIs.
-
Writing high-quality, well-tested, and well-documented code following best practices.
-
Developing and monitoring metrics to improve system reliability and performance.
-
Collaborating closely with frontend engineers, data engineers, and ML specialists to build end-to-end pipelines.
-
Participating in agile ceremonies and code reviews to uphold quality and delivery standards.
Tech Stack & Skills
Core skills:
-
Strong Python backend development experience (FastAPI or similar frameworks)
-
Cloud-native deployment experience (preferably Google Cloud and Cloud Run)
-
Unit testing experience (pytest or similar frameworks)
-
Familiarity with agile development and CI/CD processes
-
Strong understanding of Git/GitHub workflows and DevOps tooling
Nice to have:
-
Experience with Docker or multi-container application architecture
-
Familiarity with AI/ML technologies such as LLMs, NLP, LangGraph, PydanticAI, or AutoGen
-
Experience with biological or scientific datasets (genomics, proteomics, etc.)
-
Exposure to frontend development (React preferred)
-
Experience benchmarking and improving AI/ML models or agent-based systems
What You’ll Bring
-
A track record of delivering clean, production-grade backend systems
-
A collaborative and proactive approach to working in cross-functional teams
-
Passion for innovation and applying technology to advance scientific discovery
-
A growth mindset with a focus on continuous learning and improvement
Why Join?
This is a unique opportunity to work at the intersection of AI, software engineering, and healthcare, contributing directly to products that accelerate the development of next-generation therapies. You’ll join a forward-thinking team that values ownership, accountability, and continuous improvement, in an environment built for long-term collaboration and innovation.
To Apply for this Job Click Here

GCP Data Engineer
London
£750 - £800
+ Data Engineering
ContractLondon
To Apply for this Job Click Here
Data Engineer – GCP
£750 – £800 per day
2 days in London
We’re working with a global healthcare and life sciences leader that’s pioneering the use of AI and Machine Learning to develop advanced therapies for both existing and emerging diseases. Their mission is to make personalised treatments faster, more effective, and more accessible – and this role is key to that vision.
The team is building cutting-edge data infrastructure to power scientific and ML applications, and they’re seeking a Data Engineer with strong experience developing scalable, high-quality data pipelines in the cloud.
The Role
You’ll join a team of data scientists, bioinformaticians, and engineers working at the intersection of healthcare and AI. Your focus will be on designing and maintaining the data pipelines that feed large-scale ML and research workflows.
Day-to-day responsibilities include:
-
Building and maintaining data pipelines using Python, SQL, Spark, and Google Cloud technologies (BigQuery, Cloud Storage).
-
Ensuring pipelines are robust, reliable, and optimised for AI/ML use cases.
-
Developing automated tests, documentation, and monitoring for production-grade data systems.
-
Collaborating with scientists and ML engineers to meet evolving data needs.
-
Participating in code reviews, introducing best practices, and continuously improving performance and quality.
Tech Stack & Skills
Core Skills:
-
Strong experience with Python and SQL in production environments
-
Proven track record developing data pipelines using Spark, BigQuery, and cloud tools (preferably Google Cloud)
-
Familiarity with CI/CD and version control (git, GitHub, DevOps workflows)
-
Experience with unit testing (e.g., pytest) and automated quality checks
-
Understanding of agile software delivery and collaborative development
Nice to Have:
-
Experience with bioinformatics or large-scale biological data (e.g., genomics, proteomics)
-
Familiarity with orchestration tools such as Airflow or Google Workflows
-
Experience with containerisation (Docker)
-
Exposure to NLP, unstructured data processing, or vector databases
-
Knowledge of ML and AI-powered data products
What You’ll Bring
-
Strong problem-solving skills and curiosity about scientific or AI-driven challenges
-
A focus on quality, scalability, and collaboration
-
The ability to work across cross-functional teams and translate complex requirements into robust data workflows
To Apply for this Job Click Here

GCP Data Engineer (Contract)
London
£750 - £800
+ Data Engineering
ContractLondon
To Apply for this Job Click Here
Contract Data Engineer – Google Cloud | Scientific Data | Research-Focused Organisation
London (2-3 days a week onsite)
£750 per day (Inside IR35) | 6-month contract
The Company
Harnham is partnering with a leading research and technology organisation that’s leveraging data and AI to accelerate scientific innovation. They’re looking for an experienced Data Engineer to join on a 6-month contract and help build and optimise data pipelines supporting large-scale scientific and R&D initiatives.
The Role
You’ll be responsible for designing, developing, and maintaining robust data pipelines on Google Cloud. Working closely with other engineers and scientists, you’ll ensure the delivery of clean, scalable data solutions that power high-impact research.
Key Responsibilities
-
Build and maintain data pipelines using modern tools on Google Cloud, including Python, Spark, SQL, BigQuery, and Cloud Storage.
-
Ensure data pipelines meet the analytical and scientific needs of key applications.
-
Deliver high-quality, production-grade code with testing and documentation.
-
Develop, measure, and monitor key performance metrics across tools and services.
-
Collaborate with technical peers and participate in code reviews to maintain engineering excellence.
-
Work cross-functionally with allied teams to deliver end-to-end, production-ready data solutions.
Your Skills and Experience
-
2+ years of experience as a Data Engineer (or equivalent) with a relevant degree in a computational, numerate, or life sciences field.
-
Strong experience with Google Cloud Platform (GCP).
-
Excellent programming skills in Python and SQL.
-
Experience with Spark and DevOps tools (Terraform, GitHub Actions, etc.).
-
Strong understanding of modern development practices (git/GitHub, CI/CD, agile).
-
Experience with automated testing frameworks such as pytest.
Preferred Qualifications
-
Experience working with biological or scientific datasets (e.g. genomics, proteomics, or pharmaceutical data).
-
Knowledge of bioinformatics or large-scale research data.
-
Familiarity with Nextflow, Airflow, or Google Workflows.
-
Understanding of NLP techniques and processing unstructured data.
-
Experience with AI/ML-powered applications and containerised development (Docker).
Contract Details
-
Day Rate: £750 (Inside IR35)
-
Location: London – 3 days per week onsite
-
Duration: 6 months (potential to extend)
How to Apply
If you’re a skilled Data Engineer with experience in cloud and scientific data environments, and you’re looking for a hands-on contract where your work has real-world impact, apply now or get in touch with Harnham for more details.
To Apply for this Job Click Here

Backend Software Engineer – AI/ML
London
£700 - £800
+ Data Science & AI
ContractLondon
To Apply for this Job Click Here
Backend Software Engineer – AI/ML
6-month contract | Inside IR35 | Up to £800 per day | 2-3 days per week in London
We’re supporting a global R&D organisation in the build-out of its AI and ML engineering platforms. The team is responsible for productionising and scaling ML models that power drug discovery and predictive research.
They’re seeking a Backend Software Engineer to lead development of APIs and data infrastructure supporting these ML applications.
What you’ll be doing:
-
Develop backend systems for ML-driven applications using Python (FastAPI)
-
Build data and compute integration pipelines with cross-functional AI teams
-
Design testing and deployment frameworks for model reliability
-
Work collaboratively in agile sprints alongside data and DevOps teams
Key Skills:
Python, FastAPI, Git/GitHub, CI/CD, cloud infrastructure (GCP), automated testing (pytest)
Nice to Have:
Experience with ML model deployment or scientific data systems
Ideal for:
Engineers who thrive building robust, production-ready backends for AI/ML platforms.
To Apply for this Job Click Here

Full Stack Developer – GenAI
London
£700 - £800
+ Data Science & AI
ContractLondon
To Apply for this Job Click Here
Full Stack Developer – GenAI
6-month contract | Inside IR35 | Up to £800 per day | 2-3 days per week in London
A global pharmaceutical client is growing their Responsible AI function and looking for a Full Stack Developer to help build and visualise GenAI applications that evaluate LLM safety, performance, and output explainability.
This role suits someone who enjoys working across both frontend and backend systems, connecting AI models to real-world user interfaces and data pipelines.
What you’ll be doing:
-
Develop and deploy full stack features (Python backend, React frontend)
-
Integrate LLM and GenAI components into the product ecosystem
-
Conduct user research to understand risk and usability in AI systems
-
Contribute to agile development cycles and code reviews
Key Skills:
Python, React, TypeScript, Git/GitHub, cloud infrastructure (GCP), agile development, automated testing
Nice to Have:
Experience with AI safety evaluations, LangGraph, PydanticAI, or AutoGen frameworks
Ideal for:
Full stack engineers interested in building user-facing AI tools and safety-focused GenAI systems.
To Apply for this Job Click Here

Backend Software Engineer – GenAI
London
£700 - £800
+ Data Science & AI
ContractLondon
To Apply for this Job Click Here
6-month contract | Inside IR35 | Up to £800 per day | 2-3 days per week in London
We’re working with a global pharmaceutical company expanding its Responsible AI division. The team are building a next-generation Generative AI platform to accelerate drug discovery and safety evaluation.
They’re looking for a Backend Software Engineer to help design and develop backend systems supporting large-scale LLM and GenAI workloads. The role will focus on backend architecture, API integration, and automation across data and compute pipelines.
What you’ll be doing:
-
Build and maintain Python backend systems for GenAI evaluation
-
Integrate LLM components with data and compute infrastructure
-
Deliver clean, well-tested, production-grade code (FastAPI preferred)
-
Collaborate with ML engineers, data scientists, and DevOps teams
-
Participate in agile delivery and CI/CD improvement initiatives
Key Skills:
Python, FastAPI, Cloud (GCP/AWS), CI/CD, Git/GitHub, automated testing, DevOps fundamentals
Ideal for:
Engineers with a strong backend foundation and hands-on GenAI or LLM integration experience looking to contribute to applied AI in healthcare.
To Apply for this Job Click Here

Technical Business Analyst
City of London
£400 - £401
+ Advanced Analytics & Marketing Insights
ContractCity of London, London
To Apply for this Job Click Here
TECHNICAL BUSINESS ANALYST
£400 PER DAY OUTSIDE IR35
REMOTE (WITH OCCASIONAL OFFICE REQUIREMENTS – LONDON)
3 MONTH CONTRACT
THE ROLE:
As a Business Analyst, you’ll join a core commercial platform team, sitting at the intersection of Product, Engineering, and Commercial Operations. Your responsibilities will include:
- Analysing business problems and identifying opportunities for improvement and impact
- Translating product and commercial goals into clear requirements, user stories, and acceptance criteria
- Supporting Product Managers through discovery, backlog refinement, and sprint planning
- Facilitating workshops and stakeholder sessions across Product, Engineering, and Commercial functions
- Creating and maintaining documentation such as process maps, data flows, and business rules
- Validating requirements through stakeholder input, data, and business impact analysis
- Supporting backlog management, prioritisation, and sprint readiness
- Acting as the bridge between business and technical teams to ensure shared understanding
YOUR SKILLS AND EXPERIENCE:
The successful Business Analyst, will have:
- Strong analytical, facilitation, and documentation skills
- Solid understanding of Agile and Scrum delivery practices
- Experience writing user stories and defining acceptance criteria
- Ability to navigate ambiguity and bring structure to fast-moving projects
- Experience in commercial platform or integration-heavy environments
- Familiarity with APIs, system interfaces, or data flow documentation
HOW TO APPLY:
Please register your interest by sending your CV to Mojola Coker via the apply link on this page.
To Apply for this Job Click Here

Senior Data Engineer
London
£500 - £560
+ Data Engineering
ContractLondon
To Apply for this Job Click Here
Data Engineer
£500 – £560 per day
London – 1 day per week in office
We’re working with a leading global healthcare technology company who are building out their next-generation data platform, with a strong emphasis on automation, testing, and cloud-native engineering, and are looking for an experienced Data Engineer to join their team.
The Role
You’ll be part of a modern data engineering function that’s implementing best-in-class data practices across ingestion, transformation, and orchestration layers. The environment is highly technical, collaborative, and fast-paced, giving you the opportunity to work across a wide variety of data sources and tools.
Day-to-day responsibilities include:
-
Designing and developing DBT models and Airflow pipelines within a modern data stack.
-
Building robust data ingestion pipelines across multiple sources – including external partners, internal platforms, and APIs.
-
Implementing automated testing and CI/CD pipelines for data workflows.
-
Performing data extraction and enrichment, including web scraping and parsing of unstructured text (e.g., scanned forms and documents).
-
Collaborating on forecasting and predictive analytics initiatives.
-
Bringing modern engineering practices, testing frameworks, and design patterns to the wider data function.
Tech Stack & Skills
Core skills:
-
Strong experience with DBT, Airflow, Snowflake, and Python
-
Proven background in automated testing, CI/CD, and test-driven development
-
Experience building and maintaining data pipelines and APIs in production environments
Nice to have:
-
Knowledge of Snowflake infrastructure and data architecture design
-
Experience using LLMs or MLOps frameworks for data extraction or model training
-
Familiarity with cloud-agnostic deployments and version control best practices
What You’ll Bring
-
A proactive, hands-on approach to engineering challenges
-
A passion for data quality, scalability, and performance
-
The ability to influence best practices and introduce modern standards across a data estate
-
Strong problem-solving skills and the confidence to work across multiple complex data sources
Why Join?
This is an opportunity to help shape the data foundations of a high-impact healthcare technology business – one that’s actively exploring the intersection of data engineering, MLOps, and AI.
You’ll have ownership of end-to-end data workflows, work with a world-class tech stack, and join a forward-thinking team that values automation, collaboration, and innovation.