New jobs page

This is soe summary text

Latest Jobs

Salary

€50000 - €60000 per annum

Location

Paris, Île-de-France

Description

En plein développement, cette entreprise de conseil en marketing recherche un Data Engineer pour rejoindre son équipe

Salary

US$150000 - US$160000 per annum

Location

New York

Description

I'm seeking a Senior Full-Stack Engineer who is strong on the back end, likes to lead, and wants to make a big impact.

Salary

£35000 - £70000 per annum

Location

London

Description

This leading insureTech have customer experience at the center of their priorities. They are looking to get data as efficiently as possible for their customers

Salary

£100000 - £120000 per annum

Location

London

Description

Unique Head of DevOps role for a data driven consultancy where you will be shaping their environment

Salary

€50000 - €60000 per annum

Location

Paris, Île-de-France

Description

En plein développement, cette entreprise de conseil en marketing recherche un Data Engineer pour rejoindre son équipe

Salary

£71000 - £700000 per annum + Additional Benefits

Location

London

Description

Join a global brand looking for a handson Java Developer to join a growing team,working on applications that reach over 7 million customers across the bank.

Salary

€640 - €720 per day

Location

Rotterdam, South Holland

Description

An exciting Data and Analytics Consultancy are seeking a BI Consultant to work on some exciting projects for their high profile clients.

Salary

€60000 - €75000 per annum

Location

Paris, Île-de-France

Description

Cette société développant une plateforme e-commerce à fort volume recherche un DevOps Engineer afin d'étendre son expertise tech.

Salary

Up to £400 per day

Location

London

Description

A chance to work for a leading ecommcerce company in a fast paced environment on ML models

Salary

£70000 - £75000 per annum

Location

London

Description

A new role for a Senior Data Engineer within an international media company based in London.

Salary

£80000 - £85000 per annum

Location

London

Description

Join a leading fin-tech as a Cloud Engineer where you will be responsible for designing, building and managing automated software deployment solutions.

Salary

£45000 - £65000 per annum + Additional Benefits

Location

Leeds, West Yorkshire

Description

This is your chance to join a growing challenger bank, looking for an enthusiastic Report Writer to join their development team!

Harnham blog & news

With over 10 years experience working solely in the Data & Analytics sector our consultants are able to offer detailed insights into the industry.

Visit our Blogs & News portal or check out our recent posts below.

Defragmenting Data Analytics

This week's guest blog is written by Moray Barclay.                    Around 20 years ago I was showing some draft business plans with cashflow projections to my new boss. His name was Marc Destrée and I concluded by saying I’d like to get the finance department involved. “No”, Marc replied. He paused for several seconds, looked up from his desk, and explained "Do the internal rate of return. Then we discuss. Then we give it to finance." He was right of course, for three reasons which together represent best practice. Firstly, it cemented the separate accountabilities between the different job functions responsible for the business case and financial governance. Secondly, there were no technical barriers to separating the “cashflow creation process” and the “P&L creation process” as everyone in the organisation used the same product: Excel. Thirdly, it assigned the right skills to activities. Today, organisations have no equivalent best practice upon which to build their data analytics capability. The lack of best practice is caused by fragmentation: fragmentation of job functions, fragmentation of products, and fragmentation of skills. This is not necessarily a bad thing: fragmentation drives innovation, and those organisations who get it right will gain huge competitive advantage. But the application of best practice mitigates against unnecessary fragmentation and hence unnecessary inefficiencies. So how could best practice be applied to an organisation’s data analytics capability? In other words, how we do defragment data job functions, data products and data skills? Defragmenting data job functions A good starting point to understanding best practice for data job functions is the informative and well-written publication “The scientist, the engineer and the warehouse”, authored by the highly respected Donald Farmer of TreeHive Strategy. He includes references to four job functions: (i) the data scientist, (ii) the data engineer, (iii) the business intelligence analyst and (iv) the departmental end user.  (i) The data scientist: The accountability of the data scientist is to build data science models using their skills in maths and coding to solve business problems. In addition to using open source technologies, such as python and R, data scientists can and do use data science platforms such as Knime which enable them to spend more time on maths and less time on coding - more on data science platforms later. (ii) The data engineer: The accountability of the data engineer is to build robust and scalable data pipelines which automate the movement and transformation of data across the organisation’s infrastructure, using their skills in database engineering, database integration, and a technical process called extract/transform/load (ETL) and its variants – more on ETL production platforms later. (iii) The business intelligence (BI) analyst: Donald Farmer’s publication does not address the accountabilities of the BI analyst in any detail because that is not its focus. Unlike the clearly defined roles of data scientists and data engineers, there are no best practice descriptions for the role of BI analyst. Typical accountabilities often include designing data visualisations from existing datasets, building these visualisations into reports or online dashboards and automating their production, and configuring end users to ensure they only have access to data that they are approved to see. Beyond these core accountabilities, BI analysts sometimes create entirely new datasets by building complex analytic models to add value to existing datasets, using either a suitable open source technology (such as python, but used in a different way to data scientists) or a data analytic platform such as Alteryx which enables the creation of code-free analytic models. One final point - a BI analyst might also build data science models, albeit typically more basic ones than those built by data scientists. BI analysts will inevitably become more like data scientists in the future driven by their natural curiosity and ambitions, vendors creating combined data science platforms and data analytic platforms, and organisations wanting to benefit from the integration of similar functions. (iv) The departmental end-user: A departmental end-user is generally the most data-centric person within a department: it might be a sales operations professional within a sales department for example. I am told that when Excel was first introduced into organisations in the 1980’s, there would be a “go-to Excel expert”; self-evidently over time everyone learned how to use it. I was there when CRM systems like salesforce.com and Netsuite appeared 20 years later, and the same thing happened: initially there would be one or two pioneers, but eventually everyone learned to use it. The same democratisation is happening and will continue to happen with business intelligence. In the same way that CRM and Excel are used by everyone who needs to, soon anyone will be able to build their own data visualisations and reports to help identify and solve their own problems. In some organisations such as BP this is already well-established. And why stop there? If a departmental end-user can model different internal rates of return and create visualisations, then why should they not apply their own data science techniques to their own datasets? But this can only happen if the role of the BI analyst has an accountability for democratisation, in addition to those mentioned earlier.In summary, the following is a list of best practice accountabilities for the BI analyst: (1) Build and automate the initial set of business intelligence reports and visualisations (2) Create the data governance framework to enable self-service by departmental end-users (3) Act as the initial go-to business intelligence expert (4) Evangelise a data-driven culture and mentor those who want to become proficient in self-service (5) Deploy resources which over time make redundant the role of a go-to business intelligence expert (6) Over time, increase time devoted to creating innovative datasets by building complex analytic models which add value to existing datasets - using open source technologies and/or a data analytic platform (7) Work with the data science function in such a way that over time the data science function and the BI function can be merged The above best practice eventually results in the role of the BI analyst, or the BI analyst team, becoming redundant, much in the way that the role of a dedicated Excel specialist died out in the mid-1980’s. As mentioned earlier, as BI analysts will move into data science, this should not result in people losing their jobs.  Defragmenting data products Unlike open source technologies there is a highly fragmented data product landscape. Products include data science platforms, data analytic platforms, platforms which are more visualisation-centric, and platforms which are more focused on data governance. There are also ETL production platforms which are in the domain of the data engineer but which include functionality to build some types of analytic models. Fragmented markets eventually consolidate. Even the broadest three cloud vendors, Amazon, Google and Microsoft, do not cover the entire landscape. For visualisation there is Quicksight, Data Studio, and Power BI respectively as well as competitive products, most obviously Tableau; for ETL production platforms there is Athena, Cloud Dataflow and Azure Data Factory, as well as competitive products such as Talend. But smaller vendors have the lead in data science platforms and data analytic platforms. The hiring by Microsoft of the python inventor Guido van Rossum two months ago points to their ambitions in data science platforms and data analytic platforms. Market consolidation in 2021 seems inevitable, but the details of actual acquisitions are not obvious. After all, it was salesforce.com which bought Tableau in 2019: not Amazon, Google or Microsoft. Best practice for organisations is to consider possible vendor consolidation as part of their procurement process, because product fragmentation means there is a corresponding fragmentation of skills. Defragmenting data skills Fragmentation of data skills means that the market for jobs, particularly contract jobs, is less elastic than it could be. The fragmentation of skills is partly caused by the fragmentation of products and their associated education resources and certification. Vendor’s product pricing typically falls into three categories: (i) more expensive commercial products (c. £500 - £5000 per user per month) which include free online education resources and certification; (ii) inexpensive commercial products (c. £5 to £50 per user per month) which usually require a corporate email address but have free online education resources and reasonably-priced certification exam fees (c £100- £200); and (iii) products which are normally expensive but have an inexpensive licensed version that cannot be used for commercial purposes, again including free online education resources and certification. The latter approach is best practice for solving the fragmentation of skills because the barriers to learning (i.e. high product cost or the need for a corporate email address) are removed. Best practice includes the Microstrategy Analyst Pass, which is available to anyone and costs $350 per year including a non-commercial product licence, online education resources and access to certification exams. University students (as well as self-educated hackers) learn open source technologies and one would expect that those skills are sufficient for them to enter the workplace in any data analytics environment. Yet several vendors who provide the more expensive commercial products (c. £500 - £5000 per user per month) and do not have discounted licences for non-commercial purposes make one exception: universities. At face value, this seems benign or even generous. But it contributes to the inelasticity of the job market at graduate level because an unintended consequence is that some graduate data analytics jobs require the graduate to be competent in a product before they have started work. Best practice is for organisations to employ graduates based on their skills in maths, statistics and open source technologies, not product. In seeking corporate acquisitions, vendors might find that their customers value “education bundling” as much as “product bundling”. Customers who are happy to pick, for example, the best visualisation product and the best data storage product from different vendors might be more attracted to their people using a single education portal with the same certification process across all products. And if an organisation can allocate 100% of its education budget to a single vendor then it will surely do so. Best practice is for vendors to consider the value of consolidating and standardising education resources, and not just products, when looking at corporate acquisitions. Defragmentating data analytics The consequence of implementing a best practice data analytics capability based on the principles of defragmentation has profound consequences for an organisation. It enables a much richer set of conversations to the one which took place 20 years ago. A young business development manager is showing some draft business plans to their new boss. They conclude by saying they’d like to get a data scientist involved. “No”, the boss replies. He pauses for several seconds, looks up from his desk and explains "Segment our customer base in different ways using different clustering techniques. Then run the cashflow scenarios. Then we discuss. Then we give it to data science." You can view Moray's original article here. Moray Barclay is an Experienced Data Analyst working in hands-on coding, Big Data analytics, cloud computing and consulting.

The Dialogue: Keeping Data Secure While Working Flexibly

Last week, Peter Schroeter and Ryan Collins, head of DevSecOps at Upvest, and co-founder of RapidGroup, discussed how to keep Data secure while working flexibly.  Ryan brings to the table 12 years of experience from Server Administrator to CTO roles as well as experience as a Contractor. And as a business owner himself, he sees where the shortage falls and perhaps a way to fill the gap. Security is in the Spotlight Security is a priority for many businesses today. Avoiding negative PR has caused an internal shift in which companies take more care with their Data. There is also a Catch-22. In order to provide higher levels of security, businesses are slowing down their developers and software engineers, and taking more engineering time which in turn costs more money. Security before cost is becoming the new reality. The contradiction of deployment, project run time, and budgets are only part of the bigger picture. Compromise is key and follow these three tenets. Don’t leave anything open to vulnerability.Focus on auditability.Offer more training for Software Engineers. The Security-Focused Skills Gap How can you push security forward in a meaningful way when you can’t find the people you need with relevant experience? There’s a big gap in the market right now for security-oriented DevOps engineers. The skillsets many businesses are looking for include: Google Platform AWS and possibly Azure with a modern suite of tools with Teraform.Ability to float to the development side and work in SRE to ensure things are stable and scalable.DevOps Engineers with experience in the GO stack are especially hard to find. Companies want to go with what they know. However, there is a shift toward a more remote-friendly and contractor culture. When you can’t find the talent you need, sometimes it’s best to bite the bullet, and consider a Contractor. The Case to Hire a Contractor If this year has taught us anything, it’s that we don’t have to be confined to an office or to even one location. And yes, while there is an element of risk to being a Contractor, there are benefits to both sides. Contractors are compensated higher because businesses have lower HR costs, less tax regulations, payroll, and reporting to do.  Though there is some risk, there is always work for because so many companies want to secure their data. There’s always something to be optimised, always something which needs attention. You won’t be without work for long, if you have the skills. Disconnect Between Capability and Desire in the Market Some candidates have mentioned they have 60% of the skills required, but not enough project-based experience. How do you reconcile the disconnect?  It’s hard to specialise in DevOps, DevSecOps, and similar roles because it’s about automation, you have to be that "Swiss Army Knife", you have to live in the middle. You have to know how to get into the code, you have to know how to get in and do the CI workflows, etc. It’s almost Developer plus value-added skills or security engineer plus value-added skills. Remote Working Habits You Need Now Have a separated space set up for work.If you have a mix of people in office and those working remote, you have to make sure they’re communicating with each other.Rolling coffee breaks within Google Meet or something similar.Have a task management and time tracking system used by employees both on and offline.Build culture by picking at random two people and putting them together for a half hour to have a conversation that isn’t about work.  Startup vs Legacy Hiring Don’t box yourself in, but understand startups have a unique set of skills they need and most often without the budget to train someone. Whereas legacy businesses more often than not have the budget to train someone in the skillset they need. However, it’s important to note, the tech stack itself doesn’t really change. Best Practices for Async-Comm Teams Remember not everyone works in the same time zone. Don’t expect and immediate answer, and if you need an immediate answer, pick up the phone. Make sure your team has remote tools such as Slack or Google Meet and is doing most of their communication this way, even if they are in an office. If you don’t, your remote workers could be missing key information. The Future of DevOps and Data Security Many businesses may see a shift toward a more open working environment which is a good balance for what works best for talent and is good for productivity as well. Ultimately, we’re all solving the same challenges What is most important when it comes to keeping Data secure? Most important is getting people, systems, and education in place to do something in the first place. In other words, build from concept rather than moving too fast and breaking too many things. How Can Prospective Candidates Prepare? Focus on continuous learning and getting your skillsets like automation tools. If you really want to get involved with security side and SRE, you really have to get involved with the development side, too. Using the modern tech stacks like the GOLANG, the RUST, the SWIFT on the mobile side, and there’s always new pieces to the puzzle.There’s room for all types of relationships. Whether you’re looking for a long-term role, a short-term role, or something in between, DevOps is a never-ending project, so continuous learning is key for both candidate and company. You can watch the full conversation below:

Recently Viewed jobs