Computer Vision jobs

What We Do

We help the best talent in the Computer Vision market find rewarding careers.

“Computer Vision” jobs help companies make sense of their visual data.  Acquiring, analysing and understanding visual data has become a rich source of information for both established and emerging industries in the data and analytics space. 

Our minds filter images every day and with the advancements in IoT, it was only a matter of time before every industry from healthcare to fraud prevention, sports and media, and even data for driverless cars entered the Computer Vision space. 

For the latest in the Computer Vision space please see the jobs below!

Latest Jobs

Salary

£40000 - £50000 per annum + benefits + bonus

Location

London

Description

Brand-new opportunity, focusing on predicting customer behaviour by implementing advanced statistical modelling & state-of-the-art machine learning techniques

Salary

£35000 - £45000 per annum + benefits + bonus

Location

London

Description

This company is passionate about a future where customer experience is better than any competitors, creating data products to retain their 10 million customers

Salary

£75000 - £85000 per annum + Benefits

Location

London

Description

Research position within a hedgefund in London - please apply here and send through your CV for more information!

Salary

US$150000 - US$180000 per year + Bonus, Stock, Benefits

Location

Woburn, Massachusetts

Description

Leading robotics business, building out a new R&D center in Boston, are looking for computer vision engineers to work with the R&D lead.

Salary

US$160000 - US$200000 per year + Equity, Bonus, Benefits

Location

Boston, Massachusetts

Description

A leader in robotics is building a new R&D lab in Boston, and are looking for computer vision engineers within SLAM to drive research.

Salary

US$120000 - US$140000 per year + Bonus, Stock, Benefits

Location

Akron, Ohio

Description

A rapidly growing series C start-up of more than 70 employees and over $40 million in funding are growing their core machine vision team

Harnham blog & news

With over 10 years experience working solely in the Data & Analytics sector our consultants are able to offer detailed insights into the industry.

Visit our Blogs & News portal or check out our recent posts below.

From Idea to Impact: How Charities Use Data

It’s that time of year again. As the festive season draws near and we pull together wish lists, many of us also begin to think about how we can give back. Given that the UK spent over £7 billion this Black Friday and Cyber Monday weekend, it’s not surprising that the idea of Giving Tuesday is becoming more and more popular.  But with 160,000 registered charities in the UK alone, institutions are turning to data to find new ways to stand out and make a greater impact.  Far from just running quarterly reports, charities are now utilising the insights they gain from data to inform their strategies, improve their services and plan for the future.  IDEAS Given that not every charity is lucky enough to go viral with an Ice Bucket Challenge style video, there is a need to find other ways to stand out in such a crowded market. As such, many are looking to the data they have collected to help create a strategy. Macmillan Cancer Support, one the UK’s biggest charities, wanted to see more success from one of their main fundraisers, ‘The World’s Biggest Coffee Morning’. The event, which sees volunteers hold coffee and cake-fuelled gatherings across the country was revolutionised by data. By engaging with their database and researching what motivated fundraisers, they refocused their marketing around how the occasion could create an opportunity for people to meet up and chat, such as swapping ‘send for your free fundraising pack’ for ‘order your free coffee morning kit’. Whilst these amends may seem superficial, they had a major impact increasing funds raised from £15m to £20m.  Some brands have taken this idea even further, using Data & Analytics tools to engage with potential donors. Homelessness charity Cyrenians’ data told them that there were a number of misconceptions about rough sleepers, including 15% of people believing that they were homeless by choice. To counter this they created an AI chatbot, named Alex, that allowed users to ask questions they may not have been comfortable asking a real person.  Another charity using data tools to counter common misconceptions is Dyslexia Association. Their Moment of Dyslexia campaign saw them utilise facial recognition technology; the longer a person looked at their digital poster, the more jumbled up the words and letters became. By harnessing both insights and the technology made possible by data, they were able to offer an insight into what dyslexia is like for people who previously didn’t understand.  INDIVIDUALS A big issue facing a number of charities is trust. Following a series of recent scandals, the public are more sceptical than ever of how charities are run, and their use of data is no exception. This ‘trust deficit’ has resulted in vast amount of potential donors staying away, with recent research highlighting that only 11% of people are willing to share their data with a charity, even if it means a better service.  Whilst charities with effective Data Governance are able to use their vast amount of data to enhance those business, those who mismanage it are likely to suffer. Following a cyber-attack that exposed the data of over 400,000 donors, the British and Foreign Bible Society were fined £100,000. As hackers were able to enter the network by exploiting a weak password, this serves as a timely reminder that our data needs not only to be clean, but secure.  Financial implications aside, improper data usage can also do irreversible damage to a charity’s reputation. St Mungo’s has faced criticism for passing information about migrant homeless people to the Home Office, putting them at risk of deportation. Whilst they were cleared of any wrongdoing by the ICO, this controversial use of data has had a negative impact on the charity’s image. With a decline in the number of people donating to charity overall, anything that can put people off further is bad news.  IMPACT Whilst there is more demand than ever for charities to share their impact data, there is also more opportunity. With Lord Gus O’Donnell urging charities to make data an ‘organisation-wide priority’, many are going beyond publishing annual reports and fully embracing a culture shift. Youth charity Keyfund have been able to justify how the spend their funds based on their impact data. Having heard concerns from fundraisers regarding whether their leisure projects were effective they looked at the data they had gathered from the 6,000 young people they were helping. What they found was that not only were their leisure projects effective, they had an even more positive impact than their alternatives, particularly for those from the most deprived area. This allowed them to continue to support these programs and even increase funding where necessary. Going one step further are Street League, a charity that use sports programmes to tackle youth unemployment. Rather than share their impact data in quarterly, or even annual, reports they moved to real-time reporting. Interested parties can visit an ‘Online Impact Dashboard’ and see up-to-the-minute data about how the charity’s work is impacting the lives of the people it is trying to help. This not only allows for the most relevant data to be used strategically, but also supports the business holistically, gaining donor both attention and trust. To stand out in the charity sector institutions need to take advantage of data. Not only can this be used to generate campaigns and streamline services but, when used securely and transparently, it can help rebuild trust and offer a competitive edge.  If you want to make the world a better place by harnessing and analysing data, we may have a role for you. Take a look at our latest opportunities or get in touch with one of our expert consultants to see how we can help you. 

Fighting Crime with Data: An Ethical Dilemma

Can you be guilty of a crime you’ve yet to commit? That’s the premise of Steven Spielberg’s 2002 sci-fi thriller ‘Minority Report’. But could it actually be closer to reality than you think.   As technology has advanced, law enforcement has had to adapt. With criminals utilising increasingly sophisticated methods to achieve their goals, our police forces have had to continuously evolve their approach in order to keep up.   New digital advances have refined crime-solving techniques to the point where they can even predict the likelihood of a specific crime occurring. But with our personal data at stake, where do we draw the line between privacy and public safety?  Caught on Camera   The digital transformation has led to many breakthroughs over the past few decades, originating with fingerprint analysis, through to the advanced Machine Learning models now used to tackle Fraud and analyse Credit Risk.   With an estimated one camera per every 14 individuals in the UK, CCTV coverage is particularly dense. And, with the introduction of AI technologies, their use in solving crimes is likely to increase even further.   IC Realtime’s Ella uses Computer Vision to analyse what is happening within a video. With the ability to recognise thousands of natural language queries, Ella can let users search footage for exactly what they’re after; from specific vehicles, to clothes of a certain colour. With only the quality of CCTV cameras holding it back, we’re likely to see technology like this become mainstream in the near future.   Some more widespread technologies, however, are already playing their part in solving crimes. Detectives are currently seeking audio recordings from an Amazon Echo thought to be active during an alleged murder. However, as with previous requests for encrypted phone data, debate continues around what duty tech companies have to their customer’s privacy.  Hotspots and Hunches Whilst Big Data has been used to help solve crime for a while, we’ve only seen it begin to play a preventive role over the past few years. By using Predictive Analytics tools such as HunchLab to counter crime, law enforcement services can:  Direct resources to crime hotspots where they are most needed.  Produce statistical evidence that can be shared with local and national-level politicians to help inform and shape policy.   Make informed requests for additional funding where necessary.   Research has shown that, in the UK, these tools have been able to predict crime around ten times more accurately than the police.   However, above and beyond the geographical and socioeconomic trends that define these predictions, advances in AI have progressed things even further.   Often, after a mass shooting, it is found that the perpetrators had spoken about their planned attack on social media. The size of the social landscape is far too big for authorities to monitor everyone, and often just scanning for keywords can be misleading. However, IBM’s Watson can understand the sentiment of a post. This huge leap forward could be the answer to the sincere, and fair, policing of social media that we’ve yet to see. Man vs Machine  Whilst our social media posts may be in the public domain, the question remains about how much of our data are we willing to share in the name of public safety.   There is no doubt that advances in technology have left us vulnerable to new types of crime, from major data breaches, to new ways of cheating the taxman. So, there is an argument to be had that we need to surrender some privacy in order to protect ourselves as well as others. But who do we trust with that data?  Humans are all susceptible to bias and AI inherits the biases of its creators. Take a program like Boulder, a Santa-esque prototype that analyses the behaviour of people in banks, determining who is ‘good’ and who is ‘bad’. Whilst it can learn signs of what to look for, it’s also making decisions based around how it’s been taught ‘bad’ people might look or act. As such, is it any more trustworthy than an experienced security guard?  If we ignore human bias, do we trust emotionless machines to make truly informed decisions? A study that applied Machine Learning to cases of bail found that the technology’s recommendations would have resulted in 50% less reoffenders than the original judges’ decisions. However, whilst the evidence suggests that this may be the way forward, it is unlikely that society will accept such an important, life-changing decision being made by a machine alone.  There is no black and white when it comes to how we use data to prevent and solve crime. As a society, we are continuously pushing the boundaries and determining how much technology should impact the way we govern ourselves. If you can balance ethics with the evolution of technology, we may have a role for you.   Take a look at our latest roles or contact one of our expert consultants to find out how we can help you. 

Recently Viewed jobs