With over 10 years experience working solely in the Data & Analytics sector our consultants are able to offer detailed insights into the industry.
Visit our Blogs & News portal or check out our recent posts below.
Ross joined Harnham as a graduate in 2013 as part of the Data and Technology team. He helped subspecialise and grow the team, and now is a Manager leading Big Data Engineering, Business Intelligence, Data Governance, and Software Engineering recruitment. He brings a passion for data technology with a high-level of service, looking to make your recruitment process as easy, fast, and efficient as possible.
With over 10 years experience working solely in the Data & Analytics sector our consultants are able to offer detailed insights into the industry.
Visit our Blogs & News portal or check out our recent posts below.
The New Year, and the new decade, have arrived. The past ten years saw Data move to the forefront of public conversation following a number of big leaks and controversies. But, realistically, the impact of the ease of access to a surplus Big Data has only just begun to be felt. Whilst many are predicting what the world will look like by the end of the 2020s, discussing how far AI will have come and the consequences of automation on the job market, we’ve decided to look a little closer to home. With that in mind, here are a few trends we expect to see over the next year. ACCESS TO DATA SCIENCE WILL BECOME EASIER Data Scientists have traditionally been limited in number, a key group of individuals with PhDs, honed skills, and a vast understanding of Data & Analytics. However, with the advent of a number of new tools, more and more users will be able to perform Data Science tasks. However, many of the more sophisticated processes are still far from being replicated, so those currently working in this area shouldn’t be concerned. In fact, the more standard tasks that can be automated, the more time Data Scientists will have to experiment and innovate. THE 5G EXPLOSION Whilst there may have been a soft launch last year, the introduction of 5G will have a much more significant impact over the next year. With a flurry of compatible mobile devices around, and many more expected to come, we’re likely see 5G networks hit the mainstream. In the world of Data, this is likely to have a huge impact on how businesses use the Cloud. Indeed, with mobile upload and download speeds set to be so fast, there is a chance that an online middle-system may no longer be as necessary as it once was. THE RISE OF THE EDGE On the subject of the Cloud, it’s worth talking about Edge Computing. No, this has nothing to do with the pizza or the guitarist. Edge Computing has been a trend for a few years now, but, following an announcement from AWS, it looks set to become much more prevalent in 2020. Concerned with moving processing away from the Cloud and close to the end-user, Edge Computing is already beginning to have an impact across a number of industries. A NEED FOR AUGMENTED ANALYTICS It’s no surprise that the use of AI, Machine Learning and NLP is set to increase over the next year, so it shouldn’t come as a shock that Augmented Analytics are set to become more popular too. The opportunities, and extra time, offered by using the automated decision making offered by Augmented Analytics are the perfect fit for the increasing number of organisations who find themselves with more Data than processing capabilities. DATA WILL HELP FIGHT THE CLIMATE CRISIS Whilst there is a fair argument that the amount of processing required by the world of Data & Analytics is detrimental to the climate, the benefits any insights can offer are likely to outweigh any negative impact. Indeed, the UK government are already using Satellite Data to help reduce the impact of flooding, whilst Google’s EIE is being used to map carbon emissions with a view to better plan future cities. Given the recent, and tragic, bushfires in Australia, this is going to become an even more pressing issue over the next 12 months. If you want to be at the forefront of the latest innovations in Data & Analytics, we may have a role for you. Take a look at our latest opportunities, or get in touch with one of our expert consultants to find out how we can help you.
09. January 2020
Increasingly, I speak to managers who are adopting big data tools and developing PoCs to prove how they can make use of them. Just last week I spoke to a data architect who mentioned that if he didn’t get exposure to big data tech sooner rather than later, his current RDBMS skills may become redundant within the next few years. While that is likely an exaggeration, it is certainly an interesting point. Companies that would have never previously had the capability to interpret ‘Big Data’ are now exploring a variety of NoSQL platforms. In particular, the massive performance benefits gained from Spark and real-time/streaming tools have opened up a whole new world beyond just MapReduce. I don’t claim to be a data engineer, but as a recruiter for this sector, what I do is spend all day, every day interacting with big data developers, architects and managers (as well as keeping a close eye on the latest Apache incubator projects). Due to this, I have seen some recurring themes that have become trends when companies look to create and build their big data teams that are coming to the fore. Candidate demand The demand for Big Data professionals is very much a present day issue as the data companies have grand plans for is waiting for the right data developer to use the best tech to extract valuable insights from it. The best candidates receive massive interest, often gain multiple offers from a range of companies. Your business is now no longer just competing with large corporations such as Facebook, Twitter or Yahoo. Startups and SMEs are also vying for the best candidates. Candidates are seeing pay rises twice that of the normal rate, as illustrated in our salary guide. Candidate shortage The number of candidates with hands-on, production level Big Data experience is incredibly limited. We go to great lengths to find the candidates who can add real value to companies. The growth and exciting future for the big data industry has led to increased interest in big data jobs, particularly for those from RDBMS or software. engineering backgrounds. This leaves the industry in a difficult predicament: high demand + low supply = massive competition. There are countless examples of companies that have failed to recruit a Big Data team after a year of looking. Competition to get ahead and stand out Planning - Companies need to have a data road map detailing their future plans. Candidates want to clearly know what they are getting into and what to expect from a job. Innovation - Why get stuck on batch processing? The most exciting positions that candidates love are in data innovations teams, playing with real-time/streaming tech and new languages. Personal development, growth and training – with the data science market experiencing similar growth, many big data engineers are looking for a job that not only offers the chance to work with machine learning and similar fields; but training, mentoring towards clear career progression as standard. Speed – the length of the interview process is often seen as a reflection of the amount of red tape developers have to go through to get a job. The longer and more convoluted the process, the more put off some people may be. Complacency – don’t rest on your laurels, it’s unlikely that you’ll get 10s of CVs through when you are looking to fill a data role, so when you find a candidate you like, move swiftly to show your interest to them as quality candidates don’t come around often. By implementing these small but effective improvements to your recruiting process and how you develop data talent will see you create a team that is a success in this ever more digital analytics landscape. Companies who don’t create and nurture strong, dynamic teams will fall by the wayside. It’s Harnham’s job to help you achieve this goal. Get in touch with us to tell you how. T: (020) 8408 6070 E: email@example.com
06. June 2017
With all the talk of big data and data science being able to predict what colour shirt I will buy in four years’ time (probably white or blue for those who don’t know me!), effective business intelligence is sometimes passed by or considered old news. The reality is that companies are realising that they can get much more from their business intelligence and are changing their strategies to deliver interactive, insight-driven and visualised reports. Not every data-driven decision needs machine learning algorithms behind it, and quality business intelligence enables all managers to be effective decision-makers. These strategies are creating some obvious trends in the market, resulting in a change in expectations when hiring a BI Manager. Key BI TrendsData Visualisation – Companies of all sizes are implementing Qlikview and Tableau (amongst many other tools) to create attractive, interactive visualisations, to harness intelligence, in a way that will capture attention in a presentation. Insight Driven - A BI professional can’t simply develop automated reports anymore. Analysts are often required to offer suggestions for business change and present insight to decision makers. Hands-on Management – BI managers and even heads of business intelligence are expected to keep coding well into their management years, with the logic that problems can be spotted quicker when they are in the trenches, coupled with strategic and line management work. Data Ambassadors – BI professionals are becoming door-to-door data sellers, coaching teams in a business on the benefits of using data to optimise their teams and decisions to save or bring in more money. Heads are in the Cloud – Companies are using cloud-based data warehouses such as Redshift to save on storage costs, whilst creating a centralised data warehouse for BI. Alternative Data Sources – Companies are looking to use the web and social media data, alongside numerous other sources to generate deep insights for managers. The BI Manager EffectI am completely sold that all of these features represent the future of business intelligence. The few companies that are doing all of the above well enough, are doing advanced work in the area and these companies will be leveraging big commercial gains from their business intelligence teams. The problem is that only a few businesses are doing all of the above, so only a handful of professionals have the relevant experience, and as a result expect top dollar to bring all of those skills. Therefore, it is prudent to be flexible with your hiring requirements. Look for a bright, passionate candidate, who can readily grasp the shift in business intelligence trends, and is keen to plug skills gaps. An enthusiastic business intelligence professional will get up to speed with whatever they were missing. Don’t be too quick to dismiss those who are not ready-made BI managers on paper. Message to CandidatesFor all aspirational or existing business intelligence managers and leaders, I would advise you try to stay hands on as long as possible. I know some of you dream of never seeing a line of SQL code again, however, the trend in hiring for hands-on business intelligence management positions means that keeping your tech skills sharp will really keep your options open moving forward. It would be great to hear your experiences, so please feel free to comment below on the trends you see in your business. Have you needed to remain hands on as you progress within your career? Or are you looking for a multi-skilled BI manager, and it is proving hard?
06. July 2016
Introducing New PracticesThe introduction of a new methodology into a business structure should, in theory at least, be about the simple integration of a new practice into an existing framework. The truth is that this is rarely simple. Over the weeks, months and sometimes years of development of something such as information and data distribution systems they will naturally develop a set of practice based functionalities. Commonly (despite how this can feel at the time) these structures are in fact perfectly sound and probably just require minor adjustments to make them fit new circumstances. At other times due to either internal or external pressures such as changes in the market, technological advances or organic development and growth, the systems can be in need of a more radical change. In either case issues can arise within the distribution of responsibility for the specifics of the infrastructure.The introduction of the position of Chief Data Officer (or CDO) in a large business environment is very likely to be a catalyst for change. As discussed in a previous article the role of the CDO is varied at times; but it does have a very specific set of common elements. One certain commonality in the role is the need for the CDO to oversee the collation of systems and data flow processes into the larger ‘whole’ of the business. This will very likely require a redistribution of responsibility. One potential area of contention for example could be the methods and aims of the data engineers and associated colleagues compared to those of the technologists and hardware related areas. Clearly these are associated and intimately linked in that, to state the obvious, they have a symbiotic relationship with the flow of data from storage to user but these are often different departments with disparate operational procedures and methodologies. To the CDO however they will need to be seen as an operational component of the wider system. That means specific lines will need to be drawn to ensure efficient use of resources and the effective utilisation of the data. In short the poor CDO may find themselves in the unenviable position of trying to disentangle a complex weave of different threads of operations. Once this is done he can then set about the equally arduous task of re-weaving them into a new structure.Reweaving The ThreadsClearly a moments thought informs us that there is not going to be a quick fix for this, and no one size fits all plan is available. For the new CDO one of the first tasks will be to clearly understand the roles and responsibilities of the team and, with long term and deeply embedded working practices, this is not likely to be a matter of reading the job descriptions. Once the system is understood then the redistribution of workflow and responsibility can commence. Of course the CDO will be responsible for more than the mechanics of the situation. All business has people at the heart of the operations and a good CDO will understand this. Team player and leadership qualities may well be just as important to the new CDO as his technical and managerial skills when it comes to forging his position in the structure.
16. January 2015
Gathering and analyzing millions of data points can be difficult, but big data can tell business where to focus its efforts. Advances in data gathering, computing power and connectivity mean that we have more information than ever before at our fingertips. IBM estimates that by 2020 there will be 300 times more information in the world than there was in 2005 – a total of 43tn gigabytes. And this data is being put to good use. Increasingly we hear how properly understanding data leads to positive results, whether this is Moneyball in sport or Nate Silver's predictions of the US elections. We are only just starting to scratch the surface of how businesses can process, analyze and otherwise make use of all this extra information to help them make money, save money and become more sustainable. But when it comes to sustainability the great thing about big data is that it is unlocking the ability of businesses to understand and act on what are typically their biggest environmental impacts – the ones outside their control. For pharmaceutical giant GSK only 20% of its carbon footprint is within its own boundaries: 80% comes from indirect emissions, with 40% of that coming from the use of its products such as propellant inhalers. Big data's potential big impact on sustainability hinges on three simple facts: • taking meaningful action on corporate sustainability requires an understanding of all the impacts that the business world and the natural world have on each other; • the business world is a very complicated place, with lots of interactions between consumers and companies and suppliers and markets; • the natural world is even more complicated, with lots of interactions between people and resources and ecosystems and climate. Until relatively recently businesses struggled to get a full picture of the impact of their own operations. The information required to get an accurate understanding of even something relatively simple such as energy consumption was kept in separate documents, in varying formats, and across multiple sites. But now leading businesses such as Nike and Ikea are trying to understand the entire end-to-end impact of their businesses, throughout the value chain. This includes looking at what's happening outside the boundaries of the business, including raw materials, suppliers, employees traveling, customers using products, how waste is dealt with, and investments that have been made. Businesses know that measurement is one of the keys to management. Collecting and understanding data about how an organization operates leads to knowledge that can improve decision making, refine goals and focus efforts. When the Carbon Trust worked with BT, we found that emissions outside its direct control accounted for 92% of the total. To add to the complexity, two thirds of those emissions were from BT's supply chain, which involves 17,000 suppliers around the world providing products and services worth £9.4bn. Big data has the power to transform how large businesses – the ones with biggest environmental impacts, but also access to large volumes of information – can take action on sustainability. A drive for data collection can also incentivize smaller suppliers to be more responsible in their own operations, creating a domino effect. Companies such as Hitachi are already providing an online platform for suppliers to submit how they meet sustainability criteria. Providing quality data in the right format is becoming an increasingly important factor in whether a supplier is chosen. The worlds of data collection and analysis, sophisticated business software applications, and accepted measurement standards are coalescing to help drive transparent and improved sustainability performance for companies and their supply chains. Measuring and understanding how doing business really does affect the natural world will open up new opportunities for bringing sustainability inside an organization: creating change, cutting costs and boosting long-term profitability in a resource-constrained world. It isn't easy. There are challenges around gathering external data, as well as in analyzing and interpreting hundreds of thousands, or millions, of data points. But we are already seeing the pioneers in sustainability leading the way, bringing suppliers and customers along for the journey. Author John Hsu is an expert in sustainability data at the Carbon Trust Click here for the article on the web.
18. February 2014
The BBC are set to splash out £18 MILLION of licence fee cash... on Big Data.The Beeb are planning to throw up to £18m worth of licence fee payers' cash at data analytics suppliers to work their dark arts for the corporation.A tender for a three-year Next Generation Digital analytics Services agreement appeared in the Official Journal of the European Union at the weekend."The BBC is looking for a framework of suppliers to provide web and data analytics tools and services, and associated activities," the tender stated.The framework will be split into two lots: the first is a single supplier lot for a core analytics platform designed to provide "insight" into web reporting, advanced predictive analytics and regulatory reporting requirements.This is worth between £6.3m to £9.9m for the supplier that wins the only seat on the framework.The second lot is a multi-supplier framework which covers enhanced reporting and analytics tools, worth between £5.5m and £7.92m. Up to 24 suppliers can make it onto this agreement.Lot Two includes multivariate testing, multi attribute segmentation, models to drive algorithmic content recommendations, visualization and social media analytics.Anyone wanting to chance their luck should note the closing date for submissions is noon on 18 November.Venerable analyst house TechMarketView described the deal as "eye-catching" because at up to £18m, the data analytics gig was a "medium-sized project" in a sector where "advanced analytics projects tend to be small or exploratory"."We expect all the usual suspects will be interested in the analytics tender but will be insightful to see who they [the BBC] pull in from the specialist analytics vendor community. This is a valuable opportunity for the 'little guys'," said TMV's Angela Eager. Click here for the article on the web.
08. November 2013
Google is calling the next version of its mobile operating system Android KitKat. The news comes as a surprise as the firm had previously indicated version 4.4 of the OS would be Key Lime Pie. The decision to brand the software with the name of Nestle's chocolate bar is likely to be seen as a marketing coup for the Swiss food and beverage maker. However, Google told the BBC that it had come up with the idea and that neither side was paying the other. "This is not a money-changing-hands kind of deal," John Lagerling, director of Android global partnerships, told the BBC. Instead, he said, the idea was to do something "fun and unexpected". However, one branding expert warned there were potential pitfalls to such a deal. "If your brand is hooked up with another, you inevitably become associated with that other brand, for good or ill," said Simon Myers, a partner at the consultancy Prophet. "If that brand or business has some reputation issues that emerge, it would be naive to think as a brand owner that your good name, your brand equity, would not be affected." Nestle has faced criticism in the past for the way it promoted powdered baby milk in the developing world. It has also had to recall numerous products, most recently bags of dog food following a salmonella scare in the US. Google has also attracted controversy of its own, including a recent report from the US government suggesting that Android attracts more malware attacks than any other mobile OS. Google also announced that it has now recorded the system being activated on a smartphone or other device more than one billion times. Cold call Since 2009, Google and its partners in the Open Handset Alliance have codenamed each Android release after a type of treat, with major updates progressing a letter along the alphabet. Previous versions have been called Cupcake, Donut, Eclair, Froyo (short for frozen yoghurt), Gingerbread, Honeycomb, Ice Cream Sandwich and Jelly Bean. Although the developers had referred to the forthcoming version as KLP in internal documents, Mr Lagerling said the team decided late last year to opt instead for the chocolate bar. "We realized that very few people actually know the taste of a key lime pie," he explained. "One of the snacks that we keep in our kitchen for late-night coding are KitKats. And someone said: 'Hey, why don't we call the release KitKat?' "We didn't even know which company controlled the name, and we thought that [the choice] would be difficult. But then we thought well why not, and we decided to reach out to the Nestle folks." Mr Lagerling said he had made a "cold call" to the switchboard of Nestle's UK advertising agency at the end of November to propose the tie-up. The next day, the Swiss firm invited him to take part in a conference call. Nestle confirmed the deal just 24 hours later. "Very frankly, we decided within an hour to say let's do it," Patrice Bula, Nestle's marketing chief told the BBC. Mr Bula acknowledges there were risks involved - for example, if the new OS proved to be crash-prone or particularly vulnerable to malware it could cause collateral damage to KitKat's brand. "Maybe I'll be fired," he joked. "When you try to lead a new way of communicating and profiling a brand you always have a higher risk than doing something much more traditional. "You can go round the swimming pool 10 times wondering if the water is cold or hot or you say: 'Let's jump.'" Secret story Executives from the two firms met face to face at a secret event held at Mobile World Congress in Barcelona in February to finalize the details. To promote the alliance, Nestle now plans to deliver more than 50 million chocolate bars featuring the Android mascot to shops in 19 markets, including the UK, US, Brazil, India, Japan and Russia. The packaging had to be produced in advance over the past two months. But despite the scale of the operation, the two firms managed to keep the story a secret, "Keeping it confidential was paramount to Google's strategy," acknowledges Mr Bula. "Absolutely nothing leaked." The Android team also took steps to preserve the element of surprise, notifying only a "tight team" about the decision. "We kept calling the name Key Lime Pie internally and even when we referred to it with partners," revealed Mr Lagerling. "If we had said, 'The K release is, by the way, secret', then people would have racked their minds trying to work out what it was going to be." Most Google employees will have learned of the news only when a statue of the Android mascot made out of KitKats was unveiled at the firm's Mountain View, California, campus. "A lot of things, especially in tech nowadays, become public before they are officially supposed to be," said Mr Lagerling. "I think it's going to a big surprise for a lot of people, including Googlers." Click here for the article on the web.
05. September 2013
U.S. companies will need 1.9 million more techies by 2015, says one expert. Here are the top 10 tech skills employers are seeking.With data analytics now one of the fastest growing fields in IT, it stands to reason that data scientists are in demand. That's great for people with the requisite skills. The problem, according to Peter Sondergaard, a senior vice president at IT research firm Gartner, is that there aren't enough of them.Of the almost 2 million openings he expects over the next three years in the U.S. alone (4 million worldwide), Sondergaard predicts that only about one-third will be filled, making analytics software whizzes "a scarce, valuable commodity" that employers will have to fight to hire and retain.Not all analytics talent is on the tech side. People who can translate mathematical models into English are needed, too. Among IT mavens, though, it's clear which skills are shaping up to be the hottest in Big Data, says a new report from job site Dice.com.Far and away the leader on the list is Hadoop. Originally developed in 2005, Hadoop is a Java-based open-source platform that was named after one of its inventors' small son's stuffed toy elephant. Hadoop powers Yahoo (YHOO) web searches, and Amazon (AMZN), eBay (EBAY), Google (GOOG), LinkedIn (LNKD), Twitter, and lots of other companies use it too.Dice.com's ranking of the top 10 tech skills Big Data needs now:1. Hadoop plus Java — "the Number One combination by a large margin," notes the report, adding that's "not surprising given that [Hadoop] is a Java-based framework."2. Developer3. NoSQL4. Map Reduce5. Big Data6. Pig7. Linux8. Python9. Hive10. ScalaThe shortage of professionals with experience in Hadoop and NoSQL has already given rise to higher pay for qualified hires, topping $100,000 on average, the report says.But the real winner could be the U.S. economy as a whole. Anticipating a multiplier effect like that of the pre-recession auto industry, Peter Sondergaard predicts that "every Big Data-related role in the U.S. will create employment for three people outside of IT. So over the next three years, a total of 6 million jobs will be generated by the information economy." Here's hoping he's right. Click here for the article on the web.
31. May 2013