With over 10 years experience working solely in the Data & Analytics sector our consultants are able to offer detailed insights into the industry.
Visit our Blogs & News portal or check out our recent posts below.
Sandra is a Senior Consultant who first began her career in Recruitment in 2016. She has a unique background; having graduated from RADA, she spent several years as a professional actress before making the move to recruitment. She has established an acute and focused understanding of the market and has subsequently built strong relationships with hiring managers across a variety of sectors. She exclusively covers Junior to Mid Level roles within BI Analysis and Datawarehousing.
Visit our Blogs & News portal or check out our recent posts below.
As the Data & Analytics marketplace continues to grow, what is it that makes a candidate stand out? More and more, employers are on the lookout for people with both hard and soft skills; those who cannot only interpret data, but possess the ability to translate and relay that data to key stakeholders. To convey data in a cohesive, informative, and memorable way, we need to think beyond making something aesthetically pleasing. People connect with stories, be they fictional, personal, historical or otherwise. By utilising universal storytelling techniques, we can share data in a way that people intuitively connect with. Here are our Top Five Tips for telling stories with data: Start With The Structure Structures are the essential foundations that sit under any good story. Without a solid structure, the story we are telling can become confusing, distracting and unfocused. When presenting data, it is essential that we work to a clear structure to ensure that we can be understood. All stories feature three things; a beginning, a middle, and an end. A story told through data is no different: The Beginning: What is the question that has been asked? What are we trying to learn from this information? The Middle: The Data itself. What the numbers say. The End: What insights can we gain from the data, what is the data really telling us? By sticking to this structure, we can ensure that each bit of information gathered is explained with the relevant context required to convey the most information possible. When looking at several pieces of data, it makes sense to think of these as chapters. They may tell their own smaller story, but in the wider context of an overall narrative, they need to be in the correct order to make sense and not leave anyone confused. Speak To Your Audience When presenting data, it is crucial to remember who your audience is. Whey they’re a novice, expert, or the chairman of your company, each individual has their own vested interested in what you are showing them. As a Data and Analytics professional, your job is to serve as curator, creating a story that feels tailored to each unique person. In order to help understand how your audience might be best served by your story, it’s helpful to ask yourself the following questions: What information are the most interesting in? What information do they need to know the most? What is their daily routine? Is this their big meeting of the day, or one of several back-to-back? What actions will they take off the back of your insights? By asking these questions, you should be able to curate your data in a way that is meaningful for your audience. Find Your Characters The majority of data is based upon an initial human interaction. From a video viewed, to a product purchased, it’s easy to forget that at the end of the line is a real human being. By bringing this to the forefront of your insights you create a compelling new way to connect with your audience. Consider what this data actually meant when it was first gathered; who was that person and what does this information say about them? If you are able to create ‘personas’ or ‘characters’ from this data, you can present something tangible that people can connect and, potentially, even empathise with. Even if you use existing data to reference a personal experience, you’re adding a sense of palpability that gives your insights depth. Painting The Right Picture As Data Visualisers will tell you, the most elaborate visual is not always the most appropriate way to convey your insights. The key is to always consider what tells the story best. A heat map may be perfect for telling a story of geographical differences but is likely to make no sense when conveying a customer journey. The beauty of utilising different visual techniques is that they allow you to create an emotional impact with data, fully emphasising the meaning of your insights. David McCandless showcases how data can be visualised in various dynamic ways that create the most amount of meaning possible. Start Big, Get Smaller Data presentations have the difficult challenge of needing to be both accessible and detailed. By ensuring that you have the big picture covered with enough context, you can ensure that everyone gets the headline takeaway. Following this, you can highlight further insights that reveal more information for those who need to do a deeper dive. Much like in a good story, whilst you may understand the overall narrative the first time round, looking closer and revisiting certain parts should reveal more insights and nuances. If you have the skills to turn Data & Analytics insights into compelling stories then we may have a role for you. Register with us or search the hundreds of jobs available on our site.
06. September 2018
Information management is a seemingly overlooked area of data management, but arguably one of the most important that I recruit for. It isn’t a trending job title like data science, but is key to data governance and the accuracy all the companies I work with crave when they are hiring their next analytics champion.What does the market say?The market has sharply woken up to the benefits of information management due to the endless benefits it presents to organisations, such as:Accuracy and integrity of data: It goes without saying that any business will want to ensure the insights derived from data are accurate. This means analysing how raw data is handled by the business from when it is first collected to ensure its integrity.Cost: With more accurate data a business is able to be more targeted and conversely more effective with outbound activities such as direct marketing and strategic risk prevention. Properly handled data is pivotal to financial reporting. If a business can interpret their data correctly, this can decrease the likelihood and risk of being slapped with a fine from the FCA for noncompliance. Which are not only costly for business and individuals, but also reputation damaging.As a recruiter, I know that professionals in this area are high in demand because this is a relatively new area of ‘data’. There aren’t masses of experienced information management professionals available, and by the nature of supply and demand, skilled professionals are able to charge a contract rate of anywhere between £500 - £1,200 per day and £60-100k in the perm market.In addition to this, the marketplace is becoming populated with Data Quality Analysts working as Information Management specialists. Not all of these will have the experience or knowledge to effectively review business processes to ensure they are compliant with regulations, potentially exposing a business to the same risks as having no information management department at all!Some organisations have taken a new approach to this problem by training junior candidates with information management and data governance backgrounds. This approach does beg the question—if an organisation brings in professionals to ensure information is managed correctly, what knowledge can they teach, and expect to be taught?I’d be interested to hear your thoughts on the topic and what your experience is being part of an information management team or creating one. What challenges have you faced?
19. August 2016
Big data was seen as one of the biggest buzzwords of 2013, when companies often used the term inappropriately and in the wrong context. This year, people will finally understand what it means One could look back at 2013 and consider it the breakthrough year for big data, not in terms of innovation but rather in awareness. The increasing interest in big data meant it received more mainstream attention than ever before. Indeed, the likes of Google, IBM, Facebook and Twitter all acquired companies in the big data space. Documents leaked by Edward Snowden also revealed that intelligence agencies have been collecting big data in the form of metadata and, amongst other things, information from social media profiles for a decade. And beyond all of that, big data became everyone's most hated buzzword in 2013 after it was inappropriately used everywhere, from boardrooms to conferences. This has led to countless analysts, journalists and readers calling for people to stop talking about big data. A good example could be seen in the Wall Street Journal last week, where a reader wrote in complaining: A lot of companies talk about it but not many know what it is. While that's a problem, it leads to my first prediction: 1. In 2014, people will finally start to understand the term big data. Because, as it stands, many do not. The truth is that we've only really just started to talk about big data and companies aren't going to stop screaming about their latest big data endeavors. In fact, it's only January and the social bookmarking network Pinterest has already acquired image recognition platform VisualGraph. (Why? Pinterest want to understand what users are "pinning" and create better algorithms to help users better connect with their interests). So let's get 2014 off on the right foot with a definition of big data, from researchers at St Andrews, that's fairly easy to understand: The storage and analysis of large and/or complex data sets using a series of techniques including, but not limited to: NoSQL, MapReduce and machine learning. The main elements revolve about volume, velocity and variety. And the word 'big'? If your personal laptop can handle the data on an Excel spreadsheet, it's not big. Matt Asay, a journalist with ReadWriteWeb, also does a good job in explaining what makes a big data problem (as opposed to more traditional business intelligence). If you know what questions to ask of your transactional cash register data, which fits nicely into a relational database, you probably don't have a big data problem. If you're storing this same data and also an array of weather, social and other data to try to find trends that might impact sales, you probably do. 2. Consumers will begin to (voluntarily) give up certain elements of privacy for personalization. We've all heard of cookies – and we know that our actions around the internet affect the adverts that we see on websites and the suggested items we receive on Amazon. This is a concept that we've not only become accustomed to but also accept. After all, if we're going to have information put in front of us, we'd rather that we could relate to it. But there have been problems in the past. Some websites have taken advantage of customers, for example increasing the prices for a flight that they've previously expressed interest in (consumers might worry that the price will go up even further and therefore decide to buy a ticket). But as more companies instil big data techniques, customers will cooperate, on the premise that they will benefit. This is likely to follow Tesco's methodology, whereby customers are sent vouchers for goods that they are likely to buy anyway, creating a win-win situation for both parties. Customers, generally, are happy to receive a discount and retailers are pleased customers are coming back (especially if vouchers have an expiry date). 3. Big data-as-a-service will become a big deal Despite claims from analysts that all businesses will look to hire data scientists, this just isn't going to happen. Firstly, there's a shortfall of data scientists, which goes some way in explaining why companies are retraining existing staff to work with big data) and secondly, not all companies are ready to (nor do they need to) invest in full-time data scientists to analyze and explain their data. Instead, just as in other areas, I expect a wave of companies hustling to enter the big data-as-a-service space, an idea that began to creep into the latter parts of 2013. This could be anything from small and medium businesses signing up to anything from entire packages of storing, analyzing, explaining and visualizing data to more compact services, which focus on transferring data to cloud-based servers to allow for an accessible way of questioning the data in the future. 4. And finally... remember how Hadoop is an open-source software? Expect a lot more of that. Hadoop, famously named after a toy elephant, is a well known piece of software to anyone curious about data science and it provides the backbone for many big data systems, allowing businesses to store and analyze masses of data. Most importantly, it's open source, which means that its implementation was inexpensive, allowing many organizations to understand, rather than ignore, the data they were collecting. Quentin Gallivan, the chief executive of business analytics software firm Pentaho, explained last month that the rise of new open-source software will bring about more innovation and more ways of understand the data. He said: New open source projects like Hadoop 2.0 and YARN, as the next generation Hadoop resource manager, will make the Hadoop infrastructure more interactive... projects like STORM, a streaming communications protocol, will enable more real-time, on-demand blending of information in the big data ecosystem. Click here for the article on the web.
27. February 2014