Data Science Interview Questions: What The Experts Say

Guest Blog our consultant managing the role
Author: Guest Blog
Posting date: 8/22/2019 9:13 AM
Our friends at Data Science Dojo have compiled a list of 101 actual Data Science interview questions that have been asked between 2016-2019 at some of the largest recruiters in the Data Science industry – Amazon, Microsoft, Facebook, Google, Netflix, Expedia, etc. 

Data Science is an interdisciplinary field and sits at the intersection of computer science, statistics/mathematics, and domain knowledge. To be able to perform well, one needs to have a good foundation in not one but multiple fields, and it reflects in the interview. They've divided the questions into six categories: 

  • Machine Learning
  • Data Analysis
  • Statistics, Probability, and Mathematics
  • Programming
  • SQL
  • Experiential/Behavioural Questions

Once you've gone through all the questions, you should have a good understanding of how well you're prepared for your next Data Science interview.

Machine Learning

As one will expect, Data Science interviews focus heavily on questions that help the company test your concepts, applications, and experience on machine learning. Each question included in this category has been recently asked in one or more actual Data Science interviews at companies such as Amazon, Google, Microsoft, etc. These questions will give you a good sense of what sub-topics appear more often than others. You should also pay close attention to the way these questions are phrased in an interview. 

  • Explain Logistic Regression and its assumptions.
  • Explain Linear Regression and its assumptions.
  • How do you split your data between training and validation?
  • Describe Binary Classification.
  • Explain the working of decision trees.
  • What are different metrics to classify a dataset?
  • What's the role of a cost function?
  • What's the difference between convex and non-convex cost function?
  • Why is it important to know bias-variance trade off while modeling?
  • Why is regularisation used in machine learning models? What are the differences between L1 and L2 regularisation?
  • What's the problem of exploding gradients in machine learning?
  • Is it necessary to use activation functions in neural networks?
  • In what aspects is a box plot different from a histogram?
  • What is cross validation? Why is it used?
  • Can you explain the concept of false positive and false negative?
  • Explain how SVM works.
  • While working at Facebook, you're asked to implement some new features. What type of experiment would you run to implement these features?
  • What techniques can be used to evaluate a Machine Learning model?
  • Why is overfitting a problem in machine learning models? What steps can you take to avoid it?
  • Describe a way to detect anomalies in a given dataset.
  • What are the Naive Bayes fundamentals?
  • What is AUC - ROC Curve?
  • What is K-means?
  • How does the Gradient Boosting algorithm work?
  • Explain advantages and drawbacks of Support Vector Machines (SVM).
  • What is the difference between bagging and boosting?
  • Before building any model, why do we need the feature selection/engineering step?
  • How to deal with unbalanced binary classification?
  • What is the ROC curve and the meaning of sensitivity, specificity, confusion matrix?
  • Why is dimensionality reduction important?
  • What are hyperparameters, how to tune them, how to test and know if they worked for the particular problem?
  • How will you decide whether a customer will buy a product today or not given the income of the customer, location where the customer lives, profession, and gender? Define a machine learning algorithm for this.
  • How will you inspect missing data and when are they important for your analysis?
  • How will you design the heatmap for Uber drivers to provide recommendation on where to wait for passengers? How would you approach this?
  • What are time series forecasting techniques?
  • How does a logistic regression model know what the coefficients are?
  • Explain Principle Component Analysis (PCA) and it's assumptions.
  • Formulate Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) techniques.
  • What are neural networks used for?40. Why is gradient checking important?
  • Is random weight assignment better than assigning same weights to the units in the hidden layer?
  • How to find the F1 score after a model is trained?
  • How many topic modeling techniques do you know of? Explain them briefly.
  • How does a neural network with one layer and one input and output compare to a logistic regression?
  • Why Rectified Linear Unit/ReLU is a good activation function?
  • When using the Gaussian mixture model, how do you know it's applicable?
  • If a Product Manager says that they want to double the number of ads in Facebook's Newsfeed, how would you figure out if this is a good idea or not?What do you know about LSTM?
  • Explain the difference between generative and discriminative algorithms.
  • Can you explain what MapReduce is and how it works?
  • If the model isn't perfect, how would you like to select the threshold so that the model outputs 1 or 0 for label?
  • Are boosting algorithms better than decision trees? If yes, why?
  • What do you think are the important factors in the algorithm Uber uses to assign rides to drivers?
  • How does speech synthesis works?

Data Analysis

Machine Learning concepts are not the only area in which you'll be tested in the interview. Data pre-processing and data exploration are other areas where you can always expect a few questions. We're grouping all such questions under this category. Data Analysis is the process of evaluating data using analytical and statistical tools to discover useful insights. Once again, all these questions have been recently asked in one or more actual Data Science interviews at the companies listed above.  

  • What are the core steps of the data analysis process?
  • How do you detect if a new observation is an outlier?
  • Facebook wants to analyse why the "likes per user and minutes spent on a platform are increasing, but total number of users are decreasing". How can they do that?
  • If you have a chance to add something to Facebook then how would you measure its success?
  • If you are working at Facebook and you want to detect bogus/fake accounts. How will you go about that?
  • What are anomaly detection methods?
  • How do you solve for multicollinearity?
  • How to optimise marketing spend between various marketing channels?
  • What metrics would you use to track whether Uber's strategy of using paid advertising to acquire customers works?
  • What are the core steps for data preprocessing before applying machine learning algorithms?
  • How do you inspect missing data?
  • How does caching work and how do you use it in Data Science?

Statistics, Probability and Mathematics

As we've already mentioned, Data Science builds its foundation on statistics and probability concepts. Having a strong foundation in statistics and probability concepts is a requirement for Data Science, and these topics are always brought up in data science interviews. Here is a list of statistics and probability questions that have been asked in actual Data Science interviews.

  • How would you select a representative sample of search queries from 5 million queries?
  • Discuss how to randomly select a sample from a product user population.
  • What is the importance of Markov Chains in Data Science?
  • How do you prove that males are on average taller than females by knowing just gender or height.
  • What is the difference between Maximum Likelihood Estimation (MLE) and Maximum A Posteriori (MAP)?
  • What does P-Value mean?
  • Define Central Limit Theorem (CLT) and it's application?
  • There are six marbles in a bag, one is white. You reach in the bag 100 times. After drawing a marble, it is placed back in the bag. What is the probability of drawing the white marble at least once?
  • Explain Euclidean distance.
  • Define variance.
  • How will you cut a circular cake into eight equal pieces?
  • What is the law of large numbers?
  • How do you weigh nine marbles three times on a balance scale to select the heaviest one?
  • You call three random friends who live in Seattle and ask each independently if it's raining. Each of your friends has a 2/3 chance of telling you the truth and a 1/3 chance of lying. All three say "yes". What's the probability it's actually raining?
  • Explain a probability distribution that is not normal and how to apply that?
  • You have two dice. What is the probability of getting at least one four? Also find out the probability of getting at least one four if you have n dice.
  • Draw the curve log(x+10)


When you appear for a data science interview your interviewers are not expecting you to come up with a highly efficient code that takes the lowest resources on computer hardware and executes it quickly. However, they do expect you to be able to use R, Python, or SQL programming languages so that you can access the data sources and at least build prototypes for solutions.

You should expect a few programming/coding questions in your data science interviews. You interviewer might want you to write a short piece of code on a whiteboard to assess how comfortable you are with coding, as well as get a feel for how many lines of codes you typically write in a given week. 

Here are some programming and coding questions that companies like Amazon, Google, and Microsoft have asked in their Data Science interviews. 

  • Write a function to check whether a particular word is a palindrome or not.
  • Write a program to generate Fibonacci sequence.
  • Explain about string parsing in R language
  • Write a sorting algorithm for a numerical dataset in Python.
  • Coding test: moving average Input 10, 20, 30, 10, ... Output: 10, 15, 20, 17.5, ...
  • Write a Python code to return the count of words in a string
  • How do you find percentile? Write the code for it
  • What is the difference between - (i) Stack and Queue and (ii) Linked list and Array?

Structured Query Language (SQL)

Real-world data is stored in databases and it ‘travels’ via queries. If there's one language a Data Science professional must know, it's SQL - or “Structured Query Language”. SQL is widely used across all job roles in Data Science and is often a ‘deal-breaker’. SQL questions are placed early on in the hiring process and used for screening. Here are some SQL questions that top companies have asked in their Data Science interviews. 

  • How would you handle NULLs when querying a data set?
  • How will you explain JOIN function in SQL in the simplest possible way?
  • Select all customers who purchased at least two items on two separate days from Amazon.
  • What is the difference between DDL, DML, and DCL?96. Why is Database Normalisation Important?
  • What is the difference between clustered and non-clustered index?

Situational/Behavioural Questions

Capabilities don’t necessarily guarantee performance. It's for this reason employers ask you situational or behavioural questions in order to assess how you would perform in a given situation. In some cases, a situational or behavioural question would force you to reflect on how you behaved and performed in a past situation. A situational question can help interviewers in assessing your role in a project you might have included in your resume, can reveal whether or not you're a team player, or how you deal with pressure and failure. Situational questions are no less important than any of the technical questions, and it will always help to do some homework beforehand. Recall your experience and be prepared! 

Here are some situational/behavioural questions that large tech companies typically ask:   

  • What was the most challenging project you have worked on so far? Can you explain your learning outcomes?
  • According to your judgement, does Data Science differ from Machine Learning?
  • If you're faced with Selection Bias, how will you avoid it?
  • How would you describe Data Science to a Business Executive?

If you're looking for new Data Science role, you can find our latest opportunities here. 

This article was written by Tooba Mukhtar and Rahim Rasool for Data Science Jojo. It has been republished with permission. You can view the original article, which includes answers to the above questions here. 

Related blog & news

With over 10 years experience working solely in the Data & Analytics sector our consultants are able to offer detailed insights into the industry.

Visit our Blogs & News portal or check out the related posts below.

3 Ways Machine Learning Is Benefiting Your Healthcare

With Data-led roles leading the list in the World Economic Forum’s ‘Jobs of the Future’ report, it is no surprise that Data Science continues to be the main driving force behind a number of technological advancements. From the Natural Language Processing (NLP) that powers your Google Assistant, to Computer Vision identifying scanning pictures for specific objects and the Deep Learning techniques exploring the capability of computers to become “human”, innovation is everywhere.  It’s unsurprising, then, that the world of healthcare is fascinated by the possibilities Data Science can offer,  possibilities which could not only make your and my life better, but also save several thousands of lives around the world. To just scrape the surface, here are three examples of how Machine Learning (ML) techniques are being used to benefit our healthcare.  COMPUTER VISION FOR IMAGING DIAGNOSTICS  Have you ever had a broken leg or arm and saw a x-ray scan of your fracture? Can you remember how the doctor described the kind of fracture to you and explained where exactly you can see it in the picture? The same thing that your doctor did a few years ago, can now be done by an algorithm that will identify the type of fracture, and provide insights into how you should treat it. And it’s not just fractures; Google's AI DeepMind can spot breast cancer as well as your radiologist. By feeding a Machine Learning model the mammograms of 76,000 British women, Google’s engineers taught the system to spot breast cancer in a screen scan. The result? A system as accurate as any radiologist.  We‘ve already reached the point where Machine Learning and AI can no longer just outsmart us at a board game, but can benefit our everyday lives, including in as sensitive use-cases as the healthcare industry. NLP AS YOUR PERSONAL HEALTH ASSISTANT  When we go to our GP, we go to see someone with a medical education and clinical understanding who can evaluate our health problems. We go there because we trust in the education of this person and their ability to give us the best information possible. However, thanks to the rise of the internet, we’ve turned to search engines and WebMD to self-diagnose online, often reading blogs and forums that will convince us we have cancer instead of a common cold.  Fortunately, technology has advanced to the point where it can assist with an on-the-spot (much more accurate) evaluation of your medical condition. By conversing with an AI, like the one from Babylon Health, we can gain insights into possible health problem, define the next steps we need to take and know whether or not we need to see a doctor in person.  There’s no need to wait for opening times or to sit bored in a waiting room. Easy access from your phone democratises the process and advice can be received by anyone, at any time.    DEEP LEARNING DRAWS CONCLUSIONS BETWEEN MEDICAL STUDIES Despite their extensive qualifications, even medical researchers can feel overwhelmed by the sheer amount of Insights and Data that are gathered around the world in hospitals, labs, and across various studies. No wonder it’s not uncommon for important Insights and Data to get forgotten in the mix. Once again, Machine Learning can help us out. Instead of getting lost in a sea of medical data, ML algorithms can dig deep and find the information medical researchers really need. By efficiently sifting a through vast amounts of medical data, combining certain datasets and providing insights, ML sources ways for treatments to be improved, medicines to be altered, and, as a result, can save lives. And this is only the beginning. As Machine Learning continues to improve we can expect huge advances in the following years, from robotic surgery to automated hospitals and beyond. If you’re an expert in Machine Learning, we may have a job for you. Take a look at our latest opportunities of get in touch with one of our expert consultants to find out more. 

How NLP Is Redefining The Future Of Tech

How NLP Is Redefining The Future Of Tech

During the last half of the past decade the importance of Data reached a level at which it was coined “the new oil”. This was indicative of a shift in the practices of individuals and businesses, highlighting how they now rely on something which isn’t measurable in gallons but in bytes. However,  because we can’t physically see the Data we generate, gather and store, its easy to lose our connection to it.  This is where NLP is comes into play. With the purpose of helping computers understand our languages, NLP (Natural Language Processing) gained an increased importance over the last couple of years. But, more than teaching a computer how to speak, NLP can make sense of patterns within a text, from finding the stylistic devices of a piece of literature, to understanding the sentiment behind it.  So, with NLP set to become even more prevalent over the next decade, here are some of the ways in which it’s already being put to use:  EXTRACTION Like an advanced version of using Ctrl + F to search a document, NLP can instantly skim through texts and extract the most important information. Not only that, but NLP algorithms are able to find connections between text passages and can generate statistics related to them. Which leads me to my next example: TEXT CLASSIFICATION  This is fairly self-explanatory: NLP algorithms can parameters to categorise texts into certain categories. You’ll find this used frequently in the insurance industry, where businesses use NLP to organise their contracts and categorise them the same way newspapers categorise their articles into different subcategories. And, closer to home, it’s similar algorithms that keep your inbox free from spam, automatically detecting patterns which are heavily used by spammers. But NLP does more than just look for key words, it can understand the meaning behind them:  SENTIMENT ANALYSIS Sentiment Analysis takes the above understanding and classification and applies a knowledge of subtext, particularly when it comes to getting an indication of customer satisfaction.  For example, Deutsche Bahn are using Sentiment Analysis to find out why people are unhappy with their experience whilst Amazon are using it to keep tabs on the customer service levels of their sellers. Indeed, Facebook have taken this one step further and, rather than just tracking satisfaction levels, they are examining how users are organising hate groups and using the data collected to try and prevent them mobilising.  With the advancement of Machine Learning and technological developments like quantum computing, this decade could see NLP’s understanding  reach a whole level, becoming omnipresent and even more immersed in our daily lives: PERSONAL AI ASSISTANTS The popularity of using personal AI-based assistants is growing thanks to Alexa and Google Assistant (Siri & Cortana not so much, sorry). People are getting used to talking to their phones and smart devices in order to set alarms, create reminders or even book haircuts.  And, as we continue to use these personal assistants more and more, we’ll need them to understand us better and more accurately. After decades of using generic text- or click inputs to make a computer execute our commands, this decade our interactions with computers need to involve into a more “natural” way of communicating. But these advances are not just limited to voice technologies. Talking and texting with machines, the way we would with friends, is increasingly realistic thanks to advances in NLP: CHATBOTS Since companies have realised that they can answer most generic inquiries using an algorithm, the use of chatbots has increased tenfold.  Not only do these save on the need to employee customer service staff, but many are now so realistic and conversational that many customers do not realise that they are engaging with an algorithm.  Plus, the ability to understand what is meant, even when it is not said in as many words, means that NLP can offer a service that is akin to what any individual can.  If you’re interested in using NLP to fuel the next generation of technical advancements, we may have a role for you. Take a look at our latest opportunities or get in touch with one of expert consultants to find out more. 


recently viewed jobs