Back to blog
9 min read

Data Analyst Interview: SQL, Case Studies, and What They're Really Testing

The real data analyst interview isn't about syntax. Learn what hiring managers look for beyond technical skills.

interview prepdata analystanalyticssql interview

Most data analyst candidates prepare for the interview that does not exist. They practice SQL syntax until they can write a window function in their sleep, build dashboards in Tableau, and memorize the difference between a fact table and a dimension table. Then they sit down in the interview and spend most of their time getting asked about a business problem they have never thought about — what is causing a drop in conversion, how would you measure the impact of a product change, what would you do if the data told you something the team did not want to hear.

The SQL is necessary but not sufficient. What separates data analyst candidates who get offers from those who do not is not technical ability — it is business thinking. This guide focuses on what hiring managers are actually evaluating, and how to demonstrate it.

What Interviewers Are Really Assessing

Can you turn data into a decision?

The most important thing a data analyst does is not query databases. It is translate data into decisions that stakeholders can act on. Interviewers want to know whether you have done that in practice — whether you have ever changed someone's mind or changed a course of action because of an analysis you ran. If every answer to behavioral questions ends at "I built a dashboard" or "I ran the analysis," you are stopping short of what matters.

In interviews, this shows up in how you frame your work. "I identified that cart abandonment was 18% higher for mobile users than desktop users, which led the product team to reprioritize the mobile checkout redesign" is a completely different signal than "I built a funnel analysis."

Do you know what good data looks like?

Analysts who have worked with real data — as opposed to clean teaching datasets — have scar tissue around data quality. Experienced hiring managers will probe for this by asking what you do when the data does not look right, how you validated your analysis, or whether you have caught a metric that was being reported incorrectly. Candidates who say they just trust the data they receive look naive. Candidates who describe their validation process look like someone who has been burned before and learned.

Can you explain your work to a non-technical audience?

This is a genuine differentiator that many technically strong candidates fail. Being able to explain a cohort analysis to a CFO, or describe what a p-value means in plain language to a marketing director, is a core part of the job. In interviews, this often comes up as a question about how you communicated findings or how you handled a stakeholder who disagreed with your analysis.

Are you curious, or just diligent?

The best data analysts are curious in a way that manifests as an instinct to question the numbers rather than just report them. Interviewers pick this up in whether you explore unexpected results rather than smooth them over, whether you ask why something is happening rather than just what is happening, and whether you proactively look for stories in data versus waiting to be asked for them.

The Questions You'll Actually Get Asked

"Write a query to find the second-highest salary in an employee table."

Classic SQL question. There are multiple correct approaches: using a subquery with MAX, using DENSE_RANK() as a window function, or using LIMIT/OFFSET. What interviewers are really looking at: do you know multiple ways to solve this, can you explain the performance trade-offs, and do you handle edge cases (what if there is no second salary, what if there are ties)?

Think out loud. "I would use DENSE_RANK here because it handles ties correctly, and I would filter WHERE rank = 2 in a CTE. The subquery approach also works but gets messier if you need to generalize to the nth salary." That kind of thinking out loud signals real experience, not memorized syntax.

"Our sign-up conversion rate dropped 15% last month. How would you investigate?"

This is the most important question type in a data analyst interview: the structured investigation question. Do not jump to a hypothesis. Walk through a systematic decomposition out loud.

Framework: Validate first (is the measurement correct, did the tracking code change?), then segment (is the drop in all users or a specific cohort — mobile vs desktop, organic vs paid, new vs returning?), then timeline (when exactly did it drop — does it correlate with a deployment, a campaign, an external event?), then cross-reference (is this drop isolated to conversion or are upstream metrics also affected?). Only after all of that should you form a hypothesis.

Strong answers end with: "Based on this, my leading hypothesis is X, and here is the query I would write to test it."

"How would you measure the impact of a new feature we just launched?"

This is a measurement design question. Start by clarifying: was there an A/B test? If yes, walk through how you would analyze the experiment results — sample size validation, statistical significance, guardrail metrics, segmentation. If no, discuss quasi-experimental methods: difference-in-differences, pre/post analysis with caveats about confounders, or using a holdout group if one was preserved.

The important thing is to flag what you can and cannot conclude from each approach. An analyst who says "we can't establish causality from this data" is more trustworthy than one who reports a correlation as a result.

NextCV generates interview cheat sheets with STAR examples

"Tell me about a time your analysis led to a decision you disagreed with."

This is a behavioral question about judgment and professional dynamics. The answer the interviewer is looking for: you raised your concern with evidence, you presented an alternative interpretation, and you ultimately supported the decision even if you still had reservations — while maybe flagging that you would want to revisit it after the next data cycle.

The red flag answer: either you went along silently (no backbone) or you refused to move forward until they agreed with you (no collaboration). The thing interviewers are listening for is whether you understand that your job is to inform decisions, not make them unilaterally.

How to Prepare the Night Before

Run through five SQL problems from scratch. Not the hard ones — the classics. Aggregations, joins, window functions, date arithmetic, self-joins. The goal is to make sure your hands remember the syntax under light pressure. Mistakes on simple SQL in a screenshare interview look worse than they are, so drill the basics.

Prepare one investigation story. Think of a time you found something unexpected in data — a metric that did not look right, a trend that contradicted expectations, a result that changed a plan. Structure it: what were you looking at, what seemed off, how did you investigate, what did you find, and what happened as a result. This story will serve you well in multiple question types.

Review the company's public metrics and business model. Understanding how the company makes money and what their key performance drivers are lets you anchor your answers in their context. A candidate who frames their analysis of a conversion drop in terms of the company's actual revenue model looks like someone who has been paying attention.

NextCV's interview cheat sheet feature generates a focused prep guide from the specific job description — including likely question themes, the technical areas to brush up on, and behavioral story prompts aligned to the data role in question. Useful for structuring your final prep the night before.

See how NextCV tailors your preparation to match the job posting

Write down three things that could go wrong with the company's key metric. Before the interview, take whatever their primary KPI likely is (DAU, GMV, conversion rate, retention) and brainstorm what a drop in that metric could mean — measurement error, product issue, external factor, attribution change. This primes you for investigation questions and makes your answers feel less like you are reading from a script.

Common Interview Mistakes for Data Analyst Candidates

Reporting findings without interpreting them

"The data shows that conversion dropped 15% in March" is a description. "The data shows that conversion dropped 15% in March, concentrated in mobile users, beginning the day after we deployed the new checkout flow — which suggests a regression in the mobile experience" is an analysis. Candidates who stop at description make hiring managers nervous because translating data into insight is the job. Show that you do not just summarize — you interpret.

Skipping data validation

Jumping straight to analysis without discussing how you would validate the data quality is a red flag. Real data is messy: tracking breaks, definitions change, data pipelines have bugs. Experienced analysts have frameworks for checking their work — they look for unexpected nulls, implausible values, and totals that do not add up. Mentioning this instinct in your answers, even briefly, signals real-world experience.

Being overconfident about causality

Saying "this feature caused the improvement" when all you have is correlational data from a non-controlled rollout is a credibility problem. Strong analysts are precise about what they can and cannot claim. Using language like "this is consistent with the hypothesis that..." or "we can observe a correlation, though we would need a controlled experiment to establish causality" signals analytical maturity. It also makes you more trustworthy when you do claim something strongly.

Over-engineering the technical answer

When an interviewer asks a SQL question, they are often looking for the simplest readable solution, not the most optimized one. Candidates who immediately reach for complex window functions when a GROUP BY would suffice sometimes signal that they optimize for cleverness over clarity. Write clean, readable SQL first. If performance is a concern, mention it and discuss trade-offs.


Data analyst interviews are fundamentally about judgment: the judgment to investigate systematically before concluding, the judgment to communicate findings honestly even when they are inconvenient, and the judgment to know what the data actually supports versus what you want it to say. Technical skills get you to the interview. These habits get you the offer.

Ready to build your tailored CV?

Paste any job posting and get a CV optimized for that specific role — in seconds.

Try NextCV free