The research mix

November 05, 2019

This post is part of my notes on three years of design leadership at Pipedrive. Also read the other posts in the series:

Research is a crucial part of Product Design, regardless of the formal reporting structures in the company. I started the product research discipline at Pipedrive and hired a team of researchers and data analysts. Originally they were all in the same research team, which was later sliced and diced and split up, but it didn’t matter because everything remained under the same Product umbrella. That’s my first point if you want just a quick takeaway - research, data, design and product management should all build off one another and inform each other’s work being done.

Regardless of team composition and the exact roles, here’s how I think about research methods and activities.

Research methods

The basic tradeoff

The basic gist of it is, you can know a little about many people, or a lot about a few people, but you can never know everything about everyone. The chart shows this tradeoff, and positions the various methods on the tradeoff scale.

The mix of methods is eclectic. “(Customer) Support” is not really a research method, but it definitely is a source of research data. At Pipedrive, we mostly did research with already existing customers, and those who contacted support were a great group to talk to. Sure, there is a selection bias there, so we didn’t only rely on that. But often, if you work on a new product direction, a good group to talk to is the people who have explicitly requested that thing to exist, which in Pipedrive case they often do via support.

Generative and evaluative research

The two common types of product research are generative and evaluative. Sometimes also called formative and summative, or something else equally fancy. Simply put, the first one you do to understand what to do (“doing the right thing”), and the second one you do to understand if what you built really fits in the world of humans who you think would be using it (“doing the thing right”).

The above methods chart doesn’t really distinguish between the two since there is often a blend of them during interviews, and both the researchers and customers doing the talking are the same. You’ll go into a study and interview with a particular focus and script, but there’s always room to discuss “the other side” and it’s valuable to capture those insights.

Thick data

I got the “thick data” label from Tricia Wang’s work. In particular, I happened to see her keynote at IxDA 16 in Helsinki. She explains how it’s useful to have something equally catchy as “Big Data” to indicate how depth matters.

You’ll see how the dimensions start to compound each other here. You can have thick data in both the generative and evaluative research stages.

Problem and solution space

Yet another dimension comes from Indi Young’s wonderful work on problem vs solution space.

Problem and solution spaces


Indi explains about how and why to explore the problem space, and things that people find hard to understand about this approach. In brief, the problem space thinking is less about your organization, product or service, and more about people in their own current world which, sure, may include you and your product, but always includes many other things as well.

Armed with all these tools, a good design leader and researcher can work on both immediate product problems and longer-term frontiers, and always suggest an appropriate mix of tools to fit the research goals of a larger or smaller piece of work.