Report Design and Data Quality: How To Gather Requirements, According to Justin Norris

Marketing leader, Justin Norris shares recommendations for how to produce valuable reporting for stakeholders.

Social Stories by Product Coalition
Product Coalition

--

By Tremis Skeete, for Product Coalition

How do you know when a digital product is “done?”

Product development teams ask this question, not because we want to end the work and move on to something else. We ask because, we want to define what “done” can look like for a given project timeline — while knowing we will continue work, add or change capabilities, and improve upon the product in the future.

That’s why it’s good to find the answer and understand what the “end state” version of a product should look like. It’s also one of many important reasons why gathering product requirements upfront is critical.

Now let’s say in this case — the digital product is a report.

In a LinkedIn post, Director of Marketing Operations at 360Learning, Justin Norris, makes several suggestions for how one can approach designing and building reports for executive stakeholders.

As someone with a background in computer science and UI/UX design, I thought his post was fascinating. After all, it’s not everyday I read about a practical requirements and design process from someone with a non-technical academic background. His LinkedIn post also serves as a helpful example towards bridging the gap between the technical and business contexts in product development. To Justin — kudos to you.

Justin Norris

When a stakeholder requests a report, whether they tell you this or not, they made the request so they can get something done. To respond to this request, as a product person, the best question to ask is:

“By using this report, what are you striving to accomplish?”

With this question, you’re striving to establish stakeholder objectives within the requirements. You ask because — it’s a pathway towards understanding what, how, and why the report needs to perform in a certain way in order to help your stakeholder(s) achieve their objective(s).

After reviewing his LinkedIn post, I feel that Justin highlights the following activities in order to produce valuable report designs:

  1. Help stakeholders communicate their ideas.
  2. Understand the business problem(s).
  3. Slow down and find out what’s the problem we’re trying to solve here.
  4. Collect requirements, get clarity upfront, establish definition of “done”.
  5. Perform a report and/or data feasibility analysis.
  6. Identify the complex entities.
  7. Identify the nuances, exceptions, and edge cases.
  8. Identify the metrics i.e. the quantitative measurements of data.
  9. Ensure you know how to produce that metric from your data sources.
  10. If a data point or query is difficult to produce, negotiate to remove from scope, and plan ahead for future implementation.
  11. If your requirements analysis reveals broken business processes, negotiate to improve processes and/or mitigate risks.
  12. If your requirements analysis reveals data that’s not up to par, perform data cleanup and/or optimization.
  13. Make a dictionary of the metrics needed.
  14. Make a dictionary of the business and systems entities and terminology.
  15. Ensure the dictionary definitions are agreed upon by stakeholders.
  16. Identify the data dimensions you’ll need.
  17. Find out from stakeholders the potential business insights to be gained.
  18. Find out the potential decisions to be taken with the information.
  19. Wrap yourself in the business context.
  20. Build a mockup [prototype].
  21. Design the report [mockup] look and feel to ensure data quality is accurate and useful.
  22. Review the mockup with team and stakeholders.
  23. Identify unclear elements [ambiguities] and clarify.
  24. Identify unnecessary elements and remove from mockup.
  25. Iterate, refine and finalize the report design [mockup].
  26. Test mockup with stakeholders, and iterate based on feedback.
  27. Based on mockup and feedback, complete the final report version.
  28. Test final version with stakeholders, and iterate based on feedback.
  29. Make final adjustments based on feedback.
  30. Finalize report and launch.

In items 1 to 19, it’s all about talking to stakeholder(s), understanding the business challenges, the desired objectives to be achieved, and to establish a clear understanding of the purpose of the report.

From 20 to 26, it’s about building, testing, learning from mockup designs with stakeholders, and identifying opportunities to optimize data quality.

Lastly, from 27 to 30, it’s about finalizing the report in accordance with the definition of “done” established in the requirements.

When Justin talks about building a mockup [item 20], I appreciate the moment where he suggests Google Slides or Sheets. Why not use Figma, or InVision, or Adobe XD? It’s because prototyping is not exclusive to design software. With all due respect to Justin — he’s a marketing expert, so it makes sense that he would choose software that marketers use.

Justin could have suggested a pen and paper, or dry erase markers and a whiteboard, but it doesn’t matter which software or materials you use. What matters is that you use the materials you have at your disposal, to build a prototype [mockup] as fast as humanly possible.

Please keep in mind, the above mentioned list is by no means, strict instructions for how to approach a report design. As a product person you’re expected to make adjustments and adaptations with respect to your business culture and operations.

To create a valuable report, know that what really matters the most for executive stakeholders — is to drive innovation. If we as product people want a report design to succeed, we need to embrace the data and operational capabilities of the business itself, and the people and culture that create them. There’s no effective requirements gathering and analysis strategy that could conclude anything different.

Read a copy of Justin’s LinkedIn post below to find out more:

An exec says, “I want a report that shows x, y, and z!”

“Sure!” says the ops team / data team / reporting minion.

Report is built. Everyone gathers round.

“No, no, no!” says the exec. “This isn’t what x should mean. And by ‘y’ I actually was envisioning something more like ‘q’. And that definitely isn’t z.”

Repeat. 😩

The cycle of failed reporting is painful. We’ve all been there. What’s the root cause?

Some people are hard to please, but I actually DON’T think that’s the issue most of the time.

Most people have a VERY good idea of what they want to see. But *communicating* those ideas with the necessary level of precision is hard. It’s not something a typical business stakeholder will do.

If you want to prevent re-work, it’s up to the person collecting the requirements to get clarity up front. Here’s a few tips.

UNDERSTAND THE BUSINESS PROBLEM

It’s easy to be reactive when you get a reporting request — to produce what’s being asked for, which may not be what’s actually needed.

So, SLOW DOWN. Find out: what is the problem we are trying to solve here? What insights will be gained, what decisions taken, with this information?

Wrap yourself in the business context.

MOCK IT UP

It’s a lot easier to build a dashboard in Slides or Sheets than in your BI tool. So mock it up and review it together. Make sure it flows.

This is a great time to highlight potential ambiguities or something that’s not actually needed.

BUILD A DICTIONARY

Before you build the report, create a dictionary of the metrics and dimensions you’ll need.

For example, if someone wants to see the ratio of “calls connected” to “calls dialed,” both those metrics need to be defined both in business terms and then in system terms.

A *business definition* for a “connected call” might be, “any call in which we speak to a live person for any length of time.”

A *system definition* might be, “an activity in Salesloft where type = call and disposition = connected.”

Through building this dictionary you can:

- Ensure that EVERYONE is aligned on the business definition. It’s at this nitty-gritty level that most of the complexity, nuance, exceptions, and edge cases emerge. Hash it out up front.

- Ensure you know HOW to produce that metric and perform a feasibility analysis. If a particular data point is extra hard to produce, you can negotiate and decide to remove it from scope.

FIX BROKEN PROCESS

Sometimes the process of creating definitions exposes much deeper process issues.

For example, in the “connected calls” scenario, you might discover that there is no standardized process for logging calls and so your data is garbage.

This is an excellent opportunity to ensure you fix data quality at the root and improve process at the same time.

I’ll be honest — it’s a lot of work to do before you even start building the report.

But hopefully you’ll only need to build it once. 😊

--

--