Ann-Louise Gaynor

Product Designer

Hiya! I'm Ann-Louise, a product designer placing lived experiences at the heart of digital experiences through empathy, curiosity, and one too many sticky notes.

Previously shaped experiences at Brandwatch and meldCX

Based in ever rainy London

Theme

Brandwatch, 2023

Get actionable insights about your post

How providing context and meaningful insights turned plain metrics into actionable data stories in a feature with 75% adoption.

How providing context and meaningful insights turned plain metrics into actionable data stories in a feature with 75% adoption.

Logo

Tracks Covered

UI/UX Design, Prototyping, Data Visualisation

Team Involved

Project Manager, FE and BE Engineers, UX Researcher

Tools Used

Figma, Miro, Confluence, JIRA, Dovetail, Mixpanel

Timeline Spanned

Nov 2022 - Feb 2023 (4 months), Full-time, Remote

Nov 2022 - Feb 2023
(4 months), Full-time,
Remote

Nov 2022 - Feb 2023 (4 months),
Full-time, Remote

This case study demonstrates how iterative design, informed by user interviews and peer feedback, resulted in contextual social media performance insights that were digestible, organised, and effective in supporting data-driven decision-making.

75% adoption

And paired with increasing retention rates

37% conversion

As users take action on the insights

20% completion

As only a small percentage of users view all insights

This case study demonstrates how iterative design, informed by user interviews and peer feedback, resulted in contextual social media performance insights that were digestible, organised, and effective in supporting data-driven decision-making.

75% adoption

And paired with increasing retention rates

37% conversion

As users take action on the insights

20% completion

As only a small percentage of users view all insights

This case study demonstrates how iterative design, informed by user interviews and peer feedback, resulted in contextual social media performance insights that were digestible, organised, and effective in supporting data-driven decision-making.

75% adoption

And paired with increasing retention rates

37% conversion

As users take action on the insights

20% completion

As only a small percentage of users view all insights

⟢ Overview

The TL;DR

About Measure from Brandwatch

Measure, a B2B SaaS social analytics tool from Brandwatch, helps businesses track their social media performance through customisable dashboards, showing them where efforts are succeeding and where to adjust their content strategies.

Totally Candid Picture of the Measure Team ft. Me

Problem

Our users lacked the ability to meaningfully connect their social media posts with its performance. Performance insights were presented in either isolation from their relevant post or limited in the story they told, leading to missed opportunities for strategic adjustments and growth for users and reduced user satisfaction for Measure.

Limitations of Current Content Performance Widget

Method

To address these user pains, I reviewed existing research into the problem scape, conducted my own research to fill any gaps, ideated on ways to show post performance alongside the context of the performance, tested my ideas and designs, and lastly, handed over designs over to development.

Complete Project Methodology

Solution

Post Insights in Measure

Different Available Insight Sections

Impact

75% adoption

as users use the feature to know more behind the context of their post's performance

37% of users take action

after viewing the insights, like boosting the post, labelling it, or viewing it in another Brandwatch product

20% of users view all

six out of six sections of insights, with majority of users dropping off after the 4th section

Read the full case study below to discover the whole journey to getting here.

⟢ Problem

Forcing Users to Connect the Dots Themselves

What problem were our users faced with?

Our users lacked the ability to meaningfully connect post content with its performance. While they could view metrics like reach or likes through tables or bar charts, these insights were presented in either isolation from the relevant post or limited in the story they told.

Ok... but the question still stands, how exactly is that a problem?

This inability manifested itself in user pain points that we're identified through grouping similar user feedback found on Productboard:

Being in a contextual blind spot

Users couldn’t see the visual or textual details of posts alongside their metrics, making it difficult to pinpoint factors driving good or poor performance.

Having limited visibility

Users were restricted to viewing only a few metrics at a time, which hindered their ability to analyse broader patterns or compare data holistically.

Tolerating an inefficient workflow

Users had to rely on manual cross-referencing or external tools to get the full compete story, adding unnecessary friction and time to their workflow.

User Pain Points

Third time’s the charm, let’s give it one more go... what’s the actual problem here?

For our users, they were left with incomplete insights, leading to missed opportunities for strategic adjustments and growth.

For Measure, it means reduced user satisfaction and possible churn as clients turn to more comprehensive solutions offered by our competitors. How can a business distinguish itself from competitors who were superior to it in more ways than one?

Defining what we know

01

Our Users

Our users range from social media managers of brands or agencies to data analysts and to advertisers. But no matter the role, their tasks range from analysing overall page performance to going into deep analysis of individual post performance.

Users of Measure

02

Our Key Question

How might we help general users understand post content in relation to its performance, while enabling power users to uncover deeper insights and make data-driven decisions about their content strategy?

03

Our Scope

We focused our scope to three areas, each area addressing a user pain point and bring us closer to answering our key question:

Being in a contextual blind spot

Provide context

Give users a way to view both post content (e.g., thumbnail, text, comments) and performance metrics together, making it easier to identify the factors driving engagement.

Having limited visibility

Expand visibility

Enable users to access a broader range of metrics at once, allowing for more comprehensive analysis and the ability to uncover patterns and insights across multiple data points.

Tolerating an inefficient workflow

Streamline workflows

Link post content with performance, erasing the need for manual cross-referencing. Provide easy access to relevant post actions, allowing users to take immediate, informed actions.

Defined Project Scope with User Pains as Guideline

04

Our Indicators for Success

To succeed would mean to have correctly answered our How Might We so, it became the basis for the selection of our KPIs:

At least 70% feature adoption within the first 3 months.

Reduced usage in the ‘export’ action: this would suggest that users access the insights they need directly within the platform.

At least 50% of users take any relevant actions after viewing the insights.

⟢ Research

Building on Top of Existing Work

Studying existing groundwork

I took over the project from the previous Measure PD, who had already conducted a thorough investigation into the problem. I studied the existing research they made and added to it with my own analysis of how our competitors and the social networks implement post performance analytics.

Snapshot of Social Network References and Competitor Analysis on Miro

Checking the specifics

Knowing the average desktop screen size of our users would ensure that the most important information was visible without needing to scroll. Through data gathered via Mixpanel, we learned that we had around 700px before users need to scroll.

Another consideration was the size of the post's media with each social network having different dimensions for each of their post types. We optimised for a 9:16 aspect ratio , a ratio that could accommodate for video type media and most image type media.

Screen Heights of Measure User's Devices on Mixpanel (left) and

Dimensions of Different Social Media Posts (right)

Ideation

Eigth Time's the Charm! That's How the Saying Goes, Right?

Starting small

I had to carefully consider how users would navigate the different categories of metrics and data visualisations. So, my initial exploration was focused on:

  • How to navigate through, organise, and contain various metrics within different sections, all while having digestibility as the guiding principle.

  • How to display the post preview alongside the insights, while taking into consideration user screen heights and social media post aspect ratios.

  • And how users would access actionable next steps, with visibility and easy access as its key aspects.

Early Wireframes of Navigation Methods and Layouts with Figma

Gaining some momentum

With a better idea of the layout, I started to explore designing the insights section by section:

Before (Left) and After (Right) of Early Mockups of the Distribution Section

  • Benchmarking insights: moved to the top to user's catch attention and to deliver them a high value insight first.

  • Data breakdown: added sources of traffic for additional insight and context.

Before (Left) and After (Right) of Early Mockups of the Engagement Section

  • KPI supplemental information: seemed like noise rather than additional insight, it took away spotlight from KPI rather than support it.

  • Breakdown: switched from tree map to bar chart as it took lesser time to process the difference in values for the different engagement types.

Getting feedback, feedback, and guess what... more feedback

The mockups went through 7 rounds of internal feedback. At the time, I was still building my visual design skills so these sessions allowed me to learn from more experienced team members and address areas where I lacked expertise.

Using a Miro board to present my mockups along with my assumptions and questions, I scheduled 1-on-1s with design seniors and internal stakeholders with the goal to gather feedback, test my assumptions, and get answers to my questions.

Feedback Notes from 2 of the 7 Different Critique Sessions with Miro

After the last design iteration, I finally had a version ready for user testing ✨

⟢ Testing

The Moment of Truth

Setting everything up for user testing

With a working prototype ready, we proceeded to find users to test it on. Together with the product manager and our embedded UX researcher, we defined our method, participants we'd like to have, and brief for the test.

Given the size of the feature, we decided it'd be best to conduct 1 on 1 interviews with the participants. Each interview lasted around 45 minutes.

Research Plan

During each interview, I'd join in and take live notes of each piece of feedback our participants had. I'd link each feedback to its relevant section in a Miro board. This made it incredibly easy for me later on to review and remember the context behind user's feedback:

Part of the Interview Notes with Miro

Translating test results in next steps

The test showed that:

  • 8/8 participants were happy with the metrics offered in the prototype

  • 8/8 participants easily navigated the prototype and understood the sections

  • 3/8 participants struggled with the font sizes

  • 6/8 participants struggled with finding the definitions of the metrics

We also got a lot of section specific feedback that we evaluated and prioritised as a team through a prioritisation matrix.

Results of the Priortisation Workshop Published in Dovetail

After incorporating the feedback and a final presentation to the Measure team and its stakeholders, I handed the prototype over to our engineers for development.

⟢ Solution

Now, That's a Lot of Numbers!

Post Insights in Measure

Post Insights in Measure

Different Available Insight Sections

The solution in full

Post Insights in Measure

Some design rationale:

  • Following the existing patterns in the product suite and in the design system, the feature lives in a full-page modal where it is centred, surrounded by large margins, and a footer for the CTA and other actions.

  • Visualisation are from High Charts, the official library used by the products. And it's colours are from the product suite's design system.

Measured outcomes

75% adoption within the first month

And a 50% retention rate on it's 14th month.

10% reduction in exports

We were expecting a larger number but our hypothesis was that one causes the other. This shows that events may just be correlated, albeit on a very small level. There could have been other factors at play contributing to this change as well.

37% of users took action after viewing the insights

Users boosted the post, checked them on them natively, and viewed the post through the other products in the product suite. This is lower than the 50% goal we set, indicating that we'd need to further explore the kinds of actions users would find helpful in this case and possibly how visible these actions are.

20% viewed all 6 sections of insights

More than half of users dropped off by the fourth section, indicating that the last 3 sections were not as meaningful to them. Those sections contained the least important metrics as mentioned by the test participants. Participants were likely to drop off after the 'most important metrics': distribution, engagement, and conversion metrics.

Users are clearly using the feature to gain more insights about their posts. However, further research and design is needed to improve the actions users can take and the value of the last few sections.

⟢ Reflection

Looking back

What I'd do differently…

Improve visual design

This is the first large project I led at Measure. I had yet to learn how to correctly space out elements and that is evident in this project. If I had to time redo this project, I’d rework the layout to group the elements better and to space them out more clearly. I'd rework the visual hierarchy as well, placing emphasis on more relevant items and make them more immediate to the user.

What I learned….

Visualising dense data

By iterating on different layouts and chart types, I learned how to present metrics in ways that were both clear and helpful for users. Choosing the right visualisation for the data and minimising cognitive load, guided my approach.

Using more than 2 braincells (aka critical thinking)

The intensive feedback process during this project transformed the way I evaluate my designs. With seven rounds of feedback and a critique workshop, I honed my ability to view each element within the UI critically, questioning its contribution to the overall goal and ensuring that it added value. This also taught me the importance of being able to defend your design decisions. Documenting insights, linking them back to user research, and incorporating feedback gave me the confidence I needed at the time as a Junior PD to make decisions on my own.

- END -

To the Measure team, thank you for creating a safe environment where I could try, fail, learn, and get right back up… And to Elena, for your guidance and top-tier company... Благодаря 🌷

© 2024 Ann-Louise Gaynor

© 2024 Ann-Louise Gaynor

© 2024 Ann-Louise Gaynor