B2B marketplace desktop app report

Automate report creation

Reports in powerpoint presentation

Problem statement

The insights team spends 2-3 days manually extracting data from the existing app and other databases to create 1 PowerPoint report on expert leaders' social performance metrics for brand managers. Expert leaders are subject matter experts who have a deep knowledge of a certain topic or industry.

The main objective

To reduce the manual hours spent by our primary users, the insights team, in producing reports, aiming to increase report delivery by 20% while significantly improving turnaround time.

User research

Qualitative data from interviews

Method 1: Qualitative data from interviews

I conducted individual interview sessions with members of the insights team. The primary aim was to ascertain the essential information required for creating the PowerPoint presentations. I requested that participants provide examples of reports they had previously made.

Using the reports they have brought in the interview, my objective was to identify key information that needed emphasis, discern any recurring trends, and determine if similar patterns of information were being delivered to different brand managers. Additionally, I sought to understand the time investment required for creating these reports.

Affinity mapping

Method 2: Affinity mapping

After the interview phase, I categorised the collected data that was undertaken. This approach facilitated the identification of information elements required for report generation. The process enabled us to effectively replicate the informational content present in existing PowerPoint presentations while simultaneously enhancing the discoverability of these reports within the application interface.

Card sorting exercise

Method 3: Card sorting

I organised and facilitated a card sorting session with the insights team. During this exercise, I requested participants to categorise the information previously gathered through interviews. The primary aim was to gain valuable input on effective report structuring approaches.

The goal was to ensure that users who received the Powerpoint presentations would encounter familiar information patterns. This approach was designed to instill confidence in the accuracy and reliability of the generated reports.

Key findings

* 80% of our insights team members were manually extracting data from the platform and incorporating it into PowerPoint presentations.

* 75% of our insights team members produce essentially identical reports, with variations only in the specific industry and brand manager for whom the report is intended.

* 80% of our insights team members find the repetitive nature of their current reporting tasks to be unchallenging and demotivating.

Define: How might we questions

"How might we" questions were developed to help support the insights team.

* How might we reimagine the report creation process to reduce time investment while maintaining or improving quality?

* How might we streamline our insights reporting process to enhance efficiency while preserving the ability to tailor content for specific industries and brand managers?

* How might we transform our reporting processes to enhance job satisfaction and empower our insights team to engage in more stimulating and impactful work?

Design process

Initial concepts

After conducting team interviews and analysing report examples, I translated the findings into initial designs using Figma. The design approach focused on three key areas:

Initial concepts 1

* Creating a new layout to improve visual hierarchy and overall user experience.

* Integrating data already present in user profiles to reduce manual input and increase efficiency.

* Develop an intuitive process to simplify report generation, aiming to reduce time investment on the manual effort of report creation.

Initial concepts 2

Iterative stages

I made several amends to the initial design to improve overall usability:

* Incorporating familiar metrics from user profiles to enhance recognition and minimise cognitive effort for users.

* A benchmarking visual aid was introduced to help users quickly assess whether an expert leader's engagement metric has improved or declined over time.

* Integrated a feature that displays which expert leaders are being included in a report, providing users with a clear overview of their selections.

Prototyping and testing

Prototyping method

Figma was used to create high fidelity prototype of reports. With these prototypes, I was able to add interactivity and simulate user flows. The designs needed to appear finalised so that the insights team can see what they would see in the future.

Remote usability testing

I organised individual testing sessions with members of the insights team. I presented each participant with a specific scenario: to create a report featuring 10 expert leaders in the AI space.

Participants were encouraged to think aloud during the test, vocalising their thoughts and decision-making processes as they navigated through the report creation task. This method allowed us to gain deeper insights into their cognitive processes and identify any pain points or areas of confusion.

I conducted and recorded all test sessions using Microsoft Teams, ensuring we could review and analyse the interactions in detail later. This approach provided us with valuable qualitative data on user behaviour and preferences, which we could then use to refine our design further and improve the overall user experience.

Usability testing feedback

Results and iterations

Our testing sessions with the insights team produced promising results while also revealing areas of uncertainty. The uncertainties were addressed by removing problematic elements and adding the related insights to our backlog for future consideration and development.

The feedback from the insights team was largely positive. They particularly appreciated the speed and ease of report creation, highlighting the efficiency of our new design. The consistency in displayed metrics was another aspect that received praise. This standardisation of input not only streamlined the reporting process but also opened up potential business opportunities.

Card vs tabular view a/b testing

Minimum Viable Product (MVP)

Key features and benefits

* A primary action for our users when creating a report

* Standardisation of reports

* Ability to view more than 1 expert leader’s engagement performance at a time

Hotjar and Churnzero 1 year post eva

Impact and results - 1 year post evaluation

Results: 

* 100% of our insights team can confidently create a report fast

* 20% of our insights team workload are dedicated to creating customised reports for upsell

* Brand managers from Enterprises have started creating and viewing reports on the app

Task success: 

* Before 2-3 days to create a report consisting of 10 expert leaders engagement performance

* After ~10 mins to create and view a report on the app

User satisfaction metrics: 

* 100% of our insights team rated us 8 or above, showing positive sentiment.

* CSAT 90%

User feedback

The insights team, our primary users, responded positively to the new reporting feature, emphasising its ease of use and significant time-saving benefits. Some feedback statements from team members, which are presented below:

“This is amazing! This will save us a lot of time creating reports and I know the brand managers will love this because they see multiple expert leaders at once on screen so they can compare them!” - Patrick

“Siemens love this! They have purchased an annual subscription of it!” - Will

Business outcomes

* Increase in annual revenue through purchased reports by Siemens -  £32,500

* 20% boost in capacity to deliver tailored reports that facilitate upselling opportunities.

* Increase product positioning against other competitive social listening and influencer tools

Rapid iteration feedback

Lessons learned

Challenges

A significant challenge I encountered during this project was working within tight time constraints. The insights team's busy schedule, filled with urgent report creation tasks, often led to rescheduling of our interview sessions. 

To adapt to these limitations, I changed our interview structure. We focused on diving directly into the most crucial aspects of the reporting process, prioritising discussions about valuable information and de-emphasizing less critical elements. This approach allowed us to gather essential insights efficiently, despite the limited time available with each team member.

Some challenges during implementation occurred as well so it was important that regular communication with the tech team was in placed. The challenges occurred during alpha testing and it was quickly noted through team so that it we can use rapid iterations to resolve the issues.

What worked well

Throughout the project, maintaining consistent communication with the insights team proved crucial in achieving a viable outcome. Given their heavy workload, I adapted our approach to accommodate their limited availability. This meant conducting shorter, more focused interview sessions and scheduling follow-up meetings as needed.

It was also important to have regular and consistent communication with the technical team to ensure the progress of the project and discuss any technical concerns.

By involving the insights team and the technical team from the project's inception - through user interviews and usability testing - we fostered a collaborative environment. This early and ongoing engagement was key to our success.

Next steps

Future enhancements

The initial release of the new reporting feature has been well-received by the insights team, marking a significant improvement in their workflow. One key opportunity for improvement is the integration of data from additional social channels. This expansion would provide a more comprehensive view of expert leaders' engagement performance, enabling users to make more informed decisions.

Another potential feature we're considering is the ability to export reports. This functionality would allow the insights team to easily share reports with whomever, who could then distribute them more widely within their organisations. 

These potential enhancements demonstrate our commitment to continuous improvement and responsiveness to user needs. As we gather more user feedback and usage data, we'll prioritise these and other features to ensure our reporting tool remains effective and relevant.

More by Sem Heaver

View profile