top of page

G2 - Business Software & Services Reviews 

Needs Assessment & Usability Evaluation Project

UX Researcher 

1/1/17 - 4/30/17

images-1.png

Overview 

For my Needs Assessment and Usability Evaluation master's course, I worked within a team to evaluate users' needs and the usability of the G2 website. G2 is a website that provides validated and unbiased peer-to-peer reviews of business software and service solutions. To evaluate the website, we used several techniques: mapping the primary user paths with an interaction map, conducting user interviews, performing a comparative evaluation of competitors, gathering quantitative data through surveys, and performing a heuristic evaluation and usability testing. 

Interaction Map

To begin evaluating the website, we created an interaction map to better understand its scope and variation. As this is a complex website, we did not map out every interaction. Instead, we gathered data from our client, G2, who informed us that users commonly take three different paths on the website, so we decided to focus on these. They include: Search Products (from the website's search bar), Browse Products, and Compare Products. 

Interaction Map Results

Strong Support for Error Management

Information Overload

Contact Form Requests Too Much Personal Info

Presentation of Interaction Map Results

 

To help ensure our team clearly presented our findings, I wrote a clear and concise script for our team member to use during the presentation. As she was not used to public speaking, she found the script to be useful and it helped allay some of the anxiety many people feel when presenting to a large group.

Research Questions​

The primary research focus was to evaluate the user experience of the G2 website and identify the different needs of key user groups. The overarching research question was: Is the experience of using G2 different for users of different business sizes? If so, should G2 personalize the experience for these different users? And how?

​

​

Interviews​

​

After completing the interaction map, we conducted a series of interviews, including one with the G2 Product Manager and 5 with a variety of users. User recruitment was completed with the help of G2. We created an interview protocol, and the team conducted the interviews in pairs, with 1 person moderating and both taking notes. 

6 Participants
G2 Crowd Product Manager
Business Analysts
Manager
Executive
Industries
Software Development
Food Services
Real Estate
Financial Services
Company Size
Small
Mid-market
Enterprise 
Frequency of Use
Weekly 
Quarterly 
Yearly 

Data Analysis

The interviews were audio-recorded and transcribed. To analyze the data, we held interpretation sessions and collected affinity notes. Then we organized and analyzed the affinity notes to identify common themes and problems. 

Key Findings

1

Users' Said:

"I need help understanding and interpreting the graphs.​"

​

Finding: 

Users find it difficult to understand some of the graphs, especially the spider chart. 

​

2

Users' Said:

"Sometimes I can't tell when the screen has changed. You need to make it more obvious and user-friendly."

​

Finding: 

After using the sort function, users can't tell that they are seeing new data, due to a lack of visual feedback.​

3

Users' Said:

"Sometimes, I have to use the search function several times before I can find what I'm looking for.​"

​

Finding: 

Users find it difficult to find what they are looking for, because they don't know which keywords to use. 

​

4

Users' Said:

"When reviewing the grid, I want to know at glance how many people have reviewed the software.​"

​

Finding: 

Users want to immediately visualize how many reviews companies have received when they see G2 Crowd's 4-quadrant grid.

​

5

Users' Said:

"The grid needs to show me how the software has been rated over time, not just recently."

​

Finding: 

Users are unable to see how or if software ratings have changed over time on the G2 Crowd 4-quadrant grid. They to be able to access historical ratings. 

​

Personas

After completing the interviews, we developed personas based on the information we discovered. The personas helped the team remain connected and empathetic toward our users, always keeping their goals, motivations, frustrations, and the features most important to them in mind. 

Recommendations

Based on the findings from the user interviews, we recommended several ways to better serve users' needs by making data easier to understand visually, improving accessibility and features of the data provided, helping users' find the information they were looking for, and providing clear indications when data are updated after users perform a sort function.

Remove the spider graphs​ and provide better guidelines to help users interpret the graphs and other data visualizations.

Provide users with stronger visual feedback when sorting and filtering data to indicate a change in results.

Improve keyword-based search by tagging the product type and the product name.

Provide immediate feedback to users of a product's average rating on the G2 Crowd 4-quadrant graph.

Add option for users to view the historical rating of software to help them make more informed buying decisions.

Comparative Analysis 

To better understand the competitive landscape, we conducted a comparative analysis with companies who were direct, partial, and parallel competitors of G2. Also, we included non-competitors who did not offer exactly the same type of product, but had enough similarities for us to learn from.

Survey

To gain a broader understanding of G2's current and potential users and to understand their needs, attitudes, characteristics and current practices, we designed and conducted a survey. Our target population for the survey consisted of users of G2's website who had recently visited the site. Survey participants were recruited by G2 via email, and were chosen because they had recently submitted questions to software vendors using G2's website. In total, there were 38 participants whose responses were recorded and analyzed.

Findings & Recommendations

Based on the findings from the survey, we recommended several ways to better serve users' needs:

Heuristic Evaluation

Applying Nielsen's heuristics to the G2 website, each member of the team conducted a heuristic evaluation independently. Next, we conducted a team debrief to ensure all team members' observations were aggregated, discussed, and prioritized. Next, we in conjunction with the personas we created earlier, we assigned numerical scores to indicate the severity of the website usability issues we identified. Assigning values to these issues helped us prioritize the list of issues, so we could focus on the issues causing the most user difficulties.

Findings & Recommendations

Usability Testing

After completing the heuristic evaluation, we completed usability testing. We decided to administer a survey before and after the testing. When deciding on the tasks we would test, we used our previous research findings to focus on the most critical areas. These included testing the Software & Review Search functions, Software Comparison functions, and the Software Category functions. Since most of the software on the G2 website is for businesses, we created fictional scenarios where subjects were asked to imagine themselves working for a business interested in purchasing new software.

Final Findings & Recommendations

With the completion of the interaction map, interviews, personas, comparative analysis, initial survey, and the usability testing, we determined our final findings and made the following recommendations to our client, G2:

Discussion

In addition, our initial research questions were about evaluating the user experience of the G2 website and identifying the needs of key user groups. The results of our research do not indicate that different user groups have different needs. For example, most users visit the site to compare software. As this is a heavily used function, users from different industries and different positions reported that this function is important to them and that it is currently difficult for most of them to use. As part of the software comparison process, it made sense that users also heavily used the data visualizations to compare software. This was another area that suffered from significant usability issues.

 

To address the remaining 2 findings, we recommended using prominent breadcrumbs to aid navigation and to eschew a too minimalist design to improve the readability of text, provide stronger visual feedback when filtering information, and make the "follow" function easily discoverable. For the last finding, we did not anticipate that users would try to click the icons on the landing page to access the website's functions. However, we found during usability testing that many users did try and were frustrated when unable to use the icons to accomplish their goals. 

​

Research & Design for Good

​

​

© 2021 by Jill Meyerson

bottom of page