User Experience Case Study
THE UX BEHIND ASSOCIATES
Process
How we tackled the challenge
I used the following process to guide the sequencing of activities that provided key insights to the team to help gain an understanding, define a direction, craft a solution, and evaluate our results.
Understand
Focused on understanding the users and the content. Led user interviews and content audit and analysis.
Methods
- Stakeholder Interviews
- User Interviews
- Surveys
- Personas
- Content Audit
Define
Determined the information architecture through a card sort and tree tests, and designed user flows.
Methods
- Card Sort
- Tree Test
- User Flow
- Job (User) Stories
Craft
Created an experience. Designed a user flow, iterated on wireframes, and developed a prototype.
Methods
- Wireframe
- Prototype
- Visual Design
Evaluate
Tested the experience with real users doing real tasks in order to uncover areas to improve.
Methods
- User Test
- Usability Test
- Retrospectives
Understand
The foundation for subsequent phases
Conduct Interviews
Understand Stakeholders & users
For stakeholders, I wanted to understand their goals for the project, their perception of any challenges, and their definition of success.
For users, I wanted to understand their attitudes and approach towards key activities, conversations with customers, and how they thought through challenges. I interviewed users where they worked so that I could observe and ask questions about their environment.
We did not discuss idealized process flows nor application-specific topics that related to the future solution. These interviews were not requirements gathering sessions. We did not focus on feature sets. The topic came up, but I pivoted the conversation to explore the pain-points that the proposed solution or feature would address.

Avatars made with Pablo Stanley’s library
Conduct follow-up surveys
Quantify qualitative findings
To continue to understand the users, I led the team in the creating a set of questions that dove into the user’s behaviors, thoughts, and pain-points in order to extrapolate relative frequency and significance.
Build Personas
360° View of our Users
Personas were created from field research, survey results, anecdotal evidence supplied and observed during interviews, and abstracted evidence from market and additional user research. The focus was not to create a beautiful deliverable, but instead, for our team to obtain a shared understanding of our users.

Conduct Content Audit
Understand the content
To begin to understand the information space, I ran a content audit. Product owners identified the content that would be featured within the platform. This included current content, and materials to be released within the year. In a sea of seasonal documents, training sheets, product one-pagers, seasonal updates, and special offers, the organization’s focus was to design a system that could administer all of the collateral, and serve it to associates to use with their clients. The audit included a series of fields to help identify, categorize, rank, and expire the content. The majority of the materials were public facing and thus had both a legal and marketing lifespan.
With the content inventory complete and expiration dates accounted for, the business prioritized the backlog of documents for the initial release. Seasonal and one-time offers were prioritized for subsequent releases of the application. The tiered release of content provided the business a timeline to review and update materials for go-live, while our project team kept focus on the foundational activities to bring the system to life.
John McCrory’s Content Audit Template segments content in a simple way. Without getting into maturity models, content quadrants, content life cycles, etc – John’s diagram provides a basic plan for content strategy.

Key Findings
Please mind the gaps
Through user research, key insights for both associates and the Marketing department were uncovered. I consolidated and discussed my findings with the team and product owners. We referenced user research as we defined the backlog of features, and throughout design and development.






Define
Determine the direction of the solution

Photo by Bryan Minear on Unsplash
Information Architecture
Structure information
After gaining a shared understanding of various aspects of our project, I shifted my focus on analyzing, organizing, and defining a strong content hierarchy. My goal was to provide a logical organization of content, and help users navigate the information space.

Card Sort
Grouping content
Card sorting is a technique used to uncover the structure and grouping of information and concepts. The benefit is that it allows participants the opportunity to group concepts in a way that makes sense to them.
I opted to use OptimalSort, , a web-based product by Optimal Workshop to run an unmoderated exercise with 26 associates, all located in different cities. I also ran this activity for the admin side of the application with all of the Marketing admins who were located in two cities.
After working to index the information for participants to group, I logged in to OptimalWorkshop to enter in the information onto digital cards. After configuring the study, I sent a link to the participants. After about 2 weeks, all of the participants had completed the sort, and I logged in to view the results. I analyzed the Similarity Matrix and Dendrogram and reviewed the results and recommended information hierarchy with the team. The project team found it enlightening to see how participants clustered information and grouped ‘like’ concepts.

Tree Test
Test our navigation
Also known as a Reverse Card Sort, tree tests are a great way to evaluate the findability of information within an application using realistic user tasks.
I used a tree test to test how easily users found information. If this had been an existing app, I’d have suggested running the Tree Test first in order to evaluate the effectiveness of the current information architecture (IA) and to create a baseline if it tested poorly. With this being a new application, I prefered the Tree Test second, to help evaluate the suggested IA.
To write user tasks, I started with the information gathered in prior stages. After additional input and refinements from the team, I jumped into TreeJack, OptimalWorkshop’s online tree testing product.
I entered our content hierarchy, being sure to correctly capture all of the parent-child relationships. I then added the user tasks. Once I had the study configured, I sent links to the participants. After a week and a half, I logged in and reviewed the results. There were parts of the hierarchy that did fantastic, and few areas needed improvement. The team and I reviewed the results and were able to make decisions that were backed by data.
We removed some of the esoteric marketing labels that had been included in the first round. We rant the tree test again and had an improved task success record.

User Flow
The Path Forward
From research gathered, we worked to identify, prioritize, and generate a series of user flows. Each flow began with a decision tree that the team reviewed. I then sketched a series of representative interfaces on my iPad and put them in the order of the flow. I made the decision tree and user flow available to all members of the team to ensure we could always refer back to it as we continued to take steps forward.


You’re the real MVP
Building a user-focused backlog
While focused on defining varying aspects of the strategy and information architecture, the business wanted to begin meeting to create a backlog based on the key learnings and additional features they wanted to include. We referenced our user research as we defined the backlog of features.
Use existing user mental models
Tag content for search behaviors
Integrate calendar & contacts
Recommend strategies to aid sales
Build an intuitive CMS*
Expose marketing’s backlog
Integrate content usage reporting
* Content Management System
** Plus: authentication, security, etc.

* Representative personas featured later in the case study
Craft
Explore and create the experience

Photo by William Iven on Unsplash
Wireframe
Blueprint the layouts
I used a combination of my iPad and Sketch to create wireframes of varying fidelity. By keeping things in a lower fidelity longer, we could quickly build up and tear down without a large investment in time and effort. I put wireframes into a prototype in order to test layouts and the journey with our users, admins, and stakeholders. As key decisions were made, I increase the fidelity of the wireframes.

Prototype
Model the experience
During this project, I used InVision to create a clickable prototype that could be updated and accessed online. Our team and stakeholders could feel the interactions and provide feedback through InVision’s comment system. This type of collaboration provided us a common references during conversations that in the end, led us to evolve the design of the system more efficiently.
Visual Design
Model the experience
The challenge faced was to expand the brand’s visual design language. As our project was being developed, the brand was in the midst of a pivot. The digital ecosystem was the focus, and therefore, all style and brand guidelines were in flux. For example, certain elements mentioned in the style guides were redlined ”TBD”.
We could not stop our project, so I used the print and digital guides as a base. I reviewed elements and screens with both brand and our compliance partners. There were challenges. Feedback mixed opinions, politics, and suggestions that would not have been ADA compliant. Fortunately, testing prevailed, and many suggestions were overturned using research, data, best practices, and trust built over time.
Visual Design Samples
Elements of our application’s design
Search
We placed search at the top and aligned it with search results to ensure quick access. Within the list of
suggestions and instant results, we highlighted the additional terms to make focusing on the uniquenesses easier.

Discover Materials
List View focuses on information, making it easier to review document’s one-by-one, and view materials in a traditional view that is preferred when browsing through new materials.

Photo by rawpixel.com on Unsplash. Edits by me.
Document Card in List View

Find “That” Document
Grid View provides larger document images that supports scanning and recognition of document quickly.
This is the preferred view when locating known materials.

Photo by rawpixel.com on Unsplash. Edits by me.
Document Card in Grid View

Evaluate
Uncover areas to improve

Photo by David Travis on Unsplash
Usability Testing
MODERATED & IN-PERSON
An article by the Nielsen-Norman Group suggests that 5 participants per round maximizes the benefit-cost ratio while additional users in the same round returns a decreasing marginal benefit.
Based on a future user population of just under 20,000 associates, I analyzed the segments of the population, and used various heuristics to derive the makeup of a sample group to test. I recommended more rounds of testing instead of more users per round. We agreed to a total of 20 participants across 4 rounds.
I pushed for moderated in-person tests that would conducted within the associate’s place of work. Conducting usability tests within a lab setting is a wonderful way to control variables, however, for our application, we decided on testing it within the associates environment since it was to be used as both a preparation tool, and a tool that could be used at the speed of conversation with clients.
While working on sending out requests and obtaining confirmation from 20 participants, I conducted a small pilot study with 3 associates. The pilot study provided an opportunity improve the wording of both instructions and tasks. The tasks themselves were set up to capture time on task, success rate, and error rate. Each task had an additional follow-up question to capture subjectivity towards the task or feature being tested.
I invited the team and stakeholders to observe via screen share with audio. I put those that attended on mute, and recorded the sessions as well (don’t forget to get consent to record and release if you decide to do this too!). After the sessions were complete, I sent out a survey to gather subjective measures of satisfaction on the application.
Overall, usability testing was a benefit to our team as we gathered metrics and insights that we used to improve the application. We prioritized things by balancing the remaining timeline and budget. Aspects deemed a lower priority were not lost, and instead, added to the backlog.
Retrospective
KEEP & START
We built an application that had a defined content hierarchy, was easy to navigate, had a strong information scent that provided users throughout their journey. Analytics gathered after launch helped us to continue to improve the application.
Below is my own personal retrospective on the project:
I’m going to keep:
- Advocating for UX to be invoked early in projects instead of viewed as a ‘bolt-on’
- Balancing user and business goals
- Iterating on my process and ensuring it’s flexibility to the product, team, and environment
I’m going to start:
- To improve the design feedback loop related to speed and quality
- To continue to work with my team’s to hasten the feedback loop
- To continue to work on improving design feedback discussions
