Validating an Old Feature
Project Team: Chris Hill, Hudson Diaz
Project Summary:
With the “Search to Create” workflow wrapped up it was time to address a user need that had been uncovered while working on previous iterations of the product, Validate. We knew that users needed a way to gather feedback about created profiles within the application easily, so the team designed and shipped a feature that would replace costly manual options for users seeking to learn more how peers were thinking about skill profiles.
Key Outcomes:
Shipped feedback feature
Shipped mobile optimized survey feature
Important Context:
The Company:
SkillsEngine is a startup and skunkworks project of TSTC, a regional technical college here in Texas. The goal of SkillsEngine is to advance prosperity through a shared understanding of skills. Their application (Builder) was designed to allow its users to access a curated library of skills so that they could create, validate, and manage skill profiles. These profiles could then be used for a number of things including, curriculum creation, job description generation or validation, job function analysis, etc.
My Role:
As a senior designer at SkillsEngine, I was tasked with defining, designing, and releasing features and functionality that would improve client acquisition. It was also my goal to improve the usability of the application as well as improve upon the processes and tools of the design team.
Project Kick Off:
With other core user journeys finished, focus shifted to another feature on the roadmap, Validate. Validate, was a tool from a previous iteration of the platform (Calibrate) and leadership believed it was time to start work on incorporating this feedback device into the newest iteration of the platform.
Initial Discovery:
Because we had experience with this feature before, the team felt that we could move forward using most of the gained knowledge from work on the previous iteration of the validate tool. That being said, discovery for this project involved meeting with engineers, and other product team members involved in the original iteration to gain insight on user needs, pain points, and opportunities for improvements. The following are the key takeaways from those conversations:
Ultimately, users wanted to gain a sense of confidence from the feedback sessions. They needed to ensure that the information they were using on a profile was the right information for their use case
Regionality, reviewer experience, rating, and the number of reviewers, were some of the most important data points for users of the feature. The feature needed to have a quantitative element that users could examine.
Users would occasionally gather this feedback via in person workshops which allowed for a back and forth when the information wasn’t accurate. We wanted to maintain a qualitative touch with the feature.
Design Approach:
This page would include several visual components that helped users determine what changes to make to a profile. These components included a skills table, a data panel, and a comment section. Together, these tools would paint an actionable picture for our users with both qualitative and quantitative data.
Data Panel:
This component was a new addition to our application generally speaking as we didn’t have anything else visually like it. We wanted to find a way to display quantitative survey data quickly and prominently so that users could determine: The status of a review session, the high-level sentiment of the content within the profile, and a breakdown of the survey respondents. We felt the primary job of this panel was to help users understand what kinds of information they were getting from a review session, so we placed it top left and used bold colors and font sizes to increase visual importance.
Skills Table:
Our skills table could be found on many pages within the application so we had a general understanding of what would be included from both a visual and interaction standpoint. Noticeable changes however, were the addition of two data points that indicated the average review given and number of responses for a skill. This was the main focus of the page so naturally we decided that it should occupy the most space. With that in mind, we gave it the top portion of the page directly to the right of the high level data panel. We felt this was a nice flow for users moving from high level data to more specific skill data within the table.
Comment Section:
The comments section was also a new addition to the platform. We chose to include it so that users could have a more well rounded understanding of the content within a profile. While the data panel would provide a quantitative look at the skills, this comment section would allow respondents to leave qualitative feedback. Because of the importance of the other two panels and the fact that a comment was optional, we decided to leave this section below the more important skills table.
Testing:
Once we felt that designs were in a good place, we reviewed the mock ups with a small group of beta users to evaluate our decisions. Again, we knew that this feature was valuable from a previous iteration and so were only really looking to understand if our visual layout was intuitive. Feedback from the group was generally positive with only a few questions popping up about control functionality. With a few iterations to labeling and controls we felt like we were in a good place to meet with leadership to get final sign off on the designs for this feedback feature. They agreed with our direction and felt it was ready for a first version handoff.
Implementation:
Because this was such a large feature the team decided to break up the implementation into multiple parts. The first unit of work would focus on backend tasks that would create the mechanisms to facilitate the data storage and retrieval needed for the page. The survey (data collection of this project) would be handled by another team. Once the backend architecture was in place, the team then decided to implement the front end that connected to the recently created backend.
Outcomes:
This project resulted in a new feature being released that met a key user need and allowed us a company to pursue new clientele that required the functionality. As the feature went live and got more usage, the team eventually revisited the project and added more functionality to support a high variance workflow highlighted by users.