Menu
Company
Becker Professional Education

Product
CPA
Industry
Education - eLearning for
Accounting Professionals
Deliverable
New Feature

Timeline
8 months
Main Tasks
User Research, Data Analysis, User Experience &
User Interface

1

Project Overview

Project Scope
In the past few years, Beckers has lost market share. To turnover this issue, Becker's leadership decided to change its strategy from fit-to-all learning path towards a hyper-personalized and adaptive learning for each student.

To accomplish that, Artificial Intelligence (AI) and Machine Learning (ML) emerged as the most natural solution - adaptation to real-time data and students needs.

We didn't build the AI algorithm in house, so we add to work with an external stakeholder, SanaLabs. Sana Labs AI technology within the course tracks every student interaction to generate a unique review experience for each student according to their individual weaknesses and learning needs.

With these new technological possibilities, we were able to do the one step further towards our competitors and start using our users' behavioral and real-time data to create highly contextual communication that is relevant to the user study proficiency.
Stakeholders & My Role
I am a single Designer on an Agile team comprised by 3 Developers, 2 Products Managers, 1 Software Developer Manager, and 4 Quality Assurance Engineers.

I played three roles:
Management: I was responsible for determining the overall design direction; I define the design sprints; I wrote and managed Jira design stories and tasks. I was part of the design review sessions.

User Research: This project raise a lot of new technological and design challenges. User research had to be constant in this project because a lot of decisions had to be made upon user research insights.

Product Designer: I collaborate with the team on the ideation process. I analysed users insights either qualitative and quantitative (factor analysis) and made my design decision base on that. Developed the needed prototypes and then tested with users.
Project Goal
Supported by AI & ML technology, switch the product experience from a fit-to-all experience and learning path, towards a personalized and individual experience that reads students assessment tests and tailors a program of study personalized to them to maximize their study time and pain points. And allow students to plan their study schedule and make the most of their limited time.
KEY PROBLEM
How to shift from a concept of tracking users progress to start tracking user proficiency?
Constraints
- Outsourcing the AI algorithm development to a third party.
- A design team of one.
- Only remote access to users.
- The feature had to run in an external application provide by AICPA. With an user interface we cannot control or made changes.

2

My Approach

Contextual Inquiry
(empirical process)
Users & Business Goals
Data Gathering
raw work activity data
organized & structure work activity data
Needs & Requirements
(deductive analytic process)
Extraction
Design Informing Models
(e.g., flow models, usage scenarios)
Extraction
DESIGN
(integrative process)
Contextual Analysis
(inductive analytic process)
Data Interpretation & Consolidation
Hartson, Rex, and Pardha Pyla. 2012. The UX Book. San Francisco, United States: Elsevier Science & Technology.

3

The Results

Contextual Inquiry / Data Gathering
Part 1
Taking in consideration our key problem: how users differentiate progress from proficiency? I conduct a survey, to a sample of 300 students, to try to understand how their perception variate according to certain types of graphic representations.
Survey Process
1st: I gathered 15 different graphic representations normally attribute to the concept of progress and/or proficiency.
2nd: I asked the users how they perceive the respective graphic individually.
3rd: I gather all the graphic representation on a grid and ask which of them they thought to better represent the notion of progress and the notion of proficiency.
Goal: Test for Distinction Bias.
Interesting Fact: Our users are all accountants. So, I found out with this study that they have an intrinsic need for every graphic representation to be followed by numbers, results or percentage description. They just love numbers :D
Contextual Analysis / Data Interpretation & Consolidation
The most voted graphic representation for Progress
20,69% of users voted with a significant expression for this representation as to the most recognizable for tracking progress.
The most voted graphic representation for Proficiency
30,43% of users voted with a significant expression for this representation as to the most recognizable for tracking proficiency.
For a more detailed understanding, feel free to download the full survey report here.
Needs & Requirements / Extration
The two concepts are already complex to distinguish, so upfront, I exclude all the graphic representation that didn't have a significant result. I didn't want to fall in the mistake of creating an ambiguous representation. Accordantly, I immediately excluded the following representation:
Design Informing Models / Flow Models & Usage Scenarios
Becker's Content
CPA course has the following content structure:
As mentioned in our constraints,
we need to create a feature that could run in an external application with the following user interface.
Next was time to think about the feature itself and where to place it.
We decide to call the feature
Personalized Review Session (PRS).
Following this structure, we define the following user flow for the feature.
Important Note: The long term goal is to apply the AI personalized experience into all levels of study content. But for the first phase, the feature was developed to run on a content Unit level.
Design / Wireframes & Prototyping - Interactive Process
Until we migrate the whole education system oriented to proficiency rather than progress, we need to have a hybrid experience between both. For Courses and Sections levels we still need to track progress. And we do it throughout the following graphic representations:
To not conflict with the current percentage progress tracking, with the support of the survey study results, a star representation system emerged as the most logical approach to represent proficient.

Problem: By exploring this representation I understand that a 5-star system was getting the UI too crowded.

Refine: I simplified it to a 3-star system with a half star achievement.

Badge System
Now we needed a more visually intuitive way to communicate the student level of proficiency. I explored several options and figure out that a badge approach was the most appropriate for the context. And we develop the following system.
Problem
I tested this concept and quickly learn that for a student earn an half-star he needs to work several Multiple Questions (MCQs) or Simulations. And students replied that they needed a more immediate feedback if they were progressing or not. So, I refine the design to to this approach.
Refine
I included a kind of experience bar. Every time a student answered a question the bar would increase if the answer were right and decrease if the answer were wrong. Every time a student fill up a bar, he earns an half-star, until it reaches the 3 stars proficiency. Nonetheless, this kind of podium stage is not guaranteed. If the student answered wrongly to a question that he already accomplished 3 stats he can lose proficiency on it. Because we are evaluation knowledge not progress. Thats why he may lose proficiency.
Important Note: To build an efficient AI personalized experience, first, the algorithm needs to collect data and learn from student-based knowledge. So, it would be reckless to encourage the student to jump in directly to the feature. Thus, we decide to place feature at the end of the Unit learning path.
With PRS
Students can now review their weaknesses.
How does it work?
Until the student reaches the end of the Unit learning path, the algorithm will be tracking its performance and knowledge.

Reaching the end of the path, the student can launch a study session with MCQs and Simulations (the two types of exercises that appear in the AICPA exam) and work his weakness.

The PRS came with a Proficiency Dashboard where they can see an overall picture of they proficiency by the Module knowledge level and by the overall Unit knowledge level. If the student didn't study any content for a Module, the badge will appear greyed down, since the algorithm didn't collected any information yet.

Important Note: We prepared on-boarding resources to students'. A pdf guide and a video tutorial on how the feature works.
Review Session
Once the student finishs a unit, the course will allow them to work through PRS that accommodates their learning needs with targeted practice questions so that they can deepen in their understanding of the unit's key concepts.

Students can decide how many questions you would like to answer in each review session. Although students often asked for less than 20 questions during the testing sessions, we tested, and concluded that a pre set of 20 questions would be the least a student can do to concretely see any proficiency evolution. So, the default launch, is a session of 20 questions but the student can change it up until 100. Launching the PRS session the student will now see the AICPA external tool.

The feature will serve them a mixture of MCQs and TBSs that the AI technology has selected for them in light of they performance within the course.
AICPA GUI
We adapted the UI tool to the following design.
As they answer each question, the course tracks whether their proficiency has increased or decreased. And with this information, the course continues to assemble the optimal mix of practice questions for the review session.

Each question within the review session indicates the CPA Exam skill level at which it is testing them. The question also comes with an explanation as to why the course provided that question to them.

Before a student answer each question in the review session, we decide to provide the possibility of watching the lectures discussing the topics tested by the practice questions so that they can ensure that they are mastering the information.

Then, once they have answered all of the questions in the review session, they can go through a detailed list of the questions that appeared within that module. From this dashboard, they can choose to return to particular questions so that they can read the in-depth explanations about why the correct answer was right and the incorrect answers were wrong. If they want, they can also work through additional review sessions to become even more proficient with the exam topics.

Proficiency Tracking
The review session uses badges containing progress bars and proficiency stars to track students proficiency within an unit. The badges report the module number or unit number and are color-coded based on the level of proficiency. Additionally, the progress bars within the badges fill as they answer questions correctly and empty and turn red as they answer questions incorrectly. The proficiency stars also fill by halves as you fill the progress bars.

4

The Impact of My Work

After finishing this first MVP of the new feature. We conduct a Beta A/B Test with ~3 000 students. The AI algorithm needed to learn from individual users but also from the collective usage. To give time to test the algorithm, we launch the A/B test during three months and the end surveyed the students. With the survey results, I did a qualitative analysis.

Qualitative Analysis: Since we are a remote team, we use Miro platform to build remote workshops. With the Product Owners we explore the results from 4 stand points - Behavior, Algorithm, User Interface and Feature. We analyse the positive and negative feedback and come up with solutions and strategies to improve the feature.
Important Note: For privacy and competitor reasons, I am not allowed to share this dashboard real content.
We learned several aspects
1. Problem: The students felt that they didn't control the feature. The feature was controlling them with poor communication on why it was taking certain actions on behalf of them.

Solution: We decide to incorporate in the question header, what kind/type of questions was that and why the AI decide to present it. With a tooltip explaining what the category means according to the AICPA parameters. E.g:
Another aspect was "The PRS didn't seem to account for how many tries it took me to get MCQs right". So, we decide to include additional information.
2. Problem: Keeping the original AICPA timer came up problematic. A lot of students expressed that they were struggling with the exam clock. They didn't see the value and was distracting to them. We had this clock just because we were mimic the real exam. But this was harming the students. Students express that they wanted to pause the time well they look for the solution or explanation. And also, the clock was also conflicting with the question timer.

Solution: We decide to remove the AICPA exam timer and focus only in the question timer. The question timer is important because, for example, in the real exam the student have an average of less than 2 min to answer a multiple question. So, I design the following solution, incorporating a tooltip to explain the feature and a pause time action.
3. Problem: Several student's express things like "If the AI behind the program could more accurately figure out what I needed to study, this would be a great tool"

Solution: I redesign the Performance Dashboard at the end of the session in order to provide more detailed information to guide the student study. I want to give the flexibility of see a quick summary or a detailed report.
4. Problem: Some questions take the student's a lot of time to answer it. Because some involve calculations. Thus, a lot of students request a Skip Action when they had limited time to do a session but either way wanted to do some studying and didn't want to get blocked by a long question. Also, a lot of student also request the ability to mark questions so they can review or study it better later.

Solution: We include both suggestions, Skip Question button and Mark Question.
Becker's PRS UI
Final Design
Did you like this article?
Made on
Tilda