Project overview
My approach
The results
The impact of my work
Becker Professional Education. A global professional education company which has been the market leader in accounting exam preparation for the past 60 years.
CPA Exam Review - An online eLearning tool to prepare accounting professionals for the CPA exam to gain their certification.
New Feature

8 months
UX generalist who covers the entire design process

Main Tasks
User Research, Data Analysis, User Experience &
User Interface
Becker's Professional Education is moving from a fit-to-all learning path experience towards a personalized and adaptive experience that tailors the learning experience to the individual needs of each students.

To keep the company at the forefront and leverage technology to increase course completion and improve student engagement, Artificial Intelligence (AI) and Machine Learning (ML), emerged as the most natural solution - adaptation to real-time data and students' needs.

The main challenge of this feature was moving from tracking progress towards tracking proficiency. How do we measure proficiency?
Becker Professional Education
Is a subsidiary of Adtalem Global Education (formerly DeVry Inc.) that offers educational resources for professionals in the areas of accounting, finance and project management. Becker is best known as the largest provider of training for candidates who are preparing to sit for the United States CPA Exam in order to become Certified Public Accountants.
In the past few years, Beckers has lost market share. To turnover this issue and leverage technology to increase course completion and improve student engagement, we change the CPA exam review strategy from
fit-to-all learning path experience towards a personalized and adaptive learning experience for each individual needs.
Provide a more efficient and personalized course experience.
We build a new feature called, Personalized Review Sessions (PRS).

It delivers the optimal next step of remediation for each student, in real-time.
Support by AI engine, it does:

  1. Identify the knowledge gaps specific to each learner
  2. Trace these back to their root cause
  3. Reads students assessment tests and tailors a program of study personalized to them
  4. Maximize their study time and pain points
It also allow students to plan their study schedule and make the most of their limited time
Project Scope
We partner up with SanaLabs, an AI company specialized in personalized education to build the next state of the art learning experience for professional education.

With SanaLabs technology within the course, Becker's CPA is now able to tracks every student interaction to generate a unique review experience for each student according to their individual weaknesses and learning needs.

SanaLabs achieves this by utilizing algorithms proven to be extremely accurate in predicting future mistakes. Leveraging spaced-repetition techniques, Sana's algorithms also ensure long-term knowledge retention in preparation for the final exam.

With these new technological possibilities, we were able to do the one step further towards our competitors and start using our users' behavior and real-time data to create highly contextual communication that is relevant to the user study proficiency.
TEAM: Me and an Agile team comprised by 3 Developers, 2 Product Managers, 1 Software Developer Manager, and 4 Quality Assurance Engineers.
I played three roles
Management: I was responsible for determining the overall design direction; I define the design sprints; I wrote and managed Jira design stories and tasks. I was part of the design review sessions.

User Research: This project raise a lot of new technological and design challenges. User research had to be constant in this project because a lot of decisions had to be made upon user research insights.

Product Designer: I collaborate with the team on the ideation process. I analysed users insights either qualitative and quantitative (factor analysis) and made my design decision base on that. Developed the needed prototypes and then tested with users.

Key Problem

How to shift from a concept of tracking users progress to start tracking user proficiency?


  • Outsourcing the AI algorithm development to a third party.
  • A design team of one.
  • Only remote access to users.
  • The feature had to run in an external application provide by AICPA. With an user interface we cannot control or made changes.


  • 88% of student-user Beta group felt that the Personalized Review Sessions improved their efficiency in studying for the exam.
  • 92% of the students felt that they studied more effectively with personalized review sessions. Students also felt empowered by a newfound awareness of their knowledge gaps at a granular level

My Approach

Contextual Inquiry
(empirical process)
Users & Business Goals
Data Gathering
raw work activity data
organized & structure work activity data
Needs & Requirements
(deductive analytic process)
Design Informing Models
(e.g., flow models, usage scenarios)
(integrative process)
Contextual Analysis
(inductive analytic process)
Data Interpretation & Consolidation
Hartson, Rex, and Pardha Pyla. 2012. The UX Book. San Francisco, United States: Elsevier Science & Technology.


01 Learn. Gain knowledge of users, context and technologies. Gather user data, surveys, research competitive products, conduct stakeholders interviews. 02 Explore. Build user profiles on gathered data, produce materials that will aid the outlining of the project, site maps, content inventories, screen flows, navigation models, task flows, user journeys, scenarios. 03 Select. Evaluate, test, and select wireframe concepts for prototype development. 04 Develop. Create design specifications and evolve concept/wireframes into full design solution. 05 Refine. Evaluate design with stakeholders to obtain feedback, conduct usability testing and surveys. 06 Deliver. Complete design and produce deliverables.

Qualitative Analysis Study

To gain more insights about the Survey Process and the Qualitative Analysis, please, read the following article Qualitative Analysis.

01 Learn

Taking in consideration our key problem: how users differentiate progress from proficiency? I conduct a Survey and a Thematic Analysis, to a sample of 300 students, to understand how their perception variate for progress and proficiency according to certain types of graphic representations.
The most voted graphic representation for Progress
20,69% of users voted with a significant expression for this representation as to the most recognizable for tracking progress.
The most voted graphic representation for Proficiency
30,43% of users voted with a significant expression for this representation as to the most recognizable for tracking proficiency.
Interesting Fact: Our users are all accountants. So, I found out with this study that they have an intrinsic need for every graphic representation to be followed by numbers, results or percentage description. They just love numbers :D

02 Explore

The survey results were crucial to support the next design process decisions.

I was able to build user profiles on the gathered data. Define the user flow for the feature, as well as navigation models, screen flows, task flows, user journeys, and scenarios.
Design Informing Models / Flow Models & Usage Scenarios
Becker's Content
CPA course has the following content structure:
As mentioned in our constraints,
we need to create a feature that could run in an external application with the following user interface.
I define the following user flow for the feature.
Important Note: The long term goal is to apply the AI personalized experience into all levels of study content. But for the first phase, the feature was developed to run on a content Unit level.

03 Select

Evaluate, Test, and select wireframe concepts for prototype development.
Design / Wireframes & Prototyping - Interactive Process
Until we migrate the whole education system oriented to proficiency rather than progress, we need to have a hybrid experience between both. For Courses and Sections levels we still need to track progress. And we do it throughout the following graphic representations:
To not conflict with the current percentage progress tracking, with the support of the survey study results, a star representation system emerged as the most logical approach to represent proficient.

Problem: By exploring this representation I understand that a 5-star system was getting the UI too crowded.

Refine: I simplified it to a 3-star system with a half star achievement.

Badge System
Now we needed a more visually intuitive way to communicate the student level of proficiency. I explored several options and figure out that a badge approach was the most appropriate for the context. And we develop the following system.
I tested this concept and quickly learn that for a student earn an half-star he needs to work several Multiple Questions (MCQs) or Simulations. And students replied that they needed a more immediate feedback if they were progressing or not. So, I refine the design to to this approach.
I included a kind of experience bar. Every time a student answered a question the bar would increase if the answer were right and decrease if the answer were wrong. Every time a student fill up a bar, he earns an half-star, until it reaches the 3 stars proficiency. Nonetheless, this kind of podium stage is not guaranteed. If the student answered wrongly to a question that he already accomplished 3 stats he can lose proficiency on it. Because we are evaluation knowledge not progress. Thats why he may lose proficiency.
Proficiency Tracking
The review session uses badges containing progress bars and proficiency stars to track students proficiency within an unit. The badges report the module number or unit number and are color-coded based on the level of proficiency. Additionally, the progress bars within the badges fill as they answer questions correctly and empty and turn red as they answer questions incorrectly. The proficiency stars also fill by halves as you fill the progress bars.
Important Note: To build an efficient AI personalized experience, first, the algorithm needs to collect data and learn from student-based knowledge. So, it would be reckless to encourage the student to jump in directly to the feature. Thus, we decide to place feature at the end of the Unit learning path.

04 Develop

I create design specifications and evolve concept/wireframes into full design solution.
With PRS
Students can now review their weaknesses.
How does it work?
Until the student finishes a Unit learning path, the algorithm will be tracking their performance and knowledge.

Reaching the end of the path, the student can launch a study session with Multiple Questions and Simulations (the two types of exercises that appear in the AICPA exam) and work his weakness.

The PRS came with a Proficiency Dashboard where they can see an overall picture of they proficiency by the Module knowledge level and by the overall Unit knowledge level. If the student didn't study any content for a Module, the badge will appear greyed down, since the algorithm didn't collected any information yet.

Important Note: We prepared on-boarding resources to students'. A pdf guide and a video tutorial on how the feature works.
Review Session
The PRS will accommodate their learning needs with targeted practice questions so that they can deepen in their understanding of the unit's key concepts.

Students can decide how many questions they would like to answer in each review session. Although students often asked for less than 20 questions during the testing sessions, we tested, and concluded that a pre set of 20 questions would be the least a student can do to concretely see any proficiency evolution. So, the default launch, is a session of 20 questions but the student can change it up until 100. Launching the PRS session the student will now see the AICPA external tool.

The feature will serve them a mixture of MCQs and TBSs that the AI technology has selected for them in light of they performance within the course.
We adapted the UI tool to the following design.
As they answer each question, the course tracks whether their proficiency has increased or decreased. And with this information, the course continues to assemble the optimal mix of practice questions for the review session.

Each question within the review session indicates the CPA Exam skill level at which it is testing them. The question also comes with an explanation as to why the course provided that question to them.

Before a student answer each question in the review session, we decide to provide the possibility of watching the lectures discussing the topics tested by the practice questions so that they can ensure that they are mastering the information.

Then, once they have answered all of the questions in the review session, they can go through a detailed list of the questions that appeared within that module. From this dashboard, they can choose to return to particular questions so that they can read the in-depth explanations about why the correct answer was right and the incorrect answers were wrong. If they want, they can also work through additional review sessions to become even more proficient with the exam topics.

05 Refine

I evaluate the design with stakeholders to obtain the final feedback. Them, we conduct an usability tests and A/B Test, complemented with a surveys.

The Impact of My Work

After finishing this first MVP, we conduct a Beta A/B Test with ~3 000 students. The AI algorithm needed to learn from individual users but also from the collective usage. To give time to test the algorithm, we launch the A/B test during three months and the end surveyed the students.

Qualitative Analysis Study

To gain more insights about the Survey Process and the Qualitative Analysis, please, read the following article Qualitative Analysis.
I learned several aspects
1. Problem: The students felt that they didn't control the feature. The feature was controlling them with poor communication on why it was making certain actions on behalf of them.

Solution: I decide to incorporate in the question header, what kind/type of questions was that and why the AI decide to present it. With a tooltip explaining what the category means according to the AICPA parameters. E.g:
Another aspect was "The PRS didn't seem to account for how many tries it took me to get MCQs right". So, we decide to include additional information.
2. Problem: Keeping the original AICPA timer came up problematic. A lot of students expressed that they were struggling with the exam clock. They didn't see the value and was distracting to them. We had this clock just because we were mimic the real exam. But this was harming the students. Students express that they wanted to pause the time well they look for the solution or explanation. And also, the clock was also conflicting with the question timer.

Solution: We decide to remove the AICPA exam timer and focus only in the question timer. The question timer is important because, for example, in the real exam the student have an average of less than 2 min to answer a multiple question. So, I design the following solution, incorporating a tooltip to explain the feature and a pause time action.
3. Problem: Several student's express things like "If the AI behind the program could more accurately figure out what I needed to study, this would be a great tool"

Solution: I redesign the Performance Dashboard at the end of the session in order to provide more detailed information to guide the student study. I want to give the flexibility of see a quick summary or a detailed report.
4. Problem: Some questions take the student's a lot of time to answer it. Because some involve calculations. Thus, a lot of students request a Skip Action when they had limited time to do a session but either way wanted to do some studying and didn't want to get blocked by a long question. Also, a lot of student also request the ability to mark questions so they can review or study it better later.

Solution: We include both suggestions, Skip Question button and Mark Question.

06 Deliver

Time to complete design and produce deliverables.
Becker's PRS UI
Final Design
Did you like this article?
Made on