Test Creation Platform

Client

Mercer Mettl

Task

UX Design, Project Managment, UI Design

Time

2022-Present

Product Overview

Provide a brief overview of the product, including its purpose, target audience, and current state.

Mercer core product centers on a testing platform tailored to provide extensive online examinations for various demographics including university exams, recruitment assessments, surveys, learning and development programs.


A versatile tool empowering users to effortlessly craft customised test tailored to their specific needs. With our intuitive interface, users can seamlessly create test that focus on essential skills and knowledge.


Key features -

  1. Skill-Based Test Creation - Our platform allows users to design test centred around specific skills, ensuring targeted evaluation of competencies essential for success in various domains.

  2. Question Library - Access a comprehensive repository of questions covering a wide range of topics and difficulty levels. Easily search, select, and incorporate questions into test to streamline the creation process.

  3. Customisable Test - Tailor assessments to meet unique requirements by adjusting parameters such as question types, scoring mechanism, time limits, and more.

  4. User friendly Interface - Our platform boast an intuitive interface designed for ease of use, allowing users to create test efficiently without the need for extensive training or technical expertise.

  5. Analytics and Reporting - Gain valuable insights into test performance with comprehensive analytics and reporting features. Track participant progress, identify areas for improvements and make data-driven decision to enhance test effectiveness.

Whether creating test for educational purposes, employee training, or talent acquisition, our Test Creation Platform provides the flexibility and functionality needed to design test that accurately measure skills and knowledge.

Project Goals

State the specific goals and objectives of the redesign. These should be measurable and achievable.

These goals were developed after more than 45 days of extensive research and need-finding exercises. Each plan contributes to a specific aspect of a product and plays a vital role in providing a better user experience.

Updated Design Language

Use Mercer design system for a consistent, modern look and feel.

Functionality Overhaul

Update test platform stack: modern codebase, privacy enhancements, design system front-end, expanded API controls.

Usability Enhancements

Redesign user flow for better navigation, touchpoints, alert management, and intuition to make the platform easier to use.

Platform Compatibility

Cross-device responsive platform for seamless testing.

Learnability Curve

The platform should be user-friendly and intuitive for users to log in and take tests without guidance.

Accessibility

The platform should comply with W3C AA accessibility standards and include enhanced user features.

Intrinsic Requirements

Project managers outlined essential objectives for the platform's new interface.

Problems

Identify the problem that the redesign was intended to solve. This could be a usability issue, a business goal, or a user need.

  • The platform's non-compliance with modern web standards caused it to malfunction on newer browsers and raised security concerns.

  • The platform had usability, functionality, and UX issues.

  • The lack of accessibility features limits platform usage for some demographics.

  • The platform was challenging to learn and had a complex pre-test process.

  • Features to extend the platform’s support were non-existent.

  • A defined brand language must be used across the platform to ensure consistency.

Given these factors, rethinking and innovating the test platform became essential.

Research and discovery

User research

Describe the user research conducted to understand the problem and the target audience. This could include interviews, surveys, or usability testing.

Planning UX Research

The goal of the research plan was to understand two broad segments:

  • The test platform

  • The users that are using the test platform

A thorough research plan was created using a mixed-methods approach, with each method aiming to provide us with a comprehensive understanding of the product, users, problem, and user needs.

Naturalistic & Participant Observations

Goal: To observe how users experience and use the platform in an examination setting.

Naturalistic Observation
With the assistance of the Operations team, we observed the clients doesnt create the test in their account. The test were created by our own operation team. This helped us comprehend the user's perspective, the testing environment, and how the user experience unfolds during the process.

We only took consolidated notes and did not record or take pictures of the entire naturalistic observation activity.

Short: 

We observed the whole process of assessment creation with the help of our operations team. This helped me understand the user’s perspective, testing environment, and UX.
Notes were taken instead of recording or taking pictures.


Participant Observation
We simulated an exam environment to fully grasp the platform's context and nuances during an exam. We participated in the test activity to observe and make notes on interactions & behaviour.

Hacks & Workarounds

Goal: To determine any unconventional platform usage to achieve users' goals.

Observation exercises allowed us to observe users' workarounds while creating the test. This helped us identify critical points to focus on as we further analysed the users.

User Interviews

Goal: To confirm the insights we have already gained and to discover any additional insights that users can provide.

We continued with user interviews to analyse the user’s needs further. We partnered with users who had never used the Mercer test platform to gain fresh insights.

We segregated the user base into three categories

  1. Novice: Users with no experience with online assessments

  2. Intermediate: Users who have created tests in online test platforms once or twice within one year

  3. Expert: Users who engage with online tests, hackathons, and other online assessments.

We interviewed at least two users per group for 1.5 hours each. We set up tasks for them to complete on the platform and asked them to think aloud. 

We collected feedback on the functionalities, interface, and overall experience at the end.

Plaform / Heuristic Evaluation

Goal: To get an expert panel of users to assess the platform's usability issues based on defined characteristics.

We created a review process in which a group of experienced designers, product managers, and tech leads were asked to evaluate the system using Jakob Nielsen's heuristics.

To quantify the metrics from this analysis, we also identified a few key metrics more relevant to our product type than other heuristics.
The list of heuristics we used for analysis:

  • Visibility of system status // Passed
    The system should always inform users about what is happening through appropriate feedback within a reasonable time.


  • [Imp] Match between the system and the real world // Failed
    The system should speak the users' language, with words, phrases and concepts familiar to the user rather than system-oriented terms.


  • User control and freedom // Passed
    Users should be able to control what they are doing and where they are going. They should not be surprised by unexpected system changes.


  • [Imp] Consistency and standards // Passed
    Users should not wonder whether different system parts mean the same thing. Follow platform and industry conventions.


  • [Imp] Error prevention // Passed
    It is better to prevent errors than to try to recover from them.


  • [Imp] Recognition rather than recall // Passed
    Minimise the amount of information the user must remember. Help users remember what they need to know by making finding and remembering information easy.


  • [Imp] Flexibility and efficiency of use // Failed
    The system should be easy to use for both novice and experienced users.


  • Aesthetic and minimalist design // Passed
    The design should be easy on the eyes and pleasant to use.


  • [Imp] Help users recognise, diagnose, and recover from errors // Failed
    Errors should be easy to recognise and understand. The system should provide clear and helpful error messages.


Help and documentation // Failed
The system should provide help and documentation that is easy to find and use.

Platform / Data Logs, Feedback & Web Analytics

Goal: To collect objective data that can be used to turn qualitative insights into actionable and measurable goals.

The platform has stored many feedback data and client requests since its inception. The data that we acquired from product managers was enormous.
Hence, we need to identify a way to highlight recurring chunks from this data, which can be visualised using a word cloud.

A word cloud is a common technique that scientists use to visualise data.
Unfortunately, the platform lacked web analytics tools, making it challenging to collect objective data.
We had to rely on feedback data and qualitative methods to identify the issues.

Data analysis: 

Summarise the key findings from the user research. This should include insights into user needs, pain points, and preferences.

A multitude of data was compiled, which was gathered from all the mentioned activities, highlighting various issues, lack of features and other UX issues. Since the research data is confidential, I will provide a general overview of the insights we gained from each research method.

Naturalistic / Participant Observations

We clearly understand how the test platform is used in the given context.

  • We saw users taking the time to learn about the platform.

  • We found user are not able to find the skills or questions for their roles.

  • We understood the significance of content hierarchy on the test platform.

  • We found that user doesnt use different test settings to finalise the test.

User Interviews

  • 100% of the users highlighted some degree of navigation issues with the platform.

  • 100% of the users commented on the overall design of the platform.

  • 80% of the users highlighted the question or skill discovery on the platform.

  • 70% of the users commented on the responsiveness of the platform.

  • 100% of expert users highlighted the problems in advanced questions.

Heuristic Evaluation

  • Platform scored 2/6 for all critical heuristics.

  • Platform scored 6/10 on the given heuristics.

Data Logs, Feedback & Analytics

  • Client requests frequently mentioned scalability and platform support expansion. 

  • Better customer support and alternative methods for users were also emphasised. 

  • The lack of accessibility features was also mentioned.

  • Client request self-served and easy to use platform. 

    We compiled all the data we collected and created UX requirements for the project, expanding on the requirements defined by the project managers. These requirements were categorised into the following sections, each addressing a critical aspect of the product.

  • Functionality

  • Usability

  • Learnability

  • Accessibility

  • Compatibility

  • And defined Project requirements

Design Process

After gathering user data and solidifying project requirements, we started designing alternatives to the highlighted issues.
The design process needed to cater to three significant aspects of the flow.

  • Select Content

  • Structure your test

  • Test Setting

  • Finalise or Test Overview

Ideation

Describe the brainstorming process used to generate design ideas. This could include sketching, wireframing, or prototyping.

Flows Tackled

The test platform was divided into three flows catering to essential user journeys.

  1. Select Content

  • Landing Page with manual inputs

  • Content Explorer

  • Question Discovery

  • Skill Cards

    2.Structure your test

  • Section Details

  • Section Summary

  • Correct/Incorrect Marks

  • New Section

    1. Test Setting

  • Test Taker Instructions & Actions

  • Prevent Unfair Practices

  • Test Taker Tools

  • Report Setting

  • Test Taker Registeration Fields

  • Coding Question Settings

  • Database Question Settings

  1. Finalise or Overview

  • Overall Summary

  • Section Details

  • Setting Enabled

Each flow we tackled provided new opportunities to rethink the existing approach and formulate a better solution. Multiple product teams were involved in each aspect of the flow, so the major challenge was identifying all the available micro-flows.

Ideation Process

We used a variety of ideation methods for each flow we addressed. The primary objective of each approach was to generate as many ideas as possible for each task, process, and flow, whether realistic or not.

The methods that were used for ideation were

  • SCAMPER

  • Storyboarding

  • Brainstorming

  • Sketching

  • Card Sorting.

We weighed the pros and cons of each solution to understand perspectives, strengths, and drawbacks. This helped us pick the best aspects and formulate alternatives.

Creatives Produced

At the end of iteration cycles, we ended up with multiple potentially viable solutions that incorporated all the best parts of ideations. These solutions were ideated as Paper sketches and wireframes, Low-fidelity wireframes.

Prioritisation:

Explain how the design ideas were prioritised. This could be based on user needs, business goals, or technical feasibility.

The finalised solutions from each flow were presented to a panel of product managers and tech leads to show each solution's potential advantages and disadvantages. 

  • Project goals were reiterated to evaluate how each idea aligns with the said goals. 

  • Product managers played a crucial role in analysing the solutions based on the impact-effort scale.

  • Including the technology, team was vital to understanding the proposed solutions' technical feasibility and preparing the technology stack for revamping.

After several rounds of analysis and discussions, product managers solidified the flows that best met the given criteria. The flows were also modified to include feedback from the panel and accommodate recurring client requests presented by product managers.

Prototyping:

Describe the process of creating prototypes to test the design ideas. This could include low-fidelity or high-fidelity prototypes.

Figma was used to create high-fidelity prototypes with multiple interactions based on paper and low-fidelity wireframes.
The Mercer Design System (MDS) was customised to meet the project's requirements. Additionally, MDS was implemented in Figma for other projects. 

Multiple variations of the prototype were created to experiment with different visual styles. These variations were then tested with users to determine which style was the most user-friendly and visually appealing.
The results of these tests were used to finalise the design of the prototype.

Creatives Produced

At the end of iteration cycles, we ended up with multiple potentially viable solutions that incorporated all the best parts of ideations. These solutions were ideated as Paper sketches and wireframes, Low-fidelity wireframes.

Testing and Feedback

User testing

Describe the process of user testing the prototypes. This could involve testing with real users or with a panel of experts.

We employed usability testing to test the design flows and the new UI. We procured people from Operations teams and Interns for the testing.
We created a series of tasks that each user has to go through, which can provide us with insights, and we encouraged think-aloud protocols to get the user’s motivation and expectations during an interaction.
We also conducted structured interviews after the testing to get more insights and feedback on the overall design of the platform.

Feedback

Summarise the feedback that was received from user testing. This should include both positive and negative feedback.

We had five interviews with objective reviews and insights from each session.
Few Insights from sessions

  • The new platform was widely praised for its design and appearance. 

  • The user flow was logically divided and demonstrated the user's journey. 

  • The new navigation was easier to understand and more defined. The touchpoints were clear and indicated the next action. 

  • The microcopies were clear and concise, and were placed in the correct locations. 

  • The test platform highlighted essential elements without cluttering the interface.

  • However, there were a few issues:

  • Users needed help find some skills as per their names in the our content repository.

  • User suggest to adding of new skills once we have proceeded to the 2nd step and new section is created.

These issues should be addressed in the next iteration of designs to improve the user experience.

Testing and Feedback

User testing

Describe the process of user testing the prototypes. This could involve testing with real users or with a panel of experts.

We employed usability testing to test the design flows and the new UI. We procured people from Operations teams and Interns for the testing.
We created a series of tasks that each user has to go through, which can provide us with insights, and we encouraged think-aloud protocols to get the user’s motivation and expectations during an interaction.
We also conducted structured interviews after the testing to get more insights and feedback on the overall design of the platform.

User testing

Describe the process of user testing the prototypes. This could involve testing with real users or with a panel of experts.

We employed usability testing to test the design flows and the new UI. We procured people from Operations teams and Interns for the testing.
We created a series of tasks that each user has to go through, which can provide us with insights, and we encouraged think-aloud protocols to get the user’s motivation and expectations during an interaction.
We also conducted structured interviews after the testing to get more insights and feedback on the overall design of the platform.

Say Hello

Have a project in mind? Let’s get to work.👋📫