Evaluative benchmarking study on hardware wearable devices performance
Evaluative benchmarking study on hardware wearable devices performance
Evaluative benchmarking study on hardware wearable devices performance
Role
UX Researcher
Timeline
5 months
Client
Leading consumer social media and tech company
Method
In-lab Usability Testing, Benchmarking, Quant Analysis
🔎 Hey heads up! This is a NDA-protected project.
🔎 Hey heads up! This is a NDA-protected project.
🔎 Hey heads up! This is a NDA-protected project.
There are details of this project that cannot be shared due to the NDA agreement, but I would like to discuss my roles and takeaways along the journey!
There are details of this project that cannot be shared due to the NDA agreement, but I would like to discuss my roles and takeaways along the journey!
Background
Background
👉 Project Overview
👉 Project Overview
Starting May 2024, I joined Blink UX, a design research consulting firm, as UX Researcher where I collaborated with our client to perform in-lab benchmarking usability testings for 4 hardware wearable devices, including virtual reality headset and AI-infused smart glasses, assessing user experience and establishing a baseline for potential improvements.
Starting May 2024, I joined Blink UX, a design research consulting firm, as UX Researcher where I collaborated with our client to perform in-lab benchmarking usability testings for 4 hardware wearable devices, including virtual reality headset and AI-infused smart glasses, assessing user experience and establishing a baseline for potential improvements.
👉 Goal
👉 Goal
Primary Goal: Collect quantitative data to benchmark the efficiency and usability of products to decide product roadmap and measure product improvements/declines over time.
Secondary Goal: Use qualitative data to provide supplementary explanations and context for the quant metrics.
👉 My Role
👉 My Role
Ran pilot tests to refine and finalize study protocol/facilitator's guide
Moderated 100+ usability testings, with each lasts 90-min, across 4 products and collected both qual and quant data using Qualtrics
Cleaned and visualized data using R studio and analyzed qualitative task failure for reporting
Identified product improvement opportunities from various aspects: bug, user flow, UI, accessibility
Ran pilot tests to refine and finalize study protocol/facilitator's guide
Moderated 100+ usability testings, with each lasts 90-min, across 4 products and collected both qual and quant data using Qualtrics
Cleaned and visualized data using R studio and analyzed qualitative task failure for reporting
Identified product improvement opportunities from various aspects: bug, user flow, UI, accessibility
👉 Impacts
👉 Impacts
Initiated the benchmarking program to be continued further for study preparation and execution
Research reports were delivered to impact design and desicion making at the C suites level in client's company, creating a future roadmap to prioritize feature improvement
My Journey Map
My Journey Map
Below outlines my journey mapping as a researcher within one study, showing how the project unfolds and my involvements at each stage.
Below outlines my journey mapping as a researcher within one study, showing how the project unfolds and my involvements at each stage.
Why?
Why?
Let me explain why we chose this research method and the inside scoop on our moderation magic
Let me explain why we chose this research method and the inside scoop on our moderation magic
(typical lab set up)
(typical lab set up)
🖥️ Moderated UT
🖥️ Moderated UT
We chose moderated usability testing to collect behavioral and attitudinal data that our client desires to benchmark. With the clear success criteria for each task in mind, I measured participants' behavioral and perceived time-on-task and their attitudinal self-reported metrics, including confidence, satisfaction, frustration, ease of use, and ergonomic fit while moderating sessions.
We chose moderated usability testing to collect behavioral and attitudinal data that our client desires to benchmark. With the clear success criteria for each task in mind, I measured participants' behavioral and perceived time-on-task and their attitudinal self-reported metrics, including confidence, satisfaction, frustration, ease of use, and ergonomic fit while moderating sessions.
⏱️ Highly-structured session
⏱️ Highly-structured session
Aiming to collect a complete dataset from each participant, each session is highly structured and meticulously prepared. As the moderator, I need to keep as neutral as possible while matching with participant’s energy. This approach differs significantly from my previous experience, which was more qualitative and emphasized building rapport with participants.
Aiming to collect a complete dataset from each participant, each session is highly structured and meticulously prepared. As the moderator, I need to keep as neutral as possible while matching with participant’s energy. This approach differs significantly from my previous experience, which was more qualitative and emphasized building rapport with participants.
🔨 Counterbalancing technique
🔨 Counterbalancing technique
In some studies, we used counterbalancing to vary the order of tasks and prevent participants from being influenced by previous experiences, thereby reducing potential bias. In these cases, I couldn't anticipate which task will come next. This requires me to be highly familiar with each task, its success criteria, and the study protocol to manage the timing.
In some studies, we used counterbalancing to vary the order of tasks and prevent participants from being influenced by previous experiences, thereby reducing potential bias. In these cases, I couldn't anticipate which task will come next. This requires me to be highly familiar with each task, its success criteria, and the study protocol to manage the timing.
What?
What?
Below outlines my journey mapping as a researcher within one study, showing how the project unfolds and my involvements at each stage.
From hiccups to high-fives, the challenges I encountered along the journey!
(goal is goaling everyday)
🏃🏻♀️ Quick turnaround
🏃🏻♀️ Quick turnaround
Running a large benchmarking study for our client, I handled each study involving different products consecutively. I maintained an agile mindset, ensuring flexibility and adaptability to keep the project on track. I quickly took ownership of sessions and products to ensure smooth and steady progress throughout.
Running a large benchmarking study for our client, I handled each study involving different products consecutively. I maintained an agile mindset, ensuring flexibility and adaptability to keep the project on track. I quickly took ownership of sessions and products to ensure smooth and steady progress throughout.
📊 Large dataset
📊 Large dataset
Engaging with 80 participants per study, I handle ~ 40,000 data points for each study. This is my first experience working with such a large dataset and using R to create graphs for reporting. While there is a steady learning curve as I adapt, it’s a fantastic opportunity to gain more experience in quantitative analysis!
Engaging with 80 participants per study, I handle ~ 40,000 data points for each study. This is my first experience working with such a large dataset and using R to create graphs for reporting. While there is a steady learning curve as I adapt, it’s a fantastic opportunity to gain more experience in quantitative analysis!
⏳ Catch client timelines
⏳ Catch client timelines
While the client has a clear timeline for each study, I’m always ready to take on different roles to support my team. For example, I assisted Senior UXR in planning their study protocol by participating in the early pilot and providing feedback on task verbiage and flow. I also analyzed qual data, drawing on my background in qualitative research. My willingness to help and agile approach contributed to the study’s success.
While the client has a clear timeline for each study, I’m always ready to take on different roles to support my team. For example, I assisted Senior UXR in planning their study protocol by participating in the early pilot and providing feedback on task verbiage and flow. I also analyzed qual data, drawing on my background in qualitative research. My willingness to help and agile approach contributed to the study’s success.
How?
How?
Below outlines my journey mapping as a researcher within one study, showing how the project unfolds and my involvements at each stage.
Besides the research report being crucial for the client, it was also a valuable learning experience for the team, collaborating and developing a more standardized study procedure.
(PRIDE activities with team :p)
👉 Internal standards
👉 Internal standards
As the study continues, our research process is becoming more streamlined with well-defined roles, standardized procedures, and consistent timelines. For example, we have a more transparent data correction and sharing workflow that provides visibility and prevents unauthorized alterations. Everyone gradually gets into the flow and takes greater ownership of the project.
As the study continues, our research process is becoming more streamlined with well-defined roles, standardized procedures, and consistent timelines. For example, we have a more transparent data correction and sharing workflow that provides visibility and prevents unauthorized alterations. Everyone gradually gets into the flow and takes greater ownership of the project.
👉 External reporting
👉 External reporting
This benchmarking study outlines the roadmap for these products, comparing with the previous version /competitor data, with an implication of around 7 million active users per month. Our client is pleased with our deliverable, which includes both high-level actionable insights and a detailed analysis of the product for each task. While preparing the report, I learned that the client prefers to see the most urgent and action-needed issues and bugs at a glance. Therefore, the final report is organized in a top-down structure to highlight these critical issues first.
This benchmarking study outlines the roadmap for these products, comparing with the previous version /competitor data, with an implication of around 7 million active users per month. Our client is pleased with our deliverable, which includes both high-level actionable insights and a detailed analysis of the product for each task. While preparing the report, I learned that the client prefers to see the most urgent and action-needed issues and bugs at a glance. Therefore, the final report is organized in a top-down structure to highlight these critical issues first.
My Reflection
My Reflection
Below outlines my journey mapping as a researcher within one study, showing how the project unfolds and my involvements at each stage.
This is my first full-time job since graduating, and it's been quite an adventure. I've learned so much from my team and also gained hands-on research execution experience.
(My last day :p)
Pilot testing is the key!
Pilot testing is the key!
This is a necessary step to assess the study protocol and evaluate the preparation. Pilot helps uncover unexpected issues such as technical problems, question phrasing, lab setup, and time management. By practicing roles as both participant and moderator, I had a clear mindset to troubleshoot and finalize the protocol.
This is a necessary step to assess the study protocol and evaluate the preparation. Pilot helps uncover unexpected issues such as technical problems, question phrasing, lab setup, and time management. By practicing roles as both participant and moderator, I had a clear mindset to troubleshoot and finalize the protocol.
Deal with uncertainty
Deal with uncertainty
Working with hardware devices shows me how unpredictable research can be. Each twist and turn grows my adaptability and problem-solving in real-time, which makes this journey thrilling yet rewarding.
Working with hardware devices shows me how unpredictable research can be. Each twist and turn grows my adaptability and problem-solving in real-time, which makes this journey thrilling yet rewarding.
Initiatives on ResearchOps
Initiatives on ResearchOps
Running hardware projects requires precise and strict device management. I created a device tracking sheet during sessions that included details of device pairings between moderators and research assistants to ensure clarity and visibility for the team. Besides, after reviewing tasks during training, I prepared print-outs with all the necessary information for participants to complete tasks and distributed them to each lab.
Running hardware projects requires precise and strict device management. I created a device tracking sheet during sessions that included details of device pairings between moderators and research assistants to ensure clarity and visibility for the team. Besides, after reviewing tasks during training, I prepared print-outs with all the necessary information for participants to complete tasks and distributed them to each lab.