Role Digital product designer
Team Cross-functional product team, Researcher
Goal Optimise MOO’s design tool for new customers on mobile
I was part of a cross-functional team working on MOO’s create and upload application, enabling customers to create bespoke printed designs from a range of templates and tools. I was tasked with improving the mobile experience to increase customer conversion. Using iterative user interviews and prototypes, I explored both large and small design solutions to enhance the user experience.
The application was previously desktop only, but was recently made responsive to fit on mobile devices in order to reach the growing mobile market. However, no user testing was involved in the process of reducing the application to a small screen. With no previous insights, my initial goal was to understand how users created a business card and what problems they were having, this would help me to identify quick wins to improve the experience as well as uncover blockers that needed to be redesigned.
Some of the key user issues we discovered in user testing:
I set up user interviews to understand how users created a business card on moo.com. I ran these interviews weekly, allowing the rest of the team to observe and help collect notes. These weekly sessions allowed me to adjust the tasks depending on what I was looking into that week. This set-up enabled me to quickly test prototypes for new designs and conduct competitor comparison interviews, enabling me to test various design solutions for the same tools and refine my designs.
To keep it lean, user problems were collected on post-it notes and prioritised with the team using effort/value mapping, enabling us to have a positive impact on the user experience from the start.
Error notifications and warnings were causing more usability issues than they helped to fix, and stopped users from proceeding in the test. We needed a solution to allow users the control and freedom to correct or dismiss the errors.
Remove the error banner. Errors and warnings consisted of a banner at the top of the page and a tooltip over the canvas. In testing users did not associate these two messages, especially on mobile where they were not visible at the same time. Therefore, we removed the top banner notification to focus on the tooltip.
Keep the tooltip in the viewport. The new tooltip notification floats at the bottom of the screen so that it is always visible, allowing users to access their design behind it or close the notification in order to correct the problem.
Provide more information at the review stage. Errors in the review stage did not provide users with any information. I designed a new solution that summarises each error warning at the top of the page, with buttons to explain the next steps to fix.
How to run user interviews. Prior to this project I had only conducted unmoderated user testing, this was my first experience conducting moderated tests and speaking directly with users. This was a great learning experience and will be a valuable tool in the future.
Increase empathy with user testing. Throughout the process I involved other team members, engineers and project managers, inviting them to observe interviews and debriefs, as well as using their vast knowledge of the site to help prioritise issues, creating a living effort / value map. I regularly received positive feedback from the team and noticed a shift in the way they talked about the product, considering the user’s perspective.
Keeping track of insights. Because this was such a lean research project with new insights coming every week and fast turnarounds, I relied heavily on post-it notes. In order to keep a record of all the work that was being done and what was coming up next I created a spreadsheet listing all the issues and linked to JIRA tickets and video clips where possible.
© Danielle Last 2020