Redesigning a highly interactive, commercially central design tool
Reducing the learning curve for customers, making a powerful design tool easier for customers to use, more maintainable for development, and more hard-working commercially. This work generated a 9% uplift in conversion, generating an additional £2.8 million in revenues. (2014 - 2016)
Arguably MOO’s core digital tool offering is a highly interactive build tool at the centre of the ecommerce experience. The tool allows customers to customise their designs extensively before adding to their cart and checking out. My team supported major design overhauls as we shifted the tool from being Flash-based to HTML5 and React-based technology. I oversaw UX for significant improvements in customer experience and revenues along the way
Customer & Business Challenge
When I joined MOO the old build tool, referred to as ‘Canvas’, was under-resourced and relatively unloved. The UX and Tech teams were short-staffed and over stretched and couldn’t devote more than cursory attention to the tool. Furthermore, being Flash technology, the tool stood in the way of MOO going responsive on mobile devices.
Here are some screens of the old tool, Canvas, from 2013. The controls along the top of the screen take up a lot of real estate, it's not very clear where you are at any given point in the process, nor is how you find your way to around. The main focus of the screen should be the card you're designing, but it's pushed down the screen, and the many prompts and messages needed to provide user feedback don't fit anywhere clearly.
The Approach & My Role
After several attempts, we eventually hired the right strong team of interaction and visual designers to the agile team that was responsible for redesigning and overhauling Canvas. My role over two years was to lead staffing efforts, and then advise and mentor my designers working on its replacement, called 'Pixel'. I lead work on how to address key interaction challenges and information design raised by the potentially complex controls. I also helped the technical team understand why some issues were important from a UX point of view, particularly the role of motion in helping customers grasp what impact their design actions had. My input with the design involved regular design critique and direction. I also supported my senior UX researcher in running and facilitating major user testing sessions every month to iterate and improve on the performance of the tool.
Initially, the new tool, Pixel, came about after designing to help customers better understand what cards they had in their pack, before they ordered. Feedback was that this was very hard to understand for users of the old tool. The shot immediately below is one of the early incarnations of the redesigned flow. However, this was still problematic: although customers understood more easily what cards were in their packs, they had a difficult time wayfinding in the interface. To address this, with regular user testing over 2014 and 2015, we tackled five key parts of the experiences:
- Onboarding, and the 'Establishing Shot', helping people understand what they could do, from where, in what order.
- The design controls layout and grouping in the key design screens UI, ensuring more real estate for the main task at hand.
- On advanced products, such as cards printed with different layers (such as spot gloss or raised ink), ensuring that this extra level was clear in the concept model of the flow, and users could understand how the cards were layered.
- Paper picking options.
- Previewing the final set of designs clearly.
1. Evolution of Onboarding
This shows how the team iterated the design via user feedback over time, to make it easier for customers to understand what they could do, from where. We wanted to avoid a 'tour' experience, preferring to enable customers to get going immediately, but with subtle hints along the way to help them with how the UI operated.
2. Grouping design tools within the main flow, and wayfinding pointers
We added a clear progress bar along the top of the screen to show step by step what was happening. We experimented over time with progressively disclosing controls, as well as how to group features where customers expected to see them. All of this was supported and driven by insights from on site user testing, as well as remote evaluation of prototype screens and A/B testing.
3. Making layered products like spot gloss and raised ink easier for customers to understand
We introduced a new layers model, based on customer research, which was successful in customers understanding exactly how their designs would work in what could be initially confusing printing processes.
4. Testing different interaction patterns for the Paper Picker
After designing the card, we need to confirm the card stock the customer wants for their card. Each has different options and pricing, so we had to fit quite a lot of information into one space. Here are some of versions that were tested.
5. Previewing final card designs more clearly
Customers had previously struggled with understanding exactly what their cards would be like when they received them. This was due to lots of issues, including how we rendered designs on screen, and the complexities of displaying colours on screen versus their reproduction in ink at the factory. We tackled this in some way by using clearer messaging about numbers of designs in each order (customers could have more than one design on the reverse of their cards in the same order), as well as experimenting with more realistic preview imagery. I still think we have a way to go here.
Implementing Pixel allowed the team to steadily and measurably improve the customer experience. Over almost three years, we experimented with a number of wayfinding approaches, with varying success. We went from a linear model on the original tool (step 1, 2, 3…) to a more fluid model and then back again to something in between, as we experimented with what worked for customers and the business.
Overall, the 2015 iterative improvements on navigation and wayfinding on Pixel, led to a 9% conversion uplift (equivalent to revenue gains of £2.8M).
What I've Learned
The make up of this team rotated extensively over my time at MOO. The team was challenging, with some very strong personalities on the project, and others who evidently displayed some initial nerves and even distrust around ‘design’, most notably a legacy issue of the environment before I joined. This really helped me navigate organisational politics and egos, as well as maintaining a focus on customers’ actual problems and solving them with good design teamwork.
Key Team Credits
Researcher: Hannah Capstick
Interaction Designers: Lyzbelle Strahan, Anh Tran, Alexia Tagliaferro, Nabeel Mousawi & Sam Charman
Front End: Laura Fernandez-Villar, Stephen Brandwood
Year: 2015 / 2016