Product Discovery & User Research
Figuring out what to build, for who, and most importantly, why? Focusing on understanding ongoing customer needs and behaviours across two continents - and keeping the customer front and centre for the business. (2013 - 2016)
My second proudest achievement during my time at MOO was in setting up a new and robust User Research function for the business (my first was the fantastic, talented team I assembled overall). User Research helped our team anticipate business needs, and very often answer questions months before they were asked. It’s unusual to be able to secure funding for this type of work within a small company, so it was a pleasure to work with two extremely talented researchers on my team doing lean, commercially supportive customer work.
We delivered regular user research work across the US and UK. This included both flagship, higher-end studies, as well as guerilla, quick and dirty and usability prototype evaluations, depending on need. We evaluated competitor experiences to benchmark MOO's offering, extensively evaluated our build tools in fortnightly sessions, and helped with strategic product discovery, such as understanding the product opportunity around new potential offerings, such as customised packaging.
Among our meatiest projects: a new Mental Model, new Customer Personas, experimental and leading edge work such as incorporating affective and emotional design tool measures into our design processes, and exploring our ability to do ethnographic research and co-analysis with our stakeholders.
Outside of regular UX research evaluations as part of ongoing product development at MOO, my team also provided quarterly end-to-end analysis of customers’ experiences with MOO. We did this outside of the usual projects, and its focus was to better understand how each agile team’s product changes impacted the customer journey at other points. If one team had launched a new feature, how did that impact things for customers elsewhere in the journey? Did it make things easier and more satisfying? Or had it inadvertently caused issues?
Customer & Business Challenge
A real challenge when working with separate agile product teams is that each owns a separate part of the customer journey. It can be difficult for the teams to see when and how their work may have inadvertent side effects on another teams’ metrics. My team provided independent evidence from customers of what was positive and challenging about using MOO. We highlighted the need for different agile teams to keep the experience consistent between their different features on the site. This makes things easier for customers to use, and more profitable.
The Approach & My Role
My team provided an independent, watchdog role in the business with these studies on customer experience. We highlighted what was working, had improved (or become worse) from quarter to quarter. We provided extensive lab based usability sessions with a wide cross section of MOO’s target customers. We looked at measures of ease of use (SUS). We examined brand language, tone, and customers’ emotional engagement throughout the journey. We shared these baseline scores with the business and used it to feed product work into the backlog of each of the agile teams and solve with our designers each quarter.
This approach can provide a scorecard for teams to better understand how their work impacts the customers’ experience and purchase outcomes - even in areas that are handled by another product team, not their own. It helps teams think holistically, rather than on a narrow subset of their own metrics. It focuses the wider business on assigning resources to improving customer experience. It takes a couple of quarters to bed in the process, but it helps connect UX work to commercial outcomes in the business. It helps Exec Leadership to make more informed decisions and trade-offs, which are a pragmatic reality of business. The UX team must focus not on delivering static reports, but working directly within the product teams to get working customer problems into the active backlog - to actually get the issues addressed.
Researchers: Hannah Capstick, Terri Herbert
Year: 2013 - 2016