top of page
ANALYTICS

Benchmarking the User Experience

Introducing UX metrics into a data-first org to raise its UX maturity score from 3 to 4

The Problem

CitizenM had an internal mandate to raise their UX maturity level, and as a data-driven company with minimal data on UX, the user experience was essentially invisible. Colleagues could remind each other that UX was important, but when push came to shove, UX would lose out because there was only qualitative evidence. Initiatives supported by hard metrics would win the prioritization battles. That would have to change to achieve any meaningful advancement toward UX maturity.

The Solution

As a first step, introducing UX metrics would provide some data on the user experience to executives, and once they could see how the user experience was performing relative to other business KPIs, they could factor it into their thinking more. The task at hand was to devise a UX metrics approach that would capture the state of the experience.

Project at a glance

  • I worked in collaboration with the Head of UX

  • I did the work of writing or adapting and then testing survey questions, testing various word choice, posture, topics, and so forth

  • Frequently consultations with the Head of UX included brainstorming, planning, and filtering and priorizing discussions, until we reached an acceptable solution

Deep Dive

Metrics that measure user experience are important in any organization that relies heavily on data for decision making. When UX is left unmeasured, as it often is, it is never given as much focus or priority as other aspects of a business that are easier to measure, like sales volume or customer service response time, even though UX may a critical component of business success.​
​

It doesn't help that UX metrics are difficult to do well, because user experience is intangible in many ways. The challenge in implementing them, then, lies in a bit of creative problem solving: finding the right proxy data. I believe that more often than not, given a little effort, it’s possible to identify a few metrics that accurately measure user experience.​

​

For citizenM, which had limited user experience data collection on their website, in the form of quick-hit surveys ("Please rate your experience with this booking"), the task was to develop more robust measurements of the experience customers had more broadly on their digital properties.

​

We started by testing common industry measurement tools, like the System Usability Survey and a few other more recent alternatives. It turned out that in our context, those surveys were too general, and in some cases the questions didn’t make sense to the participants in our testing. We then started experimenting with our own questions.

​

We started at the most basic level, with user satisfaction, ease of use, and goal success. We added additional topics, like aesthetic response, emotional affinity, and textual clarity. We experimented with different survey lengths to calibrate the right number of questions, and different question language that would result in more accurate data. Finally, we wanted to minimize the typical biases for surveys, like response type bias, so we factored that into the questions.

​

Testing a custom survey like this is critical, so we ran the survey with a variety of customer types from a variety of locales, making refinements until the survey was reliable. The next step was to roll it out as an intercept on the website.

The Outcome

The survey created a new category of data for citizenM, which helped raise the profile of user experience in the company's operations. With data on UX, product owners both became aware themselves of various user experience issues and had numbers to help prioritize work. The UX maturity level score increased from 3 to 4.

Contact me if you are looking for more detail. I can provide it privately. 

bottom of page