Hearing the voice of the customer in the enterprise product development lifecycle
As a large organisation, Telstra recognised it needed to come up with an effective and efficient customer experience evaluation framework to support its full product portfolio.
But products developed by the large-scale telecommunications company generally comprise multiple services, rather than single products or interfaces. As a result, Telstra needed to establish a robust evaluation methodology that would generate specific, measurable results for those services, as well as provide visibility of the user experience.
A consistent but flexible evaluation framework for Telstra products
In this presentation, we will describe the methodology we have developed – an approach that is consistent but flexible enough to address products that are diverse and frequently complex. The key aim of our methodology is to introduce the voice of the user into the enterprise’s mature product development process, while managing varied stakeholder expectations.
Details and concrete examples
We will explain how Telstra’s UX testing program has met the challenges which exist in this environment, illustrated with plenty of examples from our recent research. In particular, we will draw on our pre-launch service walkthrough of two service products (Telstra Platinum and Telstra Cloud services) that recently entered the market.
We will provide rich descriptions, using a variety of formats, of core aspects of our framework which we believe are central to the success of our program.
- Our ‘three pillar’ framework which defines the product evaluation needs relating to (i) product features, (ii) technical and service environment, and (iii) usability, providing a map for defining the design and evaluation needs for individual products and services. This provides a common point of reference in determining which aspects of evaluation will be included in a service walkthrough, as opposed to other evaluation activities.
- Stakeholder engagement processes, which build common ground, providing a basis for informed and appropriate research design, while enabling rapid ramp-up and delivery in line with tight product development timelines.
- Examples from our lab-based service walkthroughs showing how we have adapted this technique to evaluate key touchpoints from across the end-to-end service experience (including customer service, pre-sales activities, phone-based interactions and collateral) using scenarios, wizard of oz, role-play and prototypes.
- Integration of advocacy metrics – now fundamental to Telstra’s way of working and performance tracking – into our evaluation process.
- Approaches to capturing and representing user experience insights that are consistent with Telstra’s processes and can ensure the experience is consistent with business expectations before launch.
Why listen to us?
Telstra and U1 Group have been working together for over a year, drawing on U1’s 12 years of research expertise to develop and implement an evaluation framework which is:
- Consistent and repeatable, but applicable to diverse product types
- Compatible with the enterprise product development process and decision checkpoints
- Responsive to stakeholders’ diverse needs and varied expectations of what ‘user testing’ should entail
- Effective in giving visibility to user experience issues
Our approaches are proving highly successful and garnering considerable interest within Telstra; having delivered excellent outcomes for several high-profile products, the UX testing program is expanding to additional product portfolios.
We believe that our presentation will be relevant and helpful to testers, designers, UX managers, CX professionals and market researchers seeking to establish evaluation approaches that go beyond user interfaces to incorporate service components – in particular where these approaches need to be applied in a large or mature enterprise environment.