British Army
The brief
To Inspire, Inform, and Engage
Our brief was to evaluate the existing structure and content of the British Army MVP site, focusing on identifying user needs related to Careers, Corps Regiments and Units, Ranks, and Personnel and Welfare pages, and their respective user journeys.
This involved capturing user behaviours, attitudes, and general feedback on the new MVP site. The objective was to enhance the content and functionality of the MVP site, ensuring that it effectively inspired, informed, and guided users.
What we did
In-depth Moderated User Testing Sessions
The user research sessions involved testing the site on both desktop and mobile devices, a short semi-structured interview, a series of tasks, and a debrief discussion. Audience segments included Serving Personnel, Partners of Serving Personnel, Potential Recruits (Officers and Soldiers), and Veterans. These sessions were conducted over three separate days, with findings from each session informing our approach for the ones to follow.
To capture the mobile experience, we used 'Mr. Tappy,' a kit designed for recording mobile devices from the user's perspective. Although good in theory, this setup prevented users from actually holding their devices, hindering the capture of their natural browsing experience on the mobile site. Furthermore, it limited our ability to record users' gestures, facial expressions, and comments.
After observing these limitations firsthand during the first few sessions, I developed a new approach using Vysor (https://vysor.io), a suggestion credited to my colleague Guillermo.
What we did
User Testing as a 'Team Sport'
User testing as a 'Team Sport' makes user testing a collaborative team exercise, requiring team members of all disciplines to observe live testing sessions.
I used GameShow, a gaming streaming app, to capture the website and screen activity of participants. In addition, GameShow enabled us to stream the sessions to YouTube so that team members could turn drop in and out of sessions; this was important as flexibility and incentives are essential to encourage participation in user testing.
Collective Team Analysis Sessions
I ran an analysis 'wash-up' session with the project team, which allowed each team member to provide feedback on the session they observed. Following this, we discussed the issues and pain points identified, exploring potential solutions. To streamline the discussion, we adopted the following structure: 'What we saw and heard?', 'What we thought it meant?', and 'Why we thought it mattered?'.
The outcome
User Testing and Findings Report
I compiled a research report, detailing key observations and insights related to engaging users, creating awareness, and guiding them into, and, through the career funnel. To provide context for each observation and insight, I included relevant quotes and recommendations. Additionally, each recommendation was evaluated in terms of 'benefit to the user' and 'effort required.'
The core finding was that due to the user journey being sub-optimal on mobile devices, users struggled to find the content they needed and were discouraged by the outdated look and feel of the site – a significant issue for their digitally-native, mobile-first target audiences of 16–24 year-olds.
The redesigned website needed to:
- Inform the audience about opportunities available in the Army, no matter their ability.
- Educate on what to expect from a life in the Army.
- Convert knowledge into action, turning curiosity and excitement into registration.
- Prepare the user for the realities of application and assessment.
We mapped these findings to user journeys, recommending a structure that split up content by audience. This allowed for unique and targeted experiences that enabled the site to meet the needs of three distinct user groups – regular, reserve or specialist.