top of page
  • Hollie

Week 7: Prototyping and Usability Testing (Part 1)

Updated: Apr 23, 2023

1. Setting the scene


1.1. Breadth of analysis: exploring key skills and domains – creating wireframes and interactive prototypes


This week, we started the process of transforming our project wireframes into prototypes. I'd been looking forward to getting stuck into this phase of the design process, as prototyping was one of the elements I enjoyed most during the rapid ideation sessions I carried out in the Development Practice module (my other favourite element being visual design, which I'm looking forward to diving into next week).


While I was glad that I'd already started getting to grips with some of Figma's prototyping capabilities during Reading Week, I was also conscious that I hadn't yet created a full wireframe flow for my project. I set myself the goal of mapping out a full set of basic screens by the middle of the week, including some basic prototyping interactions between each state. While time-consuming, this was an extremely helpful exercise for getting my initial ideas out of my head and into something more tangible (see Fig. 1).



Fig. 1: Full set of wireframes including interactions


1.2. SMART goal: explore Google's Material Design system to get inspiration for potential design patterns


Before starting on this week's challenge activities, I booked some time with my tutor to get some feedback on my initial flow. During our discussion, it became clear that, while I'd been practising with some basic interactions like interactive checkboxes, I hadn't yet given much thought as to how I might harness existing UI design patterns to create a more familiar, intuitive experience for my users.


Following feedback from my tutor, I plan on spending some time in Weeks 7 and 8 exploring Google's Material Design system, to help generate ideas for common design patterns that I might be able to effectively apply in my project (for example, modals, progress indicators, buttons, text input fields, and alternatives to checkboxes, like clickable filter chips).


2. Exploring usability testing


As I learned more about usability testing this week, I started to think about how I might be able to apply it more consciously to my practice as a UX Writer. All of the usability tests I've come across in my UX work so far have focused primarily on the interactions a user must take to complete a certain action. These tests have often been measured using quantitative metrics, such as task completion within a certain timeframe or a certain number of clicks. In contrast, I haven't yet seen a usability test carried out that focuses primarily on the language used in an interface.


Of course, language is often bound up with interactions themselves – if the instructions in an interface aren't clear, users will struggle to achieve their goals. But I wanted to explore how I might be able to run usability tests that focus less on how functional the words used in an experience are, and more on how they make users feel. What associations do the words conjure up for it? If the interface had a personality, how would users describe it, based on the way it speaks? How do they make users feel about the experience as a whole?


These kinds of questions perhaps fall more under the umbrella of 'desirability' – a key tenet of UX alongside usability and usefulness (e.g. Kreitzberg & Little 2009). But I also wonder if it's really helpful (or fully possible) to separate these concepts while testing. If we believe that "usability is the base level of the user experience", we also have to acknowledge that "without desirability, it’s unlikely that the user experience will be memorable or recommendable to others" (Interaction Design Foundation 2021).


Viewed in this way, then, testing the desirability of the language used in a product alongside its usability can only enhance the overall user experience. I was especially encouraged to find that Marsh (2022: 119) describes "content testing [as] a specific type of usability testing that focuses on how suitable and understandable your content is for the intended audience. It can be done as part of usability testing or separately."


With the above questions and Marsh's insights in mind, I decided to explore further – firstly, by conducting a mini language-focused usability test myself (see 2.1), and secondly, by setting myself the goal of getting more familiar with usability testing tools to be able to conduct language-focused research in my professional work (see 2.2).


2.1. Breadth of analysis: exploring key skills and domains – conducting usability testing


To conduct my mini language-focused usability test, I used the framework laid out in this week's challenge activities. I began by creating two tasks, focusing on making them realistic, actionable, and not leading.


In task 1, I used both quantitative and qualitative metrics to assess whether the copy was clear enough to guide users to a particular screen. I focused on messaging more explicitly in task 2, using qualitative metrics to assess the copy's effectiveness:


Task 1


Start point: ‘Start the quiz’ screen

End point: ‘Skills to share’ screen

Goal: Users should be able to select specific skills they’d like to share while completing the quiz.

Hypothesis: If users are able to reach the ‘skills to share’ screen without getting confused by the ‘skills to gain’ screen, the copy is sufficiently clear.

Task: Show me how you’d select skills that you’d like to share while volunteering.

Metrics: Quantitative: Users should be able to reach the relevant screen in 4 clicks. Qualitative: User satisfaction/frustration/confusion.


Task 2


Start point: Onboarding screen 1

End point: Onboarding screen 3

Goal: Understand how users feel about the tone of the messaging used in the onboarding screens.

Task: Look at the information on each screen. What do you think you can do using this app? How does the messaging make you feel? Is anything unclear?

Metric: Qualitative: Users feel encouraged or excited by and/or positive about the messaging.


I decided to specify start and end points for the parts of the flow I wanted to test, so that I could focus on refining specific areas of my wireframes to present to participants. At this point, however, I feel I got a little carried away with the visual presentation of my wireframes. I wanted them to be more visually pleasing before I showed them to potential users, rather than embracing the 'quick and dirty' approach of getting rough designs into users' hands as soon as possible (see Fig. 2 for my slightly over-refined prototypes). I'd like to be mindful of this as I move through the rest of the module and onto other projects, testing earlier and more often so that I can make incremental changes as I go.



Fig 2: Slightly over-refined wireframes...


I decided to try out the research tool Lookback to carry out my usability tests with three participants. This was a useful learning experience in itself: while I was careful to provide some basic instructions to participants, including asking them to talk out loud as they moved through the screens (as recommended by e.g. De Voil 2020: 132; see Fig. 3), I could have been more explicit in how they needed to interact with the tool. On one participant's first go, for example, they became stuck on the very first screen, and 'completed' the test before seeing any of the other content. They were able to complete the task successfully on a second attempt, but I could have prevented this by providing clearer guidance.


Next time I carry out usability testing with a tool like Lookback, User Zoom, or User Testing, I plan on carrying out a "dummy run session" (Marsh 2022: 21) to iron out any sources of potential confusion for participants before beginning the research properly.



Fig. 3: Lookback usability test set-up


Despite these setbacks, I was able to gain some valuable feedback from the usability tests. All three participants were able to complete Task 1 successfully, suggesting that the language used to guide them to the target screen was clear. For task 2, I used Lookback's built-in transcription tool to analyse how participants described the messaging in the app and gathered the keywords into a word cloud (see Fig. 4). I plan on using these keywords to help me craft a voice chart for my experience as part of the goal I set for myself in Week 5.



Fig. 4: Word cloud generated using usability testing transcripts


2.2. SMART goal: become more familiar with usability testing tools to be able to conduct language-focused research in my professional work


I'd like to further develop my skills in applying industry-standard user research tools so that I can gain deeper insights into the language used and preferred by the customers I write for in my professional role. Over the next two months, I plan on completing training for the platform User Testing and finding a mentor at work to help me learn how to set up an unmoderated usability test, create a research script, and evaluate, analyse, and present the results to my peers.


I'd also like to explore the possibility of completing the UX Content Collective's UX Content Research and Testing course (2023), to give me a better understanding of content testing best practices as applied by other UX writing professionals in their day-to-day work.


3. Conclusions


Now that I have my full user journey mapped out in wireframe form, as well as some initial insights on the tone of my experience so far from my usability testing, I'm excited to start bringing my designs to life next week by applying visual design elements like colour and typography. I need to try and keep the 'test early, test often' mantra in mind as I go, regularly seeking feedback so that I can make my designs as useful, usable, and desirable as possible.


The experience of gaining feedback at this stage will be an interesting exercise in practising openness and humility. As Hamm (2014: 77) states, "we do ourselves no favour [sic] by staunchly defending a solution that is not comprehensive. Sometimes we have to back up and try again" – which, it seems, is the name of the UX game.


3.1. How satisfied do I feel with my work this week?


1 = Very satisfied , 2 = Quite satisfied, 3 = Neutral, 4 = Quite frustrated, 5 = Very frustrated


Start of week (pre-activities): 2 – excited to start prototyping; encouraged by the work I'd already done to familiarise myself with Figma's prototyping capabilities; encouraged and inspired by tutor feedback to explore more sophisticated prototyping and design techniques


Usability testing: 2 – enjoyed exploring 'language-focused' usability testing; learned about the importance of clear participant instructions when setting up test scenarios


4. References


DE VOIL, Nick. 2020. User Experience Foundations. BCS, The Chartered Institute for IT.


HAMM, Matthew. 2014. Wireframing essentials: an introduction to user experience design. Birmingham, England: Packt Publishing.


INTERACTION DESIGN FOUNDATION. 2021. ‘Usability vs Desirability in Mobile UX’. Interaction Design Foundation [online]. Available at: https://www.interaction-design.org/literature/article/key-question-in-user-experience-design-usability-vs-desirability [accessed 13 March 2023].


KREITZBERG, Charles and Ambrose LITTLE. 2009. ‘Usability in Practice - Useful, Usable and Desirable: Usability as a Core Development Competence’ [online]. MSDM Magazine, VoL. 24, Nr. 05. Available at: https://learn.microsoft.com/en-us/archive/msdn-magazine/2009/may/usability-in-practice-useful-usable-and-desirable-usability-as-a-core-development-competence [accessed 13 March 2023].


MARSH, Stephanie. 2022. User Research. 2nd edition. London/New York/New Delhi: Kogan Page Limited.


UX CONTENT COLLECTIVE. 2023. 'UX Content Research & Testing'. UX Content Collective [online]. Available at: https://uxcontent.com/content-research-testing/ [accessed 22 April 2023].





37 views0 comments

Recent Posts

See All

Comments


bottom of page