Hala 🤩 !
Here is the Part 4 of this serious which I’ll talk about Design Sprint Foundation – The Udacity nano Degree. You can View the Previous Part:
- Design Sprint Foundation P1 – Intro
- Design Sprint Foundation P2 – Day one
- Design Sprint Foundation P3 – Day two
- Design Sprint Foundation P4 – Day three
Through this article we gonna talk about the Day No.4 “the last day” – Activities, what we gonna do and deliverables in details, so let’s begin.
This is the day we get real user feedback on the ideas and the solutions to our sprint questions that we set on Day 1 of the sprint. Now the 3 Modules on Day 4 are going to determine the success of your entire sprint.
- Module 7: We’re going to look at how to recruit user testers.
- Module 8: We’re going to look at how to set up and run the actual user interview itself.
- Module 9: We’re going to look at how to write a sprint summary that answers all of the sprint questions and gives the team clear next action steps.
How to recruit for interviews
This is actually an ongoing process that can begin on Day 2 “recruiting users will take time, start it early!”. once the sprint team has decided on the ideas that they’d like to test. The sprint team will agree on a target demographic of users that will benefit and most likely use your product.
Here are four ways that you can recruit for your interviews:
- Social Media Ads: Write an Ad or post that targets your specific demographic.
- Online Forums: Go to online forums like Quora or Reddit and reach out to individual users who are already discussing using products similar to what you’re working on.
- Existing Customers: Ask existing customers. If you’re working on a product or service that has already been released into the market, there might be some open lines of communication with users who are already giving feedback to the current product or service.
- Friends & Family: Ask friends or family. It’s often that you’re working on challenges or products and services that you know would benefit your friends and family. It’s not a cop-out to send them a message and see if they have time. You’ll be surprised that these people will be really happy to help you.
When you need to reach out to user testers in a very short time frame, keeping it simple is incredibly important.
Use a Scheduling Tool
Using an automatic scheduler is a life-saving when it comes to last-minute test scheduling. Tools like Calendly or Need to Meet will help remove the logistical difficulties around meeting coordination. Between timezones and availability, you’ll need all the help you can get. Check out this link for Best Meeting Scheduler Apps and Tools.
Leverage Your Network
Using Networking sites like LinkedIn, Quora, Medium, and Reddit to target particular groups of people can be incredibly effective. Do you know people who might be interested in your product? Just ask your community of professionals. You’ll be surprised who will express interest and show up!
It’s a great idea to have one or two people take on the responsibility of logistics for your user testers. As you are taking up their time, you need to make sure all communication is clear, effective, and considerate.
Your goal is to secure 5 user testers for testing on Day 4. This is an efficient number of tests that will still provide you with clear patterns to emerge from the data. We’re aiming for qualitative rather than quantitative data for these tests “If you don’t know the difference between the two types of data, please google it”. If you are curious on how the magic number 5 came about, read this article by the Nielsen Norman group.
Typically you want to start the process near the end of Day 2, once there is a clear idea from the Storyboard and the team has identified the target demographic.
Here’s a Checklist for tasks to complete on Day 2:
- Define the target demographic
- Vision and define how the test will be conducted (interview? remote test?)
- Try out a social media ad that targets the selected group
- Decide on a gift and budget to thank each user for their time. Company swag or a gift card work well!
And Here’s a Checklist for tasks to complete before testing, so on Days 2 & 3:
- Schedule times for the 5 user testers: Block an hour for each test; 40 minutes test, 20 minutes reset & prep.
- Use a Calendar scheduling tool like Calendly to set up each test session
- Create a custom video call link if the tests are remote.
- Create a templated response for the user testers that write back to your ad to keep each interaction uniform.
User Feedback Wall
Once you have secured your 5 testers, you can prepare your Testing Feedback Wall. Generally, you want columns for each tester including a note on the time slot they have. Then across as rows, it’s good to include each section of the Prototype that you want to focus on for feedback collection. This wall will be filled up with notes taken by the team members that are listening in and observing each of the tests.
Module 8 – The Interview
The sprint team will have decided on one person to conduct the interviews. They will be doing the interviews away from the rest of the team whose job it is to capture and take down all of the feedback that the user is giving to the interviewer.
You’ll be collecting both positive and negative feedback from the user. So using two colors of post-its is a great way to represent this on your feedback wall.
The sprint team, and in particular the interviewer, can easily derive the interview questions by checking in on the user test flow, the Storyboard, as well as the sprint questions created on Day 1.
Setting up Both Rooms
Typically for these user interviews, you need one interview room and one feedback room. Usually, the sprint team will be using the same room from the Design Sprint as the feedback room.
The interview room should be secluded so that the user tester is not distracted or intimidated by events or people that might be happening in the office. If it’s a remote interview, then the interviewer needs to make sure to place themselves in a secluded room to lessen the noise interference and maintain a sense of professionalism within the testing space.
The feedback room should have the User Feedback Wall set up and ready for feedback collection. This can either be set up on a physical wall or on a moveable whiteboard. Both options are great. Most important aspect is that all folks on the sprint team can see the feedback coming in and have access to add their own feedback after each user test session.
Ideally, the user won’t be seeing the feedback room nor meet the observers taking notes. Of course, the user tester should be notified that people are watching and listening into the session, but generally, you don’t want the user to meet the people that are writing the feedback, as their actions and thoughts might be influenced by knowing who is watching them.
Device Set up
For a digital test, typically the user tester will be using a device provided by the sprint team to test the Prototype. If possible, you should have the device screen sharing or cast to a screen within the feedback room. This way, the observers will be able to see how the user is interacting with the Prototype screens. It’s also important to set up audio, either with the devices built-in microphone or by setting up an external microphone in the room. Being able to hear what’s happening in the interview room is crucial for collecting feedback.
Each observer should not be talking or making comments on the test during the sessions. Instead, the observers should be silently writing single pieces of reaction or feedback on sticky notes. Using two colors for positive/negative feedback is a great, simple way to visually identify the type of feedback coming in for each section.
You’ll need to come up with a script or set of interview questions that you’ll ask each of your user testers. The more uniform you can make the interactions, the less likely the environment and setting will change or affect the user testing outcome.
Tips on Writing Questions:
- Review the Sprint Questions
- Review the Long-Term Goal of the company
- Check out the Storyboard, specifically the Key Screens selected by the team
- Try to keep each of your questions open-ended so that the user won’t just give a yes or
Example Interview Questions:
- How did the ad make you feel?
- What do you think about the Rewards program?
- What are your thoughts on the point system for the tasks?
- What do you think about the types of tasks available in the program? How did the special offer make you feel?
User interviews can happen on location in person, or done remotely online. If you’re running the user interview, your first job is to make sure your user is comfortable. You want to create a safe space where they can give you open and honest feedback on the ideas on the Prototype that you’re going through with them.
To build a rapport with your user, it’s a good idea to warm up by asking: “Did you find your way to the location easily?” or “How’s your day going?”
You can remind them that you’re not testing them, you’re testing the Prototype.Not everything in your Prototype will work. So ask your user to say their thoughts out loud. There are no right or wrong answers here. So by capturing the user’s thoughts, you’ll be able to follow exactly what they would like to do next. And your team will be one step closer to building a product that will actually be useful.
If you find that your user is not opening up and talking openly about what they’re seeing in front of them, then try asking open-ended questions. You’ll be surprised how quickly people jump in and try to finish off your sentence if you just let it trail…….. off.
Suggestions on User Interview Conduct
Start your contact with the user tester by thanking them for taking the time to come in for the test. Let them know how user testing is a critical part of the team’s process. Frame the user test as a test for the product, not a test for the user.
Acknowledge their feedback
You can say make audible affirmations as often as you think necessary. These noises signal that you’re taking in what the participant is saying and you’d like the user to continue along the same lines. Note these affirmations are meant to indicate that you understand what the participant is saying, not that you necessarily agree with it.
Clarifying User Phrases for Observers.
If the user mentions something on the screen but doesn’t give any identifying information, you may want to do a little bit of clarification to make it easier for the observers to follow the action. For instance, when the user says “That was weird!,” you can say, “Which part was weird? Can you be more specific?” Here are some more examples of clarifying questions:
What do you think about this section?
Is that what you expected to happen?
Was there something in particular that made you click that?
Sometimes it helps to give a short summary of what the participant just said to make sure that you’ve heard and understood correctly. And it confirms to the user that you are listening to what they are saying. Here’s a great resource for Active Listening, a great skill to employ during activities like User Interviews.
This is the last part of the sprint. We’re going to look at how the team comes together to summarize the feedback and we’re going to also look at how to write a detailed summary of the sprint.
Once the user interviews are all complete, the sprint team should huddle again, look at the user feedback wall, and derive the trends from the feedback across all of the interviews. Having these trends formalized before you write the summary will help you keep the summary succinct and on point.
Writing a Succinct Summary Report Writing a succinct sprint summary can be broken down into five parts:
- Top Trends: Open your summary with one or two sentences that generally capture the overall feedback of the user testers. This could be in a positive light, or in a negative light, but be honest.
- Long-Term Goal Reflection: Reflect back on the 2-year goal set on Day 1. Does this goal seem realistic given all of the user feedback collected from the testing?
- Sprint Question Answers: Answer the Sprint Questions. Use the trends that you’ve highlighted across all of the feedback to clearly answer the questions in a yes or no fashion.
- Next Action Steps: Give clear next action steps. To help your team make progress on this project, look for 3-5 ways that the team can use the feedback and the outcomes of the sprint to move closer to releasing a real product. For example, did one or two features resonate really highly with our users? Or did they give us suggestions on how they might like this feature to work? Now you have a basis to iterate on this week’s outcome.
- Detailed Prototype Feedback: Give a detailed breakdown of the individual feedback for each feature or idea. As the user has been moving through the Prototype, there’s going to be bullet points of things that worked well, as well as things that didn’t work so well. You want to capture all of these points here and it could get quite long. So make sure that you’re lifting the important points to the top of the report.
How to Define the Next Steps
- Start by clustering the notes from each test. Look for similarities in the feedback from all the users. With the use of two colors, positive and negative feedback, you may be able to easily see some very positive areas. This generally means that those parts worked well and generated positive feedback.
- Aggregate insights from the notes on the grid and the synthesized notes grouped on the wall: Start writing down the main repeated feedback as bullet points under each aspect which was tested. It might be useful to use a tool like Google Docs or a spreadsheet to organize all of the feedback and view it as a team.
- Starting to answer your sprint questions will lead you towards a clearer picture of possible next steps.
Answer all the Sprint questions in depth and come to a Yes or No conclusion for each.
- Finally, collect ideas on how to improve the prototype or whether to pivot to a different solution to be
tested. You may want to pivot towards a different solution if it’s clear that none of the features worked
as you intended them to.
- Your Next Step Options: Improve, Pivot, or Launch.
And, That was the last post covering Design sprint foundations – the nano degree program which offered by Udacity, I highly recommend this program for any designer, since you will find out more!
Again, Remember, The Design Sprint Kit site by Google – is here for help also. and keep in your mind, That it’s not the pure way of implementing Design Sprint, but – it’s the way used by AJ&Smart – and it’s very moderated that’s why it’s really useful and productive!
Scenarios describe the stories and context behind why a specific user or user group comes to your site. They note the goals and questions to be achieved and sometimes define the possibilities of how the user(s) can achieve them on the site. A ...
Contextual inquiry is a semi-structured interview method to obtain information about the context of use, where users are first asked a set of standard questions and then observed and questioned while they work in their own environments. Four pri...
The Kano model is a theory for product development and customer satisfaction developed in the 1980s by Professor Noriaki Kano, which classifies customer preferences into five categories. Helps determine and prioritize which product att...
A content inventory is a collection of data about your content. Unlike the content audit, which is qualitative, the inventory is quantitative. It's a comprehensive list – typically a spreadsheet – of all content assets, ideally across all content...
Testing of prototypes, products, or interfaces by users of a system in design development This gauges human expectations against a designed artifact, determining whether something is useful, usable, and desirable. Testing should collect per...
A method of illustrating relationships and patterns in system behaviors by representing two or more system variables in a controlled way. People understand the way the world works by identifying relationships and patterns in or between systems. On...