This project will be completed in the middle of December 2022. The high-fidelity user interface design and usability test are in progress. While we work on the complete reader-friendly case study and publish it, read about the design process of this project "Social media, More Connectedness or Less Connectedness?"
Also, check out the initial user flow to better understand the system.
Throughout the Interaction Design Methods course, we’ve explored, studied, and practiced numerous user experience research methods that are used in the industry. It is next to impossible to explore all the UX Research that exists so far in 4 months. For instance, in this book by Bella Martin and Bruce Hanington, 100 UX research methods are discussed.
My professor, Aqueasha Martin-Hammond, has come up with an amazing idea to get to know other UX research methods. It was named “UX Methods Madness Presentation.” It was an opportunity to explore other popular interaction design methods and analysis techniques that are not covered.
The main idea of this presentation is to form a group of 4-5 people and choose one of the UX Research Methods from the list. After choosing a method, each of the team members has to choose an academic paper or case study from the ACM digital portal or IEEE Xplore that uses the method. Then they have to summarize and explain—
The method we have chosen is Concept Maps.
Find my summary of Concept Mapping here.
These presentations occurred every Thursday in the last month. The key to the presentation is to be clear but succinct. Our goal is to introduce the method with enough detail that the audience can get a summary of what it is and how it can be used.
We had to—
The method and its history
How and when is used in user research.
Summarize 2-3 of the papers/case studies that your individual teammates researched ( you do not have to summarize all due to the time constraints)
Conclusion: My team members and I have learned about the nitty-grittiest of Concept Mapping; we are glad that we’ve chosen this and have it in our UX Arsenal, and really excited to use it in our UX design process. In these 3 weeks, we have learned about some unexplored UX Research methods, and some of them were really interesting and exciting. Almost all of the teams have done a great job explaining the selected methods.
The settings option is highly overwhelming, so I looked up online and discovered I was not alone. That is why I chose it for a usability test. This usability test consists of the Blink Test, the Expectancy Test, and several tasks.
Furthermore, the participant is Sydney Anuyah, a student at SOIC, IUPUI, pursuing a master’s degree, and he uses Gmail almost daily.I have decided to conduct the study at home in a studying environment as a Contextual Inquiry.
The Gmail Settings issues are that it is all over the place. The information is poorly organized and thus very difficult to navigate, making it less intuitive. It seems confusing to tech-savvy persons, let alone older adults and people with various impairments.
While trying to configure the setting mentioned above, I spent a considerable amount of time looking up online tutorials, blogs, etc. Gmail has over a billion daily active users, which are very much diversified; such a service should not be this hard to navigate.
As mentioned previously, it is a Contextual Inquiry. Here I, as the facilitator, am going to ask the participants questions and ask them to perform predefined tasks. At the same time, I will observe the participant's reactions, emotions, and mouse cursor movement and take notes.
Gmail is widely used, so finding a real user is not difficult. I have deliberately chosen an educated and tech-savvy participant to fathom the usability for people who are not very knowledgeable and tech-savvy. In the meantime, Contextual Inquiry seemed the way to go because the issues are with navigation and Information Architecture, so observing and including moderated tasks is vital. Also, I am conducting usability testing for only one participant, thus having the time and other resources.
This usability test is measured in 4 UX metrics, such as—
Kindly find the setting of Gmail. (From the home page)
Response: The participant finds the gear icon within a few seconds. Point to be noted that later in the test, the participant confused the setting and support icons.
Another point is that the participant could not discover “See all settings” the first time. It took him some time to find out that it offered more settings.
Time taken: 5-6 seconds.
Blink Test: The participant was given 5-6 seconds to look at the setting panel.
Expectancy Test: Gave the participants 15 more seconds to look at and interact with the settings.
(Task) Can you kindly change the language?
Response: It took the participant to find the language settings 6-8 seconds. Then extra 4-5 seconds to find the “Save Changes” button to save the changed settings.
It was pretty straightforward for him; he did not confuse or hesitant about other things.
Completion rate: 100%
(Task) Kindly turn on/off the grammar and spelling checking.
Response: After searching for a few seconds, he could not find the setting; interestingly, he searched in Support>Help and found the guidance for changing the settings.
Time took: About a minute.
For instance, you are on vacation, or it is a weekend, and you want to set auto email reply for whoever tries to reach you.
Response: The participant gives up and directly searches “Help” for the option. He assumed he would not get it right or find the option. Thus he directly went on searching for the setting.
Interestingly, he found an option to set auto replies for vacation. However, it does not allow him to set auto replies for weekends or “on repeat” occasions.
There is a way to set that using templates. Unfortunately, there is no official documentation for it; thus, the participant could not find it.
So here, the weekend/out-of-office auto-reply could be implemented using the “on repeat” option like in Outlook calendar events.