Inclusive Design, Product Design
In a participatory design project and considering our user's insight, the Converse device was designed to provide deaf users to have meaningful communication within desired social environments.
Client: Student Project
My Role: Inclusive Designer, User Researcher, Prototyper
Tools: Sketch, Invision, Photoshop, Pen & Paper
Converse is an application that helps deaf people to communicate with others at work or personal life in a way that creates deep and meaningful connections.
The design dimensions, technologies, and ideas were introduced during the 5 phases of participatory design activity. In each phase, Lori picked aspects of the design she wanted to pursue, so it:
Lori Leal was born deaf
Learned Sign language at age 14
Strong comprehension and vocabulary
Learn About Lori
Identifying unmet needs or problems without a solution.
Defining design ideas and opportunities.
Hobbies & activities
Lori has been experiencing challenges in conversing and connecting with others which have caused feelings of isolation and loneliness.
The major problems and need found from the research
Construct point of view based on Lori’s needs
During the data analysis process, four key tasks were defined that Lori may want to complete with the design. The tasks are independent of any specific technology so that we could still explore many designs.
1. Converse independently
Talk to someone without an interpreter.
Respond to a conversation at life or work or make quick comments on a fast-paced group conversation.
2. Invite people to converse
Act on opportunities to converse with new people.
Respond to a situation (e.g. ordering food).
3. Have a deep one-on-one or group conversation
Have a nuanced comversation (adopting a position on a topic, expressing an opinion, etc.)
Conversation discussion with a co-worker or family member.
4. General conversation skills
Like social confidence, comprehension, etc.
BRAINSTORM OF IDEAS AND SKETCHES
Through sketching, we brainstormed a range of design ideas for how to support the above key tasks and considered a mix of different platforms, like physical interactive objects, smartphones, tablets, and laptops may be helpful.
Avatar or hologram which translatessign
language to speech, and vice versa
360-degree camera with live captioning and text to speech system
Smart contacts with live captioning feature
ASL to speech translation system
Real-time text communication where both people can type at the same time.
Real-time text communication where
both people can type at the same time.
IDEATE DESIGN SOLUTION
PRE CO-DESIGN PROTOCOL TEST OUT
We tested out the protocol we made with some pilot participants to make sure specific points, questions, and materials are working and modify the errors.
Create one design/solution using Lori's lived experience
We distributed design materials (paper, markers, etc.) to Lori and the team and emphasized not to bound by any of the previous ideas, just can sample the best parts to create a new idea, or work off an existing idea if they wish.
Started with storyboarding to represent a nuanced need that she has
Based on the storyboard, everyone created initial sketches and prototypes.
Encourage divergent thinking. Make note of all idea threads Lori generates
Discussion and New Design Direction:
Considering Lori’s insights from the co-design session, we designed a device including a 360° camera and an accompanying app called Converse, which would provide meaningful communication within desired social environments.
BUILDING INTERACTIVE PROTOTYPE
Defining the Functionalities
Interaction Design & Visual Design Iterations
Evaluation & Walkthroughs
The formal testing was done with the InVision prototype and we observed Lori while interacting with the prototype to complete the test tasks. We also asked questions to understand her satisfaction from the design concepts within the app and how it would function if it was used in a real-life situation.
Key Findings from the Testing
Overall, Lori was very positive toward the prototype. She did not have issues in completing tasks and appreciated the simplicity of design.
One of the major improvements was the app’s ability to detect and display background noise. Lori stressed that without the ability to hear, there is a lack of access to the noise that is occurring around them. The background noise detection embedded right into the chat would provide an understanding and an awareness that is lacking for a Deaf person within a hearing environment. The only critique was to change the background sense color so it would stand out from the conversation.
The camera captures the live video of all people in the conversation and the application shows a waist-up view of all people on the top half of the mobile screen, with a larger image of the person talking.
The live translation of the conversation overlays on the bottom half of the screen with the name of people.
The chlickable yellow notification banner within the conversation shows the background noise.
From this project, I have learned that the creation of a new product should be people and problem focused, and not featured focus. I had a valuable experience of learning from people with a range of perspective through co-design activities and it helped me to get closer to the final user of the product. Regarding the iterative process, I realized that the solution and design would be very defferent when it's done directly with the final user through co-design sessions. In addition, by working on inclusive design projects, we can shift our design thinking towards universal solutions.
The biggest limitation of our design was the lack of sign language and deaf speech to text and audio. Considering that typing is a last resort for Deaf people to respond, in the near future when deaf speech to text technology is commercially viable, Lori will be able to speak back to the device instead of having to type.