Hearing student needing feedback on a augmented reality project

Cody Jackson

New Member
Joined
Mar 19, 2017
Messages
8
Reaction score
2
Hi, my name is Cody Jackson and I live in the San Francisco Bay Area in California. I go to school at the Academy of Art University in SF and am working on a concept augmented reality project focused on helping the deaf community interact more easily with people, like myself, who don't know sign language.

A little bit about me and my project. For the last two years, I worked at Starbucks as a Barista. We had a few deaf customers who frequented my location and I got to know them (as well as their drinks). I do not know sign language so it was kind of difficult for me to communicate with them. Two semesters ago, I had a class project that involved creating an augmented reality project. I remembered that lack of easy communication, so I decided to see if I could create something that might help.

Now, I have a small presentation of what the product interface might look like as well as a lot of questions. I am here hoping to get some feedback on the project, but more importantly, I would like to get to know you better and in the process create a better project.

To start off, a few questions to help me understand your daily life a little better:
  • Do you find communicating with non-signing people to be difficult?
  • Is it hard to get to know people who don't sign?
  • How would you normally communicate with a non-signing person?
  • How do you prefer that people get your attention?
  • Have you found that communication influences whether or not you get a job?

The project is called Caption. In the project, I have created two-part system for the two sections of a conversation. When a non-signing person spoke to a you, it would use speech recognition software to translate words spoken to you into ASL which would hover in front of the person speaking. When you responded, you would make a gesture and think what you intended to say. Your thoughts would be translated into audible "speech" which the system's speakers would project as sound. There are more detailed explanations in the presentation.

Please bear in mind that the presentation was created to not only present the idea, but also to allow hearing people to get to know the deaf community a little better.

Now a few questions about the project:
  • I created the project assuming that it would be best for the non-signing person's words to be translated into ASL. Is that true or would it be better for them to be translated into plain text?
  • Would you rather use your thoughts to speak or would it be better to use motion-capture to capture what you were saying as you signed it?
  • Overall, do you think the project is something that you might use if it were to be created?
I really appreciate any insight you can give me. One of the main reasons I got into design is so that I can help people have easier lives. The only way to figure out how I can do that is to get to know them. Thank you for your time!
 
Last edited:
Interesting, but didn't understand how it works. Who should be wearing this device? Hearing or Deaf?
Also, how well works the speech recognizing software, especially in Bay Area, where everyone has an accent?
 
Interesting, but didn't understand how it works. Who should be wearing this device? Hearing or Deaf?
Also, how well works the speech recognizing software, especially in Bay Area, where everyone has an accent?

The deaf person would wear the system. According to the research I have done on a few of the speech recognition software out there, they said that accents were accounted for. Of course, some of this project is based on technology which doesn't exist quite yet, but at least has the scientific backing to hypothetically work in the future. Right now, the system is designed for a simple two person conversation.
 
Is this device available for beta tests?

No, this is still is the design phase. I am hoping to get feedback from the deaf community on it's design and whether the functionality I have described would be useful for them in a conversation.

Do you think that captions in ASL or captions in text would be more useful for you in a conversation?
 
Do you think that captions in ASL or captions in text would be more useful for you in a conversation?
Deaf people are very different. Some of us were raised orally, and some of them raised signing ASL.
ASL and English are completely different languages. So for some Deaf people ASL signs could be a better choice, and for Deaf(non-signing yet) people like me would choose captions.

Where are these captions/ASL signs would be displayed? How would a Deaf person preview it? I checked the presentation, at some point it is confusing to me.
Or maybe it is just me sick and tired to read stuff :)
 
Deaf people are very different. Some of us were raised orally, and some of them raised signing ASL.
ASL and English are completely different languages. So for some Deaf people ASL signs could be a better choice, and for Deaf(non-signing yet) people like me would choose captions.

Where are these captions/ASL signs would be displayed? How would a Deaf person preview it? I checked the presentation, at some point it is confusing to me.
Or maybe it is just me sick and tired to read stuff :)

Gotcha, so maybe the best bet would be an option for both? The individual user could choose how they wanted their information to be displayed.

The captions would be displayed floating in front of the hearing person. To the deaf user, it would seem like the hearing person had captions for whatever they said.
 
The captions would be displayed floating in front of the hearing person. To the deaf user, it would seem like the hearing person had captions for whatever they said.
Sounds interesting! Yes, information should be available for switch.
Also, I think it won't be in ASL, it more likely to be a SEE(Signing Exact English).
 
Sounds interesting! Yes, information should be available for switch.
Also, I think it won't be in ASL, it more likely to be a SEE(Signing Exact English).

OK, thanks! That is super helpful. Do you mind if I pick your brain on a few things? I don't want to be a pain, but I could really use the help!

If you are ok with that, would the device having the ability to read emotions, which are usually conveyed by speech and tone (ie: sarcasm), be a feature which would be useful to have or is that something that most deaf people wouldn't need?
 
If you are ok with that, would the device having the ability to read emotions, which are usually conveyed by speech and tone (ie: sarcasm), be a feature which would be useful to have or is that something that most deaf people wouldn't need?
I usually don't mind. I'm excited about all kinds of ideas and technologies. But people are different.

By the way, here is an idea - you can create this device to make translations not only between Deaf and Hearing worlds, but also between the different languages. Google Translator API is quite good and readily available. This should be easier than translating into Sign.

I don't know how would the words be translated into SEE, PSE, ASL, because Sign language is not just hands, it is a lot about facial expressions, hands position toward the face, body, direction, and so on.
 
Last edited:
I usually don't mind. I'm excited about all kinds of ideas and technologies. But people are different.

By the way, here is an idea - you can create this device to make translations not only between Deaf and Hearing worlds, but also between the different languages. Google Translator API is quite good and readily available. This should be easier than translating into Sign.

I don't know how would the words be translated into SEE, PSE, ASL, because Sign language is not just hands, it is a lot about facial expressions, hands position toward the face, body, direction, and so on.

Thank you very much!

I like that idea. It would open up who this product could help.

Are there any features which you think would be particularly helpful in a product like this?
 
Back
Top