therapeutic recreation / en Students program computers to interpret sign language /news/2019-08/students-program-computers-interpret-sign-language <span>Students program computers to interpret sign language</span> <span><span lang="" about="/user/236" typeof="schema:Person" property="schema:name" datatype="" xml:lang="">Melanie Balog</span></span> <span>Mon, 08/12/2019 - 16:11</span> <div class="layout layout--gmu layout--twocol-section layout--twocol-section--30-70"> <div> </div> <div class="layout__region region-second"> <div data-block-plugin-id="field_block:node:news_release:body" class="block block-layout-builder block-field-blocknodenews-releasebody"> <div class="field field--name-body field--type-text-with-summary field--label-visually_hidden"> <div class="field__label visually-hidden">Body</div> <div class="field__item"><p><span class="intro-text">With a twist or shake of your wrist, your smartphone can interpret motion to take a picture, turn on a light, and more. Last year, AV computer science professors <a href="https://cs.gmu.edu/directory/detail/62/">Parth Pathak</a> and <a href="https://cs.gmu.edu/~hrangwal/">Huzefa Rangwala</a> were brainstorming how similar technology could help society in even greater ways. Their idea? To automatically translate sign language into text or speech.</span></p> <p>“There are some products that can do gesture recognition, but they’re very preliminary. And it’s very different from ASL [American Sign Language], which is not just a few gestures—it’s thousands of words,” said Pathak, principal investigator on the Summer Team Impact Project funded by Mason’s <a href="https://oscar.gmu.edu/">Office of Student Scholarship, Creative Activities, and Research</a> (OSCAR).</p> <p>This summer, nine Mason undergraduates joined in the research that could help make the technology a reality.</p> <figure role="group"><div> <div class="field field--name-image field--type-image field--label-hidden field__item"> <img src="/sites/g/files/yyqcgq291/files/2023-03/Sign_Language_08_main_crop.jpg" width="725" height="371" alt="A student does sign language in front of a computer camera while two other students on the other side of the table look at the data on their laptops." loading="lazy" typeof="foaf:Image" /></div> </div> <figcaption>Frederick Olson (from left), Sai Gurrapu and Dom Huh are part of a summer research project on automatic multimodal sign language recognition. Photo by Lathan Goumas/Office of Communications and Marketing.</figcaption></figure><p>“The goal would be to deliver a readable message to a device so that it’s bridging the gap between ASL users and non-users,” said <a href="https://rht.gmu.edu/recreation-management/therapeutic-recreation-concentration">therapeutic recreation</a> senior Riley Wilkerson, “an easier, more effective, and more personal way of communicating.”</p> <figure role="group" class="align-right"><div> <div class="field field--name-image field--type-image field--label-hidden field__item"> <img src="/sites/g/files/yyqcgq291/files/styles/small_content_image/public/2023-03/Sign_Language_03_sensor_342.jpg?itok=TNrrnvrI" width="342" height="228" alt="Close up of Riley Wilkerson's hands as she signs in front of a radar sensor." loading="lazy" typeof="foaf:Image" /></div> </div> <figcaption>Riley Wilkerson signs for a radar sensor as part of a summer research project on automatic multimodal sign language recognition. Photo by Lathan Goumas/Office of Communications and Marketing.</figcaption></figure><p>Three teams of students are experimenting with different sensors: a wireless radar, a camera, and an inertial measurement unit (a wearable motion sensor used in smartphones and Fitbits). Each sensor offers certain opportunities, but also challenges including privacy and ease of use, said Pathak, who is guiding the students on the project along with Mason computer science professor <a href="https://volgenau.gmu.edu/profile/view/8677">Jana Kosecka</a> and Mason’s Helen A. Kellar Institute for Human disabilities director <a href="https://cehd.gmu.edu/people/faculty/lmason20">Linda Mason</a>, and graduate student Panneer Selvam Santhalingham.</p> <p>On each team, a student familiar with ASL signs in front of a sensor that collects data about the motion or the environment. <a href="https://cs.gmu.edu/">Computer science</a> and <a href="https://volgenau.gmu.edu/">engineering</a> students refine the data to find patterns and write machine learning algorithms—code that allows them to interpret the computer’s recognition of the signs.</p> <p><span><span><span><span><span>So far, the undergraduates have “taught” their machines to recognize about 20 signs with accuracy rates ranging from 70-97 percent. The fluctuations in accuracy are due to the machine learning process, said senior computer science major Yuanqi Du.  </span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span>Diverse data helps the computer recognize the signs with increased accuracy, Du said. In initial trials with one student, accuracy rates were higher. When a new ASL user was introduced, the accuracy diminished, Du said. Once the new ASL user’s data was included in the algorithms, accuracy rates rose again.</span></span></span></span></span></span></span></span></span></span></span></p> <figure role="group" class="align-left"><div> <div class="field field--name-image field--type-image field--label-hidden field__item"> <img src="/sites/g/files/yyqcgq291/files/styles/small_content_image/public/2023-03/Sign_Language_05_phones_closeup.jpg?itok=uEDQAEn6" width="342" height="228" alt="Cell phones are strapped on a student's wrists as signs." loading="lazy" typeof="foaf:Image" /></div> </div> <figcaption>Ariana Havens wears cell phones as she signs as part of a summer research project on automatic multimodal sign language recognition. Photo by Lathan Goumas.</figcaption></figure><p><span><span><span><span><span><span><span><span><span><span><span>As the multi-year project continues, Pathak said the team plans to increase the number of signs the computer can recognize using data from many diverse users. They will also scale it to interpret full sentences and pick up other gestures used in ASL such as body tilts and micro expressions like raising an eyebrow, he said.</span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span>“Being able to communicate instantly would hopefully remove issues [the ASL community experiences],” said Frederick Olson, a senior IT major who said both his parents are deaf. That includes being able to ask a question at a store, socializing,  communicating with doctors easily during appointments, or being able to land better job opportunities. The technology could be life-changing, he said.</span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span>It could also be applied beyond the deaf community, the students said, helping people with autism or developmental and learning disabilities for whom communicating using spoken words is challenging, Wilkerson said.</span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span>“It could be applicable to other industries and disciplines in the future [that will work with similar technology], too,” said junior computer science major Sai Gurrapu.</span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span>And, the project pushes student learning to the next level, Pathak said.</span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span>“They’re not given a fixed task here—they’re given a problem and they have to find a solution,” Pathak said.</span></span></span></span></span></span></span></span></span></span></span></p> <p><span><span><span><span><span><span><span><span><span><span><span>“This project is one of a variety of opportunities [Mason] has presented to me that goes beyond just taking 15 credits each semester,” Wilkerson said. “You can only learn so much in a classroom—you have to apply it.” </span></span></span></span></span></span></span></span></span></span></span></p> <figure role="group" class="align-left"><div> <div class="field field--name-image field--type-image field--label-hidden field__item"> <img src="/sites/g/files/yyqcgq291/files/2023-03/Sign_Language_04_main_top_crop_0.jpg" width="725" height="483" alt="One student does sign language in front of a radar sensor and two other students on the other side of the table view the data on their computers." loading="lazy" typeof="foaf:Image" /></div> </div> <figcaption>Seniors Yuanqi (from left), Nguyen Dang and Riley Wilkerson are part of a summer research project on automatic multimodal sign language recognition. Photo by Lathan Goumas/Office of Communications and Marketing.</figcaption></figure></div> </div> </div> <div data-block-plugin-id="field_block:node:news_release:field_content_topics" class="block block-layout-builder block-field-blocknodenews-releasefield-content-topics"> <div class="field field--name-field-content-topics field--type-entity-reference field--label-visually_hidden"> <div class="field__label visually-hidden">Topics</div> <div class="field__items"> <div class="field__item"><a href="/taxonomy/term/2186" hreflang="en">computer science</a></div> <div class="field__item"><a href="/taxonomy/term/1546" hreflang="en">Office of Student Scholarship Creative Activities and Research (OSCAR)</a></div> <div class="field__item"><a href="/taxonomy/term/4376" hreflang="en">Helen A. Kellar Institute for Human disAbilities</a></div> <div class="field__item"><a href="/taxonomy/term/4381" hreflang="en">therapeutic recreation</a></div> <div class="field__item"><a href="/taxonomy/term/3036" hreflang="en">engineering</a></div> <div class="field__item"><a href="/taxonomy/term/271" hreflang="en">Research</a></div> </div> </div> </div> </div> </div> Mon, 12 Aug 2019 20:11:08 +0000 Melanie Balog 15841 at