While volunteering at AbilityPath, a non-profit organization dedicated to providing services to people with disabilities, I realized a major problem some people faced was not access to technology but being able to use it. Specifically, the high levels of fine motor control needed to use the interface (such as mice, trackpads, keyboards, and game controllers) prohibit many people’s access. This problem led me toward a project to enhance computer accessibility without increasing cost.
I aimed to simplify and make computer interfaces more natural and accessible by using a much more common and direct form of communication: gestures. Using (I show how to get) signals captured by the webcam, I will show how the data is processed before being fed through two major AI models to first extract hand-landmark positions and then gestures from the landmarks.
Although existing commercial methods using facial expressions captured by a camera to control a game do exist, there is an overwhelming population of people who have troubles with fine motor control and won’t benefit from an unintuitive, fairly unnatural form of the game interface. I minimized the required technology to further universalize the access, by using open-source software and free tools, as well as making the entire project open-source and thus free.
Speaker: Henri Sayag, Kehillah High School
Register at weblink to receive connection information, or watch here.
Contact:Website: Click to Visit
Save this Event:iCalendar
Windows Live Calendar