Deep-learning company, linedanceAI is hosting, Project KinTrans Hands Can Talk. The project uses AI to learn and process the body movements of sign language.
In 2020 we are building the FIRST, Deaf crowd-sourced, 3D database for American Sign Language designed to compliment the future development of new, sign-accessible technology solutions.
If you are in the US and would like to participate in the digital codification of ASL, email us at info@linedanceAI.com.
Origins of the Project
KinTrans Inc dba linedanceAI was founded by Mohamed Elwazer in 2013 in Dubai. He was inspired to build the first automatic, machine-learning technology that can learn the movements of sign language.
What has evolved is a machine-learning platform for full body human movement.
Today the HQ is in Dallas, Texas USA.
Technology patented in the US, pending in Europe, Canada and Israel.
Technology development was inspired by Deaf User requirements of a future sign language translator:
Natural use of sign language - NO wearables, no gloves
Build a flexible application - so it can fit in different places, or on the mobile
Recognize signs when they are continuous
Accommodate different body types and signing styles
KinTrans won a grant for 2018 to support projects like the image above at Texas School for the Deaf. The grant enabled us to hire interns to build part of our ASL 3D dictionaries & conduct a Deaf user experience survey.
Today, KinTrans Hands Can Talk 3D database project is being modeled for greater scalability to digitize the movements of global sign languages.
Mohamed Elwazer, Founder/CTO
Mohamed Elwazer, from Cairo, Egypt, system architect for linedanceAI's human movement analysis platform. Computer systems engineer focused in machine-learning & image processing coupled with over 10 years of technical & leadership entrepreneurial experience.
We believe human movement is the next data frontier. Like those cognitive services that have come before such as voice, image and object recognition, the time is now for human movement.
We look forward to software being a life partner, connected to us by our movements and gestures, not clicks and commands.
Catherine Bentley, Co-founder/Biz Dev
Catherine Bentley, from Dallas, Texas, manages partnerships, new business & operations for linedanceAI. A seasoned business consultant in human resources, strategy and innovation for businesses and governmental entities in the US and Middle East.
Human centered design meets machine-learning:
KinTrans Hands Can Talk 3D database project is built upon the linedanceAI technology platform.
This machine-learning platform hosts human movement recording, analysis software and various APIs.
The database project is Deaf-led and will host 3000 signs with variations. It is designed to compliment other 3D emotion and hand libraries.
Developers may customize sign language dictionaries on our platform