Download PDFOpen PDF in browser

Sign Language and Gesture Recognition

EasyChair Preprint no. 3535

17 pagesDate: June 2, 2020

Abstract

Each ordinary person sees, tunes in, and responds to encompassing. There are some unfortunate people who doesn't have this significant gift. Such people, chiefly hard of hearing and unable to speak, they rely upon correspondence by means of gesture based communication to associate with others. Be that as it may, correspondence with customary people is a significant disability for them since only one out of every odd run of the mill individuals grasp their gesture based communication. The quiet/hard of hearing people have a correspondence issue managing others. It is difficult for such people to communicate what they need to state since gesture based communication isn't reasonable by everybody. This paper is to build up a framework that makes an interpretation of the communication via gestures into content that can be perused by anybody. This framework is called Sign Language Translator and Gesture Recognition. We built up a product that catches the signal of the hand and deciphers these motions into intelligible content. This content can be sent showcase/screen. The present variant of the framework can decipher 26 out of 26 letters, 0-9 digits and some other sign with an acknowledgment precision of 95%.

Keyphrases: gesture recognition, Keras, machine learning, OpenCV, Sign language translator, training model

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:3535,
  author = {Ashish Sah and A. Arul Prakash},
  title = {Sign Language and Gesture Recognition},
  howpublished = {EasyChair Preprint no. 3535},

  year = {EasyChair, 2020}}
Download PDFOpen PDF in browser