to download project abstract

ABSTRACT

Sign language is an overlooked concept even though there being a large
social group which could benefit by it. Not everyone knows how to
interpret a sign language when having a conversation with a deaf and
dumb person. There is always a need to communicate using sign
language. One finds it hard to communicate without a translator. To
solve this, we need a common translator that is understood by common
people and will help them to communicate without any barriers. Image
classification and machine learning can be used to help computers
recognize sign language, which could then be interpreted by other
people. Pre-processing will be performed on images to get cleaned
input. After that convolutional neural network (CNN) will be used to
recognize sign language gestures. The main aim of this project is to
eliminate the barrier between the deaf and dumb and the rest.

INTRODUCTION
1.1 OUTLINE OF THE PROJECT
Indian sign language is a predominant sign language Since the only disability
D&M people have is communication related and they cannot use spoken
languages hence the only way for them to communicate is through sign
language. Communication is the process of exchange of thoughts and messages
in various ways such as speech, signals, behaviour and visuals. Deaf and
dumb(D&M) people make use of their hands to express different gestures to
express their ideas with other people. Gestures are the nonverbally exchanged
messages and these gestures are understood with vision. This nonverbal
communication of deaf and dumb people is called sign language.

American sign language is a predominant sign language Since the only disability
D&M people have is communication related and they cannot use spoken
languages hence the only way for them to communicate is through sign
language. Communication is the process of exchange of thoughts and messages
in various ways such as speech, signals, behaviour and visuals. Deaf and
dumb(D&M) people make use of their hands to express different gestures to
express their ideas with other people. Gestures are the nonverbally exchanged
messages and these gestures are understood with vision. This nonverbal
communication of deaf and dumb people is called sign language.

For interaction between normal people and D&M people a language barrier is
created as sign language structure which is different from normal text. So, they
depend on vision-based communication for interaction. If there is a common
interface that converts the sign language to text, the gestures can be easily
understood by the other people. So, research has been made for a vision-based
interface system where D&M people can enjoy communication without really
knowing each other’s language. The aim is to develop a user-friendly human
computer interface (HCI) where the computer understands the human sign
language. There are various sign languages all over the world, namely American
Sign Language (ASL), French Sign Language, British Sign Language (BSL),
Indian Sign language, Japanese Sign Language and work has been done on
other languages all around the world

The importance of the very use of this method is increasing day by day as this
gives the people with disabilities and opportunity to try their hands in
different fields that require communication. With the help of this project, they can
communicate with the majority in any industry, thus giving them an even playing
field to cope up with. Here, we use the method of object identification followed by
recognition to help us differentiate between the many symbols used in the sign.

Leave a Comment

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *