What are the things most difficult for
people with vision disabilities?
Walking without bumping into things, reaching their desired location without help, locating objects of every day use and reading. While there may be devices that can help in one or the other of these tasks, a unit that can help in all of them in an easy way would be much more helpful.
Walking without bumping into things, reaching their desired location without help, locating objects of every day use and reading. While there may be devices that can help in one or the other of these tasks, a unit that can help in all of them in an easy way would be much more helpful.
Third year students of Visvesvaraya
National Institute of Technology (VNIT) Akshay Khatri, Harshal Wankhede
and Kartikey Totewar, thought of solving all these problems with a
single device. That is how they came up with their blind navigation
system consisting of three android applications and an embedded device.
The handheld device which is the size of a small notebook consists of
ultrasonic sensors and vibrators that help avoid obstacles. One of the
apps also makes use of colored bands to identify specific objects.
“It started when our
seniors, Rohan Thakkar, Sachin Bharambe and Harsharanga Patil, suggested
that we work on a project that would be socially relevant. They gave us
the idea of an integrated system. We noticed that despite the high
number of eye donations, several people remain without sight,” said
Wankhede. The team also came across some statistics from World Health
Organisation (WHO) according to which there are 285 million people with
vision disabilities worldwide, 39 million are blind and 246 million have
low vision.
“Our aim was to make
something to help people with vision disabilities cope with the
difficulties they face. So, we observed their lives and listed their
most common problems,” said Khatri. Over a period of four months, the
three mechanical engineering students got together and built the entire
system. The prototype cost them only Rs 3,000, which would come down
further if mass produced.
Khatri was working on a
summer project in Indian Institute of Technology-Bombay where he worked
on the machine. Most of the basic work was done in the SINE laboratory
of the institute equipped with 3-D printer and laser cutting machine
that helped in fabrication. The programming and circuit designing was
then done at VNIT campus by the three.
“We only wish more
engineering students would try doing projects that are more socially
relevant. Also, it would be good if there could be more original
projects. After all, why keep redesigning and remaking things already in
existence,” asked Totewar.
Even while making the
device, they faced some problems to keep it cost-effective as well as
maintaining its quality and usability. The three are now trying to
improve the prototype by doing pilot studies. After these studies
succeed, it would be improved and redesigned to make it more
user-friendly.
How it works:
- The system consists of an embedded hand-held device and three android applications.
- The device is portable and easy to hold, much like a walking stick.
Two ultrasonic sensors on the front face of the device help detect
obstacles.
- Depending upon the location of the obstacle, the motors vibrate with
amplitude inversely proportional to the distance of the obstacle.
Depending upon the region of the obstacle, whether the obstacle lies in
region 1, 2, or 3, respective motors vibrate.
- The third ultrasonic sensor stays horizontal, so if there is a
footpath or a staircase the first and fourth vibrator motor vibrate
simultaneously, whereas if there is a pothole, the fourth vibration
motor vibrates.
- For the reading application, the camera on the android phone can be
pointed to the text to be read. It extracts text from the image being
captured and speaks it out. It takes 5 consecutive frames, and develops a
single best string based on these frames and speaks that out.
- For searching required objects, the camera can be pointed anywhere
in a room. Colored bands are put on the objects and the object assigned
to a particular marker can be entered in the android app based on which
it speaks what object has been detected.
- For navigation, the desired destination is entered using the speak
destination button. Then it calculates the path to the destination from
the current location of the user as it has already detected the current
location using GPS. Once the speak direction button is pressed, it
speaks the first direction to the destination. After this it
automatically speaks the directions whenever necessary by keeping track
of the user’s current location.
Rs 3,000: Production cost
of prototype, which will come down if mass produced. Selling price will
depend upon the scale of production and the subsequent improvements.
Source : Global Accessibility News Via Times of India , 30th October 2013
Source : Global Accessibility News Via Times of India , 30th October 2013
No comments:
Post a Comment