An Innovative Artificial Replacement to Facilitate Communication between Visually and Hearing- Impaired People

Authors(2) :-Sangeetha R, Elakkiya S

In this work, the problem of the communication between visually- and hearing-impaired people is a special case of particular interest, where the aforementioned aids cannot be applied because these users do not share any common communication channel. A proposed scheme included algorithms for speech recognition and synthesis to aid communication between visually and hearing-impaired people. Our proposed framework consists of a situated communication environment designed to foster an immersive experience for the visually and hearing impaired. To achieve the desired result, our situated modality replacement framework combines a set of different modules, sign language analysis and synthesis, speech analysis and synthesis, into an innovative multimodal interface for disabled users.

Authors and Affiliations

Sangeetha R
Research Scholar, VLSI, Vandayar Engineering College, Thanjavur, Tamilnadu, India
Elakkiya S
Assistant Professor of ECE in Vandayar Engineering College, Thanjavur, Tamilnadu, India

Sign language synthesis, ANN, speech synthesis, feature selection, feature extraction

  1. J. Lumsden and S.A. Brewster, ‘‘A Paradigm Shift: Alternative Interaction Techniques for Use with Mobile & Wearable Devices,’’ Proc. 13th Ann. IBM Centers for Advanced Studies Conf., IBM Press, 2003, pp. 97-100.
  2.  O. Lahav and D. Mioduser, ‘‘Exploration of Unknown Spaces by People Who Are Blind, Using a Multisensory Virtual Environment (MVE),’’ J. Special Education Technology, vol. 19, no. 3, 2004, pp. 15-24.
  3. T. Pun et al., ‘‘Image and Video Processing for Visually Handicapped People,’’ Eurasip J. Image and Video Processing, vol. 2007, article ID 25214, 2007.
  4. A. Caplier et al., ‘‘Image and Video for Hearing Impaired People,’’ Eurasip J. Image and Video Processing, vol. 2007, article ID 45641, 2007.
  5. S.C.W. Ong and S. Ranganath, ‘‘Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning,’’ IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 27, no. 6, 2005, pp. 873-891.
  6. N. Bourbakis, A. Exposito, and D. Kobraki, ‘‘MultiModal Interfaces for Interaction-Communication Between Hearing and Visually Impaired Individuals: Problems and Issues,’’ Proc. 19th IEEE Int’l Conf. Tools with Artificial Intelligence, IEEE Press, 2007, pp. 522-530.
  7. D. Tzovaras et al., ‘‘Design and Implementation of Haptic Virtual Environments for the Training of Visually Impaired,’’ IEEE Trans. Neural Systems and Rehabilitation Eng., vol. 12, no. 2, 2004, pp. 266-278.
  8. H. Petrie et al., ‘‘Universal Interfaces to Multimedia Documents,’’ Proc. 4th IEEE Int’l Conf. Multimodal Interfaces, IEEE CS Press, 2002, pp. 319-324.
  9. N.O. Bernsen and L. Dybkjaer, Multimodal Usability, HumanComputer Interaction Series, Springer, 2010.
  10. O. Aran et al., ‘‘Signtutor: An Interactive System for Sign Language Tutoring,’’ IEEE MultiMedia, vol. 16, no. 1, 2009, pp. 81-93.
  11. G. Bologna et al., ‘‘Transforming 3D Coloured Pixels into Musical Instrument Notes for Vision Substitution Applications,’’ Eurasip J. Image and Video Processing, special issue on image and video processing for disability, vol. 2007, article ID 76204, 2007.
  12. M. Papadogiorgaki et al., ‘‘Gesture Synthesis from Sign Language Notation Using MPEG-4 Humanoid Animation Parameters and Inverse Kinematics,’’ Proc. 2nd Int’l Conf. Intelligent Environments, IET, 2006.
  13. G.C. Burdea, Force and Touch Feedback for Virtual Reality, Wiley-Interscience, 1996.
  14. K. Moustakas, D. Tzovaras, and M.G. Strintzis ‘‘Sq-map: Efficient Layered Collision Detection and Haptic Rendering,’’ IEEE Trans. Visualization and Computer Graphics, vol. 13, no. 1, 2007, pp. 80-93.
  15. K. Moustakas et al., ‘‘Haptic Rendering of Visual Data for the Visually Impaired,’’ IEEE MultiMedia, vol. 14, no. 1, 2007, pp. 62-72.
  16. R. Ramloll et al., ‘‘Constructing Sonified Haptic Line Graphs for the Blind Student: First Steps,’’ Proc. ACM Conf. Assistive Technologies, ACM Press, 2000.
  17. S. Young, The HTK Hidden Markov Model Toolkit: Design and Philosophy, tech. report CUED/ F-INFENG/TR152, Cambridge Univ. Engineering Dept., Sept. 1994.
  18. B. Bozkurt et al. ‘‘Improving Quality of Mbrola Synthesis for Non-Uniform Units Synthesis,’’ Proc. IEEE Workshop Speech Synthesis, IEEE Press, 2002, pp. 7-10.

Publication Details

Published in : Volume 2 | Issue 5 | September-October 2017
Date of Publication : 2017-10-31
License:  This work is licensed under a Creative Commons Attribution 4.0 International License.
Page(s) : 677-682
Manuscript Number : CSEIT1725153
Publisher : Technoscience Academy

ISSN : 2456-3307

Cite This Article :

Sangeetha R, Elakkiya S, "An Innovative Artificial Replacement to Facilitate Communication between Visually and Hearing- Impaired People", International Journal of Scientific Research in Computer Science, Engineering and Information Technology (IJSRCSEIT), ISSN : 2456-3307, Volume 2, Issue 5, pp.677-682, September-October-2017. |          | BibTeX | RIS | CSV

Article Preview