Language Accessibility for the Deaf and Hard-Of-Hearing

Studies in Europe and USA show that around 10-15% of the population are either deaf or hard of hearing. Although both deaf and hard-of-hearing people face barriers to accessing spoken language, they form separate groups with different perspectives and priorities. Those who are hard of hearing use a spoken language, such as English, Greek, or French, as their preferred, or first language. Those who identify as deaf use a signed language, such as British Sign Language, Greek Sign Language, or French Sign Language as their preferred language. Both groups require access to the information contained in spoken messages, however, those who identify as deaf also require advanced access to written forms of spoken language. Information and communication technology (ICT) can support visual modalities with pictures or written forms of speech on screen allowing individuals to extend both their general knowledge and use of language without listening. ICT also has the potential to support translation from audible and written forms of spoken language to the preferred sign language of a deaf user. Additionally, using ICT for collaborative activities can encourage a group of persons to improve their use of language and their understanding of concepts as they plan and carry out their work. Further, avatar-based translation systems, used on 2D screens or in Extended Reality environments have the potential for enhancing communication and collaboration for those whose preferred language is signed, and for whom a spoken language, albeit in written form, is a second language. The Special Thematic Session (STS) invites contribution on all aspects of the accessibility, usability, and intelligence of ICT-based systems and applications that make spoken language and translation more accessible, including:

  • User-controlled systems for language accessibility
  • Multimodal integration of information with language accessibility
  • Teaching tools for second-language learners, both hearing and deaf
  • Machine and deep learning for ICT-based language transcript and language translation systems
  • Using Extended Reality (Mixed Reality, Augmented Reality, Virtual Reality) for language accessibility
  • Universal and graphical design of user interfaces for language accessibility
  • Evaluation methodologies for the quality of language accessibility

Chairs


Contributions to a STS have to be submitted using the standard submission procedures of ICCHP24.
When submitting your contribution please make sure to select the right STS from the drop-down list "Special Thematic Session". Contributions to a STS are evaluated by the Programme Committee of ICCHP-AAATE and by the chair(s) of the STS. Please get in contact with the STS chair(s) for discussing your contribution and potential involvement in the session.