top of page
Screenshot 2025-04-04 at 21.21.39.png

Sign Language

Sign Language is an interactive audiovisual piece that explores the perceptual experience of hearing loss. A combination of both hand gestures activates one of four simulated hearing conditions, with the left hand forming a fist and the right hand shaping sign language-inspired numbers one through four. Each gesture triggers a distinct transformation of sound, reflecting common symptoms of sensorineural loss, conductive loss, hyperacusis, and unilateral deafness.

 

The visuals respond to audio in real time, with the waveform grid subtly shifting in color and texture to reflect each hearing condition. By reimagining sign language as a tool for transforming perception rather than communication, the piece invites participants to engage with the invisible realities of hearing impairment through embodied interaction.

Instructions:

Click the image above to interact with this audiovisual interaction system.

An audio file will begin playing when the website launches. Start with the volume turned down, then gradually raise it to a comfortable level.

Headphones recommended — for full spatial and perceptual immersion.

Use both hands together in front of the camera to explore different hearing experiences:

  • Left hand closed in a fist

  • Right hand forms a gesture (sign language numbers)

Each right-hand gesture triggers a different hearing condition:

  • Index finger extended → Sensorineural Hearing Loss

  • Index + middle fingers → Conductive Hearing Loss

  • Index + middle + thumb → Hyperacusis

  • Index + middle + ring + pinky → Unilateral Hearing Loss

To return to normal hearing:

  • Spread both hands open fully.

Press the “F” key to enter fullscreen mode for the best experience.

Adjust your hand-to-camera distance to get the best interaction result.

(Desktop only)

  • SoundCloud
  • Instagram

©2022 by Ningxin Zhang

bottom of page