Type You Can Hear:
Making Bungee Accessible through Cross-Sensory Design

Reimagining the Bungee Font Tester as a multi-sensory, accessible experience. The project features a redesigned interface for screen readers and an innovative audio system that translates Bungee’s visual features, such as color, layering, and orientation, into sound, ensuring the font’s dynamic personality can be experienced by all users.

Team

Myself + 4

Duration

3 Weeks

Tools

Figma, FigJam, AI Voice Generation

Figma, FigJam,

AI Voice Generation

Skills

UX Design, Accessibility Design, User Research

UX Design,

Accessibility Design,

User Research

Client

Client

Cooper Hewitt, Smithsonian Design Museum

Cooper Hewitt,

Smithsonian Design Museum

The Challenge

Bungee is a bold, colorful sans-serif typeface designed by David Jonathan Ross, inspired by vintage urban signage. The Bungee Font Tester is a web-based, interactive playground that showcases this expressive typeface through dynamic color layers, varied orientations, and playful styling.

However, typography testers like the Bungee Font Tester are highly visual experiences. For users who rely on screen readers or non-visual ways of accessing digital art, these dynamic details are completely lost.

Without a way to translate visual design into other sensory experiences, users struggle to:

  • Understand Bungee’s unique layering and color interactions

  • Experience the font’s playful, expressive personality

  • Navigate font tester interfaces built with non-semantic, visual-only UI elements

How can we ensure that Bungee’s dynamic typographic personality, its weight, orientation, color, and layering, remains meaningful and accessible to users who rely on visual description rather than visual information?

The Solution

Reimagine the Bungee Font Tester as a multi-sensory experience by:

  • Redesigning the interface to be accessible for screen reader users

  • Translating visual qualities, like color, layering, weight, and orientation, into sound

  • Ensuring the font’s personality remains intact while meeting WCAG 2.2 accessibility standards

The Approach

An end-to-end process was employed, combining research, accessibility insights, and creative prototyping techniques:

  • Accessibility Research: Studied visual impairment needs and multi-sensory design precedents to inform how visual attributes could translate into sound.

  • Audio Mapping: Developed a system to convert visual features into distinct audio characteristics, such as pitch, timbre, and spatial sound.

  • UI Redesign: Created a guided, screen reader-friendly interface that makes the font tester navigable and understandable without relying on sight.

  • Prototyping & Validation: Built an interactive prototype and audio demo to test how effectively Bungee’s personality could be conveyed through sound.

Research Process

Before redesigning the Bungee Font Tester, we wanted to understand a crucial question:

How do people experience art, typography, and digital spaces when they can’t rely on sight?

Understanding the Landscape

This question led us to research about visual impairments and the broader landscape of multi-sensory design.

The numbers were impossible to ignore:


51.9 M

U.S. adults report some level of vision loss

307,000

U.S. adults are completely blind

2.2 B

people globally, live with visual impairments

These statistics weren’t just data points. They represented a massive audience often left out of digital experiences, especially those as visual and expressive as typography.

Types of Visual Impairment

Vision loss is a spectrum. Understanding its different forms was essential for shaping the design approach:

  • Low Vision → Limited sight not correctable with glasses or contact lenses.


  • Blindness → Total or near-total vision loss.


  • Cortical Visual Impairment (CVI) → Difficulty processing visual information in the brain, despite healthy eyes.

Each of these affects how users interact with screens, and each demanded different solutions.

Learning from Multi-Sensory Art

From there, we looked for inspiration in the art world—places where designers and artists were already translating visuals into other senses.

We discovered:

  • Clarke Reynolds a blind artist, whose tactile artwork allows people to “read” images through touch, challenging traditional notions of seeing.


  • Bobby Goulder's “Sound of a Masterpiece,” built with Dolby, which transforms famous paintings into immersive audio experiences, letting sound evoke color and movement.


  • Shannon Lin’s “Visualizing Sound,” which reverses the usual sensory flow by turning music into textures and motion.

These examples showed that interpreting visuals through sound is not only possible but can be poetic, expressive, and deeply human.

Design Process and Implementation


Redesigning the Interface to be more Screen-Reader Friendly by Simplifying Controls

  • A fully visible interface rather than having an accordion menu

  • The color picker should contain pre set colors for easier use, with the option to pick custom colors if needed by users

  • Addition of a "Play" button for users to be able to control how often they want to hear the output


  • Make interactive elements keyboard focusable

  • Make interactive elements keyboard focusable



  • Ensuring UI Changes Support Real Accessibility


  • Ensuring UI Changes Support Real Accessibility



A Guided Tutorial

Incorporating a guided tutorial was essential to help users, especially the ones using screen readers and keyboard, to understand a nontraditional interaction model and navigate with confidence.


A Guided Tutorial

Incorporating a guided tutorial was essential to help users, especially the ones using screen readers and keyboard, to understand a nontraditional interaction model and navigate with confidence.



Prototyping

The prototype demonstrates how Bungee’s visual style can be translated into a multi-sensory experience that’s screen-reader friendly. Simplified controls and clear labels make navigation straightforward, while a guided tutorial introduces users to the interaction model. Visual attributes of the font - like weight, orientation, and layering - are mapped to distinct audio cues, allowing users to “hear” Bungee’s personality as they test the typeface.


Shaping my Approach

Designing for Bungee reminded me that accessibility is ultimately about creating equal access to expression. The challenge was not only to simplify navigation but also to capture the font’s playful personality through sound. This balance showed me how multi-sensory design can open new ways of experiencing art and design, making it meaningful to a broader audience.

The project also highlighted how small design choices like a clearer interface, a guided tutorial, or thoughtful feedback can remove barriers for users who often get overlooked. Moving forward, I want to continue exploring how modalities like sound, touch, and motion can become integral to interaction design, and not just substitutes for vision. Accessibility, to me, has shifted from a requirement to a design opportunity: a way to expand creativity by imaging experiences that everyone can take part in.