See the concept poster here: actuated pixels
An apparent issue with our modern-day everyday touch devices that many a sci-fi universe has solved is the ability to reach out to grasp and feel digital elements with our hands. Take the smartphone, for example. These devices can be drastically limiting to vision-impaired users as many of them are dependent on software such as text-to-speech, or similar, to use the phone.
AP creates physical elevations behind its digital counterpart, providing a sense of depth and various touch sensations based on the type of element that the user interacts with through hydraulics and electro-tactile feedback. Use cases could be as simple as braille text for the visually impaired to browse the internet, but even more interestingly, its application for navigational purposes.
Ed is vision impaired and reliant on his white cane to navigate. Ed was recently introduced to the new AP Phone and is excited to give it a go. Ed finds google maps by reading the braille text on his screen and immediately feels a map forming under his fingertips. He can sense the buildings on his left, the park and river on his right and the pavement he stands on. Ed notices an elevated wave moving towards him on the screen, realizing it's a person walking that the camera has translated into actuated feedback.
Using his phone to feel what others can see has provided Ed with valuable new opportunities. Now Ed can find bus departures by reading braille text and navigate there using elevated real-time maps, making his everyday activities much more accessible.
week3 actuatedpixels actuated update