DESIGN PROCESS
PHASE 1: CONTEXT
Overview
Despite advances in smart infrastructure, our modern cities fail to meet a sufficient standard of accessibility for over 575,000 blind or vision impaired Australians. This not only diminishes practical mobility but is disempowering, placing the vision impaired at life-threatening risk that can have a significant impact on their overall health and sense of personal autonomy. The direction of the project aims to explore solutions to existing navigational hindrances in an effort to grant visually impaired individuals’ greater freedom when making their way from one location to another. Specifically, we wanted to understand and analyse the efficiency, convenience and reliability of current systems.
Background
Wayfinding
The vision-impaired spend extensive amounts of time preparing for their journey. In consequence, journeys are orchestrated in greater detail leading to more wayfinding decisions overall. Those with hindered sight should have more wayfinding information made available to them throughout their commute.
Mobility Related Accidents
Inevitably, people who live with visual impairment suffer from at least one accident per month. Existing mobility aides such as Long canes and guide dogs were not found to be effective in avoiding over-head accidents or even reducing the frequency of over-head accidents. This has resulted in a loss in confidence and a change in walking habits.
Transport Accessibility
The experience of accessing public transport is not treated as equally important or vital. It is the beginning and end of the journey that is fully satisfied by this audience. Issues such as lack of defined footpaths, shared footpaths and the absence of audio tactile traffic signals on major intersections provide an obstacle for the visually impaired from accessing public transport at all. For public transport and built environments to be utilised, barrier-free and safe pedestrian access should be developed in and around public spaces.
Exisiting Blind Solutions
An overall lack of innovation or developments which attempt to address problems faced by the visually impaired can be noted. Negligence in infrastructure to be more contextually aware and provide information to changes in the environment or directional information show that there are vast opportunities to be attained within this market.
Problem Statement
The vision-impaired require assistance to navigate modern cities independently. This includes assistance in orienting themselves, identifying hazards and using existing public transport and public infrastructure.
PHASE 2: CONCEPT IDEATION
Concept Exploration
We then began ideating multiple concepts before narrowing down to the four strongest ideas. For each idea, we described the specific problem it aims to address and its solution to user needs as well as, hardware and software requirements needed to facilitate the product. Although some of the chosen solutions did not correspond directly to addressing problems within a pubic infrastructural realm, we felt the vast contrast between all four designs would later help inform a fully realised chosen concept. The concepts are:
Beacon
A smartwatch designed to help the vision-impaired navigate to destinations by guiding them to incremental checkpoints (such as a bus stop, train gate or platform). Users are guided through a series of haptic vibrations that increase in rate and intensity as the user approaches a checkpoint. In addition, braille text conveys the next step or checkpoint that the user is travelling to. Pre-set destinations are set either by the user or a carer. Alternatively, destinations can be beamed directly from a phone if it is a new location. Users interface with the product with a physical selection dial to choose pre-set destinations.
Braille Map
A dynamic, updatable braille map which makes location-based wayfinding applications accessible to the blind. This braille map would be produced by mechanically pushing out pins on a 2D plane to represent the user’s surroundings. Users touch a 3 dimensional surface to read the map and locate where they are and what their surroundings look like. The braille pins move in or out to represent buildings or roads.
Optic Locket
The concept is a necklace with an attached ultrasonic sensor to detect any obstructions that are from torso to head height. Vibration motors embedded onto each side of the necklace vibrate depending on what side the obstruction is and how far away it is. Through haptic feedback, the user is informed where to direct themselves around the object.
Tack-Tiles
Tack-Tiles is a dynamic pavement solution, conveying vibration-based messages to inform individuals, who suffer from visual disabilities, on the impending dangers they may face at crossings. Messages are updated based on prediction-based AI software that provides constant and accurate feedback to its users. When users position themselves on the Tack-Tile surface, the pads vibrate to convey messages on the situation of their surroundings.
User Testing: Round 1
The array of product prototypes assist blind people in navigating cities in distinctly separate ways. By conducting user testing, we will gain a high-level understanding of which product most effectively solved this problem, and which aspects of that product were most intuitive. We will then evaluate, select and develop iterations of a single concept to ensure it effectively assists blind people in navigating cities.
In the initial round of think-aloud testing, we observed participants as they completed an activity, followed by an interview to understand pain points and what they wanted in future iterations. With time limitations, we resorted to mimicry usability testing. To ensure conditions that closely imitated conditions faced by our target audience, we prompted participants to wear blacked-out goggles to simulate blindness.
Optic Locket
The test environment was a hallway where strings were tied at torso height on different sides. Participants had to navigate through the hallway, avoiding the strings while two different vibration sounds were played behind them which indicated whether they should move left or right. The prototype was designed to see how each test subject would react to audio indicators guiding them without any other aids.
Beacon
We performed a Wizard of Oz method of testing to validate the key interaction of navigating vision-impaired users through haptic feedback from a watch. To perform the test, we put a moto 360 watch on our subjects and tasked them with finding an X written on paper. The vibrations were controlled remotely by an app on a smartphone. The tester would send an increasing number of vibrations as the participant approached the X.
Tack-Tiles
Participants were tasked with stepping on a cardboard prototype of the concept. They were told they were waiting at a zebra crossing and had to determine what different sounds played indicated about the traffic conditions. Their understanding of each sound, the duration of each test and any comments were noted.
Due to technological limitations, the team decided to eliminate Braille Map from the testing process.
Findings Summary
The use of in-test observations and post-test interview methods supported our qualitative analyses of the prototypes. Investigating and comparing the use of haptic feedback across a range of differing prototypes provided the team with a succinct understanding of users’ mental models.
In addition, participants were prompted to relay their experiences and emotions in a private post-test survey. We used a SUS style questionnaire to collect quantitative results that compared and rated each concept to further narrow down our decision. By means of an Affinity diagram, we summarised our findings into three distinct points.
Analysis
Distinct Feedback
Users wanted feedback from each product to be more distinct and unique. The vibrations tested were too indistinct, which led to a mix up of what each vibration conveyed. Simplicity is favoured over complexity, as too many variations of vibrations make it difficult to correspond a vibrational message to an easily interpreted response.
Intuitive Feedback
There is a desire for more intuitive and more informative feedback. Participants expressed the desire for the prototypes to make better use of our instinctual nature or extra information to interpret the vibrations. Substantial changes that include, greater pause to emphasise a change, negative feedback and additional assurances, such as audio, help support better haptic feedback comprehension.
Reassurance
Directional feedback from all concepts needs to be more reassuring to feel safe. The challenge of using haptic feedback as a means of conveying directional information is that it is hard to interpret in what direction users should move.
In further support of Tack-Tiles, the prototype received the highest average rating for all three metrics tested in the survey, with an aggregate score of 7.8. The survey prompted users to scale the prototypes based on their feasibility, helpfulness and their overall ease of
use.
Two individuals with visual impairment, who were not able to participate in tests, were interviewed as final indicators for the direction of the project. Their advice and insight into commuting around cities helped us understand what aid they use, how they get help and the problems that may arise. As we were unable to test any truly visually impaired individuals, their feedback would prove pivotable in deciding which concept would be most helpful to them and most feasible to implement.
Reflection
The most prominent challenge we faced moving into testing was our inability to assess each concept with participants who were visually impaired. Additionally, it proved difficult testing the low fidelity version of each prototype. With little control over tools with haptic feedback, we resulted in mimicking vibrations through sound. Overall, participants found it hard to interpret the sounds with the same sensation of a vibration.
Chosen Concept: Tack-Tiles
It was clear in our amalgamated results, that Tack-Tiles had the most intuitive interaction and greatest potential. Our process of choosing Tack-Tiles involved rating all concepts against one another in the form of a decision matrix. We iterated on Tack-Tiles by putting our user needs to the forefront.
We updated vibrations to be more intelligible and easily interpreted by users. After testing the effectiveness of various vibration patterns, we found that first time users often misinterpreted the vibration patterns. For example, a fast vibration sound which was meant to prompt users to cross the road was often misinterpreted as a “don’t cross” signal.
Additional audio cues help further reduce interpretation error and ensure safety through product confidence. Further testing will explore a range of audio cues to discern the most beneficial sounds to be played along with the vibrations.
Testing: Round 2
After we had selected Tack-Tiles as our chosen concept, we conducted further testing to validate aspects of the design and continue to iterate. For Tack-Tiles, we wanted to test which kinds of sounds and vibrations worked best to communicate various traffic conditions to blind subjects.
We developed three different sets of sounds and played them to our subjects in a “blind test” where they listened to each sound and had to choose its meaning from a list without any other assistance. Thus, we could determine which sounds represented each traffic scenario most accurately based on the percentage of participants who guessed the meaning correctly. We also gauged their emotional response to each sound.
Results showed vibrations using spoken instructions with audio cues were best at communicating traffic changes at intersections. Moreover, the overall emotional response to vibrations with spoken instructions scored the highest against vibrations with sound effects and sole use of vibrations.
Further iterations will explore how to best communicate audio to users. For example, we must consider whether to use speakers or headphones and whether to play sounds automatically or only trigger sounds when a user interacts with the product. We will also experiment with embedding lights to guide partially sighted users to help further distinguish traffic states.
PHASE 3: DELIVERY
Product Development
With Tack-Tiles fully realised, we began on the physical construction of the product. First and foremost, we developed a game plan, which divided up development in a structured and cohesive manner. Team responsibilities were distributed fairly across a set of expectations to follow over four weeks.
Blueprints were fabricated to plan the overall structure and dimensions of Tack-Tiles. This included necessitated materials, hardware and spatial planning of various internal components.
In the meanwhile, the task of vibrational code implementation would be taken on by our coding expert. This included warranted hardware related coding and the amalgamation of varying internal hardware such as wiring and vibration motor creation. Finally, tasks were extended to the curation of audio clips, along with the execution of its code.
Hardware and Software Requirements
Actuators
An array of 6 green LED lights are connected to a shift register and powered by the Arduino. A single red LED is also included. A small DC motor with an offset weight acts as a vibration motor. A portable speaker is connected to the computer by Bluetooth. Each of these actuators are turned on and off at different rates, triggered by a change in signal from the ultrasonic sensor.
Sensors
An ultrasonic sensor is used to detect aproaching traffic. A sound wave in the ultrasonic frequency range is sent out if an object, such as a car, is in the way. This message will be reflected back to the receiver and trigger a signal to be sent to the Arduino.
Software
The Arduino IDE is used to program the microprocessor. The code mainly consists of an if statement which checks to see whether the ultrasonic sensor has detected a car and triggers a “wait” or “walk” pattern in the LEDs and motor. An audio library and the processing IDE are also used to take the signal from the ultrasonic sensor and use it to trigger audio files to play on a laptop, which is connected to a Bluetooth speaker.
The Final Design
Tack-Tiles is a piece of smart pavement infrastructure, alerting pedestrians about roadside traffic conditions. Our concept aims to improve the ground surface indicators already present in most developed cities. While existing designs have proven effective in defining hazards, they lack effectiveness in conveying the type of obstacle present and alerting users to changes in traffic conditions.
The physical form of Tack-Tiles is similar to a traditional tactile ground surface indicator typically found at traffic light crossings or train platforms. The product conveys vibration and audio-based messages to alert visually impaired individuals on traffic conditions or other potential hazards. For example, Tack-Tiles installed at Zebra crossings produce vibrational messages and audio instructions depending on whether the crossing is clear, or a car is approaching.
Haptics
Users can step on Tack-Tiles or touch the surface with a white cane to begin receiving messages. Ultrasonic sensors placed on either end of a crosswalk pick up traffic information. Tack-Tiles receive this data and convert it into haptic feedback, updating the user on when to stop or walk. If impeding traffic is present, Tactile pads begin to vibrate at slow regular intervals. If no traffic is present, Tack-Tiles vibrate quicker in succession.
Visual
Embedded LEDs help provide further indication at crossings. When cars are present, red LEDs form an X and blink slowly to indicate to users that it is not safe to cross. Once no hazards are present, a strip of green LEDs turns on and off in a sequence, moving in the direction of travel. Applying LEDs to this context assists those who suffer from partial visual disabilities and rely on high contrasting colours when wayfinding throughout public spaces.
Audio
Speakers embedded within Tack-Tiles warn users when vehicles are inbound. If no traffic is present, spoken instructions direct users to cross. With sufficient utilization of Tack-Tiles, the objective is for users to interpret messages without relying on sound in circumstances, where overwhelming noise, due to traffic, crowds, construction or other forms of background noise, may cause inaudibility.
Issues
Placement of ‘Do not walk indicator’
In future iterations, we would improve the placement of the red “don't-walk” indicator. Currently positioned in the back-left corner, it is easily obscured by the user’s foot. A better place would be in the middle of the tile. This problem could also be solved by using multiple tiles side-by-side so that if one is covered, the others can still be seen.
Sensor type
We used an ultrasonic sensor to demonstrate the product. However, if deployed at real crossings, it would be best to use an inductive loop. This is what is used in current traffic light car detection systems. It consists of a long wire embedded into the road, which detects a large metal object such as a car by generating a magnetic field.
Production
Our device contains an Arduino, laptop and Bluetooth speaker. If mass produced, the Arduino and laptop would need to be replaced by a cheaper, more reliable PCB and microprocessor. The Bluetooth speaker could be replaced by a cheaper standard speaker.
Future Work
Improving Tack-Tiles
The next steps would see an iteration on haptic feedback that improve on the methods of communication. More concise vibrational feedback would allow for the eventual goal of users instinctively understanding instructions at crosswalks without having to rely on audio. In circumstances where audio instructions may not be heard, users can still interpret messages communicated by Tack-Tiles.
Moreover, experimentation in the form of additions to haptic feedback could be conducted. Users may use this technology to position themselves in the least flow of pedestrian resistance along a crosswalk or become aware of the size of a vehicle as they move into a crosswalk.
Expanding Tack-Tiles
The Tack-Tiles solution could conform to a range of other public areas with their own unique hazards. For example, Tack-Tiles could be used to showcase travel times along a platform, how many minutes a train is due to arrive or depart or where users should position themselves on a platform coinciding with doors on a train.
Further assistance to improve wayfinding among visually impaired individuals may also be applied to other forms of tactile paving. Haptic feedback on elongated style tiling (designed to show pathways) may provide additional information while on the go.