Version française


This project was funded primarily by the Secrétariat du Conseil du trésor of Québec through its "Appui au passage à la société de l'information" with extensions supported by a Google Faculty Research Award. An E. Ben and Mary Hochhausen Award from the Canadian National Institute for the Blind (CNIB) is currently funding application improvements targeted at putting the application in the hands of a wider audience.

You will shortly be able to download Autour from the Apple App Store, as well as read more about how to install and use it on our support page


Autour (French for "Around", pronounced "oh tour") is an eyes-free mobile system designed to give blind users a better sense of their surroundings. Although other systems (e.g., Humanware's Trekker and standard GPS tools) emphasize navigation from one specific location to another, typically accomplished by explicit turn-by-turn instructions, our goal is to use ambient audio to reveal the kind of information that visual cues such as neon signs provide to sighted users. Once users notice a point of interest, additional details are available on demand.

Our current design consists of the following modes of operation:

The Autour user experience is tightly tied to spatialized audio, preferably using only bone-conducting or open air headphones so as not to interfere with the natural sounds of the environment. Sounds appear to come from locations surrounding the user, thereby giving a sense of directionality and distance. This allows for parsimony of representation and less intrusive sound cues. Imagine the difference between a mechanical voice stating, "Restaurant, 50 meters, 60 degrees to your left" vs. a very short "Restaurant" spatialized in the correct direction.

Prototype iphone application and bone conducting headphones.

Autour currently runs on either an iPhone 4 (or later) or an Android Galaxy Nexus, and uses the smartphone's built-in compass, accelerometer, gyroscope and GPS hardware to determine the user's location and orientation. Nearby points of interest are retrieved automatically via the Autour server. The appropriate sound scene is then rendered using the libpd library to run PureData (PD) patches.

Our server contains a copy of certain portions of OpenStreetMap, presently limited to Québec. This is used to find the nearest point on the closest street, to name intersections, and obtain the contours of parks and certain large buildings such as hospitals. OpenStreetMap data is copyrighted by its owners, who provide it under a free license (ODbL). The server also contains information from the Société de Transport de Montréal (STM), related to bus stops and subway entrances (see the STM website for information about their free license), but does not contain such information from the public transportation companies of Laval, Longueuil nor the rest of Québec.

Our server also acts as an intermediary for real-time requests to Google Places™ and Foursquare™ to locate companies, public buildings, monuments, etc. The Autour app can then render those locations in different ways, e.g., by relative direction ("front right") or by cardinal direction ("northeast"). Category names (e.g., "bar," "fast food", etc.), are spatialized to sound like they are coming from the actual location of the POI, giving a direct cue as to its direction and distance.

Video demonstration

(Note that Autour was formerly named In-Situ Audio Services, or "ISAS" for short)

An earlier demonstration video of Autour shows some functionality not covered in the latest version, such as audio icons.

User testing

We have worked with both French and English organizations for the blind in Montreal, including the Institut Nazareth & Louis Braille (INLB) and the Montreal Association for the Blind (MAB) to test Autour with a number of blind participants. These user tests have taken the form of informal walkabouts while soliciting feedback, more formal tests with specific tasks to complete using Autour while on the streets of Montreal, and also longer-term deployments where blind individuals were loaned iPhone devices to use in their daily routines. Feedback has been generally positive for the system as a whole, but has also pointed out numerous usability and other issues that have been factored into the design. A paper (link below) summarizing the results across several of these tests was presented at the 2013 ACM Conference on Human Factors in Computing Systems (CHI).

Walking straight

In addition to core environmental awareness functionality, we have also experimented with an additional mode designed to assist blind users walk in a straight line. By using the iPhone's gyroscope, we detect deviation from a straight path, and play sounds through headphones to guide the user back on course. Our user testing indicates that blind users of this mode do walk straighter than without, and that this is generally adequate to keep the user within a pedestrian walkway over a distance of 15m. (see the paper for details)

Smartphone sensor evaluation

Sensor reliability on smartphone devices continues to be an ongoing concern for practical deployment, especially in downtown areas with large buildings, where the sensors become erratic. Thus, we have focused considerable attention on evaluating the quality of the location and orientation sensors in current smartphones. Since Autour depends heavily on these sensors, knowing when and how they fail is crucial to handling these issues in the application. We undertook an experiment with three smartphones: iPhone 4, iPhone 4s, and Android Samsung Galaxy Nexus. By repeatedly walking a prescribed path in two areas of Montreal with all three phone in various orientations and positions on the body, while continually logging location and compass/device orientation data, we can see how well the sensors perform vs. reality. In addition, we can see whether the devices' estimates of their own error are accurate. The results have been written up in a paper, Smartphone sensor reliability for augmented reality applications, presented at the Mobiquitous 2012 conference in Beijing, China.

These results have caused us to remove our original sensor fusion algorithm and to implement solutions such as "snapping" the user to the nearest street and to dynamically change how POIs are rendered depending on sensor accuracy. We expect the results of this study to be useful to others who are encountering the same issues while implementing augmented reality solutions on smartphones.


Autour received the Best Paper Award at Mobiquitous 2011 in Copenhagen, Denmark for the paper, What's around me? Spatialized audio augmented reality for blind users with a smartphone.

Autour won the 2012 CIRA Impact Award in the Application Category. We were interviewed after the awards ceremony at the Mesh 2012 conference in Toronto:

As part of the Impact Award program, CIRA also ran a "People's Choice Award" competition, which Autour won by receiving the most votes from mesh12 attendees, placing first out of the four Impact Award winners.

Publications & Presentations

  1. The Walking Straight Mobile Application: Helping the Visually Impaired Avoid Veering Panëels, S.; Varenne, D.; Blum, J.; and Cooperstock, J. R. In International Conference on Auditory Displays, Lodz, Poland, July 2013.
  2. Rendering the world to blind people via spatialized audio. J. Blum and D. El-Shimy and M. Bouchard and J. R. Cooperstock. Graphics, Animation and New Media (GRAND) Conference, Toronto, Canada, May, 2013. (Research Note) [Presentation Slides]
  3. Listen to it yourself! Evaluating Usability of "What's Around Me?" for the Blind. S. Paneels and A. Olmos and J. R. Blum and J. R. Cooperstock. 2013. ACM Conference on Human Factors in Computing Systems (CHI), Paris, France, April.
  4. Spatialized Audio Environmental Awareness for Blind Users with a Smartphone. Blum, Jeffrey R. and Bouchard, Mathieu and Cooperstock, Jeremy R. 2012, Mobile Networks and Applications, p. 1-15, Springer US, December.
  5. Assisting the blind and treating amblyopia: Two more things you can do with your smartphone. Invited talk. Co-presented by J. R. Cooperstock and J. R. Blum. Le 15e Symposium scientifique sur l'incapacité visuelle et la réadaptation. University of Montreal, February, 2013.
  6. Smartphone sensor reliability for augmented reality applications. Blum, J.R., Greencorn, D., and Cooperstock, J.R. 2012, 9th International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services (Mobiquitous 2012), Beijing, China, December. Accept rate 31%.
  7. Mobile is not just fun and games: improving people's lives with smartphones Presentation, Mobiz, Montréal Digital Festival, Montréal, Canada, November 15, 2012.
  8. Two ways Smartphones can change the lives of blind and visually impaired people Invited talk, Premier Atelier sur les Technologies Assistés, Centre de recherche informatique de Montréal (CRIM), 2012, Montréal, Canada, June
  9. Sound, Noise, Silence: What's around me? Spatialized audio augmented reality for blind users with a smartphone. Invited talk, ConnexCité, Montréal, Quebec, March 2012.
  10. Eyes-Free Environmental Awareness for Navigation. El-Shimy, D.; Grond, F.; Olmos, A.; and Cooperstock, J. R. 2012. Springer Journal on Multimodal User Interfaces, Special Issue on Interactive Sonification.
  11. What's around me? Spatialized audio augmented reality for blind users with a smartphone. Blum, J.R., Bouchard, M., and Cooperstock, J.R., Mobiquitous 2011, Copenhagen, Denmark, December. Main track accept rate 28%. (BEST PAPER AWARD)
  12. Hearing Neon Signs: Spatialized Audio Augmented Reality for Blind Users. Invited talk, Interacting with Sound Workshop, Mobile HCI 2011, Stockholm, Sweden, September.

Future work