Seeing Philadelphia’s Past in Augmented Reality

GIS Contributor


Guest article by Deborah Boyer and Andrew Thompson.

For over 140 years, the City of Philadelphia has been taking pictures of itself.  Though local pride is never in short supply in Philly, far more practical reasons like risk management and documentation of public works projects have motivated decades of city-employed photographers to capture images of streets, buildings, and events in all corners of the city.  The Philadelphia City Archives, under the administration of the City of Philadelphia’s Department of Records (DOR), is now tasked with maintaining the collection and serves as the resting place for an estimated 2 million photographs, an invaluable source of insight into the City’s rich history.

Accessing the images, however, was a difficult process.  Individuals had to travel to the City Archives during open hours, page through an index, and submit requests for certain photographs – and even then much of the photo collection was not fully catalogued.  Only a few hundred people completed the process each year.

Those who did found a unique collection: a century of snapshots, many with a geographic element. Most of the images are of a specific location identified by the original photographer, information that can be important both to a scholar researching the history of a developing neighborhood and a history enthusiast looking at her old elementary school.  The potential for the DOR’s collection of images to inspire local historical research, education, and old-fashioned awe was incredible.

Free weekly newsletter

Fill out your e-mail address to receive our newsletter!

In the early 2000s, the DOR began searching for a method to provide easier access to the photos by digitizing and placing them online.  The DOR partnered with Azavea ( a local software company specializing in Geographic Information Systems (GIS), and in 2005, the team launched (, a web-based digital asset management system that provides online access to the digitized images.  Users can search through the collection by geographic location, keyword, date, and other criteria.  In addition, each image in the system is also accompanied by a map of the location where the photo was taken and a link to view that location as it looks today via Google Street View.  Administrators can also use to manage and update the photo collections.

The map-based search page emphasizes the geographic information available for many of the images.
The map-based search page emphasizes the geographic information available for many of the images.

Over the past six years, what started as a simple website with only 90 images from the City Archives has developed into an online database with over 95,000 historic photographs and maps from five Philadelphia organizations.  Currently, has over 7,300 registered users and regularly receives over 13,000 unique visitors per month.  Map- and address-based searches have consistently been the most frequently used on, with over 682,000 address searches performed in the last year compared to only 160,000 keyword searches.  Users also often provide positive feedback related to the unique geographic features of

“Thank You for all this amazing work you are doing to keep the history of Philadelphia alive so future generations will see what a truly great city this is. I was born there and feel it’s still my city even though I live 3,000 miles away. I am so proud of what I find here and enjoy it so much! Thank you again for all your hard work and dedication to this project. You are making a lot of folks happy!” User

Initial Mobile Development

As both’s collections and user base grew, the DOR and Azavea sought other ways to utilize the geographic element of the site.  In summer 2007, the DOR launched Mobile (, making the entire collection accessible via Internet-capable cell phones.  Individuals could now stand in front of a modern building and directly compare it to the image of its historical predecessor.  A PhillyHistory smartphone web app created in 2009 for iPhone ( and Android devices improved the mobile version’s mapping and search capabilities and took advantage of the internal Global Positioning System (GPS) in many smartphones to quickly find and load the photos closest to a user’s location.

The image on the left shows relative position where the photo always directly faces the user. The image on the right shows absolute positioning where the photo is angled based on additional coordinates.

But the DOR and Azavea realized there was still more potential for applying mobile technologies to such a location-based collection.  Could the team create a full illusion and merge the worlds of the past and present?  Some of the most striking possibilities to do just that appeared to be in the field of augmented reality.

Investigating Augmented Reality

Once only to be found in science fiction, the development of augmented reality (AR) technology has exploded in recent years thanks to advances in mobile devices like smartphones.  Modern smartphones do not just make calls and run applications; they are packed with complex sensors that application developers can access and utilize, including GPS, WiFi, a cellular radio, camera, compass, accelerometer, and (in some phones) gyroscopes.  Using these tools, a smartphone can determine its basic geographic location using GPS, while improving the accuracy using WiFi and cell tower data.  A compass pinpoints the direction the phone is facing, and an accelerometer detects smaller movements through space while the gyroscope smoothes over the jitters.  An AR application can then take all this information and “augment the reality” shown through the smartphone’s camera by adding digital images and data onto the screen.

In February 2010, the Philadelphia Department of Records received a Digital Humanities Start-Up Grant from the National Endowment for the Humanities ( ) to research and develop a prototype mobile AR application that would enable users, via their smartphones, to view historic photographs of Philadelphia as overlays on the current urban landscape.  By using the geographic coordinates tracked as part of the digitization process, the DOR hoped the prototype could accurately place the images in 3D space and create an immersive user experience.  The idea was that users could walk down the street and watch as historical photos of the immediate area appeared on their smartphone screens, as if they were fixed-position “billboards” superimposed on the phone camera’s view of the current locations.

“Through the development of an augmented reality application, we hoped to create another way for people to interact with the photo collections at the Philadelphia City Archives. AR can serve as a powerful tool for helping people see the connections between the past and the present all over the city.” – Deb Boyer, Project Manager, Azavea/

The original grant application proposed that a selection of 500 images from the collection would be geographically “pinned” in 3D space.  When the images were viewed in the AR application, the photos would line up as accurately as possible with features that still exist in the current landscape, such as roof lines and street corners.  A group of local historians would write additional descriptions for twenty of the images.  A core goal of the project was to select images in a broad range of different neighborhoods.  In addition to providing extensive coverage of the city, geographically dispersed images also provided options for researching location accuracy and image placement issues related to tree cover, building heights, and other errors that could dispel the application’s illusion.

Once again, the DOR partnered with Azavea to tackle the technical aspects of the project.  Other museums and cultural institutions had created AR applications with some success, but many challenges remained for the team to achieve their version of augmented reality.  For a true “point and view” illusion, the app would require accurate GPS readings from smartphones in a crowded urban environment where the satellite signals might bounce off tall buildings.  The developers needed to find a suitable AR technology that would let them place the images in the correct 3D space, provide fast processing times to send the AR images and data to the smartphones, and include an intuitive user interface for viewing the images and historical text.

By calculating the viewing angle, we adjusted the images as necessary to make them more easily visible.
By calculating the viewing angle, we adjusted the images as necessary to make them more easily visible.

The team proposed three different options for the prototype:

  1. Create an application using an existing proprietary augmented reality framework.
  2. Create a custom application using open source technology for use on the Apple iOS.
  3. Create a custom application using open source technology for use on the Google Android platform.

Ideally,’s AR application would be able to run on both iOS and Android in order to reach the greatest number of users.  The team knew, however, that there might be significant development hurdles in developing a custom application for two separate smartphone operating systems.

Selecting Images

Software development aside, there was still the question of which 500 images to select from the pool of over 95,000 photos and maps. Additionally, to model the image and “pin” it accurately in 3D space, information on each shot’s angle, position, and scale needed to be gathered.  The team eventually settled on a list of criteria for selecting photos, including geocoding data, historical interest, educational value, aesthetics, diverse locations, strong comparisons and contrasts between the past and present environment, and how well the photo matched to Google Street View. enables administrators to set the pitch, yaw, and zoom of the Google Street View image to match the view shown in the historic image.  This additional coordinate data would be used to accurately place the images in 3D space.  A dedicated group of interns set the Street View coordinates for each of the 500 selected images.

However, even with 500 photos, the team found areas of the city with large gaps between photos.  To ensure individuals across the city could use the app, DOR and Azavea decided to incorporate all 90,000 geocoded images from the collection in the AR application.  While the majority of the images would be positioned using only latitude and longitude coordinates, an AR application with so many digital assets would present unique challenges.

Selecting AR Technology

While the developers had hoped to utilize open source AR technology to create a custom application, they found that open source frameworks for creating sensor-based AR apps lacked stability and were often not yet fully developed or difficult to implement for both the iOS and Android platforms. Developing a new open source framework was beyond the scope of the grant so the developers turned to existing proprietary platforms.  After reviewing many options, they chose to develop the PhillyHistory AR app using Layar (, a sensor-based AR framework with support for both Android and iOS.

Sensor AR technologies find the smartphone’s location using GPS and then use the other sensors to further pinpoint where the user is pointing the phone and how it is oriented in order.  With this information, an AR app can superimpose digital images on a view of the physical world seen through the phone’s camera.

Building an Augmented Reality App in Layar

Initially, the development team was uncertain that Layar would meet the requirements for the AR application.  However, reflecting the rapidly developing field of augmented reality, the Layar company had recently added several features to their framework that were crucial to the project.  Using new enhancements to Layar, the team could accurately place the 500 select images as 2D images in 3D space and build the application to match the branding, colors, and design elements.

Supplying the images to the Layar service and thus to the eventual users of the app was the first challenge.  The entire collection of 90,000 images would be impossible to store directly on most smartphones.  The AR application would have to download images on the fly from the Internet using data services developed by Azavea.  While robust in many ways, this method has the disadvantage of a noticeable delay as the images are downloading – and if the user has no cellular coverage, the AR application may be inoperable.

Additional issues quickly arose.  The Layar framework and Google Street View (which had been used to calculate the supplemental coordinates for the 500 images) measured their angles differently – Layar on a counter-clockwise basis and Street View on a clockwise basis – requiring some additional calculations.  Since the 500 “pinned” images were also displaying like billboards located at a particular angle in space, users standing behind where the photo was taken (i.e. the backside of the billboard) would not see the image at all.  Other images were angled or appeared in different sizes depending on the smartphone.  With over 90,000 images in the application, several photos could be located at a single location, such as City Hall, which led to long image loading times, images blocking each other, and some images shearing and splicing together.

The team addressed these challenges using some inventive geometry and prioritization.  The developers modified their data services to automatically flip the image angle data and check which type of smartphone was requesting the images in order to deliver the correct size.  The team also limited the number of images that would show up in a given area and made the views less chaotic by forcing 1 meter of separation between images and giving each displayed image its own 20-degree viewing “slice.”  By creating a prioritization system that displayed the twenty images with historical text first, the 500 “pinned” photos second, and the rest of the 90,000 only if the others weren’t available for any given location, the team made sure that the most detailed results would always appear.

Finally, after months of selecting photos, writing additional text, working with the Layar AR framework, and developing code for displaying 90,000 images as augmented reality points, the Augmented Reality by application was launched in late April 2011. Though the original grant’s mission was a prototype, the DOR and Azavea felt confident enough in the application to release it as a free download from iTunes App Store and the Android Market.  As a requirement of the original NEH grant, the team also published a white paper on the project detailing their research and conclusions about the use of this form of augmented reality in cultural institutions.  Azavea and the DOR found that although AR technology has immense potential the technology is still very limited and further research, particularly in terms of open source tools, is necessary to create additional AR projects and a smoother user experience.  The full white paper can be downloaded at the bottom of the page of:

The development of the prototype served as an opportunity for the DOR to experiment with a new method for introducing the historic photographs to new audiences and connecting individuals with Philadelphia’s past.

“When we started 5 years ago, we only had 90 photos.  We could hardly imagine that today we would have almost 100,000 historic photos and maps available to the public. Now with the augmented reality capabilities added to the system, makes creative use of the latest technology to bridge the past and the present, as well as give residents and visitors a new means of learning about Philadelphia.” – Joan Decker, Commissioner, Philadelphia Department of Records

The prototype Augmented Reality by application will remain available for download until July 2011.


The development of the Augmented Reality by app was funded through a Digital Humanities Start-Up Grant from the National Endowment for the Humanities Office of Digital Humanities. Any views, findings, conclusions, or recommendations expressed in this project and article do not necessarily reflect those of the National Endowment for the Humanities

This article is a summary of the “Implementing Mobile Augmented reality Technology for Viewing Historic Images” white paper, available for download at

Photo of author
About the author
GIS Contributor
Guest articles from GIS professionals. Authorship information is listed at the bottom of the articles.