We are very excited to announce a brand new Virtual Tours feature for our TripAdvisor iPad app, code-named “OwlCam”. When the iPad launched last April, one noticeable shortcoming was its lack of camera, preventing augmented reality apps from really taking off on the platform. However, augmented reality is such a great tool for finding nearby hotels, restaurants, and attractions, and we really wanted to find a way to bring it to our app without waiting for the launch of the iPad 2.
At the same time, TripAdvisor is most useful for travelers planning their trips ahead of time, so our users generally don’t care so much about restaurants where they CURRENTLY are as much as restaurants nearby their hotel on their upcoming trip. So, we realized that our ultimate goal was to find a way to implement “remote” augmented reality.
In the end, the lack of camera on the iPad was not a barrier to development of our feature. The iPad is equipped with all of the other sensors typically used for augmented reality apps, including a compass, accelerometer, and GPS. We were able to leverage Google’s Street View API to provide 3D panoramas of vistas all around the world. The breadth of locations covered by Street View is truly amazing, including the United States, Europe, Australia, New Zealand, parts of Asia, and even parts of Antarctica.
Once we have a Street View panorama loaded on the iPad, we are able to display information and ratings about the hotels, restaurants, and attractions right within the Street View display. The end result is that you can literally drop yourself into a vacation destination and see all of the nearby things to do & places to eat. With one extra tap on a specific location, you immediately warp yourself to the 3D panorama nearby that location. So, it is easy to take a virtual tour of the best places to see in Paris, Hong Kong, or Sydney while sitting on your couch at home. This "warping" behavior is not even possible in standard augmented reality apps, for which you are bound to the viewport in your current location.
Suppose you will be staying at the Hollywood Celebrity Hotel and want to find things to do while you stay there. You see that Grauman’s Chinese Theatre is less than half a mile south, well within walking distance. By clicking on the orange button, you can immediately warp there to see the view.
Once you’re there, you can virtually walk down the street, or find a nearby restaurant where you can eat lunch.
Adding location data in 3D space
It is simple to render Google’s Street View imagery using their Maps API, but we also added a layer in our iPad app to display the rectangular pins showcasing our hotel, restaurant, and attraction ratings. Once we have the GPS coordinates of the pins, we need to calculate the locations onscreen to render the rectangles. First, lets start with the GPS coordinate where the user is currently standing in the virtual viewport. In navigation, a common term is the azimuth, which is the angle on the ground plane between the current location and some other location in the distance, relative to true north.
The azimuth angle can be calculated for each pin based on the arc-tangent of its difference in latitude with the Street View location divided by its difference in longitude with the Street View location, adjusting for quadrant:
LatDiff = PinLatitude - StreetViewLatitude
LngDiff = PinLongitude - StreetViewLongitude
if (LngDiff == 0)
if (LatDiff < 0)
Azimuth = Π
Azimuth = 0
AzimuthIntermediate = Π/2 - arctan(LatDiff / LngDiff)
if (LngDiff > 0)
Azimuth = AzimuthIntermediate
Azimuth = AzimuthIntermediate + Π
Once we have the azimuth angle for each pin, the next step is to find the horizontal location onscreen for that pin. We continually update a special azimuth angle representing an imaginary location straight ahead of us in the field of vision, whenever the user decides to rotate the Street View panorama left or right. The horizontal onscreen location is computed by comparing this special azimuth with the pin azimuth. On the iPad, in landscape orientation we can assume that our field of vision spans roughly 70 degrees out of a possible 360. If the difference between the straight-ahead azimuth and the pin azimuth is greater than 70 degrees, then we do not show the pin at all. Otherwise, we calculate the horizontal coordinate of the pin such that pins with azimuth angles closest to the straight-ahead azimuth appear in the exact center of the screen. Pins with azimuth angles exactly 35 degrees on either side of the straight-ahead angle will appear just on the edge of the screen.
The final step is to calculate the vertical location of each pin onscreen. Here, we have a few options. We do not know the height off the ground of each hotel, restaurant, and attraction, and so we'll need to simulate the intended height value. One option is to keep the pin centered vertically if it is very far away, and move the pin towards the top or bottom of the screen if it is very close to the Street View location. This option has the effect of showing faraway locations right on the horizon. However, in practice, we do not ever want two pins to overlap with each other for display purposes, and by putting many pins on the horizon, they almost always tend to overlap. So, the second option is to just ignore the pin distance and stack pins vertically so that they never overlap, and are also clustered towards the center of the screen. With either option, if the user is allowed to move the Street View panorama up and down, then the vertical pin locations must be offset by the degree by which the user has tilted the panorama.
Making use of device sensors
In the typical augmented reality experience, the user moves their phone or tablet around in real-time, looking through the camera viewport at their surroundings. The device compass, accelerometer, and gyroscope are used to calculate the direction and height in which the user is facing.
The device compass readings can tell us the straight-ahead azimuth angle as described in the previous section. That is, it tells us the degree by which the current forward direction differs from north. Keep in mind that the compass reading does not change at all based on how the user's phone or tablet is currently being held. Compass readings can be used to calculate the horizontal location of pins onscreen.
The device accelerometer readings tell us the orientation at which the user's phone or tablet is currently being held. If the user is holding their device perpendicular to the ground, then pins should appear in the center of the screen because the user is looking forward at the horizon. However, if the user is holding their device at a 45 degree angle facing downwards toward the ground, then pins might appear further towards the top of the screen. The accelerometer can also be used to determine whether pins should be rotated onscreen.
The device gyroscope can be used to determine how quickly a user is currently rotating their phone or tablet. A common complaint with augmented reality apps is that the pins onscreen move around very frequently and are overly sensitive to device movement. By observing gyroscope events, the effect of the accelerometer on pin movement can be dampened whenever device movement is rapid. A gyroscope is available on iOS starting with the iPhone 4.
For project "OwlCam", we use Google Street View in place of a device camera, and so the default user expectation is to be able to control the viewport by touching the screen. However, a special option in the settings enables “Compass Mode”, which uses the iPad’s internal compass to track movement within Street View.
Tip: To enable Compass Mode, go to Settings -> TripAdvisor -> Street View -> Compass Mode -> ON, and then return to the TripAdvisor app. Within Street View, rotate your iPad around to control the camera.
In the future, we expect that augmented reality will become even more prevalent on mobile and tablet devices, especially for online travel planning. We are very excited at TripAdvisor to be on the leading edge of this technology.