Frontier Nerds: An ITP Blog

Sequential Story

Poram and I undertook the illustration of an October 8th exchange on the ITP student email list, involving the theft and retrieval of a student’s mug.

Update: Get the backstory and ongoing drama surrounding Clay Shirky’s mug thievery at Sarah Dahnke official blog on the topic.

Sequential Story Part 1 Sequential Story Part 2 Sequential Story Part 3 Sequential Story Part 4 Sequential Story Part 5 Sequential Story Part 6 Sequential Story Part 7 Sequential Story Part 8

Here’s the PSD File (~5 MB).

Many of the images used in the even frames were under a Creative Commons license.

Clay Shirky’s head: Pop!Tech T-Shirt Body: Lee-Sean Here Comes Everybody book cover: Serdal Masking tape: Bixentro Beer taps: Digiart2001 Arturo photo: Unknown

Street Level

I’m intrigued by the possibilities for new interfaces things like Google Street View bring up. I was expecting full walkable VR setups and that sort of thing, but the projects that have actually been implemented are a bit more humble.


Here’s Tokyo Jogging a clever (if low frame rate) pairing of Street View and the WiiMote.


Stringing together Street View frames to create animation:


Stweet pairs Tweets and Street View

Screenshot of stweet showing 6th Avenue in Manhattan


Pseudo-augmented reality panning interface. (This one’s pretty old — and while it makes a good demo, actually using it might be too much work compared to the iPhone’s more practical implementation.


New York Then and Now

Screenshot of NYPL project New York Then and Now


Finally, I wanted to explore retracing a GPS-logged route using Street View for my response to Red’s M5 bus assignment. The results were a bit strange, but I think it brings up some of the contention between actually being somewhere and settling for Google’s approximation.

If anyone’s interested in exploring Street View in code, here’s a bit of processing code I wrote which will attempt to download and composite the panoramic image for any given address to lat / lon pairing.

Lab: Servo Motor Control

Next came the servo lab.

I ended up using an LDR as an analog input to set the position of the servo.

An Arduino connected to a breadboard connected to a servo

Then I switched to a pot for finer control over the servo position. Worked great as a makeshift camera panning platform:


Update: The old long-form pulse method ended up being really useful, since my Digital Combination Lock needed to use a servo and the tone library at the same time. It turns out that both the servo library and the tone library use up the Arduino’s available timers… so the only way to use both simultaneously was to fall back to the pulse method outlined in 4.1.

Lab: Tone Output

Time for the tone lab.

An Arduino connected to a breadboard with photoresistors and a speaker

  1. The photoresistors gave me an input range from 33 to 668.

    I could only scrounge up a 16 ohm speaker. It was really quiet, so I substituted a 220 ohm resistor in place of the 1K ohm resistor specified in the schematic. This gave the volume a bump.

  2. Basic tone production using a photocell-based voltage divider

  1. Playing series of tones

M5

Video

M5 bus route as seen by Google Street View, with each panorama synced to the GPS log from my trip and time-scaled to 1/100 its original duration.

Concept

Beyond the changing neighborhoods, demographics, architecture, gentrification, etc., the thing that struck me most about the M5 was the pace and aesthetics of the trip. The bus is a truly strange thing to take in New York — particularly the M5, which one could easily substitute for a ① train. Buses are slower than the subway if you’re heading uptown or downtown, and often slower than walking if you need to go crosstown. The contrast between the lazy pace of the bus (we averaged 7 mph) and the rush of the city outside the window was disorienting.

Yet taking the bus connects the rider with the city. Conversations are overheard, ridership changes from neighborhood to neighborhood, stops are missed, etc. I wanted to use Google’s images to explore and subvert this connection between person and place. Living in New York represents a commitment to place and physicality — it’s overpriced, it’s cramped, it’s a lot of work — but we deal with it for the sake “being there” and sharing the experience. Google’s Street View suggests that we don’t have to leave our chairs to see a place, and flattening Google’s 360° panoramas to fit in a rectangular frame (as done for the video) takes this further, suggesting that we don’t even have to swivel our heads.

Essentially, I used Google backwards. Instead of using Street View for reconnaissance or as an alternative to making an actual trip, I used their service to outsource the documentation of the bus ride I had already taken. The result is a complete departure from the actual experience: An hour and 40 minutes spent on a bus in the physical world is compressed into a 60 second approximation; a digital copy of my analog trip on the M5.

Path of GPS log overlaid on a map of Manhattan

Technique

I ran a geo-tracking application on my iPhone for the length of the bus ride. After the trip, I downloaded all of the lat / lon / time coordinates and fed them into a Processing program I built to place the points on a map and interpolate points in between those captured by the phone. (In this manor 447 original waypoints from the trip were extrapolated into 1490 film frames.) I was careful to keep the time coding intact, so that the pace of the video — while accelerated — would be true to the pace of the bus ride. From there, I sent the data to another bit Processing code written to query Google and download, composite, and save the imagery from their Street View feature for every lat / lon point logged in the previous step. Next, one more Processing program glued together the stills in sequence to create a video.

I’ll post my code shortly in case it’s of use to other ITP students or other others. The Street View-scraping program, in particular, demonstrates the how trivial (and handy) it is to eavesdrop on HTTP headers and reverse-engineer a simple API like the one Google uses to load Street View source images.

Oh, and the source images, I imagine, are © Google.

Outcome

The resulting video wasn’t quite what I expected — my attempt to capture the halting pace of the trip by adjusting the frame rate to reflect the speed of the bus at different points along the trip ended up looking more like a bandwidth-related glitch than an intentional effect. Also, the GPS signal I used to map the points was pretty noisy, and tall buildings cut the signal strength down considerably. In midtown, the position jumps around, shaking the resulting video around and cutting from block to block, destroying the sense of forward motion — in one sense it’s an honest reflection of midtown’s packed streets, but in another sense there’s something satisfying about such a security-camera-filled and densely networked part of time falling off the GPS radar.