a digital scan of a 35mm film image of a processing sketch running on an LCD
Skip to Content

Real-time Sky Hiatus

The output from my real-time sky cam remains inconsistent. The color of the sky was supposed to set the background color of this blog... but I'm having second thoughts.

I was expecting shades of blue, but ended up with yellows and browns... assaults on the eyes that are too much to bear. Until I can muster a better color-sensing solution, I'm returning the blog background to its original light-gray. In the mean time, the sky camera will continue to send its uninspired color values up to Pachube every minute.

And so, failure: The whole idea was to lighten up and relinquish some aesthetic control to an unknowing third party. It was more than I could handle. For shame.

December 28 2009 at 11 PM

Bargaining Chip

December 27 2009 at 2 AM

It's Video Art

André Callot's pithy survey of video art. Via Ben Kolak.

December 23 2009 at 5 PM

Real-time Sky

For the com lab final, I decided to revisit the blog-improvement assignment — might have been low hanging fruit given my prior experience on these fronts, but I wanted to try something new: tapping real-time sensor data to determine the aesthetic properties of my website.

I spend such an inordinate amount of time in front of screens, that bringing in a bit of the outside world onto said screens seemed like a nice thing to do (if just a bit sad).

So I decided to set up a camera pointed at the sky, 24 / 7, and use the average sky color to set the background on this website.

First, I needed a camera. I scrounged up an old webcam, and put it in a small tupperware container since it was going to have to brave the elements on my windowsill.

Camera in tupperware enclosureCamera in tupperware enclosure with lid

Next up was how to get data from the camera to the web. I just found out about a website called Pachube (confoundingly pronounced "patch bay") which provides a platform for the hosting and exchange of small bits of sensor data — ideal for this application. So I set up an input feed on Pachube to receive and store the sky color values.

I whipped up a Processing application to read from an old webcam, average the pixel values, and then send the value up to Pachube every minute. (The EEML Library for Processing made interfacing with Pachube completely painless.)

Getting the data out of Pachube and into my Drupal Blog needed another piece of glue code: the Pachube PHP Library. I created a Drupal module to wrap up the library, and another small module to query their servers, grab the current color value, pass it on to a bit of JavaScript, which then set the background color via jQuery.

Right now the background only updates on refresh (since the data changes quite slowly) but it wouldn't be much more work to set up a timer and some AJAX to fetch the latest value at a regular interval.

With the software working...

Sky camera monitoring software

And the camera situated...

Sky camera in windowsill

I was ready for the data to start coming in. My webcam had other ideas, though. It turns out that even with auto-exposure and auto-gain turned on, the sensor doesn't have enough dynamic range to capture the sky with any kind of accuracy. The sky is just too bright during the day, and too dark at night. There are a few interesting color values at dawn and dusk, but otherwise it's either solid black or solid white.

Here's a look at tonight's sunet (the top of the gradient is about 6:00 PM ET, the bottom is 2:00 PM ET (the sun officially set at 4:30). There's about an hour of meaningful color data between the blown-out day and solid-black night. I'm investigating more sensitive camera options, in hopes of getting meaningful values for more of the day.

4 Hours of sky cam data

Here's the Processing code. It's a bit verbose, since I entertained a few what-if scenarios and drew out some portions for legibility's sake. The low frame rate helps keep CPU usage (and therefore CPU temperature and fan noise) within the realm of rationality.

  1. // Sky color logger.
  2.  
  3. // Takes a photograph of the sky every second, then finds and stores the average color value.
  4. // Each minute, average the one-second averages to find the average sky color over the last minute.
  5. // Upload the result to Pachube (http://www.pachube.com/) as a hexadecimal color value.
  6. // Built with the EEML Library for Processing, available at http://www.eeml.org/library/
  7. import processing.video.*;
  8. import eeml.*;
  9.  
  10. DataOut dataOut;
  11. Capture cam;
  12.  
  13. int pixelCount;
  14. color averageColor;
  15. color minuteAverageColor;
  16. int lastMinute;
  17.  
  18. int minuteAverageRed = 0;
  19. int minuteAverageGreen = 0;
  20. int minuteAverageBlue = 0;
  21. int averageRed = 0;
  22. int averageGreen = 0;
  23. int averageBlue = 0;
  24. int sampleCount = 0;
  25.  
  26. int[] averages; // Stores recent averages.
  27.  
  28. void setup() {
  29.  size(640, 480);
  30.  noStroke();
  31.  frameRate(2);
  32.  
  33.  // Set up communication with Pachube.
  34.  String feedURL = "http://www.pachube.com/api/feeds/3929.xml";
  35.  String apiKey = "YOUR PACHUBE API KEY HERE";
  36.  dataOut = new DataOut(this, feedURL, apiKey);
  37.  dataOut.addData(0, "sky,color");
  38.  
  39.  // Set up the camera.
  40.  String[] devices = Capture.list();
  41.  println(devices);
  42.  cam = new Capture(this, 320, 240, devices[1]);
  43.  cam.frameRate(1); // Read 1 frame per second.
  44.  pixelCount = cam.width * cam.height;
  45.  
  46.  // Take note of the time. We will upload once per minute.
  47.  lastMinute = minute();
  48.  
  49.  // Store recent averages in an array.
  50.  averages = new int[240];
  51.  
  52.  // Fill the averages array with white to start
  53.  for(int i = 0; i < averages.length; i++) averages[i] = 255;
  54. }
  55.  
  56. void draw() {
  57.  background(0);
  58.  
  59.  // Show the live camera feed in the top left.
  60.  fill(0);
  61.  image(cam, 0, 0);
  62.  
  63.  // Show the latest average color in the top right.
  64.  fill(averageColor);
  65.  rect(320, 0, 320, 240);
  66.  
  67.  // Show the last minute average in the bottom left.
  68.  // This is what was actually sent to Pachube.
  69.  fill(minuteAverageColor);
  70.  rect(0, 240, 320, 240);
  71.  
  72.  // Draw the recent averages as horizontal lines in the bottom right.
  73.  // The latest values are pushed onto the top of the stack.
  74.  for(int i = 0; i < averages.length; i++) {
  75.    stroke(averages[i]);
  76.    strokeWeight(1);
  77.    line(320, 240 + i, 640, 240 + i);
  78.  }
  79.  noStroke();
  80.  
  81.  // Aggregate the averages and send to Pachube every minute.
  82.  if((minute() != lastMinute) && (sampleCount > 0)) {
  83.  
  84.    // Find the aggregated average for each minute.
  85.    minuteAverageRed = minuteAverageRed / sampleCount;
  86.    minuteAverageGreen = minuteAverageGreen / sampleCount;
  87.    minuteAverageBlue = minuteAverageBlue / sampleCount;
  88.  
  89.    minuteAverageColor = color(minuteAverageRed, minuteAverageGreen, minuteAverageBlue);
  90.  
  91.    // Convert to hex and send to Pachube.
  92.    println("Sending sky value " + hex(minuteAverageColor, 6) + " to Pachube.");
  93.    dataOut.update(0, hex(minuteAverageColor, 6));
  94.    int response = dataOut.updatePachube();
  95.    println("Pachube response: " + response); // Should be 200 if successful; 401 if unauthorized; 404 if feed doesn't exist.
  96.  
  97.    // Shift values in the averages array over one, and then add the latest to the beginning.
  98.    for(int i = averages.length - 1; i > 0; i--) averages[i] = averages[i - 1];
  99.    averages[0] = minuteAverageColor;
  100.  
  101.    // Reset the minute averages.
  102.    minuteAverageRed = 0;
  103.    minuteAverageGreen = 0;
  104.    minuteAverageBlue = 0;
  105.  
  106.    // Reset the sample count.
  107.    sampleCount = 0;
  108.  
  109.    // Reset the minute mark.
  110.    lastMinute = minute();
  111.  }
  112. }
  113.  
  114. void captureEvent(Capture cam) {
  115.  // Get the latest camera data.
  116.  cam.read();
  117.  cam.loadPixels();
  118.  
  119.  // Reset the averages for this frame.
  120.  averageRed = 0;
  121.  averageGreen = 0;
  122.  averageBlue = 0;
  123.  
  124.  // Average the pixels.
  125.  int pixelColor;
  126.  
  127.  for (int i = 0; i < pixelCount; i++) {
  128.    pixelColor = cam.pixels[i];
  129.  
  130.    // Extract the RGB values.
  131.    averageRed += (pixelColor >> 16) & 0xff;
  132.    averageGreen += (pixelColor >> 8) & 0xff;
  133.    averageBlue += pixelColor & 0xff;
  134.  }
  135.  
  136.  // Take the average.
  137.  averageRed = averageRed / pixelCount;
  138.  averageGreen = averageGreen / pixelCount;
  139.  averageBlue = averageBlue / pixelCount;
  140.  
  141.  // Note this average for the aggregated average taken at the minute mark.
  142.  // It would be simpler not to store every single frame's average, and instead to average
  143.  // a minute's worth of pixel data (e.g. average = averageChannel / (pixelCount * sampleCount))
  144.  // but for the sake of the display it's useful to have average data from the latest frame as well.
  145.  minuteAverageRed += averageRed;
  146.  minuteAverageGreen += averageGreen;
  147.  minuteAverageBlue += averageBlue;
  148.  
  149.  averageColor = color(averageRed, averageGreen, averageBlue);
  150.  
  151.  // Iterate the sample count... this is what we use to take the average
  152.  // at the minute mark. With the camera at 1 FPS, it should always be 60, but
  153.  // in case of timing issues it's worth keeping track.
  154.  sampleCount++;
  155. }

December 14 2009 at 2 AM

SimCity 2000 Sprite Extraction

In preparation for an AfterEffects project, I set out to free the SimCity 2000 building tiles from their binary confines. I wanted every building in the game on a transparent background. Here's the final working set I ended up with:

Getting all of the assets in shape for AfterEffects turned out to be a bit of an ordeal. My initial plan was to do something clever, like writing a parser for the mysterious ".mif" file format in which Maxis embedded their tile data. A thorough Googling turned up no documentation or existing hacks for the file type. Poking through the bytes with my trusty hex editor also failed to bring enlightenment. Since MIF is ostensibly a graphics format, in a final act of desperation I attempted to interpret the bytes as pixels via Photoshop, again with no success. It's not even an autostereogram:

Failed processing of SimCity 2000 MIF tileset file

(This dead-end detour did bring to mind the need for a tool that can take the friction out of rapidly reinterpreting large binary files as pixel data... being able to "tune" the dimensions / header length / byte order of the image parser in real-time whilst looking for the emergence of some kind of pattern or representational content in the pixels could be useful, or at least of conceptual interest. I'll add it to the project queue.)

In the end I resorted to a mandraulic approach: selecting each building in the Urban Renewal Kit, exporting a BMP for each, and sending the results through a number of Photoshop batches to place them on transparency, resize, etc.

Not exactly clever, but it worked.

December 3 2009 at 10 PM

Real Time Omniscience

Our survey of live video services last week proved that there's plenty of content out there, but the bottleneck seemed to be a lack of discoverability in the interface. None of the sites we looked at really gave the sense of how much was happening at one, or provided a way to sift through the piles of live video to find something of relevance or interest. The major live webcam sites seem stuck in the YouTube / catalog approach. Luckily for us, JustinTV hosts an API that makes building experimental interfaces relatively painless. (Ustream offers a similar but slightly less robust API.)

I wanted to see whether bandwidth / API terms-of-service / human perception would allow for a massive grid of live feeds. The answer turns out to be "sort of." I wrote some code to take the most active JustinTV channels in the "lifecasting" category, and dump them into a grid.

The results are available here.

In case the system breaks, or I run through my API allocation, a video of the system in action is available below:

A few other re-workings of the catalog approach are of interest.

5000 Webcams places live feeds in geographical context on a google map. (Also available as a grid view.)

November 19 2009 at 12 PM

Toast

225 degree CPU

November 13 2009 at 10 PM

Storyboard

Arturo, Julio, Tamar and myself put together a storyboard for our Communications Lab narrative video. Click above for a semi-decipherable full-res version.

November 9 2009 at 12 AM

Danger Mouse

OO Mouse

If it's not a prank, this mouse has to be the most chilling example thus far in support of dictatorship over direct democracy in matters of design.

The open source community has delivered millions of lines of code to the world, and I love it dearly for this. But the movement seems completely devoid of aesthetic sensibility or restraint. Even the Ubuntu project, which tries so hard to get usability right (and does, to some extent) still serves up a watery approximation of Windows and an excess of configuration options.

Maybe things will improve as more non-coders learn to code. (Or maybe this will just do to code what the open-source developers did to design.)

(Via Daring Fireball.)

November 8 2009 at 9 PM

Lab: DC Motor Control

This week's lab covered DC motor control with an H-bridge, an IC that allows to control a motor's direction. Applying power to C1 sends power through the motor in one direction, while power to C2 sends power through the other way. Setting both C1 and C2 to high will have a braking effect on the motor.

H Bridge Schematic

I added a potentiometer to the lab's circuit to allow speed control. The pot controlled a PWM signal going to the H-bridge's enable pin. Power to the motor came courtesy of a 9V wall wart.

Motor Lab
Everything worked fine:

November 8 2009 at 7 PM