<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
    <id>https://frontiernerds.com</id>
    <title>Frontier Nerds</title>
    <updated>2012-04-09T16:51:39.000Z</updated>
    <generator>https://github.com/jpmonette/feed</generator>
    <author>
        <name>Eric Mika</name>
        <email>eric@ericmika.com</email>
        <uri>https://ericmika.com</uri>
    </author>
    <link rel="alternate" href="https://frontiernerds.com"/>
    <link rel="self" href="https://frontiernerds.com/feed.xml"/>
    <subtitle>Frontier Nerds is a blog documenting Eric Mika's work at NYU's ITP masters program between the Fall of 2009 and the Spring of 2011.</subtitle>
    <rights>CC BY-NC-SA 3.0 US</rights>
    <entry>
        <title type="html"><![CDATA[Strata 2012: Roll Your Own Front End]]></title>
        <id>https://frontiernerds.com/strata-2012</id>
        <link href="https://frontiernerds.com/strata-2012"/>
        <updated>2012-04-09T16:51:39.000Z</updated>
        <content type="html"><![CDATA[<p><img src="https://frontiernerds.com/assets/strata-presentation.Dshcrdk0_Z2e9zna.webp" alt="Grid of framework logos"></p>
<p><a href="https://web.archive.org/web/20120110180344/http://www.noisederived.com/portfolio/">Michael Edgcumbe</a> and I gave a <a href="https://web.archive.org/web/20120627080023/http://strataconf.com/strata2012/public/schedule/detail/22358">presentation at the O’Reilly Strata Conference</a> last month that takes stock of the myriad (and amazing) framework options available to creative coders and data visualization developers.</p>
<p>The <a href="https://frontiernerds.com/strata">slide deck</a>, and the <a href="https://youtu.be/B7qwUhFhxlg">example zoo</a> are online. Thanks to everyone who attended and provided feedback.</p>
<p>Also, the source code for Michael’s sample app in Processing is <a href="https://frontiernerds.com/files/stratatalk_processing_v4.zip">available on GitHub</a>.</p>]]></content>
        <published>2012-04-09T16:51:39.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Winter Show 2011]]></title>
        <id>https://frontiernerds.com/winter-show-2011</id>
        <link href="https://frontiernerds.com/winter-show-2011"/>
        <updated>2011-12-20T05:56:37.000Z</updated>
        <content type="html"><![CDATA[<p>Lots of nice work at the <a href="https://web.archive.org/web/20240603102628/https://itp.nyu.edu/shows/winter2011/category/projects/">ITP winter show</a> this year.</p>
<p>Here are some personal favorites:</p>
<h3>Written World</h3>
<p><em>Zach Schwartz, Martín Bravo, Nara Kasbergen, Rachel Slaff, and Ruth Spencer</em></p>
<p><img src="https://frontiernerds.com/assets/writtenworld.CvEM7G_m_k87DI.webp" alt="Screenshot of text layered over a map"></p>
<p>Brilliant mashup of Google Maps and the not-as-famous-as-it-should-be <a href="https://www.yourworldoftext.com/home/">Your World of Text</a>. Written world lets you leave persistent messages on a map in real time. I really think it needs an address search bar to make it easier to explore other parts of the world — and being able to zoom out and see the density of conversation around the globe would be amazing. Regardless, it’s a great idea and the execution is polished. <a href="https://zachschwartz.com/work/writtenworld/">Go try it!</a></p>
<hr>
<h3>Fade Away</h3>
<p><em>Matt Richardson</em></p>
<p><img src="https://frontiernerds.com/assets/laser-draw-thing.BB9xFi6G_Z29C11l.webp" alt="A laser leaving trails of letters on a phosphorescent screen"> Photo: <a href="https://www.flickr.com/photos/tigoe/6539507285">Tom Igoe</a></p>
<p>A servo-mounted laser methodically draws text (tweets) on a phosphorescent screen. The glow-in-the-dark screen holds a record of the laser’s path. So nice to get away from flickering screens (even if we can’t get away from tweets).</p>
<hr>
<h3>Cavendish Trebuchet</h3>
<p><em>Lia Martinez, William Jennings</em> A tiny catapult as an interface for cueing projected clips of bananas hitting a wall. Simple, but instantly gratifying.</p>
<hr>
<h3>Contextual Interface for Remote Gestural Input</h3>
<p><em>Gavin Hackeling, Zena Koo</em></p>
<p><img src="https://frontiernerds.com/assets/cifrgi.B7l8sedx_ZGxA1X.webp" alt="Drawing on a screen remotely via an iPhone"></p>
<p>Wonderful side-stepping of QR-driven AR by using real images as tags in lieu of <a href="https://justinsomnia.org/2011/03/why-does-that-qr-code-take-me-to-justinsomnia-org/">robot barf</a>. Watch the <a href="https://vimeo.com/33092926">video</a>.</p>
<hr>
<h3>SoZen</h3>
<p><em>Luis Palacios, Kaitlin Till-Landry, and Peter Darche</em></p>
<p><img src="https://frontiernerds.com/assets/sozen.DdWjtRw3_2mgkXN.webp" alt="A marble tracing a path through sand"></p>
<p>An XY table that draws in a pseudo-zen garden sand pit. Instead of putting a pen or pencil on the business end of an XY table, SoZen’s creators use a strong magnet to drag a ferrous ball through the sand to build up a drawing. Making an XY table is a PComp rite of passage, but this is a great take on it and a nice use of magnetic indirection to keep the mechanism out of sight and mind.</p>
<hr>
<h3>Solid</h3>
<p><em>Ivana Basic</em> A plaster sculpture slowly destroys itself from the inside out in the absence of attention. More art should behave like a suburban adolescent.</p>
<hr>
<h3>Archived Visualizations</h3>
<p><em>Kate Watson</em></p>
<p><img src="https://frontiernerds.com/assets/archives.LpGrDhH2_u6D8V.webp" alt="Sketches from the ITP archives"></p>
<p>Finally, someone braves the ITP thesis archives. It’s the only real hope for deconstructing what, exactly, ITP is / was, while removing all excuses for creating the same projects over and over. Looking forward to seeing the evolution of Archived Visualizations next semester.</p>]]></content>
        <published>2011-12-20T05:56:37.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Kim Jong Il Looked At Things]]></title>
        <id>https://frontiernerds.com/kim-jong-il-looked-at-things</id>
        <link href="https://frontiernerds.com/kim-jong-il-looked-at-things"/>
        <updated>2011-12-19T13:20:10.000Z</updated>
        <content type="html"><![CDATA[<p><img src="https://frontiernerds.com/assets/kim_jong_il_looked_at_things.PPAny2QW_Z2wVPYl.webp" alt="A rapid sequence of photographs of Kim Jong Il looking at things, centered on his face"></p>
<p>It’s an animation built from a <a href="http://kimjongillookingatthings.tumblr.com/">tumblr blog</a> documenting Kim Jong Il looking at things.</p>]]></content>
        <category label="dear leader" term="dear-leader"/>
        <category label="animated gif" term="animated-gif"/>
        <published>2011-12-19T13:20:10.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Animated Gif (Single Frame)]]></title>
        <id>https://frontiernerds.com/single-frame</id>
        <link href="https://frontiernerds.com/single-frame"/>
        <updated>2011-10-28T22:43:43.000Z</updated>
        <content type="html"><![CDATA[<p><img src="https://frontiernerds.com/assets/animated_gif_single_frame.B3Gvk6op_Z1wPci1.webp" alt="A single-frame gif animation that appears as a solid black rectangle"></p>
<p><em>Animated Gif (Single Frame) – Eric Mika, 2011</em> It’s a single frame animated gif.</p>
<p>The file was hand written, per <a href="https://www.w3.org/Graphics/GIF/spec-gif89a.txt">the GIF file specification</a>. Although the original specification accommodates basic animation, this particular file uses an extension of the original format created and first supported by Netscape 2.0. Netscape’s addition remains out of the official spec — but seems to be the approach favored by most animated gifs around the web. While <em>Animated Gif (Single Frame)</em> may not follow CompuServe’s canonical format to the byte, it’s aligned instead with the vernacular of the real web. It renders, after all. The above image is simply solid black, and won’t appear to animate, but the file is structured in a way that identifies itself as a looping animation. Therefore, the parser rendering it in your browser is actually cycling between Frame 1 and Frame 1 every second, indefinitely.</p>
<p>It is intended as a “last work” for an exhausted medium.</p>
<p>Source code:</p>
<pre><code>47 49 46 38 39 61 00 02 56 01 80 00 00 00 00 00 FF FF FF 21 FF 0B 4E 45 54 53 43 41 50 45 32 2E 30 03 01 FF FF 00 21 F9 04 04 64 00 00 00 2C 00 00 00 00 00 02 56 01 00 02 FF 84 8F A9 CB ED 0F A3 9C B4 DA 8B B3 DE BC FB 0F 86 E2 48 96 E6 89 A6 EA CA B6 EE 0B C7 F2 4C D7 F6 8D E7 FA CE F7 FE 0F 0C 0A 87 C4 A2 F1 88 4C 2A 97 CC A6 F3 09 8D 4A A7 D4 AA F5 8A CD 6A B7 DC AE F7 0B 0E 8B C7 E4 B2 F9 8C 4E AB D7 EC B6 FB 0D 8F CB E7 F4 BA FD 8E CF EB F7 FC BE FF 0F 18 28 38 48 58 68 78 88 98 A8 B8 C8 D8 E8 F8 08 19 29 39 49 59 69 79 89 99 A9 B9 C9 D9 E9 F9 09 1A 2A 3A 4A 5A 6A 7A 8A 9A AA BA CA DA EA FA 0A 1B 2B 3B 4B 5B 6B 7B 8B 9B AB BB CB DB EB FB 0B 1C 2C 3C 4C 5C 6C 7C 8C 9C AC BC CC DC EC FC 0C 1D 2D 3D 4D 5D 6D 7D 8D 9D AD BD CD DD ED FD 0D 1E 2E 3E 4E 5E 6E 7E 8E 9E AE BE CE DE EE FE 0E 1F 2F 3F 4F 5F 6F 7F 8F 9F AF BF CF DF EF FF 0F 30 A0 C0 81 04 0B 1A 3C 88 30 A1 C2 85 0C 1B 3A 7C 08 31 A2 C4 89 14 2B 5A BC 88 31 A3 C6 8D FF 1C 3B 7A FC 08 32 A4 C8 91 24 4B 9A 3C 89 32 A5 CA 95 2C 5B BA 7C 09 33 A6 CC 99 34 6B DA BC 89 33 A7 CE 9D 3C 7B FA FC 09 34 A8 D0 A1 44 8B 1A 3D 8A 34 A9 D2 A5 4C 9B 3A 7D 0A 35 AA D4 A9 54 AB 5A BD 8A 35 AB D6 AD 5C BB 7A FD 0A 36 AC D8 B1 64 CB 9A 3D 8B 36 AD DA B5 6C DB BA 7D 0B 37 AE DC B9 74 EB DA BD 8B 37 AF DE BD 7C FB FA FD 0B 38 B0 E0 C1 84 0B 1B 3E 8C 38 B1 E2 C5 8C 1B 3B 7E 0C 39 B2 E4 C9 94 2B 5B BE 8C 39 B3 E6 CD 9C 3B 7B FE 0C 3A B4 E8 D1 A4 4B 9B 3E 8D 3A B5 EA D5 AC 5B BB 7E 0D 3B B6 EC D9 B4 6B DB BE 8D 3B B7 EE DD BC 7B FB FE 0D 3C B8 F0 E1 C4 8B 1B 3F 8E 3C B9 F2 E5 CC 9B 3B 7F 0E 3D BA F4 E9 D4 AB 5B BF 8E 3D BB F6 ED DC BB 7B FF 0E 3E BC F8 F1 E4 CB 9B 3F 8F 3E BD FA F5 EC DB BB 7F 0F 3F BE FC F9 F4 EB DB BF 8F 3F BF FE FD FC FB FB 6D FF 0F 60 80 02 0E 48 60 81 06 1E 88 60 82 0A 2E C8 60 83 0E 3E 08 61 84 12 4E 48 61 85 16 5E 88 61 86 1A 6E C8 61 87 1E 7E 08 62 88 22 8E 48 62 89 26 9E 88 62 8A 2A AE C8 62 8B 2E BE 08 63 8C 32 CE 48 63 8D 36 DE 88 63 8E 3A EE C8 63 8F 3E FE 08 64 90 42 0E 49 64 91 46 1E 89 64 92 4A 2E C9 64 93 4E 3E 09 65 94 52 4E C9 4A 01 00 21 FE 2E 41 4E 49 4D 41 54 45 44 20 47 49 46 20 28 53 49 4E 47 4C 45 20 46 52 41 4D 45 29 20 2D 20 45 52 49 43 20 4D 49 4B 41 20 32 30 31 31 00 3B
</code></pre>
<p>(Bytes responsible for animation in red.)</p>]]></content>
        <category label="rock bottom" term="rock-bottom"/>
        <category label="last work" term="last-work"/>
        <category label="animated gif" term="animated-gif"/>
        <published>2011-10-28T22:43:43.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Flapping Toasters]]></title>
        <id>https://frontiernerds.com/flapping-toasters</id>
        <link href="https://frontiernerds.com/flapping-toasters"/>
        <updated>2011-04-05T06:02:53.000Z</updated>
        <content type="html"><![CDATA[<p><img src="https://frontiernerds.com/assets/after-dark-disk.CzLe439f_1w1zp3.webp" alt="After Dark 3.5&#x22; diskette"></p>
<h3>Concept</h3>
<p>Flapping Toasters is an absurdist reconsideration of the classic screensaver <em>Flying Toasters</em>. You embody the toaster, flapping your arms to stay aloft, clapping to release toast, and tilting to roll away from danger. Created in collaboration with <a href="https://web.archive.org/web/20100716013636/http://neocyde.net/derive/">Michael Edgcumbe</a>.</p>
<h3>Video</h3>
<h3>Screenshots</h3>
<p><a href="https://frontiernerds.com/assets/flapping-toasters-basic-flapping-screenshot.NG6ECsQO_Z4J6d7.webp"><img src="https://frontiernerds.com/assets/flapping-toasters-basic-flapping-screenshot.NG6ECsQO_Z4J6d7.webp" alt="Screenshot of basic toaster flapping gesture"></a> <a href="https://frontiernerds.com/assets/flapping-toasters-thumb.C6-_nodh_Yakvl.webp"><img src="https://frontiernerds.com/assets/flapping-toasters-thumb.C6-_nodh_Yakvl.webp" alt="Screenshot of multi-player toaster flapping"></a> <a href="https://frontiernerds.com/assets/flapping-toasters-tilt-roll-screenshot.CuoylySW_ZPT3Kk.webp"><img src="https://frontiernerds.com/assets/flapping-toasters-tilt-roll-screenshot.CuoylySW_ZPT3Kk.webp" alt="Screenshot of tilting to roll the toaster"></a> <a href="https://frontiernerds.com/assets/flapping-toasters-clapping-for-toast-screenshot.AhbcerRW_1jgP22.webp"><img src="https://frontiernerds.com/assets/flapping-toasters-clapping-for-toast-screenshot.AhbcerRW_1jgP22.webp" alt="Screenshot of clapping gesture to release toast"></a></p>
<p>Clockwise from top left: Basic flapping / Multi-player / Tilting to roll / Clapping to release toast.</p>
<h3>Background</h3>
<p>The screensaver is forbidden territory for those seeking to elevate the status of new media art. A screensaver is lowbrow, unserious, and fetishizes aesthetics over content. I’ve explored this hostility a bit further in <a href="https://frontiernerds.com/scarlet-s">another post</a>. At the same time, early screensavers represent a raw, unpretentious play with the screen’s possibilities combined with a novel set of constraints. (E.g. a screensaver must have no static content, run for an unlimited duration, etc.)</p>
<p>Michael and I decided to embrace the kitsch, and compound the absurdity of the ever-criticized screen saver with the hype and widespread obsession with the Kinect. Beyond trying to show some love for the screensaver, we were conscious of how many camera + presence driven “interactive artworks” present an interaction model that reduces the participant to maniacal arm waving and overwrought gestures designed to satisfy a finicky computer vision algorithm. We wanted to poke some fun at this trope by creating an interaction model that explicitly demands that the participant is reduced to crazed arm-flapping.</p>
<h3>Process / Implementation</h3>
<p>We built the software in openFrameworks. Since we wanted skeleton tracking, we used the ofxOpenNI plugin instead of ofxKinect (since the latter only provides RGB-D data).</p>
<p>Detecting flaps was a bit trickier than expected, since there’s an issue of ranging, direction detection, correlating the animation to the arm position, direction reversal detection, etc. We ended up solving these issues with brute force by tracking velocity of the hand points, filtering heavily, and setting up a dynamic bounding system that would allow people with various degrees of enthusiasm / gesticulation to fly successfully.</p>
<p>We also added support for rolling in both directions. The player initiates a roll by tilting their arms in ether direction — we used an angle threshold on an imaginary line drawn between hand points to trigger it.</p>
<p>Life as a toaster would be unfulfilling without the opportunity to toast bread, so Michael implemented hand clap detection to trigger the release of golden-brown toast.</p>
<p>Ideally there would be some kind of system for recording sample motions, saving them, and then automatically firing an event when a similar motion is detected along with a confidence value. I’m sure this kind of thing exists on industrial-strength Kinect development platforms (and must have existed somewhere in the Wii’s development stack for ages). Would be great to have a similar library for OF or Processing… something for the to-do list.</p>
<p>OpenNI makes tracking multiple players relatively trivial, so we implemented support for up to 15 simultaneous flappers (OpenNI’s internal limit).</p>
<p>The most recent version of the <a href="https://github.com/gameoverhack/ofxOpenNI">ofxOpenNI</a> plugin includes support for OpenNI’s native hand tracking algorithms. They weren’t available during our initial development, so we used the raw skeleton data instead. These algorithms seem a bit more reliable for locating hands then the skeleton points alone, particularly when hands cross the torso, go out of frame, and so on — essentially, it uses a more generous set of inferences to give you the best guess about hand position.</p>
<p>Since flapping toasters is all about hand tracking, it sounds like this new feature would be great. So, I tried a quick reimplementation of Flapping Toasters using these points instead of the end of the lower arm joint. It turned out not to work with more than one player, since I couldn’t find a way to associate two hands with a single person. For example, if three hands are detected, I had no way of knowing which two belonged to one person, and to whom the third might belong. There might be a way, but it wasn’t immediately obvious from the API.</p>
<p>The graphics are scraped sprites from After Dark’s original Flying Toasters. We created PNGs from each frame of animation and re-implemented the basic design of the screensaver.</p>
<p>The final element, which was surprisingly the hardest to get right, was the physics simulation. It’s your basic gravity / acceleration / velocity simulation, but getting the power of a flap just right was tricky. It still doesn’t feel quite like I imagined. Flapping is a bit too much work, but it’s also too easy to flap all the way off the top of the screen. The solution to this probably involves spending a bit more times with the sliders to get the numbers just right, as well as taking into account the vigor of the flap. (E.g. use some combination of average arm velocity during the down-stroke and the relative size of the flap as a coefficient to the flap power value.)</p>
<p>After Dark’s version of flying toasters included two possible soundtracks — Wagner’s <em>Ride of the Valkyries</em>, and a slightly less compelling theme song with <a href="https://www.flickr.com/photos/afroswede/496454/">karaoke-style lyrics</a>. Michael and I opted for Wagner, and rendered out a midi version of the song as a WAV for use in openFrameworks. I’d have preferred to make the recording using the midi synths from a Macintosh of the appropriate vintage, but we just couldn’t get access to one in time for the class deadline.</p>
<h3>Improvements</h3>
<p>After flapping for a few minutes, most people ask what they’re trying to do. We haven’t implemented any kind of game logic — it’s just you, the toaster, and the joy of flap-induced flight. In one sense it’s nice to keep things minimal. But if we really wanted a game dynamic, there are a bunch of possibilities.</p>
<p>Some kind of collision detection between players could make things more interesting. You’re also given an unlimited supply of toast. Perhaps you could try to “catch” un-toasted pieces of bread, carry them for an amount of time (toasting them in the process) before clapping to release. Carry them for too long, and you’ll burn the toast. (The horror!)</p>
<h3>Download / Source Code</h3>
<p>The application was written in <a href="http://openframeworks.cc/">openFrameworks</a>. <a href="https://github.com/kitschpatrol/flapping-toasters">The source code is available on GitHub</a>. See the repository’s readme file for additional notes on development / compilation. A Mac OS binary is also included in the GitHub repo.</p>]]></content>
        <category label="kinect" term="kinect"/>
        <category label="screensaver" term="screensaver"/>
        <category label="kitsch" term="kitsch"/>
        <published>2011-04-05T06:02:53.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[The Two Cultures]]></title>
        <id>https://frontiernerds.com/two-cultures</id>
        <link href="https://frontiernerds.com/two-cultures"/>
        <updated>2011-04-02T00:18:10.000Z</updated>
        <content type="html"><![CDATA[<p><img src="https://frontiernerds.com/assets/spectrum.WL05L9CO_ZQe4XV.webp" alt="Electromagnetic spectrum poster"> <img src="https://frontiernerds.com/assets/guggenheim.CmNR9Fqa_Z1c9qSz.webp" alt="Interior of Guggenheim museum"></p>
<p><a href="https://frontiernerds.com/courses/analog-circuits">9 AM class</a> on the left. <a href="https://frontiernerds.com/courses/conceptual-art">3 PM class</a> on the right.</p>]]></content>
        <published>2011-04-02T00:18:10.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[The Scarlet S]]></title>
        <id>https://frontiernerds.com/scarlet-s</id>
        <link href="https://frontiernerds.com/scarlet-s"/>
        <updated>2011-03-31T06:29:39.000Z</updated>
        <content type="html"><![CDATA[<p><img src="https://frontiernerds.com/assets/cloud-toast-2.D3Sh3V34_Z1SKKpW.webp" alt="Toasters flying amidst Mario Brothers clouds"></p>
<p>The remark I remember most from my <a href="https://vimeo.com/2668463">undergraduate thesis</a> critique came from a fellow student. A film major, I think.</p>
<blockquote>
<p>“I’d like to have this as a screensaver!”</p>
</blockquote>
<p>She meant it as a compliment, but at the time I took offense. I thought the screensaver was the epitome of digital banality.</p>
<p>But in the last few years, I’ve come around a bit.</p>
<p>I think the late 80s / early 90s era screensavers now read as a refreshing double-negation of the recent 8 bit / animated gif fixation. (Although I’m not sure if there’s a way to revisit / reconsider the After Dark heyday without collapsing in a pile of insincerity.)</p>
<p>I can’t think of a course or context at ITP where calling something out as screensaver-esque would go over particularly well (although opportunities abound). The more general new media community seems divided on the aesthetic / contextual parallels between screen-based art and screensavers.</p>
<p>Some forsake the association, others mine the medium for nostalgic kitsch, and a few embrace the idea and use the screensaver as a means of distributing their work. I expected most to fall into the first category, but some Googling around on the topic suggests that NMSAS (New Media Screensaver Anxiety Syndrome) doesn’t run as deep as I initially expected.</p>
<p>Nevertheless, some choice quotes emerged from the sift. Sources are linked, and if anyone wants to comment / update their opinion — some of the quotes are years old — I’m game for a conversation.</p>
<p>Let’s open the floor with three quotes from <a href="https://web.archive.org/web/20090304173625/https://www.unlekker.net/">Marius Watz</a>:</p>
<blockquote>
<p>“The curse of generative art: ‘So you make screensavers?’” – <a href="https://web.archive.org/web/20130914102057/http://www.scribd.com/doc/21918141/20091030-Grafill-Edit-09-Marius-Watz">Marius Watz</a></p>
</blockquote>
<blockquote>
<p>“To the frustration of many digital artists, screensavers have much in common with generative art. They often rely on some kind of ruleset to allow infinite animation and to avoid burning a single image into the screen because of repetition. Many classic screensavers use mathematical formulas like Bezier or Lissajous curves. But most screensavers are created by programmers, not designers, hence the bad reputation they have as cultural artifacts.” – <a href="https://web.archive.org/web/20071016191514/http://workshop.evolutionzone.com/2007/09/19/aho-mission-2-make-a-screensaver/">Marius Watz</a></p>
</blockquote>
<blockquote>
<p>“As for the complaint that generative art is simply decorative, fit only for screensavers or wallpaper patterns, it is hardly worth answering. Such a position would invalidate most of art history.” – <a href="http://digicult.it/design/marius-watz-code-art-aesthetic/">Marius Watz</a></p>
</blockquote>
<blockquote>
<p>“The other hurdle is the ‘screen saver’ comparison. Society has chosen to consider screen savers with very little regard — they are temporary visuals. Another challenge for the legitimacy of this type of art.” – <a href="https://frontiernerds.com/files/fe_2003_sacks_en.pdf">Steve Sacks</a></p>
</blockquote>
<blockquote>
<p>“Interactive art usually presents more abstract and complex concepts but has terrible interface / interaction. Ok so ‘that’s not the point of the work’ you say. But what irks me is that there is very little work that addresses this. As a result many people will dismiss much interactive art as just a screensaver or digital toy.” – <a href="https://frontiernerds.com/files/lab3dwed3dreader.pdf">Tom Betts</a></p>
</blockquote>
<blockquote>
<p>“We don’t want it to look like an iTunes screensaver.” – <a href="https://operaofthefuture.com/tag/media-lab/">Matt Checkowski</a> on the Opera of the Future</p>
</blockquote>
<blockquote>
<p>“This generative ‘art’ seemed better suited to screensavers or abstract desktops than canvases — a fact confirmed by Davis’s own Reflect app for the iPhone.” – Daniel West on Joshua Davis (<a href="https://www.qbn.com/topics/599544/">flame war</a> ensues)</p>
</blockquote>
<blockquote>
<p>“But for me it is just a screen saver, since there is no story.” – <a href="https://dspace.mit.edu/handle/1721.1/60338">Mauro Martino</a></p>
</blockquote>
<blockquote>
<p>“Neither [Casey Reas nor Sol LeWitt] creates interactive works per se, but they are touchstones for anyone interested in the algorithmic art as something other than a screen saver.” – <a href="https://www.oreilly.com/library/view/programming-interactivity/9780596800598/">Joshua Noble</a></p>
</blockquote>
<blockquote>
<p>“The idea behind a work can sometimes be more compelling than what actually appears on the screen. And for viewers without a thorough grounding in technology — or advanced math — the most innovative visual programs can seem like little more than high-end screensavers.” – <a href="https://www.ekac.org/forbesarticle.html">Susan Delson</a></p>
</blockquote>
<blockquote>
<p>“ <em>Clouds</em> also became a popular desktop and screen saver at some point. I read it on the Internet, like, “Here’s instructions on how to take this and make a screen saver.” I just surfed on it a while back. I was like, Wow, it probably would look nice on the desktop or whatever.” – <a href="https://www.eai.org/resourceguide/exhibition/installation/interview_arcangel.html">Cory Arcangel</a></p>
</blockquote>]]></content>
        <category label="kitsch" term="kitsch"/>
        <category label="screensaver" term="screensaver"/>
        <category label="found art" term="found-art"/>
        <category label="theory" term="theory"/>
        <published>2011-03-31T06:29:39.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Research on Relative Motion Tracking]]></title>
        <id>https://frontiernerds.com/research-on-relative-motion-tracking</id>
        <link href="https://frontiernerds.com/research-on-relative-motion-tracking"/>
        <updated>2011-02-09T17:19:52.000Z</updated>
        <content type="html"><![CDATA[<h3>Project Context</h3>
<p>My thesis project has undergone a major shift in the last week. I’m moving away from the post-apocalyptic pirate internet, and towards something completely different: A means of projecting content onto surfaces that makes the projection appear intrinsic to the surface.</p>
<p>Imagine a hand-held projector that you can sweep across a room, kind of like a flash light. As it moves, the projected content appears stuck to the wall, the floor, etc. For example, you could add something to the scene in a particular location — a bit of text, perhaps.</p>
<p>After adding the text, you could sweep the projector to a different part of the wall. The text would appear to go out of view once it left the throw-area of the projector, but if you were to move the projector back towards the spot where you initially added the text, you would see the words come back into view. The words are stuck to their environment — the projection is just an incidental way of exploring the space and revealing its content. Two recent technologies make this a particularly ripe time for this project: The Kinect gives cheap 3D scene information, which can improve the quality of motion tracking and automate the projection mapping process. New pico-projectors that can run on battery power and weigh significantly less than their conference-table counterparts mean that carrying around and a projector and using it to explore a space is no longer an entirely ridiculous proposition. This whole idea, which I’m currently calling Thesis II (for personal reasons) will be written up in more detail soon.</p>
<h3>Fronts of Inquiry</h3>
<p>The creative challenge for the next twelve weeks is to conceive of and build an application that demonstrates the usefulness and creative possibilities of this tool.</p>
<p>The technical challenges are twofold. First, I need a way to track the relative motion between the projector and the projection surface (generally a wall) — I’ll refer to this as relative motion tracking. Second, I need a way to dynamically distort the projected image to match the geometry of the projection surface. This is similar in concept to projection mapping, except the projection surface isn’t static. I’ll call this dynamic projection mapping. The calculations for both of these steps need to happen in less than 20 milliseconds if the effect is going to work and feel fluid.</p>
<p>Other people are <a href="https://web.archive.org/web/20180310193202/https://forum.libcinder.org/topic/projection-mapping-with-kinect">already working on dynamic projection mapping</a>, and from a technical standpoint it’s both more familiar ground and less essential to the final project than relative motion tracking. Where projection mapping is “nice to have” and will contribute significantly to the quality of the project, the technology that the project depends on to work at all is dynamic motion tracking. So, this paper will focus on research into means of relative motion tracking, and which (if any) existing open-source projects could be adapted for this application.</p>
<h3>Similar Projects</h3>
<p>At the most basic level, I need to find a way to take a camera feed and determine how content in the scene is moving. Traditionally, this is called camera tracking — a form of deriving <a href="https://3dsav.blogspot.com/2011/02/week-1-intro-to-3d.html">structure from motion</a>. The process goes something like this: First the software identifies feature points within each frame — these are generally areas of high contrast, which relatively easy to pick out algorithmically. On the next frame, the software finds another batch of feature points, and then does correspondence analysis between these feature points in the most recent frame and feature points in the last frame. From this information, the <a href="https://web.archive.org/web/20120708174221/http://www.digilab.uni-hannover.de/docs/manual.html">movement of the camera can be inferred</a>. (e.g. if a feature point is at pixel [5, 100] in frame one, and then moves to pixel [10, 80] in frame two, we can guess that the camera shifted about [5, -20] between frames. It’s a bit more complicated than that, because of the parallax effect — points closer to the camera will appear to move more than points further away from the camera. The software can take this into account, and build a rough point cloud of the scene.</p>
<p>This process has applications in special effects and film / post-production. If you have a shot with a lot of camera movement, and you need to add an explosion to the scene, camera tracking gives exactly the information you need to position the explosion in a believable way from frame to frame. Because of this demand, there are a few über-expensive closed-source software packages designed to perform camera tracking reliably. <a href="https://web.archive.org/web/20120624031713/http://www.vicon.com/boujou/">Boujou</a>, for example, sets you back about $10,000. There is, however, a free and open-source option called <a href="https://www.robots.ox.ac.uk/~gk/PTAM/">PTAM</a> — Parallel Tracking and Mapping for Small AR Workspaces which can perform similar tracking.</p>
<h3>Caveats</h3>
<p>The PTAM code seems like the right starting point for my own adaptation of this concept, but there are a few caveats that make me nervous about just how much of a head start the code will give me. First, PTAM and similar camera tracking software is designed for use on high-contrast two-dimensional RGB bitmaps — basic still film frames. In contrast, the grayscale depth map coming from the Kinect is relatively low contrast, and areas of high contrast are probably best avoided in the feature detection process, since they represent noisy edges between depths. I probably will not be able to use the Kinect’s RGB data, because it’s going to be filled with artifacts from the projection. Also, since the Kinect already gives us a point cloud, I don’t need any of the depth-calculation features from PTAM. Because of these issues, I will probably start work by skimming through the PTAM source code to get an idea of their approach to the implementation, and then seeing how PTAM behaves when fed the grayscale depth map from a Kinect. From there, I will probably start experiment a simpler feature extraction and tracking algorithms in Processing that make the most of the Kinect’s depth data. (This code would be destined for an eventual port to C++.)</p>]]></content>
        <published>2011-02-09T17:19:52.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Rough Thesis Production Schedule]]></title>
        <id>https://frontiernerds.com/rough-thesis-production-schedule</id>
        <link href="https://frontiernerds.com/rough-thesis-production-schedule"/>
        <updated>2011-02-09T14:49:53.000Z</updated>
        <content type="html"><![CDATA[<p>A bit of optimism…</p>
<p><a href="https://frontiernerds.com/assets/thesis-feb.-qpsaeup_Bp8Qq.webp"><img src="https://frontiernerds.com/assets/thesis-feb.-qpsaeup_Bp8Qq.webp" alt="Thesis February"></a> <a href="https://frontiernerds.com/assets/thesis-march.BZoGYnyc_ZVDf6g.webp"><img src="https://frontiernerds.com/assets/thesis-march.BZoGYnyc_ZVDf6g.webp" alt="Thesis March"></a> <a href="https://frontiernerds.com/assets/thesis-april.CfmyLKZf_jmC39.webp"><img src="https://frontiernerds.com/assets/thesis-april.CfmyLKZf_jmC39.webp" alt="Thesis April"></a> <a href="https://frontiernerds.com/assets/thesis-may.CocDyG5D_Z5Won0.webp"><img src="https://frontiernerds.com/assets/thesis-may.CocDyG5D_Z5Won0.webp" alt="Thesis May"></a></p>]]></content>
        <published>2011-02-09T14:49:53.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Loose bedside inventory]]></title>
        <id>https://frontiernerds.com/loose-bedside-inventory</id>
        <link href="https://frontiernerds.com/loose-bedside-inventory"/>
        <updated>2011-02-09T14:06:09.000Z</updated>
        <content type="html"><![CDATA[<p><em>This is an experimental essay composed over winter break</em></p>
<p>One white cap, striated, counter-clockwise, childproof. One inch, three tenths, twenty hundredths and seven thousandths in diameter, seven tenths, eighty hundredths and one thousandth in height. Twenty-four extra-strength Excedrin, caplets. Seven hundred ninety-four thousandths in length. Three hundred eighty five thousandths in width. Two hundred forty-one thousandths in height. Engraved with an “E”, does not stand for ecstasy. Super-dome like in cross section.</p>
<p>One white label, three thousandths in thickness affixed to soviet-green translucent bottle. Promises for headaches, colds, arthritis, muscle aches, sinusitis, toothache, premenstrual and menstrual cramps delineated with bullets. You can call them at 800 468 7746. Possibly made in New Jersey.</p>
<p>Clear plastic blister on printed cardboard backing, nineteen thousandths thick. Contents formerly four, now two. Cylindrical, five-hundred fifty-six thousandths in diameter. Length measures one thousand nine hundred and ninety-one thousandths, and one point five volts. The positive nip, two hundred and twelve thousandths in diameter. Forty-four thousandths in height. Do not connect improperly. Made in U.S.A. Ne pas installer de maniére inapproprie. Fabriqué aux É-U.</p>
<p>Sixteen thousandths of cardboard, folded over. One thousand five hundred twenty-one thousandths in width. One thousand eight hundred ninety-two thousandths in height (closed). Tapered, one hundred twenty three thousandths at one end, two hundred sixty nine at the other. Profile like an arrow loop. Contents formerly twenty, now sixteen. Tip contains red phosphorus, potassium chlorate, sulfur and starch, a neutralizer, siliceous filler, diatomite and glue. Certain family members consider this a delicacy. Made in New Haven.</p>]]></content>
        <published>2011-02-09T14:06:09.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Godspeed, Comp Cameras]]></title>
        <id>https://frontiernerds.com/godspeed-comp-cameras</id>
        <link href="https://frontiernerds.com/godspeed-comp-cameras"/>
        <updated>2011-02-09T13:29:54.000Z</updated>
        <content type="html"><![CDATA[<p>Computational Cameras and I have parted ways. I’m sure I’ll end up doing my share of pixel munging as I start work on Thesis II.</p>]]></content>
        <published>2011-02-09T13:29:54.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Ten Face-Related Ideas and One Implementation]]></title>
        <id>https://frontiernerds.com/ten-face-related-ideas-and-one-implementation</id>
        <link href="https://frontiernerds.com/ten-face-related-ideas-and-one-implementation"/>
        <updated>2011-02-01T16:52:52.000Z</updated>
        <content type="html"><![CDATA[<h3>Film Faceprints</h3>
<p>(implemented and shown below) Run face detection on film frames, grabbing full-size frames in which non-face areas are masked off. Average these frames together to generate a single-frame representation of presence and characters over the duration of the film. This leaves you with a kind of thumbprint of the film and it’s characters. The results are kind of anticlimactic, there are only vague shadows of faces. A failed experiment, but brings to mind some more interesting directions of approach to the content.</p>
<p>(Animating the accumulation of the average, scaling all of the faces to the same dimension before averaging — or maybe ditching the averaging idea and trying a grid arrangement that would reduce a film’s narrative to a series of faces.)</p>
<p><img src="https://frontiernerds.com/assets/titicut-follies.BV6U_oAn_Z23JqiV.webp" alt="Blurry accumulated face image from Titicut Follies"></p>
<p><a href="https://frontiernerds.com/assets/abe-lincoln.BKv3gI2__ZonIj9.webp"><img src="https://frontiernerds.com/assets/abe-lincoln.BKv3gI2__ZonIj9.webp" alt="Abe Lincoln"></a> <a href="https://frontiernerds.com/assets/yellow-submarine.qk-E8DOG_Znpdbx.webp"><img src="https://frontiernerds.com/assets/yellow-submarine.qk-E8DOG_Znpdbx.webp" alt="Yellow Submarine"></a></p>
<p><a href="https://www.imdb.com/title/tt0062374/">Titicut Follies</a> is at top. Bottom left is an excerpt from a film about Abraham Lincoln, bottom right is <a href="https://www.imdb.com/title/tt0063823/">Yellow Submarine</a>.</p>
<p>Perhaps more interesting are the algorithms leftovers. As it runs, the latest faces are dumped into a buffer and drawn to the screen. A couple averages in progress are shown below:</p>
<p><a href="https://frontiernerds.com/assets/titicut-in-progress.BY6nWRTb_Z1m31zC.webp"><img src="https://frontiernerds.com/assets/titicut-in-progress.BY6nWRTb_Z1m31zC.webp" alt="Titicut Follies in progress"></a> <a href="https://frontiernerds.com/assets/yellow-submarine-in-progress.B1nClnoH_Z1QPIF9.webp"><img src="https://frontiernerds.com/assets/yellow-submarine-in-progress.B1nClnoH_Z1QPIF9.webp" alt="Yellow Submarine in progress"></a></p>
<p>Here is the <a href="https://frontiernerds.com/files/video_face.pde">rather messy source code</a>.</p>
<h3>Quantify Contact</h3>
<p>Run face detection on the contents of your computer screen. Log how often faces are encountered in web browsing / photo editing / whatever. In this way the relative loneliness of extended sessions in front of a machine could be quantified.</p>
<h3>Curb Paranoia</h3>
<p>Implement a face-detecting and obfuscating filter at a very low level (somewhere in the camera driver, probably). Pseudo-privacy protection</p>
<h3>Quantifibate</h3>
<p>Run face detection on your laptop’s camera all the time. Since computers tend to be left on, “uptime” doesn’t say much about the hours per day sunk into these machines. Face detection could give more accurate statistics about presence / attention.</p>
<h3>Tenso</h3>
<p>Automate the <a href="https://knowyourmeme.com/memes/face-swap">face swapping / tenso meme</a>.</p>
<h3>Almost Face</h3>
<p>Go through large sets of face-tagged images (an iPhoto library, for example) and hand-pick all of the false positives to build a collection of almost-faces.</p>
<h3>Street View</h3>
<p>Process Google Street View panoramas for faces. Hit rate might be a bit low since google blurs faces, but it would be interesting to build a map of geolocated faces.</p>]]></content>
        <category label="face recognition" term="face-recognition"/>
        <published>2011-02-01T16:52:52.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Street View Automatic]]></title>
        <id>https://frontiernerds.com/street-view-automatic</id>
        <link href="https://frontiernerds.com/street-view-automatic"/>
        <updated>2011-01-30T07:33:39.000Z</updated>
        <content type="html"><![CDATA[<p>Why is it always daytime in Google Street View?</p>

<p>The disagreement between Street View’s 100</p>
<p>ratio of light to dark and my window’s less optimistic 50 ratio has been particularly jarring lately. What a tax on our brittle circadian rhythms!</p>
<p>I have created a <a href="https://en.wikipedia.org/wiki/Bookmarklet">bookmarklet</a> to solve the simpler (street view) half of this disparity. Now, you can push a button to instantly cast any Street View scene into a weak approximation of darkness. The degree of night is based on what time it actually is in the corner of the world you’re viewing, combined with information on when the sun will rise or set.</p>
<p>An open Street View window, left unattended, will now cycle from day, to night, and back again, indefinitely. No longer will you leave the house under the false promise of daylight at your destination.</p>
<p>The algorithm is operating on the four sample scenes above. If you’d like to give it a try, drag and drop the link below into your bookmarks bar (for quick access) or right click and add it to your bookmarks (for less obtrusive access).</p>
<hr>
<p><em>drag and drop the below link to your bookmarks bar</em></p>
<p>Street View Automatic</p>
<p><em>drag and drop the above link to your bookmarks bar</em></p>
<hr>
<p>Next, navigate to <a href="https://www.google.com/maps">Google Maps</a>, and get into a Street View as you would otherwise. Once the view has loaded, give the new <em>Street View Automatic</em> link in your bookmarks bar a click to show the scene in its true (and current) light. Of course, if it’s actually daytime, you won’t see much change at all. The code also won’t work on embedded maps or portable devices.</p>
<p>My thanks to Jonathan Stott of <a href="https://web.archive.org/web/20120722111516/http://www.earthtools.org/webservices.htm">Earthtools</a> for making his excellent lat / lon to local time and sunrise / sunset API services available free of charge to the public.</p>]]></content>
        <category label="street view" term="street-view"/>
        <category label="maps" term="maps"/>
        <category label="time" term="time"/>
        <category label="maps" term="maps"/>
        <category label="sky" term="sky"/>
        <category label="night" term="night"/>
        <published>2011-01-30T07:33:39.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Your World of Text]]></title>
        <id>https://frontiernerds.com/world-of-text</id>
        <link href="https://frontiernerds.com/world-of-text"/>
        <updated>2011-01-27T02:51:13.000Z</updated>
        <content type="html"><![CDATA[<p>I spent twenty minutes trying to remember the name of this brilliant, unmoderated, real-time, infinitely-large canvas of collaborative and anti-collaborative text. It’s <a href="https://www.yourworldoftext.com/home/">Your World of Text</a> by <a href="https://andrewbadr.com/">Andrew Badr</a>.</p>
<p><a href="https://frontiernerds.com/assets/your-world-of-text-screenshot-2025-02-27.hRdRvc1c_Z21q2j7.webp"><img src="https://frontiernerds.com/assets/your-world-of-text-screenshot-2025-02-27.hRdRvc1c_Z21q2j7.webp" alt="Your World of Text Screenshot"></a></p>
<p>If you run out of room, you can scroll to a fresh plot of page à la Google Maps.</p>
<p>Even more brilliant, Andrew <a href="https://github.com/reverie/yourworldoftext/">released the source</a> a while back. Interesting to see that it’s built on Django, and that clients keep sync by polling the server instead of some kind of pushed data from the server via Comet or a hidden socket.</p>]]></content>
        <category label="text" term="text"/>
        <published>2011-01-27T02:51:13.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Upload to Flickr from Processing]]></title>
        <id>https://frontiernerds.com/upload-to-flickr-from-processing</id>
        <link href="https://frontiernerds.com/upload-to-flickr-from-processing"/>
        <updated>2010-12-17T20:54:38.000Z</updated>
        <content type="html"><![CDATA[<p><img src="https://frontiernerds.com/assets/processing-to-flickr-header.DvjK-PIu_Z1mfpz9.webp" alt="The Processing logo connected to the Flickr logo by an arrow"></p>
<h3>About The PImage Uploader</h3>
<p>I’ve attached a quick Processing sketch that uploads PImages from a camera directly to Flickr each time you click the mouse.</p>
<p>The actually upload process is pretty simple — it just involves posting a bunch of bytes over HTTP to a specific URL. The hard part is getting Flickr to believe that you are who you say you are so that it will accept the images you upload.</p>
<p>That’s where this code is meant to help.</p>
<p>In order to upload images to a Flickr account, your app will need write permission. In order to get write permission, you’ll need to go through the authentication process.</p>
<p>Basically, the first time your app wants to upload it will open up a URL on the Flickr website prompting you to log in and “allow” the app to do what it wants to do. You may be familiar with this procedure if you’ve had to authenticate third party apps that tie into Flickr (such as iPhoto or a desktop flickr uploader). In the case of the attached code, Processing opens the authentication link for you, and then gives you 15 seconds to approve the app on Flickr’s website before continuing on its way.</p>
<p>After this, it stores the authentication data in a text file (called token.txt) local to the Processing sketch, so that you won’t have to go through the online authentication process each time you run the app. I’ve encapsulated this process into a single function called <code>java authenticate()</code> to make things as simple as possible. If the token is lost or becomes corrupted, the app will automatically try to fetch a new one the next time it runs. (Note that you should not distribute any sketches with your own generated token file!)</p>
<p>The code makes use of a Flickr library for Java called flickrj. Since flickrj is a generic Java library and isn’t designed specifically for Processing, its use is not quite as intuitive as you’re accustomed to. For one, the steps to use the library with your sketch are a bit different. Instead of putting files in your <code>~/Documents/Processing/libraries</code> folder, you’ll need to download the <code>.jar</code> file from the <a href="http://flickrj.sourceforge.net/">flickrj website</a> and drag and drop it onto your sketch window. This creates a folder called “code” inside your sketch folder with a copy of the <code>.jar</code> file inside for your sketch to reference as needed.</p>
<p>If you prefer, you can create the folder and copy the <code>.jar</code> file manually. You’ll end up with the same setup as if you dragged and dropped the file. Also note that you’ll never see anything appear in the “import” menu list since flickrj wasn’t built with Processing in mind. The flickrj jar is included in the zipped uploader code below to make your life easier.</p>
<hr>
<h3>The API / Library Conundrum</h3>
<p>The amount of code and number steps involved in getting the necessary authorization is kind of ridiculous. It’s easy to imagine a range of places to improve upon the library.</p>
<p>Flickrj is a pretty direct mirror to the official <a href="https://www.flickr.com/services/api/">Flickr API</a>, and that’s how most API libraries are designed. It seems to be designed for experienced Java programmers working on large-scale projects instead of the quick and dirty sketches typical to Processing work. It’s tough to find exactly the right balance between a library that makes sense relative to the official API, and one that adds new features or code and leverages the paradigms of a particular programming language or framework.</p>
<p>For example, a Processing-specific library might incorporate a threaded image downloader that could return arrays of PImages from a given query. It could also wrap up the authorizations into a few lines of code as outlined in this post. These Processing-esque abstractions on top of Flickr’s own API abstractions add a lot of code and maintenance liabilities to our hypothetical library — but it would certainly open things up for beginner coders.</p>
<p>My Processing to-do list is pretty long, but I’ll add a new Flickr library filed under “maybe someday”.</p>
<hr>
<h3>The Code</h3>
<p>The core of the sketch is shown below, but note that it will be easiest to download <a href="https://frontiernerds.com/files/flickr_uploader.zip">flickr_uploader.zip</a> for testing since it includes the flickrj library. The code looks a bit lengthy and convoluted, but it mostly consists of helper functions to take care of the authentication process and image compression to make the upload process as simple as possible — and the helper functions should be reusable without modification, so all you really need to worry about is creating the Flickr object, calling the authentication function, and then uploading to your heart’s desire.</p>
<pre><code class="language-java">// Simple sketch to demonstrate uploading directly from a Processing sketch to Flickr.
// Uses a camera as a data source, uploads a frame every time you click the mouse.

import processing.video.*;
import javax.imageio.*;
import java.awt.image.*;
import com.aetrion.flickr.*;

// Fill in your own apiKey and secretKey values.
String apiKey = "********************************";
String secretKey = "****************";

Flickr flickr;
Uploader uploader;
Auth auth;
String frob = "";
String token = "";

Capture cam;

void setup() {
  size(320, 240);

  // Set up the camera.
  cam = new Capture(this, 320, 240);

  // Set up Flickr.
  flickr = new Flickr(apiKey, secretKey, (new Flickr(apiKey)).getTransport());

  // Authentication is the hard part.
  // If you're authenticating for the first time, this will open up
  // a web browser with Flickr's authentication web page and ask you to
  // give the app permission. You'll have 15 seconds to do this before the Processing app
  // gives up waiting fr you.

  // After the initial authentication, your info will be saved locally in a text file,
  // so you shouldn't have to go through the authentication song and dance more than once
  authenticate();

  // Create an uploader
  uploader = flickr.getUploader();
}

void draw() {
  if (cam.available()) {
    cam.read();
    image(cam, 0, 0);
    text("Click to upload to Flickr", 10, height - 13);
  }
}

void mousePressed() {
  // Upload the current camera frame.
  println("Uploading");

  // First compress it as a jpeg.
  byte[] compressedImage = compressImage(cam);

  // Set some meta data.
  UploadMetaData uploadMetaData = new UploadMetaData();
  uploadMetaData.setTitle("Frame " + frameCount + " Uploaded from Processing");
  uploadMetaData.setDescription("To find out how, go to https://frontiernerds.com/upload-to-flickr-from-processing");
  uploadMetaData.setPublicFlag(true);

  // Finally, upload/
  try {
    uploader.upload(compressedImage, uploadMetaData);
  }
  catch (Exception e) {
    println("Upload failed");
  }

  println("Finished uploading");
}

// Attempts to authenticate. Note this approach is bad form,
// it uses side effects, etc.
void authenticate() {
  // Do we already have a token?
  if (fileExists("token.txt")) {
    token = loadToken();
    println("Using saved token " + token);
    authenticateWithToken(token);
  }
  else {
   println("No saved token. Opening browser for authentication");
   getAuthentication();
  }
}

// FLICKR AUTHENTICATION HELPER FUNCTIONS
// Attempts to authneticate with a given token
void authenticateWithToken(String _token) {
  AuthInterface authInterface = flickr.getAuthInterface();

  // make sure the token is legit
  try {
    authInterface.checkToken(_token);
  }
  catch (Exception e) {
    println("Token is bad, getting a new one");
    getAuthentication();
    return;
  }

  auth = new Auth();

  RequestContext requestContext = RequestContext.getRequestContext();
  requestContext.setSharedSecret(secretKey);
  requestContext.setAuth(auth);

  auth.setToken(_token);
  auth.setPermission(Permission.WRITE);
  flickr.setAuth(auth);
  println("Authentication success");
}

// Goes online to get user authentication from Flickr.
void getAuthentication() {
  AuthInterface authInterface = flickr.getAuthInterface();

  try {
    frob = authInterface.getFrob();
  }
  catch (Exception e) {
    e.printStackTrace();
  }

  try {
    URL authURL = authInterface.buildAuthenticationUrl(Permission.WRITE, frob);

    // open the authentication URL in a browser
    open(authURL.toExternalForm());
  }
  catch (Exception e) {
    e.printStackTrace();
  }

  println("You have 15 seconds to approve the app!");
  int startedWaiting = millis();
  int waitDuration = 15 * 1000; // wait 10 seconds
  while ((millis() - startedWaiting) &#x3C; waitDuration) {
    // just wait
  }
  println("Done waiting");

  try {
    auth = authInterface.getToken(frob);
    println("Authentication success");
    // This token can be used until the user revokes it.
    token = auth.getToken();
    // save it for future use
    saveToken(token);
  }
  catch (Exception e) {
    e.printStackTrace();
  }

  // complete authentication
  authenticateWithToken(token);
}

// Writes the token to a file so we don't have
// to re-authenticate every time we run the app
void saveToken(String _token) {
  String[] toWrite = { _token };
  saveStrings("token.txt", toWrite);
}

boolean fileExists(String filename) {
  File file = new File(sketchPath(filename));
  return file.exists();
}

// Load the token string from a file
String loadToken() {
  String[] toRead = loadStrings("token.txt");
  return toRead[0];
}

// IMAGE COMPRESSION HELPER FUNCTION

// Takes a PImage and compresses it into a JPEG byte stream
// Adapted from Dan Shiffman's UDP Sender code
byte[] compressImage(PImage img) {
  // We need a buffered image to do the JPG encoding
  BufferedImage bufferedImage = new BufferedImage( img.width,img.height, BufferedImage.TYPE_INT_RGB );

  img.loadPixels();
  bufferedImage.setRGB(0, 0, img.width, img.height, img.pixels, 0, img.width);

  // Need these output streams to get image as bytes for UDP communication
  ByteArrayOutputStream baStream    = new ByteArrayOutputStream();
  BufferedOutputStream bos        = new BufferedOutputStream(baStream);

  // Turn the BufferedImage into a JPG and put it in the BufferedOutputStream
  // Requires try/catch
  try {
    ImageIO.write(bufferedImage, "jpg", bos);
  }
  catch (IOException e) {
    e.printStackTrace();
  }

  // Get the byte array, which we will send out via UDP!
  return baStream.toByteArray();
}
</code></pre>]]></content>
        <category label="processing" term="processing"/>
        <category label="flickr" term="flickr"/>
        <published>2010-12-17T20:54:38.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Spring Thesis Plans]]></title>
        <id>https://frontiernerds.com/spring-thesis-plans</id>
        <link href="https://frontiernerds.com/spring-thesis-plans"/>
        <updated>2010-12-08T20:50:28.000Z</updated>
        <content type="html"><![CDATA[<h3>THE POST-APOCALYPTIC PIRATE INTERNET</h3>
<p><em>For background on the basic idea of the post-apocalyptic pirate internet, please read an <a href="https://frontiernerds.com/rough-thesis-proposal">earlier post on the subject</a></em></p>
<h3>Problem: The centrally-distributed internet is fragile and politically fickle</h3>
<p>The web’s current implementation is built from millions of geographically dispersed clients communicating with a handful of extremely high-density data centers. Despite the <em>many ⇔ many</em> ideals of the web, the infrastructure looks more like <em>many ⇒ one ⇒ many</em>.</p>
<p>This topology means that there are points in the network of significant vulnerability: Backbone fiber, ISP central offices, data centers, etc. all represent potential choke points in the web. The destruction of physical infrastructure or installation of firewalls to screen and censor data at one of these points could snuff access to the web. That would be a shame, since the web is arguably the most significant aggregation of knowledge and culture humanity has ever assembled.</p>
<p>How could this knowledge be protected, and how could the current freedom of expression and exchange enjoyed on the centralized web reemerge under a distributed model that is technically immune to data loss and censorship?</p>
<h3>Solution: Distributed, mesh-networked backups of the entire web</h3>
<p>I propose a distributed backup system for the web to ensure the survival of data and continuation of the platform’s ideals in the face of a political or infrastructural apocalypse.</p>
<p>The basic unit of the post-apocalyptic pirate internet is the “backup node”. These are relatively small, suitcase-sized computers with lots of storage space. Servers, basically. They’re designed for use by consumers of average technical aptitude. Backup nodes would sit in the corner of a room and sip data from the internet to build a backup of some portion of the web. If and when the centralized web infrastructure falls apart, the backup nodes would be poised to respond by automatically transforming from data aggregators to data distributors. Requests for web data in the absence of centralized infrastructure (post-apocalypse) would instead be fulfilled by the backup nodes — at least to the extent that backups are available.</p>
<p>The technical infrastructure of the post-apocalyptic pirate internet has two basic components. The first is physical: local storage nodes — hard disks, flash memory, etc. — on which fragments of the web will be backed up and paired with a supporting computer and interface (most likely a browser). The second is ethereal: wireless communication which will enable the formation of mesh network between physically proximate nodes. This would give apocalypse survivors access to more than just the data stored on their local node. In this sense, a new internet would take shape as the backup nodes enmeshed — an internet that was not vulnerable to centralized oversight or obstruction.</p>
<h3>Execution: Research demand and feasibility, then build a backup node</h3>
<p>First I’ll have to figure out how / why, exactly, such a system could / should be built. How would the content of the backups backups curated? By some distributed democratic means? By the usage patterns of the backup node’s owner? There’s a judgment to be made in deciding between saving the data people actually interact with on a daily basis (say, Twitter), and the data that actually carries forward knowledge essential to civilization (OpenCourseWare comes to mind).</p>
<p>What role will the backup nodes play before the apocalypse? Will they be seemingly dormant black boxes going about their work without human intervention, or will they become distribution points for content censored from the centralized web (Wikileaks would be the example of the day).</p>
<p>Marina has encouraged me to focus on the conceptual justifications for the system instead of technical implementation. However I’m personally interested in creating at least one actual node to demonstrate the concept. I understand the futility of the gesture, since the pirate internet would require thousands of backup nodes to be built, sold, and operated if it was going to actually protect (and eventually distribute) an appreciable amount of data. A single node is not particularly useful. Nevertheless, I’d like to end the semester with more than an exhaustive string of justifications / marketing material for something that doesn’t actually exist.</p>]]></content>
        <published>2010-12-08T20:50:28.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[NIME is Coming]]></title>
        <id>https://frontiernerds.com/nime-is-coming</id>
        <link href="https://frontiernerds.com/nime-is-coming"/>
        <updated>2010-12-08T20:11:49.000Z</updated>
        <content type="html"><![CDATA[<p><img src="https://frontiernerds.com/assets/nime-poster.BojTo978_Z1yJ4rs.webp" alt="NIME 2010 Poster"></p>]]></content>
        <published>2010-12-08T20:11:49.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Signs of the Apocalypse]]></title>
        <id>https://frontiernerds.com/signs-of-the-apocalypse</id>
        <link href="https://frontiernerds.com/signs-of-the-apocalypse"/>
        <updated>2010-12-08T18:28:54.000Z</updated>
        <content type="html"><![CDATA[<p>A glut of headlines relevant to the <a href="https://frontiernerds.com/rough-thesis-proposal">post-apocalyptic pirate internet</a> have popped up over the last few weeks. Here’s a quick review with commentary.</p>
<p>This first batch is regarding the temporary loss of major online repositories for “user generated content” (to invoke the cliché).</p>
<p>Another post discussing the Wikileaks saga is forthcoming in the context of the post-apocalyptic pirate internet is forthcoming.</p>
<hr>
<p><a href="https://www.tumblr.com/">Tumblr</a>, the celebrated blogging platform, was <a href="https://techcrunch.com/2010/12/06/tumblr-downtime/">down for about 24 hours</a> on December 5th. This was their longest outage to date.</p>
<p><img src="https://frontiernerds.com/assets/screen-shot-2010-12-06-at-9.32.09-am-e1291645709199.MpSeW3iQ_eLk1j.webp" alt="Tumblr outage screenshot"></p>
<p>Users’ trust is shaken by this sort of thing, and a day after the outage they <a href="https://staff.tumblr.com/post/286303145/tumblr-backup-mac-beta">released a backup application</a> that lets users save all of their Tumblr posts to their hard disks.</p>
<p>Here’s the official line:</p>
<blockquote>
<p>Unlike other publishing sites’ approach to backups, our goal was to create a useful copy of your blog’s content that can be viewed on any computer, burned to a CD, or hosted as an archive of static HTML files.</p>
<p>Wherever possible, we use simple file formats. Our backup structure is optimized for Mac OS X’s Spotlight for searching and Quick Look for browsing, and we’ll try to use the same structure and achieve the same benefits on other platforms.</p>
</blockquote>
<p>To me this reads more like, “Keep uploading! If we implode, we won’t take your data with us.”</p>
<p>The backup app strikes me as Hail Mary decision executed in the interest of damage control (with the side effect of actually being good news for the survivability of the 2+ billion posts Tumblr hosts on their servers). There’s a tension on social media websites between giving users access to their own data (in the form of database dumps) and maximizing “lock in” — since giving users downloadable access to their data can provide an easy means of egress from one service and migration to a competitor. (cf. <a href="https://techcrunch.com/2010/10/06/facebook-now-allows-you-to-download-your-information/">Facebook’s recent decision to let users dump their data in one step</a>.)</p>
<p>Of course, like most prophylactics, the download tool would only be useful in the context of the post-apocalyptic pirate internet if it 100% of Tumblr publishers used it 100% of the time. Nevertheless, the fact that this piece of preservationist infrastructure was officially released suggests that some portion of the Tumblr staff / users are paranoid enough to prepare for a data or infrastructure related disaster. The app also implicitly migrates the worst-case backup burden from the host to the client. (e.g. “Oops, we lost everything… what, you didn’t back up your posts?”) This represents a significant shift in one of the basic contracts of Web 2.0, which is the idea that “files” as we know them on our PCs don’t exist, you don’t have to worry about which directory things go in, you don’t plan for a day when you’ll need to open Word 3.0 files, and you certainly don’t have to back up. The understanding between consumer and provider is that once something’s uploaded, it’s safe from loss due to technical failure — where every bit is tucked away in multi-million-dollar data centers and placed under the careful watch of bespectacled geeks pacing up and down miles of server racks.</p>
<p>Of course, that’s not how things work out, but the cloud = safe truism is one that will need to be proven catastrophically false before the basic tenet of the post-apocalyptic pirate internet — that local bits are safe bits — can take hold.</p>
<hr>
<p>Another outage of reasonably high profile (although certainly not on the scale of Tumblr) struck <a href="http://github.com/">GitHub</a> on November 14th. A botched command by a systems administrator wiped out a database and destroyed some data along the way. The site was unusable for about three hours.</p>
<p>GitHub is much more esoteric than Tumblr, but for the uninitiated it’s basically a web site layering social-networking tools on top of <a href="http://en.wikipedia.org/wiki/Git_(software)">Git</a>. Git, in turn, is a piece of software that runs locally on your computer to keep track of collaborations around / revisions to source code written in the course of developing software.</p>
<p>Anyway, here’s what bad news looked like, as delivered by GitHub’s mascot, the Octocat:</p>
<p><img src="https://frontiernerds.com/assets/github-problems.CaNfJ8ke_1fpfHR.webp" alt="GitHub Octocat looking sad"></p>
<p>The nature of Git (the version-control system) means that even a total loss of GitHub (the community build on Git) would be inconvenient, but not catastrophic. When you’re working with a Git repository, you have a local copy on your hard disk that is periodically updated and synced to the GitHub server.</p>
<p>If 50 people are working on a particular project, then 50 copies of that project exist on local hard disks in one corner of the world or another. Thus the degree to which a projects is insured against disaster rises proportionally to a project’s popularity / number of collaborators.</p>
<p>So there are two particularly great things about the Git + GitHub combination that should be kept in mind as plans for the post-apocalyptic pirate internet are drawn up:</p>
<ol>
<li>The same basic software (Git) is running on both your own computer and GitHub’s servers. In this sense, GitHub makes the most of the web when it’s available (by adding a social layer to Git), but Git itself doesn’t completely melt down in the absence of GitHub. In short, Git’s use of the centralized web is value added, not mission critical.</li>
<li>Local backups are generated automatically in the course of using GitHub — unlike Tumblr’s proposed solution, which calls on users to make a conscious decision to back up at regular intervals if they want the safety of their data.</li>
</ol>]]></content>
        <category label="tumblr" term="tumblr"/>
        <category label="git" term="git"/>
        <category label="wikileaks" term="wikileaks"/>
        <published>2010-12-08T18:28:54.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[It Talks: Text to Speech in Processing]]></title>
        <id>https://frontiernerds.com/text-to-speech-in-processing</id>
        <link href="https://frontiernerds.com/text-to-speech-in-processing"/>
        <updated>2010-12-06T17:47:26.000Z</updated>
        <content type="html"><![CDATA[<p>The Mac has a really great text-so-speech (TTS) engine built right in, but at first glance it’s only available at Apple’s whim in specific contexts — e.g. via a menu command in TextEdit, or system-wide through the accessibility settings. Seems grim, but we’re in luck — Apple, in their infinite generosity, have given us a command line program called “say”, which lets us invoke the TTS engine through the terminal. It’s super simple to use, just type the command and then the text you want, e.g.</p>
<pre><code class="language-sh">say cosmic manifold
</code></pre>
<p>So that’s great, now what if we wanted to make a Processing sketch talk to us? In Java, as in most languages, there are ways to send commands to the terminal programmatically. By calling <code>Runtime.getRuntime().exec("some command");</code> we can run any code we want on the terminal from within Processing. So to invoke the TTS engine from a Processing sketch, we can just create the <code>say ...</code> command line instruction in a string object, pass that into the runtime execution thing, which in turn handles the TTS conversion.</p>
<p>I’ve put together a small Processing class that makes it easy to add speech to your Processing sketches. It only works on Mac OS, won’t work in a web applet, and has only been tested in Mac OS 10.6. (I think the list of voices has changed since 10.5.)</p>
<p>Note that the since the class is quite simple and really just wraps up a few functions. I’ve set it up for static access, which means that you should never need to instantiate the class by calling something like <code>java TextToSpeech tts = new TextToSpeech()</code> — and in fact that would be a Bad Idea. Instead, you can access the methods any time without any prior instantiation using static style syntax, e.g. <code>java TextToSpeech.say("cosmic manifold");</code>.</p>
<p>Here’s the class and a sample sketch:</p>
<pre><code class="language-java">// Processing Text to Speech
// Eric Mika, Winter 2010
// Tested on Mac OS 10.6 only, possibly compatible with 10.5 (with modification)
// Adapted from code by Denis Meyer (CallToPower)
// Thanks to Mark Triant for the inspiring sample text

String script = "cosmic manifold";
int voiceIndex;
int voiceSpeed;

void setup() {
  size(500, 500);
}

void draw() {
  background(0);

  // set the voice based on mouse y
  voiceIndex = round(map(mouseY, 0, height, 0, TextToSpeech.voices.length - 1));

  //set the voice speed based on mouse X
  voiceSpeed = mouseX;

  // help text
  fill(255);
  text("Click to hear " + TextToSpeech.voices[voiceIndex] + "\nsay \"" + script + "\"\nat speed " + mouseX, 10, 20);

  fill(128);
  text("Mouse X sets voice speed.\nMouse Y sets voice.", 10, 65);
}

void mousePressed() {
  // say something
  TextToSpeech.say(script, TextToSpeech.voices[voiceIndex], voiceSpeed);
}

// the text to speech class
import java.io.IOException;

static class TextToSpeech extends Object {

  // Store the voices, makes for nice auto-complete in Eclipse

  // male voices
  static final String ALEX = "Alex";
  static final String BRUCE = "Bruce";
  static final String FRED = "Fred";
  static final String JUNIOR = "Junior";
  static final String RALPH = "Ralph";

  // female voices
  static final String AGNES = "Agnes";
  static final String KATHY = "Kathy";
  static final String PRINCESS = "Princess";
  static final String VICKI = "Vicki";
  static final String VICTORIA = "Victoria";

  // novelty voices
  static final String ALBERT = "Albert";
  static final String BAD_NEWS = "Bad News";
  static final String BAHH = "Bahh";
  static final String BELLS = "Bells";
  static final String BOING = "Boing";
  static final String BUBBLES = "Bubbles";
  static final String CELLOS = "Cellos";
  static final String DERANGED = "Deranged";
  static final String GOOD_NEWS = "Good News";
  static final String HYSTERICAL = "Hysterical";
  static final String PIPE_ORGAN = "Pipe Organ";
  static final String TRINOIDS = "Trinoids";
  static final String WHISPER = "Whisper";
  static final String ZARVOX = "Zarvox";

  // throw them in an array so we can iterate over them / pick at random
  static String[] voices = {
    ALEX, BRUCE, FRED, JUNIOR, RALPH, AGNES, KATHY,
    PRINCESS, VICKI, VICTORIA, ALBERT, BAD_NEWS, BAHH,
    BELLS, BOING, BUBBLES, CELLOS, DERANGED, GOOD_NEWS,
    HYSTERICAL, PIPE_ORGAN, TRINOIDS, WHISPER, ZARVOX
  };

  // this sends the "say" command to the terminal with the appropriate args
  static void say(String script, String voice, int speed) {
    try {
      Runtime.getRuntime().exec(new String[] {"say", "-v", voice, "[[rate " + speed + "]]" + script});
    }
    catch (IOException e) {
      System.err.println("IOException");
    }
  }

  // Overload the say method so we can call it with fewer arguments and basic defaults
  static void say(String script) {
    // 200 seems like a resonable default speed
    say(script, ALEX, 200);
  }

}
</code></pre>]]></content>
        <category label="processing" term="processing"/>
        <category label="speech" term="speech"/>
        <published>2010-12-06T17:47:26.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Rough Thesis Proposal]]></title>
        <id>https://frontiernerds.com/rough-thesis-proposal</id>
        <link href="https://frontiernerds.com/rough-thesis-proposal"/>
        <updated>2010-11-30T18:04:41.000Z</updated>
        <content type="html"><![CDATA[<p>Let’s suppose the internet stops working tomorrow. Not the hard disks, just the wires. Government firewall, tiered service, cut fiber, ISP meltdown — pick a scenario.</p>
<p>How much would be lost? How much could be recovered? What would you need to rebuild?</p>
<p>We have the infrastructure — computing power, abundant storage, networking — but not the know-how, organization, will, or sense of impending disaster required to start building a decentralized web.</p>
<p>We need a way to easily rearrange our consumer electronics (which are optimized for consumption) into a network that can’t be centrally controlled or destroyed (and is therefore optimized for creation and distribution). Most importantly, the ubiquity and overlap of consumer-created wireless networks in urban areas means that mesh-based networks with thousands of nodes should be feasible without any reliance on centralized network infrastructure.</p>
<p>Hence the post-apocalyptic pirate web kit: everything you need to bootstrap a decentralized web in a single package. This could take several forms, but my initial thinking suggest a suitcase full of hard disks, wireless connections, and processing power designed to restore fragments of the web to nearby users and act as a node in a broader network of data survivalists.</p>
<p>The hard-disks would be pre-loaded with an archival version of the web. The whole of Wikipedia’s english text content, for example, is readily available for download and amounts to abut 5 terabytes. This could fit on three hard disks, which cost about $100 each, and together displace about as much physical space as a loaf of bread.</p>
<p>In its dormant form, the post-apocalyptic pirate web kit is something you might leave plugged in at the corner of the room — it could sit there indefinitely, like a fire extinguisher. The kit could automatically crawl the web and keep its archival mirror as fresh as possible.</p>
<p>When and if disaster strikes, the kit would be ready to switch into server node and thus preserve our way of internet life. (So that we might continue with a spirit of bold curiosity for the adventure ahead!)</p>]]></content>
        <category label="data" term="data"/>
        <category label="apocalypse" term="apocalypse"/>
        <category label="culture" term="culture"/>
        <published>2010-11-30T18:04:41.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Dead Drop at 319]]></title>
        <id>https://frontiernerds.com/dead-drop-at-319</id>
        <link href="https://frontiernerds.com/dead-drop-at-319"/>
        <updated>2010-11-29T14:41:18.000Z</updated>
        <content type="html"><![CDATA[<p>This morning I installed a <a href="https://deaddrops.com/">dead drop</a> at <a href="https://web.archive.org/web/20131004172923/http://319scholes.org/">319 Scholes</a> street in Brooklyn.</p>
<p>Four gigabytes worth of zeroed bits are now available for web-free semi-anonymous file exchange.</p>
<p><a href="https://frontiernerds.com/assets/319drop-wide.V8lkycmc_ZRUI55.webp"><img src="https://frontiernerds.com/assets/319drop-wide.V8lkycmc_ZRUI55.webp" alt="319 Scholes Street"></a> <a href="https://frontiernerds.com/assets/319drop-mid.CqNsLMGH_wI46.webp"><img src="https://frontiernerds.com/assets/319drop-mid.CqNsLMGH_wI46.webp" alt="The door to 319 Scholes Street"></a></p>
<p><a href="https://frontiernerds.com/assets/319drop-close.Mae1Qb5m_Z15VEex.webp"><img src="https://frontiernerds.com/assets/319drop-close.Mae1Qb5m_Z15VEex.webp" alt="Dead Drop installed at 319 Scholes Street"></a></p>
<h3>Background</h3>
<p>Dead drops was initiated by Aram Bartholl during his residency at Eyebeam — the project involves embedding digital storage in buildings or other large, immovable objects in public space. Aram’s “Dead Drops Manifesto” follows:</p>
<blockquote>
<p><em>Dead Drops is an anonymous, offline, peer to peer file-sharing network in public space. Anyone can access a Dead Drop and everyone may install a Dead Drop in their neighborhood/city. A Dead Drop must be public accessible. A Dead Drop inside closed buildings or private places with limited or temporary access is not a Dead Drop. A real Dead Drop mounts as read and writeable mass storage drive without any custom software. Dead Drops don’t need to be synced or connected to each other. Each Dead Drop is singular in its existence. A very beautiful Dead Drop shows only the metal sheath enclosed type-A USB plug and is cemented into walls.You would hardly notice it. Dead Drops don’t need any cables or wireless technology. Your knees on the ground or a dirty jacket on the wall is what it takes share files offline. A Dead Drop is a naked piece of passively powered Universal Serial Bus technology embedded into the city, the only true public space. In an era of growing clouds and fancy new devices without access to local files we need to rethink the freedom and distribution of data. The Dead Drops movement is on its way for change!</em></p>
</blockquote>
<blockquote>
<p><em>Free your data to the public domain in cement! Make your own Dead Drop now! Un-cloud your files today!!!</em></p>
</blockquote>
<p>I really like this project — it sits at the intersection of the practical and the conceptual, and forces us to rethink data exchange and anonymity. In the course of researching dead drops, I read hundreds of comments from visitors to Aram’s blog posts outlining the project. A <em>huge</em> percentage of the comments are Henny Penny hysterics about the security implications involved in plugging your laptop into a dead drop. (Viruses! Destructive circuitry! Illegal content!) Such is the nature of creative tear-down on the web.</p>
<p>Nevertheless there’s something about drawing data exchange into real life that amplifies paranoia and brings out a sense of peril. If my thesis project about building a decentralized mesh-based alternative internet ends up solidifying, I’ll definitely revisit this issue of how trust works so differently in real life.</p>
<h3>Process</h3>
<p>Installing the drop was relatively simple. I followed the <a href="https://deaddrops.com/participate/">how-to guide</a>. As fate would have it, there was already a hole in the building’s brick facade that was just the right size for a USB key. With that already sorted, I started by cracking open the key’s plastic case and mummifying it in PTFE tape for water protection:</p>
<p><a href="https://frontiernerds.com/assets/319drop-installation-materials.ItHAqRru_Z2j0wrk.webp"><img src="https://frontiernerds.com/assets/319drop-installation-materials.ItHAqRru_Z2j0wrk.webp" alt="A collection of tools"></a> <a href="https://frontiernerds.com/assets/319drop-installation-drive.CU1Rf9L__ZCVpaD.webp"><img src="https://frontiernerds.com/assets/319drop-installation-drive.CU1Rf9L__ZCVpaD.webp" alt="A USB key"></a> <a href="https://frontiernerds.com/assets/319drop-installation-dismantled.CukfAla8_vriDh.webp"><img src="https://frontiernerds.com/assets/319drop-installation-dismantled.CukfAla8_vriDh.webp" alt="A dismantles USB key"></a> <a href="https://frontiernerds.com/assets/319drop-installation-mummified.Bsga_2j2_RAkoq.webp"><img src="https://frontiernerds.com/assets/319drop-installation-mummified.Bsga_2j2_RAkoq.webp" alt="A dismantled USB key wrapped in protective tape"></a></p>
<p>Next I mixed up some cement, and embedded the key in the wall:</p>
<p><a href="https://frontiernerds.com/assets/319drop-installation-hole.C_Qexvlp_ZXXzdA.webp"><img src="https://frontiernerds.com/assets/319drop-installation-hole.C_Qexvlp_ZXXzdA.webp" alt="Close up of the door at 319 Scholes Street"></a> <a href="https://frontiernerds.com/assets/319drop-installation-test-fit.DVhLEHoq_pDsBW.webp"><img src="https://frontiernerds.com/assets/319drop-installation-test-fit.DVhLEHoq_pDsBW.webp" alt="USB key placed in the wall at 319 Scholes Street"></a> <a href="https://frontiernerds.com/assets/319drop-installation-cement.DOBYfkbI_ZsTvyC.webp"><img src="https://frontiernerds.com/assets/319drop-installation-cement.DOBYfkbI_ZsTvyC.webp" alt="A small container with freshly mixed cement"></a> <a href="https://frontiernerds.com/assets/319drop-close.Mae1Qb5m_Z15VEex.webp"><img src="https://frontiernerds.com/assets/319drop-close.Mae1Qb5m_Z15VEex.webp" alt="USB key cemented into the wall at  Scholes Street"></a></p>
<h3>Outlook</h3>
<p>I seeded the drive with a few files of interest, and <a href="https://deaddrops.com/db/?page=view&#x26;id=102">posted it to the dead drops database</a> — this is the 66th drop to be installed worldwide. I’ll be checking the drive every few days, and if anything interesting shows up I’ll write a post.</p>
<p>I’ll be interested to see how well / long it holds up to the elements, since the Eyebeam drop met an early demise after a rainstorm, and the Union Square drop was lost to vandalism.</p>]]></content>
        <category label="eyebeam" term="eyebeam"/>
        <category label="apocalypse" term="apocalypse"/>
        <published>2010-11-29T14:41:18.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Building a Real-Time Web Library for Processing]]></title>
        <id>https://frontiernerds.com/real-time-web-plans</id>
        <link href="https://frontiernerds.com/real-time-web-plans"/>
        <updated>2010-11-18T08:48:49.000Z</updated>
        <content type="html"><![CDATA[<p><img src="https://frontiernerds.com/assets/web-tap-logo.Car7OLjP_27y3LF.webp" alt="Four beer tap handles labeled with web services"></p>
<p>In the course of working on my big screens project this semester — <a href="https://frontiernerds.com/big-screens-ideas">a real-time web clock</a> — I’ve realized that it would be awfully nice to have a Processing library to abstract away all of the footwork involved in skimming the latest data from the usual Web 2.0 suspects like Twitter, Flickr, Facebook, Foursquare, etc.</p>
<p>There’s already some nice work on this front.</p>
<p>Mark McBride’s excellent <a href="https://github.com/mccv/processing-tweet-stream">Tweet Stream library</a> for Processing works exactly as advertised. There are also Java wrappers for popular APIs (<a href="https://sourceforge.net/projects/flickrj/">flickrj</a> comes to mind) that should work in Processing and are likely to make data extraction easier.</p>
<p>But, many of these approaches are lacking, and getting from a raw API to a real-time stream can be a lot of work. So I’ve started work on a Processing library whose sole function is to provide a real-time stream of web data from as many sources as possible, with as little delay as possible.</p>
<p>A few considerations and concerns for the design and implementation of the library are outlined below.</p>
<h3>One Library vs. Many</h3>
<p>Traditionally, APIs wrappers / libraries come à la carte. You want Flickr? Download the Flickr library. You want Twitter? Download the Twitter library. Etc.</p>
<p>That’s nice, but all of these libraries work in slightly different ways and putting two or more services together requires plenty of fuss and glue code. I think a lot of the interesting things you can do with real-time data involve placing it in the context of other events, which favors a one-library approach.</p>
<p>Also, consolidating the services into a single library means that we can use polymorphism to make dealing with generic “event” objects from different sources relatively painless. E.g., if you wanted to work with tweets and Flickr updates in similar contexts, you should be able to manage everything from a single array list since their event objects inherit from some kind of generic event class.</p>
<h3>Maintainability</h3>
<p>Not all services with high-rates of incoming data have clean, public-facing APIs for grabbing said data.</p>
<p>In the case of Foursquare, for example, there’s no way to access the most recent public check-ins through their official API. The API allows certain kinds of interactions with the service, but it doesn’t do exactly what we want.</p>
<p>Likewise, Facebook’s status API doesn’t let you do an empty search for all of the latest updates — instead you’re limited to a specific search term. So, in this case, there’s an API that almost does what we want, but we’ll have to get clever if we want something resembling the whole stream of public status updates.</p>
<p>Therefore getting real-time data from these services will involve some hairy HTTP polling and HTML scraping. These kinds of things are liable to break if / when the structure of the source page changes. There are also potential request rate limits to deal with. Keeping the library up to date and fixing breakage when it happens is going to be a challenge — but I can’t think of a way around this until more sites support officially-sanctioned access to structured real-time data through an API. (And good on Flickr and Twitter for providing real-time data in their APIs already.)</p>
<h3>Actual Rate vs. Available Rate</h3>
<p>Ideally, every single event from a service would be captured by the real-time web library. However, for extremely high-volume services (Twitter, for example), even the official APIs only give a fraction of the actual event streams. Again, there’s not really a way around this, but keeping tabs on what percentage of the full stream we’re currently reading might be useful — some way to compare the current message rate through the library to the actual message rate on the service’s end. (For example, it would be useful to know that the library’s current Twitter message rate is 30% of actual traffic.)</p>
<p>Conversely, being able to dynamically specify a rate limit for each service might be useful in certain contexts where bandwidth needs to be conserved or services need to be synchronized. (At the very least, rate limiting will be useful for my big screens project, where a full-rate sample of the services would result in graphic slowdowns.)</p>
<hr>
<p>So how should it work from the library programmer’s perspective? Maybe something like this:</p>
<pre><code class="language-java">WebStream webStream;

void setup() {
  // instantiate the web stream object
  webStream = new WebStream(this);

  // add as many (or as few) real-time services as you like
  webStream.activateFlickr("api key here");
  webStream.activateFacebook();
  webStream.activateTwitter("api key here");
}

void draw() {
  // Nothing to see here, yet
}

void streamEvent(Update u) {
  // the web stream returns updates via a callback event
  // this would print a quick summary of the most recent event
  println("content: " + u.getText() +
  " via: " + u.getAuthor() +
  " at: " + u.getTime() +
  " from: " + u.getServiceName());
}
</code></pre>
<p>That’s the plan, at least. I’ll work on this over the next week and eventually have some code for public consumption in the next month or two. If anyone has feature requests or a different take on how something like this should work, let me know.</p>]]></content>
        <category label="live web" term="live-web"/>
        <published>2010-11-18T08:48:49.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Concept Dump]]></title>
        <id>https://frontiernerds.com/concept-dump</id>
        <link href="https://frontiernerds.com/concept-dump"/>
        <updated>2010-11-16T15:31:46.000Z</updated>
        <content type="html"><![CDATA[<p>An ever-expanding list of assorted things of direct or tangential relevance to my continued inquiry regarding a thesis on the subject of a post-apocalyptic pirate internet.</p>
<h3>Books</h3>
<ul>
<li><em>Inventing the Internet</em> — Janet Abbate</li>
<li><em>Internet Architecture and Innovation</em> — Barbara van Schewick</li>
<li><em>The Internet Imaginaire</em> — Patrice Flichy</li>
<li><em>Access Controlled, The Shaping of Power, Rights, and Rule in Cyberspace</em></li>
<li><em>Wirelessness Radical Empiricism in Network Cultures</em> — Adrian Mackenzie</li>
<li><em>Crypto Anarchy, Cyberstates, and Pirate Utopias</em> — Edited by Peter Ludlow</li>
<li><em>Security in Wireless Mesh Networks</em> — Editors Zhang, Zheng, and Hu</li>
</ul>
<h3>Articles</h3>
<ul>
<li><em><a href="https://www.sacred-texts.com/eso/taz.htm">The Temporary Autonomous Zone</a></em></li>
<li><em><a href="https://torrentfreak.com/scary-bittorrent-monitoring-as-a-work-of-art-100503/">BitTorrent Monitoring As a Work of Art</a></em></li>
<li><em><a href="https://www.wired.com/2010/07/intercepting-cell-phone-calls/">Hacker Spoofs Cell Phone Tower to Intercept Calls</a></em></li>
<li><em><a href="https://www.theatlantic.com/magazine/archive/1945/07/as-we-may-think/303881/">As We May Think</a></em></li>
</ul>
<h3>Groups</h3>
<ul>
<li><a href="https://web.archive.org/web/20110128205639/http://meningrey.net/">Men in Grey</a></li>
<li><a href="http://fffff.at/">F.A.T.</a></li>
<li>Eyebeam</li>
<li><a href="https://churchofwifi.org/">The Church of Wi-fi</a></li>
<li><a href="https://wiki.archiveteam.org/">Archive Team</a></li>
</ul>
<h3>Software</h3>
<ul>
<li><a href="https://github.com/codebutler/firesheep">Firesheep</a></li>
<li><a href="https://www.wireshark.org/">Wireshark</a></li>
<li><a href="http://en.wikipedia.org/wiki/Carnivore_(software)">Carnivore</a></li>
<li><a href="https://web.archive.org/web/20160508134714/http://www.wardrive.net/wardriving/tools/">Assorted wardriving tools</a></li>
</ul>
<h3>People</h3>
<ul>
<li><a href="https://arambartholl.com/">Aram Bartholl</a></li>
<li><a href="http://www.thomson-craighead.net/">Thomson &#x26; Craighead</a></li>
<li><a href="https://www.v2.nl/archive/people/gordan-savicic">Gordan Savicic</a></li>
<li><a href="https://k0a1a.net/">Danja Vasiliev</a></li>
</ul>
<h3>Works</h3>
<ul>
<li><a href="http://deaddrops.com/">Dead Drops</a> — Aram Bartholl</li>
<li><a href="https://web.archive.org/web/20170107141714/http://www.vvork.com/?p=21180">Six Square Meters of RAM</a> — James Irwin</li>
</ul>
<h3>Words</h3>
<ul>
<li>Prepared</li>
<li>Paranoid</li>
<li>Ad-Hoc</li>
<li>Mesh</li>
<li>Self-healing</li>
<li>Archive</li>
<li>Survival</li>
<li>Shelter</li>
<li>Network</li>
<li>Wireless</li>
<li>Web</li>
<li>Indestructible</li>
<li>Bootstrapped</li>
<li>Autonomous</li>
<li>Displaced</li>
<li>Intercept</li>
<li>Splice</li>
<li>Reroute</li>
</ul>]]></content>
        <published>2010-11-16T15:31:46.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Dead Dead Drops]]></title>
        <id>https://frontiernerds.com/dead-dead-drops</id>
        <link href="https://frontiernerds.com/dead-dead-drops"/>
        <updated>2010-11-16T12:04:07.000Z</updated>
        <content type="html"><![CDATA[<p><a href="https://deaddrops.com/">Dead Drops</a> is a project fresh out of Eyebeam that proposes a new and unsanitary means of network-free data exchange. The project seems highly relevant to my nascent thesis idea, <a href="https://frontiernerds.com/the-post-apocalyptic-pirate-web-kit">The Post-Apocalyptic Pirate Web Kit</a>.</p>
<p>Artist <a href="https://arambartholl.com/">Aram Bartholl</a> describes the project:</p>
<blockquote>
<p>‘Dead Drops’ is an anonymous, offline, peer to peer file-sharing network in public space. USB flash drives are embedded into walls, buildings and curbs accessible to anybody in public space.</p>
</blockquote>
<p>Everyone is invited to drop or find files on a dead drop. Plug your laptop to a wall, house or pole to share your favorite files and data.</p>
<p>It’s the digital equivalent of a glory hole.</p>
<p>I was curious to see what kind of content the drops have accumulated in their first few days of existence, so I printed a handful of maps and set out to visit each one. The site’s <a href="https://deaddrops.com/db/">location database</a> is a little clunky, but it turned up five drop locations in Brooklyn and Manhattan:</p>
<p><a href="https://frontiernerds.com/assets/dead-drops3.C2sowugv_VzyM6.webp"><img src="https://frontiernerds.com/assets/dead-drops3.C2sowugv_VzyM6.webp" alt="Map printouts of Dead Drop locations in New York"></a></p>
<p>Eyebeam was the first stop. Here’s the drop:</p>
<p><a href="https://frontiernerds.com/assets/dead-drops1.BN2pWwlx_PeIpa.webp"><img src="https://frontiernerds.com/assets/dead-drops1.BN2pWwlx_PeIpa.webp" alt="A USB key cemented into a wall at Eyebeam"></a></p>
<p>Plugging in was a bit of an anticlimax — nothing happened:</p>
<p><a href="https://frontiernerds.com/assets/dead-drops2.BeIDu_PW_Z9LTE1.webp"><img src="https://frontiernerds.com/assets/dead-drops2.BeIDu_PW_Z9LTE1.webp" alt="Testing the dead drop at eyebeam"></a></p>
<p>The drive refused to show up on my desktop, and poking around with Disk Utility suggested that the issue went beyond a botched format or a corrupt partition. I emailed Aram about it, and he confirms that it’s gone down and attributes the loss to rain.</p>
<p>The Union Square subway drop also <a href="https://deaddrops.com/r-i-p-union-drop/">went down earlier today</a>, possibly the result of vandalism. So it goes.</p>
<p>That leaves three more drops to explore, I’ll update this post as I make the rounds.</p>
<p>Even though my attempt to use the drop was a failure (thus far, at least), the concept stands on its own and presents an interesting counter-scenario to the ethereal mesh-networks I’m proposing for the post-apocalyptic pirate internet.</p>
<p>Let’s consider some dead drop novelties:</p>
<ul>
<li>There is no electrical network — feet and subway trains do the work of shuffling packets around the city.</li>
<li>There are no loops — a drop is a dead end.</li>
<li>There is no censorship — there is no CAPTCHA or filter or firewall. Maintaining a standard of content is perpetually up to the next person to visit the drop.</li>
<li>There is no permission — anyone can delete or create any file on the drop indiscriminately.</li>
<li>There is no privacy — though the files on a drop may be of anonymous origin, you must be physically present at the drop’s point of installation to exchange data.</li>
<li>There is no protocol — where most networks impose a structure for communication (TCP / IP, UDP, OSC, MIDI, AT), the drop’s only structure is derived from the file system.</li>
</ul>]]></content>
        <category label="eyebeam" term="eyebeam"/>
        <category label="data" term="data"/>
        <category label="apocalypse" term="apocalypse"/>
        <published>2010-11-16T12:04:07.000Z</published>
    </entry>
    <entry>
        <title type="html"><![CDATA[Big Screens is Coming]]></title>
        <id>https://frontiernerds.com/big-screens-is-coming</id>
        <link href="https://frontiernerds.com/big-screens-is-coming"/>
        <updated>2010-11-16T02:40:59.000Z</updated>
        <content type="html"><![CDATA[<p><a href="https://frontiernerds.com/assets/bigscreens_invite.D04CdmUz_16mGmj.webp"><img src="https://frontiernerds.com/assets/bigscreens_invite.D04CdmUz_16mGmj.webp" alt="ITP Big Screens event promo image"></a></p>
<p>It’s coming up… December 3rd at the IAC building. Project descriptions and additional info <a href="https://web.archive.org/web/20120428033801/http://itp.nyu.edu/bigscreens2010/">here</a>.</p>]]></content>
        <published>2010-11-16T02:40:59.000Z</published>
    </entry>
</feed>