on NTS Radio

A few weeks ago, NTS Radio asked me to put together a two-hour set of music and audio from the site. It’s something I’ve been meaning to do for years anyway, so I decided to go for it! As I expected, it was kind of nerve-wracking editing things down into a “finished” mix, but it was also a lot of fun to build a massive multi-track project out of so many layers of sound and music. I’ll embed the Mixcloud archive of the complete set once it’s posted, but for now I just wanted to share the tracklist and a few shout-outs.


  1. Tone Color – Drone [00:00]
  2. Chris Zabriskie – I Am a Man Who Will Fight for Your Honor [02:11]
  3. Christopher Sisk – Seeing Things Anew [13:16]
  4. EVAC – The Nothing [Buried in Time] [18:44]
  5. Håkan Olsson – Ambient1 [22:02]
  6. Lee Rosevere – Gilding the Lily [24:33]
  7. Limited – Dance of the Weedy Sea Dragon [26:25]
  8. Andrew Johnson – Choral Bloom [29:25]
  9. Christopher Sisk – Watching Clouds Form [39:06]
  10. Erik Tokle – The Good Life Most Will Never Find [44:15]
  11. Diamond Sutra &  Singing Bowls & Rain [51:54]
  12. Conet Project [57:47]
  13. James Barber – On These Illimitable Rivers You Will Go Far [58:59]
  14. Ricky Graham – Axon (Reprise) [01:01:16]
  15. Norman Fairbanks – Counterfeit 2 [01:05:28]
  16. Bash Nova – Following You Home [01:18:49]
  17. j. williams friedman – An Ice Tream Truck Decorated To Look Like It’s From Tron [01:26:00]
  18. Cordovan Music – Anemone [01:41:00]
  19. Jani R – Spring Flowing In [01:44:35]
  20. Ann Deveria – No tan lejos [01:52:17]

And the two spoken word bits towards the end are from Krishnamurti and Aldous Huxley.

  • – As I mentioned when I first posted about the project on MetaFilter, I came up with the idea while listening to the police scanner, and obviously I love their Drone Zone & Mission Control channels as well!
  • SoundCloud – Two of the biggest challenges when running a web radio station are music licensing & streaming bandwidth, but thanks to SoundCloud and Creative Commons I didn’t have to deal with either one!
  • The 100+ artists whose music I’ve used on the site. I wish I could have included more in this mix, but it was only two hours long! Next time? =)
  • Broadcastify – Similarly, these guys have done an amazing job of building a community of scanner enthusiasts and organizing thousands of streaming audio feeds in one place.
  • Hugh Mandeville – Hugh volunteered a ton of time and effort in building the most recent version of the site & also set up the BYO section where you can make your own mixes.
  • Scanner (aka Robin Rimbaud) has been doing music + phone/radio performances since the 90s!
  • Soul Coughing – I named the site after the “It is 5AM, and you are listening to Los Angeles” bit in their song, “Screenwriter’s Blues”.

If you want to get in touch with me about the site, contributing music or anything else, the easiest way to do it is on Twitter. Or you can email at but it may take me longer to reply that way.

Thanks for listening!

Best of 2k12

Well 2013 is almost over, so I guess I might as well post my annual “Best New Music” playlist… for last year… ¬¬

Here’s the direct download link for the individual MP3 files. It’s about 114 MB in total. I’ll try to get this year’s list posted in a much more timely fashion. Stay tuned!

Eternal September

Everyone who’s spent more than a few months “on the internet” knows (hopefully) that there are certain behavioral norms & expectations that ought to be observed in order to keep things civil. For example… DON’T TYPE IN ALL CAPS BECAUSE IT WILL LOOK LIKE YOU’RE YELLING or “It’s impossible to prevent this indefinitely, but let’s do our best to keep online discussions from getting to the point where both sides are comparing each other to Hitler for as long as possible.”

These aren’t “laws” per se, because (excuse my capitalization, but I really do want to yell this) NOBODY OWNS THE INTERNET, but they’re almost as universal as any law could ever hope to be. However, if you’ll refer back to my first sentence, please notice that I’ve qualified things by excluding those with “less than a few months” of online experience from being expected to know all these ins and outs.

Seems kind of preposterous today, since kids are now being exposed to the internet as soon as their parents realize that iPhones are more effective pacifiers than anything else yet devised, but try (if you can) to recall a time when this wasn’t the case. It was only 10 or 15 years ago. When did you first encounter email? Instant messaging? Chat rooms? Forums? Was it in high school? College? Jr. High? Until 1993 most people weren’t exposed to any of these until they got to college.

Every September, a new crop of freshmen were turned loose on the internet (or, as it was known in those days, “The Information Superhighway”) and left to fend for themselves amongst a puzzling variety of obscure protocols and acronyms that were only increasing in number. FTP, WWW, WAIS, Gopher, USENET, TCP/IP, etc. But at least they were all in it together. Once a few people clicked “Reply All” and alerted the entire Psych 101 class to their embarrassing weekend exploits the word got around, and by Thanksgiving most everyone had come to understand the basics “DOs and DON’Ts” of online etiquette. Things had returned to a sort of equilibrium.

So what happened in 1993? Well as those of you were too young to be in college at that point (myself included) will recall, it was then that our friends at America On-Line realized that their thriving floppy-disk distribution racket might be aided by including access to this “internet thing” along with your paid, dial-up subscription. And so they began allowing their members to browse not just the walled garden of AOL content, but also the internet itself: web, newsgroups & all.

Every day, more people downloaded the latest AOL system update (do you remember how long it took via 28.8 modem?) and decided to try clicking on the “Newsgroups” link that appeared on their next login. Every day AOL sent out more floppy disks & CD-ROMs. Every day more people joined the service. High-school students, 10-year-olds, cat ladies, construction workers, aging musicians and tax collectors, disgruntled postal workers, you name it. Do you see where I’m going with this?

Post-1993, there could be no equilibrium. Probably there never really was, but in any case… things changed. While some traditions were preserved (I.E. NOT USING CAPS LOCK) many others were forgotten or permanently altered by the influx of new users. Books & CDs became available for purchase. Magazines & newspapers could be read online. High-school & college kids started publishing their own websites & using the internet in ways that it’s previous inhabitants could never have predicted. Many long-time internet users recognized this sea-change & called it “Eternal September“.

But, and this is the important part, this is *exactly* what was supposed to happen. The internet in general, and the web in particular, were designed from the ground up so that it was as easy to publish and share content as it was to consume it. The very first web browser, as created by Tim Berners-Lee at CERN in Switzerland was *also* a web editor. The web is a LEGO set and while it’s fun to follow the instructions and apply all the LucasFilm-licensed decals, it’s even *more* fun to stick Chewbacca’s head on Darth Vader’s body and put him inside a rainbow-colored jail surrounded by strange mutant trees.

A person who fell into a coma in 1965 and awoke in 1985 would probably have few problems understanding how the television in their hospital room worked. You turn it on, change the channels and wait for the commercials to end so that you can get back to the good stuff. But a person who missed out on even 4 or 5 years of the internet’s development would be significantly more confused. YouTube was founded in 2005. Facebook opened to non-academic users in 2006.

How did you first find out about Napster? BitTorrent? Yelp? MetaFilter? Wikipedia? Amazon? Google!? Chances are it wasn’t in a magazine or on a TV news report. Probably a friend or family member said “You gotta check this out.” And it turned out they were right. Things kept changing. You listened to music you’d never heard about before, read newspapers from other countries, bought a digital camera and started posting your photos online. People in Israel thought they were cool. Things kept changing.

Sometimes you weren’t sure if you *liked* the way things were changing. Your boss started reading your blog. All the photos you took with your ex kept showing up on Facebook. People started downloading the movies that your studio produced without paying for them. Was there anything that could be done to stop this? Who was in charge around here?!

Here’s the thing. You are in charge. I am in charge. We are all in charge of it together, but it is owned by none of us. In the absence of any (real) governing body, we have all managed to find a way to co-exist and use the internet together in (relative) peace and harmony. How? Why? Because on some individual level, each of us has made a decision that goes something like this:

The usefulness of being an accepted part of this community outweighs the risks or negative consequences of choosing to participate.

Of course… it’s not entirely that simple. Some people are forced to participate against their will (“Hey dude, don’t tag me in that Facebook photo”) or make decisions that they will later regret (“No, actually. In spite of those pictures of me smoking weed and wearing a Megadeth Rules t-shirt taken 12 years ago, I think I would make a great addition to the teaching staff here at Pleasantville High School!”) But, just as we were on those pre-1993 Septembers of old, all of us are in this together, and by the time “Thanksgiving” rolls around I think it will all start making a lot more sense. After all, as painful as it may be in the moment, humans are unparalleled in their ability to adapt to change.

However, there are certain institutions and entities that are not so adaptable. For instance, corporations, banks and governments. These have all been organized (at least recently) on the basis of predictable inputs and outcomes. Entire books have been written on the subject of their inflexibility, and whether anything can be done about it. It seems to me that we are starting to find out.

About 7 or 8 years ago I read an article in this weird Canadian technology magazine that ended up going out of business several weeks later. The article was about a bunch of the same things that I’ve talked about here and there was one bit in it that always stuck with me. I lost track of the magazine for years, and despite a number of attempts to track it down online, it wasn’t until recently that I found it in an old box of books and finally got a chance to re-read it. I’ll just paste it in here so you can see for yourself, copyright be damned…

And what the heck, I’m in a charitable mood so let me fire up the old scanner and share the rest of the article with you as well.

Last Thoughts on Steve Jobs

Technology is a morally neutral or ambiguous force, and this is nowhere as obvious as in the history of modern, digital computers. The earliest such computers were used to calculate artillery flight paths, develop the hydrogen bomb & create actuarial tables that contributed to the massive enrichment of insurance companies & other corporate entities in the second half of the twentieth-century. Prior to this, the mechanical punch-card computers from International Business Machines had served largely the same function & throughout the 1950s it seemed that the advent of digital computer would only multiply the existing competitive advantage held by their wealthy owners.

This arrangement was not entirely without benefit to the average citizens of the United States & other industrialized nations. Standards of living rose as the efficiency gains created by computers were allowed to “trickle down” from producers to consumers in the form of lower prices for manufactured goods, cheaper credit & a variety of other ways I won’t get into here.

However, the new “freedoms” afforded by this process were of a particularly limited & insidious kind. Those who wished to share in them were required to become willing participants in their own systemization. The ways in which this systemization was made palatable to people who had spent the better part of the previous 200 years fighting against monarchistic control are rather astonishing, and have been thoroughly documented in works such as 1984 & Brave New World, as well as the documentary films of Adam Curtis and Errol Morris. While there was opposition to this increasing centralization of knowledge & control, the massive cost of computer hardware made it seem that these “means of (electronic) production” were no more likely to be owned by their true end users than were the mines & factories of 19th-century Europe.

However, throughout the 1960s computers were getting smaller & cheaper at an almost incredible rate. It was becoming increasingly obvious that in a matter of years, anyone would be able to own a computer thousands of times more powerful than those currently being used by the largest corporations. But why would they want to?

Early minicomputers were purchased by universities & eventually microcomputers found their way into the hands of individual hobbyists, but throughout the 1970s, they never really escaped from the ghetto of massive nerdiness. Despite their ever-decreasing cost, there was still little reason for most people own a computer, and those corporations & governments who had already benefited greatly from their use had little (if any) desire for this situation to change.

But Steve Jobs was among the shrewdest of the nerds. He emerged, serpent-like, from the tree of technology, offering us an Apple. We saw that it was good and ate of it, planting the seeds so that more would grow. We shared the seeds with our friends, parents, children, siblings, colleagues and others until the trees grew in every yard and the fruits were carried in every pocket.

But even as Steve’s success has proven the validity of his ideas, there remains much work to be done. As the slumbering giants of mass media, centralized government and academic institutions awaken to their own declining influence, the reactions have been all too predictable.

But the die is cast, the tools are now widely available. It’s time to build something new. And though, “it’s only my opinion, I may be right or wrong“, I think it’s going to be something wonderful.


Some guys took apart a new Apple TV & found out that it’s got 8 GB of Flash memory (plus space for more) inside. Since the device doesn’t actually allow users to store any audio or video content locally, this would seem to suggest that Apple is planning to open the device up to iPhone-style apps at some point. The implications of this should already be scaring the crap out of video game console manufacturers like Sony, Nintendo & Microsoft, but I suspect they don’t see the Apple TV’s “iPad guts” as a real competitor for their current or next-gen systems. However, I wonder if they’ve also considered the possibility that Apple could develop (read: acquire) some server technology (i.e. OnLive) that would allow users to stream games of unlimited size and complexity in full HD resolution from a central location, enabling AAA game developers to completely bypass issues of piracy, physical distribution & hardware limitations via a $99 box that never needs upgrading & could in all likelihood be given away free to end users with a online-service subscription. Most of the moving parts are already in place here (user accounts & billing through iTunes, stored value cards at nationwide retail, relationships with game publishers & developers, and now the set-top hardware) so Apple really just needs to wait for broadband internet speeds to reach an appropriate level, buy some big servers & flip the switch. Of course all of this assumes that they can get their head out of their ass about building a real online service with matchmaking & etc. but maybe they’ll learn a thing or two from this whole Ping business and surprise us yet. And if not, it would be pretty damn hilarious to see Microsoft come chugging along & knock down the whole iTunes house of cards by building the exact same thing on top of the existing Xbox platform.

tl;dr Consider buying $AAPL

Anarchy in the USA

Commenter “AL” in a Gapers Block thread on Critical Mass in Chicago writes:

It is an outright statement of differentiating values – expressed not through free speech, not through permitted demonstration – but in guerilla fashion with absolutely no accountability. The very organization of CM is setup in distributed cellular fashion so as to obviate the possibility of anyone being held to account for the misdeeds of those participating.

BINGO! To me it’s the “distributed cellular” setup of CM that makes it so powerful, but also so controversial. As Americans, and members of democratic society at large, we’re all brought up believing in the idea that there is a “system” in place and whether we’re for it or against it, its existence cannot be questioned. The “system” allows us to transfer accountability, both good and bad, to symbolic entities and figureheads. Hate the Tea Party? Blame Glenn Beck. Love your iPhone? Thank god for Steve Jobs! But what’s lost in this process is the understanding that all organizations are composed of individuals acting out of their own free will. Probably the reason we choose to ignore this is that it’s really fucking complicated (and often scary) to deal with thousands of distinct individuals, as opposed to a centralized organization. Consider the record industry trying to prevent piracy by suing individual downloaders or the DEA trying to fight drugs by jailing end users. If all these folks belonged to some kind of organization, “United Drug Users” or “Local Downloaders 451″ then the solution would be much easier! Similarly, if the CPD could simply call up the “President of Critical Mass” and ask him to tell CM riders to obey stop lights, then I’m sure the antagonism between motorists & cyclists would be greatly reduced. But because this isn’t possible, the two “groups” are forced to confront each other as individuals. It’s not something that we’re terribly good at (hence the screaming) but it’s an absolutely essential part of being human.

tl;dr CM is divisive because it represents anarchy and disorder, which can be either incredibly empowering or frightening, depending on your personality & perspective.

(FYI, I could go on for days about the *other* socio-political implications of Critical Mass, but I think this is the crux of it…)

The World’s Greatest

Johnny 5 needs more input

Kevin Kelly knows a thing or two about good journalism, so when I saw on his blog that he had posted a list of “The Best Magazine Articles Ever” (as nominated by the readers of said blog) I knew that I was in for a treat! Several of the articles I had read before, but quite a few were ones that I had meaning to get to for awhile & so I decided to take advantage of the rare summer-y weather in San Francisco this weekend to get outside & read them on my nook (via instapaper!)

Two articles that I found particularly awesome were this one:

“As We May Think” by Vannevar Bush (1945)

And this one:

“Space War: Fanatic Life and Symbolic Death Among Computer Bums” by Stewart Brand (1972)

Because despite being written 38 and 65 (!!!) years ago they have both not only proven to be completely prophetic about the development of the internet and the videogame industry to date, but they also serve as a reminder how how much important work remains to be done in both fields!

But enough of my jibber jabber… Go read some stuff!

ps. OK, a little more jibber jabber… If you’re looking for suggestions on where to go after the previously mentioned articles I’d also highly recommend David Foster Wallace’s “Consider the Lobster“, Michael Lewis’s “The End“, Bill Joy’s “Why the Future Doesn’t Need Us” and Atul Gawande’s “Letting Go“. And if you find any other awesome ones, let me know in the comments!

(“Johnny 5″ Photo by liquidnight on Flickr)

An Open Letter to Roger Ebert on the subject of Games as Art

Mr. Ebert, for years now, you have been making the case that video games, the most popular entertainment medium of the 21st century, are not “art”, while film, the most popular entertainment medium of the 20th century, is. Subjectively, we all can (and will) argue this point until the world runs out of oil and both these media are rendered obsolete, but in the meantime let’s approach the issue quantitatively & see where it leads.
First of all, let me give you a bit of background about myself. I was a college film student who ended up working in the game industry for several years before making the jump into tech & marketing and so I hope I can speak on these matters from a position of some experience. I’m also a lover of drama, music, comedy, visual art and literature & while I don’t usually enjoy perpetuating stereotypes, one that I feel contains a grain of truth is that of the “tortured” artist. This is not to say that all artists are Van Gogh-esque nut-jobs, driven to self-mutilation by failed love affairs & the chemicals in their paints, but I *do* believe that we all make choices in life & I think no one would argue that a career in “the arts” is the most profitable or stable one that a talented and intelligent young person could choose.
In other words, people who choose to make a living from their artistic endeavors, whether filmic, dramatic, poetic or etc., are aware of the more lucrative alternatives (lawyer / doctor / social-media expert / etc.) and have instead decided to “suffer” for their art, because they value it more highly than material wealth or social status. Are we agreed upon this point, Mr. Ebert? I hope so, or this next bit is going to be a bit tedious & unconvincing.
I have before me 2 documents. One is a “2009 Salary Guide” from a freelance staffing agency that works with clients in the advertising & technology fields. The other is the “9th Annual Salary Survey” as compiled by Game Developer Magazine for their April 2010 issue. While neither of these documents can be considered definitive, I think comparing the two can offer some useful insights on the “game  business”, as distinct from the broader “technology” industry, and can also support some of the hearsay and observations that I’ve collected during my time in both.
Since the nomenclature used by the gaming industry borrows (interestingly enough) more heavily from film than from technology, it will first be necessary to establish some valid correlations between job titles in the two groups. Let’s start with Project Managers (or as they’re known in games and film, “Producers”). According to the “2009 Salary Guide” provided by the staffing agency, an “Interactive Project Manager” can expect to earn a starting salary between $61,250 and $88,250 (for an average of $74,750) at the national level. Regional variances can affect these figures, with Interactive PMs in San Francisco earning between $82,687 and $119,137 to start, while those in Columbus, Ohio will fall between $58,187 and  $83,837.
Moving on to the 2009 “Salary Survey” from Game Developer magazine, it’s reported that a “Producer” with up to 3 years experience will earn an average salary of only $42,000 for work that is (in my personal experience) largely indistinguishable from that of his or her “Interactive Project Manager” counterpart in tech/marketing. The same regional variances exist in the GDM survey, but comparing the national average salaries across the 2 industries, we can see that the games producer earns a full $32,750 less per year!
Now that this methodology has been established, let’s look at some other positions. On the more technical side of things, we can compare a “Programmer/Engineer” from the games industry with a “Flash Developer” on the tech/marketing side. As an aside, the game “Programmer” would probably consider this a rather insulting comparison, but “Flash Developer” is the best match I’ve got in the “Salary Guide”. In any case, the  video game “Programmer” (with up to 3 years experience) earns an average of $54,975, while the tech/marketing “Flash Developer” starts at $75,625, a difference of $20,650.
Next we come to the “artists” themselves, who work mainly with 3D modeling & animation tools these days. In the games industry the average salary for a “3D Artist” starts at $45,200, while in tech & marketing this figure is $64,375, for a difference of $19,175.
And let’s not even talk about QA (Quality Assurance) or Strategy & Business Development.
So what my point in rattling off all these statistics? It’s this… My friends in the gaming industry are some of the smartest, most talented, industrious and sensible people I know & it is safe to assume they are all fully aware that more lucrative career options exist in other fields. Despite this, they have *chosen* to work in game development for some pretty damn good reasons. Chief among these, and the one which makes their blood boil at your remarks, Mr. Ebert, is that they, along with millions of people around the world, all share a deep-seated belief that the work they are doing has value beyond the purely commercial, that the experiences created over late nights of pizza and Dr. Pepper are reaching, and having an emotional impact on, a massive audience, and that they are all doing their part to help us understand the meaning & possibilities of life on this planet, and beyond, over the next 100 years.
Surely this effort is worth a few thousand dollars a year, and *surely* it is deserving of  some artistic recognition, wouldn’t you agree?
Thank you for your time,
Eric Eberhardt
P.S. On the outside chance that you actually read this *and* get to the end, I’d like to take the opportunity to second the recommendation of everyone on the internet who has commented on your previous blogs to suggest that you take 5 minutes and play Jason Rohrer’s “Passage” for Mac/PC/Linux and I will also add that the 2008 Wii title “de Blob” (originally conceived by students from the Utrecht School of the Arts) is also worth a look. Finally, you should, if you are not already, absolutely be aware of Keita Takahashi’s 2004 classic “Katamari Damacy” for PS2 & Steve Meretzky’s “A Mind Forever Voyaging” from 1985.