December 2010 Archives

Confident and Terrified

By Michael Mulvey on December 29, 2010 12:20 PM

I don't know much about Richard Saul Wurman (the founder of the TED), but I was nosing through Warren Berger's site and found a post with some great quotes.

On confident and terrified:

I am both confident and terrified all the time. These are two emotions you're not supposed to have. If you're terrified, you're called a scaredy cat. And if you're confident, you're called arrogant. But both of them working at the same time in parallel allows you to get at ideas and puts the edginess on your solutions. The terror of not knowing is where you begin, and you move backwards toward zero to find how to begin. And confidence allows you to begin. If those two emotions are out of balance, you're not such a good designer.

On learning:

My definition of learning is as follows: Learning is remembering what you're interested in. Think about that. If you don't remember it, then you haven't learned it. You may take a course during your schooling, and might do well in it, but you don't remember what was taught. On the other hand, you may have also taken courses you didn't do as well in--but you were interested in the subject, and you remember everything about it. I think interest, absolutely, goes hand in hand with learning.

On why he started TED:

The goal in starting TED was not to bring people together--who cares about that. I wanted other people to pay so that I could listen to interesting people talk. I sat on the stage the whole time and they talked to me. The goal was to take myself from not knowing to knowing, again and again--so that I could have that experience for as much time as I can throughout my life.

design = emotion

By Michael Mulvey on December 28, 2010 2:24 PM

John Gruber weighs in on the iPhone-versus-Android debate updates that have happened over the holiday. He touches on key points including the 'race to the bottom' on the cost of Android phones, how this isn't '1995 all over again' and the role emotion plays in Apple products to name a few.

I love this piece on features versus emotion in mobile computing:

Ignore for the moment whether it's true that "Android can go feature by feature against iPhone now". I'd dispute it, but just concede it for now. Ignore also that the best Android phones, like the Nexus S, cost over $500 unsubsidized. What Gray is missing is that emotion counts. Mobile computing is not an entirely rational market. Emotion is a huge factor when people choose what to buy -- I'd say maybe even the biggest one. Apple understands this. All iOS devices -- all Apple devices, for that matter -- are designed with the emotional experience in mind. Why does almost everything in iOS animate? Why did Apple create CoreAnimation, and base UIKit app development so heavily upon it? Because animation, even in small unobtrusive doses, has an emotional affect. It results in a feeling.

A must-read if your a designer or anyone involved in mobile computing and software.

poetry in destruction

By Michael Mulvey on December 28, 2010 1:23 PM


Hans Hermann crashing beautifully at the 1959 German Grande Prix (via Good Old Valves).

Photo Editing

By Michael Mulvey on December 27, 2010 11:15 AM


Before and after using Instagram

Photo editing, photo retouching, photo enhancement, photo manipulation - all of these terms are correct, depending on the objectives of the individual or group publishing the photos. The truth is, since Louis Jacques Daguerre developed the photographic process in 1836, photo editing has existing. There are proponents and opponents of it, the same as there in writing or music.

Some photographers believe all settings for focus, contrast, exposure and cropping should be done at the moment of capture. How that photo comes out is how it was meant to be.

Then there's others who believe, as in writing or music, that it's all about the editing. No, editing isn't alchemy. You can't make a shit photo into something award winning but great editing also can't hurt it. Editing presets, or editing settings that are 'pre-baked' might not address every nuance in a photo, but if you pick the right one, they can certainly help.

Which leads me to Instagram. I think it's a great tool. My brother, thinks it's the equivalent of AutoTune in music. A cheap parlor trick. I say, we're both right - it just depends on who's using it. The more educated and experienced you are in a particular art, the more sensitive you are to it. When you're educated you can tell Bad from Decent, and Decent from Great. No amount of editing can make an amateur photographer into a Diane Arbus or an amateur author into a Kurt Vonnegut.

For the inexperienced, Instagram can resolve 101 photography issues like contrast and color. What it can't do is tell someone if their chosen crop is good or not. It also can't tell someone they've picked a filter that doesn't work with their photo.

Instagram, for instance, can never provide the level of detail in editing that Richard Avedon aimed for on his photos (via):


For me, Instagram gives me a way to quickly edit the photos on my iPhone with a effect I feel is appropriate for the shot and a crop that looks right. If the perfection accuracy of Avedon's editing is 100%, then perhaps Instagram's accuracy can range anywhere from 5-50% improvement over the original.

It's a fun tool and I use it as such.

Galaxie 500

By Michael Mulvey on December 27, 2010 11:07 AM


Galaxie 500

a block of wood under the brakes

By Michael Mulvey on December 23, 2010 1:45 PM

Given how many double-shot lattes I drink, I should know more about how caffeine works. Lifehacker discusses the book, Buzz: The Science and Lore of Alcohol and Caffeine:

More important than just fitting in, though, caffeine actually binds to those receptors in efficient fashion, but doesn't activate them--they're plugged up by caffeine's unique shape and chemical makeup. With those receptors blocked, the brain's own stimulants, dopamine and glutamate, can do their work more freely--"Like taking the chaperones out of a high school dance," Braun writes in an email. In the book, he ultimately likens caffeine's powers to "putting a block of wood under one of the brain's primary brake pedals."

I love how nicely car metaphors lend themselves to various subjects.

It looks like caffeine isn't necessarily helping me as someone in the creative field:

The general consensus on caffeine studies shows that it can enhance work output, but mainly in certain types of work. For tired people who are doing work that's relatively straightforward, that doesn't require lots of subtle or abstract thinking, coffee has been shown to help increase output and quality. Caffeine has also been seen to improve memory creation and retention when it comes to "declarative memory," the kind students use to remember lists or answers to exam questions.

Found via PSFK


By Michael Mulvey on December 23, 2010 1:37 PM


Via Good Old Valves

The Home Button Is Fine

By Michael Mulvey on December 22, 2010 11:55 AM

Aza Raskin recently wrote about an observation on iPhone users:

If you sit and watch people use an iPhone there's a mistake made often and reliably: They hit the home button when they mean to just go back to the app's main screen. Going home has heavy consequences--to recover you've got to find that app again, sit through its splash screen, and fiddle the app to where it was before. The home button is the grunt-and-touch control of physical affordances. While iconically simple, the one bit of information it lets you indicate is too little.

He suggests a solution:

Camera shutter buttons have a two-stop action. Half-press them to lock focus and aperture settings, fully press them to take the picture. There's a delightful tactile indent at the half-way mark so that your fingers know what's going on. Let's borrow this two-stop action for the home button. Press half-way to go to the app's main screen, all the way to go to the phone's main screen. If you need to fully escape mash the button. If you just want to head back to the main-screen of the app, tap lightly. You can easily convert a light-press into a heavy-press mid-action. It's as naturally a mapping as you are going to get.

A two-stop Home button is an interesting idea, but it's not the solution. While providing an OS-level *crutch* to resolve an application-level usability problem can be implemented it's not addressing the problem at it's origin - the application.

While I'm an an interactive designer and not an average user, I would still like to go through a few anecdotal examples in order prove that no only is a two-stop Home button unnecessary but doesn't map to every application UI flow.

First off, here are my top 5 most used 3rd party iPhone applications and games:

Angry Birds

Few of my Top 5 applications have what I consider a true Home Screen. I define a Home screen as a screen that provides the main access point to all core functions. A view of a list of content/feed is not a Home Screen. Neither is a Splash Screen, a Splash Screen is less useful than a Home screen. In general, the levels of heirarchy within iPhone applications are so few that we rarely need a Home screen or a Main Menu.

Below are screenshots from application entry point down to subsequent sub/detail views:

Instapaper: 1) Category List 2) Read Later List 3) Article Body

Angry Birds: 1) Splash Screen 2) Worlds 3) Levels within a World 4) Playing a Level

Twitter: 1) Default Feed View 2) Single Tweet View 3) User Profile View

Facebook: 1) Home Screen 2) News Feed View 3) Single Feed Item

Instagram: 1) Feed View 2) Share - Step 1 3) Share - Step 3 4) Share - Final Step

Of all these applications, Facebook is the only application with a true Home Screen. It provides global access to all the core areas of Facebook. A two-stop Home button would work, but there's a Home button staring you right in the face in the top left corner of the screen.

Instapaper and Twitter both have Main Views, not quite the same thing as Home Screens and with Instapaper, your Main View is the Read Later list. Rarely do you need to go back to the main Category Screen. So if I were implementing two-stop Home button functionality, what's the Home Screen? I would argue on a day-to-day basis the Read Later screen is the most widely used.

The only screen Instagram has that could act as a Home Screen is the first button in the bottom Mode menu - Feed, but one could argue Popular and News could just as easy function as such. As as far as getting lost within the levels of navigation within the photo selection process - if the back buttons aren't clear enough for you, then neither is a two-stop Home button.

As for Angry Birds, you're either playing a level, selecting a level within a group, or choosing a level group. Outside of those 3 screens you have the Splash Screen, which is useless once you've set your gameplay preferences.

A two-stop Home button is an interesting idea, but not practical. For it to work optimally across all applications and games it would require developers to designate a view that could function as a Home Screen and this might be different for each developer. If users are getting lost down the rabbit hole that is your application, the solution is fixing the navigaiton in your application.

UPDATE: Looks like John Gruber agrees with me. Seems neither of us see people hitting the Home button to go home within an app:

I don't see people doing this. The half-press on a camera shutter serves an essential purpose. Creating a "half-press to go back to the current app's root level" iOS home button would serve a purpose, but I don't think it'd be worth the cost in additional complexity. Plus, it create a small exception to one of the key design tenets of iOS: when you're in an app, everything you can do in that app is done on-screen.

Christmas Tree

By Michael Mulvey on December 17, 2010 3:21 PM


Still addicted to Instagram.

Design is Systems, Not Things

By Michael Mulvey on December 17, 2010 11:15 AM

When you buy a house, you're not just buying a thing with a roof, your moving into a community and if you're smart, you take everything into account - schools, environmental aesthetics, people, distance from your job, price, safety, activities. The same can be said for people who use iTunes.

They're not just buying media files, they're investing in an ecosystem which is more than the sum of its parts. It's not just the MP3 files they bought. Whether conscious or subconscious everything is factored in: 'distance' to my iPod/iPhone, user interface aesthetics, accessibility, features, prices, educational content (iTune U podcasts).

The lack of an ecosystem is why Amazon isn't denting iTunes dominance.

From the Wall Street Journal:

Despite its cut-throat pricing, Amazon has made little headway against Apple, which closely ties its iTunes software to its iPods and other gadgets. Amazon heavily markets its Kindle e-reader with TV commercials, but its MP3 store has a lower profile--the company markets it, largely, through emails to customers and a Twitter account where it highlights deals.

Deep discounts aren't worthless, but they mean significantly less when your products are scattered like buckshot in a forest. Sure, there's a slight premium for media in the iTunes store, but it's all there, well organized and look on in the lefthand column, there's my iPhone, connected and read to accept not just music but all media including movie purchasing and renting.

Like anything, it's about a million little things that add up to a lot, just compare the interfaces:



Once you purchase your music from Amazon you need to use the Amazon MP3 Downloader to download your tracks. Once they're on your hard drive you have to then import them into iTunes or whatever media manager you use.

Not rocket science, but convoluted and confusing to the non-geeks.

AmazonMP3 is not a great community, so people don't want to move it.

Keep It Local

By Michael Mulvey on December 16, 2010 8:12 AM

Proponents of cloud computing tell us we'll never have to worry about the integrity and location of our files, cause, like they're in The Cloud. The problem is, once we surrender all responsibility of our data to remote servers we open the door to relinquishing ourselves of any ownership and privacy over our data.

Sure I hate the phrase, but I appreciate the service cloud computing provides - in the right context. I love having my email and the Internet accessible from anywhere. But my personal documents? Photos? Music? What if I'm in the subway or anywhere else I can't get online?

I was inspired to write this post based on Richard Stallman's thoughts on Google's Chrome OS and it's reliance on cloud computing and web apps (via Daring Fireball):

But Stallman is unimpressed. "I think that marketers like "cloud computing" because it is devoid of substantive meaning. The term's meaning is not substance, it's an attitude: 'Let any Tom, Dick and Harry hold your data, let any Tom, Dick and Harry do your computing for you (and control it).' Perhaps the term 'careless computing' would suit it better."

He sees a creeping problem: "I suppose many people will continue moving towards careless computing, because there's a sucker born every minute. The US government may try to encourage people to place their data where the US government can seize it without showing them a search warrant, rather than in their own property. However, as long as enough of us continue keeping our data under our own control, we can still do so. And we had better do so, or the option may disappear."

Sure, blindly uploading your data without knowing the integrity of the servers (and the company) might be careless, but you have just as many careless people who fill up hard drives to the point of disk failure.

I'd actually say this potential catastrophe is carelessness by the developers who built your OS, less so on the end user. If we're going to allow people to fill up their hard drives, we need to emphasize the danger they're in, just as a fuel gauge does in a car.

Every month of every year our media devices acquire more megapixels, requiring more disk space on our memory cards and hard drives. Many of us (both professionals and amateurs) have been shooting digital photos and videos for over 10 years and even on laptops with 100 GB of disk we don't have enough space.

So with great amounts of digital media comes great responsibility. Redundancy is key, but keep all on the cloud? I'm not so sure.

The point isn't to avoid cloud computing. I couldn't live without services like GMail and DropBox, but always ask yourself it you truly own your content and by own I mean - have complete control over.

No Portfolio? No Excuses.

By Michael Mulvey on December 14, 2010 8:51 AM

So you're an unemployed designer or maybe you're lucky enough to have just graduated into this wonderful economy we're in. Whatever your situation, there's no excuse for not having a portfolio with fresh work.

No clients? Make them up. Always wanted to make a poster for TRON? Do it. Think you have a better UI scheme than the iPhone's? Make a prototype.

If you want to see what I'm talking about, have a look at the work of Justin Van Genderen:




I know designing new-old posters for classic films is in fashion right now, but it's great creative exercise and a great way to flaunt your skills. You might even make some cash off it like Van Genderen does on Etsy.


By Michael Mulvey on December 14, 2010 7:56 AM

As much as I claim to love cars, there's still a lot I don't know about them.

I've just discovered Gruppo Bertone - the name behind some of the most beautifully styled cars in history.

The car that stands out the most in Bertone's portfolio is the Lamborghini Muira designed by Marcello Gandini. The Miura is the grand daddy of supercars:


Other notables I love are (in order below) the BMW 3200 CL, Aston Martin DB4 and the curious Bertone Mustang:





By Michael Mulvey on December 13, 2010 5:26 PM

Mode Kills Therefor I Don't Mind Killing CAPS LOCK

By Michael Mulvey on December 13, 2010 2:59 PM

Aza Raskin had a good post recently on the failure of visual feedback in interfaces and how quasi-modes are usually better than modes:

Caps Lock is a prime example of a mode that gives barely-worthwhile feedback: a small light on the keyboard glows when Caps Lock is engaged. It's the analog of the hypothetical visual feedback for the radio in Airplane. Of course, the keyboard is exactly where no touch-typist ever looks and, to make matters worse, the Caps Lock light can be unhelpfully unlabeled:


The Caps Lock feedback is so easy to ignore that it just doesn't work. As often happens in computing, we get band-aid fixes instead of true fixes. For instance, Microsoft added a nicely non-modal message to the Windows login screen that reminds the user of Caps Lock's state. Even that doesn't work to prevent mode errors, as I always type my password before noticing the warning.

This folds in nicely with the nerdy uproar over Google removing the Caps Lock key on their new Chrome notebooks. The official reason from a rep was that Google wants to 'improve the quality of comments across the Web'. I'm not sure why there was backlash about this key removal, but I welcome it. I never use it and after I read Aza's post, I want that key gone!

We Are All Designers

By Michael Mulvey on December 13, 2010 2:25 PM

All men are designers. All that we do, almost all the time, is design, for design is basic to all human activity. The planning and patterning of any act towards a desired, foreseeable end constitutes the design process. Any attempt to separate design, to make it a thing-by-itself, works counter to the inherent value of design as the primary underlying matrix of life. Design is composing an epic poem, executing a mural, painting a masterpiece, writing a concerto. But design is also cleaning and reorganizing a desk drawer, pulling an impacted tooth, baking an apple pie, choosing sides for a back-lot baseball game, and educating a child.

-Victor Papanek, Design for the Real World: Human Ecology and Social Change

Thanks Dalematic

Del Toro - Disruptive Filmmaker

By Michael Mulvey on December 13, 2010 12:55 PM

From Deadline:
Guillermo del Toro has teamed with director Mathew Cullen, cinematographer Guillermo Navarro and executive producer Javier Jimenez to launch Mirada, a 25,000 square foot studio in Marina Del Rey. Del Toro, a prolific producer of projects beyond his own directing vehicles, said that he expects to use the venue as a home base and a way for him to embrace a transmedia future. Del Toro will run a lot of his projects through a facility that has the ability to handle everything from pre-visualization to animatics, story boarding and visual effects work. The studio opens for business today.
PSFK explains transmedia:
Transmedia is a way of developing a story or "expanded universe" among multiple platforms while also sometimes encouraging audience participation and interactivity. In this sense, a film only represents a single segment, along with graphic novels or video games, of a much larger world that could spawn several unrelated stories or characters. It has gained steam among independent filmmakers as a new way of innovating the medium to compete with the increasing dominance of digital and interactive art forms and media. It has recently begun expanding into mainstream film.

Asymmetric Competition

By Michael Mulvey on December 13, 2010 8:26 AM

The New York Times has a article on how media companies like Time Warner are getting angry as Neflix continues to drink their milkshakes.

Like any disruptive technology, the incumbent's first reaction is to kill the new innovative kid on the block rather than take notes and adapt.

Netflix is disrupting media distribution because they're taking advantage of asymmetric competition, a term picked up from Horace Dediu's wonderful tech blog of the same (abbreviated) name, Asymco. They're able to distribute the content of big media companies over the web, sans hardware for one monthly price. This from a company that up until recently made most of their money mailing out little red envelopes.

The language these old media guys use is telling, like Jeffrey L. Bewkes, CEO of Time Warner, viewing Netflix as an army from a tiny country:

It's a little bit like, is the Albanian army going to take over the world? I don't think so.

It reminds me of Palm's former CEO Ed Colligan reacting in 2006 to the iPhone (2 months before it was introduced):

We've learned and struggled for a few years here figuring out how to make a decent phone, PC guys are not going to just figure this out. They're not going to just walk in.

As we know, Apple did just 'walk in' and ended up drinking almost 40% of all industry profits from mobile phones. It doesn't mean Netflix is guaranteed success, but it does mean we should continue to keep an eye on them.

This reaction to Netflix is why I think we're seeing business deals happen like Comcast's merger with NBC. In short, the distributors want to control the content. It's bullshit if you ask me. The Comcast deal potentially opens the door for more like it, further pulling the buying power from individuals.

I am offline

By Michael Mulvey on December 9, 2010 11:46 AM

Danah Boyd on email sabbaticals:

I am offline, taking a deeply needed break while traveling. During the duration of my break, no email will be received by my computer. All email sent to me during this period will be redirected to /dev/null (aka "the trash"). If you send me a message during this period, I will never receive it and never respond to it. If you need to contact me, please send your email after January 12. If it is urgent and you know how to reach my mother, I will be in touch with her every few days. But I am intentionally unreachable during this period. Please respect that a girl needs a break and this is mine.

Words to live by. (via Minimal Mac)

why developers do it

By Michael Mulvey on December 9, 2010 10:29 AM

Great post last month by Marco Arment (creator of my favorite app, Instapaper) on why developers don't rush to new platforms.

We're making iPhone software primarily for three reasons:

Dogfooding: We use iPhones ourselves.
Installed base: A ton of other people already have iPhones.
Profitability: There's potentially a lot of money in iPhone apps.

I agree 100% with everything he says in the post, but can we use another phrase besides *dogfooding*?

RIM PlayBook

By Michael Mulvey on December 9, 2010 8:52 AM


Boy Genius Report posted a hands-on video demo of the RIM PlayBook, their new tablet prototype. I say 'prototype' since this thing isn't on the market yet and is still very much half-baked.

The PlayBook looks well thought out with clear influence from both Apple's iOS in the icon arrangement and homescreen navigation and HP's (Palm's) webOS in the tile/card multitasking (you even swipe-up to quit a program). Navigation is clear, transitions are smooth and production quality is tight.

What is yet to be shown is the ecosystem in which the PlayBook lives. The applications to make it more powerful. The software to transfer media easily. The integration of services - how email connects with photos, how maps connect with contacts.

And what of other devices? It would be a shame to put all this effort into the PlayBook without seeing the operating system extend to smartphones.

Steve Jobs confessed to Walt Mossberg at the D8 Conference this past June (at about the 36:30 mark) that they had originally started developing iOS for the iPad, but felt creating a multi-touch mobile phone was more important than a tablet. So they shelved the iPad and focused on the iPhone.

One would think this line of thinking would be equally, if not more important to RIM, given the jeopardy their smartphone devision is in right now with competition from Apple's iPhone and Google's Android.

It's like RIM is in a car race with Apple and Google and instead of fixing their car, they're in a hangar, creating a new airplane to compete with other airplanes.

our tubes are full, help us buy more tubes

By Michael Mulvey on December 8, 2010 12:21 PM

According to Bloomberg News:

Google Inc., Apple Inc., and Facebook Inc. need to pitch in to help pay for the billions of dollars of network investments needed for their bandwidth-hogging services, European phone operators say.

Do governments expect car manufacturers to help pay for roads, highways and bridges?

oh joy of joys

By Michael Mulvey on December 3, 2010 2:53 PM

The Joy of Stats, starting on BBC:


I already get Top Gear UK on BBC America, I hope this show is on as well.

cars are people too.

By Michael Mulvey on December 3, 2010 2:31 PM

We first experience the human side of cars with their faces. Some are smiling, some are cute, some are angry and some are dumb.

And some are sexy like Gisele Bündchen:


I'm not sure there's a sexier face on car out there. Look at her.

via Motoriginal

emotional comprehension

By Michael Mulvey on December 3, 2010 1:06 PM

From an interview with Stanley Kubrick from 1969:

I think that 2001, like music, succeeds in short-circuiting the rigid surface cultural blocks that shackle our consciousness to narrowly limited areas of experience and is able to cut directly through to areas of emotional comprehension. In two hours and forty minutes of film there are only forty minutes of dialogue.

Play, boy.

By Michael Mulvey on December 3, 2010 10:23 AM


via bluntsboozebitches

Distracted By Design

By Michael Mulvey on December 1, 2010 10:18 AM

Last week the NYTimes published an article, Growing Up Digital, Wired for Distraction. In it Matt Richtel argues that today's youth is susceptible to far more distractions in the digital age than previous generations. The focal point of Richtel's artlcle is Vishal Singh, a high school kid who loves to make films on his computer.

There is some truth to Richtel's argument. Today's world encourages distraction in the myriad of digital devices and technologies we use. Omni-present phone calls. Text alerts. Email alerts. IM alerts. Push notifications. But Richtel chose the wrong subject for his article. Vishal is absolutely a distracted kid but technology isn't the reason for his being distracted, it's his creativity.

I know this, because in a galaxy far, far away, I was very similar to Vishal. I didn't break 1000 on the SATs (I got 970). I graduated with a 2.7 average from high school.

But something else was going on. While my averages were poor, my specific grades on projects would be a steady stream of B's, C's and D's interrupted by sporadic A's. My mother used to defend me to my father. She told him, "Michael's right-brained. He's an artist, it's not his fault." (reality: I drew with my right hand. I still love you mom, you were trying.)

My father, on the other hand, called bullshit. He saw those random A's. He knew what I was capable of. He was watching me in his basement laboratory, testing voltage levels on double-A batteries, and soldering wires together on broken gadgets. If I wasn't fixing things in the basement, I was picking up parts at the junkyard for my 1984 Celebrity Station Wagon (be jealous).

When I wasn't fixing and tinkering with stuff, I was drawing and painting and sculpting and shooting photos and making films with my friends.

Jonah Lehrer over at Wired unearthed interesting research on this creativity-distraction connection:

Those students who were classified as "eminent creative achievers" - the rankings were based on their performance on various tests, as well as their real world accomplishments - were seven times more likely to "suffer" from low latent inhibition. This makes some sense: The association between creativity and open-mindedness has long been recognized, and what's more open-minded than distractability? People with low latent inhibition are literally unable to close their mind, to keep the spotlight of attention from drifting off to the far corners of the stage. The end result is that they can't help but consider the unexpected.

It's easy to blame external forces on our conditions. It's harder to look inward and analyze our kids and ourselves. Don't be so quick to blame technology for your kid's inability to stay focused. First determine if it's their creative endeavors, and not technology, that's causing their lack of focus.

UPDATE: Looks like Steven Johnson shares a similar point of view as me:

That said, I do find something puzzling about the whole choice of Vishal as a central study, because the piece assumes that his lessening interest in books and (some) of his coursework is due to the siren song of the digital screen. But what's clearly obsessing Vishal is his love affair with video editing. There's no reason to think the 1985 version of Vishal wouldn't have been equally distracted from his schoolwork by the very same hobby.

Daily Exhaust is hosted by DreamHost, powered by Movable Type with Minted statistics.