August 2010 Archives

Vision In Motion

By Michael Mulvey on August 31, 2010 8:14 AM

Design has many connotations. It is the organization of materials and processes in the most productive, economic way, in a harmonious balance of all elements necessary for a certain function. It is not a matter of façade, of mere external appearance; rather it is the essence of products and institutions, penetrating and comprehensive. Designing is a complex and intricate task. It is integration of technological, social and economic requirements, biological necessities, and the psychophysical effects of materials, shape, color, volume, and space: thinking in relationships.

–Lázló Maholy-Nagy, Vision In Motion, 1947 (via Daily Icon)

helping out those feeble electric currents

By Michael Mulvey on August 31, 2010 8:00 AM

audion_triode_tube_1906.png

Nick Carr has posted a chunk of his new book, The Shallows: What the Internet Is Doing to Our Brains:

De Forest couldn't have known it at the time, but he had inaugurated the age of electronics. Electric currents are, simply put, streams of electrons, and the Audion was the first device that allowed the intensity of those streams to be controlled with precision. As the twentieth century progressed, triode tubes came to form the technological heart of the modern communications, entertainment, and media industries. They could be found in radio transmitters and receivers, in hi-fi sets, in public address systems, in guitar amps. Arrays of tubes also served as the processing units and data storage systems in many early digital computers. The first mainframes often had tens of thousands of them. When, around 1950, vacuum tubes began to be replaced by smaller, cheaper, and more reliable solid-state transistors, the popularity of electronic appliances exploded. In the miniaturized form of the triode transistor, Lee de Forest's invention became the workhorse of our information age.

data visualizations - appearing to be useful

By Michael Mulvey on August 26, 2010 9:01 AM

Sometimes that's all they're doing.

Case-in-point: Stephan Thiel's B.A. thesis at the University of Applied Sciences Potsdam. He attempts to understand Shakespeare through visualizations:

understanding_shakespeare_romeo_and_juliet.gif

I ran this by my brother Mark, because he's long been a huge fan of Shakespeare. This was his take (via email):

Shakespeare's not very difficult. Probably the hardest part of reading Shakespeare is the learning curve: there are a lot of words in his plays that have fallen [into] disuse...so you have to learn a bunch of new vocab. Once you have those down it becomes much more manageable. Also, some of the plots can get convoluted, though that's where a lot of the comedy (and deceit in the tragedies) comes from. i.e. "She's disguised as a guy, but this other guy doesn't realize it and he's talking to her like she's a he, etc..."

I can't imagine how these visualizations are of any use.

I agree with my brother. I feel that Thiel's thesis is very high-level and doesn't help me understand Shakespeare any better.

It brings to mind one of the many great data visualizations from the New York Times, The State of the Union in Words: A Look at the 34,000 State of the Union Words Delivered of George W. Bush.

nytimes_state_of_the_union_address.gif

What we get with the NYTimes graphic is context. We can tell what the focus was for different years, repeated themes and we also can click on a specific word and get the exact place where it was used in the speech.

I should make it clear that I find the quality of Thiel's thesis top notch - from data mining to final printed output. It's beautiful. All he needs now is more context, more questions answered. So he did all this work - what is his conclusion? His insight?

Other than pattern recognition, I walk away from his thesis not knowing any more about Shakespeare than I did coming in.

The preface to The Picture of Dorian Gray comes to mind:

"We can forgive a man for making a useful thing as long as he does not admire it. The only excuse for making a useless thing is that one admires it intensely."

disruption in publishing

By Michael Mulvey on August 23, 2010 8:06 AM

Seth Godin decides he's no longer going to publish books the traditional route:

The thing is--now I know who my readers are. Adding layers or faux scarcity doesn't help me or you. As the medium changes, publishers are on the defensive.... I honestly can't think of a single traditional book publisher who has led the development of a successful marketplace/marketing innovation in the last decade. The question asked by the corporate suits always seems to be, "how is this change in the marketplace going to hurt our core business?" To be succinct: I'm not sure that I serve my audience (you) by worrying about how a new approach is going to help or hurt Barnes & Noble.

asymmetries, disruption and innovation

By Michael Mulvey on August 22, 2010 8:31 PM

Wow, how much to I love this article, The Innovator's Battle Plan:

Asymmetries allow disruptive attackers to enter a market, grow without incumbent interference, and mitigate the incumbent's response when it is finally motivated to counterattack. The result of asymmetric battles often is the seemingly sudden end of a great firm. From the incumbent's perspective, every action it takes is rational. But the outcome is devastating. Disruption is the strategy that creates and capitalizes on asymmetries of motivation and skills.

Found via one of my new favorite websites, Asymco (I can see where that name came from now):

Design in the Age of Multi-Touch Reproduction

By Michael Mulvey on August 22, 2010 5:45 PM

PSFK asks, Is The Touchscreen Killing Or Reinventing Design?

Good friggin' question. Actually, it's a decent question, but a bit general. A better question would be, Is The Touchscreen Killing or Reinventing Industrial Design?

Right now and for the foreseeable future, the answer is yes, the touchscreen is killing industrial design. We're replacing real clocks, calculators, compasses and keyboards with virtual ones on screens. Not only are we killing industrial design, but we're also killing the enjoyment of touching physical objects. Pressing real buttons, typing on real keys. We're replacing physical feedback from physical products with haptic, aural and visual feedback from virtual devices. Real buttons that make click-sounds and spring back from real springs are now WAV files of click-sounds and animations of up, hover, press and release states.

We're living in the age of skeuomorphs:

A skeuomorph is a derivative object which retains ornamental design cues to a structure that was necessary in the original. Skeuomorphs may be deliberately employed to make the new look comfortably old and familiar, such as copper cladding on zinc pennies or computer printed postage with circular town name and cancellation lines.

I grabbed this word from Adam Greenfield's blast on Apple (via Daring Fireball) and their overindulgence in this area:

The iPhone and iPad, as I argued on the launch of the original in 2007, are history's first full-fledged everyware devices -- post-PC interface devices of enormous power and grace -- and here somebody in Apple's UX shop has saddled them with the most awful and mawkish and flat-out tacky visual cues. You can credibly accuse Cupertino of any number of sins over the course of the last thirty years, but tackiness has not ordinarily numbered among them.

While the loss of analogue devices can be sad and their digital replacements can be lacking in responsiveness, I don't think the retro TV shell by frog creative director Jonas Damon is anything more than cute decoration. While Apple creates digital skeuomorphs, Damon makes analogue skeuomorphs.

tv_3.jpg

The Wii steering wheel is example of transcending a skeuomorph and providing actual, functional value to a non-analogue device. In the absence of the wheel, the Wii controller doesn't lend itself well to driving:

wii_steering_wheel.jpg

We'll reach a point in the future where digital devices will be able to give us physical feedback - be it tactile, aural or olfactory, but we're not there yet. We'll continue to see skeuomorphic crutches for out digital devices and these crutches aren't always bad.

iOS_vs_Windows_Phone_7.jpg

While Greenfield can argue Apple has the skeuomorphic dial up to 11, I much prefer this to the complete lack of substance in the Windows Phone 7 user interface. With that said, I do agree Apple can go heavy on the GUI sauce (see my post on The Goddamn Page-Turn).

What's the point of Twitter?

By Michael Mulvey on August 22, 2010 3:54 PM

I hear this a lot, both by people in and outside the web industry.

Over at frog design's design mind blog, they recently hit the 100,00 follower mark and explain what this milestone means. Here are a few points that stood out to me:

If there's one truth to this milestone, it's that a social network doesn't exist on its own. Unless you're a pop star with an audience already in place like LeBron James, who opened a Twitter account and got 650,000 followers in seven weeks, I believe it's impossible to attract such a following without also having an ecosystem of complimentary initiatives in place, namely a rich and always-fresh supply of content to share, a community of actual people that you actually talk to (not just Twitter accounts), and a dedicated person or team to care for and feed the social media conversation

Why people follow them, and the responsibilty that goes with 100,000 followers:

There is a reason people continue to follow us on Twitter, just as there is a reason conferences want our magazine at their events or indeed, why people want to continue to do business with us. They trust us. If anything, reaching the 100,000 follower mark on Twitter is a reminder of the responsibility we have to be thoughtful curators of relevant news, trends, and debates, even when those debates involve our competitors.

Regarding ROI (return on investment):

Now that frog has reached 100,000 followers, we are officially considered an "influencer" by analyst firm Forrester, which means we can augment conversations on the Social Web, instantly broadcast content to a wide, relevant audience, and use that reach as an asset in our relationships with clients and conferences.

The the most negative comments I hear people make about Twitter always focus on the mundane, trivial and adolescent tweets. That's easy, and yes, there's plenty of dumb shit out in the Twitosphere.

But there's also the people and companies I follow. Twitter streams I find thought-provoking, hilarious and insightful, like frogdesign, kanyewest and mullerbrockmann.

Randy Mora

By Michael Mulvey on August 22, 2010 3:54 PM

israelblog.jpg

The work of Randy Mora (via Forgotten Hopes)

It rewards skill and care with immediate feedback

By Michael Mulvey on August 22, 2010 2:04 PM

A beautiful short film by Charles and Ray Eames on the Polaroid SX-70 (via Cool Hunting)

I couldn't help but keep my finger on trigger for screengrabs:

sx-70_01.jpg sx-70_01.jpg sx-70_01.jpg sx-70_01.jpg sx-70_01.jpg sx-70_01.jpg sx-70_01.jpg sx-70_01.jpg sx-70_01.jpg sx-70_01.jpg sx-70_01.jpg

a conspiracy against the mind

By Michael Mulvey on August 18, 2010 4:33 PM

Found a great quote in the comments in Fast Company's article about Alex Bogusky leaving his agency:

"They do not want to own your fortune, they want you to lose it; they do not want to succeed, they want you to fail; they do not want to live, they want you to die; they desire nothing, they hate existence, and they keep running, each trying not to learn that the object of his hatred is himself ... They are the essence of evil, they, those anti-living objects who seek, by devouring the world, to fill the selfless zero of their soul. It is not your wealth that they're after. Theirs is a conspiracy against the mind, which means: against life and man."

—Atlas Shrugged

makes me think of Google

By Michael Mulvey on August 18, 2010 10:42 AM

You guys think I'm just some untouchable peasant? Peon? Huh? Maybe so, but following a broom around after shitheads like you for the past eight years I've learned a couple of things...I look through your letters, I look through your lockers...I listen to your conversations, you don't know that but I do...I am the eyes and ears of this institution my friends.

—Carl, The Breakfast Club

disruption in the computer world

By Michael Mulvey on August 18, 2010 8:54 AM

I'm the first one to call bullshit on all the hyperbole the press engages in on a regular basis, but even taking that into consideration, there's definitely real upheaval going on in the computer world.

Can we give some credit to Apple and the disruption it brought into the mobile computing world in 2007? Absolutely.

Are there other factors involved beyond Apple? Absolutely.

But make no mistake - these are not isolated incidents.

.NET Journal: Is Microsoft's Ballmer Out?

Electronista: Nokia shareholders call for CEO to resign; Palm deal ignored

SEC: over a quarter of shareholders want Dell CEO out

Whether we're talking about mobile phone makers failing to turn their units into profitable computing platforms and not cheap throw-aways, or PC makers failing to move from desktop computers to multi-touch mobile/tablet computers, we're seeing denial/stubborness/ignorance in the face of a changing computer world.

Steve Jobs talked about this at the D8 Conference this year (about 44 minutes into the interview):

When we were an agrarian nation, all cars were trucks, because that's what you needed on the farm. But as vehicles started to be used in the urban centers, cars got more popular. Innovations like automatic transmission and power steering and things that you didn't care about in a truck as much started to become paramount in cars. ... PCs are going to be like trucks. They're still going to be around, they're still going to have a lot of value, but they're going to be used by one out of X people.
And this transformation is going to make some people uneasy. People from the PC world, like you and me [looking at Walt Mossberg] ... it's going to make us uneasy because PC's have taken us a long way. It's brilliant. And we like to talk about the post-PC era, but when it really starts to happen I think it's uncomfortable for a lot of people, because it's change. Vested interests are going to change, it's going to be different.

The way I see it, the PC landscape would continue on in it's backwards-looking, non-innovative direction ad infinitum. Change is not something these entrenched giants (Dell, Microsoft, Nokia) want or are structured to create on their own. The change (erosion?) to this landscape would have happened one way or another, Apple simply sped up the process with the iPhone.

profit from disruption

By Michael Mulvey on August 17, 2010 12:21 PM

Man, Asymco is killing it these days with their mobile analyses:

Finally, looking at the pure smartphone vendors RIM and Apple, the picture is nothing short of astonishing. This before-and-after share-of-available-profit chart shows that the two entrants went from about 7% profit share to 65% in three years.
Disruption is the diagnosis here. The incumbents were caught in the headlights. Disruptive innovation leads to asymmetric competition and this is what we just witnessed. History has shown that the shift of profits is usually the last stage of disruption and is usually irreversible because the change in business models cannot happen at the rate of change of profit transfer.

the desire to achieve

By Michael Mulvey on August 12, 2010 8:45 AM

a_creative_man_9000.gif

More graphic awesomeness from 9 0 0 0

The Hold Up Problem

By Michael Mulvey on August 8, 2010 9:20 PM

Open Source and Economics: How the Hold Up Problem Explains the Flash Wars

The surprising aspect of open source is not its existence, but its success. People do things for free all the time. Among other things, there is no shortage of people willing to share their videos on the web. However, despite the availability of free videos, viewers are often willing to pay money to watch films made by professionals. Professional producers of films in turn usually make full use of copyright laws. In contrast, in many software domains, open source solutions are preferred. The important question about open source is therefore not "Why do people contribute to a project like Apache?" but rather, "Why can't companies create proprietary products that can beat Apache on the market?"

Rubicon, opening credits

By Michael Mulvey on August 8, 2010 9:03 PM

I have no interest in seeing Rubicon (the show on AMC that's on right before Mad Men), but the opening credits are absolutely gorgeous (created by Imaginary Forces).

rubicon_opening_credits_01.jpg rubicon_opening_credits_02.jpg rubicon_opening_credits_03.jpg rubicon_opening_credits_04.jpg rubicon_opening_credits_05.jpg

Water?

By Michael Mulvey on August 6, 2010 10:56 AM

get_me_a_water.jpg

via Comically Vintage

Why do they always have to cheat my feelings?

By Michael Mulvey on August 5, 2010 10:56 AM

hitler_rants_about_iPhone_4.jpg

All websites are not created equal

By Michael Mulvey on August 5, 2010 9:58 AM

Google and Verizon Near Deal on Pay Tiers for Web

Google and Verizon, two leading players in Internet service and content, are nearing an agreement that could allow Verizon to speed some online content to Internet users more quickly if the content's creators are willing to pay for the privilege.

Such an agreement could overthrow a once-sacred tenet of Internet policy known as net neutrality, in which no form of content is favored over another. In its place, consumers could soon see a new, tiered system, which, like cable television, imposes higher costs for premium levels of service.

Remember when everyone used to talk about how the Internet was a level playing field? Where a website by John Q. Public was as easily accessible as a website by Corporation X?

Well, it looks like those days are over.

Don't be evil, right Google? Fuck you. And fuck you too, Verizon.

the combustion chamber hangs tough

By Michael Mulvey on August 3, 2010 8:31 AM

NPR: Light, Fuel-Driven Car Goes For 100 Mpg X Prize

Warehouse space was cheap, so the retired race car driver hired a team of winners from the world of racing and set up shop to build a car that gets the equivalent of 100 miles per gallon. The team, called Edison2, entered its vehicle, dubbed the Very Light Car, in the X Prize competition, and it's the last remaining four-seat sedan in the competition.

I have to agree with my father. I'm always amazed and excited, not by new technologies that emerge, but by the refining and perfecting of existing ones. Do we have to move beyond fossil fuels? Absolutely, but that doesn't mean stories like aren't exciting.

The Goddamn Page-turn.

By Michael Mulvey on August 2, 2010 8:14 AM

de_soto_record_player.jpg

I understand that when transitioning from any one technology to a new one, there's bound to be ideas from the former that get absorbed into the latter. Sometimes these ideas are valid and logical and sometimes they're intended to be temporary, a bandage, to be used until something better is thought of.

Sometimes we get comfortable with our bandages and never take them off. I won't go through all the ones we're familiar with in the computer world (folder, page, desktop, below-the-fold). To the defense of these bandages, they do a pretty decent job most of the time.

But some ideas just feel olde tyme-y.

Case-in-point: the page-turn effect.

In 2009, Microsoft filed a patent for the page-turn gesture in digital interfaces. It's the same gesture Apple currently uses in their e-books on the iPad.

Then last week, Gizmodo posted a video from a UI firm demoing a page-turning Windows 7 interface for tablets.

page_turn_UI.jpg

People, we're better than this. If we're going to move beyond the printed page, we need to move beyond the printed page. It's important we preserve certain aspects of the analogue world in computer technology. This is especially true for the multi-touch world we're jumping into right now. The inertial scrolling on the iPhone and iPad aren't just there for show, they make the interface.

Thirty years ago, we went from analogue to full, digital abstraction on the desktop computer. Now we're at a point in computer evolution where we're bringing the analogue back into the interface. We're physically interacting with our machines beyond mouse clicks and keyboard taps. The danger in this is taking too many of the inefficiencies of the physical with us into digital.

Buttons that de-press, lists that rubber band when you reach the end of them, screens that smoothly transition between zoom levels - these are all welcome effects in digital. But with page-turning, there's no value add when you shoehorn it into digital. It's like giving cars a clip-clop horse trotting sound effect when you drive.

I actually hope Microsoft enforces their patent and makes Apple remove it from iBooks so that Apple can create a better gesture.

Explosive Bolts

By Michael Mulvey on August 1, 2010 3:07 PM

explosive_bolts.jpg

Daily Exhaust is hosted by DreamHost, powered by Movable Type with Minted statistics.