Awesome headline: BGR: Samsung is 'hiring like crazy' trying to come up with original ideas
Samsung is spending somewhere in the neighborhood of $300 million in building its Bay Area R&D center, which will be an enormous facility of 1.1 million square feet. Business Insider notes that Samsung actually spent more on R&D than any other company in the world last year, although the Korean smartphone still isn't seen as an innovation powerhouse like Apple and Google are.
I'm confused. Does throwing money at technology not make it more innovative?
Over at the New Yorker, Jill Lepore calls out Clayton Christensen on his innovation and disruption theories:
In his original research, Christensen established the cutoff for measuring a company's success or failure as 1989 and explained that " 'successful firms' were arbitrarily defined as those which achieved more than fifty million dollars in revenues in constant 1987 dollars in any single year between 1977 and 1989--even if they subsequently withdrew from the market." Much of the theory of disruptive innovation rests on this arbitrary definition of success.
I love Christensen's work, but it's always interesting to read opposing views.
One of the things that I am most passionate about is showing respect for the ingenuity of others. Working in an ecosystem where I am often competing very closely, it is inevitable that I will be confronted with situations where the easy thing is to match/copy/remix someone else's ideas into my own app.
What I have found very frustrating is that I haven't been able to define what is acceptable in a manner that comes anywhere close to the importance I think this topic demands. Too often I am left with just an I'll know it when I see it definition.
—David Smith provides a great example of the art of remixing
I call it not being a lazy ass and using your brain to make something that resonates with you.
Horace Dediu shares my frustration in how most people have no idea what innovation means:
But there is another form of ignorance which seems to be universal: the inability to understand the concept and role of innovation. The way this is exhibited is in the misuse of the term and the inability to discern the difference between novelty, creation, invention and innovation. The result is a failure to understand the causes of success and failure in business and hence the conditions that lead to economic growth.
My contribution to solving this problem is to coin a word: I define innoveracy as the inability to understand creativity and the role it plays in society. Hopefully identifying individual innoveracy will draw attention to the problem enough to help solve it.
I addressed this issue back in 2009.
True examples of innovation are incredible but the amount and frequency this word is abused and misused has made me grow to hate it.
Over at Slate, Jessica Olien explains how people don't actually like creativity:
In the United States we are raised to appreciate the accomplishments of inventors and thinkers--creative people whose ideas have transformed our world. We celebrate the famously imaginative, the greatest artists and innovators from Van Gogh to Steve Jobs. Viewing the world creatively is supposed to be an asset, even a virtue. Online job boards burst with ads recruiting "idea people" and "out of the box" thinkers. We are taught that our own creativity will be celebrated as well, and that if we have good ideas, we will succeed.
It's all a lie. This is the thing about creativity that is rarely acknowledged: Most people don't actually like it. Studies confirm what many creative people have suspected all along: People are biased against creative thinking, despite all of their insistence otherwise.
As Olien says in her post, part of creativity is uncertainty and people don't like uncertainty.
I'd like to think as a web & mobile designer, my industry is the exception to this creativity bias, but it's not. This is because designers might have the balls to try new, dangerous ideas, but clients don't.
Clients want creative, but not too creative.
Steven Sinofsky responds to the fall of Blackberry on his blog:
Disruption happens when a new product comes along and changes the underlying assumptions of the incumbent, as we all know.
Incumbent products and businesses respond by often downplaying the impact of a particular feature or offering. And more often than folks might notice, disruption doesn't happen so easily. In practice, established businesses and products can withstand a few perturbations to their offering. Products can be rearchitected. Prices can be changed. Features can be added.
What happens though when nearly every assumption is challenged? What you see is a complete redefinition of your entire company. And seeing this happen in real time is both hard to see and even harder to acknowledge. Even in the case of Blackberry there was a time window of perhaps 2 years to respond-is that really enough time to re-engineer everything about your product, company, and business?
...says the man who left Microsoft.
His post is decent, but hindsight is 20/20. It's like an alcoholic with a revoked driver's license telling you not to drink and drive.
Sinofsky mentions "Christensen" once, but it would have been good to mention the actual book that obviates his blog post—The Innovator's Dilemma by Clayton Christensen.
From Daniel Eran Dilger at Apple Insider:
It turns out that while the tech media spent most of 2013 complaining that Apple "wasn't innovating," Apple was secretly developing its new Mac Pro supercomputer, perfecting its Authentech-based Touch ID technology that the industry has been flummoxed to copy, completing iOS 7 (while Google took a Kit Kat break with Android Key Lime Pie) and OS X Mavericks (while Microsoft fiddled as Windows 8 burned), while also bringing an entirely new 64-bit mobile architecture into production ahead of the world's leading chip designers and foundries (which didn't see a pressing need to move to 64-bit and lacked Apple's experience in doing so), and, as nearly a side project, spending billions to build out a series of new iCloud data centers...
via Jason Putorti
BGR on Microsoft still not getting the whole tablet thang:
Windows-based tablets haven't been big successes so far, whether they use the desktop-centric Windows 8 or the tablet-centric Windows RT. iMore's Rene Ritchie does some sharp analysis of Microsoft's latest marketing campaign and concludes that the company simply does not understand why people are buying tablets in the first place. Essentially, Microsoft doesn't get that its central criticism of the iPad -- that is, that it's more of a toy that can't be used for doing serious work -- is precisely why consumers are drawn to it in the first place. Simply put, consumers have PCs at their offices if they want to do work. When they're at home, they want to play around with their tablets instead; they like having toys.
As Clay Christiansen famously points out in The Innovator's Dilemma, most innovations aren't taken seriously when they debut and by the time they gain momentum, it's usually too late for competition to respond. My favorite one was when people in the media and some people in the government thoughts the Internet a fad back in the 1990s.
The iPad might look like a toy because kids love them, but when I'm on business trips I see iPads being used in first class seats and then I see them again when I'm in client meetings at Fortune 500 companies.
From the Verge:
CBS CEO Les Moonves is the latest TV exec to publicly entertain the idea of halting free broadcast TV if streaming provider Aereo is allowed to continue its service. Responding to a question about News Corp. COO Chase Carey's threats to make Fox cable-only, Moonves told The New York Times that he "wholeheartedly supported what Chase said." He explained that CBS was in preliminary talks with cable operators in the New York - Connecticut area (currently the only area in which Aereo operates) about what the switch would take, emphasizing his reluctance to take such a drastic approach. "Frankly, we don't think it will get to that point," he explained.
The definitions of television are changing.
The definitions of a computer continue to change.
The definitions of everything you once knew and know now are changing.
I don't care if I'm sounding like a broken record, but keep the words of Darwin in your head at all times. Adapt or die.
Over at SFGate, James Temple calls out Silicon Valley on it's innovation bullshit:
It's fairly obvious the region's business culture and investing philosophy - that is to say, the free market - often doesn't reward the kind of "deep innovation" Levchin and Thiel trumpet.
The media attention doesn't go to pasty scientists working tirelessly on something that maybe, might, someday change the world, it goes to the fresh-faced entrepreneur with the game trending in the App Store.
Venture capitalists are eager to cash out on their investments in as few years as possible, which requires products ready to ship and stable financial track records. And the public markets don't reward big risks; they applaud predictable growth.
It's not just companies in Silicon Valley that have beaten the word to death. Go to most agency, start-up and consultancy websites and you'll find "innovation" peppered through all the copy.
Everyone loves to think they're innovative. Very few are.
link via Matt Mullenweg
Over at ABC News, details on the new HTC One (via The Loop):
"We think it's time to shake things up in the smartphone space," Mike Woodward, President of HTC America, told ABC News in an interview. "We have decided to come out and reinvent the smartphone."
Careful with the 'R' word.
I got news for ya, Mister Woodward. Your HTC One smartphone is not a reinvention, it's an evolution of the smartphone paradigm Apple introduced in 2007. Instead of making a phone with a few portable computing features, Apple made a mobile computer with the ability to make phone calls. The HTC One follows this paradigm precisely.
That said, the phone looks really sharp. Nice work.
Everybody's having fun speculating about Apple's supposed iWatch. Bloomberg is telling us there's already a team of 100 people working on it at Apple.
Over at The Atlantic's new site Quartz, they went step further and are telling us some 18-year-old could beat Apple to market with his own iWatch. Ha! Take that!
It seems even Samsung is trying to preemptively copy Apple and release one of their own smart watches, which they'll inevitably change in order to look like whatever Apple ends up shipping because, well, they told us they like to copy their competitors.
I was talking with Bryan about this over instant message earlier today and I agree with his view, "If I think it's an ugly watch, I'm not wearing it." I'm not wearing it either (OK, OK, maybe I'll wear it.).
This is why Apple made the iPhone as much as an uncarved block as possible. Jony Ive's philosophy is to distill a product down to it's essence. Like a naked body, let individuals decide how to "cloth" their devices. Me? My view on cases for iPhones is the same as bras on beautiful cars.
But as Bryan pointed out in our conversation, there's a big difference between a phone and a watch. A phone can go in your pocket. A watch goes on your wrist for everyone to see, all the time. A phone only has to be fashionable some of the time, but a watch is on the catwalk the whole day.
Which is why I think everyone is thinking about it wrong. Here's the thing. The iWatch, if it is in-the-works, is as much a watch as the iPhone is a phone. It brings to mind the (unverified) quote attributed to Henry Ford, "If I had asked my customers what they wanted they would have said a faster horse."
Everyone is thinking of a faster horse right now, while Apple (if they are working on something) are working on an automobile.