[...] Although many people continue to equate intelligence with genius, a crucial conclusion from Terman's study is that having a high IQ is not equivalent to being highly creative. Subsequent studies by other researchers have reinforced Terman's conclusions, leading to what's known as the threshold theory, which holds that above a certain level, intelligence doesn't have much effect on creativity: most creative people are pretty smart, but they don't have to be that smart, at least as measured by conventional intelligence tests. An IQ of 120, indicating that someone is very smart but not exceptionally so, is generally considered sufficient for creative genius.
—Nancy Andreasen, Secrets of the Creative Brain
Umm, no. The score was 2-0. Huffpost showing, once again, that being first with a story does not always mean being the best. Slow down. Chew your food.
Citing its usual anonymous supply chain sources, Digitimes on Monday reported that Microsoft called off its plan to mass-produce and launch the Surface Mini tablet back in May. According to the report, the decision to cancel the device was made because the tablet lacked differentiation compared to other small tablets, and also because the company received "negative responses" from its various brand vendor partners.
—Zach Epstein, BGR
The whole sales pitch for the Surface (Pro) is the fact that it's a laptop replacement. Microsoft has even gone so far as to offer people $650 to trade in their MacBook Air.
Considering the uphill battle they're facing trying to convince people their Surface is superior to a MacBook Air experience, imagine them trying to sell a Surface Mini. There's no way you're going to convince anyone a tablet with an 8-inch screen is going to replace a MacBook Air.
The only thing a Surface Mini could possibly replace is a Zune.
Don't even try to tell me you've forgotten about the Zune already.
While it's impressive how small today's computers can get, Google and its partners have still failed to demonstrate truly compelling use cases--let alone "rich user experiences"--that will create a mass market for $200+ smartwatches. In almost every example during Singleton's presentation, simply accessing a smartphone--an activity Google says its one billion Android users already do an average of 125 times a day--seems like it would be a more capable and comfortable solution. (And there's no either/or option here--today's smartwatches must be paired to a phone in the vicinity to access the internet.)
— Dan Frommer, Quartz
Seeing all these companies scramble to come up with compelling smartwatches makes me think how much the people at Apple are enjoying watching it all go down. Apple wasn't first to market with their MP3 player (iPod), smartphone (iPhone) or tablet (iPad).
In the past there was never as much of a desire to preempt an Apple product launch like there is now with the rumored 'iWatch'. Samsung, and Motorola and LG and Google have all raced to get wearables to market. I wouldn't be surprised if Apple looked forward to such preempting.
Ironically, this preempting gives Apple a head start in getting things right with wearable computers where everyone else is getting things wrong.
You might suggest I stop reading tech news sites if I have so many problems with them. The truth is, that's where most of the tech scoops happen, so I keep reading them. This doesn't mean I'll stop calling bullshit on them when they post stupid headlines, like this one from BGR.com:
After watching the YouTube video of Google's demonstration of the first "working" Project Ara device, I couldn't help but laugh.
First off, the audience of Android nerds OOHs and AHHHs when the demonstrator gets a very, very rough prototype device to merely boot up and show the Android logo. Really, guys?
Secondly, while the idea of a modular mobile computer sounds awesome, tell me exactly what person would customize a phone like you're proposing:
This is the kind of phone the 15-year-old version of myself would have designed. A speaker! A clock! Yeah! Totally!
I think there's a hobbyist market for such a device, but it's nothing that would ever have mass appeal. It's also a difference in philosophy from a company like Apple. Apple has very opinionated design in their products.
They build devices and user experiences based on what they feel is best. Your ability to customize such experiences is severely limited (although this is changing a bit with iOS 7 customizations).
This week Michael and Bryan discuss Vanna White, Yul Brynner as a robot, how HDR imaging is abused, the misconception that mobile apps are easy to make, Bryan's fear of flying, the grit of Philadelphia and growing up in the suburbs.
This episode opens with the exhaust from a 1968 Camaro Super Sport.
Weekly Exhaust, Episode 6
If you're interested in sponsoring the podcast, contact Michael.
Curt Aldredge has concluded there's no proof hollow icons are harder to interpret/recognize than solid icons:
Johnson's warning against using hollow icons in user interfaces just isn't supported by evidence from real users. For one thing, an icon's style doesn't exist in isolation, but interacts with other attributes like color to create compounding effects on usability. Furthermore, less than half of the icons in my set of 20 performed better in a solid style than a hollow style. A different set of icons would likely result in a different overall result.
Me? I don't care if hollow icons are harder or easier to decode than solid icons.
I'll repeat myself 50 million more times if I have to: hollow icons are not icons, they're wireframes of icons.
Different strokes for different folks, but I think a more appropriate approach to icon design is thinning them out, not hollowing them out. Let your icons stay solid, just streamline them so there's less area to fill in.
Over at Slate, Reihan Salam sees the city I live in, San Francisco, as a selfish, selfish place:
Or consider San Francisco, one of the least-affordable major cities in the United States. San Francisco's population is about 825,000. If it had the same population density as my hometown, New York City, it would instead have a population of 1.2 million. Note that I'm referring to the population density of all five boroughs of New York City, including suburban Staten Island and the low-rise outer reaches of Brooklyn, Queens, and the Bronx. A San Francisco of 1.2 million would not be a Blade Runner-style dystopia in which mole people were forced to live cheek-by-jowl in blighted tenements. San Francisco at 1.2 million people would still be only half as dense as Paris, a city that is hardly a Dickensian nightmare.
I'm not an urban planner and I don't have a silver bullet solution to the housing problem here in San Francisco, but something has to give. What San Franciscans have been experiencing in recent years is what cities like New York have been dealing with for decades and decades and decades.
My wife was born and raised in San Francisco proper, and agrees with me that it's more of a suburban metropolis than a city. San Francisco needs to put on it's big boy and big girl pants and start acting like a top tier city. A better transit system than the shitty BART we currently have, more development and more infrastructure.
As beneficial as expanded development could be for San Francisco, I can't help but think about when—not if—the next earthquake hits the Bay Area.
Perhaps all these smart, young brains in Silicon Valley should use their collective brainpower to focus on earthquakes and not how to charge people for fucking parking spaces.
I had no idea about this:
You might be surprised to know that Facebook has an active and ongoing Artist in Residence Program. Not only that, but it's also among the most innovative corporate art and artist programs anywhere. Now in its second year, artists have become a regular fixture around the Facebook campus. According to the program's founder and curator, Drew Bennett, artists are active within the Facebook community during the periods of their residencies and besides making art for exhibition, display and dissemination around the campus, they also have ongoing opportunities to observe, mingle and interact with the people who work there.
Making art is a much better way to spend your time than to fuck around on Facebook.com.
David Cain noticed his spending habits changed when he returned from many months backpacking around the world:
One of the most surprising discoveries I made during my trip was that I spent much less per month traveling foreign counties (including countries more expensive than Canada) than I did as a regular working joe back home. I had much more free time, I was visiting some of the most beautiful places in the world, I was meeting new people left and right, I was calm and peaceful and otherwise having an unforgettable time, and somehow it cost me much less than my humble 9-5 lifestyle here in one of Canada's least expensive cities.
It seems I got much more for my dollar when I was traveling. Why?
Later on he gets to the antiquated 40-hour work week bullshit we still have in the West:
The eight-hour workday developed during the industrial revolution in Britain in the 19th century, as a respite for factory workers who were being exploited with 14- or 16-hour workdays.
As technologies and methods advanced, workers in all industries became able to produce much more value in a shorter amount of time. You'd think this would lead to shorter workdays.
But the 8-hour workday is too profitable for big business, not because of the amount of work people get done in eight hours (the average office worker gets less than three hours of actual work done in 8 hours) but because it makes for such a purchase-happy public. Keeping free time scarce means people pay a lot more for convenience, gratification, and any other relief they can buy. It keeps them watching television, and its commercials. It keeps them unambitious outside of work.
It's amazing how such obvious observations can be so eye-opening (and depressing).
Children not only learn to read more quickly when they first learn to write by hand, but they also remain better able to generate ideas and retain information. In other words, it's not just what we write that matters -- but how.
"When we write, a unique neural circuit is automatically activated," said Stanislas Dehaene, a psychologist at the Collège de France in Paris. "There is a core recognition of the gesture in the written word, a sort of recognition by mental simulation in your brain.
"And it seems that this circuit is contributing in unique ways we didn't realize," he continued. "Learning is made easier."
—Maria Konnikova, NYTimes: What's Lost as Handwriting Fades
Thinking and creativity intimately connected to our ability to sketch things out—to take our thoughts out of the ether and put them into the physical world. And this is not just important to "creative" types, but everyone.
I'm reminded of the slogan for Field Notes: "I'm not writing it down to remember it later, I'm writing it down to remember it now."
First Round Capital has a great profile on the brand development of Harry's (the guys behind Warby Parker).
I particularly like where they zagged to after Movember's zig:
Last year, Harry's launched National Shave Day on December 1 to much fanfare -- riding on the coattails of another cultural facial hair phenomena: Movember. In doing so, they appealed their target market, and not only appeared timely, but prescient. After not shaving all month, men everywhere were in desperate need of a good razor.
"The holiday created more story around the brand. We had an event at our barber shop, put it on the web, promoted it on social media, and we watched the conversions explode," says Morin. "When we tell stories that connect with people, it's obvious. On National Shave Day we saw a 360% lift in traffic to the website.
The new Android logo looks like something I've seen before, I just can't put my finger on it.