Human Experience for Google and Microsoft

Do Google and Microsoft understand what Human Experience is?
Sometimes they do and sometimes they don’t. They’re both companies run by engineers, so that’s bound to happen.
Google has launched Fast Flip and Microsoft has launched Visual Search – both of which are search-related tools. Both of which are confusing.

Google Fast Flip

google_fastflip.jpg
Google explains Fast Flip on their blog:

Fast Flip also personalizes the experience for you, by taking cues from selections you make to show you more content from sources, topics and journalists that you seem to like. In short, you get fast browsing, natural magazine-style navigation, recommendations from friends and other members of the community and a selection of content that is serendipitous and personalized.

The problem is, Fast Flip doesn’t make scanning headlines any easier or enjoyable for me. Just because something is visually rich, doesn’t guarantee it’s easier to understand. When I want to scan news headlines, I, uh, scan news headlines. I don’t need screengrabs of websites to act as training wheels for me. Google News is more than sufficient for me.
I concur with Richard Ziade’s thoughts over at Basement.org:

What’s interesting about this tool is that it’s the anti-Readability. Instead of helping us get rid of the junk around what we’re trying to read, Google fossilized the layout – junk and all – in images.

Microsoft Visual Search

microsoft_visualsearch.jpg
Then we have Microsoft’s attempt to make search results engaging by making them pictures. My co-worker Rob calls them ‘glorified image galleries’. The novelty of Visual Search wears off quickly and makes me pissed that I bothered to install Silverlight in the first place.
If Visual Search was integrated in some other Microsoft properties, it might add some value and move beyond a one trick pony.