Sunday, March 14, 2010

The Other Side Of The Coin

There's an idea held by some who work in technology that all you need to turn out great results is to get a bunch of smart people together and let them go at it. Surely if you just give them enough time and resources they'll turn out something amazing, right? And in some ways it's true, because you'll get some very creative ideas, and some new ways of doing things that can probably be turned into a successful business. But despite some good ideas and some shiny new tech, you're almost always going to find that a group put together with no supervision or income based on results is going to turn out systems that are nearly unusable.

That's because the neat ideas, as fun as they are, don't make up the whole system. In any usable item there are a couple of neat ideas, and a whole lot of incredibly boring work that went into finishing the rest of the product. The reason everything around you works half as well as it does is that there are millions of hours of human attention devoted to the items within 20 feet of where you sit, right now. Not millions of hours of creative bliss, either -- millions of hours of mind-numbing repetition and monotony, all to create something that's not clumsy enough to irritate you when you use it.

It's not just automation that distinguishes our modern comforts from the pre-Industrial age. Obviously that helps, because it adds geometrically to the amount of detail that can be created without a corresponding increase in human attention. It even frees up some of those human hours to be devoted to cool ideas. But make no mistake, more than ever our lives are founded on monotony.

With physical goods it's pretty obvious, although we tend not to think about it that often. Every mass produced item had to be assembled somehow, which still involves human oversight at some level. But before that, every piece, measurement, and color had to be designed, reviewed, re-designed, approved, and sourced from a specialized manufacturing facility. Some building blocks are so common that the designs are usually made to fit them, rather than the other way around -- screws, bolts, transistors, wiring -- although all of those also underwent countless hours of design before becoming a standard.

With technology and the Internet it's a little different. The building blocks aren't as simple as screws and transistors, and the result is that when you release a product you are to some extent on the hook for verifying all your building blocks as well. This is especially true in the open source world, although I found one particularly nasty bug in the Microsoft implementation of the STL that illustrates the idea that even well funded companies can't spend enough time on drudge work to completely ensure quality. Anyway, not only do you have to find and fix all the rough edges in your own product, but you have to worry about finding at least the worst ones in everything your code is built on.

So what's the point? Well, computer science has often been seen as an industry that's especially suited to a small number of bright people creating something that leaps past the competition and snags the market and millions of dollars in the process. At one time that was true, but the fundamental difference is that the chrome people expected around a product back then was a whole lot less than what they expect today. The nature of the market for a while was that the trim was so minimal, the drudgery could mostly be avoided. That's really not true today, because there are enough things out there which have paid their cost in monotony. People expect new things to match up.

Where this gets interesting is in watching companies who seem intent on willfully ignoring the reality of their situation. Google is probably the most notable at this point, simply because they are so large and still regarded as quite successful. Their initial success was absolutely a work of genius, packaging some creative genius and hard work on some really cool problems behind some chrome so simple that it couldn't have taken more than a couple days to throw together. And later efforts worked well when they could get away with simplicity -- GMail was revolutionary for simplicity when everyone else was loading up their interface with features. They went full steam after this model of hiring a lot of really smart people, putting them in a room, and seeing what happened.

Only in the last couple years later something happened. They were still hiring smart people, and from what I understand still giving them free reign to work on whatever struck their fancy. But new products have been decidedly lackluster. Wave was hyped like crazy before it released, but as far as I can tell it took about a week for people to stop caring once they got to play with it. Buzz has managed to hang on to the public's attention for longer, but mainly because of the giant backlash it created with its default privacy settings. Both have relatively interesting ideas behind them, but both suffer from the same problem, namely a notable lack of simplicity. Trying to read through a page of information in either one is like trying to carry on a conversation in the middle of a rock show.

In the end, simple, clean interfaces are one of those things that takes a lot of monotony and grunt work, as well as creativity. Wave and Buzz are exactly what I'd expect to come out of a company that hires too many smart people. If you bring a ton of creative people in and tell them to work on whatever they think is interesting, who's going to go spend three months testing different mockups of conversation flow for usability? It's a crappy job, and nobody's going to do it if they have the choice. The places that will succeed in the long run are the ones who have enough money to pay people to not have a choice.

1 comment:

  1. The problem with intellectual labor is the line between "good idea" and "monotonous repetition". At least in academic research, it's obsessively hard to find the distinction.