Sunday, April 4, 2010

The Bedrock Of Engineering

Call me conceited, but I'm going to write this entry based on the idea that I am a stereotypical engineer. I know right off the bat that it's not true -- I'm not nearly interested enough in math to fully embody the role, nor am I enough of a disciplined analytical thinker to really carry the title. My core personality is more along the lines of engineering by gut feeling. I'm technical enough to pull it off, but I don't have the chops in science to really own it in the same way as someone who still remembers everything they learned in calculus ten years after college. But I do think there's a common aspect between my personality and that of someone who's more methodical and, well, a classical engineer, and that is the desire to fix things.

I'm not just talking about a vague desire to make things better. Pretty much everyone has that, engineer or no. I'm talking about a deep-seated frustration with things that are slower than they could be. I'm talking about an exasperation that rises from deep within at the hint that you might be doing something repetitive -- and not just repetitive, but when doing anything that you haven't been personally been convinced is worth your time. If your reaction to a boring task is not, "Let's get this over with," but, "Let's automate this so I can do it in one percent of the time," that's the core of the engineering mindset.

It comes so naturally to me that it's something of a shock when I step back and look at all the elements that have to be in place to give me the luxury of even thinking about this. Combined millennia of engineering effort have gone into creating flexible, multipurpose tools that I can play with on a whim. The fact that the Internet will shuttle any piece of information from anywhere, to anywhere, without any setup cost by the person injecting the information into the network is a minor miracle. These things give me the kind of freedom to create that people one hundred years ago could only dream of, and yet I still find everything too slow, too repetitive, and badly in need of fixing.

Sunday, March 28, 2010

Jack Of All Friends

I have to admit that as much as I love shiny new tech, I tend to be a skeptic when it comes to technology fads. I straight-up don't understand the appeal of Facebook, to this day. Maybe it's the way it emphasizes the quantity of your connections over any of the activities it enables once you have those connections in the system. The mad land rush to grab as many friends as possible back around 2005, when Facebook was still exclusive to those with email accounts from selected colleges, left my contact list cluttered with way too many contacts with whom I'm not really interested in sharing the kinds of things Facebook wants me to share. I'm left with a choice of either telling a bunch of people that I'm not really their friend -- honest, but potentially quite rude -- or not using the service at all.

The problem I have at its core is with indiscriminately broadcasting information about myself all over the Internet. Certain things obviously are more okay than others. Blogging, apparently, is one of those things that I don't have a problem with, mostly because it's filtered and limited to stuff that I don't mind saying to anyone. Obviously this results in a different type of output than something filtered, so while I enjoy putting this work out there it's not going to be an outlet for everything I want to say. Despite the fact that almost nobody will see it, the fact that literally anybody potentially could is enough to limit the utility of the medium for me.

Facebook provides filters, though. I currently have everything locked down to "friends only" mode, so it's not a question of blocking information from strangers. I think it's more a question of partitions and groups. I want an output where the purpose is more limited, that gives the communication a more definite context. Long form communication like blogging makes sense because it takes some more thought to write a full post. Facebook communication is shorter, more off the cuff, and as such I want some kind of filter that I can consider once and rely on the next time I'm saying random things.

Sunday, March 21, 2010

A Failure To Communicate

Today was the health care reform bill vote in the House. I'm still avoiding political topics directly, but there's a technical topic here as well which continues to mystify me, and that is the complete lack of any kind of unified information stream available on the Internet. There's a raw stream of Congress available on a number of websites, sure -- unfiltered, and without context. There are news articles updated every couple of hours, but with no real status updates. This is obviously the big story of the day, and these two items are at the top of the page, which shows they know this.

The difference between the website content and that on television is night and day. The television broadcast has extensive, rich commentary including highlights of the event, and nearly real-time analysis not only of the proceedings, but of historic context around them. It's wonderful, and provides a lot of important information to help make sense of what's happening (note that here I'm referring to the MSNBC coverage -- I don't care much for CNN). But since this is on television it's inherently a one-way experience, with no opportunity to direct the conversation from my end.

The television broadcast indicates that they do understand how to create a compelling context around a real-time event. So why can't they do it in a bi-directional medium? I hope it's self evident how much more valuable this could be, since the asides could be selected when the viewer is interested rather than when the producer thinks everyone will be interested. I would even be satisfied with largely the same experience as the television broadcast with an additional "procedural comments" ticker provided by another set of commentators working in the background.

Sunday, March 14, 2010

The Other Side Of The Coin

There's an idea held by some who work in technology that all you need to turn out great results is to get a bunch of smart people together and let them go at it. Surely if you just give them enough time and resources they'll turn out something amazing, right? And in some ways it's true, because you'll get some very creative ideas, and some new ways of doing things that can probably be turned into a successful business. But despite some good ideas and some shiny new tech, you're almost always going to find that a group put together with no supervision or income based on results is going to turn out systems that are nearly unusable.

That's because the neat ideas, as fun as they are, don't make up the whole system. In any usable item there are a couple of neat ideas, and a whole lot of incredibly boring work that went into finishing the rest of the product. The reason everything around you works half as well as it does is that there are millions of hours of human attention devoted to the items within 20 feet of where you sit, right now. Not millions of hours of creative bliss, either -- millions of hours of mind-numbing repetition and monotony, all to create something that's not clumsy enough to irritate you when you use it.

It's not just automation that distinguishes our modern comforts from the pre-Industrial age. Obviously that helps, because it adds geometrically to the amount of detail that can be created without a corresponding increase in human attention. It even frees up some of those human hours to be devoted to cool ideas. But make no mistake, more than ever our lives are founded on monotony.

Sunday, March 7, 2010

Doing More With Less

There's an economic principle I ran across a while ago which I find absolutely fascinating -- in fact it's a decent part of why I wanted to start writing about this type of thing in the first place. Unfortunately I can't remember its name (the Internet has failed me), but I can describe it.

Part of it is based on the idea that I've discussed in a lot more depth elsewhere of jobs shifting their use of automation over time, relative to each other. With that as a given, it's obvious that taken in isolation, certain jobs would tend to command higher salaries over time, while those left without the benefit of automation over a long period would tend to drop. But human workers don't act in isolation. Instead, there's another driving principle in the economy, which is that jobs, even in different fields, which take a roughly equal amount of skill to perform must provide roughly equal compensation to people across all those fields.

The unnamed principle deals with the result of the obvious conflict between those two forces. On the one hand, fields that don't benefit from automation will tend to drop in purchasing power. On the other hand those jobs aren't getting any easier to perform, so you have to keep compensation in those fields roughly in step with that in automation-friendly fields or face the consequences of deteriorating workforce quality. Not a pretty choice, and the further implications are even worse.

Sunday, February 28, 2010

The Beauty Of Abstract Thinking

I've been spending a lot of time in the past couple weeks on refactoring large sections of code. To some people this might be boring, or tedious, but to me it's possibly the best part of being a programmer, or at least tied with the other best part which is writing new code. I love poking around in conceptual musty back rooms, throwing up my hands in disgust, cursing a little, and then tearing it all apart and putting it back together again.

In case anyone reading this isn't familiar with the term, it's basically the same process as what any other discipline that produces written text would call editing -- going back over what's been written and adjusting it for accuracy and style. Except unlike other written output, program code is functional. It's a set of instructions being interpreted by a machine that is completely stupid, and completely unwilling to treat anything you write any way other than literally. This has some advantages. A naive system will, at least, always come to the same conclusion every time it reads the instructions, and every machine will interpret your words in the same way. But it also means that you have to be explicit about everything. Every detail has to be included, every time you need it.

I'm willing to bet that the majority of people who write for human consumption aren't really aware of how lucky they are to have an audience that isn't completely literal. A human reading a story or a set of instructions will bring an amazing amount of conceptual understanding to their side of the discussion, and interpret confusing or contradictory things properly using their own judgment. If you take a story, change a character from male to female, and miss one of the pronouns, any human reader might be a little confused, but will quickly realize what's going on and continue reading. Maybe a little irritated, but without a severely damaged understanding of what happened. A computer presented with the same situation will crash, and refuse to read past the confusing word.

Sunday, February 21, 2010

When You Know You're Right

I had a Sociology professor in college who described the breakdown between liberals and conservatives like this.

Liberals think things are broken, and want to fix it by finding new ways of doing things. Conservatives think things are broken, and want to fix it by going back to the way we used to do things.

Since I'm paraphrasing, poorly, there's an assumption that goes along with this statement which my clumsy version has failed to capture. The best way I can describe it is, conservatives tend to find an innate value in continuity -- something that's remained unchanged for a long time is thought to be better precisely because it has remained unchanged for a long time. Liberals, obviously, would fall on the other side of this distinction, finding little or no innate value in old habits just because they are old.

Sunday, February 14, 2010

Resisting The iPad

Remember when you were a kid, there was some toy that you just had to have. It was shiny and perfect, and you could only imagine the hours of finely sculpted joy you would have with it, if only you could buy it (or convince your parents to buy it for you). For me it was those little hand-held video games, the kind where it could only play one game because all the lcd segments were in pre-defined shapes rather than pixels. Then in high school it was a bulky, underpowered surplus laptop that was cheap enough for me to afford, but not powerful enough to run much of anything. Then in college it was a Palm Pilot, one of the old black and white ones that could sync to a computer and not much else.

The problem with all of them was that the reality never lived up to the expectation. The imagined joy instilled on whichever product was the object of my techno-lust was never founded in specifics, but rather a haze of half realized possibility fueled by colorful packaging and television commercials. I suspect everyone has something like this in their past, and that this feeling is not something unique to my own experience.

Which brings us to the latest in a long line of Over-Hyped Shiny Objects: the iPad. I will say one thing for Steve Jobs, he's damned good at making a sales pitch that instills this exact kind of unreasoning technological envy while making it look like he's doing the exact opposite. The announcement, streamed live on the front page of CNN (how they pulled that one off I have no idea), presented such a dizzying series of features and applications that it's hard to imagine there being any purpose the iPad couldn't fulfill. Don't worry about needing a reason to buy one, Steve just showed you fifty reasons; surely a couple will apply to you.

Sunday, February 7, 2010

So, You Want To Work In Programming?

Programmers and other engineers are going to put everyone else out of business. It's just a fact. At some point -- not next year, probably not next decade -- but at some point, a whole bunch of stuff is going to be written down in a repeatable form. Computers and machines will follow those instructions, and do work that could employ millions of people. Okay, that's already happened today, but so far we've found other productive things to replace those jobs. The someday scenario is the point at which we have few enough non-automation jobs and a large enough supply of non-automation people that society starts to notice.

That's still not clear, and it's because I'm working on a theory which I haven't stated yet, which goes something like this: As automation replaces an increasing number of jobs, the proportion of jobs actively involved in creating more automation will go up. Remember the graph from a couple posts back, with the pretty colors? In those terms, this means that the bell curve is going to squeeze in and become much more narrow, and the green portion is going to expand to fill most of it (if that makes no sense, check out the archives). In pop culture terms, this is the future we've been dreaming of for centuries, when machines will do all the menial junk that we don't want to do, and we can all become artists and scientists and dance in the meadows.

While this is usually painted as some kind of paradise, I'm not so convinced that this is going to turn out happily for everyone. It's a paradise for the people who want to be artists and scientists, sure, but there's another theory I have which I haven't really heard repeated much, and it's this: Some people just aren't cut out to work in automation. A world where being creative is the only thing a machine can't do for pennies on the dollar might be paradise for someone who likes being creative, but for someone who doesn't like it -- or worse, can't do it -- it sounds a lot more like hell.

Sunday, January 31, 2010

Living In The Bell Curve, Part 3

In the last two posts I've been talking about a conceptual model to describe the effect of advancing technological process on the workforce. If you haven't read it, you might want to do so, because this post is going to be more of the same. The basic idea though, is that some amount of effort is spent doing things that could be automated but aren't for cost reasons, most effort is spent doing things that can't be automated, and a bit less effort is spent on automating new things. Further, I made the claim that those areas of effort are roughly distributed along a scale representing the complexity of the task, and that the distribution of effort is roughly in the shape of a bell curve. There's a handy graphic that illustrates all this in the first post.

I also claimed, fairly I think, that the effect of technological advancement and automation of processes tends to push existing jobs into the red zone, where they're in danger of being replaced by automation, and that this acts as a forcing function to drive people toward more complex jobs, both as a way to find job safety (further up the curve) and just out of necessity when trying to find work. So what does this really look like, for someone in this situation? It should be self evident that the things that get automated first tend to be the simplest tasks. The stuff that's left over is more creative, more complex, and generally harder to do, and on average also tends to end up creating and touching more in the same period of time, because every human task at this point is using any number of different automation technologies in the process of their work.

For those of us who fall somewhere on that scale, and I'm definitely one of them, this means that we're constantly in danger of losing our jobs. Even worse, this isn't at the level of a single company, or a section of an industry. It's not from a downturn in the economy which will end at some point. We're in danger of losing our jobs to an automated process that encodes a large part of what we know how to do. We're in danger of having our skill set made entirely obsolete, across the entire culture.

Sunday, January 24, 2010

Living In The Bell Curve, Part 2

In the last post, I claimed that someone was actively trying to steal your job, and mine as well. I also proposed an abstraction for the concept, which involved human effort being distributed along a bell curve of complexity, ranging from those doing jobs that have already been automated, up to being the ones working to automate new things.

One of the most important points of the Tech Bell is that it's self modifying. Specifically, the work represented by the green area (effort put toward automation) will likely change the shape of the curve itself, as well as altering the distributions of the zones within it. This is especially important because any alteration in the distribution of the graph will affect the green zone itself, either magnifying or dampening the effect of the change over time.

The other key point is that the entire graph shifts horizontally over time, being pulled along to the right (toward higher complexity), again due to the effort represented by the green zone of new automation. So although the change in the shape of the graph may be more gradual -- and I suspect it is -- the specifics of the work represented by each zone from one decade to the next may still be drastically different despite the overall shape being largely unchanged.

Sunday, January 17, 2010

Living In The Bell Curve, Part 1

Right now, somebody is plotting to steal your job. Somebody is plotting to steal mine too, and if you happen to be a programmer it may even be the same person gunning for both of us. It doesn't matter where you live, or what you do for a living. Even worse, they're not just trying to take your job for themselves, they're trying to make what you do obsolete.

That's not to say that this is a new phenomenon. The entire point of the industrial revolution was that machinery and automation could make large swaths of production fast and predictable, which they could never be when they were in the hands of individual craftsmen. Automation gives you fast, cheap, and interchangeable parts without which most of the modern world couldn't exist. Still, it's a bit unnerving to think that there are people out there who will, if they do their jobs right, make your knowledge and skill set pointless.

Here's my version of what this idea looks like on paper.

Sunday, January 10, 2010

On Unintended Consequences

So far I've mostly used this space to write about technology, and in pretty abstract and wishy-washy terms. That's not all I want to talk about, but I felt like I needed to get some core stuff ironed out for myself and expressed in concrete language. I may change my mind on a lot of it a couple months from now, but having it recorded in some solid form helps me to direct my thoughts.

This week, though, I've been thinking more about economics. Without question I blame the media for this; Congress is back in action, and the usual public circus continues, with Libertarians and the Right continuing to scream the glories of the free market at the top of their lungs to anyone who will point a camera in their direction. In a better world their energy could be directed toward something more useful -- possibly shoveling literal rather than metaphorical bullshit -- but unfortunately they do get attention, and enough of it that they've already succeeded in derailing some of the most meaningful portions of the health care bill that existed a few months ago.

I have to admit I'm more socialist than not. I can accept the idea that the free market as a concept is useful for explaining the interactions of parties in an unrestricted system. I can even go along with the idea that strict market control from an external policy organization such as the government would be a bad idea when implemented on a very fine-grained level. Trying to plan the production of goods down to a single bar of soap or loaf of bread is likely not going to turn out all that well (although probably better with modern data analysis methods than it did in the 50s without them).

So why am I a socialist rather than a capitalist? Because agreeing that a scientific theory does a good job of explaining a certain system has absolutely nothing whatsoever to do with liking the end result of that system. What comfort is there, when everything around you is going to hell, in being able to explain exactly how you got to that point? Some, maybe, but not enough that I'm willing to just sit back and let it happen.

Sunday, January 3, 2010

We Are Legion

I firmly believe that the human race are cyborgs. It happened years ago, and because we were looking for a specific vision of our sci-fi future we didn't notice that it had already arrived, only in a different form. I think maybe we were waiting for literal biological implants, because that assumption seems to underly a lot of the fiction that deals with this concept. But why would you undergo invasive surgery if you didn't have to? All it does is make it more difficult and dangerous if you want to repair or upgrade something. So instead we have a softer, gentler cyborg. Instead of implants for data storage and access, we have pervasive access to the network; instead of implants for telepathic communication, we have cell phones.

The two necessary features of this state of existence, in my mind, are the capacity of machines to augment human physical prowess, and to add new senses to the human experience. As I've mentioned before, technologically augmented physical prowess is obvious, in the form of functionally expanded brain capacity through the cloud -- or for a low(er) tech version, super-human speed through mechanical transportation. Somewhat less obvious are the addition of new senses, but without loosening the definition too much I believe that requirement is met as well.

One we've had for millenia -- a compass, one of the simplest tools around, lets us sense magnetic fields. More recently, the construction of the GPS system essentially fabricated a new sense for the entire species, letting us sense our physical location with amazing precision on a global scale. Infra-red goggles allow us to see more parts of the electromagnetic spectrum; cellular phones allow us to selectively hear anyone in the world from anywhere; rapid access to a network containing public commentary lets us almost instinctively know whether an experience (movie, restaurant, etc) we are contemplating will be enjoyable.