I can't remember if I've already posted on this or not. If so, I'm sure this take will benefit from the increased wisdom and experience I bring to bear. Or not.
This month marks a personal anniversary. As of this month, it's been 30 years since I started working full-time in some form of software development or other. (I know ... I've previously said that anniversaries, and especially round-numbered ones, are artificial and meaningless, but let's go with it here.) So, reflecting on three decades of software development, I see much that has changed, and much that has not.
When I started, we worried a lot about the memory and processor speed constraints of minicomputers. As soon as those resources became virtually unlimited, we started developing software for workstations, and again had to worry about memory and processor speed. By the time workstations became virtually unlimited, we were developing software for PCs. Guess what. And now it's cell phones and hand-held devices. What's next? Implants?
After a year at my first full-time job, at a big Wall St. brokerage company, I got a job at DEC, then the coolest computer company around. DEC was like being in college, but making money. I joined DEC in the New York office. There were about 20 of us software specialists, all using a single VAX 11/780 as our main computer. We connected via VT-100 terminals, and did everything with command line programs. Now I have many times that amount of computing power and memory in each of my pockets, not to mention my Mac, Windows and Linux desktop machines, my camera, etc., and it's not enough!!
When I started, we wrote programs to solve problems. I would estimate we spent about 80% of our efforts on finding the best solution to the problem, and getting it to work reliably and efficiently. The other 20% was spent on integration ... getting the program to be compatible with other software, to provide compliant interfaces, and generally to play nicely with other software.
Now, it's the opposite. We spend about 20% of our time actually solving the problem at hand, and the other 80% on making sure everything is translatable into every human language, compliant with the latest Microsoft interfaces, Web compatible, accessible, interoperable, scalable, scriptable, and just about every other kind of -ible or -able.
Gordon Moore observed in 1965 that the number of transistors we can fit on a chip doubles about every two years. By extension, this Moore's law is widely taken to mean that the capability of any given technology doubles about every two years, or at some astonishing rate. I don't think anyone has yet quantified how quickly our expectations about technology increase, but it's got to be at least 4 or 8 times the Moore's law rate. (Ok, we geeks are stuck on powers of 2.) It's a major news story when it takes people a few hours to download and install over 200 megabytes of new iPhone software. We complain if an email to the opposite side of the earth takes almost half a minute to send. We gripe about spending a few dollars for a first person shooter game for our cell phones.
One thing I've learned in 30 years: Computers will never do what we want!
1 comment:
Peter, isn't this the story of Sisyphus? You'll never get that boulder to the top! That's part of being human.
In your last entry, you mentioned the mute button being subverted by TV programmers. I've noticed that software people tend to want a LOT of control over their environments and gadgetry, if not TOTAL control. And they're unhappy, and they struggle, when they don't have this control.
Of course, this compulsion is what makes them software people to begin with.
So this is your boulder.
Post a Comment