Wednesday, December 24, 2008

Overly Credulous



Really? It's in a friggin' plastic spray bottle, for chrissake! Isn't that a clue?

Saturday, December 20, 2008

Rich Man, Poor Man

On the front page of today's Boston Globe, there are two stories with interesting parallels. One is the story of a 13 year old boy who, along with another 13 year old and and a 20 year old, murdered his drug-dealing half-brother, stole $10,000, and set fire to the body.

The other story deals with Bernie Madoff, the Wall St. investment artist who bilked some of the largest investors in the world out of $50 billion, bringing major charitable foundations to ruin.

In this holiday season, it's good to remember the basic human qualities we all share.

Friday, December 19, 2008

Keeping Stuff

I have a lot of books. I mean a lot of books. Thousands. I think if I got rid of them all, the house would rise several inches from the ground. I've always been a book hoarder. I somehow delude myself that merely possessing them somehow gives me more immediate access to their contents, whether I've read them or not. I have the further delusion that there's some permanence to this ... that I'll always be able to enjoy these books.

But even for the ones I have read, it's often difficult to go back and find specific information ... favorite passages, obscure facts, etc. This is supposed to be one of the great benefits of electronically stored information. But electronic information fails on the permanence quality. Information gets lost or corrupted, or simply becomes unreadable through hardware and software obsolescence. It may even become unavailable through business failures.

Electronic information even comes up short on the accessibility criterion. Sure, I can search my email with GMail, and I can search my PDFs with Acrobat. I can search my archived Web sites with ... well, nothing that I know of, as Web sites are notoriously ephemeral. As the sources of information ... email, blogs, on-line articles, etc. ... grows, the possibilities for saving and retrieving all that juicy info become more and more feeble in comparison.

I'm going to experiment with Evernote, which claims to be the answer to all these concerns. I'll report here on what I find. Meanwhile, I welcome any comments or recommendations for storing everything I ever knew or wished I knew.

Monday, December 15, 2008

A snowball's chance ...

You want to know what's wrong with the world? I'll tell you what's wrong with the world. A circular for a sporting goods store in this weekend's paper showed not one, but two different plastic snowball making/throwing gadgets. Snowball making?

For thousands of years, kids were able to make snowballs and throw them with their bare hands. Now they need a plastic utensil for this?

That's what's wrong with the world.

Thursday, December 11, 2008

Newspaper Bail-out

cshirky's post about the decline of the newspaper industry, and of the information industry in general, highlights a concern I've had for a while now. It's not just the dead tree technology that's becoming increasingly devalued. It's the content itself. Look at what's happened in the music industry, and what's now happening in Hollywood. It's a slow, lingering death, but the abundance of acceptable quality on-line journalism, music, video, etc. is surely fatal to the traditional creators of this entertainment.

At one time, we thought there might be a two-track system, allowing users to get sponsored content for free, or pay extra for ad-free content. But even that appears to be eroding, because advertising is an invisible cost to the viewer ... you don't know you're paying with your eyes and ears. And advertising, in the form of product placements, is getting more invisible and more difficult to separate from the entertainment/information part of the content.

Of course, the big shift in this is the disappearance of editing, or at least of reliable editing. Newspaper editors take much of the responsibility for the space, priority, and treatment of news stories, just as years ago, publishers determined what books would reach the shelves. The decline of these gatekeeping functions may sound like a rallying cry for non-mainstream artists, creators, etc., but may also turn the information world into an unstructured, formless blob of competing criers, clamoring for attention.

Friday, December 5, 2008

Support Call

About half the world will never fully understand this, but for those who will, so much of life depends on the proper containment of the scrotum. Striking the right balance between support and freedom is essential to a sense of physical well being. And physical well being is, of course, vital even to engagement in intellectual pursuits.

So those grandparents who insist on giving underwear as gifts know what they're doing.

Thursday, December 4, 2008

Rating Technology

In the New York Times Gadgetwise blog, Roy Furchgott comments on the way Consumer Reports conducted ratings of smartphones. In particular, Furchgott says:
The point is not that Consumer Reports has done a bad job. The point is Consumer Reports has tackled an impossible job. Picking a phone is dependent on so many factors – which service provider you want, the quality of reception in your area, and what features you value. There is no way to create a rating system that has all of the answers.
The smart phone is such a versatile device that there's no way to create a single rating or score that will be true for all potential users. For example, Consumer Reports claims to have weighed voice quality more heavily than other attributes in their ratings. I use a smart phone almost exclusively for Internet access, email, etc. I occasionally make phone calls, but really, the only reason I got this instead of a separate dumb phone and PDA was to free up one pocket. (See my earlier post, "No Such Thing As Too Many Pockets").

So if rating smartphones has this challenge, isn't this true for other technology devices? Consumer reports has rated computers, digital cameras, TVs and home entertainment equipment. All of this stuff is becoming incredible complicated, versatile and ... interlinked. I want Internet access from my TV, so I can download Netflix movies and read my email in my living room. (Yup, moving the laptop is too much trouble.)

Obviously buzz (the current term for word of mouth) has a lot to do with which devices we choose. But is it possible, as it once was with toasters and washing machines, to come up with objective comparisons and ratings of the current gadgets?

Friday, November 21, 2008

Wednesday, November 19, 2008

Restart will be required.

I've ranted about this before, but seriously, this is what my Mac's Software Update is saying:


Note the message in the bottom left corner: Restart will be required.

Really? Why? What is so friggin' essential about Safari or iPhoto or AirPort Extreme (I'm using a wired connection here) that requires a full reboot of the operating system?

This is just developer laziness, or excessive caution.

I'm working here. Get it? I'M WORKING HERE!

I have multiple windows open, with just the right PDF documents displayed, and all the windows configured correctly. I have the right files open in my developer tools. Even with shortcuts like "Open recent ..." and Firefox's automatic tab re-opening (which Safari makes you do by hand, for some reason), having to restart, and then re-open and re-arrange everything is is a big interruption.

Now I understand the concept of the therapeutic reboot ... the reboot just to get the system back to a stable state after it's gotten all weird. But this isn't about that. This is just some stupid little software update.

So, Software Update, get over yourself, you annoying little twit.

Same to you, Windows Update.

Tuesday, November 11, 2008

Rush to Judgement

A lot of the talk surrounding the recent presidential campaign was about judgement. Did Obama's connection with Bill Ayers show bad judgment? Was McCain's selection of Sarah Palin as his running made poor judgment?

Actually, everything is about judgement. We use judgement constantly, all day every day. Everything we do voluntarily, from getting out of bed to making business decisions, involves judgement. In fact, the new field of behavioral economics has much to say about how our conscious and unconscious brains collaborate in this decision-making process. The long and short is that our judgement is not always rational, and there are a host of biological and environmental factors that contribute to how we choose.

So, given that our judgement is so subjective and fragile, doesn't it make sense to use objective data and criteria when possible? Heck, even baseball now uses instant replay to augment the umpire's decision-making.

So shouldn't DNA evidence, where available, be a right of accused and convicted people in criminal cases.

Friday, November 7, 2008

Acts of Contrition

I'm sorry. I know I've been remiss. I have to confess that I'm finding it very hard to be curmudgeonly this week. I'm trying. Normally I thrive on cynicism and negativity. But I just can't seem to muster much ex-thusiasm right now.

So I apologize for the tenor of these posts lately. But don't worry. I'm sure Obama will screw something up, and we'll be back to normal. (There, you see? There's hope! ... Oops.)

Wednesday, November 5, 2008

The Real Change

What amazes me most is that my son's generation is not amazed at all. They don't see anything remarkable in our having elected an African American president.

That's the real change!

Tuesday, November 4, 2008

Whew!

Amazing!

Why Be a Superpower?

Much of the talk surrounding this presidential election has to do with income inequality and the distribution (or re-distribution) of wealth. Both major party campaigns have at least paid lip service to the plight of the middle class.

What is the connection, if any, between wealth distribution and a nation's status as a world power? The U.S. came of age as a world power during the so called gilded age, when new industries and a laissez-faire regulatory policy created huge discrepancies in wealth. This lasted until after World War II, when the more egalitarian welfare state took hold. So is inequality a prerequisite for being a world power?

And what's wrong with being a former world power anyway? France hasn't been a threat to anyone since the days of Napolean, and that seems like a pretty nice place. Britannia no longer rules the waves, but there's great theater, dining and shopping in London. So maybe it's time to take it easy.

Sunday, November 2, 2008

Faith, But No Love?

John McCain said:

[Senator Obama] said the other day that his primary victory 'vindicated his faith in America.' My country has never had anything to prove to me, my friends. I have always had faith in it..."

I guess John McCain has always been a white male.

In any case, John McCain told Sean Hannity earlier this year, "I really didn't love America until I was deprived of her company..."

Hmmm.

Friday, October 31, 2008

Cause for Celebration

Well, the presidential election seems to be at the stage where people are betting on the point spread rather than on who's going to be victorious. Undoubtedly, whoever becomes the next president will fall far short of his supporters' expectations, and yet be vastly better than his detractors' worst fears. (I can use the masculine possessive with reasonable confidence here.)

However, Barack Obama's election would have an automatic symbolic significance that can't be ignored. No matter how much we convince ourselves we're not voting for racial reasons, the election of the country's first African American president will be a major milestone in our history. And though this certainly would represent an enormous accomplishment by Obama, it says more about the country than it does about the candidate. (Ok, maybe it says something about George W. Bush and Sarah Palin also.)

Should this come to pass, as now appears likely, it's a great cause for celebration, regardless of your politics or ideology. We can all celebrate the fact that our country has the maturity to take a giant step closer to the dream articulated by Martin Luthor King 45 years ago:

I have a dream that my four little children will one day live in a nation where they will not be judged by the color of their skin but by the content of their character.

Of course, this is certainly not the end of racism in this country or elsewhere. Obama's election would not mean the end of racism any more than Bush's presidency marked the end of discrimination against the mentally challenged. Still, it's a good time to celebrate.

Thursday, October 30, 2008

White Knuckle Waiting

Anyone who's ever been a Red Sox fan knows the feeling ... your team's ahead, there are only a few outs or an inning left, and you just sit with clenched fists, hoping they don't blow it, as they have so many times before. Prior to 2004, at least, the Red Sox were certainly the champions at snatching defeat from the jaws of victory. Season after season, they were the "also rans" ... the ones who just missed ... the worst best team in baseball. Even this year, they came roaring back against the Rays in the ALCS, only to lose it in game 7.

So the fans got conditioned to this ... waiting and hoping.

That's what this election is starting to feel like.

Monday, October 27, 2008

Mad Men

If you looked around the offices of Sterling Cooper, you'd find one man named Donald and another named Duck.

Coincidence?

Friday, October 24, 2008

We Know So Little

Scotch tape, which has been around since 1930, emits x-rays when unrolled.

Transistors, which have been around since the late 1940's, produce a new state of matter, distinct from solid, liquid, gas and plasma.

Alan Greenspan, who has been around forever, says "oops." (Of course, in Greenspan-ese, it takes 3247 syllables.)

We know so little, even about the most commonplace, mundane things in our lives. And yet we strut around proclaiming this and declaring that as if we were all knowing.

A little humility would not be out of place.

Tuesday, October 21, 2008

Porting the Constitution

In traditional software development, as in other forms of engineering, the process frequently starts with a loose set of requirements. These are then reviewed and refined into a more precise statement of requirements. This is followed by a functional specification, describing the outward appearance of a product to meet those requirements. Next comes a design specification, detailing how the product will be built, and possibly even an implementation spec, with more low level detail. Finally, there is code. In a sense, there's a continuum from requirements to actual code, with high level understanding or philosophy at one end of the spectrum, and very specific implementation at the other.

Newer methodologies, such as agile development, appear to bypass some of these steps. However, these are still implicitly in place, although they may not take the form of written documents. Developers must still understand what problem they are solving, and then devise a possible solution, and refine that solution in more and more detail.

It frequently happens that software originally developed for one environment must be ported to another, perhaps with newer technology. In this process, the sequence described above may be reversed. Developers may initially try to port the code directly, possibly using porting tools to automate this. In places where this doesn't work, developers step back to a higher level statement of intent, so they can carry out that intent in the newer environment.

Our constitution can be viewed in this way also. The Constitution is a document that includes both high level requirements:

...in order to form a more perfect union, establish justice, insure domestic tranquility, provide for the common defense, promote the general welfare, and secure the blessings of liberty to ourselves and our posterity...
and implementation details:

Every bill which shall have passed the House of Representatives and the Senate, shall, before it become a law, be presented to the President of the United States; if he approve he shall sign it, but if not he shall return it, with his objections to that House in which it shall have originated, who shall enter the objections at large on their journal, and proceed to reconsider it. If after such reconsideration two thirds of that House shall agree to pass the bill, it shall be sent, together with the objections, to the other House, by which it shall likewise be reconsidered, and if approved by two thirds of that House, it shall become a law. But in all such cases the votes of both Houses shall be determined by yeas and nays, and the names of the persons voting for and against the bill shall be entered on the journal of each House respectively.
Unfortunately, it's a bit light on the functional description that might lie between the requirements and the implementation.

This very elaborate engineering document was ratified in 1789, over 200 years ago. During the course of that time, the platform on which the implementation resides has changed. Apart from the obvious technology changes of mass media, cars and assault weapons, the geography and demography have changed drastically. Instead of a skinny stack of 13 states, the U.S. now spans the continent east to west, plus Alaska and Hawaii. The populations include a wide mix of urban and rural, European and non-European, religious and non-religious cultures.

The Constitution does propose a means for its own amendment, and this has been exercised twenty-seven times. However, amendment is in itself an elaborate process, with many political remifications.

All this is by way of saying that in interpreting the Constitution, perhaps it's best not to take a magnifying glass to the original text, or to try to divine what the authors meant by this or that turn of phrase. Perhaps the most useful way to interpret this document is by inferring the functionality that was intended, lying somewhere between the goals in the preamble and the implementation in the following seven articles. Then we can try to achieve that functionality in the current operating environment.

Friday, October 17, 2008

Spread the Wealth Around

In the Oct. 15 presidential debate, John McCain mocked Barack Obama's desire to "spread the wealth around," equating it to redistribution, a conservative curse word. Republicans frequently refer to redistribution, or even class warfare, to suggest that their political opponents are fundamentally socialists.

Nothing could be further from the truth. In fact, the most dramatic redistribution of wealth has been going on for a while, mostly at the hands of Republican governments. George W. Bush, in particular, has accelerated that trend, as described on the eve of Bush's re-election by BusinessWeek here, and on the PBS program NOW here.

But this redistribution has been of the Reverse Robin Hood type ... the wealth flows toward the already wealthy, and away from everyone else.

So maybe instead of Democratic or liberal redistribution, we should speak of correction.

Thursday, October 16, 2008

It's Greed

Greed is to blame for the whole Wall St. fiasco. At least, that's how John McCain portrays it. It's all those greedy, latte-scarfing, BMW-driving merciless Wall St. types who have mortgaged our country. They should be pilloried ... driven from town ... sent to a special place in hell for greedy financiers.

But don't raise their taxes!

Tuesday, October 14, 2008

Why Products Succeed

I don't believe in manufactured demand. I don't believe marketers are ever so good that they make a success of a product nobody wants. People have to want it at some level, or at least to think they do, to make a product a success.

I was recently arguing with someone on an email list about the likelihood of success of a new 3D camera. He pointed to several examples in the past of 3D cameras that were introduced, but never caught on. All I could point to was a period in the late 1940's and the 1950's when certain 3D cameras were extremely successful, selling hundreds of thousands of units. To my mind, selling hundreds of thousands of 3D cameras, even if only for a brief period lasting less than a decade, indicates that there are people who want to create 3D pictures.

When a product fails, there can be many reasons for that failure. It could be poor pricing. It could be poor manufacturing. Bad marketing, bad customer service, inappropriate sales channels, and dozens of other factors could account for a product's demise.

When a product succeeds, however, there's only one reason: it's right. If the product does meet a (real or perceived) demand, and the production/sales/delivery process doesn't get in the way, the product will succeed.

Tuesday, October 7, 2008

McCain vs. Pork

Just to be clear, the "overhead projector" that McCain keeps referring to as an example of Obama's excessive spending is, in fact, a replacement for the planetarium projector at the Adler Planetarium in Chicago. This was the first planetarium in the Western hemisphere, and has been a very popular instructional tool for students and other museum visitors.

So does McCain consider science education to be pork-barrel spending?

Monday, October 6, 2008

Most Qualified vs. Most Entertaining?

As a corollary to my previous post, Capitalism Depends on Greed, I think we'd have to conclude that the entire cast and the writers of Saturday Night Live will be voting for the McCain/Palin ticket. This may also apply to Jon Stewart, Bill Maher and others.

Thursday, October 2, 2008

Capitalism Depends on Greed

Ok, let's get this straight. Free market capitalism is based on greed. That's the engine that makes it work. The entire basis of capitalism is that everyone acts in his or her own self-interest, to try to get as much as he or she possibly can. That fact is what's supposed to keep the players in check.

Don't get me wrong. Free market capitalism has many failings. The most obvious is the tendency of wealth to concentrate more and more into fewer and fewer hands over time. Another is that naivete and lack of education allow some people to be victimized by others. A third is that people often forsake their long-term interests for short-term gratification. ("I want to drive now. Who cares if the environment is a few degrees warmer in a hundred years?") Also, there's the unspoken collusion that allows, for example, oil companies to jack up prices together, ensuring higher profits for all.

But greed is not a failing of capitalism. It is the basis of it.

So for John McCain or Sarah Palin to say they're against greed is just to say they're against capitalism.

Wednesday, October 1, 2008

My Unfair Lady

My Unfair Lady
(or Pygmalion)
Scene 1: A crowded street, evening.

McCain: You see that downtrodden governor? In six months, I could pass her off as Vice President of the United States!
Bush: Really? She looks nothing like Dick Cheney.
McCain: No, no. The next Vice President.
(sings)
Why can't some parents teach their children to abstain?
For when they face temptation, they simply must refrain.
If they learned abstention, sir, at least till they are wed
... More states would be voting red!
(They leave.)
Sarah: (sings)
All I want is to be V.P.
Learning loads at my boss's knee.
And maybe someday be
Just P!
That would be loverly.

Lots of oil flowing through my pipes
Pays for diapers and baby wipes,
But soon these D.C. types
Serve me!
That will be loverly.

Scene 2: A room at the McCain mansion ... one of them.

McCain: Now try it again!
Sarah: The flack for Iraq? Just blame it on Barack.
McCain: Again?
Sarah: The flack for Iraq? Just blame it on Barack!
McCain: I think she's got it. I think she's got it!
Sarah: (singing) The flack for Iraq? Just blame it on Barack!
McCain: (singing) By George, she's got it! Hey, George, she's got it!

(To be continued ... maybe.)

You Didn't Phrase Your Response As a Question

The debate between the two vice presidential candidates on Thursday is really going to be a test of how much information the McCain team have been able to cram into Sarah Palin's head over the past week, and how well Joe Biden can keep his feet clear of his oral cavity.

Shouldn't this really be moderated by Alex Trebek?

Tuesday, September 30, 2008

Self-Fulfilling Prophecies

The stock market is the ultimate game of chicken. You try to hold onto stocks as long as they're going forward, and jump out just before they crash.

Because of this, stock market forecasts tend to be self-fulfilling prophecies. If enough people believe the market is going up, they start buying more stock and guess what! Likewise, if people fear a downward trend, they jump ship, which, of course, pushes things down. That's why the classic advice "buy low, sell high" is something of a joke. You try to buy at rock bottom (though of course no one knows when this is until things start going up), and sell at the peak (again, best guess.) Most people, however buy high ("Gee, everything looks great!") and sell low ("Ouch. Things really tanked.")

So basically, if the pretty much all political leaders say the market will crash if X doesn't happen, and X doesn't happen ... guess what! Investors assume the threatened crisis will occur, and try to get out while the getting's good.

So is the real problem the fact X (the bailout bill) didn't happen, or the fact that everyone said X has to happen to avert disaster?

The other factor is the yo-yo effect. After a crash, some people who actually follow the "buy low, sell high" advice go bargain shopping. This pushes things up until the next round of fear sets in, so the market bounces up and down like daily. You have to look longer term to see the trends.

What does all this have to do with being a tech curmudgeon? Nothing. I just felt like ranting.

Monday, September 29, 2008

Judgement vs. Consensus

I'm going to have to write a lot more about this at some point, but briefly, I think the Web is moving us from a Judgment-oriented culture to a Consensus-oriented one. By that I mean that we are substituting the opinions of many for the selectivity of a few authorities.

This is obvious in the broadest sense of the term publishing. It used to be that you had to have editors and publishers find enough value in your work to deem it worthy of publication. Now anyone with a Web site can publish, and it can look as authoritative as the big money sites.

I've also seen Web contests in which competing writers (or illustrators or whatever) try to get all their friends, relatives, acquaintances, co-workers, etc. to vote on their work, so they can beat out all the other contestants whose friends, relatives, etc. are less cooperative. There have always been contests in various creative pursuits, but they seem to have proliferated in this "click to vote" world.

Some people choose their iTunes music and other downloads based on the ratings these have received, rather than based on music reviews or other supposedly authoritative opinions.

To some extent, the media have always been consensus-driven. Producers look at sales figures, Nielsen ratings, etc. to steer their resources. But it's become much more direct. Not only are the media providers looking at these figures, but buyers are relying on public opinion much more directly and more heavily than in the pre-Web days. This is how viral marketing and memes work.

Friday, September 26, 2008

In English, por favor

So I drive up to my bank's outdoor ATM (or walk to the indoor one). I swipe my ATM card, and enter my PIN. Then the machine asks if I want to continue in English or Spanish.

Hello?

I've been banking here since this bank took over my previous one, where I banked for years before that. I've used my card and my PIN, so this machine now knows who I am. It knows who I am enough to let me get at my money. That implies a pretty high level or recognition.

But it doesn't know what language I speak? Or perhaps it thinks I've suddenly decided to learn Spanish, and I want to practice by moving money around.

Never mind that I always, ALWAYS, ALWAYS ask for a receipt, but it still asks me if I want one.

Thursday, September 25, 2008

Flinchers

According to a new study, reported in Science magazine, there's a correlation between a person's reaction to sudden loud noises and startling images and that person's political inclinations. Specifically, people who have strong reactions to shocking images or sounds tend to be more conservative, while people who are less sensitive to these stimuli tend to be more liberal.

So, I guess the proper response to this news would be to start referring to conservatives as flinchers.

Tuesday, September 23, 2008

And the wolf shall dwell with the lamb ...

On September 22, 2008, Paul Krugman and William Kristol wrote op-ed pieces in the New York Times that essentially agreed with each other.

The world must be coming to an end.

Saturday, September 20, 2008

Hold On To Your Large Hadrons

So it appears that after a whole week and a half in quasi-operation, the Large Hadron Collider (LHS to its friends) has sustained severe damage, and will be out of commission for months. Well, I guess after so many billions of years, the secret of the universe can wait a few months to reveal itself.

This doesn't paint a good pictures of the scientists who designed this equipment, however. It must be especially unnerving to those folks who are still convinced that an earth-swallowing black hole will result from its activation.

Sounds like they need a good ad campaign. I hear Jerry Seinfeld's available.

Friday, September 19, 2008

Clone Wars

With all that's going on this week, dueling presidential candidates (plain and vice), investment bank meltdowns, and the climax of the regular baseball season, the one thing that seems to have lit up the blogosphere is ads. In particular, it's the ad war between Apple and Microsoft.

For about two years, Apple has been running a series of ads on TV and in print, featuring an ever-so-cool looking personification of a Mac inevitably triumphing over a somewhat stuffy and officious PC character, more or less a caricature of Bill Gates.

After a brief foray into ads about nothing, featuring former Microsoft head Bill Gates and former TV star Jerry Seinfeld, intended to salvage Vista's tarnished image, Microsoft changed horses and decided to go with a new campaign spoofing Apple's, feature a look-alike of Apple's Bill Gates look-alike.

One step ahead, though, Apple launched a new ad spoofing Microsoft's spoof of Apple's ads. This features the original Gates simulacrum pleading with people to stop switching to Mac and, of course, in the process, demonstrating his own ineptitude.

There's one more good thing about open source software like Linux ... no ads!

Stir With A Fork

Now that the weather's getting cool again in New England, I can return to one of my instant gratifications. I dump a packet of instant hot chocolate mix into a cup, and then brew a cup of coffee into it. Ok, I know it's nowhere near as good as a real mocha can be, but it's cheap and quick, and it's not half bad.

Now hot chocolate mix, being naturally gregarious, tends to clump up when the coffee is added, and stirring just makes the clumps swirl around. What's really needed is a good amount of turbulence to break up the clumps so they'll dissolve more quickly (e.g., before I finish drinking the hot chocolate.) At first I tried switching directions. I'd stir counter-clockwise for a while, until the drink really gets going, and then switch. I figured the sudden change in direction would confuse the little hot chocolate clumps, causing them to scatter in a panic.

Nope. They happily switched direction with me.

Now using one of these little plastic stir sticks, or stirrers as they're euphonically called, is completely useless. They have no drag. They simply slice through the liquid, producing no turbulence or wake. I tried switching to a spoon. Once I got the hang of not splattering the coffee/hot chocolate mix all over my shirt, this became pretty straightforward. However, it still didn't break up the hot chocolate clumps.

Finally it hit me. Too really work up a good, clump-breaking turbulence, use a fork! By forcing the liquid through those spaces between the tines, you really shake things up, and also put pressure on those clumps that directly impact the fork tines.

So if you suffer from clumpy hot chocolate mix, stir with a fork!

Can you tell I had too much free time today?

Thursday, September 18, 2008

The Big Bucks

TechCrunch has culled the 29 technology moguls from the Forbes list of the 400 wealthiest Americans. These 29 have a combined net worth of $239 billion, about enough to give $35 to every person in the world! Bill Gates alone could give everybody about eight bucks! He could say "Hey, earth! Let's go to Starbucks. It's on me." And he'd still have enough left to go shoe shopping with Jerry Seinfeld.

Of the top 7 spots, three are Microsoft dudes, and two are Googles. These 7 are are each worth $15 billion or more. Then we drop to Jeff Bezos of Amazon.com, with a paltry $8.7 billion. He'd better start selling more Kindles.

Steve Jobs has only about one tenth the wealth of Bill Gates. On the other hand, he's got Mickey Mouse.

So what's the point of all this? No idea. I just thought it was amusing.

But in closing, let me say that I invite anyone from the TechCrunch 29, or indeed the whole Forbes 400, to give me a billion dollars. Bill Gates wouldn't even miss that much. Or you could all chip in. If the whole Forbes 400 ponies up, it's only a measly 2.5 million each. That's chump change to you!

In return, I promise to dedicate my writing to the benefit of all humankind!

Think about it. I take Paypal.

Wednesday, September 17, 2008

Wall Street in Black Hole

Nobody else seems to have figured this out, but the current kerfuffle on Wall Street was caused by black holes produced by the Large Hadron Collider at CERN. Now I know most of the free world has been clamoring for a way to have its hadrons collided, especially the large ones. "Hey, these things don't collide themselves, you know!" is the all too frequent cry of the hadron owner.

But despite the overwhelming demand, building the collider was a rash and irresponsible move. As many feared, the force of all those hadrons banging into each other brought about the creation of black holes, like the one first observed in Calcutta in 1756. As everyone knows, black holes are places in space where a star collapsed, forming such an intense gravitational field that not even light can get out. Of course, anything that's near a black hole tends to get sucked in, as we saw in the Disney movie, The Black Hole. (Like most people, I always turn to Disney movies when I need to understand some esoteric concept of modern cosmology.)

So, the black hole can swallow anything ... planets, stars, etc. What else in the universe could swallow up four trillion dollars? And hasn't anyone else noticed that the collider is in Switzerland, where they have the world's most secure bank accounts? Moreover, the collider was entirely designed by scientists, the very people the current administration has worked so hard to squelch!

This is just too much coincidence to go unnoticed. I think we should demand an immediate investigation!

Monday, September 15, 2008

Bail Outs

It's difficult not to draw a comparison between the thousands of residents of Galveston, Texas, who needed rescuing after Hurricane Ike hit this past weekend, and the Wall St. investment banks that needed rescuing this weekend. Both are victims of catastrophes that were seen and anticipated way in advance. Hurricane Ike was talked about for over a week as it moved its inexorable way across the Atlantic and into the Gulf. It's likely landfall in Texas was predicted days in advance.

Likewise the housing and credit crises that took down Lehman Bros. and Merrill Lynch, and threatened AIG had already offed Bear Stearns, the Federal National Mortgage Association (FNMA, or "Fannie Mae") and the
Federal Home Loan Mortgage Corporation (FHLMC, or, for some strange reason, "Freddie Mac"). No surprises here.

I think there can only be three reasons for ignoring the very marked warnings that have accompanied both of these catastrophes:
  1. You're not listening.
  2. You don't believe what you're hearing/seeing.
  3. You think you can beat the odds.
If you're just plain not listening ... well, what can I say. If you're an investment banker, your whole job is supposed to be about knowing what's going on.

If you don't believe what you're hearing and seeing, you'd better start looking for more credible sources. Stop reading this blog (just temporarily) and pick up a newspaper ... yes, an actual printed newspaper (I somehow think print still carries a credibility cachet that's missing from Web media) ... preferably one that not owned by an entertainment conglomerate. Sure, there's usually a range of opinions about the severity of the credit crisis, but with the possible exception of John McCain, I don't think anyone has been denying that there are some serious economic issues in the U.S. right now.

And if you just think you can beat the odds, thank you. You're helping keep our state lotteries afloat.

Friday, September 12, 2008

Think Small

In his blog on Adobe, John Nack voices concern that many Photoshop users will find the addition of 3D features off-putting, and will believe the product is evolving in directions they don't care about. Unfortunately, this is an intrinisic problem with monolithic ("single stone") applications. They're big, expensive, and wasteful. Not everyone is going to use all the features, but everybody pays for them, in disk and memory space and in performance as well as in price.

I thought we had learned this lesson years ago. Unix had things beautifully worked out. There are hundreds of simple command line programs that you could use as building blocks. You could create your own workflow by piping the output from one program to become input to the next.

In the early days of object-oriented program, this kind of world was envisioned. The computing environment would be populated with objects which had methods, and could respond to messages. Your new application would send messages to the appropriate other objects, already in place, to use their methods to manipulate their data. CORBA (Common Object Request Broker Architecture) was one statement of this vision, though it never caught on for various reasons.

But the general idea, that software should be built from small, reusable components, has been around for about as long as software has. In fact, software is just a way of re-purposing hardware!

The plug-in approach is a step in this direction. Applications like Photoshop, Illustrator and InDesign allow plug-ins to add new functionality, or change the behavior of what's already there. But there's still a tendency for the applications to become more and more bloated with new built-in features and functions in each release. The most obvious downside of this is that it drives the price up to the point at which there's a whole sub-industry of Photoshop imitators. Photoshop CS3 is listed at $650 on the Adobe site right now! The "Extended" version is $999! And this is software for creatives?

I think developers need to challenge themselves constantly to think in terms of software building blocks that can be assembled to meet various needs. Same-day startup time wouldn't be a bad thing either.

Thursday, September 11, 2008

Esquire They Doing That?

In keeping with my tradition of ignoring anniversaries, I will make no mention of the fact that today is the 7th anniversary of a major anti-U.S. terrorist attack involving 4 hijacked airliners. (Well, except for that.)

Instead, I'll talk about the 75th anniversary of Esquire magazine and, in particular, the much-ballyhooed cover. The cover, at least on 100,000 copies, features an electronic display with technology from E-Ink, a Cambridge, MA spin-off from the MIT Media Lab. E-Ink's flagship product, a flexible electronic display, lets you have paper whose contents can be replaced, so, for example, a sheet could display the front page of The New York Times, always with the latest news.

Unfortunately, the small sample included on the October 2008 Esquire cover (with another inside for an ad) is spectacularly unimpressive. Basically, it just flashes some text and some boxes on and off. The display doesn't really change at all. There are some photos and some text printed on a plastic sheet which overlays the E-Ink product, and by changing the background of the product from dark gray to white, these appear to flash on and off. But the photos don't change. Nothing moves. Even the text which is actually displayed on the e-paper is completely static. I just blinks on and off. The same effect could be achieved with an LCD pretty easily, I think.

The text proudly declares "The 21st Century Begins Now," but you'd never know it from this demonstration.

Wednesday, September 10, 2008

Google Chrome - Not Just Another Browser

Ok, I admit it. I was wrong in my earlier post about Google's Chrome: YAB - Yet Another Browser.

Chrome is not just another browser. Sure, it looks and feels like a browser today. Partly that's a cognitive issue. People understand what a browser is, so that's the best way to explain Chrome. It's also true that Chrome's functionality overlaps that of a browser by a very large amount.

But Chrome is not a browser!

It's a virtual operating system. In other words, it's a distributed platform for running application software. This point is made in Google's press on Chrome, and in the wonderful on-line comic they produced to accompany it. But it's glossed over. They spend more time talking about security and reliability, and of course, they keep referring to it as a browser.

But no one said software is what the developers say it is. Regardless of what anyone says, Chrome is a virtual OS. It's not the first virtual OS. I'm sure there were earlier ones, but for me, emacs comes to mind. Emacs began life as a text editor, but as it grew, it not only added more and more functions in its base code. It also provided a runtime environment for a flavor of the lisp programming language, and this, in turn, meant more and more applications were written in emacs lisp and run within the environment of the editor. People have used emacs for reading and writing email, keeping journals and calendars, composing Web sites, manipulating spreadsheets, and many other things. Some folks still use it for editing text files.

But Chrome is the VOS for the 21st century! Or at least, for the end of the first decade of the 21st century, and probably at least the early part of the second decade as well. It's currently only available in a beta for Windows, but one can easily imagine where it will go in the next year or so. It will provide all the richness that Web applications currently use, and it will fit more seamlessly (less seamfully?) with other applications on your computer, phone, iPod, watch, TV, shoe, etc.

VOS populi.

Monday, September 8, 2008

Life = TV?

The Sharp Aquos (How the heck do you pronounce that anyway? Akwos or Accu-os?) is now advertising with the slogan:

CHANGE YOUR TV / CHANGE YOUR LIFE

Think about that.

Saturday, September 6, 2008

Journalism: Too Much or Too Little Detail

In the usually meticulous New York Times, in an opinion piece called The Two Weeks You Missed, William Falk writes:

Last summer, warming temperatures melted more of the Arctic ice cap than at any time since measurements have been taken.

There is no qualification as to when people started taking measurements of ice cap melting. As far as I know, it was last Tuesday. So this statement, presumably intended to alert me to impending doom at the North Pole, really means nothing.

I don't doubt that there is impending doom from global warming. I'd just like more careful journalism to disclose it.

I've also noticed the opposite situation. I frequently see and hear journalists making statements like:

Wall Street suffered its biggest one day drop since last month.

Since last month? Big whoop! Wake me when something actually happens.

Friday, September 5, 2008

Mental Appliances

A week or so ago, John Tierney posted How to Get Smarter on his TierneyLab blog. He asked how human beings could remain competitive in a world of super-intelligent machines, such as that envisioned by computer scientist and science-fiction writer Vernor Vinge in his novel, Rainbow's End.

All of this, of course, goes to the well worn questions of what intelligence is anyway, and whether machines can become intelligent. The best model I've seen is that what we normally think of as intelligence is a range of skills including pattern recognition, abstract thinking, imagination, etc. Were Shakespeare and Einstein both extremely intelligent? How about SPA? (Socrates, Plato and Aristotle, whose contributions can be difficult to tease apart, especially since Socrates work is only known through the writing of Plato, and Plato was Aristotle's teacher.)

In this light, I think machines will get very good at certain tasks, but not others. The simple reason is that we don't need them to do these other tasks. Computers are already very good at storing and retrieving massive amounts of information, and communicating across long distances. That's good, because without computers, humans are pretty bad at those things. But aside from some undergraduate prank or demonstration of cleverness, why bother making a computer to listen to music? (Not analyze ... just to listen.) Or to enjoy nature? These are things that we do fairly well, and there's no particular benefit to having a computer do it for you.

So the interesting problem is how to use computers to do things we're not good at. This is obvious. Any industry is based on meeting some need ... i.e., some real or perceived lack. Grocery stores exist because most of us are not good at producing our own food. Duh.

In a recent Scientific American article, Why Our Brains Do Not Intuitively Grasp Probabilities, Michael Shermer reveals the disconnect between our perceptions about numbers and probabilities and the reality. Of course, one has only to listen to the presidential campaign speeches to see that disconnect in action. Humans are astoundingly bad at objectively evaluating numerical evidence, and are easily swayed by anectdotal arguments and broad generalizations.

This is where we could use a mental appliance!

Thursday, September 4, 2008

YAB - Yet Another Browser

For years I've been trying to negotiate a settlement in the Mac vs. PC war. In my day job as software developer, I was working largely on Windows platforms because ... well, you can sell more software 'cause there's more of 'em out there. Outside the office, though, in my own illustration and design and, more importantly, in hosting various discussion forums for illustrators, cartoonists, etc., the Mac was king!

And during much of this time, the holy grail of desktop software was a toolkit that would allow the same application to run on both Macs and Windows, and to look like it belonged there! It wasn't enough that the software could run lamely in some misbegotten compatibility mode. It had to look and feel like a native application on both platforms.

But after many years in the desert, we finally saw the promised land of Web applications! Now it didn't matter whether you were running on a Mac or Windows or even some flavor of Linux, because all your applications would run in your browser. The added benefit was that your data lived on a server, so you could get at it from anywhere. (In my experience, we cycle between putting all the smarts on a server and putting all the smarts in your desktop/laptop/pocket about every 5 years.)

But this promised land was an illusion. Even for those applications that could be Web-hosted, the stunning variety of browsers, each with version and platform incompatibilities, turned the Mac/Windows problem into a million smaller problems requiring special case coding.

And now Google, the most notable pioneer of this Web-as-computer approach, introduced yet another browser, Chrome. The name is intended to suggest something bright and shiny, but let's get real here. The fact that Chrome is only available for Windows initially should set off warning bells. It also doesn't support Java currently, and seems to have problems with a fair number of Flash applications.

Back to the desert.

Thursday, August 28, 2008

Free Wheelin' Aint Free

It doesn't take much awareness to realize that bicycle commuting, and bike use in general, is on the rise. There are articles in local papers, growing numbers of blogs, and increasing accomodation for bikes in street layout. Heck, just go outside and look around. The three main reasons seem to be, in no particular order, 1) the high price of gas, 2) health concerns, and 3) environmental concerns.

Now I know most people think "technology" means the ability to communicate wirelessly while watching high-definition video of the latest world event and listening to downloaded music. But believe it or not, there's still technology that's purely mechanical, and there's a lot of interesting innovation going on there.

Bicycles are among the notable beneficiaries of this mechanical innovation. There are two technologies in particular that are very promising for commuters and heavy-duty bicyclists: internal gear hubs and shaft drives.

The shaft-drive replaces the greasy bicycle chain, with it's cogs, cassettes and derailleurs, with a neat little tube housing that extends from the pedal crank to the rear hub. Inside, protected from the elements and road conditions, are the drive shaft and bevel gears. The principle is simple: you work the pedals, which turns the shaft, which turns a gear at the rear hub. That would be enough to make a very robust single speed bike.

For multiple speeds, though, you can't use a derailleur that pushes the chain back and forth among smaller and larger gears. Instead, you can use one of the internal gear hubs like those from Shimano, SRAM and Rohloff. Again, all the mechanisms are fully enclosed in housings that retain the grease, and keep out road dirt and other crud.

Dedicated recreational bicyclists are used to doing a certain amount of maintenance to keep their bikes in prime performing condition, and fortunately, bikes are pretty simple, easy-to-understand machines. But for commuters, who just want to hop on and pedal to where they're going, the low-maintenance aspects of these enclosed mechanism, the shaft drive and the gear hub, are a huge boon.

Tuesday, August 26, 2008

Beer Here

At a recent family get-together, my cousin showed off his latest iPhone applications. One of the more amusing was iBeer, an application that creates the appearance of filling your iPhone (or iPod Touch) with foaming beer, and then drinking or pouring it by tipping the phone. One look, and I was sold.

iBeer is classified as "entertainment" software, but as it turns out, it's only entertaining for the people you show it to. Seeing someone else demonstrate it may make you want to get it, but once you do, it's only use is to demonstrate it to others. After all, it's not very satisfying to drink by yourself, and especially when it's only pretend beer. And when you're "drinking" it, you can't even see the screen to watch it drain.

This is the ultimate product! It's only purpose is to sell itself. In effect, this is the essence of viral marketing. Anyone who buys it instantly becomes a sales person by showing it to others.

And yet, this seems much more innocent than the I Am Rich iPhone application that caused such a buzz. I guess the fun of showing it to others, and watching their amusement, is the return on investment.

Monday, August 25, 2008

The Future In Your Pocket

Ok, let me try a different take on the future.

In the future, no one will buy a computer. The computer, per se, will be a relic. Instead, almost everything you buy, with the possible exception of food and toilet paper, will contain a computer. Every durable good will contain some amount of intelligence. Pens will remember what they've written, and maybe improve your handwriting. Shoes will keep track of mileage and wear, and self-adjust to terrain. Coffee cups will know if your beverage is mixed correctly, too sweet or too creamy, or about to run out. Oh, and they'll maintain whatever temperature you prefer. GPS-guided cars will take you wherever you want to go automatically, avoiding collisions and maintaining optimal traffic flow, and will respond to voice commands when you need a rest stop. Your refrigerator will thaw the roast just in time to hand it off to the waiting microwave/convection oven, which has timed your whole meal to the second. Even your clothes will know when they've got ring around the collar.

One of the reasons general-purpose computers will go away, of course, is that using a keyboard, mouse and screen is a really stupid way to do most things. Sure, they have their uses, and some appliances (e.g., TV entertainment systems) will still resemble the computer, but for most activities, voice and/or gesture are much more expressive and flexible. These are the ways we've communicated with each other for thousands of years. As the computer evolves into a set of intelligent tools and companions, we would want them to be just as receptive and intuitive as another human being.

There are two products on the market today which are bold steps in this direction, and both have been resounding successes. Both have had long lines of anxious would-be purchasers, and both have spurred great volumes of discussion, demonstration and debate.

I'm referring, of course, to the Apple iPhone and the Nintendo Wii.

Welcome to the future.

Friday, August 22, 2008

The Future ... Out of Order

I was in Montreal this past week, and noticed an exciting store display. The full street level window of this business was taken up by a flat panel display. The video featured a pouting model strolling the runway and giving the camera a look that was both enticing and distancing.

As the bus I was on rounded the corner, we saw another window-sized display on that side. This one, however, just showed a perfect green field of gently rolling hills. Above it, a perfect blue sky with just the right number of fluffy white clouds looked on. It was, in short, the Windows Vista desktop background. Hovering in front of all this pastoral beauty was a rectangular box displaying a program failure message, complete with "Ok" button.

This reminded me of when I visited the Washington, D.C. area shortly after the opening of the Silver Spring, Md. Metro station. There were nine of the gleaming new fare card machines at the entrance. Of these, seven were out of order, and another failed while I stood on line. Eventually, the operators let everyone onto the Metro for free.

This is what the future holds. There will be technology to address every human need and desire, in the most sophisticated manner imaginable. But it will all be out of order.

Friday, August 15, 2008

In the Land of the Blind ...

Everyone seems to be talking about Nicholas Carr's article in The Atlantic, Is Google Making Us Stoopid? The gist is that Google is really the scapegoat or synecdoche for the Internet as a whole, and that the style of reading it encourages, one of short blurbs and rapid context changes, is robbing us of our ability to read deeply on any subject. Carr cites some anecdotal evidence from his own and others' experience, and waltzes around his many caveats about the benefits of information appliances, but on the whole, bemoans his own loss of ability to concentrate on longer writing. (He neglects to consider the possible effects of aging.)

Many other bloggers have taken up the challenge either to defend or to attack Carr's position. There's been plenty of lively discussion around this, and many interesting points made on both sides.

But I take a totally different tack. I say "Hooray!" I'm finally in vogue. I've always had trouble reading long articles and books. My attention has always been a fleeting thing at best. Most non-fiction books seem to me to be attempts to commercialize on a single idea or, at best, a very few ideas. Once you grasp the idea, the rest of the book is just filler to make it appear to be worth $16.95.

Moreover, I find reading someone else's writing is like having to step in their footprints in the snow. Language is a vehicle for thought, so following someone else's writing means riding in the back seat while the writer drives. If I want to try going off on a side road, I don't have that option. I can only experience the trip exactly as the tour guide wants it. That's ok for a short trip, but for a really lengthy exploration, I want to have some freedom to linger here and there, or take side trips on my own.

Now, of course, reading can spur the imagination, and inspire new and creative thinking. But for me at least, that's always a digression. I have to catch myself and forcibly return to the text once in a while, or I'll never get through it. The best books take me the longest to read, because they trigger so many interesting diversions.

Well, that's about all I can bother to write on this, so I'll just say "Vive l'Internet!"

Wednesday, August 13, 2008

Personal Technology

I want to talk about how the technology for experiencing media is becoming more personal, and less communal, but I'm not sure where to put this. I have another blog, Art/Tech Fusion, that's focused on the intersection of art and technology, and how they affect each other. That seems like a good place to talk about the impact of personal technology on the experience of various art forms (music, movies, etc.)

However, this blog is a great place to just gripe about stuff, so this is a good place to complain about drivers too busy talking on their cell phones to look where they're going, and pedestrians reading email while walking clumsily down the sidewalk.

So I guess I'll talk about it in both places.

So at the risk of looking like a total fan, I'll mention another David Brooks column from yesterday's New York Times. Brooks contrasts individualist cultures, such as our European heritage, with collectivist ones, such as China, Japan and other Asian cultures. Our technology seems to be furthering individualism by making everything personal. We watch TV, listen to iPods, and work on personal computers and laptops. Everything's so intimate.

The collectivist cultures, on the other hand, put more emphasis on obligation and cooperation. One would expect technology in these cultures to promote public experiences like movies, concerts and other performances with large audiences.

And yet, where does all this personal technology come from? Remember the Sony Walkman?

Monday, August 11, 2008

Brooks No Interference

Last Thursday, David Brooks' op-ed piece, Lord of the Memes, in The New York Times echoed some views I've expressed here before. Specifically, Brooks says:

Now the global thought-leader is defined less by what culture he enjoys than by the smartphone, social bookmarking site, social network and e-mail provider he uses to store and transmit it.

[...]

Today, Kindle can change the world, but nobody expects much from a mere novel. The brain overshadows the mind. Design overshadows art.

In my post from Feb. 5, 2007, The Geeks vs the Heads, I lamented the triumph of Apple, the computer company, over Apple, the Beatles' music company, in their long-standing trademark litigation. I said:
This is the triumph of medium over message. The company that controls the technology dominates over the company that creates the actual content.
I don't always agree with Brooks, but I almost always find his columns interesting and thought-provoking.

Thursday, August 7, 2008

Media Critical Judgement -- YAFS

As I've railed previously on this blog, most recently here, advertising is the commodity on the Web. Or, more precisely, attention is the commodity. That's what all content providers and advertisers are after ... a bit of your attention.

That's fine. If that's what supports the creation of excellent content like this blog (I wish!), so be it. But as readers and viewers, we have a certain obligation to develop critical faculties and judgment. We need to acquire the ability to discriminate between the well argued and documented, and the merely loud.

This is no different from TV commercials. They're all louder than the programs they infest, so you can hear them even when you go to the fridge for a snack. And they all try to be cute and memorable, as opposed to persuasive. Think about the commercials that stick in your mind. Do you think Geico Insurance is better than any other because they have cavemen or a cute animated gecko? Does that horde of extras stalking the Verizon customer really prove their service is better?

Given that more research is done on the effectiveness of ads than probably any other aspect of human behavior, these techniques must work, despite their irrationality. That's why we all need to learn to evaluate advertising claims critically. And because of product placements, viral marketing, and other means of infiltrating seemingly innocuous content, we need to apply this critical thinking to all content. Otherwise, we're just supporting and encouraging more intrusive and misleading promotion, as Seth Godin describes on his blog.

My father used to sit in front of the TV and say "YAFS!" to the commercials. (This was in the days before mute buttons.) YAFS is an acronym for "You Are Full of ... well, something." No exaggeration. I grew up watching TV with this constant reminder to be skeptical of commercials. You can imagine the cynicism this engendered.

Now you know why I'm such a curmudgeon.

Wednesday, August 6, 2008

The Perfect Servant

In my ranting about blogs last week, I realized that one of the reasons they overwhelm you is that we're just not there yet with filtering. The ideal syndication tool would be one that automatically harvests everything you're interested in, and nothing you're not, and shows you only the interesting stuff. You don't have to worry about which posts or blogs to read, or which Web sites to check out. You just get what you want to see automatically.

Of course, this means that a complete, and ever changing profile of you has to exist somewhere. I'm sure Google would be only too happy to collect that information.

In a sense, this is the idea behind groups such as those run by Yahoo! and Google. You subscribe to join the groups you're interested in, and get to see all the content, without having to scan through a lot of other stuff. Of course, the problem with that is that the only real filter is what people posting to the group think is appropriate. We all know where that gets us.

Think about the movies in which some fantastically rich person has servants who supply only the relevant mail, or only the interesting parts of the newspaper. That's what computers should be ... perfect servants. They should know exactly what we do and don't need or want to see, and present all of that, and nothing else. They should be omnipresent, and available all the time.

Wake me when we get there.

Monday, August 4, 2008

Look With Your Hands

When I was a kid shopping with my parents, they always said "Look with your eyes, not your hands."

Part of the beauty of the iPhone and iPod Touch interface is that you can look with your hands. Seriously, the appeal of this interface lies not in either the 3.5" screen or the Multi-Touch interface. It's both!

The screen is larger than most PDA and cell phone screens, but it's still small compared with how most of us are used to browsing the Web and reading email. But the fact that it's so easy to scroll around with the flick of a finger means that your data can be displayed in big, clear graphics, even if it doesn't all fit on the screen.

Maybe I'm slow, but it just dawned on me today that the "Day" view in the calendar only shows a small portion of the day, but it's very readable. Because I can scroll up and down so easily, it doesn't bother me that the view is limited. It's even easier then scrolling on a desktop machine.

So the direct manipulation via Multi-Touch is an attractive feature itself, but it also enables greater magnification of displayed information. That combination is especially appealing.

Thursday, July 31, 2008

Blogging

It seems pretty obvious that someone who calls himself The Tech Curmudgeon should be blogging about ... well, blogging, among other things.

The fact is, I still consider myself a newb n00b in the medium, though I'm trying to catch on as quickly as I can. My first reaction, as I'm sure is true for a lot of people, was "Why would I want to read about a lot of other people's favorite TV shows, or restaurants, or laundry detergent or whatever?"

After some serious introspection, I realized that I don't. But there's a lot of other good stuff out there that is interesting to read. In fact, I'm interested in stuff I didn't even know I was interested in ... didn't even know existed!

The other problem is that it's an infinite time pit. If you start reading blogs, you'll inevitably want to follow links to see what the posters are talking about, and that will get you started on other blogs, and so on. When you read a book or a newspaper or magazine article, you eventually finish it (unless you're like me, and they just pile up, half read, on the coffee table.) On the Internet, nothing ever ends. I don't have a workaround for that problem, I'm afraid. I suppose having a day job helps, but it's still too easy to get caught in a never ending train of link chasing.

And there's the quality thing. I heard about a few sites that offer blog recommendations, but they offer zillions of them! Technorati.com lists thousands of blogs, covering every topic in the spectrum of human thought. Squidoo.com is supposed to help you find all the blogs relevant to your interests, but what if you're interested in everything? Everyone's so paranoid about all the personal information that Google is hoarding, but I say "Good for them. Figure out what I'm interested in (which has eluded me), and show me that!"

Finally, there's the well-known conundrum of authority on the Internet. How do you know any of this is true? Someone publishes some inaccurate information, and it gets linked to and copied all over the known universe within hours. Now, there appear to be multiple sources corroborating the bogus news, giving it the appearance of truth. (Nothing to worry about here. Readers of this blog get pure, unvarnished opinion, not biased by facts.)

So, I guess the logical conclusion is that blogs suck. I'm glad not everyone feels that way.

Wednesday, July 30, 2008

Oil Drilling

Lifting the ban on offshore oil drilling is like searching for spare change under the couch cushions. If you're lucky, you may get some temporary relief, but it doesn't solve the basic problem.

Actually, I guess it's more like buying a bunch of used couches in order to look under the cushions.

Tuesday, July 29, 2008

Mad Men And Even Madder Men

I grew up in the era depicted in the popular TV show Mad Men. Back then, advertising was the number one game in town. It was almost a cliche ... the corporate, suit-and-tied junior exec was in advertising. The number of books and movies portraying the advertising business is staggering. Movies like Lover Come Back, Good Neighbor Sam, and Mr. Blandings Builds His Dream House, along with TV shows like Bewitched, all revolve around the ad business. So it's no surprise that shows like Mad Men fill us with nostalgia for those cut-throat, back-stabbing days when advertising was the prestige business in America.

Well, guess what, it still is. In the high-tech world, hardware companies like DEC and IBM were eclipsed by software giants like Microsoft and Adobe. Now those, too, are overshadowed by companies like Google, a company whose business is ... selling ads.

Monday, July 28, 2008

Packaging

As someone who's earned a substantial portion of his living in the print industry, I've been concerned about the long-touted, but scarcely evident, demise of print. As I've been hearing for decades, paper is going to disappear. Everything's going to be digital.

One has only to glance in my office, or that of most of my co-workers, to see that paper is not going to disappear anytime soon. But even if it were, as I've told myself, there's always packaging! They still need to print packaging!

But packaging, or at least printed packaging, is also a by product of brick-and-mortar shopping. There's no need for slick boxes in the on-line shopping world. Snazzy graphics, and even animation, can be used on a Web site to hawk products that could be shipped in plain cardboard cartons.

But it's not happening anytime soon. Packaging design is still as important as, if not more important than, product design. Take the iPod Touch. It comes in a cute little box with John Lennon's picture on it. (This is, in a way, Apple's gloat at having achieved the ultimate triumph of form over content by winning the Apple trademark away from the Beatles' company, Apple Corps)

And then there's the welded plastic clamshell packaging that covers so many consumer items these days. These packages, designed to lacerate your hands when you try to open them, minimize the printed surface area to a small card inserted between the halves of the shell. Luckily, there's hope for their extinction.

When CDs first became popular in the early 1980's, there was actually a campaign of popular opinion to get rid of the long box, a 6" x 12", shrink-wrapped cardboard box that housed the plastic jewel box containing the CD. Theoretically, the purpose of the long box was to allow retailers to display CDs in bins designed for vinyl records. But in a more innocent and idealistic age, people actually had the audacity to resist this blatant waste. Now, we buy DVDs in cases with almost as much wasted space ... and they're plastic!*

But this, too, shall pass. Already, people are saying "CDs?," "DVDs?" Don't you just download everything? Even books, perhaps the most perfect example of packaging and content combined, are being replaced by downloadable digits.

When I can download all the food and clothing and furniture and recreational equipment I've bought recently, I'll be happy to give up packaging.

* In a classic example of ineptitude, the plastic shell cases for DVDs have notches which should enable you to get your fingers on the edges of the DVD. However, most manufacturers of these cases evidently failed to understand the purpose of these notches, and so blocked them with a little ridge that makes grabbing the disc by its edges impossible.

Friday, July 25, 2008

Feature Priorities

One of the most interesting interview questions I've heard is: What should a snack vending machine be like in 5 or 10 years?

There are loads of possibilities. If something's out of stock, the machine should direct you to the nearest machine that has that item. It should recognize your RFID and automatically highlight choices fitting your preferences (or dietary restrictions!) It should offer different items based on time of day or season of the year. It should be accessible from the desktop, so you order what you feel like, and the now robotic machine delivers it to you.

I've spent some time mulling this question. I considered outlandish possibilities, like a machine that prepares your snacks to order, or one that automatically supplies nutritionally correct items at the appropriate time.

But I've decided there's one overriding feature I'd like to see in the vending machine of 2013.

It should work! It should give you the damn item you paid for.

Thursday, July 24, 2008

Where Am I?

I ordered a GPS system on-line, but it got lost in the mail.

It eventually did show up, though, and immediately won my heart. To begin with, I hate driving. I hate traffic. I hate searching for parking spaces. I hate idiots who pull out in front of me, and even bigger idiots who stop to let the lesser idiots pull out. But most of all, I hate not knowing where I'm going. So, after a particularly frustrating attempt to find my way back to my hotel after a baseball game, I decided to get a GPS.

The infatuation was immediate. That calm, enticing voice is so soothing, it's better than transcendental meditation. The almost unflappable way it warns me of upcoming turns, and then repeats the instructions just when I need them, is so seductive I've started purposely making wrong turns just to prolong the conversation. I'll change routes and shoot down one way streets just for another word from my guide. I keep trying more and more outlandish routes to try to evoke some surprise ... some emotion ... anything!

But I think it's on to me. When I've ignored some instruction, I now detect a definite peevishness about the way it says "Recalculating..." If I've been particularly heedless, I get long periods of silence as I sail past obvious shortcuts. And if I commit some really egregious fault, like making a wrong turn at the very end of my trip, I get veiled threats in the form of hospital locations.

But disconnecting is not an option. I can no longer live without it. I guess I'll just have to give in and do what I'm told.

I've gotta go now. It's time for a drive.

Monday, July 21, 2008

hello, world

Brian W. Kernighan and Dennis M. Ritchie introduced their C programming language to the world in a 1978 book obliquely titled The C Programming Language. They opened with a 3-line program that simply prints "hello, world" on the screen. (Remember, these are the days of terminals and command line programs, and the screen was what was endearingly called stdout.)

In an earlier post, I waxed nostalgic for the software development world of 30 years ago, so I thought it would be entertaining (for me, at least) to consider what it would have been like to be the maintainer of this program over the past three decades.

1980's:

First, the greeting itself would have to translatable into different languages, so the program would probably be modified to read the greeting string from a file on disk. That way, translators could localize the string for different places without having to modify the code.

Since the location of the string file would probably vary from one installation to another, we'd need some way to identify the file with its site-specific path. This could be done with an environment variable or a resource file on a *n*x system, or a .ini file on Windows. (The registry was still years away). On the Mac, perhaps it would be stored in the resource fork. A more general approach would be to put the string in a database, so it's value could be retrieved via a query. This could be implemented in the then new SEQUEL (later SQL) query language.

Now, of course, a string with 8-bit characters can only represent some Western European languages, so we'd have to enable the program to support 16-bit characters, and make the corresponding changes to the resource files and/or database. Since it's inefficient to use 16-bit characters all the time, the program would have to test the language to determine whether to use 8-bit or 16-bit representations. We also need fonts which can display the necessary characters in all the languages we'll be using.

Also, since we now have workstations and personal computers with graphical user interfaces, we'd want to display the greeting in a window, instead of just on a terminal screen. In fact, Charles Petzold did include a sample "Hello, Windows" program in his book, Programming Windows. It was about 60 lines of code.

1990's:

So by now, it's time to convert the code to C++. This will, of course, make it easier to maintain, and it's just generally cooler. For the C++ implementation, we'll want to have classes for the display window and for the string, to hide all the ugly implementation details of multiple string formats, and of differing window systems.

We should probably also make it client/server based, so we can ... uh, ... well, just 'cause that's the way to go. We'll have a "hello, world" server, hwserv, which will keep track of who needs to be greeted, and will display the appropriate greeting. The server can keep the database of greeting strings, and query each client for the appropriate language to use.

But now, of course, there's that World Wide Web thing to consider. Some clients will just be using various Web browsers, and we'll have to use Perl CGI scripts to create dynamic HTML pages containing the suitable greeting.

Of course, we also want to use relevant meta-tags, so that search engines can find our page. We'll probably also consider commercializing the page (Surprisingly, displaying "hello, world" doesn't bring in a lot of money!) by including ads. And we'll need counters and statistics to know how many people are actually seeing our greeting.

Now that we think about it, the Perl CGI scripts are pretty slow. We'd be better off using downloaded Java applets, or possibly JavaScript, to determine the local language and select the greeting string. Unfortunately, just downloading Java applets is slow, and JavaScript doesn't work the same way from one Web browser to another.

We also need to implement security so only those who are entitled to be greeted can receive our greeting page, and so that the site itself is not subject to hacker and denial-of-service attacks.

2000's:

Downloading all the possible greeting strings in all languages is really unnecessary. It would be more efficient to determine the language and then fetch a short XML representation of the appropriate string, using AJAX.

Since our greeting must be state-of-the-art, we're going to create an animated sequence to display the "hello, world" graphic via Flash. We can use ActionScript to animate the text, which will be generated on-the-fly in the appropriate language.

But wait! This needs to run on a cell phone! Of course, it will have to display correctly, regardless of the size and orientation of the screen.

And it needs to be position-sensitive, via the built-in GPS chip, so that you automatically get the appropriate greeting string for the current location of the phone.

And there must be an audio version, so it's accessible to the vision impaired ...

And it should be touch-aware ...

And it should work over WiFi or 3G networking ...

And ...

And ...

And ...

hello, world

Monday, July 14, 2008

Deja-Vu

I can't remember if I've already posted on this or not. If so, I'm sure this take will benefit from the increased wisdom and experience I bring to bear. Or not.

This month marks a personal anniversary. As of this month, it's been 30 years since I started working full-time in some form of software development or other. (I know ... I've previously said that anniversaries, and especially round-numbered ones, are artificial and meaningless, but let's go with it here.) So, reflecting on three decades of software development, I see much that has changed, and much that has not.

When I started, we worried a lot about the memory and processor speed constraints of minicomputers. As soon as those resources became virtually unlimited, we started developing software for workstations, and again had to worry about memory and processor speed. By the time workstations became virtually unlimited, we were developing software for PCs. Guess what. And now it's cell phones and hand-held devices. What's next? Implants?

After a year at my first full-time job, at a big Wall St. brokerage company, I got a job at DEC, then the coolest computer company around. DEC was like being in college, but making money. I joined DEC in the New York office. There were about 20 of us software specialists, all using a single VAX 11/780 as our main computer. We connected via VT-100 terminals, and did everything with command line programs. Now I have many times that amount of computing power and memory in each of my pockets, not to mention my Mac, Windows and Linux desktop machines, my camera, etc., and it's not enough!!

When I started, we wrote programs to solve problems. I would estimate we spent about 80% of our efforts on finding the best solution to the problem, and getting it to work reliably and efficiently. The other 20% was spent on integration ... getting the program to be compatible with other software, to provide compliant interfaces, and generally to play nicely with other software.

Now, it's the opposite. We spend about 20% of our time actually solving the problem at hand, and the other 80% on making sure everything is translatable into every human language, compliant with the latest Microsoft interfaces, Web compatible, accessible, interoperable, scalable, scriptable, and just about every other kind of -ible or -able.

Gordon Moore observed in 1965 that the number of transistors we can fit on a chip doubles about every two years. By extension, this Moore's law is widely taken to mean that the capability of any given technology doubles about every two years, or at some astonishing rate. I don't think anyone has yet quantified how quickly our expectations about technology increase, but it's got to be at least 4 or 8 times the Moore's law rate. (Ok, we geeks are stuck on powers of 2.) It's a major news story when it takes people a few hours to download and install over 200 megabytes of new iPhone software. We complain if an email to the opposite side of the earth takes almost half a minute to send. We gripe about spending a few dollars for a first person shooter game for our cell phones.

One thing I've learned in 30 years: Computers will never do what we want!

Tuesday, July 8, 2008

Mute Buttons

On another blog, Ideas Great and Dumb, I celebrate with humorous rhymes and short essays some of the most important ideas in history. However, I'll take an opportunity here to discuss one of the lesser, but still important ideas ... the mute button.

TV has basically become a resource. TV flows around us, through fiber and cables, microwave signals and broadcast airwaves, like an enormous river. And, like a resource, this vast flow of stuff has to be harnessed and controlled to be put to the service of humankind.

One of the most useful of these controls is the mute button. This is what makes commercial TV bearable. Not only does it allow you to suppress the multi-decibel volume jump when commercials come on, but you can play fun games by filling in your own dialog for the commercials. The stupid little animated gecko can now talk about how his British accent makes him a hit with the ladies. And that herd of Verizon stalkers with their Geeky-looking leader can now be recognized for what they are.

But the power of the mute button depends on the ability to un-mute the sound when the show comes back on. Normally, this is pretty simple. Just watch for the reappearance of your favorite characters. But lately, the networks have resorted to a dastardly trick ... they advertise the very show you're watch DURING THE SHOW. Imagine! You're sitting watching Family Guy, and suddenly, there's a commercial for ... Family Guy! What the deuce? How are we supposed to deal with that? All over the country, millions of remotes are suffering prematurely worn out mute buttons from these false alarms. Millions of Americans are startled out their stupor, causing elevated heart rates and other stress-induced medical conditions.

To heck with wardrobe malfunctions. The FCC should be all over this.

Monday, June 30, 2008

GREED-E

Ok, I admit it. WALL-E is a great movie. It operates as a simple love story, as a hero vs. villain melodrama, and as post-apocalyptic science fiction, and it succeeds at all three.

However, if only to maintain my curmudgeonly reputation, I have to find a few things to pick on. For now, I'll limit myself to two.

Behind the closing credits, there's a wonderful sequence of graphics essentially mimicking the history of art in the course of a few minutes. There are prehistoric-looking drawings, graphics that resemble the work of ancient scribes and Medieval illuminations. There are also references to specific artists such as J.M.W. Turner, Georges Seurat and Vincent van Gogh. I'll have to see it again to put my finger on it, but there's something about these stylistic allusions that suggests the Pixar artists are not simply paying homage to these great artists. They are smugly boasting, as if to say "Ha! With our digital tools, we can do anything any other artist has ever done."

The more egregious fault, of course, is that although the entire movie is a heavy-handed screed against consumer culture, it's preceded by an ad for the WALL-E video game, due out next month. The discreet BnL ad hidden near the end is tongue-in-cheek, but the WALL-E video game ad is certainly not. Moreover, a quick Web search reveals that the Disney/Pixar folks are zealously pursuing every possible licensing opportunity for WALL-E toys, games, bed clothes, etc., just as with every other Disney property. It's as if the message is: "Humankind is doomed if we don't change our acquisitive ways, but meanwhile, buy some more junk from us!"

Thursday, June 26, 2008

ConFusion

Ever consider the rash of new job titles being created for positions involved with some aspect of developing and maintaining Web sites? It seems that every task or activity in traditional publishing has a corresponding position in Web site creation, but with a completely misleading title. In addition, some jobs correspond more with traditional broadcasting than publishing, and they now have the pre-pended "Web" designation.

Web Master, Web Designer, Web Developer, Web Producer, and dozens more with "Web" in the title. Salary.com lists 18 different job titles starting with "Web...", and that's not even counting "Web Press Operator," a traditional printing job. Then there are the Web jobs that don't even have "Web" in the title: Information Architect, Experience Designer, Content Coordinator.

And the salary ranges are all over the place. Though you'd never guess it from the titles, some of these jobs correspond to traditional graphic design jobs, some to writing and editing jobs, some to software engineering/computer scientist jobs, and others all over the place.

Can job titles be copyrighted or trademarked? Maybe there's a revenue opportunity here.