Tuesday, September 30, 2008

Self-Fulfilling Prophecies

The stock market is the ultimate game of chicken. You try to hold onto stocks as long as they're going forward, and jump out just before they crash.

Because of this, stock market forecasts tend to be self-fulfilling prophecies. If enough people believe the market is going up, they start buying more stock and guess what! Likewise, if people fear a downward trend, they jump ship, which, of course, pushes things down. That's why the classic advice "buy low, sell high" is something of a joke. You try to buy at rock bottom (though of course no one knows when this is until things start going up), and sell at the peak (again, best guess.) Most people, however buy high ("Gee, everything looks great!") and sell low ("Ouch. Things really tanked.")

So basically, if the pretty much all political leaders say the market will crash if X doesn't happen, and X doesn't happen ... guess what! Investors assume the threatened crisis will occur, and try to get out while the getting's good.

So is the real problem the fact X (the bailout bill) didn't happen, or the fact that everyone said X has to happen to avert disaster?

The other factor is the yo-yo effect. After a crash, some people who actually follow the "buy low, sell high" advice go bargain shopping. This pushes things up until the next round of fear sets in, so the market bounces up and down like daily. You have to look longer term to see the trends.

What does all this have to do with being a tech curmudgeon? Nothing. I just felt like ranting.

Monday, September 29, 2008

Judgement vs. Consensus

I'm going to have to write a lot more about this at some point, but briefly, I think the Web is moving us from a Judgment-oriented culture to a Consensus-oriented one. By that I mean that we are substituting the opinions of many for the selectivity of a few authorities.

This is obvious in the broadest sense of the term publishing. It used to be that you had to have editors and publishers find enough value in your work to deem it worthy of publication. Now anyone with a Web site can publish, and it can look as authoritative as the big money sites.

I've also seen Web contests in which competing writers (or illustrators or whatever) try to get all their friends, relatives, acquaintances, co-workers, etc. to vote on their work, so they can beat out all the other contestants whose friends, relatives, etc. are less cooperative. There have always been contests in various creative pursuits, but they seem to have proliferated in this "click to vote" world.

Some people choose their iTunes music and other downloads based on the ratings these have received, rather than based on music reviews or other supposedly authoritative opinions.

To some extent, the media have always been consensus-driven. Producers look at sales figures, Nielsen ratings, etc. to steer their resources. But it's become much more direct. Not only are the media providers looking at these figures, but buyers are relying on public opinion much more directly and more heavily than in the pre-Web days. This is how viral marketing and memes work.

Friday, September 26, 2008

In English, por favor

So I drive up to my bank's outdoor ATM (or walk to the indoor one). I swipe my ATM card, and enter my PIN. Then the machine asks if I want to continue in English or Spanish.


I've been banking here since this bank took over my previous one, where I banked for years before that. I've used my card and my PIN, so this machine now knows who I am. It knows who I am enough to let me get at my money. That implies a pretty high level or recognition.

But it doesn't know what language I speak? Or perhaps it thinks I've suddenly decided to learn Spanish, and I want to practice by moving money around.

Never mind that I always, ALWAYS, ALWAYS ask for a receipt, but it still asks me if I want one.

Thursday, September 25, 2008


According to a new study, reported in Science magazine, there's a correlation between a person's reaction to sudden loud noises and startling images and that person's political inclinations. Specifically, people who have strong reactions to shocking images or sounds tend to be more conservative, while people who are less sensitive to these stimuli tend to be more liberal.

So, I guess the proper response to this news would be to start referring to conservatives as flinchers.

Tuesday, September 23, 2008

And the wolf shall dwell with the lamb ...

On September 22, 2008, Paul Krugman and William Kristol wrote op-ed pieces in the New York Times that essentially agreed with each other.

The world must be coming to an end.

Saturday, September 20, 2008

Hold On To Your Large Hadrons

So it appears that after a whole week and a half in quasi-operation, the Large Hadron Collider (LHS to its friends) has sustained severe damage, and will be out of commission for months. Well, I guess after so many billions of years, the secret of the universe can wait a few months to reveal itself.

This doesn't paint a good pictures of the scientists who designed this equipment, however. It must be especially unnerving to those folks who are still convinced that an earth-swallowing black hole will result from its activation.

Sounds like they need a good ad campaign. I hear Jerry Seinfeld's available.

Friday, September 19, 2008

Clone Wars

With all that's going on this week, dueling presidential candidates (plain and vice), investment bank meltdowns, and the climax of the regular baseball season, the one thing that seems to have lit up the blogosphere is ads. In particular, it's the ad war between Apple and Microsoft.

For about two years, Apple has been running a series of ads on TV and in print, featuring an ever-so-cool looking personification of a Mac inevitably triumphing over a somewhat stuffy and officious PC character, more or less a caricature of Bill Gates.

After a brief foray into ads about nothing, featuring former Microsoft head Bill Gates and former TV star Jerry Seinfeld, intended to salvage Vista's tarnished image, Microsoft changed horses and decided to go with a new campaign spoofing Apple's, feature a look-alike of Apple's Bill Gates look-alike.

One step ahead, though, Apple launched a new ad spoofing Microsoft's spoof of Apple's ads. This features the original Gates simulacrum pleading with people to stop switching to Mac and, of course, in the process, demonstrating his own ineptitude.

There's one more good thing about open source software like Linux ... no ads!

Stir With A Fork

Now that the weather's getting cool again in New England, I can return to one of my instant gratifications. I dump a packet of instant hot chocolate mix into a cup, and then brew a cup of coffee into it. Ok, I know it's nowhere near as good as a real mocha can be, but it's cheap and quick, and it's not half bad.

Now hot chocolate mix, being naturally gregarious, tends to clump up when the coffee is added, and stirring just makes the clumps swirl around. What's really needed is a good amount of turbulence to break up the clumps so they'll dissolve more quickly (e.g., before I finish drinking the hot chocolate.) At first I tried switching directions. I'd stir counter-clockwise for a while, until the drink really gets going, and then switch. I figured the sudden change in direction would confuse the little hot chocolate clumps, causing them to scatter in a panic.

Nope. They happily switched direction with me.

Now using one of these little plastic stir sticks, or stirrers as they're euphonically called, is completely useless. They have no drag. They simply slice through the liquid, producing no turbulence or wake. I tried switching to a spoon. Once I got the hang of not splattering the coffee/hot chocolate mix all over my shirt, this became pretty straightforward. However, it still didn't break up the hot chocolate clumps.

Finally it hit me. Too really work up a good, clump-breaking turbulence, use a fork! By forcing the liquid through those spaces between the tines, you really shake things up, and also put pressure on those clumps that directly impact the fork tines.

So if you suffer from clumpy hot chocolate mix, stir with a fork!

Can you tell I had too much free time today?

Thursday, September 18, 2008

The Big Bucks

TechCrunch has culled the 29 technology moguls from the Forbes list of the 400 wealthiest Americans. These 29 have a combined net worth of $239 billion, about enough to give $35 to every person in the world! Bill Gates alone could give everybody about eight bucks! He could say "Hey, earth! Let's go to Starbucks. It's on me." And he'd still have enough left to go shoe shopping with Jerry Seinfeld.

Of the top 7 spots, three are Microsoft dudes, and two are Googles. These 7 are are each worth $15 billion or more. Then we drop to Jeff Bezos of Amazon.com, with a paltry $8.7 billion. He'd better start selling more Kindles.

Steve Jobs has only about one tenth the wealth of Bill Gates. On the other hand, he's got Mickey Mouse.

So what's the point of all this? No idea. I just thought it was amusing.

But in closing, let me say that I invite anyone from the TechCrunch 29, or indeed the whole Forbes 400, to give me a billion dollars. Bill Gates wouldn't even miss that much. Or you could all chip in. If the whole Forbes 400 ponies up, it's only a measly 2.5 million each. That's chump change to you!

In return, I promise to dedicate my writing to the benefit of all humankind!

Think about it. I take Paypal.

Wednesday, September 17, 2008

Wall Street in Black Hole

Nobody else seems to have figured this out, but the current kerfuffle on Wall Street was caused by black holes produced by the Large Hadron Collider at CERN. Now I know most of the free world has been clamoring for a way to have its hadrons collided, especially the large ones. "Hey, these things don't collide themselves, you know!" is the all too frequent cry of the hadron owner.

But despite the overwhelming demand, building the collider was a rash and irresponsible move. As many feared, the force of all those hadrons banging into each other brought about the creation of black holes, like the one first observed in Calcutta in 1756. As everyone knows, black holes are places in space where a star collapsed, forming such an intense gravitational field that not even light can get out. Of course, anything that's near a black hole tends to get sucked in, as we saw in the Disney movie, The Black Hole. (Like most people, I always turn to Disney movies when I need to understand some esoteric concept of modern cosmology.)

So, the black hole can swallow anything ... planets, stars, etc. What else in the universe could swallow up four trillion dollars? And hasn't anyone else noticed that the collider is in Switzerland, where they have the world's most secure bank accounts? Moreover, the collider was entirely designed by scientists, the very people the current administration has worked so hard to squelch!

This is just too much coincidence to go unnoticed. I think we should demand an immediate investigation!

Monday, September 15, 2008

Bail Outs

It's difficult not to draw a comparison between the thousands of residents of Galveston, Texas, who needed rescuing after Hurricane Ike hit this past weekend, and the Wall St. investment banks that needed rescuing this weekend. Both are victims of catastrophes that were seen and anticipated way in advance. Hurricane Ike was talked about for over a week as it moved its inexorable way across the Atlantic and into the Gulf. It's likely landfall in Texas was predicted days in advance.

Likewise the housing and credit crises that took down Lehman Bros. and Merrill Lynch, and threatened AIG had already offed Bear Stearns, the Federal National Mortgage Association (FNMA, or "Fannie Mae") and the
Federal Home Loan Mortgage Corporation (FHLMC, or, for some strange reason, "Freddie Mac"). No surprises here.

I think there can only be three reasons for ignoring the very marked warnings that have accompanied both of these catastrophes:
  1. You're not listening.
  2. You don't believe what you're hearing/seeing.
  3. You think you can beat the odds.
If you're just plain not listening ... well, what can I say. If you're an investment banker, your whole job is supposed to be about knowing what's going on.

If you don't believe what you're hearing and seeing, you'd better start looking for more credible sources. Stop reading this blog (just temporarily) and pick up a newspaper ... yes, an actual printed newspaper (I somehow think print still carries a credibility cachet that's missing from Web media) ... preferably one that not owned by an entertainment conglomerate. Sure, there's usually a range of opinions about the severity of the credit crisis, but with the possible exception of John McCain, I don't think anyone has been denying that there are some serious economic issues in the U.S. right now.

And if you just think you can beat the odds, thank you. You're helping keep our state lotteries afloat.

Friday, September 12, 2008

Think Small

In his blog on Adobe, John Nack voices concern that many Photoshop users will find the addition of 3D features off-putting, and will believe the product is evolving in directions they don't care about. Unfortunately, this is an intrinisic problem with monolithic ("single stone") applications. They're big, expensive, and wasteful. Not everyone is going to use all the features, but everybody pays for them, in disk and memory space and in performance as well as in price.

I thought we had learned this lesson years ago. Unix had things beautifully worked out. There are hundreds of simple command line programs that you could use as building blocks. You could create your own workflow by piping the output from one program to become input to the next.

In the early days of object-oriented program, this kind of world was envisioned. The computing environment would be populated with objects which had methods, and could respond to messages. Your new application would send messages to the appropriate other objects, already in place, to use their methods to manipulate their data. CORBA (Common Object Request Broker Architecture) was one statement of this vision, though it never caught on for various reasons.

But the general idea, that software should be built from small, reusable components, has been around for about as long as software has. In fact, software is just a way of re-purposing hardware!

The plug-in approach is a step in this direction. Applications like Photoshop, Illustrator and InDesign allow plug-ins to add new functionality, or change the behavior of what's already there. But there's still a tendency for the applications to become more and more bloated with new built-in features and functions in each release. The most obvious downside of this is that it drives the price up to the point at which there's a whole sub-industry of Photoshop imitators. Photoshop CS3 is listed at $650 on the Adobe site right now! The "Extended" version is $999! And this is software for creatives?

I think developers need to challenge themselves constantly to think in terms of software building blocks that can be assembled to meet various needs. Same-day startup time wouldn't be a bad thing either.

Thursday, September 11, 2008

Esquire They Doing That?

In keeping with my tradition of ignoring anniversaries, I will make no mention of the fact that today is the 7th anniversary of a major anti-U.S. terrorist attack involving 4 hijacked airliners. (Well, except for that.)

Instead, I'll talk about the 75th anniversary of Esquire magazine and, in particular, the much-ballyhooed cover. The cover, at least on 100,000 copies, features an electronic display with technology from E-Ink, a Cambridge, MA spin-off from the MIT Media Lab. E-Ink's flagship product, a flexible electronic display, lets you have paper whose contents can be replaced, so, for example, a sheet could display the front page of The New York Times, always with the latest news.

Unfortunately, the small sample included on the October 2008 Esquire cover (with another inside for an ad) is spectacularly unimpressive. Basically, it just flashes some text and some boxes on and off. The display doesn't really change at all. There are some photos and some text printed on a plastic sheet which overlays the E-Ink product, and by changing the background of the product from dark gray to white, these appear to flash on and off. But the photos don't change. Nothing moves. Even the text which is actually displayed on the e-paper is completely static. I just blinks on and off. The same effect could be achieved with an LCD pretty easily, I think.

The text proudly declares "The 21st Century Begins Now," but you'd never know it from this demonstration.

Wednesday, September 10, 2008

Google Chrome - Not Just Another Browser

Ok, I admit it. I was wrong in my earlier post about Google's Chrome: YAB - Yet Another Browser.

Chrome is not just another browser. Sure, it looks and feels like a browser today. Partly that's a cognitive issue. People understand what a browser is, so that's the best way to explain Chrome. It's also true that Chrome's functionality overlaps that of a browser by a very large amount.

But Chrome is not a browser!

It's a virtual operating system. In other words, it's a distributed platform for running application software. This point is made in Google's press on Chrome, and in the wonderful on-line comic they produced to accompany it. But it's glossed over. They spend more time talking about security and reliability, and of course, they keep referring to it as a browser.

But no one said software is what the developers say it is. Regardless of what anyone says, Chrome is a virtual OS. It's not the first virtual OS. I'm sure there were earlier ones, but for me, emacs comes to mind. Emacs began life as a text editor, but as it grew, it not only added more and more functions in its base code. It also provided a runtime environment for a flavor of the lisp programming language, and this, in turn, meant more and more applications were written in emacs lisp and run within the environment of the editor. People have used emacs for reading and writing email, keeping journals and calendars, composing Web sites, manipulating spreadsheets, and many other things. Some folks still use it for editing text files.

But Chrome is the VOS for the 21st century! Or at least, for the end of the first decade of the 21st century, and probably at least the early part of the second decade as well. It's currently only available in a beta for Windows, but one can easily imagine where it will go in the next year or so. It will provide all the richness that Web applications currently use, and it will fit more seamlessly (less seamfully?) with other applications on your computer, phone, iPod, watch, TV, shoe, etc.

VOS populi.

Monday, September 8, 2008

Life = TV?

The Sharp Aquos (How the heck do you pronounce that anyway? Akwos or Accu-os?) is now advertising with the slogan:


Think about that.

Saturday, September 6, 2008

Journalism: Too Much or Too Little Detail

In the usually meticulous New York Times, in an opinion piece called The Two Weeks You Missed, William Falk writes:

Last summer, warming temperatures melted more of the Arctic ice cap than at any time since measurements have been taken.

There is no qualification as to when people started taking measurements of ice cap melting. As far as I know, it was last Tuesday. So this statement, presumably intended to alert me to impending doom at the North Pole, really means nothing.

I don't doubt that there is impending doom from global warming. I'd just like more careful journalism to disclose it.

I've also noticed the opposite situation. I frequently see and hear journalists making statements like:

Wall Street suffered its biggest one day drop since last month.

Since last month? Big whoop! Wake me when something actually happens.

Friday, September 5, 2008

Mental Appliances

A week or so ago, John Tierney posted How to Get Smarter on his TierneyLab blog. He asked how human beings could remain competitive in a world of super-intelligent machines, such as that envisioned by computer scientist and science-fiction writer Vernor Vinge in his novel, Rainbow's End.

All of this, of course, goes to the well worn questions of what intelligence is anyway, and whether machines can become intelligent. The best model I've seen is that what we normally think of as intelligence is a range of skills including pattern recognition, abstract thinking, imagination, etc. Were Shakespeare and Einstein both extremely intelligent? How about SPA? (Socrates, Plato and Aristotle, whose contributions can be difficult to tease apart, especially since Socrates work is only known through the writing of Plato, and Plato was Aristotle's teacher.)

In this light, I think machines will get very good at certain tasks, but not others. The simple reason is that we don't need them to do these other tasks. Computers are already very good at storing and retrieving massive amounts of information, and communicating across long distances. That's good, because without computers, humans are pretty bad at those things. But aside from some undergraduate prank or demonstration of cleverness, why bother making a computer to listen to music? (Not analyze ... just to listen.) Or to enjoy nature? These are things that we do fairly well, and there's no particular benefit to having a computer do it for you.

So the interesting problem is how to use computers to do things we're not good at. This is obvious. Any industry is based on meeting some need ... i.e., some real or perceived lack. Grocery stores exist because most of us are not good at producing our own food. Duh.

In a recent Scientific American article, Why Our Brains Do Not Intuitively Grasp Probabilities, Michael Shermer reveals the disconnect between our perceptions about numbers and probabilities and the reality. Of course, one has only to listen to the presidential campaign speeches to see that disconnect in action. Humans are astoundingly bad at objectively evaluating numerical evidence, and are easily swayed by anectdotal arguments and broad generalizations.

This is where we could use a mental appliance!

Thursday, September 4, 2008

YAB - Yet Another Browser

For years I've been trying to negotiate a settlement in the Mac vs. PC war. In my day job as software developer, I was working largely on Windows platforms because ... well, you can sell more software 'cause there's more of 'em out there. Outside the office, though, in my own illustration and design and, more importantly, in hosting various discussion forums for illustrators, cartoonists, etc., the Mac was king!

And during much of this time, the holy grail of desktop software was a toolkit that would allow the same application to run on both Macs and Windows, and to look like it belonged there! It wasn't enough that the software could run lamely in some misbegotten compatibility mode. It had to look and feel like a native application on both platforms.

But after many years in the desert, we finally saw the promised land of Web applications! Now it didn't matter whether you were running on a Mac or Windows or even some flavor of Linux, because all your applications would run in your browser. The added benefit was that your data lived on a server, so you could get at it from anywhere. (In my experience, we cycle between putting all the smarts on a server and putting all the smarts in your desktop/laptop/pocket about every 5 years.)

But this promised land was an illusion. Even for those applications that could be Web-hosted, the stunning variety of browsers, each with version and platform incompatibilities, turned the Mac/Windows problem into a million smaller problems requiring special case coding.

And now Google, the most notable pioneer of this Web-as-computer approach, introduced yet another browser, Chrome. The name is intended to suggest something bright and shiny, but let's get real here. The fact that Chrome is only available for Windows initially should set off warning bells. It also doesn't support Java currently, and seems to have problems with a fair number of Flash applications.

Back to the desert.