The big question, of course, is "What IS intelligence?" What exactly will these super brains do better than we do? There's no dodging that question. Machines do what we design them to do. So what will that be? Design them to create even better machines? Better in what way?
Will they make better TV shows? And if so, who will enjoy them ... us or them? Are we really going to build machines that can write new episodes of Hot in Cleveland? And what will the commercials advertise?
How about governing? Surely that's one of the most important problems humankind has to solve ... how to govern. Can we create robot world leaders? Would they show a shred of warmth and humanity, or just be like Al Gore?
They could probably cure cancer, AIDS and flatulence, but would they? These highly intelligent robots wouldn't suffer from any of these conditions. Will they see any benefit to keeping a bunch of flesh and blood beings around?
Wouldn't it be simpler for them to just take over the world? Science fiction books and movies have warned us about this for decades. But really, just the way Microsoft dropped support for Windows XP, sooner or later the robots are going to drop support for Human 1.0. We'll be too obsolete and expensive to maintain. You'll try to get help with some human problem, and get connected to a robot with an Indian accent halfway around the world.
But it's inevitable. Futurists are looking at trends in technology, and the rate at which things change. The signs are already there. My computers spend way more time upgrading themselves than doing any actual work. If that's not superior intelligence, what is?