[A few more posts about last weekend’s H+ Summit at Harvard.]
The last and keynote speaker of the 2010 H+ Summit was, of course, the big daddy of transhumanism, Ray Kurzweil (bio, on-the-fly transcript).
In introducing him, the organizers noted that he flew into town that morning from Colorado, where he was filming his movie, and that he would be zipping out from the conference right after his talk to catch a flight to Los Angeles. This little detail is pretty emblematic of the conference in general: whereas Kurzweil hovered around last year’s Singularity Summit and descended intermittently to comment upon it like the head priest issuing edicts to his votaries, here he attended none of the conference and just stopped by to deliver his stump speech and head back out.
Given that this is the main event, I should probably try to outline it in detail, but just like his talk at SingSum, there was neither any core message to this talk nor anything remotely new about it. He hits all of his standard talking points. And I don’t just mean the same themes, but the very same details he lays out in The Singularity Is Near: the same graphs about Moore’s Law and about the exponential progress of technology in general and various technologies in specific. The main reason for his being here is his celebrity, it seems. Though he does have the shiniest slides of anyone here; his presentation is polished, if not new or focused.
In keeping with Kurzweil’s own unfocused approach, here are a few random notes about the talk and follow-up Q&A:
- — I wasn’t the only one underwhelmed by Ray the K’s presentation. George Dvorsky, a conference presenter, tweeted about how it was all boilerplate. Tweeter Samuel H. Kenyon complained about the warm reception: “Seriously people, why does Kurzweil deserve a standing ovation but the other presenters don’t? Idol worshiping is not my bag.” The best tweet was a tweak, joking about Kurzweil’s obsession with exponential curves: “Why is this talk now not 5 minutes long and 1000 times as interesting as it was 5 years ago?”
- — Here’s Kurzweil on human DNA: “We’re walking around with software — and this is not a metaphor, it’s very literal — that we’ve had for thousands or millions of years.” My jaw was on the floor. Literally, not metaphorically.
- — Kurzweil is working on a book about reverse-engineering the brain, called How the Mind Works and How to Build One (see the Singularity Hub’s recent article on this). Someone alert Steven Pinker that he’s been one-upped. Also, this literal/metaphorical biological software business doesn’t bode well for the metaphysical clarity of this book.
- — He makes an important admission, which is that there is no scientific test we can conceive of to determine whether an entity is conscious — and this means in particular that the Turing Test does not definitively demonstrate consciousness. His conclusion is that consciousness may continue to elude our philosophical understanding, and we should just set those questions aside and focus on what we can practically do.
- — He claims that we are “not going to be transcending our humanity, we’re going to be transcending our biology.” Uh oh, they’re going to need to add a few items to the agenda for the next staff meeting:
- (1) Time to change the name to “transbiologism”? And “H+” to “B+”? (2) Figure out how in the world humanity is separate from its biology. (3) Come up with a plan to deal with some very put-out materialists.
- — As part of the great transhumanist benevolence outreach, Kurzweil makes the bold claim that “old people are people too.” Of course, what this really means — aside from “if you at all question the wisdom of extreme longevity, then you hate old people” — is “we should turn our revulsion for getting old into pity for the elderly.” Somehow I don’t think respecting the dignity of the elderly as we do the young and able-bodied is really what he’s getting at here.
And that’s it for the last presentation of the 2010 H+ Summit. Stay tuned for a couple of wrap-up posts.
Kurzweil is very repetitive. He has been saying the same thing for the past 5 or ten years. Also his fetishizing of moore's law has no basis in fact. Moore's law has become a Rorschach inkblot test, people project their own beliefs onto the term instead of using Gordan's original formulation. Most futurists use the term to connotate that computing power is growing exponentially. This trend and its supposed implications is basically the deus ex machina of transhumanists.
CPU clock speed stopped increasing 5 years ago at 3 ghz. Companies now add more cores to the die (instead of ramping up ghz), however not all programs can take advantage of more cores. 2 Cores does not equal twice the computing power even in programs that can take advantage of them. Also there are bottlenecks due to RAM and other factors.
It's easy for Kurzweil to find exponential lines when he selectively interprets the data. Many of his trends are somewhat fabricated.
That said, he does have a point that certain exponential curves can cause dramatic societal changes.
As for our humanity… With many new technologies there are often worries that they could take something away from our humanity. If you brought someone from the 1600's into the modern world, I think they would have a hard time dealing with how much people have changed. Women (and men) would be dressed and acting inappropriately to them. If they saw someone listening to music with earphones they might wonder why that device was attached to their heads. Modern songs would likely sound like wretched noise to them. They wouldn't initially comprehend how people could spend hours in front of a monitor and why they would be constantly taping on a board. Etc. Etc. Society might be extremely weird to them and people could possibly seem almost inhuman. However, most of these changes happen incrementally for those that experience them. So people get desensitized and don't realize how "unnatural" their life already is.
Where do you draw the line between human and inhuman? Are people who have prosthetic arms or legs or eyes not human? Are people who have a deep brain stimulation implant in their brain for parkinson's or depression not human?
If I may take up just part of Mike's thoughtful comment: I think you are quite right about the adjustment problems a man from the 1600s would face today. A good deal of modern song sounds like wretched noise to me, and I am not nearly that old. Still, it is not clear to me what conclusions we should draw from this thought experiment. None of the examples you provide represent anything in principle new: we are as in the 1600s still in a world of reading, writing, listening, seeing, a world of men and women and manners. A reasonably intelligent man of that time could "learn our ways" just as men of the same time have adapted to radically different human cultures. In contrast, I would have thought that a central point of TRANShumanism and POSThumanism, a point reinforced by the concept of Singularity, is that no such adjustment or adaptation would be possible — and that is something we ought to look forward to. As far as line-drawing goes, for transhumanism the loss of our humanity is not some potential or arguable downside of technological change; getting to the other side of the line is the advertised product.
Never underestimate the ability of human adaptability. The example of Homo 16o is not cut and dried. It depends upon which 16th century human you choose…oh, and Frescobaldi's music was often considered "noise" in his own time …
Plenty of people now live pretty much the way people did in 1600, and I don't mean just hunter-gatherers or peasant farmers in less developed countries. Even in developed countries, you can find people who engage in traditional livelihoods like wine making which haven't changed much in centuries.
Trying to position yourself as in favor of the elderly while simultaneously opposing rejuvenation technology strikes me a difficult balancing act. Good luck.
Kurzweil seems to me amazingly prolific, and if his basic message hasn't changed in 5 years, that is fairly normal, isn't it?
CPU clock speed has plateaued, but the number of cores is now increasing exponentially (already up to 100 cores per chip). Before going to multicore, the complexity of cores increased, meaning more computation per clock cycle. That's less efficient than multicore architectures (and to be fair the cores in many-core chips are stripped down), but as you point out the leap to multicore is a big one for software.
N cores does not always mean an N-fold speedup, but in many cases it can. RAM arrays can be divided up for multiple access, and usually in multicore processors each core has its own cache. The exponential growth of RAM density, of course, continues unabated (still a long way to go to one bit per atom).
In any case, Moore's law continues, even though CMOS seems to have hit a speed limit – which only means, in the longer term, that there is increasing motivation to develop the technology that will supersede CMOS.
As Kurzweil and others have noted, the exponential growth of computing did not begin with integrated circuits but goes back 100 years or more, with one technology superseding another, typically characterized by an S-curve of early development and exponential growth, followed by linear growth and then maturation and slowing of growth. I think Kurzweil makes a strong case that one sees this not only in computers but in many areas of technology.
On another topic, the distinction between human and non-human is clear by reference to the physical description of a thing. Humans are, for example, made of human tissue, composed of human cells, etc.
People who have prosthetic devices implanted in their bodies are people who have prosthetic devices implanted in their bodies. The prosthetic devices are not human, but the people are, and not more nor less than anybody else. Is there really any confusion about this?
I think there are deep reasons (apart from problems associated with foreign body rejection) why it is preferable that prosthetic as well as utilitarian devices be kept external to the body whenever possible. The same reasons would also make the use of cloned tissue to replace damaged tissue preferable to the use of non-human implanted devices or prosthetics.
Transhumanists often exaggerate the imagined advantages of implanted "enhancements" over external tools. If I want to see in the IR, for example, I can put on night-vision goggles, which could be made quite light and convenient long before anyone will figure out how to implant them in my eyes.
Very well put, Charlie. The quick rebuttal to Mike's point is that he is describing only cultural changes, whereas transhumanists want to change (or "trascend") our very biology.
But moreover, I'm struck again by the transhumanist trope of trying to blur all change into an undifferentiated haze, so that there's no reason to oppose their project because it's just the same as all the change we've seen before. But I thought the whole point was that it is going to be so radically different, that we're going to be transcending current limitations in ways we've never seen before. It seems counterintuitive for them to deny the very boundaries they aspire to overcome.
I think kurzweil could use a major revamp of his speeches, especially considering he is such a vocal proponent of the singularity and transhumanism. I like kurzweil, but he needs to be more open to criticism of his theories.
I remember Ray was saying in 2005 that CPU clock speed would be in the 10 ghz range by 2010 and 20 ghz by 2012/2014. Earlier in the decade intel was predicting 15 ghz chips with 10 billion transistors by 2010. Most chips now run at 2 to 3 ghz and have 1 or 2 billion transistors. The forecast for the introduction of specific transistor nodes (45 nm, 32 nm, 22 nm etc) has been pushed back by a year or two when compared to earlier forecasts made by intel 5 to 10 years ago. That's one of the problems with forecasting exponential trends. If the doubling period is off by even a small amount, your numbers will eventually look ridiculous.
I'm not sure what you are referring to when you say 100 cores, maybe a specialized chip? Core is an inexact term that can mean a variety of different hardware configurations. Most CPU's have 2 to 4 cores and it seems like they have kind have been stuck at that level for 5 years. There are diminishing returns going from a dual core to a quad core (especially if the clock speed is less in the quad core). Many programs show no speed up and it's still rare for software to utilize quad-cores fully. I could go on and on about computing limitations (like the huge amount of MW that may be required to power an exaflop supercomputer). My main point, though, is that many transhumanists use a facile interpretation of moore's law. Showing a graph that plots the supposed doubling of CPU core numbers misses a lot of the intricacies of changing computing (or any other supposed exponential curve). I don't really buy into the claim that cpu core number is increasing exponentially anyway. This doesn't mean I deny that there are some really interesting accelerating trends that are occurring (GPU supercomputing offers tremendous speedups for certain types of scientific research, as an example). I'm sure there will be more breakthroughs in the future as well (50 ghz photonic chips?) The "exponential curve will save the day and transform society" meme is very appealing in many ways and it seems like a lot of people just take the bait, hook, line and sinker. The meme sort of disables rational thinking on the topic and transforms people in moore's law zombies who won't tolerate any criticism. I initially fell for it too (along with all the singularity hype), but I've become more skeptical as of late. As for the other points in the comments, I will try to say a few more things later.
It seems that, after all, you are only quibbling that progress has been a little slower than some predictions, but you don't deny that the historical exponential trend continues.
Most computers are, of course, PCs, and most PCs aren't used for crunch-hungry apps, and furthermore are wedded to legacy software that does, indeed, have difficulty exploiting multiple cores. Just wait for vision, natural-language processing, motion planning and control, and general intelligence apps, e.g. in robotics. These are all highly parallelizable and computationally intensive, and will drive the next growth spurt of the underlying hardware.
I'm not sure what you mean by "Moore's law zombies" etc. You say that you have become skeptical of "singularity hype". Sounds like denial to me.
Yes, small differences in exponential growth rates compound over time, but if progress slows by 50%, we are still facing the prospect of machines with a credible claim to human equivalence within my childrens' lifetime – and that leaves me very uncomfortable.
Well, clearly this has brought the Kurzweil/Transhumanist "fanboys" out…how is pointing out the fallacies a form of denial (Good on you, Mike!)?
And honestly, the same message of the past five years? I'm sorry, but that implies a kind of intellectual ossification that is just simply sad. A broken record, to use an analog analogy…
Comments are closed.