Wednesday, April 28, 2010

A Short Rant on Religious Freedom and the Courts

Two recent events make it all the more clear that religious liberty for all is in jeopardy. Recent events show that even in civilized, Western countries, basic religious freedom, whether for believers, agnostics or atheists hangs by a series of thin threads. Great Britain and the United States provide the two most recent examples.

The United States Supreme Court ruled 5-4 that game playing with transfers of small plots of land allow the federal government to endorse specific religions. Readers are likely familiar with the ongoing case of the war memorial cross in the Mojave desert. The federal government attempted to transfer the land just surrounding the cross to a private veterans group to prevent any issues with the establishment clause. The court decided not just that this game playing was acceptable but that it probably wasn't even necessary. Justice Kennedy wrote:

A Latin cross is not merely a reaffirmation of Christian beliefs. Here, a Latin cross in the desert evokes far more than religion. It evokes thousands of small crosses in foreign fields marking the graves of Americans who fell in battles, battles whose tragedies are compounded if the fallen are forgotten.
Because of course, the fact that fallen soldiers of other religions are buried with other symbols is of course besides the point. And the fact that the commonality of the cross is solely because the US is a majority Christian nation is besides the point. And the fact that some (small) Christian groups are actually uncomfortable with the cross as a religious symbol is also besides the point.

But not to worry, since while the US Supreme Court is busy whittling away at basic separation and church and state, the British are busy destroying the rights of people to say things which offend religion. Apparently leaving anti-religious tracts around in the wrong places in England can get you convicted. After leaving anti-religious tracts in an airport prayer room, Harold Taylor received a six-month suspended sentence and is now not allowed to carry anti-religious leaflets in public. As far as I can tell, the tracts left by Taylor were deeply unfunny cartoons that wouldn't have convinced anyone of anything. Taylor probably needs a few lessons in how to be funny and not just annoying(Jennifer McCreight could likely teach him a thing or two). But that shouldn't be a criminal offense either. At least Taylor's situation would still be unambiguously unacceptable in the United States.

These events highlight how important it is that Obama's next Supreme Court nominee be a strong supporter of free speech. Unfortunately, his previous nominee, Sotomayor, has a mixed record on such issues. Let's hope the next one is better.

Monday, April 19, 2010

On the Coming Singularity

Much has been said in the last few years about an approaching technological Singularity, beyond which humans or humans' descendants will be so far beyond anything we understand today that comparisons would be meaningless. I do not believe that the Singularity is imminent.

What do people mean when they speak of the Singularity? There are a variety of such notions, but most versions of the Singularity focus on self-improving artificial intelligences. The central idea is that humans will not only construct functioning artificial intelligences, but that such AIs will be smarter than humans. Given such entities, technological progress will increase rapidly as the AIs make discoveries and inventions that humans would not. This effect will be self-reinforcing as each successive improvement makes the AIs smarter. There are variations of this idea: Other Singularity proponents, generally described as Transhumanists emphasize genetic engineering of humans or emphasize direct interfaces between the human brain and computers. I am skeptical of a Singularity occurring in the near future.

Certainly Singularitarism is seductive. Variations of it make for great science fiction (Charlie Stross' Eschaton is an excellent example) and some version of the Singularity, especially those that involve humans being downloaded into immortal computers or the like, are appealing. Singularitarism may sometimes border on a religion, but it has the virtue of a minimally plausible eschatology, one that doesn't require the intervention of tribal deities, just optimistic estimates for technological and scientific progress. And to be sure, there are some very smart people such as Eliezer Yudkowsky who take the Singularity very seriously.

The most common criticism of Singularitarism is that we will not develop effective AIs. This argument is unpersuasive. There's no intrinsic physical law against developing AIs; we are making slow but steady progress; and we know that intelligences are already possible under the laws of the universe.We're an example.

While I reject most of the common criticisms of a coming Singularity, I am nevertheless skeptical of the idea for two reasons. First, while human understanding of science and technology has been improving over the last few hundred years, the level of resources it takes today to produce the same increase in understanding has increased dramatically. For example, in the mid 19th century a few scientists could work out major theories about nature, such as the basics of evolution and electromagnetism. Now, however, most major scientific fields have thousands of people working in them, and yet the progress is slow and incremental. There seems to be a meta-pattern that as we learn more we require correspondingly more resources to make corresponding levels of progress. Thus, even if we develop smart AIs, they may not lead to sudden technological progress.

Second, we may simply be close to optimizing our understanding of the laws of physics for technological purposes. Many of the technologies we hope to develop may be intrinsically impractical or outright impossible. There may be no room-temperature superconductors. There may be no way to make a practical fusion reactor. As Matt Springer suggested (here and here), we might activate our supersmart AI, and then it may say "You guys seem to have thought things through pretty well. I don't have much to add." This seems to be a common problem with Singularity proponents. It is a common argument by Singularitarians that essentially all challenges can be solved by sufficient intelligence. I've personally seen this argument made multiple times by Singularitarians discussing faster-than-light travel. But if it isn't allowed by the laws of physics than there's nothing we can do. If in a chess game white can force a checkmate in 3 moves, it doesn't matter how smart black is. They'll still lose. No matter how smart we are, if the laws of physics don’t allow something then we won’t be able to do that thing, any more than black will be able to prevent a checkmate by white.

There's a third problem with Singularitarism beyond issues of plausibility: It doesn't tell us what to do today. Even if no one had ever come with the Singularity, we'd still be investigating AI, brain-computer interfaces, and genetic engineering. They are all interesting technologies with potentially have major applications to help us answer fundamental questions about human nature. So in that regard, the Singularity as a concept is unhelpful: It might happen. It might not happen. But it tells us very little about what we should do now.