Thursday, June 14, 2012

I got some stuff published

Well, after two and a half years, (!) I figure it's time to update this blog.

After I put up the article on machine consciousness mentioned in my last posting in this blog, I had some interesting email exchanges with several people about it, including Mark Bishop and David Chalmers, both of whom have written on this topic, and both of whom encouraged me to write something up for publication, which I have finally done. Part of the delay was due to my other writing project (Do Not Go Gentle, more on that below), and part was due to the fact that in November 2010 my sabbatical was over and I went back to full-time work.

The article I wrote is titled "Counterfactuals, Computation, and Consciousness", and it's published in Cognitive Computation. It is available online from Springer, or if you don't have institutional access to that journal, you can read a less-nicely-formatted version of it on my website:

muhlestein.com/consciousness/ccc.html

For the sake of conciseness, I decided to write in detail about a single thought experiment that illustrates one of the difficulties of computationalism: the philosophical claim that a purely computational theory of mind is possible. Computationalism has been the leading theory of mind for some time, though it does have its detractors. I had more or less assumed that something like uploading to a purely computational substrate would work, in the sense that an uploaded mind would have conscious experience, but  I never thought seriously about it until I was plotting my book and I realized I'd have to come down one way or the other on the question. Unfortunately, I've come to believe there are serious reasons to doubt that it would indeed work as some hope.

Very clearly, there is computation going on in the brain, as has been demonstrated beautifully, for example, in studies of visual processing. But what about consciousness itself? The more I thought about it, the less convinced I became that a purely computational account would do the job. The problem for me is the abstract nature of computation, and the fact that it is possible to blur the distinction between a recording of a computation, which almost everybody (with the exception of patternists like Ben Goertzel) agrees could not be conscious, and a bona fide computation, which computationalists assert could be conscious, as long as it is the right sort of computation.

Beyond needing to know how to think about this for use in my novel, I have a personal interest in this as well: I have many friends in the transhumanist community who are looking forward to uploading their consciousness into a computational substrate via a destructive scanning of their brains. I'd be the last person to try to prevent someone from doing that, but before I could recommend it, I'd want to be double damn sure that it works. The resulting failure mode if it doesn't work is quite horrifying: your loved one uploads and announces that they are conscious, everything's fine, life is good, etc. But if computationalism is false, they would in fact be dead, and their behavior would be the output of a program, no more conscious than a one-liner that prints "Hello, World." So we definitely want to get this right!

In other news, as most readers of this blog will already know, I did manage to get the first two volumes of my novel, Do Not Go Gentle, put up on Amazon as ebooks:

Do Not Go Gentle Book One: Discovery
Do Not Go Gentle Book Two: Kinaadman

People seem to be enjoying the story, and though I have made no attempt whatsoever to promote it, I've had a gratifying response so far. At some point I guess I'll spend some effort getting the word out about it, but for now, I'm enjoying just working on it.

I'm about 2/3 done with the third installment of the story, and now that I've got the consciousness paper finished, I'm finally getting back to work on it. I'm hoping to finish book 3 before the end of the year. I have a  blog for the story, which is at millstorm.com, "Millstorm" being the nom de plume I'm using.