?

Log in

Micheal Moore has a new movie about to be released. I'm inching… - Civil Energetics [entries|archive|friends|userinfo]
Civil Energetics

[ userinfo | livejournal userinfo ]
[ archive | journal archive ]

[Sep. 14th, 2009|10:44 am]
Civil Energetics

civilenergetics

[anansi133]
Micheal Moore has a new movie about to be released. I'm inching through Life, incorporated, after having read No Logo and The Corporation. It's not a particularly good time to be a capitalist right now.

But if there's a mass revolt in the works against The market, what about all the other mass revolts in history, against things that *weren't* capitalist?

I'm a lazy engineer, if I'm going to all the trouble to revolt against something, I want to be absolutely certain I'm rebelling against the thing that's actually bothering me.

I don't think it's capitalism, that's just the current flavor of it. For me, the thing worth opposing is an entire way of thinking.

Um, that didn't come out right. I don't mean a school of thought, or a set of wrong ideas, I mean a mechanism for framing and making choices.

When computer geeks talk about artificial intelligence, we usually mean a kind of mental process that's as flexible and robust as what happens in our own heads, taken outside the brain and put in a format that can be saved, loaded, backed up, and turned off-without it being murder.

And my first knee-jerk reaction has always been the labor-union stance. We have people wasting countless clock cycles on dull meaningless drudgery, are we really that invested on taking more of the fun stuff away from people and giving it to the machines?

It strikes me that every cautionary tale about robot rebellion is just a thinly disguised parable about the rise of the multinational corporation, and its asymmetric tug of war with the State. (I'm not a big fan of the State either, come to that.)

In _The Difference Engine_. Bruce Sterling and William Gibson wrote an alternative history of the technological singularity, where artificial intelligence becomes more important than natural born intelligence. It's made from punchcards and cogs and steam power instead of electricity and magnets and light... but it's recognizably the same thing, the same stuff.

What I've come to believe, is that artificial intelligence is not something waiting to happen, but it's been happening for a very long time, longer than our records show. It's not the information processing power that we're beginning to notice, but the integration of all those peripherals.

I don't like calling it "corporate thought" or "the institution", because that puts it safely out of reach foe the viewer at home. My working title for it is Turing Thought, the kind of cognition that can be mapped on any garden variety turing machine.

And the register of a turing machine doesn't have to look anything like a machine at all. You want to play tic tac toe with a computer, you can use beads and a bunch of match box. Clay tablets will work, or knotted cord. As soon as humans begin to be literate, this kind of artificial intelligence will begin to emerge, it can't help itself.

OK, so what? It sounds like I'm speaking out against literacy. Thing is, I don't see any way to live outside this kind of AI, and I don't want to. But there are certain things a natural human mind can do, that a turing intelligence cannot do. Obvious ones like "wear out" "suffer pain" and "Die". (OK, a turing mind can obviously die, but it would die more like an ant colony's death, less like a human's.)

But the one I was really thinking about, has to do with sanity. A natural human intelligence can question its own sanity, in a way that no organization can. it has to do with cycles of metabolism and attention span. In the time it takes for a human to question our own sanity, we can still draw breath, eat and drink, maybe even fire off a reproductive urge. But for a church, a monarchy, or a corporation to question its own sanity, is tantamount to questioning its own existence. It has no body of its own, so it borrows human minds moment by moment. As soon as the component minds start questioning the need to be part of the larger organ, then the organ itself shrinks even before any sort of choice is made.

That's all well and good, but the crops still need to be watered, the rain still needs to be deflected, and sometimes you still gotta have a doctor to call. We don't know how to replace these turing minds with anything better, and for each of us to be materially self-sufficient is a huge waste of time for everyone.

If I want to do something interesting with this idea, I have to ask myself if there might not be a more effective way of organizing and meeting human needs. Yet every time someone comes along promising to make things better, it seems that some kind of undesirable group of humans needs to be displaced, murdered, or blamed. We can't have an inside without having an outside, and the outside doesn't have meaning unless there are people to be ostracized.

Right now the only convergent factor that springs to mind, has to do with our agreement to limit our own power. Ideas like the separation of church and state, the separation of legislative powers from judicial and executive, and fair trade practices. What bugs me, is how each time we set those agreements into place, some clever bloke down the line is able to figure out a loophole and exploit it, and our grandkids have a whole new kind of tyranny to contend with.

I don't know how to do it, but I know what I want to do: Any time humans decide we're going to organize to scratch an itch or solve a problem, it would be great if natural human intelligence could be at the nucleus of that cell. You could still have turing intelligence flitting about most every which way, but it would never develop an identity outside a human body. Kind of like the dumb but decent machinery of Zion, compared to the clever but evil machinery of the AI agents in the Matrix.

Right now, that seems to tell me that you can't form an institution to correct the excess of every other institution. There's no place to fit the homonunculus.

I think this is in some way related to the moral obligations of the officer class compared to the corresponding obligations of the enlisted class in any military. But I'm outta steam.
linkReply