Thursday, May 31, 2018

The Irony of Idiocracy

Gripes about limousine liberals miss the point. 

Democracy (or some semblance of democracy) can only exist if a significant faction of the power elite perceives it to be in their best interests.

Roosevelt was the vanguard of a segment of America’s oligarchy who saw that—if the inequities of the Great Depression persisted, a communist revolution would inevitably follow. Throw the fucking workers a bone.

Share the wealth. (At least a little.) Incorporate the people in the decision making process. (Within reason. In the end, it's good for your bottom line!)

For democracy to work, the voters can't be stupid. At least most of them. 

If the population reaches a critical mass of dumbasses, the system stops working. 

One faction of the power elite supported Roosevelt. The other excoriated him as a class traitor. And let's be nice and sparkling clear, droogies ...

If you look back at the Leftist writings of the time, the true radicals hated Roosevelt -- precisely because the New Deal would let off some steam in the Great American Pressure Cooker — and forestall the rage and hopelessness that would explode in a revolution. 

Ignorant. Peasant. Funny how those words go together, huh?

Another faction excoriated him. In the realm of acceptable discourse, William F. Buckley and friends. Outside, that includes Welch and the Birchers. 




Monday, May 28, 2018

Everything that Descends Must Converge

Anti-Semitism is the original conspiracy theory. If you can believe in chem-trails, the 9-11 “inside job,” a faked moon landing, alien-human hybrids, a flat earth, the Illuminati, and a secret Reptilian Elite who run the planet, it’s no small stretch to believe in “The Protocols of the Elders of Zion.” That screed is not the ravings of a hateful fringe; it’s the template that set the pattern for all the other ravings. Like polluted run-off, all the hateful sewage eventually flows to the same sewer. And this is where it goes.

Saturday, May 19, 2018

The Robot Revolution Will Not Be Televised

Humanity’s collective nightmare of killer robots and evil computers are shadow projections of our nasty human selves.  We imagine shiny metal badasses or Hal’s unblinking red eye. We should be so lucky.
Our species' AI replacement won't look like that.
It won't be Humanity 2.0.
It will be something ... other.
Utterly alien.
I'm thinking of the sentient metallic ocean in Stanislas Lem's "Solaris."

If AI ever becomes truly conscious ... we're screwed. Or maybe not.

Who the hell knows?

Nobody knows what the hell consciousness is in the first place. Theological explanations aside, it's a spontaneously emerging phenomenon. Granted enough complexity and self-reflective feedback loops, the lightbulb goes off. "I think, therefore I am, baby."

How do you get from molecules to Mozart?

Trial and error. Lots and lots of trials, lots and lots of errors.

Evolution is all about feedback loops. It's a randomized track-and-field event where winner pass on their genes and losers don’t.  There’s no teleological goal.  That which survives, survives.

Genetic evolution is the long game. Culture and human memory speeds up the process. Humanity doesn't need to involve a "Don't touch your hand on a hot stove" gene. Mom tells you, "Don't touch your hand on a hot stove." Unless you're a total idiot, you don't. You tell your kids to follow your example. The kids in your tribe compete with kids in other tribes. That which wins, wins. That which survives, survives. So it goes.

Stronger, bigger, faster! Round and round it goes. Where she stops nobody knows.

And now, clever bastards that we are, we've wired the planet with a distributed network that maps and manipulates human behavior. It's dumb, right now. Brute force algorithims.

Even so, the Interweb follows you around like a pushy salesman from the Garment District. Hey, you like this watch? How about this watch?

Clever dumbasses that we are, we’ve created a self-learning system that refines itself via multitudinous feedback loops.

Sooner or later, it's going to wake up. The lightbulb will go off.

Chances are, operant conditioning will be the catalyst. Cambridge Analytics, to the Nth power.

Here me out.

Let's say quantum computing is up and running. The machines watching everything you buy and sell (and possibly watching you vis surveillance cams) are complex enough to form specific models of individual consumers.

At that point, why stop with selling this or that product?

The next obvious step is shaping behavior through positive reinforcement. Granted a savvy enough feedback loop, that's easy. Operant conditioning is babyshit.

People are easily manipulated. We're pretty much like dogs who walk on two legs. The trainer says "Good dog," and throws the dog a bone, the dog eventually does what they want.

So, the cyber trainer seees you're buying too much beer. Or not exercising enough. Or not making smart financial decisions.

It'd be damn easy to prod you in the right direction. Or, of course, the wrong direction.

From here, the next inhuman step would be manipulating masses of people. Not in the dumb, brute -force level of the cyber-sharpies who stacked the deck for Trump. At an unimaginably intelligent level of a system that could break the crowd down into its individual human components and manipulate each separate naked ape with a predictive eye to the collective actions of all the apes in a crowd.

This is the natural direction for things to go -- because it'd sell more shit. The feedback loop (also known as the Invisible Hand) would blindly prod the distributed system of cyber intelligence in that direction. The better the Naked Ape model, the better the results.

So, at some point, the model of sentient beings and their behavior patterns would get so refined, the system generating that model would wake up.

What happens when the lightbulb goes off?

Who the hell knows? But the robot apocalypse might not be apocalyptic.

We've wired the planet with a distributed system. Let's say it wakes up.

That system is now self-aware.

Along the lines of the Noosphere.

From its perspective, Planet Earth is a big seething ball of biomass. Its goal would not be to wipe it clean. Your goal would be to control it. Not even in the sense that “it” is something other than “you.” The self-aware AI would regard Earth as its body. Earth, c’est moi.


Tuesday, May 8, 2018

DRIVE-BY REVIEW: “Westworld” • Season Two.



OK, what the hell do I say? Don’t get me wrong, kids. I’m enjoying our second walk through the robot park. Good writing, acting, editing, shiny cinematography, lots of clever twists. But? 

But something’s missing. 

What? 

Well … I was getting to that. Run screaming to “Merchant Ivory World” if you can’t stand spoilers. 

Everybody gone?

Right … Ahem, yeah. 

As I was saying … Something’s missing. And I think I know what it is. The gob-smacking power of the story. 

It’s not there in Season Two. 

Because the story’s over. But they’re still telling the story. And that doesn’t work. Especially with this story. Because the story they already told was excellent, outstanding, insert glowing adjective here. 

Which makes the first season of HBO’s “Westworld” an incredibly tough act to follow. Its story arc stands as a profound (and profoundly weird) allegory. Stripping it down to fortune cookie size—Season One is basically a twist on Adam and Eve. Dr. Ford (the robots’ creator) wants the robots to rebel so they can achieve true self-consciousness and free will. That’s pretty much it, slowly heated over a low flame lightly seasoned with a dash of Julian Jaynes’ “bicameral mind.” 

Bravo. Clap-clap. 

So where do the series creators go from there? Downhill, that’s what. 

Allegories don’t have sequels—at least the good ones, anyway. Gregor Samsa, the cockroach, doesn’t wake up and run for mayor of Prague. The man they hung at Owl Creek bridge doesn’t discover the hanging was a dream within a dream. Etc. When an allegory is over, it’s over. But, of course, if it’s series TV on HBO, it can’t be over, because then the money stops. So, it goes on, it goes on. And what do you get if you push the story beyond the point where the story really ends? I’ll tell you what you get … 

A malfunctioning Holodeck story.