We are a commune of inquiring, skeptical, politically centrist, capitalist, anglophile, traditionalist New England Yankee humans, humanoids, and animals with many interests beyond and above politics. Each of us has had a high-school education (or GED), but all had ADD so didn't pay attention very well, especially the dogs. Each one of us does "try my best to be just like I am," and none of us enjoys working for others, including for Maggie, from whom we receive neither a nickel nor a dime. Freedom from nags, cranks, government, do-gooders, control-freaks and idiots is all that we ask for.
Our Recent Essays Behind the Front Page
Wednesday, February 22. 2012
I'm sure Isaac Asimov was not a fan of capitalism, let alone the Republican Party (or even Libertarians). The movie I, Robot was based on his series, primarily his work on the Three Laws of Robotics and some outcomes that may occur with their implementation. In some ways, the movie was a criticism of corporate culture and government becoming too interlaced. US Robotics becomes an overly powerful organization with deep ties to government, ultimately making the robot takeover very difficult to slow or stop. On the other hand, it's a criticism of Progressive overreach. Perhaps unknowingly.
There is one scene which reminded me of our current government's goals. The idea that we have politicians or bureaucrats who 'know better', and can guide us to a better place. All we have to do is agree to let them, and while many will be harmed, it will be for a 'better good'.
The Three Laws guide our hopeful Progressive overlords. I, for one, do not welcome them.
They take away food we send to school with our children, tell us they can provide us with better health care, and insist they alone can protect us from Anthropogenic Climate Change and 'fix' our economy. In doing so, they set the stage, whereby we become willing to give up the benefits of risk which are inherent in life, in order to feel 'safe and secure' in the comfort of a nanny state.
It is odd that Asimov assumed it was the ignorant people, the Moral Majority, the self-righteous, who would take away our freedoms. In a sense he was correct. I'm sure the Religious Right would be quite pleased to use politics to deprive us all of individual freedoms as they see fit. On the other hand, I'm certain he would understand that the current crop of 'intelligentsia' are just as despicable, in this regard.
This from a man who knew better:
Display comments as (Linear | Threaded)
I think you're wrong about Asimov. He was very close to a libertarian (as your quote indicates), and liked capitalism.
The movie doesn't necessarily resemble the book, and I've read others speculate that Asimov would have hated the result of the movie. I'm not sure if he would have hated it, but he certainly was not pleased when institutions gained too much authority over the individual.
He had Libertarian qualities, but completely despised the Moral Majority.
He had many quotes demeaning them, and felt they should not be allowed a vote.
Here is one he had on the MM:
“Anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge.'”
He was a fan of the New Deal, became a Democrat, and never varied from the position, even late in life. He despised Nixon, was friendly with Abbie Hoffman (though felt he was morally adrift) and while his writings bordered on Libertarian thought, he never would have accepted it because he opposed corporations.
I agree, he probably would have hated the movie. But I actually enjoyed it because VIKI's speech outlines where Progressives go wrong.
the movie I robot has everything to do with hollywood's worldview, not asimov's. If you read the novel, it would be unrecognizable from the movie. they share almost nothing in common. none of the corporate themes were there. beyond the core murder mystery, the main theme i can remember is whether there was some point where having robots do everything for us might undermine our vitality as a species.
It conformed to Asimov's, as well.
I did write that it was based on his book on the three laws and some of the potential outcomes. I didn't say it was from any of his books specifically.
But is VIKI's monologue any different from Asimov's commentaries about the ignorance of people? Particularly people he considered intellectually inferior to him, personally? These would include people of faith - who he regularly railed against as hypocrites.
While I listen to this monologue and view the 'ignorance' exists on the side of the Progressives, Asimov planted his flag with them every time.
But, if you want to assume Asimov would not have agreed with the Progressive world view (which he regularly showed support for, voting for and promoting Democratic causes), and make the case for a Hollywood....well, I'll go for that too.
Either way, VIKI's monologue shows the hypocrisy of Progressive viewpoints. I don't care if anyone believes Asimov was a Libertarian (he wasn't), the whole point of the post is that VIKI outlines the Progressive mantra - 'we know better, and we can save you from yourself' - and the Progressives (in Hollywood or otherwise) failed to notice that in demonizing VIKI, they demonized their own worldview.
I don't know where to start here so I'll just start at the mosts obvious, and perhaps the most egregious of the statements you and Robotwhatever made.
First of all, the Three Laws of Robotics are not universally hated. In fact, they are a staple of science fiction when it comes to robotics. Stories using the Three Laws are not just another Faustian tale of creator destroying creation - the stories become much more complex as long as artificial intelligence and behavior are restrained by the Three Laws.
The history of the Three Laws started evolving in the early Golden Era of science fiction with Lester del Rey ("Helen of Troy" - 1938) began the Three Laws concept. Otto Binder expanded del Rey's idea in a story called "I, Robot" (as it happened) in which the very first law codified by del Rey, was expanded to "A robot must never kill a human, of his own free will."
Asimov was looking in another direction with his formulation. In fact his very first attempt at it, "Robbie", was rejected by John Campbell editor of Astounding Science-Fiction" stating that the whole idea wasn't fully formed - yet. Fredrick Pohl eventually bought "Robbie" for
Astounding", but by that time, in a conversation with John Campbell, Campbell pointed out to Asimov that the Three Laws Asimov had in mind were similar to those used by tool and dye designers at the time - to wit:
1 - A tool must not be unsafe to use. Hammers have handles, screwdrivers have hilts.
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2 - A tool must perform its function efficiently unless this would harm the user.
A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3 - A tool must remain intact during its use unless its destruction is required for its use or for safety.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Thus was the birth of the Three Laws of Robotics. And they make perfect sense. They have also been expanded on and developed including the Zeroth Law which places humanity over the human as long as humanity will benefit from the robot's actions. In that sense, the article's author may have a point, but again, as constructed, the Three Laws take precedence over the Zeroth law unless or until an dead lock takes place when the Zeroth Law takes control. You would need to read the Elijah Baily stories to make sense of this - it is complex.
Robotolizer uses a case which makes absolutely no sense and betrays a complete and total misunderstanding of the Three Laws concept. The statement is - "Am I too understand that if I and a human female carrying a baby are walking down the street and confronted with, say a rampaging elephant, I'm not supposed to toss her in front of it to create a diversion so I can save my much more valuable metallic hide?" (sic).
WTF? That's exactly what the Laws prescribe. The machine is subordinate to the biological construct - the imperative being that the "metallic hide" has the exact compulsion to sacrifice itself to benefit the human. That is the whole point of having the Three Laws in the first place. You can't even make a case for the application of the Zeroth Law in that situation.
The fact that Asimov was a progressive humanist, atheist and "rationalist" is not relevant to the Three Laws. That is not in dispute - even from me who actually met him more than once and found him to be a total and complete ass. I can tell you a few stories about Asimov from his contemporaries (that my Dad knew and was friends with) - they weren't all that impressed with him either.
In short, railing against the Three Laws as some kind of progressive thought falls totally short of the mark. Confusing the two demonstrates a lack of understanding of the Three Laws and their uses in controlling machines that was really the result of humanities paranoia about a machine controlled future.
Don't confuse the ramblings of a progressive poltroon with the genius of the Three Laws of Robotics.
I wasn't railing against the Three Laws. Did I say I was? I haven't made any commentary against it. I didn't even say the Three Laws were a Progessive agenda, just that a particular conceptualization of them, as described by the monologue of VIKI, is very similar to the Progressive vision.
Robotolizer may have misused or misunderstood the Three Laws to make his point, but his broader points are accurate - there is a natural extension of the laws which could lead to sentient robotics becoming primary rulers. Even the Zeroth Law doesn't eliminate that possibility, it merely limits the opportunity for robots to overthrow humans physically. The assumption of robots controlling human lives isn't pre-empted entirely by the Zeroth Law, as long as it's in the human's best interest. The Progressive Agenda (we'll tell them what's good for them, and they should live that way because it's what's in their best interest, and we'll do everything in our power short of directly harming humans to get them to do it) doesn't necessarily preclude harming humans if the overall benefit of human society leads to a possibility of harm to one or a few who may reject this agenda. This wouldn't occur by the hands of robots (one huge flaw in the movie, in my opinion, was when the robots became violent), but due to humans rejecting robotic rule and leading to violent reactions. Interestingly similar to the reaction of anyone trying to get Progressives out of our lives.
I didn't know, nor do I care, what Asimov was like personally. I love his stuff, not him necessarily. He was a great writer and thinker. I found "Quasar, Quasar, Burning Bright" to be a very eye opening way to look at life in general. I should re-read it because I think he did a good job explaining climate change before it was fashionable.
The Three Laws are 'perfect' (and this is made clear in the movie), there are no flaws. But what you did was to confuse was my comparison of VIKI's monologue to the Progressive Agenda with a criticism of the Three Laws by an author I happened to link to (primarily because I agree that the premise of the Three Laws can lead to a mindset remarkably similar to the Progressive Agenda.
If robots were capable of seeing that humans regularly harm themselves, and their job is to protect humans in some way without directly harming humans, then they could potentially seek to gain some form of dominance in order to 'protect' us. Isn't that the Progressive mindset?
You certainly implied criticism of the Three Laws by referencing Robotolizer's stupid rant and you made it a part of your central post theme. I believe you stated, please correct me if I'm wrong, that: "The Three Laws guide our hopeful Progressive overlords. I, for one, do not welcome them."
What annoys me to no end is twisting and distorting of great themes and commentary to fit a particular political agenda. That somehow the Three Laws of Robotics (or seven if you include later extensions by other authors) are part of a "progressive" agenda based on a distorted and completely inaccurate portrayal of the original story line. This same type of distortion came with "Starship Troopers" in which Heinlein's world was made into some kind of facist crypto state instead of the libertarian ideal of personal and social responsibility.
If you want to rail and rant about the movie, then fine - that's fair game because it was a bastardization and a "re-imaging" of the original. But the movie had about as much of a relationship to the original stories (and Three Laws) as you have with the Obama political machine - which is zero.
Ah...I used that line sarcastically. I guess I should've been more clear.
No, I recognize the immutability of the Three Laws. I just found that monologue from the movie to mesh extremely well with the Progressive view.
Since it was 'based' on Asimov's work, and I was aware of Asimov's political views, I thought it made for a nice allegorical review.
As I said, Robotolizer may have misunderstood or misstated something - which is fine, really. As I pointed out, there is at least one (and possibly more) ways the Three Laws could be used to result in the end result the movie suggested (though without the violence of robot on human).
And, I'll add if you think that's impossible, then I'd opt for Arthur C. Clarke's view:
"When a distinguished but elderly scientist states that something is possible he is almost certainly right.
When he states that something is impossible, he is very probably wrong."
Asimov's psychohistory in the Foundation Series certainly suggests he thought smart people could run things for others - a capital P Progressive view consonant with Dewey or TR. His Guide to the Bible, his history books and Our Angry Earth were liberal tripe, though well-written.
He consistently cast military and corporate figures as villains and academics as heroes. There were exceptions. He did seem to worry that government could get too large and intrusive.
Not quite consistently. In "Foundation and Empire" he went fairly deeply into the distinction between the scientific method (reliance on experiment) and what he called the "Academic Method" (the argument-from-authority fallacy), and how the First Empire's science establishment were all following the path of the Academic Method because it secured their jobs and the prestige that went with them.
I see a strong parallel with ClimateGate here (not to mention the State Science Institute in Atlas Shrugged).
I seem to recall that credits for the movie state something to the effect that the movie uses the title from the book. The book has no correspondance to the movie outside of that.
Asimov had strong utopian tendencies. While amusing, his robot stories never actually explained how robots would work in a functioning society.