Notes from the peanut gallery
Jun. 17th, 2008 12:31 pmHappy Birthday, Marge, Dan, and Diana!
I was reading this article which is interesting, but I have several objections:
1) You can't use Nietzsche's brain as an example because of what we like to call a "confounding variable," namely that he was CRAZY FROM SYPHILIS.
2) 2001, YOU'RE DOING IT WRONG:
I’m haunted by that scene in 2001. What makes it so poignant, and so weird, is the computer’s emotional response to the disassembly of its mind: its despair as one circuit after another goes dark, its childlike pleading with the astronaut—“I can feel it. I can feel it. I’m afraid”—and its final reversion to what can only be called a state of innocence. HAL’s outpouring of feeling contrasts with the emotionlessness that characterizes the human figures in the film, who go about their business with an almost robotic efficiency. Their thoughts and actions feel scripted, as if they’re following the steps of an algorithm. In the world of 2001, people have become so machinelike that the most human character turns out to be a machine. That’s the essence of Kubrick’s dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.
First of all, STFU, because Dave is totally awesome, and it's his desire for new knowledge and understanding that allows him to carry on against logic -- because he's totally screwed -- and enables him to achieve enlightenment at the end of the movie (that was the acid trip part, BTW).
Second of all, HAL's tragedy occurs because he's NOT human -- he can't cope with the simultaneous demands of being a fact-based machine while being expected to lie to Dave about the mission's objective. He may be super-intelligent, but he doesn't have the mental flexibility that allows humans to lie, and also be optimistic and creative. So really, the point of the movie is not that AI will be more human than us, it's that we can't EXPECT it to be more human to us, or else we'll get ejected into space. TRY WATCHING THE MOVIE MORE CLOSELY, FOOL.
3) So really, the problem boils down to "our brains are becoming more attuned to the internet at the expense of being less capable of absorbing longer sources of information like books, because we only read the internet"? I think the obvious solution is READ A FUCKING BOOK NOW AND THEN SO YOU CAN DO BOTH. See? You can achieve maximum efficiency IF YOU AREN'T BEING A LAZY DUMBASS.
I was reading this article which is interesting, but I have several objections:
1) You can't use Nietzsche's brain as an example because of what we like to call a "confounding variable," namely that he was CRAZY FROM SYPHILIS.
2) 2001, YOU'RE DOING IT WRONG:
I’m haunted by that scene in 2001. What makes it so poignant, and so weird, is the computer’s emotional response to the disassembly of its mind: its despair as one circuit after another goes dark, its childlike pleading with the astronaut—“I can feel it. I can feel it. I’m afraid”—and its final reversion to what can only be called a state of innocence. HAL’s outpouring of feeling contrasts with the emotionlessness that characterizes the human figures in the film, who go about their business with an almost robotic efficiency. Their thoughts and actions feel scripted, as if they’re following the steps of an algorithm. In the world of 2001, people have become so machinelike that the most human character turns out to be a machine. That’s the essence of Kubrick’s dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.
First of all, STFU, because Dave is totally awesome, and it's his desire for new knowledge and understanding that allows him to carry on against logic -- because he's totally screwed -- and enables him to achieve enlightenment at the end of the movie (that was the acid trip part, BTW).
Second of all, HAL's tragedy occurs because he's NOT human -- he can't cope with the simultaneous demands of being a fact-based machine while being expected to lie to Dave about the mission's objective. He may be super-intelligent, but he doesn't have the mental flexibility that allows humans to lie, and also be optimistic and creative. So really, the point of the movie is not that AI will be more human than us, it's that we can't EXPECT it to be more human to us, or else we'll get ejected into space. TRY WATCHING THE MOVIE MORE CLOSELY, FOOL.
3) So really, the problem boils down to "our brains are becoming more attuned to the internet at the expense of being less capable of absorbing longer sources of information like books, because we only read the internet"? I think the obvious solution is READ A FUCKING BOOK NOW AND THEN SO YOU CAN DO BOTH. See? You can achieve maximum efficiency IF YOU AREN'T BEING A LAZY DUMBASS.