04-29-2014, 12:45 AM
I saw it last night and very much enjoyed it. I thought it was the best possible transhumanist film they could have made, it had very good visual affects and most of the plot was quite good. Aside from the plethora of minor bad points that any film will have (like the fact he's a scientist who doesn't like public speaking but is so famous people want his autograph) there were three major plot points that I feel it would have been so much better to overcome rather than relying on cliché:
1) The source code virus. Ultimately he is killed because his friend who modified the brain emulation software knows of a bug in the source code he can create a virus for. Yet minutes after the emulation software starts working and the uploaded will can control the computer with his thoughts he begins to expand, learn and rewrite portions of his own code (the classic hard-take off scenario). It seems unlikely that he wouldn't have found and corrected any flaws at this time or at least done so towards the end of the film when he deduces the plan to create the virus.
2) No social intelligence. Probably my biggest worry about the film before seeing it and my biggest gripe. It's established that will has become a super-intelligence capable of absorbing knowledge as fast as his bandwidth will allow, spotting patterns no one else can and developing technologies far beyond human science. Yet for all this natural intelligence he is somewhat socially stupid. He even seems to regress from how he was as a human, relying on crude biological cues like heart rate to figure out his wife's emotions and not anticipating she will be freaked out when he remote controls a human body and tries to reach out to her. This feeds into the central point of the movie in that people are scared of him and want to destroy him but he underestimates this and never manages to figure out the best way to calm their fears and let him help. A super-intelligence should be able to look at a group of individuals (even a whole society) and in a short time build a model of them better than humans can make models of their closest friends. Rather than running, hiding and relying on them to come around after they see his technology he should have been planning public engagements, awareness campaigns, meetings with government figures, demonstrations of his abilities etc all to make people accept him.
3) The death of the internet. Talk about pyrrhic victory! The humans wipe out every machine on Earth by destroying will. I'd be surprised if less than 80% of us were dead after a few months if that really happened. Think of all the runways that can't guide planes down, all the hospitals no longer with power, all the emergency services that can't get to people and perhaps biggest of all, the collapse of food delivery networks. Coming from someone who lives on an island that has had to import food to sustain itself for decades the idea of global communications and transport disappearing overnight and not coming back strikes me as an apocalypse. It really bugs me how films and other media make the erroneous assumption that because we lived without computers and electricity before we'd be fine without it. Err, no. Our society is totally dependant on those things now, we don't have systems set up to run without them. We'd have to invent them from scratch and in the mean time the failure of Just in time business (specifically food) would kill most people.
So those were my major problems but as I said I did very much enjoy the film regardless. I thought the nanotechnology effects were some of the best I've seen in science fiction (it looked how I imagine hyperfog) and didn't go over board in terms of technobabble. I liked how scanning the brain wasn't enough and that it took a break through to make software that could emulate a brain without destroying it and process it's inputs and outputs. I like that they had to take a lot of phenotypic tests to get the software to work rather than ignoring the fact that the brain and mind are intimately linked to the body. And I liked that when first uploaded wills wife and friend had no idea what they were dealing with because all the software was giving as output was noise which took time to stabilise (showing that will didn't have an instant intuitive understanding of his new environment but required time for the software running him to integrate his senses and motor control).
1) The source code virus. Ultimately he is killed because his friend who modified the brain emulation software knows of a bug in the source code he can create a virus for. Yet minutes after the emulation software starts working and the uploaded will can control the computer with his thoughts he begins to expand, learn and rewrite portions of his own code (the classic hard-take off scenario). It seems unlikely that he wouldn't have found and corrected any flaws at this time or at least done so towards the end of the film when he deduces the plan to create the virus.
2) No social intelligence. Probably my biggest worry about the film before seeing it and my biggest gripe. It's established that will has become a super-intelligence capable of absorbing knowledge as fast as his bandwidth will allow, spotting patterns no one else can and developing technologies far beyond human science. Yet for all this natural intelligence he is somewhat socially stupid. He even seems to regress from how he was as a human, relying on crude biological cues like heart rate to figure out his wife's emotions and not anticipating she will be freaked out when he remote controls a human body and tries to reach out to her. This feeds into the central point of the movie in that people are scared of him and want to destroy him but he underestimates this and never manages to figure out the best way to calm their fears and let him help. A super-intelligence should be able to look at a group of individuals (even a whole society) and in a short time build a model of them better than humans can make models of their closest friends. Rather than running, hiding and relying on them to come around after they see his technology he should have been planning public engagements, awareness campaigns, meetings with government figures, demonstrations of his abilities etc all to make people accept him.
3) The death of the internet. Talk about pyrrhic victory! The humans wipe out every machine on Earth by destroying will. I'd be surprised if less than 80% of us were dead after a few months if that really happened. Think of all the runways that can't guide planes down, all the hospitals no longer with power, all the emergency services that can't get to people and perhaps biggest of all, the collapse of food delivery networks. Coming from someone who lives on an island that has had to import food to sustain itself for decades the idea of global communications and transport disappearing overnight and not coming back strikes me as an apocalypse. It really bugs me how films and other media make the erroneous assumption that because we lived without computers and electricity before we'd be fine without it. Err, no. Our society is totally dependant on those things now, we don't have systems set up to run without them. We'd have to invent them from scratch and in the mean time the failure of Just in time business (specifically food) would kill most people.
So those were my major problems but as I said I did very much enjoy the film regardless. I thought the nanotechnology effects were some of the best I've seen in science fiction (it looked how I imagine hyperfog) and didn't go over board in terms of technobabble. I liked how scanning the brain wasn't enough and that it took a break through to make software that could emulate a brain without destroying it and process it's inputs and outputs. I like that they had to take a lot of phenotypic tests to get the software to work rather than ignoring the fact that the brain and mind are intimately linked to the body. And I liked that when first uploaded wills wife and friend had no idea what they were dealing with because all the software was giving as output was noise which took time to stabilise (showing that will didn't have an instant intuitive understanding of his new environment but required time for the software running him to integrate his senses and motor control).