Fincher
Coming Undone
- Joined
- Jul 3, 2014
- Messages
- 3,875
- Reaction score
- 1,262
- Points
- 103
Wasn't it because it was too perfect? They didn't buy it?
Okay, to start with, I should point out that he never said the first Matrix contained an anomaly; it was actually the third Matrix. What I said mixed the first and third versions up.
Instead, he turned it into a choice he believed that they would always make in the machines' favor. He didn't understand choices, he was wrong, and it only took six iterations for this new approach to fail.
The first Matrix I designed was quite naturally perfect, it was a work of art, flawless, sublime. A triumph equaled only by its monumental failure. The inevitability of its doom is apparent to me now as a consequence of the imperfection inherent in every human being. Thus, I redesigned it based on your history to more accurately reflect the varying grotesqueries of your nature. However, I was again frustrated by failure. I have since come to understand that the answer eluded me because it required a lesser mind, or perhaps a mind less bound by the parameters of perfection. Thus, the answer was stumbled upon by another, an intuitive program, initially created to investigate certain aspects of the human psyche. If I am the father of the Matrix, she would undoubtedly be its mother.
Perfection didn't work. Imperfection still didn't work. Being given a choice worked at first, because most chose The Matrix. However, even if those who chose otherwise were a statistical minority, something with a very low chance of destabilizing the system, given enough time, would become a near certainty ("an escalating probability of disaster"). The odds of an average high school basketball player shooting a basket from full court and making it is very low. The odds of him shooting a million baskets from full court and making at least one of them is astronomically high. Eventually, due to people making their own choices, it would fail, and The Architect couldn't accept that imperfection.As I was saying, she stumbled upon a solution whereby nearly ninety-nine percent of the test subjects accepted the program provided they were given a choice - even if they were only aware of it at a near-unconscious level. While this solution worked, it was fundamentally flawed, creating the otherwise contradictory systemic anomaly, that, if left unchecked, might threaten the system itself. Ergo, those who refused the program, while a minority, would constitute an escalating probability of disaster.
Instead, he turned it into a choice he believed that they would always make in the machines' favor. He didn't understand choices, he was wrong, and it only took six iterations for this new approach to fail.