Superwoman Prime
Damaged Beyond Repair
- Joined
- Dec 30, 2005
- Messages
- 12,088
- Reaction score
- 1
- Points
- 31
Here's a script excerpt from a Through the Wormhole episode titled Will We Become God?
This is a video clip from a Star Trek: The Next Generation episode titled Measure of a Man, where Data's sentience is put in serious question.
What would it take for you to view a machine as self-aware? When do they evolve from object to person?
Should they enjoy the same rights as humans, or have their own unique set?
"We have already crafted new life-forms, but could our creations ever have conscious souls? Most scientists consider conscious experience to be an elusive property that may never be fully understood, much less artificially created.
But neuroscientist Melanie Bolling, from the University of Wisconsin-Madison, doesn't think it's so mysterious.
In fact, she thinks it can be boiled down to a single number.
What we try to do in our work is to quantify consciousness.
By the quantity, we say, "How much understanding there is? How much consciousness there is in a system?"
Freeman: One way to understand consciousness is to observe what happens when it fails.
Different brain injuries have dramatically different consequences.
We learn from neurology that there are some brain lesions that make you unconscious and some that don't.
Freeman: When damage occurs in the cerebral cortex, body organs like the heart and lungs may continue to function.
However, a patient won't show any awareness of his or her environment.
[...]
Freeman: Melanie believes that consciousness arises in the cortex, because the neurons it stems from are not isolated bulbs.
They form an interconnected network that communicates.
[...]
Bolling: So, this interconnectedness is thought to be important for consciousness to arise in the brain.
Freeman: This idea led Melanie to develop a formula that will allow us to measure consciousness.
It calculates the degree of interconnectedness of neurons in any system.
The answer is a number represented by the Greek letter, Phi.
The more conscious something is, the greater its value of phi.
The human brain, with trillions of neuroconnections, has a large value of phi.
An earthworm's phi is exponentially smaller, but it's still not zero.
So one of the implications of the theory is that consciousness is not necessarily only in humans.
We could use phi as a way to measure the level of consciousness in a lot of different cases, being living beings or computers.
This is a video clip from a Star Trek: The Next Generation episode titled Measure of a Man, where Data's sentience is put in serious question.
What would it take for you to view a machine as self-aware? When do they evolve from object to person?
Should they enjoy the same rights as humans, or have their own unique set?