What defines us is our level of consciousness, or rather, the lack of it. That really seems to be the premise, as it even precedes the definition of purpose. What weighs more? What we have or what we lack?

For AIs, does this mean that the level of consciousness of an AI will determine its free will? Is free will an emergent behavior given the level of consciousness of a system?


References

Consciousness Artificial Intelligence Neuromancer