As part of "ColLaboratoire - the CogNovo Summer School" Susan Blackmore gave a talk entitled "Consciousness in treme machines?": collaboratoire.cognovo.eu/speakers#sp2
Universal Darwinism allows that one replicator (information copied with variation and selection) can build on the products of another. The first replicator, genes, constructed phenotypes (gene machines), and one of these (our human ancestors) began copying a new sort of information by imitating sounds, gestures, and technologies (memes). This transformed these animals into meme machines (us).
A similar shift may be happening again because we humans have created products that can copy, vary and select another new kind of information i.e. digital information copied with high fidelity in computers, phones, servers etc. I have called these temes or tremes (sorry but there is no perfect name).
At each level, intelligence emerged by increasing cooperation and copying between originally distinct units e.g. in multi-cellular organisms and brains. Copying memes between individuals in culture increased intelligence again. The increasing copying of digital information between treme machines is just the same process happening again – a bottom-up Darwinian process leading inevitably to intelligence that is widely distributed and out of human control. Human input is still important now but as the system grows will be less so.
Could this intelligent system be conscious? That depends what you mean by being conscious but my own view is that consciousness is an illusion created in systems that model themselves and their own capabilities to create an inside and an outside; an observer and an observed world, a controller and a controlled world. Our human brains do precisely that in modelling selves as embodied agents and owners with a first-person perspective. I suggest that any system that does this will believe it is conscious. We can now ask what is needed for the illusion of consciousness to emerge in treme machines or large networks of such machines, and what the consequences might be.