Looks like an attempt to turn Transhumanism into religious gibberish.
Awareness of the probability of the singularity is a good idea, but this isn’t something mystical or spiritual. The singularity is what many see as an inevitable result of our rapidly increasing development of technology, especially computer technology and the expected birth of true machine intelligence and thereafter the highly probable creation of intelligence greater than human.
With such developments at our disposal or more probably beyond our control it is very difficult to predict what might happen next. With such rapid technological and genetic progress it is also assumed that man will readily take advantage of these developments and use them to enhance his own capabilities far beyond his current abilities.
Curing the disease of aging and the transfer of neural patterns to more resilient media will potentially and probably lead to unlimited lifespans and the dream of actual immortality, although that is a stretch for the moment.
The prediction of 2012 seems too early. The year 2040 has been oft quoted as more realistic, but the work IBM is currently undertaking on the design of Blue Gene that will create a computer that should have little problem of achieving and surpassing the processing power of the human brain by 2007, seems to indicate that 2040 is too far out. However, the current plans for Blue Gene are not as yet seemingly directed at brain emulation.
However, it is very difficult to predict what will happen as technology continues on its apparent exponentially increasing trend towards super-intelligence and technologically enhanced humans.
What seems certain is that the next two decades are going to see some massive changes far in excess of anything we have seen in the past 100 years. Most people, and certainly those who are very wary of change will most likely be left behind or will not benefit. Those who welcome such technologies and are prepared to use it will indeed appear to move to another level of human evolution, one of our own design and manufacture.
There are many concerns, certainly in the arena of uncontrolled nanotechnology and in the area of self-aware super intelligent machines. Some suspect we may not survive, and this seems like a justifiable fear. Those of us who welcome these changes are hoping that we can enhance ourselves rapidly enough to compete with the SI machines on an equal basis. At the moment it looks like SI machines will be here before we can upload ourselves to achieve that hoped for equality. If SI becomes malignant then that is the end of the human race.
The greatest problem is that these changes are likely to occur so rapidly that we will not have time to react, unless we foresee the problems and plan for them now.
Does that help?
Of course I could be entirely wrong and IXL777 is talking about something entirely different.