Jump to content

Technological Singularity

  • Please log in to reply
6 replies to this topic

#1 Guest_metroid_dragon_*

  • Guests

Posted 18 May 2006 - 01:40 AM

So here I am at work, doing my typical thing of researching random articles on Wikipedia while listening to System of a Down. I started off doing research on Impulse engines from Star Trek, following into Physics in the Star Trek Universe, from there I saw an article on Tatooine, which I read. From there I read about the Tusken Raiders (Sand People) and how they used to be a major power in the galaxy and Tatooine was their home planet. from there, I did some research on Taris, which was an ecumenopolis (city planet). Back to the Sand People it said they were most likely Type 1 at least on the Kardashev scale (how powerful space-faring civilizations can be). In transfers between levels on the scale, Earth is currently still level 0, and one of the ways to transfer to level 1 is a technological singularity. Now although that was mostly pointless, this is where I get to my point.

A technological singularity is sort of an event horizon in the predictable development of humanity and it's technology. We are currently the dominant force in scientific and technological progress. However, following the creation of a strong artificial intelligence (AI) or a enhanced human bio-form, or perhaps both, a combination of the two if you will, we will no longer be that dominant force. Instead our own creations will literally become better than us.

To quote I.J. Good:
    "Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an 'intelligence explosion,' and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make."

I don't know how you feel about humans losing their position as the dominant force on the planet because of our own actions, but I don't like it. This is one of the points this topic is about, how do you feel about a technological singularity? It could lead to great things, with us living in harmony with these newly created intelligent life forms, benefiting from their advancing technology that we couldn't invent for perhaps centuries yet. Yet it could also lead to violence, war, and if it came to that, we would undoubtedly be the losers. The Matrix is perhaps the best example of a technological singularity gone horribly wrong, although their are others as well, Gundam SEED for example, where genetically enhanced humans and normal humans are at war, abusing any WMD they can get their hands on.

In the end, do the benefits outweigh the risks? Should we allow for a greater power to come to, in hopes we can benefit from them and it will not turn on us, or should we take the safe past, and rely on our own intelligence and ingenuity to advance technologically through the ages?

One of the more interesting points to note, is that there are theoretically 3 or less "mindsteps" left before the possible technological singularity here on earth. Mindsteps are revolutionary new technologies or ideas for humanity that influence many new inventions and ideas. The next mindstep is predicted for 2021, and the last one in 2053.

#2 Guest_etile_*

  • Guests

Posted 18 May 2006 - 04:02 AM

Out of sheer curiosity, what are these mindsteps?

#3 Tundra142


    Advanced Member

  • Members
  • PipPipPip
  • 950 posts

Posted 18 May 2006 - 10:26 PM

QUOTE(metroid_dragon @ May 17 2006, 07:40 PM) View Post

So here I am at work, doing my typical thing of researching random articles on Wikipedia while listening to System of a Down.

no freaking way blink.gif i do that too

well that would certainly suck to not be able to screw the world over at the push of a button as we can now, but it would be awesome if these uberintelligence thingys wouldnt screw us over first.

#4 Guest_metroid_dragon_*

  • Guests

Posted 19 May 2006 - 12:58 AM

Mindsteps, as I pointed out are mainly new ideas or changes in human history and their subsequent changes, the article doesn't actually note what each mindstep is specifically, although here is the paragraph on mindsteps that I read about:

In his book "Mindsteps to the Cosmos" (HarperCollins, August 1983), Gerald S. Hawkins elucidated his notion of 'mindsteps', dramatic and irreversible changes to paradigms or world views. He identified five distinct mindsteps in human history, and the technology that accompanied these "new world views": the invention of imagery, writing, mathematics, printing, the telescope, rocket, computer, radio, TV... "Each one takes the collective mind closer to reality, one stage further along in its understanding of the relation of humans to the cosmos." He noted: "The waiting period between the mindsteps is getting shorter. One can't help noticing the acceleration." Hawkins' empirical 'mindstep equation' quantified this, and gave dates for future mindsteps. The date of next mindstep (5; the series begins at 0) is given as 2021, with two more successively closer mindsteps, until the limit of the series in 2053. His speculations ventured beyond the technological:

"The mindsteps... appear to have certain things in common - a new and unfolding human perspective, related inventions in the area of memes and communications, and a long formulative waiting period before the next mindstep comes along. None of the mindsteps can be said to have been truly anticipated, and most were resisted at the early stages. In looking to the future we may equally be caught unawares. We may have to grapple with the presently inconceivable, with mind-stretching discoveries and concepts."

There isn't much on google either, though I gathered one of them is mathamatics.

#5 BreakTheReflection


    Sylvia Plath

  • Moderators
  • PipPipPipPipPipPip
  • 4,979 posts

Posted 19 May 2006 - 04:13 AM

This is an interesting thing to think about. I don't know much about technological singularity, but the thought of machines dominating humans and running the world, (for lack of better words), freaks me out. We need to be careful about what we create.

#6 Guest_etile_*

  • Guests

Posted 19 May 2006 - 05:50 AM

How about moral/ethical mindsteps? I mean, one mindstep should be the realization that screwing over the planet isn't the only way to advance humanity. Silly pollution.

#7 Zelda Princess

Zelda Princess


  • RP Forum Mod's
  • PipPipPipPipPip
  • 2,479 posts

Posted 20 May 2006 - 11:02 AM

QUOTE(etile @ May 19 2006, 03:50 PM) View Post

How about moral/ethical mindsteps? I mean, one mindstep should be the realization that screwing over the planet isn't the only way to advance humanity. Silly pollution.

(Edit: Looking over this, it doesn't make much sense. I meant for the paragraph below to be something on the whole moral/ethical thing, with how that could or couldn't be considered a 'minstep'. I'm confused if it is relevant or not, now that my brain is turned off, but shall keep it there. Ignore it if it doesn't make sense.)

Doesn't that somehow tie with our conscience, though? Acts such as comprehending one's ordeal or emotions such as sympathy sort of tie to morals and ethics, and all of us have a bit of that, I think. I can understand how one can see that as a sort of 'mindstep', though..

As for Technological Singularity.. I think I've read something about this before, though I can't remember where or when. The idea of an artificial intelligence I find to be dangerous, but in my opinion, seems to be something that we would build up towards. I mean, if I was ruler of the world for whatever weird out-of-this-universe reason, I wouldn't want my scientists attempting to make a computer that had a developing mind and was able to mature in thought. (Anyone seen 'Wargames'?) I think that it could benefit us in ways, but in the end would probably be a failure..

1 user(s) are reading this topic

0 members, 1 guests, 0 anonymous users