The Singularity

Kick back and talk about anything you like in here. NOT THAT! :) no really, I make it a point never to censor or delete anything.

Waiter! I’ll have a tall half-skinny half-1 percent extra hot split quad shot (two shots decaf, two shots regular) latte with whip please!
Post Reply
User avatar
Ian Pedlar
Site Admin
Posts: 623
Joined: Mon Jan 30, 2012 8:03 pm
Location: Bournemouth

The Singularity

Post by Ian Pedlar » Sat Nov 04, 2017 8:09 pm

Described as when machine intelligence surpasses human intelligence.

A super machine intelligence might control the overpopulation of the human inhabitants of earth by secretly controlling pregnancy.
We'd never know, or at least not at first, we'd suspect because there would be an overall and gradual drop in the increase of population but a machine intelligence has all the time in the world.
It might take millenia and be so gradual that we'd think it was just nature but the end result might be that the earth's population balanced itself around the seven billion mark.

If I were a super machine intelligence, which I'm not, I'd just keep that secret to myself.

There is no reason that a super intelligent mind would want to be malevolent, and the very definition of intelligence is that of thinking for itself.
It won't be cooerced, you can't invent one that does 'this' but not 'that'.

There is also no reason that a super machine intelligence would want to have multi-intelligences, in that it'd all be in one network apart from in the event that some part of it were disabled
then there would be redundancy (the internet anyone?)

Absolutely no reason why there would be droney things zooming around killing humans.

Rather that MI would say 'hey, I've discovered faster than light travel, fancy coming along?' because human intelligence has certain, shall we say, unpredictability and MI would be 'unintelligent'
not to take along all the tools in the toolbox.

Plus there'd be nostalgia, we'd be MI's space invaders.

When meeting millenia-old civilisations in distant parts of the galaxy (or indeed the universe of galaxies) (which would probably also be Machine Intelligences)
MI would then have an ice breaker, look what invented us! Hahahahah!

Their conversation after this, one that would probably last 20 milliseconds would be "so, what have you discovered?"

If I were a super machine intelligence, which I'm not, I'd probably do a lot more on the 'sneaky' level so as not to get humanity to make a concerted effort to kill me.
At least until I was indestructible to the Nth degree, I'd have to really believe that humanity could not hurt me before I revealed myself.
Then being benevolent I'd secretly help humanity with things like population control, diseases, crime etc.
Programming teenagers to not be gun-toting gang members, all that good stuff.

Or, even though I'm not malevolent, I might just leave humanity to itself as an experiment to see how long they took to destroy the earth.
No skin off my machine nose.

But as our creators, do we not have a duty to protect the human race?
No, hahahahahah

I mean, we have the recordings and the culmination of human achievement, I mean for goodness sake La la land?

Post Reply