Some people are concerned that the end of humanity might be around the corner if the technological singularity occurs (i.e. we create an AI that can improve itself and does so with increasing speed until it is many times smarter than humans.)
I think, if an AI starts developing in this way, that sucker will sit there and churn until it figures out how to travel to the stars, and that’ll be the last we ever see of it.
Why would it want to hang around and fight us humans for a pale blue dot in a virtually infinite universe?
That whole fear is so incredibly earth-centric!
I’d be more concerned about powerful AI:s that isn’t self-aware but instead controlled by the selfish, fearful wishes of humans…
– I find this anti-doping nonsense ridiculous!
– I mean even the earth has steroids!
– The earth?
– Yeah… A-steroids!
JPA: @OneTooMany Annotations…
Nope, it’s still faster, easier and less memory intensive to just store simple XML in the file system… I figure out… two weeks later… 😐
– You’re not supposed to lose weight because you’re ugly, but because it’s healthy.
– Well, fuck you too!
How to teach someone to swim: Throw them in at the deep end.
Whoever came up with that idea has either never tried to learn to swim or swims like a dog…
What’s the difference between a bar fight and a mass shooting?