Screw Global Warming and to Hell with the Economy.

Discussion in 'Politics' started by Hoofhearted, Aug 17, 2014.

  1. While this may be true, stopping GW may not necessarily stop, or even slow down ME- in fact quite the opposite can occur.

    For instance, if the solution to GW is to change from gasoline cars and coal burning plants to Hydrogen powered cars and solar energy, then how does this do anything to stop the destruction of species habitat?

    With ever expanding populations, wherever man treads, apex species are the first to go; then the pests, etc, etc.

    Cutting down on emissions won't save the humpbacks from those Goddamn Japs!

    Combating GW in and of itself will not be enough.

    On the other hand, You GW alarmists would do just fine to ride the coat tails of us ME alarmists, reaping all of the benefits of our work.
     
    #11     Aug 18, 2014
  2. dbphoenix

    dbphoenix

    How paper clips could bring about the end of the world

    ANDREW LEONARD

    Nick Bostrom is explaining to me how superintelligent AIs could destroy the human race by producing too many paper clips.

    It’s not a joke. Bostrom, the director of the Future of Humanity Institute at Oxford University, is the author of “Superintelligence: Paths, Dangers, Strategies,” an exploration of the potentially dire challenges humans could face should AIs ever make the leap from Siri to Skynet. Published in July, the book was compelling enough to spur Elon Musk, the founder and CEO of Tesla, into tweeting out a somber warning:

    Worth reading Superintelligence by Bostrom. We need to be super careful with AI. Potentially more dangerous than nukes.

    — Elon Musk (@elonmusk) August 3, 2014

    Via Skype call from his office in Oxford, Bostrom lays out a thought experiment that demonstrates how all our affairs could go awry.

    It doesn’t have to be paper clips. It could be anything. But if you give an artificial intelligence an explicit goal — like maximizing the number of paper clips in the world — and that artificial intelligence has gotten smart enough to the point where it is capable of inventing its own super-technologies and building its own manufacturing plants, then, well, be careful what you wish for.

    “How could an AI make sure that there would be as many paper clips as possible?” asks Bostrom. “One thing it would do is make sure that humans didn’t switch it off, because then there would be fewer paper clips. So it might get rid of humans right away, because they could pose a threat. Also, you would want as many resources as possible, because they could be used to make paper clips. Like, for example, the atoms in human bodies.” . . .
     
    #12     Aug 18, 2014
  3. My apologies for mistakenly referring to us as alarmists, when the more proper term would likely be activists.

    It was likely the alarming rate at which GW and ME are having affect on our planet that prompted me to ignore the popular republican definition of the term.

    It would seem that an alarmist should be viewed as just that- someone who rings the alarm when smoke is detected, and naysayers should perhaps instead refer those who over-exaggerate about situations to be dramatists or histrionics.

    You keep up the good fight FC- people who wish to ignore and rebuke man induced climate/environmental change will soon die out (probably with the rest of mankind- yeesh).
     
    #13     Aug 18, 2014
  4. piezoe

    piezoe

    Hoofy, I like your assessment of what is driving, albeit very slowly from a human perspective, mass extinction. There is one thing I can think of off-hand that you might consider adding to your list, and that is the problem of religion. But I don't know what its priority would be.
     
    #14     Aug 18, 2014
  5. dbphoenix

    dbphoenix

    Probably high since they're all looking so forward to the Rapture. What I've wondered is that if they're so looking forward to it, why not accelerate the process and leave us now rather than later?
     
    #15     Aug 18, 2014

  6. I'll make sure to not program any AI's to "make as many paper-clips as possible" because now I fully realize that these AI's will stop at nothing to destroy any and all lifeforms which could possibly someday inhibit their paper-clip production.

    Thank you, DBP, for such invaluable insight- it may have just saved all of our asses.
     
    #16     Aug 18, 2014
  7. Sure Pie,

    I understand there are many problems that come with religion- just as their are in any sectors of society.

    Perhaps if you were to articulate the problem(s) in more detail, I could help with the allocation of its solution, regarding its priority.
     
    Last edited: Aug 18, 2014
    #17     Aug 18, 2014
  8. dbphoenix

    dbphoenix

    Actually the invaluable insight comes from the author of the book. I serve only to copy and paste.
     
    #18     Aug 18, 2014
  9. Wouldn't this be like saying "all gay people are looking forward to painful butt sex",?

    It's tempting, but I'm afraid such a conversation will only be petty and nonsense at best.
     
    #19     Aug 18, 2014
  10. dbphoenix

    dbphoenix

    Well, I'm not into the whole Rapture thing, but I understand that this is why evangelicals and tea-partiers and so forth don't want to spend any money on trying to rectify what we've done to the planet because after all we have been given dominion over the earth and it's somehow not our responsibility to fix this since it's God's will and so on and he's coming to get us anyway, so why bother?

    But if there is a God and if there is a Rapture and if there is a Heaven to which God is supposedly going to take these people, I can't think of a single reason why they would expect him to do so given the kind of people they are.

    But then I'm not a theologian.
     
    #20     Aug 18, 2014