Friday, August 11, 2006

Famed Futurist Visits Astroroach Blog

I'm amazed and delighted that futurist Michael Anissimov has commented on the blog entry August 22 - Start of the Apocalypse. Michael writes and speaks on futurist issues, especially the relationships between accelerating change, nanotechnology, existential risk, transhumanism, and the Singularity. His popular blog Accelerating Future discusses these issues regularly, and in much more depth than you'll typically find here. Fans of the Astroroach blog and podcasts (especially the Singularity 101 and Existential Risks episodes) will really enjoy Michael's site. Highly recommended.

Oh, and what was his comment? You can read it yourself, but essentially Michael felt I was being alarmist in highlighting the jihadist threat. I admit the headline is a bit over the top, but I posted the article not to scare anyone but get people thinking. Lewis argues that MAD is not a deterrent to those with an Apocalyptic bent. I think that's a valid observation. But a few nuclear weapons here and there are not the biggest threat we will face, and I think we are in agreement on that.

1 comment:

Unknown said...

I did mention amplified human or biologically-derived greater-than-human intelligence in some of my audio podcasts, although you'd no doubt have problems with those as well, since they are completely off-the-cuff. All the podcasts are completely unscripted - many times I'm not even sure what I'll talk about when I sit down at the microphone. The end result are some rather cursory and superficial discussions sometimes, but so what - they're just for fun.

I'm looking forward to continuing the dialog. Thanks for subscribing.

Concerning your "Nukes" article:

Immediately after 9/11 the airwaves were full of discussions about our vulnerabilities, and many of them received the same kind of criticism I'm sure you get, namely "let's not give the enemy any new ideas." Although your article is unlikely to provide inspiration to a terrorist, it's not completely inconceivable, and there lies the problem.

This is similar to the arguments I hear when a computer vulnerability is found. Some groups think the vulnerability must be immediately published so users can start to take necessary steps; others, like Microsoft, hope to use "security by obscurity" and don't think vulnerabilities should be made public until there's a fix for it.

As a "good guy" I want to be aware of where I'm vulnerable so I can start thinking of ways to mitigate the risk, but I don't want to give a "heads up" to the bad guys either. I don't know where I stand on this issue, but I'm sure we'll be facing this dilemma more often as we get closer to the Singularity or Apocalypse (or both), so I'd better make up my mind!

By the way, I was thinking about your "Nuke Tehran" scenario. As you say, it doesn't really matter who actually does the bombing. Should the Iranians obtain a single nuclear weapon, say from North Korea, they couldn't do much better than to use it on themselves - if the goal is to start Armageddon. This would bring down much of the world on Israel and do far more damage than nuking Israel itself.

What an odd world this is...