To elaborate just a bit on this, the article quoted in the video I was referring to... says that we're in a nation-state race for AI, as evidenced by Trump's "Stargate project". I say OK, that may be their intention, but it just sounds silly to me. Because you can have your superintelligent AI with its "billions of scientists" (hopefully contained in a nuclear silo ten miles underground with no access to the world) producing all these massive theoretical breakthroughs in secret... OK. You will still need to somehow go out to the physical world and put some WORK to it. And the other nation-states will absolutely and definitely be WATCHING and their spies will eventually figure out what you're trying to do. Or at least that you're all of a sudden mining for whatever ore in buttfuck nowhere, and will immediately do the same somewhere else. It just sounds very silly to me, this hope/fear that whoever unleashes it will overnight become the Absolute Overlord of the world and will be able to maintain total secrecy and information asymmetry for a sufficiently long time. View quoted note →
I'm watching a video in which the narrator is going through an article that argues that Western states "must" be the ones to develop general AI, as opposed to China, for example. The author says that broad AI will supply whatever nation that develops it with "billions of scientists", so scientific and technological development, specifically military technology, would burn through decades of human work in years or less. The culmination of this would be AGI giving birth to hyperintelligence . I remain skeptical. It's not only the infrastructure that such levels of output require -- Stargate project or not. It's the infrastructure that *materializing* those hypothetical scientific breakthroughs require. The AGI may very well come up with the whole tech tree required to produce a warp drive (as I said earlier, I'm stuck watching Star Trek these days). But then we have to make all those things required to produce it. New materials, new mining rigs to extract them, new plants to process them and assemble them, and so on and so forth. And what about the testing? Will we just get rid of testing? Will we accept AI models and simulations ("projections", they'll call them) instead? I just don't see it.