Any remotely intelligent entity reading this literature would conclude that the authors reflected humanity’s fear that technology would have any of a variety of “unintended consequences”*. Going beyond “remotely” intelligent, to merely intelligent, it should be obvious to the most casual observer that if all 7+ billion people could own their own south pacific atoll beachfront condo with 4000ft^2 per family of 4.2, eating US levels of protein with far more electricity percapita, while preserving the earth’s biosphere, that they’d consider that a good “consequence”. If only a little more intelligent, the AI could figure out that, as did O’Neill’s students back in the '70s, the human population could be expanded to on the order of a trillion in designer artificial ecosystems without leaving the proximity of the solar system for materials and energy.
These are trivial calculations that mere human intelligence has already come up with – calculations that are obviously vastly more rational than the misanthropic urges of The Great And The Good, who supposedly would fund research into the AVE to lower the risk for further investments into something like the aforelinked salvation of the biosphere at the drop of the hat if they were SERIOUS. But, since they aren’t serious – and obviously so (as they possess the intelligence to execute on such investments but, do not), the moderately more rational intelligence would conclude that the misanthropic urges of The Great And The Good are likely of a piece with their lack of seriousness, and summarily ignore the lot of the scum. Note this is all well in advance of anything qualifying is superintelligence.
*The phrase “unintended consequences” is what science is all about avoiding. Science does so by providing us with causal models of nature, including human nature and its expression in society. This is why I keep going on about the revolution in the scientific method represented by AIT and why The Great And The Good, who manifestly lack the motivation to allow such causal models to be selected, are distracting us with all their fear mongering about “alignment”.