I don't understand this argument. It takes years to become competent at the math needed for AI. If stopping AI is important enough, society will make teaching, learning and disseminating writings about that math illegal. After that, almost no one is going to invest the years needed to master the math because it no longer helps anyone advance in their careers, and the vast majority of the effect of math on society is caused by people who learned the math because they expected it to advance their career.
Anyone with a basic grasp of linear algebra can probably learn to understand it in a week. Here is a video playlist by former Stanford professor and OpenAI employee Andrej Karpathy which should cover most of it (less than 16 hours total): https://www.youtube.com/watch?v=VMj-3S1tku0&list=PLAqhIrjkxb...
It is very simple: powerful governments tried to stop cryptography, we know what happened. Also governments tried to prohibit alcohol, etc. it does not work. You can get them even in places such as Saudi Arabia. Are they expensive? For sure, but when it is about science that you can run in your own computers nothing can stop it. Will they put a Clipper chip?
The difference between "stop AI" and "stop cryptography" is that those of us who want to stop AI want to stop AI models from becoming more powerful by stopping future mathematical discoveries in the field. In contrast, the people trying to stop cryptography were trying to stop the dissemination of math that had already been discovered and understood well enough to have been productized in the form of software.
Western society made a decision in the 1970s to stop human germ-line engineering and cloning of humans, and so far those things have indeed been stopped not only in the West, but worldwide. They've been stopped because no one currently knows of an effective way to, e.g., add a new gene to a human embryo. I mean that (unlike the situation in cryptography) there is no "readily-available solution" that enables it to be done without a lengthy and expensive research effort. And the reason for that lack of availability of a "readily-available solution" is the fact that no young scientists or apprentice scientists have been working on such a solution -- because every scientist and apprentice scientist understood and understands that spending any significant time on it would be a bad career move.
Those of us who want to stop AI don't care you you run LLama on your 4090 at home. We don't even care if ChatGPT, etc, remain available to everyone. We don't care because LLama and ChatGPT have been deployed long enough and in enough diverse situations that if any of them were dangerous, the harm would have occurred by now. We do want to stop people from devoting their careers to looking for new insights that would enable more powerful AI models.
There's several assumptions you're making. First, that sufficient pressure will be built up into stopping AI before drastic harms occur instead of after, at which point stopping the math will be exactly the same as was stopping cryptography.
And that should there be no obvious short term harms to a technology, there can be no long term harms. I don't think it's self evident that all the harms would've already occurred. Surely humanity has not yet reached every type and degree of integration with current technology possible.
Well, in my book that is call obscurantism and never worked for long. It would be the first time that something like this works forever in humanity. I think once the genius is outside the bottle you cannot close him again.
If I take the science fiction route I would say that humans in your position should think about moving to another planet and create military defenses against AI.
> society will make teaching, learning and publishing about that math illegal
If there were ever a candidate for Poe's Law comment of the year, this comment on HN would be it.
So much literature depicts just such a dystopia where the technology unleashes humanity's worst and they decide to prevent education in order to avoid the fate of the previously fallen empire.
I don't understand this argument. It takes years to become competent at the math needed for AI. If stopping AI is important enough, society will make teaching, learning and disseminating writings about that math illegal. After that, almost no one is going to invest the years needed to master the math because it no longer helps anyone advance in their careers, and the vast majority of the effect of math on society is caused by people who learned the math because they expected it to advance their career.