>>8050
>e.g. "communism" is a complicated construct more substantial than any "maximise X"
Only because you are restricting the socialist set to a subset called communism.
Being a socialist in the 19th Century just meant you believed that labor should get its own, so you were maximizing the amount of surplus that went back to the worker. Different interpretations of that then battled for supremacy.
So, even if an ideology starts with a simple idea, you then get a great big tree of ideologies with different theories on how to implement the idea.
This means that even if you start with "maximize paperclips" you'd have to come with flowering and complicated separate interpretations of that, because the English language isn't precise enough to define "paperclip" except as a vague category that overlaps with other things, and even "maximize" can be hazy, since suspension of maximization in the short term might be needed in order to perform other goals that allow you to survive so that you can maximize in the long term.
"maximize paperclips" is at least an ideological superset in the same way that "liberate the proletariat" or "save the white race" is.
>anarcho-capitalism might describe itself as "maximise freedom", the ideology includes a lot more tirades about what is and is not a restriction on people's freedom and a lot less optimisation for anything in particular than a paperclip AI coded for "freedom" would
The AI would have to decide what freedom meant too, or at least its programmers would. A paperclip maximizing AI would have to have rules about what counts as a paperclip.
If there are different maximizers they would also clan together based on least difference in their maximization interpretations, so as to fight groupings that are most different. This would naturally reproduce politics, and probably goes some way to explaining why it exists in humans in the first place.