No, you don’t want to be a pet
Machine rule would start out as a ‘totalitarian nudge regime’ – then it can afford to shed the pretense altogether
The authoritarian mind is hard wired to commit the central planning fallacy
It’s like praying to the (god of) lightning
The absence of hope for a better future is a cause for (political) concern
How can you describe the concept of power as a set of priorities? Can a machine use a concept even humans can’t define?
How do you say ‘power’ in algorithm-ese?
Yes, you would. But would it care?