Andrea Kendall-Taylor, Erica Frantz, and Joseph Wright say so.
Surveillance powered by artificial intelligence (AI), for example, allows despots to automate the monitoring and tracking of their opposition in ways that are far less intrusive than traditional surveillance. Not only do these digital tools enable authoritarian regimes to cast a wider net than with human-dependent methods; they can do so using far fewer resources: no one has to pay a software program to monitor people’s text messages, read their social media posts, or track their movements. And once citizens learn to assume that all those things are happening, they alter their behavior without the regime having to resort to physical repression.
This in spite of, or in response to, their tally that
between 2000 and 2017, 60 percent of all dictatorships faced at least one antigovernment protest of 50 participants or more.
. . .protests unseated ten autocracies, or 23 percent of the 44 authoritarian regimes that fell during the period. Another 19 authoritarian regimes lost power via elections. . .many of the elections had followed mass protest campaigns.
Recommended.
Full article The Digital Dictators:
How Technology Strengthens Autocracy by Andrea Kendall-Taylor, Erica Frantz, and Joseph Wright at the Internet Archive.
The rules of power and violence are being modified rapidly. Technology can only strengthen regimes that are prepared to take advantage. Autocratic leaders at one time needed deep support and control over military and police. Now they will also need to extend those controls to certain engineers and coders to maintain their power. Some will be able to and some will not.
In the past, autocrats were vulnerable to a takeover by a military faction. In the future, they might be taken down by a project manager.
I really liked the article but it is a bit two one-sided. For me, this is another story about the Web/Mobile/Cloud revolution we have just lived through. I’m not sure whether the new AI tech described in the article is a new revolution or just an incremental innovation in Cloud computing. Regardless, this is very much a Digital Red Queen scenario: everyone is running madly just to stay in place (individuals, corporations, and government run institutions). It is freakishly cheap and easy to roll out your own AI surveillance. A mobile video camera with enough bandwidth to stream to off-the-self Cloud AI/ML clusters offered by either Amazon AWS or Google GCP gives any individual very advanced surveillance/data-capture capabilities. The cost is similar to the relative cost of running a web site in the first one or two decades after the Web went mainstream. The only question is the who/what/where of the video capture and the restrictions put in place to gather this kind of data in public/semi-public places. What would you do with (and how much would pay for) AI/ML derived video data from a parking lot, busy intersection, or busy sidewalk? Faces, genders, fashion, car models, license plates, walking rates, and countless other things.
I showed the infamous 1984 Apple ad to my kids and my thought was the modern digital world that Apple help create is probably the main reason why 1984 realities are slowly becoming true.
1) The amount of Surveillance of the average citizen has a huge potential. I went out yesterday and if needed one could find out exactly where I went all day as:
a) My licence plate is probably on 5 different private security servers right now.
b) Cell phone says all my movements.
They may have the lead right now but I guess that people will figure ways are around it.
1984 . . . or Brazil?
https://www.youtube.com/watch?v=nSQ5EsbT4cE
…or Brazil?
Looked like the feds arresting Roger Stone.
I’m not so sure that democracies have all that much to brag about in the non-authoritarian category. An acquaintance suggests there is more overall freedom in China vs. US: other than criticizing the gov’t, it’s a free for all in China regarding speech and activity; whereas in the US one can criticize the gov’t, but many taboos over race, gender, sex, ethnicity, etc.
My favorite perspective on this (which might just be optimistic) is this piece by Henry Farrell. A key passage: “there is a very plausible set of mechanisms under which machine learning and related techniques may turn out to be a disaster for authoritarianism, reinforcing its weaknesses rather than its strengths, by increasing its tendency to bad decision making, and reducing further the possibility of negative feedback that could help correct against errors.” The whole piece can be read here: http://crookedtimber.org/2019/11/25/seeing-like-a-finite-state-machine/