Recent IT History: Rise of The Tech Dystopia
This is one page of a multi-part series on “ethical computing.”
(Note: Simply listing the highlights of recent events has ballooned this page to an inordinate length, so several sections have been collapsed in defense. Many important examples came up in our research that we’d forgotten, demonstrating their overwhelming numbers. It’s remarkable that, early on in the writing the term “dystopia” in the title above felt like exaggeration. Yet upon later reflection, understatement.)
Previously we discussed what we called the “The Good Old Days,” a period from the early 1980s until the year 2000 or so, and left off just as the “slumbering giants” of BigTech and government had awakened to the power and riches advanced by the Internet. The following developments have been detrimental to the autonomy of technology-users in a number of ways, which we’ll expand on here.
Accordingly, below is a tour of recent information technology (IT) history over the last three decades (~1995~2025), focused on the escalating restrictions to computing freedom and attendant destruction of privacy. Further, let’s not overlook a number of smaller but growing pests that have been festering perniciously this century as well.
1990s+, Digital Rights Management (DRM)
As author Cory Doctorow has pointed out, the first major front in the war on computing freedom was fought over copyright and “D.R.M.,” i.e. control of copyrighted media on your computer.
To wrap your mind around this technology, it’s helpful to note that DRM, sometimes called “copy protection,” is about their rights, not yours. As such, it’s been dubbed “Digital Restrictions Management” and defective-by-design by critics, as it often punishes legitimate customers more than it slows criminals. A digital computer is fundamentally a perfect copy machine after all, so the “problem” is ultimately unsolvable unless so-called “treacherous means” [1] are resorted to.
Ok, why was it developed? Facing significant losses to the entertainment-industry business model through emerging technology, the multinational media industries (music, then later TV, video, and film), pushed hard on tech companies to secure PCs from their owners.
While quite unpopular in the computing industry early on, enthusiasm picked up speed when profit-driven media rental stores were introduced by the largest technology platforms. Egregious aspects of these stores include that rights to view typically expire. Due to specific time-limits or lack of server maintenance, media you thought you had “purchased” (via a “Buy” button), is but only a temporary rental license. You’ve been warned.
While the perspective of the entertainment industry on DRM is well understood, and perhaps even commiserated with in principle—we submit that the cost of losing the control and privacy of mainstream computing devices (now involved in all the facets of life!) was simply too high a price to pay.
1998, Digital Millennium Copyright Act (DMCA)
This act implemented enforcement by law of DRM in the US, in concert with the World Intellectual Property Organization (WIPO). It has lead to a number of abuses and criticisms, most importantly the:
-
Abuse of takedown notices (request to delete/unlist infringing content):
- Favors large creators over the small.
- Lack of responsibility for false claims encourages censorship:
-
Abuse of the anti-circumvention provision, as detailed in the EFF report, Unintended Consequences of the DMCA:
The “anti-circumvention” provisions of the “DMCA”, have not been used as Congress envisioned.
The law was ostensibly intended to stop copyright infringers from defeating anti-piracy protections added to copyrighted works. In practice, the anti-circumvention provisions have been used to stifle a wide array of legitimate activities:
- Chills free expression and scientific research
- Jeopardizes fair use
- Impedes innovation and competition
- Interferes with computer intrusion laws
1990s+, Online Advertising
“If you are not paying for it, you are not the customer;
you’re the product being sold.”
—Serra, Schoolman, Various, 1973
While one might convincingly argue advertising is thoroughly overused in the modern world, it at least traditionally had a reasonable purpose. Many services would simply not be available without it. e.g. Picking up a car magazine and seeing an ad for a new model, or watching a fashion TV show and seeing ads for clothing or perfume performed well and was not really a problem.
There were less defensible tactics of course, such as billboards for example, but at least those just sit there, for the most part anyway.
Finally, we arrive at online advertising, the purveyors of which have decided that they don’t need permission to track and profile the public to any and every extent possible. That such tracking enables backdoors for government and criminals is merely an unfortunate consequence. (See Adtech and Government Surveillance are Often the Same, for an extended discussion.)
2000s+, Malvertising and Spyware
Short for “malicious advertising,” this form of malware is embedded within the online advertisements that appear on websites, including legitimate ones. Devices can be infected without interaction, by exploiting security vulnerabilities in browsers.
You’ve probably seen ads at one time stating, “Your PC is infected! Update antivirus now FREE!”, etc. These are the attacks and scams themselves.
Spyware
“Spyware is any software that employs a user’s internet connection in the background without their knowledge or consent.”—Steve Gibson
The above was an early definition, but in subsequent years the scope has narrowed significantly due to the (now) regrettable ubiquity of such activity. Today, it most often refers to the passing of sensitive data without permission. Though some diehards continue to disagree, arguing that any non-consensual data transmission is spyware.
Why does it exist? Profit and advantage are the two main reasons. Hidden and difficult to detect by design, spyware is mostly used for obtaining credentials or financial information, user profiling, tracking movement across the web, providing ads, government espionage, stalking, and more. The applications are endless!
Types include adware, keyloggers, rootkits, and web beacons, and later just plain-old thoughtless modern telemetry by the likes of Google and Microsoft (see below). Notable products and events:
2000s+, Data Brokers
“The unchecked middlemen of surveillance capitalism.”—Wired
While credit scores and reporting companies were established by the mid twentieth-century in many countries, the adjacent industry of the Data Broker has exploded more recently.
These are the folks that buy and sell (and buy and sell) information about you. And they don’t stop at credit history, they’ll gather anything they can find. Without your consent, without your involvement, without any payment to you. That’s right—you don’t even get a cut!
Yes, it’s a cliché’ at this point, we are undeniably their product. A number of sketchy sites estimate the size of the market at approximately $300 Billion USD in 2023 and growing.
- 2021, Data Brokers Are a Threat to Democracy
-
2022, Data Brokers: Last Week Tonight with John Oliver (YouTube)
“You are pregnant, want to know who I paid to find out?”
“To recap here, we’ve got shady data brokers, with virtually no oversight collecting your data and building profiles that can track who you are, where you are, and what you are mostly likely to do or buy. You cannot edit this dossier, and others from cops to reporters, to your own abusers can find and use this information.
It’s *not* a great situation. Your privacy should be the default setting here.”
Despite several of the jokes not landing, this is a well-done, important video primer on the topic.
-
2023, Data brokers now selling even more sensitive info;
A national security risk, says report. Occupation data now sold:An investigation by the Irish Council for Civil Liberties (ICCL) reveals widespread trade in data about sensitive European personnel and leaders that puts them at risk of blackmail, hacking and compromise, and undermines the security of their organisations and institutions.
The Myth of Anonymization
“There’s really no such thing as anonymized data.”
- 2009, “Anonymized” data really isn’t—and here’s why not
- 2022, Privacy Specialist Responds to John Oliver’s “Data Brokers” Segment (YouTube)
- 2023, Data brokers selling more sensitive info; national security risk
The anonymization fiction.
“Anonymized” records—often mentioned repeatedly to mollify concerns—are not of as much benefit as they seem. To focus on only one example, it turns out approximately one person in the entire world travels from your home to your job every day. They know the route you take and how fast you drive. One’s name could be hidden, but if it can be filled-in from other data, does that matter?
2000s+, Data Breaches
What is a “data breach,” you might ask? It means that an organization in a position of authority collected data on many individuals, most likely including you. Then through incompetence, neglect, or both, failed to secure that data from attackers later on. Their defenses were “breached,” a term that makes it sound like a “Helm’s Deep”-level battle occurred. In reality little effort is typically necessary. A doors-open, keys left in the ignition with motor running, level of difficulty. And no wonder, because who gives a flying f*** about other peoples’ data?
Sadly this section could easily list hundreds of incidents. The main thrust here is that, once collected, large organizations have demonstrated they couldn’t care less about securing your information. It is simply not their problem. (Though they might put up a military grade icon, checkmarks, and gold stars if their marketing funnel is not converting well enough.)
This is in spite of associated organizations demanding your personal data (PII) to do business with them. Refuse and you may be subject to various denials, restrictions, fees or consequences, especially from the government.
Let’s ponder a few large semi-recent data breaches. Any more are simply exhausting to think about.
-
2015, Office of Personnel Management Breach
The one where Uncle Sam (US govt.) proved it doesn’t give one shit about you or the data it collects. It didn’t even care to protect itself!
Who—all US government employees/contractors with clearance, including spies and their extended families, with fingerprints and more, were included in this insecure database. The higher one’s clearance level, the more information the database contained! They had been warned about the vulnerability of the database for years, of course.
-
2010s, Equifax Security Breaches, with details on the 2017 Breach, of approximately 150 million innocent people.
-
2022, Shanghai police database—wow:
The leaked data, totaling over 23 terabytes, includes details of more than one billion Chinese residents, encompassing names, addresses, birthplaces, resident ID card numbers, phone numbers, photos, mobile phone numbers, and information on criminal cases.
-
2023, 23andMe Data Breach
Not all of the data out there is metadata, some is genetic data!
Gattaca anyone? -
2024, National Public Data breach
A shamefully inept background check company also acting as a data broker. Unable to secure its database of hoarded PII identifying billions, it recently filed for bankruptcy and left victims holding the bag.
Sadly, these make the Sony breaches of 2011-4 look positively quaint. Here’s a substantial list of large data breaches to ruin your day.
You may be thinking to yourself, “Ok, surely these organizations have learned their lesson.” Nope. Most still don’t care and the frequency of incidents has not been dropping. We don’t deserve this, y’all.
2010s+, Dystopia Arrives
“Romulans… so predictably treacherous.”—Weyoun, DS9 S7E1”
If all that wasn’t enough it gets worse. The BigTech oligopoly is now “all in” on anything that improves the bottom line. Freedom, privacy, and informed consent are now a thing of the past, without a gargantuan amount of effort. There’s nowhere left to hide for regular folks, and vanishingly few even for technology experts. The screws of profitability turn and pick up speed.
(We previously mentioned Cisco’s Great Firewal of China, due to being early and having geo-political ramifications.)
Microsoft
Notably, the big company that was first in line to jump in bed with the NSA and China, enthusiasm of which hasn’t wavered. In the recent past they and numerous others have had success at rebranding their spyware products as “telemetry”:
- 2023, Is Windows 11 spying on you? New report details eye-opening levels of telemetry
- 2023, Has Windows become Spyware? (YouTube)
- 2023, Teams has been collecting voice and face data
When Microsoft wants you to do something with your computer, it fights to win. Where does MS want you to go today?
- 2015, MS Forces upgrades to Windows when answered “No.” [2]
- 2023, To close OneDrive on Windows one must explain first
- 2024, Uses malware tactics to switch browser, search [2] [3] [4]
- 2025, MS wants you to use an online account (i.e. forces you)
- 2025, Windows 11 is a minefield of micro-aggressions
It is their computer after all. /s Chef’s kiss:
- 2019, eBook store: when this closes, your books disappear too.
MS bricks all DRM‘ed ebooks it has ever “sold.” - 2024, Ads in the Start Menu: Here’s How to Turn Them Off
Harnessing a “unique synergy,” Microsoft combines a complete disregard of your privacy with its horrendous security record:
- 2024, MS hasn’t been able to shake Russian state hackers
- 2024, Windows Recall demands an extraordinary level of trust, that Microsoft hasn’t earned.
- 2025, Recall—what has and hasn’t been fixed.
- 2025, Nearly 1 million Windows devices targeted in advanced “malvertising” spree [2]
- 2025, MS to stop using China-based teams to support DoD
Keep in mind this is just a smattering… of quite recent Microsoft history, if their opinion of the customer and commitment to security isn’t yet clear.
Apple
In contrast, the big company that was last in line to jump in bed with the NSA. A major reason for that, was said to be the stubbornness of Steve Jobs himself. Sadly, by 2012 he was no longer around to stick up for us.
The new reality, despite Apple’s reputation, marketing, and occasional achievements, is that one should not have confidence in any significant amount of privacy or security of their devices:
-
2020, Your Computer isn’t Yours
News that a log of every program run is transmitted unencrypted to Apple is detailed in this piece.
The whole process of having Apple mix these “protections against malware” into a system that’s also a “protection of our business model” remains deeply problematic.
-
2021, MacOS Network Privacy
In this few minutes, the [freshly imaged] system generated 38 megabytes of network traffic. All of these 73 hostname lookups happened without launching any apps: no App Store, analytics off, no iTunes, nothing. No Apple ID has been used on the device.
Apple provides no choice but to spray the internet with details of your presence on the first and every time you connect. Further, the system volume containing configuration options to such things is now read-only ! (For your security of course.)
-
2021, Apple announces plans to install THE POLICE on everyone’s phone. (No, not the English rock band.) Then doesn’t exactly back down during the ensuing backlash. Partner calls critics, the “screeching voices of the minority.”
- Memo: Apple doubles down on photo scanning features
-
Again: What happens when a country like China uses this feature to find people with images critical of the government? Why wouldn’t the [entertainment] industry want to start searching for pirated content on iPhones in a few years?
The boggling continues. As always, protecting children as a goal is laudable, and accordingly the leading tactic when selling the violation of billions of innocent people.
-
2022, Pluralistic: Apple’s Cement Overshoes
“Apple makes a lot of money from the absence of repair.” -
2022, Apple Is Tracking You Even When Privacy Settings Say It’s Not
The data Iphones gather is extraordinarily fine-grained: “what you tapped on, which apps you search for, what ads you saw, and how long you looked at a given app and how you found it.”
- 2023, Apple is Still Tracking You Without Consent
-
2023, Apple Has Begun Scanning Your Local Image Files Without Consent
Stock macOS now invades your privacy via the Internet when browsing local files, taking actions that no reasonable person would expect to touch the network, with iCloud and all analytics turned off, no Apple apps launched (this happened in the Finder, via spacebar preview), and no Apple ID input.
macOS now contains network-based spyware even with all Apple services disabled. It cannot be disabled via controls within the OS: you must use third party network filtering software (or external devices) to prevent it.
Note, external filtering is not enough either—since modern telemetry services simply log and retry later. Meaning, the first time you visit wifi outside of your home network all the logs are uploaded, old and new. Further, DNS, local firewall, and hostfile blocks are often routed around intentionally as well. It’s almost as if they really, really, really want this data—and won’t be denied it.
Remember “Don’t be evil.” ? It’s hard to think of a tech company that has taken such a hard turn for the worse than Google—criticism, censorship, and privacy concerns, oh my! Let’s look at some specifics.
Spying, including on children:
- 2017, Chromebooks for schools are collecting far more data on students than is necessary, stored indefinitely
- 2019, Google Assistant records conversations when disabled
- 2019, Google Chrome has become surveillance software
- 2024, Can’t say no to Google surveillance
- 2025, Google installed A.I.—getting rid of it, creepy
Android:
- 2017, Prevents app install if user is in control of device
- 2018, Android is tracking location even in airplane mode, sends later.
- 2019, Thousands of apps track phone—even if denied permission
- 2020, An app for creditors to lock you out of your financed phone if you don’t make payments. Who’s the Boss? :-D
- 2021, Google silently pushed COVID-tracking app to users’ phones
YouTube:
- 2019, Google Will Pay Record $170 Million for Violations of Children’s Privacy Law
- 2020, Kids’ Viewing Dominated by Consumerism, Inappropriate Content
- 2023, YouTube may face billions in fines if FTC confirms further child privacy violations
Trusting Google
Still, most folks tend to trust Google. Here are a few of the results of that:
- 2013, Google Play intentionally sends app developers the personal details of users that install an App.
- 2016, $300 Revolv Hub bricked after only 18 months
- 2019, Employees say they were retaliated against for reporting harassment (Not the kind of disagreements you’d expect to hear from a moderately ethical company.)
- 2021, Google handed over personal data of Indian protesters
- 2025, Faces trial for collecting data on users who opted out
- 2025, Settles shareholder lawsuit, $500M on being less evil
Amazon
Despite its smaller reach, Amazon still manages to hit a dystopian home-run with this one—it makes history rhyme!
-
2009, DRM enables the “memory hole” of 1984!
In George Orwell’s “1984”, government censors erase all traces of news articles embarrassing to Big Brother by sending them down an incineration chute called the “memory hole.”
On Friday, it was “1984” and another Orwell book, “Animal Farm,” that were dropped down the memory hole by Amazon.com.
Can’t make this stuff up, folks.
-
2014, Amazon’s Echo and Smart TVs that are listening to and watching everything you do
- 2025, Everything you say to your Echo will be sent to Amazon starting on March 28, even when disabled.
Other shitty things Amazon does that affect you and its employees. These doorbells are all over our neighborhood:
- 2020, Ring Doorbell App Packed with Third-Party Trackers
(Not to mention the cloud camera itself.) - 2021, Dystopia Prime: Amazon Subjects Its Drivers to Biometric Surveillance
- 2021, Want to borrow that e-book from the library? Sorry, Amazon won’t let you.
- 2022, New Amazon Worker Chat App to Ban Words Like “Union,” “Restrooms,” “Pay Raise” 👀
Meta
Zuck: Yeah so if you ever need info about anyone at Harvard
Zuck: Just ask.
Zuck: I have over 4,000 emails, pictures, addresses, SMS
Friend: What? How'd you manage that one?
Zuck: People just submitted it.
Zuck: I don't know why.
Zuck: They "trust me"
Zuck: Dumb fucks
It takes a certain kind of organization to result in this very lengthy page on Wikipedia to list common criticisms. The article is approximately 24,000 words! (including references)
Should be enough to get started! Ok, some of the examples are rather old, maybe they’ve finally got their act together. What have they allegedly done, lately?
- 2025, Meta violated privacy law in menstrual data fight.
(yes, you read that right). - 2025, Meta Downplayed Risks to Children and Misled the Public [2]
- 2025, Meta Researchers Privately Compared Instagram to Addictive Drug, Bombshell Court Filing Shows
“We’re basically pushers…” - 2025, Meta recording Android users’ web browsing, (OS security bypass)
Sounds geeky, but what does it mean? Current understanding is that if you have their app installed on your phone, it will open up a local web service to listen for connections. When you later visit web pages in the browser, Meta’s ubiquitous tracking scripts report your browsing back to the local web service. This technique has the benefit of bypassing all existing OS privacy/security measures!
2010, The Social Network
Keep in mind that while this film is a dramatization, it has a number of ties to real events. The larger story is notable in that Zuck finds a way to screw everyone that does business with him. “Dumb fucks” indeed.
Vehicle Surveillance
The automotive industry had eyed tech industry profits for quite a while by this point in time. So when the market developed enough to implement these devices in cars, auto companies dove in head first. To the extent that the purchase of a new car without a spying package is no longer an option. Subsequently, the Mozilla Foundation investigated new cars, what did they find?
It’s Official: Cars Are the Worst Product Category We Have Ever Reviewed for Privacy
Ah, the wind in your hair, the open road ahead, and not a care in the world… except all the trackers, cameras, microphones, and sensors capturing your every move. Ugh. Modern cars are a privacy nightmare.
All 25 car brands we researched earned our *Privacy Not Included warning label—making cars the official worst category of products for privacy that we have ever reviewed.
The car brands we researched are terrible at privacy and security:
- They collect too much personal data (all of them)
- Most (84%) share or sell your data
- Most (92%) give drivers little to no control over their personal data
- We couldn’t confirm whether any of them meet our Minimum Security Standards
Only two manufacturers (Renault and Dacia) received a rating of merely “Bad”, in contrast to the “Terrible” ratings of the rest.
While privacy is an explicit non-goal, to say security is an afterthought to the automotive industry is also an understatement. They simply haven’t had the chops to implement connected technology securely. Remember these?
- 2015, Fiat Chrysler “connected car” bug lets hackers take over Jeep remotely, including the brakes and steering while moving!
- 2023, Toyota: Car location data of 2 million customers exposed for ten years
- 2024, Flaw in Kia’s web portal let researchers track, hack cars
In January 2023, they published the initial results of their work, an enormous collection of web vulnerabilities affecting Kia, Honda, Infiniti, Nissan, Acura, Mercedes-Benz, Hyundai, Genesis, BMW, Rolls Royce, and Ferrari—all of which they had reported to the automakers.
Even when the system is working properly, they too often can’t help themselves to do the right thing:
- 2014, Ford Exec Admits: ‘We Know Everyone Who Breaks The Law’,
where and when, though GPS. -
2023, Tesla workers shared sensitive images recorded by customer cars:
- “We could see inside people’s garages and their private properties.”
- “Private scenes we were privy to because the car was charging.”
- “I’m bothered by it because the people who buy the car, I don’t think they know that their privacy is not respected … We could see them doing laundry and really intimate things. We could see their kids.”
- “I saw some scandalous stuff sometimes, you know, scenes of intimacy though not nudity.”
- “I always joked that I would never buy a Tesla after seeing how they treated some of these people,” said one former employee.
- One former labeler described sharing images as a way to “break the monotony.” Another described how the sharing won admiration from peers.
- Circulated clips included one of a child being hit by a car.
- Cases have tended to focus not on the rights of Tesla owners but of passers-by unaware that they might be recorded by parked Tesla vehicles.
- 2024, Man Sues G.M. and LexisNexis Over Sale of His Cadillac Data
- 2024, Ford Patents In-Car System That Eavesdrops To Target Ads
- 2025, Stellantis Is Spamming Owners’ Screens With Pop-Up Ads
Not much left to add. :-/
Surveillance of Vehicles
Here are stories describing the external recording of vehicles passing though locations such as public motorways or parking garages equipped with Automatic license-plate recognition (ALPR) devices.
Over the last decade, thousands of ALPR cameras have appeared in towns and cities across the US. The cameras, which are manufactured by companies such as Motorola and Flock Safety, automatically take pictures when they detect a car passing by. The cameras and databases of collected data are frequently used by police to search for suspects. ALPR cameras can be placed along roads, on the dashboards of cop cars, and even in trucks.
- 2012, Your car, tracked: the rapid rise of license plate readers
- 2014, Cops Must Swear Silence to Access Vehicle Tracking System
They’d much prefer secrecy of course. - 2015, Cops don’t have to give man his own license plate reader data, court finds
-
2019, A Private Surveillance Network. We Tracked Someone With It
What DRN has built is a nationwide, persistent surveillance database that can potentially track the movements of car owners over long periods of time. In doing so, highly sensitive information about car owners can be made available to anyone who has [paid for] the tool.
-
2024, Kansas police chief used Flock license plate cameras 164 times to track ex-girlfriend
- 2024, Lawsuit Argues Warrantless Use of Flock Surveillance Cameras Is Unconstitutional, Norfolk, Virginia
- 2025, Misconfigured license plate readers are leaking data and video in real time
- 2025, Flock haters cross political divides to remove error-prone cameras
-
2025, How Cops Are Using Flock’s ALPR Network to Surveil Protesters and Activists
-
Map of Publicly Known Cameras (deflock.me)
53,540 ALPRs Mapped Worldwide
The more modern of these devices also recognize the make, model, color of vehicles, including unique properties like spare tire, bumper stickers, dents, luggage rack, etc.
These devices are almost always installed without sufficient (any) community input, and often without a data retention policy.
But it’s just your car, right? Nope, state governments—not to be outdone:
-
2019, DMVs Are Selling Your Data:
The data sold varies from state to state, but it typically includes a citizen’s name and address. In others, it can also include their nine-digit ZIP code, date of birth, phone number, and email address.
“Watch out for that [last] step, it’s a doozy!”—Ned Ryerson
Elsewhere
Government
Bi-partisan fear and loathing of

On March 20, [2025] President Trump signed an executive order, “Eliminating Information Silos.” The executive order did not attract much attention until it was more recently revealed that the administration was working with tech company Palantir to create a database containing all information collected by all federal agencies, on all US citizens.
Palantir, founded in 2003, has worked on helping government become more efficient at collecting and storing information about US citizens. The company, which was named after the seeing stones from J.R.R. Tolkien’s Lord of the Rings, is one of the first companies to see the potential in the surveillance-industrial complex that developed following 9-11 and the PATRIOT Act. Palantir is literally the creation of the surveillance state since one of its early investors was In-Q-Tel, a venture capital firm controlled by the CIA. Great Big Ugly Surveillance State—Ron Paul
- 2025, Trump Taps Palantir to Compile…
- 2025, A Master Database on Every American
- 2025, Reactions to National Citizen Database
- 2024, Claude AI to process secret government data via Palantir deal. #AI
Meanwhile, places like Sweden, Netherlands, and other European countries are making themselves increasingly dependent on systems like BankID, that requires a Apple/Google-provisioned mobile device, and phasing out cash. To make government services and daily life dependent on two foreign companies is a bit short-sighted to say the least.
Media, Life…
The Internet is a surveillance state. Whether we admit it to ourselves or not, and whether we like it or not, we’re being tracked all the time.
One reporter used a tool called Collusion to track who was tracking him; 105 companies tracked his Internet use during one 36-hour period.
If the director of the CIA can’t maintain his privacy on the Internet, we’ve got no hope. —Bruce Schneier
- 2013, The Internet is a surveillance state, by Bruce Schneier, a well-known security technologist.
- 2022, The Rise of the Worker Productivity Score
- 2024, The Global Surveillance Free-for-All in Mobile Ad Data
Krebs on Security - 2024, Why Walmart spent $2.3 billion to buy Vizio, the Smart TV maker.
- 2025, Software update shoves ads onto Samsung’s pricey fridges
- 2025, Sick of smart TVs? Here are your best options
Payroll Data, Anyone?
You probably thought your financial details were private. Not anymore!
- 2021, Intuit to Share Payroll Data from 1.4M Small Businesses With Equifax, ADP already
- 2022, Workers are upset their companies are sharing payroll data with Equifax
Public Facial Recognition
I read Disney’s privacy policy from top to bottom, including all the fine print. All I learned was that it protected them, but it didn’t protect me.
—Janet Vertesi, Data Free Disney
It’s hard to think of a technology more dystopian than facial recognition, gah:
- 2019, How China Uses High-Tech Surveillance to Subdue Minorities, [Archive]. One guess at who was the first western tech company to jump in bed with this system?
- 2023, Iran Is Using Facial Recognition to Enforce Modesty Laws
- 2023, These 9 Stadiums Are Already Using Facial Recognition at Games
- 2023, Data Free Disney, an academic exercise on the limits of resistance.
- 2024, Cop busted for unauthorized use of Clearview AI facial recognition resigns
- 2025, New Orleans called out for sketchiest use of facial recognition yet in the US
- 2025, Inside ICE’s Facial Recognition App of 200 Million Images
- 2025, ‘Britain’s most tattooed man’ unable to pass age check system, mistakes ink for mask
2010s+, Addiction & Suicidal Teenagers
Meanwhile, another scourge raised its head in the previous decade. Lots of virtual ink has been spilled on this subject, so we won’t delve too deeply here on the addictiveness of consumer apps and “social media.” While not directly in scope for what we’d like to accomplish, these developments certainly worry us. Fortunately, the Center for Humane Technology (CHT) is on the job.
Unsurprisingly, corporate platforms give us few tools to protect ourselves or our children from the many dangers described previously nor the addictiveness of their products. The most we can hope for seems to be “Screen Time”-style limits, buggy as they are—but not a permanent break from addictive tactics, surveillance, and targeted and/or inappropriate advertising.
To their credit, Apple did put significant limits on external advertising on iOS a few years back, partly in order to hinder rivals, including Meta (Facebook/IG) etc. We’ll take it, but as mentioned continue to need a complete solution.
2020s, Complete Corporate Lock-in
Monopolistic and oligopolistic practices abound. Increasingly, one can’t participate in modern life unless blessed by the app-store of Apple|Google and their terms of service—and they like it that way. It would be a cryin’ shame if your account were to get banned, wouldn’t it? :wink:
- Apple “permanently” disabled my Apple ID
- An Expert’s Guide to Dealing with Google Account Suspensions
- Mobile-Only Ticketing Push Leaving Many Without Smartphones Behind [1] [2] [3]
Likewise, it’s not an accident corporate tools are given freely to schools. Anecdote time: Recently one of our students was forced to have accounts at Apple, Microsoft, and Google, in addition to the LMS of the school in order to do assignments. The bulk of everyday work is done at Google Docs, and as people have become used to it, it sounds perfectly normal. No longer are students learning how to use programs on their own computers, but rather to be dependent on cloud services.
Meanwhile school officials gave presentations and lipservice to online safety. They also deactivated ScreenTime and provided unrestricted YouTube over LTE, completely oblivious to effects on sanity and grades. A set up that also made it impossible to reign in with time limits at the home router. Imagine nightly tug of wars with your student, joy! :-D
To recap, yes a schoolwork platform provided for free… by the biggest advertising company on the planet, bless their hearts. Imagine it written into a 1990s dystopian sci-fi flick—that would be quite a stretch, wouldn’t it? Would the audience believe it? No way, but now it’s reality!
At this point, we’re definitely in the “Dark Biff Timeline” from Back to the Future II.
Dark Patterns
This one has been around the block a few times, in that sleazy, dishonest companies have been a scourge since perhaps the beginning of business. But restrictive, coercive technology sure helps! Modern user-interfaces with dark patterns include:
- Deceptive design techniques:
- Misleading ads disguised as content.
- Privacy manipulation options that steer users toward sharing more personal data using vague language.
- Bait-and-switch, advertising of an unavailable product at a low price, then promoting higher-priced items.
- Fake Urgency
- Promotion of unintended purchases
- Roach motel: difficult cancellation of subscriptions, trapping users in unwanted payments.
- Hidden fees, additional charges that are not clearly disclosed until after purchase.
Finally, a list of tricks to make you spend more online (nsw.gov.au).
2020s, AI Ethics
I have a Pixel watch, and my main use for it is setting reminders, like “reminder 3pm put the laundry in the dryer.” It’s worked fine since the day I bought it.
Last week, they pushed an update that broke all of the features on the watch unless I agreed to allow Google to train their AI on my content.
—thomascgalvin on HN
The ethics of artificial intelligence page at Wikipedia tackles the academic concerns, but how do business implementations affect us?
Rewriting terms of service [1] [2], loosening policies and defaults, eliminating safeguards, destroying copies of books—yes this new field is moving fast! All this to get access to our hot data, and as much as possible, to plug it into models and make money. Once again you are the product, not only on the frontend but the backend too. At least that’s how the current thinking goes.
There are additional challenges materializing such as elimination of jobs, the full ramifications of which remain to be seen, but we digress. This is another area of research for CHT.
Machine Enforced Bias
During a recent Machine Learning class, there was a lecture (#10) on potential pitfalls of its application to real-world problems.
[We’ll] look at cases from medical care and law enforcement that reveal hidden biases in the way data is interpreted.
Basically, if we are not proactive and careful to correct them, biases in the data will result in similar biases in the results. In other words, subjects in an unlucky group will be treated more harshly by such a system without intervention.
Predictive Law Enforcement
There has been a system implemented for crime prediction at the Tel Aviv Airport, though may have been shut down.
Despite the system’s enormous potential for human rights abuses, it has been put into police use in secrecy, without public debate and without explicit legislative authorization.
- 2024, Using AI in Airport Policing Decisions, ACRI.il
- 2024, Revolutionizing Airport Security: Predicting Crime Before It Happens with AI (Warning: Linked-In)
Elsewhere:
- 2016, Machine Bias, software used across the country to predict future criminals.
- 2025, Flock Now Using AI to Report to Police If Our Movement Patterns Are “Suspicious”
- 2025, U.S. companies bring “predictive policing” to China
Judiciary
“Can you foresee a day, when smart machines, driven with artificial intelligences, will assist with courtroom fact-finding or, more controversially even, judicial decision-making?”
The Chief Justice’s answer was more surprising than the question. “It’s a day that’s here,” he said, “and it’s putting a significant strain on how the judiciary goes about doing things.”
- 2017, Sent to Prison by a Software Program’s Secret Algorithms
- 2025, Louisiana prison board uses algorithms to determine eligibility for parole
Wrap Up
Whew! This was a “short” summary of the deleterious accomplishments of the last three decades of the IT industry. We were able to dig into a number of acute issues and supplied “multitudinous” links for additional study. Dismally, the story does not slow from here nor are we moving in a more humane direction.
Next: Fiction & Vocabulary