☀︎

Ethical Computing Initiative

Recent IT History: Rise of The Tech Dystopia

This is one page of a multi-part series on “ethical computing.”

(Note: Simply listing the highlights of recent events has ballooned this page to an inordinate length, so several sections have been collapsed in defense. Many important examples came up in our research that we’d forgotten, demonstrating their overwhelming numbers. It’s remarkable that, early on in the writing the term “dystopia” in the title above felt like exaggeration. Yet upon later reflection, understatement.)

Previously we discussed what we called the “The Good Old Days,” a period from the early 1980s until the year 2000 or so, and left off just as the “slumbering giants” of BigTech and government had awakened to the power and riches advanced by the Internet. The following developments have been detrimental to the autonomy of technology-users in a number of ways, which we’ll expand on here.

Accordingly, below is a tour of recent information technology (IT) history over the last three decades (~1995~2025), focused on the escalating restrictions to computing freedom and attendant destruction of privacy. Further, let’s not overlook a number of smaller but growing pests that have been festering perniciously this century as well.

1990s+, Digital Rights Management (DRM)

As author Cory Doctorow has pointed out, the first major front in the war on computing freedom was fought over copyright and “D.R.M.,” i.e. control of copyrighted media on your computer.

To wrap your mind around this technology, it’s helpful to note that DRM, sometimes called “copy protection,” is about their rights, not yours. As such, it’s been dubbed “Digital Restrictions Management” and defective-by-design by critics, as it often punishes legitimate customers more than it slows criminals. A digital computer is fundamentally a perfect copy machine after all, so the “problem” is ultimately unsolvable unless so-called “treacherous means” [1] are resorted to.

Ok, why was it developed? Facing significant losses to the entertainment-industry business model through emerging technology, the multinational media industries (music, then later TV, video, and film), pushed hard on tech companies to secure PCs from their owners.

While quite unpopular in the computing industry early on, enthusiasm picked up speed when profit-driven media rental stores were introduced by the largest technology platforms. Egregious aspects of these stores include that rights to view typically expire. Due to specific time-limits or lack of server maintenance, media you thought you had “purchased” (via a “Buy” button), is but only a temporary rental license. You’ve been warned.

While the perspective of the entertainment industry on DRM is well understood, and perhaps even commiserated with in principle—we submit that the cost of losing the control and privacy of mainstream computing devices (now involved in all the facets of life!) was simply too high a price to pay.

1998, Digital Millennium Copyright Act (DMCA)

This act implemented enforcement by law of DRM in the US, in concert with the World Intellectual Property Organization (WIPO). It has lead to a number of abuses and criticisms, most importantly the:

1990s+, Online Advertising

“If you are not paying for it, you are not the customer;
you’re the product being sold.”
Serra, Schoolman, Various, 1973

While one might convincingly argue advertising is thoroughly overused in the modern world, it at least traditionally had a reasonable purpose. Many services would simply not be available without it. e.g. Picking up a car magazine and seeing an ad for a new model, or watching a fashion TV show and seeing ads for clothing or perfume performed well and was not really a problem.

There were less defensible tactics of course, such as billboards for example, but at least those just sit there, for the most part anyway.

Finally, we arrive at online advertising, the purveyors of which have decided that they don’t need permission to track and profile the public to any and every extent possible. That such tracking enables backdoors for government and criminals is merely an unfortunate consequence. (See Adtech and Government Surveillance are Often the Same, for an extended discussion.)

2000s+, Malvertising and Spyware

Short for “malicious advertising,” this form of malware is embedded within the online advertisements that appear on websites, including legitimate ones. Devices can be infected without interaction, by exploiting security vulnerabilities in browsers.

You’ve probably seen ads at one time stating, “Your PC is infected! Update antivirus now FREE!”, etc. These are the attacks and scams themselves.

Spyware

“Spyware is any software that employs a user’s internet connection in the background without their knowledge or consent.”—Steve Gibson

The above was an early definition, but in subsequent years the scope has narrowed significantly due to the (now) regrettable ubiquity of such activity. Today, it most often refers to the passing of sensitive data without permission. Though some diehards continue to disagree, arguing that any non-consensual data transmission is spyware.

Why does it exist? Profit and advantage are the two main reasons. Hidden and difficult to detect by design, spyware is mostly used for obtaining credentials or financial information, user profiling, tracking movement across the web, providing ads, government espionage, stalking, and more. The applications are endless!

Types include adware, keyloggers, rootkits, and web beacons, and later just plain-old thoughtless modern telemetry by the likes of Google and Microsoft (see below). Notable products and events:

2000s+, Data Brokers

“The unchecked middlemen of surveillance capitalism.”—Wired

While credit scores and reporting companies were established by the mid twentieth-century in many countries, the adjacent industry of the Data Broker has exploded more recently.

These are the folks that buy and sell (and buy and sell) information about you. And they don’t stop at credit history, they’ll gather anything they can find. Without your consent, without your involvement, without any payment to you. That’s right—you don’t even get a cut!

Yes, it’s a cliché’ at this point, we are undeniably their product. A number of sketchy sites estimate the size of the market at approximately $300 Billion USD in 2023 and growing.

The Myth of Anonymization

“There’s really no such thing as anonymized data.”

“Anonymized” records—often mentioned repeatedly to mollify concerns—are not of as much benefit as they seem. To focus on only one example, it turns out approximately one person in the entire world travels from your home to your job every day. They know the route you take and how fast you drive. One’s name could be hidden, but if it can be filled-in from other data, does that matter?

2000s+, Data Breaches

What is a “data breach,” you might ask? It means that an organization in a position of authority collected data on many individuals, most likely including you. Then through incompetence, neglect, or both, failed to secure that data from attackers later on. Their defenses were “breached,” a term that makes it sound like a “Helm’s Deep”-level battle occurred. In reality little effort is typically necessary. A doors-open, keys left in the ignition with motor running, level of difficulty. And no wonder, because who gives a flying f*** about other peoples’ data?

Sadly this section could easily list hundreds of incidents. The main thrust here is that, once collected, large organizations have demonstrated they couldn’t care less about securing your information. It is simply not their problem. (Though they might put up a military grade icon, checkmarks, and gold stars if their marketing funnel is not converting well enough.)

This is in spite of associated organizations demanding your personal data (PII) to do business with them. Refuse and you may be subject to various denials, restrictions, fees or consequences, especially from the government.

Let’s ponder a few large semi-recent data breaches. Any more are simply exhausting to think about.

Sadly, these make the Sony breaches of 2011-4 look positively quaint. Here’s a substantial list of large data breaches to ruin your day.

You may be thinking to yourself, “Ok, surely these organizations have learned their lesson.” Nope. Most still don’t care and the frequency of incidents has not been dropping. We don’t deserve this, y’all.

2010s+, Dystopia Arrives

“Romulans… so predictably treacherous.”—Weyoun, DS9 S7E1”

If all that wasn’t enough it gets worse. The BigTech oligopoly is now “all in” on anything that improves the bottom line. Freedom, privacy, and informed consent are now a thing of the past, without a gargantuan amount of effort. There’s nowhere left to hide for regular folks, and vanishingly few even for technology experts. The screws of profitability turn and pick up speed.

(We previously mentioned Cisco’s Great Firewal of China, due to being early and having geo-political ramifications.)

Microsoft

Notably, the big company that was first in line to jump in bed with the NSA and China, enthusiasm of which hasn’t wavered. In the recent past they and numerous others have had success at rebranding their spyware products as “telemetry”:

When Microsoft wants you to do something with your computer, it fights to win. Where does MS want you to go today?

It is their computer after all. /s  Chef’s kiss:

Harnessing a “unique synergy,” Microsoft combines a complete disregard of your privacy with its horrendous security record:

Keep in mind this is just a smattering… of quite recent Microsoft history, if their opinion of the customer and commitment to security isn’t yet clear.


Apple

In contrast, the big company that was last in line to jump in bed with the NSA. A major reason for that, was said to be the stubbornness of Steve Jobs himself. Sadly, by 2012 he was no longer around to stick up for us.

The new reality, despite Apple’s reputation, marketing, and occasional achievements, is that one should not have confidence in any significant amount of privacy or security of their devices:


Google

Remember “Don’t be evil.” ?  It’s hard to think of a tech company that has taken such a hard turn for the worse than Google—criticism, censorship, and privacy concerns, oh my! Let’s look at some specifics.

Spying, including on children:

Android:

YouTube:

Trusting Google

Still, most folks tend to trust Google. Here are a few of the results of that:


Amazon

Despite its smaller reach, Amazon still manages to hit a dystopian home-run with this one—it makes history rhyme!

Other shitty things Amazon does that affect you and its employees. These doorbells are all over our neighborhood:


Meta
Zuck: Yeah so if you ever need info about anyone at Harvard
Zuck: Just ask.
Zuck: I have over 4,000 emails, pictures, addresses, SMS

Friend: What? How'd you manage that one?

Zuck: People just submitted it.
Zuck: I don't know why.
Zuck: They "trust me"
Zuck: Dumb fucks

It takes a certain kind of organization to result in this very lengthy page on Wikipedia to list common criticisms. The article is approximately 24,000 words! (including references)

Should be enough to get started! Ok, some of the examples are rather old, maybe they’ve finally got their act together. What have they allegedly done, lately?

Sounds geeky, but what does it mean? Current understanding is that if you have their app installed on your phone, it will open up a local web service to listen for connections. When you later visit web pages in the browser, Meta’s ubiquitous tracking scripts report your browsing back to the local web service. This technique has the benefit of bypassing all existing OS privacy/security measures!

2010, The Social Network

Keep in mind that while this film is a dramatization, it has a number of ties to real events. The larger story is notable in that Zuck finds a way to screw everyone that does business with him. “Dumb fucks” indeed.


Vehicle Surveillance

The automotive industry had eyed tech industry profits for quite a while by this point in time. So when the market developed enough to implement these devices in cars, auto companies dove in head first. To the extent that the purchase of a new car without a spying package is no longer an option. Subsequently, the Mozilla Foundation investigated new cars, what did they find?

It’s Official: Cars Are the Worst Product Category We Have Ever Reviewed for Privacy

Ah, the wind in your hair, the open road ahead, and not a care in the world… except all the trackers, cameras, microphones, and sensors capturing your every move. Ugh. Modern cars are a privacy nightmare.

All 25 car brands we researched earned our *Privacy Not Included warning label—making cars the official worst category of products for privacy that we have ever reviewed.

The car brands we researched are terrible at privacy and security:

  1. They collect too much personal data (all of them)
  2. Most (84%) share or sell your data
  3. Most (92%) give drivers little to no control over their personal data
  4. We couldn’t confirm whether any of them meet our Minimum Security Standards

Only two manufacturers (Renault and Dacia) received a rating of merely “Bad”, in contrast to the “Terrible” ratings of the rest.

While privacy is an explicit non-goal, to say security is an afterthought to the automotive industry is also an understatement. They simply haven’t had the chops to implement connected technology securely. Remember these?

In January 2023, they published the initial results of their work, an enormous collection of web vulnerabilities affecting Kia, Honda, Infiniti, Nissan, Acura, Mercedes-Benz, Hyundai, Genesis, BMW, Rolls Royce, and Ferrari—all of which they had reported to the automakers.

Even when the system is working properly, they too often can’t help themselves to do the right thing:

Not much left to add. :-/

Surveillance of Vehicles

Here are stories describing the external recording of vehicles passing though locations such as public motorways or parking garages equipped with Automatic license-plate recognition (ALPR) devices.

Over the last decade, thousands of ALPR cameras have appeared in towns and cities across the US. The cameras, which are manufactured by companies such as Motorola and Flock Safety, automatically take pictures when they detect a car passing by. The cameras and databases of collected data are frequently used by police to search for suspects. ALPR cameras can be placed along roads, on the dashboards of cop cars, and even in trucks.

The more modern of these devices also recognize the make, model, color of vehicles, including unique properties like spare tire, bumper stickers, dents, luggage rack, etc.

These devices are almost always installed without sufficient (any) community input, and often without a data retention policy.

But it’s just your car, right? Nope, state governments—not to be outdone:


“Watch out for that [last] step, it’s a doozy!”—Ned Ryerson

Elsewhere

Government

Bi-partisan fear and loathing of Uncle Sam:

On March 20, [2025] President Trump signed an executive order, “Eliminating Information Silos.” The executive order did not attract much attention until it was more recently revealed that the administration was working with tech company Palantir to create a database containing all information collected by all federal agencies, on all US citizens.

Palantir, founded in 2003, has worked on helping government become more efficient at collecting and storing information about US citizens. The company, which was named after the seeing stones from J.R.R. Tolkien’s Lord of the Rings, is one of the first companies to see the potential in the surveillance-industrial complex that developed following 9-11 and the PATRIOT Act. Palantir is literally the creation of the surveillance state since one of its early investors was In-Q-Tel, a venture capital firm controlled by the CIA. Great Big Ugly Surveillance State—Ron Paul

Meanwhile, places like Sweden, Netherlands, and other European countries are making themselves increasingly dependent on systems like BankID, that requires a Apple/Google-provisioned mobile device, and phasing out cash. To make government services and daily life dependent on two foreign companies is a bit short-sighted to say the least.

Media, Life…

The Internet is a surveillance state. Whether we admit it to ourselves or not, and whether we like it or not, we’re being tracked all the time.

One reporter used a tool called Collusion to track who was tracking him; 105 companies tracked his Internet use during one 36-hour period.

If the director of the CIA can’t maintain his privacy on the Internet, we’ve got no hope. —Bruce Schneier

Payroll Data, Anyone?

You probably thought your financial details were private. Not anymore!

Public Facial Recognition

I read Disney’s privacy policy from top to bottom, including all the fine print. All I learned was that it protected them, but it didn’t protect me.
—Janet Vertesi, Data Free Disney

It’s hard to think of a technology more dystopian than facial recognition, gah:


A Final Note on "Bugs"

To wrap up this section on dystopia (feelin’ it yet?), it’s important to note that quite a few of the items in the lists above can be chalked up to unfortunate bugs. That’s certainly understandable to a degree, at least early on. However, when countless privacy violation bugs happen constantly over more than two decades… well, one might be inclined to notice a pattern.

These mainstream systems are not designed well in the first place. Freedom, security, privacy, and consent certainly appear to be on the back burner of concerns for these companies, if they get any attention at all. As some of us are system developers ourselves, we believe many of these bugs should not have even been possible. That is, if ethics (or simply user needs) were taken into account earlier (or at all) in the design process.

2010s+, Addiction & Suicidal Teenagers

Meanwhile, another scourge raised its head in the previous decade. Lots of virtual ink has been spilled on this subject, so we won’t delve too deeply here on the addictiveness of consumer apps and “social media.” While not directly in scope for what we’d like to accomplish, these developments certainly worry us. Fortunately, the Center for Humane Technology (CHT) is on the job.

Unsurprisingly, corporate platforms give us few tools to protect ourselves or our children from the many dangers described previously nor the addictiveness of their products. The most we can hope for seems to be “Screen Time”-style limits, buggy as they are—but not a permanent break from addictive tactics, surveillance, and targeted and/or inappropriate advertising.

To their credit, Apple did put significant limits on external advertising on iOS a few years back, partly in order to hinder rivals, including Meta (Facebook/IG) etc. We’ll take it, but as mentioned continue to need a complete solution.

2020s, Complete Corporate Lock-in

Monopolistic and oligopolistic practices abound. Increasingly, one can’t participate in modern life unless blessed by the app-store of Apple|Google and their terms of service—and they like it that way. It would be a cryin’ shame if your account were to get banned, wouldn’t it? :wink:

Likewise, it’s not an accident corporate tools are given freely to schools. Anecdote time: Recently one of our students was forced to have accounts at Apple, Microsoft, and Google, in addition to the LMS of the school in order to do assignments. The bulk of everyday work is done at Google Docs, and as people have become used to it, it sounds perfectly normal. No longer are students learning how to use programs on their own computers, but rather to be dependent on cloud services.

Meanwhile school officials gave presentations and lipservice to online safety. They also deactivated ScreenTime and provided unrestricted YouTube over LTE, completely oblivious to effects on sanity and grades. A set up that also made it impossible to reign in with time limits at the home router. Imagine nightly tug of wars with your student, joy! :-D

To recap, yes a schoolwork platform provided for free… by the biggest advertising company on the planet, bless their hearts. Imagine it written into a 1990s dystopian sci-fi flick—that would be quite a stretch, wouldn’t it? Would the audience believe it? No way, but now it’s reality!

At this point, we’re definitely in the “Dark Biff Timeline” from Back to the Future II.

Dark Patterns

This one has been around the block a few times, in that sleazy, dishonest companies have been a scourge since perhaps the beginning of business. But restrictive, coercive technology sure helps! Modern user-interfaces with dark patterns include:

Finally, a list of tricks to make you spend more online (nsw.gov.au).

2020s, AI Ethics

I have a Pixel watch, and my main use for it is setting reminders, like “reminder 3pm put the laundry in the dryer.” It’s worked fine since the day I bought it.

Last week, they pushed an update that broke all of the features on the watch unless I agreed to allow Google to train their AI on my content.
thomascgalvin on HN

The ethics of artificial intelligence page at Wikipedia tackles the academic concerns, but how do business implementations affect us?

Rewriting terms of service [1] [2], loosening policies and defaults, eliminating safeguards, destroying copies of books—yes this new field is moving fast! All this to get access to our hot data, and as much as possible, to plug it into models and make money. Once again you are the product, not only on the frontend but the backend too. At least that’s how the current thinking goes.

There are additional challenges materializing such as elimination of jobs, the full ramifications of which remain to be seen, but we digress. This is another area of research for CHT.

Machine Enforced Bias

During a recent Machine Learning class, there was a lecture (#10) on potential pitfalls of its application to real-world problems.

[We’ll] look at cases from medical care and law enforcement that reveal hidden biases in the way data is interpreted.

Basically, if we are not proactive and careful to correct them, biases in the data will result in similar biases in the results. In other words, subjects in an unlucky group will be treated more harshly by such a system without intervention.

Predictive Law Enforcement

There has been a system implemented for crime prediction at the Tel Aviv Airport, though may have been shut down.

Despite the system’s enormous potential for human rights abuses, it has been put into police use in secrecy, without public debate and without explicit legislative authorization.

Elsewhere:

Judiciary

“Can you foresee a day, when smart machines, driven with artificial intelligences, will assist with courtroom fact-finding or, more controversially even, judicial decision-making?”

The Chief Justice’s answer was more surprising than the question. “It’s a day that’s here,” he said, “and it’s putting a significant strain on how the judiciary goes about doing things.”

Wrap Up

Whew! This was a “short” summary of the deleterious accomplishments of the last three decades of the IT industry. We were able to dig into a number of acute issues and supplied “multitudinous” links for additional study. Dismally, the story does not slow from here nor are we moving in a more humane direction.

Next: Fiction & Vocabulary