#data brokers
Explore tagged Tumblr posts
Text
Your car spies on you and rats you out to insurance companies
I'm on tour with my new, nationally bestselling novel The Bezzle! Catch me TOMORROW (Mar 13) in SAN FRANCISCO with ROBIN SLOAN, then Toronto, NYC, Anaheim, and more!
Another characteristically brilliant Kashmir Hill story for The New York Times reveals another characteristically terrible fact about modern life: your car secretly records fine-grained telemetry about your driving and sells it to data-brokers, who sell it to insurers, who use it as a pretext to gouge you on premiums:
https://www.nytimes.com/2024/03/11/technology/carmakers-driver-tracking-insurance.html
Almost every car manufacturer does this: Hyundai, Nissan, Ford, Chrysler, etc etc:
https://www.repairerdrivennews.com/2020/09/09/ford-state-farm-ford-metromile-honda-verisk-among-insurer-oem-telematics-connections/
This is true whether you own or lease the car, and it's separate from the "black box" your insurer might have offered to you in exchange for a discount on your premiums. In other words, even if you say no to the insurer's carrot – a surveillance-based discount – they've got a stick in reserve: buying your nonconsensually harvested data on the open market.
I've always hated that saying, "If you're not paying for the product, you're the product," the reason being that it posits decent treatment as a customer reward program, like the little ramekin warm nuts first class passengers get before takeoff. Companies don't treat you well when you pay them. Companies treat you well when they fear the consequences of treating you badly.
Take Apple. The company offers Ios users a one-tap opt-out from commercial surveillance, and more than 96% of users opted out. Presumably, the other 4% were either confused or on Facebook's payroll. Apple – and its army of cultists – insist that this proves that our world's woes can be traced to cheapskate "consumers" who expected to get something for nothing by using advertising-supported products.
But here's the kicker: right after Apple blocked all its rivals from spying on its customers, it began secretly spying on those customers! Apple has a rival surveillance ad network, and even if you opt out of commercial surveillance on your Iphone, Apple still secretly spies on you and uses the data to target you for ads:
https://pluralistic.net/2022/11/14/luxury-surveillance/#liar-liar
Even if you're paying for the product, you're still the product – provided the company can get away with treating you as the product. Apple can absolutely get away with treating you as the product, because it lacks the historical constraints that prevented Apple – and other companies – from treating you as the product.
As I described in my McLuhan lecture on enshittification, tech firms can be constrained by four forces:
I. Competition
II. Regulation
III. Self-help
IV. Labor
https://pluralistic.net/2024/01/30/go-nuts-meine-kerle/#ich-bin-ein-bratapfel
When companies have real competitors – when a sector is composed of dozens or hundreds of roughly evenly matched firms – they have to worry that a maltreated customer might move to a rival. 40 years of antitrust neglect means that corporations were able to buy their way to dominance with predatory mergers and pricing, producing today's inbred, Habsburg capitalism. Apple and Google are a mobile duopoly, Google is a search monopoly, etc. It's not just tech! Every sector looks like this:
https://www.openmarketsinstitute.org/learn/monopoly-by-the-numbers
Eliminating competition doesn't just deprive customers of alternatives, it also empowers corporations. Liberated from "wasteful competition," companies in concentrated industries can extract massive profits. Think of how both Apple and Google have "competitively" arrived at the same 30% app tax on app sales and transactions, a rate that's more than 1,000% higher than the transaction fees extracted by the (bloated, price-gouging) credit-card sector:
https://pluralistic.net/2023/06/07/curatorial-vig/#app-tax
But cartels' power goes beyond the size of their warchest. The real source of a cartel's power is the ease with which a small number of companies can arrive at – and stick to – a common lobbying position. That's where "regulatory capture" comes in: the mobile duopoly has an easier time of capturing its regulators because two companies have an easy time agreeing on how to spend their app-tax billions:
https://pluralistic.net/2022/06/05/regulatory-capture/
Apple – and Google, and Facebook, and your car company – can violate your privacy because they aren't constrained regulation, just as Uber can violate its drivers' labor rights and Amazon can violate your consumer rights. The tech cartels have captured their regulators and convinced them that the law doesn't apply if it's being broken via an app:
https://pluralistic.net/2023/04/18/cursed-are-the-sausagemakers/#how-the-parties-get-to-yes
In other words, Apple can spy on you because it's allowed to spy on you. America's last consumer privacy law was passed in 1988, and it bans video-store clerks from leaking your VHS rental history. Congress has taken no action on consumer privacy since the Reagan years:
https://www.eff.org/tags/video-privacy-protection-act
But tech has some special enshittification-resistant characteristics. The most important of these is interoperability: the fact that computers are universal digital machines that can run any program. HP can design a printer that rejects third-party ink and charge $10,000/gallon for its own colored water, but someone else can write a program that lets you jailbreak your printer so that it accepts any ink cartridge:
https://www.eff.org/deeplinks/2020/11/ink-stained-wretches-battle-soul-digital-freedom-taking-place-inside-your-printer
Tech companies that contemplated enshittifying their products always had to watch over their shoulders for a rival that might offer a disenshittification tool and use that as a wedge between the company and its customers. If you make your website's ads 20% more obnoxious in anticipation of a 2% increase in gross margins, you have to consider the possibility that 40% of your users will google "how do I block ads?" Because the revenue from a user who blocks ads doesn't stay at 100% of the current levels – it drops to zero, forever (no user ever googles "how do I stop blocking ads?").
The majority of web users are running an ad-blocker:
https://doc.searls.com/2023/11/11/how-is-the-worlds-biggest-boycott-doing/
Web operators made them an offer ("free website in exchange for unlimited surveillance and unfettered intrusions") and they made a counteroffer ("how about 'nah'?"):
https://www.eff.org/deeplinks/2019/07/adblocking-how-about-nah
Here's the thing: reverse-engineering an app – or any other IP-encumbered technology – is a legal minefield. Just decompiling an app exposes you to felony prosecution: a five year sentence and a $500k fine for violating Section 1201 of the DMCA. But it's not just the DMCA – modern products are surrounded with high-tech tripwires that allow companies to invoke IP law to prevent competitors from augmenting, recongifuring or adapting their products. When a business says it has "IP," it means that it has arranged its legal affairs to allow it to invoke the power of the state to control its customers, critics and competitors:
https://locusmag.com/2020/09/cory-doctorow-ip/
An "app" is just a web-page skinned in enough IP to make it a crime to add an ad-blocker to it. This is what Jay Freeman calls "felony contempt of business model" and it's everywhere. When companies don't have to worry about users deploying self-help measures to disenshittify their products, they are freed from the constraint that prevents them indulging the impulse to shift value from their customers to themselves.
Apple owes its existence to interoperability – its ability to clone Microsoft Office's file formats for Pages, Numbers and Keynote, which saved the company in the early 2000s – and ever since, it has devoted its existence to making sure no one ever does to Apple what Apple did to Microsoft:
https://www.eff.org/deeplinks/2019/06/adversarial-interoperability-reviving-elegant-weapon-more-civilized-age-slay
Regulatory capture cuts both ways: it's not just about powerful corporations being free to flout the law, it's also about their ability to enlist the law to punish competitors that might constrain their plans for exploiting their workers, customers, suppliers or other stakeholders.
The final historical constraint on tech companies was their own workers. Tech has very low union-density, but that's in part because individual tech workers enjoyed so much bargaining power due to their scarcity. This is why their bosses pampered them with whimsical campuses filled with gourmet cafeterias, fancy gyms and free massages: it allowed tech companies to convince tech workers to work like government mules by flattering them that they were partners on a mission to bring the world to its digital future:
https://pluralistic.net/2023/09/10/the-proletarianization-of-tech-workers/
For tech bosses, this gambit worked well, but failed badly. On the one hand, they were able to get otherwise powerful workers to consent to being "extremely hardcore" by invoking Fobazi Ettarh's spirit of "vocational awe":
https://www.inthelibrarywiththeleadpipe.org/2018/vocational-awe/
On the other hand, when you motivate your workers by appealing to their sense of mission, the downside is that they feel a sense of mission. That means that when you demand that a tech worker enshittifies something they missed their mother's funeral to deliver, they will experience a profound sense of moral injury and refuse, and that worker's bargaining power means that they can make it stick.
Or at least, it did. In this era of mass tech layoffs, when Google can fire 12,000 workers after a $80b stock buyback that would have paid their wages for the next 27 years, tech workers are learning that the answer to "I won't do this and you can't make me" is "don't let the door hit you in the ass on the way out" (AKA "sharpen your blades boys"):
https://techcrunch.com/2022/09/29/elon-musk-texts-discovery-twitter/
With competition, regulation, self-help and labor cleared away, tech firms – and firms that have wrapped their products around the pluripotently malleable core of digital tech, including automotive makers – are no longer constrained from enshittifying their products.
And that's why your car manufacturer has chosen to spy on you and sell your private information to data-brokers and anyone else who wants it. Not because you didn't pay for the product, so you're the product. It's because they can get away with it.
Cars are enshittified. The dozens of chips that auto makers have shoveled into their car design are only incidentally related to delivering a better product. The primary use for those chips is autoenshittification – access to legal strictures ("IP") that allows them to block modifications and repairs that would interfere with the unfettered abuse of their own customers:
https://pluralistic.net/2023/07/24/rent-to-pwn/#kitt-is-a-demon
The fact that it's a felony to reverse-engineer and modify a car's software opens the floodgates to all kinds of shitty scams. Remember when Bay Staters were voting on a ballot measure to impose right-to-repair obligations on automakers in Massachusetts? The only reason they needed to have the law intervene to make right-to-repair viable is that Big Car has figured out that if it encrypts its diagnostic messages, it can felonize third-party diagnosis of a car, because decrypting the messages violates the DMCA:
https://www.eff.org/deeplinks/2013/11/drm-cars-will-drive-consumers-crazy
Big Car figured out that VIN locking – DRM for engine components and subassemblies – can felonize the production and the installation of third-party spare parts:
https://pluralistic.net/2022/05/08/about-those-kill-switched-ukrainian-tractors/
The fact that you can't legally modify your car means that automakers can go back to their pre-2008 ways, when they transformed themselves into unregulated banks that incidentally manufactured the cars they sold subprime loans for. Subprime auto loans – over $1t worth! – absolutely relies on the fact that borrowers' cars can be remotely controlled by lenders. Miss a payment and your car's stereo turns itself on and blares threatening messages at top volume, which you can't turn off. Break the lease agreement that says you won't drive your car over the county line and it will immobilize itself. Try to change any of this software and you'll commit a felony under Section 1201 of the DMCA:
https://pluralistic.net/2021/04/02/innovation-unlocks-markets/#digital-arm-breakers
Tesla, naturally, has the most advanced anti-features. Long before BMW tried to rent you your seat-heater and Mercedes tried to sell you a monthly subscription to your accelerator pedal, Teslas were demon-haunted nightmare cars. Miss a Tesla payment and the car will immobilize itself and lock you out until the repo man arrives, then it will blare its horn and back itself out of its parking spot. If you "buy" the right to fully charge your car's battery or use the features it came with, you don't own them – they're repossessed when your car changes hands, meaning you get less money on the used market because your car's next owner has to buy these features all over again:
https://pluralistic.net/2023/07/28/edison-not-tesla/#demon-haunted-world
And all this DRM allows your car maker to install spyware that you're not allowed to remove. They really tipped their hand on this when the R2R ballot measure was steaming towards an 80% victory, with wall-to-wall scare ads that revealed that your car collects so much information about you that allowing third parties to access it could lead to your murder (no, really!):
https://pluralistic.net/2020/09/03/rip-david-graeber/#rolling-surveillance-platforms
That's why your car spies on you. Because it can. Because the company that made it lacks constraint, be it market-based, legal, technological or its own workforce's ethics.
One common critique of my enshittification hypothesis is that this is "kind of sensible and normal" because "there’s something off in the consumer mindset that we’ve come to believe that the internet should provide us with amazing products, which bring us joy and happiness and we spend hours of the day on, and should ask nothing back in return":
https://freakonomics.com/podcast/how-to-have-great-conversations/
What this criticism misses is that this isn't the companies bargaining to shift some value from us to them. Enshittification happens when a company can seize all that value, without having to bargain, exploiting law and technology and market power over buyers and sellers to unilaterally alter the way the products and services we rely on work.
A company that doesn't have to fear competitors, regulators, jailbreaking or workers' refusal to enshittify its products doesn't have to bargain, it can take. It's the first lesson they teach you in the Darth Vader MBA: "I am altering the deal. Pray I don't alter it any further":
https://pluralistic.net/2023/10/26/hit-with-a-brick/#graceful-failure
Your car spying on you isn't down to your belief that your carmaker "should provide you with amazing products, which brings your joy and happiness you spend hours of the day on, and should ask nothing back in return." It's not because you didn't pay for the product, so now you're the product. It's because they can get away with it.
The consequences of this spying go much further than mere insurance premium hikes, too. Car telemetry sits at the top of the funnel that the unbelievably sleazy data broker industry uses to collect and sell our data. These are the same companies that sell the fact that you visited an abortion clinic to marketers, bounty hunters, advertisers, or vengeful family members pretending to be one of those:
https://pluralistic.net/2022/05/07/safegraph-spies-and-lies/#theres-no-i-in-uterus
Decades of pro-monopoly policy led to widespread regulatory capture. Corporate cartels use the monopoly profits they extract from us to pay for regulatory inaction, allowing them to extract more profits.
But when it comes to privacy, that period of unchecked corporate power might be coming to an end. The lack of privacy regulation is at the root of so many problems that a pro-privacy movement has an unstoppable constituency working in its favor.
At EFF, we call this "privacy first." Whether you're worried about grifters targeting vulnerable people with conspiracy theories, or teens being targeted with media that harms their mental health, or Americans being spied on by foreign governments, or cops using commercial surveillance data to round up protesters, or your car selling your data to insurance companies, passing that long-overdue privacy legislation would turn off the taps for the data powering all these harms:
https://www.eff.org/wp/privacy-first-better-way-address-online-harms
Traditional economics fails because it thinks about markets without thinking about power. Monopolies lead to more than market power: they produce regulatory capture, power over workers, and state capture, which felonizes competition through IP law. The story that our problems stem from the fact that we just don't spend enough money, or buy the wrong products, only makes sense if you willfully ignore the power that corporations exert over our lives. It's nice to think that you can shop your way out of a monopoly, because that's a lot easier than voting your way out of a monopoly, but no matter how many times you vote with your wallet, the cartels that control the market will always win:
https://pluralistic.net/2024/03/05/the-map-is-not-the-territory/#apor-locksmith
Name your price for 18 of my DRM-free ebooks and support the Electronic Frontier Foundation with the Humble Cory Doctorow Bundle.
If you'd like an essay-formatted version of this post to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
https://pluralistic.net/2024/03/12/market-failure/#car-wars
Image: Cryteria (modified) https://commons.wikimedia.org/wiki/File:HAL9000.svg
CC BY 3.0 https://creativecommons.org/licenses/by/3.0/deed.en
#pluralistic#if you're not paying for the product you're the product#if you're paying for the product you're the product#cars#automotive#enshittification#technofeudalism#autoenshittification#antifeatures#felony contempt of business model#twiddling#right to repair#privacywashing#apple#lexisnexis#insuretech#surveillance#commercial surveillance#privacy first#data brokers#subprime#kash hill#kashmir hill
2K notes
·
View notes
Text
it's so funny to think abt how the dystopian levels of surveillance and data collection we are subjected to every day without consent, and sometimes without awareness being done primarily for the purpose of advertising goods and services to people. targeted ads that so often get blocked and ignored because everyone hates ads.
just... the hilarity of a vast network of machines dedicated to spying on everyone in the world, straight out of the mind of a deranged conspiracy theorist, which exists to let you know that shoes are 10% off at wal-mart, and which doesn't actually make you want to shop at wal-mart
#comedy#humor#data brokers#surveillance capitalism#anti-capitalist#anti consumerism#politics#social commentary#scout's brain
988 notes
·
View notes
Text
I am NOT going to fucking pay YOU to delete data about me that you took without my persimmons- kill your actual self
#tech speaks#data brokers#privacy#capitalism#incogni#YouTube ads#dystopia#youtube#ai#apple#google#bitches really think they have squatters rights or something ffs#idk
2 notes
·
View notes
Text
Today, Mozilla Monitor (previously called Firefox Monitor), a free service that notifies you when your email has been part of a breach, announced its new paid subscription service offering: automatic data removal and continuous monitoring of your exposed personal information.
On your behalf, Mozilla Monitor will start with data removal requests, then scan every month to make sure your personal information stays off data broker sites. Monitor Plus will let you know once your personal information has been removed from more than 190+ data broker sites.
9 notes
·
View notes
Text
youtube
#Aperture#video essay#algorithm#algorithms#Eric Loomis#COMPAS#thought piece#computer#computer program#data#data brokers#targeted ads#data breach#terminal#the silver machine#AI#machine learning#healthcare#tech#technology#profit#Youtube
2 notes
·
View notes
Text
The FBI, CBP, and other agencies can track your location using WiFi and GPS data, but they rarely know how to do all of this and piece together enough of your location data to get a conviction without a confession. Most of this data is actually useless without other evidence or a confession, not to mention the easy method of making all of your digital behavior random and unpredictable to where their machines can’t make predictions on you, and any agents get a headache trying to understand what you’re doing. You can also have multiple phones logged into the same account running in different locations, faraday bags, and custom encrypted operating systems.
#social engineering#hacking#location data#GPS#WiFi#data spoofing#faraday bags#computers#programming#data#data brokers
3 notes
·
View notes
Text
Victory! California’s new data broker law will hold data brokers accountable and give us needed control over our data by making it easier to exercise our privacy rights.
Read more about what the new law does here:
#privacy #databrokers #CA
#privacy#data brokers#california#usa#america#law#humanrights#invasion of privacy#privacy rights#ausgov#politas#auspol#tasgov#taspol#australia#fuck neoliberals#neoliberal capitalism#anthony albanese#albanese government#native american#amerikkka#amerika#united states#unitedstateofamerica#class war#eat the rich#eat the fucking rich#fuck the gop#fuck the police#fuck the patriarchy
5 notes
·
View notes
Text
our anonymity, our right to privacy and therefor our right to PEACE is constantly being stripped away from us
"oh i dont care if google tracks and makes money off my data all they use it for is ads"
if thats you ^ you are FOOLISH to think this
maybe you have nothing to loose? youve never done anything wrong so you have nothing to hide?
but its never been about right and wrong doings. if the right kinds of oppressed people are considered wrong enough, then they will be prosecuted and their identity is more than enough to criminalize them
if you are not concerned its not about you.
the people who are concerned are not paranoid. they are likely one bad law or political movement away from becoming a victim of this terrible system.
and then our only saving grace is anonymity. your voice can and will be used against you if they know who it belongs to
the fingerprint data brokers have on us should be horrifying. the loophole that police dont need a warrant if the information is obtained via a data broker should be horrifying.
by normalizing it you do nothing but help big brother. we are living in 1984 and that is not a meme, its true.
whatever you can do within your own power to help prevent this is better than nothing. just switching browsers from chrome to Firefox is a big step.
learn to protect your privacy. not just for you but for all the rest of us too.
#we need to start caring about this collectively l#i can see how this is playing out and it give horrible precedent to policing to come#please protect yourself#please help protect others#data#online fingerprinting#police#data brokers#right to privacy#privacy#1984
2 notes
·
View notes
Text
Ah, data brokers, gathering info for the US government.
5 notes
·
View notes
Text
Data Brokers and the Sale of Americans’ Mental Health Data
The Exchange of Our Most Sensitive Data and What It Means for Personal Privacy
Authored by Joanne Kim,
Sanford School of Public Policy - Duke University
(Full report made available online)
Overview:
This report includes findings from a two-month-long study of data brokers and data on U.S. individuals’ mental health conditions. The report aims to make more transparent the data broker industry and its processes for selling and exchanging mental health data about depressed and anxious individuals. The research is critical as more depressed and anxious individuals utilize personal devices and software-based health-tracking applications (many of which are not protected by the Health Insurance Portability and Accountability Act), often unknowingly putting their sensitive mental health data at risk. This report finds that the industry appears to lack a set of best practices for handling individuals’ mental health data, particularly in the areas of privacy and buyer vetting. It finds that there are data brokers which advertise and are willing and able to sell data concerning Americans’ highly sensitive mental health information. It concludes by arguing that the largely unregulated and black-box nature of the data broker industry, its buying and selling of sensitive mental health data, and the lack of clear consumer privacy protections in the U.S. necessitate a comprehensive federal privacy law or, at the very least, an expansion of HIPAA’s privacy protections alongside bans on the sale of mental health data on the open market.
Key Findings:
Some data brokers are marketing highly sensitive data on individuals’ mental health conditions on the open market, with seemingly minimal vetting of customers and seemingly few controls on the use of purchased data.
26 of the 37 contacted data brokers responded to inquiries about mental health data, and 11 firms were ultimately willing and able to sell the requested mental health data.
Whether this data will be deidentified or aggregated is also often unclear, and many of the studied data brokers at least seem to imply that they have the capabilities to provide identifiable data.
The 10 most engaged data brokers asked about the purpose of the purchase and the intended use cases for the data; however, after receiving that information (verbally or in writing) from the author, those companies did not appear to have additional controls for client management, and there was no indication in emails and phone calls that they had conducted separate background checks to corroborate the author’s (non-deceptive) statements.
The 10 most engaged brokers advertised highly sensitive mental health data on Americans including data on those with depression, attention disorder, insomnia, anxiety, ADHD, and bipolar disorder as well as data on ethnicity, age, gender, zip code, religion, children in the home, marital status, net worth, credit score, date of birth, and single parent status.
Pricing for mental health information varied: one data broker charged $275 for 5,000 aggregated counts of Americans’ mental health records, while other firms charged upwards of $75,000 or $100,000 a year for subscription/licensing access to data that included information on individuals’ mental health conditions.
One company that the author was in contact with depicted their firm as an advertising tech firm. The sales representative offered to ask their manager about coordinating a data deal on information from organizations they advertise for on behalf of the author.
Data broker 1 emphasized that the requested data on individuals’ mental health conditions was “extremely restricted” and that their team would need more information on intended use cases—yet continued to send a sample of aggregated, deidentified data counts.
After data broker 1 confirmed that the author was not part of a marketing entity, the sales representative said that as long as the author did not contact the individuals in the dataset, the author could use the data freely.
Data broker 2 implied they may have fully identified patient data, but said they were unable to share this individual-level data due to HIPAA compliance concerns. Instead, the sales representative offered to aggregate the data of interest in a deidentified form.
Data broker 4 was the most willing to sell data on depressed and anxious individuals at the author’s budget price of $2,500 and stated no apparent, restrictive data-use limitations post-purchase.
Data broker 4 advertised highly sensitive mental health data to the author, including names and postal addresses of individuals with depression, bipolar disorder, anxiety issues, panic disorder, cancer, PTSD, OCD, and personality disorder, as well as individuals who have had strokes and data on those people’s races and ethnicities.
Two data brokers, data broker 6 and data broker 9, mentioned nondisclosure agreements (NDAs) in their communications, and data broker 9 indicated that signing an NDA was a prerequisite for obtaining access to information on the data it sells.
Data broker 8 often made unsolicited calls to the author’s personal cell. If the author was delayed in responding to an email from data broker 8, the frequency of calls seemed to increase.
Some brokers imposed data use limitations on the possible sale of people’s mental health information, ranging from “single-use” (which usually pertains to mailing purposes) to “multi-use” (which means the dataset is available for one year after purchase) based on the firm and the product purchased.
Based on an evaluation of privacy policies, data brokers seem collectively less willing to provide access and disclosure to their customers and users about the collection or correction of personal data.
#privacy#privacy rights#data brokers#mental health#personal data#consumer privacy#data resellers#information broker#digital privacy#online safety
1 note
·
View note
Text
The context of the state of internet privacy laws (or lack thereof) make this “TikTok ban”even wilder.
Learn stuff the US govt doesn’t want you to know👇🏾^…^👇🏾
Congress is moving urgently to pass a TikTok ban that nobody asked for while 23 million homes are about to be priced out of affording the internet. 🔍 Affordable Connectivity Program
As a lil cherry on top, Congress has been dragging their feet to stop government surveillance. Our government is buying and selling our personal data from/to foreign adversaries. And yes, it is unconstitutional AF 🔍 Section-702 Foreign Intelligence Surveillance Act
This TikTok ban is the result of an wombo-combo of Sinophobia at the highest rungs of our government, bloated military spending, and excessive lobbying of right wing lawmakers by Facebook (Meta) to eliminate competition. 🔍Targeted Victory “Slap a Teacher” trend
So yeah fuck all this. We need to start demanding real data privacy from our government. Some helpful terms to know when you call your goons.
Data minimization: limit the kind of data collected and for how long
Net Neutrality: protection from Internet Providers selling our browsing information + blocking access to certain sites
Close the Digital Divide: establish affordable, readily available internet in public + private locations to stop internet/data monopolies + digital discrimination.
Regulate Data Brokers: Monitor data vendors and punish irresponsible/ illegal data purchases to protect privacy.
Trust, they don’t want you to know this stuff. Your call to your rep will be 10x spookier if you say any of this👆🏾
so the house of representatives just passed a bill that will now move to the senate to BAN tik tok completely in the united states and they are expected to argue that “national security risks” outweigh the freedom of speech and first amendment rights. biden has already said that if it gets to him, he will sign it. whether or not you use the app…….this is something to be worried about
#ray writes#tik tok#tik tok ban#us politics#ref#data privacy#tiktok#cyberpunk#hope punk#government surveillance#sec 702 fisa#702#ACP#affordable connectivity program#internet access#data brokers#data poisoning#call your reps folks#call your reps#call your senators#Congress#us senate
55K notes
·
View notes
Text
Tech monopolists use their market power to invade your privacy
On SEPTEMBER 24th, I'll be speaking IN PERSON at the BOSTON PUBLIC LIBRARY!
It's easy to greet the FTC's new report on social media privacy, which concludes that tech giants have terrible privacy practices with a resounding "duh," but that would be a grave mistake.
Much to the disappointment of autocrats and would-be autocrats, administrative agencies like the FTC can't just make rules up. In order to enact policies, regulators have to do their homework: for example, they can do "market studies," which go beyond anything you'd get out of an MBA or Master of Public Policy program, thanks to the agency's legal authority to force companies to reveal their confidential business information.
Market studies are fabulous in their own right. The UK Competition and Markets Authority has a fantastic research group called the Digital Markets Unit that has published some of the most fascinating deep dives into how parts of the tech industry actually function, 400+ page bangers that pierce the Shield of Boringness that tech firms use to hide their operations. I recommend their ad-tech study:
https://www.gov.uk/cma-cases/online-platforms-and-digital-advertising-market-study
In and of themselves, good market studies are powerful things. They expose workings. They inform debate. When they're undertaken by wealthy, powerful countries, they provide enforcement roadmaps for smaller, poorer nations who are being tormented in the same way, by the same companies, that the regulator studied.
But market studies are really just curtain-raisers. After a regulator establishes the facts about a market, they can intervene. They can propose new regulations, and they can impose "conduct remedies" (punishments that restrict corporate behavior) on companies that are cheating.
Now, the stolen, corrupt, illegitimate, extremist, bullshit Supreme Court just made regulation a lot harder. In a case called Loper Bright, SCOTUS killed the longstanding principle of "Chevron deference," which basically meant that when an agency said it had built a factual case to support a regulation, courts should assume they're not lying:
https://jacobin.com/2024/07/scotus-decisions-chevron-immunity-loper
The death of Chevron Deference means that many important regulations – past, present and future – are going to get dragged in front of a judge, most likely one of those Texas MAGA mouth-breathers in the Fifth Circuit, to be neutered or killed. But even so, regulators still have options – they can still impose conduct remedies, which are unaffected by the sabotage of Chevron Deference.
Pre-Loper, post-Loper, and today, the careful, thorough investigation of the facts of how markets operate is the prelude to doing things about how those markets operate. Facts matter. They matter even if there's a change in government, because once the facts are in the public domain, other governments can use them as the basis for action.
Which is why, when the FTC uses its powers to compel disclosures from the largest tech companies in the world, and then assesses those disclosures and concludes that these companies engage in "vast surveillance," in ways that the users don't realize and that these companies "fail to adequately protect users, that matters.
What's more, the Commission concludes that "data abuses can fuel market dominance, and market dominance can, in turn, further enable data abuses and practices that harm consumers." In other words: tech monopolists spy on us in order to achieve and maintain their monopolies, and then they spy on us some more, and that hurts us.
So if you're wondering what kind of action this report is teeing up, I think we can safely say that the FTC believes that there's evidence that the unregulated, rampant practices of the commercial surveillance industry are illegal. First, because commercial surveillance harms us as "consumers." "Consumer welfare" is the one rubric for enforcement that the right-wing economists who hijacked antitrust law in the Reagan era left intact, and here we have the Commission giving us evidence that surveillance hurts us, and that it comes about as a result of monopoly, and that the more companies spy, the stronger their monopolies become.
But the Commission also tees up another kind of enforcement: Section 5, the long (long!) neglected power of the agency to punish companies for "unfair and deceptive methods of competition," a very broad power indeed:
https://pluralistic.net/2023/01/10/the-courage-to-govern/#whos-in-charge
In the study, the Commission shows – pretty convincingly! – that the commercial surveillance sector routinely tricks people who have no idea how their data is being used. Most people don't understand, for example, that the platforms use all kinds of inducements to get web publishers to embed tracking pixels, fonts, analytics beacons, etc that send user-data back to the Big Tech databases, where it's merged with data from your direct interactions with the company. Likewise, most people don't understand the shadowy data-broker industry, which sells Big Tech gigantic amounts of data harvested by your credit card company, by Bluetooth and wifi monitoring devices on streets and in stores, and by your car. Data-brokers buy this data from anyone who claims to have it, including people who are probably lying, like Nissan, who claims that it has records of the smells inside drivers' cars, as well as those drivers' sex-lives:
https://nypost.com/2023/09/06/nissan-kia-collect-data-about-drivers-sexual-activity/
Or Cox Communications, which claims that it is secretly recording and transcribing the conversations we have in range of the mics on our speakers, phones, and other IoT devices:
https://www.404media.co/heres-the-pitch-deck-for-active-listening-ad-targeting/
(If there's a kernel of truth to Cox's bullshit, my guess it's that they've convinced some of the sleazier "smart TV" companies to secretly turn on their mics, then inflated this into a marketdroid's wet-dream of "we have logged every word uttered by Americans and can use it to target ads.)
Notwithstanding the rampant fraud inside the data brokerage industry, there's no question that some of the data they offer for sale is real, that it's intimate and sensitive, and that the people it's harvested from never consented to its collection. How do you opt out of public facial recognition cameras? "Just don't have a face" isn't a realistic opt-out policy.
And if the public is being deceived about the collection of this data, they're even more in the dark about the way it's used – merged with on-platform usage data and data from apps and the web, then analyzed for the purposes of drawing "inferences" about you and your traits.
What's more, the companies have chaotic, bullshit internal processes for handling your data, which also rise to the level of "deceptive and unfair" conduct. For example, if you send these companies a deletion request for your data, they'll tell you they deleted the data, but actually, they keep it, after "de-identifying" it.
De-identification is a highly theoretical way of sanitizing data by removing the "personally identifiers" from it. In practice, most de-identified data can be quickly re-identified, and nearly all de-identified data can eventually be re-identified:
https://pluralistic.net/2024/03/08/the-fire-of-orodruin/#are-we-the-baddies
Breaches, re-identification, and weaponization are extraordinarily hard to prevent. In general, we should operate on the assumption that any data that's collected will probably leak, and any data that's retained will almost certainly leak someday. To have even a hope of preventing this, companies have to treat data with enormous care, maintaining detailed logs and conducting regular audits. But the Commission found that the biggest tech companies are extraordinarily sloppy, to the point where "they often could not even identify all the data points they collected or all of the third parties they shared that data with."
This has serious implications for consumer privacy, obviously, but there's also a big national security dimension. Given the recent panic at the prospect that the Chinese government is using Tiktok to spy on Americans, it's pretty amazing that American commercial surveillance has escaped serious Congressional scrutiny.
After all, it would be a simple matter to use the tech platforms targeting systems to identify and push ads (including ads linking to malicious sites) to Congressional staffers ("under-40s with Political Science college degrees within one mile of Congress") or, say, NORAD personnel ("Air Force enlistees within one mile of Cheyenne Mountain").
Those targeting parameters should be enough to worry Congress, but there's a whole universe of potential characteristics that can be selected, hence the Commission's conclusion that "profound threats to users can occur when targeting occurs based on sensitive categories."
The FTC's findings about the dangers of all this data are timely, given the current wrangle over another antitrust case. In August, a federal court found that Google is a monopolist in search, and that the company used its data lakes to secure and maintain its monopoly.
This kicked off widespread demands for the court to order Google to share its data with competitors in order to erase that competitive advantage. Holy moly is this a bad idea – as the FTC study shows, the data that Google stole from us all is incredibly toxic. Arguing that we can fix the Google problem by sharing that data far and wide is like proposing that we can "solve" the fact that only some countries have nuclear warheads by "democratizing" access to planet-busting bombs:
https://pluralistic.net/2024/08/07/revealed-preferences/#extinguish-v-improve
To address the competitive advantage Google achieved by engaging in the reckless, harmful conduct detailed in this FTC report, we should delete all that data. Sure, that may seem inconceivable, but come on, surely the right amount of toxic, nonconsensually harvested data on the public that should be retained by corporations is zero:
https://pluralistic.net/2024/09/19/just-stop-putting-that-up-your-ass/#harm-reduction
Some people argue that we don't need to share out the data that Google never should have been allowed to collect – it's enough to share out the "inferences" that Google drew from that data, and from other data its other tentacles (Youtube, Android, etc) shoved into its gaping maw, as well as the oceans of data-broker slurry it stirred into the mix.
But as the report finds, the most unethical, least consensual data was "personal information that these systems infer, that was purchased from third parties, or that was derived from users’ and non-users’ activities off of the platform." We gotta delete that, too. Especially that.
A major focus of the report is the way that the platforms handled children's data. Platforms have special obligations when it comes to kids' data, because while Congress has failed to act on consumer privacy, they did bestir themselves to enact a children's privacy law. In 2000, Congress passed the Children's Online Privacy Protection Act (COPPA), which puts strict limits on the collection, retention and processing of data on kids under 13.
Now, there are two ways to think about COPPA. One view is, "if you're not certain that everyone in your data-set is over 13, you shouldn't be collecting or processing their data at all." Another is, "In order to ensure that everyone whose data you're collecting and processing is over 13, you should collect a gigantic amount of data on all of them, including the under-13s, in order to be sure that not collecting under-13s' data." That second approach would be ironically self-defeating, obviously, though it's one that's gaining traction around the world and in state legislatures, as "age verification" laws find legislative support.
The platforms, meanwhile, found a third, even stupider approach: rather than collecting nothing because they can't verify ages, or collecting everything to verify ages, they collect everything, but make you click a box that says, "I'm over 13":
https://pluralistic.net/2023/04/09/how-to-make-a-child-safe-tiktok/
It will not surprise you to learn that many children under 13 have figured out that they can click the "I'm over 13" box and go on their merry way. It won't surprise you, but apparently, it will surprise the hell out of the platforms, who claimed that they had zero underage users on the basis that everyone has to click the "I'm over 13" box to get an account on the service.
By failing to pass comprehensive privacy legislation for 36 years (and counting), Congress delegated privacy protection to self-regulation by the companies themselves. They've been marking their own homework, and now, thanks to the FTC's power to compel disclosures, we can say for certain that the platforms cheat.
No surprise that the FTC's top recommendation is for Congress to pass a new privacy law. But they've got other, eminently sensible recommendations, like requiring the companies to do a better job of protecting their users' data: collect less, store less, delete it after use, stop combining data from their various lines of business, and stop sharing data with third parties.
Remember, the FTC has broad powers to order "conduct remedies" like this, and these are largely unaffected by the Supreme Court's "Chevron deference" decision in Loper-Bright.
The FTC says that privacy policies should be "clear, simple, and easily understood," and says that ad-targeting should be severely restricted. They want clearer consent for data inferences (including AI), and that companies should monitor their own processes with regular, stringent audits.
They also have recommendations for competition regulators – remember, the Biden administration has a "whole of government" antitrust approach that asks every agency to use its power to break up corporate concentration:
https://www.eff.org/deeplinks/2021/08/party-its-1979-og-antitrust-back-baby
They say that competition enforcers factor in the privacy implications of proposed mergers, and think about how promoting privacy could also promote competition (in other words, if Google's stolen data helped it secure a monopoly, then making them delete that data will weaken their market power).
I understand the reflex to greet a report like this with cheap cynicism, but that's a mistake. There's a difference between "everybody knows" that tech is screwing us on privacy, and "a federal agency has concluded" that this is true. These market studies make a difference – if you doubt it, consider for a moment that Cigna is suing the FTC for releasing a landmark market study showing how its Express Scripts division has used its monopoly power to jack up the price of prescription drugs:
https://www.fiercehealthcare.com/payers/express-scripts-files-suit-against-ftc-demands-retraction-report-pbm-industry
Big business is shit-scared of this kind of research by federal agencies – if they think this threatens their power, why shouldn't we take them at their word?
This report is a milestone, and – as with the UK Competition and Markets Authority reports – it's a banger. Even after Loper-Bright, this report can form the factual foundation for muscular conduct remedies that will limit what the largest tech companies can do.
But without privacy law, the data brokerages that feed the tech giants will be largely unaffected. True, the Consumer Finance Protection Bureau is doing some good work at the margins here:
https://pluralistic.net/2023/08/16/the-second-best-time-is-now/#the-point-of-a-system-is-what-it-does
But we need to do more than curb the worst excesses of the largest data-brokers. We need to kill this sector, and to do that, Congress has to act:
https://pluralistic.net/2023/12/06/privacy-first/#but-not-just-privacy
The paperback edition of The Lost Cause, my nationally bestselling, hopeful solarpunk novel is out this month!
If you'd like an essay-formatted version of this post to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
https://pluralistic.net/2024/09/20/water-also-wet/#marking-their-own-homework
Image: Cryteria (modified) https://commons.wikimedia.org/wiki/File:HAL9000.svg
CC BY 3.0 https://creativecommons.org/licenses/by/3.0/deed.en
#pluralistic#coppa#privacy first#ftc#section 5 of the ftc act#privacy#consumer privacy#big tech#antitrust#monopolies#data brokers#radium suppositories#commercial surveillance#surveillance#google#a look behind the screens
231 notes
·
View notes
Text
“If you’re not the one paying, you’re not the customer.”
Twitter post: Andrew Lewis @andlewis “If you are not paying for it, you're not the customer; you're the product being sold.” 9:01 AM · Sep 13, 2010, 18 replies, 428 retweets, 293 hearts, 21 bookmarks.
We’ve known this for a long time. And I still notice people recommending “free” resources, as if it being free, or open source, makes it necessarily trustworthy. There’s absolutely no guarantee of that. Whether a product or service is inherently good or bad or safe or unsafe is entirely independent from how it’s funded or who pays for it or doesn’t. Obviously how something is funded is an extra layer to consider regarding ethical concerns. But you can have an open source recipe using humanely sourced poison that’s deadly.
It’s really problematic that so often good information is behind paywalls, and disinformation is of course made free, for maximum spread for deceit. But often some “free” stuff is actually a lure to retrieve information from people, and people’s personal information is extremely valuable, especially that identifies someone as part of a particular target market. Social engineering attacks also use the watering hole strategy - seeking people out where they are likely to be with a lure they’re likely to take.
#social media#online marketing#target marketing#free#you're the product#personal information#data brokers#paywalls#information#infosec#information security#cybersecurity#deceit#algorithms#andrew lewis#ethics#humanely sourced poison#open source
0 notes
Text
CFPB Takes Aim at Data Brokers in Proposed Rule Amending FCRA
On December 3, the CFPB announced a proposed rule to enhance oversight of data brokers that handle consumers’ sensitive personal and financial information. The proposed rule would amend Regulation V, which implements the Fair Credit Reporting Act (FCRA), to require data brokers to comply with credit bureau-style regulations under FCRA if they sell income data or certain other financial…
#AI#Artificial Intelligence#CFPB#consent#Consumer Financial Protection Bureau#CRA#credit history#credit score#data brokers#debt payments#Disclosure#Fair Credit Reporting Act#FCRA#financial information#personal information#privacy protection#Regulation V
1 note
·
View note
Text
SOCIAL SECURITY NUMBER RULES COULD BE CHANGED
In response to widespread data breaches exposing millions of Social Security numbers, the Consumer Financial Protection Bureau (CFPB) is proposing new regulations to strengthen consumer protections. These changes aim to classify certain data brokers as consumer reporting agencies, subjecting them to the same rules as credit bureaus under the Fair Credit Reporting Act (FCRA). This move seeks to…
0 notes
Text
Privacy Risks for Women Seeking Out-of-State Care
In this episode of Scam DamNation, host Lillian Cauldwell introduces an old type of scam still continuing in the United States targeted at women where personal information bought with a credit card tracks women who visit abortion clinics and tracks them back across state lines to their place of residence and nothing is being done about it. Senator Ron Wyden wrote an article in which he states…
#Abortion Clinics.#AI Scams.#Cell Phone#Credit Card Tracking&039;s#Data Breach#Data Brokers#Lillian Cauldwell#Privacy Risks#Scam DamNation#Scams#Senator Ron Wyden#women health#Women Rights
0 notes