#they really go for the jugular every chance they get hsdjksa
Explore tagged Tumblr posts
mercifullymad · 1 year ago
Text
Interesting update to my previous posts urging people to call their senators and oppose the Kids Online Safety Act (KOSA): KOSA is "essentially copied" directly from legislation created by a British Baroness and film director who directed, among other things, the second Bridget Jones movie.
Baroness Beeban Kidron, who has been successful at pushing various online restriction bills through British Parliament, is partnering with U.S. politicians championing similar causes. I very much agree with her stance that it is the responsibility of platforms, rather than parents or kids, to ensure their product is safe for users of all ages. However, the Electronic Frontier Foundation opposes the too-vague language used in one of the U.S. bills copied off of Kidron's bills, the California Age-Appropriate Design Code Act (CAADCA), which has already been signed into law and will go into effect in 2024. CAADCA, "the first [law] of its kind anywhere in the U.S.," says it will "protect children from seeing posts about self-harm or from predatory digital advertisers."
The same problem occurs over and over again with these children's digital "protection" bills — the language is simply too vague, and vague enough to be implemented dangerously by either social media companies or Republicans using these bills to block children from seeing queer or race-conscious content. Aside from those more obvious problems, we also must think seriously and deeply about what good it does to stop everyone under 18 (as CAADCA defines a "child," with no differing restrictions for different ages under this cut-off) from seeing any content about self harm, suicide, eating disorders, and other forms of "self-destructive" conduct that are imagined to be highly transmissible by the sane public. Why do these bills prevent children from seeing content about "self-destructive" behavior but not "other-destructive" behavior? Why do they focus more on preventing access to content about suicide rather than content glorifying gun culture, or content about eating disorders rather than content about white supremacy, sexism, transphobia, and fatphobia (some of the biggest forces causing/encouraging eating disorders)?
These bills want children to self-manage their reactions to living in a dangerous and unjust world in "proper" ways, without actually moving the conditions of the world closer toward safety and justice. And per usual, the target audience of these bills are children who are imagined to be not-mad, children who are not already contemplating suicide or hurting themselves, but who could be "corrupted" into doing so by nefarious forces on the internet. Meanwhile, already-mad children get psych-warded or locked out of the care need.
There's a fine line to thread here, because I am extremely pro-regulating Big Tech and pro-digital privacy/rights, and I don't want to come off as if I'm not. But much of the motivation for these bills comes not so much for genuine concern for digital privacy but overextended paternalism of (mad) children and even more generally, an unexamined discomfort with changing norms for social interaction in a digital world. Kidron recounts that she initially got into this sphere of activism because she once walked into a room to find a bunch of teens "all just looking at their screens," prompting her to make a documentary about teens' digital behaviors. More than looking at how much time children are spending on screens and what they're doing on screens, we must first look at why they're using this technology in these ways. The why of it surely has more answers to what we can do to make the world more livable for them.
13 notes · View notes