#it was a misunderstanding about certain data they will be deleting with the google account. probably referring to things like watch history
Explore tagged Tumblr posts
risingsunresistance · 2 years ago
Note
hi! i wanted to ask if you know anyone/anywhere that archived techno’s twitter account? my own inactive account was deleted a week or so ago, per the new rules, and thinking about the day his gets removed is really nervewracking.
i dont know of anyone specifically that has done that, but there are plenty of archives on the wayback machine. here's one from 2022 which includes the last tweet made on his account from technodad
that being said, i wouldnt worry too much about his account since his dad HAS logged into it. it's likely that someone will tell him about this new policy and he'll continue to log into it. if he doesn't though, it exists on the wayback at the very least. maybe you can save your own backup from there
7 notes · View notes
tie--up--loose--ends · 2 years ago
Text
And before I go, I need to clear some things up.
( the shorter version)
I spent my whole 2022 year of fighting the consequences of my involvement with the writing part of the fandom.
Then because of some entitled to my work popular writer, and a POLITE ask I sent to them not to use my DELETED content (gifs), some additional drama took place, and some unpleasant stuff happened.
Anyways.
I was having a blog under the name agentalpha. I can't use that name anymore, because it reminds me of that fiasco and all my old stuff is marked with it so....
I decided to speak because:
I developed depression and PTSD as a side effect....and needed counselling to get better.
I was never diagnosed with depression before that so that the side effect as they told me.
And I think that people need to be more mindful towards the others, because you never know who is behind the screen. Being kind to all, not only your friends.
I wished people to be able to understand, that common courtesy rules and mutual respect exists for a reason.
-----
And before I go, I need to clear some things up.
Because I always adhere to common courtesy rules and Online Netiquette, and I find it very...unpleasant, when others do not. And then not even consider it.
HERE SOME THINGS, I WANT MAKE VERY CLEAR ABOUT ME, after NO ONE had the common courtesy to behave like an adult, and .... ask what's what.
1. I DON'T do HATE / OFFENSE
AND I WILL TELL YOU WHY I DON'T .
I write fiction since 2007. I HAVE DEALT WITH HATERS.
I WON'T DO THAT TO ANOTHER HUMAN BEING.
I'm also too old for petty dramas, and doing childish stuff like that.
THE ASK And here, that's a screen of the ask I sent.
The one that lead to everything else.
See anything offensive?
NO. Because there is NOTHING OFFENSIVE.
Tumblr media
* There was NO OFFENSE in none of my other posts either.
And that's because I do NOT DO OFFENSE/HATE.
Simple as that.
I prefer common courtesy. Thank you.
2. I NEVER interact with fiction that is not to my taste. In no way.
I keep to myself and I only interact with fics I like.
That's about it.
Here some more:
PERSONAL BOUNDARIES STATED POLITELY AREN'T OFFENSE
I COULD SLIP INTO ANXIETY ATTACK If I interact with certain contents. And I definitely love my ability to be able to breath.
I could read almost everything except things that contain some forms of abusive behaviour (including consent stuff).
I DON'T HAVE PROBLEMS WITH PEOPLE'S PERSONAL LIKES - kinks, and personal preferences included.
No matter how exotic they might be.
I have my own likes, that might not be to someone else's taste.
And I'm NO double faced a$$. Would you kindly not make me one.
What people like is none of my business.
It's not my business what they write on their blogs either.
Unless they think that they are better than everyone else, and have more rights than the others.
--
So would you kindly, before listening to gossips or whatever about someone else, FIRST go CHECK FOR YOURSELF WHAT IT IS, ASK THE PERSON YOURSELF WHAT'S GOING ON AND MAKE CONCLUSION BASED ON THEIR ANSWER.
That's the right thing to do.
And NOT DUMPING AND ABANDONING PEOPLE THAT SUPPORT YOU, AND LIKE YOU AND YOUR CONTENT, BASED ON MISUNDERSTANDING, INCOMPLETE DATA AND ( MAY BE) BAD MOUTHING BY ANOTHER.
( Tip: The popularity of one person, should not be your correct criteria to take action against someone else who is not.)
------
And before you read any of the posts here, LET ME TELL YOU SOMETHING MORE ABOUT ME.
Things I don't mention, usually, but now I need to, because no one cares about common courtesy, mutual respect and other's people limitations:
I have social anxiety and rejection sensitive dysphoria ( google it)
I get sensory overload by too much online interactions especially in online groups
I have hard time reaching to others even for friendships - online and otherwise
And after all that, and no one willing to ask or care....well you can guess for yourself
----
That's me being done. After I say what I have to I will be on my way.
I won't make other posts but I need to close accounts and tie up loose ends.
Those are NOT HATE posts.
Excuse the cynical sub-tones.
#fandoms#common courtesy IS A THING#people are different but mutual respect goes a long way#toxic behaviours#personal story of why I left a fandom#I'm not trying that again#pedro pascals fandom#pedro pascal fans#There are good people there too thank you for the few kind words#pedro pascal#fanfiction… 
4 notes · View notes
cyberwitcher-blog · 7 years ago
Text
Privacy hygiene
Rapid technological development supplemented our conventional way of life with 24/7 online life and facilitated our move from reading the information into constant “screening” from displays. Unfortunately, there are also many risks related to these “advances”. For this reason, people have to hold basic knowledge of the digital environment and observe straightforward and simple rules, which may ensure the safe use of digital tools and mitigate certain risks. 
To put it simple mentioned rules may constitute some kind of “privacy hygiene”. Basically, in order to plan and understand how to apply this “treatment”, as the first step it is recommended to revisit and make due diligence of all websites and applications, which have been used/hold user’s personal data. Then, depending on findings, a user may use a set of rights provided for by available data protection legislation framework/clean up or delete already disclosed information and registered accounts. Many interesting nuances and frightening facts may be revealed when the whole picture of user’s digital life is investigated.
However, if to adjust some digital habits you may avoid any future stress of uncertainty and misunderstanding regarding your digital footprints. 1. Check your privacy settings regularly. It may seem obvious, but (i) these settings occasionally may change; (ii) opt-out options may come up; and (iii) user’s privacy attitudes may change at some point on a sub-consciousness level and therefore familiarisation with available settings will give a push to play with different options. 2. Run your name searches on different search engines (DuckDuckGo, Bing, Google, etc.). Reflect on what you find and how it represents you. 3. Don’t post anything stupid and/or overly disclosing under your true name in social media. 4. Check your phone settings. Many applications get access to camera and microphone, as well as location data. 5. Avoid correspondence with internet trolls. Sometimes even users “with cold head” may be provoked to disclose information about which they will regret. 6. Remove low quality and out of date content/detag yourself. Regularly revisit and evaluate the information that was previously disclosed. 7. Think about how your data may be used. Method called premortem, which was elaborated by American research psychologist Gary Klein, may help here to identify risks at the outset. For instance, being on vacation think how social network post with location check-in may be used by burglars in your home city. 8. Respect the privacy of other people. Remember that in addition to legal rights you have a set of obligation to comply with. 9. Be aware of technical solutions and advice: a) Install an anti-virus scanner; b) Keep your software updated (for security purpose); c) Use strong password and two-factor authentication for important things; d) Select password manager for less important things; e) Use software to block trackers (there are already a plenty of solutions on the market).
Tumblr media
CyberWitcher (c)
1 note · View note
neptunecreek · 5 years ago
Text
The Key to Safety Online Is User Empowerment, Not Censorship
The Senate Judiciary Committee recently held a hearing on “Protecting Digital Innocence.” The hearing covered a range of problems facing young people on the Internet today, with a focus on harmful content and privacy-invasive data practices by tech companies. While children do face problems online, some committee members seemed bent on using those problems as an excuse to censor the Internet and undermine the legal protections for free expression that we all rely on, including kids.
Don’t Censor Users; Empower Them to Choose
Though tech companies weren’t represented in the hearing, senators offered plenty of suggestions about how those companies ought to make their services safer for children. Sen. John Kennedy suggested that online platforms should protect children by scanning for “any pictures of human genitalia.”
It’s foolish to think that one set of standards would be appropriate for all children, let alone all Internet users.
Sen. Kennedy’s idea is a good example of how lawmakers sometimes misunderstand the complexity of modern-day platform moderation, and the extreme difficulty of getting it right at scale. Many online platforms do voluntarily use automated filters, human reviewers, or both to snoop out nudity, pornography, or other speech that companies deem inappropriate. But those measures often bring unintended consequences that reach much further than whatever problems the rules were intended to address. Instagram deleted one painter’s profile until the company realized the absurdity of this aggressive application of its ban on nudity. When Tumblr employed automated filters to censor nudity, it accidentally removed hundreds of completely “safe for work” images.
The problem gets worse when lawmakers attempt to legislate what they consider good content moderation. In the wake of last year’s Internet censorship law SESTA-FOSTA, online platforms were faced with an awful choice: err on the side of extreme prudishness in their moderation policies or face the risk of overwhelming liability for their users’ speech. Facebook broadened its sexual solicitation policy to the point that it could feasibly justify removing discussion of sex altogether. Craigslist removed its dating section entirely.
Legislation to “protect” children from harmful material on the Internet will likely bring similar collateral damage for free speech: when lawmakers give online platforms the impossible task of ensuring that every post meets a certain standard, those companies have little choice but to over-censor.
During the hearing, Stephen Balkam of the Family Online Safety Institute provided an astute counterpoint to the calls for a more highly filtered Internet, calling to move the discussion “from protection to empowerment.” In other words, tech companies ought to give users more control over their online experience rather than forcing all of their users into an increasingly sanitized web. We agree.
It’s foolish to think that one set of standards would be appropriate for all children, let alone all Internet users. But today, social media companies frequently make censorship decisions that affect everyone. Instead, companies should empower users to make their own decisions about what they see online by letting them calibrate and customize the content filtering methods those companies use. Furthermore, tech and media companies shouldn’t abuse copyright and other laws to prevent third parties from offering customization options to people who want them.
Congress and Government Must Do More to Fight Unfair Data-Collection Practices
Like all Internet users, kids are often at the mercy of companies’ privacy-invasive data practices, and often have no reasonable opportunity to opt out of collection, use, and sharing of their data. Congress should closely examine companies whose business models rely on collecting, using, and selling children’s personal information.
Some of the proposals floated during the hearing for protecting young Internet users’ privacy were well-intentioned but difficult to implement. Georgetown Law professor Angela Campbell suggested that platforms move all “child-directed” material to a separate website without behavioral data collection and related targeted advertising. Platforms must take measures to put all users in charge of how their data is collected, used, and shared—including children—but cleanly separating material directed at adults and children isn’t easy. It would be awful if a measure designed to protect young Internet users’ privacy made it harder for them to access materials on sensitive issues like sexual health and abuse. A two-tiered Internet undermines the very types of speech for which young Internet users most need privacy.
We do agree with Campbell that enforcement of existing children’s privacy laws must be a priority. As we’ve argued in the student privacy context, the Federal Trade Commission (FTC) should better enforce the Children’s Online Privacy Protection Act (COPPA), the law that requires websites and online services that are directed to children under 13 or have actual knowledge that a user is under 13 to obtain parental consent before collecting personal information from children for commercial purposes. The Department of Education should better enforce the Family Educational Rights and Privacy Act (FERPA), which generally prohibits schools that receive federal funding from sharing student information without parental consent.
EFF’s student privacy project catalogues the frustrations that students, parents, and other stakeholders have when it comes to student privacy. In particular, we’ve highlighted numerous examples of students effectively being forced to share data with Google through the free or low-cost cloud services and Chromebooks it provides to cash-strapped schools. We filed a complaint with the FTC in 2015 asking it to investigate Google’s student data practices, but the agency never responded. Sen. Marsha Blackburn cited our FTC complaint against Google as an example of the FTC’s failure to protect children’s privacy: “They go in, they scoop the data, they track, they follow, and they’ve got that virtual you of that child.”
While Google has made some progress since 2015, Congress should still investigate whether the relevant regulatory agencies are falling down on the job when it comes to protecting student privacy. Congress should also explore ways to ensure that users can make informed decisions about how their data is collected, used, and shared. Most importantly, Congress should pass comprehensive consumer privacy legislation that empowers users and families to bring their own lawsuits against the companies that violate their privacy rights.
Undermining Section 230 Won’t Improve Companies’ Practices
At the end of the hearing, Sen. Lindsey Graham (R-SC) turned the discussion to Section 230, the law that shields online platforms, services, and users from liability for most speech created by others. Sen. Graham called Section 230 the “elephant in the room,” suggesting that Congress use the law as leverage to force tech companies to change their practices: “We come up with best business practices, and if you meet those business practices you have a safe haven from liability, and if you don’t, you’re going to get sued.” He followed his comments with a Twitter thread claiming that kneecapping liability protections is “the best way to get social media companies to do better in this area.”
Don’t be surprised if the big tech companies fail to put up a fight against these proposals.
Sen. Graham didn’t go into detail about what “business practices” Congress should mandate, but regardless, he ought to rethink the approach of threatening to weaken Section 230. Google and Facebook are more willing to bargain away the protections of Section 230 than their smaller competitors. Nearly every major Internet company endorsed SESTA-FOSTA, a bill that made it far more difficult for small Internet startups to unseat the big players. Sen. Josh Hawley’s bill to address supposed political bias in content moderation makes the same mistake, giving more power to the large social media companies it’s intended to punish. Don’t be surprised if the big tech companies fail to put up a fight against these proposals: the day after the hearing, IBM announced support for further weakening Section 230, just like it did last time around.
More erosion of Section 230 won’t necessarily hurt big Internet companies, but it will hurt users. Under a compromised Section 230, online platforms would be incentivized to over-censor users’ speech. When platforms choose to err on the side of censorship, marginalized voices are the first to disappear.
Congress Must Consider Unintended Consequences
The problems facing young people online are complicated, and it’s essential that lawmakers carefully consider the unintended consequences of any legislation in this area.
Companies ought to help users and families customize online services for their own needs. But congressional attempts to legislate solutions to harmful Internet content by forcing companies to patrol users’ speech are fraught with the potential for collateral damage (and would likely be unconstitutional). We understand Congress’ desire to hold large Internet companies accountable, but it shouldn’t pass laws to make the Internet a more restrictive place.
At the same time, Congress does have an historic opportunity to help protect children and adults from invasive, unfair data-collection and advertising practices, both by passing strong consumer privacy legislation and by demanding that the government do more to enforce existing privacy laws.
from Deeplinks https://ift.tt/2Zb2VJK
0 notes