#quiz auction fun politics game
Explore tagged Tumblr posts
Text
[The Keihancarl Diaries: June 2-3, 2018]
All right, this is going to be an awesome weekend! And best of all, it's right near my place.
Of course, there are some things I wanted to do first before proceeding to the main part of a really awesome weekend, the Pinoy Otaku Festival: Ai 2018.
To be honest, I'm still nervous about going to an otaku event all by myself, just like the last time. Being the shy, introverted person I was, I somehow managed to check most of the event happenings as well as missing some of them.
June 2nd
I decided on my favorite black fleece coat, my brown polo shirt, black fingerless gloves, and a pair of red socks. As much as I wanted to wear an anime shirt to an otaku event, I resist the urge since I'm not really keen on wearing all-black like the last time (all of my anime shirts are in black).
I left home early, but I decided to come to the otaku event later. For the next few hours, I’m dealing with a light to moderate rainfall, but luckily I didn't get wet.
Anyway, let's start with the usual mall-hopping fare and a sip of caramel frappe at my favorite milk tea shop in Lagro. I'll definitely need to calm down first since I tend to get overexcited at the thought of going to an otaku event for the second time.
I arrived at Fairview Terraces at exactly 2:00 PM. By the time I got there, a panel discussion Anime vs. Manga vs. Visual Novel vs. Light Novel is about to start. The panelists are asked about the first anime series they watched, as well as the first manga series/visual novel/light novel they read. They discussed the pros and cons of manga series adapted into anime and debating whether an anime adaptation is better than the original manga and vice versa (if I recall correctly).
There was a cute intermission number with OO-kun and Chu Chu, plus highlights of some upcoming anime shows on Animax (POF: Ai 2018’s official media partner), some performances and other intermission numbers, another panel discussion May Forever Ba Sa Cosplay (English: Is There Love In Cosplay) and a game show Anime Game Challenge. As for the latter, there were groups of contestants who answer the questions from three categories: guessing the anime character, what title of an anime theme song being played, and guessing who's the seiyū (Japanese voice actors) of the characters featured in a series of different anime clips.
There was also a singing game show The Otaku Singing Bee and another panel discussion about the issues within the cosplay community (Issues In Coscom).
The Otaku Singing Bee features the theme songs from old anime shows like DNAngel, Yu Yu Hakusho (Ghost Fighter), Fushigi Yuugi, Inuyasha, and Full Metal Alchemist.
For the first part of the game show, the contestants must rearrange the words that appear on the screen, that is actually part of the lyrics that will follow soon afterward. The second part of the game show, they must guess the missing part of the song's lyrics. One guy really nailed the last part of Ghost Fighter's theme song, too bad I never got to record it on video. It was really fun, and some of the audience/event-goers couldn't resist singing the lyrics to the opening theme songs, even if they are about to sing the correct answer.
As for the latter, the panel discussion tackles about the issues cosplayers faced in the industry, like bullying/cyberbullying and crab mentality within the cosplay community.
There was the announcement of those who will advance to the final round, and the first day of POF: Ai 2018 is over. I managed to upload some of the pics and videos, including selfies with some of the cosplayers.
June 3rd
Same outfit, but with a yellow shirt and striped white/black/yellow socks. I arrived at around 2:00 PM, around the same time as yesterday.
The second day of the event had singing competitions, cosplay competitions and the battle of the bands, with some performances, games, stand-up comedy, auctions, and panel discussions. I did have some additional selfies with the cosplayers too, some of the pics are uploaded to the event's FB page as comments to the cosplayers' posts.
I did catch Who Wants To Be An Otaku, an anime-oriented quiz game, and the panel discussion on seiyūs or Japanese voice actors. There’s also the Cosplay Idol Finals, but I didn’t get to see much of it since I decided to take some time off from the event for a nice stroll in the adjacent malls and check out some of the shops as well. I then head back to the activity center an hour and a half later.
It was already 6:00 PM and I'm starting to feel hungry so I immediately head to Yoshinoya at the mall's second level. I decided on a beef and mushroom bowl, which actually tasted like corned beef with mushrooms. It was okay. Cosplay Xtra already started by the time I head back to the activity center.
I then decided to have a sip of fruit tea. Come to think of it, CoCo already has branches in SM Fairview and Fairview Terraces a long time ago, but I've yet to try out their drinks. And so I did, and I had a Passion Fruit Tea Burst. I have to wait for about 20-30 minutes before my drink is served since the line was pretty long and there were a lot of drinks to be served to the waiting customers.
It was worth the wait, though I was wondering if the crunchy, tarty seeds from the passion fruit are edible. I eventually researched all about it, and it is indeed edible. I finished the drink inside the mall area.
And there's the final part of the event, the Battle Of The Bands. Eleven bands participated in the event, each performing a cover of their chosen anime theme song and either a cover of an OPM song or their own composition. In between the performance, there was an auction of stuffed plushies and other items while waiting for the next band to perform. By the time the eighth or ninth band was about to perform, it was already near the mall's closing time (about fifteen minutes to 10:00 PM).
Despite this, the competition continued until about a quarter past 10:00. The event finished at exactly 10:30 PM. By then, most of the mall's lights are dimmed and only a few restaurants and cafés are open. I got home at around 11:00 PM.
During the two-day event, I only bought a few souvenirs at the event (a couple of button pins on the first day, and a handmade keychain on the second day) since I don’t feel like spending too much, considering the fact that I have a limited budget for the entire weekend.
Regarding my outfit... well, there's a possibility that I might get mistaken for a cosplayer (I wish I was one of them, but...) since my outfits often include black fingerless gloves and the black fleece coat. Some cosplayers wear gauntlets and gloves as part of the [character’s] costume like in the case of Noctis Lucis Caelum of Final Fantasy XV, so there’s no surprise that wearing my black fingerless gloves alone will make them assume that I was actually one of them. A couple of guys asked if they can have a selfie with me, but I politely told them that I’m not a cosplayer... with an awkward smile, of course. Maybe I might reconsider the thought of wearing them again in my future otaku event visits.
They have the photo printing booth where you can have your photos from the event printed for free (I think you have to sign up first), though I never had any of the photos printed, which is a total shame on my part since that would serve as a wonderful remembrance of the event. Aside from that, there were stalls selling various merchandises, mostly anime-related stuff, and some food stalls as well.
There are also workshops too, one of them is the Baybayin Writing Workshop, where one can learn the ancient Filipino writing system. If you notice the unusual writing below the word Ai on the poster, that’s how the Baybayin script looks like.
The event is also helping out charities, mainly Hands Of GOD Charity Works.
That concludes the two-day otaku event, and just like the last time (POF: Danketsu 2017), it was really fun. I had a lot of selfies with the cosplayers and it was really awesome! I’m definitely looking forward to the next POF event, in a larger venue. I’m hoping for the event to be held in either Vertis Tent or The Elements at Eton Centris, but then I wouldn’t mind going as far as MOA Complex just to experience the otaku vibe.
It sure was a really awesome weekend to remember. Until next time!
All pics are taken by yours truly, Keihancarl. Most of the pics are posted to my private Instagram account, @kcox105.
0 notes
Link
Facebook has naively put its faith in humanity and repeatedly been abused, exploited, and proven either negligent or complicit. The company routinely ignores or downplays the worst-case scenarios, idealistically building products without the necessary safeguards, and then drags its feet to admit the extent of the problems.
This approach, willful or not, has led to its latest scandal, where a previously available API for app developers was harnessed by Trump and Brexit Leave campaign technology provider Cambridge Analytica to pull not just the profile data of 270,000 app users who gave express permission, but of 50 million of those people’s unwitting friends.
Facebook famously changed its motto in 2014 from “Move fast and break things” to “Move fast with stable infra” aka ‘infrastructure’. But all that’s meant is that Facebook’s products function as coded even at enormous scale, not that they’re built any slower or with more caution for how they could be weaponized. Facebook’s platform iconography above captures how it only sees the wrench, then gets shocked by the lightning on the other end.
Sometimes the abuse is natural and emergent, as when people grow envious and insecure from following the highlights of their peers’ lives through the News Feed that was meant to bring people together. Sometimes the abuse is malicious and opportunistic, as it was when Cambridge Analytica used an API designed to help people recommend relevant job openings to friends to purposefully harvest data that populated psychographic profiles of voters so they could be swayed with targeted messaging.
NEW YORK, NY – SEPTEMBER 19: CEO of Cambridge Analytica Alexander Nix speaks at the 2016 Concordia Summit – Day 1 at Grand Hyatt New York on September 19, 2016 in New York City. (Photo by Bryan Bedder/Getty Images for Concordia Summit)
Whether it doesn’t see the disasters coming, makes a calculated gamble that the growth or mission benefits of something will far outweigh the risks, or purposefully makes a dangerous decision while obscuring the consequences, Facebook is responsible for its significant shortcomings. The company has historically cut corners in pursuit of ubiquity that left it, potentially knowingly, vulnerable to exploitation.
And increasingly, Facebook is going to lengths to fight the news cycle surrounding its controversies instead of owning up early and getting to work. Facebook knew about Cambridge Analytica’s data policy violations since at least August 2016, but did nothing but send a legal notice to delete the information.It only suspended the Facebook accounts of Cambridge Analytica and other guilty parties and announced the move this week in hopes of muting forthcoming New York Times and Guardian articles about the issue (articles which it also tried to prevent from running via legal threats.) And since, representatives of the company have quibbled with reporters over Twitter, describing the data misuse as a “breach” instead explaining why it didn’t inform the public about it for years.
“I have more fear in my life that we aren’t going to maximize the opportunity that we have than that we mess something up” Zuckerberg said at a Facebook’s Social Good Forum event in November. Perhaps it’s time for that fear to shift towards ‘what could go wrong’, not just for Zuck, but the leaders of all of today’s tech titans.
Facebook CEO mark Zuckerberg
An Abridged List Of Facebook’s Unforeseen Consequences
Here’s an incomplete list of the massive negative consequences and specific abuses that stem from Facebook’s idealistic product development process:
Engagement Ranked Feed = Sensationalized Fake News – Facebook built the News Feed to show the most relevant content first so we’d see the most interesting things going on with our closest friends, but measured that relevance largely based on what people commented on, liked, clicked, shared, and watched. All of those activities are stoked by sensationalist fake new stories, allowing slews of them to go viral while their authors earned ad revenue and financed their operations with ad views delivered by Facebook referral traffic. Facebook downplayed the problem until it finally fessed up and is now scrambling to fight fake news.
Engagement Priced Ad Auctions = Polarizing Ads – Facebook gives a discount to ads that are engaging so as to incentivize businesses to produce marketing materials that don’t bore or annoy users such that they close the social network. But the Trump campaign designed purposefully divisive and polarizing ads that would engage a niche base of his supporters to try to score cheaper ad clicks and more free viral sharing of those ads.
Academic Research = Emotion Tampering – Facebook allows teams of internal and external researchers to conduct studies on its users in hopes of producing academic breakthroughs in sociology. But in some cases these studies have moved from observation into quietly interfering with the mental conditions of Facebookers. In 2012, Facebook data science team members manipulated the number of emotionally positive or negative posts in the feeds of 689,000 users and then studied their subsequent status updates to see if emotion was contagious. Facebook published the research, failing to foresee the huge uproar that ensued when the public learned that some users, including emotionally vulnerable teenagers who could have been suffering from depression, were deliberately shown sadder posts.
Ethnic Affinity Ad Targeting = Racist Exclusion – Facebook’s ad system previously let businesses target users in “ethnic affinity” groups such as “African-American” or “Hispanic” based on their in-app behavior as a stand in for racial targeting. The idea was likely to help businesses find customers interested in their products, but the tool was shown to allow exclusion of certain ethnic affinity groups in ways that could be used to exclude them from legally protected opportunities such as housing; employment, and loans. Facebook has since disabled this kind of targeting while investigates the situation.
Exclusionary ethnic affinity ad targeting, as spotted by ProPublica
App Platform = Game Spam – One of Facebook’s earliest encounters with unforeseen consequences came in 2009 and 2010 after it launched its app platform. The company expected developers to build helpful utilities that could go viral thanks to special, sometimes automatic posts to the News Feed. But game developers seized on the platform and its viral growth channels, spawning companies like Zynga that turned optimizing News Feed game spam into a science. The constant invites to join games in order to help a friend win overwhelmed the feed, threatening to drown out legitimate communication and ruin the experience for non-gamers until Facebook shut down the viral growth channels, cratering many of the game developers.
Real Name Policy = Enabling Stalkers – For years, Facebook strictly required to use their real names in order to reduce uncivility and bullying facilitated by hiding behind anonymity. But victims of stalking, domestic violence, and hate crimes argued that their abusers could use Facebook to track them down and harass them. Only after mounting criticism from the transgender community and others did Facebook slightly relax the policy in 2015, though some still find it onerous to set up a pseudonym on Facebook and dangerous to network without one.
Self-Serve Ads = Objectionable Ads – To earn money efficiently, Facebook lets people buy ads through its apps without ever talking to a sales representative. But the self-serve ads interface has been repeatedly shown to used nefariously. ProPublica found businesses could target those who followed objectionable user-generated Pages and interests such as “jew haters” and other disturbing keywords on Facebook. And Russian political operatives famously used Facebook ads to spread divisive memes in the United States and pit people against each other and promote distrust between citizens. Facebook is only now shutting down long-tail user-generated ad targeting parameters, hiring more ad moderators, and requiring more thorough political ad buyer documentation.
Developer Data Access = Data Abuse – Most recently, Facebook has found its trust in app developers misplaced. For years it offered an API that allowed app makers to pull robust profile data on their users and somewhat limited info about their friends to make personalized products. For example, one could show which bands your friends Like so you’d know who to invite to a concert. But Facebook lacked strong enforcement mechanisms for its policy that prevented developers from sharing or selling that data to others. Now the public is learning that Cambridge Analytica’s trick of turning 270,000 users of Dr. Aleksandr Kogan’s personality quiz app into info about 50 million people illicitly powered psychographic profiles that helped Trump and Brexit pinpoint their campaign messages. It’s quite likely that other developers have violated Facebook’s flimsy policies against storing, selling, or sharing user data they’ve collected, and more reports of misuse will emerge.
Each time, Facebook built tools with rosy expectations, only to negligently leave the safety off and see worst-case scenarios arise. In October, Zuckerberg already asked for forgiveness, but the public wants change.
Trading Kool-Aid For Contrarians
The desire to avoid censorship or partisanship or inefficiency is no excuse. Perhaps people are so addicted to Facebook that no backlash will pry them their feeds. But Facebook can’t treat this as merely a PR problem, a distraction from the fun work of building new social features, unless its employees are ready to shoulder the blame for the erosion of society. Each scandal further proves it can’t police itself, inviting government regulation that could gum up its business. Members of congress are already calling on Zuckerberg to testify.
Yet even with all of the public backlash and calls for regulation, Facebook still seems to lack or ignore the cynics and diverse voices who might foresee how its products could be perverted or were conceptualized foolishly in the first place. Having more minorities and contrarians on the teams that conceive its products could nip troubles in the bud before they blossom.
“The saying goes that optimists tend to be successful and pessimists tend to be right” Zuckerberg explained at the November forum. “If you think something is going to be terrible and it is going to fail, then you are going to look for the data points that prove you right and you will find them. That is what pessimists do. But if you think that something is possible, then you are going to try to find a way to make it work. And even when you make mistakes along the way and even when people doubt you, you are going to keep pushing until you find a way to make it happen.”
Zuckerberg speaks at Facebook’s Social Good Forum
That quote takes on new light given Facebook’s history. The company must promote a culture where pessimists can speak up without reprise. Where a seeking a raise, reaching milestones, avoiding culpability, or a desire to avoid rocking the Kool-Aid boat don’t stifle discussion of a product’s potential hazards. Facebook’s can-do hacker culture that codes with caution to the wind, that asks for forgiveness instead of permission, is failing to scale to the responsibility of being a two billion user communications institution.
And our species is failing to scale to that level of digital congregation too, stymied by our insecurity and greed. Whether someone is demeaning themselves for not having as glamorous of a vacation as their acquaintances, or seizing the world’s megaphone to spew lies in hopes of impeding democracy, we’ve proven incapable of safe social networking.
That’s why we’re relying on Facebook and the other social networks to change, and why it’s so catastrophic when they miss the festering problems, ignore the calls for reform, or try to hide their complicity. To connect the world, Facebook must foresee its ugliness and proactively rise against it.
For more on Facebook’s non-stop scandals, check out these TechCrunch feature pieces:
Zuckerberg asks forgiveness, but Facebook needs change
The difference between good and bad Facebooking
from Social – TechCrunch http://ift.tt/2G24Nhp Original Content From: https://techcrunch.com
0 notes
Text
Facebook and the endless string of worst-case scenarios
Facebook and the endless string of worst-case scenarios
Facebook has naively put its faith in humanity and repeatedly been abused, exploited, and proven either negligent or complicit. The company routinely ignores or downplays the worst-case scenarios, idealistically building products without the necessary safeguards, and then drags its feet to admit the extent of the problems.
This approach, willful or not, has led to its latest scandal, where a previously available API for app developers was harnessed by Trump and Brexit Leave campaign technology provider Cambridge Analytica to pull not just the profile data of 270,000 app users who gave express permission, but of 50 million of those people’s unwitting friends.
Facebook famously changed its motto in 2014 from “Move fast and break things” to “Move fast with stable infra” aka ‘infrastructure’. But all that’s meant is that Facebook’s products function as coded even at enormous scale, not that they’re built any slower or with more caution for how they could be weaponized. Facebook’s platform iconography above captures how it only sees the wrench, then gets shocked by the lightning on the other end.
Sometimes the abuse is natural and emergent, as when people grow envious and insecure from following the highlights of their peers’ lives through the News Feed that was meant to bring people together. Sometimes the abuse is malicious and opportunistic, as it was when Cambridge Analytica used an API designed to help people recommend relevant job openings to friends to purposefully harvest data that populated psychographic profiles of voters so they could be swayed with targeted messaging.
NEW YORK, NY – SEPTEMBER 19: CEO of Cambridge Analytica Alexander Nix speaks at the 2016 Concordia Summit – Day 1 at Grand Hyatt New York on September 19, 2016 in New York City. (Photo by Bryan Bedder/Getty Images for Concordia Summit)
Whether it doesn’t see the disasters coming, makes a calculated gamble that the growth or mission benefits of something will far outweigh the risks, or purposefully makes a dangerous decision while obscuring the consequences, Facebook is responsible for its significant shortcomings. The company has historically cut corners in pursuit of ubiquity that left it, potentially knowingly, vulnerable to exploitation.
And increasingly, Facebook is going to lengths to fight the news cycle surrounding its controversies instead of owning up early and getting to work. Facebook knew about Cambridge Analytica’s data policy violations since at least August 2016, but did nothing but send a legal notice to delete the information.It only suspended the Facebook accounts of Cambridge Analytica and other guilty parties and announced the move this week in hopes of muting forthcoming New York Times and Guardian articles about the issue (articles which it also tried to prevent from running via legal threats.) And since, representatives of the company have quibbled with reporters over Twitter, describing the data misuse as a “breach” instead explaining why it didn’t inform the public about it for years.
“I have more fear in my life that we aren’t going to maximize the opportunity that we have than that we mess something up” Zuckerberg said at a Facebook’s Social Good Forum event in November. Perhaps it’s time for that fear to shift towards ‘what could go wrong’, not just for Zuck, but the leaders of all of today’s tech titans.
Facebook CEO mark Zuckerberg
An Abridged List Of Facebook’s Unforeseen Consequences
Here’s an incomplete list of the massive negative consequences and specific abuses that stem from Facebook’s idealistic product development process:
Engagement Ranked Feed = Sensationalized Fake News – Facebook built the News Feed to show the most relevant content first so we’d see the most interesting things going on with our closest friends, but measured that relevance largely based on what people commented on, liked, clicked, shared, and watched. All of those activities are stoked by sensationalist fake new stories, allowing slews of them to go viral while their authors earned ad revenue and financed their operations with ad views delivered by Facebook referral traffic. Facebook downplayed the problem until it finally fessed up and is now scrambling to fight fake news.
Engagement Priced Ad Auctions = Polarizing Ads – Facebook gives a discount to ads that are engaging so as to incentivize businesses to produce marketing materials that don’t bore or annoy users such that they close the social network. But the Trump campaign designed purposefully divisive and polarizing ads that would engage a niche base of his supporters to try to score cheaper ad clicks and more free viral sharing of those ads.
Academic Research = Emotion Tampering – Facebook allows teams of internal and external researchers to conduct studies on its users in hopes of producing academic breakthroughs in sociology. But in some cases these studies have moved from observation into quietly interfering with the mental conditions of Facebookers. In 2012, Facebook data science team members manipulated the number of emotionally positive or negative posts in the feeds of 689,000 users and then studied their subsequent status updates to see if emotion was contagious. Facebook published the research, failing to foresee the huge uproar that ensued when the public learned that some users, including emotionally vulnerable teenagers who could have been suffering from depression, were deliberately shown sadder posts.
Ethnic Affinity Ad Targeting = Racist Exclusion – Facebook’s ad system previously let businesses target users in “ethnic affinity” groups such as “African-American” or “Hispanic” based on their in-app behavior as a stand in for racial targeting. The idea was likely to help businesses find customers interested in their products, but the tool was shown to allow exclusion of certain ethnic affinity groups in ways that could be used to exclude them from legally protected opportunities such as housing; employment, and loans. Facebook has since disabled this kind of targeting while investigates the situation.
Exclusionary ethnic affinity ad targeting, as spotted by ProPublica
App Platform = Game Spam – One of Facebook’s earliest encounters with unforeseen consequences came in 2009 and 2010 after it launched its app platform. The company expected developers to build helpful utilities that could go viral thanks to special, sometimes automatic posts to the News Feed. But game developers seized on the platform and its viral growth channels, spawning companies like Zynga that turned optimizing News Feed game spam into a science. The constant invites to join games in order to help a friend win overwhelmed the feed, threatening to drown out legitimate communication and ruin the experience for non-gamers until Facebook shut down the viral growth channels, cratering many of the game developers.
Real Name Policy = Enabling Stalkers – For years, Facebook strictly required to use their real names in order to reduce uncivility and bullying facilitated by hiding behind anonymity. But victims of stalking, domestic violence, and hate crimes argued that their abusers could use Facebook to track them down and harass them. Only after mounting criticism from the transgender community and others did Facebook slightly relax the policy in 2015, though some still find it onerous to set up a pseudonym on Facebook and dangerous to network without one.
Self-Serve Ads = Objectionable Ads – To earn money efficiently, Facebook lets people buy ads through its apps without ever talking to a sales representative. But the self-serve ads interface has been repeatedly shown to used nefariously. ProPublica found businesses could target those who followed objectionable user-generated Pages and interests such as “jew haters” and other disturbing keywords on Facebook. And Russian political operatives famously used Facebook ads to spread divisive memes in the United States and pit people against each other and promote distrust between citizens. Facebook is only now shutting down long-tail user-generated ad targeting parameters, hiring more ad moderators, and requiring more thorough political ad buyer documentation.
Developer Data Access = Data Abuse – Most recently, Facebook has found its trust in app developers misplaced. For years it offered an API that allowed app makers to pull robust profile data on their users and somewhat limited info about their friends to make personalized products. For example, one could show which bands your friends Like so you’d know who to invite to a concert. But Facebook lacked strong enforcement mechanisms for its policy that prevented developers from sharing or selling that data to others. Now the public is learning that Cambridge Analytica’s trick of turning 270,000 users of Dr. Aleksandr Kogan’s personality quiz app into info about 50 million people illicitly powered psychographic profiles that helped Trump and Brexit pinpoint their campaign messages. It’s quite likely that other developers have violated Facebook’s flimsy policies against storing, selling, or sharing user data they’ve collected, and more reports of misuse will emerge.
Each time, Facebook built tools with rosy expectations, only to negligently leave the safety off and see worst-case scenarios arise. In October, Zuckerberg already asked for forgiveness, but the public wants change.
Trading Kool-Aid For Contrarians
The desire to avoid censorship or partisanship or inefficiency is no excuse. Perhaps people are so addicted to Facebook that no backlash will pry them their feeds. But Facebook can’t treat this as merely a PR problem, a distraction from the fun work of building new social features, unless its employees are ready to shoulder the blame for the erosion of society. Each scandal further proves it can’t police itself, inviting government regulation that could gum up its business. Members of congress are already calling on Zuckerberg to testify.
Yet even with all of the public backlash and calls for regulation, Facebook still seems to lack or ignore the cynics and diverse voices who might foresee how its products could be perverted or were conceptualized foolishly in the first place. Having more minorities and contrarians on the teams that conceive its products could nip troubles in the bud before they blossom.
“The saying goes that optimists tend to be successful and pessimists tend to be right” Zuckerberg explained at the November forum. “If you think something is going to be terrible and it is going to fail, then you are going to look for the data points that prove you right and you will find them. That is what pessimists do. But if you think that something is possible, then you are going to try to find a way to make it work. And even when you make mistakes along the way and even when people doubt you, you are going to keep pushing until you find a way to make it happen.”
Zuckerberg speaks at Facebook’s Social Good Forum
That quote takes on new light given Facebook’s history. The company must promote a culture where pessimists can speak up without reprise. Where a seeking a raise, reaching milestones, avoiding culpability, or a desire to avoid rocking the Kool-Aid boat don’t stifle discussion of a product’s potential hazards. Facebook’s can-do hacker culture that codes with caution to the wind, that asks for forgiveness instead of permission, is failing to scale to the responsibility of being a two billion user communications institution.
And our species is failing to scale to that level of digital congregation too, stymied by our insecurity and greed. Whether someone is demeaning themselves for not having as glamorous of a vacation as their acquaintances, or seizing the world’s megaphone to spew lies in hopes of impeding democracy, we’ve proven incapable of safe social networking.
That’s why we’re relying on Facebook and the other social networks to change, and why it’s so catastrophic when they miss the festering problems, ignore the calls for reform, or try to hide their complicity. To connect the world, Facebook must foresee its ugliness and proactively rise against it.
For more on Facebook’s non-stop scandals, check out these TechCrunch feature pieces:
Zuckerberg asks forgiveness, but Facebook needs change
The difference between good and bad Facebooking
0 notes
Link
The Tories held an auction, but can you guess which of these were a sold as a 'Lot' & which surely were 'Not'?
1 note
·
View note