#Google Cloud AI Platform
Explore tagged Tumblr posts
Text
In today’s competitive mobile app market, it’s crucial to find ways to stand out from the crowd. One powerful tool that can help you achieve this is artificial intelligence (AI). AI is rapidly transforming industries, and mobile app development is no exception. This article, explores the benefits of integrating AI into your mobile app. Let’s dive deeper and see how AI can revolutionize your app and boost your business.
0 notes
Text
Google Document AI is an exciting new technology that is changing the way we interact with information. Technology is aiding organizations and individuals to process large volumes of documents quickly and accurately.
It does this by combining natural language processing, computer vision, and machine learning. This technology also helps to extract valuable insights from the data. As the technology continues to develop, we can expect to see even more exciting applications in the years to come.
#Google Document AI#AI/ML tools and services#Google Vertex AI#google cloud ai platform#AI/ML solutions#Onix
0 notes
Text
Unlocking Potential Through Immersive VR-Based Training Solutions
Atcuality is dedicated to unlocking human potential through VR-based training solutions that push the boundaries of traditional learning methods. In industries where hands-on experience is crucial, VR allows trainees to engage in realistic scenarios without leaving the training room. Our VR-based training solutions are designed to provide the kind of experiential learning that helps individuals not just understand but deeply internalize crucial skills. From medical procedures to machinery operation, these immersive training environments offer flexibility and safety, removing the limitations of traditional learning setups. By simulating high-risk or complex tasks, learners can gain valuable experience while mitigating actual risks. With Atcuality’s VR training, companies can ensure their workforce is prepared for real-world challenges, resulting in enhanced safety, improved skill levels, and reduced onboarding and training costs. Our commitment to advancing training standards is evident in our ability to offer highly customizable, impactful VR experiences across various industries.
#digital marketing#seo optimization#seo company#seo changbin#seo services#seo agency#seo marketing#emailmarketing#search engine optimization#google ads#socialmediamarketing#ai services#iot#iotsolutions#iot applications#iot development services#iot platform#digitaltransformation#techinnovation#cloud hosting in saudi arabia#cloud server in saudi arabia#cloud computing#amazon web services#amazon services#mobile app development company#mobile app developers#mobile application development#app developers#app development company#azure cloud services
0 notes
Text
WBSV
WorldBridge Sport Village is a remarkable mixed-use development located in the rapidly growing area of Chroy Changvar, just 20 minutes away from Phnom Penh's Central Business District. It is a pioneering Sport Village that offers a unique opportunity to blend work and play in a health-conscious environment inspired by international-level sports villages, akin to Olympic athlete villages. It will be the first-ever Sport village that offers you a one-of-a-kind opportunity to experience both work and play in the distinctively healthy atmosphere of an international-level sports village. WorldBridge Sport Village similarly offers a range of landed home living. Properties such as villas, Townhouses, Row Houses, and Shophouses can be found that more than accommodate any family size looking to live in the next big neighborhood in the fastest-growing area of Chroy Changvar.
•Project: WORLD BRIDGE SPORT VILLAGE •Developer: OXLEY-WORLDBRIDGE (CAMBODIA) CO., LTD. •Subsidiary: WB SPORT VILLAGE CO., LTD. •Architectural Manager: Sonetra KETH •Location: Phnom Penh, Cambodia
The condo units offer up to 3-bedroom selections across 12 high-rise blocks with spacious interiors and breathtaking views.
#Sonetra Keth#Architectural Manager#Architectural Design Manager#BIM Director#BIM Manager#BIM Coordinator#Project Manager#RMIT University Vietnam#Institute of Technology of Cambodia#Real Estate Development#Construction Industry#Building Information Modelling#BIM#AI#Artificial Intelligence#Technology#VDC#Virtual Design#IoT#Machine Learning#C4R#Collaboration for Revit#Cloud Computing and Collaboration Platforms#NETRA#netra#នេត្រា#កេត សុនេត្រា#<meta name=“google-adsense-account” content=“ca-pub-9430617320114361”>#<script async src=“https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js?client=ca-pub-9430617320114361”#crossorigin=“anonymous”></script>
0 notes
Photo
A Comprehensive Guide about Google Cloud Generative AI Studio https://medium.com/google-cloud/a-comprehensive-guide-about-google-cloud-generative-ai-studio-1b2bafc4108a?source=rss----e52cf94d98af---4
#google-cloud-platform#generative-ai#generative-ai-tools#generative-ai-solution#large-language-models#Rubens Zimbres#Google Cloud - Community - Medium
0 notes
Text
“Disenshittify or Die”
youtube
I'm coming to BURNING MAN! On TUESDAY (Aug 27) at 1PM, I'm giving a talk called "DISENSHITTIFY OR DIE!" at PALENQUE NORTE (7&E). On WEDNESDAY (Aug 28) at NOON, I'm doing a "Talking Caterpillar" Q&A at LIMINAL LABS (830&C).
Last weekend, I traveled to Las Vegas for Defcon 32, where I had the immense privilege of giving a solo talk on Track 1, entitled "Disenshittify or die! How hackers can seize the means of computation and build a new, good internet that is hardened against our asshole bosses' insatiable horniness for enshittification":
https://info.defcon.org/event/?id=54861
This was a followup to last year's talk, "An Audacious Plan to Halt the Internet's Enshittification," a talk that kicked off a lot of international interest in my analysis of platform decay ("enshittification"):
https://www.youtube.com/watch?v=rimtaSgGz_4
The Defcon organizers have earned a restful week or two, and that means that the video of my talk hasn't yet been posted to Defcon's Youtube channel, so in the meantime, I thought I'd post a lightly edited version of my speech crib. If you're headed to Burning Man, you can hear me reprise this talk at Palenque Norte (7&E); I'm kicking off their lecture series on Tuesday, Aug 27 at 1PM.
==
What the fuck happened to the old, good internet?
I mean, sure, our bosses were a little surveillance-happy, and they were usually up for sharing their data with the NSA, and whenever there was a tossup between user security and growth, it was always YOLO time.
But Google Search used to work. Facebook used to show you posts from people you followed. Uber used to be cheaper than a taxi and pay the driver more than a cabbie made. Amazon used to sell products, not Shein-grade self-destructing dropshipped garbage from all-consonant brands. Apple used to defend your privacy, rather than spying on you with your no-modifications-allowed Iphone.
There was a time when you searching for an album on Spotify would get you that album – not a playlist of insipid AI-generated covers with the same name and art.
Microsoft used to sell you software – sure, it was buggy – but now they just let you access apps in the cloud, so they can watch how you use those apps and strip the features you use the most out of the basic tier and turn them into an upcharge.
What – and I cannot stress this enough – the fuck happened?!
I’m talking about enshittification.
Here’s what enshittification looks like from the outside: First, you see a company that’s being good to its end users. Google puts the best search results at the top; Facebook shows you a feed of posts from people and groups you followl; Uber charges small dollars for a cab; Amazon subsidizes goods and returns and shipping and puts the best match for your product search at the top of the page.
That’s stage one, being good to end users. But there’s another part of this stage, call it stage 1a). That’s figuring out how to lock in those users.
There’s so many ways to lock in users.
If you’re Facebook, the users do it for you. You joined Facebook because there were people there you wanted to hang out with, and other people joined Facebook to hang out with you.
That’s the old “network effects” in action, and with network effects come “the collective action problem." Because you love your friends, but goddamn are they a pain in the ass! You all agree that FB sucks, sure, but can you all agree on when it’s time to leave?
No way.
Can you agree on where to go next?
Hell no.
You’re there because that’s where the support group for your rare disease hangs out, and your bestie is there because that’s where they talk with the people in the country they moved away from, then there’s that friend who coordinates their kid’s little league car pools on FB, and the best dungeon master you know isn’t gonna leave FB because that’s where her customers are.
So you’re stuck, because even though FB use comes at a high cost – your privacy, your dignity and your sanity – that’s still less than the switching cost you’d have to bear if you left: namely, all those friends who have taken you hostage, and whom you are holding hostage
Now, sometimes companies lock you in with money, like Amazon getting you to prepay for a year’s shipping with Prime, or to buy your Audible books on a monthly subscription, which virtually guarantees that every shopping search will start on Amazon, after all, you’ve already paid for it.
Sometimes, they lock you in with DRM, like HP selling you a printer with four ink cartridges filled with fluid that retails for more than $10,000/gallon, and using DRM to stop you from refilling any of those ink carts or using a third-party cartridge. So when one cart runs dry, you have to refill it or throw away your investment in the remaining three cartridges and the printer itself.
Sometimes, it’s a grab bag:
You can’t run your Ios apps without Apple hardware;
you can’t run your Apple music, books and movies on anything except an Ios app;
your iPhone uses parts pairing – DRM handshakes between replacement parts and the main system – so you can’t use third-party parts to fix it; and
every OEM iPhone part has a microscopic Apple logo engraved on it, so Apple can demand that the US Customs and Border Service seize any shipment of refurb Iphone parts as trademark violations.
Think Different, amirite?
Getting you locked in completes phase one of the enshittification cycle and signals the start of phase two: making things worse for you to make things better for business customers.
For example, a platform might poison its search results, like Google selling more and more of its results pages to ads that are identified with lighter and lighter tinier and tinier type.
Or Amazon selling off search results and calling it an “ad” business. They make $38b/year on this scam. The first result for your search is, on average, 29% more expensive than the best match for your search. The first row is 25% more expensive than the best match. On average, the best match for your search is likely to be found seventeen places down on the results page.
Other platforms sell off your feed, like Facebook, which started off showing you the things you asked to see, but now the quantum of content from the people you follow has dwindled to a homeopathic residue, leaving a void that Facebook fills with things that people pay to show you: boosted posts from publishers you haven’t subscribed to, and, of course, ads.
Now at this point you might be thinking ‘sure, if you’re not paying for the product, you’re the product.'
Bullshit!
Bull.
Shit.
The people who buy those Google ads? They pay more every year for worse ad-targeting and more ad-fraud
Those publishers paying to nonconsensually cram their content into your Facebook feed? They have to do that because FB suppresses their ability to reach the people who actually subscribed to them
The Amazon sellers with the best match for your query have to outbid everyone else just to show up on the first page of results. It costs so much to sell on Amazon that between 45-51% of every dollar an independent seller brings in has to be kicked up to Don Bezos and the Amazon crime family. Those sellers don’t have the kind of margins that let them pay 51% They have to raise prices in order to avoid losing money on every sale.
"But wait!" I hear you say!
[Come on, say it!]
"But wait! Things on Amazon aren’t more expensive that things at Target, or Walmart, or at a mom and pop store, or direct from the manufacturer.
"How can sellers be raising prices on Amazon if the price at Amazon is the same as at is everywhere else?"
[Any guesses?!]
That’s right, they charge more everywhere. They have to. Amazon binds its sellers to a policy called “most favored nation status,” which says they can’t charge more on Amazon than they charge elsewhere, including direct from their own factory store.
So every seller that wants to sell on Amazon has to raise their prices everywhere else.
Now, these sellers are Amazon’s best customers. They’re paying for the product, and they’re still getting screwed.
Paying for the product doesn’t fill your vapid boss’s shriveled heart with so much joy that he decides to stop trying to think of ways to fuck you over.
Look at Apple. Remember when Apple offered every Ios user a one-click opt out for app-based surveillance? And 96% of users clicked that box?
(The other four percent were either drunk or Facebook employees or drunk Facebook employees.)
That cost Facebook at least ten billion dollars per year in lost surveillance revenue?
I mean, you love to see it.
But did you know that at the same time Apple started spying on Ios users in the same way that Facebook had been, for surveillance data to use to target users for its competing advertising product?
Your Iphone isn’t an ad-supported gimme. You paid a thousand fucking dollars for that distraction rectangle in your pocket, and you’re still the product. What’s more, Apple has rigged Ios so that you can’t mod the OS to block its spying.
If you’re not not paying for the product, you’re the product, and if you are paying for the product, you’re still the product.
Just ask the farmers who are expected to swap parts into their own busted half-million dollar, mission-critical tractors, but can’t actually use those parts until a technician charges them $200 to drive out to the farm and type a parts pairing unlock code into their console.
John Deere’s not giving away tractors. Give John Deere a half mil for a tractor and you will be the product.
Please, my brothers and sisters in Christ. Please! Stop saying ‘if you’re not paying for the product, you’re the product.’
OK, OK, so that’s phase two of enshittification.
Phase one: be good to users while locking them in.
Phase two: screw the users a little to you can good to business customers while locking them in.
Phase three: screw everybody and take all the value for yourself. Leave behind the absolute bare minimum of utility so that everyone stays locked into your pile of shit.
Enshittification: a tragedy in three acts.
That’s what enshittification looks like from the outside, but what’s going on inside the company? What is the pathological mechanism? What sci-fi entropy ray converts the excellent and useful service into a pile of shit?
That mechanism is called twiddling. Twiddling is when someone alters the back end of a service to change how its business operates, changing prices, costs, search ranking, recommendation criteria and other foundational aspects of the system.
Digital platforms are a twiddler’s utopia. A grocer would need an army of teenagers with pricing guns on rollerblades to reprice everything in the building when someone arrives who’s extra hungry.
Whereas the McDonald’s Investments portfolio company Plexure advertises that it can use surveillance data to predict when an app user has just gotten paid so the seller can tack an extra couple bucks onto the price of their breakfast sandwich.
And of course, as the prophet William Gibson warned us, ‘cyberspace is everting.' With digital shelf tags, grocers can change prices whenever they feel like, like the grocers in Norway, whose e-ink shelf tags change the prices 2,000 times per day.
Every Uber driver is offered a different wage for every job. If a driver has been picky lately, the job pays more. But if the driver has been desperate enough to grab every ride the app offers, the pay goes down, and down, and down.
The law professor Veena Dubal calls this ‘algorithmic wage discrimination.' It’s a prime example of twiddling.
Every youtuber knows what it’s like to be twiddled. You work for weeks or months, spend thousands of dollars to make a video, then the algorithm decides that no one – not your own subscribers, not searchers who type in the exact name of your video – will see it.
Why? Who knows? The algorithm’s rules are not public.
Because content moderation is the last redoubt of security through obscurit: they can’t tell you what the como algorithm is downranking because then you’d cheat.
Youtube is the kind of shitty boss who docks every paycheck for all the rules you’ve broken, but won’t tell you what those rules were, lest you figure out how to break those rules next time without your boss catching you.
Twiddling can also work in some users’ favor, of course. Sometimes platforms twiddle to make things better for end users or business customers.
For example, Emily Baker-White from Forbes revealed the existence of a back-end feature that Tiktok’s management can access they call the “heating tool.”
When a manager applies the heating toll to a performer’s account, that performer’s videos are thrust into the feeds of millions of users, without regard to whether the recommendation algorithm predicts they will enjoy that video.
Why would they do this? Well, here’s an analogy from my boyhood I used to go to this traveling fair that would come to Toronto at the end of every summer, the Canadian National Exhibition. If you’ve been to a fair like the Ex, you know that you can always spot some guy lugging around a comedically huge teddy bear.
Nominally, you win that teddy bear by throwing five balls in a peach-basket, but to a first approximation, no one has ever gotten five balls to stay in that peach-basket.
That guy “won” the teddy bear when a carny on the midway singled him out and said, "fella, I like your face. Tell you what I’m gonna do: You get just one ball in the basket and I’ll give you this keychain, and if you amass two keychains, I’ll let you trade them in for one of these galactic-scale teddy-bears."
That’s how the guy got his teddy bear, which he now has to drag up and down the midway for the rest of the day.
Why the hell did that carny give away the teddy bear? Because it turns the guy into a walking billboard for the midway games. If that dopey-looking Judas Goat can get five balls into a peach basket, then so can you.
Except you can’t.
Tiktok’s heating tool is a way to give away tactical giant teddy bears. When someone in the TikTok brain trust decides they need more sports bros on the platform, they pick one bro out at random and make him king for the day, heating the shit out of his account.
That guy gets a bazillion views and he starts running around on all the sports bro forums trumpeting his success: *I am the Louis Pasteur of sports bro influencers!"
The other sports bros pile in and start retooling to make content that conforms to the idiosyncratic Tiktok format. When they fail to get giant teddy bears of their own, they assume that it’s because they’re doing Tiktok wrong, because they don’t know about the heating tool.
But then comes the day when the TikTok Star Chamber decides they need to lure in more astrologers, so they take the heat off that one lucky sports bro, and start heating up some lucky astrologer.
Giant teddy bears are all over the place: those Uber drivers who were boasting to the NYT ten years ago about earning $50/hour? The Substackers who were rolling in dough? Joe Rogan and his hundred million dollar Spotify payout? Those people are all the proud owners of giant teddy bears, and they’re a steal.
Because every dollar they get from the platform turns into five dollars worth of free labor from suckers who think they just internetting wrong.
Giant teddy bears are just one way of twiddling. Platforms can play games with every part of their business logic, in highly automated ways, that allows them to quickly and efficiently siphon value from end users to business customers and back again, hiding the pea in a shell game conducted at machine speeds, until they’ve got everyone so turned around that they take all the value for themselves.
That’s the how: How the platforms do the trick where they are good to users, then lock users in, then maltreat users to be good to business customers, then lock in those business customers, then take all the value for themselves.
So now we know what is happening, and how it is happening, all that’s left is why it’s happening.
Now, on the one hand, the why is pretty obvious. The less value that end-users and business customers capture, the more value there is left to divide up among the shareholders and the executives.
That’s why, but it doesn’t tell you why now. Companies could have done this shit at any time in the past 20 years, but they didn’t. Or at least, the successful ones didn’t. The ones that turned themselves into piles of shit got treated like piles of shit. We avoided them and they died.
Remember Myspace? Yahoo Search? Livejournal? Sure, they’re still serving some kind of AI slop or programmatic ad junk if you hit those domains, but they’re gone.
And there’s the clue: It used to be that if you enshittified your product, bad things happened to your company. Now, there are no consequences for enshittification, so everyone’s doing it.
Let’s break that down: What stops a company from enshittifying?
There are four forces that discipline tech companies. The first one is, obviously, competition.
If your customers find it easy to leave, then you have to worry about them leaving
Many factors can contribute to how hard or easy it is to depart a platform, like the network effects that Facebook has going for it. But the most important factor is whether there is anywhere to go.
Back in 2012, Facebook bought Insta for a billion dollars. That may seem like chump-change in these days of eleven-digit Big Tech acquisitions, but that was a big sum in those innocent days, and it was an especially big sum to pay for Insta. The company only had 13 employees, and a mere 25 million registered users.
But what mattered to Zuckerberg wasn’t how many users Insta had, it was where those users came from.
[Does anyone know where those Insta users came from?]
That’s right, they left Facebook and joined Insta. They were sick of FB, even though they liked the people there, they hated creepy Zuck, they hated the platform, so they left and they didn’t come back.
So Zuck spent a cool billion to recapture them, A fact he put in writing in a midnight email to CFO David Ebersman, explaining that he was paying over the odds for Insta because his users hated him, and loved Insta. So even if they quit Facebook (the platform), they would still be captured Facebook (the company).
Now, on paper, Zuck’s Instagram acquisition is illegal, but normally, that would be hard to stop, because you’d have to prove that he bought Insta with the intention of curtailing competition.
But in this case, Zuck tripped over his own dick: he put it in writing.
But Obama’s DoJ and FTC just let that one slide, following the pro-monopoly policies of Reagan, Bush I, Clinton and Bush II, and setting an example that Trump would follow, greenlighting gigamergers like the catastrophic, incestuous Warner-Discovery marriage.
Indeed, for 40 years, starting with Carter, and accelerating through Reagan, the US has encouraged monopoly formation, as an official policy, on the grounds that monopolies are “efficient.”
If everyone is using Google Search, that’s something we should celebrate. It means they’ve got the very best search and wouldn’t it be perverse to spend public funds to punish them for making the best product?
But as we all know, Google didn’t maintain search dominance by being best. They did it by paying bribes. More than 20 billion per year to Apple alone to be the default Ios search, plus billions more to Samsung, Mozilla, and anyone else making a product or service with a search-box on it, ensuring that you never stumble on a search engine that’s better than theirs.
Which, in turn, ensured that no one smart invested big in rival search engines, even if they were visibly, obviously superior. Why bother making something better if Google’s buying up all the market oxygen before it can kindle your product to life?
Facebook, Google, Microsoft, Amazon – they’re not “making things” companies, they’re “buying things” companies, taking advantage of official tolerance for anticompetitive acquisitions, predatory pricing, market distorting exclusivity deals and other acts specifically prohibited by existing antitrust law.
Their goal is to become too big to fail, because that makes them too big to jail, and that means they can be too big to care.
Which is why Google Search is a pile of shit and everything on Amazon is dropshipped garbage that instantly disintegrates in a cloud of offgassed volatile organic compounds when you open the box.
Once companies no longer fear losing your business to a competitor, it’s much easier for them to treat you badly, because what’re you gonna do?
Remember Lily Tomlin as Ernestine the AT&T operator in those old SNL sketches? “We don’t care. We don’t have to. We’re the phone company.”
Competition is the first force that serves to discipline companies and the enshittificatory impulses of their leadership, and we just stopped enforcing competition law.
It takes a special kind of smooth-brained asshole – that is, an establishment economist – to insist that the collapse of every industry from eyeglasses to vitamin C into a cartel of five or fewer companies has nothing to do with policies that officially encouraged monopolization.
It’s like we used to put down rat poison and we didn’t have a rat problem. Then these dickheads convinced us that rats were good for us and we stopped putting down rat poison, and now rats are gnawing our faces off and they’re all running around saying, "Who’s to say where all these rats came from? Maybe it was that we stopped putting down poison, but maybe it’s just the Time of the Rats. The Great Forces of History bearing down on this moment to multiply rats beyond all measure!"
Antitrust didn’t slip down that staircase and fall spine-first on that stiletto: they stabbed it in the back and then they pushed it.
And when they killed antitrust, they also killed regulation, the second force that disciplines companies. Regulation is possible, but only when the regulator is more powerful than the regulated entities. When a company is bigger than the government, it gets damned hard to credibly threaten to punish that company, no matter what its sins.
That’s what protected IBM for all those years when it had its boot on the throat of the American tech sector. Do you know, the DOJ fought to break up IBM in the courts from 1970-1982, and that every year, for 12 consecutive years, IBM spent more on lawyers to fight the USG than the DOJ Antitrust Division spent on all the lawyers fighting every antitrust case in the entire USA?
IBM outspent Uncle Sam for 12 years. People called it “Antitrust’s Vietnam.” All that money paid off, because by 1982, the president was Ronald Reagan, a man whose official policy was that monopolies were “efficient." So he dropped the case, and Big Blue wriggled off the hook.
It’s hard to regulate a monopolist, and it’s hard to regulate a cartel. When a sector is composed of hundreds of competing companies, they compete. They genuinely fight with one another, trying to poach each others’ customers and workers. They are at each others’ throats.
It’s hard enough for a couple hundred executives to agree on anything. But when they’re legitimately competing with one another, really obsessing about how to eat each others’ lunches, they can’t agree on anything.
The instant one of them goes to their regulator with some bullshit story, about how it’s impossible to have a decent search engine without fine-grained commercial surveillance; or how it’s impossible to have a secure and easy to use mobile device without a total veto over which software can run on it; or how it’s impossible to administer an ISP’s network unless you can slow down connections to servers whose owners aren’t paying bribes for “premium carriage"; there’s some *other company saying, “That’s bullshit”
“We’ve managed it! Here’s our server logs, our quarterly financials and our customer testimonials to prove it.”
100 companies are a rabble, they're a mob. They can’t agree on a lobbying position. They’re too busy eating each others’ lunch to agree on how to cater a meeting to discuss it.
But let those hundred companies merge to monopoly, absorb one another in an incestuous orgy, turn into five giant companies, so inbred they’ve got a corporate Habsburg jaw, and they become a cartel.
It’s easy for a cartel to agree on what bullshit they’re all going to feed their regulator, and to mobilize some of the excess billions they’ve reaped through consolidation, which freed them from “wasteful competition," sp they can capture their regulators completely.
You know, Congress used to pass federal consumer privacy laws? Not anymore.
The last time Congress managed to pass a federal consumer privacy law was in 1988: The Video Privacy Protection Act. That’s a law that bans video-store clerks from telling newspapers what VHS cassettes you take home. In other words, it regulates three things that have effectively ceased to exist.
The threat of having your video rental history out there in the public eye was not the last or most urgent threat the American public faced, and yet, Congress is deadlocked on passing a privacy law.
Tech companies’ regulatory capture involves a risible and transparent gambit, that is so stupid, it’s an insult to all the good hardworking risible transparent ruses out there.
Namely, they claim that when they violate your consumer, privacy or labor rights, It’s not a crime, because they do it with an app.
Algorithmic wage discrimination isn’t illegal wage theft: we do it with an app.
Spying on you from asshole to appetite isn’t a privacy violation: we do it with an app.
And Amazon’s scam search tool that tricks you into paying 29% more than the best match for your query? Not a ripoff. We do it with an app.
Once we killed competition – stopped putting down rat poison – we got cartels – the rats ate our faces. And the cartels captured their regulators – the rats bought out the poison factory and shut it down.
So companies aren’t constrained by competition or regulation.
But you know what? This is tech, and tech is different.IIt’s different because it’s flexible. Because our computers are Turing-complete universal von Neumann machines. That means that any enshittificatory alteration to a program can be disenshittified with another program.
Every time HP jacks up the price of ink , they invite a competitor to market a refill kit or a compatible cartridge.
When Tesla installs code that says you have to pay an extra monthly fee to use your whole battery, they invite a modder to start selling a kit to jailbreak that battery and charge it all the way up.
Lemme take you through a little example of how that works: Imagine this is a product design meeting for our company’s website, and the guy leading the meeting says “Dudes, you know how our KPI is topline ad-revenue? Well, I’ve calculated that if we make the ads just 20% more invasive and obnoxious, we’ll boost ad rev by 2%”
This is a good pitch. Hit that KPI and everyone gets a fat bonus. We can all take our families on a luxury ski vacation in Switzerland.
But here’s the thing: someone’s gonna stick their arm up – someone who doesn’t give a shit about user well-being, and that person is gonna say, “I love how you think, Elon. But has it occurred to you that if we make the ads 20% more obnoxious, then 40% of our users will go to a search engine and type 'How do I block ads?'"
I mean, what a nightmare! Because once a user does that, the revenue from that user doesn’t rise to 102%. It doesn’t stay at 100% It falls to zero, forever.
[Any guesses why?]
Because no user ever went back to the search engine and typed, 'How do I start seeing ads again?'
Once the user jailbreaks their phone or discovers third party ink, or develops a relationship with an independent Tesla mechanic who’ll unlock all the DLC in their car, that user is gone, forever.
Interoperability – that latent property bequeathed to us courtesy of Herrs Turing and Von Neumann and their infinitely flexible, universal machines – that is a serious check on enshittification.
The fact that Congress hasn’t passed a privacy law since 1988 Is countered, at least in part, by the fact that the majority of web users are now running ad-blockers, which are also tracker-blockers.
But no one’s ever installed a tracker-blocker for an app. Because reverse engineering an app puts in you jeopardy of criminal and civil prosecution under Section 1201 of the Digital Millennium Copyright Act, with penalties of a 5-year prison sentence and a $500k fine for a first offense.
And violating its terms of service puts you in jeopardy under the Computer Fraud and Abuse Act of 1986, which is the law that Ronald Reagan signed in a panic after watching Wargames (seriously!).
Helping other users violate the terms of service can get you hit with a lawsuit for tortious interference with contract. And then there’s trademark, copyright and patent.
All that nonsense we call “IP,” but which Jay Freeman of Cydia calls “Felony Contempt of Business Model."
So if we’re still at that product planning meeting and now it’s time to talk about our app, the guy leading the meeting says, “OK, so we’ll make the ads in the app 20% more obnoxious to pull a 2% increase in topline ad rev?”
And that person who objected to making the website 20% worse? Their hand goes back up. Only this time they say “Why don’t we make the ads 100% more invasive and get a 10% increase in ad rev?"
Because it doesn't matter if a user goes to a search engine and types, “How do I block ads in an app." The answer is: you can't. So YOLO, enshittify away.
“IP” is just a euphemism for “any law that lets me reach outside my company’s walls to exert coercive control over my critics, competitors and customers,” and “app” is just a euphemism for “A web page skinned with the right IP so that protecting your privacy while you use it is a felony.”
Interop used to keep companies from enshittifying. If a company made its client suck, someone would roll out an alternative client, if they ripped a feature out and wanted to sell it back to you as a monthly subscription, someone would make a compatible plugin that restored it for a one-time fee, or for free.
To help people flee Myspace, FB gave them bots that you’d load with your login credentials. It would scrape your waiting Myspace messages and put ‘em in your FB inbox, and login to Myspace and paste your replies into your Myspace outbox. So you didn’t have to choose between the people you loved on Myspace, and Facebook, which launched with a promise never to spy on you. Remember that?!
Thanks to the metastasis of IP, all that is off the table today. Apple owes its very existence to iWork Suite, whose Pages, Numbers and Keynote are file-compatible with Microsoft’s Word, Excel and Powerpoint. But make an IOS runtime that’ll play back the files you bought from Apple’s stores on other platforms, and they’ll nuke you til you glow.
FB wouldn’t have had a hope of breaking Myspace’s grip on social media without that scrape, but scrape FB today in support of an alternative client and their lawyers will bomb you til the rubble bounces.
Google scraped every website in the world to create its search index. Try and scrape Google and they’ll have your head on a pike.
When they did it, it was progress. When you do it to them, that’s piracy. Every pirate wants to be an admiral.
Because this handful of companies has so thoroughly captured their regulators, they can wield the power of the state against you when you try to break their grip on power, even as their own flagrant violations of our rights go unpunished. Because they do them with an app.
Tech lost its fear of competitin it neutralized the threat from regulators, and then put them in harness to attack new startups that might do unto them as they did unto the companies that came before them.
But even so, there was a force that kept our bosses in check That force was us. Tech workers.
Tech workers have historically been in short supply, which gave us power, and our bosses knew it.
To get us to work crazy hours, they came up with a trick. They appealed to our love of technology, and told us that we were heroes of a digital revolution, who would “organize the world’s information and make it useful,” who would “bring the world closer together.”
They brought in expert set-dressers to turn our workplaces into whimsical campuses with free laundry, gourmet cafeterias, massages, and kombucha, and a surgeon on hand to freeze our eggs so that we could work through our fertile years.
They convinced us that we were being pampered, rather than being worked like government mules.
This trick has a name. Fobazi Ettarh, the librarian-theorist, calls it “vocational awe, and Elon Musk calls it being “extremely hardcore.”
This worked very well. Boy did we put in some long-ass hours!
But for our bosses, this trick failed badly. Because if you miss your mother’s funeral and to hit a deadline, and then your boss orders you to enshittify that product, you are gonna experience a profound moral injury, which you are absolutely gonna make your boss share.
Because what are they gonna do? Fire you? They can’t hire someone else to do your job, and you can get a job that’s even better at the shop across the street.
So workers held the line when competition, regulation and interop failed.
But eventually, supply caught up with demand. Tech laid off 260,000 of us last year, and another 100,000 in the first half of this year.
You can’t tell your bosses to go fuck themselves, because they’ll fire your ass and give your job to someone who’ll be only too happy to enshittify that product you built.
That’s why this is all happening right now. Our bosses aren’t different. They didn’t catch a mind-virus that turned them into greedy assholes who don’t care about our users’ wellbeing or the quality of our products.
As far as our bosses have always been concerned, the point of the business was to charge the most, and deliver the least, while sharing as little as possible with suppliers, workers, users and customers. They’re not running charities.
Since day one, our bosses have shown up for work and yanked as hard as they can on the big ENSHITTIFICATION lever behind their desks, only that lever didn’t move much. It was all gummed up by competition, regulation, interop and workers.
As those sources of friction melted away, the enshittification lever started moving very freely.
Which sucks, I know. But think about this for a sec: our bosses, despite being wildly imperfect vessels capable of rationalizing endless greed and cheating, nevertheless oversaw a series of actually great products and services.
Not because they used to be better people, but because they used to be subjected to discipline.
So it follows that if we want to end the enshittocene, dismantle the enshitternet, and build a new, good internet that our bosses can’t wreck, we need to make sure that these constraints are durably installed on that internet, wound around its very roots and nerves. And we have to stand guard over it so that it can’t be dismantled again.
A new, good internet is one that has the positive aspects of the old, good internet: an ethic of technological self-determination, where users of technology (and hackers, tinkerers, startups and others serving as their proxies) can reconfigure and mod the technology they use, so that it does what they need it to do, and so that it can’t be used against them.
But the new, good internet will fix the defects of the old, good internet, the part that made it hard to use for anyone who wasn’t us. And hell yeah we can do that. Tech bosses swear that it’s impossible, that you can’t have a conversation friend without sharing it with Zuck; or search the web without letting Google scrape you down to the viscera; or have a phone that works reliably without giving Apple a veto over the software you install.
They claim that it’s a nonsense to even ponder this kind of thing. It’s like making water that’s not wet. But that’s bullshit. We can have nice things. We can build for the people we love, and give them a place that’s worth of their time and attention.
To do that, we have to install constraints.
The first constraint, remember, is competition. We’re living through a epochal shift in competition policy. After 40 years with antitrust enforcement in an induced coma, a wave of antitrust vigor has swept through governments all over the world. Regulators are stepping in to ban monopolistic practices, open up walled gardens, block anticompetitive mergers, and even unwind corrupt mergers that were undertaken on false pretenses.
Normally this is the place in the speech where I’d list out all the amazing things that have happened over the past four years. The enforcement actions that blocked companies from becoming too big to care, and that scared companies away from even trying.
Like Wiz, which just noped out of the largest acquisition offer in history, turning down Google’s $23b cashout, and deciding to, you know, just be a fucking business that makes money by producing a product that people want and selling it at a competitive price.
Normally, I’d be listing out FTC rulemakings that banned noncompetes nationwid. Or the new merger guidelines the FTC and DOJ cooked up, which – among other things – establish that the agencies should be considering whether a merger will negatively impact privacy.
I had a whole section of this stuff in my notes, a real victory lap, but I deleted it all this week.
[Can anyone guess why?]
That’s right! This week, Judge Amit Mehta, ruling for the DC Circuit of these United States of America, In the docket 20-3010 a case known as United States v. Google LLC, found that “Google is a monopolist, and it has acted as one to maintain its monopoly," and ordered Google and the DOJ to propose a schedule for a remedy, like breaking the company up.
So yeah, that was pretty fucking epic.
Now, this antitrust stuff is pretty esoteric, and I won’t gatekeep you or shame you if you wanna keep a little distance on this subject. Nearly everyone is an antitrust normie, and that's OK. But if you’re a normie, you’re probably only catching little bits and pieces of the narrative, and let me tell you, the monopolists know it and they are flooding the zone.
The Wall Street Journal has published over 100 editorials condemning FTC Chair Lina Khan, saying she’s an ineffectual do-nothing, wasting public funds chasing doomed, quixotic adventures against poor, innocent businesses accomplishing nothing
[Does anyone out there know who owns the Wall Street Journal?]
That’s right, it’s Rupert Murdoch. Do you really think Rupert Murdoch pays his editorial board to write one hundred editorials about someone who’s not getting anything done?
The reality is that in the USA, in the UK, in the EU, in Australia, in Canada, in Japan, in South Korea, even in China, we are seeing more antitrust action over the past four years than over the preceding forty years.
Remember, competition law is actually pretty robust. The problem isn’t the law, It’s the enforcement priorities. Reagan put antitrust in mothballs 40 years ago, but that elegant weapon from a more civilized age is now back in the hands of people who know how to use it, and they’re swinging for the fences.
Next up: regulation.
As the seemingly inescapable power of the tech giants is revealed for the sham it always was, governments and regulators are finally gonna kill the “one weird trick” of violating the law, and saying “It doesn’t count, we did it with an app.”
Like in the EU, they’re rolling out the Digital Markets Act this year. That’s a law requiring dominant platforms to stand up APIs so that third parties can offer interoperable services.
So a co-op, a nonprofit, a hobbyist, a startup, or a local government agency wil eventuallyl be able to offer, say, a social media server that can interconnect with one of the dominant social media silos, and users who switch to that new platform will be able to continue to exchange messages with the users they follow and groups they belong to, so the switching costs will fall to damned near zero.
That’s a very cool rule, but what’s even cooler is how it’s gonna be enforced. Previous EU tech rules were “regulations” as in the GDPR – the General Data Privacy Regulation. EU regs need to be “transposed” into laws in each of the 27 EU member states, so they become national laws that get enforced by national courts.
For Big Tech, that means all previous tech regulations are enforced in Ireland, because Ireland is a tax haven, and all the tech companies fly Irish flags of convenience.
Here’s the thing: every tax haven is also a crime haven. After all, if Google can pretend it’s Irish this week, it can pretend to be Cypriot, or Maltese, or Luxembougeious next week. So Ireland has to keep these footloose criminal enterprises happy, or they’ll up sticks and go somewhere else.
This is why the GDPR is such a goddamned joke in practice. Big tech wipes its ass with the GDPR, and the only way to punish them starts with Ireland’s privacy commissioner, who barely bothers to get out of bed. This is an agency that spends most of its time watching cartoons on TV in its pajamas and eating breakfast cereal. So all of the big GDPR cases go to Ireland and they die there.
This is hardly a secret. The European Commission knows it’s going on. So with the DMA, the Commission has changed things up: The DMA is an “Act,” not a “Regulation.” Meaning it gets enforced in the EU’s federal courts, bypassing the national courts in crime-havens like Ireland.
In other words, the “we violate privacy law, but we do it with an app” gambit that worked on Ireland’s toothless privacy watchdog is now a dead letter, because EU federal judges have no reason to swallow that obvious bullshit.
Here in the US, the dam is breaking on federal consumer privacy law – at last!
Remember, our last privacy law was passed in 1988 to protect the sanctity of VHS rental history. It's been a minute.
And the thing is, there's a lot of people who are angry about stuff that has some nexus with America's piss-poor privacy landscape. Worried that Facebook turned grampy into a Qanon? That Insta made your teen anorexic? That TikTok is brainwashing millennials into quoting Osama Bin Laden? Or that cops are rolling up the identities of everyone at a Black Lives Matter protest or the Jan 6 riots by getting location data from Google? Or that Red State Attorneys General are tracking teen girls to out-of-state abortion clinics? Or that Black people are being discriminated against by online lending or hiring platforms? Or that someone is making AI deepfake porn of you?
A federal privacy law with a private right of action – which means that individuals can sue companies that violate their privacy – would go a long way to rectifying all of these problems
There's a pretty big coalition for that kind of privacy law! Which is why we have seen a procession of imperfect (but steadily improving) privacy laws working their way through Congress.
If you sign up for EFF’s mailing list at eff.org we’ll send you an email when these come up, so you can call your Congressjerk or Senator and talk to them about it. Or better yet, make an appointment to drop by their offices when they’re in their districts, and explain to them that you’re not just a registered voter from their district, you’re the kind of elite tech person who goes to Defcon, and then explain the bill to them. That stuff makes a difference.
What about self-help? How are we doing on making interoperability legal again, so hackers can just fix shit without waiting for Congress or a federal agency to act?
All the action here these day is in the state Right to Repair fight. We’re getting state R2R bills, like the one that passed this year in Oregon that bans parts pairing, where DRM is used to keep a device from using a new part until it gets an authorized technician’s unlock code.
These bills are pushed by a fantastic group of organizations called the Repair Coalition, at Repair.org, and they’ll email you when one of these laws is going through your statehouse, so you can meet with your state reps and explain to the JV squad the same thing you told your federal reps.
Repair.org’s prime mover is Ifixit, who are genuine heroes of the repair revolution, and Ifixit’s founder, Kyle Wiens, is here at the con. When you see him, you can shake his hand and tell him thanks, and that’ll be even better if you tell him that you’ve signed up to get alerts at repair.org!
Now, on to the final way that we reverse enhittification and build that new, good internet: you, the tech labor force.
For years, your bosses tricked you into thinking you were founders in waiting, temporarily embarrassed entrepreneurs who were only momentarily drawing a salary.
You certainly weren’t workers. Your power came from your intrinsic virtue, not like those lazy slobs in unions who have to get their power through that kumbaya solidarity nonsense.
It was a trick. You were scammed. The power you had came from scarcity, and so when the scarcity ended, when the industry started ringing up six-figure annual layoffs, your power went away with it.
The only durable source of power for tech workers is as workers, in a union.
Think about Amazon. Warehouse workers have to piss in bottles and have the highest rate of on-the-job maimings of any competing business. Whereas Amazon coders get to show up for work with facial piercings, green mohawks, and black t-shirts that say things their bosses don’t understand. They can piss whenever they want!
That’s not because Jeff Bezos or Andy Jassy loves you guys. It’s because they’re scared you’ll quit and they don’t know how to replace you.
Time for the second obligatory William Gibson quote: “The future is here, it’s just not evenly distributed.” You know who’s living in the future?. Those Amazon blue-collar workers. They are the bleeding edge.
Drivers whose eyeballs are monitored by AI cameras that do digital phrenology on their faces to figure out whether to dock their pay, warehouse workers whose bodies are ruined in just months.
As tech bosses beef up that reserve army of unemployed, skilled tech workers, then those tech workers – you all – will arrive at the same future as them.
Look, I know that you’ve spent your careers explaining in words so small your boss could understand them that you refuse to enshittify the company’s products, and I thank you for your service.
But if you want to go on fighting for the user, you need power that’s more durable than scarcity. You need a union. Wanna learn how? Check out the Tech Workers Coalition and Tech Solidarity, and get organized.
Enshittification didn’t arise because our bosses changed. They were always that guy.
They were always yankin’ on that enshittification lever in the C-suite.
What changed was the environment, everything that kept that switch from moving.
And that’s good news, in a bankshot way, because it means we can make good services out of imperfect people. As a wildly imperfect person myself, I find this heartening.
The new good internet is in our grasp: an internet that has the technological self-determination of the old, good internet, and the greased-skids simplicity of Web 2.0 that let all our normie friends get in on the fun.
Tech bosses want you to think that good UX and enshittification can’t ever be separated. That’s such a self-serving proposition you can spot it from orbit. We know it, 'cause we built the old good internet, and we’ve been fighting a rear-guard action to preserve it for the past two decades.
It’s time to stop playing defense. It's time to go on the offensive. To restore competition, regulation, interop and tech worker power so that we can create the new, good internet we’ll need to fight fascism, the climate emergency, and genocide.
To build a digital nervous system for a 21st century in which our children can thrive and prosper.
Community voting for SXSW is live! If you wanna hear RIDA QADRI and me talk about how GIG WORKERS can DISENSHITTIFY their jobs with INTEROPERABILITY, VOTE FOR THIS ONE!
If you'd like an essay-formatted version of this post to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
https://pluralistic.net/2024/08/17/hack-the-planet/#how-about-a-nice-game-of-chess
Image: https://twitter.com/igama/status/1822347578094043435/ (cropped)
@[email protected] (cropped)
https://mamot.fr/@[email protected]/112963252835869648
CC BY 4.0 https://creativecommons.org/licenses/by/4.0/deed.pt
#pluralistic#defcon#defcon 32#hackers#enshittification#speeches#transcripts#disenshittify or die#Youtube
904 notes
·
View notes
Note
Can you please just tell us what is wrong with ai and why, I can't find anything from actual industry artists ect online through Google just tech bro type articles. All the tech articles are saying it's a good thing, and every pro I follow refuses to explain how or why it's bad. How am I supposed to know something if nobody will teach me and I can't find it myself
I'll start by saying that the reason pro artists are refusing to answer questions about this is because they are tired. Like, I dont know if anyone actually understands just how exhausting it is to have to justify over and over again why the tech companies that are stealing your work and actively seeking to destroy your craft are 'bad, actually'.
I originally wrote a very longform reply to this ask, but in classic tumblr style the whole thing got eaten, so. I do not have the spoons to rewrite all that shit. Here are some of the sources I linked, I particularly recommend stable diffusion litigation for a thorough breakdown of exactly how generative tools work and why that is theft.
youtube
or this video if you are feeling lazy and only want the art-side opening statements:
Everytime you feed someone's work- their art, their writing, their likeness- into Midjourney or Dall-E or Chat GPT you are feeding this monster.
Go forth and educate yourself.
#ai art#asks answered#qwillechatter#for real though guys please dont interact with me about AI#I intended for this to be the last post I'll ever make about it#but tumblr ate it so you miss out on the nice essay#all the sources tell you everything you need to know#Im so fucking tired
825 notes
·
View notes
Text
been waiting for Matt Levine to write more about AI, and he doesn't disappoint
"Wells Fargo is using large language models to help determine what information clients must report to regulators and how they can improve their business processes. “It takes away some of the repetitive grunt work and at the same time we are faster on compliance,” said Chintan Mehta, the firm’s chief information officer and head of digital technology and innovation. The bank has also built a chatbot-based customer assistant using Google Cloud’s conversational AI platform, Dialogflow."
Do you think that Wells Fargo’s customer chatbot pushes customers to open more accounts to meet its quotas? Do you think that its regulatory-reporting chatbot then reports it to regulators? Soon Wells Fargo may be able to generate and negotiate billion-dollar regulatory settlements without any human involvement at all. ... Isn’t this sort of exciting? The widespread use of relatively early-stage AI will introduce new ways of making mistakes into finance. Right now there are some classic ways of making mistakes in finance, and they periodically lead to consequences ranging from funny embarrassment through multimillion-dollar trading loss up to systemic financial crises. Many of the most classic mistakes have the broad shape of “overly confident generalizing from limited historical data,” though some are, like, hitting the wrong button. But there are only so many ways to go wrong, and they are all sort of intuitive. ... Now some banker is going to type into a chat bot “our client wants to hedge the risk of the Turkish election,” and the chat bot will be like “she should sell some Dogecoin call options and use the proceeds to buy a lot of nickel futures,” and the banker will be like “weird okay whatever.” And that trade will go wrong in surprising ways, the client will sue, the client and the banker and the chat bot will all come to court, the judge will ask the chat bot “well why would this trade hedge anything,” and the chat bot will shrug its little imaginary shoulders and be like “bro why are you asking me I’m a chat bot.” Or it will say “actually the Dogecoin/nickel spread was ex ante an excellent proxy for Turkish political risk because” and then emit a series of ones and zeros and emojis and high-pitched noises that you and I and the judge can’t understand but that make perfect sense to the chat bot. New ways to be wrong! It will make life more exciting for financial columnists, for a bit, before we are all replaced by the chat bots.
🔥
111 notes
·
View notes
Text
Integrate AI into your mobile app to enhance user experience, improve efficiency, and drive business growth. Adding artificial intelligence (AI) to your app can lead to improved user experiences, streamlined operations, and data-driven decision-making.
0 notes
Text
Ensuring Data Integrity Post Cloud Migration: Netezza to GCP – BigQuery
The rising cost of healthcare is a global concern. From expensive treatments to skyrocketing insurance premiums, making quality healthcare accessible remains a challenge. But there's a silver lining emerging: artificial intelligence in healthcare (AI) holds immense potential to transform the healthcare landscape, making it more affordable for everyone.
How Can AI Reduce Medical Costs?
The use of AI in healthcare goes beyond futuristic robots performing surgery.
Here are some ways AI can contribute to cost reduction:
Improved Diagnostics and Early Detection: AI algorithms can analyze medical scans and patient data with remarkable accuracy, leading to earlier diagnoses of diseases. Early detection often translates to less expensive and less invasive treatments, saving money in the long run.
Streamlined Treatment Plans: AI can analyze vast amounts of medical data to identify optimal treatment strategies based on a patient's specific condition and medical history. This personalized approach can prevent unnecessary tests and procedures, reducing overall costs.
Enhanced Operational Efficiency: AI can automate administrative tasks like appointment scheduling, claims processing, and patient record management. This frees up valuable time for medical professionals, allowing them to focus on patient care and improving overall operational efficiency, potentially lowering operational costs.
Reduced Hospital Readmissions: AI can analyze patient data to predict potential complications and recommend preventative measures. This can significantly reduce hospital readmissions, a major contributor to healthcare costs.
The Cost of AI in Healthcare: An Investment, Not an Expense
While implementing AI solutions in healthcare requires an initial investment, it's crucial to view it as an investment with long-term returns. As AI technology matures and becomes more widely adopted, cost-efficiencies will likely outweigh the initial investment. Additionally, the potential savings from early disease detection, personalized treatment plans, and reduced hospital readmissions can be substantial.
Exploring the Future of Affordable Healthcare with AI
AI is still in its early stages of development in healthcare, but the potential for cost reduction is undeniable. As research and innovation continue, we can expect even more advanced AI applications to emerge, transforming healthcare into a more accessible and affordable system for all.
By embracing AI, healthcare providers and institutions can pave the way for a future where quality healthcare is not a luxury, but a basic right available to everyone.
0 notes
Text
RECENT SEO & MARKETING NEWS FOR ECOMMERCE, JULY 2024
If you are new to my Tumblr, I usually do these summaries of SEO and marketing news once a month, picking out the pieces that are most likely to be useful to small and micro-businesses.
You can get notified of these updates plus my website blog posts via email: http://bit.ly/CindyLouWho2Blog or get all of the most timely updates plus exclusive content by supporting my Patreon: patreon.com/CindyLouWho2
TOP NEWS & ARTICLES
There is a relatively new way to file copyright claims against US residents, called The Copyright Claims Board (CCB). I wrote more here [post by me on Patreon]
After a few years of handwringing and false starts, Google is abandoning plans to block third-party cookies in Chrome. Both Safari and Firefox already block them.
When composing titles and text where other keywords are found, it can be useful to have a short checklist of the types of keywords you need, as this screenshot demonstrates. While that title is too long for most platforms and search engines, it covers really critical points that should get mentioned in the product description and keyword fields/tags as well:
The core keywords that describe the item
What the customer is looking to do - solve a problem? Find a gift? Feel better?
What makes the product stand out in its field - why buy this instead of something else? Differentiating your items is something that should come before you get to the listing stage, so the keywords should already be in your head.
Relevant keywords that will be used in long tail searches are always great add-ons.
What if anything about your item is trendy now? E.g., sustainability? Particular colours, styles or materials/ingredients are always important.
SEO: GOOGLE & OTHER SEARCH ENGINES
Google’s June spam update has finished rolling out. And here is the full list of Google news from June.
Expect a new Google core update “in the coming weeks” (as if we needed more Google excitement).
Google’s AI overviews continue to dwindle at the top of search results, now only appearing in 7% of searches.
Despite Google trying to target AI spam, many poorly-copied articles still outrank the originals in Google search results.
Internal links are important for Google SEO. While this article covers blogging in particular, most of the tips apply to any standalone website. Google also recently did a video [YouTube] on the same topic.
Google had a really excellent second quarter, mostly due to the cloud and AI.
Not Google
OpenAI is testing SearchGPT with a small number of subscribers. Alphabet shares dropped 3% after the announcement.
SOCIAL MEDIA - All Aspects, By Site
General
New social media alert: noplace is a new app billed as MySpace for Gen Z that also has some similarities with Twitter (e.g., text-based chats, with no photos or videos at this time). iOS only at the moment; no Android app or web page.
Thinking of trying out Bluesky? Here are some tips to get the most out of it.
Facebook (includes relevant general news from Meta)
Meta’s attempt at circumventing EU privacy regulations through paid subscriptions is illegal under the Digital Markets Act, according to the European Commission. “if the company cannot reach an agreement with regulators before March 2025, the Commission has the power to levy fines of up to 10 percent of the company’s global turnover.”
If you post Reels from a business page, you may be able to let Meta use AI to do A/B testing on the captions and other portions shown. I personally would not do this unless I could see what options they were choosing, since AI is often not as good as it thinks it is.
Apple’s 30% fee on in-app ad purchases for Facebook and Instagram has kicked in worldwide as of July 1.
Facebook is testing ads in the Notifications list on the app.
Meta is encouraging advertisers to connect their Google Analytics accounts to Meta Ads, claiming “integration could improve campaign performance, citing a 22% conversion increase.”
Instagram
The head of Instagram is still emphasizing that the number of DM shares per post is a huge ranking factor.
LinkedIn
Another article on the basics of setting up LinkedIn and getting found through it.
You can now advertise your LinkedIn newsletters on the platform.
Pinterest
Pinterest is slowly testing an AI program that edits the background of product photography without changing the product.
Is Pinterest dying? An investment research firm thinks so.
Reddit
If you want to see results from Reddit in your search engine results, Google is the only place that can happen now.
More than ever, Reddit is being touted as a way to be found (especially in Google search), but you do have to understand how the site works to be successful at it.
Snapchat
Snapchat+ now has 9 million paying users, and they are getting quite a few new personalization updates, and Snaps that last 50 seconds or less.
Threads
Threads has hit 175 million active users each month, up from 130 million in February.
TikTok
TikTok has made it easier to reuse your videos outside of the site without a watermark.
TikTok users can now select a custom thumbnail image for videos, either a frame from the clip itself, or a still image from elsewhere.
Twitter
You can opt out of Twitter using your posts as data for its AI, Grok.
YouTube
YouTube has new tools for Shorts, including one that makes your longer videos into Shorts.
Community Spaces are the latest YouTube test to try to get more fan involvement, while moving users away from video comments.
(CONTENT) MARKETING (includes blogging, emails, and strategies)
Start your content marketing plans for August now, including back-to-school themes and Alfred Hitchcock’s birthday on August 13.
ONLINE ADVERTISING (EXCEPT INDIVIDUAL SOCIAL MEDIA AND ECOMMERCE SITES)
Google Ads now have several new updates, including blocking misspellings.
Google’s new Merchant Center Next will soon be available for all users, if they haven’t already been invited. Supplemental feeds are now (or soon will be) allowed there.
STATS, DATA, TRACKING
Google Search Console users can now add their shipping and return info to Google search through the Console itself. This is useful for sites that do not pay for Google Ads or use Google’s free shopping ads.
BUSINESS & CONSUMER TRENDS, STATS & REPORTS; SOCIOLOGY & PSYCHOLOGY, CUSTOMER SERVICE
The second part of this Whiteboard Friday [video with transcript] discusses how consumer behaviour is changing during tight economic times. “People are still spending. They just want the most for their money. Also, the consideration phase is much more complex and longer.” The remainder of the piece discusses how to approach your target market during these times.
Prime Day was supposedly the best ever for Amazon, but they didn’t release any numbers. Adobe Analytics tracked US ecommerce sales on those days and provides some insight. “Buy-now, pay-later accounted for 7.6% of all orders, a 16.4% year-over-year increase.”
MISCELLANEOUS
You know how I always tell small business owners to have multiple revenue streams? Tech needs to have multiple providers and backups as well, as the recent CrowdStrike and Microsoft issues demonstrate.
If you used Google’s old URL shortener anywhere, those links will no longer redirect as of August 25 2024.
14 notes
·
View notes
Text
instagram
It all started here. I woke up and looked at 'my' Discord server for the bloated tic of a fanfic (hey that rhymes, neurospicy two seconds!) I write with my partner SonaBeanSidhe, Aran Thranduil's Dining Hall. There hadn't been enough tea yet so the natural response was 'wth is this?' Well, THIS was going to Hoover my next two days and counting. Just visit the post, dear friends of the writing persuasion and you would see (I'll save you time because I'm nice this way) that there are many screenshots supporting just what the meme says: This Instagram OP knows a writer who found herself in a steaming pile of mess with the Google Docs TOS (which is really also the Google Drive TOS, making it somewhat difficult to locate said TOS. I'll save you more time. Here:
This information comes from https://support.google.com/docs/answer/148505#zippy=%2Csexually-explicit-material So, it was the act of sharing the files, not having created or stored them that triggered what had happened. Okay, still not reassuring. In the meantime, I was concerned enough to post what had happened to r/FanFiction. Within minutes, my post had been frozen by the mods there. I can see their concern. I believe they had only seen the meme-like first screen of the Instagram post, and it does come off like a bad Snopes Fail. They asked me for more proof. The first thing I did was a search for previous instances of Google having frozen Drive accounts previously. They are limited, but they exist. The concern here had to do with the current lack of information as to why the action happened, and whether we who collectively authored and had shared Docs with explicit content were about to have our Drive accounts fall before the scythe of an overzealous AI rollout on the part of Google. Which led to the second part of this. My first response to the mods was as follows:
A little while after I sent this, an update appeared on the Instagram OP's account (easy to find) so I added this:
The salient point of the update was that the author had received restored access to her Drive, but NOT the Doc that had started the entire difficulty. I waited for a response. And waited, and waited. In the meantime, I had no means to add to, respond to, anything to my original post or defend myself against some fairly sharp initial criticisms leveled against me by individuals who clearly hadn't read anything beyond the initial meme. Special. While I understand a sub this large likely has a lot of messages to field, I also felt that given they managed to freeze my post in a matter of minutes, they were really taking their time responding. So I weighed in on Instagram with a message of support. If nothing else is taken away, please take this: Back up your files. It's easy to create an alt Google account; this other account will have its own Drive storage. Share your folders and files with yourself. If you don't already realize, in Docs under the File dropdown menu the third option is Make A Copy. If you have shared the original document 75 times, you won't have shared that copy at all. Consider backing up your works to Indie platforms like Get Hermit, Ellipsus and The Quill. There may be others. Your own external hard drive is also a very good idea. But wait, you ask. What about just using MS Word, or having copies as a PDF file in the Adobe cloud? Not so fast, my friend. Look at Microsoft and Adobe's TOS regarding sexually explicit content...you'll get an unpleasant surprise. I did. In the meantime, a third update came from the OP on Instagram shedding far more light as to what had likely gone wrong. She had shared the Doc with she guessed 18-20 people as beta readers. The belief is now that one of these persons actually turned her in to Google via their abuse form. To quote one of the commenters on that update: "I heard from an agent that Google will only do this to your account if someone reports it. Google isn’t scanning docs for explicit content (except maybe images), but yeah, for this to happen, someone had to report her document to Google." This felt like a relief...and yet is still packaged inside a cautionary tale. How many of us have shared by 'whoever has this link'? I have, or rather, I had. I revoked all such permissions to all large stories and folders in favor of my few trusted long-time friends and beta readers. The works are on AO3. I don't need those Docs links to be free floating all over the place, not after seeing what can happen. I was on vacation in England some years prior when I realized I had lost the ability to send out all outgoing Gmail. It took me several days to figure out what had gone wrong and nearly lost me my ability to receive Covid test results at a time when that was a travel necessity...almost disastrous. Fortunately I had multiple email accounts and could work around the problem. Which brings me to the last of this.
While including the subreddit mod's responses here may seem retaliatory/petulant on my part...I'm at best miffed at their shortsightedness, maybe mildly insulted, mostly shaking my head at the fact that there is something to have been learned here that could have affected any one of us. No one was spreading misinformation. Something had indeed happened. The source of it had been unknown at the time of posting. Had they left my post alone I would have gladly redacted as it unfolded; the reason for placing it on that sub was not to create a giant stir but to let others know of the risk of this happening at all and to encourage backing up that which most writers invest a considerable amount of their emotional well-being; the safety of their written works. Yes, I'm neurodivergent. No, I don't always see the world as others do. No, I still don't think this difference allows the response to be interpreted as much other than a slight and working against the interests of authors (I'm on their side and last I checked, readers need the authors to have the things that are written in order to have the things to read. Just saying). Whatever, what really matters is, if you feel this is worth sharing, share, take away the important bits, and if someone out there just wants to write it up in 3 sentences so it isn't a tl;dr, do that too.
Have a Googly day, to all near and far, in these times in which we live...>.>
#google docs#google drive#terms of service#writing#fanfic#slash fanfiction#smut fanfiction#service provider#beta reader#Instagram
12 notes
·
View notes
Text
WBSV Sales Gallery and Headquarters
WBSV Sales Gallery + Headquarters is a new concept that applies to the previous design which is supposed to be a Hotel and Mall. The tower has 3 levels of podium and 11 storeys of hotel. The podium of the tower typically includes the WBSV headquarters, sales gallery, retail spaces, amenities, parking, and other facilities
•Project: WORLD BRIDGE SPORT VILLAGE •Facility: WBSV SALE GALLERY and HEADQUARTERS •Architectural Manager: Sonetra KETH •Developer: OXLEY-WORLDBRIDGE (CAMBODIA) CO., LTD. •Subsidiary: WB SPORT VILLAGE CO., LTD. •Location: PHNOM PENH, CAMBODIA
#Sonetra Keth#Architectural Manager#Architectural Design Manager#BIM Director#BIM Manager#BIM Coordinator#Project Manager#RMIT University Vietnam#Institute of Technology of Cambodia#Real Estate Development#Construction Industry#Building Information Modelling#BIM#AI#Artificial Intelligence#Technology#VDC#Virtual Design#IoT#Machine Learning#Drones and UAVs#C4R#Cloud Computing and Collaboration Platforms#NETRA#netra#នេត្រា#កេត សុនេត្រា#<meta name=“google-adsense-account” content=“ca-pub-9430617320114361”>#<script async src=“https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js?client=ca-pub-9430617320114361”#crossorigin=“anonymous”></script>
0 notes
Text
The US Department of Justice had long been expected to file an antitrust lawsuit against Apple. But when the suit arrived Thursday, it came with surprising ferocity.
In a press conference, attorney general Merrick Garland noted that Apple controlled more than 70 percent of the country’s smartphone market, saying the company used that outsize power to control developers and consumers and squeeze more revenue out of them.
The suit and messaging from the DOJ and 15 states and the District of Columbia joining it take aim at Apple’s most prized asset—the iPhone—and position the case as a fight for the future of technology. The suit argues that Apple rose to its current power thanks in part to the 1998 antitrust case against Microsoft, and that another milestone antitrust correction is needed to allow future innovation to continue.
Like the Microsoft case, the suit against Apple is “really dynamic and forward looking,” says John Newman, a law professor at the University of Miami. “It's not necessarily about Apple seeing direct competitors,” he says. “It's more about them trying to grab the territory you would need if you were going to even try to compete against Apple.”
Antitrust action in the tech industry has been a focus of the Biden administration’s agenda, which has seen suits brought against both Amazon and Google by the DOJ and the Federal Trade Commission. “This case demonstrates why we must reinvigorate competition policy and establish clear rules of the road for Big Tech platforms,” Democratic senator Amy Klobuchar told WIRED in a statement.
Rebecca Hall Allensworth, a law professor at Vanderbilt University, says that though the government almost always faces an uphill battle in antitrust cases, the Apple case appears relatively solid. “It's a lot stronger than the FTC Amazon monopolization lawsuit from last year,” she says. “And yet, it's very hard to win antitrust cases.”
In a statement, Apple spokesperson Fred Sainz said that the lawsuit “threatens who we are and the principles that set Apple products apart in fiercely competitive markets,” including the way its products work “seamlessly” together and “protect people’s privacy and security.”
Apple has long argued that keeping its mobile operating system, app store, and other services closed offers greater security and safety for customers. But Newman says that the DOJ complaint indicates that Apple doesn't enforce these policies consistently as would make sense if the goal was to protect users.
“Instead [Apple] heavily targets the types of app developers that pose the biggest competitive threat to Apple,” Newman says. The DOJ alleges that restrictions Apple places on iMessage, Apple Wallet, and other products and features create barriers that deter or even penalize people who may switch to cheaper options.
History Repeating
The antitrust case against Microsoft in the late 1990s accused the company of illegally forcing PC manufacturers and others to favor its web browser Internet Explorer. It is widely credited with causing the company to be slow to embrace the web, falling behind a wave of startups including Google and Amazon that grew into giants by making web services useful and lucrative.
When asked about the threat the new antitrust lawsuit might pose to Apple’s business, a DOJ official noted that “there are actually examples where companies, after having been charged and had to change business practices because they violated the antitrust laws in the long run, end up being more valuable than they were before.” Microsoft, thanks to its success in cloud services and more recently AI, is now the most valuable company in the world.
The Department of Justice said Thursday that any potential remedy was on the table for Apple—implying that even breaking up the company is a possibility. But Allensworth says it is unlikely the government would pursue that outcome. The proposed remedies could more likely force Apple to change its "technological and contractual restrictions on app development, and on interoperability with other phones,” she says. “That is something that could be very meaningful, if that remedy were fully realized and overseen in a good way. But it still leaves Apple basically in control of the ecosystem,” Allensworth says.
Paul Swanson, antitrust partner at the law firm Holland & Hart, sees potential difficulties ahead for the suit. “They're alleging that Apple is excluding competition in the smartphone market by making their products stickier, by making it very attractive to stay within their ecosystem. And the way that Apple does that, according to the DOJ, is that it doesn't cooperate nicely with other companies,” he says. But Swanson says antitrust laws don’t generally require companies to work with others. “A business doesn't violate antitrust laws by terminating or refusing to work with another business.”
This is not the first antitrust case against Apple. In 2020, Epic Games filed a lawsuit against the company, accusing it of anticompetitive behavior, after being kicked off the App Store for offering a version of the Fortnite game that circumvented the Apple’s steep 30 percent fees for in-app purchases. Epic lost the case in the lower courts, and in January the Supreme Court declined to hear the appeal—and Apple announced it would levy a new app store fee on developers.
Newman notes that the government seems to have kept a close eye on that case in constructing the suit launched Thursday. The case was filed in the Third Circuit Court in New Jersey, rather than the Ninth Circuit Court, which includes California. He predicts it will ultimately end up before the Supreme Court. “I think this one's probably going all the way,” Newman says.
12 notes
·
View notes
Text
Unpersoned
Support me this summer on the Clarion Write-A-Thon and help raise money for the Clarion Science Fiction and Fantasy Writers' Workshop!
My latest Locus Magazine column is "Unpersoned." It's about the implications of putting critical infrastructure into the private, unaccountable hands of tech giants:
https://locusmag.com/2024/07/cory-doctorow-unpersoned/
The column opens with the story of romance writer K Renee, as reported by Madeline Ashby for Wired:
https://www.wired.com/story/what-happens-when-a-romance-author-gets-locked-out-of-google-docs/
Renee is a prolific writer who used Google Docs to compose her books, and share them among early readers for feedback and revisions. Last March, Renee's Google account was locked, and she was no longer able to access ten manuscripts for her unfinished books, totaling over 220,000 words. Google's famously opaque customer service – a mix of indifferently monitored forums, AI chatbots, and buck-passing subcontractors – would not explain to her what rule she had violated, merely that her work had been deemed "inappropriate."
Renee discovered that she wasn't being singled out. Many of her peers had also seen their accounts frozen and their documents locked, and none of them were able to get an explanation out of Google. Renee and her similarly situated victims of Google lockouts were reduced to developing folk-theories of what they had done to be expelled from Google's walled garden; Renee came to believe that she had tripped an anti-spam system by inviting her community of early readers to access the books she was working on.
There's a normal way that these stories resolve themselves: a reporter like Ashby, writing for a widely read publication like Wired, contacts the company and triggers a review by one of the vanishingly small number of people with the authority to undo the determinations of the Kafka-as-a-service systems that underpin the big platforms. The system's victim gets their data back and the company mouths a few empty phrases about how they take something-or-other "very seriously" and so forth.
But in this case, Google broke the script. When Ashby contacted Google about Renee's situation, Google spokesperson Jenny Thomson insisted that the policies for Google accounts were "clear": "we may review and take action on any content that violates our policies." If Renee believed that she'd been wrongly flagged, she could "request an appeal."
But Renee didn't even know what policy she was meant to have broken, and the "appeals" went nowhere.
This is an underappreciated aspect of "software as a service" and "the cloud." As companies from Microsoft to Adobe to Google withdraw the option to use software that runs on your own computer to create files that live on that computer, control over our own lives is quietly slipping away. Sure, it's great to have all your legal documents scanned, encrypted and hosted on GDrive, where they can't be burned up in a house-fire. But if a Google subcontractor decides you've broken some unwritten rule, you can lose access to those docs forever, without appeal or recourse.
That's what happened to "Mark," a San Francisco tech workers whose toddler developed a UTI during the early covid lockdowns. The pediatrician's office told Mark to take a picture of his son's infected penis and transmit it to the practice using a secure medical app. However, Mark's phone was also set up to synch all his pictures to Google Photos (this is a default setting), and when the picture of Mark's son's penis hit Google's cloud, it was automatically scanned and flagged as Child Sex Abuse Material (CSAM, better known as "child porn"):
https://pluralistic.net/2022/08/22/allopathic-risk/#snitches-get-stitches
Without contacting Mark, Google sent a copy of all of his data – searches, emails, photos, cloud files, location history and more – to the SFPD, and then terminated his account. Mark lost his phone number (he was a Google Fi customer), his email archives, all the household and professional files he kept on GDrive, his stored passwords, his two-factor authentication via Google Authenticator, and every photo he'd ever taken of his young son.
The SFPD concluded that Mark hadn't done anything wrong, but it was too late. Google had permanently deleted all of Mark's data. The SFPD had to mail a physical letter to Mark telling him he wasn't in trouble, because he had no email and no phone.
Mark's not the only person this happened to. Writing about Mark for the New York Times, Kashmir Hill described other parents, like a Houston father identified as "Cassio," who also lost their accounts and found themselves blocked from fundamental participation in modern life:
https://www.nytimes.com/2022/08/21/technology/google-surveillance-toddler-photo.html
Note that in none of these cases did the problem arise from the fact that Google services are advertising-supported, and because these people weren't paying for the product, they were the product. Buying a $800 Pixel phone or paying more than $100/year for a Google Drive account means that you're definitely paying for the product, and you're still the product.
What do we do about this? One answer would be to force the platforms to provide service to users who, in their judgment, might be engaged in fraud, or trafficking in CSAM, or arranging terrorist attacks. This is not my preferred solution, for reasons that I hope are obvious!
We can try to improve the decision-making processes at these giant platforms so that they catch fewer dolphins in their tuna-nets. The "first wave" of content moderation appeals focused on the establishment of oversight and review boards that wronged users could appeal their cases to. The idea was to establish these "paradigm cases" that would clarify the tricky aspects of content moderation decisions, like whether uploading a Nazi atrocity video in order to criticize it violated a rule against showing gore, Nazi paraphernalia, etc.
This hasn't worked very well. A proposal for "second wave" moderation oversight based on arms-length semi-employees at the platforms who gather and report statistics on moderation calls and complaints hasn't gelled either:
https://pluralistic.net/2022/03/12/move-slow-and-fix-things/#second-wave
Both the EU and California have privacy rules that allow users to demand their data back from platforms, but neither has proven very useful (yet) in situations where users have their accounts terminated because they are accused of committing gross violations of platform policy. You can see why this would be: if someone is accused of trafficking in child porn or running a pig-butchering scam, it would be perverse to shut down their account but give them all the data they need to go one committing these crimes elsewhere.
But even where you can invoke the EU's GDPR or California's CCPA to get your data, the platforms deliver that data in the most useless, complex blobs imaginable. For example, I recently used the CCPA to force Mailchimp to give me all the data they held on me. Mailchimp – a division of the monopolist and serial fraudster Intuit – is a favored platform for spammers, and I have been added to thousands of Mailchimp lists that bombard me with unsolicited press pitches and come-ons for scam products.
Mailchimp has spent a decade ignoring calls to allow users to see what mailing lists they've been added to, as a prelude to mass unsubscribing from those lists (for Mailchimp, the fact that spammers can pay it to send spam that users can't easily opt out of is a feature, not a bug). I thought that the CCPA might finally let me see the lists I'm on, but instead, Mailchimp sent me more than 5900 files, scattered through which were the internal serial numbers of the lists my name had been added to – but without the names of those lists any contact information for their owners. I can see that I'm on more than 1,000 mailing lists, but I can't do anything about it.
Mailchimp shows how a rule requiring platforms to furnish data-dumps can be easily subverted, and its conduct goes a long way to explaining why a decade of EU policy requiring these dumps has failed to make a dent in the market power of the Big Tech platforms.
The EU has a new solution to this problem. With its 2024 Digital Markets Act, the EU is requiring platforms to furnish APIs – programmatic ways for rivals to connect to their services. With the DMA, we might finally get something parallel to the cellular industry's "number portability" for other kinds of platforms.
If you've ever changed cellular platforms, you know how smooth this can be. When you get sick of your carrier, you set up an account with a new one and get a one-time code. Then you call your old carrier, endure their pathetic begging not to switch, give them that number and within a short time (sometimes only minutes), your phone is now on the new carrier's network, with your old phone-number intact.
This is a much better answer than forcing platforms to provide service to users whom they judge to be criminals or otherwise undesirable, but the platforms hate it. They say they hate it because it makes them complicit in crimes ("if we have to let an accused fraudster transfer their address book to a rival service, we abet the fraud"), but it's obvious that their objection is really about being forced to reduce the pain of switching to a rival.
There's a superficial reasonableness to the platforms' position, but only until you think about Mark, or K Renee, or the other people who've been "unpersonned" by the platforms with no explanation or appeal.
The platforms have rigged things so that you must have an account with them in order to function, but they also want to have the unilateral right to kick people off their systems. The combination of these demands represents more power than any company should have, and Big Tech has repeatedly demonstrated its unfitness to wield this kind of power.
This week, I lost an argument with my accountants about this. They provide me with my tax forms as links to a Microsoft Cloud file, and I need to have a Microsoft login in order to retrieve these files. This policy – and a prohibition on sending customer files as email attachments – came from their IT team, and it was in response to a requirement imposed by their insurer.
The problem here isn't merely that I must now enter into a contractual arrangement with Microsoft in order to do my taxes. It isn't just that Microsoft's terms of service are ghastly. It's not even that they could change those terms at any time, for example, to ingest my sensitive tax documents in order to train a large language model.
It's that Microsoft – like Google, Apple, Facebook and the other giants – routinely disconnects users for reasons it refuses to explain, and offers no meaningful appeal. Microsoft tells its business customers, "force your clients to get a Microsoft account in order to maintain communications security" but also reserves the right to unilaterally ban those clients from having a Microsoft account.
There are examples of this all over. Google recently flipped a switch so that you can't complete a Google Form without being logged into a Google account. Now, my ability to purse all kinds of matters both consequential and trivial turn on Google's good graces, which can change suddenly and arbitrarily. If I was like Mark, permanently banned from Google, I wouldn't have been able to complete Google Forms this week telling a conference organizer what sized t-shirt I wear, but also telling a friend that I could attend their wedding.
Now, perhaps some people really should be locked out of digital life. Maybe people who traffick in CSAM should be locked out of the cloud. But the entity that should make that determination is a court, not a Big Tech content moderator. It's fine for a platform to decide it doesn't want your business – but it shouldn't be up to the platform to decide that no one should be able to provide you with service.
This is especially salient in light of the chaos caused by Crowdstrike's catastrophic software update last week. Crowdstrike demonstrated what happens to users when a cloud provider accidentally terminates their account, but while we're thinking about reducing the likelihood of such accidents, we should really be thinking about what happens when you get Crowdstruck on purpose.
The wholesale chaos that Windows users and their clients, employees, users and stakeholders underwent last week could have been pieced out retail. It could have come as a court order (either by a US court or a foreign court) to disconnect a user and/or brick their computer. It could have come as an insider attack, undertaken by a vengeful employee, or one who was on the take from criminals or a foreign government. The ability to give anyone in the world a Blue Screen of Death could be a feature and not a bug.
It's not that companies are sadistic. When they mistreat us, it's nothing personal. They've just calculated that it would cost them more to run a good process than our business is worth to them. If they know we can't leave for a competitor, if they know we can't sue them, if they know that a tech rival can't give us a tool to get our data out of their silos, then the expected cost of mistreating us goes down. That makes it economically rational to seek out ever-more trivial sources of income that impose ever-more miserable conditions on us. When we can't leave without paying a very steep price, there's practically a fiduciary duty to find ways to upcharge, downgrade, scam, screw and enshittify us, right up to the point where we're so pissed that we quit.
Google could pay competent decision-makers to review every complaint about an account disconnection, but the cost of employing that large, skilled workforce vastly exceeds their expected lifetime revenue from a user like Mark. The fact that this results in the ruination of Mark's life isn't Google's problem – it's Mark's problem.
The cloud is many things, but most of all, it's a trap. When software is delivered as a service, when your data and the programs you use to read and write it live on computers that you don't control, your switching costs skyrocket. Think of Adobe, which no longer lets you buy programs at all, but instead insists that you run its software via the cloud. Adobe used the fact that you no longer own the tools you rely upon to cancel its Pantone color-matching license. One day, every Adobe customer in the world woke up to discover that the colors in their career-spanning file collections had all turned black, and would remain black until they paid an upcharge:
https://pluralistic.net/2022/10/28/fade-to-black/#trust-the-process
The cloud allows the companies whose products you rely on to alter the functioning and cost of those products unilaterally. Like mobile apps – which can't be reverse-engineered and modified without risking legal liability – cloud apps are built for enshittification. They are designed to shift power away from users to software companies. An app is just a web-page wrapped in enough IP to make it a felony to add an ad-blocker to it. A cloud app is some Javascript wrapped in enough terms of service clickthroughs to make it a felony to restore old features that the company now wants to upcharge you for.
Google's defenstration of K Renee, Mark and Cassio may have been accidental, but Google's capacity to defenstrate all of us, and the enormous cost we all bear if Google does so, has been carefully engineered into the system. Same goes for Apple, Microsoft, Adobe and anyone else who traps us in their silos. The lesson of the Crowdstrike catastrophe isn't merely that our IT systems are brittle and riddled with single points of failure: it's that these failure-points can be tripped deliberately, and that doing so could be in a company's best interests, no matter how devastating it would be to you or me.
If you'd like an e ssay-formatted version of this post to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:
https://pluralistic.net/2024/07/22/degoogled/#kafka-as-a-service
Image: Cryteria (modified) https://commons.wikimedia.org/wiki/File:HAL9000.svg
CC BY 3.0 https://creativecommons.org/licenses/by/3.0/deed.en
521 notes
·
View notes