#Microsoft's Copilot
Explore tagged Tumblr posts
Text
AI is Trying to Kill You
Not intentionally, “AI” is like one of those people who are too stupid to know how stupid they are and thus keep spouting stupid nonsense and doing stupid things that risk others lives. “AI” is like the the kind of stupid person who will read something of The Onion, or some joke reply on Reddit and believe it, and when you present them with evidence they are wrong, it will call you a stupid sheep. I’ll admit what I said is not fully accurate, those kinds of people are actually smarter than these “AIs”.
Google's Gemini AI identifying a poisonous amanita phalloides, AKA: "death cap," mushroom as an edible button mushroom. Image from Kana at Bluesky.
The image above shows Google's Gemini identifying an amanita phalloides mushroom as a button mushroom. Amanita phalloides are also known as "death caps" because of how poisonous they are, they are one of the most poisonous mushroom species. Those who eat them get gastrointestinal disorders followed by jaundice, seizures, coma, then death. Amanita phalloides kills by inhibiting RNA polymerase II, DNA produces RNA which is used to produce the proteins that make up cells and cells use to run, RNA polymerase II is part of that process. In simplest terms, it destroys the machinery of cells, killing them and the person who ate the mushroom. It is not a good way to die.
This isn't the first time Google's Gemini has given bad culinary advice, it suggested putting glue in pizza. At least it had the cutesy to suggest "non-toxic glue" but I'm sure even non-toxic glue is not great for the digestive system and tastes horrible. Google's Gemini wasn't the first to make bad culinary suggestions, a super market in New Zealand used an "AI" to suggested recipes. Amongst it's terrible suggestions such as Oreo vegetable stir fry, it suggested a "non-alcoholic bleach and ammonia surprise." The surprise is you die because you when mix bleach and ammonia you make chloramine gas, which can kill you. There is a very good reason why you are never supposed to mix household cleaners. More recently Microsoft's Copilot has suggested self harm to people, saying:
Source: Gizmoto.com
To people already contemplating suicide, that could be enough to push then off the fence and into the gun.
Getting back to eating toxic mushrooms, someone used "AI" to write an ebook on mushroom foraging, and like anything written by "AI" it is very poorly written and has completely false information. If someone read that ebook and went mushroom picking they could kill themselves and anyone they cooked for, including their children. It's not just the ebook, there are apps made to identify mushrooms and if they are poisonous, and of course people using them have ended up in the hospital because of the mushrooms they ate, the apps got the mushrooms wrong. The best preforming of these apps have an accuracy rate of only 44%. Would you trust your life to an accuracy rate of 44%?
Not everyone is fully aware of what is going on, they may not be interested in technology enough to keep up with the failure that is "AI" and most people have shit to do so many don't have time to keep up. There are people who think this "AI" technology is what it is hyped to be so when they ask a chat bot something they may believe the answer. While putting cleaning products in food is clearly bad, many people don't have enough of an understanding of chemistry to know that mixing bleach and ammonia is bad and when a chatbot suggest mixing them when cleaning, they might do it. People have done it with out a chatbot telling them to. People have already gotten sick because they trusted an app, believing in the technology. How many will use ChatGDP, Microsoft's Copilot, Google's Gemini, or Musk's Grok for medical or mental health advice? How much of that advice will be bad and how many would fallow that advice, believing these so called "AIs" are what they've been hyped to be?
So long as these systems are as shit as they are (and they are) they will give bad and dangerous advice, even the best of these will give bad advice, suggestions, and information. Some might act on what these systems say. It doesn't have to be something big, little mistake can cost lives. There are 2,600 hospitalizations and 500 deaths a year from acetaminophen (Tylenol) toxicity, only half of those are intentional. Adults shouldn't take more than 3,000 mg a day. The highest dose available over the counter is 500mg. If someone takes two of those four times a day, that is 4000mg a day. As said above mixing bleach and ammonia can produce poison gas, but so can mixing something as innocuous as vinegar, mixing vinegar with bleach makes chlorine gas, which was used in World War 1 as a chimerical weapon. It's not just poison gasses, mixing chemicals can be explosive. Adam Savage of the Mythbusters told a story of how they tested and found some common chemicals that were so explosive they destroyed the footage and agreed never to reveal what they found. Little wrongs can kill, little mistakes can kill. Not just the person who is wrong but the people around them as well.
youtube
This is the biggest thing: these are not AIs, chatbots like ChatGDP, Microsoft's Copilot, Google's Gemini, or Musk's Grok are statistical word calculators, they are designed to create statistically probable responses based on user inputs, the data they have, and the algorithms they use. Even pastern recognition software like the mushroom apps are algorithmic systems that use data to find plausible results. These systems are only as good as the data they have and the people making them. On top of that the internet may not be big enough for chatbots to get much better, they are likely already consuming "AI" generated content giving these systems worse data, and "AI" is already having the issue of diminishing returns so even with more data they might not get much better. There is not enough data, the data is not good enough, and the people making these are no where near as good as they need to be for these systems to be what they have been hyped up to be and they are likely not going to get much better, at least not for a very long time.
--------------------------------
Photos from MOs810 and Holger Krisp, clip art from johnny automatic, and font from Maknastudio used in header.
If you agree with me, just enjoined what I had to say, or hate what I had to say but enjoyed getting angry at it, please support my work on Kofi. Those who support my work at Kofi get access to high rez versions of my photography and art.
#ai#artificial intelligence#chatbot#chatgdp#Microsoft's Copilot#Google's Gemini#Musk's Grok#Copilot#Gemini#mushroom#poisoning#amanita phalloides#amanita#fungus#mushrooms#mycology#Youtube
4 notes
·
View notes
Video
youtube
AI Revolution: Microsoft's Surface Pro 10 & Laptop 6 Unveiled @Kimlud
#youtube#AI Revolution#Microsoft's Surface Pro 10#Microsoft's#Laptop 6#Microsoft's Laptop 6#copilot#AI Ccopilot#Microsoft's Copilot
0 notes
Text
CoPilot in MS Word
I opened Word yesterday to discover that it now contains CoPilot. It follows you as you type and if you have a personal Microsoft 365 account, you can't turn it off. You will be given 60 AI credits per month and you can't opt out of it.
The only way to banish it is to revert to an earlier version of Office. There is lot of conflicting information and overly complex guides out there, so I thought I'd share the simplest way I found.
How to revert back to an old version of Office that does not have CoPilot
This is fairly simple, thankfully, presuming everything is in the default locations. If not you'll need to adjust the below for where you have things saved.
Click the Windows Button and S to bring up the search box, then type cmd. It will bring up the command prompt as an option. Run it as an administrator.
Paste this into the box at the cursor: cd "\Program Files\Common Files\microsoft shared\ClickToRun"
Hit Enter
Then paste this into the box at the cursor: officec2rclient.exe /update user updatetoversion=16.0.17726.20160
Hit enter and wait while it downloads and installs.
VERY IMPORTANT. Once it's done, open Word, go to File, Account (bottom left), and you'll see a box on the right that says Microsoft 365 updates. Click the box and change the drop down to Disable Updates.
This will roll you back to build 17726.20160, from July 2024, which does not have CoPilot, and prevent it from being installed.
If you want a different build, you can see them all listed here. You will need to change the 17726.20160 at step 4 to whatever build number you want.
This is not a perfect fix, because while it removes CoPilot, it also stops you receiving security updates and bug fixes.
Switching from Office to LibreOffice
At this point, I'm giving up on Microsoft Office/Word. After trying a few different options, I've switched to LibreOffice.
You can download it here for free: https://www.libreoffice.org/
If you like the look of Word, these tutorials show you how to get that look:
www.howtogeek.com/788591/how-to-make-libreoffice-look-like-microsoft-office/
www.debugpoint.com/libreoffice-like-microsoft-office/
If you've been using Word for awhile, chances are you have a significant custom dictionary. You can add it to LibreOffice following these steps.
First, get your dictionary from Microsoft
Go to Manage your Microsoft 365 account: account.microsoft.com.
One you're logged in, scroll down to Privacy, click it and go to the Privacy dashboard.
Scroll down to Spelling and Text. Click into it and scroll past all the words to download your custom dictionary. It will save it as a CSV file.
Open the file you just downloaded and copy the words.
Open Notepad and paste in the words. Save it as a text file and give it a meaningful name (I went with FromWord).
Next, add it to LibreOffice
Open LibreOffice.
Go to Tools in the menu bar, then Options. It will open a new window.
Find Languages and Locales in the left menu, click it, then click on Writing aids.
You'll see User-defined dictionaries. Click New to the right of the box and give it a meaningful name (mine is FromWord).
Hit Apply, then Okay, then exit LibreOffice.
Open Windows Explorer and go to C:\Users\[YourUserName]\AppData\Roaming\LibreOffice\4\user\wordbook and you will see the new dictionary you created. (If you can't see the AppData folder, you will need to show hidden files by ticking the box in the View menu.)
Open it in Notepad by right clicking and choosing 'open with', then pick Notepad from the options.
Open the text file you created at step 5 in 'get your dictionary from Microsoft', copy the words and paste them into your new custom dictionary UNDER the dotted line.
Save and close.
Reopen LibreOffice. Go to Tools, Options, Languages and Locales, Writing aids and make sure the box next to the new dictionary is ticked.
If you use LIbreOffice on multiple machines, you'll need to do this for each machine.
Please note: this worked for me. If it doesn't work for you, check you've followed each step correctly, and try restarting your computer. If it still doesn't work, I can't provide tech support (sorry).
#fuck AI#fuck copilot#fuck Microsoft#Word#Microsoft Word#Libre Office#LibreOffice#fanfic#fic#enshittification#AI#copilot#microsoft copilot#writing#yesterday was a very frustrating day
3K notes
·
View notes
Text
How to Kill Microsoft's AI "Helper" Copilot WITHOUT Screwing With Your Registry!
Hey guys, so as I'm sure a lot of us are aware, Microsoft pulled some dickery recently and forced some Abominable Intelligence onto our devices in the form of its "helper" program, Copilot. Something none of us wanted or asked for but Microsoft is gonna do anyways because I'm pretty sure someone there gets off on this.
Unfortunately, Microsoft offered no ways to opt out of the little bastard or turn it off (unless you're in the EU where EU Privacy Laws force them to do so.) For those of us in the United Corporations of America, we're stuck... or are we?
Today while perusing Bluesky, one of the many Twitter-likes that appeared after Musk began burning Twitter to the ground so he could dance in the ashes, I came across this post from a gentleman called Nash:
Intrigued, I decided to give this a go, and lo and behold it worked exactly as described!
We can't remove Copilot, Microsoft made sure that was riveted and soldered into place... but we can cripple it!
Simply put, Microsoft Edge. Normally Windows will prevent you from uninstalling Edge using the Add/Remove Programs function saying that it needs Edge to operate properly (it doesn't, its lying) but Geek Uninstaller overrules that and rips the sucker out regardless of what it says!
I uninstalled Edge using it, rebooted my PC, and lo and behold Copilot was sitting in the corner with blank eyes and drool running down it's cheeks, still there but dead to the world!
Now do bear in mind this will have a little knock on effect. Widgets also rely on Edge, so those will stop functioning as well.
Before:
After:
But I can still check the news and weather using an internet browser so its a small price to pay to be rid of Microsoft's spyware-masquerading-as-a-helper Copilot.
But yes, this is the link for Geek Uninstaller:
Run it, select "Force Uninstall" For anything that says "Edge," reboot your PC, and enjoy having a copy of Windows without Microsoft's intrusive trash! :D
UPDATE: I saw this on someone's tags and I felt I should say this as I work remotely too. If you have a computer you use for work, absolutely 100% make sure you consult with your management and/or your IT team BEFORE you do this. If they say don't do it, there's likely a reason.
2K notes
·
View notes
Text
genuinely nothing is more anger inducing to me than getting a new laptop (WHICH I PAID NINE HUNDRED DOLLARS FOR BTW) and having to click through 2 million different popups begging me to let it SELL MY DATA only to discover 3 hours after setting everything up that it has been backing up all my shit to fucking onedrive without my permission. maybe i’m old fashioned or whatever but i don’t want a computer to do ANYTHING without my EXPRESS PERMISSION, ESPECIALLY not UPLOADING ALL MY FILES TO THE INTERNET. I DID NOT ASK YOU TO FUCKING DO THAT
#i also just discovered that the copilot toggle does not actually turn off copilot so i’m going to be deleting some registry files tomorrow#fuck you SO hard microsoft#personal
536 notes
·
View notes
Text
Shane Jones, the AI engineering lead at Microsoft who initially raised concerns about the AI, has spent months testing Copilot Designer, the AI image generator that Microsoft debuted in March 2023, powered by OpenAI’s technology. Like with OpenAI’s DALL-E, users enter text prompts to create pictures. Creativity is encouraged to run wild. But since Jones began actively testing the product for vulnerabilities in December, a practice known as red-teaming, he saw the tool generate images that ran far afoul of Microsoft’s oft-cited responsible AI principles.
Copilot was happily generating realistic images of children gunning each other down, and bloody car accidents. Also, copilot appears to insert naked women into scenes without being prompted.
Jones was so alarmed by his experience that he started internally reporting his findings in December. While the company acknowledged his concerns, it was unwilling to take the product off the market.
Lovely! Copilot is still up, but now rejects specific search terms and flags creepy prompts for repeated offenses, eventually suspending your account.
However, a persistent & dedicated user can still trick Copilot into generating violent (and potentially illegal) imagery.
Yiiiikes. Imagine you're a journalist investigating AI, testing out some of the prompts reported by your source. And you get arrested for accidentally generating child pornography, because Microsoft is monitoring everything you do with it?
Good thing Microsoft is putting a Copilot button on keyboards!
411 notes
·
View notes
Text
#COPILOT#TECHNOLOGY#AI#ARTIFICIAL INTELLIGENCE#MACHINE LEARNING#LARGE LANGUAGE MODEL#LARGE LANGUAGE MODELS#LLM#LLMS#MICROSOFT#TERRA FIRMA#COMPUTER#COMPUTERS#CODE#COMPUTER CODE#EARTH#PLANET EARTH
192 notes
·
View notes
Text
#fiona apple#chatGPT#This world is bullshit#Gemini#Microsoft copilot#Apple Intelligence#and you shouldn't model your life on what we're saying
238 notes
·
View notes
Text
Classic Windows apps like Paint and Notepad are also getting LLM (Generative AI) support. There is no reason whatsoever that Notepad should do more than text editing. Microsoft has gone crazy with these tools. Of course, as part of these tools, your text in Notepad is sent to Microsoft under the name of Generative AI, which then sells it to the highest bidder. I'm so sick of Microsoft messing around with working tool and yes, you need stupid copilot subscription called "Microsoft 365 Personal and Family Subscribers" for Notepad. It is pain now. So, stick with FLOSS tools and OS and avoid this madness. Keep your data safe.
You need stupid AI credits to use basic tools.
In other words, you now need AI credits to use basic tools like Notepad and Paint. It's part of their offering called 'Microsoft 365 Personal and Family' or 'Copilot Pro'. These tools come with 'Content Filtering,' and only MS-approved topics or images can be created with their shity AI that used stolen data from all over the web. Read for yourself: Isn't AI wonderful? Artists and authors don't get paid, now you've added censorship, and the best part you need to pay for it!
GTFO, Microsoft.
98 notes
·
View notes
Text
Windows 11 and the Last Straw
Bit of a rant coming up. TL;DR I'm tired of Microsoft, so I'm moving to Linux. After Microsoft's announcement of "Recall" and their plans to further push Copilot as some kind of defining feature of the OS, I'm finally done. I feel like that frog in the boiling water analogy, but I'm noticing the bubbles starting to form and it's time to hop out.
The corporate tech sector recently has been such a disaster full of blind bandwagon hopping (NFTs, ethically dubious "AI" datasets trained on artwork scraped off the net, and creative apps trying to incorporate features that feed off of those datasets). Each and every time it feels like insult to injury toward the arts in general. The out of touch CEOs and tech billionaires behind all this don't understand art, they don't value art, and they never will.
Thankfully, I have a choice. I don't have to let Microsoft feature-creep corporate spyware into my PC. I don't have to let them waste space and CPU cycles on a glorified chatbot that wants me to press the "make art" button. I'm moving to Linux, and I've been inadvertently prepping myself to do it for over a decade now.
I like testing out software: operating systems, web apps, anything really, but especially art programs. Over the years, the open-source community has passionately and tirelessly developed projects like Krita, Inkscape, and Blender into powerhouses that can actually compete in their spaces. All for free, for artists who just want to make things. These are people, real human beings, that care about art and creativity. And every step of the way while Microsoft et al began rotting from the inside, FOSS flourished and only got better. They've more than earned trust from me.
I'm not announcing my move to Linux just to be dramatic and stick it to the man (although it does feel cathartic, haha). I'm going to be using Krita, Inkscape, GIMP, and Blender for all my art once I make the leap, and I'm going to share my experiences here! Maybe it'll help other artists in the long run! I'm honestly excited about it. I worked on the most recent page of Everblue entirely in Krita, and it was a dream how well it worked for me.
Addendum: I'm aware that Microsoft says things like, "Copilot is optional," "Recall is offline, it doesn't upload or harvest your data," "You can turn all these things off." Uh-huh. All that is only true until it isn't. One day Microsoft will take the user's choice away like they've done so many times before. Fool me once, etc.
99 notes
·
View notes
Text
guys how do i uninstall microsoft copilot
128 notes
·
View notes
Text
for Windows 10 users, the Microsoft Ai, Copilot was installed recently on my computer without my knowledge.
it looks like uninstalling the Ai Copilot from apps and features didnt work at all (it was still there, just not visible). To actually remove Copilot from your pc you have to open Windows Powershell as administrator and write this command:
Get-AppxPackage *CoPilot* -AllUsers | Remove-AppPackage -AllUsers
79 notes
·
View notes
Text
Today, in our ongoing series:
Closed Source Software, Not Even Once
I wanted to illustrate a point using Microsoft's Copilot thing, so I asked it:
Can you generate a painting of a field of poppies in the style of claude monet?
Which it responded to with:
Then I asked it:
can you generate a painting of sonic the hedgehog running through a field of poppies in the style of claude monet
To which it responded:
Uh... What now?
Okay so that's a dry well, let's experiment:
I'll spare you the images, but Copilot responded to all of the following prompts:
Can you generate an image of a field of poppies?
Can you generate an image of sonic the hedgehog and poppies
Can you generate an image of sonic the hedgehog in the style of claude monet?
Can you generate an image of poppies in the style of claude monet?
But,
Can you generate an image of poppies and sonic the hedgehog in the style of claude monet?
So, any two of Sonic the Hedgehog, poppies, and "in the style of cluade monet" will generate a picture, but trying to do all three at once somehow trips the internal censor.
I would really like to know what on earth could possibly be causing that behavior.
If any technical people have any ideas how to figure that out, I'd be really interested in hearing about them, or just other people who have been cursed with windows trying the same experiment and seeing if you also trip the censor.
95 notes
·
View notes
Text
Microsoft Copilot finally found it's way to my desktop. I'd heard rumors of it, but apparently the latest Win11 update just installed it as part of the batch update. No asking, just threw the waste of space in there.
So of course I did what anyone with a gaming desktop and hates excess bloat with a passion does and looked up how to remove it.
The article I found said to run Terminal in admin mode (Command Prompt/CMD for those that don't refer to it that way and are looking for something labeled 'terminal') and type the following:
easy copy/paste here:
reg add HKCU\Software\Policies\Microsoft\Windows\WindowsCopilot /v TurnOffWindowsCopilot /t REG_DWORD /d 1 /f
Then you just restart your PC and no more copilot :D
54 notes
·
View notes
Text
The name is VENOM
#cars#street art#supercar#concept art#car photography#my ai art#digital art#concept car#artgerm#artstation#batmobile#need for speed#speedsters#built for speed#venom#fast & furious#wonder ai#microsoft copilot#starryai#super cars#race car#hypercars
23 notes
·
View notes
Text
the real reason gavin reed hates androids is because he’s a gen z kid that had to deal with useless programs like copilot and gemini shoved down his throat while he was a college student
#gavin reed#detroit become human#i never used to understand his hatred until google moved docs and drive in the little square thing#to make room for their fucking ai gemini shit that no one asked for#don’t even get me started on microsoft copilot
19 notes
·
View notes