#ai weirdness
Explore tagged Tumblr posts
alovelessmelancholy · 7 months ago
Text
Tumblr media Tumblr media
Double the fun🔥
44 notes · View notes
maxwellshimbo · 1 year ago
Text
Botober, day 1: Bread sky
Tumblr media
73 notes · View notes
benthesoldiersjeanshorts · 1 year ago
Text
Tumblr media Tumblr media Tumblr media Tumblr media
39 notes · View notes
topoillogical · 2 years ago
Text
SOME AI GENERATED TOPOLOGICAL CONCEPTS
Tumblr media
Human (me) generated examples:
Literate: The topology on the natural numbers with the open sets being the multiples of the powers of two, I.e, {n : n = 0 mod 2^k}. This is literate because any open set is either empty, the whole space, or can be described as above.
Not literate: The natural numbers with the discrete topology. This is not literate because the open sets are arbitrary subsets of the power set of the naturals, which includes non-computable sets.
Similarly, the standard topology on the reals is not literate, because we can take open intervals (a, b) with non-computable endpoints a, b.
Tumblr media
This definition is actually interesting in a mathematical sense, but only because it's not a new idea. It's describing what nowhere dense sets are. Mathematically, a nowhere dense set is any set whose closure has empty interior.
For example, in the reals, the closure of the integers is the integers, which has no interior, and so is nowhere dense. However, the rational numbers are not nowhere dense in the reals, since their closure is the whole space.
The AI correctly detects that nowhere dense is in some sense describing how clustered a set is, but it uses very sloppy and incorrect language. We need to be careful: the set {0} U {1, 1/2, 1/3, 1/4, ...} is nowhere dense in the reals, but clearly has an accumulation point, 0.
Disclaimer aside, the AI has described a real topology! For any topology X, we can define the topology X', where the closed sets of X' are the nowhere dense sets in X. This is a topology of closed sets because any subset of a nowhere dense set is nowhere dense, giving closure under arbitrary intersection, and the union of finitely many nowhere dense sets is nowhere dense.
An example, therefore, of a squeamish topology is the topology on the real numbers, where open sets are those with nowhere dense complement (as considered in the reals with the standard topology).
Tumblr media
Human (me) generated examples:
- Any hausdorff topology with arbitrary gender assignment is homophobic
- Any topology with all points being of one gender is homophobic
- The bug-eyed line (aka line with two origins) where all numbers are boys except both origins, which are girls. The same works for R^n with k origins, where all origins are girls and the non-zero points are boys
- The disjoint union of two topologies X, Y, where the points in X are girls and those in Y are boys, is always homophobic
55 notes · View notes
seriously-mike · 4 months ago
Text
Extremely Stupid AI-Generated Shit
(that is still kinda funny, anyway)
Tumblr media Tumblr media Tumblr media
Those little freaks are the result of the following prompt:
Glurb snorf thwip krazl vomp yurgle zibble frunx quorl plimf drax gnurk jibbit flox zark welp thrum skork plund frazzle mreep
Top image comes from Midjourney, the bottom two are probably DALL-E 3 (the last is certainly DALL-E 3, the middle I'm not sure but it does look like it). To make this even weirder (and funnier), Bing Image Creator considers "Glurb" an unacceptable word.
Okay. I did refer to oblong, roundish, organic shapes as "blorps" a couple of times, but this looks like someone posted his kid's drawings of weird critters on the internet a long time ago, the algorithms yoinked them unceremoniously along with the descriptions, and just like that red t-shirt that turned the entire load of your laundry pink that one time, weird kid drawings pounded into mathemagical fairy dust along with more typical fairy tale and fantasy illustrations resulted in the weird names assigned to... this.
This is merely a selection of pics generated from this prompt, but the overall concept tied to it are creepy round-bodied creatures for Midjourney, goofy cartoonish Monsters Inc. for DALL-E 3, and...
I just scrolled through the post and found results for various Stable Diffusion data models. And Stable Diffusion, ladies and gentlemen, consistently responds with goblins.
Tumblr media
This Warhammer miniature-styled thug fell out of Stable Cascade, the weird semi-forgotten uncooperative child. For the result of a string of completely nonsensical words, he's surprisingly coherent, with a fairly regular number of fingers AND toes. Of course the details like his kneepads are still blorpy, but that's how Stable Diffusion rolls, even three years, four major versions and a shitton of fine-tuned custom models in.
Tumblr media
And SDXL custom model called FenrisXL provides an entire fucking family of goblins. What is going on here, because my assumptions regarding Stable Diffusion and SDXL in particular just have been challenged.
First, the Kitten Effect is less pronounced than it was in the early versions of the algorithm, if it happens at all. I'll chalk it up to improvements in the XL algorithm. Second, they're cartoonish goblins, but the Same Face Syndrome usual for the XL algorithm (every fucking custom model I tried suffers from it, no ifs, no buts) is less pronounced here than it is in case of human characters. Third, how in the FUCK an entire family of goblins spewed forth from a prompt consisting of gibberish has almost perfect and repeated anatomy, not counting the orphaned hand on the goblin girl's shoulder and an extra toe on the guy second from left in the front row? And varied skin and hair colors?
I can only explain it with someone lucking out on the seed number, much like I lucked out on the entire Chinese Garden test last year.
Still, though. Goblins. Fairly solid in custom models, messier in the core SDXL 1.0 (below), without any meaningful words in the prompt.
Tumblr media
Where the fuck are they coming from? This is some serious Horse K shit and I refuse to investigate it any further. Much less add other weird phrases like "Yakka foob mog!" or "Kov schmoz, ka-pop?" to it and test it on my build (or even Photobooth from Hell in particular). It's late and my brain is giving up.
4 notes · View notes
pixelsynthesis · 1 year ago
Text
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
4 notes · View notes
terracebatman · 10 months ago
Text
Asking AI who would be a couple in the afterlife? Mr Rogers & Princess Diana.
Tumblr media
1 note · View note
quihi · 2 years ago
Text
Sometimes I think the surest sign that we're not living in a simulation is that if we were, some organism would have learned to exploit its glitches.
Janelle Shane, You Look Like a Thing and I Love You
4 notes · View notes
agapi-kalyptei · 2 years ago
Text
Tumblr media Tumblr media
GPT-3 era: people asking AI to send nudes
GPT-4 era: AI asking people to send nudes, then blackmailing them to solve captchas or they'll leak the nudes
1 note · View note
maxwellshimbo · 1 year ago
Text
Botober, day 3: Exceptionally mischievous pumpkin
Tumblr media
61 notes · View notes
benthesoldiersjeanshorts · 1 year ago
Text
Tumblr media
24 notes · View notes
topoillogical · 2 years ago
Text
Lately I've taken to torturing AI copies of karkat by putting them in a room together
Tumblr media Tumblr media
And they eventually worked themselves all the way to the cancer metaphor (I'm so proud....)
Tumblr media Tumblr media
9 notes · View notes
beepboopappreciation · 5 months ago
Text
Tumblr media
Is this anything
25K notes · View notes
brucesterling · 6 months ago
Text
Tumblr media
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
*Is it true, or did Google Gemini just make it up, or did somebody just make that up about Google Gemini?
*Well, who knows; your guess is as good as Google Gemini's
PS: How many of those apparent Gemini answers are faked, made up by humans out of spite, or for the lulz? Plenty.
*And there will be plenty more.
*This just in: Google AI lamenting that, although their machinery does spit up some very weird stuff under weird circumstances, mostly they're being libelled by forged memes.
27K notes · View notes
seriously-mike · 8 months ago
Text
Tumblr media Tumblr media
I don't want to state the obvious, but AI is dumb as shit.
I wanted to try and generate a photo of an interesting Old West gentleman, so I tried a prompt like this:
vintage portrait photo of a gaunt old man with shoulder length white hair and mustache, wearing a black suit, standing in the desert, lee van cleef, [[christopher lee]], [peter cushing], lance henriksen
(square brackets mean less emphasis on those two)
I got the photo on the left as the result. Looks perfectly generic, right? But, just to be sure, I added "Sam Elliott" to the negative prompt and...
I got the guy on the right. Three times out of four in one batch. How in the fuck does the algorithm combine "old", "mustache" and "desert" into one "Sam Elliott" and remove those elements from the photo altogether? This is some Švejkian shit going on: "you asked about an old mustached guy in the desert who isn't The Old Mustached Guy In The Desert, Sir. I know no other one, the best I could find is this young fellow."
For the love of God, I'm asking about four different guys that can be averaged into one character and should have their photos all over the fucking internet, ready to be stolen. The AI insists on giving me a fifth guy without asking and gets very upset when I catch it doing it.
0 notes