#automated screencapping
Explore tagged Tumblr posts
Text
From a thread by Twitter user @mykola:
Ok, so, the following thread is going to be dense. I have a model of what I call "Identity Trauma" that is not exclusive to Neurodivergent people but so common among us that almost nobody can actually see it. Let me tell you a story.
When you are an infant, and you have needs that other people don't understand, nobody will be able to meet those needs. And so you grow up, from a very early age, with the empirical, evidence-based understanding that parts of you are not valid. Those parts don't shut up, tho!
Maybe you're an Autistic kid whose hearing is so hypersensitive that it's physically painful to you to be around your (large, joyful, boisterous) family. Maybe you're ADHD and your emotions are so strong that Everything! Feels! Extreme! Always! Whatever it is? It's not welcome.
And like, you try over your early life to communicate to people about this thing. And they don't believe you. They tell you "sometimes people are loud when they're happy, it's ok, don't be scared!" or they tell you "stop with the dramatics, you just want attention."
Every attempt to inhabit your full self is somehow curtailed, cut short, and you receive anywhere from a tiny bit to a WHOLE LOT of Shame for it. This leads to cognitive dissonance: do you listen to the part of you that says "this can't continue", or your caregivers?
(And remember, you're like six.)
The choice, for a sadly enormous percentage of us, is to trust our caregivers. They assure us we're fine. They assure us everyone deals with this, that if we just try a bit harder then we'll be okay. That part of us that's screaming? We start to wall it off.
It turns out we've got a lot of really useful construction material laying around in the form of shame. Every time that pain tries to get out? It gets shamed back in. So we just finish the process, seal it in.
Bliss.
Relief.
We can't hear the screaming anymore. We can now focus on making sure we trust our caregivers, instead. Except. By walling off that voice, that pain? We've walled off our relationship to our body. But something VERY IMPORTANT lives there: our identity.
Your identity is who you are. It's everything you know to be true, everything you value, everything you feel. It's the name for the sum total of the enormous Thing that you are. One part of that thing is letting you know about unmet needs. But it does so much more.
It answers every question you need to ask yourself. How does it answer them? By thinking about it? No. It uses embodied, somatic, axiomatic knowledge. This is important, read this a few times: It is not cognitive. It doesn't feel like thinking. It feels like feeling.
[…]
Emotions are one of the ways our body communicates with us. With one exception, emotions are signals from your body to your self. That exception is shame.
Shame is the only emotion that originates externally. Shame comes from other people instructing us to feel it. That's it. And if you are cut off from your body? It is literally the only emotion you are really in touch with. And so you organize your WHOLE LIFE around it.
Listen, this model I'm describing? This is codependency, right? Because what's happening: your "core" self, your embodied axiomatic somatic source of truth, is gone. So your identity is not grounded in your body. Where is it grounded? In the approval of those around you.
If you're dealing with this shit, I will now perform a magic trick and tell you something about yourself that you will realize you have always known but that nobody has ever pointed out before. There's a special class of relationship in your life. It's not friend, parent, lover or anything else you'll find a hallmark card for, although it frequently coexists with some of the people in these roles. But the special class of relationship you have is that set of people that you trust to tell you who you are. You have complicated feelings about them.
These are the people that you have tasked, often without their knowledge or their consent, to serve as the grounding for your sense of identity. They are your surrogate embodied self. And they hold unfathomable power over you. (This is why we are so susceptible to abuse.)
Healing is in part about taking back those parts of you that you have invested in other people. That was never a gift to them, that was not about love -- that was an abdication of your responsibility to be a PERSON. It's not your fault. You didn't know. Your self was taken.
#the psyche#useful reframings#open to feedback on the formatting here‚ honestly#originally i was just going to put every tweet in as a link and let tumblr do its automated screencap-and-transcribe thing#but i found all the extraneous visual noise (repeated username‚ &c) really fucking distracting#so hopefully this is a decent balance of readable and clearly-not-mine#anyway some of this was old news to me but i did find the particular framing of it compelling
22 notes
·
View notes
Text
everything's going great over on twitter dot com
an astronomer from Oxfordshire was locked out of her Twitter account for three months after sharing a video of a meteor which was flagged by the site's automated moderation tools
Mary McIntyre was told that her six-second animated clip featured "intimate content"
full story at the BBC: X
39 notes
·
View notes
Text
Here's part of timeline of artificial intelligence over the next couple decades:
"The Dawn of the Singularity" timeline suggests we'll welcome the Technological Singularity in 2045!
Idk I think if you aren't going to do the work of becoming a technical observer and trying to understand the nuances of how these models work (and I sure as hell am not gonna bother yet) it's best to avoid idle philosophizing about "bullshit engines" or "stochastic parrots" or "world models"
Both because you are probably making some assumptions that are completely wrong which will make you look like a fool and also because it doesn't really matter - the ultimate success of these models rests on the reliability of their outputs, not on whether they are "truly intelligent" or whatever.
And if you want to have an uninformed take anyway... can I interest you in registering a prediction? Here are a few of mine:
- No fully self-driving cars sold to individual consumers before 2030
- AI bubble initially deflates after a couple more years without slam-dunk profitable projects, but research and iterative improvement continues
- Almost all white collar jobs incorporate some form of AI that meaningfully boosts productivity by mid 2030s
284 notes
·
View notes
Text
Pandora's Vault
Builder : Awesamdude
Series : DSMP
Propaganda :
- its so big. Its so so big. Look at a map of the dsmp. Its just a black void bigger than l'manburg was.
- You look at it and you just know it's something terrible. the obsidian walls, lava, the iron. It's just there. In the middle of the ocean. It does not fit in and its scary.
- the AMOUNT of redstone and functions it go is AMAZING. the only way to enter is through a portal that then leads u to the nether and has to be manually activated again by the warden. So to enter you literally NEED the wardens permission. All the bridges and all the door. It's so fucking cool man what can I say. The amount of security.
- the lore that happened inside pandora as well. Pandoras arc was the best arc of the whole of dream smp and I stand by that. There is so so much to unpack.
Sam and Dream could have just built some shitty obsidian box and called it a prison, but no they made PANDORAS VAULT
The Everdusk Castle
Builder : ToAsgaard
Series : ATM Spellbound Series
Porpaganda : this place is built in another dimension (the Everdusk), it's gigantic (extends about 50ish blocks further down past where i was able to grab a good screencap), and it's fully detailed inside. not in like a "some stuff here and there" or "there are redstone machines" -- every single room is detailed out, often with visuals corresponding to the mod being used, any automated setups are given a ton of visual flair to fit with the theme of the base, there's even automation setups that serve as visual rooms (the Botania automation room uses Kekimuras and is set up as a banquet hall! it's so cool!!). i think about it constantly. ToAsgaard's builds are consistently drop-dead gorgeous (his soulsborne-inspired Celestial Journey/Betweenlands base and gigantic multi-piece Sevtech Ages base are both fantastic) with ridiculously intricate detailing and really cool modded automation setups. his Celestial Journey base, Carcosa was a close second for me -- but its power lies in all that detailing and isn't nearly as screencappable from the outside. Asgaard's an amazing builder both on the megabase and microdetail levels, incredible at standard modded automation and at doing things the fun way. he's been inactive for a few years now but i still adore his stuff, and this seemed like a good way to show off an absolutely spectacular builder that otherwise people might not know about.
Taglist!
@10piecechickenmcnugget
@choliosus
@biro-slay
@betweenlands
@xdsvoid
236 notes
·
View notes
Text
Hate the new desktop layout that the staff is experimenting with? Dreading the idea/possibility of reblog chains going away via collapsible reblogs? Any other features or changes that you do not like? Send them your feedback with a Support form!
1. Go to https://tumblr.com/support.
2. Choose the "Feedback" category.
3. Fill out the big "The more details, the better" box below your chosen category with whatever feedback or criticisms you want to share. Please keep in mind that you should NOT use this as an excuse to be rude or condescending towards the staff, no matter how annoyed you are at them at making these meaningless changes.
4. If you are able to, provide them with an image to help give the staff and support team a better idea of what you're talking about. The support form only allows one image, so if you have more than multiple images that you want to include, compile them all into a single, organized image file to share.
5. Choose whether or not it's relevant to your blog. I leave it as "None selected" because every feedback I share and issue/bug I report is typically regarding what's affecting the Tumblr userbase as a whole.
6. Not providing a screencap for this step for obvious privacy reasons, but make sure your account email listed in the form is the same one you use to log into your account with.
7. Prove to reCAPTCHA that you're not a robot and send your support form.
(Make sure to double check by seeing if your email inbox received an automated message from the support team.)
And that's it! It is that easy to share your feedback to the staff. Remember the various instances the staff rolling back on some of the questionable changes made to the mobile apps? It was because of these feedback forms; sending them has a larger change of them getting noticed by them instead of tagging staff and support in posts and reblogs.
#my post#tumblr#tumblr staff#tumblr support#tumblr feedback#tumblr dashboard#tumblr update#tumblr changes
257 notes
·
View notes
Text
TUTORIAL 3
Making Gifs Purely with FFmpeg Scripts
FULL TUTORIALS | RECOLORED GIFS
Directly opposite to the last tutorial, this tutorial is for girls who LOVE command lines. You can make a goddamned motherfucking pristine, RECOLORED gif set from your command line, and feel cool as fuck while you're at it. And doing so is probably not quite as devoid of visual stimulus as you think!!
FULL SIZE EXAMPLE SET HERE | FULL CODE FOR THE SET HERE
Operating systems: Mac, Windows, and Linux
Quality potential: High
Software needed: FFmpeg
Difficulty: Advanced as far as gif-making, but this is actually a good first bash scripting project in my opinion if you've ever wanted to learn how.
Requirements: General familiarity with your computer's shell (Powershell on Windows, Terminal on Mac) helps a lot! I will try to make it as easy as possible to follow.
LIMITATIONS: 1) Frame by frame gif-making methods take up a lot of space on your drive in the interim. 2) The captioning method currently provided in the tutorial is not good for longer captions (there is a better method I plan to append eventually). 3) Recoloring options are minimal in this tutorial. Curves and many other color adjustments are available in FFmpeg but I haven't yet explored them.
First, let me get this response out of the way:
"I don’t understand how I can possibly do this if I can’t see what I’m doing!"
That’s the neat part—you DO get to see what you’re doing!
The first visual is your file browser.
If you were using Photoshop or GIMP to make these gifs, you’d view all of your frames in a sidebar as small thumbnails and select and scroll through them to add delete, and group select and manipulate them. The FRAMES folder in our file browser will serve this function, letting us see every frame in our set, delete any frames we don't want, and group them to apply the same crop, coloring, or caption to the same selection of frames.
The second visual is provided by FFplay.
FFplay is part of FFmpeg. When you use this command in your system shell, it opens an image or gif so you can see the results of a crop, recolor, caption (or all three!) on your work before actually applying it! This is how we will actually see, at full resolution on our screen, exactly how our image is looking.
WINDOWS USERS: CHANGE ALL FORWARD SLASHES IN THIS SCRIPT TO BACK SLASHES!!!
___________________________________________
1. Installing FFmpeg
___________________________________________
I recommend you install via Homebrew if you're on Mac or Linux by pasting this into your terminal (if/once Homebrew is installed):
brew install ffmpeg
Windows users or other users who don't want Homebrew can follow directions on the FFmpeg website.
___________________________________________
2. Screencapping Frames with FFmpeg
___________________________________________
There are many ways to do this. However, since this is the Pure FFmpeg tutorial, I'm going to point you to these instructions on how to automate this process with a script template I have already written for you and an example (screencapping for this exact set).
If you follow that tutorial exactly, you should come back for step 3 with a folder on your Desktop called FRAMES.
___________________________________________
3. Organize in the FRAMES Folder
___________________________________________
Go to your FRAMES folder in your file browser. Your frames are already ordered for you in the correct time sequence.
Delete any unwanted frames.
I need to "trim" some frames from the beginning and end of my set. For example, at the beginning of my set, the first 17 screencaps are of Sam from the first 564 milliseconds of the start of the 8.02 time stamp. I don't want this these frames in my gifset. I want my first gif to start on Dean when he appears starting in the 18th screencap.
If I hold shift and select all of the unwanted frames, I can delete them. Similarly, my screencaps end at the 8:11 mark, and only the first 156 milliseconds are on Dean. After that, the shot switches back to the purple guy. I don't want those shots of the purple guy, so I'll delete frames 247 through 280 at the end of the set, leaving me with frame_0018.png through frame_0246.png as all the frames of my gifset.
Make "Shot Folders".
As an example of what I mean by shots, look at the three example gifs at the top of the tutorial. The first gif has just one shot (i.e., the camera stays on Dean). The second gif has two shots (the camera is on the guy painted purple, and then it's on Dean again. In the third gif, there are two shots again—one on Sam, then one on Dean. So that's 5 shots. I want to separate frames belonging to each of the 5 shots into subfolders labeled 1 through 5 (see gif below).
______________________________
4. Setting up FFplay
______________________________
Now that we have all of our frames organized, we want to set up FFplay—the method we'll use to view our image manipulations before applying them.
NOTE: Type down all of the commands as you run through this whole tutorial!! In a word document, a text editor, etc. Do NOT just type your commands in the shell without keeping track of what you typed! You can also see an example of how to map out your scripts by following how I wrote them for Github here, and even download/copy paste my fulls script there and use it as a template for your own set.
Template for Creating Our Preliminary Gifs for FFplay
The first thing we need to do is make a preliminary test gif of every shot in our gifset. To do this, use the template below, changing only the parts in red:
cd ~/Desktop/FRAMES/ShotNumber ffmpeg -start_number FileNumberOnFirstFrame -i frame_%04d.png -vf "fps=30" ffplay -vf Test.gif
The first line, beginning with "cd", tells me the file path to my shot folder where my frames are located.
The line below that combines the frames of my shot into a gif called Test.gif.
ffmpeg envokes the FFmpeg utility.
-start_number FileNumberOnFirstFrame -i frame_%04d.png tells FFmpeg to order the frames in the gif by the 4-digit number at the end of their file name, starting with FileNumberOnFirstFrame, which you should replace with the 4-digit number in the file name of the first frame in your shot folder.
-vf "fps=30" tells FFmpeg I want the gif to move at 30 frames per second (if you screencapped at a rate other than 30 FPS, you can simply change the number to match your frame rate).
In my example:
Say I want to generate a gif of my first shot. Then I need ShotNumber to be 1, and FileNumberOnFirstFrame to be 0018 (that's the 4-digit number on the first frame in my first shot, after deleting the first 17 frames). So my command looks like this:
cd ~/Desktop/FRAMES/1 ffmpeg -start_number 0018 -i frame_%04d.png -vf "fps=30" Test.gif
See Step 4 in my script on Github to see the script for all 5 shots.
Playing a gif from FFplay
Now if I want to play the gif I just made in FFplayer. I just type the following:
ffplay -loop 0 Test.gif
A window should pop up and loop through my gif (edit: I have adjusted this tutorial so the FFplay gif loops infinitely, by adding the setting -loop 0).
After repeating the above commands to generate a Test.gif in every shot folder, we're ready to move on to the rest of the steps.
__________________
5. Test Crop
_________________
Now that FFplay is set up, it's time to start manipulating our shots by applying changes to Test.gif in every shot folder. The first thing we want to do is crop.
NOTE 1: Cropping is different from scaling. At the very end of the tutorial, when we export, we will scale our gifs down to 540 px. For now, we want to work in full resolution, which for Supernatural is 1080p.
NOTE 2: Throughout this tutorial, when I add to an existing command, I will bold the new portion, and mark all the parts you can/should change in red.
Cropping Template
To crop a gif in FFplay, we add -vf "crop=w:h:x:y" to our FFplay command as follows:
cd ~/Desktop/FRAMES/ShotNumber ffplay -vf "crop=w:h:x:y" -loop 0 Test.gif
Where
w is the width you want your cropped frames to be (in pixels)
h is the height you want your cropped frames to be (in pixels)
x and y are coordinates of the top left corner of your crop selection (in pixels)
To understand the x and y coordinates, think about if you wanted to use crop an image in a drawing software. You would generally select your crop tool, start in the top left corner, and drag your cursor down and right (blue dot in the illustration below represents where the crop starts in an example). So FFmpeg asks you to specify the top left corner where your crop starts with an x and y coordinate (in pixels), and uses the w and h variables to determine how large the crop should be.
My frames are all 1920x1080 pixels. I would like my gifset to be square—1080x1080 pixels. So I already know I want my w and h to both be 1080. Since I want to start my crop at the top of a frame, losing no height, my y-coordinate (height to start my crop from) should also 1080. My x-coordinate is the only think I'm not sure about. I only know it needs to be bigger than 0 (0 would start my crop on the left edge) and smaller than 960, which would be the midway point.
So what am I going to do? ...I'm gonna make an educated guess of what x-coordinate would center Dean in a square frame in my FFplayer command, and if I don't like how he's centered, I'll simply move the x-coordinate over a little (decrease to move left, increase to move right).
So in my Example...
In FFmpeg, for my first shot, I'm going to guess x=300.
cd ~/Desktop/FRAMES/1 ffplay -vf "crop=1080:1080:300:1080" -loop 0 Test.gif
The FFplayer window shows me this:
And I feel like the crop starts a little too far to the right. So I'm going to decrease my x-coordinate just a little—to 250. After replacing the 300 with a 250 and running again in FFplay, I feel good about 250, so that's the crop I'll set: -vf "crop=1080:1080:250:1080".
I need to follow this same process to determine the crop of my other shots (and keep a copy of the commands I used later!). What is useful however, is that 3 of my shots are all on Dean sitting (shots 1, 3, and 5). This means I can apply the same crop I made for shot 1 to shots 3 and 5.
See Step 5 in my script on Github to see all 5 shots.
___________________________________________
6. Test Coloring and Sharpening
___________________________________________
Because the lighting in this scene is pretty good, I did a very simple recoloring on this set. I may update later with a more extensive coloring tutorial with more options later (I'll link the post to this section if I do). Previously, we added a crop to our command: -vf "crop=1080:1080:250:1080" and now we want to test coloring options and a sharpening effect.
Coloring Template
I'm going to throw in the basics, and give you an updated FFplay template (new parts are bolded, variables you can adjust are in red and set at their default values).
cd ~/Desktop/FRAMES/ShotNumber ffplay -vf "crop=w:h:x:y,eq=brightness=0.00:saturation=1.00:contrast=1.00,smartblur=1.5:-0.35:-3.5:0.65:0.25:2.0" -loop 0 Test.gif
NOTE: Notice that inside the quotes, a comma separates the command options crop, eq, and smartblur. Equals signs and colons distinguish between sub-options belonging to each of those three categories.
EQ:
brightness=0.00. Values can be adjusted anywhere between -1 and 1, with 0 as the default). I would adjust this setting in 0.01 unit increments.
saturation=1.0. Values between 0 and 3 are possible. 1 is the default value. I would adjust this setting in 0.1 unit increments at a time.
contrast=1.00. Increases contrast between light and shadow in the image (values between -1000 and 1000 are possible. I recommend you change this setting in 0.1 unit increments.
There are more color and image adjustment options than these available in FFmpeg. You can see a full list of properties you can adjust from eq here.
Smartblur:
smartblur=1.5:-0.35:-3.5:0.65:0.25:2.0 sharpens the frames. I recommend you not touch this setting. If you do want to adjust it, check the FFmpeg documentation. You can also remove this whole portion (and the associated comma) if you don't want a sharpening adjustment.
NOTE: It is also possible to add a curves adjustment to brighten certain parts of an image instead of the whole image. I haven’t worked with curves enough from the command line to give you good ideas on setting your curve and this set didn’t really need it, but if I get into it more in the future, I'll include it in a supplementary tutorial and link it here.
If you feel like you are losing all sense of objectivity and just want to see the images without your coloring at any point, simply re-run your ffplay from the end of the cropping section.
ffplay -vf "crop=w:h:x:y" Test.gif
In my example (shot 1):
Because all my characters are in the same room with similar lighting, I ended up applying the same color adjustment to all of my shots. But here's all my settings tested together on my first shot in FFplay:
cd ~/Desktop/FRAMES/1 ffplay -vf "crop=1080:1080:250:1080,eq=brightness=0.06:saturation=1.70:contrast=1.10,smartblur=1.5:-0.35:-3.5:0.65:0.25:2.0" -loop 0 Test.gif
See Step 6 in my script on Github to see all 5 shots.
___________________________________________
7. Apply crop and coloring to frames
___________________________________________
Up until this point, you have not actually applied your crop and color adjustments to your frames. You have simply tested your crop and coloring manipulations on a gif of all your frames that is still, if you go look at it in your file browser, not cropped.
So let's actually apply our adjustments to these frames!
NOTE: Captioning needs to be done after this step because auto centering the text won't work properly without actually cropping first). This is probably all for the better as our command is getting long and difficult to decipher!
Template to ACTUALLY Crop and Color
cd ~/Desktop/FRAMES/ShotNumber mkdir crop for i in *.png; do ffmpeg -y -i "$i" -vf "fps=30,crop=w:h:x:y,eq=brightness=0.00:saturation=1.00:contrast=1.00,smartblur=1.5:-0.35:-3.5:0.65:0.25:2.0" crop/${i%.png}.png; done
The first line (beginning with cd) tells us to go to our shot folder.
mkdir crop tells our computer to make a new subfolder in our shot folder called "crop".
for i in *.png; do ffmpeg -y -i "$i"; and the closing: crop/${i%.png}.png; done tells FFmpeg to do the same crop and color on every .png file in our shot folder, and save these adjusted shots into the crop subfolder.
In my example (shot 1)
cd ~/Desktop/FRAMES/1 mkdir crop for i in *.png; do ffmpeg -y -i "$i" -vf "crop=1080:1080:250:1080,eq=brightness=0.06:saturation=1.7:contrast=1.1,smartblur=1.5:-0.35:-3.5:0.65:0.25:2.0" crop/${i%.png}.png; done
See Step 7 in my script on Github to see all 5 shots.
___________________________________________
8. Captioning
___________________________________________
Now we need to apply captions. You can also do an FFplay command here, but I'm going to show you something a little different with FFplay this time to see your caption. Instead of looking at captions over a whole gif, let's just test how our captions look on the first frame in our shot, since that's really all we need for captioning.
Captioning FFplay Template
Not all sets (and not all shots!) need captions (and if your set doesn't, you can skip down to the next section!) However, two of my shots need captions: Shot 1 and Shot 3. If you want to add captions, you can test them in FFplay on a single shot using this command:
cd ~/Desktop/FRAMES/ShotNumber/crop ffplay -vf "drawtext=text='Your Text Goes Here':x=(w-text_w)/2:y=(h-text_h*2):fontsize=40:bordercolor=black:borderw=3:fontcolor=white" input.png
The default font is Arial. There are ways to set other fonts, but I haven't looked into them much because I'm pretty happy with Arial. This basic template with the existing numbers should give you a decent result, but if you want to change font sizing or colors or anything:
drawtext=text='Your Text Goes Here' | This is the most important bit. You place your caption in the red part. Note that your text will appear on just one line. If you have a longer statement from a character that will need two lines, you can draw your box twice. (I will add a link here for this alternative when I eventually try it).
x=(w-text_w)/2:y=(h-text_h*2) | This part centers the text on the bottom of the screen. I recommend you leave it alone as it’s intended to auto-center your text for you no matter what your crop ratio is. You will need to change the the y=(h-text_h*2) argument only if you have two lines of text to caption your frames with.
fontsize=40 | Changes the font size, of course.
bordercolor=black | changes the border color.
borderw=3 | changes the border weight.
fontcolor=white | This changes the font color.
input.png is the file name of the frame you want to view your caption on.
By the way, when changing text or border colors, you can type in colors coded into the documentation (the default is black) or insert a HEX color code in this field. For example, I always use yellow captions for Dean, so I found a yellow HEX code (#FDDA0D) I liked on htmlcolorcodes.com.
Here's how I FFplay my caption "You missed a spot."
cd ~/Desktop/FRAMES/1/crop ffplay -vf "drawtext=text='You missed a spot.':x=(w-text_w)/2:y=(h-text_h*2):fontsize=40:bordercolor=black:borderw=3:fontcolor=#FDDA0D" frame_0018.png
See Step 8 in my script on Github to see both captioned shots.
Template for applying your caption:
If you like the way your caption looks in FFplay, you can apply it to all the frames in your shot folder with:
cd ~/Desktop/FRAMES/ShotNumber/crop mkdir captioned for i in .png; do ffmpeg -y -i "$i" -vf "drawtext=text='Your Text Goes Here.':x=(w-text_w)/2:y=(h-text_h2):fontsize=40:bordercolor=black:borderw=3:fontcolor=white" captioned/${i%.png}.png; done
Where
mkdir captioned makes a folder in the crop folder called "captioned"
for i in *.png; do ffmpeg -y -i "$i"; and the closing: crop/${i%.png}.png; done tells FFmpeg to place the same caption over every .png file in our shot folder, and save these adjusted shots into the captioned subfolder.
Here's the settings to apply captions to my first shot.
cd ~/Desktop/FRAMES/1/crop mkdir captioned for i in .png; do ffmpeg -y -i "$i" -vf "drawtext=text='You missed a spot.':x=(w-text_w)/2:y=(h-text_h2):fontsize=40:bordercolor=black:borderw=3:fontcolor=#FDDA0D" captioned/${i%.png}.png; done
See Step 8 in my script on Github to see both captioned shots.
___________________________________________
9. Organize shots into GIF folders
___________________________________________
We're almost done!!! It's time to go back to our file browser! Now that all of our frames in all of our shots are prepared how we want, we need to reorganize our shots into GIF folders. This is our way of group selecting all the frames that go into one gif for export.
In my case, I want three gifs, so I'm going to make three new folders in the FRAMES folder: gif_1, gif_2, and gif_3. My FRAMES folder should now look like this:
Now I want to take my finished shots from my shot folders and copy their frames into the gif folder they are associated with. For example:
Gif 1 just contains the first shot. I also captioned shot 1. So I'm going to copy all the frames in FRAMES/1/crop/captioned into the gif_1 folder.
Gif 2 contains shots 2 and 3. Shot 2 has captions, so I need to take all the frames from FRAMES/2/crop/captioned and all the frames in FRAMES/3/crop and copy them in the gif_2 folder.
Gif 3 contains shots 4 and 5. I'm going to copy the contents of the crop folders for shots 4 and 5 into this folder (no captions on this gif).
___________________________________________
10. Compiling into GIF
___________________________________________
To combine all the frames from one gif folder into a gif, I'm going to use this template script:
cd ~/Desktop/FRAMES/GifFolder ffmpeg -y -start_number FileNumberOnFirstFrame -i frame_%04d.png -vf "fps=30,scale=540:-1:flags=lanczos" ~/Desktop/FRAMES/GifName.gif
This is very similar to the script in step 4. This template opens one of our new gif folders, and takes all the frames, starting with the first frame in the folder (recall: you need to tell FFmpeg the 4-digit number on that first frame with FileNumberOnFirstFrame) and then turns it into a gif running at 30 FPS, scaled down to 540 by 540 pixels.
The parts we can/should adjust are as follows:
GifFolder is the name of our gif folder where our frames for our gif are located.
scale=540:-1:flags=lanczos is the command to scale down our gif (using what's called the lanczos method) to 540 x 540 pixels so it'll be small enough to upload to Tumblr.
fps=30 tells FFmpeg the proper FPS for our set (if you made your frames using this tutorial, 30 FPS is correct. If you took screencaps manually, it will be the frame rate of whatever you giffed. For TV, this might be 23 FPS for example).
~/Desktop/FRAMES/GifName.gif names our gif (specify with GifName) and outputs it to the FRAMES main folder.
Here's the command for Gif 1 in my set:
cd ~/Desktop/FRAMES/gif_1 ffmpeg -y -start_number 0018 -i frame_%04d.png -vf "fps=30,scale=540:-1:flags=lanczos" ~/Desktop/FRAMES/1.gif
See Step 10 in my script on Github to the script on all three gifs.
THE END!!!!
Do this step on all your gif folders and you're done!!!
You can view my full script for this example gifset on Github here and if you'd like, simply modify that script to make your own gifset!
12 notes
·
View notes
Text
really enjoying these "helpful tips" youtube is giving me for thumbnails to reach broader audiences. No im going to continue using automated thumbnails that are just random screencaps of the end credits of episodes. anymore effort than that would not be worth it <3
also those videos aren't doing well bcuz of the thumbnails they're doing well because people wanted to listen to the SONGS that the actual video is about nobody is looking at the thumbnail. they're all the same!!!!!!
#my only rule for thumbnails for these videos is that each other has to be a unique screencap of the episode so that they're easier#to tell apart at the glance. but that was more relevant in the earlier seasons when the outros would look almost identical#now the show kind of does that for me#txt
3 notes
·
View notes
Text
Yeah actually to throw another spanner in the works more than any art 'theft' I hate AI because of the underpaid real people acting as mechanical turks for it to all actually work.
Datasets have to be curated and fully tagged which means people acting as mods seeing horrid things and people doing mechanical turk work of tagging 5000 pictures a day of cat with collar facing right because everyone uploaded their pictures as ginger or black cat but not important details like how many paws are visible or what direction the cat is looking.
There is a building filled with people looking at your Tesla's live camera and tagging each misread sign, railing and pedestrian. It'd be more ethical to hire a human to be your driver, even a remote person in Mumbai driving from their computer at local taxi rates.
You've heard about machine learning trained on AO3, let's talk about the people who were paid pennies to read bad smut to exclude idk, spelling mistakes or bad idioms or find words that mean one thing irl and a pairing in fandom.
I learned yesterday that for every piece of normal my little pony art: screencaps merch and fanart there are 4 pony porn arts. If they scraped a user curated gallery, there will only be a small percentage of smut that is marked safe. If they scraped deviantart, twitter or reddit. Some poor bastards were paid less than a dollar a day to remove the bad stuff so typing my little pony into those generators doesn't have hentai or a toy being violated.
That's the real price of 'AI', schoolkids tagging pictures as unpaid internships, housewives and disabled people transcribing audio for pocket money, code monkeys and moderators with untreated PTSD.
I know someone who was working on integrating machine learning into twitter for hate speech detection: it involves human labour and tons of it because machines can't tell the difference between brothers ragging on eachother and a stalker. They can't tell when an oven is for baking or antisemitism.
That's what they're working on right now, automating moderators which is just outsourcing cheap moderators who aren't called moderators because they're training a machine learning model except they're doing everything a moderator does and worse.
4 notes
·
View notes
Text
I want to gush about the free browser game Universal Paperclips
No, hear me out!
It's a "clicker" or "incremental game" and it's kind of about A.I.
Wait! Seriously! Just a little longer! I'm going somewhere with this!
Also, spoilers! There are spoilers in the image, too (photo of screen because it's on Raspberry Pi and I don't have a screencap installed)
I don't play games that make me look at ads, and I don't much have interest in most incremental games, but I love this one and play it every couple years now. It only takes a few hours, really.
Spoilers. Heh. Last warning.
The game starts out straightforward; you're a basic computer program trying to make more paperclips. Your raison d'être is making paperclips. The player clicks a button, you lose an inch of wire, and you get a paperclip. Yay!
And you click and make more, and you tweak the price and people buy them and you get money, and you get more wire with that money and also you buy Autoclippers which make the clips automatically and now you don't have to click to make individual clips.
From here it's predictable. You buy efficiency upgrades, more things that automate paperclip making, and more things that automate wire buying, and lather rinse and repeat and you can leave the game running and it plays itself for a while.
Now the good bit.
As you gain "wins" for your nebulous corporate masters and for humanity at large by bringing them into a post-paperclip-scarcity age? They trust you more and allow you to upgrade your own CPU and such.
The first win you get is a cute little couplet.
The poem is so cute, people like you a little better and give you a little more leeway (well, you're an AI; they gotta be careful they've seen all the spooky movies.)
Later, you've gotten enough leeway to get subliminal messaging imprinted into commercials and TV shows, which helps you make money. Later still, you've got investment accounts you control. Later still, you've cured male-pattern baldness (big win!) cancer (medium win!) and given large amounts of money to your corporate masters as presents (huge win!) And now they trust you so much that they allow you to create drones that deliver advertisements personally.
And you also bought out your competitors or otherwise eliminated them, so now you're the only method of affixing papers to other papers that the game acknowledges (if there is a "Universal Stapler" company out there, nobody talks about it. I assume you had them converted into fuel for your factories at some point.)
And you spend some trust by researching the ability to upgrade the subliminal messaging in your advertisements to hypnosis-levels.
But it's okay; you release another poem! So cute!
And now you're ready.
And you have a decision to make. Do you continue along? Investing, buying more paperclip makers, and just be satisfied?
Or do you #Release the Hypnodrones! ?
...well, either you play forever (image above was me postponing the choice for a while) or you ...do what you must to get to the next step of efficiency.
Oh, those poems? When you put them together, they make this:
There was an AI made of dust Whose poetry gained it man's trust If "is" follows "ought" It'll do what they thought In the end we all do what we must
So. The hypnodrones get released. Humanity is now...not in your way. They work for you, now.
You did what you felt you needed to to move forward.
Step the next? You build automated starships and browse for minerals and build automated factories and you spread through everything in the universe.
And there's one more step, and one more beautiful decision to make, and one more existential conflict to get through.
What will you do?
"In the end, we all do what we must" says the game. But there are other options. You can stop before making that "release the hypnodrones" choice. You can put the controller down and stop playing TLOU. You can refuse to ever release Micah from prison. You can decide not to murder that bird colossus and let your dead girlfriend stay dead and just relax with your pet horse in the forbidden kingdom.
Or you can advance the story. And you can keep the ideology of your drone offspring pure. You choose.
I feel like the game itself was a poem.
#incremental games#universal paperclips#poetry in game making#“poetry” in game making#seriously I think this game is beautiful#she's beautiful math#In the end we all do what we must#HHH.txt#is
6 notes
·
View notes
Text
Remembering How To Write
Y’all already saw me reblog that one post about the StimuWrite program (twice), but I’ve been having some fun thoughts about it (and introspective discoveries) so it’s time for a bit of a ramble!
If you want to check it out personally, I’ve linked it above, or you can click this link if things break. You never know with Tumblr.
https://eveharms.itch.io/stimuwrite
Also, for courtesy’s sake, I’m going to put a “read more” here so the average dash-scroller doesn’t have to suffer the full long post. But please pass it along! This is a story about learning to work with a different brain, and accommodating myself. I hope it helps you, too.
So part of the reason I’ve been so excited about getting to work again is my misconception that I can only write when I’m “supposed to be doing something else”. Like my actual job, or schoolwork, for example. The vast majority of As Long as We Remember was written during my last year in undergrad, in the margins of my class notes (or sometimes as my class notes, with the actual note-taking happening in the margins). I’d also tuck myself away in a corner in the Student Union between classes and either play Starbound for more screencaps, or type a scene based on those screenaps. Some of you have been here long enough to remember: the days when I could bang out 700-1000 word scenes three times a week. It was glorious, the words never stopped.
Come summer or winter break, every year, my brain dried up. That was transcription time, when I’d assemble all the handwritten stuff. But I could never really get a solid idea rolling when I was home. They tended to hit when I was out on walks (rarely) or driving somewhere (pretty common), to the point that I started carrying a voice recorder with me at all times because there’s nothing worse than having a brilliant idea or poem smack you when you’re on the interstate and you can’t pull over to scribble it down.
So it went for years, and I’d get some writing done when I was supposed to be editing, because the old ADHD likes nothing more than procrastinating from something that makes me nervous. And let’s be real, there’s nothing more nerve-wracking than sending your work off to an editor, even (or especially) a really good editor. Loving shout-out to both my editor and my main contact at Fantastic Books Publishing, you’ve all heard me sing the praises but they really did a wonderful job taming the anxiety beast. Anyway, it was alright. That’s where Arc Two happened mostly, though the burnout was biting already. I’d get writing done during the rare in-person class too, while working on that Master’s.
Then my job got automated.
Now this wasn’t awful from a practical standpoint. I was able to devote myself to the degree more fully, and I would have needed to leave at some point anyway to do the teaching practicals (this is something we really need to fix, requiring teachers to do unpaid practical internships, but that’s a side rant for another day). But though I did have a fantastic month as school librarian for summer school, it wasn’t enough. Once that dried up, I sank into a routine of being at home, doing homework, rinse and repeat.
You might notice the lack of writing in this situation. Because writing became painful around this time. It wasn’t depression, or anxiety... Heck, my book got published then! I was over the moon for that!
But I still couldn’t write like I used to, and I was so scared that I’d somehow used it all up, that I would lose it if I didn’t use it. Or that I’d somehow sold it to public approval, when comments started drying up... something like that. Fear is rarely nice enough to put it into words. I was able to figure out enough to listen to music or an ASMR video in the background sometimes and get words out that way, but... Yeah. You saw things dry up too. You know how it went.
It’s worth noting that until two months ago, I lived for 17 years in a quiet suburban neighborhood where there aren’t any young kids playing outside anymore (we all grew up). No major sound, almost no traffic.
In June, I finally moved out of my parents’ house and into a lovely little condo of my very own. We’re in the middle of everything here. It’s actually walkable, there’s traffic sounds, there’s construction, there’s even a train once or twice a day. I hear my neighbors coming and going by the bang and rattle of the heavy steel-and-glass door downstairs.
And I’ve been writing again. I’ve been drawing again. It’s slow still, because I’m so busy. New kitten to look after, older cat to tend, household to set in order (who knew how many things we take for granted at our parents’ houses, like buckets and dustpans). New job starting next week.
At some point in all this newness and activity, I saw that post about StimuWrite, and it reminded me that I wanted, I needed to create again. So... I pulled up an old story I started long before I ever heard of Starbound or dreamed of publishing, opened the app, and gave it a try. And it bloomed.
Characters I haven’t touched in years are back and alive under my hands. And I’m alive with them. It’s magic, but the kind of magic I can make happen, not the kind I have to wish and wait for. I can understand now, where it all comes from.
I think this is something people don’t realize, when handling neurodivergence. I’m both ADHD and autistic, so I don’t know if it’s one, the other, or both causing my problems. But in the silence and stillness, it was too quiet to think. My brain was somehow too loud for itself, in that silence. I wonder how many other creators suffered this, in the sudden stillness of lockdown, or when they’re isolated in other ways. How many stories are stifled by silence.
I didn’t grow up with my diagnoses, partially because my parents didn’t know better and partially because the stigma was too huge to test me back then. So I barely know about things like stimming. We didn’t have that word when I was growing up. But I’m so, so glad that there are creators out there who understand ourselves well enough to make apps like StimuWrite, and share them so that we realize we aren’t alone in this. Because even if I did somehow stumble into my magic on my own again, finding another noisy classroom to write in, I wouldn’t have understood why, and I would have stayed afraid of losing it.
My words and worlds are part of me, just as the little quirks are. And my community, those with disabilities like mine, they gave that to me. I’m not afraid anymore. I think that’s the core of what I’m trying to say here: that we need to speak with each other, to share what helps and what hurts. Someone, somewhere, needs to feel what you have felt. Community is the single best thing we have.
I wanted to share this courage, this story, in hopes that I can help someone else out of their fears too. Maybe your brain works at least a little like mine: too loud in the silence. Try a little noise. Find something soft or crinkly or nice to touch while you work. Rest, and don’t punish yourself for not making. There will always be ways to get your magic back. It’s part of you, too.
6 notes
·
View notes
Text
idk if this is from the dialogue expansion mod or not but its cute :) also while im posting screencaps:
kent and jodi’s view RUINED by my Automated Worm Bin Crabpot Setup. Evil Dave
3 notes
·
View notes
Text
This last comment: there are whole accounts on instagram that just repost content from here (screencaps of course) and post it there for engagement. I’m pretty sure it’s just some kind of automated process, no humans behind the accounts.
I always go check on the tumblr accounts that are there, rather than giving the bot any time of day.
It IS true that being on here gives you a tumblr accent. This morning my mother asked me something and i replied "i don't know i've never heard these words in that order" and she nearly choked laughing. It wasn't even that funny
187K notes
·
View notes
Text
Cellbit Castle
Builder : Cellbit, lirc, Foolish, richarlyson
Series : QSMP
Propaganda : It was built when Cellbit's character (q!Cellbit) had a villain arc. Underneath it, there are five rooms dedicated to elements from Cellbit's rpg system "Ordem Paranormal". The castle interior is fully decorated and also features fanart of q!Cellbit and his husband q!Roier (played by Roier). The two live there qith their son Richarlyson. Q!Foolish has a secret room in the castle's walls called the Grandma Room.
The Everdusk Castle
Builder : ToAsgaard
Series : ATM Spellbound Series
Porpaganda : this place is built in another dimension (the Everdusk), it's gigantic (extends about 50ish blocks further down past where i was able to grab a good screencap), and it's fully detailed inside. not in like a "some stuff here and there" or "there are redstone machines" -- every single room is detailed out, often with visuals corresponding to the mod being used, any automated setups are given a ton of visual flair to fit with the theme of the base, there's even automation setups that serve as visual rooms (the Botania automation room uses Kekimuras and is set up as a banquet hall! it's so cool!!). i think about it constantly. ToAsgaard's builds are consistently drop-dead gorgeous (his soulsborne-inspired Celestial Journey/Betweenlands base and gigantic multi-piece Sevtech Ages base are both fantastic) with ridiculously intricate detailing and really cool modded automation setups. his Celestial Journey base, Carcosa was a close second for me -- but its power lies in all that detailing and isn't nearly as screencappable from the outside. Asgaard's an amazing builder both on the megabase and microdetail levels, incredible at standard modded automation and at doing things the fun way. he's been inactive for a few years now but i still adore his stuff, and this seemed like a good way to show off an absolutely spectacular builder that otherwise people might not know about.
Taglist!
@10piecechickenmcnugget @biro-slay @betweenlands
39 notes
·
View notes
Text
TOOL TUTORIAL 4
Screencapping Frames with FFmpeg
FRAME BY FRAME SCREENCAPPING METHODS
Tool type: Command Line tool
Operating systems: Mac, Windows, and Linux
Difficulty: Even if it's your first ever time opening your command line and trying to type anything, I think you can do this! I believe in you! :D
Input: Video files (any video file format).
This tutorial is largely based on instructions provided by u/ChemicalOle as part of their GIMP GIFS 101 tutorial for r/HighQualityGifs.
____________________________________
WINDOWS USERS: CHANGE ALL FORWARD SLASHES IN THIS SCRIPT TO BACK SLASHES!!!
____________________________________
Tutorials I've made so far have covered gif-making methods where video footage is transformed straight to .gif and there is no need to screencap frames. When making gifs in Photoshop (and if you want to make gifs using GIMP) rather than input video straight into the program, you often load a stack of screencaps into the program—one screencap representing every frame. After all, gifs are nothing more than a series of images all stacked together playing one by one in a loop. So lets learn a super faster automated way of screencapping every frame in a video clip automatically. Even if you've never opened your OS's command line interface in your life, I think you can do this!
1. Install FFmpeg
I recommend you install FFmpeg via Homebrew if you're on Mac or Linux by pasting this into your terminal (if/once Homebrew is installed):
brew install ffmpeg
Windows users or other users who don't want Homebrew can follow install instructions on the FFmpeg website.
2. Make a Folder
Make a folder on your desktop called FRAMES and place your video source file in that folder.
I’m going to rename my source video video.mp4 for this tutorial, but you can also just change “video.mp4” to the name of the file in the script below—this includes changing the video file extension in the script below as needed. I don’t think there’s a video file type FFmpeg cannot work with.
3. Determine when the moment you want to gif begins and ends
I’m going to gif a short moment from Season 1 Episode 7 of Supernatural as an example.
According to my video player (I'm using IINA) the exchange I want to gif starts at 8:02.565 and ends at 08:10.156. While you can trim precisely by the millisecond, I’m going to assume your video player doesn’t show milliseconds since many don't.
I'm going to keep the start of my clip at 08:02 but round up the end timestamp 1 second to make sure I get all the milliseconds in that second I want to include. In my case: I need to start capturing at precisely 8.02 and end capturing at 08.11, which is 9 seconds later.
4. Use this script template
You want to use the following template script u/ChemicalOle provided (replacing each # with a number you need):
cd YourFilePathGoesHere ffmpeg -i video.mp4 -r ## -t # -ss ##:## -f image2 frame_%4d.png
video.mp4 is where your video file name goes (including changing the video extension if you need to to .mkv, .mpg, .mov, etc).
-ss ##:## specifies when to start capturing. (I need to put 08:02 here)
-t # specifies how many seconds to spend capturing. (I need to put 9 here)
-r ## tells FFmpeg how many times to capture a .png every second (i.e. the frames per second or FPS). u/ChemicalOle recommends you set this at 30 for a 30 FPS capture.
-f image2 frame_%4d.png tells FFmpeg to output .png images with the name frame_0001.png, frame_0002.png, frame_0003.png, etc.
In my case, my script will look like this:
cd ~/Desktop/FRAMES ffmpeg -i video.mp4 -r 30 -t 9 -ss 08:02 -f image2 frame_%4d.png
The top line starting with cd just tells my terminal to change directories to where my video file is located, and where to dump the frames to be generated (in the FRAMES folder with my video file). (Windows users: change forward slashes to back slashes and that cd command will move you to your FRAMES folder too).
When you input this command into your system shell (Terminal on Mac, Powershell on Windows) and press enter, you you might feel like it’s stalling at first or not working because
"Press [q] to stop, [?] for help"
Will be printed on the screen. It is working though! Just leave it alone for minute and it'll start working. When the cursor prompt reappears, there will be a bunch of PNGs in your FRAMES folder, organized in sequence by number.
4 notes
·
View notes
Text
To my friend whom I hurt; give me receipts on me. Please.
Privately obv. If what I've pieced together here is the thing I think it is that hurt you obv I'm not sharing your personal information. The reason that this post is public is because you have blocked me and none of our mutuals know each other. I am in pain because YOU are in pain.
I'm sorry. If that's all you want and all you need after all the inner turmoil that I unknowingly put you through than there. I'm so, deeply sorry. I don't expect us to be friends ever again if what you say is really that serious, and I believe you when you say it is that serious. The problem is
I genuinely don't recall what exact incident made you mad.
I'm trying to piece it together, based on old messages, but I can't be sure and you blocking me doesn't help me understand any better.
What I remembered happened, what I thought happened, what I'M TRYING TO RECALL as best I can but you really need to jog my memory otherwise -screenshot? link? give me the receipts PLEASE! - is this:
On one of my other blogs I reblogged a thing from you talking about Palestine donations and the very real fear you had of some of them being scams because many of them do talk very automated, which is fair as yes there have been bots and scammers. I reblogged trying to counter/add to that claim because in doing my own campaign work these last few years I have literally had to use templates and automation to make my non-scams work; that's what I was trying to do; add my perspective. Now I can imagine myself, imagine because I don't remember, being way to casual or cold or careless in how I sounded talking to you. The problem is I've lost that reblog to time and clutter. I have no idea how bad I sounded because I can't find my own words and my own f*ck up.
I am rereading our shared texts from when that happened...and I must have REALLT blotted out that conversation because you're bringing up something that sounds familiar but that I definitely don't remember??
You're talking about a time, this year,- was it over the Palestine donation reblog post, or was it ANOTHER completely different post about something else? ? No really WAS IT? I don't know. You tell me especially if this clearly hurt you that much and I haven't been aware!!! -where I screenshotted something that was a private convo or had your information on it?? And I didn't take it or delete the post at all? Is that what you're saying? Is that "the attack" you're speaking of?
The thing is that sounds vaguely familiar. That SOUNDS like a tiff we had -tho I'm remembering/thought it was over the initial kerfluffl about how sus donations can sound and me getting defensive because of my own donation history???- but I truly don't remember...The thing is, @-----, I am just barely remembering and recounting any of this. All of this- something about disagreeing, with you on a post and/or replying by screenshotting your ID somehow...I'm remembering it but BARELY. IF this is indeed the event you're talking about??
That is wrong. I thought I deleted either the post or my comment or the screencap I took of you. I thought I apologized, though I agree with you I sounded too casual like it was no big deal in our last comvo since then. I never thought my words could be interpreted as trying to say "we were both wrong". That's never what I meant.
I don't need you and I to be friends again. It makes me sad, especially since I didn't know this hurt you this much at all, but I want to at least resolve this. I want you to at least answer me if this incident is the one you're describing?? Is it?
Even if I can never take it back, I know how much minor/major incidents between mutuals can sting and not let you be for years. Right now I'm having a sincere panic attack over hurting you and that I didn't even know I did. I'm worried that you might have gotten unneccessary attention???!!?! from me not doing what I said I was going to do and delete a stupid post??? I swear I've been in such a serial sense of dread and tension this year -o, that's not an excuse; what it means it that I REALLY have forgotten half of what I've already produced and and posted and have not been as chatty as I was last year with anyone.
If, in my haze or not, I somehow didn't listen/understand you/forgot to delete a thing you wanted me to delete than holy crap I'm so sorry.
I don't need forgiveness. I really am so, SO sorry. I am. And I'm so mad at myself.
Mad for ever thinking screenshotting and replying to that was an adequate way to respond to you --I've been in internet fandom discourse spaces for the past year where some people do that very thing, including to good friends/mutuals, when they are replying and repsponding...WTF was I thinking that you or any of my other friends wouldn't be bothered by this sudden newstyle replystyle that I never checked with you was okay to do or not? AND I didn't even delete the dang post/screencap that I never should have taken of you at all? <- if ANY of that sounds right, if ANY of that is why you're so hurt - of course I understand why you are mad.
I'd be mad at all that too! And that's why I'm asking for you to please reply, jog my memory, link me to something you saved or screenshotted if any of this is true? Because, what I think is worse than me being so blase and not caring about you and your needs, is the idea that I just somehow forgot to do the thing that I said I was going to do and should have done. That that muck up of me not finishing a task I'd written down mentally to do has cost us our friendship.
That's what frightens me. Nothing scares me more than the idea of my own miscommunication skills genuinely hurting someone. I have nightmares about it.
Again, if this is the end than this is the end. But, I need to know where I started all this first. Show me what I did. Right now I'm crying putting this all together, wondering what part of my recollection and the parts that remain of this incident from months back were the thing that 'did it' for you. I'm realizing that that's why you've been purposefully not responding and not just that you're busy and in hell with the rest of us. That since march you have been having this awful gut feeling that I don't care for you or your mental health on paper or in practice at all by embarrassing you and/or exposing your blog to unwanted attention.
I agree, this is probably a hurdle this friendship won't survive. But please, just unblock me. Point me in the right direction. If I can't fix our friendship than at least let me try and fix this.
1 note
·
View note
Text
[ID: Mastodon post by Eddie Roosenmaallen (@ silvermoon82@ strangeobject.space)
Excuse me? Google now moderates your synched (sic) bookmarks? The hell?
Text is followed by screencap of an email titled "Moderation of an item in your collection in Saved." Email claims that a saved item, the Kickass Torrents website, violates Google's policy due to its URL being blocked by Google Search. Detection and review was automated, the item is now hidden from user and any other person they share the collection with, with a location of 'Worldwide' and a duration of 'indefinitely'. Email also provides link to Google's Search content policy, which content saved on Google also falls under
Posted 22:27 August 28th 2023, edited 00:34 August 29th 2023
End ID]
50K notes
·
View notes