#rosie the robot maid
Explore tagged Tumblr posts
Text
I already did this but whateve
Hello, tumblr user. Before you is a tumblr post asking you to name a female fictional character. You have unlimited time to tag a female character, NOT a male one.
Begin.
#amelia bedelia#sakura haruno#bedelia du maurier#marinette dupain cheng#rayla#zatanna#chichi#hinata hyuga#claire bennet#lois lane#gogo tomago#galadriel#allura#yang xiao long#ruby rose#sailor jupiter#sailor mercury#asuka langley soryu#Liko#orla#chun-li#apple white#rosie the robot maid#judy jetson#usagi tsukino#chibiusa#ariel#sypha belnades#queen beryl#ursula
56K notes
·
View notes
Text
🤖 Rosey the Robot Maid 🛸
#Rosey the Robot#The Jetsons#character study#cartoon fanart#Hanna Barbera#Warner Bros Animation#Cartoon Network#Boomerang from Cartoon Network#Jean Vander Pyl#Tress MacNeille#Iwao Takamoto#futuristic#space age#60s cartoons#80s cartoons#robot#maid#Rosie's Boyfriend#Rosie Come Home#Mother's Day for Rosie#Robot’s Revenge#Wedding Bells for Rosey
7 notes
·
View notes
Text
The robots at the Tesla event being tele-operated is so funny to me, mostly because I keep thinking about what it must be like to be one of the people doing that. Like, rich people paying you to serve drinks, but you're doing it through a machine that cost tens of thousands of dollars? This is some scifi dystopia stuff. Actually, I think I must have seen it before somewhere, but Surrogates is the closest that comes to mind, and that was people teleoperating fit, attractive robots to live their ideal lives, which is kind of the opposite.
I'm feeling really compelled to cook up a short story about teleoperated robot maids. Maybe some The Jetsons fanfic where Rosie is being run by someone in South America.
57 notes
·
View notes
Note
Neil spaceman....
No but for real, I imagine that in like 200 years there'll be a lot of space-related surnames like Jupteriano and Kosmovich. Just imagine.....
In Spanish the Jetsons were translated to "Los Supersónicos", George was just called "Súper" and his boss was called "Júpiter"
I looked it up (it's been ages since I watched it) and the rest of the family also had names like Ultra (the mom) Lucero (the girl, and Lucero is an actual female name in Spanish) Cometín (the boy) and Astro the dog. Funny enough, Rosie the robot maid was translated as "Robotina" which is what some people call Roombas here.
Where was I going with this? Oh yeah, in the Soviet Union it wasn't uncommon to name kids after revolutionary or scientific concepts. So you had kids named Radiy (for radium) and Industriy. IIRC there were kids also named for electricity, cosmonautics, vaccines, and of course the names of Lenin, Marx, concepts such as world revolution, the new era... Like we say in Argentina, los soviéticos estaban en una, pero una muy buena, if only we had half of the optimism those people had for the future
36 notes
·
View notes
Note
What I don't understand about AI is shouldn't we want to teach the robots to do the menial tasks that nobody likes so we can have more time to spend on creative and other fun pursuits? Not have the robots do the fun stuff so we can continue to suffer doing the worst jobs. I want the Jetsons future with Rosie the Robot maid, not this, and I don't understand why ANYONE wants this. An AI can't possibly write or make music or paint as well as a human!
Of course, we should use AI, if we use it at all, to reduce human misery and increase our free time so people can have more fun and make more art. But the people who are driving the current AI push into creative arts don't see things that way. From what I can tell, they're basically soulless techbros and/or greedy billionaires who don't give a shit about art, or the quality of human life, or even the looming dangers of the singularity. They will happily destroy jobs, human creativity, and even humanity itself if it will make their stock options go up and let them buy bigger yachts.
#ask me anything#tv writing#ask me stuff#ai#ai art generator#chatgpt#wgawest#wga west#stand with the wga#wga strike#support the wga#wga strong#sag aftra#sag strike#amptp#fuck the amptp
203 notes
·
View notes
Text
⚠︎ ⟮ NPTs ⟯ ... Murder_drones.mp3 ⟩ Cyn
﹫ ❲ Requested by anonymous ❳
「 NAMES 」
Cyn / Sin, Cynthia / Sinthia, Doily / Doilie¹, Marnierre², Salem, Marianne, Marie Anne, Lady, Doll / Dolly / Dollie, Cutlerie³, Sweetie, Belle, Bow, Lacy / Lacie / Lace, Ribbon, Sash, Felicity, Lolita, Teacup, Porcelain / Porcelynn, Silver, Delilah, Velvet / Velvette, Paisley / Paislie, Creepie, Leslie, Rose / Rosie, Pearl, Opal, Filistata⁴, Dear / Dearie, Ciel, Jacqueline / Jacquelynne², Curtsy, Embroiderie⁵, Boudreaux², Envy, Violynne⁶
¹ After doilies. ² French. ³ After cutlery. ⁴ A kind of arachnid. ⁵ After embroidery. ⁶ After violins.
「 PRONOUNS 」
Che/Cher/Cherie, Fem/Femme, Goth/Gothic, Maid/Maids, Sweet/Sweeties, Solve/Solvers, Wyrm/Wyrms, Eldritch/Eldritchs, See/Seer, Fabric/Fabrics, Sh3/H3r, Dear/Dearie, Error/404, 404/404s, Glitch/Glitches, Tick/Tock, Sh_/H_r, X/Xs, Stitch/Stitches,🪞/🪞s, 🥂/🥂s, ☕/☕s, ⚠️/⚠️s, ⚠︎/⚠︎s
「 TITLES 」
The Lady / Gentleman of the mansion / manor, The absolute solver of fabrics, ( Prn ) with a golden stare, The most elegant robot / drone / android, His / N's little sister, The sweetest / cutest / most adorable younger sister, The little girl / ( Label ) with a dark secret, ( Prn ) with a dark secret, ( Prn ) who is attending the gala, ( Prn ) who is hosting a tea party, ( Prn) who is playing with ( prn ) dolls, ( Prn ) eldritch form
#https://absoluteSolver_npts#murder drones#md#md cyn#murder drones cyn#the absolute solver of fabrics#absolute solver#npts#npt suggestions
57 notes
·
View notes
Text
I've finally been able to sketch something, It ain't much, but it's honest work.
"Jeeves" is the most basic robotic domestic servant offered in C.H.C. space; he is relatively cheaply made, with no options to alter or change his personality or appearance; he is stuck as an "elderly" butler; and he struggles to complete more physical tasks as well as tasks that require high dexterity as Jeeves only has three fingers on each hand. However, his low cost and easy maintenance ensure that he is still one of the best sellers for those in need of domestic servants, though some find his featureless face, sunken eyes, and disproportionate arms off-putting.
"Rosie" is the newest and greatest robotic servant, considerably bulkier and more expensive than Jeeves, with the base personality of an elderly housemaid. Rosie does well in any role given to her; she works with AI cards much like Jeeves, but they are far larger and therefore have much greater flexibility per card. While Jeeves may need separate AI cards for housework, chauffeuring, shopping, and yard work, Rosie would only need one all-encompassing domestic AI card. Rosie is highly customizable, and with the use of personality cards that change her personality from the baseline maid, she can suit her owner's liking in terms of looks and personality. Combined with this, Rosie has a "face," allowing her to "look at" and express herself to her owners, making her far more popular than Jeeves, as many see their Rosie as part of the family, despite the constant reminders from the manufacturer that Rosie does not have feelings and has no attachment to any one person.
Many view Jeeves and Rosie as a couple, and the manufacturer has capitalized on this. If Jeeves and Rosie are in close proximity and idle, such as when recharging or waiting outside of a building for their owners, they will sometimes have "conversations," prepackaged voice lines they say to each other, often more ad than actual discussion, with both machines praising the construction and quality of each other or Rosie pointing out her "superiority" over Jeeves.
K luv U bi
#art#digital art#digital#oc#art of mine#original character#character design#oc art#oc illustration#illustration#digital illustration#robot#robot art#robot oc#when i wrote this it didn't seem like that much words#artists on tumblr#scifi art#sci fi#scifi#sketch
3 notes
·
View notes
Note
I feel Maid Iggy would just create Rosie the Robot from The Jetsons to do the work for her XD
not if Velvette is making her wear it
0 notes
Text
A faster, better way to train general-purpose robots
New Post has been published on https://sunalei.org/news/a-faster-better-way-to-train-general-purpose-robots/
A faster, better way to train general-purpose robots
In the classic cartoon “The Jetsons,” Rosie the robotic maid seamlessly switches from vacuuming the house to cooking dinner to taking out the trash. But in real life, training a general-purpose robot remains a major challenge.
Typically, engineers collect data that are specific to a certain robot and task, which they use to train the robot in a controlled environment. However, gathering these data is costly and time-consuming, and the robot will likely struggle to adapt to environments or tasks it hasn’t seen before.
To train better general-purpose robots, MIT researchers developed a versatile technique that combines a huge amount of heterogeneous data from many of sources into one system that can teach any robot a wide range of tasks.
Their method involves aligning data from varied domains, like simulations and real robots, and multiple modalities, including vision sensors and robotic arm position encoders, into a shared “language” that a generative AI model can process.
By combining such an enormous amount of data, this approach can be used to train a robot to perform a variety of tasks without the need to start training it from scratch each time.
This method could be faster and less expensive than traditional techniques because it requires far fewer task-specific data. In addition, it outperformed training from scratch by more than 20 percent in simulation and real-world experiments.
“In robotics, people often claim that we don’t have enough training data. But in my view, another big problem is that the data come from so many different domains, modalities, and robot hardware. Our work shows how you’d be able to train a robot with all of them put together,” says Lirui Wang, an electrical engineering and computer science (EECS) graduate student and lead author of a paper on this technique.
Wang’s co-authors include fellow EECS graduate student Jialiang Zhao; Xinlei Chen, a research scientist at Meta; and senior author Kaiming He, an associate professor in EECS and a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL). The research will be presented at the Conference on Neural Information Processing Systems.
Inspired by LLMs
A robotic “policy” takes in sensor observations, like camera images or proprioceptive measurements that track the speed and position a robotic arm, and then tells a robot how and where to move.
Policies are typically trained using imitation learning, meaning a human demonstrates actions or teleoperates a robot to generate data, which are fed into an AI model that learns the policy. Because this method uses a small amount of task-specific data, robots often fail when their environment or task changes.
To develop a better approach, Wang and his collaborators drew inspiration from large language models like GPT-4.
These models are pretrained using an enormous amount of diverse language data and then fine-tuned by feeding them a small amount of task-specific data. Pretraining on so much data helps the models adapt to perform well on a variety of tasks.
“In the language domain, the data are all just sentences. In robotics, given all the heterogeneity in the data, if you want to pretrain in a similar manner, we need a different architecture,” he says.
Robotic data take many forms, from camera images to language instructions to depth maps. At the same time, each robot is mechanically unique, with a different number and orientation of arms, grippers, and sensors. Plus, the environments where data are collected vary widely.
The MIT researchers developed a new architecture called Heterogeneous Pretrained Transformers (HPT) that unifies data from these varied modalities and domains.
They put a machine-learning model known as a transformer into the middle of their architecture, which processes vision and proprioception inputs. A transformer is the same type of model that forms the backbone of large language models.
The researchers align data from vision and proprioception into the same type of input, called a token, which the transformer can process. Each input is represented with the same fixed number of tokens.
Then the transformer maps all inputs into one shared space, growing into a huge, pretrained model as it processes and learns from more data. The larger the transformer becomes, the better it will perform.
A user only needs to feed HPT a small amount of data on their robot’s design, setup, and the task they want it to perform. Then HPT transfers the knowledge the transformer grained during pretraining to learn the new task.
Enabling dexterous motions
One of the biggest challenges of developing HPT was building the massive dataset to pretrain the transformer, which included 52 datasets with more than 200,000 robot trajectories in four categories, including human demo videos and simulation.
The researchers also needed to develop an efficient way to turn raw proprioception signals from an array of sensors into data the transformer could handle.
“Proprioception is key to enable a lot of dexterous motions. Because the number of tokens is in our architecture always the same, we place the same importance on proprioception and vision,” Wang explains.
When they tested HPT, it improved robot performance by more than 20 percent on simulation and real-world tasks, compared with training from scratch each time. Even when the task was very different from the pretraining data, HPT still improved performance.
“This paper provides a novel approach to training a single policy across multiple robot embodiments. This enables training across diverse datasets, enabling robot learning methods to significantly scale up the size of datasets that they can train on. It also allows the model to quickly adapt to new robot embodiments, which is important as new robot designs are continuously being produced,” says David Held, associate professor at the Carnegie Mellon University Robotics Institute, who was not involved with this work.
In the future, the researchers want to study how data diversity could boost the performance of HPT. They also want to enhance HPT so it can process unlabeled data like GPT-4 and other large language models.
“Our dream is to have a universal robot brain that you could download and use for your robot without any training at all. While we are just in the early stages, we are going to keep pushing hard and hope scaling leads to a breakthrough in robotic policies, like it did with large language models,” he says.
This work was funded, in part, by the Amazon Greater Boston Tech Initiative and the Toyota Research Institute.
0 notes
Text
A faster, better way to train general-purpose robots
New Post has been published on https://thedigitalinsider.com/a-faster-better-way-to-train-general-purpose-robots/
A faster, better way to train general-purpose robots
In the classic cartoon “The Jetsons,” Rosie the robotic maid seamlessly switches from vacuuming the house to cooking dinner to taking out the trash. But in real life, training a general-purpose robot remains a major challenge.
Typically, engineers collect data that are specific to a certain robot and task, which they use to train the robot in a controlled environment. However, gathering these data is costly and time-consuming, and the robot will likely struggle to adapt to environments or tasks it hasn’t seen before.
To train better general-purpose robots, MIT researchers developed a versatile technique that combines a huge amount of heterogeneous data from many of sources into one system that can teach any robot a wide range of tasks.
Their method involves aligning data from varied domains, like simulations and real robots, and multiple modalities, including vision sensors and robotic arm position encoders, into a shared “language” that a generative AI model can process.
By combining such an enormous amount of data, this approach can be used to train a robot to perform a variety of tasks without the need to start training it from scratch each time.
This method could be faster and less expensive than traditional techniques because it requires far fewer task-specific data. In addition, it outperformed training from scratch by more than 20 percent in simulation and real-world experiments.
“In robotics, people often claim that we don’t have enough training data. But in my view, another big problem is that the data come from so many different domains, modalities, and robot hardware. Our work shows how you’d be able to train a robot with all of them put together,” says Lirui Wang, an electrical engineering and computer science (EECS) graduate student and lead author of a paper on this technique.
Wang’s co-authors include fellow EECS graduate student Jialiang Zhao; Xinlei Chen, a research scientist at Meta; and senior author Kaiming He, an associate professor in EECS and a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL). The research will be presented at the Conference on Neural Information Processing Systems.
Inspired by LLMs
A robotic “policy” takes in sensor observations, like camera images or proprioceptive measurements that track the speed and position a robotic arm, and then tells a robot how and where to move.
Policies are typically trained using imitation learning, meaning a human demonstrates actions or teleoperates a robot to generate data, which are fed into an AI model that learns the policy. Because this method uses a small amount of task-specific data, robots often fail when their environment or task changes.
To develop a better approach, Wang and his collaborators drew inspiration from large language models like GPT-4.
These models are pretrained using an enormous amount of diverse language data and then fine-tuned by feeding them a small amount of task-specific data. Pretraining on so much data helps the models adapt to perform well on a variety of tasks.
“In the language domain, the data are all just sentences. In robotics, given all the heterogeneity in the data, if you want to pretrain in a similar manner, we need a different architecture,” he says.
Robotic data take many forms, from camera images to language instructions to depth maps. At the same time, each robot is mechanically unique, with a different number and orientation of arms, grippers, and sensors. Plus, the environments where data are collected vary widely.
The MIT researchers developed a new architecture called Heterogeneous Pretrained Transformers (HPT) that unifies data from these varied modalities and domains.
They put a machine-learning model known as a transformer into the middle of their architecture, which processes vision and proprioception inputs. A transformer is the same type of model that forms the backbone of large language models.
The researchers align data from vision and proprioception into the same type of input, called a token, which the transformer can process. Each input is represented with the same fixed number of tokens.
Then the transformer maps all inputs into one shared space, growing into a huge, pretrained model as it processes and learns from more data. The larger the transformer becomes, the better it will perform.
A user only needs to feed HPT a small amount of data on their robot’s design, setup, and the task they want it to perform. Then HPT transfers the knowledge the transformer grained during pretraining to learn the new task.
Enabling dexterous motions
One of the biggest challenges of developing HPT was building the massive dataset to pretrain the transformer, which included 52 datasets with more than 200,000 robot trajectories in four categories, including human demo videos and simulation.
The researchers also needed to develop an efficient way to turn raw proprioception signals from an array of sensors into data the transformer could handle.
“Proprioception is key to enable a lot of dexterous motions. Because the number of tokens is in our architecture always the same, we place the same importance on proprioception and vision,” Wang explains.
When they tested HPT, it improved robot performance by more than 20 percent on simulation and real-world tasks, compared with training from scratch each time. Even when the task was very different from the pretraining data, HPT still improved performance.
“This paper provides a novel approach to training a single policy across multiple robot embodiments. This enables training across diverse datasets, enabling robot learning methods to significantly scale up the size of datasets that they can train on. It also allows the model to quickly adapt to new robot embodiments, which is important as new robot designs are continuously being produced,” says David Held, associate professor at the Carnegie Mellon University Robotics Institute, who was not involved with this work.
In the future, the researchers want to study how data diversity could boost the performance of HPT. They also want to enhance HPT so it can process unlabeled data like GPT-4 and other large language models.
“Our dream is to have a universal robot brain that you could download and use for your robot without any training at all. While we are just in the early stages, we are going to keep pushing hard and hope scaling leads to a breakthrough in robotic policies, like it did with large language models,” he says.
This work was funded, in part, by the Amazon Greater Boston Tech Initiative and the Toyota Research Institute.
#000#ai#ai model#Amazon#approach#architecture#arm#artificial#Artificial Intelligence#author#Brain#Building#Carnegie Mellon University#challenge#computer#Computer Science#Computer Science and Artificial Intelligence Laboratory (CSAIL)#Computer science and technology#conference#cooking#data#datasets#Design#diversity#domains#Electrical engineering and computer science (EECS)#engineering#engineers#Environment#Forms
0 notes
Text
A faster, better way to train general-purpose robots
In the classic cartoon “The Jetsons,” Rosie the robotic maid seamlessly switches from vacuuming the house to cooking dinner to taking out the trash. But in real life, training a general-purpose robot remains a major challenge. Typically, engineers collect data that are specific to a certain robot and task, which they use to train the robot in a controlled environment. However, gathering these…
0 notes
Photo
Mike talks with filmmaker SK Dale about his latest work, Subservience. It's the story of a family with a new robot maid who's a little more conniving then Rosie from The Jetsons. https://www.spreaker.com/episode/special-report-s-k-dale-on-subservience--61640586
0 notes
Text
We've got the Jetson's Rosie the Robot, The Sims's Bonehilda, and Fifi from Beauty and the BEast, and they all just seem to say that 'maid' and the classic 'French maid' housemaid outfit are so synonymous at least in the English speaking work that one is the other, regardless of culture or era. Black dress with white trim, short lace bonnet, feather duster, and huge eyes and I feel like you've got the platonic visual of a Ancilla domestica in its natural habitat.
I just saw a wizard-as-fantastical-species shitpost and a maid-as-fantastical-species shitpost on my dash back to back, and now I'm kind of wondering what the maid equivalent of the classic "just a robe, a hat, and a pair of eyes" interpretation of the wizard-as-species would be.
1K notes
·
View notes
Text
19/?? Childhood TV Shows You Should Watch
Title: The Jetsons
Seasons: 3
Episodes: 75
Run Time: 22-30 mins
Original Air Date:
S1: September 23, 1962 - March 17, 1963
S2/S3: September 16, 1985 - November 12, 1987
Synopsis:
In the future, the Jetsons are a family residing in Orbit City. George Jetson lives with his family in the Skypad Apartments: his wife Jane is a homemaker, their teenage daughter Judy attends Orbit High School, and their son Elroy attends Little Dipper School. Housekeeping is performed by a robot maid named Rosie, who handles chores not otherwise rendered trivial by the home's numerous push-button Space Age-envisioned conveniences. The family has a dog named Astro.
George Jetson's work week consists of an hour a day, two days a week. His boss is Cosmo Spacely, the bombastic owner of Spacely Space Sprockets. Spacely has a competitor, Mr. Cogswell, owner of the rival company Cogswell Cogs. Daily life is leisurely, assisted by numerous labor-saving devices, which occasionally break down with humorous results.
My Rating: 10/10
My Reasoning:
I absolutely LOVE, this show. I like it as much as I like the Flintstones. It's a classic cartoon and pop culture icon, in my opinion. I always found the futuristic ideas so fascinating and interesting. Looking at it from today's perspective it is interesting to see how some of these things now exist. It's also funny that some of the things they thought would exist are still nowhere near existing.
I genuinely like all the characters; even the dog and robot. Somehow they're all relatable. I also like the social commentary the show has. It shows that even though the world could be seemingly "perfect" people are always going to complain about inconveniences or whatever.
It's also got good humor and I find most of the situations funny. I definitely think you should watch it if you get the chance.
#tv shows#childhood tv shows#childhood shows you should watch#cartoons#childhood cartoons#childhood tv list#the jetsons
1 note
·
View note
Text
Home Robots that Were Built to Make Your Life Much Easier
Home robots are not just in futuristic TV shows or movies anymore. How many people watched Rosie the robot maid on the Jetsons and wanted to have one too? Or how many saw Star Wars and wanted to bring R2-D2 home even if we also had to take C-3PO? We haven't gotten that advanced yet, but aido robot companions, personalized assistants, and home management aids have been steadily improving since Roomba first hit the store shelves in 2002.
Some robots are human-like running on artificial intelligence and some look like flying saucers, but they all make life easier for their human owners and many are as affordable as a new computer. There are robots that provide home security, one is a personal pool boy, and one will even play with your children.
Check out these nine home helper robots and see if one is right for you:
1. Ubtech Lynx
This humanoid robot helps Amazon Alexa come alive. Lynx has facial recognition technology and personalized greetings. Lynx can give you weather reports, play your favorite music and make to-do lists for you as well as remind you when you need to place an Amazon order. No sticky notes required.
Visit Here :- Aido Robot.
2. Asus Zenbo
This smart mobile companion robot can provide assistance and entertainment when you need it. Zenbo learns and adapts and even shares emotions with you. The robot controls household devices and can act as a security system when you are not home and even read to your children to keep them entertained. It’s a robot, a friend and a builtin babysitter too.
3. Roomba by IRobot
Roomba has been around since 2002 but has improved a lot in the past 16 years. The latest model does a lot more than just vacuum. It can be controlled via Wi-Fi or linked to Amazon Alexa or Google voice-activated assistant. While the Roomba moves around your home cleaning floors, it can remember dirty places that need extra attention and it can plug itself into its charging station and go back to where it left off when the battery is recharged. How cool is that?
4. Alfawise Magnetic
This robot also cleans your house. Alfawise works like a vacuum cleaner but can do much more. It has microfiber pads to clean glass and suction features to ensure it doesn't fall off when cleaning windows. Finally, a maid that does windows.
5. Worx Landroid
This robotic lawn mower will do the yard work for you. Landroid is designed to trim your lawn on a daily basis. It is a much quieter alternative to gasoline engine lawn mowers and can return to its charging station when its batteries are low or if it starts to rain. This is much better than paying a neighbor kid to mow your grass.
Read More :- Ingen Dynamics.
6. Dolphin Nautilus
This small robotic pool boy includes vacuuming and scrubbing elements and is smart enough to decide which one to use while it cleans your pool. Dolphin comes with special swivel cables that will never get tangled and GPS to ensure that your entire pool is cleaned. You can sit by the pool and sip a cool drink while Dolphin does all the work.
7. Budgee by 5 Elements Robotics
Do you need an extra set of hands to carry something around the house? That's Budgee's specialty. The hardworking robot can follow you around your house, the supermarket or even the airport to help you carry stuff. No tips required.
8. Aido by Ingen Dynamic
This robot isn't for sale yet, but it will be later this year. Aido is a family-friendly robot that can help you with household chores, handle schedules, can connect and configure inputs from custom medical devices and can even play with your kids. This sounds like it will be a real winner when it hits the store shelves.
Visit Here
0 notes
Text
Encounters with robots -Part 2 Rumpole
Figure 1 - Messier 81, Bode’s Nebula (c) DE Wolf 2023
Figure 1 is a photograph that I took of Messier 81, Bode’s Nebula two nights ago using iTelescope T24, a 0.61-m f/6.5 reflector, in Auberry, California. It is another example of encounters with robotic eyes. Bodes’s Nebula is a spiral galaxy about 12 million light-years away in the constellation Ursa Major. Messier 81 was first discovered by Johann Elert Bode on 31 December 1774; Bode being famous for Titus-Bode’s Law of planetary distances.
At first, I was a bit put off by the lack of sharpness in this image. Artistically it look like a little fuzzy pancake. But then I realized that I had looked at M81 myself with my own telescope the previous night, and that is exactly what it looks like. The appeal lies in the subtle color, in the incredible dynamic range between the galactic core and the outer band; it floats; it is gorgeously ethereal. M81’s core contains a black hole with the mass of 70 million Suns and 15 times the mass of the black hole at the center of our own Milk Way galaxy. Breath taking!
But moving from the sublime to the more mundane, what I really want to talk about today is my Roomba by iRobot. It cleans my house by command either using an app on my cellphone or using Alexa by voice command. My house has never been so clean! Every day I send it to a different room. It is constantly updating its map of my house. Yes, true learning! So, I think a true robotic encounter.
You are encouraged to name your Roomba. Mine is Rumpole. And everyday as I watch him systematically lumber to his task. I think of Leo McKern, who played this character on television.
But really, I tend to be reminded of the nineteen-sixties futuristic TV series “The Jetsons“ and in particular Rosie the Robot, the Jetson family’s XB-500 series robotic maid. Rosie came complete with a little frilly French maids outfit. Like Rumpole, she was forever cleaning.
Futuristic fiction always bears the flavor of modern times. The Jetsons takes place in a temporarily displaced 1960’s. It carries forward all the prejudices and quirks of the 60’s. Worse, of course, is Star Trek. Did we truly hope that the future would be just as sexist as the then present? Rosie is a human-like machine that pushes a vacuum around. Makes no sense from an engineering perspective. Rumpole takes a deep dive under the couch, in the search of alien dust bunnies. Indeed, on his first mission in my bedroom, he had an encounter with a paper towel under my bed. Rumpole growled a bit and spit it out. The paper towel did not fare as well.
(c) DE Wolf 2023
0 notes