Tumgik
#Space and Drone Remote Sensing Lab
Text
Tumblr media
(CNN) — Egypt’s Great Pyramid and other ancient monuments at Giza exist on an isolated strip of land at the edge of the Sahara Desert.
The inhospitable location has long puzzled archaeologists, some of whom had found evidence that the Nile River once flowed near these pyramids in some capacity, facilitating the landmarks’ construction starting 4,700 years ago.
Using satellite imaging and analysis of cores of sediment, a new study published Thursday in the journal Communications Earth & Environment has mapped a 64-kilometer (40-mile) long, dried-up branch of the Nile, long buried beneath farmland and desert.
“Even though many efforts to reconstruct the early Nile waterways have been conducted, they have largely been confined to soil sample collections from small sites, which has led to the mapping of only fragmented sections of the ancient Nile channel systems,” said lead study author Eman Ghoneim, a professor and director of the Space and Drone Remote Sensing Lab at the University of North Carolina Wilmington’s Department of Earth and Ocean Sciences.
“This is the first study to provide the first map of the long-lost ancient branch of the Nile River.”
Tumblr media
Ghoneim and her colleagues refer to this extinct branch of the Nile river as Ahramat, which is Arabic for pyramids.
The ancient waterway would have been about 0.5 kilometers wide (about one-third of a mile) with a depth of at least 25 meters (82 feet) — similar to the contemporary Nile, Ghoneim said.
“The large size and extended length of the Ahramat Branch and its proximity to the 31 pyramids in the study area strongly suggests a functional waterway of great importance,” Ghoneim said.
She said the river would have played a key role in ancient Egyptians’ transportation of the enormous amount of building materials and laborers needed for the pyramids’ construction.
“Also, our research shows that many of the pyramids in the study area have (a) causeway, a ceremonial raised walkway, that runs perpendicular to the course of the Ahramat Branch and terminates directly on its riverbank.”
Hidden traces of a lost waterway
Tumblr media
Traces of the river aren’t visible in aerial photos or in imagery from optical satellites, Ghoneim said.
In fact, she only spotted something unexpected while studying radar satellite data of the wider area for ancient rivers and lakes that might reveal a new source of groundwater.
“I am a geomorphologist, a paleohydrologist looking into landforms. I have this kind of trained eye,” she said.
“While working with this data, I noticed this really obvious branch or a kind of riverbank, and it didn’t make any sense because it is really far from the Nile,” she added.
Born and raised in Egypt, Ghoneim was familiar with the cluster of pyramids in this area and had always wondered why they were built there.
She applied to the National Science Foundation to investigate further.
Geophysical data taken at ground level with the use of ground-penetrating radar and electromagnetic tomography confirmed it was an ancient arm of the Nile.
Two long cores of earth the team extracted using drilling equipment revealed sandy sediment consistent with a river channel at a depth of about 25 meters (82 feet).
It’s possible that “countless” temples might still be buried beneath the agricultural fields and desert sands along the riverbank of the Ahramat Branch, according to the study.
Tumblr media
Why this branch of the river dried up or disappeared is still unclear. Most likely, a period of drought and desertification swept sand into the region, silting up the river, Ghoneim said.
"The study demonstrated that when the pyramids were built, the geography and riverscapes of the Nile differed significantly from those of today," said Nick Marriner, a geographer at the French National Centre for Scientific Research in Paris.
He was not involved in the study but has conducted research on the fluvial history of Giza.
“The study completes an important part of the past landscape puzzle,” Marriner said.
“By putting together these pieces, we can gain a clearer picture of what the Nile floodplain looked like at the time of the pyramid builders and how the ancient Egyptians harnessed their environments to transport building materials for their monumental construction endeavors.”
Tumblr media
5 notes · View notes
nova--spark · 8 months
Note
OMG Radix looks so good! Seeing them has inspired me to ruin your life lol. I have so many questions to ask hehe. 1. They have a spider-lily alt mode and I'm curious as to what this means. Like do they shrink into a smaller spider-lily or something else? If they do shrink into a smaller spider-lily does this mean that they'd be a good spy, recon or surveyer? Or are they more like an inquisitive person and use this alt mode to research and understand other plants and animals? 2. Do they photosynthesize or are they still consume energon and they only take on the appearance of a spider-lily and not its characteristics? 3. Here to ruin your life lol, Seeing Radix inspired me to think about like plantformers, where they have plant related alt modes and there could be a sunflower who has a bold and cheery disposition or a forget-me-not that has an ethereal appearance and are a hopeless romantic. To make this a question, what inspired you to make them have a non-object, vehicle, animal alt mode? I wish I could draw cause Radix has totally inspired me on this idea lol.
Tumblr media
I am delighted to answer this!! Keep in mind, I will post a link to his Bio soon!!
1] He does not shrink down to a real flower! Similar to Botannica from Beast Machines(?) I believe, he took an plant form yes, as that's what was available to him to scan, but it was merely a method by which he adapted his Cybertronian/Maximal biology to resemble the spider lily! As for how he uses his alt mode, it is indeed in both of these methods! He is no spy, his mode made him immobile in the end, as he becomes entirely rooted to the base/lab, etc he works in and needs an aid if he wishes to move around.
Instead, he uses it to quite literally integrate with the comms and more, his own network [like plant roots = a living network] of data. In that sense, he is both recon in remote, and a researcher, able to pour through data easily and control tech on a far smoother scale.
2] Nope! Energon is indeed his primary and only source of sustenance! Sure, as a techno-organic he theoretically could photosynthesize, but he'd rather take Energon over the possible risk.
As for characteristics , Radix being a spider lily, is toxic to most animals, so he has a minor venom sting that will just cause minor illness [nausea, fatigue, etc] to organic and cybernetic creatures. It is a method of defense he is grateful for.
He also has both Bee and Hummingbird drones, which he uses to work remotely from in the collection of data in enemy lines, surveillance and research of all kinds!
3] As is the norm, it was actually @piltover-sharpshooter who initially got the idea of Radix! Inspired by Botannica, he thought about a plant-bot who would be entirely rooted into the base, or his work space, which led to Radix after consulting our other friend @vanidrabbles who adores flowers! She suggested spiderlilies, and I adored the idea of a flamboyant spiderlily, who'c petals and stamens doubled as datacables like Soundwave's and had bee drones befitting his remote work.
5 notes · View notes
spacenutspod · 9 months
Link
SpaceX has yet another busy week of launches this week, with Rocket Lab, Russia, and China also launching rockets. On Thursday, SpaceX will begin their busy week with the launch of a Falcon 9 carrying another batch of Starlink satellites from Space Launch Complex 40 (SLC-40) in Florida. The company’s Thursday launch will then be followed by another Falcon 9 launch carrying yet another batch of Starlink satellites, this time from Vandenberg Space Force Base in California, on Friday. Towards the end of the week, Rocket Lab will return to flight on Friday with “The Moon God Awakens” mission. This comes after a previous mission, dubbed “We Will Never Desert You,” failed to reach orbit on Sept. 19, 2023, and will set a new yearly launch record for the company. Russia is expected to launch a Soyuz 2.1b with the Arktika-M n°2 remote sensing and emergency communications satellite on Saturday, and SpaceX will finish the week with the launch of a Falcon 9 carrying the Ovzon-3 satellite from SLC-40 on Sunday. Throughout the week, China is also expected to conduct three launches, with one potentially being China’s own reusable spaceplane, similar to the United States X-37B. Falcon 9 Block 5 | Starlink Group 6-34 On Thursday night, SpaceX will launch another batch of 23 Starlink V2 mini satellites at 11:00 PM EST (04:00 UTC on Dec. 13) on a Falcon 9 from SLC-40. This will be the first of two Starlink missions SpaceX plans to launch this week. B1081-3 is the first stage booster assigned to this mission, marking a 32-day turnaround after launching the CRS-29 resupply mission to the International Space Station in November. After launch, the booster will land on the drone ship A Shortfall of Gravitas in the Atlantic Ocean. Launch Complex 40 at Cape Canaveral pic.twitter.com/8GnI5aXpiW — Elon Musk (@elonmusk) December 12, 2023 Chang Zheng 2F/T | CSSHQ On Thursday at 14:10 UTC, a Chang Zheng 2F/T rocket is expected to launch China’s reusable spaceplane into orbit from the Jiuquan Satellite Launch Center in China. Very little is publicly known about this spaceplane, even less than the X-37B, which this vehicle is believed to be based on. If this launch does carry the spaceplane into orbit, it will mark the third flight of this spacecraft and will be the second time it is in space in 2023. The last launch of this vehicle occurred on Aug. 4, 2022, and returned to Earth earlier this year on May 8. Electron/Curie | The Moon God Awakens  On Friday, Rocket Lab will launch the “The Moon God Awakens” mission aboard their Electron rocket, officially returning the vehicle to flight after its last mission ended in failure when an electrical arc within the power supply system on the second stage occurred, causing the vehicle to lose power and shut down its engine shortly after separation from the first stage. Liftoff is scheduled to occur during a two-hour window, opening at 17:00 NZDT and closing at 19:00 NZDT (0400-0600 UTC), from Pad B at Launch Complex 1 on the Mahia Peninsula in New Zealand. Mission patch for “The Moon God Awakens” mission. (Credit: Rocket Lab) This mission will carry the QPS-SAR-5 satellite for the company iQPS. This satellite is a small, synthetic aperture radar satellite weighing only ~100 kilograms and will be used to collect high-resolution photos of Earth from orbit.  QPS-SAR-5 will join the other satellites in the iQPS constellation, and once completed, the constellation will consist of 36 satellites capable of monitoring specific points of the Earth as often as every 10 minutes. This mission will also set a new yearly launch record for Rocket Lab, closing 2023 out with ten launches, beating the company’s previous record of nine in 2022. Falcon 9 Block 5 | Starlink Group 7-9  Towards the end of the day on Friday, SpaceX will launch a Falcon 9 carrying another batch of  22 Starlink V2 mini satellites from Space Launch Complex 4-East at the Vandenberg Space Force Base in California. Liftoff is set to occur at 9:14 PM PST (04:59 UTC on Dec. 14). The booster being used on this mission is currently unknown; however, after launch, it is expected to touchdown on the Of Course I Still Love You drone ship in the Pacific Ocean. Chang Zheng 5 | Unknown Payload On Friday, a Chang Zheng 5 will launch a currently unknown payload from the Wenchang Space Launch Site in China. An exact liftoff time is unknown; however, NOTAMS indicate a launch window of 13:32-14:26 UTC. If the launch occurs as scheduled, it will be the sixth launch of the regular Chang Zheng 5 configuration and the tenth for the Chang Zheng 5 vehicle family. Chang Zheng 5 rolling out to the pad at Wenchang. (Credit: CASC) Hyperbola-1 | Unknown Payload On Saturday, the private Chinese aerospace company i-Space is expected to launch its Hyperbola-1 rocket with a currently unknown payload. Liftoff from the Jiuquan Satellite Launch Center is expected to occur at 06:00 UTC. Soyuz 2.1b | Arktika-M n°2 A few hours later, Russia is expected to launch a Soyuz 2.1b rocket from Site 31 at the Baikonur Cosmodrome in Kazakhstan. Liftoff is expected to occur at 12:17 Moscow Time (09:17 UTC). The Arktika-M is a remote sensing and emergency communication satellite designed to monitor high-latitude areas of Earth and weighs ~2100 kilograms. Arktika-M will be launched into an Molniya orbit. This highly elliptical orbit takes 12 hours to complete and allows a satellite to pass over the same spot every 24 hours, making it very useful for communication satellites in high-latitude areas. The Arktika-M n°2 satellite ahead of launch. (Credit: Roscosmos) Falcon 9 Block 5 | Ovzon-3  Finishing off the week on Sunday, another Falcon 9 will launch from SLC-40, carrying the Ovzon-3 satellite into a Geostationary Transfer Orbit (GTO).  Liftoff is set to occur at 3:46 PM EST (20:46 UTC). Ovzon-3 is a Swedish geostationary satellite and is the first privately funded geostationary satellite ever built by the country. The satellite weighs ~1800 kilograms and, once fully deployed, has a length of 27 meters. Once deployed from Falcon 9’s second stage, the satellite will use its onboard electric propulsion to maneuver into its final operating orbit over the next several months. The booster for this mission is currently unknown; however, after launch, it will touchdown back at Landing Zone 1 at the Cape Canaveral Space Force Station. This is unusual as many missions launching to GTO require more performance out of Falcon 9, thus usually requiring a drone ship landing. This may indicate that SpaceX has squeezed out even more margin regarding Falcon 9’s performance. (Lead image: Electron on the pad in New Zealand ahead of its launch on Friday. Credit: Rocket Lab) The post Launch Roundup: Rocket Lab to return to flight; SpaceX set to launch Falcon 9 three times appeared first on NASASpaceFlight.com.
0 notes
scifigeneration · 4 years
Text
Robots are playing many roles in the coronavirus crisis – and offering lessons for future disasters
by Robin R. Murphy, Justin Adams, and Vignesh Babu Manjunath Gandudi
Tumblr media
A nurse (left) operates a robot used to interact remotely with coronavirus patients while a physician looks on. MIGUEL MEDINA/AFP via Getty Images
A cylindrical robot rolls into a treatment room to allow health care workers to remotely take temperatures and measure blood pressure and oxygen saturation from patients hooked up to a ventilator. Another robot that looks like a pair of large fluorescent lights rotated vertically travels throughout a hospital disinfecting with ultraviolet light. Meanwhile a cart-like robot brings food to people quarantined in a 16-story hotel. Outside, quadcopter drones ferry test samples to laboratories and watch for violations of stay-at-home restrictions.
These are just a few of the two dozen ways robots have been used during the COVID-19 pandemic, from health care in and out of hospitals, automation of testing, supporting public safety and public works, to continuing daily work and life.
The lessons they’re teaching for the future are the same lessons learned at previous disasters but quickly forgotten as interest and funding faded. The best robots for a disaster are the robots, like those in these examples, that already exist in the health care and public safety sectors.
Research laboratories and startups are creating new robots, including one designed to allow health care workers to remotely take blood samples and perform mouth swabs. These prototypes are unlikely to make a difference now. However, the robots under development could make a difference in future disasters if momentum for robotics research continues.
Robots around the world
As roboticists at Texas A&M University and the Center for Robot-Assisted Search and Rescue, we examined over 120 press and social media reports from China, the U.S. and 19 other countries about how robots are being used during the COVID-19 pandemic. We found that ground and aerial robots are playing a notable role in almost every aspect of managing the crisis.
Tumblr media
R. Murphy, V. Gandudi, Texas A&M; J. Adams, Center for Robot-Assisted Search and Rescue, CC BY-ND
In hospitals, doctors and nurses, family members and even receptionists are using robots to interact in real time with patients from a safe distance. Specialized robots are disinfecting rooms and delivering meals or prescriptions, handling the hidden extra work associated with a surge in patients. Delivery robots are transporting infectious samples to laboratories for testing.
Outside of hospitals, public works and public safety departments are using robots to spray disinfectant throughout public spaces. Drones are providing thermal imagery to help identify infected citizens and enforce quarantines and social distancing restrictions. Robots are even rolling through crowds, broadcasting public service messages about the virus and social distancing.
At work and home, robots are assisting in surprising ways. Realtors are teleoperating robots to show properties from the safety of their own homes. Workers building a new hospital in China were able work through the night because drones carried lighting. In Japan, students used robots to walk the stage for graduation, and in Cyprus, a person used a drone to walk his dog without violating stay-at-home restrictions.
Helping workers, not replacing them
Every disaster is different, but the experience of using robots for the COVID-19 pandemic presents an opportunity to finally learn three lessons documented over the past 20 years. One important lesson is that during a disaster robots do not replace people. They either perform tasks that a person could not do or do safely, or take on tasks that free up responders to handle the increased workload.
The majority of robots being used in hospitals treating COVID-19 patients have not replaced health care professionals. These robots are teleoperated, enabling the health care workers to apply their expertise and compassion to sick and isolated patients remotely.
Tumblr media
A robot uses pulses of ultraviolet light to disinfect a hospital room in Johannesburg, South Africa. MICHELE SPATARI/AFP via Getty Images
A small number of robots are autonomous, such as the popular UVD decontamination robots and meal and prescription carts. But the reports indicate that the robots are not displacing workers. Instead, the robots are helping the existing hospital staff cope with the surge in infectious patients. The decontamination robots disinfect better and faster than human cleaners, while the carts reduce the amount of time and personal protective equipment nurses and aides must spend on ancillary tasks.
Off-the-shelf over prototypes
The second lesson is the robots used during an emergency are usually already in common use before the disaster. Technologists often rush out well-intentioned prototypes, but during an emergency, responders – health care workers and search-and-rescue teams – are too busy and stressed to learn to use something new and unfamiliar. They typically can’t absorb the unanticipated tasks and procedures, like having to frequently reboot or change batteries, that usually accompany new technology.
Fortunately, responders adopt technologies that their peers have used extensively and shown to work. For example, decontamination robots were already in daily use at many locations for preventing hospital-acquired infections. Sometimes responders also adapt existing robots. For example, agricultural drones designed for spraying pesticides in open fields are being adapted for spraying disinfectants in crowded urban cityscapes in China and India.
Tumblr media
Workers in Kunming City, Yunnan Province, China refill a drone with disinfectant. The city is using drones to spray disinfectant in some public areas. Xinhua News Agency/Yang Zongyou via Getty Images
A third lesson follows from the second. Repurposing existing robots is generally more effective than building specialized prototypes. Building a new, specialized robot for a task takes years. Imagine trying to build a new kind of automobile from scratch. Even if such a car could be quickly designed and manufactured, only a few cars would be produced at first and they would likely lack the reliability, ease of use and safety that comes from months or years of feedback from continuous use.
Alternatively, a faster and more scalable approach is to modify existing cars or trucks. This is how robots are being configured for COVID-19 applications. For example, responders began using the thermal cameras already on bomb squad robots and drones – common in most large cities – to detect infected citizens running a high fever. While the jury is still out on whether thermal imaging is effective, the point is that existing public safety robots were rapidly repurposed for public health.
Don’t stockpile robots
The broad use of robots for COVID-19 is a strong indication that the health care system needed more robots, just like it needed more of everyday items such as personal protective equipment and ventilators. But while storing caches of hospital supplies makes sense, storing a cache of specialized robots for use in a future emergency does not.
This was the strategy of the nuclear power industry, and it failed during the Fukushima Daiichi nuclear accident. The robots stored by the Japanese Atomic Energy Agency for an emergency were outdated, and the operators were rusty or no longer employed. Instead, the Tokyo Electric Power Company lost valuable time acquiring and deploying commercial off-the-shelf bomb squad robots, which were in routine use throughout the world. While the commercial robots were not perfect for dealing with a radiological emergency, they were good enough and cheap enough for dozens of robots to be used throughout the facility.
Robots in future pandemics
Hopefully, COVID-19 will accelerate the adoption of existing robots and their adaptation to new niches, but it might also lead to new robots. Laboratory and supply chain automation is emerging as an overlooked opportunity. Automating the slow COVID-19 test processing that relies on a small set of labs and specially trained workers would eliminate some of the delays currently being experienced in many parts of the U.S.
Automation is not particularly exciting, but just like the unglamorous disinfecting robots in use now, it is a valuable application. If government and industry have finally learned the lessons from previous disasters, more mundane robots will be ready to work side by side with the health care workers on the front lines when the next pandemic arrives.
Tumblr media
About The Authors:
Robin R. Murphy is the Raytheon Professor of Computer Science and Engineering and Vice-President, Center for Robot-Assisted Search and Rescue (nfp) at Texas A&M University, Justin Adams is President of the Center for Robot-Assisted Search and Rescue/Research Fellow - The Center for Disaster Risk Policy at Florida State University, and Vignesh Babu Manjunath Gandudi is a Graduate Teaching Assistant at Texas A&M University
This article is republished from The Conversation under a Creative Commons license.
13 notes · View notes
Text
All Together, Prologue and Part 1
Making A Plan
Word Count: 2257
Based on this AHWM AU
Warnings: None
Author’s Notes at the end
Some say that the night is dead, that it is silent and empty, but that is never quite the case. As the moon rises and the world is lulled to sleep there is always someone, somewhere who resists the darkness’s lullaby. An owl, willingly or not, left to their own devices as the stars above make their journey across the skies.
Tonight, however, seemed to be filled with a whole flock, ruffling their feathers as wide eyes search for something far from their grasp, something that cannot be hunted by one alone.
With their skills, their experience, and their hints of jumbled memories, success appears to be in reach, despite the secrets, the conflicts, and the haunting truth.
However,
It is also important to note the average owl’s brain only takes up about ⅓ of its skull.
Which can equate to roughly the size of a thimble in some species.
Do with that information as you will.
________________________________________________________________
Blue and red hues colored the moonlight gently streaming into the lab, as the steady hum of machines filled the void with quiet noise. At the hour of 4 am, the halls should have been emptied hours ago, but of course science never sleeps, so neither did the scientist. It wasn’t healthy, she that more than anybody, but she considered it a small price to pay for what was at stake.
At least that’s what she told herself. It was difficult to label what exactly was at stake when nothing simply made sense anymore. Nothing was adding up in the way they should, and the scientific method she held on to appeared to be failing her at every turn.
The first indicator of something being amiss was the time. Yes, staying up till 4 in the morning was horrendous for a person’s circadian rhythm, but that wasn’t the major issue. The major issue was that the sun was shining bright, and the clock was reading 2:37 pm up until she had turned from her desk until just moments ago to be met with darkened windows. Yes, perhaps Einstein’s theory of relativity could be to blame, but she wasn’t that absorbed in her work… okay maybe she was, but even she had to get up once in awhile in the span of roughly 13 hours that had somehow passed in an instant.
The second indicator was a feeling that was gnawing at her from the inside out. A sense of Deja vu that would never leave, a constant feeling of a word stuck on the tip of her tongue, and bits and pieces of memories in her brain that seemed logically impossible, even in her dreams.
What did it all mean?
Despite the piles of handwritten notes strewn across her desk, she felt completely at a loss. At least she had the newly built Time Anomaly Tracker… that she had no recollection of building, to show for.
Maybe she just needed a break from it. Maybe things would make more sense in the morning after what little sleep she could get.
But first she needed to slow down the wheels turning in her mind. It was a good thing there was an old TV in the break room, that should do the trick. It didn’t take her long to plop down onto a dusty couch and grab the remote. Hopefully it would be enough to distract her from all her thoughts. 
-click-
“Order your bubbles today-”
-click-
“Welcome to Warfs-”
-click-
“You think she cares? Bad Dog!”
-click-
So picky, she couldn’t help but drone through the different channels until a shaky camera and a stuttering voice caught her attention.
“Hello everybody this is Jim, and this is my associate Jim. Welcome to this Jim News Exclusive -stay low, stay low- Tonight, we bring to you-” the reporter paused for a moment to dramatically point to the camera, as if this was a message directly to Rose Beauregard herself, “live footage from the scene of the crime. The crime of robbery. A robbery so mysterious, so mystifying that no one could even pathom how the the robberors could have broken into this heavily guarded museum in the first place!”
From the way they were sneaking around, it appeared that the reporters had broken into the museum. It was actually quite impressive considering the lines of caution tape that wrapped some exhibits like Christmas presents, the addition of a laser based alarm system, and the obscene number of patrolling guards and policemen that could be easily seen in the background.
"We must be careful Jim we don’t know what dangers may be lurking abo- oh hand me the steak," it was thrown off camera, quickly followed by a distant voice cheering in delight about the free snack, “The Old Steak Trick, works most of the time.”
Soon, maybe a little bit too soon, the Jims approached a very much unlocked and strangely unprotected vault. 
“Here it is, the grisly scene. Not one, not seven, not four, but two insidious individuals committed the reprehensible act of theft in this very vault. Yes, the item that once here is no longer here. It has disappeared, off with the perpetrators. We have no confirmation about exactly what it is they actually stole, but we have our theories. It could have been a treasure map or an ancient salt shaker, it may be from another world or the source of a time anomaly, it could be fairy (like the ones we learned about in history class) or the world’s oldest picnic basket, it could be all of these, it could be nothing at all, the possibilities are endless.”
“Now I’m sure many of you watching at home are shaking in utter fear, I am too, but fear not. Thankfully for you innocent, or perhaps not so innocent civilians, justice hit them hard, even harder than how Cousin Jim was hit by that bus, and a great many times quicker. Our inside resource has informed us Jims about the fates of Mark Iplier and his assistant Y/N. They are already locked up, far, far away at Happy Trails Penitentiary to never see the light of day again. We are safe, for now.”
“However, there is still a mystery to be solved. For unknown reasons, the object of question has not been returned to its rightful place. It’s tragic on every degree, that poor stolen object, it must be so scared and alone now that its captors are behind bars. But that is why we have taken it upon ourselves to get answers. And this time I swear, on every Jim ever to Jim, that we will find an answer to whatever, wherever, whenever, and whyever this thing is-”
The Jim’s voice was cut off by another’s, which resulted in the reporters and their sole viewer being being thrown off guard. In their hasty escape, the camera tumbled to the ground, making it even more difficult to decipher who the new person was. From the small glimpse, it didn’t appear like they worked at the museum, nor were they dressed as any law enforcer. Nevertheless, they didn’t seem so glad to see the trespassers.
“You two again!? Why won’t you quit?!?” was the last thing to be heard before the screen was claimed by static.
The scientist simply sat on, dumbfounded by whatever the hell she just watched. Her head was filled with so many questions she wasn’t even sure where to begin. There was a heist at a public museum, yet no one knows what was stolen? How were the perpetrators already in jail? Didn’t the crime just happen? When was the trial? And why did everything seem like it was…
Out of order.
“Mark Iplier… Y/N… A time anomaly”
It all finally clicked. 
This had all happened before. Well, sort of. The events were different as far as she remembered. Thankfully it appeared she was no longer in a timeline riddled with the undead and raiders, and undead raiders. However… if she had already destroyed the anomaly before, all of the time-space issues should have been fixed right? Unless, of course, her original hypothesis about of all this was wrong. Perhaps it wasn’t the box causing all the trouble, maybe, it was Y/N and maybe this Mark causing the trouble. The strangeness always seemed to be triggered by them afterall. 
Too many of her questions were still left unanswered, which only served to fuel her curiosity and need to fix this once and for all. However, this time Rose was no longer at a complete loss, she now had a lead, which only meant one thing:
Off to Happy Trails Penitentiary.
________________________________________________________________
Blue and red hues colored the moonlight blanketing the courtyard, as the blaring call of ambulance sirens added to the usual chaos of the night. 4 am was too late for any of this, but crime never slept, and apparently neither did any of the criminals. Which in turn led to a very sleep deprived and grouchy warden that now that had to deal with one prisoner being punched through the wall and another pulling off a magic disappearing act.
Not to mention the holes. There was a giant hole in the bottom of his office, several even larger holes in the cell walls, and another that was vaguely human shaped and a little bit too disturbing to deal with at the moment. It felt like the place was built out of goddamn graham crackers.
At least the injured prisoner was properly dealt with, Mark Iplier, or Asshole Mark as the other prisoners called him, broke too many bones for the staff to handle so he was sent off to a nearby hospital to recover. The warden didn’t mind, he was causing too much trouble anyways going off and asking for his personal belongings.
And speaking of the two’s personal belongings, the box they had arrived with just so happened to disappear with Y/N, who was otherwise known as *Insert Ridiculous Prison Nickname Here*. He hadn’t even gotten a chance to open it yet and now it was gone. However, that wasn’t the worst of it. Y/N was gone, or perhaps very good at hide and seek. He was hoping for the latter but after several hours of guards and prisoners counting and looking, it didn’t seem very likely.
The warden was offended on all accounts by the newest prisoner's conduct. There hadn't been a breakout in years and then suddenly they waltz in and think that they could just waltz back out like it was no big deal. Though yet again, he doubted the charade would last long. Most of the criminals of Happy Trails wouldn’t last a day out in the real world. They even sang a song about how they never wanted to leave. It would be soon enough until they came crawling back again, and when they did, they were going to face all the wrath of Warden Dave Murderslaughter. They were going to get rehabilitated harder than they would ever get rehabilitated before, whether they liked it or not.
But for now he had to play the waiting game. Somehow, someway or another, he was going to take the reins over once again. This was his penitentiary after all, and what kind of warden would he be if his jail wasn’t in proper order?
________________________________________________________________
At this hour the world seemed almost monochrome. It was an hour in which one should be snuggled up in bed, or in some cases, a jail cell safe from the dangers of the world. It was for sure not an hour where someone should be braving the summer night’s heat as they wander through tall grass, with no one other than the insects eating them alive as a companion.
Y/N had escaped, that was a given, but that didn’t mean they felt free. Once everyone knew they escaped, the hunt would be on. All they could think of at the moment was to carry on forward, but they knew they’d have to think of a plan eventually. If only Mark was there with them… he was always the one to point out their options.
But now they were all alone, truly alone… Wow, when was the last time that happened? Of course they couldn’t remember, during all these adventures memory never seemed like a necessity. With every bizarre scenario that came along, it was difficult to process the present as it was. Trying to analyze the past was a whole other be a feat in itself. As Yancy said, “The past ain’t the kind of thing to be trifled with.” It was the future they really needed to worry about right now.
They had to forge their way out of this mess somehow, but they couldn’t do it all by themselves. They needed to find Mark, or at least some they could trust, they needed to make sure no one else would be looking for them, and they needed to learn the truth about the box they had gone through so much trouble to steal. In it was a key as far as they knew, but inside they knew that there had to be more going on. Something that perhaps Mark wasn’t telling them about.
So now they had… something that resembled a plan. They were still unsure of the road ahead, but perhaps if they followed that plan… and don’t deviate from it… everything might just turn alright in the end after all. 
________________________________________________________________
Thank you for reading, it’s much appreciated :) Future parts should have less POV switching, this is just mainly to set up where each character is at starting out. (Also please don’t quote me on the owl facts, I was just trying my best to make a dramatic metaphor) 
Tagging: @thatforgottenbasilisk @thecatchat @statictay @gay-spaghetti @captainsaltypear @chelseareferenced
54 notes · View notes
Link
In a move inspired by natural engineering, robotics researchers have demonstrated how tiny palm-size drones can forcefully tug objects 40 times their own mass by anchoring themselves to the ground or to walls. It’s a glimpse into how small drones could more actively manipulate their environment in a way similar to that of humans or larger robots.
“Teams of these drones could work cooperatively to perform more complex manipulation tasks,” says Matt Estrada, a Ph.D. student in mechanical engineering at Stanford University. “We demonstrated opening a door, but this approach could be extended to turning a ball valve, moving a piece of debris, or retrieving an object of interest from a disaster zone.”
Winged creatures such as birds, bats, and insects can lift only objects that are about five times their own weight when flying. But Estrada and his colleagues from Stanford University and the École Polytechnique Fédérale de Lausanne, in Switzerland, looked instead to the practical approach taken by predatory wasps, which land on the ground to drag larger prey back to their nests. The group’s bioinspired approach to robotic experimentation is detailed in the latest issue of Science Robotics.
The “FlyCroTug” drones also represent an evolution for ground-based robots originally developed by David Christensen, a coauthor on the paper who is currently employed at Disney Research. By turning to a custom-built quadrotor drone design, the team created micro air vehicles that combine aerial mobility with greater pulling or pushing strength based on ground anchoring.
Each FlyCroTug drone has a specialized attachment at the end of a long cable that can be paid out and then pulled back in through a winch. That means the drones can attach one end of their cable to an object, fly off, land, and anchor themselves before hauling the heavy load toward them. What might normally be one small step at a time for wasps becomes one giant flying leap at a time for the drones, Estrada explained.
The anchoring mechanisms based on technologies from Stanford’s Biomimetics and Dexterous Manipulation Lab also took inspiration from natural design: microspines capable of attaching to rough stucco or concrete surfaces, and sticky, gecko-inspired adhesives for attaching to smooth glass.
Having tiny drones that can explore cramped spaces and still exert large forces upon their surroundings opens many new possibilities for search-and-rescue applications in either civilian or military scenarios. For example, Estrada suggested that such drones could be a portable tool for first responders or military personnel to deploy sensors or transport medical supplies to a person stuck in a remote location.
In one of the team’s experiments, a FlyCroTug drone clung to an overhang as it pulled up a battery-powered camera to perform inspections of a collapsed building site at a military training facility outside of Geneva, Switzerland.
A second door-opening scenario required teamwork between two FlyCroTug drones. The first drone grabbed the door handle with a special grappling attachment and then anchored itself to the smooth glass door. The second drone slipped a hook under the door and then latched onto the nearby carpet to pull the door open, once the handle had been turned.
As impressive as this all sounds, the FlyCroTug drones still face serious limitations. Their current battery life is sufficient for just five minutes of flight time, which severely limits what they can do. Complex and unknown environments would also require possibly many versions of the drones with different attachments and anchor mechanisms for various surfaces. But the latter may not be a problem, if such flying robots could be made cheaply and be deployed as swarms of disposable drones.
Researchers have not yet developed either sensing capabilities or artificial intelligence capabilities for such drones to operate even semi-independently, let alone in fully autonomous mode without human control. But Estrada believes that a teleoperation approach makes the most sense for near-future deployments of such technology.
“Humans can intuitively read a room and predict what surfaces might be suitable to attach onto and [find] feasible paths towards these locations,” Estrada says. “This could certainly be combined with some low-level autonomy for maneuvers such as holding a position or grappling a handle.”
22 notes · View notes
lex-munro · 3 years
Text
[Suicide Squad Scrap] Princess pt 7
self-indulgent batjokes-flavored SS/BvS/JL, installment 7.  the rats who are close personal friends with this version of Ratcatcher are stronger, smarter, and hardier than the average rat, because reasons.  Mr. Freeze is very frustrating from a technical standpoint, because the only thing canon says is ‘sub-zero,’ which is a very large temperature range…but most cryogenic preservation takes place at -100 to -120 celsius, so we can at least make a fairly accurate guess about Nora’s temps.
the piece as a whole is rated Mature for pervasive language, varying degrees of violence, use of controlled substances, sexual references, questionable ethics, and themes of mental illness.  set from Flag’s POV, with references to Birds of Prey, but not compliant with The Suicide Squad.
***
The techs are still setting up shop when Flag and his team arrive in the early evening.
“We’ve already got the blueprints of the building,” the tech lead says as he passes over a tablet.  “Our stealth drone is on the way to do a flyby while we finish getting logistics unpacked.”
Flag nods as he does a quick once-over of Freeze’s current hideout.  “All right, kids, we’re just gonna relax right here until we’ve got a full grasp of the current situation, then we pass it to our lovely strategist.  Arcee, how cold can your little buddies get?”
“Colder than most rats,” she says with a shrug.  “Uh.  They can poke around in minus twenty or thirty Fahrenheit, but not for long.  Surfaces colder than forty below will give ‘em frostbite real fast.”
“Victor’s hired help will need it at least that warm,” Joker puts in.  “The main lab might be colder, but the building as a whole won’t.  Hard to keep that much airmass chilled, actually.”
“Makes sense,” the tech agrees.
Joker leans into Flag’s personal space to poke at the tablet.  “Main entrance, two alarmed fire exits.  Visual security’s gonna be tightest at the front door, almost non-existent closer to Mr Cold Miser, because he has a tendency to frost the lenses accidentally.  That, and he thinks he’s hot shit—well, cold shit.  Whatever.”
“Kicked the Bat’s ass more than once,” says Croc.
Slowly, Joker turns.
Croc may be three hundred pounds of muscle and fangs, but he damn near cowers from whatever look Joker gives him.  Lawton smacks Croc’s shoulder.
“Jay,” Flag calls.  “You were saying?  No visual near the lab?”
Joker tips his head back, takes a long breath, exhales in a low growl.  Then, in a snap, he’s smiling jovially at Flag.  “Right!  Hubris, you might say.  Laziness and lack of healthy paranoia, if ya ask me.  He’ll also have few if any guards around him.  I’ve told him before—it’d be soooo easy to just walk in, poison all his little cryo canisters, unplug the wife, and be gone before he realized he was choking on his own blood.”
“We want him alive.”
“You just have to crush my dreams, don’t you?”
“Major Flag?” the tech lead calls.  “Thermal’s up.”
Joker practically pounces, pale hands tight on the tech’s shoulders (the guy goes stiff as a board, but doesn’t say anything).  “Yes.  Yesyesyes,” he says, and starts to laugh.
“Fuck, I hate when he does that,” Digger mutters.
“What’s on your mind, pretty?” Lawton asks over the noise.
Joker finally calms himself and clears his throat.  “The coldest point in the building is minus one-twenty celsius, and that’ll be the wifey, but the entire room around it is negative thirty.  Our ice cube is in plainclothes.”
“Shit, son—let’s tranq his ass and shove him in a sack.”
“I count six guards,” Flag comments, eyes on the person-shaped hot spots.  “Two at each fire exit, two at the front security desk.  According to the security layout on file, the desk has remote access to the front door locks.  Known lobby cameras are here, here, and here.”
Joker claps his hands together with a little giggle.  “I need a jacket, a baseball cap, and at least a dozen helium balloons.”
“Classic,” Lawton says with a smirk.
“Old tricks are the best tricks, Gun-Bunny.  Be ready for the follow-up.”
Flag heaves a sigh but gestures to one of the recon team.  “You heard him.  A jacket, a baseball cap, and a dozen helium balloons.”
The guy looks at Flag with deep resentment.  “I’m a certified electrician with degrees in robotics and signal capture, and you’re sending me to Party City?”
“I’m a soldier for hire with over forty confirmed kills,” Flag retorts.  “You want me to make use of that?”
Mr. Party City slinks off.
“I’m such a bad influence on you, Boy Scout,” Joker trills happily, head tipped back to survey Flag over his shoulder (mostly-upside-down and entirely disconcerting).
“Y’all ain’t askin’ me to go in there, right?” Croc says dubiously.  “I don’t do cold.  I get sniffles anywhere under fifty degrees.”
Ratcatcher scoffs and smacks his thick chest.  “You’re not actualy cold-blooded, and we all know it.  You’ll be fine, you big baby.  At least watch the door, or somethin’.  It’d take forever to quietly hit the main security feed on a building this size, but once Mr J gets the front door open, my buddies can take out the lobby cams.  Just lounge up front and try not to eat the guards after Deadbutt whacks ‘em.”
Half an hour later, Joker is wearing a hot pink letterman’s jacket and cap (Mr. Party City probably thought he was being vindictive, but the clown seems to like the color) and holding a fistful of mylar balloons with a mile-wide metal grin.
“This shit’s like givin’ a chainsaw to a six-year-old,” Ratcatcher decides.  “Like, you know it’s gonna be fucked up, but you’re not sure exactly how, and you kinda wanna duck ‘n cover, but you can’t look away…”
“Take out the external cam on the southeast corner,” Lawton tells her.  “We’ll stack there; your pals move in when the door opens, I take the guards out as soon as the balloons are in place, the rats kill the lobby cams before the balloons clear.”
“Butter,” she replies, elbow-bumping him in deference to the rats in her hands.
“What the—is that some kinda millennial thing?”
“What?  My hands’re full, and I’m too short to chest-bump you.”
“No, I meant—never mind.”
Flag closes his eyes and prays for strength.
.End.
1 note · View note
report-ocean · 3 years
Text
Stratospheric uav payloads technology market research report with opportunities and strategies to boost growth- covid-19 impact and recovery
Unmanned Aerial Vehicles (UAVs) are remotely piloted aerial vehicles that have significant roles in defense as well as commercial sectors. UAVs are commonly termed as drones' and increasingly used for border surveillance. They are also used in various commercial applications that include monitoring, surveying and mapping, precision agriculture, aerial remote sensing, and product delivery.
Based on the Stratospheric UAV Payloads Technology market development status, competitive landscape and development model in different regions of the world, this report is dedicated to providing niche markets, potential risks and comprehensive competitive strategy analysis in different fields. From the competitive advantages of different types of products and services, the development opportunities and consumption characteristics and structure analysis of the downstream application fields are all analyzed in detail. To Boost Growth during the epidemic era, this report analyzes in detail for the potential risks and opportunities which can be focused on.
Request A Free Sample - https://www.reportocean.com/industry-verticals/sample-request?report_id=mai197146 We share our perspectives for the impact of COVID-19 from the long and short term. We provide the influence of the crisis on the industry chain, especially for marketing channels. We update the timely industry economic revitalization plan of the country-wise government. Key players in the global Stratospheric UAV Payloads Technology market :
Boeing Parrot Global Near Space Services Aeryon Labs 3D Robotics Textron Lockheed Martin Corporation DJI QinetiQ Openstratosphere S.A. General Atomics Northrop Grumman Arca Space Corporation On the basis of types, the Stratospheric UAV Payloads Technology market from 2015 to 2025 is primarily split into:
Altitude Matters Stratospheric UAVs Aircraft vs. Airships Jet Stream UAVs On the basis of applications, the Stratospheric UAV Payloads Technology market from 2015 to 2025 covers: Military Scientific
Request A Free Sample - https://www.reportocean.com/industry-verticals/sample-request?report_id=mai197146 Geographically, the detailed analysis of consumption, revenue, market share and growth rate, historic and forecast (2015-2025) of the following regions :
North America   United States Canada Mexico Europe   Germany UK France Italy Spain Russia Others Asia-Pacific   China Japan South Korea Australia India South America  Brazil Argentina Columbia Middle East and Africa   UAE Egypt South Africa Years considered for this report: Historical Years: 2015-2019 Base Year: 2019 Estimated Year: 2020 Forecast Period: 2020-2025
0 notes
pink-ink-goblin · 7 years
Note
Darkmark or darkstache: "um... What did two lines mean again?"
((Double sorry mysterious being. One for taking so long to get to this. And two, for making you wait forever for me to essentially tell you no. I don’t do male pregnancy fics. At least, not with the ending you’re probably thinking of. No judgement from me on your proclivities, it’s just not something I like writing. That said, how about a disorganized, light series of probably completely unfunny events instead?))  
It had been a quiet day for the most part. No one was fighting, no one had died, and Dark hadn’t found a single squirrel darting around. It was a strange but very welcome sort of peace, so the demon took advantage of it by making himself scarce so he could not only enjoy it, but concentrate on the various more corporate aspects of maintaining a building full of unpredictable, and infinitely frustrating, beings as well. It was more than just watching over them after all. They did not reside there for free and silence was not a cheap item to buy, regardless of how much smooth talking there had been.
But that was honestly the easy part. The rest of the neatly stacked papers, however, were written requests from the more active egos submitted via form because Dark was done dealing with their whining face to face. The one in front of him currently was from their resident game show host, and Bim was requesting permission to expand the studio. He must be at odds with Wilford again if he was beseeching Dark about it.
However, despite enjoying the silence, he couldn’t ignore the strange fact that his main interruption had been absent all day, making the silence take on a more suspicious air, but Dark wasn’t concerned enough to go look for him and ruin his potentially quiet afternoon. Nothing was broken, nothing was flickering in and out of existence, and no one was screaming, so if it didn’t warrant world ending interventions, then he was happy to step back and let be.
True to form, however, his blissful solitude wasn’t meant to last long, and, with the sound of a bubblegum pop, Wilford was in front of his desk, fingers already reaching out to fiddle with his pen stand as he often did when he needed to ask something. It was less a nervous habit and more a plain annoying one, but one that Dark had grown used to so long ago.
“Yes, Wilford?” Dark droned, not even bothering to look up. What were the legal repercussions of letting Host run his own Podcast? As long as it couldn’t be traced, then he could have at it. Approved.
“Um…” Wilford hesitated, seeming to be trying to find the proper words for his question oddly enough, before settling, as he usually did, upon being blunt. “What did two lines mean again?”
Dark’s pen paused in his writing, considering the confusing nature of the words presented to him. He was more than certain Wilford was looking at him expectantly, the sentence of course making sense to the being but not quite registering that they may be puzzlingly vague to someone else. In the small stretch of silence, the pastel-themed being’s deft fingers had left the pen stand and were already reaching for his magnetic container of paperclips, but Dark reached over and snatched it away, still without looking up.
“That’s a very broad question,” The demon finally replied patiently, flipping a paper over and placing it neatly into another pile. “Why not ask Google? He’d be happy to list every single instance of significance that two lines can have in this dimension.”
“Because,” Wilford retorted somewhat petulantly, mostly at being denied optimum stimming material, before tossing something skinny and cream colored onto Dark’s desk that bounced to a stop right on top of his paperwork. “I’m asking you.”
It took longer than the demon would care to admit to recognize not just the stick, but the minimal information Wilford had provided with it, and when it clicked, it made him finally sit up in confusion.
“I can’t remember what the box said,” Wilford admitted, oblivious to Dark’s reaction. The pink ego had a habit of doing the same thing when he cooked, but instead of fishing the box out of the garbage with an air of defeat like a sane being, he would continue on stubbornly and then grumpily whine to Dark when everything went wrong. “Something about one line or two meaning something or other.”
“Wilford,” Dark said slowly, refusing to touch the offending thing with an air of disgust. “This is a pregnancy test.”
“So?” Wilford cocked an eyebrow at him, but Dark could see the man didn’t understand what Dark was implying. He couldn’t possibly actually be this oblivious.
“So you’re a male. Males don’t get pregnant. And, considering you are not a sea horse, I doubt you have anything to worry about. Once again, I implore you to ask Google for clarification. And also get this off my desk.”
“But what do the two lines mean?”
“Two lines usually means-” Positive… Wait, what? “Wilford, did you use this?”
“Yeah.”
“When?”
“A week ago. I forgot about it.” It took an incredible amount of willpower to keep Dark seated after that statement. Had Wilford just been sitting on that information for a week, or had he only just checked it now and thought to ask? Dark supposed it didn’t matter at this point, but it didn’t necessarily stop him from being not only irate, but also deeply concerned.
“And there’s no chance anyone else could have gotten a hold of it?”
“No, it was in my pen cup,” And with that, Dark made note to never touch anything on Wilford’s desk ever again. “Dark, what does it mean?”
Dark sighed heavily, fingers pressing into his temples as he prayed for patience. “… It means we need to have a chat with our good doctor.”
——
It was only natural that their resident doctor’s immediate reaction was to laugh. It was a short bark because the man valued his life, but it was still enough to have Dark only just resisting the urge to throttle him. The demon supposed that if their roles had been reversed, maybe he might have found humor in it as well, but as it stood, he was much too irate to consider it from any side other than his own, and he didn’t even want that perspective either.
They stood now near the door, Dark with his hands behind his back, trying to pretend nothing was wrong with anything he had just said, while the doctor stood across from him, hiding his smile rather poorly as he leaned a hip against the nearest hospital bed with his arms crossed. Wilford, naturally, had become quickly disinterested and wandered off in the moderate space allowed because he was no longer being directly referred to.
“Okay, disregarding Wilford,” The doctor started quietly, the last of the humor finally working its way out of his system. At least for now. “Surely at least you know how this is all physically impossible?”
Dark gave him a flat look. “Why do you think I came to you?”
“Wait, so you don’t know?” Dr. Iplier’s face fell at the prospect of having to give ‘the talk’ to the last two beings he would ever have expected to give it to.  
“Of course I know how it all works,” Dark hissed dangerously, something bleeding out into his voice to distort it in his sudden offence, before he took a calming breath and composed himself once more. “That’s the problem. It’s a logical fallacy with a single point of truth.”
Despite the outburst, Dr. Iplier took a rather relieved breath. Thank God. “Well, yeah, it is, but there are too many issues with the theory of ectopic male pregnancy for me to even begin to take that single truth with any modicum of seriousness. It’s just not possible.”
“I understand that,” Dark humored. “Believe me I do, but why then was the test positive?”
The doctor shrugged. “Faulty maybe? They aren’t really an exact science, especially in a commercial setting. Or, you know, there have been cases where males have jokingly used them only to receive a true positive due to having prostate cancer. But I can almost guarantee you that Wilford doesn’t fall under the standard definition of human male even remotely enough for that to be a possibility.” Dr. Iplier paused to sigh before relenting, “Honestly, maybe he is actually pregnant. Who knows what the hell Wilford actually could be.”
“I’ve known him long enough that I can assure you that Wilford is more or less designed like a male human, proclivities included,” Dark vouched, turning to watch distastefully as Wilford raided the doctor’s lolli cup. Dr. Iplier made a subtle face through his own side glance but otherwise let him go at it. This had come to be expected every time the being came in anyway. “That should mean he has no organs to accompany such a thing.”
Dr. Iplier wisely chose to ignore the idea of how Dark could even begin to know that. “And I would be inclined to absolutely agree with you, but with you extra-dimensionals, I’ve seen a lot of weird crap that throws normal right out the window. Have you tried making him take one again?”
“No,” Dark admitted, mood growing more sour by the second. “Because I know for a fact that he’s incapable… Maybe.” Dark rubbed at his face wearily. “Don’t you have a test of your own you could use? Perhaps take some blood?”
“I’ve plenty of cups he can pee in, but not a single machine or any chemical strips to test it with. That’s not my field.”
“You have lab equipment in the back room,” Dark stated, gesturing to the lone door next to the doctor’s corner desk. He even remembered helping Dr. Iplier acquire most of what was in there even if he wasn’t sure what half of it did.
“Yes, for trauma. I treat anything from superficial injuries to life threatening wounds, not deliver babies and happy news.” The doctor replied with equal flatness. “With maybe a minor degree in pathology. Go find an OB-GYN if you’re that insistent.”
Dark was tempted to remind the doctor of his revoked license purely out of spite. “Very well. Could you at least look at the brand and tell me if it’s trustworthy?”
Dr. Iplier shrugged again, looking like he wanted to reiterate what he had just said, but instead settled on a simple, “I can do my best.”
“Wilford, come here,” Dark commanded. Wilford looked up from the mess he had made on Dr. Iplier’s desk - some kind of paper fort built of pens and paperclips that had no business being able to maintain structural integrity given the current physical plane they were on - and wandered over obediently, two suckers in his mouth, three in his shirt pocket, and, when he got close enough, one held out to Dark jovially. Dark plucked it from his fingers and placed it in his own breast pocket to later add to his collection of stolen lollipops in his desk drawer. “Give the Doctor the stick.”
Wilford fished it out of God knew where and handed it over, mouth too preoccupied with the sugary treats to speak. Dr. Iplier took it without the air of disgust Dark had given and, after a good moment of scrutinizing, an inappropriately humorous smile began to spread across his face.
“What?” Dark asked suspiciously.
“This brand’s pretty trustworthy.”
Dark’s eyes went wide with sudden concern, voice almost cracking from the sudden tightness in his throat. “Jesus Christ, you’re joking.”
“Not a bit,” The doctor responded cheerily as he was want to do when delivering bad news. “But, see this?”
“Yes, that’s the second line.” Dark confirmed, unsure what he was getting at. The whole thing was a little faded, given Wilford had left it alone, but… Wait. “Why isn’t it the same color as the first one?”
“Exactly. The color’s off because… it was originally negative. This is what happens when you let them sit out too long after using them. They give a false positive. Also why you should probably follow the directions on the box.” Dr. Iplier quipped in quick tones, turning to toss the stick into a nearby trashcan. “Tough luck. Looks like you’re both doomed to a childless future.”
Dark could feel it on his tongue, the expletive that wanted to explode out of him and eviscerate Wilford where he merrily stood, but he reigned it in with a slow deep breath, swallowing a good portion of his irritation in the process. He should honestly feel relieved, so that’s what he decided to cling to. After all, this was probably the most harmless thing Wilford’s carelessness had ever done, emotional wear aside, and considering past exploits, Dark should be counting his lucky stars that Wilford hadn’t had to have come into the clinic with anyone else.
Maybe the man was sterile. Dark could really only hope. A quiet cough brought Dark back to earth and face to face with the rather mischievous smile of the doctor with something else on his mind.
“What?” Dark humored tonelessly.
“At the risk of being eviscerated,” Dr. Iplier said slowly, taking a few steps back to ensure he was outside of Dark’s immediate reach. “You two make a horrifying and cute couple.”
“… Run. Now,” Dark watched the doctor flee from his clinic, coat flapping behind him while the threat did nothing to remove that smug grin from his face. He’d be back later when he was sure both of them were gone from his clinic. Dark also knew he wouldn’t have to worry about the doctor sharing this either, for if there was one thing the man wasn’t was a gossip, but all the same it still wore on him greatly that someone else knew of this draining experience. What an afternoon.
A hand fell on his shoulder, warm and heavy despite his aura and he looked over his shoulder to see the source of many of his daily irritations smiling at him, having finished the two suckers, but not yet spitting out the sticks. Dark sighed, about ready to ask why Wilford had even thought to buy one of those damn pregnancy tests in the first place, when, mid-turn, his elbow bumped something that made him freeze. Something very round and yellow.
And distinctly attached to Wilford’s abdomen.
Dark jumped back like a scared cat, thrusting himself out of Wilford’s grip and stumbling back in absolute shock and horror. He was about ready to freeze up or bolt when Wilford started laughing. The sudden flip to confusion was enough to ground the demon and make him pause to take a closer look, now realizing he could see something white and cloth-like poking out from between Wilford’s shirt buttons.
“Gotcha,” Wilford chuckled, patting the top of his faux-stomach hard enough to elicit dull, rustling cloth sounding thumps.
“Get that out of your shirt,” Dark demanded sourly, giving Wilford the harshest of looks while the being pulled the bed sheet out and unceremoniously threw the rumpled ball onto the nearest bed. He turned away and started walking out, Wilford trotting to catch up unprompted as Dark always expected him to. “What possessed you to buy one of those damn things anyway? Was this some kind of test?”
“I dunno, did I pass?” Wilford answered cryptically, and when Dark went to give him another beseeching look, he was met with Wilford grinning at him, lolli sticks stuck in his upper lip to look like tusks. Whether the effect was intentional or not, Dark suddenly found his mouth unwittingly pulling at the corners despite it all. A laugh, small and quiet as it was, even managed to sneak its way past his lips.
It was official. The ridiculousness of everything had finally hit him. He couldn’t even be mad anymore, so he just accepted that he would probably never know. Wilford was an enigma, even to himself, so it was always better to just let it go.
Dark reached out and looped his arm into Wilford’s as they made their way to the elevator, the pink ego’s grin turning smug with victory as they locked elbows.
“You know what?” Dark said, pressing the button for the top floor. He looked at the being, tilting his head as his own smile turned amused. “Why not?”
16 notes · View notes
magzoso-tech · 5 years
Photo
Tumblr media
New Post has been published on https://magzoso.com/tech/leading-robotics-vcs-talk-about-where-theyre-investing/
Leading robotics VCs talk about where they’re investing
Tumblr media Tumblr media
The Valley’s affinity for robotics shows no signs of cooling. Technical enhancements through innovations like AI/ML, compute power and big data utilization continue to drive new performance milestones, efficiencies and use cases.
Despite the old saying, “hardware is hard,” investment in the robotics space continues to expand. Money is pouring in across robotics’ billion-dollar sub verticals, including industrial and labor automation, drone delivery, machine vision and a wide range of others.
According to data from Pitchbook and Crunchbase, 2018 saw new highs for the number of venture deals and total invested capital in the space, with roughly $5 billion in investment coming from nearly 400 deals. With robotics well on its way to again set new investment peaks in 2019, we asked 13 leading VCs who work at firms spanning early to growth stages to share what’s exciting them most and where they see opportunity in the sector:
Shahin Farshchi, Lux Capital
Kelly Chen, DCVC
Rob Coneybeer, Shasta Ventures
Aaron Jacobson, NEA
Eric Migicovsky, Y Combinator
Helen Liang, FoundersX Ventures
Andrew Byrnes, Micron Ventures
Ludovic Copéré Sony Innovation Fund
Costantino Mariella, Sony Innovation Fund
Cyril Ebersweiler, SOSV & HAX
Peter Barrett, Playground Global
Bruce Leak, Playground Global
Jim Adler, Toyota AI Ventures
Participants discuss the compelling business models for robotics startups (such as “Robots as a Service”), current valuations, growth tactics and key robotics KPIs, while also diving into key trends in industrial automation, human replacement, transportation, climate change, and the evolving regulatory environment.
Shahin Farshchi, Lux Capital
Which trends are you most excited in robotics from an investing perspective?
The opportunity to unlock human superpowers:
Increase productivity to enhance creativity leading to new products and businesses.
Automating dangerous tasks and eliminating undesirable, dangerous jobs in mining, manufacturing, and shipping/logistics.
Making the most deadly mode of transport: driving, 100% safe.
How much time are you spending on robotics right now? Is the market under-heated, overheated, or just right?
Three-quarters of the new opportunities I look at involve some sort of automation.
The market for robot startups attempting direct human labor replacement, floor-sweeping, and dumb-waiter robots, and robotic lawnmowers and vacuums is OVER heated (too many startups).
The market for robot startups that assist human workers, increase human productivity, and automate undesirable human tasks is UNDER heated (not enough startups).
Are there startups that you wish you would see in the industry but don’t? Plus any other thoughts you want to share with TechCrunch readers.
I want to see more founders that are building robotics startups that:
Solve LATENT pain points in specific, well-understood industries (vs. building a cool robot that can do cool things).
Focus on increasing HUMAN productivity (vs. trying to replace humans).
Are solving for building interesting BUSINESSES (vs. emphasizing cool robots).
Kelly Chen, DCVC
Three years ago, the most compelling companies to us in the industrial space were in software. We now spend significantly more time in verticalized AI and hardware. Robotic companies we find most exciting today are addressing key driver areas of (1) high labor turnover and shortage and (2) new research around generalization on the software side. For many years, we have seen some pretty impressive science projects out of labs, but once you take these into the real world, they fail. In these changing environmental conditions, it’s crucial that robots work effectively in-the-wild at speeds and economics that make sense. This is an extremely difficult combination of problems, and we’re now finally seeing it happen. A few verticals we believe will experience a significant overhaul in the next 5 years include logistics, waste, micro-fulfillment, and construction.
With this shift in robotic capability, we’re also seeing a shift in customer sentiment. Companies who are used to buying outright machines are now more willing to explore RaaS (Robot as a Service) models for compelling robotic solutions – and that repeat revenue model has opened the door for some formerly enterprise software-only investors. On the other hand, companies exploring robotics in place of tasks with high labor shortages, such as trucking or agriculture, are more willing to explore per hour or per unit pick models.
Adoption won’t be overnight, but in the medium term, we are very enthusiastic about the ways robotics will transform industries. We do believe investing in this space requires the right technical know-how and network to evaluate and support companies, so momentum investors looking to dip their hand into a hot space may be disappointed.
Rob Coneybeer, Shasta Ventures
We’re entering the early stages of the golden age of robotics. Robotics is already a huge, multibillion-dollar market – but today that market is dominated by industrial robotics, such as welding and assembly robots found on automotive assembly lines around the world. These robots repeat basic tasks, over and over, and are usually separated by caged walls from humans for safety. However, this is rapidly changing. Advances in perception, driven by deep learning, machine vision and inexpensive, high-performance cameras allow robots to safely navigate the real world, escape the manufacturing cages, and closely interact with humans.
I think the biggest opportunities in robotics are those which attack enormous markets where it’s difficult to hire and retain labor. One great example is long-haul trucking. Highway driving represents one of the easiest problems for autonomous vehicles, since the lanes tend to be well-marked, the roads have gentle curves, and all traffic runs in the same direction. In the United States alone, long haul trucking is a multi-hundred billion dollar market every year. The customer set is remarkably scalable with standard trailer sizes and requirements for shipping freight. Yet at the same time, trucking companies have trouble hiring and retaining drivers. It’s the perfect recipe for robotic opportunity.
I’m intrigued by agricultural robots. I’ve seen dozens of companies attacking every part of the farming equation – from field clearing and preparation, to seeding, to weeding, applying fertilizer, and eventually harvesting. I think there’s a lot of value to be “harvested” here by robots, especially since seasonal field labor is becoming harder to find and increasingly expensive. One enormous challenge in this market, however, is that growing seasons mean that the robotic machinery has a lot of downtime and the cost of equipment isn’t as easily amortized in other markets with higher utilization. The other big challenge is that fields are very, very tough on hardware and electronics due to environmental conditions like rain, dust and mud.
There are a ton of important problems to be solved in robotics. The biggest open challenges in my mind are locomotion and grasping. Specifically, I think that for in-building applications, robots need to be able to do all the thing which humans can do – specifically opening and closing doors, climbing stairs, and picking items off of shelves and putting them down gently. Plenty of startups have tackled subsets of these problems, but to date no one has built a generalized solution. To be fair, to get to parity with humans on generalized locomotion and grasping, it’s probably going to take another several decades.
Overall, I feel like the funding environment for robotics is about right, with a handful of overfunded areas (like autonomous passenger vehicles). I think that the most overlooked near-term opportunity in robotics is teleoperation. Specifically, pairing fully automated robotic operations with occasional human remote operation of individual robots. Starship Technologies is a perfect example of this. Starship is actively deploying local delivery robots around the world today. Their first major deployment is at George Mason University in Virginia. They have nearly 50 active robots delivering food around the campus. They’re autonomous most of the time, but when they encounter a problem or obstacle they can’t solve, a human operator in a teleoperation center manually controls the robot remotely. At the same time. Starship tracks and prioritizes these problems for engineers to solve, and slowly incrementally reduces the number of problems the robots can’t solve on their own. I think people view robotics as a “zero or one” solution when in fact there’s a world where humans and robots work together for a long time.
0 notes
bluemagic-girl · 5 years
Text
Iran official tweets at Trump after apparent rocket failure
TEHRAN, Iran (AP) — An Iranian official revealed a picture Saturday of a satellite tv for pc after an apparent rocket explosion at the gap heart intended to release it, tweeting at President Donald Trump after the American chief shared on-line what gave the impression to be a surveillance photograph of the aftermath.
The tweet from Iran’s Information and Communications Technology Minister Mohammad Javad Azari Jahromi, together with a selfie of him in entrance of the Nahid-1, comes as Tehran has but to recognize Thursday’s explosion at the Imam Khomeini Space Center.
While specifics concerning the incident stay unclear, it marked the 3rd failure involving a release at the middle, which has raised suspicions of sabotage in Iran’s area program. The U.S. has criticized the initiative as some way for Tehran to advance its ballistic missiles.
Trump at once said the ones suspicions in his tweet Friday and denied any U.S. involvement.
“The United States of America was not involved in the catastrophic accident during final launch preparations for the Safir SLV Launch at Semnan Launch Site One in Iran,” Trump wrote, figuring out the rocket used. “I wish Iran best wishes and good luck in determining what happened at Site One.”
Jahromi, a emerging baby-kisser in Iran’s Shiite theocracy, replied to Trump in his tweet early Saturday.
“Me & Nahid I right now, Good Morning Donald Trump!” he wrote in English.
Later, Jahromi accompanied native reporters to a Tehran-based area analysis heart, appearing them the satellite tv for pc. Jahromi’s efforts gave the impression to be aimed at each appearing his ministry had performed its paintings in making ready the satellite tv for pc and wasn’t liable for the rest associated with the rockets or the release after the apparent explosion.
“I have no idea about the Semnan space center and the defense minister, who is in charge, should make a comment on this,” Jahromi stated, in line with the state-run IRNA information company.
He went directly to problem the U.S. to turn photographs of the place the American army surveillance drone Iran shot down in June crashed, looking to suggest Washington used to be being cheating in its movements. Iran says the drone used to be in its airspace, one thing the U.S. army denies.
Commercially to be had satellite tv for pc photographs by means of Planet Labs Inc. and Maxar Technologies confirmed a black plume of smoke emerging above a release pad Thursday, with what gave the impression to be the charred stays of a rocket and its release stand. In earlier days, satellite tv for pc photographs had proven officers there repainted the release pad blue.
Story continues
The photograph launched Friday by means of Trump gave the impression to be a once-classified surveillance photograph from American intelligence companies. Analysts stated the black rectangle within the photograph’s upper-left-hand nook most probably lined up the photograph’s classification. Trump as president can declassify subject matter.
The symbol confirmed broken cars across the release pad, in addition to harm performed to the rocket’s launcher. It additionally obviously confirmed a big word written in Farsi at the pad: “National Product, National Power.”
Jahromi informed The Associated Press in July that Tehran deliberate 3 satellite tv for pc launches this 12 months, two for satellites that do remote-sensing paintings and any other that handles communications.
The Nahid-1 is reportedly the telecommunication satellite tv for pc. Nahid in Farsi method “Venus.” The satellite tv for pc, which had Iran’s first foldable sun panels, used to be meant to be in a low orbit across the Earth for some two-and-a-half months.
The semi-official Mehr information company quoted Jahromi on Aug. 13 as announcing that the Nahid-1 used to be in a position to be brought to Iran’s Defense Ministry, signaling a release date for the satellite tv for pc most probably loomed. Iran’s National Week of Government, throughout which Tehran frequently inaugurates new tasks, started Aug. 24.
The apparent failed rocket release comes after two failed satellite tv for pc launches of the Payam and Doosti in January and February. A separate hearth at the Imam Khomeini Space Center in February additionally killed 3 researchers, government stated at the time.
Over the previous decade, Iran has despatched a number of short-lived satellites into orbit and in 2013 introduced a monkey into area.
The U.S. alleges such satellite tv for pc launches defy a U.N. Security Council solution calling on Iran to adopt no process associated with ballistic missiles able to turning in nuclear guns.
Iran, which lengthy has stated it does no longer search nuclear guns, maintains its satellite tv for pc launches and rocket checks don’t have an army part. Tehran additionally says it hasn’t violated the U.N. solution because it simplest “called upon” Tehran to not behavior such checks.
The checks have taken on new significance to the U.S. amid the maximalist way to Iran taken by means of President Donald Trump’s management. Tensions were top between the international locations since Trump unilaterally withdrew the U.S. from Iran’s nuclear deal over a 12 months in the past and imposed sanctions, together with on Iran’s oil trade. Iran lately has begun to wreck the accord itself whilst looking to push Europe to lend a hand it promote oil in another country.
___
Gambrell reported from Dubai, United Arab Emirates.
from Moose Gazette https://ift.tt/2HD6SQx via moosegazette.net
0 notes
forkadelphia · 5 years
Link
In the Penn Engineering Research and Collaboration Hub, there is a wide-open space with high ceilings and a padded floor. All around it are aisles of soldering equipment, propped-up prototypes, and metal parts of many shapes and sizes. Nestled on the third floor of the looming Pennovation Center building in the Grays Ferry neighborhood, it’s the perfect venue for robotics research.
This is where Saldaña, a member of the General Robotics, Automation, Sensing & Perception (GRASP) Laboratory, and his collaborators in the School of Engineering and Applied Science perform test flights with some of his robots. Although the designs vary in size, the newest square prototype is about the size of a shoebox. Each can be remote controlled like most drones, but when he activates several of them at once, they autonomously come together in the air.
0 notes
raystart · 4 years
Text
Technology, Innovation and Modern War – Class 9 – Autonomy – Maynard Holliday
We just held our ninth session of our new national security class Technology, Innovation and Modern War. Joe Felter, Raj Shah and I designed a class to examine the new military systems, operational concepts and doctrines that will emerge from 21st century technologies – Space, Cyber, AI & Machine Learning and Autonomy.
Today’s topic was Autonomy and Modern War.
Catch up with the class by reading our summaries of the previous eight classes here.
Some of the readings for this class session included Directive 3000.09: Autonomy in Weapons Systems, U.S. Policy on Lethal Autonomous Weapon Systems, International Discussions Concerning Lethal Autonomous Weapon Systems, Joint All Domain Command and Control (JADC2), A New Joint Doctrine for an Era of Multi-Domain Operations,  Six Ways the U.S. Isn’t Ready for Wars of the Future.
Autonomy and The Department of Defense Our last two class sessions focused on AI and the Joint Artificial Intelligence Center (the JAIC,) the DoD’s organization chartered to insert AI across the entire Department of Defense. In this class session Maynard Holliday of RAND describes the potential of autonomy in the DoD.
Maynard was the Senior Technical Advisor to the Undersecretary of Defense for Acquisition, Technology and Logistic during the previous Administration. There he provided the Secretary technical and programmatic analysis and advice on R&D, acquisition, and sustainment. He led analyses of commercial Independent Research and Development (IRAD) programs and helped establish the Department’s Defense Innovation Unit. And relevant to today’s class, he was the senior government advisor to the Defense Science Board’s 2015 Summer Study on Autonomy.
Today’s class session was helpful in differentiating between AI, robotics, autonomy and remotely operated systems. (Today, while drones are unmanned systems, they are not autonomous. They are remotely piloted/operated.)
I’ve extracted and paraphrased a few of Maynard’s key insights from his work on the Defense Science Board Autonomy study, and I urge you to read the entire transcript here and watch the video.
Autonomy Defined There are a lot of definitions of autonomy. However, the best definition came from the Defense Science Board. They said, to be autonomous a system must have the capability to independently compose and select among different courses of action to accomplish goals based on its knowledge and understanding of the world, itself, and the situation. They offered that there were two types of Autonomy:
Autonomy at Rest – systems that operate virtually, in software, and include planning and expert advisory systems. For example, in Cyber, where you have to react at machine speed
Autonomy in Motion – systems that have a presence in the physical world. These include robotics and autonomous vehicles, missiles and other kinetic effects
A few definitions:
AI are computer systems that can perform tasks that normally require human intelligence – sense, plan, adapt, and act, including the ability to automate vision, speech, decision-making, swarming, etc. – Provides the intelligence for Autonomy.
Robotics provides kinetic movement with sensors, actuators, etc., for Autonomy in Motion.
Intelligent systems combine both Autonomy at Rest and Motion with the application of AI to a particular problem or domain.
Why Does DoD Need Autonomy? Autonomy on the Battlefield Over the last decade, the DoD has adopted robotics and unmanned vehicle systems, but almost all are “dumb” – pre-programmed or remotely operated – rather than autonomous. Autonomous weapons and weapons platforms—aircraft, missiles, unmanned aerial systems (UAS), unmanned ground systems (UGS) and unmanned underwater systems (UUS) are the obvious applications.
Below is an illustration of a concept of operations of a battle space. You can think of this as the Taiwan Straits, or near the Korean Peninsula.
On the left you have a joint force; a carrier battle group, AWACS aircraft, satellite communications. On the right, aggressor forces in the orange bubbles are employing cyber threats, dynamic threats, denied GPS and comms (things we already see in the battlespace today.)
Another example: Adversaries have developed sophisticated anti-access/area denial (A2/AD) capabilities. In some of these environments human reaction time may be insufficient for survival.
Autonomy can increase the speed and accuracy of decision-making. Using Autonomy at Rest (cyber, electronic warfare,) as well as Autonomy in Motion, (drones, kinetic effects,) you can move faster than your adversaries can respond.
Autonomy Creates New Tactics in the Physical and Cyber Domains The combatant commanders asked the Science Board to assess how autonomy could improve their operations. The diagram below illustrates where autonomy is most valuable. For example, in row one, on the left, you don’t need autonomy when required decision speed is low. But as the required decision speed, complexity, volume of data and danger increases, the value of autonomy goes up. In the right column you see examples of where autonomy provides value.
The Defense Science Board studied several example scenarios.
Some of these recommendations were invested in immediately. One was the DARPA OFFSET (Offensive Swarm Enabled Tactics) program run by Tim Chung. He holds the record for holding a hundred swarms. And he took his expertise to DARPA to run a swarm challenge. Another DARPA investment was the Cyber Grand Challenge, to seed-fund systems able to search big data for indicators of Weapons of Mass Destruction (WMD) proliferation.
Can You Trust an Autonomous System? A question that gets asked by commanders and non-combatants alike is, “Can you trust an autonomous system? The autonomy study specifically identified the issue of trust as core to the department’s success in broader adoption of autonomy. Trust is established through the design and testing of an autonomous system and is essential to its effective operation. If troops in the field can’t trust that a system will operate as intended, they will not employ it. Operators must know that if a variation in operations occurs or the system fails in any way, it will respond appropriately or can be placed under human control.
DOD order 3000.09 says that a human has to be at the end of the kill chain for any autonomous system now.
Postscript – Autonomy on the Move A lot has happened since the 2015 Defense Science Board autonomy study. In 2018 the DoD stood up a dedicated group – the JAIC – the Joint Artificial Intelligence Center, (which we talked about in the last two classes here and here) to insert AI across the DoD.
After the wave of inflated expectations, deploying completely autonomous systems to handle complex unbounded problems are much harder to build than originally thought. (A proxy for this enthusiasm versus reality can be seen in the hype versus delivery of fully autonomous cars.)
That said, all U.S. military services are working to incorporate AI into semiautonomous and autonomous vehicles into what the Defense Science Board called Autonomy in Motion. This means adding autonomy to fighters, drones, ground vehicles, and ships. The goal is to use AI to sense the environment, recognize obstacles, fuse sensor data, plan navigation, and communicate with other vehicles. All the services have built prototype systems in their R&D organizations though none have been deployed operationally.
A few examples; The Air Force Research Lab has its Loyal Wingman and Skyborg programs. DARPA built swarm drones and ground systems in its OFFensive Swarm-Enabled Tactics (OFFSET) program.
The Navy is building Large and Medium Unmanned Surface Vessels based on development work done by the Strategic Capabilities Office (SCO). It’s called Ghost Fleet, and its Large Unmanned Surface Vessels development effort is called Overlord.
DARPA completed testing of the Anti-Submarine Warfare Continuous Trail Unmanned Vessel prototype, or “Sea Hunter,” in early 2018. The Navy is testing Unmanned Ships in the NOMARS (No Manning Required Ship) Program.
Future conflicts will require decisions to be made within minutes, or seconds compared with the current multiday process to analyze the operating environment and issue commands – in some cases autonomously. An example of Autonomy at Rest is tying all the sensors from all the military services together into a single network, which will be the JACD2 (Joint All-Domain Command and Control). (The Air Force version is called ABMS (Advanced Battle Management System).
The history of warfare has shown that as new technologies become available as weapons, they are first used like their predecessors. But ultimately the winners on the battlefield are the ones who develop new doctrine and new concepts of operations. The question is, which nation will be first to develop the Autonomous winning concepts of operation? Our challenge will be to rapidly test these in war games, simulations, and in experiments. Then take that feedback and learnings to iterate and refine the systems and concepts.
Finally, in the back of everyone’s mind is that while DOD order 3000.09 prescribes what machines will be allowed to do on their own, what happens when we encounter adversaries who employ autonomous weapons that don’t have our rules of engagement?
Read the entire transcript of Maynard Holliday’s talk here and watch the video below.
youtube
If you can’t see the video click here
Lessons Learned
Autonomy at Rest – systems that operate virtually, in software, and include planning and expert advisory systems
For example, Cyber, battle networks, anywhere you must react at machine speed
Autonomy in Motion – systems that have a presence in the physical world
Includes robotics and autonomous vehicles, missiles, drones
AI provides the intelligence for autonomy
Sense, plan, adapt, and act
Robotics provides the kinetic movement for autonomy
Sensors, actuators, UAV, USVs, etc.
Deploying completely autonomous systems to handle complex unbounded problems are much harder than originally thought
All U.S. military services are working to incorporate AI into semiautonomous and autonomous vehicles and networks
Ultimately the winner on the battlefield will be those who develop new doctrines and new concepts of operations
We’re seeing this emerge on battlefields today
0 notes
hudsonespie · 4 years
Text
Breakthrough Laser Sensor Technology for Maritime & Offshore Industry
Research scientist and entrepreneur Sverre Dokken believes laser-based remote sensing has big potential in the maritime domain. 
“Our mission is to be the world leader in innovative maritime sensor products that enhance the safety and security of shipping and all manner of offshore activity,” says Ladar Ltd (LDR) principal Sverre Dokken In the sensor vanguard
Essentially a laser-based navigational aid, LADAR (Laser Detection and Ranging) combines long distance object detection with high-accuracy measurement, giving users a full 2D/3D/4D (3D plus time) perspective for optimal maritime awareness. The laser pulse scans a specific area or target with over 100 readings per second. Its water-penetrating capabilities enable very high-resolution detection of objects in the surface layer up to approximately one nautical mile (1.85km) distant and up to 10m deep in ideal conditions. “Objects” can be anything from a person, floating container, icebergs, whales, or small craft to environmental factors such as waves or pollution.
“The system’s proven capability to detect, characterize, classify and track various surface-layer objects in real-time make it suitable for a wide variety of applications,” says Dokken.
The modular design incorporates unique technologies such as laser diodes together with optical camera, gyros, optional AIS, and/or radar and sonar feeds to produce a comprehensive analysis of the ocean surface layer ahead of a vessel.
“The system overlaps many existing ship radar functions with added benefits, high-speed operation and no latency,” says Dokken. It can be configured to different light bandwidths as required.
LADAR outperforms
LADAR outperforms both radar and sonar through its ability to detect both smaller and larger objects in the surface layer. Sub-metre resolution at close and long range and 1000-times better resolution in azimuth and elevation than both radar and sonar also enable detection of very small objects. The system is independent of speed meaning it can be used on high-speed vessels, while it can also be mounted on any kind of stationary platform.
Data is visualized on an intuitive, customizable graphical user interface (GUI) enabling seamless transition from above-surface, through-surface and below-surface observations. Machine learning helps to continuously improve detection and classification capabilities. Users can also experience the “live” environment using Virtual and Augmented Reality (VR/AR).
Data feedback can optionally activate functions such as safe re-routing around navigational hazards, wave spectrum and ocean current observations, charting of marine plastic pollution, uncharted reefs, shoals, moving sand-banks, and so on.
LADAR uses narrow laser beam scans, providing a full 3D perspective
  Highly versatile
“All this makes LADAR the only cost-efficient sub-surface tool that can be tailored to many maritime and offshore applications,” Dokken says.
He believes the system has many immediate benefits. “Our research indicates it can increase safety with a potential 50% reduction in navigational risk. That means fewer dry-dock visits due to accidents and collisions, reduced costs and extended vessel lifetime,” he says.
LADAR can also plug the sensor gap with large amounts of situational data as the industry moves to autonomous operations and e-navigation. LDR is also working on matching the system with drone technology.
“We’re convinced the generalization of this technology will see new applications still unknown to us,” Dokken says. “But right now, there are no other laser sensor solutions in the market with the kind of performance our system offers. Whatever alternatives there are have less range, are bigger and less adaptable. They’re also three or more times the price.”
Focus on performance testing
The LDR lab continues iterative electronic and mechanical assembly to further reduce the size, weight and production cost of the system. “We also focus on performance testing both in the lab and in live settings,” says Dokken. “Last summer, for example, we were in the Mediterranean doing tests on plastic detection with very positive results.”
LDR has conducted trials on board the cruise ferry Color Magic along its route between Oslo and Kiel, as well as proving its use in fish inventory assessment, bathymetric/sea floor mapping by aircraft and floating mine detection for a navy.
The company currently has LOIs in place with the likes of Team Tankers Management, Hurtigbåtforbundet HRF, The Fjords, GOTA Ship Management, Hargun Havfiske, Barents Nord, the Port of Rotterdam and Grand Large Yachting.
LADAR's underwater range extends to a maximum depth of up to 10 metres at short distances
  Scientific team
Dokken’s team have been perfecting their advanced LADAR sensor technology for some years now. The company was spun off from an EU-funded project that produced an early prototype.
“Our LADAR team have 100 years’ combined expertise in sensor systems, software and electrical engineering,” says Dokken. “We also cooperate with several experienced sea captains to ensure the system meets end-user needs and to keep tabs on market trends.”
Dokken himself has over 20 years’ experience in marine and remote sensing R&D programmes. He has held various academic and managerial positions in the European Space Agency (ESA), Chalmers University of Technology, and the Norwegian Defence Research Establishment. He has co-ordinated several EU Framework Programme projects as well as investor- and venture-capital backed technology enterprises.
With a double MSc from the Agricultural University of Norway and both a PhD/Dr.Tech and LicEng from Chalmers University of Technology, he says the impetus for his move from pure research scientist to entrepreneur came in 2000 when doing a Financial MBA at the International University of Monaco. “I haven’t looked back,” he jokes.
Other key team members include LDR director Carlos Pinto, product manager Siegfried Schmuck and collaborator Dr Jens Hjelmstad.
With 10 years plus experience in remote sensing and geographic information systems, Pinto has worked in management, funding strategy and product research, as well as a stint at the Food and Agriculture Organization developing Earth observation (EO) data processing.
Schmuck has more than 12 years’ experience in remote sensing applications and R&D project management. He has worked in more than six countries linking technical and business teams in maritime and EO.
Hjelmstad has over 30 years’ experience in coordinating and leading advanced sensor system development programmes. For the past 15 years he has managed microwave and optical sensor research programmes as adjunct professor at the Norwegian University of Science and Technology (NTNU). He also lectures at both graduate and PhD levels. He earlier worked for 15 years in the corporate world at Ericsson.
Meanwhile, LDR retains close links to both Chalmers and NTNU, the Cyprus Institute and University of Cyprus, to name a few.
LADAR system
  from Storage Containers https://maritime-executive.com/article/breakthrough-laser-sensor-technology-for-maritime-offshore-industry-1 via http://www.rssmix.com/
0 notes
maxihealth · 5 years
Text
What HealthyThinker Is Thinking About Health at CES 2020
Next week, the Consumer Technology Association (CTA) will convene CES, the Consumer Electronics Show, where over 180,000 tech-minded people from around the world will convene to kick the tires on new TVs, games, smart home devices, 5G connections, 3-D printing, drones, and to be sure, digital health innovations.
At #CES2020, exhibitors in the health/care ecosystem will go well beyond wearable devices for tracking steps and heart rate. I’ll be meeting with wearable tech innovators along with consumer electronics companies and retailers. I’ve also scheduled get-togethers with pharma and life science folks, health plan people, and execs from consumer health companies.
And with organizations you might not yet connect to health, well-being, and medical care.
Mainstream media have been covering every angle on CES for the past few weeks. Entrepreneur identified five innovations that will dominate CES 2020, and I see health/care in all of them: wearable AR/VR, autonomous farming, IoT in the kitchen, personal translators, and remote health monitoring. Forbes ran a column on CES 2020 discussing AI in hearing and vision. Hearables have been emerging at CES for a few years, and now vision will be a newer category to watch. (As a sidebar, note that the American Girl Doll of the Year for 2020 is Joss, who surfs and wears a hearing aid).
And in FierceHealthcare, CTA President Gary Shapiro wrote this column last week on how technology is a key to enabling value-based care.
To give you a sense of how health and wellness at CES have grown since 2013, consider Asthmapolis, a pioneer in digital respiratory health. At CES 2013, David Van Sickle, CEO and founder of Asthmapolis, spoke in a CES keynote panel about the usefulness of health data in the cloud. Later in 2013, Asthmapolis changed its name to Propeller Health. The company was acquired by ResMed exactly one year ago during CES 2019. Subsequently, ResMed joined with Dr. Oz to launch SleepScore Labs – an exhibitor at CES 2020 in the fast-growing sleep category at the Show.
Healthcare is getting serious at #CES2020, blurring into the medical and FDA regulated turf which Omron pioneered last year launching its 80+-patented blood pressure watch, the HeartGuide. This year, Omron garnered a CES 2020 innovation award nomination for its new “Complete” device embedding EKG with the blood pressure monitor.
Heart monitoring is now table stakes for wearable tech in health, with many wrist-worn wearables tracking heart rate. The signal that CTA “hearts” health this is year is that CTA is partnering with the American College of Cardiology to grant physicians attending CES to earn Continuing Medical Education credits as part of a new “Disruptive Innovations in Health Care” conference. This is something that we forecasters would have put in the “wild card” category nine years ago. The agenda for that session looks like a blur between HIMSS, Health 2.0, Connected Health and the ATA Conference – covering digital health and value-based care, reimbursement, home care, and clinician/technology partnerships. I have also heard that several hundred physicians are signed up to attend CES – again, showing this meeting has become an important forum for healthcare. The Digital Health Summit celebrates a decade at #CES2020, and I’ll be participating in this all-health-meeting within CES, as well — especially looking forward to brainstorming “Smart Health Just Got Smarter” with Roy Jakobs who leads Philip’s consumer health business.
Year on year for the past decade, digital health has grown at CES: this year will the category will expand by 25% which is the kind of growth seen since I began to attend the conference nine years (and about 30,000 fewer attendees) ago.
I called out this growth and importance of CES for health/care in my book, HealthConsuming: From Health Consumer to Health Citizen, published in May 2019. I am gratified that the book was chosen for Gary’s Book Club at CES 2020, where I’ll be interviewed by CTA’s Kinsey Fabrizio (who has driven health and fitness at the Association for many years) and do a book signing. Aside from feeling excited and humbled by this on a personal level, it’s important to see this choice of a book theme by CTA as recognition that health/care, for both self-care DIY and clinical medical applications, make up an important component of the consumer electronics industry.
Here is my pre-look into what I expect to explore at #CES2020 through my health/care-is-everywhere lens…
The next era for health-wearables isn’t about the wearable device — it’s about the data. A recent blog from Valencell, a long-time digital health exhibitor at CES, pointed out “why this time is different for wearables.” The essay pointed out four factors identified by Andreessen Horowitz’s Vijay Pande: machine learning, biometric sensor data, at scale, in context. Vijay observed that it’s the aggregation of data emanating from wearable tech and remote health monitoring devices, along with other observed behavior from, say, voice tech or driving a car, that’s driving the next phase of wearable tech growth.
Digital Swiss Army knives for health: devices do more. The launch of Omron’s Complete is an FDA-cleared innovation so it’s clinically accurate for healthcare providers to trust in their workflow. The product combines blood pressure monitoring and EKG in one device. This is significant for medical care, addressing the public health challenge of AFib, atrial fibrillation, which is a risk factor for increased risk of stroke and heart failure. Complete is an early example of a concept that does “more than one thing.” Just as we see with polypills in pharmacy – therapies that address more than one condition, making it easier for patients to be adherent to prescription drug regimes – having digital health tools that serve more than one function adds value to the wearable or monitoring device, serving up greater convenience and value to the patient and the provider.
Your car as a third space for health. True to my Detroit birth-roots, I’ve been following connected cars for health and well-being for several years. In 2017, I wrote about your car as a mobile platform for health, a new definition for the phrase “mHealth.” This year we’ll see more concept cars embedding health, wellness and well-being (HWW) that are using AI to feed back data and coaching to the driver. Last year, Kia worked with Affectiva, demonstrating a car that could sense a passenger’s mood and emotions, triggering the use of aromatherapy and lighting to bolster the person’s well-being. My 2017 post in Health Populi explained Mercedes-Benz’ prototype for health using scents and music to boost the passenger’s energy or calm a stressed person down. This year, connected cars will incorporate more data analytics, safety objectives, voice and camera devices for cars to evolve toward business models for health. A key part of a business model could be a consumer’s willingness to trade-off personal information about their time in the car with, say, a discount for car insurance or other financial inducement.
The mouth as gateway to health. Oral care is a huge consumer packaged goods category for self-care, and the electronics aspects of toothbrushing has heated up in the past few years. Last CES, Philips Sonicare line introduced a tele-dentistry program which connected consumers to dentists in the community. At CES 2020, Colgate, the toothpaste and oral care brand favorite is rumored to be introducing a new smart toothbrush. Colgate has been collaborating with Apple Health and Research Kit for over a year, so we can expect collaborations like this to be expanding to oral care – where evidence has been building connecting physical health (such as diabetes, heart conditions and stroke) to oral health.
Caregiving is the new black – watch for voice and robots to help
The Longevity Economy is an important through-line at CES 2020 with the likes of Philips Lifeline, Samsung’s piBo the robot, LiveFreely’s Buddy, Addison the Virtual Caregiver (who first appeared last year), and PECOLA, the “Personal Companion for Older People Living Alone” which is an honoree in the Smart Home category this year. I expect to see more such developments that will enable people to age-in-place longer. No question, too, that Alexa and other voice assistants will continue to have skills developed aimed at enhancing older peoples’ lives and ability to stay in their homes safely and securely. Laurie Orlov, author of the Aging in Place Technology Watch, writes in her new report on Voice, Health and Wellbeing 2020 (launching today at CES) that voice technology is particularly suited for older adults and those with disabilities. Laurie’s pioneering research into the voice market for health and wellness found that, “speaking to a device was going to be one of the most significant technology enablers for seniors, their caregivers and families.” In addition to voice, we’ll find more robots featured at CES across applications; I look forward to meeting and petting TomBot, a robotic puppy that was designed with folks from Jim Henson’s Creature Shop. While it looks like a toy, TomBot was designed for people dealing with dementia.
Food-tech for health. With growing attention to the role of food in health and local food gaining traction in many parts of the world, 2019 ushered in a new era of grocery chains bringing agriculture inside their brick-and-mortar stores. Walmart and Kroger have both entured into this area. Increasingly, as consumers take on more DIY lifestyles, gardening and especially home-growing healthy food is an expanding market. At CES 2020, LG will announce an indoor gardening appliance as part of this growing movement among people keen to know the provenance and quality of their food and living healthier, greener lifestyles, LG observed in its press release.
The home as health hub. At CES 2019, I happened upon Stanley Black & Decker. To me, the company has been long associated with DIY home-making, supplying my family with quality tools that help us make our house our safe and comfortable nest. Last year, the company’s innovation team in Boston launched SB&D’s entry into health through the prototype of Pria, a medication adherence concept, which I discussed here in Health Populi.
From Stanley Black & Decker, I point you to Whirlpool, with whom I met several years ago at CES 2015 when I looked up to the ceiling to see the company mantra, “Every day, care.” I thought to myself, “wait – these are home appliances” in this lovely homey booth. Fast forward to today, and Whirlpool is bolstering social determinants of health via the #CareCounts program, granting washing machines to schools that help students stay in school wearing clean clothes. This bolsters education which further reinforces a young person’s odds of finishing school, getting a good job, and increasing income opportunities and financial health.
These factors, my friends, underpin economic justice, health equity and longer quality life-years.
Big Tech and Health at CES: privacy and products. Of course, Big Tech companies will be at CES – Google, Facebook, Amazon, and this year for the first time in a long time, Apple, will all feature developments. Bloomberg’s story noted that the last time someone from Apple spoke at CES was in 1992 when Richard Sculley was on the dais. Apple’s decision to attend this year has been widely covered in tech media, with the overall headline that Jane Horvath, Apple’s Senior Director of Global Privacy, will speak at CES during a roundtable discussion about digital privacy. She’ll talk alongside top execs from Facebook, the USA’s FTC regulator, and Procter & Gamble. Apple is not expected to make a big product announcement at CES2020, but those of us in health/care should be mindful that this privacy discussion is a critical pillar in health data – especially as California’s CCPA is now enacted in this new year, and the GDPR is fully in force through Europe.
Beyond my own checklist, you can check out what’s next and new at CES 2020 in this CTA press release: there’s health embedded everywhere as I see it, which is why I spend an entire week at this conference to meet up with folks innovating in autos, on TVs, via fast connections, in media, building smart cities, washing laundry, sex health and, yes, medicine and wellbeing, too.
I conclude HealthConsuming in a few paragraphs under the section heading, “Home is not just where the heart is: it’s our health hub.” At #CES2020, we’ll see more connected “things” in the IoT for health and wellness at home and in our vehicles. There a global trend for self-care bolstered by over-the-counter products, exercise equipment and subscriptions, food-as-medicine, direct-to-consumer genetic and ancestry testing, and to be sure, digital health tools taking the form of wearable tech, remote health monitors, and data mash-ups.
There will be plenty of technologies and “things” to explore that can help our health. Tech and things won’t be the barriers to improving health. Public policies, lagging regulations, economic opportunity, political will, and our personal commitments to owning rather than “renting” our health, will be the limitations to tech realizing its full potential for health and wellness in 2020 and beyond.
 The post What HealthyThinker Is Thinking About Health at CES 2020 appeared first on HealthPopuli.com.
What HealthyThinker Is Thinking About Health at CES 2020 posted first on https://carilloncitydental.blogspot.com
0 notes
kathleenseiber · 5 years
Text
Hummingbird robot flies and hovers like the real thing
Researchers have engineered flying robots that behave like hummingbirds, training them with machine learning algorithms based on various techniques the bird uses naturally every day.
This means that after learning a simulation, the robot “knows” how to move around on its own like a hummingbird would.
The combination of flying like a bird and hovering like an insect could one day offer a better way to maneuver through collapsed buildings and other cluttered spaces to find trapped victims.
Artificial intelligence, combined with flexible flapping wings, also allows the robot to teach itself new tricks. Even though the robot can’t see yet, for example, it can touch surfaces to sense them. Each touch alters an electrical current, which the researchers realized they could track.
This robotic hummingbird flies on its own while tethered to an energy source. Soon, batteries will power the robot. (Credit: Bio-Robotics Lab/Purdue University)
“The robot can essentially create a map without seeing its surroundings. This could be helpful in a situation when the robot might be searching for victims in a dark place—and it means one less sensor to add when we do give the robot the ability to see,” says Xinyan Deng, an associate professor of mechanical engineering at Purdue University.
Different physics
Engineers can’t make drones infinitely smaller, due to the way conventional aerodynamics work. They wouldn’t be able to generate enough lift to support their weight. But hummingbirds don’t use conventional aerodynamics—and their wings are resilient.
“The physics is simply different; the aerodynamics is inherently unsteady, with high angles of attack and high lift. This makes it possible for smaller, flying animals to exist, and also possible for us to scale down flapping wing robots,” Deng says.
Researchers have been trying for years to decode hummingbird flight so that robots can fly where larger aircraft can’t.
In 2011, DARPA, an agency within the US Department of Defense, commissioned the company AeroVironment to build a robotic hummingbird that was heavier than a real one but not as fast, with helicopter-like flight controls and limited maneuverability. It required a human to be behind a remote control at all times.
12-gram robot
Deng’s group and her collaborators studied hummingbirds for multiple summers in Montana. They documented key hummingbird maneuvers, such as making a rapid 180-degree turn, and translated them to computer algorithms that the robot could learn from when hooked up to a simulation.
Further study on the physics of insects and hummingbirds allowed researchers to build robots smaller than hummingbirds—and even as small as insects—without compromising the way they fly. The smaller the size, the greater the wing flapping frequency, and the more efficiently they fly, Deng says.
The robots have 3D-printed bodies, wings made of carbon fiber, and laser-cut membranes. The researchers built one hummingbird robot weighing 12 grams—the weight of the average adult magnificent hummingbird—and another insect-sized robot weighing 1 gram. The hummingbird robot can lift more than its own weight, up to 27 grams.
Designing their robots with higher lift gives the researchers more wiggle room to eventually add a battery and sensing technology, such as a camera or GPS. Currently, the robot needs a tether to an energy source while it flies, but that won’t be for much longer, the researchers say.
The robots could fly silently just as a real hummingbird does, making them more ideal for covert operations. And they stay steady through turbulence, which the researchers demonstrated by testing the dynamically scaled wings in an oil tank.
The robot requires only two motors and can control each wing independently of the other, which is how flying animals perform highly agile maneuvers in nature.
“An actual hummingbird has multiple groups of muscles to do power and steering strokes, but a robot should be as light as possible, so that you have maximum performance on minimal weight,” Deng says.
Robotic hummingbirds wouldn’t only help with search-and-rescue missions, but also allow biologists to more reliably study hummingbirds in their natural environment through the senses of a realistic robot.
“We learned from biology to build the robot, and now biological discoveries can happen with extra help from robots,” Deng says.
Simulations of the technology are available open-source. Researchers, including collaborators from the University of Montana, will present the work this month at the 2019 IEEE International Conference on Robotics and Automation in Montreal.
Source: Purdue University
The post Hummingbird robot flies and hovers like the real thing appeared first on Futurity.
Hummingbird robot flies and hovers like the real thing published first on https://triviaqaweb.weebly.com/
0 notes