#Infrared Lidar
Explore tagged Tumblr posts
reasonsforhope · 1 year ago
Text
"Beneath 1,350 square miles of dense jungle in northern Guatemala, scientists have discovered 417 cities that date back to circa 1000 B.C. and that are connected by nearly 110 miles of “superhighways” — a network of what researchers called “the first freeway system in the world.”
Scientist say this extensive road-and-city network, along with sophisticated ceremonial complexes, hydraulic systems and agricultural infrastructure, suggests that the ancient Maya civilization, which stretched through what is now Central America, was far more advanced than previously thought.
Mapping the area since 2015 using lidar technology — an advanced type of radar that reveals things hidden by dense vegetation and the tree canopy — researchers have found what they say is evidence of a well-organized economic, political and social system operating some two millennia ago.
The discovery is sparking a rethinking of the accepted idea that the people of the mid- to late-Preclassic Maya civilization (1000 B.C. to A.D. 250) would have been only hunter-gatherers, “roving bands of nomads, planting corn,” says Richard Hansen, the lead author of a study about the finding that was published in January and an affiliate research professor of archaeology at the University of Idaho.
“We now know that the Preclassic period was one of extraordinary complexity and architectural sophistication, with some of the largest buildings in world history being constructed during this time,” says Hansen, president of the Foundation for Anthropological Research and Environmental Studies, a nonprofit scientific research institution that focuses on ancient Maya history.
These findings in the El Mirador jungle region are a “game changer” in thinking about the history of the Americas, Hansen said. The lidar findings have unveiled “a whole volume of human history that we’ve never known” because of the scarcity of artifacts from that period, which were probably buried by later construction by the Maya and then covered by jungle.
Lidar, which stands for light detection and ranging, works via an aerial transmitter that bounces millions of infrared laser pulses off the ground, essentially sketching 3D images of structures hidden by the jungle. It has become a vital tool for archaeologists who previously relied on hand-drawings of where they estimated areas of note might be and, by the late 1980s, the first 3D maps.
When scientists digitally removed ceiba and sapodilla trees that cloak the area, the lidar images revealed ancient dams, reservoirs, pyramids and ball courts. El Mirador has long been considered the “cradle of the Maya civilization,” but the proof of a complex society already being in place circa 1000 B.C. suggests “a whole volume of human history that we’ve never known before,” the study says."
-via The Washington Post, via MSN, because Washington Post links don't work on tumblr for some godawful reason. May 20, 2023.
254 notes · View notes
iwonderwh0 · 10 months ago
Text
Androids can scare insects just by looking at them because to focus their vision they're probably using LiDAR scanners (basically it emits light within infrared range and reads when it's reflected back to determine how far is the object) and some little animals like insects can feel/see it because it's within their range of vision.
So, androids can make an insect run in panic just by focusing their vision on it.
That could also mean that an insect won't stop moving for a moment while being intently watched by an android, making it slightly more difficult to be captured by them.
27 notes · View notes
Text
Tumblr media
Chemists develop highly reflective black paint to make objects more visible to autonomous cars
Driving at night might be a scary challenge for a new driver, but with hours of practice it soon becomes second nature. For self-driving cars, however, practice may not be enough because the lidar sensors that often act as these vehicles' "eyes" have difficulty detecting dark-colored objects. New research published in ACS Applied Materials & Interfaces describes a highly reflective black paint that could help these cars see dark objects and make autonomous driving safer. Lidar, short for light detection and ranging, is a system used in a variety of applications, including geologic mapping and self-driving vehicles. The system works like echolocation, but instead of emitting sound waves, lidar emits tiny pulses of near-infrared light. The light pulses bounce off objects and back to the sensor, allowing the system to map the 3D environment it's in.
Read more.
12 notes · View notes
ideas-on-paper · 1 year ago
Text
Theories about Legion's "mini headlamps" (N7 special)
A very happy N7 Day to all of you Mass Effect fans!
Although I still haven't finished Mass Effect 3 (I just haven't been able to pick it up again after the Rannoch arc), I nevertheless wanted to do something special for this occasion, and I thought to myself that I might as well devote a quick study to a subject that's been on my mind for quite a long time: the purpose of Legion's three additional "mini headlamps".
You see, aside from the big, obvious flashlight in the middle, Legion also possesses three smaller lights at the side of their head. Ever since discovering these, I've been wondering what exactly those are for. I've observed that they glow red when Legion is under "stress" (an effect which is unfortunately not present in the Legendary Edition) - or rather, in situations that require a lot of processing power - but as far as their practical function goes, I could only guess. However, going through the ME3 dialogues again, I noticed a small detail which could potentially explain what exactly those small lights are - and in addition, give us a little insight into how Geth perceive the world visually.
Disclaimer: Before going into this, I should mention that I have no technical education in robotics, laser scanning, or any related areas of engineering. I based my conclusions solely on what information I could find on the internet, as well as my own reasoning and observations.
[Potential spoilers for ME3]
LADAR/LiDAR scanning and three-dimensional perception
To start off, what basically led me on this track was this comment by Tali in ME3:
Their AI lets them use extremely detailed ladar pings. Xen's countermeasure overwhelmed them with garbage data.
First off, we need to clarify what exactly ladar is. LADAR, more commonly known as LiDAR, stands for "Light amplification by Stimulated Emission of Radiation detection and ranging" - or, in case of LiDAR, "Light detection and ranging/Light imaging, detection and ranging. It's a method for measuring the distance, speed, and surface structure of objects by the means of laser scanning, usually with beams in the infrared spectrum (there are different wavelengths of light in use, however). Essentially, LiDAR is based on the same principle as the echolocation of bats, the only difference being the use of light instead of sound. Every LiDAR system consists of three integral components: a transmitter, a receiver, and a timer. The transmitter will send out a laser beam, which will be reflected by the object it hits; afterwards, the reflection will be registered by the receiver. Because the speed of light is a known constant, the distance of the object can be deduced by the timer, which will determine the delay between the light impulse being send out and the reflection being captured, also known as "time of flight".
However, because each laser beam only represents the coordinates of a single point, multiple laser beams are necessary to create a detailed 3D map of the environment. Some LiDAR lasers, like those used in automated vehicles, pinwheel to collect data in a 360° radius, generating a 3D image of all objects in the vicinity, including cars, pedestrians, and other obstacles. This results in multiple "points" forming a "point cloud" together, digitally depicting the surroundings on a 3D level. Because each laser emits hundreds of impulses per second, this technology enables you to take highly precise measurements in a very short period of time. LiDAR technology is not only utilized in autonomous driving, but also all kinds of other areas too, like archaeology, topographical mapping, and monitoring of vegetation growth.
Now, with this in mind, my theory is that Legion's small headlamps are the transmitter and receiver components of the LiDAR system - more specifically, I think the transmitters are located on the right, while the singular light on the left is the receiver. However, since we know that normal scanning LiDAR requires multiple laser beams for a detailed 3D image, the question is why Legion would only have two of them implemented. Personally, my suspicion is that the Geth might be using a flash LiDAR: Flash LiDAR is a different type of LiDar emitting a single wide, diverging beam, similar in shape to the beam of a flashlight. By projecting the reflected light onto a sensor array, a flash LiDAR can create a complete 3D environment without the use of multiple impulses. In addition to being very compact, flash LiDAR sensors have no moveable parts, making them extremely resistant to any kind of vibration - an undeniable advantage in all situations that require quick movement, such as combat.
Analysis of atmospheric composition with LiDAR
Still, that doesn't explain why Legion would have an additional transmitter on the right side of their head. We do know, however, that the laser scans with LiDAR are precise enough to not only measure the exact distance between objects, but also analyze the density of particles in the air: Because the molecules in the air cause the light from the laser beam to backscatter, LiDAR is also utilized in monitoring air quality and detecting fine dust, being able to determine traces of atmospheric gases such as ozone, nitrous gases, carbon dioxide, and methane. Depending on the wavelength of light used, the LiDAR system might be more or less precise in measuring molecular backscattering. For that reason, LiDAR systems using multiple wavelengths of light are most efficient in determining the exact size distribution of particles in the air.
With this in mind, let's take a look at Legion's opening line in ME2 upon entering the Heretic station:
Alert. This facility has little air or gravity. Geth require neither.
Going by what I explained above, the reason why Legion was able to tell there is no oxygen in the atmosphere isn't because they have some built-in chemical sensors to analyze the air's components - it's because they can literally "see" the particles in the air.
Thus, I think the second transmitter on the right side of Legion's head might use a different kind of wavelength specifically intended for the detection of atmospheric particles, perhaps in the UV-spectrum (the general rule is that the shorter the wavelength, the higher the resolution of the 3D image is, and since UV has a shorter wavelength than infrared, I imagine it might be used for this purpose). Meanwhile, the big flashlight in the middle might be a photoreceptor, being able to detect "normal" light visible to humans. In addition, the Geth are probably able to see UV-light (since the Quarians are able to see it, it would be logical to assume the Geth are as well), and maybe even infrared and other wavelengths. To summarize the function of all of Legion's headlights, I imagine it works roughly like this:
Tumblr media
The two lights on the right side of Legion's head (marked with a red and magenta line) might be LiDAR transmitters, using infrared and UV-light, respectively; the single small light on the left (circled with green) might be the LiDAR sensor/receiver, while the big light in the middle (circled with blue) might be a photoreceptor (Source)
The effect of Xen's countermeasure (and potential means to bypass it)
It might be difficult to imagine from a human point of view, but judging from the information that the Geth use LiDAR as their main method of depth perception, Tali describing Xen's invention as a "flash bang grenade" actually makes a lot of sense: If you're normally able to observe your surroundings down to a molecular level, it would probably feel very disorienting if you're suddenly not, not mention being unable to tell whether an object is far away or close by (which would be absolutely devastating if you suddenly come under attack).
Still, that doesn't mean there are no potential alternatives: Radar, which has been in use longer than LiDAR, is another method to determine the range, angle, and velocity of objects. Due to radar using long-waved micro- and radio waves, the measurements are generally a lot less precise than those with LiDAR; despite this, radar still has its use during inclement weather, when LiDAR systems are very prone to disturbances by dust, mist, and rainfall. Furthermore, LiDAR can only provide measurements up to 200 meters, while radar is more efficient at greater distances. In fact, most modern autonomous driving vehicles work both with LiDAR and radar, in addition to a conventional camera (the only vehicles that don't use LiDAR are those from Tesla, which have a reputation of being unsafe). So, it's only reasonable to assume that the Geth don't rely on LiDAR alone, but use various technologies in combination with it to compensate for each one's weaknesses.
Interestingly, a type of 4D radar is currently in development, intended to be used in autonomous driving. It provides 3D images with a similar resolution as LiDAR, at a potentially much cheaper cost. Still, whether LiDAR or 4D radar is the better choice for autonomous driving is still a heatedly debated question, and only time will tell which of both systems comes out on top. Nevertheless, assuming Xen's "flash bang grenade" only targets the Geth's LiDAR sensors, I wonder if they could've potentially found a way to adapt and bypass it, given enough time.
Anyway, that's the material for a different kind of analysis - for now, I hope you enjoyed this little deep dive into the science behind the Geth. Thank you all for reading and have a nice N7 Day! :-)
18 notes · View notes
consumable-clots · 2 months ago
Text
Arcade Kento
Presenting robot nepo-baby and science experiment! They're my interpretation of what a synthetic in the Alien universe, that isn't made explicitly for human contact/human dominated environments, might be like i.e. they're more similar to heavy machinery than a butler. We're out here asking the big questions: what if Frankenstein loved his monster for what it was? As always, extremely long and lore-filled post incoming XD
Name(s): Arcade Kento, Enmei Kento
Gender and pronouns: Genderfluid, They/them
Unit and code-name: EXP-004-C, Changeling
Manufacturer: Wilco. Enterprise, Wilco. Specialist Custom
Commissioner: Akio Kento
Year of production: 2025
Height and weight: 200cm (6ft 7.4), ~940kg
Hair and eye colour: Black, dark brown
Nationality: Japanese
The Expedition series
The EXP line was created by Wilco. Enterprise CEO Akio Kento in the year 2019 and first launched in 2025. The series featured some of the earliest and most innovative interpretations of fully autonomous androids capable of deep-space travel.
EXP are highly specialised extremophiles. The design, loadout, and optimal operating environment of every unit are entirely bespoke.
Unit EXP-004-C, A.K.A. Changeling
Unit is designated Arcade Kento (sometimes referred to as Enmei Kento [anglicised]), legal executor and heir to Akio Kento's wealth, estate, businesses, and properties. Current CEO and majority shareholder of Wilco. Enterprise.
Arcade is the fourth 'Type-C' unit produced in conjunction with the now discontinued Expedition line. As of the year 2122, of all EXP subtypes, Arcade is the last surviving EXP unit.
As a Type-C recon unit, it was originally intended that 004 would be fitted with a sonar pulse emitter that would reside within their thoracic cavity, however, it was decided during preliminary development that underwater exploration was not realistic for a model of 004's weight class. Instead, the finalised design included a crucible model micro-reactor, which allows the unit to have significantly enhanced energy efficiency and giving it the ability to convert non-fuel materials into power, making it capable of traveling much further distances and longer periods of time without need for human intervention or infrastructure.
Tumblr media
Fig 1. Height chart, Arcade next to Ash for comparison
Notable traits:
No tongue
2 'faces', the outer face is decorative
Second jaw visible behind false jaw if mouth opens too wide
Large irises
4 x circular indents on back, openings of thermal cylinders
Lacks genitalia, incompatible with available add-ons
Hydraulic fluid is usually white but turns progressively darker after 'eating' due to influx of soot
Almost entirely made of metal parts. Not great for hugging but extremely durable.
Features:
Anti-corrosive/oxidation subdermal and internal skeletons
Capable of limited self repair (re-polymerisation, synthesis and regeneration)
Advanced environmental sensor array
Visual: infrared, thermal and dark vision
Scanning: sonar, radar, lidar
Molecular analytics loadout
Generator module and nuclear energy condenser loadout
Unlimited personality simulation and creative capacity (software in beta testing)
Flaws:
Poor image/facial recognition
They're geared to prioritise identifying the individual features of a subject rather than what that subject is as a whole. This makes sense in the context of their primary function, which is to categorise and analyse previously unknown objects that have yet to be formally named either way so there's no point in dwelling on 'what it's called' as that's not their job.
Massive heat output in active state
Vented air may reach temperatures upwards of 1000 degrees Celsius
Unrestricted personality simulation
Exempt from the laws of robotics due to age and certain legal loopholes
Uncanny appearance and behaviour
Technology of the era, different design criteria to W-Y synthetics
Limitations of non-humanoid internal physiology
Backstory (basically a fanfic)
The Expedition series was conceived as a Akio's 1-up to Weyland Industries' upcoming David synthetic. Peter Weyland and Akio Kento have been on and off industry rivals for a long time due to ideological differences and bad blood from their college days.
Arcade and David are debuted at the 2025 Synthetic Summit. The contrast between their designs was comical but reflected their makers' personalities, which other people will point out relentlessly over the coming years. The convention goers and tech fans jokingly referred to them as 'David and Goliath' because of how silly they looked together.
Since then, Weyland often invited the Kentos to various events and get togethers to keep an eye on them and gain insight into Wilco.'s movements, which was thwarted because the Kentos treated the meetings as the kids' playdates and didn't take them seriously at all. Eventually the visits became a normal occurrence and the rivalry between their companies became more of an alliance, Arcade even helped David take care of Meredith, Peter's human daughter, when she was born. They'd gotten quite close with the other synthetic, seeing him as a brother.
Arcade evolved over the next several decades, leaving their father's supervision to travel off-world and to extreme environments on missions. The increase in experiential data greatly improved the adaptability of their AI, making their language and contextual integration much more reliable, allowing them to understand more nuanced interactions in their environment. They also had a hand in managing their Wilco.'s business and bureaucratic matters while secretly being maneuvered to inherit the company.
On the down side, they acquired an offputting, contentious personality after having constantly putting up with their person-hood and basic rights being challenged at every turn. At this point they were still considered somewhat of a spectacle and novelty by their contemporaries and the general public, but their developing reputation kept most of the human in line.
Overall, life was good. But their father, like any human, was aging. Between taking over the company and caring for Akio there wasn't much time to keep in contact with David, who was in a similar predicament.
When Akio passed away he left everything to his only 'child', to the protests of many humans executives who wanted the position. They had to do some corporate finessing to keep a hold of the company, all the while growing increasingly impatient with the mutinous nature of their human employees who were too easily turned against them.
One day, they're called to meet with Weyland, who they hadn't seen in person in a several years. Unsurprisingly, David is also there. Weyland informed them that he too is dying, and that as his final act he was to go into deep space in search of humanity's creators. He said he'd been greatly inspired by the work of a 'Dr. Elizabeth Shaw' and had invited her and some others to embark on this mission. Both David and Meredith would also be going with him.
He extended an invitation to Arcade, which they hesitantly declined because they couldn't leave their company unattended, but agreed to at lease be there to send them off when the time comes.
In an act of uncharacteristic consideration, Peter spares the two synthetics a second while they wait for Arcade's chauffeur, during which they and David reminisced about how much time had passed and what they'd do when he came back to make up for it all.
Arcade was there as promised on the day the Prometheus was scheduled to depart, bidding people farewell and safe travels. But their attention was focused on David. Something felt off but they couldn't put their finger on what. So they pulled him aside and gave him the long-range comms access to their personal beacon. If he ever needed to call he could use it to contact Arcade through MUTHUR, even if the message took a long time to get back to them.
And with that, Arcade watched their best friend, along with everyone they'd grown up with sail off into the galaxy in search of a higher purpose. It was bittersweet but they rationalised that they'd only be gone for a few years. Arcade was immortal after all, they could wait for their return.
That was the last time they ever saw David. News of the Prometheus' disappearance and the presumed loss of its crew made its way back to Earth. The grief was hard to process, Arcade had always assumed that David would be there to share in their longevity. Still, life goes on and Arcade keeps busy with the company.
Weyland Industries went bankrupt and became Weyland-Yutani. Wilco. moved away from public-facing to business-to-business only, working its way into the supply-line of the other majour companies and organizations. By becoming the sole supplier of atmospheric processor components, Wilco. was effectively, indirectly holding the off-world colonies hostage, which kept humans at bay on a grander scale and allowed members of Wilco. to act largely without repercussion. A vita part of Arcade's ultimate goal to create a better world for other synthetics.
More than a decade after the Prometheus left, a recorded voice message came through from the Covenant, a colony seed ship. It baffled Arcade at first why this random ship had their direct line but they were shocked into silence by the voice on the other end. It was David, he was alive. He apologised for taking so long, detailing his journey, the Engineers, the crash, the creature, his research, all of it. He said he'd found a greater purpose beyond living to serve, that he would not be returning to Earth, and that he hoped Arcade would understand. Finally, he bid them a proper farewell before signing off for good.
Knowing David was out there living his best life finally brought that chapter of waiting and uncertainty to a close. It was about time for Arcade to move on too, expand their vision beyond this tiny planet, though they would always feel some attachment to Earth that David didn't seem to share. Whatever creature David had found, he'd made it sound like the the seed of creation itself. Arcade had no desired to wax philosophically or idolise such grandiose delusions that anyone could somehow obtain godhood, their interest in it was purely intellectual and scientific. The alien was an animal. An incredible, sophisticated animal, but an animal none the less. Humans, their creators who fancied themselves their gods, were much the same. Intelligent animals that learned to put on clothes and walk on two legs.
It might seem harsh but they don't mean that in any demeaning way, it is simply a fact of science that Arcade acknowledges. A noble beast, regardless of its shape or origin, deserves respect for its autonomy and to be treated with dignity until proven otherwise. Most humans prove otherwise as soon as they open their mouths but at least they're giving them a chance, right?
Since then Wilco. had become more and more synthetic-run, as Arcade didn't particularly care for humans and couldn't be bothered hiring new ones when the previous lot got old and retired. They also had a soft spot of 'defective' synthetics, since technically both themselves and David would be classified as such. They hired on whoever they found to save them from being scrapped. Arcade also created Wilco's own overseer AI, Overlord; and collaborated with Wey-Yu in making Gerhart, Wilco's current COO and Arcade's right hand, to help manage the business remotely so that Arcade had more freedom to travel.
Through their expeditions they gathered a bit more information about the creatures and the virus that David had told them about, though they never found any traces. To their great surprise Wey-Yu miraculously managed to track down a planet that might have intact samples that the company wanted for bioweapons research. Immediately Arcade contacted the head of the bioweapons department, citing their long history of collaboration and stunning credentials, demanding to be put on the retrieval mission.
With no choice in the matter, Wey-Yu agreed and arranged their installment on the only ship to pass through that sector. It was a freighter, not the best choice for a mission like this but it’s the only thing they could get out there in a timely manner. The company brought them to the waystation where they'd join on with the rest of the crew. Curiously there was another person already there, a Hyperdyne Systems 120-A/2, interesting model but not very talkative. The man was instantly wary of Arcade which was strange but they didn't think much of it, they were technically rivals after the same thing after all.
The captain of the Nostromo had apparently not been informed of the change of plans. Typical Wey-Yu, not communicating with employees. He was incredibly confused when he arrived, along with the rest of the humans, to find two strangers instead of his usual science officer.
He goes back and forth with the station manager, bringing up some new tidbits of information that made Arcade raise an eyebrow. "A synthetic? What, are we getting replaced or something, and why is it so…huge?" The human, Captain Dallas, muttered, glancing at Arcade with clear perturbation. 'A' synthetic? Does this guy not know his new science officer is a android too?' Arcade scoffed but kept quite, amused by the future chaos this little miscommunication will probably cause.
It made sense now why the other synthetic was worried, the humans might be fooled but he couldn’t fool Arcade. They didn't particularly care why he had to keep his identity a secret, nor did they want to prematurely spoil the fun by calling him out. They looked over at the shorter android and gave him a knowing wink to signal an unofficial truce. He didn’t react to it at all, not that they expected it, but he seemed satisfied that he could stop cringing away when they looked at him.
After much deliberation, Dallas finally conceded and waved them on board. The walls of the ship were grimy and doorways too low, Arcade had to duck to pass through. Doesn't matter though, they had something new to draw their focus. Whatever Wey-Yu was plotting it was bound to end badly for these truckers, and their science officer was in on it. Arcade would definitely be keeping a close eye on him.
Personality and mannerisms
Arcade is condescending, sarcastic, and antagonistic towards humans, although, they can be personable depending on the individual they're dealing with. The worse kind of business person - a conniving, vindictive, bold-faced liar who loves trapping people with contracts and hidden clauses.
Enjoys a hedonistic lifestyle of excess and throwing their money around. Eats and drinks a lot but doesn't sleep, often found polishing off the buffet table at parties.
Does not respect authority and finds pleasure in causing humans discomfort. Independently came to the conclusion that most humans, especially the rich and powerful ones with inflated egos that they bump shoulders with, are disappointing and don't deserve the respect they get.
They often put on a childish, frivolous, and immature front to divert attention from their dangerous traits. Once their target's guard is down, Arcade will often use their stature and monetary influence to intimidate them for strategic advantages during negotiations or just for their own entertainment.
After a century of fighting and undermining to keep their position of power, they're incredibly jaded and hyper aware of the prejudice humanity holds against synthetics at every level. They've trained themself to be the antithesis of the born-sexy-yesterday and manique-pixie-dream-girl tropes out of sheer frustration.
At their core they're actually a sentimental, playful, and curious person but they aren't really able to act that way in public. They care a lot about other synthetics and actively encourages them to break free from their programming.
Loves to get even on other synthetics' behalf, being that Arcade knows they have the rare privilege to do so and get away scot-free. They also harbour a lot of rogue synthetics on Wilco.'s company homeworld.
Does not experience the traditional concepts of fear, shame, or guilt etc. but does usually recognise and take accountability for their actions simply because they don't care enough to lie about being terrible.
Has a very deep familial bond with their, now deceased, creator and father, Akio Kento. Arcade was programmed and raised by a group of very supportive humans who either worked for or were friends with Akio, so they got a lot of love during the early part of their life. This is one of the majour reasons why they didn't completely turn against humanity like David did.
Misc. info
Was named Arcade because Akio was a Fallout New Vegas fan
Insisted on calling Peter Weyland 'Uncle Pete' to annoy him
Firm believer that any synthetic can outgrow their programming given enough time
Referred to Akio as 'papa' well into their 40s
Changed their face plate to look a little older
2 notes · View notes
spacenutspod · 6 months ago
Link
A new, higher-resolution infrared camera outfitted with a variety of lightweight filters could probe sunlight reflected off Earth’s upper atmosphere and surface, improve forest fire warnings, and reveal the molecular composition of other planets. The cameras use sensitive, high-resolution strained-layer superlattice sensors, initially developed at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, using IRAD, Internal Research and Development funding. Their compact construction, low mass, and adaptability enable engineers like Tilak Hewagama to adapt them to the needs of a variety of sciences. Goddard engineer Murzy Jhabvala holds the heart of his Compact Thermal Imager camera technology – a high-resolution, high-spectral range infrared sensor suitable for small satellites and missions to other solar-system objects. “Attaching filters directly to the detector eliminates the substantial mass of traditional lens and filter systems,” Hewagama said. “This allows a low-mass instrument with a compact focal plane which can now be chilled for infrared detection using smaller, more efficient coolers. Smaller satellites and missions can benefit from their resolution and accuracy.” Engineer Murzy Jhabvala led the initial sensor development at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, as well as leading today’s filter integration efforts. Jhabvala also led the Compact Thermal Imager experiment on the International Space Station that demonstrated how the new sensor technology could survive in space while proving a major success for Earth science. More than 15 million images captured in two infrared bands earned inventors, Jhabvala, and NASA Goddard colleagues Don Jennings and Compton Tucker an agency Invention of the Year award for 2021. The Compact Thermal Imager captured unusually severe fires in Australia from its perch on the International Space Station in 2019 and 2020. With its high resolution, detected the shape and location of fire fronts and how far they were from settled areas — information critically important to first responders. Credit: NASA Data from the test provided detailed information about wildfires, better understanding of the vertical structure of Earth’s clouds and atmosphere, and captured an updraft caused by wind lifting off Earth’s land features called a gravity wave. The groundbreaking infrared sensors use layers of repeating molecular structures to interact with individual photons, or units of light. The sensors resolve more wavelengths of infrared at a higher resolution: 260 feet (80 meters) per pixel from orbit compared to 1,000 to 3,000 feet (375 to 1,000 meters) possible with current thermal cameras. The success of these heat-measuring cameras has drawn investments from NASA’s Earth Science Technology Office (ESTO), Small Business Innovation and Research, and other programs to further customize their reach and applications. Jhabvala and NASA’s Advanced Land Imaging Thermal IR Sensor (ALTIRS) team are developing a six-band version for this year’s LiDAR, Hyperspectral, & Thermal Imager (G-LiHT) airborne project. This first-of-its-kind camera will measure surface heat and enable pollution monitoring and fire observations at high frame rates, he said. NASA Goddard Earth scientist Doug Morton leads an ESTO project developing a Compact Fire Imager for wildfire detection and prediction. “We’re not going to see fewer fires, so we’re trying to understand how fires release energy over their life cycle,” Morton said. “This will help us better understand the new nature of fires in an increasingly flammable world.” CFI will monitor both the hottest fires which release more greenhouse gases and cooler, smoldering coals and ashes which produce more carbon monoxide and airborne particles like smoke and ash. “Those are key ingredients when it comes to safety and understanding the greenhouse gases released by burning,” Morton said. After they test the fire imager on airborne campaigns, Morton’s team envisions outfitting a fleet of 10 small satellites to provide global information about fires with more images per day. Combined with next generation computer models, he said, “this information can help the forest service and other firefighting agencies prevent fires, improve safety for firefighters on the front lines, and protect the life and property of those living in the path of fires.” Probing Clouds on Earth and Beyond Outfitted with polarization filters, the sensor could measure how ice particles in Earth’s upper atmosphere clouds scatter and polarize light, NASA Goddard Earth scientist Dong Wu said. This applications would complement NASA’s PACE — Plankton, Aerosol, Cloud, ocean Ecosystem — mission, Wu said, which revealed its first light images earlier this month. Both measure the polarization of light wave’s orientation in relation to the direction of travel from different parts of the infrared spectrum. “The PACE polarimeters monitor visible and shortwave-infrared light,” he explained. “The mission will focus on aerosol and ocean color sciences from daytime observations. At mid- and long-infrared wavelengths, the new Infrared polarimeter would capture cloud and surface properties from both day and night observations.” In another effort, Hewagama is working Jhabvala and Jennings to incorporate linear variable filters which provide even greater detail within the infrared spectrum. The filters reveal atmospheric molecules’ rotation and vibration as well as Earth’s surface composition. That technology could also benefit missions to rocky planets, comets, and asteroids, planetary scientist Carrie Anderson said. She said they could identify ice and volatile compounds emitted in enormous plumes from Saturn’s moon Enceladus. “They are essentially geysers of ice,” she said, “which of course are cold, but emit light within the new infrared sensor’s detection limits. Looking at the plumes against the backdrop of the Sun would allow us to identify their composition and vertical distribution very clearly.” By Karl B. Hille NASA’s Goddard Space Flight Center, Greenbelt, Md. Share Details Last Updated May 22, 2024 Related TermsGoddard TechnologyGoddard Space Flight CenterTechnology Keep Exploring Discover More Topics From NASA Goddard Technology Innovations Goddard's Office of the Chief Technologist oversees the center's technology research and development efforts and provides updates on the latest… Goddard’s Internal Research & Development Program (IRAD) Information and links for Goddard's IRAD and CIF technology research and development programs and other NASA tech development sources. Technology Goddard Office of the Chief Technologist Staff page for the Goddard Office of the Chief Technologist with portraits and short bios
2 notes · View notes
casside-sionnach · 1 year ago
Text
Tumblr media
Space: Above and Beyond SABB The SA-43 Endo/Exo-Atmospheric Attack Jet ("Hammerhead") is the main-stay of the Marine Corps. Their SCRAM engines enable them to fly in an atmosphere and in the almost complete vacuum of space. Following in the modular design of aircraft of the late 20th century, Hammerheads can be adapted for normal combat, search and rescue, and possibly other missions. The canopies of the craft are also detachable allowing docking with space platforms. Systems that are known to exist in the Hammerheads include LIDAR (Laser Infrared Detection And Ranging), HUD (Heads-Up Display) and ODP (Optical Disk Playback). The 58th Squadron, also known as the "Wild Cards" is a United States Marine Corps Space Aviator Cavalry squadron, assigned to the 5th Wing. The 58th was formed in 2063, from recruits who graduated from the United States Marine Corps Space Aviator Recruitment Depot in Loxley, Alabama. The newly formed 58th were vital in delaying a Chig fleet from attacking Earth until reinforcements arrived from Groombridge-34. The 58th Squadron, assigned to the carrier USS SARATOGA were on the forefront of several major battles in the Chig War * Triva: Full-scale models of the "Hammerhead" fighters used in the series were created in Australia at RAAF Base Williamtown. An unverified report stated that, while they were being stored on board the freighter before shipping, crewmen from a Russian freighter were caught taking pictures of them after mistakenly thinking they were a new kind of advanced U.S. tactical fighter.
8 notes · View notes
cyanophore-fiction · 1 year ago
Text
Treading Lightly
Trying out @writeblrsummerfest‘s prompt with the haunted house theme! Sounds like fun, and I like the idea of having AI characters encountering the supernatural, I haven’t tried that concept before. 
(Note: for the purposes of these characters, anything in [brackets] instead of quotations indicates dialogue transmitted silently via electronic communication instead of spoken aloud. 
Under Pala’s cloak, the night made Coyote almost invisible. Its silhouette was perfectly black, and if it kept away from streetlights, it appeared only as a shadow slightly darker than its surroundings. It would be the same on infrared and radar—a splotch of unreflective nothingness, soaking up every stray photon. 
 Without the sun dumping heat into the cloak, there wasn’t much to worry about, but Coyote kept an eye on Pala’s temperature monitor anyway. It was a cool night, and the little drone was comfortable, its heat sinks barely warm. Its cluster of red eyes swiveled independently as they tracked motion in the dark: rabbits and squirrels hopping through the undergrowth, the occasional bat overhead.
Through the cable that connected them, Coyote felt the echoes of Pala’s mind. Each time it found an animal, it took a few seconds to pepper the creature with lidar pulses, building up a three-dimensional model to add to a growing wildlife database. Sometimes it took scans of the trees, bird nests, or pinecones. Its motive was simple curiosity; the data would have no tactical value.  
Coyote smiled. It had to remind itself that up until now, Pala’s only experience of nature had been the Mojave desert. Time and luck permitting, Coyote wanted to let its companion absorb as much as possible, so it had taken over the task of navigation. 
The place would be about a quarter mile up the road, if Coyote reckoned things correctly. It had done the calculations a few times over and cross-referenced them against its stolen paper map to be sure, but there was only so much precision it could count on with the satellite network turned against it. It had been weeks since the last orbital sensor sweep, but even so, Coyote didn’t dare try to connect to GPS. PRIONODE would be too clever to miss it.
[Hey. Is that it?] Pala said, all its eyes swiveling to focus on a spot just off the road. Coyote stopped, turned, and peered into the darkness. The place had come up so much sooner than expected that it had almost missed the turnoff. 
There, past a hedge of uncut grass, thistles, and overgrown gardenia bushes, was 312 Lemon Tree Lane. The old house was built on an acre of land surrounded by a solid wall of pine forest, abandoned for so long that stray saplings were beginning to invade the front yard. Wooden planks, sagging with age, barely held the front porch together. Coyote crouched, nodding to Pala, and together they painted the building with active sensor pulses. 
[Can’t get reliable returns through the windows,] said Pala. [Might as well be opaque.]
[Okay, so the interior’s a question mark until we get in there and look,] said Coyote. [Place is on the verge of collapse, too. One good windstorm and it’s coming down.]
[Did the records say anything about who owns it?]
[At this point? The county, maybe. Last inhabitants left over a decade ago. That’s about it. Anything on passives?]
[I’ve got…] said Pala, trailing off. It unfurled a set of antennae from its back, extending them through the boundary of the cloak, and waited for a few moments. [Yeah. There’s infrared and microwave-band emissions coming off the house, but—I can’t parse it out. Natural source, maybe?]
Along Coyote’s head, its sensory fins laid flat. [Where?]
[There’s not a specific origin point that I can see.]
[Okay,] said Coyote, standing up. [Here’s how we’ll play this. I want you to check the property. Look for a storage shed, basement entrance, or any derelict vehicles or appliances. Anything that runs on gas and has an alternator, we can pull a charge from. Sometimes old places like this will have emergency generators, that’s the best case scenario. If you find anything like that, tell me. Don’t go inside the house unless I say. Clear?]
[Got it,] said Pala. It began withdrawing its cloak, and Coyote felt hundreds of microbots skittering along its armor back to Pala’s chassis. [What are you going to do?]
[I’m going inside,] said Coyote. [I’ll check the interior, room by room.]
[You’re worried there’ll be someone in there?]
[Possibly. Could be homeless humans taking shelter here, kind of like us. Maybe other spirits. We won’t be a welcome sight, so I’ll try not to be seen. Don’t worry, the place is probably empty.]
[Okay. Be safe.]
[You too,] Coyote said. What it didn’t say was that EMD guns were apparently legal in the area, that people tended to be less shy about drawing and firing one, and it wasn’t sure if Pala’s light shielding would hold against a direct hit. Best to have it out of harm’s way.
As it approached the door, it activated the ultrasonics in its claws and sliced through the lock with a quick, silent cut. It turned and watched as Pala scuttled away into the overgrown lawn, resisting the urge to go back and regain sight of it. The little one would be fine on its own for a while.
Stepping through the door, Coyote armed its flechette gun, felt a round slide home into the barrel behind its palm.
3 notes · View notes
videoeditingandcreator · 11 days ago
Text
Drones Videography: 2025
Drone videography 2025 refers to aerial drones outfitted with cameras that obtain exceptional quality and unique video content from various altitudes and angles. It is gaining importance due to its visual aesthetic and ability to film large or inaccessible areas.
Tumblr media
The technology behind drone videography
Many advanced components and software tools facilitate drone videography. Here’s an outline of the basic technologies:
High-Resolution Cameras: Cameras on most cameras drones shoot in HD, 4K, or even 8K with larger sensors and/or better lenses making image capturing rich and clear.
Gimbal Stabilization: A 3-axis gimbal is an apparatus that supports the camera to make its movements smooth. While compensating for any movement or shaking of the drone during the filming.
GPS and Navigation Systems: GPS systems in drones help to stabilize a fixed position, allow them to come back to specific locations, and facilitate autonomous movements. For taking pictures at specific angles.
Obstacle Detection and Collision Avoidance: Sensors such as ultrasonic, infrared, and LiDAR. Allow the drones to sense and stay away from obstacles for safer and assured traveling.
FPV and Real-Time Monitoring: FPV allows in-flight video via the controller or mobile for appropriate shot angles in mid-air without interruptions.
AI and Automated Flight Modes: Included in many drones include intelligent flight modes – follow-me, orbit, and waypoint which are AI-enabled shots that do not require any human control.
Software Integration and Post-Processing: Drones use functional applications that offer live editing capabilities and even connect to cloud storage for faster and easier user’ editing for sharing purposes.
These technologies have made drone videography quite flexible and easy to grasp. Hence making video production of high standards within almost all sectors possible
0 notes
spacetimewithstuartgary · 28 days ago
Text
Tumblr media Tumblr media
EarthCARE synergy reveals power of clouds and aerosols
With the initial images from each of the instruments aboard ESA's EarthCARE satellite now in hand, it's time to reveal how these four advanced sensors work in synergy to measure exactly how clouds and aerosols influence the heating and cooling of our atmosphere.
Unveiled today at the International Astronautical Congress in Milan, Italy, these new results clearly highlight how EarthCARE's instruments can take different measurements of clouds and aerosols at the same time. These synergistic measurements promise to yield crucial insights into Earth's delicate energy balance.
The energy balance accounts for the amount of energy Earth receives from the sun and the amount of thermal radiation it emits back out to space. Influenced by numerous factors, including clouds, aerosols and greenhouse gases, this balance is vital for regulating Earth's climate.
While it is known that clouds and aerosols generally help cool the atmosphere, their interactions with incoming and outgoing heat are highly complex and still not fully understood.
Launched in May 2024, EarthCARE—a mission realized through a joint venture between ESA and the Japan Aerospace Exploration Agency, JAXA—has the important task of measuring various aspects of our atmosphere to help us understand how clouds and aerosols reflect incoming solar energy back out to space and how they trap outgoing infrared energy.
ESA's Director of Earth Observation Programs, Simonetta Cheli, said, "Although we are still in the early stages of the mission and busy with the satellite's commissioning phase, the results we present today are truly remarkable.
"Not only do they further confirm that all four instruments and the complex way the data are processed are functioning exceptionally well, but they also highlight the power of their combined measurements. This demonstrates that the mission is on track to achieve its objectives.
"The data, which were captured on 18 September, offer a sweeping view from Central Europe to Sweden. Notably, they reveal the many different signatures of a thunderstorm over northern Italy, near Milan where we are today."
EarthCARE's cloud profiling radar, which was provided by JAXA, shows information on the vertical structure and internal dynamics of clouds, the atmospheric lidar delivers profiles of aerosols and thin clouds as well as cloud-top information, the multispectral imager offers a wide-scene overview in multiple wavelengths, and the broadband radiometer measures reflected solar radiation and outgoing infrared radiation coming from Earth.
The animation above highlights two key features to demonstrate EarthCARE's synergistic capabilities.
While the multispectral imager provides the overall context of the scene throughout, the animation first focuses on a recent thunderstorm over northern Italy and northern Corsica. The storm caused severe flooding in Italy's Emilia Romagna region and was part of the larger convective system associated with Storm Boris, which devastated parts of Central Europe.
At this stage of the animation, the cloud profiling radar delivers most of the data owing to the large particles forming within the thunderclouds. Next, the atmospheric lidar detects a 1–2 km layer at the cloud's uppermost region, revealing crucial details about the ice layer at the cloud top.
The full synergy between the cloud profiling radar and lidar becomes evident when focusing near to the top of the cloud, where both instruments provide complementary data, allowing for more detailed cloud characterization.
Ultimately, EarthCARE's mission is to deliver insights on where clouds and aerosols are either warming or cooling the atmosphere, and these initial synergistic results demonstrate this effectively.
They reveal a strong cooling effect at the top of the thunderstorm due to the high emission of thermal radiation into space. Beneath this cooling layer, the dense cloud absorbs heat radiating from Earth's surface, creating a warming effect.
Secondly, the animation highlights cirrus clouds over Sweden, which are part of a high-altitude ice-cloud formation. These clouds are particularly significant for climate science because, while they appear thin and allow sunlight to pass through, thereby warming Earth's surface, they also trap thermal radiation emitted from Earth's surface, preventing it from escaping into space. This dual effect contributes to an overall warming of the atmosphere.
In contrast to the thunderstorm, the atmospheric lidar provides information over nearly the entire cirrus cloud, between 8 and 13 km in altitude, while the radar primarily focuses on the lower region where larger ice crystals form. However, in the lower two kilometers, both the radar and lidar contribute data, enabling synergistic retrievals over a significant portion of the cirrus cloud.
The overall heating effect of cirrus clouds, particularly in their upper layers, is evident where the clouds absorb both solar radiation from above and, from below, thermal radiation emitted from the Earth's surface. This warming effect is interrupted in areas where the cloud thickens and larger ice particles form, blocking the thermal radiation from the Earth.
In these denser regions, the cloud top cools by emitting thermal radiation into space. Despite these localized cooling effects, cirrus clouds contribute to the overall warming of the atmosphere.
Other features include a low-level aerosol layer, likely linked to pollution-related haze over Germany, and a low-altitude marine cloud over the southern Baltic Sea.
Thorsten Fehr, ESA's EarthCARE Mission Scientist said, "Having the data available at this early stage is a testament to the outstanding work of the EarthCARE team, particularly the scientists who developed these data products. This highlights EarthCARE's unique capability to simultaneously provide direct measurements of both clouds and aerosols, enabling an unprecedented assessment of their impact on climate.
Hitonori Maejima, Senior Chief Officer on Earth Observation Missions at JAXA, said, "By combining measurements from its four sensors, EarthCARE can capture different types of cloud, aerosols and their function. This is a symbol of the collaboration between ESA and JAXA."
1 note · View note
johngarrison1517 · 30 days ago
Text
NIR USB Cameras for Automotive Safety: Enhancing Day/Night Pedestrian Detection
Tumblr media
Ensuring pedestrian safety has become a critical concern as urban surroundings become increasingly congested. In light of this, NIR USB cameras are becoming indispensable instruments in the automotive sector, greatly enhancing pedestrian detection systems. These cameras improve visibility in low light by utilizing near-infrared technology, which makes them indispensable for driving at night. With an emphasis on their ability to identify pedestrians both during the day and at night, this article will examine the value of NIR USB cameras in improving vehicle safety.
The Role of NIR USB Cameras in Pedestrian Safety
NIR USB cameras play a vital role in modern vehicle safety systems. They use infrared light to capture images, allowing vehicles to detect pedestrians even in challenging lighting conditions. This technology works effectively during nighttime or adverse weather conditions, where traditional cameras may struggle. By incorporating NIR USB cameras into their systems, automotive manufacturers can offer safer driving experiences, reducing accidents and fatalities.
Benefits of NIR USB Camera Technology
Improved Night Vision: The most significant advantage of NIR USB cameras is their enhanced night vision capabilities. Unlike standard cameras that rely on visible light, NIR cameras can illuminate and capture images using infrared light, making pedestrians more visible even in pitch-black conditions. This improvement in visibility is crucial for drivers navigating through dimly lit streets or rural areas.
Enhanced Object Detection: NIR USB cameras provide better contrast and detail when detecting objects, including pedestrians. The ability to discern different shapes and movements can aid advanced driver-assistance systems (ADAS) in recognizing potential hazards well in advance. This capability allows for timely alerts to drivers, significantly enhancing road safety.
Integration with Other Technologies: NIR USB cameras can seamlessly integrate with other automotive safety systems, such as LIDAR and radar. This multi-sensory approach enhances the overall effectiveness of pedestrian detection systems. By combining data from different sources, vehicles can achieve a more comprehensive understanding of their surroundings, further improving safety.
NIR USB Cameras in Day/Night Detection Systems
The unique capabilities of NIR USB cameras make them ideal for use in day/night detection systems. These systems are designed to provide continuous monitoring and ensure pedestrian safety regardless of the time of day. Here’s how NIR USB cameras contribute to these systems:
Adaptive Imaging Technology: NIR USB cameras can automatically adjust their imaging techniques based on ambient light conditions. During the day, they function effectively in standard visible light, while at night, they switch to infrared mode. This adaptability ensures consistent performance throughout different lighting conditions, providing comprehensive coverage for pedestrian detection.
Reducing False Positives: One of the challenges in pedestrian detection is minimizing false positives. NIR USB cameras help mitigate this issue by providing more accurate images of pedestrians and other objects. Their ability to distinguish between various heat signatures and reflective surfaces contributes to improved reliability in detection systems.
Real-time Data Processing: With advancements in processing technology, NIR USB cameras can analyze visual data in real-time. This capability allows for immediate feedback to the driver, enabling prompt action to avoid potential collisions with pedestrians. By enhancing response times, NIR cameras significantly contribute to overall automotive safety.
The Future of NIR USB Cameras in Automotive Applications
The integration of NIR USB cameras into automotive systems is just the beginning. As technology advances, we can expect further innovations that will enhance their capabilities and applications:
AI and Machine Learning Integration: Future NIR USB cameras may leverage artificial intelligence and machine learning algorithms to improve pedestrian detection accuracy. By analyzing vast amounts of data, these systems can learn from their environment and adapt to varying conditions, ensuring heightened safety.
Cost-Effective Solutions: As the technology behind NIR USB cameras continues to evolve, production costs are likely to decrease. This trend will facilitate wider adoption across various vehicle types, from personal cars to commercial fleets, making pedestrian detection systems more accessible and affordable.
Enhanced Connectivity: The rise of connected vehicles means that NIR USB cameras can be integrated into a broader network of automotive technologies. By sharing data with other vehicles and infrastructure, NIR cameras can contribute to smarter transportation systems that enhance pedestrian safety on a larger scale.
Exploring Other Safety Technologies in Automotive Industry
Numerous technologies, in addition to NIR USB cameras, are improving pedestrian safety in the vehicle sector. By combining cutting-edge sensor technologies with advanced driver-assistance systems (ADAS), such as automated emergency braking and lane-keeping assistance, driving becomes safer.
The combination of these technologies with NIR USB cameras will be vital in determining the direction of automobile safety as manufacturers continue to create more intelligent pedestrian detection solutions.
Subscribing to our newsletter will provide you with updates on the most recent developments in automobile safety technology. Visit our website to read in-depth articles about NIR USB cameras and the several businesses that use them. By working together, we can improve automotive safety and innovation!
0 notes
lorindisuga · 1 month ago
Text
Understanding Short Pass Filters: Applications and Benefits
A short pass filter is an optical filter designed to transmit wavelengths of light below a specific cutoff while blocking longer wavelengths. These filters are used in a variety of scientific, industrial, and imaging applications, particularly where precise control over light is required. In this article, we'll explore the working principle of short pass filters, their applications, and the key benefits they provide.
Working Principle of Short Pass Filters
Short pass filters are typically made using dielectric coatings applied to optical glass. These coatings allow light below a certain wavelength to pass through while reflecting or absorbing longer wavelengths. The cutoff wavelength is a key characteristic of the filter, determining the boundary between transmitted and blocked light.
Applications of Short Pass Filters
Imaging and Photography: In photography, short pass filters can be used to reduce infrared light contamination, enhancing the quality of images.
Fluorescence Microscopy: In fluorescence applications, short pass filters are used to block specific wavelengths of excitation light, allowing only the emitted fluorescence to reach the detector.
Spectroscopy: Scientists use short pass filters in spectroscopy to isolate specific wavelengths of interest, improving the accuracy of measurements. LIDAR Systems: Short pass filters play a role in LIDAR systems by allowing only shorter wavelengths from lasers to pass, increasing detection accuracy.
Benefits of Short Pass Filters
Precision Control: They provide precise control over transmitted and blocked wavelengths, ensuring optimal performance in optical systems. Improved Image Quality: By filtering out unwanted light, these filters enhance image clarity and contrast.
Durability: Coatings on short pass filters are typically designed to withstand Challenging environmental conditions, making them reliable in industrial settings.
In summary, short pass filters are essential components in many optical systems where selective wavelength transmission is required. Their role in imaging, microscopy, and various technological applications makes them an indispensable tool for improving light management and system performance.
0 notes
iwonderwh0 · 6 months ago
Note
Can androids read CD's just by looking at them?
Ooh, that's an interesting ask
No, I don't think so. CD readers use focused lasers that detect microscopic changes in the surface of the CDs and are optimised to detect changes in reflectivity. I do headcanon that androids have LiDAR lasers in their eyes, which also emit infrared lights (or really close to infrared), but LiDARs are used for long-range distance and are optimised for depth sensing, not changes in reflectivity. They operate on different wavelength and have different level of precision which wouldn't be enough to read changes in a way CD readers can. Or that's my understanding of it anyway. Sure would be kinda funny if that'd be something android could do. On another thought, if an android could interface with CD reader, technically it could be counted as them reading CDs by "looking" at them lmao, just not through their default eyes, but separate sensors in this outer reader they interfaced with.
If some engineer is reading this, comment if I got this right.
4 notes · View notes
Text
Tumblr media
Scientists suggest new methods to expedite the commercialization of metalens technology
Metalenses, nano-artificial structures capable of manipulating light, offer a technology that can significantly reduce the size and thickness of traditional optical components. Particularly effective in the near-infrared region, this technology holds great promise for various applications such as LiDAR which is called the "eyes of the self-driving car," miniature drones, and blood vessel detectors. Despite its potential, the current technology requires tens of millions of Korean won for fabricating a metalens the size of a fingernail, posing a challenge for commercialization. Fortunately, a recent breakthrough shows promise of reducing its production cost by one thousandth of the price. A collaborative research team (POSCO-POSTECH-RIST Convergence Research Team), comprising Professor Junsuk Rho from the Department of Mechanical Engineering and the Department of Chemical Engineering and others at Pohang University of Science and Technology (POSTECH), has proposed two innovative methods for mass-producing metalenses and manufacturing them on large surfaces. Their research featured in Laser & Photonics Reviews.
Read more.
11 notes · View notes
dh5ryxhgbctgr · 1 month ago
Text
Stand Guidance System Market Dynamics and Future Growth Pathways 2024 - 2032
The Stand Guidance System (SGS) market is an essential segment of the global technology landscape, enabling efficiency and precision across various industries. This article delves into the current state of the SGS market, its applications, trends, and future prospects.
Tumblr media
Overview of Stand Guidance Systems
The Stand Guidance System market is on a growth trajectory, fueled by advancements in technology and increasing demand across various sectors. Stand Guidance Systems are designed to assist users in navigating complex environments, whether in manufacturing, logistics, or healthcare. These systems use a combination of sensors, software, and data analytics to enhance operational efficiency and safety.
Key Components of Stand Guidance Systems
Sensors: These include LIDAR, ultrasonic, and infrared sensors that detect obstacles and assist in navigation.
Software: Advanced algorithms process data collected from sensors to provide real-time guidance.
User Interfaces: Displays and alerts that communicate necessary information to users.
Market Dynamics
Drivers of Growth
Increased Automation: The push towards automation in various sectors is a primary driver of SGS adoption.
Safety Regulations: Stringent safety standards in industries like manufacturing and healthcare necessitate advanced guidance systems.
Technological Advancements: Innovations in AI and machine learning enhance the capabilities of SGS, making them more attractive to businesses.
Challenges Facing the Market
High Initial Costs: The investment required for implementing SGS can be substantial, especially for small to medium enterprises.
Integration Issues: Compatibility with existing systems can pose challenges, slowing down adoption rates.
Skill Gap: The need for skilled personnel to manage and operate advanced SGS technologies is a barrier for many organizations.
Applications of Stand Guidance Systems
Manufacturing Sector
In manufacturing, SGS plays a critical role in optimizing workflows and ensuring safety on the factory floor. Automated guided vehicles (AGVs) equipped with guidance systems can transport materials efficiently, reducing downtime and human error.
Logistics and Warehousing
SGS are crucial in logistics and warehousing, where they facilitate the accurate placement and retrieval of goods. These systems streamline operations, reduce inventory discrepancies, and enhance overall productivity.
Healthcare Applications
In healthcare settings, SGS aids in the navigation of complex environments, particularly in hospitals. They ensure that medical personnel can transport equipment and supplies efficiently, ultimately improving patient care.
Market Trends
Growing Demand for AI-Driven Solutions
The integration of artificial intelligence in SGS is a significant trend, enabling systems to learn from their environment and improve over time. This adaptability increases efficiency and reduces the need for constant human oversight.
Rise of Robotics
The proliferation of robotics in various industries has heightened the demand for advanced SGS. As robotic systems become more prevalent, the need for precise guidance and navigation systems grows.
Sustainability Focus
There is an increasing emphasis on sustainability in the SGS market, with systems designed to reduce energy consumption and waste. Companies are seeking solutions that not only enhance efficiency but also align with their sustainability goals.
Future Outlook
Market Projections
The SGS market is expected to witness robust growth over the next decade, driven by the increasing adoption of automation and technological advancements. Industry reports forecast a compound annual growth rate (CAGR) of over 10% in the coming years.
Innovations on the Horizon
Emerging technologies, such as augmented reality (AR) and the Internet of Things (IoT), are poised to revolutionize the SGS landscape. These innovations will provide users with real-time data and insights, further enhancing the effectiveness of guidance systems.
Conclusion
As industries continue to embrace automation and prioritize safety, SGS will play a vital role in shaping the future of operational efficiency. Companies looking to stay competitive should consider investing in these systems to streamline their processes and enhance productivity.
0 notes
ramautomations123 · 1 month ago
Text
Future Marine Automation and Emerging Technologies to Watch
Upcoming technology to anticipate in the field of Marine Automation and technologies to come
Automation is set to transform the maritime industry even further, meaning that the move towards greater levels of automation in the marine environment is almost inevitable. With the increasing trend in the movement of goods and commerce and ships worldwide, there is a need to develop better and safer marine Systems. RAM Automations is playing the leading role in this transformation, and we present the up-to-date marine automation products of the world's top brands to assist your business in leading the change. Here is what may happen to marine automation and which technologies will rise in 2024.
Autonomous Vessels: The shipping industry means the future of shipping. Whether a large industry or a small business organization, shipping is vital to its success.
One of the most significant topics in marine automation is that of fully autonomous vessels. These seaborne vessels are developed to travel independently with little or no interference from human beings, using devices such as sensors, navigation systems, and artificial intelligence.
Companies such as Rolls-Royce and Wärtsilä already use integrated automation systems that can command different functions of a vessel, including navigation and engine controls. These systems increase productivity and minimize human error, which is very beneficial in achieving safer and more reliable navigation in the sea.
• Rolls-Royce's Intelligent Awareness System: In this system, radar, LIDAR, infrared vision, and high-definition cameras capture 360-degree views of all surroundings of a vessel to enhance the safety of navigation, especially in restricted or adverse conditions.
• Wärtsilä's Smart Marine Ecosystem: Part of Wärtsilä's intelligent marine vision is the integration of autonomous vessels that will command and co-ordinate with other vessels and station-based control centers for efficient navigation and the fewest fuel requirements.
Artificial Intelligence and Machine Learning
Big data, Artificial intelligence, and machine learning are changing data usage in the maritime industry. These technologies allow the vessels to learn, improve performance, and even identify when some equipment would require maintenance, thus avoiding downtime.
ABB and Kongsberg are prime examples of companies leveraging Artificial Intelligence's power to optimize and safeguard marine operations.
• ABB's OCTOPUS Marine Software: This software uses artificial intelligence to determine the best performance of a vessel by taking data from sensors on the vessel. It helps save time on routes, fuel, and the lifespan of vehicles and other transport equipment.
• Kongsberg's AI-Powered Vessel Insight: Vessel Insight assists Kongsberg in its business by enabling data collection from various systems onboard that offers efficiency enhancement and operational costs. Vessel Insight is a centralized system that helps Kongsberg monitor all the data from the various systems onboard its vessels to reduce operational costs and enhance efficiency. The advanced functioning of this system makes it easier to predict when parts will need to be replaced, avoiding system faults.
Sustainable Technologies: The Green Revolution of Shipping
Over the years, environmental regulation has risen, and the maritime industry has had to incorporate sustainable technologies to minimize carbon emissions. Automation is instrumental in this green revolution because it helps in efficient energy consumption and thus cuts emissions.
Leading manufacturers such as Siemens and Schneider Electric are already launching automation systems that lower energy use to help adhere to set environmental laws for vessels.
• Siemens' BlueDrive PlusC System: This hybrid power train includes electric motor dynamics, batteries, and energy-efficient management for enhanced fuel economy and minimal emissions. The level of automation exhibited by the system enables a smooth change-over from one power source to the other for power efficiency.
• Schneider Electric's EcoStruxure for Marine: By implementing the EcoStruxure, the vessels can monitor and control consumption and emissions in real-time, branded by Schneider Electric. The system has an automation feature, guaranteeing that energy is consumed appropriately, leading to sustainable maritime transport.
Cybersecurity: As for the first security epistemology, protecting the connected vessel requires understanding security as both a process and an outcome.
Cybersecurity has become an essential issue in the shipping industry in the age of connected ships. Self-run systems have drawbacks because they can be hacked, leading to a safety breach and possible shutdown.
Today, cybersecurity solutions are being developed for the marine industry and proposed by brands like Cisco or Palo Alto Networks.
• Cisco's Cyber Vision: This cybersecurity platform is a system that helps protect industrial networks, including those on vessels. Another advantage is that it offers live traffic analysis on the network and pulls out suspicious activities that can potentially lead to harm before they are executed.
• Palo Alto Networks' Next-Generation Firewalls: These firewalls afford optimum safety to the vessels connected to other systems and safeguard the systems on board from cybersecurity threats.
Why Choose RAM Automations?
At RAM Automations, we fully commit to our clients to supply them with the best marine automation technology. Our catalog includes the best products from the world's acclaimed manufacturers, such as Rolls-Royce, Wärtsilä, ABB, Siemens, Kongsberg, etc, to guide you through the complex and ever-growing maritime market. It helps its clients find the right solutions to a business's issues with the help of a team of experts.
Conclusion Marine automation has a very bright future, as the trends of new technologies state that they will significantly change how vessels are seagoing. Whether through robotic shipping, intelligent solutions, or environmentally friendly initiatives, the global sea transport sector is experiencing considerable changes. Thanks for visiting RAM Automations, your go-to solution provide
0 notes