#ai at the edge
Explore tagged Tumblr posts
jcmarchi · 2 days ago
Text
NTT Unveils Breakthrough AI Inference Chip for Real-Time 4K Video Processing at the Edge
New Post has been published on https://thedigitalinsider.com/ntt-unveils-breakthrough-ai-inference-chip-for-real-time-4k-video-processing-at-the-edge/
NTT Unveils Breakthrough AI Inference Chip for Real-Time 4K Video Processing at the Edge
In a major leap for edge AI processing, NTT Corporation has announced a groundbreaking AI inference chip that can process real-time 4K video at 30 frames per second—using less than 20 watts of power. This new large-scale integration (LSI) chip is the first in the world to achieve such high-performance AI video inferencing in power-constrained environments, making it a breakthrough for edge computing applications.
Revealed during NTT’s Upgrade 2025 summit in San Francisco, the chip is designed specifically for deployment in edge devices—hardware located physically close to the source of data, like drones, smart cameras, and sensors. Unlike traditional AI systems that rely on cloud computing for inferencing, this chip brings powerful AI capabilities directly to the edge, drastically reducing latency and eliminating the need to transmit ultra-high-definition video to centralized cloud servers for analysis.
Edge Computing vs. Cloud Computing: Why It Matters
In traditional cloud computing, data from devices like drones or cameras is sent to remote data centers—often located hundreds or thousands of miles away—where it’s processed and analyzed. While this approach offers virtually unlimited compute power, it introduces delays due to data transmission, which is problematic for real-time applications like autonomous navigation, security monitoring, and live decision-making.
By contrast, edge computing processes data locally, on or near the device itself. This reduces latency, preserves bandwidth, and enables real-time insights even in environments with limited or intermittent internet connectivity. It also enhances privacy and data security by minimizing the need to transmit sensitive data over public networks.
NTT’s new AI chip fully embraces this edge-first philosophy—delivering real-time 4K video analysis directly within the device, without relying on the cloud.
A New Era for Real-Time AI on Drones and Devices
With this chip installed, a drone can detect people or objects from up to 150 meters (492 feet)—the legal altitude limit for drones in Japan. That’s a dramatic improvement over traditional real-time AI systems, which are generally limited to a 30-meter range due to lower resolution or processing speed.
This advancement enables a host of new use cases, including:
Infrastructure inspections in hard-to-reach places
Disaster response in areas with limited connectivity
Agricultural monitoring across wide fields
Security and surveillance without constant cloud uplinks
All of this is achieved with a chip that consumes less than 20 watts—dramatically lower than the hundreds of watts required by GPU-powered AI servers, which are impractical for mobile or battery-powered systems.
Inside the Chip: NTT’s Proprietary AI Inference Engine
The LSI’s performance hinges on NTT’s custom-built AI inference engine, which ensures high-speed, accurate results while minimizing power use. Key innovations include:
Interframe correlation: By comparing sequential video frames, the chip reduces redundant calculations, improving efficiency.
Dynamic bit-precision control: This technique adjusts the numerical precision required on the fly, using fewer bits for simpler tasks, conserving energy without compromising accuracy.
Native YOLOv3 execution: The chip supports direct execution of You Only Look Once v3, one of the fastest real-time object detection algorithms in machine learning.
These combined features allow the chip to deliver robust AI performance in environments previously considered too power- or bandwidth-limited for advanced inferencing.
Path to Commercialization and the IOWN Vision
NTT plans to commercialize the chip within fiscal year 2025 through its operating company, NTT Innovative Devices Corporation.
Researchers are already exploring its integration into the Innovative Optical and Wireless Network (IOWN)—NTT’s next-generation infrastructure vision aimed at overhauling the digital backbone of modern society. Within IOWN’s Data-Centric Infrastructure (DCI), the chip would take advantage of the All-Photonics Network for ultra-low latency, high-speed communication, complementing the local processing power it brings to edge devices.
NTT is also collaborating with NTT DATA, Inc. to combine the chip’s capabilities with its Attribute-Based Encryption (ABE) technology, which enables secure, fine-grained access control over sensitive data. Together, these technologies will support AI applications that require both speed and security—such as in healthcare, smart cities, and autonomous systems.
A Legacy of Innovation and a Vision for the Future
This AI inference chip is the latest demonstration of NTT’s mission to empower a sustainable, intelligent society through deep technological innovation. As a global leader with over $92 billion in revenue, 330,000 employees, and $3.6 billion in annual R&D, NTT serves more than 75% of Fortune Global 100 companies and millions of consumers across 190 countries.
Whether it’s drones flying beyond the visual line of sight, cameras detecting events in real-time without cloud dependency, or securing data flows with attribute-based encryption, NTT’s new chip sets the stage for the next frontier in AI at the edge—where intelligence meets immediacy.
1 note · View note
luwha · 1 month ago
Text
Not telling y'all that you should be able to identify AI slop (but it is a valuable skill, you totes should), but if you're to be accusing artists of being AI left and right at least go and do your homework, or at least do the bare minimum and use AI identification tools like Hive Moderation, so you 1- don't ruin someone's lifehood 2- don't make a clown out of yourself maybe
Like, i get it, AI slop and "AI artists" pretending to be genuine is getting harder and harder to identify, but just accusing someone out of the blue and calling it a day doesn't make it any better.
Tumblr media Tumblr media
The AI clowns shifted to styles that have less "tells" and the AI arts are becoming better. Yeah, it sucks ass.
They're also integrating them with memes, so you chuckle and share, like those knights with pink backgrounds, some cool frog and a funny one liner, so you get used to their aesthetic.
Tumblr media
This is an art from the new coming set Final Fantasy for MtG. This is someone on Reddit accusing someone of using AI. From what i can tell, and i fucking hate AI, there is NO AI used on this image.
As far as i can tell and as far as any tool i've used, the Artist didn't use AI. which leads to the next one:
Tumblr media
they accused the artist of this one of using Ai. the name of this artist is Nestor Ossandon.
Tumblr media
He as already been FALSELY ACCUSED of using AI, because he drew a HAND THAT LOOKED A LITTLE WEIRD, which caused a statement from D&D Beyond, confirming that no AI has been used.
Not to repeat mysef, they're accusing the art above, that is by Nestor, to have used Ai.
REAL artists are not machines. And just like the AI slop, we are not perfect and we make mistakes. The hands we draw have wonky fingers sometimes. The folds we draw are weird. But we are REAL. We are real people. And hey, some of our "mistakes" sometimes are CHOICES. Artistic choices are a thing yo.
If you're to accuse someone of using Ai, i know it's getting hard to identify. But come on. At least do your due diligence.
6K notes · View notes
bi-gwen-stacy · 7 months ago
Text
Tumblr media Tumblr media
Haven't been the same before reading this comment
2K notes · View notes
pain-tool-sai · 2 months ago
Text
Tumblr media
🌸️ (watercolor on cotton rag paper)
622 notes · View notes
nox-in-a-box · 29 days ago
Text
I was gonna do some serious sketches but got silly with them as usual...
Tumblr media Tumblr media Tumblr media Tumblr media
442 notes · View notes
local-dragon-haunt · 9 months ago
Note
hey! i’m an artist and i was wondering what about the httyd crossover art made it obviously AI. i’m trying to get better at recognizing AI versus real art and i totally would have just not clocked that.
Hey! This is TOTALLY okay to not have recognized it, because I DIDN'T AT FIRST, EITHER. Unfortunately there’s no real foolproof way to distinguish real art from the fake stuff. However I have noticed a general rule of thumb while browsing these last few months.
Tumblr media
So this is the AI generated image I used as inspiration. I will not be tagging the account that posted it because I do not condone bullying of any type, but it’s important to mention that this was part of a set of images:
Tumblr media Tumblr media
This is important because one of the BIGGEST things you can use to your advantage is context clues. This is the thing that clued me in: right off the bat we can see that there is NO consistency between these three images. The art style and outfits change with every generated image. They're vaguely related (I.E. characters that resemble the Big Four are on some sort of adventure?) and that's about it. Going to the account in question proved that all they posted were AI generated images. All of which have many red flags, but for clarity's sake we'll stick with the one that I used.
Tumblr media
The first thing that caught my eye was this???? Amorphous Blob in the background. Which is obviously supposed to be knights or a dragon or something.
Again, context clues come into play here. Artists will draw everything With A Purpose. And if what they're drawing is fanart, you are going to recognize most of what you see in the image. Even if there are mistakes.
In the context of this image, it looks like the Four are supposed to be running from these people. The thing that drew my attention to it was the fact that I Didn't Recognize The Villains, and this is because there is nothing to recognize. These shapes aren't Drago, or Grimmel, or Pitch, or any other villain we usually associate with ROTBTD. They're just Amorphous Blobs that are vaguely villain shaped.
Which brings me to my second point:
Tumblr media
Do you see the way they're standing? There is no purpose to this. It throws the entire image off. Your eye is drawn to the Amorphous Villain Blobs in the background, and these characters are not reacting to them one bit.
Now I'm not saying that all images have to have a story behind them, but if this were created by a person, it clearly would have had one. Our group here is not telling a story, they are posing.
This is because the AI does not see the image as a whole, but as two separate components: the setting, and the description of the characters that the prompter dictates. I.E. "Merida from Brave, Jack Frost from ROTG, Rapunzel from Tangled, and Hiccup from HTTYD standing next to each other"
Now obviously the most pressing part of this prompt are the characters themselves. So the AI prioritizes that and tries to spit out something that WE recognize as "Merida from Brave, Jack Frost from ROTG, Rapunzel from Tangled, and Hiccup from HTTYD standing next to each other".
This, more times than not, is going to end up with this stagnant posing. Because AI cannot create, it can only emulate. And even then, it still can't do it right. Case in point:
Tumblr media Tumblr media Tumblr media
This is not Hiccup. The AI totally thinks this is Eugene Fitzherbert. Look at the pose. The facial structure. The goatee. The smirk. The outfits. He's always next to Raps. Why does he have a quiver? Where's Toothless? His braids? His scar??
Tumblr media
HE HAS BOTH OF HIS LEGS.
The AI. Cannot even get the most important part of it's prompt correct.
And that's just the beginning. Here:
Tumblr media
More amorphous shapes.
So these are obviously supposed to be utility belts, but I mean. Look at them. The perspective is all off. There are useless straps. I don't even know what that cluster behind Jack's left arm is supposed to be.
This is a prime example of AI emulating without understanding structure.
Tumblr media Tumblr media
You can see this particularly in Jack, between his hands, the "tassels" of his tunic, and the odd wrinkles of his boots. There's just not any structure here whatsoever.
Lastly, AI CANNOT CREATE PATTERNS.
Tumblr media Tumblr media Tumblr media
Here are the side-by-sides of the shit I had to deal with when redesigning their outfits. Please someone acknowledge this. This killed me inside. THIS is most recognizable to me, and usually what I look for first if I'm wary about an art piece. These clusterfuck bunches of color. I hate them. I hate them so. much.
Anyways here's some other miscellaneous things I've noticed:
Tumblr media
Danny Phantom Eyes
Tumblr media
???? Thumb? (and random sword sheath)
Tumblr media
Collarbone Necklace (corset from hell)
Tumblr media
No Staff :( No Bow :(
Tumblr media
What is that.
So yeah. Truly the best thing to do is to just. study it. A lot of times you aren't gonna notice anything just looking at the big picture, you need to zoom in and focus on the little details. Obviously I'm not like an expert in AI or anything, but I do have a degree in animation practices and I'm. You know. A human being. So.
In conclusion:
Tumblr media
(Y'all should totally reblog my redesign of this btw)
1K notes · View notes
drip-p1ss · 7 months ago
Text
Notes Game - Bladder Torture
notes game to make my bladder stretched to its limits. no cum, just edge and hold like the stupid dumb slut I am (MDNI) wanna help? reblogs, like and comments (any time you want), make me suffer like the useless whore I am
every 1 note is 1 minute to add to my holding
every 5 notes are 100 ml I have to drink
every 10 notes I press on my bladder for 5 seconds
every 50 notes I press my bladder on a counter for 10 sec and release for 5, 3 times
every 70 notes 10 squats
100 notes: after 1 hour I can't hold with my hands or cross my leg anymore
~ after 100 notes:
every 10 notes is a slap on my full bladder
every 20 notes a slap on my open spread pussy
every 40 notes lie on my belly with something under it for 5 min
200 notes: do a workout with full bladder, leaking is not an excuse to stop
220 notes: melt an ice cube in my cunt with panties on, can't remove them (fake pee)
250 notes: body write with humiliating words while sitting on the toilet
+++ I accept tasks, challenges, punishments in the comments/asks
~ punishment
leaking
drink a glass of water + add 10 min + fake pee
wetting / accident
drink 4 glass of water + add 30 min + lay on belly with a small ball on bladder for 10 min
will close on september 11th
739 notes · View notes
princessameliajade · 23 days ago
Text
Worship my ass loser
Tumblr media
Kiss my perfect ass and thank me
225 notes · View notes
juicytemple · 15 days ago
Text
Tumblr media
Juicy tribe leader
257 notes · View notes
jcmarchi · 7 months ago
Text
Jay Shroeder, CTO at CNH – Interview Series
New Post has been published on https://thedigitalinsider.com/jay-shroeder-cto-at-cnh-interview-series/
Jay Shroeder, CTO at CNH – Interview Series
Jay Schroeder serves as the Chief Technology Officer (CTO) at CNH, overseeing the company’s global research and development operations. His responsibilities include managing areas such as technology, innovation, vehicles and implements, precision technology, user experience, and powertrain. Schroeder focuses on enhancing the company’s product portfolio and precision technology capabilities, with the aim of integrating precision solutions across the entire equipment range. Additionally, he is involved in expanding CNH’s alternative propulsion offerings and providing governance over product development processes to ensure that the company’s product portfolio meets high standards of quality and performance.
Through its various businesses, CNH Industrial, produces, and sells agricultural machinery and construction equipment. AI and advanced technologies, such as computer vision, machine learning (ML), and camera sensors, are transforming how this equipment operates, enabling innovations like AI-powered self-driving tractors that help farmers address complex challenges in their work.
CNH’s self-driving tractors are powered by models trained on deep neural networks and real-time inference. Can you explain how this technology helps farmers perform tasks like planting with extreme precision, and how it compares to autonomous driving in other industries like transportation?
While self-driving cars capture headlines, the agriculture industry has quietly led the autonomous revolution for more than two decades. Companies like CNH pioneered autonomous steering and speed control long before Tesla. Today, CNH’s technology goes beyond simply driving to conducting highly automated and autonomous work all while driving themselves. From precisely planting seeds in the ground exactly where they need to be, to efficiently and optimally harvesting crops and treating the soil, all while driving through the field, autonomous farming isn’t just keeping pace with self-driving cars – it’s leaving them in the dust. The future of transportation may be autonomous, but in farming, the future is already here.
Further, CNH’s future-proofed tech stack empowers autonomous farming far beyond what self-driving cars can achieve. Our software-defined architecture seamlessly integrates a wide range of technologies, enabling automation for complex farming tasks that are much more challenging than simple point-A-to-B navigation. Interoperability in the architecture empowers farmers with unprecedented control and flexibility to layer on heightened technology through CNH’s open APIs. Unlike closed systems, CNH’s open API allows farmers to customize their machinery. Imagine camera sensors that distinguish crops from weeds, activated only when needed—all while the vehicle operates autonomously. This adaptability, combined with the ability to handle rugged terrain and diverse tasks, sets CNH’s technology apart. While Tesla and Waymo make strides, the true frontier of autonomous innovation lies in the fields, not on the roads.
The concept of an “MRI machine for plants” is fascinating. How does CNH’s use of synthetic imagery and machine learning enable its machines to identify crop type, growth stages, and apply targeted crop nutrition?
Using AI, computer vision cameras, and massive data sets, CNH is training models to distinguish crops from weeds, identify plant growth stages, and recognize the health of the crop across the fields to determine the exact amount of nutrients and protection needed to optimize a crop’s yield. For example, with the Augmenta Field Analyzer, a computer vision application scans the ground in front of the machine as it’s quickly moving through the field (at up to 20 mph) to assess crop conditions on the field and which areas need to be treated, and at what rate, to make those areas healthier.
With this technology, farmers are able to know and treat exactly where in the field a problem is building so that instead of blanketing a whole field with a treatment to kill weeds, control pests, or add necessary nutrients to boost the health of the crops, AI and data-informed spraying machines automatically spray only the plants that need it. The technology enables the exact amount of chemical needed, applied in exactly the right spot to precisely address the plants’ needs and stop any threat to the crop. Identifying and spraying only (and exactly) weeds as they grow among crops will eventually reduce the use of chemicals on fields by up to 90%. Only a small amount of chemical is needed to treat each individual threat rather than treating the whole field in order to reach those same few threats.
To generate photorealistic synthetic images and improve datasets quickly, CNH uses biophysical procedural models. This enables the team to quickly and efficiently create and classify millions of images without having to take the time to capture real imagery at the scale needed. The synthetic data augments authentic images, improving model training and inference performance. For example, by using synthetic data, different situations can be created to train the models – such as various lighting conditions and shadows that move throughout the day. Procedural models can produce specific images based on parameters to create a dataset that represents different conditions.
How accurate is this technology compared to traditional farming methods?
Farmers make hundreds of significant choices throughout the year but only see the results of all those cumulative decisions once: at harvest time. The average age of a farmer is increasing and most work for more than 30 years. There is no margin for error. From the moment the seed is planted, farmers need to do everything they can to make sure the crop thrives – their livelihood is on the line.
Our technology takes a lot of the guesswork out of farmers’ tasks, such as determining the best ways to care for growing crops, while giving farmers extra time back to focus on solving strategic business challenges. At the end of the day, farmers are running massive businesses and rely on technology to help them do so most efficiently, productively and profitably.
Not only does the data generated by machines allow farmers to make better, more informed decisions to get better results, but the high levels of automation and autonomy in the machines themselves perform the work better and at a higher scale than humans are able to do. Spraying machines are able to “see” trouble spots in thousands of acres of crops better than human eyes and can precisely treat threats; while technology like autonomous tillage is able to relieve the burden of doing an arduous, time-consuming task and perform it with more accuracy and efficiency at scale than a human could. In autonomous tillage, a fully autonomous system tills the soil by using sensors combined with deep neural networks to create ideal conditions with centimeter-level precision. This prepares the soil to allow for highly consistent row spacing, precise seed depth, and optimized seed placement despite often drastic soil changes across even one field. Traditional methods, often reliant on human-operated machinery, typically result in more variability in results due to operator fatigue, less consistent navigation, and less accurate positioning.
During harvest season, CNH’s combine machines use edge computing and camera sensors to assess crop quality in real-time. How does this rapid decision-making process work, and what role does AI play in optimizing the harvest to reduce waste and improve efficiency?
A combine is an incredibly complex machine that does multiple processes — reaping, threshing, and gathering — in a single, continuous operation. It’s called a combine for that very reason: it combines what used to be multiple devices into a single factory-on-wheels. There is a lot happening at once and little room for error. CNH’s combine automatically makes millions of rapid decisions every twenty seconds, processing them on the edge, right on the machine. The camera sensors capture and process detailed images of the harvested crops to determine the quality of each kernel of the crop being harvested — analyzing moisture levels, grain quality, and debris content. The machine will automatically make adjustments based on the imagery data to deploy the best machine settings to get optimal results. We can do this today for barley, rice, wheat, corn, soybeans, and canola and will soon add capabilities for sorghum, oats, field peas, sunflowers, and edible beans.
AI at the edge is crucial in optimizing this process by using deep learning models trained to recognize patterns in crop conditions. These models can quickly identify areas of the harvest that require adjustments, such as altering the combine’s speed or modifying threshing settings to ensure better separation of grain from the rest of the plant (for instance, keeping only each and every corn kernel and removing all pieces of the cob and stalk). This real-time optimization helps reduce waste by minimizing crop damage and collecting only high-quality crops. It also improves efficiency, allowing machines to make data-driven decisions on the go to maximize farmers’ crop yield, all while reducing operational stress and costs.
Precision agriculture driven by AI and ML promises to reduce input waste and maximize yield. Could you elaborate on how CNH’s technology is helping farmers cut costs, improve sustainability, and overcome labor shortages in an increasingly challenging agricultural landscape?
Farmers face tremendous hurdles in finding skilled labor. This is especially true for tillage – a critical step most farms require to prepare the soil for winter to make for better planting conditions in the spring. Precision is vital in tillage with accuracy measured to the tenth of an inch to create optimal crop growth conditions. CNH’s autonomous tillage technology eliminates the need for highly skilled operators to manually adjust tillage implements. With the push of a button, the system autonomizes the whole process, allowing farmers to focus on other essential tasks. This boosts productivity and the precision conserves fuel, making operations more efficient.
When it comes to crop maintenance, CNH’s sprayer technology is outfitted with more than 125 microprocessors that communicate in real-time to enhance cost-efficiency and sustainability of water, nutrient, herbicide, and pesticide use. These processors collaborate to analyze field conditions and precisely determine when and where to apply these nutrients, eliminating an overabundance of chemicals by up to 30% today and up to 90% in the near future, drastically cutting input costs and the amount of chemicals that go into the soil. The nozzle control valves allow the machine to accurately apply the product by automatically adjusting based on the sprayer’s speed, ensuring a consistent rate and pressure for precise droplet delivery to the crop so each drop lands exactly where it needs to be for the health of the crop. This level of precision reduces the need for frequent refills, with farmers only needing to fill the sprayer once per day, leading to significant water/chemical conservation.
Similarly, CNH’s Cart Automation simplifies the complex and high-stress task of operating a combine during harvest. Precision is crucial to avoid collisions between the combine header and the grain cart driving within inches of each other for hours at a time. It also helps lessen crop loss. Cart Automation enables a seamless load-on-the-go process, reducing the need for manual coordination and facilitating the combine to continue performing its job without having to stop. CNH has done physiological testing that shows this assistive technology lowers stress for combine operators by approximately 12% and for tractor operators by 18%, which adds up when these operators are in these machines for up to 16 hours a day during harvest season.
CNH brand, New Holland, recently partnered with Bluewhite for autonomous tractor kits. How does this collaboration fit into CNH’s broader strategy for expanding autonomy in agriculture?
Autonomy is the future of CNH, and we are taking a purposeful and strategic approach to developing this technology, driven by the most pressing needs of our customers. Our internal engineers are focused on developing autonomy for our large agriculture customer segment– farmers of crops that grow in large, open fields, like corn and soybeans. Another important customer base for CNH is farmers of what we call “permanent crops” that grow in orchards and vineyards. Partnering with Bluewhite, a proven leader in implementing autonomy in orchards and vineyards, allows us the scale and speed to market to be able to serve both the large ag and permanent crop customer segments with critically needed autonomy. With Bluewhite, we are delivering a fully autonomous tractor in permanent crops, making us the first original equipment manufacturer (OEM) with an autonomous solution in orchards and vineyards.
Our approach to autonomy is to solve the most critical challenges customers have in the jobs and tasks where they are eager for the machine to complete the work and remove the burden on labor.  Autonomous tillage leads our internal job autonomy development because it’s an arduous task that takes a long time during a tightly time-constrained period of the year when a number of other things also need to happen. A machine in this instance can perform the work better than a human operator. Permanent crop farmers also have an urgent need for autonomy, as they face extreme labor shortages and need machines to fill the gaps. These jobs require the tractors to drive 20-30 passes through each orchard or vineyard row per season, performing important jobs like applying nutrients to the trees and keeping the grass between vines mowed and free of weeds.
Many of CNH’s solutions are being adopted by orchard and vineyard operators. What unique challenges do these environments present for autonomous and AI-driven machinery, and how is CNH adapting its technologies for such specialized applications? 
The windows for harvesting are changing, and finding skilled labor is harder to come by. Climate change is making seasons more unpredictable; it’s mission-critical for farmers to have technology ready to go that drives precision and efficiency for when crops are optimal for harvesting. Farming always requires precision, but it’s particularly necessary when harvesting something as small and delicate as a grape or nut.
Most automated driving technologies rely on GPS to guide machines on their paths, but in orchards and vineyards those GPS signals can be blocked by tree and vine branches. Vision cameras and radar are used in conjunction with GPS to keep machines on their optimal path. And, with orchards and vineyards, harvesting is not about acres of uniform rows but rather individual, varied plants and trees, often in hilly terrain. CNH’s automated systems adjust to each plant’s height, the ground level, and required picking speed to ensure a quality yield without damaging the crop. They also adjust around unproductive or dead trees to save unnecessary inputs. These robotic machines automatically move along the plants, safely straddling the crop while delicately removing the produce from the tree or vine. The operator sets the desired picking head height, and the machines automatically adjust to maintain those settings per plant, regardless of the terrain. Further, for some fruits, the best time to harvest is when its sugar content peaks overnight. Cameras equipped with infrared technology work in even the darkest conditions to harvest the fruit at its optimal condition.
As more autonomous farming equipment is deployed, what steps is CNH taking to ensure the safety and regulatory compliance of these AI-powered systems, particularly in diverse global farming environments?
Safety and regulatory compliance are central to CNH’s AI-powered systems, thus CNH collaborates with local authorities in different regions, allowing the company to adapt its autonomous systems to meet regional requirements, including safety standards, environmental regulations, and data privacy laws. CNH is also active in standards organizations to ensure we meet all recognized and emerging standards and requirements.
For example, autonomous safety systems include sensors like cameras, LiDAR, radar and GPS for real-time monitoring. These technologies enable the equipment to detect obstacles and automatically stop when it detects something ahead. The machines can also navigate complex terrain and respond to environmental changes, minimizing the risk of accidents.
What do you see as the biggest barriers to widespread adoption of AI-driven technologies in agriculture? How is CNH helping farmers transition to these new systems and demonstrating their value?
Currently, the most significant barriers are cost, connectivity, and farmer training.
But better yields, lowered expenses, lowered physical stress, and better time management through heightened automation can offset the total cost of ownership. Smaller farms can benefit from more limited autonomous solutions, like feed systems or aftermarket upgrade kits.
Inadequate connectivity, particularly in rural areas, poses challenges. AI-driven technologies require consistent, always-on connectivity. CNH is helping to address that through its partnership with Intelsat and through universal modems that connect to whatever network is nearby–wifi, cellular, or satellite–providing field-ready connectivity for customers in hard to reach locations. While many customers fulfill this need for internet connectivity with CNH’s market-leading global mobile virtual network, existing cellular towers do not enable pervasive connection.
Lastly, the perceived learning curve associated with AI technology can feel daunting. This shift from traditional practices requires training and a change in mindset, which is why CNH works hand-in-hand with customers to make sure they are comfortable with the technology and are getting the full benefit of systems.
Looking ahead, how do you envision CNH’s AI and autonomous solutions evolving over the next decade?
CNH is tackling critical, global challenges by developing cutting-edge technology to produce more food sustainably by using fewer resources, for a growing population. Our focus is empowering farmers to improve their livelihoods and businesses through innovative solutions, with AI and autonomy playing a central role. Advancements in data collection, affordability of sensors, connectivity, and computing power will accelerate the development of AI and autonomous systems. These technologies will drive progress in precision farming, autonomous operation, predictive maintenance, and data-driven decision-making, ultimately benefiting our customers and the world.
Thank you for the great interview, readers who wish to learn more should visit CNH.
0 notes
luna-the-cretar · 2 months ago
Text
I imagine Virgil—while appearing like a regular crow—is just ever so slightly uncanny enough to tell you “that’s not a normal crow” (kinda like how Nikkie described the demon weasels in the Crooked House. They look like normal weasels, for the most part, but something about them seems…Wrong.)
Like, he’s just slightly too big. Sure, maybe he’s just a larger crow, they can get pretty big. But he’s just a bit larger than that. Almost the size of an average raven. But…he’s not a raven. He’s a crow. So why is he the size of a raven?
His eyes are too big. Sure, his eyes are also fucked up already, but ignoring that part. Normal crow eyes are small and beady. Virgil’s eyes are almost too big for his head. They’re almost the size and shape of a human’s.
His feathers are a Too Black for a natural crow. His beak is Too Sharp. His body is too amorphous. Like, yeah, it’s vaguely crow shaped when he’s perched or flying, just enough to not question it when you see him out of the corner of your eye, but if you look at him—really look at him—something about the shape is…odd. And the worst part is that you can’t even tell what is off about him. Is it his feathers? His wings? His talons? You’re not quite sure.
I mean, Jericho calls Virgil his “Weird Gross Crow” for a reason.
58 notes · View notes
fortunaestalta · 2 months ago
Text
Tumblr media
49 notes · View notes
toxintouch · 5 months ago
Text
Headcanon that Ais never feels fully rested. He doesn't feel tired, but he doesn't feel right either. Not since joining the groupmind.
He sleeps, but his mind is never really at peace. When and if he manages to fall into a deep enough sleep, he's in a constant state of something akin to lucid dreaming.
He gets flashes of the other members of the group mind in place of any real rest. Their current actions; errant memories; whispers in long-dead languages he's leaned to understand.
98 notes · View notes
violent138 · 7 months ago
Text
If any of the Batsiblings ever end up in serious medical trouble, to the point they've been forced onto bedrest or put into a medically-induced coma, their siblings will rotate in shifts (and require physical force to be removed sometimes) or hover at windowsills and talk at the patient and meddle so much that Leslie keeps having to smack their hands away from IV lines or wrestle back her stethoscope. But the second said sibling wakes up it converts spontaneously to around the clock taunting, shoulder punches that nearly send them and their crutches into the floor, silent help during PT, and someone coming by to coerce pain meds onto you with bedside manner so bad that you forget all about almost dying and start planning homicide.
91 notes · View notes
casualavocados · 9 months ago
Text
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
Learn from who? Learn from you? You are still a brat. What do you know? You're only three years older. Like you are any better than me. You're 21, and still a virgin. What are you proud of? I think you can't do it.
KISEKI: DEAR TO ME Ep. 06
#kiseki: dear to me#kisekiedit#kdtm#kiseki dear to me#ai di x chen yi#chen yi x ai di#nat chen#chen bowen#louis chiang#chiang tien#jiang dian#userspring#uservid#userspicy#userrain#pdribs#userjjessi#*cajedit#*gif#*gestures at the caption* this is honestly the funniest argument they could possibly have idfk what to tell you. it's very ai di#meanwhile whatever's going through chen yi's head rn has recently been doused with 'the boss doesnt care abt me like that'#after watching cdy and zml at dinner. like chen yi already knows *before* ep9 & ai dis confession that cdy will never look at him#(the diff. between this scene & ep9's. is him failing in regards to the gang as well in cdy's eyes. he goes from feelings of disappointment#& irritability to complete despair and both times he drinks to cope. bc hes not enough in cdy's eyes in ANY of the ways he wants/hoped)#so honestly the crisis chen yi goes thru right here isnt unfounded at all hes literally dealing w an inadvertent rejection of his feelings#its chaos in his head and ai di is picking at him again and the wine is tilting in his blood and then- 'learn from who? learn from you?'#like what do YOU know about love ai di (WHILE CHEN YI'S PULLING HIM LIKE THAT-) so OF COURSE ai di goes for the deepest dig he can.#'i bet you cant get hard that explains how much of a coward you are'. its ridiculous the ways in which they push each other over the edge#but im ngl im kind of obsessed the way chen yi's tipsy line of thinking 'learn from you?' turned into the action 'fuck it learn from ME'#ANYWAY EVERYONE GO LISTEN TO 'LOSE CONTROL' BY TEDDY SWIMS RIGHT THE FUCK NOW. THe most chen yi song pre-ep9
84 notes · View notes