#ai at the edge
Explore tagged Tumblr posts
Text
Jay Shroeder, CTO at CNH – Interview Series
New Post has been published on https://thedigitalinsider.com/jay-shroeder-cto-at-cnh-interview-series/
Jay Shroeder, CTO at CNH – Interview Series
Jay Schroeder serves as the Chief Technology Officer (CTO) at CNH, overseeing the company’s global research and development operations. His responsibilities include managing areas such as technology, innovation, vehicles and implements, precision technology, user experience, and powertrain. Schroeder focuses on enhancing the company’s product portfolio and precision technology capabilities, with the aim of integrating precision solutions across the entire equipment range. Additionally, he is involved in expanding CNH’s alternative propulsion offerings and providing governance over product development processes to ensure that the company’s product portfolio meets high standards of quality and performance.
Through its various businesses, CNH Industrial, produces, and sells agricultural machinery and construction equipment. AI and advanced technologies, such as computer vision, machine learning (ML), and camera sensors, are transforming how this equipment operates, enabling innovations like AI-powered self-driving tractors that help farmers address complex challenges in their work.
CNH’s self-driving tractors are powered by models trained on deep neural networks and real-time inference. Can you explain how this technology helps farmers perform tasks like planting with extreme precision, and how it compares to autonomous driving in other industries like transportation?
While self-driving cars capture headlines, the agriculture industry has quietly led the autonomous revolution for more than two decades. Companies like CNH pioneered autonomous steering and speed control long before Tesla. Today, CNH’s technology goes beyond simply driving to conducting highly automated and autonomous work all while driving themselves. From precisely planting seeds in the ground exactly where they need to be, to efficiently and optimally harvesting crops and treating the soil, all while driving through the field, autonomous farming isn’t just keeping pace with self-driving cars – it’s leaving them in the dust. The future of transportation may be autonomous, but in farming, the future is already here.
Further, CNH’s future-proofed tech stack empowers autonomous farming far beyond what self-driving cars can achieve. Our software-defined architecture seamlessly integrates a wide range of technologies, enabling automation for complex farming tasks that are much more challenging than simple point-A-to-B navigation. Interoperability in the architecture empowers farmers with unprecedented control and flexibility to layer on heightened technology through CNH’s open APIs. Unlike closed systems, CNH’s open API allows farmers to customize their machinery. Imagine camera sensors that distinguish crops from weeds, activated only when needed—all while the vehicle operates autonomously. This adaptability, combined with the ability to handle rugged terrain and diverse tasks, sets CNH’s technology apart. While Tesla and Waymo make strides, the true frontier of autonomous innovation lies in the fields, not on the roads.
The concept of an “MRI machine for plants” is fascinating. How does CNH’s use of synthetic imagery and machine learning enable its machines to identify crop type, growth stages, and apply targeted crop nutrition?
Using AI, computer vision cameras, and massive data sets, CNH is training models to distinguish crops from weeds, identify plant growth stages, and recognize the health of the crop across the fields to determine the exact amount of nutrients and protection needed to optimize a crop’s yield. For example, with the Augmenta Field Analyzer, a computer vision application scans the ground in front of the machine as it’s quickly moving through the field (at up to 20 mph) to assess crop conditions on the field and which areas need to be treated, and at what rate, to make those areas healthier.
With this technology, farmers are able to know and treat exactly where in the field a problem is building so that instead of blanketing a whole field with a treatment to kill weeds, control pests, or add necessary nutrients to boost the health of the crops, AI and data-informed spraying machines automatically spray only the plants that need it. The technology enables the exact amount of chemical needed, applied in exactly the right spot to precisely address the plants’ needs and stop any threat to the crop. Identifying and spraying only (and exactly) weeds as they grow among crops will eventually reduce the use of chemicals on fields by up to 90%. Only a small amount of chemical is needed to treat each individual threat rather than treating the whole field in order to reach those same few threats.
To generate photorealistic synthetic images and improve datasets quickly, CNH uses biophysical procedural models. This enables the team to quickly and efficiently create and classify millions of images without having to take the time to capture real imagery at the scale needed. The synthetic data augments authentic images, improving model training and inference performance. For example, by using synthetic data, different situations can be created to train the models – such as various lighting conditions and shadows that move throughout the day. Procedural models can produce specific images based on parameters to create a dataset that represents different conditions.
How accurate is this technology compared to traditional farming methods?
Farmers make hundreds of significant choices throughout the year but only see the results of all those cumulative decisions once: at harvest time. The average age of a farmer is increasing and most work for more than 30 years. There is no margin for error. From the moment the seed is planted, farmers need to do everything they can to make sure the crop thrives – their livelihood is on the line.
Our technology takes a lot of the guesswork out of farmers’ tasks, such as determining the best ways to care for growing crops, while giving farmers extra time back to focus on solving strategic business challenges. At the end of the day, farmers are running massive businesses and rely on technology to help them do so most efficiently, productively and profitably.
Not only does the data generated by machines allow farmers to make better, more informed decisions to get better results, but the high levels of automation and autonomy in the machines themselves perform the work better and at a higher scale than humans are able to do. Spraying machines are able to “see” trouble spots in thousands of acres of crops better than human eyes and can precisely treat threats; while technology like autonomous tillage is able to relieve the burden of doing an arduous, time-consuming task and perform it with more accuracy and efficiency at scale than a human could. In autonomous tillage, a fully autonomous system tills the soil by using sensors combined with deep neural networks to create ideal conditions with centimeter-level precision. This prepares the soil to allow for highly consistent row spacing, precise seed depth, and optimized seed placement despite often drastic soil changes across even one field. Traditional methods, often reliant on human-operated machinery, typically result in more variability in results due to operator fatigue, less consistent navigation, and less accurate positioning.
During harvest season, CNH’s combine machines use edge computing and camera sensors to assess crop quality in real-time. How does this rapid decision-making process work, and what role does AI play in optimizing the harvest to reduce waste and improve efficiency?
A combine is an incredibly complex machine that does multiple processes — reaping, threshing, and gathering — in a single, continuous operation. It’s called a combine for that very reason: it combines what used to be multiple devices into a single factory-on-wheels. There is a lot happening at once and little room for error. CNH’s combine automatically makes millions of rapid decisions every twenty seconds, processing them on the edge, right on the machine. The camera sensors capture and process detailed images of the harvested crops to determine the quality of each kernel of the crop being harvested — analyzing moisture levels, grain quality, and debris content. The machine will automatically make adjustments based on the imagery data to deploy the best machine settings to get optimal results. We can do this today for barley, rice, wheat, corn, soybeans, and canola and will soon add capabilities for sorghum, oats, field peas, sunflowers, and edible beans.
AI at the edge is crucial in optimizing this process by using deep learning models trained to recognize patterns in crop conditions. These models can quickly identify areas of the harvest that require adjustments, such as altering the combine’s speed or modifying threshing settings to ensure better separation of grain from the rest of the plant (for instance, keeping only each and every corn kernel and removing all pieces of the cob and stalk). This real-time optimization helps reduce waste by minimizing crop damage and collecting only high-quality crops. It also improves efficiency, allowing machines to make data-driven decisions on the go to maximize farmers’ crop yield, all while reducing operational stress and costs.
Precision agriculture driven by AI and ML promises to reduce input waste and maximize yield. Could you elaborate on how CNH’s technology is helping farmers cut costs, improve sustainability, and overcome labor shortages in an increasingly challenging agricultural landscape?
Farmers face tremendous hurdles in finding skilled labor. This is especially true for tillage – a critical step most farms require to prepare the soil for winter to make for better planting conditions in the spring. Precision is vital in tillage with accuracy measured to the tenth of an inch to create optimal crop growth conditions. CNH’s autonomous tillage technology eliminates the need for highly skilled operators to manually adjust tillage implements. With the push of a button, the system autonomizes the whole process, allowing farmers to focus on other essential tasks. This boosts productivity and the precision conserves fuel, making operations more efficient.
When it comes to crop maintenance, CNH’s sprayer technology is outfitted with more than 125 microprocessors that communicate in real-time to enhance cost-efficiency and sustainability of water, nutrient, herbicide, and pesticide use. These processors collaborate to analyze field conditions and precisely determine when and where to apply these nutrients, eliminating an overabundance of chemicals by up to 30% today and up to 90% in the near future, drastically cutting input costs and the amount of chemicals that go into the soil. The nozzle control valves allow the machine to accurately apply the product by automatically adjusting based on the sprayer’s speed, ensuring a consistent rate and pressure for precise droplet delivery to the crop so each drop lands exactly where it needs to be for the health of the crop. This level of precision reduces the need for frequent refills, with farmers only needing to fill the sprayer once per day, leading to significant water/chemical conservation.
Similarly, CNH’s Cart Automation simplifies the complex and high-stress task of operating a combine during harvest. Precision is crucial to avoid collisions between the combine header and the grain cart driving within inches of each other for hours at a time. It also helps lessen crop loss. Cart Automation enables a seamless load-on-the-go process, reducing the need for manual coordination and facilitating the combine to continue performing its job without having to stop. CNH has done physiological testing that shows this assistive technology lowers stress for combine operators by approximately 12% and for tractor operators by 18%, which adds up when these operators are in these machines for up to 16 hours a day during harvest season.
CNH brand, New Holland, recently partnered with Bluewhite for autonomous tractor kits. How does this collaboration fit into CNH’s broader strategy for expanding autonomy in agriculture?
Autonomy is the future of CNH, and we are taking a purposeful and strategic approach to developing this technology, driven by the most pressing needs of our customers. Our internal engineers are focused on developing autonomy for our large agriculture customer segment– farmers of crops that grow in large, open fields, like corn and soybeans. Another important customer base for CNH is farmers of what we call “permanent crops” that grow in orchards and vineyards. Partnering with Bluewhite, a proven leader in implementing autonomy in orchards and vineyards, allows us the scale and speed to market to be able to serve both the large ag and permanent crop customer segments with critically needed autonomy. With Bluewhite, we are delivering a fully autonomous tractor in permanent crops, making us the first original equipment manufacturer (OEM) with an autonomous solution in orchards and vineyards.
Our approach to autonomy is to solve the most critical challenges customers have in the jobs and tasks where they are eager for the machine to complete the work and remove the burden on labor. Autonomous tillage leads our internal job autonomy development because it’s an arduous task that takes a long time during a tightly time-constrained period of the year when a number of other things also need to happen. A machine in this instance can perform the work better than a human operator. Permanent crop farmers also have an urgent need for autonomy, as they face extreme labor shortages and need machines to fill the gaps. These jobs require the tractors to drive 20-30 passes through each orchard or vineyard row per season, performing important jobs like applying nutrients to the trees and keeping the grass between vines mowed and free of weeds.
Many of CNH’s solutions are being adopted by orchard and vineyard operators. What unique challenges do these environments present for autonomous and AI-driven machinery, and how is CNH adapting its technologies for such specialized applications?
The windows for harvesting are changing, and finding skilled labor is harder to come by. Climate change is making seasons more unpredictable; it’s mission-critical for farmers to have technology ready to go that drives precision and efficiency for when crops are optimal for harvesting. Farming always requires precision, but it’s particularly necessary when harvesting something as small and delicate as a grape or nut.
Most automated driving technologies rely on GPS to guide machines on their paths, but in orchards and vineyards those GPS signals can be blocked by tree and vine branches. Vision cameras and radar are used in conjunction with GPS to keep machines on their optimal path. And, with orchards and vineyards, harvesting is not about acres of uniform rows but rather individual, varied plants and trees, often in hilly terrain. CNH’s automated systems adjust to each plant’s height, the ground level, and required picking speed to ensure a quality yield without damaging the crop. They also adjust around unproductive or dead trees to save unnecessary inputs. These robotic machines automatically move along the plants, safely straddling the crop while delicately removing the produce from the tree or vine. The operator sets the desired picking head height, and the machines automatically adjust to maintain those settings per plant, regardless of the terrain. Further, for some fruits, the best time to harvest is when its sugar content peaks overnight. Cameras equipped with infrared technology work in even the darkest conditions to harvest the fruit at its optimal condition.
As more autonomous farming equipment is deployed, what steps is CNH taking to ensure the safety and regulatory compliance of these AI-powered systems, particularly in diverse global farming environments?
Safety and regulatory compliance are central to CNH’s AI-powered systems, thus CNH collaborates with local authorities in different regions, allowing the company to adapt its autonomous systems to meet regional requirements, including safety standards, environmental regulations, and data privacy laws. CNH is also active in standards organizations to ensure we meet all recognized and emerging standards and requirements.
For example, autonomous safety systems include sensors like cameras, LiDAR, radar and GPS for real-time monitoring. These technologies enable the equipment to detect obstacles and automatically stop when it detects something ahead. The machines can also navigate complex terrain and respond to environmental changes, minimizing the risk of accidents.
What do you see as the biggest barriers to widespread adoption of AI-driven technologies in agriculture? How is CNH helping farmers transition to these new systems and demonstrating their value?
Currently, the most significant barriers are cost, connectivity, and farmer training.
But better yields, lowered expenses, lowered physical stress, and better time management through heightened automation can offset the total cost of ownership. Smaller farms can benefit from more limited autonomous solutions, like feed systems or aftermarket upgrade kits.
Inadequate connectivity, particularly in rural areas, poses challenges. AI-driven technologies require consistent, always-on connectivity. CNH is helping to address that through its partnership with Intelsat and through universal modems that connect to whatever network is nearby–wifi, cellular, or satellite–providing field-ready connectivity for customers in hard to reach locations. While many customers fulfill this need for internet connectivity with CNH’s market-leading global mobile virtual network, existing cellular towers do not enable pervasive connection.
Lastly, the perceived learning curve associated with AI technology can feel daunting. This shift from traditional practices requires training and a change in mindset, which is why CNH works hand-in-hand with customers to make sure they are comfortable with the technology and are getting the full benefit of systems.
Looking ahead, how do you envision CNH’s AI and autonomous solutions evolving over the next decade?
CNH is tackling critical, global challenges by developing cutting-edge technology to produce more food sustainably by using fewer resources, for a growing population. Our focus is empowering farmers to improve their livelihoods and businesses through innovative solutions, with AI and autonomy playing a central role. Advancements in data collection, affordability of sensors, connectivity, and computing power will accelerate the development of AI and autonomous systems. These technologies will drive progress in precision farming, autonomous operation, predictive maintenance, and data-driven decision-making, ultimately benefiting our customers and the world.
Thank you for the great interview, readers who wish to learn more should visit CNH.
#accidents#ADD#adoption#agriculture#ai#ai at the edge#AI-powered#API#APIs#applications#approach#architecture#Assistive technology#automation#autonomous#autonomous driving#autonomous systems#bluewhite#Building#Business#Cameras#Capture#Cars#change#chemical#chemicals#climate#climate change#CNH#collaborate
0 notes
Text
cant tell you how bad it feels to constantly tell other artists to come to tumblr, because its the last good website that isn't fucked up by spoonfeeding algorithms and AI bullshit and isn't based around meaningless likes
just to watch that all fall apart in the last year or so and especially the last two weeks
there's nowhere good to go anymore for artists.
edit - a lot of people are saying the tags are important so actually, you'll look at my tags.
#please dont delete your accounts because of the AI crap. your art deserves more than being lost like that #if you have a good PC please glaze or nightshade it. if you dont or it doesnt work with your style (like mine) please start watermarking #use a plain-ish font. make it your username. if people can't google what your watermark says and find ur account its not a good watermark #it needs to be central in the image - NOT on the canvas edges - and put it in multiple places if you are compelled #please dont stop posting your art because of this shit. we just have to hope regulations will come slamming down on these shitheads#in the next year or two and you want to have accounts to come back to. the world Needs real art #if we all leave that just makes more room for these scam artists to fill in with their soulless recycled garbage #improvise adapt overcome. it sucks but it is what it is for the moment. safeguard yourself as best you can without making #years of art from thousands of artists lost media. the digital world and art is too temporary to hastily click a Delete button out of spite
#not art#but important#please dont delete your accounts because of the AI crap. your art deserves more than being lost like that#if you have a good PC please glaze or nightshade it. if you dont or it doesnt work with your style (like mine) please start watermarking#use a plain-ish font. make it your username. if people can't google what your watermark says and find ur account its not a good watermark#it needs to be central in the image - NOT on the canvas edges - and put it in multiple places if you are compelled#please dont stop posting your art because of this shit. we just have to hope regulations will come slamming down on these shitheads#in the next year or two and you want to have accounts to come back to. the world Needs real art#if we all leave that just makes more room for these scam artists to fill in with their soulless recycled garbage#improvise adapt overcome. it sucks but it is what it is for the moment. safeguard yourself as best you can without making#years of art from thousands of artists lost media. the digital world and art is too temporary to hastily click a Delete button out of spite
23K notes
·
View notes
Text
Haven't been the same before reading this comment
#spiderverse#miles morales#gwen stacy#spidergwen#ghostflower#ai even hung upside down on the edge of my couch to try this like an idiot#can confirm you have to make an effort#atsv#raich rambles
2K notes
·
View notes
Note
hey! i’m an artist and i was wondering what about the httyd crossover art made it obviously AI. i’m trying to get better at recognizing AI versus real art and i totally would have just not clocked that.
Hey! This is TOTALLY okay to not have recognized it, because I DIDN'T AT FIRST, EITHER. Unfortunately there’s no real foolproof way to distinguish real art from the fake stuff. However I have noticed a general rule of thumb while browsing these last few months.
So this is the AI generated image I used as inspiration. I will not be tagging the account that posted it because I do not condone bullying of any type, but it’s important to mention that this was part of a set of images:
This is important because one of the BIGGEST things you can use to your advantage is context clues. This is the thing that clued me in: right off the bat we can see that there is NO consistency between these three images. The art style and outfits change with every generated image. They're vaguely related (I.E. characters that resemble the Big Four are on some sort of adventure?) and that's about it. Going to the account in question proved that all they posted were AI generated images. All of which have many red flags, but for clarity's sake we'll stick with the one that I used.
The first thing that caught my eye was this???? Amorphous Blob in the background. Which is obviously supposed to be knights or a dragon or something.
Again, context clues come into play here. Artists will draw everything With A Purpose. And if what they're drawing is fanart, you are going to recognize most of what you see in the image. Even if there are mistakes.
In the context of this image, it looks like the Four are supposed to be running from these people. The thing that drew my attention to it was the fact that I Didn't Recognize The Villains, and this is because there is nothing to recognize. These shapes aren't Drago, or Grimmel, or Pitch, or any other villain we usually associate with ROTBTD. They're just Amorphous Blobs that are vaguely villain shaped.
Which brings me to my second point:
Do you see the way they're standing? There is no purpose to this. It throws the entire image off. Your eye is drawn to the Amorphous Villain Blobs in the background, and these characters are not reacting to them one bit.
Now I'm not saying that all images have to have a story behind them, but if this were created by a person, it clearly would have had one. Our group here is not telling a story, they are posing.
This is because the AI does not see the image as a whole, but as two separate components: the setting, and the description of the characters that the prompter dictates. I.E. "Merida from Brave, Jack Frost from ROTG, Rapunzel from Tangled, and Hiccup from HTTYD standing next to each other"
Now obviously the most pressing part of this prompt are the characters themselves. So the AI prioritizes that and tries to spit out something that WE recognize as "Merida from Brave, Jack Frost from ROTG, Rapunzel from Tangled, and Hiccup from HTTYD standing next to each other".
This, more times than not, is going to end up with this stagnant posing. Because AI cannot create, it can only emulate. And even then, it still can't do it right. Case in point:
This is not Hiccup. The AI totally thinks this is Eugene Fitzherbert. Look at the pose. The facial structure. The goatee. The smirk. The outfits. He's always next to Raps. Why does he have a quiver? Where's Toothless? His braids? His scar??
HE HAS BOTH OF HIS LEGS.
The AI. Cannot even get the most important part of it's prompt correct.
And that's just the beginning. Here:
More amorphous shapes.
So these are obviously supposed to be utility belts, but I mean. Look at them. The perspective is all off. There are useless straps. I don't even know what that cluster behind Jack's left arm is supposed to be.
This is a prime example of AI emulating without understanding structure.
You can see this particularly in Jack, between his hands, the "tassels" of his tunic, and the odd wrinkles of his boots. There's just not any structure here whatsoever.
Lastly, AI CANNOT CREATE PATTERNS.
Here are the side-by-sides of the shit I had to deal with when redesigning their outfits. Please someone acknowledge this. This killed me inside. THIS is most recognizable to me, and usually what I look for first if I'm wary about an art piece. These clusterfuck bunches of color. I hate them. I hate them so. much.
Anyways here's some other miscellaneous things I've noticed:
Danny Phantom Eyes
???? Thumb? (and random sword sheath)
Collarbone Necklace (corset from hell)
No Staff :( No Bow :(
What is that.
So yeah. Truly the best thing to do is to just. study it. A lot of times you aren't gonna notice anything just looking at the big picture, you need to zoom in and focus on the little details. Obviously I'm not like an expert in AI or anything, but I do have a degree in animation practices and I'm. You know. A human being. So.
In conclusion:
(Y'all should totally reblog my redesign of this btw)
#rotbtd#the big four#anti ai#ai discourse#fanart#ask#inbox#rise of the brave tangled dragons#httyd#how to train your dragon#hiccup horrendous haddock iii#brave#tangled#rapunzel#merida#jack frost#rotg#rise of the guardians#dreamworks#disney#hijack#frostcup#jackunzel#jarida#mericcup#hicunzel#crossover#hicless#rtte#race to the edge
757 notes
·
View notes
Text
Notes Game - Bladder Torture
notes game to make my bladder stretched to its limits. no cum, just edge and hold like the stupid dumb slut I am (MDNI) wanna help? reblogs, like and comments (any time you want), make me suffer like the useless whore I am
every 1 note is 1 minute to add to my holding
every 5 notes are 100 ml I have to drink
every 10 notes I press on my bladder for 5 seconds
every 50 notes I press my bladder on a counter for 10 sec and release for 5, 3 times
every 70 notes 10 squats
100 notes: after 1 hour I can't hold with my hands or cross my leg anymore
~ after 100 notes:
every 10 notes is a slap on my full bladder
every 20 notes a slap on my open spread pussy
every 40 notes lie on my belly with something under it for 5 min
200 notes: do a workout with full bladder, leaking is not an excuse to stop
220 notes: melt an ice cube in my cunt with panties on, can't remove them (fake pee)
250 notes: body write with humiliating words while sitting on the toilet
+++ I accept tasks, challenges, punishments in the comments/asks
~ punishment
leaking
drink a glass of water + add 10 min + fake pee
wetting / accident
drink 4 glass of water + add 30 min + lay on belly with a small ball on bladder for 10 min
will close on september 11th
#bladder challenge#bladder control#bladder desperation#bladder holding#bladder torture#cl!t torture#humiliation kink#omo hold#pee humiliation#piss holding#degrade and humiliate me#piss humiliation#ruined 0rgasm#0rgasm denial#0rgasm control#bd/sm daddy#pain slvt#free use slvt#dumb slvt#omo challenge#c0cksleeve#c0ckslut#c0ckwh0re#stupid slvt#daddy's good girl#edging kink#desperate wh0re#attention wh0r3#cnc free use#ai pee desperation
497 notes
·
View notes
Text
#goon and edge#permanent sissy#bnwo propaganda#sissy for bbc#ai babe#ai sexy#ai waifu#ai goon#goon trigger
245 notes
·
View notes
Text
#bambi sleep#bimbofied#bimbo doll#bimbo hypnosis#bimbo girl#bimboization#bimbo training#bambification#bambi#bambisleep#denial kink#ai bimbo#female denial#tease & denial#deny#0rgasm denial#locked and denied#edging and denial
87 notes
·
View notes
Text
If any of the Batsiblings ever end up in serious medical trouble, to the point they've been forced onto bedrest or put into a medically-induced coma, their siblings will rotate in shifts (and require physical force to be removed sometimes) or hover at windowsills and talk at the patient and meddle so much that Leslie keeps having to smack their hands away from IV lines or wrestle back her stethoscope. But the second said sibling wakes up it converts spontaneously to around the clock taunting, shoulder punches that nearly send them and their crutches into the floor, silent help during PT, and someone coming by to coerce pain meds onto you with bedside manner so bad that you forget all about almost dying and start planning homicide.
#You guys... I have the scariest amount of deja vu about this one#ISTG i have made this post before.#But I myself cannot find it#In any case I assume renditions of it have been made over and over this is not a novel concept#which gets us thinking about what a novel concept even is and the fact that those poor baby AIs (I hate em) are just doing their best to#mimic the way our fleshy brain soup pours everything together and blends it into something new#but they're shitty at it so you can see the hard edges of the words they ripped out of a newspaper to make their collage#devastating truly#batman#dc comics#batfamily#personal#shitpost#batpost
90 notes
·
View notes
Text
Learn from who? Learn from you? You are still a brat. What do you know? You're only three years older. Like you are any better than me. You're 21, and still a virgin. What are you proud of? I think you can't do it.
KISEKI: DEAR TO ME Ep. 06
#kiseki: dear to me#kisekiedit#kdtm#kiseki dear to me#ai di x chen yi#chen yi x ai di#nat chen#chen bowen#louis chiang#chiang tien#jiang dian#userspring#uservid#userspicy#userrain#pdribs#userjjessi#*cajedit#*gif#*gestures at the caption* this is honestly the funniest argument they could possibly have idfk what to tell you. it's very ai di#meanwhile whatever's going through chen yi's head rn has recently been doused with 'the boss doesnt care abt me like that'#after watching cdy and zml at dinner. like chen yi already knows *before* ep9 & ai dis confession that cdy will never look at him#(the diff. between this scene & ep9's. is him failing in regards to the gang as well in cdy's eyes. he goes from feelings of disappointment#& irritability to complete despair and both times he drinks to cope. bc hes not enough in cdy's eyes in ANY of the ways he wants/hoped)#so honestly the crisis chen yi goes thru right here isnt unfounded at all hes literally dealing w an inadvertent rejection of his feelings#its chaos in his head and ai di is picking at him again and the wine is tilting in his blood and then- 'learn from who? learn from you?'#like what do YOU know about love ai di (WHILE CHEN YI'S PULLING HIM LIKE THAT-) so OF COURSE ai di goes for the deepest dig he can.#'i bet you cant get hard that explains how much of a coward you are'. its ridiculous the ways in which they push each other over the edge#but im ngl im kind of obsessed the way chen yi's tipsy line of thinking 'learn from you?' turned into the action 'fuck it learn from ME'#ANYWAY EVERYONE GO LISTEN TO 'LOSE CONTROL' BY TEDDY SWIMS RIGHT THE FUCK NOW. THe most chen yi song pre-ep9
82 notes
·
View notes
Text
AI inference in edge computing: Benefits and use cases
New Post has been published on https://thedigitalinsider.com/ai-inference-in-edge-computing-benefits-and-use-cases/
AI inference in edge computing: Benefits and use cases
As artificial intelligence (AI) continues to evolve, its deployment has expanded beyond cloud computing into edge devices, bringing transformative advantages to various industries.
AI inference at the edge computing refers to the process of running trained AI models directly on local hardware, such as smartphones, sensors, and IoT devices, rather than relying on remote cloud servers for data processing.
This rapid evolution of the technology landscape with the convergence of artificial intelligence (AI) and edge computing represents a transformative shift in how data is processed and utilized.
This shift is revolutionizing how real-time data is analyzed, offering unprecedented benefits in terms of speed, privacy, and efficiency. This synergy brings AI capabilities closer to the source of data generation, unlocking new potential for real-time decision-making, enhanced security, and efficiency.
This article delves into the benefits of AI inference in edge computing and explores various use cases across different industries.
Fig 1. Benefits of AI Inference in edge computing
Real-time processing
One of the most significant advantages of AI inference at the edge is the ability to process data in real-time. Traditional cloud computing often involves sending data to centralized servers for analysis, which can introduce latency due to the distance and network congestion.
Edge computing mitigates this by processing data locally on edge devices or near the data source. This low-latency processing is crucial for applications requiring immediate responses, such as autonomous vehicles, industrial automation, and healthcare monitoring.
Privacy and security
Transmitting sensitive data to cloud servers for processing poses potential security risks. Edge computing addresses this concern by keeping data close to its source, reducing the need for extensive data transmission over potentially vulnerable networks.
This localized processing enhances data privacy and security, making edge AI particularly valuable in sectors handling sensitive information, such as finance, healthcare, and defense.
Bandwidth efficiency
By processing data locally, edge computing significantly reduces the volume of data that needs to be transmitted to remote cloud servers. This reduction in data transmission requirements has several important implications; it results in reduced network congestion, as the local processing at the edge minimizes the burden on network infrastructure.
Secondly, the diminished need for extensive data transmission leads to lower bandwidth costs for organizations and end-users, as transmitting less data over the Internet or cellular networks can translate into substantial savings.
This benefit is particularly relevant in environments with limited or expensive connectivity, such as remote locations. In essence, edge computing optimizes the utilization of available bandwidth, enhancing the overall efficiency and performance of the system.
Scalability
AI systems at edge can be scaled efficiently by deploying additional edge devices as needed, without overburdening central infrastructure. This decentralized approach also enhances system resilience. In the event of network disruptions or server outages, edge devices can continue to operate and make decisions independently, ensuring uninterrupted service.
Energy efficiency
Edge devices are often designed to be energy-efficient, making them suitable for environments where power consumption is a critical concern. By performing AI inference locally, these devices minimize the need for energy-intensive data transmission to distant servers, contributing to overall energy savings.
Hardware accelerator
AI accelerators, such as NPUs, GPUs, TPUs, and custom ASICs, play a critical role in enabling efficient AI inference at the edge. These specialized processors are designed to handle the intensive computational tasks required by AI models, delivering high performance while optimizing power consumption.
By integrating accelerators into edge devices, it becomes possible to run complex deep learning models in real time with minimal latency, even on resource-constrained hardware. This is one of the best enablers of AI, allowing larger and more powerful models to be deployed at the edge.
Offline operation
Offline operation through Edge AI in IoT is a critical asset, particularly in scenarios where constant internet connectivity is uncertain. In remote or inaccessible environments where network access is unreliable, Edge AI systems ensure uninterrupted functionality.
This resilience extends to mission-critical applications, enhancing response times and reducing latency, such as in autonomous vehicles or security systems. Edge AI devices can locally store and log data when connectivity is lost, safeguarding data integrity.
Furthermore, they serve as an integral part of redundancy and fail-safe strategies, providing continuity and decision-making capabilities, even when primary systems are compromised. This capability augments the adaptability and dependability of IoT applications across a wide spectrum of operational settings.
Customization and personalization
AI inference at the edge enables a high degree of customization and personalization by processing data locally, allowing systems to deploy customized models for individual user needs and specific environmental contexts in real-time.
AI systems can quickly respond to changes in user behavior, preferences, or surroundings, offering highly tailored services. The ability to customize AI inference services at the edge without relying on continuous cloud communication ensures faster, more relevant responses, enhancing user satisfaction and overall system efficiency.
The traditional paradigm of centralized computation, wherein these models reside and operate exclusively within data centers, has its limitations, particularly in scenarios where real-time processing, low latency, privacy preservation, and network bandwidth conservation are critical.
This demand for AI models to process data in real time while ensuring privacy and efficiency has given rise to a paradigm shift for AI inference at the edge. AI researchers have developed various optimization techniques to improve the efficiency of AI models, enabling AI model deployment and efficient inference at the edge.
In the next section we will explore some of the use cases of AI inference using edge computing across various industries.
The rapid advancements in artificial intelligence (AI) have transformed numerous sectors, including healthcare, finance, and manufacturing. AI models, especially deep learning models, have proven highly effective in tasks such as image classification, natural language understanding, and reinforcement learning.
Performing data analysis directly on edge devices is becoming increasingly crucial in scenarios like augmented reality, video conferencing, streaming, gaming, Content Delivery Networks (CDNs), autonomous driving, the Industrial Internet of Things (IoT), intelligent power grids, remote surgery, and security-focused applications, where localized processing is essential.
In this section, we will discuss use cases across different fields for AI inference at the edge, as shown in Fig 2.
Fig 1. Applications of AI Inference at the Edge across different fields
Internet of Things (IoT)
The expansion of the Internet of Things (IoT) is significantly driven by the capabilities of smart sensors. These sensors act as the primary data collectors for IoT, producing large volumes of information.
However, centralizing this data for processing can result in delays and privacy issues. This is where edge AI inference becomes crucial. By integrating intelligence directly into the smart sensors, AI models facilitate immediate analysis and decision-making right at the source.
This localized processing reduces latency and the necessity to send large data quantities to central servers. As a result, smart sensors evolve from mere data collectors to real-time analysts, becoming essential in the progress of IoT.
Industrial applications
In industrial sectors, especially manufacturing, predictive maintenance plays a crucial role in identifying potential faults and anomalies in processes before they occur. Traditionally, heartbeat signals, which reflect the health of sensors and machinery, are collected and sent to centralized cloud systems for AI analysis to predict faults.
However, the current trend is shifting. By leveraging AI models for data processing at the edge, we can enhance the system’s performance and efficiency, delivering timely insights at a significantly reduced cost.
Mobile / Augmented reality (AR)
In the field of mobile and augmented reality, the processing requirements are significant due to the need to handle large volumes of data from various sources such as cameras, Lidar, and multiple video and audio inputs.
To deliver a seamless augmented reality experience, this data must be processed within a stringent latency range of about 15 to 20 milliseconds. AI models are effectively utilized through specialized processors and cutting-edge communication technologies.
The integration of edge AI with mobile and augmented reality results in a practical combination that enhances real-time analysis and operational autonomy at the edge. This integration not only reduces latency but also aids in energy efficiency, which is crucial for these rapidly evolving technologies.
Security systems
In security systems, the combination of video cameras with edge AI-powered video analytics is transforming threat detection. Traditionally, video data from multiple cameras is transmitted to cloud servers for AI analysis, which can introduce delays.
With AI processing at the edge, video analytics can be conducted directly within the cameras. This allows for immediate threat detection, and depending on the analysis’s urgency, the camera can quickly notify authorities, reducing the chance of threats going unnoticed. This move to AI-integrated security cameras improves response efficiency and strengthens security at crucial locations such as airports.
Robotic surgery
In critical medical situations, remote robotic surgery involves conducting surgical procedures with the guidance of a surgeon from a remote location. AI-driven models enhance these robotic systems, allowing them to perform precise surgical tasks while maintaining continuous communication and direction from a distant medical professional.
This capability is crucial in the healthcare sector, where real-time processing and responsiveness are essential for smooth operations under high-stress conditions. For such applications, it is vital to deploy AI inference at the edge to ensure safety, reliability, and fail-safe operation in critical scenarios.
Computer vision meets robotics: the future of surgery
Max Allan, Senior Computer Vision Engineer at Intuitive, describes groundbreaking robotics innovations in surgery and the healthcare industry.
Autonomous driving
Autonomous driving is a pinnacle of technological progress, with AI inference at edge taking a central role. AI accelerators in the car empower vehicles with onboard models for rapid real-time decision-making.
This immediate analysis enables autonomous vehicles to navigate complex scenarios with minimal latency, bolstering safety and operational efficiency. By integrating AI at the edge, self-driving cars adapt to dynamic environments, ensuring safer roads and reduced reliance on external networks.
This fusion represents a transformative shift, where vehicles become intelligent entities capable of swift, localized decision-making, ushering in a new era of transportation innovation.
The integration of AI inference in edge computing is revolutionizing various industries by facilitating real-time decision-making, enhancing security, and optimizing bandwidth usage, scalability, and energy efficiency.
As AI technology progresses, its applications will broaden, fostering innovation and increasing efficiency across diverse sectors. The advantages of edge AI are evident in fields such as the Internet of Things (IoT), healthcare, autonomous vehicles, and mobile/augmented reality devices.
These technologies benefit from the localized processing that edge AI enables, promising a future where intelligent, on-the-spot analytics become the standard. Despite the promising advancements, there are ongoing challenges related to the accuracy and performance of AI models deployed at the edge.
Ensuring that these systems operate reliably and effectively remains a critical area of research and development. The widespread adoption of edge AI across different fields highlights the urgent need to address these challenges, making robust and efficient edge AI deployment a new norm.
As research continues and technology evolves, the potential for edge AI to drive significant improvements in various domains will only grow, shaping the future of intelligent, decentralized computing.
Want to know more about how generative companies are using AI?
Get your copy of our Gen AI report below!
Generative AI 2024 report
Unlock the secrets to faster workflows with the Generative AI 2024 Report. Learn how 56.4% of companies leverage AI to boost efficiency and stay competitive.
#2024#accelerators#adoption#ai#ai at the edge#ai inference#ai model#AI models#AI systems#AI-powered#Analysis#Analytics#anomalies#applications#approach#ar#Article#artificial#Artificial Intelligence#audio#augmented reality#automation#autonomous#autonomous driving#autonomous vehicles#Behavior#Cameras#Cars#Cloud#cloud computing
0 notes
Text
After seeing this beautiful (infamous?) Vrg Grl crochet dress with strong 70's vibes that everyone seemed to want to crochet for themselves after Taylor Swift wore it recently, I wanted to turn the fabric into a pattern. And voilà! It's perfect as a beach cover-up dress, right? The outfit is from the Ambitions EP if memory serves me right. Doesn't look too shabby, does it?
#ts3#sims 3#ts3wip:patterns#simlicious wip#the sims 3#ai-upscaled image for smoother edges#I accidentally replaced my graphics rules with the wrong one so the pic was not as nice as usual
65 notes
·
View notes
Text
280 notes
·
View notes
Text
youtube
Ashton Instagram Live - 6 June 2024
Including acoustic performances of:
"Landslide" by Fleetwood Mac
"Drive" from SUPERBLOOM
A teaser of "Wild Things" from BLOOD ON THE DRUMS
"Straight To Your Heart" from BLOOD ON THE DRUMS
"Red Desert" by 5SOS
"Wicked Game" by Chris lsaak
#I 🤸🏻♀️ FEEL 🤸🏻♀️ INSANE 🤸🏻♀️#5sos#5 seconds of summer#ashton irwin#ashton#ai ig live#straight to your heart#blood on the drums#instagram#ai ig#video#kh4f post#listen i had such a stressful night & was still reeling this morning so this is both just what i needed & also has sent me over the edge 👹#this was so brutal oh he had no mercy#i thought the peak was going to be the Springsteen needle drop at the beginning lmaoo (which almost got me a copyright claim thanks sir 😌)#then the man immediately breaks out with Fleetwood Mac#DEVESRSTING#he 100000% needs to record this version of Drive btw ohmygod#like listen i feel like bc he has a tendency towards falsetto harmonies his low end tends to get overlooked#and that shit was WILD in that drive rendition#ohmygod?#i cannot believe any of this was real#i would screenshot the shit out of it but i have to go get ready for Luke now wtf 😭😭😭😭#INSANITY#👰🏻♀️#why i no can kiss#Youtube
55 notes
·
View notes
Text
if anything, this work has really made me think a lot about lies
it can really be love, it convinced me on that
#oshi no ko#oshi no ko spoilers#hikaai#ai hoshino#hikaru kamiki#that's why I think aqua's totally on the wrong track in 160#he shouldn't talk down about lies like that...#spoilers#doodle#come on aqua please#but he does have all the right to be on edge..his sister almost got stabbed
29 notes
·
View notes
Text
since I'm just a dumb slut, I need yall to help me dress up for my holding
#ai pee desperation#bladder torture#piss holding#bladder challenge#bladder control#bladder desperation#bladder holding#humiliation kink#omo hold#pee humiliation#free use slvt#dumb slvt#edging kink#bd/sm kink
15 notes
·
View notes
Text
Ermmmm idk who would use these but uhhhh
Transparent versions of my favorite images!!
If you’re wondering where the first three came from…
Her Pose Sheet!!!
#mario rabbids sparks of hope#rabbid edge#yes I actually MANUALLY erase these!!#I can’t trust AI to erase backgrounds#you can credit me but it’s optional#I actually have more but these are my top favorites!
23 notes
·
View notes