#Nvidia hardware
Explore tagged Tumblr posts
diagnozabam · 11 days ago
Text
Nintendo Switch 2: Posibilă prezentare oficială pe 16 ianuarie 2025
Nintendo pare să se pregătească pentru dezvăluirea mult așteptatei sale console, denumită neoficial Nintendo Switch 2, conform informațiilor furnizate de leaker-ul Nate the Hate și site-ul VGC. Prezentarea ar urma să fie împărțită în două evenimente: unul axat pe hardware (16 ianuarie 2025) și un altul dedicat jocurilor (spre sfârșitul lunii februarie sau începutul lunii martie). Ce știm până…
0 notes
andmaybegayer · 9 days ago
Text
bill gates is a rabid dog and he must be beaten to death with a stick
97 notes · View notes
mvfm-25 · 11 months ago
Text
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
" Because it's worth knowing just how disc + Xbox = best games on Earth! "
Official Xbox Magazine (UK) n01 - November, 2001. - Pg. 38 -> Pg.45
126 notes · View notes
retrocgads · 5 months ago
Text
Tumblr media
UK 1998
9 notes · View notes
thetechempire · 3 months ago
Text
Kai-Fu Lee has declared war on Nvidia and the entire US AI ecosystem.
Tumblr media
🔹 Lee emphasizes the need to focus on reducing the cost of inference, which is crucial for making AI applications more accessible to businesses. He highlights that the current pricing model for services like GPT-4 — $4.40 per million tokens — is prohibitively expensive compared to traditional search queries. This high cost hampers the widespread adoption of AI applications in business, necessitating a shift in how AI models are developed and priced. By lowering inference costs, companies can enhance the practicality and demand for AI solutions.
🔹 Another critical direction Lee advocates is the transition from universal models to “expert models,” which are tailored to specific industries using targeted data. He argues that businesses do not benefit from generic models trained on vast amounts of unlabeled data, as these often lack the precision needed for specific applications. Instead, creating specialized neural networks that cater to particular sectors can deliver comparable intelligence with reduced computational demands. This expert model approach aligns with Lee’s vision of a more efficient and cost-effective AI ecosystem.
Tumblr media
🔹 Lee’s startup, 01. ai, is already implementing these concepts successfully. Its Yi-Lightning model has achieved impressive performance, ranking sixth globally while being extremely cost-effective at just $0.14 per million tokens. This model was trained with far fewer resources than competitors, illustrating that high costs and extensive data are not always necessary for effective AI training. Additionally, Lee points out that China’s engineering expertise and lower costs can enhance data collection and processing, positioning the country to not just catch up to the U.S. in AI but potentially surpass it in the near future. He envisions a future where AI becomes integral to business operations, fundamentally changing how industries function and reducing the reliance on traditional devices like smartphones.
2 notes · View notes
kreativ-im-studio · 1 year ago
Video
youtube
Do you REALLY need a good GPU for Blender?
17 notes · View notes
notihardware21 · 9 months ago
Text
Impulsando el Futuro: El Éxito de NVIDIA y sus Chips de IA
NVIDIA ha experimentado un impresionante aumento del 30% en sus ganancias trimestrales, gracias a la creciente demanda de sus chips de inteligencia artificial (IA). Estos chips están revolucionando sectores como vehículos autónomos, investigación médica y supercomputación. Con más de 10 millones de vehículos equipados con su tecnología y una penetración del 70% en las principales instalaciones de investigación, NVIDIA lidera la revolución de la IA. Su enfoque integral en hardware y software está acelerando la innovación y promete un emocionante futuro impulsado por la inteligencia artificial.
https://www.eleconomista.com.mx/mercados/Ganancias-trimestrales-de-Nvidia-superan-los-pronosticos-20240221-0078.html
Tumblr media
4 notes · View notes
blazehedgehog · 2 years ago
Note
What do you think of Nvidia's new DLSS Frame Generation technology that creates frames in between ones created by the GPU with AI to produce a smoother frame rate and also give you more FPS?
As a chronic screenshotist, I hate it. I can't find the post now, but I talked about how smeary recent games are thanks to upscaling. Games with dynamic resolutions and heavy upscaling always leave smudgy, smeary, ugly artifacts all over the screen, where the renderer is filling in missing information.
Tumblr media
I see it in a lot of Unreal Engine games, but I'm pretty sure I saw it in the PS4 version of Street Fighter 6, too. It makes games look worse than they should.
I've also played Fortnite through things like GeForce Now, where you can (or used to be able to) crank the settings up to max, and even with everything on Epic Quality, surfaces can still be smeary, grainy, and covered in artifact trails.
So when you tell me that not only is the rendering engine going to be making a blurry mess out of my game because of resolution scaling, but it's also going to be motion smoothing whole entire new frames on top of that?
Gross. No thank you.
I feel like photo mode is a way to "solve" this, because photo mode deliberately takes you out of the action to render screenshots at much, much higher settings than you normally get during gameplay. But that creates a different problem in that's not how photography always works.
Photography is about taking 300 shots and only using the five best ones.
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
Photo mode suggests you are setting up for a specific screenshot on purpose when that's just not how it always goes.
Tumblr media
If interframe generation makes this worse, or harder? I don't want it. I don't care what the benefits claim to be. We're in a constant war to tell our parents to turn motion smoothing off on new TVs, why would we want to bring that in to games?
17 notes · View notes
shamblz · 1 year ago
Text
Clean installing nbividia drivers in the desperate hope that'll I'll be able to play alan wake at a reasonable fps on lowest settings
3 notes · View notes
0x5742 · 2 years ago
Text
U P G R A D E
Tumblr media
8 notes · View notes
jalprig · 2 years ago
Text
Tumblr media
3 notes · View notes
Text
GPU Industry Rant
I'm angry, angry about graphics cards.
Why?
Because what used to be a fun exercise in trying to find the best value or trying to find a good deal at a shoestring budget has turned into "how long do I need to wait to find something that isn't awful value".
It used to be that you could get a reasonably decent new gaming GPU for about $100-$150 and every new generation there'd be new cards in that price range that were a decent bit better. You used to be able to get a genuinely good gaming GPU at $150-250 with significant improvements every generation.
What used to be
It has soon been 4 years since the release of the GTX 1650, and about 3.5 years since the 1660 Super and 1650 Super. These three cards represent the last time there was a step forward at these two price points.
In 2016 we had the GTX 1050ti at $150, the RX 470 at $180 and the RX 480 at $200 ($250 for the 8GB model). The 1050ti was pretty awful value compared to the 25% faster RX 470, but it still beat previous generation $200 cards by a few percent. The GTX 1650 then at $150 just about matched the RX 470 in 2019, still not a great value improvement, especially since AMD released the slightly faster RX 570 at $170 in 2017, but at least you paid slightly less for about a match in performance. The GTX 1650 Super half a year later was about similar, matching or slightly beating the RX 580 (which in turn was a bit faster than the 480 and slightly cheaper) at $160, making for a small step up in performance compared to the RX 570. The 1660 Super at around the same time set you back $230 while providing about 25% more performance than a 1650 Super or RX 580, making it on par with 2016′s $450 GTX 1070, quite an improvement in value.
As for cards below $150, we've had nothing since the GT 1030 ($70 2017), RX 550 ($80 2017) RX 560 ($100 2017) and GTX 1050 ($110 2016).
Since then we've had
the GTX 1630, a card that costs $150 while performing somewhere between a 1050 and 1050ti, making it uncompetitive against the bad value 1050ti from 2016. The only way to make the 1630 look good is if you compared it to the 950 from 2015. The RX 6400, $160 for a card that gets beaten by the 1650 by a slight margin while also having issues in older PCs due to limited x4 PCI-E bandwidth. The RX 6500XT a $200 card that gets handily beaten by the 1650 Super with the same PCI-E issue as the 6400. The RTX 3050 a $250 (in theory at least) card that very slightly beats the 1660 Super. You're pretty much paying at least $20 more for unusable raytracing and the privilege of being able to use DLSS.
Cope
Some youtubers a while back went on about how "the age of the APU" is coming or something like that. Arguing that anything up to about $150 will be made obsolete by integrated graphics. They were technically correct, but only if you compare the latest and greatest iGPU in laptop CPUs, the Radeon 680M to the GTX 1630, which as I mentioned earlier is worse than a 1050ti, a $150 GPU that's coming up on its 7th birthday in a few months. Presumably the same 680M and possibly a 12CU RDNA 3 GPU will make it into some Ryzen 7000G APUs later this year, but even then I think top iGPU (which will be included in a CPU that'll be more expensive that it would've been to buy a cheap CPU + GPU combo back in the day) only might match RX 6400 performance or maybe 1650 performance, certainly not 1650 Super performance and absolutely not what ought to have been $150 performance this generation (which is to say something closer to the RTX 3050).
Hope
At least the used market is back to relatively normal, so if you want RX 6500XT performance but don't feel like paying $200 for it you can just buy a used RX 580 for like $90 or if you want better you can go for a 1660 Super for about $130 (both "buy it now" prices on ebay). The prices of these used cards are scaled quite appropriately from what new card pricing for the same performance levels ought to be.
The downside of buying older cards is that they don't always age that well. The GTX 9 and 10-series have aged like fine milk in the latest games (which is to say that relative performance to the 20 and 16-series is down by a lot) and AMD dropped support for their 2012-2015 lineup in 2021.
Additional notes
It is worth mentioning that the RX 6600 is currently available at $250 in the US when on a small discount and provides a good 25% performance uplift compared to the RTX 3050/GTX 1660 Super, however this pricing is not universal, the same card on German amazon is €280 or $300, on canadian amazon the best I found was 270 USD. Here in Sweden some part of increased prices is definitely due to inflation, 1660 Supers used to be around 2700 SEK, now an RX 6600 is at best 3200 SEK which is a pretty big change, sure it currently translates to just below $250 before our 25% sales tax, but that doesn't make it feel any better.
6 notes · View notes
coinbuzzfeed · 10 hours ago
Text
NVIDIA Enhances AI Inference with Full-Stack Solutions
Luisa Crawford Jan 25, 2025 16:32 NVIDIA introduces full-stack solutions to optimize AI inference, enhancing performance, scalability, and efficiency with innovations like the Triton Inference Server and TensorRT-LLM. The rapid growth of AI-driven applications has significantly increased the demands on developers, who must deliver high-performance results while managing operational complexity…
0 notes
sentivium · 13 days ago
Text
Project Digits: How NVIDIA's $3,000 AI Supercomputer Could Democratize Local AI Development | Caveman Press
Tumblr media
0 notes
geekazoidfreak · 14 days ago
Text
U guys are gonna be eating so good in terms of screenshots im building a new PC im hoping to get a 7800xt :3 was gonna go for 7600xt vut that membus difference is so 👎👎 it's good for most people but not for me i NEED crisp 4k quality screenshots to feed my little birdie friends
0 notes
peterbordes · 21 days ago
Text
Congrates to our TrajectoryVentures.vc portfolio Co’s Groq & LightMatter making the TechRadar 10 hottest AI hardware companies to follow in 2025.
Watch out Nvidia, these startups are looking to dent your dominance.
1 note · View note