#GPU Manufacturers
Explore tagged Tumblr posts
Text
#GPU Market#Graphics Processing Unit#GPU Industry Trends#Market Research Report#GPU Market Growth#Semiconductor Industry#Gaming GPUs#AI and Machine Learning GPUs#Data Center GPUs#High-Performance Computing#GPU Market Analysis#Market Size and Forecast#GPU Manufacturers#Cloud Computing GPUs#GPU Demand Drivers#Technological Advancements in GPUs#GPU Applications#Competitive Landscape#Consumer Electronics GPUs#Emerging Markets for GPUs
0 notes
Text
I've decided I'm going to slowly buy my computer piece by piece, rather than all at once.
And of course, the GPU I want keeps coming back in stock and then disappearing by the time I click the stock notification link.
Are a lot of people building midrange gaming PCs right now, or is there another stupid crypto boom I have to worry about?
#tech yearning#I'm gonna buy the GPU first since it's the most expensive component#The slightly pricier one I placed as a second choice is still in stock via the manufacturer's newegg page#Whichever one is easily available when I get paid next is what I'm gonna go with#this is annoying af
5 notes
·
View notes
Text
yeah im shipping the pc cross country i only got to really worry about the glass breaking most
#my heatsink is only two screws to take off so easy shipment there#and obviously the gpu is easy af to take out of there#and then i just got a m.2 ssd so that is very hard to break really and if it does its only like $50 to replace 🤷♀️#so just get instapak in there? and some anti static bubble wrap im golden#and i kept the cases manufacturers box this time i learned a lesson last time so ultra easy shipping
0 notes
Text
The supply chain capitalism of AI. This image partially captures the supply chain of AI as a global and complex phenomenon. Natural resources, components and materials to build AI infrastructure are extracted, shipped, manufactured and produced across the globe. For instance, NVIDIA obtains tungsten from Brazil; gold from Colombia and tantalum from Kazakhstan. Minerals are assembled to manufacture GPUs by TSMC. NVIDIA sells GPUs across data centres in the world. Given the refresh rates of these materials, data centres sent their components to recycle plants or dumps. The human labour wrapped-up in this chain includes, data labellers, logistics drivers, data scientists, miners, data centre operators and electronic waste dismantlers, who are also scattered across different geographies. Source: NVIDIA (2022) and fieldwork.
The supply chain capitalism of AI: a call to (re)think algorithmic harms and resistance through environmental lens
433 notes
·
View notes
Text
nobody wants to hear this and im straight up not letting people reblog this because i'm smarter than invoking that thundergod on my head but;
penny for penny, pound for pound, making an ai art image in anything less than like a hundred queries/revisions is almost certainly greener in terms of energy usage than even just painting that same picture in an art program (from having photoshop/csp/blender open - gpu intensive programs you have to run for hours!) or even drawing it on paper (paper + pencil manufacture). if you're gonna bitch about externalities you better consider the externalities on your side of the equation too.
130 notes
·
View notes
Text
GPU delivered and amazon slapped a shipping label on the manufacturer box and left it on my front step like that. yeah let everyone outside see i have computer parts laying on the gound. thanks.
46 notes
·
View notes
Text
Silicon Valley let out a sigh of relief on Wednesday when it learned that President Donald Trump’s tariff bonanza included an exemption for semiconductors, which, at least for now, won’t be subject to higher import duties. But just three days later, some US tech companies may be finding that the loophole actually creates more problems than it solves. After the tariffs were announced, the White House published a list of the products that it says are unaffected, and it doesn’t include many kinds of chip-related goods.
That means only a small number of American manufacturers will be able to continue sourcing chips without needing to factor in higher import costs. The vast majority of semiconductors that come into the US currently are already packaged into products that are not exempt, such as the graphics processing units (GPUs) and servers for training artificial intelligence models. And manufacturing equipment that domestic companies use to produce chips in the US wasn’t spared, either.
“If you are a major chip producer who is making a sizable investment in the US, a hundred billion dollars will buy you a lot less in the next few years than the last few years,” says Martin Chorzempa, a senior fellow at the Peterson Institute for International Economics.
The US Department of Commerce did not respond to a request for comment.
Stacy Rasgon, a senior analyst covering semiconductors at Bernstein Research, says the narrow exception for chips will do little to blunt wider negative impacts on the industry. Given that most semiconductors arrive at US borders packaged into servers, smartphones, and other products, the tariffs amount to “something in the ballpark of a 40 percent blended tariff on that stuff,” Rasgon says, referring to the overall import duty rate applied.
Rasgon notes that the semiconductor industry is deeply dependent on other imports and on the overall health of the US economy, because the components it makes are in so many kinds of consumer products, from cars to refrigerators. “They are macro-exposed,” he says.
To determine what goods the tariffs apply to, the Trump administration relied on a complex existing system called the Harmonized Tariff Schedule (HTS), which organizes millions of different products sold in the US market into numerical categories that correspond to different import duty rates. The White House document lists only a narrow group of HTS codes in the semiconductor field that it says are exempted from the new tariffs.
GPUs, for example, are typically coded as either 8473.30 or 8542.31 in the HTS system, says Nancy Wei, a supply chain analyst at the consulting firm Eurasia Group. But Trump’s waiver only applies to more advanced GPUs in the latter 8542.31 category. It also doesn’t cover other codes for related types of computing hardware. Nvidia’s DGX systems, a pre-configured server with built-in GPUs designed for AI computing tasks, is coded as 8471.50, according to the company’s website, which means it’s likely not exempt from the tariffs.
The line between these distinctions can sometimes be blurry. In 2020, for example, an importer of two Nvidia GPU models asked US authorities to clarify what category it considered them falling under. After looking into the matter, US Customs and Border Protection determined that the two GPUs belong to the 8473.30 category, which also isn’t exempt from the tariffs.
Nvidia’s own disclosures about the customs classifications of its products paint a similar picture. Of the over 1,300 items the company lists on its website, less than one-fifth appear to be exempt from Trump’s new tariffs, according to their correspondent HTS codes. Nvidia declined to comment to WIRED on which of its products it believes the new import duties apply to or not.
Bad News for US AI Firms
If a wide range of GPUs and other electronic components are subject to the highest country-specific tariffs, which are scheduled to kick in next week, US chipmakers and AI firms could be facing a significant increase in costs. That could potentially hamper efforts to build more data centers and train the world’s most cutting-edge artificial intelligence models in the US.
That's why Nvidia’s stock price is currently “getting killed,” Rasgon says, having shed roughly one-third of its value since the start of 2025.
“AI hardware, particularly high-end GPUs from Nvidia, will see rising costs, potentially stalling AI infrastructure development in the US,” says Wei from Eurasia Group. “Cloud computing, quantum computing, and military-grade semiconductor applications could also be impacted due to higher costs and supply uncertainties.”
Mark Wu, a professor at Harvard Law School who specializes in international trade, says the looming possibility that other countries embedded in the semiconductor supply chain could impose retaliatory tariffs on the US is creating a very unpredictable environment for businesses. Trump may also soon announce more tariffs specifically targeting chips, something he alluded to at a press briefing on Thursday. “There's so many different scenarios,” Wu says. “It’s almost futile to sort of speculate without knowing what's under consideration.”
More Challenges to Reshoring
Trump has said that his trade policies are intended to bring more manufacturing to the US, but they threaten to reverse what had been a bumper period for US chipmaking. The Semiconductor Industry Association recently released figures showing that sales grew 48.4 percent in the Americas between February 2023 and 2024, far above rates in China, where sales only increased 5.6 percent, and Europe, which saw sales decrease 8.1 percent.
The US has a relatively small share of the global chipmaking market as a whole, however, due to decades of offshoring. Fabrication plants located in the country account for just 12 percent of worldwide capacity, down from 37 percent in 1990. The CHIPS Act, introduced under the Biden administration, sought to reverse the trend by appropriating $52 billion for investment in chip manufacturing, training, and research. Trump called the law a “horrible thing” and recently set up a new office to manage its investments.
A glaring omission in the list of HTS code exempt from Trump’s tariffs are those that correspond to lithography machines, a highly sophisticated category of equipment central to chipmaking. Most of the world’s advanced lithography machines are made today in countries like the Netherlands (subject to a 20 percent tariff) and Japan (a 24 percent tariff). If these devices become significantly more costly to import, it could get in the way of bringing semiconductor manufacturing back to the US.
Also hit by Trump’s tariffs are a litany of less fancy but still essential ingredients for chipmaking: steel, aluminum, electrical components, lighting, and water treatment technology. All of those goods could become more expensive thanks to tariffs. “This is the classic tariff conundrum: If you put tariffs on something, it protects one kind of business, but everything upstream and downstream can lose out,” says Chorzempa.
US Allies Feel the Heat
While some countries that are already subject to US sanctions, like Russia and North Korea, were not included in the tariffs, many American allies are, like Taiwan, which plays an outsize role in the global semiconductor supply chain today compared to its size, because it’s home to companies like Taiwan Semiconductor Manufacturing Company (TSMC), which produces the lion's share of the world’s most advanced chips.
Taiwan will still feel the impact of the tariffs, despite the semiconductor carve-out, because most of what it actually exports to the US is not exempt, says Jason Hsu, a former Taiwan legislator and senior fellow at the Hudson Institute, a DC-based think tank.
Only about 10 percent of Taiwan’s exports to the US last year were semiconductor products that would be exempt from the new tariffs, according to trade data released by the Department of Commerce. The vast majority of Taiwan’s exports are things like data servers and will be taxed an additional 32 percent.
Unlike TSMC, Taiwanese companies that make servers often operate on thin margins, so they may have no choice but to raise prices for their American clients. “We might be looking at AI server prices going completely out of the roof after that,” Hsu says.
Hsu notes that the new tariffs will particularly hurt Southeast Asian countries, which could undermine a long-standing US strategic objective to decouple from supply chains in China. Countries in the region are being hit with some of the highest tariff rates of all—like Vietnam at 46 percent and Thailand at 36 percent—figures that could deter chipmaking companies like Intel and Micron from moving their factories out of China and into these places.
“I see no soft landing to this,” Hsu says. “I see this as becoming an explosion of global supply chain disorder and chaos. The ramifications are going to be very long and painful.”
10 notes
·
View notes
Text
Cybernetics with Chinese Characteristics & why we suck at the real Grand Strategy Game
Part 2 - The Quickening
Back in 2023, I wrote this more blog-like post about the mid 20th century McCarthyite purges of the Jet Propulsion Laboratory and the knock on effects that had - Namely the inception of the Chinese nuclear program, one-child policy and Chinese computing scene.
Since nothing is new under the sun, we have recently witnessed yet another example of America shooting itself in the foot, yet again, due to it's McCarthyite style purge of Chinese technology.
The release of the Chinese created AI system DeepSeek R1 last week has lead to the largest US stock market loss in history with NVIDIA stock decimated.
A record $465 Billion was wiped off its valuation in a single day. In 2024, the government of Turkey spent this much in a year on it's responsibilities?
Why did this happen?
As always, a lot can be put down to US foreign policy, and the in-intended implications of seemingly positive actions.
Do you want to start a trade war?
Back in the relatively uncontroversial days of the first Trump Presidency (Yes it does feel odd saying that) there were scandals with hardware provided by Chinese company Huawei. This led to the National Defense Authorization Act for Fiscal Year 2019 which explicitly banned Huawei and ZTE's hardware from use in US Government institutions. It also meant the US had to authorise US component manufacturer purchases by these companies.
Crucially this had a 27 month window. This allowed both companies to switch suppliers, and production to domestic suppliers. This actually led to Chinese chip advances. Following on from this came the 2022 move by the US Department of Commerce: "Commerce Implements New Export Controls on Advanced Computing and Semiconductor Manufacturing Items to the People’s Republic of China (PRC) ". This further limited the supply of semiconductor, supercomputer, and similar hardware to the PRC and associated countries.
Ok, well so far this is fairly dry stuff. You might think it would hamper Chinese development and, to some extent, it did.
It also proved to be the main catalyst for one financial quant.
Meet the Quant
Meet Liang Wenfeng (梁文锋). Educated to masters level, Liang was keen to apply machine learning methods to various field, but couldn't get a break. Finally, in the mid 2000's, he settled on a career investigating quantitative trading using machine learning techniques.
He became successful, founding several trading firms based around using machine learning methods, but his interest in base AI never seemed to cease. It was in 2021 that he started purchasing multiple NVIDIA GPUs to create a side project, leading to the creation of DeepSeek in 2023.
Now, due to import limitations, there were limitations on computation. This, however, did not stop DeepSeek's programming team.
Instead they used it as their strength.
Constrains Breed Innovation
For many years, the Western model of AI releases have focussed on making ever larger and larger models.
Why?
Let's break this down from an evolutionary point of view. Modern Western technology companies are largely monopolistic and monolithic. Many of these companies have previously hired staff at higher salaries not to fill roles, but to deny their competitors, and middle market firms, high-flying staff.
They also closely guard trade secrets. What's the training data? What algorithms were used in construction? Guess you'd better chat up some Silicon Valley bros at parties to find out.
For these kinds of firms, having control over large models, housed in data centres makes perfect sense. Controlling model deployment on their own computing systems, and not using local machines, means that they can not only control their systems more carefully, it also means that they can gatekeep access.
If your business model is to allow people to access your models on your servers, and your employees are focussed on making the biggest, best, models, there is no impetus to innovate more efficient, smaller models.
Companies such as OpenAI therefore have the following traits:
Research/Model focus on size over efficiency
Profit driven culture, with emphasis on closed source code
OpenAI's initial focus was as a non-for-profit developing Artificial General Intelligence. This became a for-profit driven company over time. - “I personally chose the price and thought we would make some money.” - Sam Altman
Staff working within paradigm they set in the early 2020's with established code libraries and direct contact with hardware companies creating chips
Significant capital investment - Upwards of several $ billions
DeepSeek, in comparison, is slightly different
For DeepSeek, necessity made innovation necessary. In order to create similar, or better models, than their counterparts, they needed to significantly optimise their code. This requires significantly more work to create, and write, libraries compared to OpenAI.
DeepSeek was started by financial quants, with backgrounds in mainly mathematics and AI. With a focus on mathematics and research, the main drive of many in the company has been exploration of the research space over concerns about profitability.
DeepSeek has also done what OpenAI stopped years ago: actually releasing the code and data for their models. Not only can these models therefore be run via their own gated servers, anyone can replicate their work and make their own system.
For DeepSeek, their traits were:
Research/Model focus on both efficiency and accuracy
Research driven culture, with open nature - “Basic science research rarely offers high returns on investment” - Liang Wenfeng
Strong mathematical background of staff, with ability to work around software, and hardware, constraints
Low capital investment of around $5.5 million
From an evolutionary point of view, DeepSeek's traits have outcompeted those of OpenAI.
More efficient models cost less to run. They also more portable to local machines.
The strong ability of DeepSeek's research focussed staff allowed them to innovate around hardware constraints
Opening up the code to everyone allows anyone (still with the right hardware) to make their own version.
To top it off, the cost to make, and run, DeepSeek R1 is a fraction of the cost of OpenAI's model
House of Cards
Now we can return to today. NVIDIA has lost significant market value. It's not just limited to NVIDIA, but to the entire US technology sector with the most AI adjacent companies losing from 10% to 30% of their valuation in a single day.
The culture, and business model, of OpenAI isn't just limited to OpenAI, but to the entire US technology ecosystem. The US model has been to create rentier-style financial instruments at sky-high valuations.
US tech stocks have been one of the only success stories for America over the past few decades, ever since the offshoring of many manufacturing industries. Like a lost long-unemployed Detroit auto-worker the US has been mainlining technology like Fentanyl, ignoring the anti-trust doctors advice, injecting pure deregulated substances into its veins.
The new AI boom? A new stronger hit, ready for Wall Street, and Private Equity to tie the tourniquet around its arm and pump it right into the arteries.
Like Prometheus, DeepSeek has delved deep and retrieved fire from the algorithmic gods, and shown it's creation to the world. The stock market is on fire, as the traders are coming off of their high, realising they still live in the ruin of barren, decrepit, warehouses and manufactories. The corporate heads, and company leaders reigning over the wreckage like feudal lords, collecting tithes from the serfs working their domain.
A Tale of Two Cities
The rise of DeepSeek isn't just a one-off story of derring-do in the AI world: It's a symbolic representation of the changing world order. DeepSeek is but one company among many who are outcompeting the US, and the world, in innovation.
Where once US free-markets led the world in manufacturing, technology and military capability, now the US is a country devoid of coherent state regulated free-market principles - its place as the singular world power decimated by destroying the very systems which made it great.
"Our merchants and master-manufacturers complain much of the bad effects of high wages in raising the price, and thereby lessening the sale of their goods both at home and abroad. They say nothing concerning the bad effects of high profits. They are silent with regard to the pernicious effects of their own gains. They complain only of those of other people." - Adam Smith, The Wealth of Nations
By selling the jobs of working class communities to overseas businesses, destroying unions and creating rentier based business models without significant anti-trust measures, US business and political elites have sealed the present fate of the country.
The CCP led, but strongly anti-trust enforcing, China has been able to innovate, ironically, using the free-market principles of Adam Smith to rise up and create some of the world's best innovations. The factories, opened by Western business leaders to avoid union/worker labour costs in their own countries, have led Shenzhen, and similar cities, to become hubs of technological innovation - compounding their ability to determine the future of technologies across the world.
Will America be able to regain its position on top? It's too early to say, but the innovative, talented, people who made America in the 20th century can certainly do it again.
As Franklin D. Roosevelt once said: “The liberty of a democracy is not safe if the people tolerated the growth of private power to a point where it becomes stronger than the democratic state itself...
We know now that Government by organized money is just as dangerous as Government by organized mob.
Never before in all our history have these forces been so united against one candidate as they stand today. They are unanimous in their hate for me—and I welcome their hatred.”
Until then, here's a farewell to the American Century 在那之前, 再见美国世纪
#cybernetics#cybernetic#ai#artificial intelligence#DeepSeek#OpenAI#ai technology#long reads#politics#us politics
14 notes
·
View notes
Note
Hi! I'm not sure if you've answered this question or not or if I missed it in your pinned post. I've been dying to mod Cyberpunk for forever, and have finally decided to give it a try. I am very intimidated by the whole ordeal because I know cyberpunk has so many spec requirements (I play on console) and was wondering if you had recommendations? I'm looking to buy a PC to start my modding and visual photography journey but don't know where to start. I've scoured reddit for recommendations but keep getting mixed signals.
I've watched you slowly create your digital portfolio for Valerie over the last couple years and have just been in utter awe of your work. I've looked up to you for a while and want to follow in your footsteps.
Thank you for your time! ☺️💖
Hey there! Thank you so much for the sweet words!
You didn't miss anything, so no worries! I don't think I've ever shared my PC specs in one place. Currently I have:
Motherboard: MSI MAG Z790 Tomahawk MAX
Processor: i7-14700K (with a Cooler Master liquid cooler, I forget the exact model)
RAM: Corsair Vengeance DDR5 64GB
GPU: Geforce RTX 3070 Ti
SSD: Samsung 860 EVO 2TB
NZXT H710i ATX tower case (I think this exact model is discontinued, but I'm a fan of NZXT cases in general--They're very roomy, have good airflow, and have good cable management features)
I've built and maintained my own PCs for about a decade now, and I remember when I first made the switch from console to PC, a lot of the conventional advice I got from more seasoned PC gamers was "Build your own rig, it's cheaper, and it's not that hard." I wasn't fully convinced, though, and I did just get a pre-built gaming PC from some random company on Amazon. If you have the money and you're really intimidated at the idea of building your own, there's nothing wrong with going this route.
Once I had my pre-built, I started with upgrading individual components one at a time. Installing a new GPU, for instance, is pretty easy and fool-proof. Installing a new CPU is a little trickier, especially with all the conflicting advice on how much thermal paste to use (I've always done the grain of rice/pea-sized method and my temperatures on multiple CPUs have always been fine). Installing a new power supply unit can be overwhelming when it comes to making sure you've plugged everything in correctly. Installing a new motherboard is not too far off from building a whole new thing.
And building/maintaining a PC is pretty easy once you get past the initial intimidation. There are so many video tutorials on YouTube to explain the basics--I think I referred to Linus Tech Tip videos back in the day (which might be cringe to suggest now, idk), but you search "how to build a gaming PC" and you'll get a ton of good results back. Also, PCPartPicker is a very helpful website in crosschecking all your desired components to make sure they'll play nicely with each other.
The other big piece of advice I'd offer on building a PC is not to drive yourself crazy reading too many reviews on components. Don't go in totally blind--Still look at Reddit, Amazon reviews, NewEgg, etc. to get an idea of the product and potential issues, but be discerning. Like if you check Amazon reviews and see a common issue mentioned in multiple reviews, take note of that, but if you see one or two complaints about something random, it's probably a fluke. Either a one-off manufacturing error or (more likely, honestly) user error.
You'll probably also see a lot of debates about Intel/NVIDIA vs AMD when it comes to processors and graphics cards--I started with Intel and NVIDIA so I've really just stuck with them out of familiarity, but I think the conventional wisdom these days is that AMD processors will give you more bang for your buck when it comes to gaming.
If you do go the NVIDIA route, I've personally always found it worth the extra money to go with a Ti model of their cards--I feel like it gives me at least another year or two without starting to really feel the GPU bottleneck. I was able to play Mass Effect Andromeda on mostly high settings with my 780 Ti in 2017, and I actually started playing Cyberpunk on my 1080 Ti in 2021--I think most of my settings were on high without any notable performance issues.
Now you probably couldn't get away with that post-Phantom Liberty/update 2.0 since the game did get a lot more demanding with those updates. However, my biggest piece of advice to anyone who wants to get into PC gaming with a heavy emphasis on virtual photography is that you do not need the absolute top-of-the-line hardware to take good shots. For Cyberpunk, I think shooting for a build that lands somewhere along the lines of the minimum-to-recommended ray-tracing requirements will do you just fine.
I don't remember all my current game settings off the top of my head, but I can tell you that I have never bothered with path-tracing, my ray-tracing settings range from medium to high, and I don't natively run the game at 4K. I do hotsample to 4K when I do VP, and I do notice a difference between a 1080 and a 4K shot, but I personally don't feel like being able to constantly run it at 4K is necessary for me right now since I still only have a 1080p monitor. If I'm going to be shooting in Dogtown, which is very demanding, I'll also cap my FPS to 30 for a little extra stability.
(Also, and hopefully this doesn't muddy the waters too much, but I feel like it's worth pointing out that you could have the absolute best of the best hardware and still run into crashes and glitches for random shit that might require advanced troubleshooting--My husband had a better build than I did when he started playing CP77, but he kept running into crashes because of some weird audio driver issue that had to do with his sound system. I just recently upgraded my CPU, RAM, and motherboard, and I was going nuts over the winter because my game somehow became less stable. It turned out the main culprit was Windows 11 has shitty Bluetooth settings.)
But in my opinion, I think getting good shots is less about hardware and more about 1) learning to use the tools available to you (e.g. in-game lighting tools, Reshade, and post-editing in programs like Lightroom or even free apps like Snapsneed) and 2) learning the basics of real-life photography (or visual art in general), particularly when it comes to lighting, color, and composition.
I don't rely on Reshade too much because I try to minimize the amount of menus I have to futz with in-game, but I do think DOF and/or long exposure shaders are excellent for getting cleaner shots. I also like ambient fog shaders to help create more cohesive color in a shot. However, I put most of my focus on lighting and post-editing. I did talk a little bit about my methods for both in this post--It is from 2023 and my style has evolved some since then (like I mention desaturating greens in Lightroom, but I've actually been loving bold green lately and I've been cranking that shit up), but I think it still has some useful advice for anyone starting out.
For a more recent comparison of how much my Lightroom and Photoshop work affects the final product, here is a recent shot I took of Goro.
The left image is the raw shot out of the game--It has some Reshade effects (most notably the IGCS DOF), and I manually set the lighting for this scene. To do this, I set the time in-game to give me a golden hour affect (usually early morning or early evening depending on your location) so the base was very warm and orange, then I dropped the exposure and essentially "rebuilt" the lighting with AMM and CharLi lights to make Goro pop and add some more color, notably green and blue, into the scene.
And the right image is that same shot but after I did some color correcting/enhancement, sharpening, etc. in Lightroom and clip-editing and texture work in Photoshop.
Okay, this was long as hell so I'm gonna end it here, haha. If you have any more questions about anything specific here, feel free to ask! I know it can be really overwhelming and I threw a lot at ya. <333
7 notes
·
View notes
Text
so it turns out some GPU and motherboard manufacturers will just natively overclock them so that they're just slightly overclocked automatically instead of at default stability and you will keep trying to troubleshoot why you're getting random blue screens and game crashes for months until you figure out you can just search 'BOOST' on your fucking BIOS and turn off a bunch of shit and suddenly everything works. this is purely a hypothetical ha ha scenario how's everyone doing
27 notes
·
View notes
Text
Az USA korlátozza a fejlett AI chipek Kínába exportálását
Az Egyesült Államok kereskedelmi minisztériuma elrendelte a Taiwan Semiconductor Manufacturing Company (TSMC), a világ legnagyobb chipgyártója számára, hogy függessze fel a fejlett mesterséges intelligencia (AI) chipek szállítását kínai ügyfeleknek. A korlátozás a 7 nanométeres és annál fejlettebb technológiával készült chipekre vonatkozik, amelyek AI gyorsítókban és GPU-kban használhatók.

Az ügy hátterében a Kínai és az Egyesült Államok közt évek óta zajló kereskedelmi háború áll, mely során többek között a 2019-ben éppen vezető okostelefon- és hálózati eszköz gyártó Huawei-t minden USA területén bejegyzett licensz használatáról letiltották és az akkor illetve azóta tiltólistára került céggekkel bármely az Egyesült Államokban jegyzett licenszt érintő kereskedelmi kapcsolatot csak külön engedéllyel kaphatnak az aztoka használó vállalatok. Ennek értelmében a TSMC nemrégiben értesítette az amerikai hatóságokat, hogy egyik chipjét a Huawei Ascend 910B AI processzorában találták meg és a TSMC egy vásárlója, a kínai Sophgo chiptervező sértette meg az export korlátozásokat.
Az USA egyre szigorúbban lép fel a Kínába irányuló technológiai exporttal szemben. A hónap elején a GlobalFoundries chipgyártót 500 ezer dolláros büntetéssel sújtották, mert engedély nélkül szállított a kínai SMIC-nek. Az amerikai politikusok további szigorításokat követelnek, és aggodalmukat fejezték ki az exportkorlátozások gyengesége miatt. A Biden-kormány további korlátozásokat tervez, de a törvények módosítása késik.
Kína eközben felkészül a további amerikai szankciókra. Nagy erőkkel vásárolják fel a chipgyártó eszközöket, új szövetségeket kötnek, és szakembereket toboroznak. Olyan országokkal működnének együtt, amelyek nem értenek egyet az USA politikájával, és igyekeznek önfenntartóvá válni a chipgyártásban. Mindezzel együtt a fejlett AI chipek exportjának korlátozása komoly csapást jelenthet Kína technológiai ambícióira, és tovább fokozhatja a két ország közötti feszültséget.
Forrás: Reuters
11 notes
·
View notes
Note
Computer q. For otherwise identical monitors, is a 4000:1 contrast ratio noticeably better from 1000:1? I don't mean for fancy art but like if I'm watching a movie, could I see the difference in a dark scene? I looked into oled's, but those are expensive and I think the way I use my stuff would cause burn in.
I hope you don't mind, but I got carried away and answered pretty much every computer monitor question anyone has ever had. And since this turned into a whole thing, I thought I'd share it for everyone to benefit.
For a computer monitor I would say the most important aspect is actually the viewing angle. This is how far off-axis you can look at the monitor before the image degrades.
We sit very close to our displays and at that distance, even a change in height in your chair can affect the image. Move a little bit left or right and a cheap display could completely wash out and look terrible. And if you get a display that is 27" or above, even if you sit dead center, the edges of the screen will appear dark and washed out with a bad viewing angle.
The two best display technologies to get a good viewing angle are IPS (in-plane switching) and OLED. If you are interested in a display without these technologies, be sure it has a decent viewing angle. You can read more about viewing angles here and here.
IPS has very little concern for burn-in, but it is still a concern with OLED. In recent years OLED has greatly improved and image retention and burn-in can be avoided with regular maintenance. Displays will have pixel shift features and noise modes that work out all the pixels evenly. You can run these features every once in a while to prevent burn-in. You can also play special anti-burn-in videos on YouTube (full screen) to exercise the pixels to uniformity.
So if you don't mind the hassle, you can manage an OLED with low risk.
That said, OLED was almost exclusively for TVs and has only recently been introduced for computer displays. The current options are quite large and fairly expensive, as you alluded to. So if you are trying to stay within a budget, it might be best to seek out an IPS display.
Another consideration is resolution. Everyone is obsessed with everything being 4K now. But I think increasing the resolution brings diminishing returns with regard to increased detail you can actually notice. So if you don't mind going with a 1440p monitor (about 2.5K), you can save some money on resolution and get higher quality in more noticeable areas. Personally, I feel 1440p gives you a nice, noticeable bump in detail over 1080p. Whereas going from 1440p to 4K (2160p) is less noticeable unless you have very good vision.
Another benefit to 1440p is that video games are much easier to run on high quality settings with a reasonable GPU. And you can use technologies like super sampling (Nvidia calls this DLSS) to increase the detail you may lose from not going 4K.
The only concern I'd have with not going 4K is if you edit 4K video. It will be difficult to do a pixel level analysis of your footage otherwise. But other than that, you can still watch 4K content on a 1440p monitor and because it is being downsampled, you will still notice a nice bump in detail.
So if you don't have a reason to get a 4K display, I think 1440p is worth considering.
The next concern would be color. Or color gamut. This is how many colors the display can accurately reproduce. If you don't do any art or video color grading, you'll at least want something that does 95 to 100% of sRGB. That is the color space the entire internet uses. And if you are going to be watching HDR movies, you might want a display with a decent percentage of the P3 color space as well. Doesn't need to be 100%, but the higher the better. And for those who do art, a good percentage of Adobe RGB is recommended.
Also, many manufacturers offer displays that come pre-calibrated from the factory. If color accuracy is important, I would seek out one of these displays with a Delta E rating of 3 or less (lower is better).
A newer factor in displays is peak brightness. This is measured in "nits." In standard dynamic range (SDR), video only needed to reach 100 nits. Most HDR content is mastered to reach 1000 nits. In the future, that number will be 4000. And if micro LED technology ever becomes affordable, we may go up to 10,000 nits. But almost everything is around 1000 at the moment, so that is a good number to shoot for.
HOWEVER, because HDR is tone mapped (the brightness of your display is factored in and the content is adjusted accordingly), you can still get some benefits of HDR, even if you cannot do the full 1000 nits.
All monitors can do 100 nits for SDR content. But with more things being displayed in HDR, having more nits will give you a better experience. This does not mean your display will blind you. Usually bright stuff only takes up a small portion of the screen. But having more nits allows highlights to really pop and feel immersive. A lightsaber might actually feel hot and dangerous on a bright enough screen.
Computer displays are often rated as HDR400 or HDR600 or HDR1000 based on their nits. The HDR400 isn't great for HDR content. If you can do 600 or above within your budget, you'll get a better experience. If you are going to watch movies, this may be a feature you prioritize.
I know you mentioned contrast ratio, but I'm afraid that is a little complicated to answer. It can depend on other aspects of the monitor and the viewing environment. So I'll try to give you the info you need to figure out if the display you select will suit your needs.
Manufacturers can use tricks to fudge their contrast ratio in product descriptions, so it is best to go to an independent review website like RTINGS to see what they measured. (They do good TV and monitor reviews too.) You'll see that OLED displays are said to have "infinite" contrast ratio, due to being able to turn off pixels completely. Which means it is probably time to move to a new metric because that gives very little info on the dynamic range of the display (the difference between the darkest and brightest thing it can show).
You definitely want a decent contrast ratio for your display, but this can be subjective. If you have a nice bright screen, your brain may feel the contrast is fantastic, even if the actual darkest black point of the monitor isn't great. If something is really bright, then dark things will *seem* darker by comparison. And if you are viewing in a dark environment, the contrast will look even better. So this is where seeking out a professional reviewer's experience of the monitor can be helpful. One monitor's 4000:1 ratio might be a different experience than another with the same measurement.
Because TVs are generally larger and can have more backlighting zones, they can get decent black levels without OLED. But smaller computer displays have more difficulty in reasonable price ranges. So manage your black level expectations if you go with an affordable IPS display. They can get bright, but they aren't great at blacks like OLED. I'm afraid that is just a limitation of the tech. In fact, getting a brighter display might be preferable to a better contrast ratio. And it will be easier to see if you are in a bright environment.
Most IPS displays are going to be between 1000:1 and 5000:1 and while it does make a difference, if you sit it next to an old plasma or an OLED, you're going to be disappointed. So I would not make contrast ratio a super high priority with IPS, because non-OLED computer displays just aren't going to give you inky blacks. I would say 2000:1 or better is going to give you a decent experience. But, again, I would seek out reviews rather than trust the official product specs when it comes to the quality of the blacks.
And one final consideration you may want to factor in is the refresh rate. This is mostly for gaming. Most displays will give you at least 60 Hz or 60 "refreshes" per second. Gamers tend to like 120 Hz or higher. This won't affect movie watching very much as nearly everything except Gemini Man is 24 fps.
TLDR overview...
Get an IPS or OLED display for a good viewing angle. I personally feel this is the most important feature.
Choose a resolution. 1440p can allow you to increase quality in other areas to maximize your budget. Only get 4K if you have a legit reason or you have fighter pilot vision.
Color gamut or number of colors. Try to get 100% of sRGB for web content, 90% or above of Adobe RGB for art/photography, and 90% or above of P3 for HDR movies and video editing.
If color accuracy is important, look for pre-calibrated displays that have a Delta E of 3 or less. (Lower is better)
HDR brightness. If you want to experience good HDR, you'll want the brightest screen possible (measured in nits). HDR600 or HDR1000 are great. If you don't care about HDR, then don't worry about the rating.
Contrast ratio and black levels. It's going to be meh on pretty much anything but OLED. 2000:1 or better is a good goal to shoot for, but be sure to check independent reviews for the subjective experience of the black levels. Dark viewing environments help too.
Refresh rate. 60 Hz is fine for most things. Gamers prefer 120 Hz or faster. And if you are a competitive gamer, you may want to seek out more info on "variable refresh rate" and "pixel response time."
Pick the variables above that seem most important to you and then seek out a display that does those things decently within your budget.
78 notes
·
View notes
Text
GPU manufacturer HIS is still stuck in 2005
http://www.hisdigital.com/hisxp/member/id/passport/login
5 notes
·
View notes
Text
funny thing with retro PC hardware is how the further back in history you go, the less you can really expect the mainboard to do for you.
you take a modern mainboard and it'll likely have most functions and features you're likely to need already integrated by default, be it sound, network, WiFi... there's usually even going to be video out from whatever barebones GPU is very likely integrated into the CPU by default, as well as a plethora of USB ports for whatever peripherals or other devices you might possibly want. It's basically almost a complete system in and of itself - just add a CPU, RAM, and some kind of storage medium and off you go. Plenty of boards of today will even have built-in support for plugging in fancy chassis RGB lighting straight into the mainboard itself.
Not so with older mainboards - the one I'm looking at using for my retro build project supports basically the typical two channels of IDE/Parallel ATA for a total of four main drives of whatever combination of hard- and optical, a single floppy drive, two PS/2 ports, one keyboard one mouse, a parallel LPT port, a few serial COM ports, an old AT DIN-5 keyboard port, and - shockingly - two USB ports that I'm guessing are ancient 1.0 standard. And that's it. There's no sound, no graphics, no networking - that's all stuff you have to add via expansion cards. You basically cannot use this computer at all without adding at least a graphics card - the Power On Self Test (or POST) will fail and straight up refuse to boot the system if no graphics card is detected. You go back far enough in history to the original IBM PC and it won't even have integrated hard drive support, necessitating an expansion card just to add fixed storage space.
And this is basically why the PC is such an inherently flexible platform - it was and is built pretty much grounds up to be extensible, providing the option to add just about whatever functions and features you might require via expansion slots built on open standards, allowing pretty much anyone with the prerequisite know-how and manufacturing capabilities to build their own. With the relative ease and low cost of circuit board manufacture of today combined with the ready access to powerful microcontrollers like the Raspberry Pi Pico, there's a good number of hobbyists making expansion cards that can more or less be programmed to do pretty much whatever.
Though this is technically still possible to do on modern PCs, the relative speed and complexity involved with modern PCI Express interfaces makes it far less accessible than making your own ISA expansion cards.
8 notes
·
View notes
Text
For a group of folks that bitch about American hegemony, I don't see the Euros making their own tech shit much. Like sure they have a vibrant electronics industry and native foundries, but like you can't run current year on ST micros and you aren't gonna emulate rayman on CPLDs.
Euros do have ARM at least in some fashion (despite relying on Chinese foundries for that iirc), but y'all should get your own GPU manufacturer. East Germans and Russians were excellent at doing the tech-dance, did y'all squander their abilities? Give some potato techbros a supply of cheap vodka and draniki and you'll have a 4090 clone in six months.
4 notes
·
View notes