#Global Artificial Intelligence Chip Market
Explore tagged Tumblr posts
Text
The Global Artificial Intelligence Chip Market size was valued at USD 15.23 billion in 2022 & is estimated to grow at a CAGR of around 37.89% during the forecast period, i.e., 2023-28. The demand for AI chips has witnessed significant growth, driven by increasing demand for AI-enabled devices across industries, advancements in machine learning, data-intensive applications, and the rise of edge computing. In addition to this, the shift towards Industry 4.0 is leading to the adoption of AI and proliferation of IoT in verticals such as healthcare, finance, automotive, manufacturing, telecommunications, aerospace, etc., which is playing a major role in enhancing the market growth.
#Global Artificial Intelligence Chip Market#Global Artificial Intelligence Chip Market News#Global Artificial Intelligence Chip Market Growth#Global Artificial Intelligence Chip Market Size#Global Artificial Intelligence Chip Market Share
0 notes
Text
There’s little doubt that the American government has decided to slow China’s economic rise, most notably in the fields of technological development. To be sure, the Biden administration denies that these are its goals. Janet Yellen said on April 20, “China’s economic growth need not be incompatible with U.S. economic leadership. The United States remains the most dynamic and prosperous economy in the world. We have no reason to fear healthy economic competition with any country.” And Jake Sullivan said on April 27, “Our export controls will remain narrowly focused on technology that could tilt the military balance. We are simply ensuring that U.S. and allied technology is not used against us.”
Yet, in its deeds, the Biden administration has shown that its vision extends beyond those modest goals. It has not reversed the trade tariffs Donald Trump imposed in 2018 on China, even though presidential candidate Joe Biden criticized them in July 2019, saying: “President Trump may think he’s being tough on China. All that he’s delivered as a consequence of that is American farmers, manufacturers and consumers losing and paying more.” Instead, the Biden administration has tried to increase the pressure on China by banning the export of chips, semiconductor equipment, and selected software.
It has also persuaded its allies, like the Netherlands and Japan, to follow suit. More recently, on Aug. 9, the Biden administration issued an executive order prohibiting American investments in China involving “sensitive technologies and products in the semiconductors and microelectronics, quantum information technologies, and artificial intelligence sectors” which “pose a particularly acute national security threat because of their potential to significantly advance the military, intelligence, surveillance, or cyber-enabled capabilities” of China.
All these actions confirm that the American government is trying to stop China’s growth. Yet, the big question is whether America can succeed in this campaign—and the answer is probably not. Fortunately, it is not too late for the United States to reorient its China policy toward an approach that would better serve Americans—and the rest of the world.[...]
Since the creation of the People’s Republic of China in 1949, several efforts have been made to limit China’s access to or stop its development in various critical technologies, including nuclear weapons, space, satellite communication, GPS, semiconductors, supercomputers, and artificial intelligence. The United States has also tried to curb China’s market dominance in 5G, commercial drones, and electric vehicles (EVs). Throughout history, unilateral or extraterritorial enforcement efforts to curtail China’s technological rise have failed and, in the current context, are creating irreparable damage to long-standing U.S. geopolitical partnerships. In 1993 the Clinton administration tried to restrict China’s access to satellite technology. Today, China has some 540 satellites in space and is launching a competitor to Starlink.
When America restricted China’s access to its geospatial data system in 1999, China simply built its own parallel BeiDou Global Navigation Satellite System (GNSS) system in one of the first waves of major technological decoupling. In some measures, BeiDou is today better than GPS. It is the largest GNSS in the world, with 45 satellites to GPS’s 31, and is thus able to provide more signals in most global capitals. It is supported by 120 ground stations, resulting in greater accuracy, and has more advanced signal features, such as two-way messaging[...]
American measures to deprive China access to the most advanced chips could even damage America’s large chip-making companies more than it hurts China. China is the largest consumer of semiconductors in the world. Over the past ten years, China has been importing massive amounts of chips from American companies. According to the US Chamber of Commerce, China-based firms imported $70.5 billion worth of semiconductors from American firms in 2019, representing approximately 37 percent of these companies’ global sales. Some American companies, like Qorvo, Texas Instruments, and Broadcom, derive about half of their revenues from China. 60 percent of Qualcomm’s revenues, a quarter of Intel’s revenues, and a fifth of Nvidia’s sales are from the Chinese market. It’s no wonder that the CEOs of these three companies recently went to Washington to warn that U.S. industry leadership could be harmed by the export controls. American firms will also be hurt by retaliatory actions from China, such as China’s May ban on chips from US-based Micron Technology. China accounts for over 25 percent of Micron’s sales.[...]
The U.S. Semiconductor Industry Association released a statement on July 17, saying that Washington’s repeated steps “to impose overly broad, ambiguous, and at times unilateral restrictions risk diminishing the U.S. semiconductor industry’s competitiveness, disrupting supply chains, causing significant market uncertainty, and prompting continued escalatory retaliation by China,” and called on the Biden administration not to implement further restrictions without more extensive engagement with semiconductor industry representatives and experts.
The Chips Act cannot subsidize the American semiconductor industry indefinitely, and there is no other global demand base to replace China. Other chip producing nations will inevitably break ranks and sell to China (as they have historically) and the American actions will be for naught. And, in banning the export of chips and other core inputs to China, America handed China its war plan years ahead of the battle. China is being goaded into building self-sufficiency far earlier than they would have otherwise. Prior to the ZTE and Huawei components bans, China was content to continue purchasing American chips and focusing on the front-end hardware. Peter Wennink, the CEO of ASML, stated that China is already leading in key applications and demand for semiconductors. Wennink wrote, “The roll-out of the telecommunication infrastructure, battery technology, that’s the sweet spot of mid-critical and mature semiconductors, and that’s where China without any exception is leading.”[...]
Former State Department official Susan Thornton, who oversaw the study as director of the Forum on Asia-Pacific Security at NCAFP, said: “This audit of U.S.-China diplomacy shows that we can make progress through negotiations and that China follows through on its commitments. The notion that engagement with China did not benefit the U.S. is just not accurate.”[...]
One fundamental problem is that domestic politics in America are forcing American policymakers to take strident stands against China instead of pragmatic positions. For instance, sanctions preventing the Chinese Defense Minister, Li Shangfu, from traveling to the United States are standing in the way of U.S.-China defense dialogues to prevent military accidents.
19 Sep 23
128 notes
·
View notes
Text
Semiconductors: The Driving Force Behind Technological Advancements
The semiconductor industry is a crucial part of our modern society, powering everything from smartphones to supercomputers. The industry is a complex web of global interests, with multiple players vying for dominance.
Taiwan has long been the dominant player in the semiconductor industry, with Taiwan Semiconductor Manufacturing Company (TSMC) accounting for 54% of the market in 2020. TSMC's dominance is due in part to the company's expertise in semiconductor manufacturing, as well as its strategic location in Taiwan. Taiwan's proximity to China and its well-developed infrastructure make it an ideal location for semiconductor manufacturing.
However, Taiwan's dominance also brings challenges. The company faces strong competition from other semiconductor manufacturers, including those from China and South Korea. In addition, Taiwan's semiconductor industry is heavily dependent on imports, which can make it vulnerable to supply chain disruptions.
China is rapidly expanding its presence in the semiconductor industry, with the government investing heavily in research and development (R&D) and manufacturing. China's semiconductor industry is led by companies such as SMIC and Tsinghua Unigroup, which are rapidly expanding their capacity. However, China's industry still lags behind Taiwan's in terms of expertise and capacity.
South Korea is another major player in the semiconductor industry, with companies like Samsung and SK Hynix owning a significant market share. South Korea's semiconductor industry is known for its expertise in memory chips such as DRAM and NAND flash. However, the industry is heavily dependent on imports, which can make it vulnerable to supply chain disruptions.
The semiconductor industry is experiencing significant trends, including the growth of the Internet of Things (IoT), the rise of artificial intelligence (AI), and the increasing demand for 5G technology. These trends are driving semiconductor demand, which is expected to continue to grow in the coming years.
However, the industry also faces major challenges, including a shortage of skilled workers, the increasing complexity of semiconductor manufacturing and the need for more sustainable and environmentally friendly manufacturing processes.
To overcome the challenges facing the industry, it is essential to invest in research and development, increase the availability of skilled workers and develop more sustainable and environmentally friendly manufacturing processes. By working together, governments, companies and individuals can ensure that the semiconductor industry remains competitive and sustainable, and continues to drive innovation and economic growth in the years to come.
Chip War, the Race for Semiconductor Supremacy (2023) (TaiwanPlus Docs, October 2024)
youtube
Dr. Keyu Jin, a tenured professor of economics at the London School of Economics and Political Science, argues that many in the West misunderstand China’s economic and political models. She maintains that China became the most successful economic story of our time by shifting from primarily state-owned enterprises to an economy more focused on entrepreneurship and participation in the global economy.
Dr. Keyu Jin: Understanding a Global Superpower - Another Look at the Chinese Economy (Wheeler Institute for Economy, October 2024)
youtube
Dr. Keyu Jin: China's Economic Prospects and Global Impact (Global Institute For Tomorrow, July 2024)
youtube
The following conversation highlights the complexity and nuance of Xi Jinping's ideology and its relationship to traditional Chinese thought, and emphasizes the importance of understanding the internal dynamics of the Chinese Communist Party and the ongoing debates within the Chinese system.
Dr. Kevin Rudd: On Xi Jinping - How Xi's Marxist Nationalism Is Shaping China and the World (Asia Society, October 2024)
youtube
Tuesday, October 29, 2024
#semiconductor industry#globalization#technology#innovation#research#development#sustainability#economic growth#documentary#ai assisted writing#machine art#Youtube#presentation#discussion#china#taiwán#south korea
7 notes
·
View notes
Text
Some 50 miles southwest of Taipei, Taiwan’s capital, and strategically located close to a cluster of the island’s top universities, the 3,500-acre Hsinchu Science Park is globally celebrated as the incubator of Taiwan’s most successful technology companies. It opened in 1980, the government having acquired the land and cleared the rice fields,with the aim of creating a technology hub that would combine advanced research and industrial production.
Today Taiwan’s science parks house more than 1,100 companies, employ 321,000 people, and generate $127 billion in annual revenue. Along the way, Hsinchu Science Park’s Industrial Technology Research Institute has given birth to startups that have grown into world leaders. One of them, the Taiwan Semiconductor Manufacturing Company (TSMC), produces at least 90 percent of the world’s most advanced computer chips. Collectively, Taiwan’s companies hold a 68 percent market share of all global chip production.
It is a spectacular success. But it has also created a problem that could threaten the future prosperity of both the sector and the island. As the age of energy-hungry artificial intelligence dawns, Taiwan is facing a multifaceted energy crisis: It depends heavily on imported fossil fuels, it has ambitious clean energy targets that it is failing to meet, and it can barely keep up with current demand. Addressing this problem, government critics say, is growing increasingly urgent.
Taiwan’s more than 23 million people consume nearly as much energy per capita as US consumers, but the lion’s share of that consumption—56 percent—goes to Taiwan’s industrial sector for companies like TSMC. In fact, TSMC alone uses around 9 percent of Taiwan’s electricity. One estimate by Greenpeace has suggested that by 2030 Taiwan’s semiconductor manufacturing industry will consume twice as much electricity as did the whole of New Zealand in 2021. The bulk of that enormous energy demand, about 82 percent, the report suggests, will come from TSMC.
Taiwan’s government is banking on the continuing success of its technology sector and wants the island to be a leader in AI. But just one small data center, the Vantage 16-megawatt data center in Taipei, is expected to require as much energy as some 13,000 households. Nicholas Chen, a lawyer who analyzes Taiwan’s climate and energy policies, warns that the collision of Taiwan’s commitments to the clean energy transition and its position in global supply chains as a key partner of multinational companies that have made commitments to net-zero deadlines—along with the explosive growth in demand—has all the makings of a crisis.
“In order to plan and operate AI data centers, an adequate supply of stable, zero-carbon energy is a precondition,” he said. “AI data centers cannot exist without sufficient green energy. Taiwan is the only government talking about AI data center rollout without regard to the lack of green energy.”
It is not just a case of building more capacity. Taiwan’s energy dilemma is a combination of national security, climate, and political challenges. The island depends on imported fossil fuel for around 90 percent of its energy and lives under the growing threat of blockade, quarantine, or invasion from China. In addition, for political reasons, the government has pledged to close its nuclear sector by 2025.
Taiwan regularly attends UN climate meetings, though never as a participant. Excluded at China’s insistence from membership in the United Nations, Taiwan asserts its presence on the margins, convening side events and adopting the Paris Agreement targets of peak emissions before 2030 and achieving net zero by 2050. Its major companies, TSMC included, have signed up to RE100, a corporate renewable-energy initiative, and pledged to achieve net-zero production. But right now, there is a wide gap between aspiration and performance.
Angelica Oung, a journalist and founder of the Clean Energy Transition Alliance, a nonprofit that advocates for a rapid energy transition, has studied Taiwan’s energy sector for years. When we met in a restaurant in Taipei, she cheerfully ordered an implausibly large number of dishes that crowded onto the small table as we talked. Oung described two major blackouts—one in 2021 that affected TSMC and 6.2 million households for five hours, and one in 2022 that affected 5.5 million households. It is a sign, she says, of an energy system running perilously close to the edge.
Nicholas Chen argues that government is failing to keep up even with existing demand. “In the past eight years there have been four major power outages,” he said, and “brownouts are commonplace.”
The operating margin on the grid—the buffer between supply and demand—ought to be 25 percent in a secure system. In Taiwan, Oung explained, there have been several occasions this year when the margin was down to 5 percent. “It shows that the system is fragile,” she said.
Taiwan’s current energy mix illustrates the scale of the challenge: Last year, Taiwan’s power sector was 83 percent dependent on fossil fuel: Coal accounted for around 42 percent of generation, natural gas 40 percent, and oil 1 percent. Nuclear supplied 6 percent, and solar, wind, hydro, and biomass together nearly 10 percent, according to the Ministry of Economic Affairs.
Taiwan’s fossil fuels are imported by sea, which leaves the island at the mercy both of international price fluctuations and potential blockade by China. The government has sought to shield consumers from rising global prices, but that has resulted in growing debt for the Taiwan Electric Power Company (Taipower), the national provider. In the event of a naval blockade by China, Taiwan could count on about six weeks reserves of coal but not much more than a week of liquefied natural gas (LNG). Given that LNG supplies more than a third of electricity generation, the impact would be severe.
The government has announced ambitious energy targets. The 2050 net-zero road map released by Taiwan’s National Development Council in 2022 promised to shut down its nuclear sector by 2025. By the same year, the share of coal would have to come down to 30 percent, gas would have to rise to 50 percent, and renewables would have to leap to 20 percent. None of those targets is on track.
Progress on renewables has been slow for a number of reasons, according to Oung. “The problem with solar in Taiwan is that we don’t have a big area. We have the same population as Australia and use the same amount of electricity, but we are only half the size of Tasmania, and 79 percent of Taiwan is mountainous, so land acquisition is difficult.” Rooftop solar is expensive, and roof space is sometimes needed for other things, such as helicopter pads, public utilities, or water tanks.
According to Peter Kurz, a consultant to the technology sector and a long-term resident of Taiwan, there is one renewable resource that the nation has in abundance. “The Taiwan Strait has a huge wind resource,” he said. “It is the most wind power anywhere in the world available close to a population.”
Offshore wind is under development, but the government is criticized for imposing burdensome requirements to use Taiwanese products and workers that the country is not well equipped to meet. They reflect the government’s ambition to build a native industry at the same time as addressing its energy problem. But critics point out that Taiwan lacks the specialist industrial skills that producing turbines demands, and the requirements lead to higher costs and delays.
Despite the attraction of Taiwan’s west coast with its relatively shallow waters, there are other constraints, such as limited harbor space. There is also another concern that is unique to Taiwan’s geography: The west side of the island faces China, and there are continuing incursions into Taiwan’s territorial waters from China’s coast guard and navy vessels. Offshore wind turbines are within easy rocket and missile range from China, and undersea energy cables are highly vulnerable.
Government critics regard one current policy as needless self-harm: the pledge to shut down Taiwan’s remaining nuclear reactor by next year and achieve a “nuclear free homeland.” It is a pledge made by the current ruling party, the Democratic People’s Party (DPP), and as the deadline approaches, it is a policy increasingly being questioned. Taiwan’s civil nuclear program was started under the military dictatorship of Chiang Kai-shek’s KMT party with half an eye on developing a nuclear weapons program. Taiwan built its first experimental facility in the 1950s and opened its first power plant in 1978. The DPP came into existence in 1986, the year of the Chernobyl disaster, and its decision to adopt a no-nuclear policy was reinforced by the Fukushima disaster in neighboring Japan in 2011.
“I think the DPP see nuclear energy as a symbol of authoritarianism,” said Oung, “so they oppose it.”
Of Taiwan’s six nuclear reactors, three are now shut down, two have not been brought online, and the one functioning unit is due to close next year. The shuttered reactors have not yet been decommissioned, possibly because, in addition to its other difficulties, Taiwan has run out of waste storage capacity: The fuel rods remain in place because there is nowhere else to put them. As some observers see it, politics have got in the way of common sense: In 2018, a majority opposed the nuclear shutdown in a referendum, but the government continues to insist that its policy will not change. Voters added to the confusion in 2021 when they opposed the completion of the two uncommissioned plants.
On the 13th floor of the Ministry of Economic Affairs in Taipei, the deputy director general of Taiwan’s energy administration, Stephen Wu, chose his words carefully. “There is a debate going on in our parliament,” he said, “because the public has demanded a reduction of nuclear power and also a reduction in carbon emissions. So there is some discussion about whether the [shuttered] nuclear plants will somehow function again when conditions are ready.”
Wu acknowledged that Taiwan was nudging against the limits of its current supply and that new entrants to Taiwan’s science and technology parks have to be carefully screened for their energy needs. But he took an optimistic view of Taiwan’s capacity to sustain AI development. “We assess energy consumption of companies to ensure the development of these companies complies with environmental protection,” he said. “In Singapore, data centers are highly efficient. We will learn from Singapore.”
Critics of the government’s energy policy are not reassured. Chen has an alarming message: If Taiwan does not radically accelerate its clean energy development, he warns, companies will be obliged to leave the island. They will seek zero-carbon operating environments to comply with the net-zero requirements of partners such as Amazon, Meta, and Google, and to avoid carbon-based trade barriers such as the European Union’s Carbon Border Adjustment Mechanism.
“Wind and solar are not scalable sources of zero-carbon energy,” he said. “Nuclear energy is the only scalable, zero-carbon source of energy. But the current laws state that foreign investment in nuclear energy must be capped at 50 percent, with the remaining 50 percent owned by Taipower. Given that Taipower is broke, how could a private investor want to partner with them and invest in Taiwan?”
Chen argues that Taiwan should encourage private nuclear development and avoid the burdensome regulation that, he says, is hampering wind development.
For Kurz, Taiwan’s energy security dilemma requires an imaginative leap. “Cables [carrying offshore wind energy] are vulnerable but replaceable,” he says. “Centralized nuclear is vulnerable to other risks, such as earthquakes.” One solution, he believes, lies in small modular nuclear reactors that could even be moored offshore and linked with undersea cables. It is a solution that he believes the Taiwan’s ruling party might come around to.
There is a further security question to add to Taiwan’s complex challenges. The island’s circumstances are unique: It is a functioning democracy, a technological powerhouse, and a de facto independent country that China regards as a breakaway province to be recovered—if necessary, by force. The fact that its technology industry is essential for global production of everything from electric vehicles to ballistic missiles has counted as a security plus for Taiwan in its increasingly tense standoff with China. It is not in the interest of China or the United States to see semiconductor manufacturers damaged or destroyed. Such companies, in security jargon, are collectively labelled Taiwan’s “silicon shield,” a shield the government is keen to maintain. That the sector depends inescapably on Taiwan’s energy security renders the search for a solution all the more urgent.
12 notes
·
View notes
Text
MediaTek Dimensity 9300+: Experience Next-Level Performance
MediaTek Dimensity 9300+
All-Big-Core Processor: Superior Performance With the MediaTek Dimensity 9300+, Arm Cortex-X4 speeds are increased to unprecedented heights, setting a new standard for smartphone performance for enthusiasts and gamers.
1X Cortex-X4 operating at 3.4 GHz Max. 3X Cortex-X4 2.85GHz Up to 2.0GHz, 4X Cortex-A720 18 MB L3 + SLC cache Supported up to LPDDR5T 9600Mbps MCQ plus UFS 4.0 Third generation of TSMC 4nm chips manufactured MediaTek’s second-generation thermally optimised packaging design Premier Generative AI System in MediaTek Dimensity 9300+
Faster and safer edge computing is made possible by the MediaTek APU 790 generative AI engine. MediaTek Dimensity 9300+ first-to-market features and extensive toolchain help developers create multimodal generative AI applications at the edge quickly and effectively, offering consumers cutting-edge experiences with generative AI for text, photos, music, and more.
Assistance with on-device NeuroPilot LoRA Fusion 2.0 and LoRA Fusion With NeuroPilot Speculative Decode Acceleration and ExecutorTorch Delegation support, performance can increase by up to 10%. Gen-AI partnerships Artificial Intelligence Cloud Alibaba Qwen LLM AI Baichuan ERNIE-3.5-SE Google Gemini Nano Llama 2 and Llama 3 Meta Epic Play
Flagship GPU with 12 cores Experience the most popular online games in HDR at 90 frames per second, while using up to 20% less power than other flagship smartphone platforms.
Adaptive gaming technology from MediaTek Activate MAGT to increase power efficiency in well-known gaming titles. This will allow top titles to run smoothly for up to an hour.
Experience the most popular online games in HDR at 90 frames per second, while using up to 20% less power than other flagship smartphone platforms.
Mobile Raytracing accelerated via hardware The Immortalis-G720 offers gamers quick, immersive raytracing experiences at a fluid 60 frames per second along with console-quality global lighting effects thanks to its 2nd generation hardware raytracing engine.
HyperEngine from MediaTek: Network Observation System (NOS)
Working with top game companies, MediaTek HyperEngine NOS offloads real-time network connectivity quality assessment, allowing for more efficient and power-efficient Wi-Fi/cellular dual network concurrency during gameplay.
Accurate Network Forecasting 10% or more in power savings Save up to 25% on cellular data guarantees a steady and fluid connection for internet gaming. Working along with Tencent GCloud Amazing Media Capture in All Situations The Imagiq 990 boasts zero latency video preview, AI photography, and 18-bit RAW ISP. Utilise its 16 categories of scene segmentation modification and AI Semantic Analysis Video Engine for more visually stunning cinematic video capture.
With three microphones capturing high dynamic range audio and filtering out background noise and wind, you can be heard clearly. This makes it perfect for impromptu vlogging.
AI-displayed MediaTek MiraVision 990 Set your goals on faster, sharper screens, and take advantage of the newest HDR standards and AI improvements for next-generation cinematic experiences.
Amazing displays: 4K120 or WQHD 180Hz AI depth finding Support that folds and has two active screens The best anti-burn-in technology available for AMOLED screens Maximum Interconnectedness WiFi 7 Extended Range Connections can extend up to 4.5 metres indoors thanks to MediaTek Xtra Range 2.0 technology (5GHz band). Up to 200% throughput improvement is provided for smoother graphics while streaming wirelessly to 4K Smart TVs thanks to coexistence and anti-interference technologies. UltraSaver Wi-Fi 7 MediaTek Wi-Fi 7 with Multi-Link Operation (MLO) and 320MHz BW up to 6.5Gbps Top Bluetooth Features Wi-Fi/BT Hybrid Coexistence 3.0 by MediaTek UltraSave Bluetooth LightningConnect MediaTek Extremely low Bluetooth audio latency (<35 ms) Smooth sub-6GHz with a 5G AI modem Sub-6GHz capable 4CC-CA 5G R16 modem Dedicated sub-6GHz downlink speed of up to 7 Gbps Modern AI equipped with situation awareness Dual SIM, Dual Active, Multimode 3.0 for MediaTek 5G UltraSave Outstanding Security for a Flagship SoC for Android
Introducing a user-privacy-focused security design that safeguards critical processes both during secure computing and boot-up, preventing physical attacks on data access.
During startup and operation, standalone hardware (Secure Processor, HWRoT) is used with New Arm Memory Tagging Extension (MTE) technology.
The next big thing in innovation is generative AI MediaTek Dimensity 9300+, the industry leader in creating high-performing and power-efficient system-on-chips, is already integrating the advantages of their potent, internally developed AI processors into their wide range of product offerings.
Every year,their inventions impact over 2 billion devices Fifth-largest fabless semiconductor maker MediaTek. MediaTek chips power 2 billion devices annually; you undoubtedly have one! Here at MediaTek, they design technology with people in mind to improve and enrich daily existence.
Amazing In Amazing Escape Smartphones with MediaTek Dimensity – 5G The cutting edge is available on MediaTek Dimensity 5G smartphone platforms, which offer amazing nonstop gaming, sophisticated AI, and professional-grade photography and multi-camera videography. Together, they enhance the intelligence, potency, and efficiency of your experience.
Chromebooks, the ubiquitous computing companion from MediaTek Kompanio MediaTek Kompanio is the dependable, creative, versatile, go-anywhere, and do-anything partner for amazing Chromebook experiences. It’s the perfect partner for learning, daily work, streaming media, video conferences, or just experimenting with one’s creativity.
MediaTek provides you with all you need in terms of computing. MediaTek processors are made to meet the needs of the modern user, whether they be for gaming, streaming, work, or education.
Brilliance on the brink IoT with Edge-AI with MediaTek Genio MediaTek Dimensity 9300+ Genio propels IoT innovation by elevating software platforms that are simple to use and have strong artificial intelligence. MediaTek helps start-ups to multinational corporations creating new IoT devices with Edge-AI capabilities, accelerating time to market to create new opportunities.
Entrepreneurs with a Vision: MediaTek Pentonic – 8K/4K Smart Televisions Five key technology pillars are offered by MediaTek Pentonic in their flagship and premium 8K/4K smart TVs: display, audio, AI, broadcasting, and connectivity. With a 60% global TV market share,they are the leading provider of smart TV platforms, supporting the largest smart TV brands in the world.
Experiences that are always connected Wi-Fi MediaTek Filogic With the most extreme speeds, improved coverage, built-in security, exceptional power efficiency, and crucial EasyMesh certification, MediaTek Filogic is bringing in a new era of smarter, more powerful Wi-Fi 7, 6E, and 6 solutions. These solutions will enable users to enjoy seamless, always-connected experiences.
MediaTek Dimensity 9300+ Specs CPU Processor 1x Arm Cortex-X4 up to 3.4GHz 3x Arm Cortex-X4 up to 2.85GHz 4x Arm Cortex-A720 up to 2.0GHz Cores Octa (8)
Memory and Storage Memory Type LPDDR5X LPDDR5T Max Memory Frequency 9600Mbps
Storage Type UFS 4 + MCQ
Connectivity Cellular Technologies Sub-6GHz (FR1), mmWave (FR2), 2G-5G multi-mode, 5G-CA, 4G-CA, 5G FDD / TDD, 4G FDD / TDD, TD-SCDMA, WDCDMA, EDGE, GSM
Specific Functions 5G/4G Dual SIM Dual Active, SA & NSA modes; SA Option2, NSA Option3 / 3a / 3x, NR FR1 TDD+FDD, DSS, FR1 DL 4CC up to 300 MHz 4×4 MIMO, FR2 DL 4CC up to 400MHz, 256QAM FR1 UL 2CC 2×2 MIMO, 256QAM NR UL 2CC, R16 UL Enhancement, 256QAM VoNR / EPS fallback
GNSS GPS L1CA+L5+ L1C BeiDou B1I+ B1C + B2a +B2b Glonass L1OF Galileo E1 + E5a +E5b QZSS L1CA+ L5 NavIC L5 Wi-Fi Wi-Fi 7 (a/b/g/n/ac/ax/be) ready
Wi-Fi Antenna 2T2R
Bluetooth 5.4
Camera Max Camera Sensor Supported 320MP
Max Video Capture Resolution 8K30 (7690 x 4320) 4K60 (3840 x 2160) Graphics GPU Type Arm Immortalis-G720 MC12
Video Encoding H.264 HEVC Video Playback H.264 HEVC VP-9 AV1 Display Max Refresh Rate 4K up to 120Hz WQHD up to 180Hz AI AI Processing Unit MediaTek APU 790 (Generative AI)
Security Security Features Secure Processor, HWRoT Arm Memory Tagging Extension (MTE) Technology CC EAL4+ Capable, FIPS 140-3, China DRM
Read more on Govindhtech.com
#MediaTekDimensity9300#mediatek#MediaTekDimensity#ai#generativeai#llama2#smartphone#lorafution#llm#llama3#meta#technology#technews#news#govindhtech
3 notes
·
View notes
Text
The Fragmented Future of AI Regulation: A World Divided
The Battle for Global AI Governance
In November 2023, China, the United States, and the European Union surprised the world by signing a joint communiqué, pledging strong international cooperation in addressing the challenges posed by artificial intelligence (AI). The document highlighted the risks of "frontier" AI, exemplified by advanced generative models like ChatGPT, including the potential for disinformation and serious cybersecurity and biotechnology risks. This signaled a growing consensus among major powers on the need for regulation.
However, despite the rhetoric, the reality on the ground suggests a future of fragmentation and competition rather than cooperation.
As multinational communiqués and bilateral talks take place, an international framework for regulating AI seems to be taking shape. But a closer look at recent executive orders, legislation, and regulations in the United States, China, and the EU reveals divergent approaches and conflicting interests. This divergence in legal regimes will hinder cooperation on critical aspects such as access to semiconductors, technical standards, and the regulation of data and algorithms.
The result is a fragmented landscape of warring regulatory blocs, undermining the lofty goal of harnessing AI for the common good.
youtube
Cold Reality vs. Ambitious Plans
While optimists propose closer international management of AI through the creation of an international panel similar to the UN's Intergovernmental Panel on Climate Change, the reality is far from ideal. The great powers may publicly express their desire for cooperation, but their actions tell a different story. The emergence of divergent legal regimes and conflicting interests points to a future of fragmentation and competition rather than unified global governance.
The Chip War: A High-Stakes Battle
The ongoing duel between China and the United States over global semiconductor markets is a prime example of conflict in the AI landscape. Export controls on advanced chips and chip-making technology have become a battleground, with both countries imposing restrictions. This competition erodes free trade, sets destabilizing precedents in international trade law, and fuels geopolitical tensions.
The chip war is just one aspect of the broader contest over AI's necessary components, which extends to technical standards and data regulation.
Technical Standards: A Divided Landscape
Technical standards play a crucial role in enabling the use and interoperability of major technologies. The proliferation of AI has heightened the importance of standards to ensure compatibility and market access. Currently, bodies such as the International Telecommunication Union and the International Organization for Standardization negotiate these standards.
However, China's growing influence in these bodies, coupled with its efforts to promote its own standards through initiatives like the Belt and Road Initiative, is challenging the dominance of the United States and Europe. This divergence in standards will impede the diffusion of new AI tools and hinder global solutions to shared challenges.
Data: The Currency of AI
Data is the lifeblood of AI, and access to different types of data has become a competitive battleground. Conflict over data flows and data localization is shaping how data moves across national borders. The United States, once a proponent of free data flows, is now moving in the opposite direction, while China and India have enacted domestic legislation mandating data localization.
This divergence in data regulation will impede the development of global solutions and exacerbate geopolitical tensions.
Algorithmic Transparency: A Contested Terrain
The disclosure of algorithms that underlie AI systems is another area of contention. Different countries have varying approaches to regulating algorithmic transparency, with the EU's proposed AI Act requiring firms to provide government agencies access to certain models, while the United States has a more complex and inconsistent approach. As countries seek to regulate algorithms, they are likely to prohibit firms from sharing this information with other governments, further fragmenting the regulatory landscape.
The vision of a unified global governance regime for AI is being undermined by geopolitical realities. The emerging legal order is characterized by fragmentation, competition, and suspicion among major powers. This fragmentation poses risks, allowing dangerous AI models to be developed and disseminated as instruments of geopolitical conflict.
It also hampers the ability to gather information, assess risks, and develop global solutions. Without a collective effort to regulate AI, the world risks losing the potential benefits of this transformative technology and succumbing to the pitfalls of a divided landscape.
2 notes
·
View notes
Text
Daily Semiconductor Industry Information By Lansheng Technology
1. At the VLSI Symposium 2023, which will be held next month, Intel will demonstrate the PowerVia technology verification chip.
2. On May 5th, Samsung Electronics is expected to surpass its main competitor TSMC in the field of chip foundry within 5 years.
3. According to media reports, Meta recruited a team from the British artificial intelligence chip company Graphcore. The team previously worked in Oslo, Norway, and was developing AI networking technology at Graphcore until late last year.
4. On May 5, 2023, semiconductor product companies Alpha and Omega Semiconductor once fell by 11.64% in intraday trading, and once touched $20.64. The stock price hit a new low since November 18, 2020.
5. Following in the footsteps of #Samsung, SK Hynix and Micron, the US chip giant Qualcomm’s latest quarterly revenue fell -16.9% year-on-year to US$9.275 billion, and its net profit fell sharply -41.9%. The three major business segments of mobile phones, automobiles and IoT All have declined to varying degrees, and its forecast data for the third fiscal quarter is also lower than market expectations.
Lansheng Technology Limited (https://www.lanshengic.com/) is a global distributor of electronic components that has been established for more than 10 years, headquartered in Shenzhen China, who mainly focuses on electronic spot stocks
6 notes
·
View notes
Text
[...]
Apocalypse is familiar, even beloved territory for Silicon Valley. A few years ago, it seemed every tech executive had a fully stocked apocalypse bunker somewhere remote but reachable. In 2016, Mr. Altman said he was amassing “guns, gold, potassium iodide, antibiotics, batteries, water, gas masks from the Israeli Defense Force and a big patch of land in Big Sur I can fly to.” The coronavirus pandemic made tech preppers feel vindicated, for a while.
Now, they are prepping for the Singularity.
“They like to think they’re sensible people making sage comments, but they sound more like monks in the year 1000 talking about the Rapture,” said Baldur Bjarnason, author of “The Intelligence Illusion,” a critical examination of A.I. “It’s a bit frightening,” he said.
[...]
For some critics of the Singularity, it is an intellectually dubious attempt to replicate the belief system of organized religion in the kingdom of software.
“They all want eternal life without the inconvenience of having to believe in God,” said Rodney Brooks, the former director of the Computer Science and Artificial Intelligence Laboratory at the Massachusetts Institute of Technology.
[...]
Critics counter that even the impressive results of L.L.M.s are a far cry from the enormous, global intelligence long promised by the Singularity. Part of the problem in accurately separating hype from reality is that the engines driving this technology are becoming hidden. OpenAI, which began as a nonprofit using open source code, is now a for-profit venture that critics say is effectively a black box. Google and Microsoft also offer limited visibility.
Much of the A.I. research is being done by the companies with much to gain from the results. Researchers at Microsoft, which invested $13 billion in OpenAI, published a paper in April concluding that a preliminary version of the latest OpenAI model “exhibits many traits of intelligence” including “abstraction, comprehension, vision, coding” and “understanding of human motives and emotions.”
Rylan Schaeffer, a doctoral student in computer science at Stanford, said some A.I. researchers had painted an inaccurate picture of how these large language models exhibit “emergent abilities” — unexplained capabilities that were not evident in smaller versions.
Along with two Stanford colleagues, Brando Miranda and Sanmi Koyejo, Mr. Schaeffer examined the question in a research paper published last month and concluded that emergent properties were “a mirage” caused by errors in measurement. In effect, researchers are seeing what they want to see.
[...]
A.I., just like the Singularity, is already being described as irreversible. “Stopping it would require something like a global surveillance regime, and even that isn’t guaranteed to work,” Mr. Altman and some of his colleagues wrote last month. If Silicon Valley doesn’t make it, they added, others will.
Less discussed are the vast profits to be made from uploading the world. Despite all the talk of A.I. being an unlimited wealth-generating machine, the people getting rich are pretty much the ones who are already rich.
Microsoft has seen its market capitalization soar by half a trillion dollars this year. Nvidia, a maker of chips that run A.I. systems, recently became one of the most valuable public U.S. companies when it said demand for those chips had skyrocketed.
“A.I. is the tech the world has always wanted,” Mr. Altman tweeted.
It certainly is the tech that the tech world has always wanted, arriving at the absolute best possible time. Last year, Silicon Valley was reeling from layoffs and rising interest rates. Crypto, the previous boom, was enmeshed in fraud and disappointment.
5 notes
·
View notes
Text
Semiconductor Market - Forecast (2022 - 2027)
Semiconductor market size is valued at $427.6 billion in 2020 and is expected to reach a value of $698.2 billion by 2026 at a CAGR of 5.9% during the forecast period 2021-2026. Increased investments in memory devices and Integrated circuit components are driving technological improvements in the semiconductor sector. The emergence of artificial intelligence, internet of things and machine learning technologies is expected to create a market for Insulators as this technology aid memory chip to process large data in less time. Moreover demand for faster and advanced memory chip in industrial application is expected to boost the semiconductor market size. Semiconductors technology continues to shrink in size and shapes, a single chip may hold more and more devices, indicating more capabilities per chip. As a result, a number of previously-used chips are now being combined into a single chip, resulting in highly-integrated solutions. Owing to such advancement in technology the Gallium arsenide market is expected to spur its semiconductor market share in the forecast period.
Report Coverage
The report: “Semiconductor Market Forecast (2021-2026)”, by IndustryARC covers an in-depth analysis of the following segments of the Semiconductor market report.
By Components – Analog IC, Sensors, MPU, MCU, Memory Devices, Lighting Devices, Discrete Power Devices, Others
By Application – Networking & Communication, Healthcare, Automotive, Consumer electronic, Data processing, Industrial, Smart Grid, Gaming, Other components
By Type - Intrinsic Semiconductor, Extrinsic Semiconductor
By Process- Water Production, Wafer Fabrication, Doping, Masking, Etching, Thermal Oxidation
By Geography - North America (U.S, Canada, Mexico), Europe (Germany, UK, France, Italy, Spain, Belgium, Russia and Others), APAC(China, Japan India, SK, Aus and Others), South America(Brazil, Argentina, and others), and RoW (Middle east and Africa)
Request Sample
Key Takeaways
In component segment Memory device is expected to drive the overall market growth owing to on-going technological advancement such as virtual reality and cloud computing.
networking and communication is expected hold the large share owing to rise in demand for smart phone and smart devices around the world.
APAC region is estimated to account for the largest share in the global market during the forecast period due to rise of electronic equipment production and presence of large local component manufacturers.
Semiconductor Market Segment Analysis- By Component
Memory device is expected to drive the overall market growth at a CAGR of 6.1% owing to on-going technological advancement such as virtual reality and cloud computing. High average selling price of NAND flash chips and DRAM would contribute significantly to revenue generation. Over the constant evolution, logic devices utilised in special purpose application particular signal processors and application specific integrated circuits are expected to grow at the fastest rate.
Inquiry Before Buying
Semiconductor Market Segment Analysis - By Application
With increasing demand for smart phone and smart devices around the world networking and communication segment is expected hold the large share in the market at 16.5% in 2020. Moreover due to Impact of Covid 19, the necessity of working from home has risen and the use of devices such as laptops, routers and other have increased which is expected to boost the semiconductor market size. The process of Wafer Level Packaging (WLP), in which an IC is packaged to produce a component that is nearly the same size as the die, has increased the use of semiconductor ICs across consumer electronics components owing to developments in silicon wafer materials.
Semiconductor Market Segment Analysis – By Geography
APAC region is estimated to account for the largest semiconductor market share at 44.8% during the forecast period owing to rise of electronic equipment production. Due to the extensive on-going migration of various electrical equipment and the existence of local component manufacturers, China is recognised as the region's leading country. The market in North America is expected to grow at a rapid pace, owing to rising R&D spending.
Schedule a Call
Semiconductor Market Drivers
Increase in Utilization of Consumer Electronics
Rise in technological advancement in consumer electronic devices have created a massive demand for integrated circuit chip, as these IC chip are used in most of the devices such as Smartphones, TV’s, refrigerator for advanced/ smart functioning. Moreover investment towards semiconductor industries by the leading consumer electronics companies such as Apple, Samsung and other is expected to boost the semiconductor market share by country. The adoption of cloud computing has pushed growth for server CPUs and storage which is ultimately expected to drive the semiconductor market. Wireless-internet are being adopted on a global scale and it require semiconductor equipment As a result, the semiconductor market research is fuelled by demand and income created by their production.
AI Application in Automotive
Semiconductor industry is expected to be driven by the huge and growing demand for powerful AI applications from automotive markets. Automakers are pushing forward with driverless vehicles, advanced driver assistance systems (ADAS), and graphics processing units (GPUs) which is estimated to boost the semiconductor market size. Furthermore, varied automobile products, such as navigation control, entertainment systems, and collision detection systems, utilise automotive semiconductor ICs with various capabilities. In the present time, automotive represents approximately 10 – 12 per cent of the chip market.
Buy Now
Semiconductor Market- Challenges
Changing Functionality of Chipsets
The semiconductor market is being held back by the constantly changing functionality of semiconductor chips and the unique demands of end-users from various industries. The factors such as Power efficiency, unrealistic schedules, and cost-down considerations are hindering the semiconductor market analysis.
Semiconductor Market Landscape
Technology launches, acquisitions and R&D activities are key strategies adopted by players in the Semiconductors Market. The market of Electrical conductivity has been consolidated by the major players – Qualcomm, Samsung Electronics, Toshiba Corporation, Micron Technology, Intel Corporation, Texas Instruments, Kyocera Corporation, Taiwan Semiconductor Manufacturing, NXP Semiconductors, Fujitsu Semiconductor Ltd.
Acquisitions/Technology Launches
In July 2020 Qualcomm introduced QCS410 AND QCS610 system on chips, this is designed for premium camera technology, including powerful artificial intelligence and machine learning features.
In November 2019 Samsung announced it production of its 12GB and 24GB LPDDR4X uMCP chip, offering high quality memory and data transfer rate upto 4266 Mbps in smartphones
In September 2019 the new 5655 Series electronic Board-to-Board connectors from Kyocera Corporation are optimised for high-speed data transfer, with a 0.5mm pitch and a stacking height of under 4mm, making them among the world's smallest for this class of connector.
For more Electronics related reports, please click here
3 notes
·
View notes
Text
Immersion Cooling Market 2030 Regional Outlook, Share, Type and Application, Trends
The global immersion cooling market was valued at USD 197.0 million in 2022 and is projected to grow at a robust compound annual growth rate (CAGR) of 22.6% from 2023 to 2030. This growth is largely fueled by the increasing demand for cost-effective and energy-efficient cooling solutions for data centers. Immersion cooling, a process where components are submerged in a thermally conductive but electrically insulating liquid, offers substantial advantages over traditional air-cooling methods, making it a compelling choice for large-scale data operations.
During the COVID-19 pandemic in 2020, global lockdowns disrupted industry expansion, delaying data center consolidation efforts as the movement of servers, closure of facilities, and construction of new sites became challenging. Despite these obstacles, demand for data centers surged, primarily due to the global shift toward remote work and a significant increase in e-commerce activities.
Gather more insights about the market drivers, restrains and growth of the Immersion Cooling Market
In the United States, the immersion cooling market is experiencing transformation as companies in this sector expand capacity to meet the growing needs of data centers. The COVID-19 pandemic accelerated the shift to digital platforms, with more businesses and consumers embracing e-commerce and online services. The need for reliable Internet of Things (IoT) capabilities and cloud computing infrastructure in the U.S. is expected to sustain high demand for data centers, especially hyper-scale data centers, which are designed to accommodate large-scale data storage and management efficiently. Unlike traditional data centers, hyper-scale facilities are equipped to handle high data traffic and intensive computing workloads, making them ideal for emerging technologies and digital applications.
The immersion cooling market comprises various global and regional players offering proprietary solutions, with some companies modifying existing Information Technology Equipment (ITE) to be compatible with immersion cooling technology. Customization is a key strategy in this market, as manufacturers often tailor solutions to meet the specific needs of their clients.
The demand for IoT and cloud infrastructure continues to grow, leading to an increased need for hyper-scale data centers. These large-scale facilities allow digital platforms to manage data storage and transfer more efficiently, catering to the expanding requirements of high-volume data traffic and intensive computing workloads.
Application Segmentation Insights:
In terms of applications, high-performance computing (HPC) was the leading segment in 2022, accounting for 34.6% of the global revenue share. Immersion cooling offers significant advantages for HPC systems, including reduced latency, improved energy efficiency, and the potential for heat reuse in industrial or urban settings. Furthermore, immersion cooling allows for rapid deployment with edge-ready solutions, making it suitable for locations where conventional cooling systems are not feasible. The technology also supports cooling of high chip densities without water waste, aligning with sustainable cooling goals.
The demand for cryptocurrency mining has grown substantially, driven by cryptocurrency’s benefits such as faster international transfers, decentralized operation, fraud protection, and enhanced transactional security. Cryptocurrency mining operations require high-performance systems that are often overclocked to maximize hash rates, and immersion cooling helps maintain the temperatures of these high-power systems efficiently. This cooling method reduces both operational and capital expenses in cryptocurrency mining, making it more cost-effective for miners to achieve their desired processing performance without overheating.
Artificial Intelligence (AI) is another segment expected to experience rapid growth, with a projected CAGR of 26.3% over the forecast period. Several factors contribute to this growth, including a resurgence in AI research in the U.S., the widespread adoption of deep learning technologies by major companies like Facebook, Google, Microsoft, and Amazon, and a rising demand for AI-driven applications. Additionally, Israel’s robust AI ecosystem is contributing to the growing demand for high-performance servers capable of managing the increased load. This upsurge in AI applications is anticipated to drive the immersion cooling market as AI servers require efficient cooling solutions to manage their significant computational workloads.
Immersion cooling systems are especially valuable for supporting large computing workloads in small or constrained spaces where traditional cooling options may be unavailable or unsuitable. This technology reduces energy consumption in edge computing environments, where cooling solutions are critical, but high-capacity power sources may not always be accessible. Immersion liquid cooling enables efficient deployment at edge locations, offering a solution that conserves energy and is adaptable to diverse environments where space or power constraints might otherwise limit the effectiveness of traditional cooling methods.
Order a free sample PDF of the Immersion Cooling Market Intelligence Study, published by Grand View Research.
0 notes
Text
Immersion Cooling Market Strategy Analysis Report by 2030
The global immersion cooling market was valued at USD 197.0 million in 2022 and is projected to grow at a robust compound annual growth rate (CAGR) of 22.6% from 2023 to 2030. This growth is largely fueled by the increasing demand for cost-effective and energy-efficient cooling solutions for data centers. Immersion cooling, a process where components are submerged in a thermally conductive but electrically insulating liquid, offers substantial advantages over traditional air-cooling methods, making it a compelling choice for large-scale data operations.
During the COVID-19 pandemic in 2020, global lockdowns disrupted industry expansion, delaying data center consolidation efforts as the movement of servers, closure of facilities, and construction of new sites became challenging. Despite these obstacles, demand for data centers surged, primarily due to the global shift toward remote work and a significant increase in e-commerce activities.
Gather more insights about the market drivers, restrains and growth of the Immersion Cooling Market
In the United States, the immersion cooling market is experiencing transformation as companies in this sector expand capacity to meet the growing needs of data centers. The COVID-19 pandemic accelerated the shift to digital platforms, with more businesses and consumers embracing e-commerce and online services. The need for reliable Internet of Things (IoT) capabilities and cloud computing infrastructure in the U.S. is expected to sustain high demand for data centers, especially hyper-scale data centers, which are designed to accommodate large-scale data storage and management efficiently. Unlike traditional data centers, hyper-scale facilities are equipped to handle high data traffic and intensive computing workloads, making them ideal for emerging technologies and digital applications.
The immersion cooling market comprises various global and regional players offering proprietary solutions, with some companies modifying existing Information Technology Equipment (ITE) to be compatible with immersion cooling technology. Customization is a key strategy in this market, as manufacturers often tailor solutions to meet the specific needs of their clients.
The demand for IoT and cloud infrastructure continues to grow, leading to an increased need for hyper-scale data centers. These large-scale facilities allow digital platforms to manage data storage and transfer more efficiently, catering to the expanding requirements of high-volume data traffic and intensive computing workloads.
Application Segmentation Insights:
In terms of applications, high-performance computing (HPC) was the leading segment in 2022, accounting for 34.6% of the global revenue share. Immersion cooling offers significant advantages for HPC systems, including reduced latency, improved energy efficiency, and the potential for heat reuse in industrial or urban settings. Furthermore, immersion cooling allows for rapid deployment with edge-ready solutions, making it suitable for locations where conventional cooling systems are not feasible. The technology also supports cooling of high chip densities without water waste, aligning with sustainable cooling goals.
The demand for cryptocurrency mining has grown substantially, driven by cryptocurrency’s benefits such as faster international transfers, decentralized operation, fraud protection, and enhanced transactional security. Cryptocurrency mining operations require high-performance systems that are often overclocked to maximize hash rates, and immersion cooling helps maintain the temperatures of these high-power systems efficiently. This cooling method reduces both operational and capital expenses in cryptocurrency mining, making it more cost-effective for miners to achieve their desired processing performance without overheating.
Artificial Intelligence (AI) is another segment expected to experience rapid growth, with a projected CAGR of 26.3% over the forecast period. Several factors contribute to this growth, including a resurgence in AI research in the U.S., the widespread adoption of deep learning technologies by major companies like Facebook, Google, Microsoft, and Amazon, and a rising demand for AI-driven applications. Additionally, Israel’s robust AI ecosystem is contributing to the growing demand for high-performance servers capable of managing the increased load. This upsurge in AI applications is anticipated to drive the immersion cooling market as AI servers require efficient cooling solutions to manage their significant computational workloads.
Immersion cooling systems are especially valuable for supporting large computing workloads in small or constrained spaces where traditional cooling options may be unavailable or unsuitable. This technology reduces energy consumption in edge computing environments, where cooling solutions are critical, but high-capacity power sources may not always be accessible. Immersion liquid cooling enables efficient deployment at edge locations, offering a solution that conserves energy and is adaptable to diverse environments where space or power constraints might otherwise limit the effectiveness of traditional cooling methods.
Order a free sample PDF of the Immersion Cooling Market Intelligence Study, published by Grand View Research.
0 notes
Text
Is Intel Too Big to Fail? Why the U.S. is Considering Government Intervention
Intel has long been a mainstay of the global IT sector, powering everything from data centers to laptops and fostering innovation that has maintained American competitiveness globally. Recent indications, however, point to serious difficulties facing the business. The question of whether Intel is too large to fail arises as the company attempts to reclaim its competitive advantage against an increasing wave of rivals like AMD, Nvidia, and TSMC. And if so, ought the United States government to intervene?
We’ll dissect Intel’s current situation in this blog, examine why the government might be considering getting involved, and consider the implications for consumers, the tech sector, and national security.
Intel’s Place in the Technology Industry
One of the biggest semiconductor companies in the world, Intel has an impressive past. The x86 architecture, which drives most PCs, was developed by this company. Numerous industries, like as consumer electronics and high-performance computing, make extensive use of its processors. Intel has consistently been at the forefront of manufacturing, especially with its integrated device manufacturing (IDM) approach, which involves the company designing and producing its own chips. However, Intel has recently faced a number of challenges:
Manufacturing Delays: Due to Intel’s manufacturing delays, rivals like TSMC and Samsung are able to produce smaller, more efficient processors, particularly when moving to more advanced nodes like 10nm and 7nm.
Competitive Pressure: AMD has significantly reduced Intel’s market share in CPUs for desktops, laptops, and data centers with to its Zen architecture and alliance with TSMC. Intel is attempting to get into the AI and graphics markets, where Nvidia’s GPUs are the industry leaders.
Demand Shift: Intel is attempting to catch up in the industries of artificial intelligence, machine learning, and cloud computing, where the semiconductor industry has witnessed a spike in demand for specialist chips.
Despite its continued profitability and size, Intel is under a lot of strain as a result of these failures. These problems are made worse by the decline in Intel’s worldwide semiconductor market dominance. The smallest and most sophisticated chips are currently made by Taiwanese companies like TSMC, which has led to a reliance on foreign suppliers for cutting-edge technology.
Why Would the American Government Think About Intervening?
Intel’s reputation as being “too big to fail” is linked to both economic stability and national security. Concern over reliance on foreign vendors for vital technologies has grown within the U.S. government. Officials are considering intervening for the following reasons:
National Security Issues: Semiconductors are essential to practically every piece of technology, from military hardware to consumer electronics. Reliance on overseas chip manufacturers, especially those in Taiwan, is viewed as potentially dangerous. If it could catch up technologically, Intel is one of the few businesses that might potentially close this gap domestically.
Global Competition with China: The significance of self-sufficiency in technology has been brought to light by the U.S.-China trade war. The U.S. government views supporting Intel as a means of maintaining competitiveness in light of China’s aspirations to become a semiconductor leader.
Economic Impact: Intel contributes significantly to job creation and innovation, and the semiconductor sector is a vital component of the American economy. There could be significant economic repercussions if Intel falters.
How Would the Government Get Involved?
The U.S. government might help Intel in a number of ways, including direct financial support and regulatory support, if it chooses to step in. Let’s examine a few options:
Tax incentives and subsidies: The government may provide funding to help defray the expenses of increasing Intel’s capacity for domestic manufacturing. Research & development subsidies, grants, or tax benefits could be some examples of this.
Partnerships and Contracts: Direct government contracts are an additional avenue that might be used to incentivize Intel to manufacture chips for the military and other government agencies.
Support for Research and Development: To help Intel catch up to or even outperform rivals in the production of advanced nodes, the United States might contribute to its R&D.
Cooperation on Semiconductor Manufacturing: To improve the infrastructure for domestic manufacturing, the government may promote or require alliances with other businesses, maybe including TSMC.
Potential Effects of Government Involvement
Government action might assist Intel in catching up to rivals and regaining its position as the semiconductor industry leader. But there are possible advantages and disadvantages to this strategy.
Advantages
Improved National Security: The United States could become less dependent on foreign producers, particularly for sensitive technologies, if Intel’s skills were strengthened.
Support for Domestic Manufacturing: More funding for semiconductor production in the United States may result in the creation of jobs and the expansion of the tech sector.
More Innovation: A more competitive semiconductor market may result from a stronger Intel, which could spur further innovation.
Drawbacks
Market Distortion: Direct intervention might stifle smaller, innovative chipmakers in the United States by upsetting the competitive environment.
Cost to Taxpayers: The cost of government assistance would probably be high. It would be essential to make sure that these money are used efficiently.
Possible International Tensions: Supporting or subsidizing one company may cause opposition from other countries, particularly if it is thought to give that company an unfair edge in the global IT market.
In conclusion
Whether Intel is “too big to fail” depends on your point of view, but it is obvious that the company’s performance is closely linked to the national security and economic interests of the United States. The semiconductor business and the global IT scene may undergo major changes as the U.S. government explores the potential of intervening. It remains to be seen if involvement would give Intel the lift it needs to recover its advantage or if it will make things much more difficult.
The choices chosen now will probably determine the future of American technological independence and influence in the global semiconductor sector as Intel navigates its difficulties.
1 note
·
View note
Text
[TIME is US Media]
U.S. and European officials are growing increasingly concerned about China’s accelerated push into the production of older-generation semiconductors and are debating new strategies to contain the country’s expansion. President Joe Biden implemented broad controls over China’s ability to secure the kind of advanced chips that power artificial-intelligence models and military applications. But Beijing responded by pouring billions into factories for the so-called legacy chips that haven’t been banned. Such chips are still essential throughout the global economy, critical components for everything from smartphones and electric vehicles to military hardware.
That’s sparked fresh fears about China’s potential influence and triggered talks of further reining in the Asian nation, according to people familiar with the matter, who asked not to be identified because the deliberations are private. The U.S. is determined to prevent chips from becoming a point of leverage for China, the people said.
Commerce Secretary Gina Raimondo alluded to the problem during a panel discussion last week at the American Enterprise Institute. “The amount of money that China is pouring into subsidizing what will be an excess capacity of mature chips and legacy chips—that’s a problem that we need to be thinking about and working with our allies to get ahead of,” she said.[...]
Legacy chips are typically considered those made with 28-nm equipment or above, technology introduced more than a decade ago. Senior E.U. and U.S. officials are concerned about Beijing’s drive to dominate this market for both economic and security reasons, the people said. They worry Chinese companies could dump their legacy chips on global markets in the future, driving foreign rivals out of business like in the solar industry, they said.[...]
domestic producers may be reluctant to invest in facilities that will have to compete with heavily subsidized Chinese plants. [...]
“The United States and its partners should be on guard to mitigate nonmarket behavior by China’s emerging semiconductor firms,”
While the U.S. rules introduced last October slowed down China’s development of advanced chipmaking capabilities, they left largely untouched [sic] the country’s ability to use techniques older than 14-nanometers. That has led Chinese firms to construct new plants faster than anywhere else in the world. They are forecast to build 26 fabs through 2026 that use 200-millimeter and 300-mm wafers, according to the trade group SEMI. That compares with 16 fabs for the Americas.
So what's the problem? is it that you suck at manufacturing & want more neoliberalism? That's what it seems like to me [31 Jul 23]
137 notes
·
View notes
Text
It’s billed as a summit for democracy. Under U.S. leadership, countries from six continents will gather from March 29 to March 30 to highlight “how democracies deliver for their citizens and are best equipped to address the world’s most pressing challenges,” according to the U.S. State Department.
Although advancing technology for democracy is a key pillar of the summit’s agenda, the United States has been missing in action when it comes to laying out and leading on a vision for democratic tech leadership. And by staying on the sidelines and letting others—most notably the European Union—lead on tech regulation, the United States has the most to lose economically and politically.
One in five private-sector jobs in the United States is linked to the tech sector, making tech a cornerstone of the U.S. economy. When U.S. tech companies are negatively impacted by global economic headwinds, overzealous regulators, or other factors, the consequences are felt across the economy, as the recent tech layoffs impacting tens of thousands of workers have shown.
And “tech” isn’t just about so-called Big Tech companies such as Alphabet (Google’s parent company) or social media platforms such as Meta’s Facebook and Instagram. Almost every company is now a tech company—automakers, for example, can track users’ movements from GPS data, require large numbers of computer chips, and use the cloud for data storage. Rapid developments in artificial intelligence, especially in the field of natural language processing (the ability behind OpenAI’s ChatGPT), have widespread applications across an even larger swath of sectors including media and communications.
This means that tech policy is not just about content moderation or antitrust legislation—two of the main areas of focus for U.S. policymakers. Rather, tech policy is economic policy, trade policy, and—when it comes to U.S. tech spreading across the globe—foreign policy.
As the global leader in technology innovation, the United States has a real competitive edge as well as a political opportunity to advance a vision for technology in the service of democracy. But the window to act is rapidly narrowing as others, including like-minded democracies in Europe but also authoritarian China, are stepping in to fill the leadership void.
The European Union has embarked on an ambitious regulatory agenda, laying out a growing number of laws to govern areas including digital services taxes, data sharing, online advertising, and cloud services. Although the regulatory efforts may be based in democratic values, in practice, they have an economic agenda: France, for example, expects to make 670 million euros in 2023 from digital services taxes, with much of that coming from large U.S. tech companies.
What’s worse is that while other key EU regulations, such as the Digital Markets Act (DMA), target the largest U.S. firms, they leave Chinese-controlled companies such as Alibaba and Tencent less regulated. That’s because the DMA sets out very narrow criteria to define “gatekeepers,” such as company size and market position, to only cover large U.S. firms, thus benefiting both European companies and subsidized Chinese competitors and creating potential security vulnerabilities when it comes to data collection and access.
While Europe rushes to regulate, China has developed an effective model of digital authoritarianism: strangling the internet with censorship, deploying AI technologies such as facial recognition for surveillance, and advocating for cyber “sovereignty,” which is doublespeak for state control of data and information. Beijing has been actively exporting these tools to other countries, primarily in the global south, where the United States is fighting an uphill battle to convince countries to join its global democracy agenda.
And the battle for hearts and minds has implications far beyond tech—it goes to the heart of U.S. global leadership. In last month’s vote at the United Nations to condemn Russia’s brutal invasion of Ukraine, endorsed by the United States, the majority of the countries that voted against or abstained were from Africa, South America, and Asia.
Without a U.S.-led concerted effort to push back against authoritarian states’ desire to define the rules around technology, large democracies such as Turkey and India are also wavering, imposing increasingly authoritarian limits on free speech online. The result is growing digital fragmentation—fragmentation that benefits authoritarian adversaries.
The Biden administration says it wants to see technology harnessed to support democratic freedoms, strengthen our democratic alliances, and beat back the authoritarian vision of a government-run internet.
Here’s how it could help achieve these goals.
First, the administration should map out an affirmative technology strategy, making sure that U.S. workers and consumers benefit from U.S. tech leadership. This means investing in competitiveness and a smarter public-private approach to research and development, an area the United States has underfunded for over a decade.
Tech touches on almost every sector of the U.S. economy as well as international trade, defense, and security, and involves almost every government agency from the State Department’s Bureau of Cyberspace and Digital Policy to the Federal Trade Commission and the Cybersecurity and Infrastructure Security Agency. And while most European countries now have full ministries for digital affairs, the U.S. doesn’t have similarly politically empowered counterparts tasked with coordinating a whole-of-government effort across all government agencies to produce a national strategy for technology. This needs to change.
Second, the administration should take advantage of the bipartisan consensus in the U.S. Congress on the need to push back against China’s growing domination in tech by putting forward a balanced regulatory agenda that establishes clear rules for responsible innovation. In an op-ed earlier this year, U.S. President Joe Biden called for Republicans and Democrats to hold social media platforms accountable for how they use and collect data, moderate online content, and treat their competition. To be sure, a national privacy law is long overdue, as several states have already passed their own laws, creating a confusing regulatory environment.
But this agenda is too backward-looking: Policymakers today are debating how to regulate technology from 20 years ago, when social media companies first emerged. As ChatGPT has shown, tech advancements far outpace regulatory efforts. A balanced agenda would set out key principles and ethical guardrails, rather than seek to regulate specific companies or apps. Banning TikTok, for example, won’t prevent another Chinese company from taking its place.
Third, the U.S. should reenergize its engagement in multilateral institutions. The United States is taking the right steps in endorsing Japan’s initiative at the next G-7 meeting to establish international standards for trust in data flows, known as the Data Free Flow with Trust. The administration has also appointed an ambassador at large for cyberspace and digital policy to work more closely with allies on tech cooperation.
The U.N.’s International Telecommunication Union, which helps develop standards in telecoms, is now directed by American Doreen Bogdan-Martin, which also presents an opportunity to beat back Russian and Chinese attempts to impose government control over the internet and instead reinforce the present private sector- and civil society-led internet governance model.
Washington has led important defensive efforts to challenge Beijing’s system of sovereignty and surveillance and has brought key allies along in these efforts. But it has not done enough to drive an affirmative agenda on technology innovation and tech-driven economic opportunity. The Biden administration has an opportunity now to prioritize tech. There is no time to waste.
4 notes
·
View notes
Text
Google’s CEO reveals that over 25% of the code they generate is through Artificial Intelligence
Sundar Pichai, CEO of Google, has revealed that more than 25% of the code “written” at the company is now exclusively generated by artificial intelligence (AI). This shouldn’t come as a surprise to anyone, as programming is one of the primary fields where AI is set to dominate the market — and yes, to some extent, replace human labor.
This information arrives at a time when Google has announced its financial results. Google Cloud generated $11.4 billion during the third quarter of the year (Q3 2024). For context, this is a 35% increase compared to last year. Google states that its AI product portfolio “is helping us attract new clients, secure larger deals, and boost current clients’ product adoption by 30%.”
As with all companies, AI is also Google’s golden goose Artificial Intelligence (AI) programming, code generation at Google Google’s financial results make clear that its “robust AI infrastructure, including data centers, chips, and a global fiber network,” is its primary focus. Meanwhile, YouTube is the undisputed leader in Google’s financial impact. They revealed that during this quarter, YouTube surpassed $50 billion in revenue for the first time in its history. This includes YouTube ads, subscriptions, YouTube Music Premium, YouTube TV, and the NFL. In terms of devices, they did not provide any details despite mentioning their recently launched Pixel 9.
Returning to the topic at hand, Google is not only facilitating access to AI for other companies, but it has also begun replacing human workers internally to boost productivity and reduce costs. AI is advancing more rapidly in software development and coding processes. Although for now, Google states that human engineers review all work generated by AI, it’s only a matter of time before the percentage of AI use continues to rise.
“We are also using AI internally to improve our coding processes, which is driving productivity and efficiency. Currently, more than a quarter of all new code at Google is AI-generated, subsequently reviewed and approved by engineers. This helps our engineers complete work faster and more efficiently.”
“I am energized by our progress and the opportunities ahead. We remain focused on creating great products.”
NVIDIA already warned of the future awaiting programmers NVIDIA’s Jensen Huang on stage It was back in February of this year when Jensen Huang, CEO of NVIDIA, said that no one should learn to program anymore. He stated that AI would program for us. In fact, someone like me, who doesn’t know how to program, could now program thanks to AI. NVIDIA’s CEO claimed that with AI, anyone could become a programmer without needing knowledge of the field. Thus, it’s no surprise that Google is now turning to AI to generate new code — just as the world’s leading companies are.
“In the last 10 to 15 years, nearly everyone on a stage like this would tell you it’s vital for your kids to learn computer science — that everyone should learn to program. In fact, it’s almost exactly the opposite. Our job is to create computing technology so that no one has to program, and the programming language becomes human. Now, everyone is a programmer. This is the miracle of artificial intelligence.”
0 notes
Text
Global 3D Atom Probe Market – Key Insight, Trend, And Industry Growth:
MARKET OVERVIEW:
The global 3D atom probe market is experiencing robust growth, driven by its essential role in providing atomic-level material analysis. This technology is vital for industries such as semiconductors, metallurgy, and advanced manufacturing, where precise material characterization is crucial for innovation. The ability to visualize a material's 3D atomic structure enables the development of high-performance products, particularly in the electronics and nanotechnology sectors.
The market is projected to grow at a CAGR of 8.5% from 2023 to 2030, with the total market size expected to reach USD 230 million by 2030. This growth is fueled by increasing demand for high-resolution microscopy in the semiconductor industry, where 3D atom probes help improve microchip design and production. Additionally, growing investments in nanotechnology and materials research further accelerate market expansion as industries seek more advanced tools for precise atomic analysis.
3D atom probe technology plays a critical role in addressing the demand for ultra-high-resolution material insights, especially in fields requiring exact composition data for complex materials. Unlike traditional microscopy methods, APT offers three-dimensional imaging and detailed chemical profiling, making it invaluable for studying materials at the atomic level. This capability is pivotal for industries that depend on atomic accuracy to optimize performance, durability, and efficiency.
North America and Europe currently lead the market, owing to established infrastructures and substantial R&D investments. In recent years, however, the Asia-Pacific region has emerged as a fast-growing player, driven by significant investments in semiconductor and advanced manufacturing sectors. Key companies and research institutions are continually advancing APT technology, introducing new equipment and software solutions to facilitate faster and more accurate analyses.
Key Trends Shaping the Global 3D Atom Probe Market
1. Expanding Applications in Semiconductor and Electronics Industries
As semiconductor devices become increasingly complex and miniaturized, the need for precise material analysis has never been greater. The 3D atom probe’s atomic-level precision allows semiconductor manufacturers to evaluate structural integrity, identify atomic defects, and optimize material properties. This capacity to inspect and understand materials at an unprecedented scale has made APT essential for chip designers and semiconductor firms striving for higher yields and more efficient components.
The rising demand for high-performance electronics—driven by trends in artificial intelligence (AI), 5G, and the Internet of Things (IoT)—has intensified R&D efforts within the semiconductor sector. Companies are investing in atom probe technology to stay competitive, as APT provides them with a deeper understanding of material characteristics essential for developing advanced microchips. This demand is expected to keep rising as electronic devices evolve and require more intricate and efficient designs.
2. Growing Role in Nanotechnology and Advanced Material Science
Nanotechnology focuses on materials at the atomic and molecular scale, and atom probe tomography has proven invaluable in this domain. By analyzing and visualizing atomic interactions within nanomaterials, APT allows researchers to create materials with highly controlled properties, essential for applications in biomedical engineering, energy, and aerospace. In nanotechnology, even minor atomic irregularities can drastically impact material performance, making the precision of APT indispensable.
Applications of APT in nanotechnology research are rapidly expanding. For instance, the technology enables detailed study of carbon-based nanostructures, quantum dots, and biomaterials, allowing researchers to optimize these materials for various applications. This trend is expected to continue as nanotechnology moves into broader industrial and consumer applications, thus driving demand for atom probe technology across both public and private sectors.
3. Critical Contributions to Battery and Renewable Energy Research
The renewable energy sector, particularly battery research, benefits significantly from the insights provided by 3D atom probe technology. The atomic-level data generated by APT allows researchers to monitor ion diffusion, electrode degradation, and other atomic-scale phenomena critical to battery performance and longevity. These insights help in the development of more stable and efficient energy storage materials, supporting growth in electric vehicle (EV) markets, grid storage solutions, and other clean energy applications.
With the global transition toward sustainable energy solutions, battery technology has become a focal point of research, especially in the context of lithium-ion and solid-state batteries. APT helps researchers identify atomic-level changes within these materials, informing new designs that maximize energy density and battery life. This demand is projected to expand, especially as clean energy initiatives and electric vehicle production accelerate worldwide.
4. Advancements in Metallurgy and High-Performance Alloys
In sectors like aerospace, automotive, and defense, high-performance alloys are essential for creating durable and lightweight components that withstand extreme conditions. APT’s ability to provide a detailed atomic view of alloys enables metallurgists to understand material composition, grain boundaries, and microstructural defects. This analysis helps optimize alloys for improved strength, corrosion resistance, and thermal stability, which are critical properties for industries relying on advanced metal components.
The growing focus on developing innovative alloy compositions is further fueling demand for 3D atom probe technology. Aerospace and automotive industries, in particular, are leveraging APT to innovate lighter, stronger materials that contribute to fuel efficiency and safety. As materials science advances, atom probe tomography will likely continue to play a crucial role in alloy development, supporting a wide range of industrial applications.
Challenges and Emerging Opportunities
Despite its numerous advantages, the high cost associated with 3D atom probe technology remains a barrier to broader adoption. Atom probe systems are expensive to acquire and maintain, and they require highly skilled operators. However, efforts are underway to reduce costs through miniaturization and automation, potentially making APT more accessible across sectors. This cost-reduction trend presents an opportunity for further market expansion as it brings atom probe technology within reach for smaller laboratories and research institutions.
Another challenge lies in data processing. The vast data generated by APT requires robust data management and analysis solutions, which can be time-consuming and costly. Software developers have an opportunity here to create advanced data processing tools that streamline APT workflows, making it easier for users to analyze and interpret their findings. Improved data management could significantly enhance the efficiency of APT technology, encouraging wider use in industry and academia.
Future Growth Potential in the Global 3D Atom Probe Market
The global 3D atom probe market shows substantial growth potential, especially as industries increasingly demand precise material analysis for product development and innovation. As APT technology advances, with enhancements in user-friendliness and automation, its appeal across sectors like electronics, energy, and materials science will likely continue to expand. Additionally, ongoing R&D investments from both public and private sectors in developing economies signal further opportunities for market growth.
Regions such as Asia-Pacific are set to become prominent players in the global atom probe market due to rapid industrialization, particularly in semiconductor manufacturing. As countries like China, Japan, and South Korea intensify their investments in nanotechnology and advanced manufacturing, the demand for APT is likely to rise in these regions. Partnerships between research institutions and commercial enterprises will play a crucial role in this expansion, as collaborative efforts accelerate the development and accessibility of atom probe technology.
Conclusion: A Cornerstone of Material Science and Industrial Innovation
The global 3D atom probe market stands at the forefront of scientific and industrial innovation, offering solutions that support advancements in sectors ranging from semiconductor manufacturing to renewable energy. As the need for precision in material analysis intensifies, demand for atom probe technology is set to grow, shaping the future of material science and supporting the development of next-generation products and technologies.
With its capacity to provide atomic-level insights, 3D atom probe technology is expected to remain essential for high-tech industries focused on improving product quality, sustainability, and performance. As costs decrease and software improvements streamline data handling, APT will become even more integral to scientific research and industrial applications, ensuring its place as a fundamental tool in modern material analysis.
More about report: https://www.xinrenresearch.com
0 notes