Tumgik
#Data centers
exeton · 4 months
Text
Data Centers in High Demand: The AI Industry’s Unending Quest for More Capacity
Tumblr media
The demand for data centers to support the booming AI industry is at an all-time high. Companies are scrambling to build the necessary infrastructure, but they’re running into significant hurdles. From parts shortages to power constraints, the AI industry’s rapid growth is stretching resources thin and driving innovation in data center construction.
The Parts Shortage Crisis
Data center executives report that the lead time to obtain custom cooling systems has quintupled compared to a few years ago. Additionally, backup generators, which used to be delivered in a month, now take up to two years. This delay is a major bottleneck in the expansion of data centers.
The Hunt for Suitable Real Estate
Finding affordable real estate with adequate power and connectivity is a growing challenge. Builders are scouring the globe and employing creative solutions. For instance, new data centers are planned next to a volcano in El Salvador to harness geothermal energy and inside shipping containers in West Texas and Africa for portability and access to remote power sources.
Case Study: Hydra Host’s Struggle
Earlier this year, data-center operator Hydra Host faced a significant hurdle. They needed 15 megawatts of power for a planned facility with 10,000 AI chips. The search for the right location took them from Phoenix to Houston, Kansas City, New York, and North Carolina. Each potential site had its drawbacks — some had power but lacked adequate cooling systems, while others had cooling but no transformers for additional power. New cooling systems would take six to eight months to arrive, while transformers would take up to a year.
Surge in Demand for Computational Power
The demand for computational power has skyrocketed since late 2022, following the success of OpenAI’s ChatGPT. The surge has overwhelmed existing data centers, particularly those equipped with the latest AI chips, like Nvidia’s GPUs. The need for vast numbers of these chips to create complex AI systems has put enormous strain on data center infrastructure.
Rapid Expansion and Rising Costs
The amount of data center space in the U.S. grew by 26% last year, with a record number of facilities under construction. However, this rapid expansion is not enough to keep up with demand. Prices for available space are rising, and vacancy rates are negligible.
Building Data Centers: A Lengthy Process
Jon Lin, the general manager of data-center services at Equinix, explains that constructing a large data facility typically takes one and a half to two years. The planning and supply-chain management involved make it challenging to quickly scale up capacity in response to sudden demand spikes.
Major Investments by Tech Giants
Tumblr media
Supply Chain and Labor Challenges
The rush to build data centers has extended the time required to acquire essential components. Transceivers and cables now take months longer to arrive, and there’s a shortage of construction workers skilled in building these specialized facilities. AI chips, particularly Nvidia GPUs, are also in short supply, with lead times extending to several months at the height of demand.
Innovative Solutions to Power Needs
Tumblr media
Portable Data Centers and Geothermal Energy
Startups like Armada are building data centers inside shipping containers, which can be deployed near cheap power sources like gas wells in remote Texas or Africa. In El Salvador, AI data centers may soon be powered by geothermal energy from volcanoes, thanks to the country’s efforts to create a more business-friendly environment.
Conclusion: Meeting the Unending Demand
The AI industry’s insatiable demand for data centers shows no signs of slowing down. While the challenges are significant — ranging from parts shortages to power constraints — companies are responding with creativity and innovation. As the industry continues to grow, the quest to build the necessary infrastructure will likely become even more intense and resourceful.
FAQs
1. Why is there such a high demand for data centers in the AI industry?
The rapid growth of AI technologies, which require significant computational power, has driven the demand for data centers.
2. What are the main challenges in building new data centers?
The primary challenges include shortages of critical components, suitable real estate, and sufficient power supply.
3. How long does it take to build a new data center?
It typically takes one and a half to two years to construct a large data facility due to the extensive planning and supply-chain management required.
4. What innovative solutions are companies using to meet power needs for data centers?
Companies are exploring options like modular nuclear reactors, geothermal energy, and portable data centers inside shipping containers.
5. How are tech giants like Amazon, Microsoft, and Google responding to the demand for data centers?
They are investing billions of dollars in new data centers to expand their capacity and meet the growing demand for AI computational power.
Muhammad Hussnain Facebook | Instagram | Twitter | Linkedin | Youtube
3 notes · View notes
jcmarchi · 4 months
Text
Elaine Liu: Charging ahead
New Post has been published on https://thedigitalinsider.com/elaine-liu-charging-ahead/
Elaine Liu: Charging ahead
Tumblr media Tumblr media
MIT senior Elaine Siyu Liu doesn’t own an electric car, or any car. But she sees the impact of electric vehicles (EVs) and renewables on the grid as two pieces of an energy puzzle she wants to solve.
The U.S. Department of Energy reports that the number of public and private EV charging ports nearly doubled in the past three years, and many more are in the works. Users expect to plug in at their convenience, charge up, and drive away. But what if the grid can’t handle it?
Electricity demand, long stagnant in the United States, has spiked due to EVs, data centers that drive artificial intelligence, and industry. Grid planners forecast an increase of 2.6 percent to 4.7 percent in electricity demand over the next five years, according to data reported to federal regulators. Everyone from EV charging-station operators to utility-system operators needs help navigating a system in flux.
That’s where Liu’s work comes in.
Liu, who is studying mathematics and electrical engineering and computer science (EECS), is interested in distribution — how to get electricity from a centralized location to consumers. “I see power systems as a good venue for theoretical research as an application tool,” she says. “I’m interested in it because I’m familiar with the optimization and probability techniques used to map this level of problem.”
Liu grew up in Beijing, then after middle school moved with her parents to Canada and enrolled in a prep school in Oakville, Ontario, 30 miles outside Toronto.
Liu stumbled upon an opportunity to take part in a regional math competition and eventually started a math club, but at the time, the school’s culture surrounding math surprised her. Being exposed to what seemed to be some students’ aversion to math, she says, “I don’t think my feelings about math changed. I think my feelings about how people feel about math changed.”
Liu brought her passion for math to MIT. The summer after her sophomore year, she took on the first of the two Undergraduate Research Opportunity Program projects she completed with electric power system expert Marija Ilić, a joint adjunct professor in EECS and a senior research scientist at the MIT Laboratory for Information and Decision Systems.
Predicting the grid
Since 2022, with the help of funding from the MIT Energy Initiative (MITEI), Liu has been working with Ilić on identifying ways in which the grid is challenged.
One factor is the addition of renewables to the energy pipeline. A gap in wind or sun might cause a lag in power generation. If this lag occurs during peak demand, it could mean trouble for a grid already taxed by extreme weather and other unforeseen events.
If you think of the grid as a network of dozens of interconnected parts, once an element in the network fails — say, a tree downs a transmission line — the electricity that used to go through that line needs to be rerouted. This may overload other lines, creating what’s known as a cascade failure.
“This all happens really quickly and has very large downstream effects,” Liu says. “Millions of people will have instant blackouts.”
Even if the system can handle a single downed line, Liu notes that “the nuance is that there are now a lot of renewables, and renewables are less predictable. You can’t predict a gap in wind or sun. When such things happen, there’s suddenly not enough generation and too much demand. So the same kind of failure would happen, but on a larger and more uncontrollable scale.”
Renewables’ varying output has the added complication of causing voltage fluctuations. “We plug in our devices expecting a voltage of 110, but because of oscillations, you will never get exactly 110,” Liu says. “So even when you can deliver enough electricity, if you can’t deliver it at the specific voltage level that is required, that’s a problem.”
Liu and Ilić are building a model to predict how and when the grid might fail. Lacking access to privatized data, Liu runs her models with European industry data and test cases made available to universities. “I have a fake power grid that I run my experiments on,” she says. “You can take the same tool and run it on the real power grid.”
Liu’s model predicts cascade failures as they evolve. Supply from a wind generator, for example, might drop precipitously over the course of an hour. The model analyzes which substations and which households will be affected. “After we know we need to do something, this prediction tool can enable system operators to strategically intervene ahead of time,” Liu says.
Dictating price and power
Last year, Liu turned her attention to EVs, which provide a different kind of challenge than renewables.
In 2022, S&P Global reported that lawmakers argued that the U.S. Federal Energy Regulatory Commission’s (FERC) wholesale power rate structure was unfair for EV charging station operators.
In addition to operators paying by the kilowatt-hour, some also pay more for electricity during peak demand hours. Only a few EVs charging up during those hours could result in higher costs for the operator even if their overall energy use is low.
Anticipating how much power EVs will need is more complex than predicting energy needed for, say, heating and cooling. Unlike buildings, EVs move around, making it difficult to predict energy consumption at any given time. “If users don’t like the price at one charging station or how long the line is, they’ll go somewhere else,” Liu says. “Where to allocate EV chargers is a problem that a lot of people are dealing with right now.”
One approach would be for FERC to dictate to EV users when and where to charge and what price they’ll pay. To Liu, this isn’t an attractive option. “No one likes to be told what to do,” she says.
Liu is looking at optimizing a market-based solution that would be acceptable to top-level energy producers — wind and solar farms and nuclear plants — all the way down to the municipal aggregators that secure electricity at competitive rates and oversee distribution to the consumer.
Analyzing the location, movement, and behavior patterns of all the EVs driven daily in Boston and other major energy hubs, she notes, could help demand aggregators determine where to place EV chargers and how much to charge consumers, akin to Walmart deciding how much to mark up wholesale eggs in different markets.
Last year, Liu presented the work at MITEI’s annual research conference. This spring, Liu and Ilić are submitting a paper on the market optimization analysis to a journal of the Institute of Electrical and Electronics Engineers.
Liu has come to terms with her early introduction to attitudes toward STEM that struck her as markedly different from those in China. She says, “I think the (prep) school had a very strong ‘math is for nerds’ vibe, especially for girls. There was a ‘why are you giving yourself more work?’ kind of mentality. But over time, I just learned to disregard that.”
After graduation, Liu, the only undergraduate researcher in Ilić’s MIT Electric Energy Systems Group, plans to apply to fellowships and graduate programs in EECS, applied math, and operations research.
Based on her analysis, Liu says that the market could effectively determine the price and availability of charging stations. Offering incentives for EV owners to charge during the day instead of at night when demand is high could help avoid grid overload and prevent extra costs to operators. “People would still retain the ability to go to a different charging station if they chose to,” she says. “I’m arguing that this works.”
2 notes · View notes
multipolar-online · 10 days
Text
1 note · View note
indianahal · 17 days
Text
With still high interest rates and massive amounts of empty offices and retail spaces, the commercial real estate industry is facing massive potential financial troubles for 2024/2025.  Over 1$ trillion in commercial real estate loans are coming due.  With dropping property values and many small and midsize commercial banks struggling to stay above water, a financial crash meeting or exceeding the 2008 economic crash is a strong possibility.  My new program is entitled "U.S. Commercial Real Estate Market Crashing 2024."
0 notes
rjzimmerman · 26 days
Text
Hungry for Clean Energy, Facebook Looks to a New Type of Geothermal. (New York Times)
Excerpt from this New York Times story:
Big tech companies across the United States are struggling to find enough clean energy to power all the data centers they plan to build.
Now, some firms are betting on a novel solution: harvesting the heat deep beneath the Earth’s surface to create emissions-free electricity, using drilling techniques from the oil and gas fracking boom.
On Monday, Meta, the company that owns Facebook, announced an agreement with a start-up called Sage Geosystems to develop up to 150 megawatts of an advanced type of geothermal energy that would help power the tech giant’s expanding array of data centers. That is roughly enough electricity to power 70,000 homes.
Sage will use fracking techniques similar to those that have helped extract vast amounts of oil and gas from shale rock. But rather than drill for fossil fuels, Sage plans to create fractures thousands of feet beneath the surface and pump water into them. The heat and pressure underground should heat the water to the point where it can be used to generate electricity in a turbine, all without the greenhouse gases that are causing global warming.
“It’s basically the same fracking technology,” said Cindy Taff, an oil industry veteran who worked at Shell for 36 years before becoming Sage’s chief executive. “The difference is that we’re going after clean heat instead of hydrocarbons” such as oil and gas.
Sage has already drilled a test well in South Texas to demonstrate its approach. The startup now aims to build its first large-scale power plant at a yet-to-be-determined location east of the Rocky Mountains, with the first phase coming online by 2027.
The deal is the latest sign of growing excitement for new types of geothermal power that could provide enormous amounts of emissions-free electricity around the clock and complement more variable sources like wind and solar power.
Google has partnered with Fervo Energy, a prominent geothermal start-up, to build a 5-megawatt pilot plant in Nevada that has already begun supplying power to the grid. The two companies recently reached a deal to supply much more geothermal power in the years ahead to Google’s data centers.
Fervo is also building a 400-megawatt plant in Utah that will sell electricity to utilities in Southern California and is expected to come online starting in 2026.
Tech firms are facing an urgent need for more electricity, as growing interest in artificial intelligence has triggered a data center boom. By one estimate, data centers could consume 9 percent of U.S. electricity by 2030, up from 4 percent today.
Data centers typically need power 24 hours a day, which wind turbines and solar panels alone can’t provide. At the same time, many technology companies have promised to reduce their planet-warming emissions and face pressure not to rely on fossil fuels like coal or gas. So they are exploring technologies that can run around the clock, like nuclear power or enhanced geothermal.
0 notes
jgkoomey · 1 month
Text
Our new article in Joule titled "To better understand AI’s growing energy use, analysts need a data revolution" was published online at Joule today
Tumblr media
Our new article in Joule on data needs for understanding AI electricity use came out online today in Joule (link will be good until October 8, 2024). Here's the summary section:
As the famous quote from George Box goes, “All models are wrong, but some are useful.” Bottom-up AI data center models will never be a perfect crystal ball, but energy analysts can soon make them much more useful for decisionmakers if our identified critical data needs are met. Without better data, energy analysts may be forced to take several shortcuts that are more uncertain, less explanatory, less defensible, and less useful to policymakers, investors, the media, and the public. Meanwhile, all of these stakeholders deserve greater clarity on the scales and drivers of the electricity use of one of the most disruptive technologies in recent memory. One need only look to the history of cryptocurrency mining as a cautionary tale: after a long initial period of moderate growth, mining electricity demand rose rapidly. Meanwhile, energy analysts struggled to fill data and modeling gaps to quantify and explain that growth to policymakers—and to identify ways of mitigating it—especially at local levels where grids were at risk of stress. The electricity demand growth potential of AI data centers is much larger, so energy analysts must be better prepared. With the right support and partnerships, the energy analysis community is ready to take on the challenges of modeling a fast moving and uncertain sector, to continuously improve, and to bring much-needed scientific evidence to the table. Given the rapid growth of AI data center operations and investments, the time to act is now."
I worked with my longtime colleagues Eric Masanet and Nuoa Lei on this article.
1 note · View note
hostdimeindia · 1 month
Text
Tumblr media
Importance of Data Centers in India
Data centers are the backbone of modern businesses, providing the infrastructure needed to store, manage, and process large amounts of data. In India, top data center companies like Hostdime are essential for ensuring reliable and secure IT operations. These facilities support everything from cloud computing to data storage, helping businesses stay competitive in today’s digital age. With advanced technology and round-the-clock support, data centers in India are crucial for business growth and continuity.
0 notes
afrotumble · 2 months
Text
Tumblr media
1 note · View note
demiumresearch · 2 months
Text
Stocks set to gain from the increasing number of data centers in India; analysts anticipate a capital expenditure of ₹50,000 crore.
Tumblr media
Data Centers Stocks Overview
Aurionpro Solutions Stock Price Analysis and Quick Research Report
• Analyzing financial data is crucial for determining a company's true net worth. • Financial ratios can help make sense of the overwhelming information in a company's financial statements. • Aurionpro Solutions' current share price is Rs 1,653.35. • The company's return on assets (ROA) is 8.27%, indicating poor future performance. • The current ratio is 1.47, indicating a company's ability to pay short-term liabilities with short-term assets. • The return on equity (ROE) is 15.11%, indicating a profit each rupee of common stockholders’ equity generates. • The debt-to-equity ratio is 0.22, indicating a low proportion of debt in the company's capital. • Aurionpro Solutions reported revenue growth of 44.92%, indicating fair growth and performance. • The operating margin for the current financial year is 13.88%. • The current year dividend is Rs 2.50 with a yield of 0.14 %. • The latest EPS of Aurionpro Solutions is Rs 7.05, indicating a higher profit allocation for each outstanding share.
* mission-critical computing and networking equipment.
* Backbone of digital world.
* Enable vast data storage, processing, and distribution.
*Aurionpro's current price in market is ₹1,607.70 and 1-year return are 192%
Cummins India Stock Price Analysis and Quick Research Report
• Stock investing involves careful analysis of financial data, including the profit and loss account, balance sheet, and cash flow statement. • Financial ratios can help determine a company's performance, making sense of the overwhelming information in its financial statements. • Key ratios to consider include: - PE ratio: high and comparatively overvalued at 65.98. - Share Price: The current share price is Rs 3,952.65. - Return on Assets (ROA): 15.78%, indicating good future performance. - Current ratio: 2.70, indicating stability against unexpected business and economic changes. - Return on Equity (ROE): 22.11%, indicating profit generation from shareholders' investments. Debt-to-equity ratio: 0.07, indicating low debt in the capital. - Sales growth: 26.12%, fair in relation to growth and performance. - Operating Margin: 16.03%, indicating operational efficiency. - Dividend Yield: Rs 25 with a yield of 0.96%. - Earnings Per Share: Rs 59.91, indicating profit allocation to each outstanding share.
Cummins India Ltd.'s current price in market is ₹4,099.30 for a 1-year return, which is 108%
Anant Raj Limited's current price in market is ₹507.00 for 1-year return, which is 160%
Schneider Electric Infrastructure Ltd.'s current price in market is ₹921.65. The 1-year return is $249
ABB India Ltd.'s current price in market is ₹8,682.75 for a 1-year return, which is 93%
Siemens current price in market ₹7866.90 for a 1-year return is 107%
Have you invested in any of these stocks? LIKE❤️ SHARE / COMMENT
Please Turn On Post notifications
Mission 🎯 
At _*demiumresearch.com*_ , Our Vision Is To Provide Knowledge Of the stock market, investment strategies, and many Many More. #demiumresearchindia
0 notes
nuadox · 2 months
Text
China has launched the world’s largest sodium-ion battery unit
Tumblr media
- By Nuadox Crew -
China has launched the world's largest sodium-ion battery energy storage system (BESS) in Qianjiang, Hubei province.
The first phase, a 50MW/100MWh project, is now operational and will eventually double in capacity to 100MW/200MWh. The system includes 42 BESS containers with 185Ah sodium-ion batteries, 21 power conversion system units, and a 110kV booster station.
Developed by Datang Hubei Energy Development, a state-owned enterprise, this project is part of China's effort to diversify energy storage technologies away from lithium. Sodium-ion batteries are seen as a promising alternative due to their potential to ease supply chain issues, despite their lower energy density and higher initial costs compared to lithium-ion.
Sodium-ion batteries offer advantages such as better efficiency and durability under extreme conditions. China is heavily investing in this technology due to its limited lithium reserves but abundant sodium resources. HiNa Battery predicts a significant growth in the sodium-ion battery industry, potentially reaching terawatt-hour scale by 2030.
Header image credit: Image Creator from Microsoft Designer/DALL.E (AI-generated)
Read more at Energy-Storage.news
--
Other recent news
AI and Emissions: Google’s AI operations have significantly increased emissions by 48% due to the high energy consumption of their data centers.
0 notes
sucka99 · 3 months
Photo
Tumblr media
0 notes
jcmarchi · 5 months
Text
Two MIT teams selected for NSF sustainable materials grants
New Post has been published on https://thedigitalinsider.com/two-mit-teams-selected-for-nsf-sustainable-materials-grants/
Two MIT teams selected for NSF sustainable materials grants
Tumblr media Tumblr media
Two teams led by MIT researchers were selected in December 2023 by the U.S. National Science Foundation (NSF) Convergence Accelerator, a part of the TIP Directorate, to receive awards of $5 million each over three years, to pursue research aimed at helping to bring cutting-edge new sustainable materials and processes from the lab into practical, full-scale industrial production. The selection was made after 16 teams from around the country were chosen last year for one-year grants to develop detailed plans for further research aimed at solving problems of sustainability and scalability for advanced electronic products.
Of the two MIT-led teams chosen for this current round of funding, one team, Topological Electric, is led by Mingda Li, an associate professor in the Department of Nuclear Science and Engineering. This team will be finding pathways to scale up sustainable topological materials, which have the potential to revolutionize next-generation microelectronics by showing superior electronic performance, such as dissipationless states or high-frequency response. The other team, led by Anuradha Agarwal, a principal research scientist at MIT’s Materials Research Laboratory, will be focusing on developing new materials, devices, and manufacturing processes for microchips that minimize energy consumption using electronic-photonic integration, and that detect and avoid the toxic or scarce materials used in today’s production methods.
Scaling the use of topological materials
Li explains that some materials based on quantum effects have achieved successful transitions from lab curiosities to successful mass production, such as blue-light LEDs, and giant magnetorestance (GMR) devices used for magnetic data storage. But he says there are a variety of equally promising materials that have shown promise but have yet to make it into real-world applications.
“What we really wanted to achieve is to bring newer-generation quantum materials into technology and mass production, for the benefit of broader society,” he says. In particular, he says, “topological materials are really promising to do many different things.”
Topological materials are ones whose electronic properties are fundamentally protected against disturbance. For example, Li points to the fact that just in the last two years, it has been shown that some topological materials are even better electrical conductors than copper, which are typically used for the wires interconnecting electronic components. But unlike the blue-light LEDs or the GMR devices, which have been widely produced and deployed, when it comes to topological materials, “there’s no company, no startup, there’s really no business out there,” adds Tomas Palacios, the Clarence J. Lebel Professor in Electrical Engineering at MIT and co-principal investigator on Li’s team. Part of the reason is that many versions of such materials are studied “with a focus on fundamental exotic physical properties with little or no consideration on the sustainability aspects,” says Liang Fu, an MIT professor of physics and also a co-PI. Their team will be looking for alternative formulations that are more amenable to mass production.
One possible application of these topological materials is for detecting terahertz radiation, explains Keith Nelson, an MIT professor of chemistry and co-PI. This extremely high-frequency electronics can carry far more information than conventional radio or microwaves, but at present there are no mature electronic devices available that are scalable at this frequency range. “There’s a whole range of possibilities for topological materials” that could work at these frequencies, he says. In addition, he says, “we hope to demonstrate an entire prototype system like this in a single, very compact solid-state platform.”
Li says that among the many possible applications of topological devices for microelectronics devices of various kinds, “we don’t know which, exactly, will end up as a product, or will reach real industrial scaleup. That’s why this opportunity from NSF is like a bridge, which is precious, to allow us to dig deeper to unleash the true potential.”
In addition to Li, Palacios, Fu, and Nelson, the Topological Electric team includes Qiong Ma, assistant professor of physics in Boston College; Farnaz Niroui, assistant professor of electrical engineering and computer science at MIT; Susanne Stemmer, professor of materials at the University of California at Santa Barbara; Judy Cha, professor of materials science and engineering at Cornell University; industrial partners including IBM, Analog Devices, and Raytheon; and professional consultants. “We are taking this opportunity seriously,” Li says. “We really want to see if the topological materials are as good as we show in the lab when being scaled up, and how far we can push to broadly industrialize them.”
Toward sustainable microchip production and use
The microchips behind everything from smartphones to medical imaging are associated with a significant percentage of greenhouse gas emissions today, and every year the world produces more than 50 million metric tons of electronic waste, the equivalent of about 5,000 Eiffel Towers. Further, the data centers necessary for complex computations and huge amount of data transfer — think AI and on-demand video — are growing and will require 10 percent of the world’s electricity by 2030.
“The current microchip manufacturing supply chain, which includes production, distribution, and use, is neither scalable nor sustainable, and cannot continue. We must innovate our way out of this crisis,” says Agarwal.
The name of Agarwal’s team, FUTUR-IC, is a reference to the future of the integrated circuits, or chips, through a global alliance for sustainable microchip manufacturing. Says Agarwal, “We bring together stakeholders from industry, academia, and government to co-optimize across three dimensions: technology, ecology, and workforce. These were identified as key interrelated areas by some 140 stakeholders. With FUTUR-IC we aim to cut waste and CO2-equivalent emissions associated with electronics by 50 percent every 10 years.”
The market for microelectronics in the next decade is predicted to be on the order of a trillion dollars, but most of the manufacturing for the industry occurs only in limited geographical pockets around the world. FUTUR-IC aims to diversify and strengthen the supply chain for manufacturing and packaging of electronics. The alliance has 26 collaborators and is growing. Current external collaborators include the International Electronics Manufacturing Initiative (iNEMI), Tyndall National Institute, SEMI, Hewlett Packard Enterprise, Intel, and the Rochester Institute of Technology.
Agarwal leads FUTUR-IC in close collaboration with others, including, from MIT, Lionel Kimerling, the Thomas Lord Professor of Materials Science and Engineering; Elsa Olivetti, the Jerry McAfee Professor in Engineering; Randolph Kirchain, principal research scientist in the Materials Research Laboratory; and Greg Norris, director of MIT’s Sustainability and Health Initiative for NetPositive Enterprise (SHINE). All are affiliated with the Materials Research Laboratory. They are joined by Samuel Serna, an MIT visiting professor and assistant professor of physics at Bridgewater State University. Other key personnel include Sajan Saini, education director for the Initiative for Knowledge and Innovation in Manufacturing in MIT’s Department of Materials Science and Engineering; Peter O’Brien, a professor from Tyndall National Institute; and Shekhar Chandrashekhar, CEO of iNEMI.
“We expect the integration of electronics and photonics to revolutionize microchip manufacturing, enhancing efficiency, reducing energy consumption, and paving the way for unprecedented advancements in computing speed and data-processing capabilities,” says Serna, who is the co-lead on the project’s technology “vector.”
Common metrics for these efforts are needed, says Norris, co-lead for the ecology vector, adding, “The microchip industry must have transparent and open Life Cycle Assessment (LCA) models and data, which are being developed by FUTUR-IC.” This is especially important given that microelectronics production transcends industries. “Given the scale and scope of microelectronics, it is critical for the industry to lead in the transition to sustainable manufacture and use,” says Kirchain, another co-lead and the co-director of the Concrete Sustainability Hub at MIT. To bring about this cross-fertilization, co-lead Olivetti, also co-director of the MIT Climate and Sustainability Consortium (MCSC), will collaborate with FUTUR-IC to enhance the benefits from microchip recycling, leveraging the learning across industries.
Saini, the co-lead for the workforce vector, stresses the need for agility. “With a workforce that adapts to a practice of continuous upskilling, we can help increase the robustness of the chip-manufacturing supply chain, and validate a new design for a sustainability curriculum,” he says.
“We have become accustomed to the benefits forged by the exponential growth of microelectronic technology performance and market size,” says Kimerling, who is also director of MIT’s Materials Research Laboratory and co-director of the MIT Microphotonics Center. “The ecological impact of this growth in terms of materials use, energy consumption and end-of-life disposal has begun to push back against this progress. We believe that concurrently engineered solutions for these three dimensions will build a common learning curve to power the next 40 years of progress in the semiconductor industry.”
The MIT teams are two of six that received awards addressing sustainable materials for global challenges through phase two of the NSF Convergence Accelerator program. Launched in 2019, the program targets solutions to especially compelling challenges at an accelerated pace by incorporating a multidisciplinary research approach.
2 notes · View notes
wawt-tech · 4 months
Text
Use Of Rectifiers In Datacenters
In the realm of datacenters, where immense amounts of data are processed and stored, the need for reliable power management systems is paramount. One key component that plays a vital role in this process the rectifier. In this article, we will delve into the importance of rectifiers in datacenters and how they contribute to ensuring smooth operations and efficiency. To Read more about the Trends and Use Of Rectifiers In Datacenters visit : https://wawt.tech/2024/05/14/trends-in-the-use-of-rectifiers-in-datacenters/
0 notes
viperallc · 4 months
Text
Data Centers in High Demand: The AI Industry’s Unending Quest for More Capacity
Tumblr media
The demand for data centers to support the booming AI industry is at an all-time high. Companies are scrambling to build the necessary infrastructure, but they’re running into significant hurdles. From parts shortages to power constraints, the AI industry’s rapid growth is stretching resources thin and driving innovation in data center construction.
The Parts Shortage Crisis
Data center executives report that the lead time to obtain custom cooling systems has quintupled compared to a few years ago. Additionally, backup generators, which used to be delivered in a month, now take up to two years. This delay is a major bottleneck in the expansion of data centers.
The Hunt for Suitable Real Estate
Finding affordable real estate with adequate power and connectivity is a growing challenge. Builders are scouring the globe and employing creative solutions. For instance, new data centers are planned next to a volcano in El Salvador to harness geothermal energy and inside shipping containers in West Texas and Africa for portability and access to remote power sources.
Case Study: Hydra Host’s Struggle
Earlier this year, data-center operator Hydra Host faced a significant hurdle. They needed 15 megawatts of power for a planned facility with 10,000 AI chips. The search for the right location took them from Phoenix to Houston, Kansas City, New York, and North Carolina. Each potential site had its drawbacks — some had power but lacked adequate cooling systems, while others had cooling but no transformers for additional power. New cooling systems would take six to eight months to arrive, while transformers would take up to a year.
Surge in Demand for Computational Power
The demand for computational power has skyrocketed since late 2022, following the success of OpenAI’s ChatGPT. The surge has overwhelmed existing data centers, particularly those equipped with the latest AI chips, like Nvidia’s GPUs. The need for vast numbers of these chips to create complex AI systems has put enormous strain on data center infrastructure.
Rapid Expansion and Rising Costs
The amount of data center space in the U.S. grew by 26% last year, with a record number of facilities under construction. However, this rapid expansion is not enough to keep up with demand. Prices for available space are rising, and vacancy rates are negligible.
Building Data Centers: A Lengthy Process
Jon Lin, the general manager of data-center services at Equinix, explains that constructing a large data facility typically takes one and a half to two years. The planning and supply-chain management involved make it challenging to quickly scale up capacity in response to sudden demand spikes.
Major Investments by Tech Giants
Tumblr media
Why the AI Industry’s Thirst for New Data Centers Can’t Be Satisfied © Provided by The Wall Street Journal
Supply Chain and Labor Challenges
The rush to build data centers has extended the time required to acquire essential components. Transceivers and cables now take months longer to arrive, and there’s a shortage of construction workers skilled in building these specialized facilities. AI chips, particularly Nvidia GPUs, are also in short supply, with lead times extending to several months at the height of demand.
Innovative Solutions to Power Needs
Tumblr media
Why the AI Industry’s Thirst for New Data Centers Can’t Be Satisfied © Provided by The Wall Street Journal
Portable Data Centers and Geothermal Energy
Startups like Armada are building data centers inside shipping containers, which can be deployed near cheap power sources like gas wells in remote Texas or Africa. In El Salvador, AI data centers may soon be powered by geothermal energy from volcanoes, thanks to the country’s efforts to create a more business-friendly environment.
Conclusion: Meeting the Unending Demand
The AI industry’s insatiable demand for data centers shows no signs of slowing down. While the challenges are significant — ranging from parts shortages to power constraints — companies are responding with creativity and innovation. As the industry continues to grow, the quest to build the necessary infrastructure will likely become even more intense and resourceful.
FAQs
1. Why is there such a high demand for data centers in the AI industry?
The rapid growth of AI technologies, which require significant computational power, has driven the demand for data centers.
2. What are the main challenges in building new data centers?
The primary challenges include shortages of critical components, suitable real estate, and sufficient power supply.
3. How long does it take to build a new data center?
It typically takes one and a half to two years to construct a large data facility due to the extensive planning and supply-chain management required.
4. What innovative solutions are companies using to meet power needs for data centers?
Companies are exploring options like modular nuclear reactors, geothermal energy, and portable data centers inside shipping containers.
5. How are tech giants like Amazon, Microsoft, and Google responding to the demand for data centers?
They are investing billions of dollars in new data centers to expand their capacity and meet the growing demand for AI computational power.
Muhammad Hussnain Visit us on social media: Facebook | Twitter | LinkedIn | Instagram | YouTube TikTok
0 notes
instaviewpoint · 5 months
Text
Creating Biased Businesses Through Taxes
April 29 2024 by Kimberly Mann How Wonderful! Jobs will be created and disposable income will enter the region! This was the reaction by many when they heard Amazon Data Services is building a center in their area. What they didn’t hear was the reasons why it may not be a great idea. Reasons like the cost to citizens. Citizen Funding “Indiana Economic Development Corporation (IEDC) committed…
Tumblr media
View On WordPress
0 notes
hostdimeindia · 3 months
Text
youtube
HostDime's Data Centers are Out of This World
HostDime operates purpose-built public data center facilities in Mexico (codename Andromeda), Brazil (Horizon), Colombia (Nebula), and Orlando (Supernova). HostDime’s near term roadmap includes a continued focus on Latin America with hyper edge facilities coming to Peru, Ecuador, Argentina, Bolivia, and India.
0 notes