Tumgik
#energy efficient data centers
vivekguptahal · 2 years
Text
Decarbonizing Data Centers for a Data-driven Sustainable Society
Tumblr media
The need for data center sustainability
Digital services like streaming, emails, online shopping, and more – can reach an infinite number of people without occupying physical space. However, there is an urgent need to decarbonize data storage, management, access and distribution and data center electricity itself is likely to increase to 8 percent of the projected total global electricity demand by 2030,  15-fold of what it is today.^  Energy-efficient data centers are therefore a key climate priority.
Hitachi helps reduce the data center carbon footprint
Hitachi is focused on a sustainable future and contributing to society through the development of superior, original technology and products. Hence, it aims to reduce the environmental impact of data centers through technology and data-driven approaches.
According to research by Stanford Energy, with the right innovative solutions, data center carbon emissions could decrease by 88 percent%^^. Understanding the carbon footprint of a data center is not enough. Leveraging tools to help collaborate across infrastructure, building, energy, and corporate management to gain visibility on consumption and emissions can make a difference.
Data centers are also increasing their use of renewable power sources.  But many of these are purchasing carbon credits instead of generating energy on-site or nearby. Data centers also need to reduce their energy consumption. They can use carbon footprint analytics to optimize their equipment. This involves analyzing how much space the equipment requires, how much heat is being generated, and how that can be controlled more efficiently. Meanwhile, software developers can create sustainable applications to minimize their environmental impact as well. 
If done right, all these measures can enhance the environmental performance of both existing data centers and future facilities.  
Tumblr media
Working together to incorporate data center sustainability
Data center sustainability and energy optimization strategies involve multifaceted approaches that include:
Investing in conversations and analyzing an organization’s current position and where it needs progress.
Creating a pilot project and replicating it later, if successful.
Defining milestones to build momentum for the overall sustainability journey. 
At Hitachi, we develop green data center solutions that span technological, organizational, training and regulatory challenges to help organizations for future green data center certifications and emerging audit standards.
Hitachi has set a goal of carbon neutrality at its business sites by 2030 and across the entire value chain by 2050. The data center emissions challenge is an opportunity to lead by example through green and digital innovation and contribute towards a sustainable society
The dawn of a new sustainable beginning has just begun.…
Discover how Hitachi is unlocking value for society with Social Innovation in Energy:
Sources :
^ https://corporate.enelx.com/en/stories/2021/12/data-center-industry-sustainability
^^ https://energy.stanford.edu/news/data-centers-can-slash-co2-emissions-88-or-more
0 notes
Text
The Evolution of Data Center Architecture: A Look at India's Growing Market
Radiant Info Solutions leads the evolution of data center architecture in India, offering modular, secure, and energy-efficient design solutions.
0 notes
jgkoomey · 1 month
Text
Our new article in Joule titled "To better understand AI’s growing energy use, analysts need a data revolution" was published online at Joule today
Tumblr media
Our new article in Joule on data needs for understanding AI electricity use came out online today in Joule (link will be good until October 8, 2024). Here's the summary section:
As the famous quote from George Box goes, “All models are wrong, but some are useful.” Bottom-up AI data center models will never be a perfect crystal ball, but energy analysts can soon make them much more useful for decisionmakers if our identified critical data needs are met. Without better data, energy analysts may be forced to take several shortcuts that are more uncertain, less explanatory, less defensible, and less useful to policymakers, investors, the media, and the public. Meanwhile, all of these stakeholders deserve greater clarity on the scales and drivers of the electricity use of one of the most disruptive technologies in recent memory. One need only look to the history of cryptocurrency mining as a cautionary tale: after a long initial period of moderate growth, mining electricity demand rose rapidly. Meanwhile, energy analysts struggled to fill data and modeling gaps to quantify and explain that growth to policymakers—and to identify ways of mitigating it—especially at local levels where grids were at risk of stress. The electricity demand growth potential of AI data centers is much larger, so energy analysts must be better prepared. With the right support and partnerships, the energy analysis community is ready to take on the challenges of modeling a fast moving and uncertain sector, to continuously improve, and to bring much-needed scientific evidence to the table. Given the rapid growth of AI data center operations and investments, the time to act is now."
I worked with my longtime colleagues Eric Masanet and Nuoa Lei on this article.
1 note · View note
jcmarchi · 2 months
Text
Adam Khan, Founder of Diamond Quanta – Interview Series
New Post has been published on https://thedigitalinsider.com/adam-khan-founder-of-diamond-quanta-interview-series/
Adam Khan, Founder of Diamond Quanta – Interview Series
Adam Khan is a vanguard in diamond semiconductor technology, celebrated for his foresight and expertise in the industry. As the founder of AKHAN Semiconductor, he was instrumental in innovating lab-grown diamond thin-films for a myriad of applications, from enhancing the durability of smartphone screens and lenses with Miraj Diamond Glass® to bolstering the survivability of aircraft with Miraj Diamond Optics®.
Following his impactful tenure at AKHAN, Adam founded Diamond Quanta to push the boundaries of diamond semiconductor technology further. Diamond Quanta specializes in the defect engineering and manufacturing-minded development of diamond systems to achieve advanced doping techniques, pioneering the development of both n-type and p-type synthetic diamond materials. This innovation enables exceptional semiconductor performance, surpassing traditional materials and unlocking new possibilities in high-power and high-temperature applications. Diamond Quanta’s mission is to lead the next evolution in semiconductor technology, driving progress in fields ranging from AI computing to automotive electronics.
What are diamond-based semiconductors, and how do they differ from traditional silicon-based semiconductors?
Diamond-based semiconductors excel in environments where traditional silicon chips falter, notably in high-power and high-temperature applications:
Thermal Management: Unlike silicon chips that require extensive cooling and operate safely below 140°C, diamond semiconductors thrive at temperatures exceeding 400°C, maintaining performance without the need for complex cooling solutions.
Power Density: Diamond can handle significantly greater power loads than silicon, enhancing performance in high-power applications without degradation.
Future Scalability: Silicon faces scalability challenges due to its thermal and power constraints, while diamond offers sustainable scalability with superior performance metrics.
What recent breakthroughs in lab-grown diamond technology have enabled the use of diamond semiconductors?
Recent advances at Diamond Quanta have pushed diamond semiconductors to the forefront, particularly with our Unified Diamond Framework. This novel technology enhances the structural integrity and thermal management of lab-grown diamonds, making them ideal for demanding applications such as data centers.
How does the thermal conductivity of diamond semiconductors improve data center efficiency?
Diamond’s superior thermal conductivity significantly reduces the need for traditional cooling systems in data centers, allowing for tighter component packing and higher operational temperatures, which translates into reduced energy consumption and enhanced overall efficiency.
How do diamond-based semiconductors manage heat dissipation more effectively than other materials?
Diamond semiconductors dissipate heat more efficiently due to their high thermal conductivity and wide bandgap, ensuring optimal performance even under high thermal loads, which is critical for maintaining system stability and longevity.
What are the benefits of greater power density in diamond-based semiconductors for data centers?
The high-power density of diamond semiconductors allows for more compact and powerful computing setups, supporting higher computation loads in smaller spaces, which is essential for scaling modern data center operations.
How can diamond-based semiconductors contribute to reducing the carbon footprint of data centers?
By eliminating the need for extensive cooling infrastructures and allowing for higher operational efficiencies, diamond-based semiconductors substantially lower the energy consumption and carbon output of data centers, significantly mitigating their environmental impact.
How can diamond semiconductors improve the performance of AI and large language models (LLMs) in data centers?
Diamond semiconductors address critical challenges like heat management and energy efficiency, enabling AI and LLMs to operate more effectively and reliably, thus enhancing computational speed and accuracy in data centers.
In what ways can diamond-based semiconductors extend the longevity of electronic devices?
The robust nature of diamond reduces wear and tear on electronic components, significantly extending the lifespan of devices by minimizing the frequency of maintenance and replacement.
What role do diamond semiconductors play in the development of quantum photonic devices?
Diamond semiconductors are pivotal in advancing quantum photonic devices due to their compatibility with existing photonic technologies and their exceptional optical and electronic properties, facilitating breakthroughs in quantum computing applications.
What future advancements in AI data centers could be enabled by diamond semiconductor technology?
Diamond-based semiconductors are poised to transform AI data centers by enabling more efficient handling of the IT load—including servers, network devices, and data storage—through advanced thermal and electrical properties. These semiconductors can significantly enhance the energy efficiency of data center power systems, such as server power supply units and uninterruptible power supplies. By achieving superior thermal management and power density, diamond semiconductors operate effectively at temperatures exceeding 400°C, far above the typical 80°C limits of current materials, which allows them to function without extensive cooling systems. This capacity not only simplifies infrastructure but also boosts operational efficiency, reducing the energy consumption by up to 18% annually and dramatically lowering CO2 emissions. The integration of diamond semiconductors in power conversion equipment and IT loads is expected to deliver critical enhancements in energy management and cost efficiency, setting a new standard for the industry’s move towards more sustainable and powerful computing environments.
Thank you for the interview, readers who wish to learn more should visit Diamond Quanta.
0 notes
radiantindia · 2 months
Text
Green Data Center Solutions by Radiant Info Solutions
Discover how Radiant Info Solutions leads the way in green data center solutions. Sustainable, energy-efficient, and cost-effective data center designs.
0 notes
pebblegalaxy · 5 months
Text
Reimagining the Energy Landscape: AI's Growing Hunger for Computing Power #BlogchatterA2Z
Reimagining the Energy Landscape: AI's Growing Hunger for Computing Power #BlogchatterA2Z #AIdevelopment #energyConsumption #DataCenterInfrastructure #ArmHoldings #energyEfficiency #SustainableTechnology #RenewableEnergy #EdgeComputing #RegulatoryMeasures
Navigating the Energy Conundrum: AI’s Growing Hunger for Computing Power In the ever-expanding realm of artificial intelligence (AI), the voracious appetite for computing power threatens to outpace our energy sources, sparking urgent calls for a shift in approach. According to Rene Haas, Chief Executive Officer of Arm Holdings Plc, by the year 2030, data centers worldwide are projected to…
Tumblr media
View On WordPress
0 notes
danieldavidreitberg · 7 months
Text
Who Holds the Keys? Ethical Considerations for AI Training and Data Storage
Tumblr media
As artificial intelligence (AI) evolves, so do the complex questions surrounding its development and use. One particularly critical issue: who owns the data used to train and store AI models? This seemingly simple question sparks a web of ethical and legal considerations that demand careful analysis.
Fueling the AI Engine
AI thrives on data. Vast amounts of data are used to train and refine models, from text and images to personal information and even medical records. But who, ultimately, owns this data?
Individuals: The data often originates from individuals, raising questions about their consent and control over its use. Should they have a say in how their data is used for AI development?
Companies: Companies collecting and utilizing data for AI development claim ownership, arguing it's part of their intellectual property. But does this outweigh individual rights and interests?
Governments: In some cases, government agencies collect and hold vast amounts of data, further complicating the ownership picture. What role should they play in regulating and overseeing AI data use?
Beyond Ownership: Ethical Implications
The question of ownership is just one part of the equation. Ethical considerations abound:
Bias and Discrimination: AI models trained on biased data can perpetuate discrimination, harming individuals and communities. How can we ensure data used for AI is fair and inclusive?
Privacy Concerns: When personal data is used for AI development, privacy concerns are paramount. How can we balance innovation with individual privacy rights?
Security and Transparency: Data breaches and misuse pose significant risks. How can we ensure secure storage and transparent use of AI training data?
Navigating the Maze
So, how do we move forward? There's no one-size-fits-all solution, but here are some steps:
Clear and informed consent: Individuals should have clear and easily understood ways to consent to their data being used for AI development.
Robust data protection laws: Strong legislation is crucial to ensure responsible data collection, storage, and use, protecting individual rights and privacy.
Ethical AI development: Developers and companies must adopt ethical frameworks and principles to guide data collection and model training.
Independent oversight bodies: Establishing independent bodies to monitor and advise on AI data practices can offer much-needed transparency and accountability.
The path forward demands collaboration: individuals, companies, governments, and researchers must work together to establish ethical guidelines and frameworks for AI training and data storage. Only then can we ensure that AI truly benefits humanity, respecting individual rights and building a fairer, more equitable future.
0 notes
reasonsforhope · 4 months
Text
Green energy is in its heyday. 
Renewable energy sources now account for 22% of the nation’s electricity, and solar has skyrocketed eight times over in the last decade. This spring in California, wind, water, and solar power energy sources exceeded expectations, accounting for an average of 61.5 percent of the state's electricity demand across 52 days. 
But green energy has a lithium problem. Lithium batteries control more than 90% of the global grid battery storage market. 
That’s not just cell phones, laptops, electric toothbrushes, and tools. Scooters, e-bikes, hybrids, and electric vehicles all rely on rechargeable lithium batteries to get going. 
Fortunately, this past week, Natron Energy launched its first-ever commercial-scale production of sodium-ion batteries in the U.S. 
“Sodium-ion batteries offer a unique alternative to lithium-ion, with higher power, faster recharge, longer lifecycle and a completely safe and stable chemistry,” said Colin Wessells — Natron Founder and Co-CEO — at the kick-off event in Michigan. 
The new sodium-ion batteries charge and discharge at rates 10 times faster than lithium-ion, with an estimated lifespan of 50,000 cycles.
Wessells said that using sodium as a primary mineral alternative eliminates industry-wide issues of worker negligence, geopolitical disruption, and the “questionable environmental impacts” inextricably linked to lithium mining. 
“The electrification of our economy is dependent on the development and production of new, innovative energy storage solutions,” Wessells said. 
Why are sodium batteries a better alternative to lithium?
The birth and death cycle of lithium is shadowed in environmental destruction. The process of extracting lithium pollutes the water, air, and soil, and when it’s eventually discarded, the flammable batteries are prone to bursting into flames and burning out in landfills. 
There’s also a human cost. Lithium-ion materials like cobalt and nickel are not only harder to source and procure, but their supply chains are also overwhelmingly attributed to hazardous working conditions and child labor law violations. 
Sodium, on the other hand, is estimated to be 1,000 times more abundant in the earth’s crust than lithium. 
“Unlike lithium, sodium can be produced from an abundant material: salt,” engineer Casey Crownhart wrote ​​in the MIT Technology Review. “Because the raw ingredients are cheap and widely available, there’s potential for sodium-ion batteries to be significantly less expensive than their lithium-ion counterparts if more companies start making more of them.”
What will these batteries be used for?
Right now, Natron has its focus set on AI models and data storage centers, which consume hefty amounts of energy. In 2023, the MIT Technology Review reported that one AI model can emit more than 626,00 pounds of carbon dioxide equivalent. 
“We expect our battery solutions will be used to power the explosive growth in data centers used for Artificial Intelligence,” said Wendell Brooks, co-CEO of Natron. 
“With the start of commercial-scale production here in Michigan, we are well-positioned to capitalize on the growing demand for efficient, safe, and reliable battery energy storage.”
The fast-charging energy alternative also has limitless potential on a consumer level, and Natron is eying telecommunications and EV fast-charging once it begins servicing AI data storage centers in June. 
On a larger scale, sodium-ion batteries could radically change the manufacturing and production sectors — from housing energy to lower electricity costs in warehouses, to charging backup stations and powering electric vehicles, trucks, forklifts, and so on. 
“I founded Natron because we saw climate change as the defining problem of our time,” Wessells said. “We believe batteries have a role to play.”
-via GoodGoodGood, May 3, 2024
--
Note: I wanted to make sure this was legit (scientifically and in general), and I'm happy to report that it really is! x, x, x, x
3K notes · View notes
electronalytics · 1 year
Text
Parallel Generator Set Controllers Market Outlook, Demand, Overview Analysis, Trends, Key Growth, Opportunity by 2032
Market Overview:
The parallel generator set controllers market refers to the segment of the power generation industry that deals with the control and synchronization of multiple generator sets operating in parallel. These controllers play a crucial role in ensuring efficient and reliable power generation by monitoring and managing the synchronization, load sharing, and other operational parameters of parallel generator sets.
Key Factors Driving the Parallel Generator Set Controllers Market:
The demand for parallel generator set controllers is rising in response to the expanding requirement for dependable power supply in a variety of commercial settings, industrial settings, and essential infrastructure.These controllers make it possible for several generators to synchronise and share loads with ease, resulting in a steady supply of electricity.
An increase in the use of renewable energy sources:Parallel generator set controllers are necessary to combine renewable energy sources with traditional generators to build hybrid power systems as solar and wind energy sources gain popularity.With this integration, renewable energy may be used more efficiently while still ensuring grid stability.
Expansion of Data Centres and Telecom Infrastructure: Robust backup power solutions are required due to the swift expansion of data centres and the telecommunications sector.Using parallel generator set controllers, generator sets can be synchronised and managed effectively for continuous operation
Growth of the manufacturing sector and industrial process automation: As the manufacturing industry expands, there is a greater need for dependable and efficient power generation. The best power distribution is achieved with the use of parallel generator set controllers, which synchronise the generators to meet changing load needs.
Governments all across the world are enforcing strict laws to reduce emissions from the production of electricity. In order to comply with environmental requirements, parallel generator set controllers can assist generator sets operate more efficiently. This results in less fuel usage and emissions.
Trends:
Smart and Digital Solutions: The industry was moving towards more intelligent and digitally connected solutions. Advanced control systems with remote monitoring, predictive maintenance, and data analytics capabilities were gaining traction.
Renewable Energy Integration: With the increasing focus on sustainability, there was a trend towards integrating parallel generator sets with renewable energy sources such as solar and wind, creating hybrid power solutions.
Energy Storage Integration: Battery storage systems were being integrated with generator sets to provide seamless power during transient events and enhance load management.
Microgrid Development: Parallel generator set controllers were being used in microgrid applications, allowing localized power generation and distribution in remote or critical areas.
Load Flexibility: Controllers were becoming more adept at managing variable and dynamic loads, optimizing fuel consumption and power distribution based on real-time demand.
Remote Monitoring and Control: The ability to monitor and control parallel generator sets remotely through IoT technology was becoming increasingly important, enabling efficient operation and maintenance.
Here are some of the key benefits:
Enhanced Power Management
Increased Power Availability
Flexibility and Scalability
Load Sharing and Efficiency
Redundancy and Reliability
Remote Monitoring and Control
Compliance and Environmental Benefits
Safety and Protection Features
Cost-Effective Solution
Integration with Smart Grids
We recommend referring our Stringent datalytics firm, industry publications, and websites that specialize in providing market reports. These sources often offer comprehensive analysis, market trends, growth forecasts, competitive landscape, and other valuable insights into this market.
By visiting our website or contacting us directly, you can explore the availability of specific reports related to this market. These reports often require a purchase or subscription, but we provide comprehensive and in-depth information that can be valuable for businesses, investors, and individuals interested in this market.
“Remember to look for recent reports to ensure you have the most current and relevant information.”
Click Here, To Get Free Sample Report: https://stringentdatalytics.com/sample-request/parallel-generator-set-controllers-market/10331/
Market Segmentations:
Global Parallel Generator Set Controllers Market: By Company
• CRE TECHNOLOGY
• SmartGen
• KOHLER
• S.I.C.E.S.
• Bruno generators
• VISA GROUP
• Alfred Kuhse GmbH
• DEIF Group
• Mechtric Group
Global Parallel Generator Set Controllers Market: By Type
• Automatic
• Manual
Global Parallel Generator Set Controllers Market: By Application
• Industrial
• Chemical
• Power Industry
Global Parallel Generator Set Controllers Market: Regional Analysis
North America: The North America region includes the U.S., Canada, and Mexico. The U.S. is the largest market for Muscle Wire in this region, followed by Canada and Mexico. The market growth in this region is primarily driven by the presence of key market players and the increasing demand for the product.
Europe: The Europe region includes Germany, France, U.K., Russia, Italy, Spain, Turkey, Netherlands, Switzerland, Belgium, and Rest of Europe. Germany is the largest market for Muscle Wire in this region, followed by the U.K. and France. The market growth in this region is driven by the increasing demand for the product in the automotive and aerospace sectors.
Asia-Pacific: The Asia-Pacific region includes Singapore, Malaysia, Australia, Thailand, Indonesia, Philippines, China, Japan, India, South Korea, and Rest of Asia-Pacific. China is the largest market for Muscle Wire in this region, followed by Japan and India. The market growth in this region is driven by the increasing adoption of the product in various end-use industries, such as automotive, aerospace, and construction.
Middle East and Africa: The Middle East and Africa region includes Saudi Arabia, U.A.E, South Africa, Egypt, Israel, and Rest of Middle East and Africa. The market growth in this region is driven by the increasing demand for the product in the aerospace and defense sectors.
South America: The South America region includes Argentina, Brazil, and Rest of South America. Brazil is the largest market for Muscle Wire in this region, followed by Argentina. The market growth in this region is primarily driven by the increasing demand for the product in the automotive sector.
Visit Report Page for More Details: https://stringentdatalytics.com/reports/parallel-generator-set-controllers-market/10331/  
Reasons to Purchase Parallel Generator Set Controllers Market Report:
• To gain insights into market trends and dynamics: this reports provide valuable insights into industry trends and dynamics, including market size, growth rates, and key drivers and challenges.
• To identify key players and competitors: this research reports can help businesses identify key players and competitors in their industry, including their market share, strategies, and strengths and weaknesses.
• To understand consumer behavior: this research reports can provide valuable insights into consumer behavior, including their preferences, purchasing habits, and demographics.
• To evaluate market opportunities: this research reports can help businesses evaluate market opportunities, including potential new products or services, new markets, and emerging trends.
About US:
Stringent Datalytics offers both custom and syndicated market research reports. Custom market research reports are tailored to a specific client's needs and requirements. These reports provide unique insights into a particular industry or market segment and can help businesses make informed decisions about their strategies and operations.
Syndicated market research reports, on the other hand, are pre-existing reports that are available for purchase by multiple clients. These reports are often produced on a regular basis, such as annually or quarterly, and cover a broad range of industries and market segments. Syndicated reports provide clients with insights into industry trends, market sizes, and competitive landscapes. By offering both custom and syndicated reports, Stringent Datalytics can provide clients with a range of market research solutions that can be customized to their specific needs
Contact US:
Stringent Datalytics
Contact No -  +1 346 666 6655
Email Id -  [email protected]  
Web - https://stringentdatalytics.com/
0 notes
marciodpaulla-blog · 1 year
Text
A Greener Byte: The Dawn of Superconducting Computing
Once upon a time, in our not-so-distant future, our world became increasingly connected, increasingly digital. The machines we’ve built to store our data, solve complex equations, and connect us to each other are like the city that never sleeps—data centers, supercomputers, always humming, always working. Yet, every city has its cost. And for our digital city, the price is high, paid in the…
Tumblr media
View On WordPress
1 note · View note
nasa · 1 year
Text
Tumblr media
Let's Explore a Metal-Rich Asteroid 🤘
Between Mars and Jupiter, there lies a unique, metal-rich asteroid named Psyche. Psyche’s special because it looks like it is part or all of the metallic interior of a planetesimal—an early planetary building block of our solar system. For the first time, we have the chance to visit a planetary core and possibly learn more about the turbulent history that created terrestrial planets.
Here are six things to know about the mission that’s a journey into the past: Psyche.
Tumblr media
1. Psyche could help us learn more about the origins of our solar system.
After studying data from Earth-based radar and optical telescopes, scientists believe that Psyche collided with other large bodies in space and lost its outer rocky shell. This leads scientists to think that Psyche could have a metal-rich interior, which is a building block of a rocky planet. Since we can’t pierce the core of rocky planets like Mercury, Venus, Mars, and our home planet, Earth, Psyche offers us a window into how other planets are formed.
Tumblr media
2. Psyche might be different than other objects in the solar system.
Rocks on Mars, Mercury, Venus, and Earth contain iron oxides. From afar, Psyche doesn’t seem to feature these chemical compounds, so it might have a different history of formation than other planets.
If the Psyche asteroid is leftover material from a planetary formation, scientists are excited to learn about the similarities and differences from other rocky planets. The asteroid might instead prove to be a never-before-seen solar system object. Either way, we’re prepared for the possibility of the unexpected!
Tumblr media
3. Three science instruments and a gravity science investigation will be aboard the spacecraft.
The three instruments aboard will be a magnetometer, a gamma-ray and neutron spectrometer, and a multispectral imager. Here’s what each of them will do:
Magnetometer: Detect evidence of a magnetic field, which will tell us whether the asteroid formed from a planetary body
Gamma-ray and neutron spectrometer: Help us figure out what chemical elements Psyche is made of, and how it was formed
Multispectral imager: Gather and share information about the topography and mineral composition of Psyche
The gravity science investigation will allow scientists to determine the asteroid’s rotation, mass, and gravity field and to gain insight into the interior by analyzing the radio waves it communicates with. Then, scientists can measure how Psyche affects the spacecraft’s orbit.
Tumblr media
4. The Psyche spacecraft will use a super-efficient propulsion system.
Psyche’s solar electric propulsion system harnesses energy from large solar arrays that convert sunlight into electricity, creating thrust. For the first time ever, we will be using Hall-effect thrusters in deep space.
Tumblr media
5. This mission runs on collaboration.
To make this mission happen, we work together with universities, and industry and NASA to draw in resources and expertise.
NASA’s Jet Propulsion Laboratory manages the mission and is responsible for system engineering, integration, and mission operations, while NASA’s Kennedy Space Center’s Launch Services Program manages launch operations and procured the SpaceX Falcon Heavy rocket.
Working with Arizona State University (ASU) offers opportunities for students to train as future instrument or mission leads. Mission leader and Principal Investigator Lindy Elkins-Tanton is also based at ASU.
Finally, Maxar Technologies is a key commercial participant and delivered the main body of the spacecraft, as well as most of its engineering hardware systems.
Tumblr media
6. You can be a part of the journey.
Everyone can find activities to get involved on the mission’s webpage. There's an annual internship to interpret the mission, capstone courses for undergraduate projects, and age-appropriate lessons, craft projects, and videos.
You can join us for a virtual launch experience, and, of course, you can watch the launch with us on Oct. 12, 2023, at 10:16 a.m. EDT!
For official news on the mission, follow us on social media and check out NASA’s and ASU’s Psyche websites.
Make sure to follow us on Tumblr for your regular dose of space!
2K notes · View notes
Text
Consider the ways oil and gas are already entwined with big tech. The foundation of the partnership between Big Tech and Big Oil is the cloud, explains Zero Cool, a software expert who went to Kazakhstan to do work for Chevron and chronicled this in Logic magazine. “For Amazon, Google, and Microsoft, as well as a few smaller cloud competitors like Oracle and IBM, winning the IT spend of the Fortune 500 is where most of the money in the public cloud market will be made”—and out of the biggest ten companies in the world by revenue, six are in the business of oil production. What are oil companies going to do with the cloud? Apparently, Chevron—which signed a seven-year cloud contract with Microsoft—generates a terabyte of data per day per sensor and has thousands of wells with these sensors. They can’t even use all that data because of the scale of computation required. “Big Tech doesn’t just supply the infrastructures that enable oil companies to crunch their data,” explains Zero Cool; they also provide analytic tools, and machine learning can help discover patterns to run their operations more efficiently. This is another reason why Big Oils need Big Tech; they have the edge when it comes to artificial intelligence/machine learning. “Why go through the effort of using clean energy to power your data centers when those same data centers are being used by companies like Chevron to produce more oil?” Zero Cool asks, also noting that one of the main reasons oil companies are interested in technology is to surveil workers.
Holly Jean Buck, Ending Fossil Fuels: Why Net Zero is Not Enough
90 notes · View notes
radiantinfosolutions · 2 months
Text
Optimizing Data Center Design for Enhanced Efficiency in India
Tumblr media
Discover how optimizing data center design can enhance operational efficiency and reduce energy consumption in India. Learn about key design considerations and benefits.
0 notes
jcmarchi · 3 months
Text
Intel Unveils Groundbreaking Optical Compute Interconnect Chiplet, Revolutionizing AI Data Transmission
New Post has been published on https://thedigitalinsider.com/intel-unveils-groundbreaking-optical-compute-interconnect-chiplet-revolutionizing-ai-data-transmission/
Intel Unveils Groundbreaking Optical Compute Interconnect Chiplet, Revolutionizing AI Data Transmission
Intel Corporation has reached a revolutionary milestone in integrated photonics technology, Integrated photonics technology involves the integration of photonic devices, such as lasers, modulators, and detectors, onto a single microchip using semiconductor fabrication techniques similar to those used for electronic integrated circuits. This technology allows for the manipulation and transmission of light signals on a micro-scale, offering significant advantages in terms of speed, bandwidth, and energy efficiency compared to traditional electronic circuits.
Today, Intel introduced the first fully integrated optical compute interconnect (OCI) chiplet co-packaged with an Intel CPU at the Optical Fiber Communication Conference (OFC) 2024. This OCI chiplet, designed for high-speed data transmission, signifies a significant advancement in high-bandwidth interconnects, aimed at enhancing AI infrastructure in data centers and high-performance computing (HPC) applications.
Key Features and Capabilities:
High Bandwidth and Low Power Consumption:
Supports 64 channels of 32 Gbps data transmission in each direction.
Achieves up to 4 terabits per second (Tbps) bidirectional data transfer.
Energy-efficient, consuming only 5 pico-Joules (pJ) per bit compared to pluggable optical transceiver modules at 15 pJ/bit.
Extended Reach and Scalability:
Capable of transmitting data up to 100 meters using fiber optics.
Supports future scalability for CPU/GPU cluster connectivity and new compute architectures, including coherent memory expansion and resource disaggregation.
Enhanced AI Infrastructure:
Addresses the growing demands of AI infrastructure for higher bandwidth, lower power consumption, and longer reach.
Facilitates the scalability of AI platforms, supporting larger processing unit clusters and more efficient resource utilization.
Technical Advancements:
Integrated Silicon Photonics Technology: Combines a silicon photonics integrated circuit (PIC) with an electrical IC, featuring on-chip lasers and optical amplifiers.
High Data Transmission Quality: Demonstrated with a transmitter (Tx) and receiver (Rx) connection over a single-mode fiber (SMF) patch cord, showcasing a 32 Gbps Tx eye diagram with strong signal quality.
Dense Wavelength Division Multiplexing (DWDM): Utilizes eight fiber pairs, each carrying eight DWDM wavelengths, for efficient data transfer.
Impact on AI and Data Centers:
Boosts ML Workload Acceleration: Enables significant performance improvements and energy savings in AI/ML infrastructure.
Addresses Electrical I/O Limitations: Provides a superior alternative to electrical I/O, which is limited in reach and bandwidth density.
Supports Emerging AI Workloads: Essential for the deployment of larger and more efficient machine learning models.
Future Prospects:
Prototype Stage: Intel is currently working with select customers to co-package OCI with their system-on-chips (SoCs) as an optical I/O solution.
Continued Innovation: Intel is developing next-generation 200G/lane PICs for emerging 800 Gbps and 1.6 Tbps applications, along with advancements in on-chip laser and SOA performance.
Intel’s Leadership in Silicon Photonics:
Proven Reliability and Volume Production: Over 8 million PICs shipped, with over 32 million integrated on-chip lasers, showcasing industry-leading reliability.
Advanced Integration Techniques: Hybrid laser-on-wafer technology and direct integration provide superior performance and efficiency.
Intel’s OCI chiplet represents a significant leap forward in high-speed data transmission, poised to revolutionize AI infrastructure and connectivity.
0 notes
radiantindia · 3 months
Text
Cisco Catalyst 3000 Series: High-Performance Switching Solutions
Learn about Cisco Catalyst 3000 series switches, their advanced features, pricing insights, deployment scenarios, and where to buy in India through Radiant Info Solutions.
Tumblr media
0 notes
wachinyeya · 4 months
Text
Indian Engineers Tackle Water Shortages with Star Wars Tech in Kerala https://www.goodnewsnetwork.org/indian-engineers-tackle-water-shortages-with-star-wars-tech-in-kerala/
When a severe water shortage hit the Indian city of Kozhikode in the state of Kerala, a group of engineers turned to science fiction to keep the taps running.
Like everyone else in the city, engineering student Swapnil Shrivastav received a ration of two buckets of water a day collected from India’s arsenal of small water towers.
It was a ‘watershed’ moment for Shrivastav, who according to the BBC had won a student competition four years earlier on the subject of tackling water scarcity, and armed with a hypothetical template from the original Star Wars films, Shrivastav and two partners set to work harvesting water from the humid air.
“One element of inspiration was from Star Wars where there’s an air-to-water device. I thought why don’t we give it a try? It was more of a curiosity project,” he told the BBC.
According to ‘Wookiepedia’ a ‘moisture vaporator’ is a device used on moisture farms to capture water from a dry planet’s atmosphere, like Tatooine, where protagonist Luke Skywalker grew up.
This fictional device functions according to Star Wars lore by coaxing moisture from the air by means of refrigerated condensers, which generate low-energy ionization fields. Captured water is then pumped or gravity-directed into a storage cistern that adjusts its pH levels. Vaporators are capable of collecting 1.5 liters of water per day.
If science fiction authors could come up with the particulars of such a device, Shrivastav must have felt his had a good chance of succeeding. He and colleagues Govinda Balaji and Venkatesh Raja founded Uravu Labs, a Bangalore-based startup in 2019.
Their initial offering is a machine that converts air to water using a liquid desiccant. Absorbing moisture from the air, sunlight or renewable energy heats the desiccant to around 100°F which releases the captured moisture into a chamber where it’s condensed into drinking water.
The whole process takes 12 hours but can produce a staggering 2,000 liters, or about 500 gallons of drinking-quality water per day. Uravu has since had to adjust course due to the cost of manufacturing and running the machines—it’s just too high for civic use with current materials technology.
“We had to shift to commercial consumption applications as they were ready to pay us and it’s a sustainability driver for them,” Shrivastav explained. This pivot has so far been enough to keep the start-up afloat, and they produce water for 40 different hospitality clients.
Looking ahead, Shrivastav, Raja, and Balaji are planning to investigate whether the desiccant can be made more efficient; can it work at a lower temperature to reduce running costs, or is there another material altogether that might prove more cost-effective?
They’re also looking at running their device attached to data centers in a pilot project that would see them utilize the waste heat coming off the centers to heat the desiccant.
41 notes · View notes