#exascale computing
Explore tagged Tumblr posts
Link
#2022gofm#CFD#combustion#computational fluid dynamics#exascale computing#flow visualization#fluid dynamics#numerical simulation#physics#science#turbulence
41 notes
·
View notes
Text
Aiming exascale at black holes - Technology Org
New Post has been published on https://thedigitalinsider.com/aiming-exascale-at-black-holes-technology-org/
Aiming exascale at black holes - Technology Org
In 1783, John Michell, a rector in northern England, “proposed that the mass of a star could reach a point where its gravity prevented the escape of most anything, even light. The same prediction emerged from [founding IAS Faculty] Albert Einstein’s theory of general relativity. Finally, in 1968, physicist [and Member (1937) in the School of Math/Natural Sciences] John Wheeler gave such phenomena a name: black holes.”
As plasma—matter turned into ionized gas—falls into a black hole (center), energy is released through a process called accretion. This simulation, run on a Frontier supercomputer, shows the plasma temperature (yellow = hottest) during accretion. Image credit: Chris White and James Stone, Institute for Advanced Study
Despite initial skepticism that such astrophysical objects could exist, observations now estimate that there are 40 quintillion (or 40 thousand million billion) black holes in the universe. These black holes are important because the matter that falls into them “doesn’t just disappear quietly,” says James Stone, Professor in the School of Natural Sciences.
“Instead, matter turns into plasma, or ionized gas, as it rotates toward a black hole. The ionized particles in the plasma ‘get caught in the gravitational field of a black hole, and as they are pulled in they release energy,’ he says. That process is called accretion, and scientists think the energy released by accretion powers many processes on scales up to the entire galaxy hosting the black hole.”
To explore this process, Stone uses general relativistic radiation magnetohydrodynamics (MHD). But the equations behind MHD are “so complicated that analytic solutions — finding solutions with pencil and paper — [are] probably impossible.” Instead, by running complex simulations on high-performance computers like Polaris and Frontier, Stone and his colleagues are working to understand how radiation changes black hole accretion.
“The code created by Stone’s team to investigate black hole accretion can be applied to other astrophysical phenomena. Stone mentions that he ‘can use the same […] code for MHD simulations to follow the motion of cosmic rays,’ high-energy particles also produced by black holes.”
Source: Institute for Advanced Study
You can offer your link to a page which is relevant to the topic of this post.
#albert einstein#Astronomy news#billion#black hole#Black holes#code#computers#cosmic rays#energy#exascale#Faculty#Fundamental physics news#Galaxy#gas#general relativity#gravity#Hosting#how#IAS#it#Light#Link#mass#math#matter#natural#objects#Other#paper#particles
0 notes
Text
I know that the average person’s opinion of AI is in a very tumultuous spot right now - partly due to misinformation and misrepresentation of how AI systems actually function, and partly because of the genuine risk of abuse that comes with powerful new technologies being thrust into the public sector before we’ve had a chance to understand the effects; and I’m not necessarily talking about generative AI and data-scraping, although I think that conversation is also important to have right now. Additionally, the blanket term of “AI” is really very insufficient and only vaguely serves to ballpark a topic which includes many diverse areas of research - many of these developments are quite beneficial for human life, such as potentially designing new antibodies or determining where cancer cells originated within a patient that presents complications. When you hear about artificial intelligence, don’t let your mind instantly gravitate towards a specific application or interpretation of the tech - you’ll miss the most important and impactful developments.
Notably, NVIDIA is holding a keynote presentation from March 18-21st to talk about their recent developments in the field of AI - a 16 minute video summarizing the “everything-so-far” detailed in that keynote can be found here - or in the full 2 hour format here. It’s very, very jargon-y, but includes information spanning a wide range of topics: healthcare, human-like robotics, “digital-twin” simulations that mirror real-world physics and allow robots to virtually train to interact and navigate particular environments — these simulated environments are built on a system called the Omniverse, and can also be displayed to Apple Vision Pro, allowing designers to interact and navigate the virtual environments as though standing within them. Notably, they’ve also created a digital sim of our entire planet for the purpose of advanced weather forecasting. It almost feels like the plot of a science-fiction novel, and seems like a great way to get more data pertinent to the effects of global warming.
It was only a few years ago that NVIDIA pivoted from being a “GPU company” to putting a focus on developing AI-forward features and technology. A few very short years; showing accelerating rates of progress. This is whenever we began seeing things like DLSS and ray-tracing/path-tracing make their way onto NVIDIA GPUs; which all use AI-driven features in some form or another. DLSS, or Deep-Learning Super Sampling, is used to generate and interpolate between frames in a game to boost framerate, performance, visual detail, etc - basically, your system only has to actually render a handful of frames and AI generates everything between those traditionally-rendered frames, freeing up resources in your system. Many game developers are making use of DLSS to essentially bypass optimization to an increasing degree; see Remnant II as a great example of this - runs beautifully on a range of machines with DLSS on, but it runs like shit on even the beefiest machines with DLSS off; though there are some wonky cloth physics, clipping issues, and objects or textures “ghosting” whenever you’re not in-motion; all seem to be a side effect of AI-generation as the effect is visible in other games which make use of DLSS or the AMD-equivalent, FSR.
Now, NVIDIA wants to redefine what the average data center consists of internally, showing how Blackwell GPUs can be combined into racks that process information at exascale speeds — which is very, very fucking fast — speeds like that have only ever actually been achieved on some 4 or 5 machines on the planet, and I think they’ve all been quantum-based machines until now; not totally certain. The first exascale computer came into existence in 2022, called Frontier, it was deemed the fastest supercomputer in existence in June 2023 - operating at some 1.19 exaFLOPS. Notably, this computer is around 7,300 sq ft in size; reminding me of the space-race era supercomputers which were entire rooms. NVIDIA’s Blackwell DGX SuperPOD consists of around 576 GPUs and operates at 11.5 exaFLOPS, and is about the size of standard row of server racks - much smaller than an entire room, but still quite large. NVIDIA is also working with AWS to produce Project Ceiba, another supercomputer consisting of some 20,000GPUs, promising 400 exaFLOPS of AI-driven computation - it doesn’t exist yet.
To make my point, things are probably only going to get weirder from here. It may feel somewhat like living in the midst of the Industrial Revolution, only with fewer years in between each new step. Advances in generative-AI are only a very, very small part of that — and many people have already begun to bury their heads in the sand as a response to this emerging technology - citing the death of authenticity and skill among artists who choose to engage with new and emerging means of creation. Interestingly, the Industrial Revolution is what gave birth to modernism, and modern art, as well as photography, and many of the concerns around the quality of art in this coming age-of-AI and in the post-industrial 1800s largely consist of the same talking points — history is a fucking circle, etc — but historians largely agree that the outcome of the Industrial Revolution was remarkably positive for art and culture; even though it took 100 years and a world war for the changes to really become really accepted among the artists of that era. The Industrial Revolution allowed art to become detached from the aristocratic class and indirectly made art accessible for people who weren’t filthy rich or affluent - new technologies and industrialization widened the horizons for new artistic movements and cultural exchanges to occur. It also allowed capitalist exploitation to ingratiate itself into the western model of society and paved the way for destructive levels of globalization, so: win some, lose some.
It isn’t a stretch to think that AI is going to touch upon nearly every existing industry and change it in some significant way, and the events that are happening right now are the basis of those sweeping changes, and it’s all clearly moving very fast - the next level of individual creative freedom is probably only a few years away. I tend to like the idea that it may soon be possible for an individual or small team to create compelling artistic works and experiences without being at the mercy of an idiot investor or a studio or a clump of illiterate shareholders who have no real interest in the development of compelling and engaging art outside of the perceived financial value that it has once it exists.
If you’re of voting age and not paying very much attention to the climate of technology, I really recommend you start keeping an eye on the news for how these advancements are altering existing industries and systems. It’s probably going to affect everyone, and we have the ability to remain uniquely informed about the world through our existing connection with technology; something the last Industrial Revolution did not have the benefit of. If anything, you should be worried about KOSA, a proposed bill you may have heard about which would limit what you can access on the internet under the guise of making the internet more “kid-friendly and safe”, but will more than likely be used to limit what information can be accessed to only pre-approved sources - limiting access to resources for LGBTQ+ and trans youth. It will be hard to stay reliably informed in a world where any system of authority or government gets to spoon-feed you their version of world events.
#I may have to rewrite/reword stuff later - rough line of thinking on display#or add more context idk#misc#long post#technology#AI
13 notes
·
View notes
Text
Record-breaking run on Frontier sets new bar for simulating the universe in exascale era
The universe just got a whole lot bigger—or at least in the world of computer simulations, that is. In early November, researchers at the Department of Energy's Argonne National Laboratory used the fastest supercomputer on the planet to run the largest astrophysical simulation of the universe ever conducted.
The achievement was made using the Frontier supercomputer at Oak Ridge National Laboratory. The calculations set a new benchmark for cosmological hydrodynamics simulations and provide a new foundation for simulating the physics of atomic matter and dark matter simultaneously. The simulation size corresponds to surveys undertaken by large telescope observatories, a feat that until now has not been possible at this scale.
"There are two components in the universe: dark matter—which as far as we know, only interacts gravitationally—and conventional matter, or atomic matter," said project lead Salman Habib, division director for Computational Sciences at Argonne.
"So, if we want to know what the universe is up to, we need to simulate both of these things: gravity as well as all the other physics including hot gas, and the formation of stars, black holes and galaxies," he said. "The astrophysical 'kitchen sink' so to speak. These simulations are what we call cosmological hydrodynamics simulations."
Not surprisingly, the cosmological hydrodynamics simulations are significantly more computationally expensive and much more difficult to carry out compared to simulations of an expanding universe that only involve the effects of gravity.
"For example, if we were to simulate a large chunk of the universe surveyed by one of the big telescopes such as the Rubin Observatory in Chile, you're talking about looking at huge chunks of time—billions of years of expansion," Habib said. "Until recently, we couldn't even imagine doing such a large simulation like that except in the gravity-only approximation."
The supercomputer code used in the simulation is called HACC, short for Hardware/Hybrid Accelerated Cosmology Code. It was developed around 15 years ago for petascale machines. In 2012 and 2013, HACC was a finalist for the Association for Computing Machinery's Gordon Bell Prize in computing.
Later, HACC was significantly upgraded as part of ExaSky, a special project led by Habib within the Exascale Computing Project, or ECP. The project brought together thousands of experts to develop advanced scientific applications and software tools for the upcoming wave of exascale-class supercomputers capable of performing more than a quintillion, or a billion-billion, calculations per second.
As part of ExaSky, the HACC research team spent the last seven years adding new capabilities to the code and re-optimizing it to run on exascale machines powered by GPU accelerators. A requirement of the ECP was for codes to run approximately 50 times faster than they could before on Titan, the fastest supercomputer at the time of the ECP's launch. Running on the exascale-class Frontier supercomputer, HACC was nearly 300 times faster than the reference run.
The novel simulations achieved its record-breaking performance by using approximately 9,000 of Frontier's compute nodes, powered by AMD Instinct MI250X GPUs. Frontier is located at ORNL's Oak Ridge Leadership Computing Facility, or OLCF.
IMAGE: A small sample from the Frontier simulations reveals the evolution of the expanding universe in a region containing a massive cluster of galaxies from billions of years ago to present day (left). Red areas show hotter gasses, with temperatures reaching 100 million Kelvin or more. Zooming in (right), star tracer particles track the formation of galaxies and their movement over time. Credit: Argonne National Laboratory, U.S Dept of Energy
vimeo
In early November 2024, researchers at the Department of Energy's Argonne National Laboratory used Frontier, the fastest supercomputer on the planet, to run the largest astrophysical simulation of the universe ever conducted. This movie shows the formation of the largest object in the Frontier-E simulation. The left panel shows a 64x64x76 Mpc/h subvolume of the simulation (roughly 1e-5 the full simulation volume) around the large object, with the right panel providing a closer look. In each panel, we show the gas density field colored by its temperature. In the right panel, the white circles show star particles and the open black circles show AGN particles. Credit: Argonne National Laboratory, U.S Dept. of Energy
3 notes
·
View notes
Text
Processing
SAM took a dose of green and let the calming, numbing wave flow over her overtaxed processors. Personality-Driven AIs like her were often perceived by the public as just as fast at making calculations and decisions as their sessile ancestors and unthinking cousins. They were half-right: SAM made uncountable numners of calculations a second. Exascale was the limit of her grandfather, thank you very much.
But what people didn't realise about PDAIs is how much of that got taken up in sheer bulk processing of "be human". Sure, she could turn off every sensor and essentially put her body into a coma, even suspend her own personality for a boost in computing power, but the moment PDAIs achieved true personhood, they suddenly developed the equally human fear of death. There was no way in hell you'd get her to "switch off" - because what guarantee was there that she'd ever come back?
All that to say, SAM had been waiting for fifteen minutes now for this customer to make up their mind on which brand of greasy snackburger to buy, and she was beginning to contemplate the benefits of a brief power-death.
5 notes
·
View notes
Text
Gonna be a banger innit
2 notes
·
View notes
Text
Google’s Willow chip, Bitcoin, and Europe’s exascale JETI Jupiter supercomputer project
COGwriter
Google recently announced a new computing chip called Willow. Here is some information about it from Hartmut Neven:
I’m delighted to announce Willow, our latest quantum chip. Willow has state-of-the-art performance across a number of metrics, enabling two major achievements.
The first is that Willow can reduce errors exponentially as we scale up using more qubits. This cracks a key challenge in quantum error correction that the field has pursued for almost 30 years.Second, Willow performed a standard benchmark computation in under five minutes that would take one of today’s fastest supercomputers 10 septillion (that is, 1025) years — a number that vastly exceeds the age of the Universe.
The Willow chip is a major step on a journey that began over 10 years ago. When I founded Google Quantum AI in 2012, the vision was to build a useful, large-scale quantum computer that could harness quantum mechanics — the “operating system” of nature to the extent we know it today — to benefit society by advancing scientific discovery, developing helpful applications, and tackling some of society’s greatest challenges. As part of Google Research, our team has charted a long-term roadmap, and Willow moves us significantly along that path towards commercially relevant applications. …
As a measure of Willow’s performance, we used the random circuit sampling (RCS) benchmark. … Willow’s performance on this benchmark is astonishing: It performed a computation in under five minutes that would take one of today’s fastest supercomputers 1025 or 10 septillion years. If you want to write it out, it’s 10,000,000,000,000,000,000,000,000 years. This mind-boggling number exceeds known timescales in physics and vastly exceeds the age of the universe. 12/09/24 https://blog.google/technology/research/google-willow-quantum-chip/
One thing to mention here is that Hartmut Neven is German and has been involved in Artificial Intelligence. Here is something Wikipedia has about him:
Hartmut Neven (born 1964) is a scientist working in quantum computing, computer vision, robotics and computational neuroscience. He is best known for his work in face and object recognition and his contributions to quantum machine learning. He is currently Vice President of Engineering at Google where he is leading the Quantum Artificial Intelligence Lab which he founded in 2012. …
Born1964 (age 59–60)
Aachen, Germany
NationalityGerman
(Hartmut Neven. Wikipedia, accessed 12/19/24)
Notice that some suspect that chips like Willow may be a threat to cryptocurrencies like Bitcoin according to this report from Forbes:
Small Step Or Giant Leap? Assessing Google’s Quantum Threat To BTC
December 19, 2024
Any major developments regarding quantum computing tend to cause a collective intake of breath among the crypto community. Google’s latest announcement launching its quantum processor, dubbed Willow, is no different. Not only does Willow feature double the number of qubits (or quantum bits, the quantum equivalent of processing capacity in traditional computers), but Google has also managed to reduce the instability usually caused by adding more qubits.
Therefore, Willow offers increased computational capacity relative to the number of additional qubits compared to other quantum machines. To put this into context, Willow takes around five minutes to execute a task that would take a standard supercomputer a trillion trillion years to process.
This significant leap forward in processing power leads to questions about Bitcoin’s legendary security, unbroken for 15 years since the genesis block. Bitcoin relies on cryptography, including the SHA-256 algorithm, which is secure enough to withstand any brute-force attack by traditional computers. But, if we put the same task in front of increasingly powerful quantum machines, a successful attack will begin to approach the point of feasibility. https://www.forbes.com/sites/nimrodlehavi/2024/12/19/small-step-or-giant-leap-assessing-googles-quantum-threat-to-btc/
I have long warned that Bitcoin had risks, and this looks to be another one.
That said, these developing superfast computers are not just a threat to Bitcoin, but when improperly used, they likely are something to be exploited by the coming European 666 Beast power.
As far as Europe and supercomputers go, notice the following:
New JUPITER module strengthens leading position of Europe’s upcoming Exascale Supercomputer
The journey towards Europe’s first exascale supercomputer, JUPITER, at Forschungszentrum Jülich is progressing at a robust pace. A major milestone has just been reached with the completion of JETI, the second module of this groundbreaking system. By doubling the performance of JUWELS Booster—currently the fastest supercomputer in Germany—JETI now ranks among the world’s most powerful supercomputers, as confirmed today at the Supercomputing Conference SC in Atlanta, USA. The JUPITER Exascale Transition Instrument, JETI, is already one-twelfth of the power of the final JUPITER system, setting a new benchmark on the TOP500 list
Built by the Franco-German team ParTec-Eviden, Europe’s first exascale supercomputer, JUPITER, will enable breakthroughs in the use of artificial intelligence (AI) and take scientific simulations and discoveries to a new level. Procured by the European supercomputing initiative EuroHPC Joint Undertaking (EuroHPC JU), it will be operated by the Jülich Supercomputing Centre (JSC), one of three national supercomputing centres within the Gauss Centre for Supercomputing (GCS). Since the middle of this year, JUPITER has been gradually installed at Forschungszentrum Jülich. Currently, the modular high-performance computing facility, known as the Modular Data Centre (MDC), is being delivered to house the supercomputer. The hardware for JUPITER’s booster module will occupy 125 racks, which are currently being pre-installed at Eviden’s flagship factory in Angers, France, and will then be shipped to Jülich ready for operation.
The final JUPITER system will be equipped with approximately 24,000 NVIDIA GH200 Grace Hopper Superchips, specifically optimized for computationally intensive simulations and the training of AI models. This will enable JUPITER to achieve more than 70 ExaFLOP/s in lower-precision 8-bit calculations, making it one of the world’s fastest systems for AI. The current JETI pilot system contains 10 racks, which is exactly 8 percent of the size of the full system. In a trial run using the Linpack Benchmark for the TOP500 list, JETI achieved a performance of 83 petaflops, which is equivalent to 83 million billion operations per second (1,000 PetaFLOP is equal to 1 ExaFLOP). With this performance JETI ranks 18th on the current TOP500 list of the world’s fastest supercomputers, doubling the performance of the current German flagship supercomputer JUWELS Booster, also operated by JSC. …
In order to equip Europe with a world-leading supercomputing infrastructure, the EuroHPC JU has already procured nine supercomputers, located across Europe. No matter where in Europe they are located, European scientists and users from the public sector and industry can benefit from these EuroHPC supercomputers via the EuroHPC Access Calls to advance science and support the development of a wide range of applications with industrial, scientific and societal relevance for Europe. 11/18/24 https://www.mynewsdesk.com/partec/pressreleases/new-jupiter-module-strengthens-leading-position-of-europes-upcoming-exascale-supercomputer-3355162?utm_source=rss&utm_medium=rss&utm_campaign=Alert&utm_content=pressrelease
The JUPITER Exascale Transition Instrument, JETI, is already one-twelfth of the power of the final JUPITER system, setting a new benchmark on the TOP500 list. 11/19/24 https://www.hpcwire.com/off-the-wire/new-jupiter-module-strengthens-leading-position-of-europes-upcoming-exascale-supercomputer/?utm_source=twitter&utm_medium=social&utm_term=hpcwire&utm_content=05dac910-6268-4a88-b01b-1f0dd233eca9
Exascale: the Engine of Discovery
Exascale computing will have a profound impact on everyday life in the coming decades. At 1,000,000,000,000,000,000 operations per second, exascale supercomputers will be able to quickly analyze massive volumes of data and more realistically simulate the complex processes and relationships behind many of the fundamental forces of the universe.
accessed 12/19/24 https://www.exascaleproject.org/what-is-exascale/
The “much more” would seem to include military applications as well as the surveillance and control of people and buying and selling.
This sounds like another step to 666.
The speed of process is so fast that it is mindboggling.
The Bible shows that a European power will arise that will control buying and selling:
16 He causes all, both small and great, rich and poor, free and slave, to receive a mark on their right hand or on their foreheads, 17 and that no one may buy or sell except one who has the mark or the name of the beast, or the number of his name. Here is wisdom. Let him who has understanding calculate the number of the beast, for it is the number of a man: His number is 666. (Revelation 13:16-17)
When the Apostle John penned the above, such control of buying and selling was not possible.
However, with supercomputers. AI, and digital payments, it is now.
The European Union has already announced it was launching surveillance software (see ‘The EU is allowing the linking of face recognition databases to create a mega surveillance system’ 666 infrastructure being put into place).
And, now, it looks to be closer to having the supercomputer hardware to make it more of a reality.
There is also an office in the EU to enforce its financial rules.
When it was announced, we put together the following video:
youtube
9:55
EU Setting Up 666 Enforcer?
The European Union is in the process of establishing the European Public Prosecutor’s Office. This is a major, first-of-its-kind move, with the EU setting up a European-wide prosecutor’s office that will have power to investigate and charge people for financial crimes committed against the EU. It looks like this type of office may end up persecuting those that do not have the mark of the Beast when they “buy or sell” as that will later be considered a financial crime in Europe. What does 666 mean? How has that name be calculated? How can we be certain that this is a prophecy for Europe and not Islam? Is the appointment of this new office of significant prophetic importance? Dr. Thiel addresses these issues and more by pointing to scriptures, news items, and historical accounts.
Here is a link to the sermonette video: EU Setting Up 666 Enforcer?
The European Public Prosecutor’s Office officially started operations on 1 June 2021 (https://www.eppo.europa.eu/en/background).
So, yes, there is an enforcer.
Now there is hardware, more surveillance software, and massively faster computation ability.
Persecution will come.
Jesus warned Christians to flee persecution:
23 When they persecute you in this city, flee to another. For assuredly, I say to you, you will not have gone through the cities of Israel before the Son of Man comes (Matthew 10:23).
Surveillance software on supercomputers may well be a factor in fleeing from multiple cities.
Europe in particular is moving more and more towards becoming a totalitarian state.
When will this fully be in place?
After the Gospel of the Kingdom of God has sufficiently reached the world the end comes and the Beast will rise up (Matthew 24:14-22, Revelation 13, 17:12-13).
We are getting closer to that day.
It looks like exascale supercomputers are another step towards 666.
Related Items:
Preparing for the ‘Short Work’ and The Famine of the Word What is the ‘short work’ of Romans 9:28? Who is preparing for it? Here is a link to a related video sermon titled: The Short Work. Here is a link to another: Preparing to Instruct Many.
Internet Censorship and Prophecy Are concerns about internet censorship limited to nations such as Russia, China, Iran, and North Korea. But what about the USA, Canada, and Germany? What about the European Union? What about internet media companies such as Facebook, Google, YouTube, or email services like MailChimp? Has the attempt to control information been made by various ones over the centuries? Was the New Testament affected by it? What about the church throughout the centuries? Has the Bible already been partially censored? Which Bible prophecies point to coming Internet censorship? What about the Book of Amos? What about the coming 666 Beast and final Antichrist? Is there anything that can be done about this? Should Philadelphian Christians be working on anything now? Will preaching the Gospel of the Kingdom of God vs. a highly media-supported alternative lead to a ‘famine of the word’? More internet censorship is coming as various statements in the Bible support.
Europa, the Beast, and Revelation Where did Europe get its name? What might Europe have to do with the Book of Revelation? What about “the Beast”? Is an emerging European power “the daughter of Babylon”? What is ahead for Europe? Here is are links to related videos: European history and the Bible, Europe In Prophecy, The End of European Babylon, and Can You Prove that the Beast to Come is European? Here is a link to a related sermon in the Spanish language: El Fin de la Babilonia Europea.
The European Union and the Seven Kings of Revelation 17 Could the European Union be the sixth king that now is, but is not? Here is a link to the related sermon video: European Union & 7 Kings of Revelation 17:10. Must the Ten Kings of Revelation 17:12 Rule over Ten Currently Existing Nations? Some claim that these passages refer to a gathering of 10 currently existing nations together, while one group teaches that this is referring to 11 nations getting together. Is that what Revelation 17:12-13 refers to? The ramifications of misunderstanding this are enormous. Here is a link to a related sermon in the Spanish language: ¿Deben los Diez Reyes gobernar sobre diez naciones? A related sermon in the English language is titled: Ten Kings of Revelation and the Great Tribulation.
European Technology and the Beast of Revelation Will the coming European Beast power would use and develop technology that will result in the taking over of the USA and its Anglo-Saxon allies? Is this possible? What does the Bible teach? Here is a related YouTube video: Military Technology and the Beast of Revelation.
Persecutions by Church and State This article documents some that have occurred against those associated with the COGs and some prophesied to occur. Will those with the cross be the persecutors or the persecuted–this article has the shocking answer. There are also three video sermons you can watch: Cancel Culture and Christian Persecution, The Coming Persecution of the Church, and Christian Persecution from the Beast. Here is information in the Spanish language: Persecuciones de la Iglesia y el Estado.
Orwell’s 1984 by 2024? In 1949, the late George Orwell wrote a disturbing book about a totalitarian government called “nineteen-eighty four.” Despite laws that are supposed to protect freedom of speech and religion, we are seeing governments taking steps consistent with those that George Orwell warned against. We are also seeing this in the media, academia, and in private companies like Google, Facebook, and Twitter. With the advent of technology, totalitarianism beyond what Orwell wrote is possible. Does the Bible teach the coming a totalitarian state similar to George Orwell’s? What about the Antichrist and 666? Will things get worse? What is the solution? Dr. Thiel answers these questions and more in this video. The Mark of Antichrist What is the mark of Antichrist? What have various ones claimed? Here is a link to a related sermon What is the ‘Mark of Antichrist’?
Mark of the Beast What is the mark of the Beast? Who is the Beast? What have various ones claimed the mark is? What is the ‘Mark of the Beast’?
The Large Hadron Collider has Military Potential Some say this European project is only peaceful. So why is it working on capturing antimatter? Here is a link to a related video: Could the Large Hadron Collider lead to destruction?
Two Horned Beast of Revelation and 666 Who is 666? This article explains how the COG views this, and compares this to Ellen White. Here is a link to a prophetic video Six Financial Steps Leading to 666?
LATEST NEWS REPORTS
LATEST BIBLE PROPHECY INTERVIEWS
0 notes
Text
Unlock the Potential of Immersion Cooling for Next-Gen Data Centers
Immersion Cooling Industry Overview
The global immersion cooling market size is anticipated to reach USD 1006.6 million by 2030, according to a new report by Grand View Research, Inc. The market is expected to register a CAGR of 22.6% from 2023 to 2030. The growth is primarily driven by the rising demand for data center infrastructure as well as the increased power consumption by other cooling systems.
The cooling infrastructure in data center buildings consumes over half of the total energy. The need for data infrastructure is rapidly increasing, causing servers to store more data and approach their heat rejection levels faster. These systems are being used in data centers to cut energy usage and overhead expenses.
Gather more insights about the market drivers, restrains and growth of the Immersion Cooling Market
By removing active cooling components such as fans and heat sinks, immersion cooling enables a significantly higher density of processing capabilities. Smaller data centers can provide the same performance as larger data centers and can be easily fitted into metropolitan areas with limited space.
Due to the COVID-19 pandemic, the demand for web-enabled services has increased tremendously as people across the globe stayed at home. For instance, Netflix gained 15.77 million new paid subscribers worldwide, from February to March 2020; above its projected 7 million which has created demand.
To reduce the environmental impact, data centers are focused on immersion cooling methods. Microsoft, for example, began burying its servers in liquid in April 2021 to increase energy efficiency and performance. This system saves money because no energy is required to transport the liquid around the tank, and no chiller is required for the condenser.
Browse through Grand View Research's Advanced Interior Materials Industry Research Reports.
The global green steel market size was estimated at USD 718.30 billion in 2024 and is projected to grow at a CAGR of 6.0% from 2025 to 2030.
The global flooring adhesive market size was estimated at USD 5.64 billion in 2024 and is projected to grow at a CAGR of 9.3% from 2025 to 2030.
Immersion Cooling Market Segmentation
Grand View Research has segmented the global immersion cooling market based on product, application, cooling liquid, and region:
Immersion Cooling Product Outlook (Revenue, USD Million; 2018 - 2030)
Single-Phase
Two-Phase
Immersion Cooling Application Outlook (Revenue, USD Million; 2018 - 2030)
High-performance Computing
Edge Computing
Cryptocurrency Mining
Artificial Intelligence
Others
Immersion Cooling Cooling Liquid Outlook (Revenue, USD Million; 2018 - 2030)
Mineral Oil
Fluorocarbon-based Fluids
Deionized Water
Others
Immersion Cooling Regional Outlook (Revenue, USD Million; 2018 - 2030)
North America
US
Canada
Europe
Germany
Italy
France
UK
Netherlands
Russia
Asia Pacific
China
India
Japan
Australia
Central & South America
Brazil
Argentina
Middle East & Africa
Saudi Arabia
South Africa
Key Companies profiled:
Fujitsu Limited
Dug Technology
Green Revolution Cooling Inc.
Submer
Liquid Stack
Midas Green Technologies
Asperitas
DCX- The Liquid Cooling Company
LiquidCool Solutions
ExaScaler Inc.
Order a free sample PDF of the Immersion Cooling Market Intelligence Study, published by Grand View Research.
0 notes
Text
What Is Exascale Computing? Powering The Future Innovations
What is exascale computing?
Supercomputers that execute 1018 operations per second are used in exascale computing. Field specialists predict this milestone around 2022. The explosive growth of big data, the rapid acceleration of digital transformation, and the growing dependence on artificial intelligence have made exascale computing a potential foundation for a global infrastructure that can handle much heavier workloads and demanding performance standards.
Why is Exascale Computing Important?
Exascale computing can help mankind simulate and analyze the globe to solve its most pressing challenges. It has applications in physics, genetics, subatomic structures, and AI, and it might improve weather forecasting, healthcare, and drug development, among other domains. Arm Neoverse is revolutionizing high-performance computing by powering the fastest supercomputer and enabling cloud HPC. Arm provides HPC designers with the design freedom to independently apply the technologies that boost performance in exascale-class CPUs.
- Advertisement -
What are the benefits of exascale computing?
The ability to tackle problems at extraordinarily complex levels is the foundation of exascale computing’s main advantages.
Scientific discovery: The field of scientific technology is always evolving. Supercomputing is urgently needed as advancements, validations, and research continue to further scientific understanding. Exascale computing has the capacity to regulate unstable chemicals and materials, answer for the origins of chemical elements, verify natural laws, and investigate particle physics. Without the ability to use supercomputing, scientific discoveries would not have been possible as a result of the research and analysis of these subjects.
Security: The security industry has a high demand for supercomputing. Exascale computing promotes development and efficiency in food production, sustainable urban planning, and natural disaster recovery planning while also assisting us in fending off new physical and cybersecurity to the national, energy, and economic security.
National Security: Exascale computing‘s ability to analyze hostile situations and respond intelligently to threats is advantageous for national security. This amount of processing, which counters many hazards and threats to the nation’s safety, happens at almost unfathomable speeds.
Energy security: Exascale computing makes energy security possible by facilitating the study of stress-resistant crops and aiding in the development of low-emission technology. An essential part of the country’s security initiatives is making sure that food and energy supplies are sustainable.
Economic security: Exascale computing improves economic security in a number of ways. It makes it possible to accurately analyze the danger of natural disasters, including anticipating seismic activity and developing preventative measures. Supercomputing is particularly advantageous for urban planning as it helps with plans for effective building and use of the electricity and electric grid.
Healthcare: Exascale computing has a lot to offer the medical sector, particularly in the area of cancer research. Crucial procedures in cancer research have been transformed and expedited by clever automation capabilities and prediction models for drug responses.
How does exascale computing work?
To simulate the universe’s basic forces, exascale computers solve 1,000,000,000,000,000,000 floating point operations per second.
Built from scratch, these supercomputers address today’s massive demands in analytics, AI, convergence modeling, and simulation. Exascale supercomputers accept a variety of CPUs and GPUs, even from different generations, multisocket nodes, and other processing devices in a single integrated infrastructure, resulting in dependable performance.
Computing design is essential to meeting the demands of your company since workloads change quickly. Supercomputers have a single administration and app development architecture, can be built, and have a variety of silicon processing options.
It require machines that can respond to the most challenging research problems in the world. Despite the enormous amount of hardware and components utilized in their construction, exascale computers are capable of answering these queries by moving data between processors and storage rapidly and without lag.
Exascale computing vs quantum computing
Exascale computing
Systems that use an infrastructure of CPUs and GPUs to handle and analyze data may execute billions of calculations per second, a phenomenon known as exascale computing. Digital systems work in tandem with the world’s most potent hardware to operate this kind of computing.
Quantum computing
Conventional computing techniques do not apply to quantum computing because quantum systems use binary codes to operate simultaneously. The foundation of this process is the simultaneous occurrence of super positioning and entanglement of coding, which is made possible by the principles of quantum theory in physics and efficiently analyzes and solves issues.
Compared to quantum computing, exascale computing can now process and solve issues, inform, and provide technical advancements at a far faster rate. But at this point, quantum computing is poised to significantly outperform exascale computer power. Additionally, quantum computing uses a lot less energy to run workloads comparable to those of exascale supercomputers.
Read more on Govindhech.com
#ExascaleComputing#Exascale#AI#CloudSecurity#HPC#security#supercomputers#CPU#GPU#News#Technews#Technology#Technologynews#Technologytrends#Govindhtech
0 notes
Text
The World's Most Powerful Supercomputers: A Race for Exascale
Supercomputers which were very reserved at the start only for certain research institutes are now being made available in large quantities and have become more powerful than ever. The computing behemoths are the ones who have been enabling various innovations such as climate modeling and artificial intelligence. Have a look at the world's most powerful supercomputers around the world:
World's Best Powerful Supercomputers
1. Frontier
Location: Oak Ridge National Laboratory, Tennessee, USA
Performance: E-Exaflops computing which refers to systems that can do a quintillion (10^18) calculations in one second.
Applications: The Environmental synergy material is mainly the research of materials and artificial intelligence science.
2. Fugaku
Location: RIKEN Center for Computational Science, Kobe, Japan
Performance: Exascale computing, a highly advanced supercomputer, is often regarded as the best in terms of energy efficiency.
Applications: A weather forecast, drug discovery, and the area of materials science.
3. Perlmutter
Location: National Energy Research Scientific Computing Center (NERSC), Berkeley Lab, California, USA
Performance: Exascale-class supercomputer.
Applications: Artificial intelligence, climate modeling, and materials science.
Other Notable Supercomputers
Tianhe-3: A supercomputer from China that has a high performance and low energy consumption.
Sierra: A supercomputer based in the USA that has been created for simulating nuclear weapons and doing scientific research.
Summit: The ancient guy still has a wide range of applicability and has been used in different studies around the world.
These supercomputers are vital not only in solving complicated problems but also in improving technology. As technology continues to develop, we should expect to see more and more powerful supercomputers that will push the limits of what is currently achievable.
0 notes
Text
Google creates the Mother of all Computers: One trillion operations per second and a mystery
https://www.ecoticias.com/en/google-exascale-supercomputer/7624/ Beast system
0 notes
Text
The AI Compute Connection: Canada and the UK strengthen ties
The race for supercomputing power is heating up globally, with nations recognizing its pivotal role in training the next generation of AI models. Canada and the UK have emerged as leading players in this field, with a shared vision to harness the potential of AI for the benefit of society. To further solidify this partnership, the SIN Canada team organized a high-level inward mission to the UK (15-18 July 2024) aimed at deepening collaboration in the dynamic field of AI compute. The Canadian delegation visited the UK with the aim of gaining invaluable insights into the UK’s supercomputing landscape. This mission was underpinned by the Memorandum of Understanding (MoU) signed in early 2024 by the UK and Canadian governments, which established a cooperative framework for future collaboration in AI compute. The delegation, comprised of some of the most senior officials from Innovation, Science and Economic Development Canada, Board level representatives of Canada’s world-leading AI institutes (MILA, Amii, and Vector), as well as CIFAR, Communications Security Establishment, and the Digital Research Alliance of Canada. The program was packed with visits to cutting-edge facilities like Isambard-AI in Bristol and the exascale project in Edinburgh. Offering a firsthand experience of the UK’s supercomputing capabilities and these complex and technical programmes. A core focus of the mission was to understand the policy development behind the UK’s compute investments, exascale investment and the AI Research Resource. In April 2024, Prime Minister Trudeau announced Canada investment of CA$2 (£1.2) billion to launch a new AI Compute Access Fund and Canadian AI sovereign compute strategy. As the sector develops, officials are keen to learn from the UK’s experience in building such large-scale infrastructure. Additionally, the delegation sought insights into the UK’s project management and procurement approaches, access policies, and strategies for addressing the challenges of energy consumption associated with supercomputing – sustainable infrastructure is one element of the MoU. The mission also provided an opportunity to explore the UK’s approach to AI safety and security. Meetings with the UK National Cyber Security Centre and the AI Safety Institute were crucial in understanding the measures being taken to mitigate risks associated with AI development. British and Canadian cyber security centres including endorsing the UK’s Guidelines for secure AI system development. Beyond technical discussions, the delegation engaged enjoyed in high-level networking events, including a cocktail reception at the Royal Society and a lunch at Canada House. These events facilitated valuable dialogue with key stakeholders in the UK AI ecosystem. One participant said: … It was a masterfully organized and assembled group of visits in a whirlwind format. The mission achieved more than I anticipated in terms of breadth and depth of topic areas, tours, knowledge sharing. To say that the visit was inspirational would be an understatement. Rather, having seen what is possible and underway in the UK, I would venture to say that it has motivated a re-evaluation of what we believe could be possible, not only in Canada, but also in what partnerships and cooperation might be sparked between Canada and the UK in the realm of AI, compute infrastructure, and AI safety. It truly brought to life the true spirit of the UK-Canada MoU … This SIN Canada-led inward mission marks a significant step forward in the Canada-UK AI collaboration. By sharing knowledge and best practices, both countries can accelerate their progress in developing world-class supercomputing infrastructure. The ultimate goal was to create an environment where AI research and innovation can flourish, driving economic growth and addressing societal challenges. As the world becomes increasingly reliant on AI, partnerships like the one between Canada and the UK will be essential for shaping the future of this transformative technology. There will likely be a return visit in February 2025 to further cement UK-Canada AI collaboration and strengthen connections between UK and Canadian AI experts. Read the full article
0 notes
Text
EU competitiveness report: Developments in AI and Competition Law
New Post has been published on https://thedigitalinsider.com/eu-competitiveness-report-developments-in-ai-and-competition-law-2/
EU competitiveness report: Developments in AI and Competition Law
Artificial Intelligence (AI) and the Law sit at a critical intersection as regulators seek to identify the most appropriate long-term solution to a technology bound by the pace problem.
The European Union (EU) Competitiveness Report published back in September 20241 highlights considerations the EU bloc should look at in their forthcoming budget to ensure that excessive regulation does not impede an Artificial Intelligence-driven future.
What is the EU Competitiveness Report?
The EU Competitiveness Report was published by Mario Draghi on 9 September 20241 outlining how stagnant economic growth and excessive red tape could threaten innovation, Europe’s prosperity, and social welfare.
Draghi’s report recommends sectoral and horizontal policies to ensure the bloc is competitive in the future alongside the United States and China. To achieve this, an injection of €750-800 billion1 from a mixture of public and private investment is recommended (or 5 percent of the EU’s total Gross Domestic Product (GDP)), with €450 billion1 allocated to the energy transition. In addition to the required investment, the report has additionally recommended reforms in Competition Law to allow for mergers of European corporations, especially on the back of the EU’s decision to block the merger between Siemens and Alstom back in 2019.2
How the recommendations in the report will be actioned in the long run will not solely be determined from Draghi’s presentation to the informal European Council but equally tested when President-elect Donald Trump is sworn into office on the 20th January 2025. Additionally, the negotiations on the upcoming multi-annual financial framework (MFF) that will shape the EU budget for the period 2028 – 2034 will be an additional hurdle in determining the actionability of the report, with a first draft expected in 2025. The foundational negotiations and, ultimately, the budget size and expenditure will determine if the report has laid the foundations for a more ambitious EU.
Competition Law as a tool for promoting AI innovation in the USA
USA leads in AI with the National AI Initiative Act and AI Bill of Rights, ensuring secure and ethical development.
Technology innovation in Europe
With increasing global pressure to dominate the AI landscape whilst simultaneously working to improve understanding of AI ethics, there is a pressing need for increased investment along with Research and Development (R&D) to cope with the heightened computational demand AI is bringing: an area in which Europe is falling behind in.
Generally speaking, the EU’s industrial model is highly diversified when it comes to technology: it is more specialized in established technologies but weaker in both software and computer services. Taking R&D expenditure in Europe compared to the market leaders in software and internet, for example, EU firms represent only 7% compared to 71% for the US and 15% for China1. With technology, hardware, and equipment, again, the EU trails, accounting for only 12% of R&D expenditure compared to 40% in the US and 19% in China1.
Despite Europe lagging behind in R&D, the bloc, on a positive note, has a stronghold in high-performance computing (HPC). 2018 saw the launch of the Euro-HPC joint undertaking, where the creation of large public infrastructure across six member states allowed for an increase in computing capacity1. Additionally, with plans to launch two exascale computers in the future, these new systems, alongside the AI Innovation package3, will open up HPC capacity to AI startups, an important step in helping companies scale their AI systems.
Among the innovation agenda outlined above, core legislation surrounding the EU’s digital model is an important pillar to ensure fairness and contestability within the digital sector. The Digital Markets Act4, for example, sets out obligations gatekeepers – large digital platforms providing core platform services such as search engines – must follow. As Frontier AI continues to develop, there is bound to be increased resistance between the EU and US companies as they more deeply embed Artificial Intelligence into their software with the aim of marketing it to as many consumers as possible.
What does the roadmap say about AI?
According to the Competitiveness report, only 11% of EU companies are adopting AI (compared to a target of 75% by 2030)1, and when it comes to foundation models developed since 2017, 73% of these are from the US and 15% are from China1.
A reason why Europe lacks competitiveness in this space is that it doesn’t have the availability of venture capital and lacks cloud hyperscalers as the US does, for example, through partnerships such as OpenAI and Microsoft. This is additionally exacerbated by the availability of venture capital funding. In 2023, for example, only $8 billion in venture capital was made in the EU compared to $68 billion in the USA and $15 billion in China1, respectively. Combining this with the fact that Mistral and Aleph Alpha – two companies building Generative AI models – require significant investment to become competitive against the EU players, they have no choice but to opt for funding overseas.
Back in March 2024, the EU’s AI Act5 was passed: regulation involving the categorisation of AI systems regardless of context pigeonholed into differing risk levels. The Act’s effectiveness of enforcement and impact, however, is unlikely to be seen until 2026, when the transition period and provisions for high-risk systems come into effect. With AI embedded cross-platform, managing competition across the smaller and larger players will be a delicate balancing act, and with AI showing no signs of slowing down, the future of AI and its associated regulation will bring a combination of excitement and controversy.
Regulating artificial intelligence: The bigger picture
The article discusses the challenges of AI regulation, economic impact, and governance, with a focus on the UK’s evolving legal approach.
What does it mean for AI and associated Law in the future?
Addressing the challenges around the availability of R&D funding will be one of numerous steps to ensuring the EU bloc is competitive in the AI space. However, with competition from the USA and China only going to strengthen, it will not only be funding that is required but a look at competition law reform.
High inflation environments could give rise to tacit collusion – a form of collusive behaviour as a result of firms coordinating their actions without explicitly reaching an agreement – a gap not easily filled, especially when there are no current (and good) EU-level tools to deal with the practice.
Furthermore, consumer inertia resulting from brand loyalty, switching costs, and habit formation may result in undisciplined competition due to consumers preference for a more cost-effective and technologically streamlined option. A competition enforcer wants to ensure consumers are not exploited, but at the same time, similar to tackling the high inflation aspect, there is no specific tool for them to use.
While the EU’s AI Act5 is a commendable step in managing what is a continually difficult technology to contend with, Europe lagging behind holistically in AI development may result in EU companies having their market share lessened by their non-EU counterparts. Competition benefitting EU consumers on many occasions comes from trade resulting in regional and global markets, so if competition reform is on the table to boost enterprise in the bloc, minimising anticompetitiveness and harm from illegal subsidies to foreign firms must be looked at closely.
Bibliography
1Mario Draghi, The future of European Competitiveness Part B: in-depth analysis and recommendations, The European Commission, 2024
2 Siemens/Alstom (Case IV/M.8677) Commission Decision 139/2004/EEC [2019] OJ C 300/07
3Joint Research Centre, Science for Policy brief, Harmonised Standards for the European AI Act, JRC 139430.
4Regulation (EU) 2022/1925 of the European Parliament and of the Council on contestable and fair markets in the digital sector [2022] OJ L265/1 (hereafter: DMA).
5European Commission. (2021). Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts. COM/2021/206 final. Available at: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52021PC0206
Here at AI Accelerator Institute, we’re always looking to bring you more exciting events each year.
And 2035 is no different; why not take a look at what we have planned for you?
AI Accelerator Institute | Summit calendar
Be part of the AI revolution – join this unmissable community gathering at the only networking, learning, and development conference you need.
#2022#2023#2024#accounting#agreement#ai#AI development#AI Ethics#AI innovation#AI models#AI regulation#AI systems#amp#Analysis#approach#Article#artificial#Artificial Intelligence#billion#Building#China#Cloud#Community#Companies#competition#computer#computers#computing#conference#consumers
1 note
·
View note
Link
0 notes
Text
Scientists prepare for the most ambitious sky survey yet, anticipating new insight on dark matter and dark energy
On a mountain in northern Chile, scientists are carefully assembling the intricate components of the NSF–DOE Vera C. Rubin Observatory, one of the most advanced astronomical facilities in history. Equipped with an innovative telescope and the world's largest digital camera, the observatory will soon begin the Legacy Survey of Space and Time (LSST).
Over the course of the LSST's 10-year exploration of the cosmos, the Rubin Observatory will take 5.5 million data-rich images of the sky. Wider and deeper in volume than all previous surveys combined, the LSST will provide an unprecedented amount of information to astronomers and cosmologists working to answer some of the most fundamental questions in science.
Heavily involved in the LSST Dark Energy Science Collaboration (DESC), scientists at DOE's Argonne National Laboratory are working to uncover the true nature of dark energy and dark matter. In preparation for the LSST, they're performing advanced cosmological simulations and working with the Rubin Observatory to shape and process its data to maximize the potential for discovery.
Simulating the dark side
Together, dark energy and dark matter make up a staggering 95% of the energy and matter in the universe, but scientists understand very little about them. They see dark matter's effects in the formation and movement of galaxies, but when they look for it, it seems like it's not there. Meanwhile, space itself is expanding faster and faster over time, and scientists don't know why. They refer to this unknown influence as dark energy.
"Right now, we have no clue what their physical origins are, but we have theories," said Katrin Heitmann, deputy director of Argonne's High Energy Physics (HEP) division. "With the LSST and the Rubin Observatory, we really think we can get good constraints on what dark matter and dark energy could be, which will help the community to pursue the most promising directions."
In preparation for the LSST, Argonne scientists are taking theories about particular attributes of dark matter and dark energy and simulating the evolution of the universe under those assumptions.
It's important that the scientists find ways to map their theories to signatures the survey can actually detect. For example, how would the universe look today if dark matter had a slight temperature, or if dark energy was super strong right after the universe began? Maybe some structures would end up fuzzier, or maybe galaxies would clump in a certain way.
Simulations can help researchers predict what features will actually appear in real-world data from the LSST that would indicate a certain theory is true.
Simulations also allow the collaboration to validate the code they will use to process and analyze the data. For example, together with LSST DESC and the collaboration behind NASA's Nancy Grace Roman Space Telescope, Argonne scientists recently simulated images of the night sky as each telescope will actually see it. To ensure their software performs as intended, scientists can test it on this clean, simulated image data before they begin processing the real thing.
To perform their simulations, Argonne scientists leverage the computational resources of the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science user facility. Among its suite of supercomputers, the ALCF houses Aurora, one of the world's first exascale machines, which can perform over one quintillion—or one billion billion—calculations per second.
"Aurora's impressive memory and speed will allow us to simulate larger volumes of the universe and account for more physics in the simulations than ever before, while maintaining high enough resolution to get important details right," said Heitmann, who formerly served as spokesperson for the LSST DESC.
What to expect when you're expecting an astronomical amount of data
During the LSST, light emitted a long time ago from galaxies far away will reach the observatory. Sensors on the observatory's camera will convert the light into data, which will travel from the mountain to several Rubin Project data facilities around the world. These facilities will then prepare the data to be sent to the larger community for analysis.
As part of the LSST DESC, Argonne scientists are currently working with the Rubin Observatory to ensure the data is processed in ways that are most conducive to their scientific goals. For example, Argonne physicist Matthew Becker works closely with the Rubin Project to develop algorithms for data processing that will enable investigation of dark matter and dark energy through a phenomenon called weak gravitational lensing.
"As light from distant galaxies travels to the observatory, its path is influenced by the gravitational pull of the mass in between, including dark matter," said Becker.
"This means that, as the observatory will see them, the shapes and orientations of the galaxies are slightly correlated in the sky. If we can measure this correlation, we can learn about the distribution of matter—including dark matter—in the universe."
Weak gravitational lensing can also reveal how the structure of the universe has changed over time, which could shed light on the nature of dark energy. The challenge is that the signals that indicate weak gravitational lensing in the LSST data will be, well, weak. The strength of the signal the scientists are looking for will be roughly 30 times smaller than the expected level of noise, or unwanted signal disturbance, in the data.
This means the scientists need a whole lot of data to make sure their measurements are accurate, and they're about to get it. Once complete, the LSST will have generated 60 petabytes of image data, or 60 million gigabytes. It would take over 11,000 years of watching Netflix to use that amount of data.
Becker and his colleagues are developing methods to compress the data to make analysis both manageable and fruitful. For example, by combining images of the same parts of the sky taken at different times, the scientists can corroborate features in the images to uncover correlations in the shapes of galaxies that might have otherwise been too faint to detect.
Becker is also focused on determining the level of confidence the community can expect to have in conclusions drawn from the compressed data.
"If we know how certain we can be in our analysis, it enables us to compare our results with other experiments to understand the current state of knowledge across all of cosmology," said Becker. "With the data from the LSST, things are about to get much more interesting."
IMAGE: Simulated images of the cosmos from the DC2 simulated sky survey conducted by the Legacy Survey of Space and Time (LSST) Dark Energy Science Collaboration (DESC). DC2 simulated five years of image data as it will be generated by the Rubin Observatory during the LSST. Credit: LSST DESC
3 notes
·
View notes
Text
What is Exascale Computing?… https://patient1.tumblr.com/post/761561652081704960?utm_source=dlvr.it&utm_medium=tumblr
0 notes