#Development pipeline analysis
Explore tagged Tumblr posts
gdesignsme · 4 days ago
Text
How to Evaluate Potential Investment Properties
Understanding Your Investment Goals Before diving into the evaluation of potential investment properties, it is essential to clarify your investment goals. Are you seeking long-term capital appreciation, immediate cash flow, or perhaps a combination of both? Knowing your objectives will guide your evaluation process and help you identify properties that align with your financial aspirations.…
0 notes
aster-daydream404 · 8 months ago
Text
Tumblr media
NO BC FR THO!! ALSO!! Putting into account on how he had to survive off the land at what im assuming is a very young age as well-!!!!
So I've been reading the Moomin Books lately and I think that some people think that Snufkin is way less of a mess than he is
Bro is NOT a cool, stoic guy
He's a silly man who doesn't care whether or not visitors came by to see him specifically, he only cares that they're there now
He brought a can of beans to a theater to pay for 25 children and little My's tickets, children who he was literally afraid of in case he didn't like them
At the theater, when the whole play goes to chaos, he hoists said children on stage and asks Moominmama to take care of them
He cried over the sea being gone because of a comet coming towards Moominvalley, not because he was going to die, but because he liked the ocean and the steam rusted his harmonica
He also got so attached to the children that he said that he'd smoke the terrible tobacco they made every summer
He is not stoic, he shows emotion, and he's a loser
196 notes · View notes
projectchampionz · 10 months ago
Text
ANALYZING THE EFFECTIVENESS OF SUCCESSION PLANNING AND TALENT MANAGEMENT IN HEALTHCARE
ANALYZING THE EFFECTIVENESS OF SUCCESSION PLANNING AND TALENT MANAGEMENT IN HEALTHCARE 1.1 Introduction Succession planning and talent management are critical strategies in ensuring the long-term success of healthcare organizations. In an industry where the demand for leadership and skilled professionals is growing due to demographic shifts, technological advancements, and evolving patient…
0 notes
electronalytics · 2 years ago
Text
0 notes
solar-sunnyside-up · 1 year ago
Note
I'm really curious about what you think schooling would look like in a Solarpunk world/future!
Because the current public school system is broken af and the homeschool system isn't much better. I personally have looked into things like Sudbury schools and found good things and also issues. I've always been a proponent of the IDEA of Unschooling (which I understand to be, letting the child learn naturally through the world around them, learn reading through reading to them or teaching math and even basic chemistry through teaching them to cook, etc) but it seems like most parents use it as an excuse to not educate their kids...
I really think kids should learn practical things alongside the Academic stuff (three Rs, science, etc) but no system seems right...
Oooh boy! Have I thought about this one endlessly!
So background info that I have to frame where I'm coming from-
A- the current system is built for school>> factory worker pipeline
B) it also evolved from ppl working at factories and needing to put their kiddos somewhere while they worked their 9-5! Thus Sunday school evolved from something to teach basic literacy to a full time job for children (it's legit nearly 40 hour weeks for CHILDREN) so there's a lot of padded time to ensure they meet that quota
C) it's used of a massive scale it was NOT designed to be used at
Soooo!! Let's imagine a better one!
Personally, based on children's development I think schooling should be broken up into focused chunks and then obvi each kiddo should be able to work at their own pace within these chunks of time.
0-6 Motor and sensory skills- introduced to music/shapes/building, "helping" with community chores (laundry/windows/dishes/sweeping), basics of plants/gardens, learning about transportation and basic navigation.
7-10 Written- literacy (reading/printing/telling time/storytelling/etc), health (emotional+physical), basic cooking + tool usage, basics of history/geography, basics of all sciences, gardening more independently
11-13 social + advanced work -- advanced history/science/literacy/home eco/etc.. start working within the community in a vollunteer capacity, Starting to specialize in interests, focuses in philosophy/analysis/debate,
14-20 community and citizenship --greated focus in Philosophy/debate/analysis in addition to apprenticeships of testing out what they'd like to do with their lives
20+ whatever they wanna do! Personally I think our adulthood should start over from 0 here. Bc after you hit 20 your a baby adult, but like a 35yr old is nearly a teenager as should be treated as such! Finding themselves, building community, getting the swing of all that jazz.
Then the WAY this is taught would be with ppl close to the kiddos, neighbors and parents and community leaders would be in charge of these chunks. Much more like a tutor or professor style where each teacher specializes in both the thing their teaching but also the kiddos their raising.
229 notes · View notes
covid-safer-hotties · 9 months ago
Text
Also preserved in our archive
A new study by researchers at Zhejiang University has highlighted the disproportionate health challenges faced by sexual and gender-diverse (SGD) individuals during the COVID-19 pandemic. By analyzing over 471 million tweets using advanced natural language processing (NLP) techniques, the study reveals that SGD individuals were more likely to discuss concerns related to social connections, mask-wearing, and experienced higher rates of COVID-19 symptoms and mental health issues than non-SGD individuals. The study has been published in the journal Health Data Science.
The COVID-19 pandemic has exposed and intensified health disparities, particularly for vulnerable populations like the sexual and gender-diverse (SGD) community. Unlike traditional health data sources, social media provides a more dynamic and real-time reflection of public concerns and experiences. Zhiyun Zhang, a Ph.D. student at Zhejiang University, and Jie Yang, Assistant Professor at the same institution, led a study that analyzed large-scale Twitter data to understand the unique challenges faced by SGD individuals during the pandemic.
To address this, the research team used NLP methods such as Latent Dirichlet Allocation (LDA) models for topic modeling and advanced sentiment analysis to evaluate the discussions and concerns of SGD Twitter users compared to non-SGD users. This approach allowed the researchers to explore three primary questions: the predominant topics discussed by SGD users, their concerns about COVID-19 precautions, and the severity of their symptoms and mental health challenges.
The findings reveal significant differences between the two groups. SGD users were more frequently involved in discussions about "friends and family" (20.5% vs. 13.1%) and "wearing masks" (10.1% vs. 8.3%). They also expressed higher levels of positive sentiment toward vaccines such as Pfizer, Moderna, AstraZeneca, and Johnson & Johnson. The study found that SGD individuals reported significantly higher frequencies of both physical and mental health symptoms compared to non-SGD users, underscoring their heightened vulnerability during the pandemic.
"Our large-scale social media analysis highlights the concerns and health challenges of SGD users. The topic analysis showed that SGD users were more frequently involved in discussions about 'friends and family' and 'wearing masks' than non-SGD users. SGD users also expressed a higher level of positive sentiment in tweets about vaccines," said Zhiyun Zhang, the lead researcher. "These insights emphasize the importance of targeted public health interventions for SGD communities."
The study demonstrates the potential of using social media data to monitor and understand public health concerns, especially for marginalized communities like SGD individuals. The results suggest the need for more tailored public health strategies to address the unique challenges faced by SGD communities during pandemics.
Moving forward, the research team aims to develop an automated pipeline to continuously monitor the health of targeted populations, offering data-driven insights to support more comprehensive public health services.
More information: Zhiyun Zhang et al, Sexual and Gender-Diverse Individuals Face More Health Challenges during COVID-19: A Large-Scale Social Media Analysis with Natural Language Processing, Health Data Science (2024). DOI: 10.34133/hds.0127 spj.science.org/doi/10.34133/hds.0127
60 notes · View notes
grison-in-space · 11 months ago
Text
I did a bit of de novo genome assembly way, way back in the day which I have never been able to use professionally because my PI refused to spend $2000 more on getting new read depth. He had ordered the reads before actually learning anything about the pipeline and only about half of the libraries he had ordered were usable in any given pipeline, see. (Some had been for older assembly methods and others had been for newer ones, basically.)
Rather than find the money to fucking get me the reads to do it right, he heard about an open source project called RACA that was some dude's dissertation arguing that you COULD use some of the worthless libraries to fill in the gaps of the assembly and get a functional genome out of it. I spent two years trying to move massive quantities of data through that fuckhead's pipeline on the campus supercomputer to get the assembled genome out, and then I got to the end and found there was no output as fastq files or ant other format recognizable to me.
(Give me a break, I was 23 and had also been frantically learning acoustic analysis, basic electrical engineering, and technical equipment maintenance in the two years since I had started learning to code. Plus I was figuring out what I wanted my dissertation to be. I'd never grappled with anything more complicated than our home-written library of matlab acoustic analysis before, and it simply hadn't occurred to me that anyone would publish a non-functional pipeline to achieve a goal quickly anyone verifying that anyone else had done anything yet.)
Anyway, eventually he collaborated with someone else who ponied up $2000 and a postdoc to get new reads. My name was not on the paper, so that's two years of my life developing a particular and fairly unique skill set that I will almost certainly never use.
In retrospect it's less surprising than you might think that the PhD took eight years and absolutely shattered my confidence.
And the best part is that it was just about impossible to predict at the time that shit would go quite this bad, except that some people handle power well when they're stressed and some people maintain a strong layer of cognitive dissonance over their knowledge of power such that it's never real enough to be responsible about but always real enough to win a dispute.
Anyway I think every student should have two advisors so that everyone in the department should have to immediately know about it when a PI is floundering and have a strong direct incentive to do something about it. A LOT of my problems could have been fixed with one look with a gimlet eye from a senior, more experienced researcher being not impressed at a student under their supervision running on an endless treadmill to nothing. Frankly a lot of my problems could have been solved if my mentor had formal training or literally any supervision that could deliver metrics faster than "how close am I to my previous mentees?"
I know a lot of dual advised students wind up in a tug of war between two advisors, but like: that's the point. If one of them turns out to be insane and malicious then a) the students all have clear lines to bail, b) the other ones all realize quickly that bailing out the chaos and career damage of someone who is fucking it up is way more work than resolving the problem, and c) the one with more tethers to reality has a way bigger likelihood of formally retaining the student when and if a third party has to examine the contract.
Just. It was such a fucking waste. And not because anyone necessarily wanted it to be wasteful, either, or any malice, but because I was... mm, I think the fifth PhD student in that lab and that's actually not that many to be learning on. Systems that set you up to play with decades of people's lives should have more fail-safes and places for people to learn before they get to be the sole director of someone else's career for five fucking years, not less. And yet!
31 notes · View notes
formulahs · 10 months ago
Note
get started i need the orca analysis
Tumblr media Tumblr media
the people yearn for the f1 driver-cetacean pipeline. well orcas are highly efficient predators with basically no other predators of their own… their senses are extremely developed, their killings are surgical yet they do show the habit of playing with food (which um.. norstappen god bless). so yeah… a sophisticated killer. thats my brother max emilian through and through
25 notes · View notes
eponymous-rose · 6 months ago
Text
It's another busy week, so I'm gonna do one of these again because it genuinely helps me keep track. Today in a nutshell!
Worked on some e-mails over breakfast - mostly coordinating for dinner tonight (I 100% did not forget to make the reservation, I promise, I just uhhhhhhhhhhh definitely didn't forget, that's for sure, and thank goodness for no particular reason that they happened to have one table left at 6PM), happily agreeing to write some reference letters for my PhD student's postdoc applications, rescheduling some meetings, setting new meetings, meetings meetings meetings. Oh, and booking tables for a couple of card shows this month! Off to work!
I get in a little later than I'd like and rush downstairs to the lounge to make my mug of tea pre-class, where I run into a student who just defended his PhD last week. I'm on his reading committee, so we agree to set up a time to go over my (honestly quite minor) comments on his dissertation. I also run into our incredible facilities guy, who follows up on some technical issues my students ran into over the weekend, hopefully resolved - I have five groups of three undergraduate students running their own weather stations all across the metro area of our city!
No time to enjoy the tea, so I leave it to steep a hilariously long time and rush back downstairs to teach my class! This year's students are truly exceptional - apparently over the weekend they all discovered that the Mac version of the data collection software for their weather stations is no longer supported, and they all independently coordinated to get PCs into the hands of all 5 groups. Let me tell you, when you're expecting to have to spend the first 20 minutes of the class troubleshooting and are instead greeted by a quiet, expectant two rows of faces, it's a great feeling.
Today's lecture is a topic I'm really passionate about - teaching students the "why" behind a lot of the statistical methods they've learned in the past (these are college seniors) and working on building a pipeline for exploratory data analysis. This isn't explicitly part of the syllabus, but my gosh, the quality of the final reports has improved sharply once I introduced these lectures. The students participated a bunch and happily launched into think-pair-share groups without my having to coordinate them. This is my sixth time teaching this class, and these students are far and away the best I've encountered. I am also very, very bad with names (and have a lot of anxiety about calling someone by the wrong name) but managed to successfully use an example in class in which I rattled off four students' names in a row, no effort needed. Phew.
As a side note, this has always been far and away my least-favorite class to teach, and this was the year I was gonna change that - I brought it to a curriculum development workshop last year and even presented on it at an education conference last week. But... dang, having strong students truly makes it effortless to enjoy teaching this class.
Back to my office, which smells like the double-spiced chai that has been steeping so long it's probably quadruple-spiced by now. Delicious. I have an hour until my next commitment, so I try to get ahead on grading the homework assignment my students handed in on Friday (all 15 of them handed it in on time!!!!). I also realize that this is my last block of free time until dinner, so I run downstairs to heat up my soup for lunch.
After getting through four of the assignments, it's time for a weather briefing (we have a team for a national forecasting competition), which means it's mostly just time for technical difficulties, but we make it through in the end and wrap up a bit early - back to grading! Students are doing great on this assignment overall, which is gratifying, but I make a note of a topic some of them are struggling on so I can mention it during Wednesday's class.
Weekly hour-long meeting with one of my Master's students! He talked about how he's taking a course on pedagogy to help with his work as a teaching assistant this quarter (!!) and he's been working through my first round of revisions of his very first first-author scientific journal article and had a few clarifying questions. I recommended some off-the-wall papers in the communications literature that I think would dovetail well with some of the discussion in his paper, and he was really jazzed to get to explore those. We also decided to get him set up with a million core-hours on a supercomputer so he can start on the next phase of his research - he promised to have the paper ready for the next set of revisions by the end of the week, so while I'm working on that, he can get familiar with the new system. I am also reminded that I really need to come up with some more substantial funding for him - currently he's working on a fellowship, but that runs out after three years.
After he heads out (a few minutes early, more grading time!) I get an e-mail from a scientist in Switzerland - she and I are working on getting her out here for a two-year postdoc job studying lightning with me. She's made revisions on her application for funding, so that's another thing for me to read over this week. I'm also reminded that I have to get back to an Italian grad student who wants to come visit my group for a year. Still figuring out the logistics on that one...
I also need to get back to a forestry service colleague of mine about getting the university my share of the funds for our fast-approaching field work using brand-new radar tech to study wildfire smoke plumes. I really, really need to get back to him this week - I think we're planning on flying out in April to start.
ALSO also this week, I have some pretty intense revisions of my own to deal with - I've been given this opportunity to write a huge review article, and I finally got it done back in December... only to learn that they want it to be about half that length. I'm going to take a swing at carving 5,000 words out of that behemoth.
AND a colleague and I are working on a resubmission of a grant to study thunderstorms in really unusual places, and I promised her I'd have a complete draft for her to read by the 7th. Phew. Good thing my week is only front-loaded with meetings.
Whoops, no more time to grade/read e-mails and schedule in my head. We have someone here today interviewing for a job on our faculty, and I'm one of the search committee members! Better dash downstairs to catch the candidate's talk. We have five two-day interviews planned for the next four weeks. Ouch.
Awesome talk by the candidate (we're very lucky to be spoiled for choice even in our very specialized field - we've whittled 86 qualified candidates down to five), and I launch straight from that into a student's PhD entrance exam. At this stage I should mention how much I genuinely loathe our PhD entrance exam, which is a pedagogical and logistical nightmare all around. This was a very popular opinion, which is why we as the faculty voted unanimously to completely change the process last year. Why are there students still taking this horrible exam???? Fuck if I know, man. At this point, it's voluntary to opt into it, and I am baffled and deeply frustrated at how many faculty members apparently encouraged their students to take it. Anyway, the student does a great job and we muddle through somehow, and now it's back up to my office to do some cramming on small-talk topics before a colleague and I host the faculty candidate for dinner!
A delightful dinner all around - my colleague is someone I was initially intimidated by (she's a giant in the field) but with whom I have since bonded, so we had some fun banter in the car and I think it helped the job candidate relax a little. We had some fun big-picture talk (and some less-fun big picture talk about news that dropped as we were eating) but mostly just talked about how much we love this part of the world. Good food, drink, and conversation. On the car ride home, I managed to troubleshoot a problem my undergrad research assistant was having with getting access to the supercomputer he needs for his project. Phew.
That's a long day, but good stuff all around!
19 notes · View notes
allthecanadianpolitics · 2 years ago
Text
Chalk up a win for the provinces and a loss for the federal government's environmental ambitions.
In a 5-2 decision released on Friday, the Supreme Court of Canada ruled against Ottawa and in favour of arguments from provincial governments about how major projects are approved in the country.
The ruling focused on the federal government's Impact Assessment Act (IAA), which gives federal regulators the power to assess potential environmental and social impacts of various major projects, such as pipelines, power plants and airports. 
Experts say it's a setback, but not a critical blow to the federal government's environmental agenda, although it could have broader implications for other climate policies Ottawa is developing.
Meanwhile, it's a triumph for provincial autonomy. [...]
As CBC reporter Erin Collins more colloquially put it on CBC Radio, a few minutes after the decision was released, "this was really Alberta telling the feds to stay off their lawn and the local bylaw officer kind of coming by and agreeing with them." [...]
Continue Reading.
Note from the poster @el-shab-hussein: Yay. Now Alberta gets full reign of their horrifically backwards tar sands environmental policies.
Tagging: @politicsofcanada, @abpoli
48 notes · View notes
mariacallous · 1 year ago
Text
In April 2018, I was invited by the American ambassador to a meeting at the embassy in Tbilisi, Georgia. The ambassador had assembled a group of nongovernmental organization (NGO) leaders in the field of disinformation to meet with a senior Trump administration official from the State Department. He asked us to describe the main narratives of Kremlin disinformation. As the director of a large international democracy organization, I highlighted Russia’s manipulation of gender and LGBTQ issues to sway Georgians away from the perceived “cultural decadence” of the European Union. The official’s frustration was palpable. His response, tinged with irritation, was telling: “Is that all you people can talk about? The gays?”
A year before, several international organizations partnered with Georgian parliamentarians on a gender equality assessment, supported by several government donors. This collaboration led to an internal conflict. The United States Agency for International Development (USAID) wanted to scrub the original report, as it covered abortion, notably legal in Georgia, while the Swedish government and other stakeholders wanted the complete assessment. As a result, at the time of its release, two distinct reports had to be printed, one with references to abortion and one without.
Former U.S. President Donald Trump emerged victorious from last week’s New Hampshire primary and is likely to be the Republican Party’s presidential nominee. His closing statement in New Hampshire praised Hungarian leader Viktor Orban, who embraces the oxymoronic term “illiberal democracy” while suppressing independent media, civil society, and courts. He has repeatedly emphasized the glory of strongmen like Orban. His foreign policy has been clear: stopping support for Ukraine, NATO, and our European allies.
But while there has been plenty of analysis of Trump’s America First impact on foreign policy and security, less covered is how it will also completely redefine foreign aid as well as the liberal democracy agenda. My experience with the first Trump administration as a senior leader in democracy organizations receiving funding from USAID provides some insight into the foreign-aid agenda of a second, but likely only scratches the surface of what is to come.
The Heritage Foundation’s Project 2025, established in 2022, offers a detailed roadmap for revamping USAID under Trump—one that will undermine, eliminate, and censor the critical work of thousands of people and organizations committed to building more just societies. The Heritage Foundation has been staffing and providing a pipeline of ideas to Republican administrations since President Ronald Reagan. Project 2025 is a plan to shape the next Republican administration, and its funders have close ties to Trump. The project’s objective is to replace “deep state” employees with conservative thought leaders to carry out an executive-driven agenda.
In the overview, the project articulates its goal to end what it calls USAID’s “divisive political and cultural agenda that promotes abortion, climate extremism, gender radicalism, and interventions against perceived systemic racism.” A key component of the illiberal playbook is to attack gender and marginalized communities, an early warning sign of democratic backsliding. Illiberal strongmen, such as Turkey’s Recep Tayyip Erdogan and Russia’s Vladimir Putin, exploit traditional hierarchies to divide society and create pecking orders of power. Russia refused to sign, and Turkey withdrew from, the Istanbul Convention, a commitment to protect women from domestic violence. The Narendra Modi administration in India filed an affidavit in the Supreme Court against criminalizing marital rape, arguing it would destabilize marriage. Hungary and Poland lobbied to ban the term “gender equality” in international agreements and implemented anti-LGBTQ policies, including local municipalities adopting “LGBT-free” zones as part of a government-supported “Family Charter” in Poland.
As a first step, Trump’s USAID will “dismantle” all diversity, equity, and inclusion (DEI) initiatives, which Project 2025 calls “discriminatory.” This mandate includes firing the chief diversity officer and all advisors and committees. In 2016, the Obama administration issued a DEI presidential memorandum to ensure USAID, among other agencies, had a diverse and representative workforce. Trump scaled back these efforts. On Jan. 20, 2021, Biden’s first day in office, he signed an executive order that demanded that government agencies devise strategies to tackle DEI issues. Pursuant to this, USAID Administrator Samantha Power signed USAID’s DEI strategy on her first day in May 2021. Project 2025 would reverse this strategy, requiring USAID to “cease promotion of the DEI agenda, including the bullying LGBTQ+ agenda,” which entails support for organizations overseas that work on these issues.
According to Project 2025, Trump’s new USAID will also eliminate the word “gender” full stop, arguing that “Democrat Administrations have nearly erased what females are.” This is bizarre, as I have decades of experience receiving USAID funding for numerous programs to advance women in political life and support women’s organizations. Working for democracy organizations across Asia and the former Soviet Union, I saw USAID provide critical support to expand women’s wings of political parties; recruit women election officials, observers, and administrators; train women’s advocacy and rights organizations; and build women’s committees in parliaments.
The Heritage Foundation report also accuses USAID of “outright bias against men,” an equally strange claim; in fact, gender realignment was needed and implemented. A Trump USAID will fire more than 180 gender advisors and points of contact, who work alongside USAID colleagues “to integrate gender and advance gender equality objectives in USAID’s work worldwide,” and scrub the words “gender,” “gender equality,” and “gender equity” from all documents. This would require a massive purge of decades of USAID materials and websites.
USAID has spent years incorporating gender into all aspects of its programming to ensure the agency addresses the needs of women, including unique development obstacles they face. Removing a gender lens would take us back in time to programming that often harmed women, inadvertently, by failing to analyze the varying effects of programming based on gender and power dynamics in different environments. To erase all of USAID’s tools, learning, and research on how to ensure best practice would have dangerous consequences. For example, when I worked for USAID in Cambodia in the 1990s, the agency supported micro-lending for small community projects, in which most of the loans went to women. This resulted in increased domestic violence, as men were angry about the financial imbalance in the home. Today, USAID has gender analysis and research on risk factors to mitigate against such outcomes.
Relatedly, a Trump USAID will make anti-choice “core” to its mission, removing all “references to ‘abortion,’ ‘reproductive health,’ and ‘sexual and reproductive rights.’” Project 2025’s blueprint singles out specific organizations and U.N. agencies to target and defund. Further, the president himself would have the ability to oversee programming directly: “Current law in the Foreign Assistance Act gives the President broad authority to set ‘such terms and conditions as he may determine’ on foreign assistance, which legally empowers the next conservative President to expand this pro-life policy.” Previous administrations have restricted funding to organizations that provide abortions (the “Mexico City Policy”), which resulted in an increase in maternal and child mortality and unsafe abortions—exactly what the policy claimed to want to prevent. In sub-Saharan Africa, data shows the policy increased abortions by defunding clinics that provided family-planning services. The first Trump administration expanded restrictions further, impacting speech and service delivery around the world.
A Trump USAID would not only stop funding local partner organizations that support gender, LGBTQ, and rights agendas but redirect that money to religious organizations. In fact, it would mandate training and indoctrination for all USAID staff on the link between religion and development. USAID would also ensure conservative oversight of all grantmaking to ensure against “progressive policies” and a “radical agenda.” USAID already engages with faith-based partnerships, alongside secular NGOs, but Project 2025 would like to shift the balance, creating a “New Partnership Initiative” that would help prioritize religious groups.
A stated “key outcome of the transformation of USAID” under Trump will be a complete revamp of the Bureau for Democracy, Development, and Innovation, shifting its focus to trade, the private sector, and religious communities, and purging staff. Importantly, all directors of each center—not just the assistant administrator—will have political leadership, not career experts. In addition, Trump’s USAID will rewrite all policy “as soon as possible” to ensure a conservative agenda.
During the first Trump administration, I felt the impact in my work overseas. I worked closely with the LGBTQ community in Georgia, which faced horrific obstacles—ostracization, violence, homelessness—and which was targeted relentlessly by Kremlin information operations. USAID has long been a defender of human rights and funded projects on these issues. There was a shift under Trump, though I applaud individual USAID employees for creatively trying to find workarounds and continue support—like slight renaming of initiatives or cleverly filing them under more favorable, broader categories like “human rights.” They no doubt prevented damaging cuts to our important work.
I am far more worried about the impact of a second administration. Back then, there was no concrete, detailed roadmap like Project 2025 and no massive replacement of foreign aid professionals with conservative political operatives. Under a second administration, under Schedule F, Trump has planned a sweeping political takeover of our civil service, stripping civil servants of protection, forcing them to implement his political policy agenda, and giving the president unilateral power to fire employees at will.
The organization I now work for, the German Marshall Fund, supports hundreds of civil society organizations across the Balkans, Black Sea region, Ukraine, and Central Europe—thanks to more than a decade of USAID support. USAID has encouraged our goals of promoting democracy; bolstering the rights of women, LGBTQ, and other marginalized communities; and deterring illiberalism through independent media, watchdog organizations, and information integrity efforts. We do this through grantmaking, capacity-building and technical assistance, leadership programs, and policy dialogues.
With democracy in global decline and illiberal strongmen on the rise, we need these efforts more than ever. Backsliding elsewhere affects democracy everywhere. America benefits from strong, free, liberal societies—it is in our national interest and key to our global security and order. While few voters go to the polls with foreign aid on their minds, the consequences for millions of people worldwide are on the ballot this November.
34 notes · View notes
beardedmrbean · 5 months ago
Text
State-backed North Korean hackers have stolen $1.5bn (£1.2bn) of cryptocurrency in the largest heist in history.
Agents from Pyongyang were able to breach the systems of Dubai-based exchange Bybit to steal the digital coin Ether, according to security analysts.
The hackers stole more cryptocurrency in one attack than all the funds stolen by North Korean cyber criminals in 2024, when the rogue state’s cyber attackers made off with around $1.3bn in digital coins, according to cryptocurrency analysts Chainalysis.
The $1.5bn total eclipses the largest known bank theft of all time, when Saddam Hussein stole $1bn from the Iraqi central bank ahead of the Iraq War in 2003.
The record haul comes as Kim Jong-un, North Korea’s supreme leader, turns to elite units of computer hackers to prop up the Communist dictatorship’s failing economy.
Chainalysis said the attack served as a “stark reminder” of the advanced tactics employed by the country’s hackers. As well as technical skills, North Korean hackers are adept at what is known as “social engineering”: manipulating people to do what they want in order to pave the way for a heist.
This can involve developing relationships with targets over email and digital chats, sometimes over a period of months.
Cyber security experts believe North Korea’s notorious Lazarus Group are the masterminds behind the latest attack. The group has terrorised Western businesses for more than a decade with a series of cyber breaches that have caused billions of dollars in losses.
Elliptic, a cryptocurrency analysis business, said the hacking group was the “most sophisticated and well-resourced launderer of cryptoassets in existence”.
The group is believed to be part of North Korea’s intelligence agency, the Reconnaissance General Bureau. It has been linked to past attacks including the hack of Sony in 2014, when the group leaked private emails from executives in an attempt to block the release of the comedy film The Interview, which lampooned North Korea’s supreme leader.
Lazarus Group has also been blamed for a near-$1bn heist from a Bangladeshi bank in 2016 and the global Wannacry cyber attack, which knocked hundreds of thousands of computers offline with damaging ransomware, including NHS systems.
While Pyongyang once relied on its elite hacking cadres to conduct espionage or steal trade secrets, increasingly they have been employed as a weapon of economic warfare to bolster the coffers of the heavily sanctioned regime.
“North Korea started using cyber attacks for espionage, stealing R&D and intellectual property,” said Rafe Pilling, of the cyber security company Secureworks. “Subsequently, they have really capitalised on it as a source of revenue.”
A Soviet-style focus on science and technology has created a “whole education pipeline” for future cyber experts, said Mr Pilling. North Korean science prodigies are identified from a young age, before being pushed to compete in international maths and programming competitions.
The country’s hackers are prolific. In 2024, they made off with approximately 61pc of the $2.2bn of cryptocurrency stolen globally, according to Chainalysis. Including last week’s attack, North Korean hackers have stolen upwards of $6bn in cryptocurrency over the last decade.
The thefts offer a substantial boost to the nation’s beleaguered economy and help support its military spending, including its ballistic missile programme. North Korea’s GDP is estimated at just $28bn and it is heavily reliant on agriculture and trade with its main ally, China.
While most members of Lazarus Group are unknown, the US has issued indictments against several North Korean military figures it believes are linked to the group.
North Korea relies on multiple different hacking techniques, ranging from uncovering so-called “zero day” hacks that can break into IT using previously unknown flaws to using fake remote-working contractors to infiltrate US companies.
Cryptocurrency analysis companies including Arkham Intelligence and Elliptic identified Lazarus Group as the likely Bybit hackers. Researchers were able to trace the digital wallets that were used by the hackers to quickly launder their funds, which are recorded on the “blockchain” technology used by the cryptocurrency industry.
Some of the funds moved through wallets believed to be associated with past North Korean hacking attacks. TRM, a cyber security company, said there were “substantial overlaps observed between addresses controlled by the Bybit hackers and those linked to prior North Korean thefts”.
The North Korean hackers were able to steal the huge crypto haul through a multi-layered and long-planned attack, according to Chainalysis. Hackers gained access to Bybit’s internal systems using so-called “phishing” email, which prompted an employee to input their login details to a seemingly legitimate website that was actually compromised.
The hackers were then able to gain access to a so-called “cold wallet” – a supposedly secure cryptocurrency storage device that holds coins offline and away from the internet. When Bybit came to transfer funds from the offline wallet to its online systems, the hackers sabotaged the transfer and stole the funds.
Within minutes the hackers had fed them through a series of other wallets and digital currency exchanges, attempting to obscure their origin by trading them for other coins or passing them through trading houses with no customer checks.
The nature of the cryptocurrency industry, which is virtually unregulated, has made it a haven for cyber attackers to launder funds. Chainalysis said it had worked with exchanges to freeze $40m in funds stolen from Bybit, but far more remained unaccounted for.
North Korea’s hackers are showing no signs of slowing down. According to Chainalysis, its attackers are getting “better and faster at massive exploits”.
North Korea’s cyber prowess allows it to be a “major player even if in the real world they are highly isolated,” Mr Pilling said.
Bybit has said it has “more than enough” assets to cover its losses and insisted the hack was an “isolated incident”.
4 notes · View notes
intensifyre · 3 months ago
Text
Anil Ambani group stock Reliance Power gains 4% on ₹2,000 crore solar power project from Bhutan.
Reliance Power share price hogged the limelight in trade on Monday, May 19, after the company announced entering into an agreement with Green Digital Private Limited, owned by the government of Bhutan, to develop the country’s largest 500 MW solar power project through a 50:50 joint venture.
Tumblr media
Following the project win, Reliance Power share price jumped nearly 4% in intraday trade today.
Reliance Power signs PPA with Bhutan govt
Reliance Power, in an exchange filing earlier today, said it has signed a commercial term sheet for a long-term power purchase agreement (PPA) with Green Digital Private Limited (GDL), owned by Druk Holding and Investments Limited (DHI), the investment arm of the Royal Government of Bhutan.
Under the agreement, Reliance Power and DHI will jointly develop Bhutan’s largest solar power project through a 50:50 venture, with an installed capacity of 500 MW. The project entails a capital outlay of up to ₹2,000 crore under Build-Own-Operate (BOO) model, representing the largest private sector foreign direct investment (FDI) in Bhutan’s solar energy sector to date, the Anil Ambani-owned company said in a filing.
Reliance Power’s total clean energy pipeline stands at 2.5 GWp solar and over 2.5 GWhr BESS, making it India’s largest player in the integrated Solar + BESS segment, the company claimed.
The project, which will be implemented in phased tranches over the next 24 months, is expected to redefine Bhutan’s solar generation capacity, surpassing all current solar installations.
“The landmark solar investment in Bhutan underscores Reliance Group’s strategic focus on expanding its renewable energy portfolio, while reinforcing its long-term commitment to strengthening India-Bhutan economic cooperation,” Reliance Power said in a filing today.
Reliance Power stock: Should you buy?
Reliance Power share price rose as much as 3.6% to the day’s high of ₹46.72 on the BSE today. With today’s gains, Reliance Power stock is up almost 16% in May. Meanwhile, in the last one year, the Anil Ambani group stock has gained as much as 77%.
According to Anshul Jain, Head of Research at Lakshmishree Investments, RPower is forming a 34-week-long flat base on the weekly charts, showing signs of bullish consolidation. Volume behaviour supports accumulation, with higher volumes on up weeks and lower volumes on down weeks, he said.
“The daily, weekly, and monthly moving averages are aligned positively, reinforcing the bullish structure. A breakout above the resistance level of 48.5 will confirm the pattern and is likely to propel the stock towards the 65 zone in the short term,” Jain added.
Reliance Power recently reported a strong set of results for the January-March quarter of the financial year 2024–25 (Q4 FY25).
Reliance Power posted a consolidated net profit of ₹126 crore in the January-March quarter of FY25 due to lower expenses. The company had reported a loss of ₹397.56 crore in the quarter ended on March 31, 2024, a regulatory filing showed.
Total income dipped to ₹2,066 crore in the fourth quarter from ₹2,193.85 crore in the same period a year ago.
“Investments in the securities market are subject to market risks.”
Intensify Research ensures that you stay ahead of upcoming market conditions, economic reforms, and the latest market trends. Our dedicated team works tirelessly with innovative ideas and expert insights, focusing on identifying profit-making stocks. With a strong emphasis on data-driven analysis and market foresight, we empower investors to make informed decisions and maximize their returns.
2 notes · View notes
govindhtech · 4 months ago
Text
Google Cloud’s BigQuery Autonomous Data To AI Platform
Tumblr media
BigQuery automates data analysis, transformation, and insight generation using AI. AI and natural language interaction simplify difficult operations.
The fast-paced world needs data access and a real-time data activation flywheel. Artificial intelligence that integrates directly into the data environment and works with intelligent agents is emerging. These catalysts open doors and enable self-directed, rapid action, which is vital for success. This flywheel uses Google's Data & AI Cloud to activate data in real time. BigQuery has five times more organisations than the two leading cloud providers that just offer data science and data warehousing solutions due to this emphasis.
Examples of top companies:
With BigQuery, Radisson Hotel Group enhanced campaign productivity by 50% and revenue by over 20% by fine-tuning the Gemini model.
By connecting over 170 data sources with BigQuery, Gordon Food Service established a scalable, modern, AI-ready data architecture. This improved real-time response to critical business demands, enabled complete analytics, boosted client usage of their ordering systems, and offered staff rapid insights while cutting costs and boosting market share.
J.B. Hunt is revolutionising logistics for shippers and carriers by integrating Databricks into BigQuery.
General Mills saves over $100 million using BigQuery and Vertex AI to give workers secure access to LLMs for structured and unstructured data searches.
Google Cloud is unveiling many new features with its autonomous data to AI platform powered by BigQuery and Looker, a unified, trustworthy, and conversational BI platform:
New assistive and agentic experiences based on your trusted data and available through BigQuery and Looker will make data scientists, data engineers, analysts, and business users' jobs simpler and faster.
Advanced analytics and data science acceleration: Along with seamless integration with real-time and open-source technologies, BigQuery AI-assisted notebooks improve data science workflows and BigQuery AI Query Engine provides fresh insights.
Autonomous data foundation: BigQuery can collect, manage, and orchestrate any data with its new autonomous features, which include native support for unstructured data processing and open data formats like Iceberg.
Look at each change in detail.
User-specific agents
It believes everyone should have AI. BigQuery and Looker made AI-powered helpful experiences generally available, but Google Cloud now offers specialised agents for all data chores, such as:
Data engineering agents integrated with BigQuery pipelines help create data pipelines, convert and enhance data, discover anomalies, and automate metadata development. These agents provide trustworthy data and replace time-consuming and repetitive tasks, enhancing data team productivity. Data engineers traditionally spend hours cleaning, processing, and confirming data.
The data science agent in Google's Colab notebook enables model development at every step. Scalable training, intelligent model selection, automated feature engineering, and faster iteration are possible. This agent lets data science teams focus on complex methods rather than data and infrastructure.
Looker conversational analytics lets everyone utilise natural language with data. Expanded capabilities provided with DeepMind let all users understand the agent's actions and easily resolve misconceptions by undertaking advanced analysis and explaining its logic. Looker's semantic layer boosts accuracy by two-thirds. The agent understands business language like “revenue” and “segments” and can compute metrics in real time, ensuring trustworthy, accurate, and relevant results. An API for conversational analytics is also being introduced to help developers integrate it into processes and apps.
In the BigQuery autonomous data to AI platform, Google Cloud introduced the BigQuery knowledge engine to power assistive and agentic experiences. It models data associations, suggests business vocabulary words, and creates metadata instantaneously using Gemini's table descriptions, query histories, and schema connections. This knowledge engine grounds AI and agents in business context, enabling semantic search across BigQuery and AI-powered data insights.
All customers may access Gemini-powered agentic and assistive experiences in BigQuery and Looker without add-ons in the existing price model tiers!
Accelerating data science and advanced analytics
BigQuery autonomous data to AI platform is revolutionising data science and analytics by enabling new AI-driven data science experiences and engines to manage complex data and provide real-time analytics.
First, AI improves BigQuery notebooks. It adds intelligent SQL cells to your notebook that can merge data sources, comprehend data context, and make code-writing suggestions. It also uses native exploratory analysis and visualisation capabilities for data exploration and peer collaboration. Data scientists can also schedule analyses and update insights. Google Cloud also lets you construct laptop-driven, dynamic, user-friendly, interactive data apps to share insights across the organisation.
This enhanced notebook experience is complemented by the BigQuery AI query engine for AI-driven analytics. This engine lets data scientists easily manage organised and unstructured data and add real-world context—not simply retrieve it. BigQuery AI co-processes SQL and Gemini, adding runtime verbal comprehension, reasoning skills, and real-world knowledge. Their new engine processes unstructured photographs and matches them to your product catalogue. This engine supports several use cases, including model enhancement, sophisticated segmentation, and new insights.
Additionally, it provides users with the most cloud-optimized open-source environment. Google Cloud for Apache Kafka enables real-time data pipelines for event sourcing, model scoring, communications, and analytics in BigQuery for serverless Apache Spark execution. Customers have almost doubled their serverless Spark use in the last year, and Google Cloud has upgraded this engine to handle data 2.7 times faster.
BigQuery lets data scientists utilise SQL, Spark, or foundation models on Google's serverless and scalable architecture to innovate faster without the challenges of traditional infrastructure.
An independent data foundation throughout data lifetime
An independent data foundation created for modern data complexity supports its advanced analytics engines and specialised agents. BigQuery is transforming the environment by making unstructured data first-class citizens. New platform features, such as orchestration for a variety of data workloads, autonomous and invisible governance, and open formats for flexibility, ensure that your data is always ready for data science or artificial intelligence issues. It does this while giving the best cost and decreasing operational overhead.
For many companies, unstructured data is their biggest untapped potential. Even while structured data provides analytical avenues, unique ideas in text, audio, video, and photographs are often underutilised and discovered in siloed systems. BigQuery instantly tackles this issue by making unstructured data a first-class citizen using multimodal tables (preview), which integrate structured data with rich, complex data types for unified querying and storage.
Google Cloud's expanded BigQuery governance enables data stewards and professionals a single perspective to manage discovery, classification, curation, quality, usage, and sharing, including automatic cataloguing and metadata production, to efficiently manage this large data estate. BigQuery continuous queries use SQL to analyse and act on streaming data regardless of format, ensuring timely insights from all your data streams.
Customers utilise Google's AI models in BigQuery for multimodal analysis 16 times more than last year, driven by advanced support for structured and unstructured multimodal data. BigQuery with Vertex AI are 8–16 times cheaper than independent data warehouse and AI solutions.
Google Cloud maintains open ecology. BigQuery tables for Apache Iceberg combine BigQuery's performance and integrated capabilities with the flexibility of an open data lakehouse to link Iceberg data to SQL, Spark, AI, and third-party engines in an open and interoperable fashion. This service provides adaptive and autonomous table management, high-performance streaming, auto-AI-generated insights, practically infinite serverless scalability, and improved governance. Cloud storage enables fail-safe features and centralised fine-grained access control management in their managed solution.
Finaly, AI platform autonomous data optimises. Scaling resources, managing workloads, and ensuring cost-effectiveness are its competencies. The new BigQuery spend commit unifies spending throughout BigQuery platform and allows flexibility in shifting spend across streaming, governance, data processing engines, and more, making purchase easier.
Start your data and AI adventure with BigQuery data migration. Google Cloud wants to know how you innovate with data.
2 notes · View notes
lethimfertilise · 7 months ago
Text
About A Blue Stream
Ask me what dominates conversations as we step into 2025, and I’d answer without hesitation: gas. The topic has been dissected in detail by industry experts and news outlets, underscoring its critical importance. But beyond the headlines and analyses, a few lesser-discussed developments stand out, and they’re worth exploring.
The cracks in Iranian urea production started showing in December, as Iran’s gas shortages disrupted urea production. What followed was a ripple effect: Turkey ramped up purchases, creating unexpected openings for Russian suppliers to bypass Indian tenders and secure better deals elsewhere. Egyptian producers weren’t far behind, leveraging the same dynamics to their advantage. 
The end of Russian gas transit through Ukraine. Despite nearly three years of war, Russian gas continued to flow to the EU via Ukraine’s pipelines-a lifeline for many. But that era is now over, leaving all sides grappling with the fallout: 
Russia: Stuck with an oversupply of gas and limited capacity to develop new markets.  
Ukraine: Losing a critical source of transit revenue and facing rising energy costs in neighbouring Moldova and Romania. To complicate matters, the pipeline system could now become a target for Russian missiles. What’s next? A potential sale of Ukraine’s gas infrastructure, including its storage facilities, to an external player. 
The EU: Confronting intensified energy insecurity. European TTF prices have already soared to $15/MMBTu—almost double the price from a year ago.
Amid this upheaval, China emerges as the clear winner. With few alternative buyers for Russian gas, China is positioned to capitalise, potentially securing supplies on highly favourable terms. 
Meanwhile, whispers of a Qatari-Turkish pipeline through Syria were met with a firm denial from Qatar. But I can’t shake the feeling that this story is far from over. Keep an eye on this development—it may resurface in unexpected ways.
#imstory #fertilisers #fertilizers #gas #urea #market #china #iran #russia #ukraine #europe #qatar #syria #turkey #analysis 
5 notes · View notes
canmom · 1 year ago
Note
(Me again! Previously I had bothered you in DMs about an article, but figured it might be better to send an ask in this case.) On the topic of environmental concerns, I did have a question about James Hansen's 'Global Warming in the Pipeline' which was published last year. A previous (and rather bleak) Medium article you analyzed had cited this particular paper as proof that we're on track to exceed 3C in our lifetimes, even if emissions were to suddenly halt today. https://pubs.giss.nasa.gov/abs/ha09020b.html Since this paper has now passed peer review, what exactly does this mean in simplistic terms? I understand this means that the climate scientists that have analyzed the paper agree with what it states (and see no issues with it's logic), but does it actually mean we'll reach 4C by 2100? Or have I misunderstood what this is stating? The only way I see this not being the case is if somehow Hansen's paper later turns out to be incorrect (which seems unlikely).
I also understand that the paper heavily advocates for a level of geoengineering, which I think is a better alternative to letting a large majority of people suffer, but I'm not sure if you have any opinions on when you think that'd be best to do.
oooh, i've put off answering this because it's perhaps a bit above my pay grade, but let's see
so as far as passing peer review - it's hard to say how robust that is in terms of whether you should believe its conclusions. it depends a lot on the field, the reviewers, and so on - papers are retracted frequently, even if the initial round of reviewers advised to publish.
in climate science we are engaged in a spectacularly difficult modelling task. this paper also speaks on a pretty broad range of subjects. let me quote the full abstract, adding some paragraph breaks:
Improved knowledge of glacial-to-interglacial global temperature change yields Charney (fast-feedback) equilibrium climate sensitivity 1.2±0.3°C (2σ) per W/m2, which is 4.8°C±1.2°C for doubled CO2. Consistent analysis of temperature over the full Cenozoic era — including 'slow' feedbacks by ice sheets and trace gases — supports this sensitivity and implies that CO2 was 300-350 ppm in the Pliocene and about 450 ppm at transition to a nearly ice-free planet, exposing unrealistic lethargy of ice sheet models. Equilibrium global warming for today's GHG amount is 10°C, which is reduced to 8°C by today's human-made aerosols. Equilibrium warming is not 'committed' warming; rapid phaseout of GHG emissions would prevent most equilibrium warming from occurring. However, decline of aerosol emissions since 2010 should increase the 1970-2010 global warming rate of 0.18°C per decade to a post-2010 rate of at least 0.27°C per decade. Thus, under the present geopolitical approach to GHG emissions, global warming will exceed 1.5°C in the 2020s and 2°C before 2050. Impacts on people and nature will accelerate as global warming increases hydrologic (weather) extremes. The enormity of consequences demands a return to Holocene-level global temperature. Required actions include: (1) a global increasing price on GHG emissions accompanied by development of abundant, affordable, dispatchable clean energy, (2) East-West cooperation in a way that accommodates developing world needs, and (3) intervention with Earth's radiation imbalance to phase down today's massive human-made 'geo-transformation' of Earth's climate. Current political crises present an opportunity for reset, especially if young people can grasp their situation.
As I've split it, the first paragraph is a quantitative statement about equilibrium warming, which is the paper's scientific contribution. The second paragraph adds some qualifiers about the expected trajectory "under the present geopolitical approach". The third para is a political argument - a 'what is to be done' type statement.
That's a lot to cover in one paper! It also invites different kinds of approaches to peer review. A scientist reviewing the first half of this paper would be making a technical analysis: do Hansen et al look at the right data, analyse it rigorously, etc. etc.
Why is this all so complicated? Well, lots of things change on Earth when it gets hotter and colder. The amount of cloud coverage, the amount of ice, the way the oceans mix hot and cold water, etc. etc., the amount of dust and soot in the air from forest fires - all of this affects how much energy comes into the atmosphere, how much gets reflected into space, etc etc.
The main things that the paper talks about are...
the equilibrium climate sensitivity: basically, if you add a bunch of extra energy to the system (what climate scientists call 'forcing'), once everything settles down, what temperature do you end up at, per unit of forcing?
the speed of various feedbacks - how quickly the clouds, ice, etc. etc. change in response to the forcing, which determines how quickly you approach this final equilibrium temperature. Knowing which feedbacks are fast and slow is important since it tells us what we can expect to happen when we cut CO2 emissions.
It's naturally a pretty involved discussion and I don't pretend to have the background to follow all the ins and outs of it, but Hansen et al. use various lines of evidence to try to assess these parameters, see how they affect climate models, and the like. They perform an analysis of how temperature and estimated CO2 varied during the Cenozoic era, and there's a section on estimating the effects of aerosols, both natural and human-made.
On the subject of aerosols, Hansen et al. suggest that previous climate models may have made two mistakes that cancelled each other out:
Recent global warming does not yield a unique ECS [Equilibrium Climate Sensitivity] because warming depends on three major unknowns with only two basic constraints. Unknowns are ECS, net climate forcing (aerosol forcing is unmeasured), and ocean mixing (many ocean models are too diffusive). Constraints are observed global temperature change and Earth’s energy imbalance (EEI) [80]. Knutti [150] and Hansen [75] suggest that many climate models compensate for excessive ocean mixing (which reduces surface warming) by using aerosol forcing less negative than the real world, thus achieving realistic surface warming.
What they're saying here is, though we have a pretty good idea of how much CO2 we put in the atmosphere, since we don't have a good measure of aerosols we don't actually know for sure how much energy humans were adding to the atmosphere. Like, CO2 adds energy, but sulfur dioxide reflects it away.
There's three unknown parameters here, and two constraints (things we can calculate for definite). We use a model to tell us one of those unknowns (the ocean stuff), and that allows us to tune the effect of aerosols until our model Earth matches our measurements of the real Earth. But, if our ocean model is wrong, then we end up wrongly estimating the effect of aerosols.
The upshot is that aerosols have been a bigger deal than we thought, and as the world cleans up the atmsophere and removes the amount of aerosols, the rate of warming will increase. It's definitely plausible - but it's such a complicated system that there could easily be some other nuance here.
I won't try to summarise every point in the paper but it's that kind of thing that they're arguing about here. This isn't a mathematical proof, though! Since it's touching on a huge range of different parameters, trying to draw together lots of different lines of evidence, there is still a fair bit of room for nuance. It's not so simple as 'Hansen et al. are right' or 'Hansen et al. are wrong' - they could be wrong about one thing and right about another.
To say they've passed peer review is to say that they've done as reasonable a job as anyone can expect to try and figure out this kind of messy problem. However, other scientists may still take issue with one or another claim. It's not as definitive as a maths paper.
That said, Hansen's arguments all seem pretty plausible to me. The tools he uses to assess this situation are sensible and he talks about cases where things weren't as expected (he thought that improved climate models would change in a different way, and they didn't). But while I know enough about the subject to be able to largely follow what he's saying, I'm not confident saying whether he's right.
The second half takes on a different tone...
This section is the first author’s perspective based on more than 20 years of experience on policy issues that began with a paper [179] and two workshops [180] that he organized at the East-West Center in Hawaii, followed by meetings and workshops with utility experts and trips to more than a dozen nations for discussions with government officials, energy experts, and environmentalists. The aim was to find a realistic scenario with a bright energy and climate future, with emphasis on cooperation between the West and nations with emerging or underdeveloped economies.
So this is more of a historical, political analysis section, addressing why we are on this trajectory and why scientists may be institutionally underestimating the threat ('scientific reticence', 'gradualism' and so on). Well, more precisely, it's a polemic - a scientifically informed polemic, but this is basically an editorial stapled to the science part of the paper.
This includes an account of how a previous paper ('Ice Melt') led by Hansen was reviewed, and sidelined by other scientists, for what Hansen considers unsound reasons. It leads into something of an impassioned plea by Hansen addressed at his fellow scientists, complete with rhetorical questions:
Climate science reveals the threat of being too late. ‘Being too late’ refers not only to warning of the climate threat, but also to technical advice on policy implications. Are we scientists not complicit if we allow reticence and comfort to obfuscate our description of the climate situation? Does our training, years of graduate study and decades of experience, not make us well-equipped to advise the public on the climate situation and its policy implications? As professionals with deep understanding of planetary change and as guardians of young people and their future, do we not have an obligation, analogous to the code of ethics of medical professionals, to render to the public our full and unencumbered diagnosis? That is our objective.
This leads into Hansen's proposal for how to get out of this mess: a price on carbon dioxide, nuclear power, and rushing to research geoengineering such as spraying salt water in the air. And then e.g. specific political proposals, like 'a political party that takes no money from special interests', ranked choice voting and so on.
Naturally this is a lot harder to take technical issue with. It's more like an editorial. As a reviewer you'd probably say it's worth publishing because it's well argued, etc. etc., without necessarily agreeing with every one of Hansen's proposals. You can say 'that obviously wouldn't work' and so on, but it's a different kind of argument.
So re your questions:
does it actually mean we'll reach 4C by 2100?
If Hansen et al. are right, the IPCC reports are underestimating the equilibrium we approach for the current amount of CO2 in the atmosphere - which would lead to 2°C well before 2050, so 4°C by 2100 seems plausible (I didn't spot a timeline that goes that far in the paper when I skimmed through but I could have missed it).
This isn't the amount of warming that will happen, because the Earth has many systems which gradually scrub CO2 from the atmosphere. If we stopped pumping out CO2 suddenly, the amount of CO2, and the amount of extra energy it adds, would gradually decline. So we wouldn't necessarily approach that equilibrium. On the other hand, the amount of CO2 forcing is only going up as things currently stand - and if the amount of forcing stayed the same, Hansen says it would eventually deglaciate Antarctica, leading to over 10°C of warming.
But working out what will actually happen by 2100 depends on a lot of modelling assumptions - how long do you assume we keep pumping out CO2? Hansen addresses this when talking about the subject of 'committed warming':
‘Committed warming’ is less precisely defined; even in the current IPCC report [12] (p. 2222) it has multiple definitions. One concept is the warming that occurs if human-made GHG emissions cease today, but that definition is ill-posed as well as unrealistic. Do aerosol emissions also cease? That would cause a sudden leap in Earth’s energy imbalance, a ‘termination shock,’ as the cooling effect of human-made aerosols disappears. A more useful definition is the warming that will occur with plausibly rapid phasedown of GHG emissions, including comparison with ongoing reality. However, the required ‘integrated assessment models,’ while useful, are complex and contain questionable assumptions that can mislead policy (see Perspective on policy implications section).
So, will we reach 4C by 2100? We can only phrase this question in a conditional way: if we continue to add this much energy, then...
In practice we will probably end up reducing our emissions one way or another - which is to say, if our present complex societies collapse, they ain't gonna be emitting much carbon anymore...
I also understand that the paper heavily advocates for a level of geoengineering, which I think is a better alternative to letting a large majority of people suffer, but I'm not sure if you have any opinions on when you think that'd be best to do.
The way things are going, I think it's likely that people will try geoengineering when the climate-related disasters really start to ramp up, so whether or not they should ends up kind of besides the point.
Hansen doesn't really advocate a specific programme to pursue - only one paragraph in the whole paper talks about geoengineering:
Highest priority is to phase down emissions, but it is no longer feasible to rapidly restore energy balance via only GHG emission reductions. Additional action is almost surely needed to prevent grievous escalation of climate impacts including lock-in of sea level rise that could destroy coastal cities world-wide. At least several years will be needed to define and gain acceptance of an approach for climate restoration. This effort should not deter action on mitigation of emissions; on the contrary, the concept of human intervention in climate is distasteful to many people, so support for GHG emission reductions will likely increase. Temporary solar radiation management (SRM) will probably be needed, e.g. via purposeful injection of atmospheric aerosols. Risks of such intervention must be defined, as well as risks of no intervention; thus, the U.S. National Academy of Sciences recommends research on SRM [212]. The Mt. Pinatubo eruption of 1991 is a natural experiment [213, 214] with a forcing that reached [30] –3 W/m2. Pinatubo deserves a coordinated study with current models. The most innocuous aerosols may be fine salty droplets extracted from the ocean and sprayed into the air by autonomous sailboats [215]. This approach has been discussed for potential use on a global scale [216], but it needs research into potential unintended effects [217]. This decade may be our last chance to develop the knowledge, technical capability, and political will for actions needed to save global coastal regions from long-term inundation.
He says 'we need to research this more to figure out the risks, since we'll probably have to do it' basically. Climate researchers have historically been reluctant to advocate geoengineering for fear it will be mistaken as a way to solve the climate problem without reducing GHG emissions, so honestly seeing them suggest it now maybe brings to light the atmosphere of desperation in the field.
Unfortunately, when talking about politics and economics, Hansen is on much less firm ground than when he's picking apart the intricacies of climate feedbacks. He clearly wants to try to discourage doomerism, and he's rightly critical of cap-and-trade and similar schemes, but he has his specific political fixations and what he suggests is all a bit unconvincing as a programme. I don't say this because I've got a better idea, though.
The problem is that the future is really hard to predict. It's bad enough when it's climate systems, but humans are even more complicated little nonlinear freaks. This isn't a new problem for Hansen's paper. I am pessimistic enough by nature that I don't really trust my ability to predict what we will do when climate change gets more severe. Hopefully by the time we finally decide to stop kicking the can down the road, there will still be something to be done.
7 notes · View notes