hopefulfestivaltastemaker
Michael Goff
85 posts
Don't wanna be here? Send us removal request.
hopefulfestivaltastemaker · 3 years ago
Text
The Alignment Problem
In artificial intelligence, the alignment problem is the question of how we can insure that an advanced AI pursues policies that align with the intentions and interests of the human designers.
The world has many problems: climate change, economic stagnation, subreplacement birthrates, democratic backsliding, COVID, and so on. Since general purposes artificial intelligence (AGI) seems far away, the AI alignment problem can seem like a poor research area when it detracts effort from more immediate challenges. But the consequences of failing to solve the alignment problem could be dire, and unlike with most other challenges, there might not be a second chance if the problem is not solved properly.
Furthermore, the need for good solutions on alignment may not be as distant as one imagines, or confined entirely to the possibility of developing AGI. Cybersecurity is a growing challenge, made worse by poorly designed software, and it is a matter of time before there is a 9/11 scale cyber incident. Alignment of institutions, such as corporations and government programs.
The quintessential formulation of the alignment problem is the paperclip maximizer, by Nick Bostrom. This thought experiment posits a paper clip manufacturer to assign to an AI the task of maximizing paperclip production. The AI proceeds to buy the resources it can to manufacture paper clips, to turns those pesky humans into paper clips (since the humans might shut off the machine, reducing paper clip production), and eventually to build a fleet of Bracewell probes to convert all matter in nearby galaxies into paperclips.
Now, one think that this is an unrealistic outcome. Surely any AI advanced enough to perform general intelligence functions would not misunderstand expectations so badly as to convert all humans and nearly galaxies into paperclips.
Melanie Mitchell gave an interesting presentation with the Santa Fe Institute on genetic algorithms a while back. Here she discussed the use of genetic algorithms for automating bug fixing. The assembly version found the address in memory for the test cases and deleted them, appearing to get a good score in a way that was obviously not intended.
As a research tool for the alignment problem, the Alignment Research Center has developed the Eliciting Latent Knowledge (ELK) tool. The basis of this tool is a SmartVault AI, whose job is to protect a diamond from being robbed by operating a complex hydraulic system to stop any would-be thieves.
The model could be trained by with a neural net, Bayes net, or other algorithm with the hydraulics as the input and whether the diamond was stolen as the output. Whether the diamond was stolen would be determined from a camera. However, a clever thief could tamper with the camera, such as by putting a fake image in front of it, and so it the AI needs to analyze the full history of the scene. ELK seeks to determine whether the diamond is safe by matching what a human observer would determine from watching the full movie of what happened.
The paper runs through several scenarios, describing several training strategies (from the builder), followed by several strategies (from the breaker) that would foil them. The paper identifies ontology identification--essentially, to characterize the scene in a conceptual way that a human might understand--as a critical subproblem.
The ELK is not yet solved, but the authors are hopeful that it will be solved within the next year, or at least that significant progress will be made.
1 note · View note
hopefulfestivaltastemaker · 3 years ago
Text
Moving to Substack
I’ve decided to pack up my blog and move it to Substack. The format and content are very similar.
0 notes
hopefulfestivaltastemaker · 3 years ago
Text
December 12, 2021
My weekly roundup of things I am up to. Topics include the 15 minute city, trip distance and frequency, popularity of housing forms, solar weather and earthquakes, and late socialism.
The 15 Minute City
There is a buzzword going around planning circles now, the 15-minute city (and other varying amounts of time). This is the idea that all establishments that a person needs are readily available within walking or biking distance, negating the need for a car or mass transit.
There is some evidence that a mixed use urban design methodology, as opposed to segregating uses by strict zoning, reduces the need for transportation, albeit by modest amounts.
Taken to an extreme, though, this is a bad urban design methodology. I wrote more extensively about this topic a couple weeks ago, so I’ll highlight what I see as the main problems.
First, there is a risk of ghettoization. When a city is divided into largely disconnected neighborhoods, different clusters of people will populate each neighborhood. Inevitably, some groups will have more political clout than others, and they should expect better public investment of infrastructure and more investment by businesses. This is not a hypothetical scenario, but in fact the reality of most pre-automobile cities, which becomes apparent if one is willing to look past the picturesque downtown neighborhoods of Paris and Rome.
Alain Bertaud discusses the attempts by the South Korean government to create satellite cities around Seoul. The theory is that these cities would be mostly self-contained, cutting down in the larger metro’s congestion problems. It didn’t work, because there is no feasible way to match employers and employees at a sub-metro geographic level.
This article highlights Jay Pitter’s criticism of the 15 minute city. His criticism, at least as presented in the article, is a bit incoherent, so I will take some liberties to reframe it. To insure geographic coverage of amenities within walking distance of every point requires a high level of central planning. Beyond the fact that central planning in general hasn’t worked very well, specifically in the context of urban design, we should expect this process to work against already disadvantaged groups. This point is not that dissimilar to what Ed Glaeser argues about ghettoization.
The value of urban living can be roughly expressed in terms of access, which is density times travel range. The more facilities one has access to, the better quality of life is via agglomeration effects (this is an enormous simplification, but we’ll go with it for now). Both density and travel range have drawbacks, and the two work against each other to some extent. But both are important. An urban planning methodology that works against travel range is likely to result in bad outcomes.
Trip Distance and Frequency
There was a very interesting paper earlier this year, The universal visitation law of human mobility, that finds that the frequency of visits to a location tends to fall off with the square of distance. Counterintuitively (to me at least), this pattern does not seem to depend on what kind of facility it is. The paper is worth a read.
It reinforces the idea that making transportation artificially difficult is an impoverishing act.
Popularity of Housing Forms
Lots of urban design topics this week. Anyway, there was a recent paper, that I discovered through Marginal Revolution’s roundup of links, that shows that single family homes are generally much more popular than apartment buildings. As stated in the abstract, “Relative to single family homes, apartments are viewed as decreasing property values, increasing crime rates, lowering school quality, increasing traffic, and decreasing desirability.”
This was a little surprising to me, though in retrospect it shouldn’t have been. I’ve long perceived the anti-development voices at city hall to represent a loud, minority ideology. This paper shows that, no, those derisively referred to as “NIMBYs” are in fact representative of large constituencies, and city leaders respond accordingly.
This reinforces the need in my mind for advocates of density to take negative externalities seriously, as outlined in the abstract. I really do think the common urbanist attitude toward crime, which has crossed the line into denial, is one of the most self-defeating things that the movement does.
There is a place and a need for dense, transit-oriented development. Most cities have dense downtown cores. This is a good thing. Downtown cores have much to offer. But it will never be more than a minority of Americans who live like this, and it is time for urbanists to come to terms with this fact.
Solar Weather and Earthquakes
A few days ago, I was alerted to the claim that charged particles from solar wind can trigger earthquakes. I had never heard this claim before, and I was curious.
Overall, it seems like the evidence of a link is mixed. These articles do find a link, while these articles do not. If there is a link, I would imagine it is a weak one. The mechanism is that the magnetic field induced by the heightened solar wind during storms can induce electrical activity in subsurface rocks, which causes them to expand, increase pressure, and trigger an earthquake. To say that in this scenario, solar storms “cause” earthquakes would be wrong; storms would be at most a minor contributing factor.
There is an individual by the name of Ben Davidson who has made a name for himself promoting the idea that solar storms cause earthquakes. Five minutes of Googling caused my BS detectors on this guy to flash red. Neither Davidson, nor his site, “suspicious0bservers”, should be regarded as credible sources of information. But I don’t think this discredits all investigation into the solar wind/earthquake link.
Late Socialism
“Late socialism” is one of several terms presented in Alexei Yurchak’s book,��Everything Was Forever, Until It Was No More.
Yurchak argues that in the period of late socialism in the Soviet Union (defined as the period from Khrushchev’s secret speech denouncing Stalinism to the beginnings of perestroika and glasnost) as one in which the meaning behind actions and words gave way to their literal enactments without a sense of meaning. Thus, in a way that would not have been perceptible at the time, the people of the Soviet Union came to lose belief in the system on which the country was predicated.
Most societies struggle in some way with Lefort’s Paradox, which is the split between ideological enunciation and ideological rule. In the case of the Soviet Union, the contradiction was between the individualist ethic in socialist ideology and the need to suppress individualism inherent in a planned economy. Specially during the period of stagnation associated with Brezhnev’s rule, this contradiction became too wide and unmanageable. By the time Gorbachev set about reforms, it was too late.
I only read the first chapter. While the subject and the book itself are interesting, the author is an academic anthropologist, and I found the academic language a bit too impenetrable. I don’t know if I will read any more.
0 notes
hopefulfestivaltastemaker · 3 years ago
Text
December 5, 2021
My weekly roundup of things I am up to. Topics include elite overproduction, knowledge graphs, and defense and industrial capabilities.
Elite Overproduction
“Elite overproduction” is the idea that wealthy economies nowadays produce many more elites than for which there are jobs available. The idea was introduced by Peter Turchin a few years ago. It’s one of those things that is a big part of the popular discourse, and has influenced my own thinking, but I haven’t read the paper until recently.
The paper has some “Limits to Growth” vibes to it, in that is uses a lot of mathematics and modeling to present the elite overproduction thesis. Turchin argues that this condition holds now, and it held at previous times and in other societies that have undergone stress. An example in the paper is overproduction prior to the American Civil War, and in other writings, Turchin links elite overproduction to events such as the Glorious Revolution, the French Wars of Religion, and the fall of the Roman empire (on the latter subject, this talk on Alaric, The Accidental Suicide of the Roman Empire, is informative).
The above analogy carries its downsides too. Inappropriate use of mathematics and modeling can create what Paul Romer described as mathiness, where math serves to confuses rather than elucidate, and conveys a false sense of rigor. Many pessimistic scenarios of varying levels of quality are built upon this idea. And in some sense, elite overproduction explains too much, which creates a risk of epistemic closure.
I assume that by “elite”, Turchin is referring to people with university degrees, but I don’t think this is explicit in the paper. That is a major oversight.
Still, we are all familiar with the trope of the English and History Ph. D.’s working at Starbucks. My own experience on the job market makes it clear that it is difficult. The idea of elite overproduction is too compelling to ignore.
Of course, the way an issue is framed matters. If a mismatch between qualified elites and job opportunities is identified as the problem, it could be framed as mid tier underproduction instead. That’s exactly what this article from American Compass does. The article also links employment issues with regional divergence, or regional concentration (the fact that high status employment is increasingly concentrated in a handful of superstar cities), which creates severe problems on the housing market.
Knowledge Graphs
I have long been a proponent of semantic encoding of knowledge, such as through RDF. This article explains why it doesn’t work, at least in the context of biotechnology. I’ve resisted it so far, but it may be time to admit that as appealing as the idea is, it’s time to accept the flaws and move on to things that have a nonzero chance of working out.
The article is written in the context of KG’s for bioinformatics, but I thought about it for environmental policy, and there too problems become apparent.
Suppose I want to determine algorithmically the truth of a simple statement, such as “solar is cheaper than coal”, the thesis of about a hundred articles on CleanTechnica. It sounds straightforward enough. Find a generally accepted measure of cost, such as levelized cost of electricity, compare the LCOE of solar power and coal power as derived from an authoritative source, and return a result.
But quick thought reveals that it is not so simple. Sources disagree on price, usually not by a whole lot, but enough that judgments are necessary before we pick an “authoritative” source, or maybe a hybrid such as by averaging a few results. Are we taking into account environmental externalities, and if so, how are they priced? Is there a regional context? The cost of various types of power, especially solar, varies by region because of climate and weather conditions, supply chain issues, regulation, the local labor market, and a hundred other factors. Are ancillary services, such as load following, voltage regulation, etc. priced, and if so, how? What assumptions are made about capacity factors and interest rates? Is storage priced? Value deflation? Grid reliability?
It requires some expertise in energy markets to even formulate all the questions, let alone to understand the issues well enough to answer them. But worse, every question raised is as complex as the original question comparing electricity costs. If these questions are asked, then they will raise yet more adjacent questions in a never-ending chain. Thus, it appears that the “straightforward” question of “is solar cheaper than coal?” is in fact an AI-complete problem.
Of course, there are numerous shortcuts that an expert takes, and indeed must take, to answer the question, and it is not necessary to know and invoke the full corpus of human knowledge. Doing so requires expert judgment.
Knowledge graphs may still have their place. For example, in organizing knowledge, or providing a terrain map to newcomers to a field to get started. But the dream of automated reasoning, outside of a few toy examples, appears to be beyond our grasp until general AI is available, something that I have not seen a credible idea of how to build.
Defense and Industrial Capabilities
That’s the subject of this report.
Having read the foreword only, it seems the thesis of the report is that the United States is dangerously losing the capacity for domestic manufacturing, which imperils the ability to compete geopolitically with China.
Reports of this nature have a self-serving agenda that is impossible to overlook. The Defense Department wants more money, which of course is recommended for key programs. Certain domestic industries such as semiconductor manufacturing like subsidies in the form of protectionism and industrial policy. Is there a good case for these policies when the self-serving aspects are corrected for?
Unlike the authors of this report, I strongly suspect that China is now at or near its peak of geopolitical power. This was my view a few years ago, and since then, the financial crisis triggered by the Evergrande liquidity crisis has made this view more mainstream. China faces severe long term problems in the form of debt and demographic decline (so does the US, but China to a greater degree). Just as fear of Islamic terrorism is losing its salience as a policy motivator in the US, fear of Chinese dominance will recede with changing reality.
The national security argument can be seen as a form of the resilience argument for autarky. I am skeptical, but I still haven’t found a good argument one way or the other that limiting imports makes a country more resistant to supply chain disruptions.
As for industrial policy (here, federal spending on semiconductor fabs and R&D), it is needless to say that there is a history of ... mixed results.
The report’s broader point is that the United States needs to rebuild manufacturing and engineering capacity. There is an element of “truthiness” to this, in that the statement feels true, but there is a need to pin down what exactly this means and how to achieve it.
0 notes
hopefulfestivaltastemaker · 3 years ago
Text
Progress and Cities
Following are my thoughts on what a progress-oriented view of urbanism can be.
Cities and Civilization
The growth of permanent settlements, be they at Çatalhöyük or even earlier, is so intertwined with our understanding of civilization that the two are practically synonymous.
By clustering large numbers of people in a small area, cities facilitate trade in a way that would be impossible in a purely agrarian society. Trade in turns allows greater specialization, allowing people to become more skilled in a particular area and do what they enjoy and are best at. This is as true today as in neolithic times. Economists formalize this notion as agglomeration economies (studied by many researchers, such as here, here, here, and here). The preceding researchers estimate that for every doubling of a city’s size, the per-capita GDP increases by an amount ranging from 7 to 20%.
Specialization allows cities to present opportunities that are not necessarily captured by GDP statistics: entertainment, cultural amenities, and the spread of ideas.
The Downsides of Cities
The same considerations that allow commerce and ideas to spread in cities also facilities the spread of negative things: pollution, crime, disease, fires, and so forth. All else being equal, larger cities tend to have more crime. Large cities have been associated with more frequent outbreaks of epidemics. Larger cities tend to be associated with greater noise and air pollution exposure. Larger cities tend to have more severe traffic congestion.
Agglomeration economies explain why we don’t all live in rural areas, and the downsides of urban growth (agglomeration diseconomies) explain why we don’t all live in a single megacity. Given a society’s technological capability and governance quality, countries tend to exist roughly at an equilibrium point, where the benefits and drawbacks of further urban growth are equal.
However, technology and governance are not static. As industrialization led to the rapid growth of cities (and vice versa), cities were forced to innovate better ways to deal with the downsides. They tamed fire risk through professional fire departments and building codes. Professional police departments helped bring crime under control. Countries dealt with air and water pollution with such measures as the Clean Air Act, and for pollution specifically from cars, from better fuel economy, phasing lead out of gasoline, catalytic converters, and now electrification of the fleet.
Progress in managing the downsides of urbanism allow cities to grow larger and to better capture the upsides. Despite the dramatic historical progress, there is a long way to go. Some clearly urgent areas include the harms of noise and light pollution. Others, such as congestion, are discussed in more detail below.
Less successful are attempts to deal with the downsides of urbanization by reversing the urban trend, such as in various back-to-the-land movements. Despite the futility of such efforts, many development NGOs are still concerned with “uncontrolled urbanization” and “urban sprawl”. Countries will do far better to shape their societies to fit their citizens’ needs, rather than the reverse.
Cities and Transportation
What is a city exactly? My working definition revolves around Marchetti’s Constant: the observation that, from neolithic times to now, the average daily work commute is typically around 30 minutes. A city is thus a daily commutershed around a central business district. In the United States, this definition is closer to that of a metropolitan region than a city in the political sense. Indeed, the political borders of most cities are now fairly arbitrary and have little to do with coherent labor markets.
Importantly, the definition is centered around time, not distance, of commuting. Thus the cohesion of a city as a unified labor market depends greatly on the transportation technology available. In the ancient world, when most people commuted by walking, most cities were limited to about a two mile radius or less. New technologies such as the carriage, mass transit, and above all the automobile greatly expanded this radius. This expansion led to a process of conurbation, where previously distinct cities are linked by transportation into single, cohesive labor markets.
This process continues with high speed rail, and may proceed further with the development of autonomous vehicles, maglev, flying cars, the hyperloop, and until the world is a single unified labor market, an ecumenopolis.
Zoning
Almost all cities in the world have some sort of zoning, or regulation on what can be built, the shape of what can be built, and where it can be built. Zoning rules exist to deal with very real externalities of urban growth, and any credible alternative solution must deal with the same problems. At the same time, zoning carries significant costs that cry out for better solutions.
Attend any meeting of your city council or planning commission, and when the question of new development comes up, there are a few common objections that citizens are likely to voice. At the top of that list is likely to be issues around parking and congestion.
Parking and congestion in most cities are externalities, in that when a person drives, they impose costs on their fellow citizens that are not accounted for in the transactions. Blocking development is a method, albeit a highly crude method, for internalizing these costs. A better tool would be to privatize and correctly price parking. Cities should repeal parking minimum rules that require new developments to provide a certain number of spaces, and they should stop giving away street parking. This need can be met by private parking lots and garages, and voluntarily by developers. Parking minimum rules carry substantial costs which are passed on to renters and homebuyers.
Several cities, such as London and Stockholm, have implemented congestion pricing systems, where motorists have to pay to use certain roadways based on the cost in the form of congestion they impose on other motorists. Singapore has taken advantage of modern IT tools and built a highly sophisticated congestion pricing system.
The decrease in crime since the 1990s has helped fuel the urban renaissance, and for more than the obvious humanitarian reasons, urbanists should focus on further reducing crime such as through community policing. The recent uptick in crime in the United States is highly concerning, and the failure for most urbanists to take this seriously is inexplicable. Crime is typically the second most cited concern, after traffic issues, against new development, especially income-restricted development.
Aesthetics are an often under-appreciated aspect of the development process and a determinant of what is and is not politically feasible. It is important to foster a greater societal affinity for change, growth, and progress in general if these values are to be reflected in cities.
Cars
Most urbanists are concerned about car dependency. These concerns mix very real problems with cars with lifestyle politics, muddled economics, romanticized views of pre-automobile cities, and degrowth ecologism. It is important to disentangle these various issues.
A key objective of urban development is to maximize access, or the number of destinations that a person can reach within a reasonable commuting time. To that end, there is a tradeoff between the space occupied by transportation and the distance that one can reach. Walking takes very little space, but one cannot go far. Cars take up a lot of space, but one can go quite far. Biking, buses, and rail occupy intermediate positions.
Most large cities have their shape determined by car access, but they also contain bus, rail, and/or subway systems that enhance throughput, especially in the central business district.
It is tempting to think that mixed use development, or an urban design philosophy that puts all common needs within walking or transit distance, will obviate the need for car ownership. This helps to an extent to reduce vehicle miles-traveled. But this fails to substitute for car ownership, mainly because of employment. A person looking for work, especially high status, white collar work, typically has to apply to many jobs and will not have many job opportunities that meet precisely what they are looking for. So they have to look all over the city to find something that fits. It is unlikely that the result will be within walking distance. Likewise, employers will look all over the city for suitable candidates, and they will not favor a candidate just because they can walk to work. In his book Order Without Design, Alain Bertaud documents how South Korea tried something like this with satellite cities around Seoul and failed to achieve the objective of congestion reduction.
If employers could somehow be forced to hire only employees within walking distance, it would reduce their talent pool and obviate the value of agglomeration economies that large cities are supposed to confer.
Transit-based cities and car-based cities can be built to a limited density before congestion becomes intolerable. That density is much higher for transit-based cities. Dense development is typically more expensive per square foot than sparse development. It typically costs less per lane-mile of highway than for a mile of railroad track. If density is to be more financially attractive, there is a great need to address the factors that contribute to infrastructure cost, such as Buy America provisions and excessive environmental review. It is also necessary to address construction costs, something that, for example, Keterra tried unsuccessfully to do. See Brian Potter’s excellent newsletter, Construction Physics, for more on this subject and the difficulty of making construction more cost-efficient.
Bus rapid transit is a relatively recent advancement in urban transportation that, though it does not offer the full passenger throughout of a heavily used metro system, nevertheless offers high throughput and can be built for much lower cost. A platooning, automated BRT, using similar technology to what is already being extensively tested for trucks, could combine the throughput of a busy metro with the cost and flexibility of a BRT.
For smaller and medium-sized metros, vanpools, such as used in Arlington, Texas, are an underutilized tool. Most cities have not yet utilized the advances in route planning that Uber and Lyft have pioneered.
Intercity transportation is just as important as intracity transportation. Access to highways, rail lines, and airports is crucial to a city’s economic success.
Regional Divergence
Many countries, including the United States, are seeing a severe concentration of technical talent and job opportunities in a small number of superstar cities. This creates problems on both ends, where cities like San Francisco are unaffordable to most newcomers, while others like Detroit are seeing neighborhoods be abandoned for lack of demand. Addressing regional concentration is complicated, and it goes somewhat against the philosophy of gains from agglomeration, but it is necessary to have broad-based prosperity.
Historically, the US federal government has showered research dollars, military contracts, and other forms of spending throughout the country. Sometimes the purpose is as crass as to buy the support of a strategic senator, and sometimes as far-sighted as to develop the West. While this practice is often not the most efficient way to spend, it brings regional development benefits such that it is worth reconsidering.
Remote work was on the rise before the COVID-19 pandemic began, though comprised a small fraction of the workforce. If the trend toward remote work remains durable, it will go a long way to promote regional convergence, as well as to alleviate congestion and improve quality of life for workers.
Toward Progress Urbanism
The urbanist movement has many important insights to offer, and those insights are currently incorporated in city plans throughout the world. But the urbanist movement also suffers from myopias: a Manichaean view of cars, an underappreciation for growth and prosperity, and a hesitation to use market forces on difficult urban problems are among them. There is a need for a politically active philosophy that combines the insights of good urbanism with pro-progress values, such as expounded, for instance, by the progress studies movement.
Such a Progress Urbanism would be concerned with questions of, for instance, how to lower costs of badly needed infrastructure, how to develop better transportation, how to better mitigate the downsides of dense cities such as crime and pollution, and how to foster broad-based opportunity for those who are unable or unwilling to move to a handful of superstar cities. These questions are not, in my observation, adequately addressed by the professional planning community, academia, or state or national policy (in the US at least, where I am).
0 notes
hopefulfestivaltastemaker · 3 years ago
Text
November 28, 2021
My weekly roundup of things I am up to. Topics include recycling, George W. Bush, and burnout.
Recycling
Following both rapid increases in municipal solid waste recycling in the US and a burst of public attention in the late 1980s and early 1990s. recycling rates are stagnating.
Tumblr media
(This figure is a few years out of date now. MSW recycling has decreased a bit since 2015).
Is the current level of recycling, in the low 30s of percent, the best we can economically do, or is this stagnating because of a failure of imagination in policy? I suspect it’s the latter. I am working on a three part response to improve recycling rates.
Part 1: Price externalities. Taking into account the environmental effects of landfills and incinerators, and the greenhouse gas cost of manufacturing from virgin material instead of recycled material, I estimate that an appropriate Pigouvian tax on landfills and incinerators is on the order of $100/ton of solid waste. It is mostly using recycled material that drives these figures (see the Waste Reduction Model). This is greatly in excess of most landfill tipping fees and would change the economics of recycling substantially. I haven’t tried to calculate non-GHG impacts of using recycled material. Incidentally, a carbon price should achieve most of this effect since it is greenhouse gases that comprise most of my estimated costs.
Part 2: Wet/dry dual stream collection. Dual stream means that recyclables are put out in two bins at the consumer level. Wet/dry means they are separated into organic (wet) and inorganic (dry) material. This system doesn’t get quite as high public participation as single stream recycling, but the loss is more than made up for by lower contamination rates, which is a major problem for single stream. There are source separated approaches which put more streams at the consumer level, but for these, the decrease in contamination is more than lost by the decrease in public participation. This study from 2002 is a good estimate of costs. Though it is outdated, more recent figures I have been able to find are less comprehensive but confirm the general picture.
Part 3: Public investment in material recovery facilities. As much as I favor private sector solutions to problems, waste management is very much a public sector activity. I would consider a MRF to be valuable public infrastructure, justifying public investment the same way as is done for roads, power lines, water treatment plants, etc. Most regions should be able to do a dual stream MRF for well under $100//ton.
There is a long way to go before I have a solution on which policymakers can act, but I think the basic outline has come into place.
George W. Bush
I recently finished The Presidency of George W. Bush, by John Robert Greene. This is the latest entry in the University Press of Kansas’ Presidency series, and like most of the other books, this was well written and one that I greatly enjoyed.
Most scholars have taken a negative view of our 43rd president. Greene’s is the first (that I know of) major scholarly work to take a balanced approach. He notes some major achievements of the Bush administration, such as PEPFAR and the response to the financial crisis, and failures, such as the pre-9/11 lethargy about terrorism and advocacy of the unitary executive. On Bush’s biggest policy measure, the war in Iraq, Greene offers a mostly negative assessment of how it was handled, but notes the positive elements of Saddam Hussein being removed from power and the success of the surge that started in 2007.
The Bush administration represented a high point for interventionism. The administration was an historically odd hybrid of nationalist anti-terrorism fear and Wilsonian democracy promotion. At the time, I viewed the post-2004 shift in rhetoric to democracy in Iraq as a cynical distraction from the failure to find weapons of mass destruction, but Greene portrays this shift as sincere. This state of affairs was unsustainable and seemed to be ending well before the financial crisis, but the onset of the recession seems to have been the nail in the coffin. The presidents elected in 2008 and 2016 ran on opposition to the war. By 2020 the issue seems to have been receding from public attention; the president elected that year supported the war, but I don’t recall this being discussed much in the campaign. Yet the decision to withdraw from Afghanistan was couched very much in “America First” terms, that as tragic as it would be for the Taliban to retake control of Afghanistan, this was not an immediate problem for the United States.
Greene focuses, as one would expect, very much on 9/11 and its aftermath, including Bush’s response to terrorism thereafter, and the decision to invade Iraq and its aftermath. I wish he focused more heavily on the financial crisis, a topic that I felt was given short shrift, and really the second term in general.
Burnout
This week I was forced to acknowledge that I am now suffering from severe burnout, so I have to take some time to reevaluate priorities.
I’ve read several of the popular psychology articles on burnout, but generally what they say doesn’t ring true in my experience. What I have found is that burnout stems primarily from a sense of futility more than overwork. I am happy to work hard, and there are periods of my life when I was able to sustain hard work over a long period. But in the present case and past episodes when I have experienced burnout, there hasn’t necessarily been overwork to go along with it. Rather, I think the problem is the sense that I am not accomplishing goals, or that the hard work is not leading to anything valuable down the line.
0 notes
hopefulfestivaltastemaker · 3 years ago
Text
November 21, 2021
Roundup of things I am up to this week. Topics include environmentalism in space, abundant energy, anti-satellite tests, and From Poverty to Progress.
Environmentalism in Space
A few weeks ago, I presented to the Overview Roundtable on environmental issues that are, or will become more, relevant with the advent of more extensive spaceflight. My full report can be seen here (warning: 25 pages long). For the presentation, I restricted to a single topic, namely options for energy production, and even then had to cut material to cram it into 15 minutes.
The material came out of proto-task force work into the subject of environmentalism (the “proto��� implies that there are additional task forces planned for the future). Although a bunch of people expressed interest in the work, only two people, myself one of them, did any significant work on this project.
My biggest regret is that we were not able to synthesize our work into a coherent whole. Instead we did two separate presentations on the same day and submitted two separate reports. Thus the work linked above represents my work only, and not a collective product of the task force.
Abundant Energy
Tyler Cowen wrote a column a few weeks ago speculating what could be done if cheap, abundant energy became available, such as if Helion’s fusion proposal is successful.
I found the piece rather demoralizing. The reality of the current energy situation is climate change, high prices stemming from supply chain disruptions, nuclear plant closures, and a renewable energy sector which, though currently thriving, is threatened in the long run by value deflation, land use conflicts, and intermittency. It is good I suppose to present a positive vision that will inspire positive change, but that vision has to be based in reality. Tyler’s is, I’m afraid, not, at least not for the foreseeable future.
I’ve toyed with the idea of writing my own “if we had cheap, abundant energy” piece, but it would be a lot of work to do the research and calculations that I think are necessary to make it good. And since I don’t see that energy revolution on the horizon, I might as well write “if we had a molecular assembler”.
Anti-Satellite Test
A major recent item in the news was an anti-satellite test, which created 1500 pieces of trackable debris and maybe 100,000 pieces of untrackable debris.
Orbital debris is a major and growing problem, and without mitigation, it could render low-earth orbit unusable, which would be a severe loss for earth monitoring, communications, and other valuable functions. ASAT tests are, as far as I can tell, the biggest source of orbital debris.
It would be easy to say that ASAT tests shouldn’t happen, but how would that be enforced? The Outer Space Treaty does prevent some weaponization of space, including the use of nuclear weapons, but it does not prevent ASAT tests. I’m not sure if there is any international law that would have prevented this test. And of course, international law, like any kind of law, is only as effective as its enforcement mechanism.
From Poverty to Progress
I’d like to give a plug to Michael Magoon’s blog, From Poverty to Progress, which documents fairly extensively the nature of social progress, particularly in the economy and technology, and how it came about. The material is based on his book of the same name. The blog is extensive and worth a browse.
0 notes
hopefulfestivaltastemaker · 3 years ago
Text
November 14, 2021
My weekly roundup of things I am up to. Topics include the productivity-pay gap and consumerism.
Productivity-Pay Gap
Most likely you have at some point seen some variation of the following plot.
Tumblr media
This is the productivity-pay gap, as reported by the left-learning Economic Policy Center (the foregoing link shows a somewhat different version of this). It purports to show that while productivity--the amount of economic activity generated per hour of labor--has been rising fairly steadily since at least the end of World War II, hourly wages have hardly increased since the early 1970s.
It is a powerful visual. The only problem is that it’s wrong. This paper breaks down three factors that explain most of the observed gap: wage inequality; compensation that is not captured in wage statistics (e.g. health benefits); and the fact that a GDP deflator is used for productivity and CPI deflator for hourly compensation.
I would recommend this article for a more layperson-friendly explanation of the problems. It explains that what is labeled “productivity” is not productivity in the sense that most of us would understand the term, but rather a broader measure of income. What the plot really shows is a decrease in the share of income that goes to labor, not a productivity-pay divergence.
The article also takes a stab at discussing a more accurate measure of productivity, a feat that is more difficult than one might guess. They propose useful energy (exergy), as expounded in Ayres and Warr, as a measure. This measure, like wages, has shown much more stagnation since the 1970s. While exergy has its own drawbacks as a measure of productivity, it goes a long way to close the supposed gap.
Faulty data is used to support two faulty conclusions.
1) Everything is fine with productivity, as it has increased mostly uninterrupted for decades. Therefore, there is no need to worry about scientific productivity; NEPA, zoning, and other regulations; or other factors that might drag on growth.
2) Since productivity does not translate into higher wages, productivity growth is of limited importance.
Recent experiences with supply chain disruptions and high inflation should be a wake-up call that an economic policy that focuses entirely on wealth redistribution and ignores wealth creation is not going to be successful.
Consumerism
This opinion article from Bloomberg argues that the current faltering of supply chains is something that should be welcomed and is a harbinger of a turning back from mass consumerism. To be clear, I am aware that the piece has many flaws, most of which I am overlooking for the sake of commenting on a few other things.
The most interesting line I thought was this one.
Long-term, sustainable growth doesn't come from going deep into debt to buy stuff we don't really need. It comes from technology and innovation, where we come up with new products and better ways of doing things. An economy based on consumption is not sustainable.
The view that consumerism drives growth is common, indeed so common that most people now have a hard time conceiving of an alternative arrangement. When considering the factors that drive long term growth, technology still dominates the discussion, and the demand question is handled by Say’s Law, which (highly simplified) follows from supply. On the time scales of business cycles, though, Keynesian reasoning has come to predominate. This ultimately follows from the observation, during the Great Depression, that there were simultaneously idle factories and unemployed workers, suggesting that supply constraints were not responsible for the economic problems. Even during more prosperous times, Keynesian reasoning reigns, with in the US Democrats preferring to stimulate demand through government spending and Republicans through tax cuts, particularly aimed at high tax brackets, but in both cases the answer is on the demand side. As noted above, evidence of problems on the supply side is growing; whether the wake-up call will be heeded is an open question.
The ecologic argument, simplified to the point of being wrong, is that during the Depression, the economy had grown to the point where basic needs and wants could be fulfilled, and we transitioned from supply-side to demand-side limitations on economic growth. As a result, governments and business have had to promote a culture of consumerism to keep demand going. The critique is that this does little to enhance well-being and causes severe environmental damage. The article makes the ecologic argument as well.
I think a more accurate take would be that, overall, we have chosen (rationally as argues Dietrich Vollrath) to spend a greater portion of the wealth generated on consumption and entertainment, and invest less in the foundations for future growth, including family formation, hard work, and investment. Falling fertility is one clear illustration of this. Another is the fall in working hours in the most of the world.
Tumblr media
I would probably agree, except for Vollrath’s characterization of these trends as positive.
The obvious question that follows is, if consumption is not the primary basis of the economy, then what is? The article fails to offer a credible alternative, and I’ll admit that I don’t have one either. Consumerism, in my read of history, is an outgrowth of humanism, which holds human well-being to be the yardstick by which societal progress is measured (indeed, the very idea of progress is a humanistic idea). In a recent podcast, Jason Crawford highlights democracy, science, and financial innovation (i.e. capitalism) as both results and drivers of progress. But I would argue that they all derive from humanism. Humanism is so central to modern thinking that it is very difficult to justify an endeavor on non-humanist grounds, let alone imagine functional alternative social arrangements.
But humanism does have what seems to be a fatal contradiction. Progress leads to growth, which creates surplus wealth, which is invested in the foundations for future progress. But increasingly, we see that surplus wealth, including our time as indicated above, is spent on consumption, to a degree which is clearly unsustainable (in a broad sense, not just environmentally unsustainable). Without a major change, which I don’t see forthcoming, the slowdown of growth we have seen over the last 50 years will continue, and eventually growth will come to a halt and go into reverse. It is difficult to see how humanism remains society’s dominant ideology in a functional form after that.
As an illustration of the editorial’s blind casting about for a credible post-humanist philosophy, we get this bizarre line.
European souls are not necessarily more fulfilled, they just find other, more eco-friendly ways to shut out the darkness - like going on a long bike ride.
So yes, European society (or the cartoon version posited by the author) does not reveal an answer. Indeed, they show a more advanced form of the malaise that afflicts American society. Since I am not advocating ecologism, maybe the response is to find a way to lift the idea of progress from its historical humanist context. I have no idea how that would work.
0 notes
hopefulfestivaltastemaker · 3 years ago
Text
November 7, 2021
Weekly roundup of things I am up to. Topics include supply chains, Nikola, and crypto cities.
Supply Chains
This article on supply chains is interesting, though it doesn’t answer my questions. It does illustrate why legislative efforts to engineer new supply chains are not likely to turn out well.
Having talked with a couple people who work in logistics, and read some articles, I think the best explanation as to why we are having difficulty now (that is, more difficulty than usual, because there are always snarls going on) is as follows. First, both the direct effects of COVID-19 and fear of a recession led to cutbacks in shipping capacity. Second, government stimulus, together with a shift in spending from services to goods, has led to a surge of demand for shipping which the industry cannot currently accommodate. The situation will probably work itself out by 2023 or 2024.
This story leaves a bit to be desired. In particular, why aren’t we seeing more inflation than we currently are? There is the sticky prices theory, but this strikes me as somewhat hand-wavey.
There is a persistent impulse toward autarky. This is based on the theory that cutting international trade and building national shipping and manufacturing capacity will make trade more secure and reduce the likelihood of the kind of disruptions we are seeing now. President Biden said as much in a speech last month on the administration’s supply chain efforts. I haven’t seen good evidence one way or the other, but I would be skeptical of this hypothesis.
Nikola Motors
There was a video a couple weeks ago about Nikola Corporation and its founder Trevor Martin. I like the channel in general. This video details the history of fraud in Martin’s earlier activities, the implausibility of Nikola’s claims, and investors’ gullibility.
Prior work I have done on energy made me a little skeptical of hydrogen fuel cells for trucking, but I was optimistic about Nikola and slow to party to recognize a problem. I even missed the fraud indictment when it first happened because I was in the hospital.
One thing highlighted in the video is that Nikola claimed they had a process to electrolyze hydrogen from clean energy sources at $3/kg. I was unaware of this claim, and I think that if I had known, I wouldn’t have believed it. That is something like eight times cheaper than mainstream production processes. It would require a source of energy much cheaper than is used now, and maybe also cheaper and/or more efficient electrolyzers. In 2019, I had done a similar analysis on producing electrolyzed fuels from nuclear power and concluded that such price points could not be achieved with technology that is currently available.
Why were investors not more skeptical? That’s a bit of a mystery to me. People look at the success that Tesla (after which Nikola has clearly modeled itself) has achieved and are hoping for the next miracle that will come out of logistics and economies of scale. But if I can see problems, with limited resources and no formal training on energy issues, then it really seems that enough people should have seen through the claims to prevent the wild valuation that Nikola achieved in 2020.
I thought about this issue this week, when Helion Energy is in the news for raising some money. They are trying to achieve deuterium-helium 3 fusion. There is also a fair amount of hype surrounding them, and a source of hype is a genuine desire to break out of the malaise that is settling in on the world economy. Neither Helion nor any other fusion approach has achieved breakeven energy production yet, and this paper from 2019 convinced me that tokamak alternatives are not likely to do so soon. Breakeven is a necessary but not sufficient condition to have a commercially viable fusion energy source. I have no reason to suspect that Helion is engaged in fraud. Like all startups, they want to present an optimistic view of what they can achieve, and they may be fooling themselves. Or I might be wrong and they will actually pull it off.
Crypto Cities
Vitalik Buterin wrote an essay about crypto cities and the various ways that blockchain technology could be used to improve how zoning is done, voting, and various other municipal functions.
I am glad to see these kinds of experiments in municipal governance. It is abundantly clear to me, having spend some time in city halls myself, that this is necessary. One of the things on my to-do list is to find what has been done in the idea of resolving zoning issues through Coasian bargaining. My understanding is that there are some problems here, but I don’t know very well what those problems are, and whether they can be overcome with the kind of low-cost transactions that blockchain would have to offer.
0 notes
hopefulfestivaltastemaker · 3 years ago
Text
October 31, 2021
My roundup of what I am up this week. I am traveling, so it’s a light post. Topics include traveling to Washington, DC and solar PV waste.
Travelogue
Travelogues can be pretty boring, but since that’s my biggest event for the week, that’s what we have.
I’ve been in Washington DC for a couple days, mainly for a get-together from the Art of Dialogue fellowship. Usually when I have reason to travel to DC, I add a few days to the trip to do some lobbying or other business, but this time it has been in-and-out.
In the past, I’ve always found traveling to DC exciting. There is a sense of possibility in this city, if only I know the right people to talk to and what about. Not this time, though. I don’t know if it’s the result of COVID-19, my own health problem, getting older, mounting disappointments, or just the general sense that things aren’t working out. I very much wish I could have taken an extra day to lobby, but I don’t have a good platform ready. I might have another chance next February.
As for the fellowship meeting itself, I am very glad I went. I saw a lot of people who I haven’t seen in years. The fellowship has been good to me. There was another person in my class who, I found, had life-threatening medical issues, and we had a chance to bond over that.
Solar Waste
When talking about nuclear energy, “what to do with the waste?” is one of the most persistent questions. Reviewing the literature on the subject, it seems to me that the cost of disposing of nuclear waste, if assessed to utilities operating nuclear power plants, should add on the orders of tenths of a cent to the levelized cost of electricity. This contrasts to about 5-10 cents/kWh for electricity from nuclear power, so it’s a minor cost if done right.
Tumblr media
Advocates of nuclear power fire back that other options, such as solar PV, also generate waste that imposes costs of operators, and on society as a whole. How does solar PV waste compare to nuclear waste in severity? I made the following back-of-the-envelope calculations.
In the 2040s, it is estimated that solar will produce an average of 3950 GW of power (it will still be increasing as of 2050). The number of panels that will be coming out of commission in that decade should generate 4.5-5.6 million tons of e-waste per year, that will cost a bit over $300/ton to recycle. This compares to a bit over 50 million tons of e-waste today per year.
If the manufacturers of solar PV have to pay for that disposal, and the cost is spread over all new PV coming into production in the 2040s, that works out to about 0.015-0.018 cents/kWh, assuming also a 30% capacity factor for PV. In order words, on the same order of magnitude of the cost of disposing of nuclear waste. This may seem counterintuitive. Pound-for-pound, high level nuclear waste is far more dangerous than old solar PVs or almost any other kind of waste. But in terms of number of pounds, the figure is far less.
These calculations are back-of-the-envelope, and there are ways they could have been done differently. Instead of assessing the cost of current waste to current panels, it may be better to assess the cost of disposing of a panel in the future to that panel today. That also means choosing an appropriate discount rate. I also used a single source for solar PV recycling cost. This seems cheap, since most e-waste figures in general I have seen put the cost north of $1000/ton, but on the other hand, the cost should go down as volume increases and the industry gets economies of scale.
I would like to tackle the waste issue more systematically. Wind turbines and batteries also generate waste that must be handled at cost. Coal ash is a severe problem. Things can go wrong like in Kingston, TN in 2008, and even in good times I would have to imagine that the cost of coal disposal would exceed the waste costs of any other power source.
0 notes
hopefulfestivaltastemaker · 3 years ago
Text
October 24
Another roundup of things I am up to this week. Topics include futurism, Wang Huning, and the overview effect.
Futurism
Predictions about the future are, in general, the empty calories of sociology, and I don’t think these predictions are an exception. But this piece did get me thinking about some general ideas about forecasting.
If you want to have a good model, which extends to a forecast, about broad questions like wealth, technology, or standard of living, then you need to have a good idea of where these things come from. The piece does not, so it is flying blind. It may be right, just as a pilot who can’t see still might land the plane by luck.
Where does our wealth come from? I think there is a general understanding that it comes from science and technology, though this only adds a layer of indirection to the puzzle. Where do science and technology come from? I see at least two good answers to this question. The following are neither exclusive nor exhaustive answers.
The first, often associated with the ecological economics school of thought, is that wealth ultimately comes from natural resources. These resources include mineral wealth, forests, farmland, oceans, and so forth. Above all is energy, the master resource, because energy is the medium by which mines are transformed into ore and then valuable minerals, by which land and oceans are transformed into food, and by which raw materials are transformed into manufactured goods. In this school of thought, technology often plays a marginal role and raw materials the central role.
A second school of thought focuses on human minds. I still consider Julian Simon to be the best exponent of this thinking. Simon regards a high human population as a good thing and I imagine would regard current demographic forecasts as a worrisome prospect. With cognition centered, resources are almost entirely decoupled from raw materials. It is advancing science that turns oil, uranium, bauxite ore, and so on from curiosities into pillars of the economy, and technology could do the same to deuterium, minerals on the sea floor, and asteroids.
A symptom of modern malaise is that we lack an understanding of where wealth comes from and therefore take it for granted. The piece linked above discusses possible technological advances, such as colonies on Mars, but there is no sense of where they will come from.
I would like to write some forecasts of my own. This genre may be a form of empty calories, but I like to indulge in candy every now and then.
Wang Huning
There was an article on Palladium recently about Wang Huning, an influential Chinese intellectual who is regarded, among other things, as a main architect behind China’s current “Common Prosperity” campaign. The article is worth reading, and I won’t belabor it further.
An English translation of the book referred to, America Against America, can be found here.
The Overview Effect
The Overview Effect is a term coined by Frank White (incidentally, the fourth edition of his book on the subject is coming out now) describing the psychological effect of seeing Earth from space. The experience is said to impress upon the individual a sense of the oneness and fragility of the Earth. I have been involved for the last few months in a discussion group on the topic.
A couple things that have occurred to me lately. The first is the overlap between the overview effect and other kinds of observational experiences, such as near-death experiences. Cathedrals are meant to impress upon a person the glory of God, to the point where medieval European societies were willing to invest a large share of their resources into building them. Exposure to nature is purported to foster environmental consciousness.
The second is the distinction between the mystical and the political. Phrases like “environmental consciousness” arouse some suspicion in me, because it is unclear what exactly they mean. A general concern for the well-being of nature might start as a mystical experience, although ethicists will quickly add an academic component when they interrogate what is meant by “well-being” and “nature”. Even with a notion in hand, are we pursuing degrowth, decoupling, or something else?
In discussions of the overview effect, it is common to translate that experience into environmental terms (see e.g. the purported influence of the Earthrise photograph on Earth Day) and geopolitical terms (e.g. international cooperation). It is difficult to draw the line between the mystical and political in these cases, but I have my suspicion that there is some amount of carelessness in this regard.
In the United States and Europe, there are instances of people with near-death experiences to meet Jesus in the course of events. A broader look at the phenomenon indicates that people, including atheists, often meet deities of their culture’s religion. The mind has a way of translating experiences into cultural terms that are familiar. This makes it quite challenging to know how to interpret these experiences.
0 notes
hopefulfestivaltastemaker · 3 years ago
Text
October 17, 2021
My weekly roundup of things I am up to. Topics include highway costs, filtering, greenhouse gases in agriculture, and technological regression.
Highway Costs
I recently became aware of a new paper from the Brookings Institution which asks why highway construction costs have gone up since the 1960s. It may be a good proxy for infrastructure costs in general. They look at several hypotheses, and settle on two as being the best explanations for why per-mile Interstate costs have gone up from the 1960s to the 1990s.
First, since American society in general has gotten wealthier, there is a greater demand and willingness to pay for more expensive construction, including more onramps, sound barriers, etc. Second, they identify “citizen voice”, which includes environmental review, as a factor driving costs. They look at a number of other hypotheses and find that they have a minor effect or none at all.
A complementary paper looks at the other side of fiscal responsibility for highway investment: ensuring that we get the full value of what is built. That includes things like congestion pricing, road damage pricing, design for safety, variable width lanes, and putting investment in those areas where it is most needed. He points out that some of these things might increase construction costs but generate net value (for example, thicker pavement might raise initial costs but lower future maintenance costs).
I’ve looked at infrastructure costs on and off for a long time. I’ve generally resisted concluding that public processes, including but not limited to NEPA, are the main drivers of cost increases, but I am finding that conclusion more difficult to avoid. This paper looks at the economy as a whole and finds that increasing regulation has cost two percentage points of GDP per year since the 1970s. In the counterfactual world where regulation stayed at pre-1970s values, we could be over twice as wealthy by now. Of course, there are some benefits to regulation--I am not so cynical as to claim that their only purpose and function is to depress wealth--and the paper does not do a cost-benefit analysis.
Filtering
Filtering is an idea in housing economics that increasing the supply of housing lowers cost for everyone, not just the people who move into the new housing. This paper provides some evidence.
Studying the market in Helsinki, Finland, the authors are able to track a “chain” of moves induced by new housing. When a new market-rate building is built near downtown, people occupy that building. This creates (mostly) new vacancies elsewhere in the city, which a different set of people occupy. This creates yet more vacancies, and so out. The authors are able to track a chain of vacancies to six iterations, and they find at that level, new market-rate housing disproportionately benefits people in the bottom quintile and bottom half of the income distribution, even if they can’t afford the new housing directly. Social housing (including public housing and subsidized housing built by nonprofits) is more efficient at filtering, but not by a whole lot.
Another study in the United States uses similar techniques and comes to similar conclusions.
Although filtering is well-supported by evidence, the idea remains controversial. Critics only see new, expensive housing, and taking a zero-sum view of the market, see this as benefiting the wealthy and hurting the poor. The error goes back at least as far as Jane Jacobs.
Greenhouse Gases in Agriculture
There is a new analysis (full version and supplementary material) on greenhouse gas emissions in agriculture. The result is a bit higher than what I had been using before from the FAO, but it is in line.
Aside from being higher, I don’t find great surprises in this report. Over half of all emissions can be attributed to animal-based foods, including the feedstock. By activity, over half of emissions come from farmland management and land use change.
This paper reinforces my view that intensification is the way to go. This will include synthetic meat.
Technological Regression
This paper from 2008 looks at a few examples of technological regression, or instances where a society’s technological capability decreased. The three examples are pre-industrial. They give a model showing that negative shocks to population can cause regression.
The central insight is that technology should be understood as embodied, not as a stock. The latter view is common, though often implicit. We imagine that as technology is discovered, it is known and readily accessible forever (this is how things work in Civilization if you have played those games). Especially in a world without widespread printing, technical knowledge is rather embodied in the skills of the population.
Toward the end, the authors make a fairly unconvincing effort to distinguish their model from the modern world and population trends. I think it is likely that the slowdown in population is contributing to technological stagnation and will lead to regression under present trends.
I was thinking about examples of regression in the modern world. So far I see a few isolated examples but no clear evidence of regression. Many areas of the world are closing nuclear power plants and generally replacing the lost power with fossil fuels. After the seeming advance of rideshare companies, e.g. Uber and Lyft, bringing efficiency to transportation over traditional taxis, we now see those companies declining and traditional taxis resurging. There was the loss of commercial supersonic flight with the end of the Concorde. After the end of Apollo, the United States lost the capability of spaceflight to the Moon, and with the end of the Shuttle, spaceflight to low Earth orbit. This regression has been partially reversed now with private spaceflight providers. More examples could be provided.
I expect that if the barriers to progress persist, then the examples of technological regression will pile up to the point where they can no longer be dealt with in isolation, but rather a better understanding is needed of the trend. I also expect that even with regression, net technological advancement (advances outweigh regression) will continue for quite some time.
0 notes
hopefulfestivaltastemaker · 3 years ago
Text
October 10, 2021
My weekly roundup of things I am up to. Topics are nuclear plant closures and supply chain issues.
Nuclear Plant Closures
Ted Nordhaus has a good piece in Foreign Policy on the series of energy supply crunches going on around the world. The proximate cause may be the larger supply chain crunch, but a deeper cause has been nuclear plant closures. This policy, which has been pursued in California and Germany in particular and also many other places, is sold on the basis that a 100% renewable energy system is the future and that the lost nuclear power will be replaced by renewables.
However, this has not been the case in general. The triumphalist narrative behind renewables is that their cost keeps falling, since they--especially solar PV--can be manufactured quickly and benefit from Wright’s Law. They have passed the “tipping point” where renewables are cheaper than other options, and will only become more so, leading to a wholesale replacement.
What this narrative ignores is that, in addition to the economies of scale coming about from Wright’s Law, renewables also face diseconomies of scale. As discussed a few weeks ago, NIMBYism is growing. NIMBYism against wind turbines is well-established, and NIMBYism against solar farms is picking up as well. I don’t endorse this NIMBYism, but it is a reality that needs to be considered. Beyond that, the ideal sites (politically easy, close to demand centers, with good resource) tend to be developed early. Just as ore grades fall as mining increases, the value of renewable resources tends to fall as penetration grows.
Ted also refers to value deflation in the piece, which is the tendency for electricity prices to fall when renewable production is greatest. After about 20% penetration or so, this becomes a major problem. The LCOE (levelized cost of electricity) metric doesn’t capture this phenomenon.
A future heavily dependent on renewable energy and electric vehicles will require far more of certain minerals than used now, especially lithium, cobalt, nickel, graphite, and various rare Earth minerals. This will require a major expansion of mining, and to avoid a supply crunch and the devastating environmental impacts of terrestrial mining, it will be necessary to engage in deep sea mining. Yet many of the same organizations that are selling the 100% renewable future are also trying to shut down deep sea mining before it starts.
The most cynical view is that most environmental organizations believe in forms of the “degrowth” agenda, and while they can’t win directly on degrowth in a democratic society, they can win indirectly by strangling all forms of progress through the regulatory apparatus. They have scored some victories on this front against nuclear power as discussed in the piece, genetically modified organisms, supersonic flight, environmental review such as NEPA, and other areas, and are now trying against deep sea mining.
The slightly less cynical, but perhaps more accurate, view is that most environmental organizations see economic growth and technological advancement as “somebody else’s problem” and believe that they can lobby for all the bans and regulations they can get without causing economic damage. As Ted points out in the piece, they build simplified models that show their policies are the best, and when reality doesn’t cooperate, they dismiss that as the growing pains of a new paradigm.
Supply Chain Issues
Last week I wrote a bit about the supposed ongoing labor shortage and my (mostly unsuccessful) attempts to understand what is going on there. Here is an equally unsuccessful attempt to understand supply chain issues.
The press has been full of articles lately about supply chain issues. Things like container ships getting backed up at port, deliveries being delayed, energy prices, especially coal, skyrocketing, and so forth. This seems to be closely related to labor issues, and maybe labor shortages are part of the generally supply chain problems.
Unfortunately, I have not been able to find a good exposition of what is going on. This article is the best I have seen so far. The author, Craig Fuller, notes a couple things. First, there have always been issues, though in 2021 they seem to be worse. Second, the main cause of the problem seems to be an influx of government stimulus into the economy. I would like to see these be more quantified.
One of my takeaways is that the principles of Keynesianism may have run its course. It is common wisdom among governments, and has been since the Great Depression, that the way to boost the economy in the short term is through increasing consumer spending, either through tax cuts or government spending. Such was the impulse behind the 2009 Recovery Act, tax cuts in 2001, 2003, and 2017, and so forth. I don’t know if this was ever sound policy, but it certainly doesn’t seem to be now at a time when economic problems seem to be driven more by supply than demand.
Still, it’s a mystery to me why we’re not seeing more inflation. There too I see some anecdotal reports of inflation in the press, but at worst the statistics suggest that we are returning to inflation rates that prevailed before the 2007-09 recession and are nowhere near the bad old days in the late 1970s and early 1980s.
0 notes
hopefulfestivaltastemaker · 3 years ago
Text
October 3, 2021
My weekly roundup of things I am up to. Topics include the homicide rate, deep sea mining, and labor shortages.
Homicides up
In 2020, the United States posted its biggest increase in the per-capita homicide rate since statistics started. However, the figures are still much lower than in the early 1990s (see the second plot).
I see two major reasons to be concerned about this. First and foremost, being a victim of crime, especially violent crime, is a traumatic experience that any decent society should try to minimize.
Second, I don’t think there is still a good idea of how spikes and dips in crime occur. There does seem to be a strong social component to it, in that if people around me engage in criminal behavior, I am more likely to as well. This feature makes the trajectory very hard to predict, almost like trying to predict a recession. But it also means that there is a positive feedback loop, which raises worry that crime will continue to rise.
Many progressives, especially in the urbanist movement, are dismissive of crime trends and fears about them. That is a huge mistake. Beyond the obvious humanitarian reasons to take crime seriously, rising crime rates are major threats to infill urbanism, criminal justice reform, and many other progressive priorities. There are ways to take crime seriously without succumbing to the “tough on crime” mentality of the 1990s. Alex Tabarrok’s essay, Underpoliced and Overprisoned, is a good starting point.
Deep Sea Mining
I wrote a little bit about deep sea mining a few months ago. More recently there was this negative piece from the Guardian. A few takeaways from it.
- The main thrust is still the precautionary principle, even though those words don’t appear explicitly in the article. The piece appeals to a lack of knowledge of the impact on deep sea ecosystems.
- The piece also invokes broader grievances about resource use, the capitalist system, etc.
It seems like deep sea mining has become a flashpoint of controversy in the environmental movement, much like nuclear power or genetically modified organisms. Unfortunately, this means that opposition is probably entrenched. Opponents will invoke uncertainty as a reason for opposition, but no amount of knowledge will be sufficient to change minds.
A good test of sincerity with the precautionary principle is to ask what the end game is. If lack of knowledge is the reason to oppose an activity, then what knowledge is needed to settle the issue, and what is the plan to acquire that knowledge? Ideally, with a time frame. If those questions are answered, then I think we have a sincere use of the precautionary principle. If not, then the PP is merely being used as a pretext to oppose activity. This is a good principle not just with deep sea mining, but in all areas where the PP is used. The impact of space exploration is another example.
As far as what impacts are known, I am only aware of one study comparing deep sea mining with terrestrial mining (I had referred to a different study previously; this one has similar results and some of the same authors, but I think it is a better reference because it is in a peer-reviewed publication). They find that deep sea mining has lower impacts than terrestrial mining on most metrics. I am not aware of a study that shows the opposite.
Labor Shortage?
The news is full of anecdotal reports of a labor shortage. From my own experience I see more “now hiring” signs around nearby businesses than in the past. Still, some aspects of the labor shortage story don’t add up in my mind.
For one thing, the unemployment rate is 5.2% as of August. That’s obviously a lot better than most of last year, but the rate was below 4% pre-pandemic.
For another thing, if there was truly a labor shortage, you would expect wages to go up via the magic of supply and demand. You really have to squint at the statistics to see that.
A variety of explanations have been given for the discrepancy, including the dissuading effects of extended unemployment benefits, a skills gap, the effects of the delta variant, and others. None of these strike me as very convincing.
A few years ago, there was the popular notion of a “tech worker shortage”, which the Brookings Institute debunked (n.b. I favor the H-1B program, but I don’t favor dishonest arguments). The current situation feels similar. For another thing, as most economists will point out, “shortage” is a subjective term. There can only be a shortage at a particular price point, and if an employer cannot hire for the wage they offer, then they should offer more. The market should then settle on a wage where the supply and demand for labor are equal. I can’t afford a Ferrari, but no one will take me seriously if I complain of a Ferrari shortage. Now I feel like a dumbass for explaining basic economic principles that popular media pieces should explain.
The whole “skills gap” thing is especially irritating to me. Just as with wages, basic economic principles and common sense suggest that if an employer cannot hire for a position, then they will have to be less picky about experience, just as they might have to be willing to pay more. But one problem is that workforce development is an externality to the labor market, in that some of the benefits of training workers do not fall directly to either the workers or the employer. That’s a basic reason we have public education and public workforce development programs. Still, I have yet to see much exploration of this idea or anyone who has given it a rigorous treatment.
Sticky wages are another notion that I haven’t seen in the press. This the notion that, for a variety of reasons, employers don’t fully lower wages to market levels during a recession, which creates unemployment (the existence of involuntary unemployment it itself a major challenge in economics). In theory the argument should apply in the other direction: if employers don’t raise wages to market levels, especially now in possibly a time of heightened inflation, then a labor shortage should result.
This question is interesting to me, though it is too far out of my wheelhouse to spend much time on.
0 notes
hopefulfestivaltastemaker · 3 years ago
Text
September 26, 2021
Another roundup of things I am up to this week. Topics include a coal exit, Age of Sail vs. space expansion, and health in space.
Coal Exit
Roger Pielke Jr. proposed a Coal Exit treaty in a recent piece.
Most international climate efforts have centered around emissions targets (or indirectly in the form of temperature targets), with individual countries to make emission reduction pledges and figure out how to achieve that. The Paris Agreement of 2015 was the most recent major such effort, with the Kyoto Protocol the major previous effort. Such endeavors have been of limited success.
I was thinking about past similar endeavors that have been successful, or not. Although mainly a private venture, Breakthrough Energy came out of the Paris Agreement to invest in revolutionary ideas with decarbonization potential, such as fusion, advanced biofuels, and new battery chemistries. The effort has not been without problems, but I think it has been a good thing, and it is fairly clear that technological advancement is the only way that we deal with climate change without austerity.
The Montreal Protocol in 1987 is the quintessential example where all countries have agreed to phase out certain ozone-depleting substances. In the years since then, the ozone hole has started to recover. It worked because there were good non-ozone depleting substitutes for CFCs for refrigerants and other chemicals. The treaty most likely accelerated a transition that was in the works. In the case of coal, new coal plants are being built, but the industry is struggling with competition from natural gas and renewables.
Phasing out coal will not be sufficient to address climate change, but it is necessary, and if we can’t even do that, then there isn’t much hope for a solution.
Age of Sail and Space Colonization
There was another interesting piece this week from Anton Howes analogizing between European colonization during the Age of Sail and possible forthcoming spacefaring ventures, and ways in which they might be similar.
Historical parallels are interesting, but it is also good to consider how the two ventures might be different, and how principles derived from the Age of Sail would not apply to spacefaring.
While it might take months or years to bring physical cargo to a base on Mars, communication with Earth will not take more than 24 minutes. For a ship at sea or a 16th century colony in the New World, both cargo transport time and communication time were in months. This will greatly limit the autonomy of spacefarers. I would have to imagine too that artificial intelligence will advance in such a way that most mission planning occurs before a mission is launched, even if there are live humans traveling.
The piece discusses the financial structures behind such missions, and it may be necessary to grant monopoly rewards to spacefaring ventures to make them financially viable. That’s probably true, though I don’t know central these efforts will be to nationalist projects. The Age of Sail coincided with the rise of modern European states and later the principle of Westphalian sovereignty. I don’t expect this system to go away in the foreseeable future, but its centrality is eroding.
Globalization is not a new phenomenon, of course; it was also clearly visible in the Age of Sail. Spacefaring carries both the hopes and fears of this trend, though. The hope is that the enormity of the spacefaring enterprise, which may prove to be too great for a single nation, will foster international cooperation. Such was the impulse behind the design of the International Space Station; how well that has worked in practice is another matter. Fears are centered in the fact that large corporations today are increasingly disconnected from their nation of origin, and the forces of international communication and trade will continue to erode the nation-state, with spacefaring only supercharging this trend.
None of the three views discussed above--that spacefaring will become a vehicle for nationalist imperialism, that it will be a vehicle for international cooperation, and that it will further erode the nation state--strike me as especially plausible. But I don’t know how it will play out.
Health in Space
One of my tasks for the week was to examine issues around human health in space. There are three main issues (and a bunch of other issues) that I see: the effects of microgravity or low gravity, radiation exposure, and access to health care.
We know that long-term exposure (maybe greater than about six months or so) to microgravity causes severe health problems, especially bone and muscle loss. We don’t really know how low gravity, such as on the Moon or Mars, will affect people in the long term, but the limited evidence is that there will be long term negative effects as well.
What happens if it becomes apparent that people simply cannot be healthy on Mars for an extended period due to low gravity? Do we accept that, or do we give up on space colonization when near-Earth gravity by rotation is infeasible?
Speaking of which, there is an estimate that a rotating habitat would need a diameter of at least 12-24 meters to have Earthlike gravity and for the Coriolis force to be tolerable. This seems a little optimistic to me, but I don’t think we really know.
NASA estimates that on a three year mission to Mars, astronauts should be exposed to 1200 milliSieverts of radiation. Under the linear no-threshold model (controversial, but we’ll go with it for now), an exposure of 1000 mS should lead to a 5.5% of developing fatal cancer. A serious risk that could be mitigated by better radiation shielding on the spacecraft, or faster travel such as with a nuclear thermal rocket, but it’s tolerable. For colonists living permanently on the Moon or Mars, they’ll probably have to go underground most of the time. In free space, we’ll need better shielding than is used now.
Access to medical care in an emergency is something I have been thinking about ever since my stint in the hospital last year. On the ISS, there are procedures and training to deal with fairly routine issues, but a medical evacuation will be necessary in the event of something serious. This paper has a stat that there are 2.3 medical evacuations for every 1000 person-months on long-duration submarine missions. The paper goes on to estimate that for a 6 person crew on a 2.5 year Mars mission, an average of 0.9 medical evacuations would be needed (a somewhat higher rate; I’m not sure exactly how this is defined). Limited access to health care may just be another risk that the first generation of colonists will have to face.
There is also the psychological stress of spaceflight, compounded by having to spend an extended period in a confined setting. I’m not sure this issue is as serious as is made out in the press.
I don’t necessarily see show-stoppers here, but there are issues that need to be considered more seriously if we are to see greater spacefaring activity. Such was also the conclusion of the inspector general.
0 notes
hopefulfestivaltastemaker · 3 years ago
Text
September 19, 2021
My roundup of things I am up to this week. Topics include peat in the Dutch Golden Age, research productivity, asteroid risk, and solar storms.
Peat and the Dutch Golden Age
That is the subject of this most fascinating article by Davis Kedrosky. The article outlines the role of peat in the prosperity of The Netherlands in the 17th century. We often split the long arc of history into the agrarian and industrial periods, with the Industrial Revolution as a transition, beginning in the 18th or 19th centuries in Britain. If so, then the Dutch Golden Age is a kind of intermediate period, with peat, the leading fuel, a transition between traditional biomass and fossil fuels.
One thing in particular jumps out at me.
Davis argues, fairly convincingly, that it was not energy limitations that brought the Dutch Golden Age to an end. During the decline, the Netherlands had as much access to coal at Britain, but they didn’t develop imports to a great degree. This is in contrast to the point of view generally associated with ecological economics, which gives energy availability and prices central importance in the evolution of an economy.
The whole substack is interesting and worth a browse. Davis focuses on economic history, particularly the history of the industrial revolution. The article about the timing is particularly interesting.
Are Ideas Getting Harder to Find?
This is the title of a paper by Bloom et al., on which I have commented a few times. Since the first preprint appeared around 2017, I haven’t seen a good counterargument to the paper, making it seem fairly strong. This essay, which came out this week, is the strongest I have seen so far.
Bloom et al. argue that ideas really are getting harder to find, and their evidence rests on three case studies (semiconductors and Moore’s Law, life expectancy, and agricultural yields) and total factor productivity (TFP). In each of the four cases, they argue that research productivity has fallen by dramatic margins. Research productivity is defined as the amount of progress in the area divided by the number of researchers.
Alexey Guzey’s essay, in contrast, makes several criticisms of Bloom et al.’s methodology. The main one is that the measure of ideas, and of “progress” in each of the four fields, is rather ad hoc. For example, with semiconductors, if the baseline measure of progress is taken to be a linear or quadratic growth of transistor density, rather than exponential, then productivity has gone up dramatically rather than down. There are a number of other criticisms as well, which I won’t enumerate fully. The essay is worth a read.
Regarding TFP, it is pointed out that TFP is not a very well-understood metric. TFP is the residual of economic growth when growth in capital investment and the labor supply are accounted for. TFP is often taken as a proxy for technological development, but really it is a sort of dark matter in macroeconomics that is only weakly correlated with technology.
The weakest point of Guzey’s argument, in my opinion, is a heavy reliance on mismeasurement. It is asserted that actual TFP is higher than the measured value because large contributions are not adequately accounted for. Operations like Google and especially Wikipedia, for example, capture only a small portion of the consumer surplus they generate. I have two problems with this argument. First, it is not clear that this is true. While Wikipedia only formally contributes to GDP a small fraction of the value it creates, easy access to knowledge should allow workers and firms to be more productive, increasing GDP indirectly. Second, even if this argument does hold up, it is not clear that mismeasurement has gotten worse over time, which would be necessary if there is recent growth in TFP not being captured by formal statistics. For instance, there is a great deal of informal household labor not captured in the economy, and if anything this value has been going down (e.g. home cooked meals being replaced by restaurant meals).
The above criticism notwithstanding, this essay is a useful contribution to the productivity debates. I think the weak correlation between TFP and innovation is the most important point. It should be borne in mind in trying to explain the paradox of how a sluggish economy seems to coexist with astonishing advances in synthetic biology, artificial intelligence, space flight, and other areas.
Asteroid Risk
The last asteroid impact to do significant damage to humans was the 2013 Chelyabinsk impact. That asteroid was about 20 meters in diameter, and though it airburst rather than hit the ground, the explosion caused about 1600 injuries, mostly from broken glass. The 1908 Tunguska impact is believed to have been caused by an asteroid about 50 meters in diameter. That did minor damage to humans because it hit in a remote area in Siberia. If it had hit a major city, the damage would have been catastrophic. Despite the light damage that has occurred historically, there is a small risk of a much larger impactor, and thus asteroids feature prominently on almost any list of existential risks to humanity.
This report from NASA has a chart (see p. 25) on the risk of impacts by size. The report is from 2006, but more recent material seems to say similar things. The risk of an impact with global consequences seems to be on the order of 1 in tens of thousands, maybe hundreds of thousands, per year, and 1 in millions before we get to extinction risk. I would be much more worried about other things.
Nevertheless, it is worth a modest investment of resources to identify and deflect asteroids. NASA’s budget to do so has grown considerably, though for a variety of reasons that are separate from planetary defense. As a result, the number of large asteroids found has also grown. As of 2018, about a third of 140+ meter asteroids are found; I think it’s about half now. Over 95% of 1+ km asteroids are found. We’ll probably never get to 100%, since there is the risk of an unexpected comet or something. Considering that the Tunguska impactor was estimated at around 50 meters, we are definitely far from finding everything that could do major damage.
It’s not clear to me what would happen if an asteroid with a high risk of impact was found. So far no deflections have been demonstrated, though there are plenty of ideas on the drawing board.
I don’t have too much to add that is novel. Having looked at this issue in more detail lately, I’m pleased to see that there has been a lot of progress, even if there is still a long ways to go. NASA now spends about $150 million/year on planetary defense, according to one of the articles above. That seems reasonable.
Solar Storms
One of the best known solar storms in recording history was the 1859 Carrington event. This storm disrupted telegraph lines and cause the Aurora Borealis to be visible as far south as Cuba. It is estimated that if a similar event happened today, the damage could be in the trillions of dollars.
This report has some figures on strengths of storms and their likelihood of happening (see Table 2 on p. 19), though they seem to misreport the strength of the Carrington Event.
Evidently there was a near-miss in 2012 of a major coronal mass ejection. It had a peak intensity is -1200 nanoteslas, while the Carrington event had a peak intensity of -850 nT. A direct hit would have been catastrophic.
Thanks to SOHO and other solar monitoring, we would have advance warning of up to a couple days in the event of a solar storm. The main thing to do would be to harden transformers and disconnect vulnerable equipment from the power grid. I would guess that some sort of Kuznets curve applies here. Since 1859, we’ve gotten much more vulnerable to solar storms because of the proliferation of power grids and electronics. But in more recent years we have gotten more resilient due to better forecasting, better design of equipment, and more knowledge of what to do. But unlike other areas where Kuznets curves apply, major solar storms are too rare to test this hypothesis rigorously.
Evidently it was proposed that the United States build a strategic reserve of transformers to prepare against solar storms or EMP attacks, but this was not deemed to be cost-effective.
0 notes
hopefulfestivaltastemaker · 3 years ago
Text
September 12, 2021
My weekly roundup of things I am up to. Topics include learning curves, Telosa, the fine-tuned universe, and some anniversaries.
Learning Curves
Last week I commented on Matt Clancy’s site, New Things Under the Sun, and in particular the material on fertility rates. It is meant to be “what academics generally believe” on questions related to innovation. The site is chock full of good material. This time I’ll comment on his work on learning curves. See in particular this and this article.
A learning curve, also known as the “learning-by-doing” phenomenon, is the idea that the cost of producing a product goes down as more is produced. This makes intuitive sense. As production goes up, we would expect more workers to be trained to be efficient, for companies to optimize their production processes, for more efficiency techniques to be discovered, and so on.
The empirical evidence is fairly strong too, or at least it appears to be. Learning curves are also known as Wright’s Law, going back to a paper by Theodore Wright in 1936 where he observed that for every doubling of airplane production, the cost goes down by 20%. There are numerous other studies that find similar cost reductions (albeit of widely different magnitudes) in many areas.
But the old mantra of “correlation doesn’t imply causation” applies here. One could tell the opposite story. Maybe cost reduction has nothing to do with learning-by-doing, but rather happens for other reasons, such as technological improvement. That’s what many of papers that Clancy cites show, to an extent.
Some, but not all, of cost reduction really is due to the learning-by-doing effect. What is the portion exactly? This turns out to be very difficult to estimate, and reading through the studies that Clancy discusses, I don’t have a good answer. Until now, though, I had naively assumed that it was the full portion.
These observations have several policy implications. Many decarbonization models, for instance, rely heavily on deployment of already developed technology and assume cost reduction going forward as this technology is deployed. This assumption has two purposes. First, if we expect that the price of solar panels, wind turbines, HVDC cables, lithium-ion batteries, and other technologies will decline with further deployment, then this makes models that rely on these technologies look more cost-effective than they would appear if present costs are assumed. The second purpose is that cost reduction is a beneficial spillover of deployment, and therefore a justification for subsidizing deployment that goes beyond carbon dioxide reduction. This point has been invoked, for instance, in justification of the policies behind Germany’s Energiwende (renewable energy transition).
If we determine that learning-by-doing effects are weaker than a naive learning curve analysis would indicate, then there is less justification for subsidizing deployment. The Investment/Production Tax Credits for renewable energy look less attractive. Policy should be more oriented toward technological change than deployment. The idea of a France- or South Korea-style nuclear power buildout also looks less attractive, and we should focus more on next generation nuclear technology instead.
Several things I have done will have to be rethought.
Telosa and Other New Cities
Telosa is a newly announced planned city with a target of 5 million people by 2050, to be built in the American Southwest somewhere. The official website, linked above, has a lot of pretty pictures and buzzwords from urban planning.
I hope the project is successful, but I also hope to be forgiven for not getting too excited. There is an extensive history of new city projects not working out or performing less well than hoped, including seasteading, Khazar Islands, Masdar City, and others.
The main headwind I see is that, as Alain Bertaud describes in his book, cities are first and foremost labor markets. When a new city starts seeking residents, it has the basic problem of providing jobs for those residents. Attracting employers will be difficult too because there won’t be many employees for those employers to hire. This is a fundamental problem that will also make space colonization difficult. As this article explains, the youngest of the top 10 cities in the United States is Phoenix, AZ, founded in 1868.
Most wealthy countries now have declining populations or soon will. Under current demographic trends, most other countries will reach this crossover point sooner or later. A declining population is another headwind for founding new cities, since they will also have to complete with depopulating existing cities.
Despite the problems noted above, Masdar City is an example of a project that has achieved at least some success. For one thing, it is not really a new city because it is within the commutershed of Abu Dhabi, reasonably close (though a bit outside the commutershed) to Dubai, and close to a major airport. The developers were also smart in recruiting IRENA (International Renewable Energy Agency) as an anchor tenant, which helps resolve that chicken-and-egg problem of building a robust labor market.
Fine-tuned Universe
The idea of fine-tuned universe is that it appears that many aspects of the universe we live in are set to specific values that are conducive to the emergence of intelligent life, to a degree that is hard to imagine being a matter of chance. Such parameters include the relative strength of gravitation and electromagnetism, the rate of hydrogen fusion, the fact that there are 3 non-compactified spatial dimensions, and many others. Explanations as to why this is the case have been all over the map, and the issue intersects deeply with questions of creationism and intelligent design.
One of the argument, outlined by Lawrence Krauss here, is that the large number of seemingly life-conducive parameters may be an artifact of our lack of understanding of physics. We are learn more, several factors that seem to be unrelated may turn out to be multiple manifestations of the same phenomenon.
While I don’t think that intelligent design is the best solution for the fine tuning problem, this explanation doesn’t make a lot of sense to me. For one thing, it is rather hand-wavy. It appeals to things that we might know in the future, but without a clear sense of what those things might be. Second, even if this argument holds, it remains unclear why a deeper theory of physics should have any cases where it resolves into a universe that is conducive for life, and thus is it unclear to me why, even if different parameters that are conducive for life turn out to be related, this should resolve the paradox.
Intelligent design is one of those topics that I avoided when it was a much more active area of debate, but now that it has calmed down somewhat, I would be interested in understanding these issues better.
Anniversaries
Yesterday, the news was occupied with the 20th anniversary of the 9/11 attacks.
George W. Bush gave a speech that I thought was interesting for two reasons.
First, and this was the issue that most of the news coverage picked up, was that Bush equated domestic terrorism with foreign terrorism and portrayed the two as being comparable threats. I would have found such a statement from a prominent figure unthinkable 20 years ago, especially from Bush. But I think it goes to show the extent to which the United States has again become an inward looking country. This is the normal state, at least that I can remember. Of the presidential elections I am old enough to remember (since 1992), the 2004 election was the only one where foreign issues were dominant.
I would reckon this state of affairs began with 9/11, and it was definitely over with the onset of the Global Financial Crisis, though it could have ended with Donald Rumsfeld’s departure from the Defense Department, or maybe sooner with events that eroded Bush’s standing, including the Terry Schiavo incident, the failed attempt to reform Social Security, and Hurricane Katrina.
Bush’s foreign policy during these years was a weird mix of anti-terrorism and Wilsonian democracy-promotion. Such a combination was probably never stable. Now the “America First” movement is strong politically. There is bipartisan opposition to immigration and trade, two issues where Bush was a proponent. In justifying the withdrawal from Afghanistan, President Biden was keen to emphasize that ongoing combat would not have been in the US interest. The political right has made hay for obvious partisan reasons and because the withdrawal was executed so badly, but they were much happier with Trump championed the same policy. Such is the degree of the inward turn that even Bush himself has to respond to it.
The second interesting point was on the idea of national unity. There is a certain 9/11 nostalgia now that pines for the apparent sense of unity that prevailed immediately after the attacks. It would appear to be distinct from, but related to, the idea of the “Sputnik moment” that prevailed in the aftermath of the 1957 Soviet launch of Sputnik, or current anticommunism and anti-Chinese sentiments. It is the hope that widespread recognition of an external threat can suppress internal acrimony and catalyze a more dynamic posture than Americans have shown in recent years. Though it has never been clear to me what this apparent unity is supposed to mean.
Tomorrow I am also noting the first anniversary of my brain aneurism. In the months after the event and my recovery from it, I have more or less resumed the patterns of living that I had pre-stroke. Perhaps some things are subtly different, such as a more visceral appreciation for the fragility of life and a greater sense of seriousness with which I pursue my goals.
I still get the chills, though, when I think about how close I came to death. I was unlucky that event happened in the first place, of course, but given that it did, I was fortunate to have a successful operation. Based on what doctors said in the hospital, I probably wouldn’t have survived if this had happened in 2001. I was fortunate to have access to a good (albeit expensive) health care system. I was also fortunate that the aneurism struck when I was in a well-trafficked hallway of my apartment building, so I was found almost right away.
In happier anniversary news, next week I will be celebrating my 40th birthday.
0 notes