Showing posts with label Science. Show all posts
Showing posts with label Science. Show all posts

Tuesday, December 16, 2014

Global warming, peak oil, collapse and the art of forecasting

Predicting the future usually means to provide an accurate analysis of the present and extrapolating some trends. Therefore the forecasts that try to look at least some decades into the future need to be well informed of the current situation. However, we have various options; we may make different decisions that will influence the outcome. For climate change, the amount of greenhouse gases emitted will be a major factor that determines the amount of warming, and this is obviously the topic for negotiations.

The Intergovernmental Panel on Climate Change (IPCC)

The IPCC thus operates with four "representative concentration pathways" (RCPs) of CO2 equivalents in the atmosphere and what effects that will have on the climate. The pathways stand for different strategies of coping with the challenges, from turning off greenhouse gas emissions immediately (not likely to happen) to sticking ones head in the tar sands and increasing the emission rates as much as possible. A part of the debate now revolves around whether or not the most restrictive pathway can be combined with continued economic growth, but more about that below.

The IPCC's fifth report from November 1 2014 comes with its usual summary for policymakers. However, having a policy is not restricted to the assumed readers of the report; we are all policymakers. IPCC's report sketches the facts (the melting ice, acidifying of oceans etc) and suggest mitigation strategies and adaptation to the inevitable worsening climate in many parts of the world. Perhaps the physics behind global warming is assumed to be known, but possible positive feedback mechanisms are worth mentioning. This is how the summary alludes to that:
It is virtually certain that near-surface permafrost extent at high northern latitudes will be reduced as global mean surface temperature increases, with the area of permafrost near the surface (upper 3.5 m) projected to decrease by 37% (RCP2.6) to 81% (RCP8.5) for the multi-model average (medium confidence).
Melting ice means that the albedo will decrease so that a darker sea surface will absorb more energy. Methane may also be released, either gradually or in a sudden burp, as permafrost thaws. For the record, methane has a global warming potential many times that of CO2 (the factor depends on the time horizon which is why different figures may be used).

As usual, the so called policy makers are reluctant of taking consorted action. From the debate it may appear as though the goal of limiting global warming to 2°C would be sufficient to keep up business as usual. Oil companies and their allied business partners either invest in campaigns casting doubt on climate science, or argue that their production is so much cleaner than in the rest of the world, and if they didn't drill for oil, then someone else would. Yet, we know that most of the known hydrocarbon sources will have to remain in the ground to minimize the impact of climate change. Even with one of the most optimistic scenarios, we would have to prepare for extreme weather; draughts, hurricanes, forest fires, flooding, and mass extinction of species. Perhaps there is an awareness of the unwillingness to take prompt measures to reduce greenhouse gas emissions that has tipped the IPCC's focus somewhat over on adaptation to shifting climate conditions rather than focusing narrowly on mitigation strategies.

Global Strategic Trends: a broader perspective

With this gloomy outlook in mind, one should not forget that there is a fast-paced development in science and technology in areas from solar panels to medicine and artificial intelligence. The British Ministry of Defence's think tank Development, Concepts and Doctrine Centre has compiled a report that tries to summarize trends in several areas in a thirty year perspective up to 2045. Their Global Strategic Trends (GST) report does not make predictions with assertions of the likelihood of various outcomes, but rather illustrates where some of the current trends may lead, as well as providing alternative scenarios. The strength of this Global Strategic Trends report is its multifaceted view. Politics, climate, population dynamics, scientific and technological development, security issues, religion and economics are all at some level interrelated so that significant events in one domain has implications in other domains. Such cross-disciplinary interactions may be difficult to speculate about, but the GST report at least brings all these perspectives into a single pdf, with some emphasis on military strategies and defence. A weakness of the GST report is that it is perhaps too fragmented, and the most important challenges are almost lost in all the detail.

Peak oil, the debt economy and climate change

A better and more succinct summary of these great challenges might be one of Richard Heinberg's lectures. As Heinberg neatly explains, the exceptional economic growth witnessed the last 150 years is the result of the oil industry. In the first years, oil was relatively easy to find and to produce. Gradually, the best oil fields have been depleted and only resources of lesser quality remain such as the recently discovered shale oil across the UK. The production of unconventional oil is much less efficient than that of standard crude oil. The question is, how much energy goes into producing energy, or what is the Energy Return On Energy Invested (EROEI)? As this ratio goes down to 1, the energy produced is equal to that invested in producing it. Although exact figures are hard to estimate, it is clear that the EROEI of unconventional oil is significantly lower than that of conventional oil; its EROEI is so low in fact that even in strictly economic terms it is scarcely worth producing, leaving aside its potentially devastating effects on the environment.

Heinberg discusses three issues which combined seem to have very dramatic consequences for the civilised world as we know it: First, there is the energy issue and peak oil. It appears unlikely that renewable energy sources and nuclear power will be able to replace fossil fuel at the pace that would be needed for continuing economic growth. Second, there is debt as the driver of economic growth. As an illustration, Heinberg mentions the early automobile industry with its efficient, oil-powered assembly line production, so efficient in fact that it led to over-production. The next problem became how to sell cars to customers, because these were more expensive than people could generally afford. Hence came the invention of advertising, of "talking people into wanting things they didn't need" and subsidiary strategies such as planned obsolescence; making products that broke down and had to be replaced, or the constant redesign and changing fashion so that consumers would want to have the latest model. The solution to the cars being too expensive was to offer consumers credit. Money is created every time someone gets a loan from the bank with the trust that it will be paid back in the future, which again necessitates economic growth. This system has sometimes been likened to a pyramid game, and it is not a very radical idea to believe that it could collapse at some point. The third issue that Heinberg brings up is the climate change, which will lead to global disruption as large parts of the world become uninhabitable.

Two of Heinberg's recent essays deal with the energy situation and the coming transition to a society with a very different pattern of energy consumption. His criticizes the excessive optimism on behalf of renewable energy on one hand and "collapsitarians" on the other, some of which even think we are bound for extinction. The conclusion is that energy use and consumption in general must be reduced, either voluntarily or as a matter of necessity when there are no longer any alternatives.

An even more vivid account of more or less the same story is given by Michael C. Ruppert in the documentary Collapse. And after the realisation that this is probably where we are heading, it may be best to take the advice of Dmitry Orlov who has some experience of living in a collapsed Soviet Union: just stay calm, be prepared to fix anything that breaks down yourself and don't expect any more holiday trips to the other side of the planet!

Friday, August 9, 2013

Synergetics, the book


Hermann Haken: Synergetics. Introduction and Advanced Topics. 

[Disclaimer: There are many things in this book that I do not understand, although hopefully I have grasped the big picture.]

Under the term Synergetics, Haken collects a number of approaches that can be useful in a variety of scientific disciplines ranging from physics, chemistry and biology, to economics and even sociology. Synergetics is presented as its own discipline with its characteristic concepts and methods. Yet this discipline draws on related fields such as thermodynamics, statistical dynamics, information theory, dynamic systems, control theory, bifurcations and catastrophe theory. Synergetics proposes to shed light on self-organized phenomena in various areas and to treat them within a unified apparatus. In particular, the slaving principle is the one trick that is used again and again. The slaving principle can be thought of in terms of a dynamic system where some variables change fast and others slowly, but there is also a separation into stable and unstable modes. The stable modes can be eliminated and treated as parameters, resulting in great simplifications.

This tome contains two classic volumes in one. Volume one (Introduction) begins gently with tutorial chapters on basic probability theory, ordinary differential equations, and their combination in stochastic differential equations. After the theoretical background has been presented, there is a chapter on self-organization followed by several chapters devoted to applications in various domains. First, the chapter on physics deals mainly with lasers. Then, as the chapters turn to chemistry, biology and economics in turn, the treatment becomes more and more accessible to the non-specialist. However, at the same time the models seem to become increasingly simplistic. Already the examples from biology and population dynamics are sketchy, and the discussion of applications to economics and sociology do not introduce many useful ideas. Nonetheless, one should remember that Haken was among the pioneers who brought a physicist's tool kit to these fields. In particular,
[...] synergetics has established links between dynamic systems theory and statistical physics. Undoubtedly, the marriage between these two disciplines has started. (p. 364 of the double volume) 
Further, regarding the connections of physics, chemistry, biology and even softer sciences:
It thus appears that we are presently from two different sides digging a tunnel under a big mountain which has so far separated different disciplines, in particular the “soft” from the “hard” sciences. (p. 364-5) 
We see the results of this excavation in numerous papers today, where physicists have begun to address such problems as the motion of crowds at concerts or the opinion formation before elections. However, there are obvious dangers involved in attacking problems that lie far beyond one's sphere of specialization. In the words of Buckminster Fuller (who also wrote a two volume book called Synergetics, otherwise bearing little resemblance to Haken's):
The word generalization in literature usually means covering too much territory too thinly to be persuasive, let alone convincing. In science, however, a generalization means a principle that has been found to hold true in every special case.
Apparently both kinds of generalization are involved in Hakens work; the applicability seems to decrease the further away from physics one gets, till it begins to look suspicious when applied to the social sciences. Meanwhile, the single finding that unites all chapters, the slaving principle, exemplifies the kind of generalization that holds true in several special cases, if not in all conceivable scenarios. It is the method of finding solutions that survives generalizations, not necessarily so with the modelling of systems in different fields.

Volume two (Advanced Topics) starts over with a long expository chapter on the application domains followed by the introduction of the theory. There are short sections on deterministic chaos, but Haken is not the best source on this. Quasi-periodicity is treated extensively. Although the exposition is clear to begin with, soon enough matters get complicated. If you ever wondered what makes a system of differential equations with quasi-periodic coefficients stable or unstable, this is the text to read.

Matters of style

The first chapters of each volume are tutorial in character and cover material that most readers probably already know. The manner of exposition changes as Haken begins to introduce his own findings—one can sense a shifting of gears when his enthusiasm sets in. Unfortunately, these parts involve solutions that stretch over sections or entire chapters, sometimes using idiosyncratic notation. It is often hard to tell whether a variable is supposed to be real, complex, or a vector, even though one may be able to figure it out from the context.

The writing has the appearance of a stream of consciousness layed out at the blackboard, rather than elaborated at the typing machine. Throughout the book, variable substitutions are profusely employed; so much, in fact, that one almost inevitably loses track of the variables' meaning. The derivations are decidedly informal, with almost no theorems and proofs. (There are a handful of theorems that rely on a long list of assumptions with long, unwieldy proofs.) Instead there are long chains of “simplifications” or “abbreviations”, often resulting in expressions that are longer than the one they replace, truncations of higher order terms in series expansions and other sorts of approximations. All these tricks are of course what physicists are usually good at, but for readers without the proper background, they may appear as incomprehensible as pulling rabbits out of a hat.

If synergetics has to do with self-organization of complex systems, it must be said that Haken is quite terse on the topic of self-organization as such. This is where some conceptual analysis is lacking. On the other hand, the cyberneticians have already contributed much hand-waving philosophizing on self-organization, without necessarily having contributed much to its understanding. Here, at least, one has a class of problems and an approach to their solution, but there is more to self-organization than what is covered in this book.



Sunday, July 28, 2013

On smoothness under parameter changes

Is your synthesizer a mathematical function?

At least it can be considered in such terms. Each setting of all its parameters represents a point in parameter space. The output signal depends on the parameter settings. Assuming the parameters remain fixed over time, the generated audio signal may also be considered as a point in another space. In order to relate these output sequences to perceptually more relevant terms, signal descriptors (e.g. the fundamental frequency, amplitude, spectral centroid, flux) are applied to the output signal.





Now, in order to assess how smoothly the sound changes as one turns any of the knobs that controls some synthesis parameter, the first step is to relate the amount of change in the signal descriptors to the distance in parameter space. The distance in parameter space corresponds to the angle the knob is turned. Let us call this distance Δc. It is trickier to define suitable distance metrics in the space of audio signal sequences, but why not use a signal descriptor φ which itself varies over time and take its time average ⟨φ⟩. The difference Δφ between two such time averages as the synthesizer is run at two different points in parameter space may be taken as the distance metric.

A smooth function has derivatives of all orders. Therefore the smoothness of a synthesis parameter may be described in terms of a derivative of the function that maps points in parameter space to points in the space of signal descriptors. This derivative may be defined as the limit of Δφ/Δc as Δc approaches 0. It makes a significant difference whether a pitch control of an oscillator has been designed with a linear or exponential response. But abrupt changes, corresponding to a discontinuous derivative, will be even more conspicuous when they occur.

Whereas the derivative is about the smoothness locally at each point in parameter space, another way to look at parameter smoothness is to measure the total variation of a signal descriptor as the synthesis parameter goes from one setting to another. As a compromise, the interval over which the total variation is measured may be made really small, so that a local variation can be measured instead over an interval of a parameter.

Is this really useful for anything?

Short answer: Don't expect too much. But seriously, whether we like it or not, science progresses in part by taking vague concepts and making them crisper, by making them quantifiable. "Smoothness" under parameter changes is precisely such a vague concept that can be defined in ways that make it measurable. Such a smoothness diagnostic may be useful in the design of synthesis models and their parameter mappings, as well as perhaps for introducing and testing hypotheses about the perceptual discrimination of similar synthesized sounds.

The paper was presented as a poster at the joint SMAC/SMC conference.