The following article originally appeared on the The Archdruid Report.
One of the central projects I've been pursuing over the
last year or so, as readers of The Archdruid Report already know, is the reinvention of a "green wizardry"
based on methods and technologies tried and proven during the energy crises of
the Seventies. While this is is an educational project in the broad
sense of the term, it's one that has no real
chance of being embraced by the established educational institutions of
American society.
Mind you, it's quite possible that a university here or a community college
there might make itself useful in one way or another. Once it becomes
impossible to ignore the mismatch between the energy resources available to
Americans and the habits of extravagant energy use that have grown up in this
country over the last three decades, new college programs to train alternative
energy professionals will no doubt pop up like mushrooms after a hard spring
rain, competing with the handful of such programs that already exist fto
attract an expected torrent of students. Some of these future programs may even
be worth the time, though with the current trajectory of college expenses, it
would amaze me if any turn out to be worth the cost.
They'll probably get the students, though. It's hardwired into the American
psyche these days that a problem for one person is a business opportunity for
someone else, and preferably someone else with the right degree. Books with
titles such as Profit from the Peak
are already festooning bookstore and library shelves, and there will doubtless
be much more of the same thinking on display as peak oil continues its journey
from a fringe concern to an inescapable reality. That's the American mindset as
the 21st century moves deeper into its first great period of crisis; if
scientists were to announce tomorrow that America was about to sink beneath the
waves like Atlantis, I don't doubt for a moment that tens of thousands of
Americans would rush out and try to launch new careers manufacturing and selling
water wings.
The green wizardry I've been writing about doesn't lend itself to that sort
of thinking, because it's not intended for specialists. Now of course it will
help a few careers along — unless you're a dab hand with plumbing, for example,
you're better off getting a professional to install your solar water heating
system — and it may get some started — I've spoken several times already about the
range of small businesses that will be needed as the global economy winds down
and maintaining, rebuilding, and repurposing old technologies becomes a
significant economic sector. Still, most of the techniques and strategies I've
been discussing aren't well suited to make money for some new wave of
specialists; their value is in making life more livable for ordinary people who
hope to get by in the difficult times that are gathering around us right now.
It's worth noting, in fact, that the twilight of the contemporary cult of
specialization is one of the implications of peak oil. A couple of decades ago,
the mathematician Ilya Prigogine showed by way of dizzyingly complex equations
that the flow of energy through a system tends to increase the complexity of
the system over time. It's a principle that's seen plenty of application in
biology, among other fields, but I don't think it's been applied to history as
often as it should have. There does seem to be a broad positive correlation
between the energy per capita available, on average, to the members of a human
society, and the number of different occupational roles available to members of
that society.
As energy per capita soared to its peak in the industrial world of the late
twentieth century, hyperspecialization was the order of the day; as energy per
capita declines — and it's been declining for some time now — the range of
specializations that can be supported by the economy will also decline, and
individuals and families will have to take up the slack, taking over tasks that
for some decades now have been done by professionals. During the transitional
period, at least, this will doubtless generate a great deal of commotion, as
professional specialists whose jobs are going away try to defend their jobs by
making life as difficult as possible for those people who, trying to get by in
difficult times, choose the do-it-yourself route. That process is already well under way in a variety of professions, but now I want
to use this lens to examine the future of one of the industrial world's
distinctive creations, the grab bag of investigative methods, ideas about the
universe, and social institutions we call "science."
It's rarely remembered these days that until quite recently, scientific
research was mostly carried on by amateurs. The word "scientist" wasn't even
coined until 1833; before then, and for some time after, the research programs
that set modern science on its way were carried out by university professors in
other disciplines, middle class individuals with spare time on their hands, and
wealthy dilletantes for whom science was a more interesting hobby than horse
racing or politics. Isaac Newton, for example, taught mathematics at Cambridge;
Gilbert White founded the science of ecology with his Natural History of Selborne in his spare time as a clergyman;
Charles Darwin came from a family with a share of the Wedgwood pottery fortune,
had a clergyman's education, and paid his own way around the world on the
H.M.S. Beagle.
It took a long time for science as a profession to catch on, because — pace a myth very widespread these
days — science contributed next to nothing to the technological revolutions that
swept the western world in the eighteenth and nineteenth centuries. Until late
in the nineteenth century, in fact, things generally worked the other way
around: engineers and basement tinkerers discovered some exotic new effect, and
then scientists scrambled to figure out what made it happen. James Clerk
Maxwell, whose 1873 book Electricity and
Magnetism finally got out ahead of the engineers to postulate the effects that
would become the basis for radio, began the process by which science took the
lead in technological innovation, but it wasn't until the Second World War that
science had matured enough to become the engine of discovery it then became. It
was then that government and business investment in basic research took off,
creating the institutionalized science of the present day.
Throughout the twentieth century, investment in scientific research proved to
be a winning bet on the grand scale; it won wars, made fortunes, and laid the
groundwork for today's high-tech world. It's a common belief these days that
more of the same will yield more of the same — that more scientific research will
make it possible to fix the world's energy problems and, just maybe, its other
problems as well. Popular as that view is, there's good reason to doubt it.
The core problem is that scientific research was necessary, but not sufficient,
to create today's industrial societies. Cheap abundant energy was also
necessary, and was arguably the key factor. In a very real sense, the role of
science from the middle years of the nineteenth century on was basically
figuring out new ways to use the torrents of energy that came surging out of
wells and mines to power history's most extravagant boom. Lacking all that
energy, the technological revolutions of the last few centuries very likely
wouldn't have happened at all; the steam turbine, remember, was known to the
Romans, who did nothing with it because all the fuel they knew about was committed
to other uses. Since the sources of fuel we'll have after fossil fuels finish
depleting are pretty much the same as the ones the Romans had, and we can also
expect plenty of pressing needs for the energy sources that remain, it takes an
essentially religious faith in the inevitability of progress to believe that
another wave of technological innovation is right around the corner.
The end of the age of cheap abundant energy is thus also likely to be the end
of the age in which science functions as a force for economic expansion. There
are at least two other factors pointing in the same direction, though, and they
need to be grasped to make sense of the predicament we're in.
First, science itself is well into the territory of diminishing returns, and most
of the way through the normal life cycle of a human method of investigation.
What I've described elsewhere as
abstraction, the form of intellectual activity that seeks to reduce the
complexity of experience into a set of precisely formulated generalizations,
always depends on such a method. Classical logic is another example, and it's
particularly useful here because it completed its life cycle long ago and so
can be studied along its whole trajectory through time.
Logic, like the scientific method, was originally the creation of a movement of
urban intellectuals in a society emerging from a long and troubled medieval
period. Around the eighth century BCE, ancient Greece had finally worked out a
stable human ecology that enabled it to finish recovering from the collapse of
Mycenean society some six centuries before; olive and grapevine cultivation
stabilized what was left of the fragile Greek soil and produced cash crops eagerly
sought by markets around the eastern Mediterranean, bringing in a flood of
wealth; the parallel with rapidly expanding European economies during the years
when modern science first took shape is probably not coincidental. Initial
ventures in the direction of what would become Greek logic explored various
options, some more successful than others; by the fifth century BCE, what we
may as well call the logical revolution was under way, and the supreme triumphs
of logical method occupied the century that followed. Arithmetic, geometry,
music theory, and astronomy underwent revolutionary developments.
That's roughly where the logical revolution ground to a halt, too, and the next
dozen centuries or so saw little further progress. There were social factors at
work, to be sure, but the most important factor was inherent in the method:
using the principles of logic as the Greeks understood them, there's only so
far you can go. Logical methods that had proved overwhelmingly successful
against longstanding problems in mathematics worked far less well on questions
about the natural world, and efforts to solve the problems of human life as
though they were logical syllogisms tended to flop messily. Once the belief in
the omnipotence of logic was punctured, on the other hand, it became possible
to sort out what it could and couldn't do, and-not coincidentally-to assign it
a core place in the educational curriculum, a place it kept right up until the
dawn of the modern world.
I know it's utter heresy even to hint at this, but I'd like to suggest that
science, like logic before it, has gotten pretty close to its natural limits as
a method of knowledge. In Darwin's time, a century and a half ago, it was still
possible to make worldshaking scientific discoveries with equipment that would
be considered hopelessly inadequate for a middle school classroom nowadays;
there was still a lot of low hanging fruit to be picked off the tree of
knowledge. At this point, by contrast, the next round of experimental advances
in particle physics depends on the Large Hadron Collider, a European project
with an estimated total price tag around $5.5 billion. Many other branches of
science have reached the point at which very small advances in knowledge are
being made with very large investments of money, labor, and computing power.
Doubtless there will still be surprises in store, but revolutionary discoveries
are very few and far between these days
Yet there's another factor pressing against the potential advancement of
science, and it's one that very few scientists like to talk about. When science
was drawn up into the heady realms of politics and business, it became
vulnerable to the standard vices of those realms, and one of the consequences
has been a great deal of overt scientific fraud.
A study
last year published in the Journal of
Medical Ethics surveyed papers formally retracted between 2000 and 2010 in
the health sciences. About a quarter of them were retracted for scientific
fraud, and half of these had a first author who had had another paper
previously retracted for scientific fraud. Coauthors of these repeat offenders
had, on average, three other papers each
that had been retracted. Americans, it may be worth noting, far more often had
papers retracted for fraud, and were repeat offenders, than their overseas
colleagues.
I don't know how many of my readers were taught, as I was, that science is
inherently self-policing and that any researcher who stooped to faking data
would inevitably doom his career. Claims like these are difficult to defend in
the face of numbers of the sort just cited. Logic went through the same sort of
moral collapse in its time; the English word "sophistry" commemorates
the expert debaters of fourth-century Greece who could and did argue with
sparkling logic for anyone who would pay them.
To be fair, scientists as a class would have needed superhuman virtue to
overcome the temptations of wealth, status, and influence proffered them in the
post-Second World War environment, and it's also arguably true that the average
morality of scientists well exceeds that of businesspeople or politicians. That
still leaves room for a good deal of duplicity, and it's worth noting that this
has not escaped the attention of the general public. It's an item of common
knowledge these days that the court testimony or the political endorsement of a
qualified scientist, supporting any view you care to name, can be had for the
cost of a research grant or two. I'm convinced that this is the hidden subtext
in the spreading popular distrust of science that is such a significant feature
in our public life: a great many Americans, in particular, have come to see
scientific claims as simply one more rhetorical weapon brandished by competing
factions in the social and political struggles of our day.
This is unfortunate, because — like logic — the scientific method is a powerful
resource; like logic, again, there are things it can do better than any other
creation of the human mind, and some of those things will be needed badly in
the years ahead of us. Between the dumping of excess specializations in a
contracting economy, the diminishing returns of scientific research itself, and
the spreading popular distrust of science as currently practiced, the
likelihood that any significant fraction of today's institutional science will
squeeze through the hard times ahead is minimal at best. What that leaves, it
seems to me, is a return to the original roots of science as an amateur
pursuit.
There are still some corners of the sciences — typically those where there isn't
much money in play — that are open to participation by amateurs. There are also
quite a few branches of scientific work that are scarcely being done at all these
days-again, because there isn't much money in play — and their number is likely
to increase as funding cuts continue. To my mind, one of the places where these
trends intersect with the needs of the future is in local natural history and
ecology, the kind of close study of nature's patterns that launched the
environmental sciences, back in the day. To cite an example very nearly at
random, it would take little more than a microscope, a notebook, and a camera
to do some very precise studies of the effect of organic gardening methods on
soil microorganisms, beneficial and harmful insects, and crop yields, or to
settle once and for all the much-debated question of whether adding biochar to
garden soil has any benefits in temperate climates.
These are things the green wizards of the future are going to need to be able
to figure out. With much scientific research in America moving in what looks
uncomfortably like a death spiral, the only way those skills are likely to make
it across the crisis ahead of us is if individuals and local groups pick them
up and pass them on to others. Now is probably not too soon to get started,
either.
Image by sebilden, courtesy of Creative Commons license.