Tuesday, December 17, 2002

(In the first of a sure to be long series of the mitigation of academeic achievement, Students in Massachusetts can get a 'certificate of attainment' instead of a diploma, apparently to recognize their attempt, albiet failed one, to graduate high school. Wouldn’t it be a better idea to figure out why the MA public school system cant get kids to pass its graduate exams after three tries isntead of just giving them a meaningless certificate for 'trying'? - Mike)

MA school board okays diploma alternative

Boston Globe
Students who repeatedly fail the English and math portions
of a Massachusetts test will get a "certificate of attainment"
instead of a diploma. The state board chair said in defending
the controversial plan, "It's to recognize and honor the effort
and persistence of students who have stuck it out ... who have
given it their best." (11/27/02)
19- Pressure on earth's carrying capacity rises, says green group
A new report measuring sustainability in terms of
"ecological footprints" has been released by Redefining
Progress. It states that human activity is exceeding the
planet's "biological capacity" by 20%. (11/28/02)

13- Rated "R" for smoking
by Charles Paul Freund
An amusing look at the Smoke Free Movies organization. (11/26/02)


Editors' Links

Rated "R" for Smoking

By Charles Paul Freund

Bond lit a cigar. The ace secret agent was, after all, in Havana, and it was almost as natural to smoke a Cuban cigar in that city as it was in the capital, Miami. Pausing to exhale appreciatively, Bond prepared to resume his conversation with the gentleman next to him, a mobster whom he would surely have to kill eventually. There would be time enough for that later, however. For now, it was time to talk tobacco.

Wednesday, December 11, 2002

(Ray Kurzweil's 'Law of Accelerating Returns' Continued...- Mike)

The Singularity Is Near
To appreciate the nature and significance of the coming "singularity," it is
important to ponder the nature of exponential growth. Toward this end, I am
fond of telling the tale of the inventor of chess and his patron, the
emperor of China. In response to the emperor's offer of a reward for his new
beloved game, the inventor asked for a single grain of rice on the first
square, two on the second square, four on the third, and so on. The Emperor
quickly granted this seemingly benign and humble request. One version of the
story has the emperor going bankrupt as the 63 doublings ultimately totaled
18 million trillion grains of rice. At ten grains of rice per square inch,
this requires rice fields covering twice the surface area of the Earth,
oceans included. Another version of the story has the inventor losing his

It should be pointed out that as the emperor and the inventor went through
the first half of the chess board, things were fairly uneventful. The
inventor was given spoonfuls of rice, then bowls of rice, then barrels. By
the end of the first half of the chess board, the inventor had accumulated
one large field's worth (4 billion grains), and the emperor did start to
take notice. It was as they progressed through the second half of the
chessboard that the situation quickly deteriorated. Incidentally, with
regard to thed oublings of computation, that's about where we stand
now--there have been slightly more than 32 doublings of performance since
the first programmable computers were invented during World War II.

This is the nature of exponential growth. Although technology grows in the
exponential domain, we humans live in a linear world. So technological
trends are not noticed as small levels of technological power are doubled.
Then seemingly out of nowhere, a technology explodes into view. For example,
when the Internet went from 20,000 to 80,000 nodes over a two year period
during the 1980s, this progress remained hidden from the general public. A
decade later, when it went from 20 million to 80 million nodes in the same
amount of time, the impact was rather conspicuous.

As exponential growth continues to accelerate into the first half of the
twenty-first century, it will appear to explode into infinity, at least from
the limited and linear perspective of contemporary humans. The progress will
ultimately become so fast that it will rupture our ability to follow it. It
will literally get out of our control. The illusion that we have our hand
"on the plug," will be dispelled.

[This is the cuase of the common source of fear and dread that many people
have when thinking of future technology. Ted Kaczynski, the Unabomber, in
his manifesto stated this as one of primary reason that an absolute
moratorium must be placed on technology. Kaczynski (and Bill joy) argued
that the more complex things get, the more humans will rely on other
computers to control them. With each increase in complexity humans are
moved further and further away from the controls. Eventually, the argument
goes they will be unable to control them. This is true, if one were to
insist that humans themselves never change. But as Kurzweil and many others
eventually point out, humans will likely increase their abilities as well,
and just as quickly as the AI counterparts, if not even faster.]

Can the pace of technological progress continue to speed up indefinitely? Is
there not a point where humans are unable to think fast enough to keep up
with it? With regard to unenhanced humans, clearly so. But what would a
thousand scientists, each a thousand times more intelligent than human
scientists today, and each operating a thousand times faster than
contemporary humans (because the information processing in their primarily
nonbiological brains is faster) accomplish? One year would be like a
millennium. What would they come up with?

Well, for one thing, they would come up with technology to become even more
intelligent (because their intelligence is no longer of fixed capacity).
They would change their own thought processes to think even faster. When the
scientists evolve to be a million times more intelligent and operate a
million times faster, then an hour would result in a century of progress (in
today's terms).

This, then, is the Singularity. The Singularity is technological change so
rapid and so profound that it represents a rupture in the fabric of human
history. Some would say that we cannot comprehend the Singularity, at least
with our current level of understanding, and that it is impossible,
therefore, to look past its "event horizon" and make sense of what lies

My view is that despite our profound limitations of thought, constrained as
we are today to a mere hundred trillion interneuronal connections in our
biological brains, we nonetheless have sufficient powers of abstraction to
make meaningful statements about the nature of life after the Singularity.
Most importantly, it is my view that the intelligence that will emerge will
continue to represent the human civilization, which is already a
human-machine civilization. This will be the next step in evolution, the
next high level paradigm shift.

To put the concept of Singularity into perspective, let's explore the
history of the word itself. Singularity is a familiar word meaning a unique
event with profound implications. In mathematics, the term implies infinity,
the explosion of value that occurs when dividing a constant by a number that
gets closer and closer to zero. In physics, similarly, a singularity denotes
an event or location of infinite power. At the center of a black hole,
matter is so dense that its gravity is infinite. As nearby matter and energy
are drawn into the black hole, an eventh orizon separates the region from
the rest of the Universe. It constitutes a rupture in the fabric of space
and time. The Universe itself is said to have begun with just such a

In the 1950s, John Von Neumann was quoted as saying that "the ever
accelerating progress of technology...gives the appearance of approaching
some essential singularity in the history of the race beyond which human
affairs, as we know them, could not continue." In the 1960s, I. J. Good
wrote of an "intelligence explosion," resulting from intelligent machines
designing their next generation without human intervention. In 1986, Vernor
Vinge, a mathematician and computer scientist at San Diego State University,
wrote about a rapidly approaching technological "singularity" in his science
fiction novel, Marooned in Realtime. Then in 1993, Vinge presented a paper
to a NASA-organized symposium which described the Singularity as an
impending event resulting primarily from the advent of "entities with
greater than human intelligence," which Vinge saw as the harbinger of a
run-away phenomenon.

From my perspective, the Singularity has many faces. It represents the
nearly vertical phase of exponential growth where the rate of growth is so
extreme that technology appears to be growing at infinite speed. Of course,
from a mathematical perspective, there is no discontinuity, no rupture, and
the growth rates remain finite, albeit extraordinarily large. But from our
currently limited perspective, this imminent event appears to be an acute
and abrupt break in the continuity of progress. However, I emphasize the
word "currently," because one of the salient implications of the Singularity
will be a change in the nature of our ability to understand. In other words,
we will become vastly smarter as we merge with our technology.

[A question that may arise from this line of reasoning is how far technology
will progress exactly. Kurzweil in this essay focuses mainly on exponential
growth trends, and some technologies hint to exponential upon exponential
growth. But will this growth continue indefinately? It is likely that it
will not. Indeed our applications of technology will be limited by the laws
of physics, but this still opens a vast range of possibilities. Another
familiar mathematical growth curve is the "S" curve. An S curve, like an
exponential curve, will start with slow constant increases, looking like a
linear increase. Its rate of increase will start to increase, and soon it
bends upward, gently at first, and then sharply, until the line is nearly
vertical. Here is where an S curve deviates from a simple exponential
curve, and exponential curve may continue indefinately but the S curve turns
into a mirror image of the exponential curves first half. The previously
near vertical line starts to bend toward the right, gently at first, then
steeper until it forms a nearly horizontal line that continues to the right
of a graph. The overal shape of this curve resembles an S slightly shifted
to the right. If this graph represents a technological growth curve, the
smoothing out occurs when we start to hit the limits that the laws of
physics allow. We will get better and better at doing things in larger and
larger scales, but can never pass that limit. It is likely that technology
will follow such a curve, indeed almost every technology has so far. So
what may the universe be like at the end of this growth curve? We could all
be immortal, part machine hybrids, visiting planets across the universe]

When I wrote my first book, The Age of Intelligent Machines, in the 1980s, I
ended the book with the specter of the emergence of machine intelligence
greater than human intelligence, but found it difficult to look beyond this
event horizon. Now having thought about its implications for the past 20
years, I feel that we are indeed capable of understanding the many facets of
this threshold, one that will transform all spheres of human life.

Consider a few examples of the implications. The bulk of our experiences
will shift from real reality to virtual reality. Most of the intelligence of
our civilization will ultimately be nonbiological, which by the end of this
century will be trillions of trillions of times more powerful than human
intelligence. However, to address often expressed concerns, this does not
imply the end of biological intelligence, even if thrown from its perch of
evolutionary superiority. Moreover, it is important to note that the
nonbiological forms will be derivative of biological design. In other words,
our civilization will remain human, indeed in many ways more exemplary of
what we regard as human than it is today, although our understanding of the
term will move beyond its strictly biological origins.

Many observers have nonetheless expressed alarm at the emergence of forms of
nonbiological intelligence superior to human intelligence. The potential to
augment our own intelligence through intimate connection with other thinking
mediums does not necessarily alleviate the concern, as some people have
expressed the wish to remain "unenhanced" while at the same time keeping
their place at the top of the intellectual food chain. My view is that thel
ikely outcome is that on the one hand, from the perspective of biological
humanity, these superhuman intelligences will appear to be their
transcendent servants, satisfying their needs and desires. On the other
hand, fulfilling the wishes of a revered biological legacy will occupy only
a trivial portion of the intellectual power that the Singularity will bring.

Needless to say, the Singularity will transform all aspects of our lives,
social, sexual, and economic, which I explore herewith

www.matus1976.com - Article archives

Monday, December 09, 2002

(This article counters some of the common claims and arguments made against
the Pharmacuetical Industry. - Mike)

Bad medicine for America
Heartland Institute
by Joseph L. Bast and Merrill Matthews
"Prescription drugs and their costs are high on political
agendas this campaign season. ... The not-so-secret
agenda ... is to create a new prescription drug
entitlement for seniors (and perhaps for everyone) that
won't bankrupt the country's taxpayers. By demonizing
the drug industry, they hope to build public and
political support for draconian price controls on
prescription drugs." (09/02)

Choice contradictions, double standards
by Sally C. Pipes
"The contradictions of feminism have been amply displayed
of late, but they have not drawn the response one would
expect. Consider the strange case of Marianne Stanley,
coach of the Washington Mystics basketball team ....
Marianne Stanley leaves little doubt, for those who
still have any, that women can be as ignorant,
tyrannical, and discriminatory as men." (10/08/02)


Choice Contradictions, Double Standards
by Sally C. Pipes
(Hi all, this is the first part in a series I will send out on perhaps one
of the most influential essays I have read in my life. This essay, entitled
'The Law Of Accelerating Returns' is authored by Ray Kurzweill. Some
musicians may recognize his name, he was the inventor of the first digitial
music synthesizer and has a brand of high end keyboards in his name. He
later designed the first Opitcal Charachter recognition technology that is
used by nearly every computer scanner today. He specializes in developing
artificial intelligence applications. He later authored the book 'The Age
of Spiritual Machines' in which he expands on the general theme of this
particular essay. In this essay Kurzweill very clearly and logically lays
out the most likely progression of technology we will see in the future, and
shares some deep insights into the history of intelligent life, humans,
technology, and indeed all life in general and the evolutionary process.
This essay is split into multiple parts because of its length and the
associated graphics with it. I will space comments I have to add throughout
the essay as well. - Mike)

The Law of Accelerating Returns


by Ray Kurzweil

Raymond Kurzweil's essay on the confluence of exponential trends known as
the Law of Accelerating Returns.

Originally published March 7, 2001. Published on KurzweilAI.net March 7,2
001. Originally written November 2000.

You will get $40 trillion just by reading this essay and understanding what
it says. For complete details, see below. (It's true that authors will do
just about anything to keep your attention, but I'm serious about this
statement. Until I return to a further explanation, however, do read the
first sentence of this paragraph carefully.)

Now back to the future: it's widely misunderstood. Our forebears expected
the future to be pretty much like their present, which had been pretty much
like their past. Although exponential trends did exist a thousand years ago,
they were at that very early stage where an exponential trend is so flat
that it looks like no trend at all. So their lack of expectations was
largely fulfilled. Today, in accordance with the common wisdom, everyone
expects continuous technological progress and the social repercussions that
follow. But the future will be far more surprising than most observers
realize: few have truly internalized the implications of the fact that the
rate of change itself is accelerating.

[Re: his comments that people did pretty much what people did in the past,
no where is this more evident than in the technological history of the human
civilizations. Archealogical evidence clearly displays abysmally slow rates
of technological progress, humans were around for 90,000 years before
developing agriculture. Simple stone tools prevailed for nearly 50,000
years, finally eclipsed by slightly less simple stone tools. There is no
doubt that cultural early humans did exactly what their grand parents, great
grandparents, great great grandparents etc etc did, only rarely did any
small changes ever creep into the society.]

The Intuitive Linear View versus the Historical Exponential View

Most long range forecasts of technical feasibility in future time periods
dramatically underestimate the power of future technology because they are
based on what I call the "intuitive linear" view of technological progress
rather than the "historical exponential view." To express this another way,
it is not the case that we will experience a hundred years of progress in
the twenty-first century; rather we will witness on the order of twenty
thousand years of progress (at today's rate of progress, that is).

[This may sound amazing at first, but consider it took 90,000 years for
humans to develop agriculture, and then about 10,000 for them to develop
written language, then about 5,000 for simple metal working, then came the
notion of science about 2,500 years ago, and shortly there after the
industrial revolution, and then came nuclear energy, electricity, radio,
television, medicine, airplanes, automobiles, the internet, space travel....
Its impossible not to notice the increasing rate of technological

This disparity in outlook comes up frequently in a variety of contexts, for
example, the discussion of the ethical issues that Bill Joy raised in his
controversial WIRED cover story, Why The Future Doesn't Need Us. Bill and I
have been frequently paired in a variety of venues as pessimist and optimist
respectively. Although I'm expected to criticize Bill's position, and indeed
I do take issue with his prescription of relinquishment, I nonetheless
usually end up defending Joy on the key issue of feasibility. Recently a
Noble Prize winning panelist dismissed Bill's concerns, exclaiming that,
"we're not going to see self-replicating nanoengineered entities for a
hundred years." I pointed out that 100 years was indeed a reasonable
estimate of the amount of technical progress required to achieve this
particular milestone at today's rate of progress. But because we're doubling
the rate of progress every decade, we'll see a century of progress--at
today's rate--in only 25 calendar years.

[Bill Joy, the head of Sun Microsystems, upon meeting and discussing the
future with Kurzweill, was reportedly scared out of his whits and wrote a
cover article for Wired calling for complete government control of
technological and a forced technological moratorium.]

When people think of a future period, they intuitively assume that the
current rate of progress will continue for future periods. However, careful
consideration of the pace of technology shows that the rate of progress is
not constant, but it is human nature to adapt to the changing pace, so the
intuitive view is that the pace will continue at the current rate. Even for
those of us who have been around long enough to experience how the pace
increases over time, our unexamined intuition nonetheless provides the
impression that progress changes at the rate that we have experienced
recently. From the mathematician's perspective, a primary reason for this is
that an exponential curve approximates a straight line when viewed for a
brief duration. So even though the rate of progress in the very recent past
(e.g., this past year) is far greater than it was ten years ago (let alone a
hundred or a thousand years ago), our memories are nonetheless dominated by
our very recent experience. It is typical, therefore, that even
sophisticated commentators, when considering the future, extrapolate the
current pace of change over the next 10 years or 100 years to determine
their expectations. This is why I call this way of looking at the future the
"intuitive linear" view.

But a serious assessment of the history of technology shows that
technological change is exponential. In exponential growth, we find that a
key measurement such as computational power is multiplied by a constant
factor for each unit of time (e.g., doubling every year) rather than just
being added to incrementally. Exponential growth is a feature of any
evolutionary process, of which technology is a primary example. One can
examine the data

in different ways, on different time scales, and for a wide variety of
technologies ranging from electronic to biological, and the acceleration of
progress and growth applies. Indeed, we find not just simple exponential
growth, but "double" exponential growth, meaning that the rate of
exponential growth is itself growing exponentially. These observations do
not rely merely on an assumption of the continuation of Moore's law (i.e.,
the exponential shrinking of transistor sizes on an integrated circuit), but
is based on a rich model of diverse technological processes. What it clearly
shows is that technology, particularly the pace of technological change,
advances (at least) exponentially, not linearly, and has been doing so since
the advent of technology, indeed since the advent of evolution on Earth.

I emphasize this point because it is the most important failure that
would-be prognosticators make in considering future trends. Most technology
forecasts ignore altogether this "historical exponential view" of
technological progress. That is why people tend to overestimate what can be
achieved in the short term (because we tend to leave out necessary details),
but underestimate what can be achieved in the long term (because the
exponential growth is ignored).

The Law of Accelerating Returns

We can organize these observations into what I call the law of accelerating
returns as follows:

- Evolution applies positive feedback in that the more capable methods
resulting from one stage of evolutionary progress are used to create the
next stage. As a result, the

- rate of progress of an evolutionary process increases exponentially over
time. Over time, the "order" of the information embedded in the evolutionary
process (i.e., the measure of how well the information fits a purpose, which
in evolution is survival) increases.

- A correlate of the above observation is that the "returns" of an
evolutionary process (e.g., the speed, cost-effectiveness, or overall
"power" of a process) increase exponentially over time.

- In another positive feedback loop, as a particular evolutionary process
(e.g., computation) becomes more effective (e.g., cost effective), greater
resources are deployed toward the further progress of that process. This
results in a second level of exponential growth (i.e., the rate of
exponential growth itself grows exponentially).

- Biological evolution is one such evolutionary process.

- Technological evolution is another such evolutionary process. Indeed, the
emergence of the first technology creating species resulted in the new
evolutionary process of technology. Therefore, technological evolution is an
outgrowth of--and a continuation of--biological evolution.

- A specific paradigm (a method or approach to solving a problem, e.g.,
shrinking transistors on an integrated circuit as an approach to making more
powerful computers) provides exponential growth until the method exhausts
its potential. When this happens, a paradigm shift (i.e., a fundamental
change in the approach) occurs, which enables exponential growth to

[This is an important fact to note, Kurzweil spells this out in more detail
later specifically regarding computational technology, which starting at
babbagges mechanical computers to Vacuum Tubes to modern transistors and
integrated circuits has undergone about 5 paradgim shifts yet still proceeds
to follow moore's law accurately. The same could be said of any technology
field, say Transportation. Through various shifts in technology we are able
to extract more energy in a smaller volume device and get farther in less
time then we ever could before, and this trend continues. From Internal
Combustion engines moving toward hydrogen fuel cell technology, to
airplaines moving from Internal combustion engine powered propellors to
modern Jet Turbine engines. The same trends can be seen and analyzed at all
levels of technology and in all fields]

If we apply these principles at the highest level of evolution on Earth, the
first step, the creation of cells, introduced the paradigm of biology. The
subsequent emergence of DNA provided a digital method to record the results
of evolutionary experiments. Then, the evolution of a species who combined
rational thought with an opposable appendage (i.e., the thumb) caused a
fundamental paradigm shift from biology to technology. The upcoming primary
paradigm shift will be from biological thinking to a hybrid combining
biological and nonbiological thinking. This hybrid will include
"biologically inspired" processes resulting from the reverse engineering of
biological brains.

If we examine the timing of these steps, we see that the process has
continuously accelerated. The evolution of life forms required billions of
years for the first steps (e.g., primitive cells); later on progress
accelerated. During the Cambrian explosion, major paradigm shifts took only
tens of millions of years. Later on, Humanoids developed over a period of
millions of years, and Homo sapiens over a period of only hundreds of
thousands of years.

[In testament to the complexity of both individual single celled organisms,
multicelled organisms, and humans, and the growing rate of change, it took
1.5 billion years after the advent of the first self-replicating molecule to
evolve into multicellular organisms, and it took another 1.5 billion years
for sentient human beings to evolve from the first multicellular organisms.
Imagine what will come from humans in 1.5 billion years]

With the advent of a technology-creating species, the exponential pace
became too fast for evolution through DNA-guided protein synthesis and moved
on to human-created technology. Technology goes beyond mere tool making; it
is a process of creating ever more powerful technology using the tools from
the previous round of innovation. In this way, human technology is
distinguished from the tool making of other species. There is a record of
each stage of technology, and each new stage of technology builds on the
order of the previous stage.

The first technological steps-sharp edges, fire, the wheel--took tens oft
housands of years. For people living in this era, there was little
noticeable technological change in even a thousand years. By 1000 A.D.,
progress was much faster and a paradigm shift required only a century or
two. In the nineteenth century, we saw more technological change than in the
nine centuries preceding it. Then in the first twenty years of the twentieth
century, we saw more advancement than in all of the nineteenth century. Now,
paradigm shifts occur in only a few years time. The World Wide Web did not
exist in anything like its present form just a few years ago; it didn't
exist at all a decade ago.

[see chart01.jpg]
[This graph clearly shows the exponental increase in, as Kurzweil labeled it
'The mass use of Inventions']

The paradigm shift rate (i.e., the overall rate oft echnical progress) is
currently doubling (approximately) every decade; that is, paradigm shift
times are halving every decade (and the rate of acceleration is itself
growing exponentially). So, the technological progress in the twenty-first
century will be equivalent to what would require (in the linear view) on the
order of 200 centuries. In contrast, the twentieth century saw only about 25
years of progress (again at today's rate of progress) since we have been
speeding up to current rates. So the twenty-first century will see almost a
thousand times greatert echnological change than its predecessor.

[End of part 1 - Mike]

(Evolutionary Psychology is a favorite research topic of mine. Its very
interesting to see what aspects of human behaviors can be attributed or
shown to be influenced by evolutionary pressures formed in our distant past.
This particular article relays some concepts about Jealousy, previously
thought to be readily explainable under simplified evolutionary psychology,
may be a bit more complicated - Mike)

New research indicates that sex differences in studies of jealousy by
evolutionary psychologists are spurious, an artifact of the particular
method used in those studies.

They suggest that, rather than representing a hard-wired psychological
mechanism for promoting reproduction, jealousy could have evolved in each
sex for some more general purpose -- for example, protecting social bonds in
a very social species.

Jealous? Maybe It's Genetic. Maybe Not.



Jealousy, according to evolutionary psychologists, evolved a million or so
years ago on the African plain, where life was no picnic.

Out there on the savanna, a man had to constantly guard against cuckoldry,
lest he squander his resources, unwittingly feeding that hard-earned leg of
mastodon to some other guy's progeny.

Women had other things to worry about, like keeping the meat coming in.
Sure, it bothered them if their men indulged in a little hanky-panky by the
watering hole. But the real threat was if a man became emotionally attached
to another woman: who would bring home the mastodon then?

At least, that's the theory advanced by evolutionary psychologists, who in
the last decade have ushered Darwinian theory into new and provocative
areas, including the relationship between the sexes. As a result of such
differing survival pressures long ago, they maintain, the brains of modern
men and women are programmed to respond differently to the infidelity of a
romantic partner. Men become more jealous over sexual infidelity, a strategy
that worked pretty well in the Stone Age, promoting reproductive success.
Women are more distressed by emotional betrayal, which could leave them
without resources.

It is an appealing argument in a society where men are considered to be from
Mars and women from Venus, and one that has gained substantial purchase
among evolutionary scientists and in popular literature. It is also
supported by a variety of studies finding evidence for such a sex
difference, many of them carried out by Dr. David M. Buss, an evolutionary
psychologist at the University of Texas, and his colleagues.

"Men and women may be equally jealous, but the events that trigger jealousy
differ," Dr. Buss wrote in "The Dangerous Passion: Why Jealousy Is as
Necessary as Love and Hate."

Other scholars have not been so convinced. They have argued that it is more
likely that differences between men and women that evolutionary
psychologists attribute to natural selection - like the tendency of men to
be polygamous and women, monogamous - are the product of cultures, not
evolution. Jealousy is probably no exception.

So the nature-nurture debate has continued over the years.

But two new research papers take a different tack. They do not dispute that
evolution plays a role in shaping human behavior. But they question the
evidence assembled by Dr. Buss and others for the notion that jealousy
evolved differently in men and in women.

In one paper, to appear in the November issue of The Journal of Personality
and Social Psychology, researchers led by Dr. David DeSteno, a psychologist
at Northeastern University, assert that the sex difference revealed in many
studies of jealousy by evolutionary psychologists is spurious, an artifact
of the particular method used in those studies.

They suggest that, rather than representing a hard-wired psychological
mechanism for promoting reproduction, jealousy could have evolved in each
sex for some more general purpose - for example, protecting social bonds in
a very social species.

"I'm very sympathetic to the evolutionary view," Dr. DeSteno said. "I think
it's ridiculous to assume that the human mind was not subject to the
evolutionary chisel. But I think there can be numerous evolutionary
arguments for how specific social behaviors develop."

Dr. DeSteno and his colleagues - Monica Y. Bartlett and Julia Braverman of
Northeastern and Dr. Peter Salovey of Yale - say the problem with many of
the studies conducted by Dr. Buss and other investigators is that they all
use the same technique: the subjects are asked to call to mind a serious
committed relationship that they had, that they now have or that they would
like to have.

They are then presented with two forms of infidelity - one sexual, one
emotional - and asked which they would find most distressing. (Dr. Buss
calls this method "Sophie's Choice," referring to the book and movie in
which the title character must choose which of her children will be killed.
Other psychologists call it "forced choice.")

Using this method, virtually every study has found a difference between the
sexes, with women being more likely to pick emotional infidelity as the most
upsetting choice.

But Dr. DeSteno and his colleagues conducted their own studies, adding other
ways of measuring jealousy, for instance, asking the 111 subjects,
undergraduates at Northeastern, to rate on a seven-point scale how upset
they would be about each form of infidelity in turn, rather than having them
choose between the two forms presented together.

When such other methods were used, the researchers found, the gap between
men and women disappeared; both sexes said they were more disturbed by
sexual infidelity.

They then investigated further, to determine the reason for the discrepancy
between the techniques.

"It's very strange from an evolutionary perspective why the sex difference
would only occur" in the forced-choice situation and not in others, Dr.
DeSteno said.

One possibility, the researchers reasoned, was that instead of eliciting an
automatic, preprogrammed response to infidelity - the kind one would expect
from a mechanism designed by evolution - the forced-choice method sent the
subjects into a more complex intellectual decision-making process, in which
they weighed the trade-offs between the two unpleasant alternatives.

To test this hypothesis, the researchers conducted another study, in which
half the subjects filled out a questionnaire asking, among other things,
whether they would be more upset if a romantic partner "had passionate sex
with someone else" or "formed a deep emotional bond to someone else." The
other subjects were given the same task, but they were asked to
simultaneously remember a string of numbers while answering the questions -
a twist the researchers hoped would eliminate the possibility of complicated
reasoning, forcing an automatic response.

The researchers found that among the subjects who completed the
questionnaire free from distraction, the usual sex difference appeared, with
more women choosing emotional infidelity. But among the subjects who had to
remember the numbers, there was no sex difference; women, as well as men,
identified sexual infidelity as the most upsetting.

"The fact that women's responses on the forced-choice measure mirrored those
of men argues forcefully against the existence of innate sex differences,"
the researchers wrote.

Dr. Buss, however, said he failed to find the new research convincing. Dr.
DeSteno and his colleagues, Dr. Buss said, had distorted the claims of
evolutionary psychology.

Jealous? Maybe It's Genetic. Maybe Not.
(Page 2 of 2)

"These authors take a kind of rigid, robotic, stereotypic and false
depiction of the evolutionary hypothesis and then show that those robotic
depictions are wrong," Dr. Buss said. "I could develop any number of
contexts in which you could make the sex differences in jealousy disappear;
the fact that you could create a laboratory experiment in which you do so
is, in my view, a meaningless and trivial demonstration."

Besides, he added, a smaller study, published this year, found sex
differences even when methods other than forced-choice were used to
determine preferences. Dr. Todd Shackelford, an associate professor of
psychology at Florida Atlantic University and a former student of Dr. Buss,
also had objections.

"I guess, to state it plainly, I think the paper is in large part
ludicrous," he said. "It's clear to me that they have an agenda they're

Yet in an extensive critique, to be published next year in the journal
Personality and Social Psychology Review, Dr. Christine R. Harris, a
psychologist and research scientist at the University of California at San
Diego, says Dr. DeSteno and his colleagues have identified only one of many
serious flaws in the case for evolved sex differences in jealousy.

"The evidence supporting this theory is far less conclusive than is often
maintained," Dr. Harris said.

For example, she pointed out that the forced-choice studies of jealousy have
found differences between American and European men as large as those
between American men and women. And in some Asian cultures, the disparity is
even larger: only 25 percent of Chinese men, for example, chose sexual
infidelity as more distressing in one study; 75 percent picked emotional

Such findings, Dr. Harris wrote, seem "quite problematic" to a theory that
posits an evolutionarily evolved mechanism operative in most, if not all,
humans, while the results are compatible with the idea that culture
influences the jealous responses of men and women.

Another difficulty, she continued, is that some studies examining real
instances of unfaithfulness - as opposed to the imagined infidelity of
college students and other laboratory subjects - found very different
patterns of results.

In one study, involving adults living in sexually open marriages, for
example, more women than men reported being bothered by the thought of their
mate's engaging in sexual intercourse with another person, Dr. Harris said.
Another study found that both men and women dwelled more on the sexual side
of a mate's infidelity than the emotional aspects.

Dr. Harris also takes on the finding, reported in the 1980's by evolutionary
psychologists like Dr. Martin Daly and Dr. Margo Wilson at McMasters
University in Ontario, that men are far more likely than women to kill their
spouses out of sexual jealousy. Men, Dr. Harris pointed out, are more likely
to be the perpetrators in all forms of violent crime. When the proportion of
homicides involving jealousy is considered, rather than the absolute number
of such acts, women are just as likely to kill out of jealousy as men are.

Perhaps predictably, such arguments are unlikely to put an end to the
continuing debate over evolution's role in shaping jealous passion.

Dr. Shackelford waved away Dr. Harris's critique and the criticisms made by
other researchers as misguided forays intended "to cater to the muddled
masses of mainstream psychology."

Dr. Buss, for his part, offered the verbal equivalent of a shrug.

"People have always been resistant to evolution," he said. "We're in the
midst of a scientific revolution in the field of psychology."

"It took 400 years for the Catholic church to forgive Galileo," he added.
"Will it take longer for this? I don't know, but it's going to happen."

(Recent developments in Nueroscience have re-sparked much debate about
free-will vs. determinism. Very interesting article - Mike)

A Question of Will

Boston Globe, October 15, 2002

Neuroscientists have detected brain signals directing a muscle to move
before the person reports having made a conscious ("free will") decision to
move the muscle. They've also found that magnetic fields influence a human's
decision (to choose left or right), yet people still "feel" they made the
decision freely.

The research has renewed the age-old controversy over free will vs.

from -

A question of will
The issue of free will has perplexed theologians and philosophers for
centuries - now neuroscience enters the age-old debate

By Carey Goldberg, Globe Staff, 10/15/2002

Try this: At a moment of your choosing, flick your right wrist. A bit later,
whenever you feel like it, flick that wrist again.

Most likely, you'd swear that you, the conscious you, chose to initiate that
action, that the flickings of your wrist were manifestations of your will.
ut there is powerful evidence from brain research that you would be wrong.
That, in fact, the signal that launched your wrist motion went out before
you consciously decided to flick.

''But, but, but,'' you'd probably like to argue, ''but it doesn't feel that

With that protest, you would be joining a great debate among
neuroscientists, philosophers and psychologists that is a modern-day version
of the age-old wrangling over free will.

The traditional conundrum went: ''How can God be all-knowing and
all-powerful and yet humans still have free will?'' And later: ''How can
everything be governed by the determinist forces of physics and biology and
society, and yet humans still have free will?''

Those questions still concern many, but the new neuro-flavored debate over
free will goes more like this: Is the feeling of will an illusion, a wily
trick of the brain, an after-the-fact construct? Is much of our volition
based on automatic, unconscious processes rather than conscious ones?

When Daniel M. Wegner, a Harvard psychology professor and author of a new
book, ''The Illusion of Conscious Will,'' gives talks about his work,
audience members sometimes tell him that if people are not seen as the
authors of their actions, it means anarchy, the end of civilization. And
worse. Some theologies, they tell him, hold that if there is no free will,
believers cannot earn a ticket to heaven for their virtue.

In reality, neuroscience is not generally tackling the sweeping
philosophical issue of free will, but something much narrower, said Chris
Frith, a neuroscientist at University College London.

''There has been much recent work addressing the question of how it is that
we experience having free will, i.e., why and when we feel that we are in
control of our actions,'' he wrote in an e-mail.

That is not to say that neuroscience will never enter the philosophical

It could even be that, once the physiological basis of will becomes better
understood, ''You'll get a more mature, larger view of what's going on and
the question of free will might vanish,'' speculated V. S. Ramachandran,
director of the Center for Brain and Cognition at the University of
California at San Diego. No one argues about ''vital spirits'' now that we
know about DNA, he noted.

Meanwhile, the debate is still on, and near its center is an 86-year-old
University of California professor emeritus of physiology, Benjamin Libet.
is seminal experiments on brain timing and will came out back in the
mid-1980s, and the results are still reverberating loudly today.

Just this summer, the journal Consciousness and Cognition put out a special
issue on ''Timing relations between brain and world'' that prominently
featured Libet's work. And, at a conference, titled ''The Self: from Soul to
Brain,'' held by the New York Academy of Sciences last month, ''Libet''
rolled off more tongues than Descartes or Kant or Hume or the other
philosophers whose names usually come up when the subject is will.

What Libet did was to measure electrical changes in people's brains as they
flicked their wrists. And what he found was that a subject's ''readiness
potential'' - the brain signal that precedes voluntary actions - showed up
about one-third of a second before the subject felt the conscious urge to

The result was so surprising that it still had the power to elicit an
exclamation point from him in a 1999 paper: ''The initiation of the freely
voluntary act appears to begin in the brain unconsciously, well before the
person consciously knows he wants to act!''

Libet's experiments continue to be criticized from every which angle. At the
New York conference, for example, Tufts philosopher Daniel C. Dennett argued
that it could be that the experience of will simply enters our consciousness
with a delay, and thus only seems to follow the initiation of the action.
ut, though controversial, the Libet experiments still stand and have been
replicated. And they have been joined by a growing body of research that
indicates, at the very least, that the feeling of will is fallible.

Among that research is the following experiment by Dr. Alvaro Pascual-Leone,
director of the Laboratory for Magnetic Brain Stimulation at the Beth Israel
Deaconess Medical Center.

A subject, he said, would be repeatedly prompted to choose to move either
his right or his left hand. Normally, right-handed people would move their
right hands about 60 percent of the time.

Then the experimenters would use magnetic stimulation in certain parts of
the brain just at the moment when the subject was prompted to make the
choice. They found that the magnets, which influence electrical activity in
the brain, had an enormous effect: On average, subjects whose brains were
stimulated on their right-hand side started choosing their left hands 80
percent of the time.

And, in the spookiest aspect of the experiment, the subjects still felt as
if they were choosing freely.

''What is clear is that our brain has the interpretive capacity to call free
will things that weren't,'' he said.

Wegner's book discusses a variety of other mistakes of will. Among them is
the ''alien-hand'' syndrome, in which brain damage leaves people with the
sense that their hand no longer belongs to them, and that it is acting -
say, unbuttoning their shirt - out of their control.

Another recent book, ''The Volitional Brain: Toward a Neuroscience of Free
Will,'' includes a psychiatrist's description of a German patient who felt
compelled to stand at the window all day, willing the sun across the sky.
egner argues that ''the feeling of will is our mind's way of estimating
what it thinks it did.'' And that, he said, ''is not necessarily a perfect
estimate.'' It is ''a kind of accounting system rather than a direct
read-out of how the causal process is working.''

In Libet's interpretation, free will could still exist as a kind of veto
power, in the fractions of a second between the time you unconsciously
initiate an action and the time you actually carry it out.

For example, he said in a telephone interview, ''The guy who killed the
mayor of San Francisco, he was obviously deliberating in advance, but then
when he gets to the mayor, there's still the process of, does he now pull
the trigger? That's the final act now. That is initiated unconsciously, but
he's still aware a couple of hundred milliseconds before he does it and he
could control it, but he doesn't.''

''That is where the free will is,'' Libet said.

Such veto power is not enough for many people, however. ''I want more free
will than that,'' Dennett complained at the conference.

He may not get it, but he will almost surely get more data about it. Some
neuroscientists are using new brain imaging technology to try to pinpoint
what happens in the brain when a person wills something. With its help, and
further work being done on patients with abnormal volition, more progress
appears likely.

''I think,'' Frith wrote, that ''in the next few years we will have quite a
good understanding of the brain mechanisms that underlie our feeling of
being in control of our actions.'' But that, he hastened to add, ''does not
in any way eliminate free will.''

Further comfort comes from Michael S. Gazzaniga, director of the Center for
Cognitive Neuroscience at Dartmouth College.

There is no need, he said, ''for depressing nihilistic views that we're all
robots walking around on someone else's agenda. It's the agenda we build
through experience, and the system is making choices.''

And just because some processes in the brain are automatic does not mean
they all are, he said. ''My take,'' Gazzaniga said, ''is that brains are
automatic and people are free.''

Carey Goldberg may be reached at G oldberg@globe.com.

This story ran on page C1 of the Boston Globe on 10/15/2002.

Forever Young

Washington Post, October 13, 2002

Eminent technologists believe science will evolve so fast in their lifetimes
that they will energetically live a very long time, if not be effectively

Ray Kurzweil sees biotechnologies within the decade that will allow us to
regrow our tissues and organs, prevent hardening of the arteries and cure
diabetes. Beyond 10 years he sees technologies that will allow us to
supplement our red and white blood cells with little robotic devices that
are hundreds of times faster.

He also sees us replacing our gastrointestinal system with an engineered one
that would allow us to eat as much of anything as we want, for sociability
and pleasure, while our new gut "intelligently extracts nutrients from food"
and trashes the rest. Ultimately he envisions us expanding our brains
through "intimate interaction with nonbiological intelligence," i.e.,

William Haseltine, CEO of Human Genome Sciences, almost 58, thinks "people
my age will live into their 100s -- and healthy for most of that time." He
bases that just on existing technologies that lower cholesterol levels,
strengthen bones, control high blood pressure and offer surgeons terrific
images of what's going on inside the body; tissue engineering in which you
create bladders and blood vessels and cartilage outside the body for
eventual implant; and mechanical helpers.

from - http://www.washingtonpost.com/wp-dyn/articles/A12377-2002Oct11.html

Forever Young

Suppose You Soon Can Live to Well Over 100, As Vibrant and Energetic as You
Are Now. What Will You Do With Your Life?
William Haseltine, chairman of Human Genome Sciences in Rockville: "People
my age will live into their 100s--and be healthy for most of that time."
(Susan Biddle - The Washington Post/File Photo)

By Joel Garreau
Washington Post Staff Writer
Sunday, October 13, 2002; Page F01

Just one generation ago, Jack Benny got laughs of recognition for
perpetually claiming to be 39. At the time, 40 was over-the-hill. The idea
of sexy 50-, 60-, 70- and 80-year-olds seemed a contradiction in terms.

How aging has changed. This is no longer the case, as has been demonstrated
by Tina Turner, Susan Sarandon, Cheryl Tiegs, Isabella Rossellini, Glenn
Close, Goldie Hawn, Diane Keaton, Farrah Fawcett, Cher, Charo, Barbra
Streisand, Candice Bergen, Lauren Hutton, Cybill Shepherd, Catherine
Deneuve, Blythe Danner, Faye Dunaway, Dolly Parton, Lynn Russell, Sophia
Loren, Joan Collins, Jane Fonda, Raquel Welch, Stockard Channing, Kathleen
Turner, Diane Sawyer, Tipper Gore, Shirley MacLaine and Lena Horne. Not to
mention Sting, Peter Jennings, Bill Clinton, Vicente Fox, Junichiro Koizumi,
Clint Eastwood, Robert Redford, Harry Belafonte, Chuck Yeager, Sonny
Jurgensen, O.J. Simpson, Sean Connery, Clint Eastwood, Colin Powell, Kevin
Costner, Oscar de la Renta, Ricardo Montalban, Tom Stoppard, Vernon Jordan,
Warren Beatty, Harrison Ford and Paul Newman.

When feminist and one-time Playboy bunny Gloria Steinem turned 50 at a gala
looking "younger, thinner and blonder than ever," as one partygoer put it,
she famously insisted, "This is what 50 looks like." That was 18 years ago.
Now we have grandparents in their eighties casually jet-setting off to the
Great Wall of China, and dancing, prancing rock-and-roll stars in their
sixties. Rock-and-roll stars in their sixties!

Remarkably, such pioneers of agelessness have accomplished all this using
what some would call primitive means -- exercise and diet, for example,
antibiotics and vaccines, makeup and plastic surgery.

Enough of that. Today, a whole new industry is booming that vows to slow,
halt or actually reverse aging. The lure is not just achieving advanced
years. It is doing so vigorously and even, dare we say it, youthfully.
Americans are spending an estimated $6 billion this year on substances from
ginkgo biloba to human growth hormone that claim to offer new powers. Some
scientific skeptics think all this money literally is being peed away. They
believe all those potions are passing through people's metabolisms producing
nothing but expensive urine.

At the same time:

* Respected demographers calculate that half the American girls born today
will live to be 100.

* The number of people older than 100 in America has been increasing by more
than 7 percent per year since the '50s. The fastest-growing group of drivers
in Florida is over 85.

* Dozens of start-up companies have been created in the last five years that
are in the business of dramatically slowing aging. Some are staffed by
distinguished scientists, including former members of the National Institute
on Aging, part of the National Institutes of Health in Bethesda.

* Two anti-aging researchers have bet each other what will amount to
millions on payoff that at least one person alive today will live to 150.

* Eminent technologists who believe science will evolve so fast in their
lifetimes that they will energetically live a very long time, if not be
effectively immortal, include William Haseltine, CEO of Human Genome
Sciences in Rockville, who soon may be biotech's first billionaire; Ray
Kurzweil, a member of the National Inventors Hall of Fame and winner of the
National Medal of Technology; and Eric Drexler, a leading apostle of
atomic-level manufacturing and author of "Engines of Creation."

The question is whether this all reflects the naive hopes of creaky baby
boomers -- the first generation that will die with most of their own teeth
-- or something like reality, in which case the baby boomers may be the last
generation to die traditional old-age deaths. If the latter, how does such
an enormous shift affect human nature itself?

Boom! They're Getting Old!

The growth curve in anti-aging companies looks like a hockey stick, rising
dramatically in just the last few years. You ask -- what was the turning
point? The growth in computer power? The access to information on the
Internet? The sequencing of the human genome?

Anti-aging advocates look at you like you're from the planet Zircon.

"It's the aging of the baby boom," explains S. Mitchell Harman, president
and director of the Kronos Longevity Research Institute and former chief of
endocrinology at the National Institute on Aging. "They are not going gentle
into that good night."

"Makes sense," says one now-menopausal '60s activist. "First we made the
world safe for blacks and women. Now we're going to do it for all those
people with their left-turn signals on for miles, who wear those funny

The anti-aging industry continuum has, at its extremes, two camps. One
consists of scientists who publish in prominent peer-reviewed journals who
say there is absolutely nothing right now available for humans that will
stop or reverse the aging process for you, period, full stop. Although of
course they are working like crazy to change that. More about that in a

The far larger group at the other end is the one at which throngs of
Americans are throwing money. It includes people with fewer credentials who
are only too happy to sell you tonics for which they make enticing claims.
Their products include everything from Vitamin E to shark cartilage, and
from water about which they make startling assertions, to sand about which
they make startling assertions, to light rays about which they make
startling assertions.

The establishment scientists view the claims of the large group as at best
unproven, and at worst, the work of "quacks, snake-oil salesmen and
charlatans" in the finest traditions of goat gland and monkey testicle
providers at the turn of the last century, as S. Jay Olshansky of the
University of Illinois at Chicago puts it. He leads what has become known as
the Gang of 51, a group of scientists who study aging that, in May, put out
a report designed to be "an authoritative statement of what we know and do
not know about intervening in human aging." In it they state flatly, "At
present, there is no such thing as an anti-aging intervention."

Needless to say, those who believe they can offer such products -- and in
whom many Americans are investing their faith -- beg to differ.

"Flat-Earthers" is how Ronald Klatz, 47, describes his detractors. Klatz is
president of the American Academy of Anti-Aging Medicine, or A4M, an
organization that boasts 11,500 practitioners in 65 countries whose official
slogan is: "Aging is not inevitable! The war on aging has begun!"

"Remember 'Animal Story' by Orson Welles?" asks Klatz.

You mean "Animal Farm" by George Orwell?

"Maybe," he replies. "But it's four legs good, two legs bad."

He sees the science and medical establishments as out to get him.

"The guys in the bow ties and suspenders are right and anybody who says
otherwise is wrong," he says sarcastically. He lists Science, Scientific
American and the Journal of the American Medical Association as publications
that "sandbagged anti-aging medicine without justification and without
science. They rubber-stamped all those supposed scientists" from such noted
institutions as the University of Chicago and the University of California
San Francisco.

Klatz believes that within 10 years, we will begin to achieve "the
technology necessary to accomplish mankind's oldest wish: practical
immortality -- life-spans of 200 years and beyond," as he wrote in a recent
article in the magazine the Futurist. "Humankind will evolve toward an
Ageless Society, in which we all experience boundless physical and mental

Some scoff. "A life expectancy at birth of 100 years requires that almost
every cause of death that exists today would have to be reduced dramatically
or eliminated altogether," Olshansky and his co-author Bruce A. Carnes write
in their book "The Quest for Immortality."

"How likely is that?" they ask.

Good question.

Die Old, Stay Pretty

The life of man in nature may or may not be nasty and brutish, but it is
indeed short.

Over most of the course of human existence, average life expectancy hovered
between 20 and 30 years. In part this is because so many infants died, but
that does not obscure a bleak evolutionary fact: For hundreds of thousands
of years, not long after we reproduced, we died. Even in Western Europe,
life expectancy did not reach 40 until 1800 and 50 until 1900, note
demographers James W. Vaupel and Bernard Jeune in "Exceptional Longevity:
>From Prehistory to the Present." In industrialized countries, female life
expectancy is now above 80, slightly less for men. This represents close to
a fourfold increase. Over the same period, your chance of living to 100 has
increased from roughly 1 in 20 million to 1 in 50. The number of
centenarians in the developed world has been increasing by more than 7
percent a year every year since the '50s, Vaupel says.

In the journal Science, Vaupel and his co-author, Jim Oeppen, noted "an
astonishing fact." Since 1840 -- for 160 years -- life expectancy has been
growing by a quarter of a year every year. "In 1840, the record for longest
life expectancy was held by Swedish women, who lived on average a little
more than 45 years," they noted. "Among nations today, the longest
expectation of life -- almost 85 years -- is enjoyed by Japanese women."
This steady march of increased life span has been so punctual, they note,
that little humans have done collectively for so long has ever been more

This stream of progress shows no sign of slowing down, much less stopping,
they say. In the first half of the 20th century, we knocked back death among
the young. Clean water, antibiotics and vaccines played enormous roles. In
the second half, we improved survival after age 65. Incremental progress in
fighting four big killers of the aged -- heart disease, some cancers,
diabetes and stroke -- continues briskly. People now live long enough for
Alzheimer's to be a big problem, so we're working on that, too.

The proverbial march of science has if anything accelerated. Ask yourself:
Do you think the sequencing of the human genome, stem cells and cloning will
have any effect on medicine? If so, you might find credible Vaupel's
controversial projection that the average American girl born today will live
to see 100.

This, of course, does little good if all we end up with is a vast cohort of
geezers "drooling on their shoes," as Klatz puts it.

That's why the anti-aging industry is not particularly interested in
gerontology -- patching up the old, hobbled and doddering. How more
efficient would it be to interrupt the aging process in the first place,
they reason.

For them, the object of the game is to die young.

As late as possible. Faith and the Future

Just about every assertion about the future of aging is based on one
faith-based system or another. People believe what they want to believe,
with or without empirically established facts.

Bruce Ames of the University of California at Berkeley is a great man in
bioscience. His scholarly articles are among the most cited of the 20th
century. If you want to discover whether a substance will cause genetic
mutation, what you want is the "Ames Test."

Nonetheless, he's got something he wants to sell you. It is an anti-aging
"nutraceutical" that is for sale over the Internet. It's called Juvenon and
consists of two antioxidants. He says he doesn't make any money on it; the
proceeds all go to a foundation. Nonetheless, the claims he and others make
for it are arresting. Memory and energy levels in lab animals increase
significantly, he reported in the Proceedings of the National Academy of

In his Berkeley living room with its marvelous view of the Golden Gate
Bridge, over a glass of sea-dark wine, he loves to say Juvenon makes his
aging lab rats "dance the macarena."

"This is great stuff. I'm beginning to remember the '60s," reports Stewart
Brand, the onetime counterculture icon who created the Whole Earth Catalog.

The trouble is nobody knows if it really works on humans. Ames says he is
selling the stuff to raise the money that only now will allow him to begin
the double-blind clinical human trials that are the scientific gold

There are no biomarkers that reliably predict remaining years of life, says
Huber Warner, director of the Biology of Aging program of the National
Institute on Aging. There is as yet no way to look at your cells and
quantify whether they are biologically older or younger than anybody else's.
So the only way to determine for sure whether any intervention works is to
try it and wait for the control group to drop dead.

In humans, this can take 20 years or more, inconveniently enough. That's why
there are so many people who are taking leaps of faith, playing the odds.
Their need is more urgent than that.

One very prominent anti-aging researcher says off the record that he takes
saw palmetto for his prostate symptoms, even though he acknowledges there is
no conclusive evidence that it works. Recent studies of ephedra,
Saint-John's-wort, ginkgo biloba and kava have called into question those
substances' effectiveness or safety. Scientists are consumed by memories of
fen-phen and female hormonal treatment, which turned out to have unexpected
consequences. Nonetheless, we are conducting this vast "uncontrolled
experiment," as Warner puts it, gobbling down potions and hoping for the
best. "Which would you rather be?" one researcher asks. "In the experimental
group or the control group?"

The closest thing to classic scientific rationalism you'll find is in the
work of people like George S. Roth. He wants to add 30-plus healthy years to
your life by convincing your cells they're starving half to death. Roth is a
senior guest scientist in the nutritional and molecular physiology section
of the National Institute on Aging. He has a company in Baltimore called
GeroTech full of retired NIA scientists. They and others are working on
"caloric restriction mimetics."

If you semi-starve a healthy organism, it turns out, its life span will
increase by 40 percent. This is the only proven method of altering the rate
of aging. Works on nematodes, fruit flies, mice, dogs, rats and spiders.
Critters react by channeling their energy from reproduction to maintenance.

There is this slight problem. Semi-starved lab rats are mean. "Oh, God, do
they bite," notes one researcher. That's why it's hard to get humans into
test trials. "Do you live longer or does it just feel that way?" another
researcher jokes.

Roth and other researchers have a more cheerful thought. What if you could
safely fool the cells into switching into starvation mode while allowing the
humans to eat normally? Roth hopes he's only a few years away from bringing
such a nutraceutical, available without prescription, to a health food store
near you. If so, it could be a big deal.

Geriatrics researchers are up to their lab rats in work on memory,
impotence, menopause, baldness, wrinkles, obesity, deafness, eyesight loss,
muscle loss, bone loss, joint deterioration, cholesterol buildup and general
aches and pains -- not to mention breast cancer, prostate cancer, colon
cancer and Alzheimer's. Some of their results could produce the next Viagra
-- especially the memory drugs.

But these relatively conventional research directions, while promising, are
not the sort of thing that fires up visions of godlike immortality.

For that you want the revolution described by the National Science
Foundation and the Department of Commerce in a July report. It points to the
four rapidly evolving and intertwining "GRIN" technologies -- genomics,
robotics, information and nano-engineering. Together they hold the potential
of "a tremendous improvement in human abilities, societal outcomes and
quality of life," the report says.

"The human body will be more durable, healthy, energetic, easier to repair,
and resistant to many kinds of stress, biological threat, and [the] aging
process," the report states.

That's why the inventor and author Ray Kurzweil, 54, is personally eating
very few carbohydrates and fats, taking more than a hundred supplements and
trying not to be too big of a nag to others his age. But he almost can't
help himself.

"If I look at my kids -- kids in their teens, twenties or even thirties --
unless they have unusual problems, a decade or two from now they will be
young and the revolutions will be in full force. They don't have to do a lot
to benefit from really radical life extensions," Kurzweil says. "The
oblivious generation is my own. The vast majority are going to get sick and
die in the old-fashioned way. They don't have to do that. They're right on
the cusp."

Like many others he sees biotechnologies within the decade that will, for
example, allow us to regrow our tissues and organs, prevent hardening of the
arteries and cure diabetes. Beyond 10 years he sees technologies that will
allow us to supplement our red and white blood cells with little robotic
devices that are hundreds of times faster. "Our biological systems are
really very inefficient, not optimally engineered," he says. A well-designed
blood system, he says, will allow you to "run an Olympic sprint for 16
minutes without taking a breath."

He also sees us replacing our gastrointestinal system with an engineered one
that would allow us to eat as much of anything as we want, for sociability
and pleasure, while our new gut "intelligently extracts nutrients from food"
and trashes the rest. "Our whole GI system is pretty stupid. It stores too
much fat," he says.

This long view has "definitely had a profound perspective on my life," says
Kurzweil. "There's always risks, but I really envision living through this
century and beyond, and it does give me a sense of the possibilities. I am
not looking to slow down 10 years from now and be happy if I make it to 80.
It's liberating. I envision doing things and being different kinds of people
that the normal model of human wouldn't allow." Ultimately he envisions us
expanding our brains through "intimate interaction with nonbiological
intelligence," i.e., computers.

But to get there, you've got to take care of yourself now, he insists.

William Haseltine, who is almost 58, agrees. The founder of Human Genome
Sciences Inc. thinks it perfectly plausible that "people my age will live
into their 100s -- and healthy for most of that time." He bases that just on
existing technologies that lower cholesterol levels, strengthen bones,
control high blood pressure and offer surgeons terrific images of what's
going on inside the body.

Haseltine is resolutely cautious about what bioscience will be able to do.
He talks a great deal about what we don't know about stem cells.
Breakthroughs "happen much slower than people think, even when they are
extremely well funded," he says.

Nonetheless, given current technology, he expects people now in their
fifties to live a decade or two longer than they expect -- perhaps "to 110
or 120 in reasonably good health." He points to tissue engineering in which
you create bladders and blood vessels and cartilage outside the body for
eventual implant. "People are trying to grow pieces of new lung, new kidney.
The textbook 'Tissue Engineering' is now in its second edition," he notes.
He also points to mechanical helpers. "Look at what Cheney's got. He's
basically got a defibrillator implanted. They didn't do that 10 years ago."

Asked how all this is affecting his life personally, Haseltine says he's
taking very good care of his body. And oh yes: "I like compound interest."

More seriously, he says the prospect of very long life "allows you to embark
on longer-range projects." He has taken on a 10-to-20-year program to learn
more history. Haseltine is going back to translations of some of the
original Roman and Greek texts -- Horace, Virgil, Ovid. He also is
scheduling trips to see art by Giotto and Donatello. "I've always been
interested in art and history. But with more time to see things, you might
as well learn," he says.

Eric Drexler, 47, the Silicon Valley nanotech pioneer, is more optimistic
than either Kurzweil or Haseltine. He wears a medallion around his neck that
asks the finder, in case of Drexler's death, to "Call now for instructions/
Push 50,000 U heparin by IV and do CPR while cooling with ice to 10C/ Keep
PH 7.5/ No embalming/ No autopsy." For Drexler plans to come back.

He and others believe that robots smaller than a human cell will soon work
like Pac-Man. Inject a few million of them into your bloodstream, and
they'll gobble up fat cells, cancer cells, what have you.

That's why he wants to make it through the next decade or two until the new
technologies kick in. If for some reason he happens to croak prematurely, he
wants to get frozen right next to Ted Williams so that when the right
technology arrives, he can be thawed and have a nanotech workover.

Does he think this will make him immortal?

"Depends on what you mean by immortal," he says, sitting at Silicon Valley's
Original House of Pancakes in Los Altos, Calif., letting his ham and eggs
get cold. "There is such a thing as proton decay."


He's talking about the eventual collapse of subatomic particles in untold

Okay, what about merely geological time? Hundreds of thousands of years?

"Oh yeah." He smiles. "That. For sure."

Forever Is a Long Time

As technology evolves ever faster, the distance between science and science
fiction shrinks. This makes estimating the impact on culture and values a
challenge. What happens in a world that can be increasingly young and vital
and robust and busy at the same time that it is increasingly very, very old?
What happens to Social Security? How many careers do you have? How many
marriages do you have? How many children do you have?

While formidable, these calculations are relatively straightforward. To
really imagine richly the complexity of such a world, one perhaps needs the
sensibilities of a novelist.

Will the new young people who are only in their twenties ever be able to
compete with the old young? Especially if the old young have seen their
compound-interest money grow startlingly?

In ancient lore, Gilgamesh built the walls around the city of Uruk as a
monument that would make him immortal. If we did not fear death, would we
lose our will to achieve? Would you put all of life forever before you?
"What's the rush? I'll get to that when I'm 100." If you did not have to
seek your immortality in children, would you have them?

If life stretches out for a very long time, do you avoid risks? Or do you
court them? Is there a growth market in recreational life-risking? Will more
people emulate George Bush, the elder, by parachuting out of airplanes at
the age of 72?

If immortality is at hand, do we need religion?

If death is never imminent, is love as intense? Do Romeo and Juliet inhabit
the world only of the very biologically young?

What happens if you seek youth and your partner does not?

Do you risk growing young alone?

(All, this is a very interesting article about the sanctions against IRAQ. I
originally sent this out in Feb of this year, but thought it would be
prudent to send again for those who missed it, and since it is such a well
researched and written article. Opponents of the sanctions site an offered
heard statement that these sanctions have led to the death of over 1,000,000
children. Osama Bin Laden in his 10/7 videotaped message stated that every
day US sanctions cause the death of 5,000 Iraqi children under the age of 5.
Proponnents of the sanctions assert that these numbers are greatly
exaggerated, if not outright lies. The truth seems to lie somewhere in the
middle. The original figure came from a five year 700 household study in
Iraq performed by Iraq's officials. Even Iraq's own official web site shows
the number to be 3,000 per month. Interestingly, the sanctions apply to all
of Iraq yet the northern part of Iraq is not under Saddam's control, only
the southern part is. Yet the under 5 death rates in the south are double
that of the north, and the northern rates have been decreasing. It seems
obvious that how the government operates may also have a significant impact
on these figures. The author also asks "How much should we blame Saddam
Hussein for rejecting the U.N.'s "oil-for-food" humanitarian offer for six
years" A later UNIFEC study showed a more reliable figure, in both north
and south, to be about 100,000 since 1990. It is important not to
trivialize these deaths, and it is obvious that a significant number of
deaths have come from the sanctions. But it is important to know the facts
and the real numbers, and acknowledge that this countries own corrupt
despotic government shares some of the blame for these deaths. Indeed, as
the author notes "UNICEF found child mortality actually decreased in the
autonomous north ...This is Exhibit A for those who argue that Saddam alone
is responsible for Iraq's humanitarian crisis." The author continues "Yet
the basic argument against all economic sanctions remains: namely, that
they tend to punish civilians more than governments and to provide dictators
with a gift-wrapped propaganda tool." and "It seems awfully hard not to
conclude that the embargo on Iraq has been ineffective (especially since
1998) and that it has, at the least, contributed to more than 100,000 deaths
since 1990. With Bush set to go to war over Saddam's noncompliance with the
military goals of the sanctions, there has never been a more urgent time to
confront the issue with clarity." - Mike)

The politics of dead children
by Matt Welch
"It seems awfully hard not to conclude that the embargo on Iraq
has been ineffective (especially since 1998) and that it has, at
the least, contributed to more than 100,000 deaths since 1990."


The Politics of Dead Children

Have sanctions against Iraq murdered millions?

By Matt Welch

Are "a million innocent children...dying at this time...in Iraq" because of
U.S. sanctions, as Osama bin Laden claimed in his October 7 videotaped
message to the world? Has the United Nations Children's Emergency Fund
(UNICEF) discovered that "at least 200 children are dying every day...as a
direct result of sanctions," as advocacy journalist John Pilger maintains on
his Web site? Is it official U.N. belief that 5,000 Iraqi children under the
age of 5 are dying each month due to its own policy, as writers of letters
to virtually every U.S. newspaper have stated repeatedly during the past
three years?

The short answer to all of these questions is no. The sanctions, first
imposed in 1990 after Iraq's invasion of Kuwait, are administered by the
U.N., not the U.S. They were first imposed on all exports from Iraq and
occupied Kuwait, and all non-humanitarian imports, in an effort to persuade
Saddam Hussein to retreat within his own borders. After the Gulf War, they
were broadened to include a dismantling of Iraq's biological, chemical,
nuclear, and missile-based weapons systems, out of fear that Hussein would
otherwise lash out again. Estimates of sanctions-era "excess" child deaths
-- the number above the normal mortality rate -- vary widely due to politics
and inadequate data, especially concerning children older than 5. The
dictatorial Iraqi government, which has blamed nearly every civilian funeral
since 1991 on sanctions, claims there have been more than 600,000 deaths of
under-5-year-olds these past 11 years (4,500 per month) and 1.5 million
deaths overall.

While firefighters were still pulling out warm body parts from Ground Zero,
foreign policy critic Noam Chomsky and his followers on college campuses and
alternative-weekly staffs nationwide were insisting that it was vital to
understand the "context" of the September 11 massacre: that U.S.-led
sanctions were killing "5,000 children a month" in Iraq. Meanwhile, on the
Iraqi government's own Web site, the number of under-5 deaths from all
causes for the month of September was listed as 2,932.

Arriving at a reliable raw number of dead people is hard enough; assigning
responsibility for the ongoing tragedy borders on the purely speculative.
Competing factors include sanctions, drought, hospital policy,
breast-feeding education, Saddam Hussein's government, depressed oil prices,
the Iraqi economy's almost total dependence on oil exports and food imports,
destruction from the Iran-Iraq and Persian Gulf wars, differences in
conditions between the autonomous north and the Saddam-controlled south, and
a dozen other variables difficult to measure without direct independent
access to the country.

Confusing the issue still further are basic questions about the sanctions
themselves. Should the U.N. impose multilateral economic sanctions to keep a
proven tyrant from developing weapons to launch more wars against his
neighbors? If sanctions are inherently immoral, what other tools short of
war can the international community use? Is this particular sanctions regime
more unreasonable than others that haven't triggered humanitarian crises?
How much should we blame Saddam Hussein for rejecting the U.N.'s
"oil-for-food" humanitarian offer for six years, and expelling weapons
inspectors in 1998? Most important, has Iraq made headway since then in
pursuing nuclear and biological weapons?

Yet all this murkiness has not deterred advocates of sanctions from claiming
absolute certainty on the issue. The warmongering New Republic, for example,
announced in October that the notion that "sanctions have caused widespread
suffering" was simply "false." Writing in National Review in December,
former army intelligence analyst Robert Stewart asserted that "resources are
available in Iraq. Even under the sanctions, Iraq's people need not starve."

The chasm between claims made by sanction supporters and opponents is enough
to make inquisitive people throw their hands up in the air. Such despair is
not exactly conducive to healthy debate, which is especially important at a
time when President Bush has made it clear that Iraq must cooperate with
weapons inspection or become the next target of the War on Terrorism. A
closer look at the controversy over dead Iraqi babies shows that opponents
of sanctions have a compelling case to make. Although they often undermine
their own position with outrageous exaggerations, their critics show a
similar disregard for the facts when they blithely dismiss concerns about
the impact of sanctions on innocent people.

Origins of a Whopper

The idea that sanctions in Iraq have killed half a million children (or 1
million, or 1.5 million, depending on the hysteria of the source) took root
in 1995 and 1996, on the basis of two transparently flawed studies, one
inexplicable doubling of the studies' statistics, and a non-denial on 60

In August 1995, the U.N. Food and Agriculture Organization (FAO) gave
officials from the Iraqi Ministry of Health a questionnaire on child
mortality and asked them to conduct a survey in the capital city of Baghdad.

On the basis of this five-day, 693-household, Iraq-controlled study, the FAO
announced in November that "child mortality had increased nearly five fold"
since the pre-sanctions era. As embargo critic Richard Garfield, a public
health specialist at Columbia University, wrote in his own comprehensive
1999 survey of under-5 deaths in Iraq, "The 1995 study's conclusions were
subsequently withdrawn by the authors....Notwithstanding the retraction of
the original data, their estimate of more than 500,000 excess child deaths
due to the embargo is still often repeated by sanctions critics."

In March 1996, the World Health Organization (WHO) published its own report
on the humanitarian crisis. It reprinted figures -- provided solely by the
Iraqi Ministry of Health -- showing that a total of 186,000 children under
the age of 5 died between 1990 and 1994 in the 15 Saddam-governed provinces.
According to these government figures, the number of deaths jumped nearly
500 percent, from 8,903 in 1990 to 52,905 in 1994.

Somehow, based largely on these two reports -- a five-day study in Baghdad
showing a "five fold" increase in child deaths and a Ministry of Health
claim that a total of 186,000 children under 5 had died from all causes
between 1990 and 1994 -- a New York-based advocacy group called the Center
for Economic and Social Rights (CESR) concluded in a May 1996 survey that
"these mortality rates translate into a figure of over half a million excess
child deaths as a result of sanctions."

In addition to doubling the Iraqi government's highest number and
attributing all deaths to the embargo, CESR suggested a comparison that
proved popular among the growing legions of sanctions critics: "In simple
terms, more Iraqi children have died as a result of sanctions than the
combined toll of two atomic bombs on Japan." The word genocide started
making its way into the discussion.

Still, the report might well have ended up in the dustbin of bad mathematics
had a CESR fact-finding tour of Iraq not been filmed by Lesley Stahl of 60
Minutes. In a May 12, 1996, report that later won her an Emmy and an Alfred
I. DuPont-Columbia University Journalism Award, Stahl used CESR's faulty
numbers and atomic-bomb imagery to confront Madeleine Albright, then the
U.S. ambassador to the United Nations. "We have heard that a half million
children have died," Stahl said. "I mean, that's more children than died in
Hiroshima. And -- and you know, is the price worth it?" Albright replied, "I
think this is a very hard choice, but the price -- we think the price is
worth it."

It was the non-denial heard 'round the world. In the hands of sanctions
opponents and foreign policy critics, it was portrayed as a confession of
fact, even though neither Albright nor the U.S. government has ever admitted
to such a ghastly number (nor had anybody aside from CESR and Lesley Stahl
ever suggested such a thing until May 1996). The 60 Minutes exchange is very
familiar to readers of Arab newspapers, college dailies, and liberal
journals of opinion. Ralph Nader and Pat Buchanan mentioned it several times
during their respective presidential campaigns.

After September 11, the anecdote received new life, as in this typically
imaginative interpretation by Harper's Editor Lewis Lapham in the magazine's
November issue: "When Madeleine Albright, then the American secretary of
state [sic], was asked in an interview on 60 Minutes whether she had
considered the resulting death of 500,000 Iraqi children (of malnutrition
and disease), she said, 'We think the price is worth it.'"

Albright has been dogged by protesters at nearly all her campus appearances
the past several years, and rightly so: It was a beastly thing to say, and
she should have refuted the figures. Quietly, a month after the World Trade
Center attack, she finally apologized for her infamous performance. "I
shouldn't have said it," she said during a speech at the University of
Southern California. "You can believe this or not, but my comments were
taken out of context."

The other, far more credible source of the 500,000 number is a pair of 1999
UNICEF studies that estimated the under-5 mortality rates of both Iraqi
regions based on interviews with a total of 40,000 households. "If the
substantial reduction in the under-five mortality rate during the 1980s had
continued through the 1990s," the report concluded, "there would have been
half a million fewer deaths of children under-five in the country as a whole
during the eight year period 1991 to 1998." If the expected mortality rate
had stayed level rather than continuing its downward slope, the excess death
number would be more like 420,000.

Significantly, UNICEF found child mortality actually decreased in the
autonomous north (from 80.2 per 100,000 in 1984-89 to 70.8 in 1994-98) while
more than doubling in the south (from 56 per 100,000 to 130.6). This is
Exhibit A for those who, like The New Republic, argue that Saddam alone is
responsible for Iraq's humanitarian crisis. When the report was released,
UNICEF Executive Director Carol Bellamy attributed the difference in
mortality trends to "the large amount of international aid pumped into
northern Iraq at the end of the [Persian Gulf] war."

The UNICEF report took pains to spread the blame for increased mortality in
the south, mentioning factors such as a dramatic increase in the bottle-only
feeding of infants in place of more nutritious (and less likely to be
tainted) breast milk. "It's very important not to just say that everything
rests on sanctions," Bellamy said in a subsequent interview. "It is also the
result of wars and the reduction in investment in resources for primary
health care."

But in the hands of sanctions opponents and some news organizations, these
findings were translated into a U.N. admission that sanctions were "directly
responsible" for killing half a million children (or even "infants"). In
September 2001 alone, the UNICEF report was mischaracterized in The Boston
Globe, The Buffalo News, The Akron Beacon Journal, The San Diego
Union-Tribune, The Charleston Gazette, the Wilmington Sunday Star-News, and
The Chicago Tribune (by a Northwestern University journalism professor, no

By November, UNICEF was annoyed enough with the frequent misinterpretations
to send out regular corrective press releases, saying things like: "The
surveys were never intended to provide an absolute figure of how many
children have died in Iraq as a result of sanctions." Rather, they "show
that if the substantial reductions in child mortality in Iraq during the
1980s had continued through the 1990s -- in other words if there hadn't been
two wars, if sanctions hadn't been introduced and if investment in social
services had been maintained -- there would have been 500,000 fewer deaths
of children under five."

Sanctions critics almost always leave out one other salient fact: The vast
majority of the horror stats they quote apply to the period before March
1997, when the oil-for-food program delivered its first boatload of supplies
(nearly six years after the U.N. first proposed the idea to a reluctant
Iraqi government). In the past four years of oil-for-food, Iraq has exported
around 3 billion barrels of oil, generating $40 billion in revenue, which
has resulted in the delivery of $18 billion of humanitarian and
oil-equipment supplies, with another $16 billion in the pipeline. (The rest
is used to cover administrative costs and reparations to Kuwait.)

As the U.N. Office for the Iraqi Program stated in a September 28, 2001
report, "With the improved funding level for the program, the Government of
Iraq is indeed in a position to address the nutritional and health concerns
of the Iraqi people, particularly the nutritional status of the children."
Even two years earlier, Richard Garfield noted in his survey that "the most
severe embargo-related damages [have] already ended."

Anyone who tells you more children will perish in Iraq this month than
Americans died on September 11 is cutting and pasting inflated mid-1990s
statistics onto a country that has changed significantly since then.
Knowingly or not, these critics are mangling the facts to prove a debatable
point and in the process damaging their own cause.

The Truth is Bad Enough

Two weeks after the hijacked planes crashed into the World Trade Center and
the Pentagon, I began looking in earnest for trustworthy sources of
information about the effects of sanctions on Iraq. I was joined in my
search by a half-dozen or so e-mail acquaintances who approached the
question from a broadly similar viewpoint: If sanctions are killing Iraqi
babies, then Osama Bin Laden has a legitimate propaganda tool, and the U.S.
has blood on its hands that demands immediate attention. So let's find the
facts, weigh them against Saddam's weapons capabilities, and proceed from

It immediately became obvious that sanctions opponents, especially in the
U.S., would be a hindrance, not a help. "I'm a little disgusted at the way
the Naderite left has used the issue, especially when they have no backup
for the claims they are making," one of my co-conspirators, Eric Mauro,
wrote me. "They may be right, but they are so inept at arguing that it's
dangerous to take their word for it."

The man who launched the American anti-sanctions movement as we know it is a
University of Texas journalism professor named Robert Jensen. His Web site's
"factsheet" on Iraq contains two lies right off the bat. Citing WHO, he
claims that "each month 5,000 to 6,000 children die as a result of the
sanctions." And citing UNICEF, he asserts that "approximately 250 people die
every day in Iraq due to the sanctions."

Jensen, who teaches "critical thinking," drifted onto the national radar
screen days after the terrorist attacks, when he wrote a column published in
ZNet, CommonDreams.org, and The Houston Chronicle titled "U.S. Just As
Guilty of Committing Own Violent Acts." He has opposed the war against
Afghanistan (not to mention Serbia), teaches the journalism of Mumia
Abu-Jamal, and once wrote a column about how the "U.S. middle class,
particularly the white middle class, is probably the single biggest
impediment to justice the world has ever known."

Jensen's cohorts in kick-starting the anti-sanctions movement were
intifada-supporting professor Edward Said, "people's historian" Howard Zinn,
and Noam Chomsky, a man who has rarely met a foreign policy he couldn't
describe as "genocide." The four issued a joint statement in January 1999
condemning the situation in Iraq as "sanctioned mass-murder that is nearing
holocaust proportions."

These four men have authored reams of hyperbolic nonsense since September
11. Isn't it reasonable to conclude that anything they and Saddam Hussein
agree upon must be false?

No, actually, it's not, and therein lies the problem. Any sustained inquiry
into the sanctions issue runs up against waves of propaganda and reckless
disregard for the truth, and it would be all too easy to declare the issue
settled after a quick dismissal of the most glaring lies. But that would be
an abdication of responsibility. Many of those who support continued
pressure on Saddam Hussein tend to focus on a few key counterpoints while
ignoring piles of haunting in-country surveys and the damning testimony of
former U.N. officials who have quit to campaign full-time against U.S.
policy in Iraq. Sanctions proponents, if they are not careful, run the risk
of aping the foolish debate tactics of the critics they condemn.

Take, for example, the lowered mortality rates in the northern provinces of
Dahuk, Sulaymaniyah, and Erbil -- the smoking gun of the
sanctions-don't-kill crowd. The New Republic claims the autonomous Kurdish
area "is subject to exactly the same sanctions as the rest of the country."
This is false: Under the oil-for-food regime, the north, which contains 13
percent of the Iraqi population, receives 13 percent of all oil proceeds, a
portion of that in cash. Saddam's regions, with 87 percent of the
population, receive 59 percent of the money (recently increased by the U.N.
Security Council from 53 percent), none of it in cash. (Of the rest, 25
percent goes to a Kuwaiti compensation fund, and the rest covers U.N.

It just isn't true that the sanctions are "exactly the same" in both parts
of Iraq. And there are other factors affecting the north-south disparity:
International aid agencies have been active in the areas protected by no-fly
zones since 1991, and the Turkish border is said to be suitably porous for
smuggling (although Saddam has been caught smuggling several times in the
past decade).

The get-Saddam camp also likes to point out that sanctions haven't seemed to
inflict similar grief in countries such as Libya and Yugoslavia. To which
Richard Garfield, who compared the various penalized countries, has an
effective rebuttal: "Embargoes with the greatest impact on the health of the
general population are usually those which are multilateral and
comprehensive, occur in countries with heavy import dependence, are
implemented rapidly, and are accompanied by other economic and social blows
to a country. Iraq shared each of these characteristics."

Those who get past the initial frustrations of researching the topic usually
end up on Richard Garfield's doorstep. His 1999 report -- which included a
logistic regression analysis that re-examined four previously published
child mortality surveys and added bits from 75 or so other relevant studies
-- picked apart the faulty methodologies of his predecessors, criticized the
bogus claims of the anti-sanctions left, admitted when the data were shaky,
and generally used conservative numbers. Among his many interesting findings
was that every sanctions regime except the one imposed on apartheid South
Africa led to limitations of food and medicine imports, even though such
goods were almost always officially exempt from the embargo. "In many
countries," he wrote, "the embargo-related lack of capital was more
important than direct restrictions on importing medicine or food."

Garfield concluded that between August 1991 and March 1998 there were at
least 106,000 excess deaths of children under 5, with a "more likely"
worst-case sum of 227,000. (He recently updated the latter figure to 350,000
through this year.) Of those deaths, he estimated one-quarter were "mainly
associated with the Gulf war." The chief causes, in his view, were
"contaminated water, lack of high quality foods, inadequate breast feeding,
poor weaning practices, and inadequate supplies in the curative health care
system. This was the product of both a lack of some essential goods, and
inadequate or inefficient use of existing essential goods."

Ultimately, Garfield argued, sanctions played an undeniably important role.
"Even a small number of documentable excess deaths is an expression of a
humanitarian disaster, and this number is not small," he concluded. "[And]
excess deaths should...be seen as the tip of the iceberg among damages to
occur among under five-year-olds in Iraq in the 1990s....The humanitarian
disaster which has occurred in Iraq far exceeds what may be any reasonable
level of acceptable damages according to the principles of discrimination
and proportionality used in warfare....To the degree that economic sanctions
complicate access to and utilization of essential goods, sanctions
regulations should be modified immediately."

Garfield's conclusion echoes that of literally every international agency
that has performed extensive studies in Iraq. In 1999 a U.N. Humanitarian
Panel found that "the gravity of the humanitarian situation of the Iraqi
people is indisputable and cannot be overstated." UNICEF's Carol Bellamy, at
the time her landmark report was released, said, "Even if not all suffering
in Iraq can be imputed to external factors, especially sanctions, the Iraqi
people would not be undergoing such deprivations in the absence of the
prolonged measures imposed by the Security Council and the effects of war."
The former U.N. humanitarian coordinator for Iraq, Denis Halliday, travels
around the world calling the policy he once enforced "genocide." His
replacement, Hans von Sponeck, also resigned in protest of the U.N.'s
"criminal policy."

Losing the Loonies

There have been no weapons inspectors in Iraq since 1998. As a result it is
exceptionally difficult to know with precision what nuclear and biological
weapons Saddam actually has on hand or in development. From the beginning,
economic sanctions have been tied to what foreign policy analyst Mark
Phythian described in World Affairs as "the first attempt to disarm a
country against its will." After September 11, the issue of an
America-hating tyrant arming himself to the teeth has seemed more pressing
than easing an embargo that blocks his access to money.

Yet the basic argument against all economic sanctions remains: namely, that
they tend to punish civilians more than governments and to provide dictators
with a gift-wrapped propaganda tool. Any visitor to Cuba can see within 24
hours the futility of slapping an embargo on a sheltered population that is
otherwise inclined to detest its government and embrace its yanqui
neighbors. Sanctions give anti-American enclaves, whether in Cairo or
Berkeley or Peshawar, one of their few half-convincing arguments about evil
U.S. policy since the end of the Cold War.

It seems awfully hard not to conclude that the embargo on Iraq has been
ineffective (especially since 1998) and that it has, at the least,
contributed to more than 100,000 deaths since 1990. With Bush set to go to
war over Saddam's noncompliance with the military goals of the sanctions,
there has never been a more urgent time to confront the issue with clarity.

That means losing the loonies on the left. Already there are signs of
mounting liberal impatience with the routine smokescreens emanating from the
usual anti-sanctions rabble. Slate, The Guardian, and even The Nation all
published sober correctives of dead-Iraqi-baby inflation toward the end of
2001, decisively backing Richard Garfield over Robert Jensen. And if Noam
Chomsky no longer leads this particular coalition of critics, maybe they're
not so wrong after all.

Matt Welch, a columnist for the Online Journalism Review, is a writer in Los

www.matus1976.com - Article archives