Monday, December 14, 2015

China's Growing Navy: The Balance of Power in the South China Sea

The South China Sea might be misnamed. If you look at a map, the majority of the coastline surrounding this body of water belongs to several other nations, including the Philippines, Vietnam, Thailand, Malaysia, Taiwan, and Cambodia.

So while the South China Sea isn’t really very “Chinese” now, China would like to change that in the future.

The nations of the South China Sea have welcomed a United States military presence in the area, because they see it as a safeguard against Chinese aggression. But the U.S. may not be able to maintain its presence there much longer.

As of July 2015, the U.S. Navy had 273 ships. That number will be getting smaller. Old ships will be decommissioned and taken out of service.

New ships, however, take a long time to build. They need to be planned several years in advance. But Congress hasn’t authorized enough new construction to prevent the shrinkage of the total size of the U.S. Navy.

The Chinese, meanwhile, are continuing to build ships at a pace which more than offsets the rate at which they are retiring their old ships. The Chinese navy is growing. Robert Kaplan writes:

The U.S. Navy presently dominates the South China Sea. But that situation will change. The size of the U.S. Navy has come down from almost six hundred warships in the Reagan era, to the mid-three hundreds during the Clinton era, to under three hundred now. It might go lower still by the 2020s, because of the retirement of current classes of submarines and surface warships, cost overruns, and future budget cuts, the result in turn of massive fiscal deficits. Meanwhile, the Chinese navy, the world’s second most powerful naval service, is growing rather dramatically. Rather than purchase warships across the board, China is developing niche capacities in subsurface warfare and ballistic missile technology (the DF-­21 missile) designed to hit moving targets at sea, such as a U.S. aircraft carrier. If China expands its submarine fleet to 78 by 2020 as planned, it will be on par with the U.S. Navy’s undersea fleet in quantity. While the U.S. Navy’s submarine fleet is completely nuclear, it requires that feature to sail halfway around the world, in order to get to East Asia in the first place, even as China’s diesel-­electric submarines are supremely quiet and can hide better, therefore, in the congested littorals of East Asia. At some point, China is likely to, in effect, be able to deny the U.S. Navy unimpeded access to parts of the South China Sea.

The Chinese have demonstrated that they have little or no interest in protecting the sea-lanes in and around the South China Sea. These shipping routes are among the busiest in the world, and any piracy or other obstacles to trade through that region would economically impact most or all of the world.

Local differences between the nations of southeast Asia are growing, even as engagement from other parts of the world is generally receding - America’s “pivot toward Asia” might constitute an exception to this trend, but that remains to be seen. As Robert Kaplan writes:

Thus, as China’s navy gets stronger — ­its economy permitting — ­and China’s claim on the South China Sea — ­as demonstrated by its maps — ­contradict the claims of other littoral states, these other states will be forced to further develop their own naval capacities and to balance against China by relying increasingly on the U.S. Navy: a navy whose strength has probably peaked in relative terms, even as it must divert considerable resources to the Middle East. Worldwide multipolarity is already a feature of diplomacy and economics, but the South China Sea is poised to show us what multipolarity in a military sense actually looks like. Just as German soil constituted the military front line of the Cold War, the waters of the South China Sea may constitute the military front line of the coming decades.

Observers from other parts of the world often underestimate the power of racism among nations of east Asia. Racism as a cultural phenomenon takes very different forms in different parts of the world.

Among the Chinese, Japanese, and Koreans, racism is unabashed and officially celebrated. Each of these groups considers itself to be simply better than the others. When these attitudes are augmented with battleships, submarines, and missiles, the danger becomes obvious:

There is nothing romantic about this new front line. Whereas World War II was a moral struggle against fascism, the Cold War a moral struggle against communism, the post-­Cold War a moral struggle against genocide in the Balkans, Africa, and the Levant, as well as a moral struggle against terrorism and in support of democracy, the South China Sea shows us a twenty-­first-­century world void of moral struggles, with all of their attendant fascination for humanists and intellectuals. Beyond the communist tyranny of North Korea, a Cold War relic, the whole of East Asia simply offers little for humanists. For there is no philosophical enemy to confront. The fact is that East Asia is all about trade and business. Even China, its suffering dissidents notwithstanding, simply does not measure up as an object of moral fury.

These factors - China’s territorial ambitions, its lack of concern for commerce in the South China Sea, mutual nationalistic racist distain between the nations of east Asia, increasing tensions between southeast Asian nations, and declining engagement from other parts of the world - send a clear signal that stability in that part of the world is endangered.

Thursday, December 3, 2015

Textual Sources of Islam

Introductory remarks about Islam usually include the observation that the Qur’an - also spelled ‘Koran’ - is the foundational and definitive text for serious Muslims. While this statement is correct, it is perhaps incomplete.

Other sources for Islam include the Hadith and the Sunnah. These are somewhat amorphous collections, records of what Muhammad said, did, and taught. They also include what his followers, with his permission and approval, said and did.

Many of Islam’s central practices and beliefs are not found in the Qur’an, but are found in these other documents.

For example, not in the Qur’an is the command to pray five times daily, the command to avoid paintings and sculptures of living things, and the requirement of capital punishment for adulterers.

Also not in the Qur’an are the details of the ‘hajj’ pilgrimage: to walk seven times in a counterclockwise circle around the ‘kaaba’ building, and then to acknowledge the ‘black stone’ at the site. Nowhere in the Qur’an is the elevation of the ‘shahada’ to a credal status.

Some other core Islamic concepts, however, are found in the Qur’an, such as the right of men to have sex with any women whom they have obtained in combat. Men are told that they may have sex with their wives, and with those women “whom their right hands possess.”

To “possess with one’s right hand” is an idiom which includes valuables taken in battle. The Qur’an is divided into chapters called “Surah” and this expression is found in Surah 4, 8, 23, and 70.

A thorough understanding of Islam, then, includes the notion that alongside the Qur’an, there are other texts which are understood to be authoritative.

Wednesday, November 18, 2015

From Paris to Africa: Liberty Struggles to Survive

In January 2015, armed Muslims entered the offices of Charlie Hebdo, a French magazine, and began shooting. They killed eleven people, because they objected to a cartoon which the magazine had published.

As part of this same action, Islam had organized violence thousands of miles away, in Africa. News media carried reports like this:

More than 70 churches were destroyed and 10 people killed after a cartoon agitated Niger’s two largest cities. One week after Islamic terrorists killed 11 staff members of the French publication Charlie Hebdo, the magazine published a “survivor’s issue.”

While the magazine is not a particularly religious one - it is, in fact, probably somewhat anti-religious - it nonetheless stands for religious freedom, and thus makes, to a limited extent, common cause with Jesus followers around the globe.

Its cover featured a cartoon of Muhammad holding a Je suis Charlie (“I am Charlie”) sign. While the cartoon sparked protests across Africa and the Middle East, the deadliest were in Niger, which ranked worst in a Pew Forum poll of sub-Saharan African Muslims’ support for religious freedom for non-Muslims.

Only a few months later, Paris was again the location of Islamic hostility, as Muslims used bombs and rifles to kill over 100 people. These attacks, on Friday, November 13, 2015, revealed that Islam has no intentions of relaxing its efforts to execute those whom it considers to be ‘infidels.’

As Islamic aggression continues to expand globally, societies based on freedom will be tested, to see if they are willing and able to resist such attacks.

Tuesday, November 17, 2015

Religious Freedom: a Global Concern

The sincerity of any claim of support for religious liberty is seen in one’s effort to support the freedom to practice a religion which is not one’s own.

In 1998, the United States Congress created an “Office of International Religious Freedom” and a “Commission on International Religious Freedom” not to address the concerns of comfortable majority religions, but to speak on behalf of persecuted minority faiths.

In keeping with the principle of protecting minority interests, an important post has been filled by a representative of Judaism. Estimates calculate that, at most, 1.4% of the U.S. population is Jewish, making it one of the smallest groups in the nation. In March 2015, news media carried reports like this:

For the first time, a non-Christian became America’s ambassador-at-large for international religious freedom. Rabbi David Saperstein, confirmed in December, fills a position left vacant since October 2013.

The ambassador’s minority status carries both a practical and a symbolic value. Practically, he has experience as a member of a possibly marginalized non-majority group. Symbolically, he represents American habits of protecting minority groups.

Members of small and possibly endangered groups have, over many years, seen the United States as a place of safety for them.

His nomination was widely lauded by Christian advocates, including Russell Moore of the Ethics and Religious Liberty Commission, Chris Seiple of the Institute for Global Engagement, and US Rep. Frank Wolf. Wolf authored the International Religious Freedom Act that created the post.

The question of safety for religious minorities has gained attention in recent decades, as Islam has organized the persecution of Jesus followers in countries from Niger to Pakistan.

Because of this trend, religious liberty as a movement has gained attention in the last few years. Worldwide, more people have been killed for speaking about Jesus or for owning a copy of the New Testament than in previous decades.

After 34 years as one of Congress’s leading advocates for international religious freedom and human rights, Wolf retired in January. But he is hardly finished advocating, becoming the first to fill a newly endowed chair in religious freedom at Baylor University.

Whoever occupies the position “ambassador-at-large for international religious freedom,” she or he will have much work to do, as Islamic aggression continues to spread.

Thursday, November 5, 2015

A Higher Level of Economic Science

The principles of economics were articulated primarily by Adam Smith, David Hume, and David Ricardo. They formulated the familiar statements about supply, demand, and price levels.

According to this type of ‘classical’ economics, prices rise when demand increases and supply remains steady, or when demand holds steady and supply decreases. Prices drop when supply increases against a steady demand, or when demand decreases against a constant supply.

These familiar axioms are intuitive, and they correspond to actual experience and data. This classical understanding is now the core of economic thought.

But there are some questions for which classical economic reasoning has no good answers.

One question is about the concept of value: food, which is necessary for life, and which everyone desires, often costs little; sapphires and pearls, which serve no practical purpose and meet no need, often cost much. Why?

The demand for food is large, universal, and steady; the demand for jewels is small, limited to a segment of the population, and can soften when competing demands expand. Yet the gems find a higher price than food. Why?

Classical economic understandings were correct as far as they went, but they did not go far enough to answer such questions. The science of economics would have to advance to a new level.

Carl Menger would discover the principles which allowed for new understandings of economic phenomena.

Born in 1840, Menger would do most of his work at the Universität Wien. He died in 1921.

Menger noticed that classical economic thought treated supply and demand as monolithic and irreducible forces, like magnetism or gravity. The original founders of classical economics - Smith, Hume, Ricardo - may have attempted to follow the paradigm of Newtonian physics or Cartesian philosophy, and tried to find a systematic solution based on a few axiomatic concepts.

Instead, Menger analyzed economic demand as being composed of many individual decisions. Different individuals will be purchasing the same product at the same time - but their choices are different.

Consumers may buy the same product, but for different reasons. The value which they place on the product will vary. ‘Value’ is a measure of the strength or intensity of demand - in common language, “how much” and “how bad” the customer wants the product.

Classical economics could not answer these questions: How does a person assign a value to an object? A person decides to pay a certain amount for the object, presumably because the object will fill, or help to fill, certain desires; how does a person make that decision? Why does a person decide to pay some specific amount? Joseph Salerno writes:

Menger brilliantly answered the question by restating it: “Which satisfaction would not be attained if the economizing individual did not have the given unit at his disposal - that is, if he were to have command of a total amount smaller by that one unit?” In light of Menger’s discussion of economizing, the obviously correct answer to this question is “only the least of all the satisfactions assured by the whole available quantity.” In other words, regardless of which particular physical unit of his supply was subtracted, the actor would economize by choosing to reallocate the remaining units so as to continue to satisfy his most important wants and to forego the satisfaction of only the least important want of those previously satisfied by the larger supply. It is, thus, always the least important satisfaction that is dependent on a unit of the actor's supply of a good and, that, therefore, determines the value of each and every unit of the supply. This value-determining satisfaction soon came to be known as the “marginal utility.”

Menger’s discovery was this: goods have values because they serve various purposes, and the importance of these various purposes differ. This is step toward answering the question about why gems are more expensive than food.

Menger argued that it is not the case that the value of goods derives from the value of the labor used to produce them. On the contrary: the value of labor derives from the value of the goods it produces.

Against the idea that financial transactions are always an exchange of equal values - like an algebra equation or a chemical equation - , Menger pointed out that people will give up what they value less in order to gain what they value more. It is this asymmetry which produces wealth: both sides gain from the transaction.

One sees here again the fascination which the founders of classical economics had with the advancements in natural science made by people like Newton and Boyle.

Carl Menger launched a new area of economic investigation, which kept many of the discoveries of ‘classical economics,’ but which also created possibilities for more advanced analyses.

Thursday, October 29, 2015

Sentenced to Death for Thinking

In Pakistan, a woman named Asia Bibi was sentenced to death after allegations were made that she was “denigrating the Prophet Muhammad,” according to news media.

The allegations were reported to authorities five years after the ostensible incident, evidence was scant, and the narratives of the purported witnesses were weak and not too consistent.

Such scenarios are not uncommon where Islam prevails. Aged fifty, Asia Bibi is the mother of five, and was sent to

death row for allegedly denigrating the Prophet Muhammad during an argument with a Muslim colleague over a cup of water. Her lawyer argued that lower courts wrongly overlooked the five-day delay between the incident and the police report filed by a local imam, who wasn’t present at the argument.

Pakistan is exemplary of many Islamic republics, which routinely sentence people to death for ‘blasphemy’ - but the definition of that word is stretched very thin. Mere accusations often suffice for conviction. The possession of a New Testament book, or attendance at any non-Muslim religious gathering, can easily be interpreted as ‘blasphemous.’

Jews, Christians, Hindus, Sikhs, Buddhists, and others are executed, imprisoned, or beaten for the simple fact that they are not Muslims.

These people are, in effect, punished for thinking.

In the case of Asia Bibi, diplomatic pressure from Europe and North America may yet have an effect. The Pakistani supreme court may hear an appeal of her conviction. Perhaps she will be allowed to live.

Sunday, October 4, 2015

Collaborating in the Ukrainian Genocide: Holodomor

In one of the more horrific aspects of Stalin’s mass murders, he discovered that the easiest way to kill millions of people was to simply starve them. Soviet soldiers set fire to fields and barns, slaughtered animals, and then left the area.

The farmers died by painful starvation.

One of the areas in which Stalin applied this technique was in the Ukraine. Some regions of that country expressed hesitation about the Soviet-Communist plan for collectivization.

Ievgen Vorobiov writes about Grigoriy Petrovskiy, a Ukrainian who actually aided Soviets in their mass murder. Far from protesting Stalin’s deadly communism, he carried out Stalin’s killing schemes:

Petrovskiy is notorious for having abetted the Soviet policy of Holodomor: forced requisition of food from Ukrainian peasants that resulted in mass starvation and the deaths of an estimated 3 to 5 million Ukrainians in 1932-1933. Even as a quarter of the Dnipropetrovsk region’s population was wiped out by the artificial famine, Petrovskiy proclaimed triumphantly in September 1933 that “the collective farm system has finally defeated all its enemies.”

Nearly a century after the fact, historians have discovered the full extent of the atrocities, details of which were kept partly secret until the fall of the Soviet Union in 1991.

Most Ukrainians (as well as at least 17 countries) recognize the Holodomor as an act of genocide against the Ukrainian people.

The USSR had help, not only from the traitor Petrovskiy who turned against his own people, but also from an American: Walter Duranty.

A reporter for the New York Times, Duranty aided Stalin by suppressing details of the millions of deaths. Instead, Duranty reported that Stalin’s communist collectivist program was a success, and that the ordinary people were benefitting from it.

Why did Duranty lie? Why did he willing aid the century’s most prolific murderer? Either Duranty received payment from Stalin, or he was ideologically sympathetic to Stalin’s cause.

Thursday, September 3, 2015

Historic Weather Patterns

The earth’s climate has an effect on social structures: in some ways obviously, but in other cases with a hidden causation.

Historian Geoffrey Parker notes that a collection of political and economic upheavals around the globe in the 1600s correlates with swings in the planet’s climate. Reviewing Parker’s book, J.R. McNeill writes:

The various economic and political crises of the 17th century were not isolated events but connected.

The earth’s climate experienced a several-century-long cold spell. This represents a statistical outlier in measured average temperatures during a period of several millennia.

Climatologists are able to accurately reconstruct past temperatures, even for periods of time prior to the invention of the mercury thermometer. A wide spectrum of evidence, from tree-ring sizes to glacial expansions and contractions, documents hot and cold periods over the centuries.

Human records, while not recording numerically the temperature in degrees, yield observations about snowfall, rainfall, crop yields, and the freezing of rivers and lakes.

McNeill continues:

They all had a component of bad weather behind them. The Little Ice Age, which extended from about 1250 to about 1850, reached its nadir in the 17th century. That is explained partly by a spate of volcanic eruptions, the dust veils of which reduced the amount of sunshine reaching the Earth’s surface, and partly by a slump in the sun’s energy output called the Maunder Minimum. Colder weather, often dryer weather, and more frequent extreme weather became common in many if not all parts of the world. No one disputes this much, although there are debates about just how much colder it was, and how global it was.

This ‘Little Ice Age’ came on the heels of its antipode, the ‘Medieval Warm Period’ during the tenth through thirteenth centuries. This ‘warm period’ was also a statistical outlier.

Noteworthy is that these two extremes were non-anthropogenic. They occurred before the mass extraction and consumption of fossil fuels like coal, oil, and gas. They occurred before the large-scale use of chlorofluorocarbons.

Temperatures during the first few decades of the twenty-first century are not as warm as the Medieval Warm Period, nor as cold as the Little Ice Age. While some excursions in temperature have occurred in the late 1900s and early 2000s, they have not endured as long as these historic eras.

Because global temperatures have not long sustained hot or cold temperatures as extreme as the historic outliers periods, it is dubious to claim that current climatic instability is the result of human activity.

To be convincing or persuasive, any assertion that climate in the early twenty-first century is significantly or measurably impacted by manmade factors would need to rely on evidence which shows both that we are now in the midst of a climate swing which features temperatures at or beyond the historic outlier levels, and that such a swing will last as long or longer than the historic outlier periods.

We lack evidence to make plausible the claim that current climate patterns have been observably impacted by human activity.

Wednesday, August 12, 2015

War Socialism: the Irresistible Temptation

While the damage which combat inflicts on people is obvious - wounds, deaths, and the destruction of property - , combat is only one part of war. War creates injuries beyond, aside from, and outside of, combat.

For those who may be hundreds, even thousands, of miles away from the physical fighting, war inflicts harm in different ways. Physically, there are shortages of materials, including important nutrients and medicines. Diseases can flourish among civilian populations during wartime.

War can also devastate political liberty. The crisis created by war provides the opportunity for political leaders to argue that urgent and exceptional circumstances necessitate and justify their suspension of the usual civil rights.

The government may contend that during peacetime, citizens have the right to freely discuss and analyze a government - pointing out its flaws - but during wartime, the situation is too important or too critical to allow unrestrained free speech. This is precisely what Woodrow Wilson’s administration did, in 1917 and 1918, when it enforced the Sedition Act and the Espionage Act.

A wartime government can inflict even more detriment on its citizens when it restricts not only their freedom to speak, but their freedom to act: when it regulates various aspects of ordinary life.

Again using war’s urgency as an excuse, governments tell farmers which crops they may grow, and at which prices they may sell them. Governments may tell bakers which type of bread to bake, and at which prices it may be sold.

This violation of an individual’s economic freedom is carried out in the name of the war effort. But such government intervention may destroy the very liberty which the war is allegedly defending.

Historians often call wartime regulations “war socialism.” Many political leaders simply can’t pass up the opportunity to gain more control over the lives of their subjects. After WWI, when Woodrow Wilson left the presidency in 1919, the voters expressed a strong desire for “normalcy.”

But Wilson’s thirst for power extended beyond the borders of the United States, and past the end of the war: not content with intervening in the lives of citizens during the war, Wilson wanted to control global relations after the war.

Redrawing the map of Europe, Wilson inflicted his revenge, largely against the Austrian Empire, for which he harbored a hatred: a hatred which many historicans find difficult to explain. Historian Hans-Hermann Hoppe writes:

As an increasingly ideologically motivated conflict, the war quickly degenerated into a total war. Everywhere, the entire national economy was militarized (war socialism), and the time-honored distinction between combatants and non-combatants and military and civilian life fell by the wayside. For this reason, World War I resulted in many more civilian casualties — victims of starvation and disease — than of soldiers killed on the battlefields. Moreover, due to the ideological character of the war, at its end no compromise peace but only total surrender, humiliation, and punishment was possible. Germany had to give up her monarchy, and Alsace-Lorraine was returned to France as before the Franco-Prussian war of 1870-71. The new German republic was burdened with heavy long-term reparations. Germany was demilitarized, the German Saarland was occupied by the French, and in the East large territories had to be ceded to Poland (West Prussia and Silesia). However, Germany was not dismembered and destroyed. Wilson had reserved this fate for Austria. With the deposition of the Habsburgs the entire Austrian-Hungarian Empire was dismembered. As the crowning achievement of Wilson's foreign policy, two new and artificial states, Czechoslovakia and Yugoslavia, were carved out of the former Empire. Austria herself, for centuries one of Europe's great powers, was reduced in size to its small German-speaking heartland; and, as another of Wilson's legacies, tiny Austria was forced to surrender its entirely German province of Southern Tyrolia — extending to the Brenner Pass — to Italy.

Although the logic of Wilson’s foreign policy is obscure, he felt that Habsburg dynasty and the Austro-Hungarian Empire posed a greater danger than the new Soviet Union.

Wilson’s policies seem to stem both from a desire to control and from a mysterious animus. He wanted to control economies, the expression of ideas, and the general shape of society. For unclear reasons, he harbored a deep antipathy against the Austrian monarchy.

Friday, July 17, 2015

Marriage: a Woman’s Free Choice

Understanding the complexities and nuances of societies which existed long ago requires both patience and the ability to set aside one’s own cultural concepts. We must reflect on our own society and recognize our era’s assumptions; we must then set those assumptions aside in order to enter into the society of a different civilization.

Historians have done this by examining personal letters and diaries of individuals from previous centuries. We are happy to have papers detailing the personal lives of a married couple, Emm Potter and Roger Lowe, who met in the 1660s in England.

Roger was learning a trade; he was learning to be a mercer. A ‘mercer’ is someone who works with fabrics. He was progressing through the system in which one begins as an apprentice, advances through a series of stages including ‘journeyman,’ and then finishes as a ‘master.’

Some English towns had a regional holiday called the ‘wakes.’ In the town of Ashton, the Ashton Wakes was celebrated on the third Sunday in September. Concerning Roger Lowe, historian Paul Griffiths notes that

It was in an alehouse during Ashton wakes that he first plucked up the courage to talk with Emm Potter his future wife.

Roger Lowe was not a major historical person. That’s what makes this event important: we see how ordinary people met, fell in love, and married. They were not the rich and famous, not the kings and queens, but the ordinary working people. Roger and Emm represent the vast majority of ordinary English folk of their era.

Significant in the marriage of Roger and Emm is that the society of the time gave her the power to accept or reject Roger’s offer of marriage. Historian Olwen Hufton writes:

Most young people, wherever they lived in western Europe, sought to bring their marriage plans together in their mid-twenties. Roger Lowe, an apprentice mercer, felt he was approaching the time for marriage in 1663 when he saw reasonable prospects of the mastership and a livelihood. However, he waited five years to wed Emm Potter, whom he met in the company of friends and relatives in an alehouse during Ashton Wakes in 1664. His courtship of Emm consisted of walks and drinks in the company of friends as well as escorting her to weddings and funerals. At the latter he also met her parents. Emm waited, however, before making up her mind.

By contrast, American society of the early 21st century has reduced the woman’s power in social relationships. Many couples live together before marriage, which reduces the social leverage of the woman.

Note that an ‘alehouse’ is considered a reasonable place for people of different genders to meet. The consumption of alcohol was done in moderation, in well-lit institutions with windows, during daylight hours.

The psychology of alcohol in England in the 17th century, compared to America of the 21st century, was much more likely to avoid both addiction and excess.

Comparing America of the 21st century with England of the 17th century, we see that Roger Lowe was taught by society to respect the will of the woman, and to respect her honor.

Despite narratives which focus on the concept of marriages arranged for money, power, land and politics, the reality for the vast majority of English people was marriage based on mutual consent and love.

The only people who did endure negotiated weddings for land, money, power, and politics were the thin layer at the top of society - the “one percent” - but, happily, the “99%” were able to marry voluntarily for love.

Of course, the popular phrase “one percent” and “99%” are not exact mathematical measurements, but merely hyperbole reflecting the small number of aristocrats and royals at the upper levels of society.

As it turns out, being one of the “upper crust” was a pain: you didn’t get to marry the person you loved.

The people of that time and place understood marriage as a free choice to give your life away to another person, to surrender your own desires for the well-being of another: it was not a right, but rather the giving up your rights. Marriage is freely letting go of any claims you might make, and offering your service to help another person.

Marriage, then, at the time of Roger Lower and Emm Potter, observed the will and humanity of the woman in a way which has sadly since then decreased in frequency.

Monday, July 6, 2015

Poland Suffers under Stalin While Allies Vegitate

One of the events which triggered the start of WWII was the Nazi invasion of Poland in 1939. The western allies declared war on Germany and on the Soviet Union, which at that time was allied with Germany and had also invaded Poland.

By 1941, the USSR would join the western allies and fight against Germany. But the Soviet change in alliances did not change Soviet behavior toward the Poles.

In 1940, a unit of the NKVD, part of the Soviet secret police, executed approximately 22,000 Poles at or near a place called Katyn. The victims were unarmed, mostly civilians, and not part of any combat operation: the invasion of Poland had already ended in late 1939.

In addition to terrorizing the Poles, Stalin also annexed Polish territory. In 1945, the USSR seized 77,000 square kilometers of Poland and made it part of the Soviet Union.

The question raises itself: why did the western allies allow, or even enable, the USSR to perpetrate atrocities against Poland? Historians Herbert Romerstein and Stan Evans write:

Among the most striking features of wartime diplomatic history was the oft-repeated belief of the Western leaders that they had to make concessions to Moscow, while asking little or nothing in return from Stalin. The rationale for this would vary from one case to the next: to keep the Soviets from making a separate peace with Hitler, to build up their confidence in our intentions, to reward them for “killing the most Germans,” to placate them because of their great military power. Whatever the stated purpose, the result in nearly all such instances was the same: to give Stalin things that he demanded.

The postwar world was shaped by a series of agreements made among the Allies at a series of major conferences. These meetings were held at Teheran in 1944, and at Yalta and Potsdam in 1945.

Stalin claimed that he needed to keep Poland under Soviet domination because it would act as a defensive shield: a “buffer” zone. Anyone trying to invade the USSR by going through Poland, Stalin argued, would be stopped before reaching Russian soil.

In reality, however, no such potential aggressors existed by the time Stalin used this excuse: Germany and Italy had been soundly defeated, and their industrial and military strength dismantled.

The scenes at Teheran and Yalta where these matters were discussed would read like a comedy of errors if they hadn’t been so tragic. Stalin’s posturing on the danger of invasion via Poland — a country he had himself invaded — has been noted. Equally bizarre were his objections to having outside observers monitor Polish elections, on the grounds that this would be offensive to the independent-minded Poles. This was said by Stalin with a presumably straight face, even as his agents were imposing a brutal dictatorship in Poland that would crush all hope of independence. All this was known by the Western allies to be bogus, but in the end they would swallow the whole concoction.

One reason why the western allies stood by while the USSR ravaged Poland was because the NKVD had placed secret agents (“moles”) inside their governments. These Soviet operatives fed only selected bits of information to the policymakers in the allied governments.

Winston Churchill was one of the few western allies who was alert to Stalin’s deceptions and plans to create a communist hegemony in eastern Europe. Churchill was, however, unable to alert or persuade other western leaders about this.

Allied foreign policy was therefore shaped on the basis of reports generated by the international communist conspiracy. Acting on the Stalinist propaganda which had been presented to them as military intelligence or foreign policy analysis, the Allies let the USSR occupy and oppress Poland under a ruthless dictatorship for several decades after 1945.

Sunday, July 5, 2015

Poland after WWII

In April and May of 1940, the NKVD, a branch of the Soviet secret police, executed approximately 22,000 unarmed Polish citizens at or near the town of Katyn in Poland. The majority of the victims were civilians and had nothing to do with the war effort.

In 1940, the USSR was still one of Hitler’s allies, and Poland was caught between the Soviet army invading from the east and the Nazis invading from the west. Poland was quickly vanquished in 1939; the 1940 Katyn massacre was not a part of any combat operations.

By the end of 1941, the USSR would be fighting against Hitler. But that didn’t change Soviet aggressiveness toward the Poles.

Hitler’s attack on Poland in 1939 was the trigger which caused the western allies to enter the war; one of their goals was to fight for Polish freedom. Although the Allies ostensibly won the war, they did not, at war’s end, liberate Poland.

In the two decades after the war, Poland was mercilessly subjugated; in the two decades prior to the war, Poland enjoyed more political liberty than at any other time in its history.

The Allies began the war in order to free Poland. The war, however, damaged Polish freedom, and the peace settlement at the end of the war - negotiated in Teheran and Yalta - made that damage permanent. Historians Stan Evans and Herbert Romerstein write:

The political outcomes of World War II were disastrous not only for the defeated nations, but also for many who sided with the victors. Nowhere was this more obviously so than in the case of Poland. The war in Europe was ostensibly fought for Polish independence, but would end in the country’s total subjugation. Poland thus embodied the tragedy of the conflict as described by Churchill: the democracies at terrible cost had won the war, then lost the peace that followed.

When the Soviets invaded Poland, one of their goals was the permanent annexation of Polish territory. At the war’s end in 1945, with Poland totally controlled by Stalin’s army, the USSR took 77,000 square kilometers of area from Poland.

Several other border adjustments in the 1940s, 1950s, and 1970s, brought about further territorial losses for Poland. The worst losses for Poland, both in terms of human life and in terms of land mass, came not from the 1939 invasion by the Nazis, but rather from the communists. Stan Evans and Herbert Romerstein write:

Poland became the proximate cause of fighting by the Western powers in September 1939, when it was invaded first by Hitler, then by Stalin, and the dictators divided up the country between them. Hitler’s invasion triggered a guarantee from England and her ally France to come to the aid of Poland in the event of such aggression. However, neither the British nor the French had the means of enforcing these brave pledges, so that Poland quickly fell to the invaders (as France herself would fall some nine months later).

The focus and goal of the Soviet communists, in the case of Poland, was to seize territory from the Poles, and tyrannize the land mass which remained as Poland.

The wartime understanding, that the Allies were working to liberate Poland, was instantly discarded at war’s end. The western allies stood back and allowed the USSR to occupy Poland.

Subsequently, when the United States entered the war as well, the Anglo-Americans would make further vows to Poland, as to other nations conquered by the Nazis. Such pledges were implicit in the Atlantic Charter of 1941 and related statements by the Allies about self-government and freedom, reinforced in comments to Polish exile leaders about the reign of liberty that would follow when the war was over. But these promises too would not be honored. Rather, when the fighting ended, Poland would again be conquered and dismembered — this time with the explicit sanction of the Western powers.

The question stands: why did the western allies allow the brutal imposition of communism on Poland? The answer is simple but shocking: Soviet agents had been planted as “moles” inside various western governments.

These agents controlled the flow of information to the policy makers in those governments. Advisors, who were actually Soviet operatives, apprising leaders about foreign relations reported some items prominently, concealed other items, and fabricated narratives when events did not play into their hands.

Thus it was that some of the western leaders unwittingly enabled the Soviet plot. Others, like England’s Winston Churchill, saw that Stalin was a threat, but could not do anything about it.

Saturday, June 27, 2015

East Asia: the World's Stage

Although Vice President Joe Biden still refers to eastern Asia as “the Orient” - he did so in a September 2014 speech - more astute observers of the region are keeping their analyses current. Understanding China’s ambition is one key to understanding the Pacific Rim.

Mainland China does lots of saber-rattling and posing. How much of it should diplomats from other nations take seriously? Although eager to expand its hegemony, China does not want a major war with a major global power: it does not want a war, at least at the present moment, with the United States.

It does, however, want to intimidate its smaller neighbors in the region. In May 2014, Michael Auslin wrote:

Just over six months ago, China set up an air defense identification zone (ADIZ) over a large part of the East China Sea, including in airspace over the Japanese-administered Senkaku Islands. Tokyo had already established its own ADIZ decades earlier that included the Senkakus. Beijing’s move was provocative, destabilizing, and an indicator of its relentless attempts to redefine Asia’s international order for its own interests. The Obama administration’s response was either praised as rightfully downplaying an insignificant action that didn’t really change much in Asia or was derided as a weak attempt to pretend that nothing serious had happened. Washington flew two B-52s through some part of the zone and then let the whole matter slip from public view, even though civilian airliners changed their operating practices to comply with Beijing’s unique demands to identify themselves even when not approaching Chinese airspace.

Developments in the region are shaped largely by maritime considerations. Robert Kaplan concludes from this that the likelihood of a major land war is small. He writes:

East Asia is a vast, yawning expanse, stretching from Arctic to Antarctic reaches — ­from the Kuril Islands southward to New Zealand — ­and characterized by a shattered array of coastlines and archipelagos, themselves separated by great seas and distances. Even accounting for the fact of how technology has compressed distance, with missiles and fighter jets — ­the latter easily refueled in the air — ­rendering any geography closed and claustrophobic, the sea acts as a barrier to aggression, at least to the degree that dry land does not. The sea, unlike land, creates clearly defined borders, and thus has the potential to reduce conflict. Then there is speed to consider. Even the fastest warships travel comparatively slowly, 35 knots, say, reducing the chance of miscalculations and thus giving diplomats more hours — ­and days even — ­to reconsider decisions. Moreover, navies and air forces simply do not occupy territory the way armies do. It is because of the seas around East Asia that the twenty-­first century has a better chance than the twentieth of avoiding great military conflagrations.

Despite Kaplan’s optimism, there is cause for concern. Chinese and Japanese military pilots are playing a complex form of chess - or a supersonic form of “chicken” - with each other. If a misstep occurs, a handful of pilots and millions or billions of dollars of aircraft could wind up at the bottom of the Pacific.

If that happens, the two belligerents might begin trading missiles with each other - first at each other’s navies, then possibly at military installations on land.

Although the United States would probably want to play the role of peacemaking diplomatic intermediary in such a scenario, increasing escalation could force the U.S. to choose a side.

Wednesday, June 24, 2015

Islam’s War on Archeology and History

Officers in the military sometimes work on unexpected tasks. Major Corine Wegener (U.S. Army) is tasked with rescuing and preserving ancient manuscripts and artifacts.

Major Wegener leads the U.S. Committee of the Blue Shield, which she coordinates with the Association of National Committees of the Blue Shield and the International Committee of the Blue Shield. Her task is to save artworks and archaeological finds from the “Islamic State” (ISIS) which is currently terrorizing the region which many historians call the “cradle of civilization.”

Sculptures and clay tablets are the main evidence of the great civilizations which once inhabited that region: Babylonians, Akkadians, Sumerians, and others. The ‘Islamic State’ terrorists are bent on not only tyrannizing the present, but also on destroying much of humanity’s past.

The United States Army, and Major Wegener in particular, is hoping to protect these artifacts so that future generations can study them.

In case this sounds familiar, the 2014 movie The Monuments Men tells a similar story. George Clooney, Matt Damon, Bill Murray, and John Goodman star in this film which show how the United States Army preserved paintings and sculptures during WWII in the 1940s.

Seventy years later, the same logic is at work. Authors Sarah Eekhoff Zylstra and Gordon Govier report on Major Wegener’s efforts:

“We teach various emergency methods for protection and evacuation,” she said. Last summer, the committee trained 14 Syrian archaeologists and museum professionals who risked their lives both to attend the training and to hide museum artifacts.

In addition to Major Wegener’s activities with the army, other groups are trying to safeguard these ancient finds. Columba Stewart is the executive director of the Hill Museum and Manuscript Library in Minnesota.

Blue Shield isn’t the only organization working on the problem. Stewart’s library has digitized 2,500 manuscripts from Syria since 2005, and about 5,000 more from Iraq since 2009. Manuscripts are small enough to move and hide, so they are relatively easy to protect from looters. But they are also relatively easy to sell on the black market, he said.

Terrorists from the “Islamic State” are not only destroying artifacts, papyri, parchments, and other documents, but are sometimes selling texts on as a way to fund their attacks. These “blood antiquities” are sold to private collectors, not to museums or universities, and become unavailable for further research and scholarship. Such manuscripts often degrade and decay, as private collectors do not usually have access to the best preservation methods.

ISIS is violating its own propaganda: it destroys artworks because it claims that orthodox Islam demands such destruction; but it preserves other artworks in order to sell them.

Small items can be smuggled out of Islamic countries to safety. Buildings cannot. For archaeological sites,

taking photographs is the only way to preserve them, said Stewart. “A statue or a carving you might be able to hide, but if somebody is intent on destroying [a building] for ideological reasons, there’s not a lot you can do.”

Sarah Eekhoff Zylstra and Gordon Govier note that “ISIS bulldozed the ancient city of Nimrod,” just as the Taliban had destroyed the Buddha statues of Bamiyan. International cultural treasures are at risk: many of the items destroyed by the Islamic State and by the Taliban were UNESCO World Heritage Sites.

Tuesday, June 23, 2015

Blood Antiquities: Islam’s War on Art History

The Hill Museum and Manuscript Library in Minnesota is not only a warehouse of priceless historical documents, but also sends out teams of researchers and preservationists to rescue parchments, papyri, and other ancient texts. These manuscripts are endangered because of the activity of the terrorist group known as the Islamic State in Iraq and Syria (ISIS). Authors Sarah Eekhoff Zylstra and Gordon Govier write:

“ISIS is very savvy, very alert to economics,” said Columba Stewart, executive director.

Not only texts, but artworks in the forms of paintings and sculptures are being destroyed wholesale by the terrorists. Stewart seeks to save as many historic items as he can. “His team has been taking digital photographs of” those which cannot be brought to safety. The team has been working to preserve “artifacts in the Middle East for 12 years.”

It’s a tale of victory and tragedy. Each piece preserved, snatched from the hands of the terrorists, is a triumph. Every item left behind is loss.

Also known as IS (“Islamic State”) or ISIL (“Islamic State of Iraq and the Levant”), this group is a manifestation of the Muslim antipathy to representative art. While some Muslim scholars permit paintings, drawings, and other images, mainstream Islam condemns any artistic image. For centuries, Muslim artists have specialized in abstract or nonrepresentational art: architecture and calligraphic patterns.

Most of the contents in Syria’s 34 national museums were transported to safe havens, United Nations officials reported last February. Still, the remaining museum pieces — or worse, uncatalogued items in archaeological sites — are at risk.

A few Islamic artists have produced images and representational art over the centuries. They, and their works, find safe havens outside Islamic nations.

In addition to the team from the Hill Museum, there are other scholars seeking to preserve artworks and ancient documents.

Enter the US Committee of the Blue Shield, the subject of George Clooney’s WWII movie The Monuments Men. Along with training the US military on how to protect cultural heritage during armed conflict, the committee also trains and teaches foreign museum staff who are trying to protect endangered artifacts, said member Corine Wegener.

The US Committee of the Blue Shield operates in concert with the Association of National Committees of the Blue Shield and the International Committee of the Blue Shield.

Major Corine Wegener (U.S. Army) was a founder and leader of this effort. It is clear that many nations around the world are eager to see cultural artifacts saved from Islamic terror.

Saturday, June 20, 2015

Woodrow Wilson’s War

The more one studies the causes of WWI, and the motivations of the various belligerents, the less one understands them.

Woodrow Wilson’s explanation that America had to enter the war to “make the world safe for democracy” must be understood as mere sloganeering. He was not interested in expanding individual political liberty, domestically or abroad.

Given Wilson’s heavy-handed penchant for social engineering, one might suppose he harbored kind sentiments toward the Austro-Hungarian monarchy. This, however, was not the case.

While Wilson may have admired and emulated a central European doctrine in which the state controls the individual, he found Habsburg empire repugnant because it was a multi-ethnic amalgamation. Wilson was a strong believer in the nation-state, and to fuel Wilsonian nationalism, an ethnic state was needed.

This explains his delight with reassembling a sovereign Poland.

The Austro-Hungarian empire, by contrast, was a potpourri of Czechs, Hungarians, Austrians, Slovaks, and a few other groups. Wilson felt that such an impure mixture could not generate authentic nationalism.

When WWI started, the U.S. Civil War was still a living memory (Wilson was born in 1856). Wilson, trying to reconcile his racist southern roots with the Northern Republican victory of 1865, interpreted the war’s end as a lesson about centralized control. In order to be on the winning side, he felt, one needed to limit the rights and freedoms of individuals.

His father had owned slaves, and those slaves served young Wilson as he grew up. Looking back, he asserted that slavery was not a moral crime, but had merely been an economic system which had outlasted its usefulness.

The dynasties of Europe displeased Wilson, not because they limited the political liberties of their subjects, but because they lacked his systematic and scientific approach to managing the lives of citizens. Wilson was an early representatives of what Alexei Marcoux and others have come to call “technocratic authoritarian paternalism.”

The ethnic nation-state was, Wilson felt, better than a multi-ethnic empire for reasons of scientific management. But Wilson was the president of a multi-ethnic country. He hoped to resolve this tension by instituting various racial policies. Historian Eric Foner writes:

Woodrow Wilson, a native of Virginia, could speak without irony of the South’s “genuine representative government” and its exalted “standards of liberty.” His administration imposed full racial segregation in Washington and hounded from office considerable numbers of black federal employees.

Wilson’s racism, however, is only a small part of his broader political vision. This becomes visible in his efforts to redraw the map of the world at Versailles:

Meanwhile, the Paris Peace Conference of 1919 sacrificed the principle of self-determination - ostensibly the Allies’ major war aim - on the altar of imperialism, so far as the world’s nonwhite peoples were concerned. Nation-states were created for Eastern Europe, but not for what Wilson's advisor Colonel Edward House called the “backward countries” of Asia and Africa.

The result was a feeling of deep betrayal that affected everyone from W. E. B. Du Bois, who had traveled to Paris to plead the cause of colonial independence, to ordinary black Americans. Du Bois was forced to conclude that Wilson had “never at any single moment meant to include in his Democracy” black Americans or the nonwhite peoples of the world.

Not a competition between liberty and tyranny, Wilson’s engagement in WWI was a competition between hereditary dynastic authoritarianism and the above-mentioned technocratic authoritarian paternalism.

Wilson rejected royalty’s claim to rule while asserting his own right to rule. In his estimation, he simply knew better, and knew more. He did not oppose authoritarianism. He simply wanted a different type of authoritarianism. As scholar Hans-Hermann Hoppe writes:

World War I began as an old-fashioned territorial dispute. However, with the early involvement and the ultimate official entry into the war by the United States in April 1917, the war took on a new ideological dimension. The United States had been founded as a republic, and the democratic principle, inherent in the idea of a republic, had only recently been carried to victory as the result of the violent defeat and devastation of the secessionist Confederacy by the centralist Union government. At the time of World War I, this triumphant ideology of an expansionist democratic republicanism had found its very personification in then U.S. President Wilson. Under Wilson’s administration, the European war became an ideological mission - to make the world safe for democracy and free of dynastic rulers. When in March 1917 the U.S.-allied Czar Nicholas II was forced to abdicate and a new democratic-republican government was established in Russia under Kerenski, Wilson was elated. With the Czar gone, the war had finally become a purely ideological conflict: of good against evil. Wilson and his closest foreign policy advisors, George D. Herron and Colonel House, disliked the Germany of the Kaiser, the aristocracy, and the military elite. But they hated Austria. As Erik von Kuehnelt-Leddihn has characterized the views of Wilson and the American left, “Austria was far more wicked than Germany. It existed in contradiction of the Mazzinian principle of the national state, it had inherited many traditions as well as symbols from the Holy Roman Empire (double-headed eagle, black-gold colors, etc.); its dynasty had once ruled over Spain (another bete noire); it had led the Counter-Reformation, headed the Holy Alliance, fought against the Risorgimento, suppressed the Magyar rebellion under Kossuth (who had a monument in New York City), and morally supported the monarchical experiment in Mexico. Habsburg - the very name evoked memories of Roman Catholicism, of the Armada, the Inquisition, Metternich, Lafayette jailed at Olmuetz and Silvio Pellico in Brünn’s Spielberg fortress. Such a state had to be shattered, such a dynasty had to disappear.”

Wilson didn’t care about making the world safe for democracy. He was a racist, but racism is only one dimension of his vision for ruling.

By means of WWI, and especially by means of postwar treaties and arrangements, Wilson hoped to reshape the world along the lines of scientific management. ‘Taylorism’ or ‘scientific management’ was an attempt to rationalize industrial production. Wilson hoped to expand that principle beyond the factory, and to impose it upon the world.

Friday, June 19, 2015

Islam’s War on Art

Islam is generally understood to prohibit the creation of images: representational art in the forms of drawings, paintings, sketches, or sculpture. For this reason, much Muslim art is nonrepresentational: the generation of abstract patterns in calligraphy and architectural ornamentation are examples.

Outside the mainstream of Islamic thought, there are some groups of Muslims who permit the making and displaying of pictures. These marginal factions, however, are few in number and consistently persecuted by majority groups within Islam.

For this reason, art historians consistently report that the vast majority of Islamic artworks are nonrepresentational, especially in the Muslim cultures of northeastern Africa and southwestern Asia. Of the representational pieces created over the centuries, few survive. Those which do survive often do so in hiding, or in non-Muslim nations.

The main source for the prohibition of images is the text of the Hadith, the collected sayings of the prophet Muhammad. Historians Sarah Eekhoff Zylstra and Gordon Govier report:

In 2001 in Kabul, Afghanistan, the Taliban destroyed thousands of artifacts that resembled humans or animals. But museum employees hid statues in obscure storerooms and kept the fragments of pieces smashed by the Taliban. In recent years, 300 have been restored.

The proscription of images centers of God, extending to Muhammad, then to Muhammad’s family and officers, then to humans generally, next to animals, and finally to plants and inanimate objects. While the Sunni are more stringent in the opposition to representational art than the Shia, in practice, any imagery or portraiture is extremely rare in both groups.

It is in fact, relatively easy to find examples of Muslim clerics who have approved the concept of representation art, even up to and including portraits of human beings. It is also easy to find examples of individual Muslim artists who made such artwork, and patrons who’ve funded or bought them. But numerically, these are exceptions. Historically, they are outliers. Geographically, they tend to be found outside historically Islamic nations.

Several early Islamic rulers approved of coinage bearing human images. Some later rulers had their palaces decorated with sculpture and paintings. In the twentieth century, two films were made by Muslim cinematographers the about prophet Muhammad. But the coins were melted down, the paintings and sculptures removed, and the films banned.

Although these exceptions exist, the overriding trend within Islam is against art. As the researchers at the David Collection, an art museum in Copenhagen, write:

The non-figurative character of religious decoration has remained a fundamental principle throughout the history of Islam. At no point have images found their way into the interiors of mosques; as far as we know, no Muslim artist has endeavored to depict God; the Koran has never been illustrated; and depictions of the Prophet Muhammad are rare. With the reform of coinage carried out by the caliph Abd al-Malik in 696, even the portraits of rulers were removed from Islamic coins and replaced by calligraphic decoration.

What do art historians and museum curators do when faced with Islamic attacks on art? The terrorist group known as the “Islamic State” has destroyed artworks, both in museums and elsewhere.

Brave preservationists have accomplished heroic feats, preserving some pieces of art from Muslim attacks. Sarah Eekhoff Zylstra and Gordon Govier write:

“The unsung heroes in these situations are people like librarians or museum directors who do their best to hide things in advance of trouble or as trouble arrives,” Stewart said. One of his colleagues smuggled thousands of manuscripts out of Qaraqosh, Iraq’s “Christian capital,” in advance of ISIS last summer.

The “Islamic State” group seeks to destroy the artistic, historical, anthropological dimensions of culture and civilization. Their efforts are designed to impoverish the human race in every sense possible - intellectually and economically.

Historians and curators around the world are eager to lend assistance in any way possible to those who are trying to preserve bits of art and history inside the Islamic world.

Still, the loss is enormous. “Our understanding of the past is made up of little specks we find and put together,” said Hershel Shanks.

Shanks, who edits an archeological periodical, assess the losses, the pieces which have been totally destroyed: “Much of what we don’t know is gone. It’s heartbreaking.”

Tuesday, June 9, 2015

Climate and Politics

History is rarely simple, and any easy narrative is suspect. Upon closer examination, an effortless account of some event or process usually contains significant, if hidden, complexities.

Intricacies are not hidden, however, in the task of examining the multidimensional and interdisciplinary phenomena surrounding Maunder Minimum and the Little Ice Age. The former is a documented span of years, approximately 1645 to 1715, during which sunspot activity was significantly lower than average. The latter, which roughly correlates with the former, is a global climate swing, the dating of which is more ambiguous.

Starting points for the Little Ice Age have been proposed around 1300 or 1350, but some scientists argue for a beginning around 1550. The age terminates around 1850 or 1870. In any case, the age was non-anthropogenic. Climatologists are unsure as to the significance of its correlation with the Maunder Minimum.

It is clear that the Little Ice Age and the Maunder Minimum together correlate with a series of economic, social, political, and military instabilities. Kenneth Pomeranz, in his review of Geoffrey Parker’s Global Crisis, writes:

We can now be sufficiently precise about at least some past weather events that we can tie them to the sorts of political and social events that unfold in months and years, not just the more diffuse patterns of decades or centuries. This allows us to write histories in which both relatively large patterns of climate and specific human decisions matter.

The climatic instability of the years of the Little Ice Age and the Maunder Minimum correlate to events like the Thirty Years’ War, the overthrow of Charles I in England, and political turmoil in Asia and Africa.

Historians hypothesize that wild non-anthropogenic swings in climate threatened the food supply: both crops and wild game would have been affected. Beyond the agriculture needed for human sustenance, goods produced for trade would have been in short supply, if they relied on weather-dependent components or processes (e.g., tobacco, cotton, silk, spices, leather, etc.).

Competition for the dwindling supply of resources could fuel or trigger political unrest. Yet the ability of humans to adapt and adjust allowed for some level of normalization even during these years of statistically outlying weather. Pomeranz notes that

By the 1680s most major polities had regained some reasonable level of internal stability, though bad climate would persist for another thirty years.

As adaptation, often in terms of technique or technology, allowed humans to acclimate, the political upheaval seemed to subside.

Internal unrest became much less of a problem, even before 18th-century warming took hold, as agricultural improvements spread, market integration advanced, plague and smallpox were much better contained.

Given that “political turmoil ceased well before the climate improved,” it may well be that, even though dramatic climatic instability is non-anthropogenic, the tendency to successfully adapt to climate change is truly human.

Monday, June 1, 2015

Art History under Fire

Islam’s prohibition of images is well-known: drawing, painting, and sculpture are considered idolatrous. The faithful are instructed to purge their environment of such things.

Not only does this belief eliminate the production of any representative art, but it also triggers the destruction of such artworks as Muslims may encounter. Non-representational art is allowed, but only if it is in the service of Islam and only if it cannot be interpreted as arising from an “infidel” worldview.

The fact that artworks can be sold, however, creates an internal tension within Islam: destroy the artworks, or use them to fund the mission? Sarah Eekhoff Zylstra and Gordon Govier write:

ISIS has become one of the world’s best-funded terrorist groups, earning most of its profits by selling seized oil. But details keep emerging of the estimated No. 2 source of its billion-dollar revenue stream: looting.

This question repeatedly confronts Islam: Destroy the cultural artifacts or sell them? If the artifacts are destroyed, a chance to fund the mission is lost. If the artifacts are sold, the seller is open to the charge of hypocrisy on his own terms: he has allowed the “idolatrous” images to continue existing, which for a Muslim is a sin.

The term “ISIS” has become common, along with its variants, “ISIL” and “IS” - referring respectively to the “Islamic State of Iraq and Syria,” the “Islamic State of Iraq and the Levant,” and the “Islamic State.”

Under any name, Islam has destroyed archeological treasures wholesale. Scientists and historians from all parts of the world lamented the use of construction equipment and explosives to eliminate all traces of the ancient city of Nimrud, an Assyrian city founded around 1200 BC.

The city’s name is sometimes transliterated as ‘Nimrod.’ It is difficult for non-Muslims to grasp the worldview which demands the destruction of archeological findings from ancient civilizations. Zylstra and Govier continue:

The group often destroys statues and other objects it deems idolatrous. Last month, ISIS bulldozed the ancient city of Nimrod, prompting the senior editor of the New York Review of Books to call for military protection of archeological sites. Italy’s culture minister and Iraq’s tourism and antiquities minister have advanced similar proposals.

In early 2015, ISIS destroyed an ancient architectural feature known as the ‘Nineveh Wall.’ In 2014, ISIS had bulldozed a colossal ancient Assyrian gateway lion sculpture dating to the 8th century BC.

While, on the one hand, Islam is committing these acts of wanton destruction based on intolerance and xenophobia, ISIS paradoxically preserves other artifacts, equally offensive to Muslim sensibilities, in order to sell them.

ISIS cannot avoid the charge of hypocrisy. If it were consistently to apply Islamic values, it would not allow any of these archeological finds to exist. But it violates the worldview it proclaims for the sake of money. ISIS compromises its allegedly Islamic values for the sake of opportunism.

But thousands of potentially lucrative archaeological sites are now under ISIS control. The resulting looting has given rise to the term “blood antiquities.” A parallel to Africa’s “blood diamonds,” where mines in a war zone are looted to finance military operations, ancient artifacts are helping to fund ISIS’s reign of terror.

In a bizarre and nightmarish circumstance, historians and scholars are thankful that ISIS needs cash to continue its terrorist activities: the need for funding is enabling the survival of at least a few of the world’s oldest cultural treasures.

While some of the destroyed items had been catalogued, photographed, and described, other objects had yet to be identified or analyzed. Indiscriminate ruination annihilated both known finds and those which lay undiscovered waiting in vain to be unearthed.

In the blanket destruction caused by bulldozing entire cities, many artifacts have been lost forever.

Thursday, May 28, 2015

McCloy and Adenauer in Postwar Germany

After 1945, Germany needed capable political leaders to restore a free and open society. For twelve long years, Germany had suffered under Nazi oppression.

It would require considerable political skill to reinstate the civil democracy, the republic with freely-elected representatives, which had been taken away during Hitler’s dozen years of tyranny. Happily, there were still individuals left who had been part of liberty of the pre-Nazi years, and who had not succumbed to the temptation to collaborate with Hitler’s National Socialist government.

Konrad Adenauer was such a man. He had been the mayor of Cologne - der Bürgermeister Kölns - during the last years of the Weimar Republic. Because of his courageous opposition to the National Socialists, he was removed from power, and lived out those twelve years under persecution, being questioned and jailed by the Gestapo.

At the war’s end, the Germans, finally free from their Nazi tormentors, began to rebuild their cities physically, and their liberty politically. Historian Terrence Prittie describes Konrad Adenauer at that point in time:

One should recall that the Adenauer of 1945 still looked, superficially, little qualified for the role which he was to play. Not only was he in his 70th year, but had next to no experience of party politics, he was utterly unknown outside his own country, and he had travelled little beyond its borders - although the picture of Adenauer, the stay-at-home hick-town boy, has been somewhat overdrawn. Brought back to Cologne as Mayor by the victorious Americans, he might well have stayed there for the rest of his working days - had the British, after taking over from the Americans, not dismissed him from office. Fortuitously and quite unintentionally, he was thrust into the party-political field. There, he made his way because of four characteristics. A previously iron will had been tempered by adversity into something more pliant, yet more subtly formidable. That Roman clarity and logic which had always been his enabled him to set himself attainable objectives, almost always the right ones. Suffering had given him, too, a more rounded character, and an elegance of manner which was highly persuasive. And his self-discipline enabled him to display a truly remarkable patience, and purposefulness.

The challenge to Germany, and to Adenauer, was to bring back the personal freedom which the country had enjoyed prior to the National Socialist dictatorship. Compounding that challenge was the fact that Germany was not a completely independent nation-state during the first postwar years.

Western Germany was occupied by the French, British, and American troops who’d liberated it from the Nazis at war’s end. Eastern Germany was brutalized by the Soviet Army which had invaded from the east.

Adenauer, as Chancellor of West Germany, was constrained by the Allied occupational authorities. He was frustrated by a lack of communication with other nations, and by communications from the Allies which were unclear or indirect.

These ambiguities left him in a position of trying to meet expectations of which he was only vaguely aware, or expectations that he respond to situations about which he had been only partially informed.

John McCloy was appointed U.S. High Commissioner for Germany in September 1949. He would work closely with Adenauer; the two were sometimes in harmony and sometimes in conflict. McCloy held the office until August 1952.

McCloy coordinated the treatment of some war criminals. McCloy reasoned that individuals who’d been involved in procurement of industrial supplies like steel and coal were different than the Nazis who’d been directly involved in the atrocities committed against civilians.

Sorting the truly evil from those who’d merely been exploited as part of the German economy, McCloy left the brutal war criminals to face their sentences, but among the those who’d faced tribunals merely because they were part of industry, he pardoned some, and commuted the sentences of others. He restored confiscated property to a few industries.

Adenauer, and public opinion in West Germany generally, encouraged McCloy to consider that the war crimes trials had gotten out of hand. The Nazis who’d committed atrocities and “crimes against humanity” had been tried, convicted, and sentenced. The trials, however, continued. Ordinary Germans were being hauled into court as defendants and charged with “war crimes” when in fact the definition of this phrase was being stretched well beyond its previously established usage.

Thus McCloy, as U.S. High Commissioner, pardoned, or commuted the sentences of, Germans who’d been wrongly convicted as war criminals. Adenauer had worked to inform McCloy about this situation.

The working relationship between Adenauer and McCloy also came into play when Germany joined the Council of Europe.

The Council of Europe was founded in 1949. It is separate from, different than, but similar in nature to, the European Council which is a branch of the European Union. The Council of Europe also predates the EU. Hans-Peter Schwarz writes:

Adenauer was constantly troubled by his ignorance of the hidden intentions of the Western Allies with regard to Germany. In April 1950 he complained bitterly about the situation during a confidential conversation with McCloy. He was expected to make a decision that was vital for his country - here was referring to entry to the Council of Europe - even though, without representation abroad, he could not know what was going on in the world.

Continuing his personal mission to help rebuild Germany, McCloy became a founding member of the American Council on Germany in 1952. His goal was to give West Germany true sovereignty and give Adenauer real governing authority.

Rebuilding and rearming Germany were central to a successful Cold War strategy. The Germans needed to know that the Allies would not randomly round up ordinary German civilians and convict them of imaged war crimes. The men whom McCloy had pardoned, or whose sentences he'd commuted, brought expertise to the rebuilding of Germany's industrial base. This base was needed as part of the defensive effort against the USSR's Cold War threat.

McCloy had reported to Truman that the Germans needed to feel they could help defend themselves against a possible Soviet invasion, and that the U.S. needed Germany as part of the defense of western Europe. The imminent Soviet invasion did not happen, in part because McCloy’s actions convinced the Germans of Allied good will.

Monday, May 25, 2015

The Southeast Asian Seascape

While some historians have embraced the slogan ‘demography is destiny’ to explain great historical trends, one might justifiably assert a competing slogan, ‘geography is destiny.’

Describing the setting for what will be decisive moments in the twenty-first century, Robert Kaplan describes the situation of the nations which border the South China Sea. Indonesia, Singapore, Malaysia, Vietnam, the Philippines, Taiwan, and China ring this body of water which is the main navigational route between the Indian Ocean and the Pacific.

Geographically, the center and heart of this part of the world is water. Kaplan writes:

Europe is a landscape; East Asia is a seascape. Therein lies a crucial difference between the twentieth and twenty-first centuries. The most contested areas of the globe in the last century lay on dry land in Europe, particularly in the flat expanse that rendered the eastern and western borders of Germany artificial, and thus exposed to intensive to-ing and fro-ing of armies. But starting in the last phase of the Cold War the demographic, economic, and military axis of the earth has measurably shifted to the opposite end of Eurasia, where the spaces between the principal nodes of population are overwhelmingly maritime. By maritime, I mean sea, air, and outer space: for ever since the emergence of aircraft carriers in the early decades of the twentieth century, sea and air battle formations have become increasingly inextricable, with outer space now added to the mix because of navigational and other assistance to ships and planes from satellites. Hence naval has become shorthand for several dimensions of military activity. And make no mistake, naval is the operative word. Because of the way that geography illuminates and sets priorities, the physical contours of East Asia argue for a naval century, with the remote possibility of land warfare on the Korean Peninsula being the striking exception.

The South China sea is a highway for billions of dollars of shipping, from agricultural products to finished high-tech machinery. More cargo moves through it than along any paved highway on land.

This body of water is also important in terms of military strategy, both because of the freight which moves through, and because any nation which would control it would also control the nations encircling it.

China is, of course, the single biggest power among these countries, but an alliance of several of the others would be something which China cannot afford to overlook. These smaller nations also look to the United States to support them diplomatically, and militarily if necessary, as they face China.

The United States must then decide if, for the sake of its own interests, or for the sake of Vietnam, Malaysia, Singapore, Indonesia, Taiwan, and the Philippines, it is willing to apply its power in this location. For this goal, an expansion and repurposing of the U.S. Navy might be necessary.

In May 2015, the Philippine GMA Network wrote about China’s slow but steady aggressive expansion into the South China Sea:

A Chinese state-owned newspaper said on Monday that "war is inevitable" between China and the United States over the South China Sea unless Washington stops demanding Beijing halt the building of artificial islands in the disputed waterway.

Who owns which parts of the South China Sea? Which parts are international waters, owned by no one nation? These are some of the questions which cause friction, and could potentially cause war. The GMA Network gives details:

China claims most of the South China Sea, through which $5 trillion in ship-borne trade passes every year. The Philippines, Vietnam, Malaysia, Taiwan and Brunei also have overlapping claims.
The United States has routinely called on all claimants to halt reclamation in the Spratlys, but accuses China of carrying out work on a scale that far outstrips any other country.
Washington has also vowed to keep up air and sea patrols in the South China Sea amid concerns among security experts that China might impose air and sea restrictions in the Spratlys once it completes work on its seven artificial islands.
China has said it had every right to set up an Air Defense Identification Zone in the South China Sea but that current conditions did not warrant one.

Even without war, however, China can increasingly manipulate the dynamics of the region. The Chinese government probably would prefer it that way - if it can succeed in establishing its tyranny over smaller countries and draining away their liberty without the expense and chaos of warfare, all the better for China.

Kaplan argues that China’s chances at aggressive imperialism are the result of a “gradual American decline, in a geopolitical sense.” The United States Navy has been slowly shrinking, both in size and in technological competitiveness.

The U.S. government has not demonstrated the will to protect American interests or lives. The historical axiom that ‘weakness is provocative’ takes effect in this situation: China could not possibly restrain itself from aggression, given a weak target. David Feith, writing in the Wall Street Journal, reports that

But even without war, Chinese military and economic power could erode the sovereignty of neighboring states, establishing Chinese hegemony over the world's most populous, economically powerful and strategically significant region — an outcome that grows more likely as the U.S. retrenches.

Without the United States, will the nations around the South China Sea be able to resist aggression from mainland China? One possible answer to that question is India. Historically, India has cultural connection to regions in Vietnam and Malaysia.

Those cultural connections are, by themselves, probably not enough to cause India to enter the fray. The diplomatic machinations around the South China Sea operation purely on calculations of power.

But India is also a rising economic power, one which seeks to compete with China. Bismarckian Realpolitik might well convince India that a Machiavellian excursion into the region could be to its advantage. One looming question for the twenty-first century is, then, whether, and to which extent, India might become a player in the South China Sea.

Tuesday, May 12, 2015

California's Drought in Historical Context

Over time, the earth’s climate has demonstrated persistent instability: take any span of decades or centuries, find the average temperature or precipitation for that segment of years, and then note that there are long periods of above average or below average data.

Because this can be done for any arbitrary segment of years, it becomes difficult to give meaning to an ‘average’ value. The arithmetic mean of temperature during one decade might, by comparison, show that decade to be above average for the two centuries before and after it, but below average for the four centuries which lie ahead of it and behind it.

Thus it is interesting to note that a New York Times article, published in May 2015, says of California’s weather that

According to climate scientists, it may be the worst arid spell in 1,200 years.

The article is quite correct: around the year 800 AD, the globe was in the midst of a decades-long hot and dry spell known to historians as the ‘Medieval Warm Period,’ a time much hotter around the globe than anything within living memory.

The records which reveal this phenomenon are both sentient and organic. Written records of rainfall, snowfall, temperature, the crests of rivers, and the advance or retreat of glaciers goes back to Roman times and even earlier.

In terms of non-human records, tree rings, core samples, and the courses of rivers offer evidence about climate over the centuries and millennia.

As the Times notes, the Medieval Warm Period was a statistical outlier that stretched over several centuries, significantly, observably, and quantifiably hotter than our era. The Medieval Warm Period was also not anthropogenic in cause.

Interestingly, the Times asserts that the current climate, while not uniformly warming, is also not anthropogenic:

The California drought of today is mostly nature’s hand, diminishing an Eden created by man.

The statistical counterpiece to the medieval warm period is known as the ‘Little Ice Age,’ and began sometime between 1300 and 1350. Lasting until around 1850 or 1870, it was several centuries of global cooling, with averages temperatures measured to be lower than the average temperatures of preceding or succeeding centuries.

Like the Medieval Warm Period, the Little Ice Age was not anthropogenic. The current time, the early twenty-first century, is a time neither hotter nor colder than what has been verified over millennia of recorded history. The climate remains as it persistently has been: erratic and unstable.

It would be a worrying violation of pattern if the climate were stable. That hasn’t ever been observed in more than 6,000 years of human experience.

Thursday, May 7, 2015

Valerian Kills Followers of Jesus

Born Publius Licinius Valerianus, Valerian’s year of birth is unknown. He was active both in the Senate and as a military leader prior to becoming emperor in 253 AD.

As emperor, he was part of growing pattern of dividing the empire, for administrative purposes, into a western and an eastern half. At the end of his career, he was occupied largely with war and diplomacy vis a vis the Persians.

During his reign, he escalated the persecution of Jesus followers. They were arrested, beaten, tortured, jailed, and killed in large quantities.

Despite, or because of, the aggressive oppression of those who followed Jesus, their numbers grew continually. Historian Ernest Gottlieb Sihler writes:

In 258 A.D. the Emperor Valerian issued a rescript to the Roman Senate, not only naming bishops, presbyters, and deacons as objects of judicial prosecution, but decreeing also that senators and “honourables” (egregii) and Roman knights, who were Christians, should lose their ranks and fortunes, and if they persisted in their religion, their lives also. We observe at once the spread of the Christian religion and worship among the higher classes. The words of the official utterance of the Proconsul at Carthage, in passing sentence of death on Bishop Cyprian, in September, 258, are a suggestive document of those times.

In 257 AD, the proconsul Aspasius Paternus placed Cyprian under arrest. Cyprian was sent to live in house arrest, in town approximately 90 km from Carthage.

The location of Cyprian’s house arrest is variously spelled Curubis, Kurba, or Korba. Proconsul Aspasius Paternus used the following words to condemn Cyprian as a follower of Jesus:

For a long time you have lived in a sacrilegious frame of mind and have gathered very many men into your wicked conspiracy and have made yourself an enemy of the Roman gods and of the statutes of religion.

The proconsul continues, recounting imperial efforts to convert Cyprian to the Roman civil religion:

Their most august majesties Valerian and Gallienus and the Caesar Valerian could not recall you to the sect of their own ceremonies.

As Professor Sihler notes, the growth of the Jesus movement, especially among the leadership class, was perceived as a significant threat to the aristocracy’s ability to control Roman society.

After a year of arrest, the proconsul sentenced Cyprian to death with these words:

Let sound tradition [disciplina] be enforced by your blood.

Cyprian is one example of the thousands, and tens of thousands, of Jesus followers who were executed, not only during the reign of Valerian, but of Decius and other emperors.

During the career of Cyprian, the Novatianist controversy arose. The question was whether those Jesus followers who denied Jesus under pressure from the Roman government could be welcomed again into the fellowship of Jesus followers.

The majority of Jesus followers answered that those who “lapsed” either by denying Jesus under Roman threats, or by offering sacrifices to the Roman gods, were indeed welcome back among the community of Jesus followers. Cyprian, as bishop, exercised leadership and moderation in dealing with this matter.

Monday, April 27, 2015

A Different Kind of Civil Rights Struggle

President Kennedy was in Berlin when he said that “Freedom is indivisible, and when one man is enslaved, all are not free,” but it is true in China, as well.

An unlikely hero, Chen Guangcheng became blind as a small child. He overcame this disability, and assembled an improbable coalition of causes to attempt to pry some humanitarian concessions from the government: the physically handicapped, women, and landowners.

While the Chinese government issued regulations which, on paper, entitled the physically disabled to certain benefits, it failed in many cases to actually give those benefits.

While the Chinese government made statements which seemed to recognize human rights, it continued to force women to undergo abortions against their wills.

While the Chinese government claimed to respect liberty, it confiscated land, and did so with no reimbursement to the land’s owners.

Chen Guangcheng saw these three violations of freedom as related. For him, they were not three separate questions, but rather three different applications of one principle.

Melanie Kirkpatrick writes that, seeing the “desperate conditions endured by China’s rural poor,” he was and is known among them as ‘the barefoot lawyer,’ because of the agricultural regions in which he carried out his work.

The Chinese government, attempting to present itself to the world as a legitimate state, issued legislation which ostensibly gave certain rights to its citizens. In reality, the government is an authoritarian autocracy. Scholars debate about which type of socialism or communism best describes the current regime in China, but whichever version of Marxist doctrine is ascribed to it, it remains a totalitarian dictatorship.

Thus it was that “Chen advised his countrymen about their legal rights.” By feigning humanity, the government had made statements - statements it never intended to uphold - but Chen would hold the government to those statements.

Chen’s activism began with a seemingly trivial incident: A ticket collector on a bus refused to let him ride free, as mandated under China’s law regarding those with disabilities. His outrage at this mistreatment propelled him into advocacy for people with disabilities, first at his school in Shandong Province and then on a national level. He educated himself on disability law, petitioned the government in Beijing for better enforcement, and used the media to call attention to violations.

The Chinese had grown quite accustomed to hearing and reading noble-sounding statements from the government, and knowing that the statements were meaningless - that the government had no plan to act according to them.

Chen, however, set a new course. He saw the government’s words as a potential trap for the government itself. The regime could not long tolerate someone who so clearly presented its hypocrisies - someone who presented them in a way which could not be ignored. The government was relieved when Chen finally left the country.

But his work may have continuing effects for many years to come.

Wednesday, April 22, 2015

The Soviet Threat to England

In 1955, Harold MacMillan, a member of the Tory party, and at that time holding the title of “Secretary of State for Foreign Affairs,” offered a rousing defense of Kim Philby, who had worked in the legendary MI5, the British intelligence agency. Philby had been accused of secretly working for the Soviets.

MacMillan’s defense of Philby was successful. It was also the worst mistake of his political career: Philby was in fact a Soviet agent.

Bringing evidence against Philby was Col. Marcus Lipton, a member of the Labor Party, and a MP representing Brixton. As a Member of Parliament, Colonel Lipton used the legal concept of “parliamentary privilege” to carry out his investigation of Philby.

MacMillan was successful in his defense of Philby, not because of a preponderance of evidence, nor because he established a reasonable doubt about Philby’s guilt. The defense of Philby was based on an appeal to Philby’s social status and Lipton’s lack of it, not on logical argumentation. Historian Stan Evans recounts the scene:

The smooth-talking diplomat in chief, unflappable as ever, was blandly reassuring: Charges of pro-Red chicanery made against a former high official had been carefully looked into, and there was nothing to them. The accused had been unfairly named and had now been cleared by the security screeners. Just another case, it seemed, of wild allegations by reckless people who didn’t know the facts of record.

Lipton’s case against Philby was dismissed, not because it was irrational, but because it was unpopular, at least among the leadership clique.

Those in power felt that Philby was like them, and therefore couldn’t possibly be guilty. Their social affinity for Philby was so great that it caused them to overlook the evidence which indicated that he was a ‘Red’ - that he was working for the Soviets.

The combative lawmaker who brought the charges wasn’t buying. He had further evidence on the matter, he said, the nature of which he couldn’t reveal but would give to the appropriate committee. This prompted cries of “smear” and demands that the accuser make his outrageous statements off the floor, without legislative privilege, so that he could be sued for slander.

This legal scene in England in the 1950s had its roots two decades earlier. By the early 1930s, Philby, whose full name is Harold Adrian Russell Philby, was already a Soviet agent. His father was a British diplomat, and Philby studied at Westminster and Cambridge. He seemed to be the right sort of person, and acquainted at an early age with powerful people of his generation.

The socialists in England and some European countries hoped that their version of socialism would be seen, and accepted as the antidote to Hitler’s national socialism. Thus presenting themselves as people of goodwill, they hoped that nobody would suspect them of secretly collaborating with Stalin’s intelligence agencies.

As the 1930s intellectual ferment fed the Communist malaise, it had other adverse effects as well. An alternative answer to the cultural breakdown was the Nazi version of the godless faith, which had just come to power in Adolf Hitler’s Germany. As the Brown and Red despotisms fought for supremacy in Europe, each posed as the remedy for the other.

The more moderate socialists of the middle class embraced the volatile Leninist-Marxists as fellow anti-fascists, either not knowing, or ignoring, that the stated goal of these communists was violent revolution in the western democracies.

“For many in England,” Stan Evans writes,

the Communists and the USSR would thus gain added luster as alleged antidotes to Hitler. (A conflict capsuled in the Spanish Civil War of the latter 1930s, as Western leftists flocked to the Loyalist government in Madrid, supported in its fashion by the USSR, in battle against Gen. Francisco Franco, backed by the Italian Fascists and the Nazis.)

Sadly, while resisting the horrific evils of Hitler, many dupes in western democracies played into the hands of Stalin, who was orchestrating a different set of mass murders as he killed millions of Ukrainians in planned famines.

“From this maelstrom came the” generation of willing accomplices, idealistic and naive, who would either knowingly enlist in Stalin’s service as Philby did, or unwittingly cooperate with Soviet agents, “and many others like them, who would be the traitors of our histories.”

From the 1930s through the 1950s to the 1980s, Soviet agents were active in Britain, exporting confidential information and classified national security documents to Moscow. They also steered domestic discussions of policy, manipulating British policy so that it favored the Soviet Union rather than the interests of Englishmen.

Tuesday, April 21, 2015

China's Barefoot Lawyer

Rarely does a lawyer become a folk hero. But Chen Guangcheng managed to become exactly that among the peasants who suffer under China’s current government.

In addition to hard work and poverty, the farmers in China are subject to arbitrary land seizures in which the government takes their land and then sells it. Local officials rely on this corrupt practice for revenue.

Defending this simplest and most basic of rights, the right to property, Chen Guangcheng became known as the ‘Barefoot Lawyer,’ given both his rural origins and the rural focus of his work. He also amazingly overcame a significant physical handicap to carry out this work, as the New York Times reports:

Blinded by a childhood illness, he helped the disabled win public benefits and aided farmers fighting illegal land seizures. But in 2005, Shandong officials turned against him when he tried to defend thousands of victims of a coercive family-planning campaign. A year later, in a trial that many legal experts described as a sham, a local court convicted him of destroying property and organizing a crowd to block traffic while he was under house arrest.

Cheng Guangcheng identified the underlying connection between economic freedoms, like property rights, and personal freedoms. Thus his defense of property rights is of a piece with his defense of rights for the physically handicapped and for women.

The Chinese government did not long tolerate Chen’s work. Scholars debate whether the government is more accurately labeled ‘communist’ or ‘socialist,’ but in any case, it accepts no limits on it scope or power. Melanie Kirkpatrick writes:

Chen Guangcheng seemed an unlikely hero. Born in 1971 to a poor family in rural China, blind since infancy, and illiterate until his late teens, Chen became his country’s most prominent human-rights activist. His story made international headlines in 2012 when, under house arrest, he made a dramatic escape and sought refuge in the US embassy in Beijing. The Chinese government eventually allowed him to go to the United States.

Guangcheng now lives with his wife and two children in the United States. His mother and brother have faced continued harassment from the Chinese authorities.

His work has been based on the assumption that personal liberty, political liberty, and economic liberty are inseparable.

Wednesday, April 15, 2015

Putin and the Western Press

As a former KGB lieutenant and a leader with instincts for both political manipulation and ruthless control, Vladimir Putin showed a stunning gap in his otherwise savvy machinations, when he a displayed a shocking lack of familiarity with the dynamics of the Western press.

During the 2005 summit meeting with President Bush at Bratislava in Slovakia, Putin made a comment about Bush having “fired” a reporter. The gaffe revealed that Putin did not understand that the Western press was independent of the federal government, and capable of directing egregious insults at elected or appointed officials with no fear of retaliation.

As President of Russia, Putin clearly had no idea of a truly free press, and could not imagine a president who was not empowered to punish or discipline the news media. President Bush recalls the moment:

It dawned on me what he was referring to. “Vladimir, are you talking about Dan Rather?” I asked. He said he was. I said, “I strongly suggest you not say that in public. The American people will think you don’t understand our system.”

At the time, CBS News had fired, or demanded a resignation from, Dan Rather, a news anchor who had broadcast a story based on what later turned out to be forged documents. CBS News accused Rather of failing to use due diligence in researching his sources.

Putin had nearly total control on the Russian press, and could not imagine the attacks which presidents regularly endure from the media in the United States.