Sunday, October 28, 2018

The Textual Basis of Islam: Historic Consensus

When Muhammad died in 632 A.D., he left behind a powerful organization, but no authoritative written texts. The Qur’an (Koran) would not be written until decades later.

In the absence of an official text, various traditions about what Muhammad might have said about different topics emerged: vacuums attract filling.

Collections, both written and spoken, appeared, presenting themselves as the words of Muhammad. These collections were not always mutually compatible. When the contradictions became evident, crisis and conflict arose.

If Islam were to continue as a viable social and political order, some harmony would be needed.

In legal matters, e.g., some foundational core of unchanging doctrines was necessary for establishing sharia. There would be some flexibility for interpretation and competing regional precedents, but there would also be fixed set of axiomatic laws.

A non-negotiable axiom within sharia is called a hadd (the plural form is hudud). The collected hudud constitute the basis of sharia and of the broader Islamic social worldview.

To formulate hudud and other invariable Islamic doctrines, early Muslim scholars began to sift through the competing collections of sayings. These collections were called hadith. In the meantime, the Qur’an was beginning to emerge in written form. The need to harmonize the hadith with each other, and with the Qur’an, was growing.

These scholars worked to form a consensus about which hadith would be regarded as authoritative, as Timothy Furnish writes:

Two aspects of individual hadiths became the focus of scholarly criticism within the early Islamic world: the matn (plural mutun or mitan), or “text,” and the isnad (plural sanad), or “chain of transmission.” A matn might well be rejected on the grounds that it seemed to contradict the Qur’an. But the focus of hadith criticism was channeled into investigating the isnads rather than the matns. The number, credibility, and seamless­ness of the transmitters became more important than what the tradition actually said. And so as long as a hadith text did not actually contradict the Qur’an, it had a shot at being accepted by at least some segment of the early Islamic community, especially if what it said proved useful in some manner, usually political. Hadiths were ranked into three categories based on the trustworthiness of their chains of transmission going back to the Prophet: sahih, “sound”; hasan, “good”; and da'if, or “weak.”

So it was that Qur’an began to accumulate out of various fragments which were circulating at the time. Simultaneously, some hadith were gaining recognition as authoritative while others were being relegated to a questionable status.

Two processes were happening, and they overlapped somewhat in time: the process of composing the final edition of the Qur’an, and the process of sorting out various hadith. These two processes most probably influenced each other.

Together, the two processes formed the finished social and political order which is Islam.

This categorization was largely worked out by Muhammad b. Idris al-Shafï i (d. 820 CE), who had been disturbed by the proliferation of questionable, even down­right false, traditions in his time and developed the gauge of isnad legiti­macy as a means of differentiating spurious hadith from acceptable ones. If a consensus of scholars agreed a particular hadith was acceptable, then it was deemed so for the entire Islamic world.

This scholarly consensus shaped the history of Islam permanently. Not only were the Hudud fixed, but more significantly, the nature of Islam as political and social movement was determined.

The fact that modern Islam is less about the agency and personhood of the deity, and more about public and communal paradigms which humans should institute and maintain, can be traced to the formative stage of development in which the texts of Islam were accreted and given their relative degrees of authority.

Thursday, October 25, 2018

African Physical Geography: Unnavigable Rivers and Absent Ports

The landscape of Africa is a significant obstacle to its inhabitants.

Africa has thousands of miles of coastline, but sub-Saharan Africa has few or no usable deep-water ports. This has historically limited imports and exports and the accompanying exchanges of ideas with other cultures.

LIkewise, the Sahara Desert is an effective obstacle to land transportation between the southern part of the continent and the northern part.

It is the land itself which has kept the cultures of sub-Saharan Africa isolated from contact with the outside world.

This geography has also kept these cultures isolated from each other. The rivers of Africa are, on average, measurably less navigable than rivers on other continents. This has led, in turn, to less contact between the various nations of sub-Saharan Africa, as Tim Marshall writes:

Africa, being a huge continent, has always consisted of different regions, climates and cultures, but what they all had in common was their isolation from each other and the outside world. That is less the case now, but the legacy remains.

A glance at a map of Africa can be misleading. One might think that it would be a straightforward overland journey from, e.g., Libya to Angola. But it is not. Likewise, one might imagine that Namibia (German South-West Africa) or Mozambique would be situated to engage in large-scale international trade by means of cargo ships. This, however, is not the case.

The map fails to reveal the impediments to travel within the continent, and the barriers to shipping from and to other continents, as Tim Marshall notes:

The world’s idea of African geography is flawed.

For cultural, social, political, and economic purposes, Africa could be considered as two continents, with the impassible Sahara between the two.

The geography of this immense continent can be explained in several ways, but the most basic is to think of Africa in terms of the top third and bottom two-thirds.

Understanding the physical geography of Africa will help the reader to understand why the continent is filled with ‘developing nations’ that don’t develop. All manner of aid from other parts of the world, and all types of schemes to encourage development - formulated both by native Africans and by outsiders - cannot change the shape and structure of the land itself.

Wednesday, October 24, 2018

Hinduism’s Treatment of Women: The Practice of Sati

One of the most gripping concepts in history is the Hindu practice of suttee or sati - the practice of expecting, or even requiring, that a widow, upon the death of her husband, commit suicide. This was often carried out in the form of self-immolation: the widow threw herself onto her husband’s funeral pyre.

This is the most jarring example of Hinduism’s view of women, but there are many other more mundane examples.

Hinduism lays a conceptual foundation for cultural and social practices which ascribe a secondary status to women, as scholar Richard Cavendish notes:

Orthodox Hindus believe that women cannot attain salvation as women, but only through being reborn as men. Women are evil and unclean, and the virtuous Hindu woman, who must treat her husband as if he was a god, is considered inferior to the worst of men.

Cavendish goes on to note that “the tirade in the Indian epic, the Mahabharata” is a “condemnation of woman” which says that women are “a curse. In her body the evil cycle of life begins afresh.” Male children are born “fouled with the impurities of woman. A wise man will avoid the contaminating society of women.”

In comparison, Cavendish says that the Judeo-Christian tradition within European culture is “by no means as hostile to women as orthodox Hinduism.” Western civilization is not “appalled by sex.”

British authorities began legislating to end suttee in the early 1800s. After India achieved political independence in the late 1940s, the Indian government continued to combat suttee, but instances of the practice have been recorded as late as 2008.

Sunday, October 7, 2018

Improving Standards of Living: Industrialization Makes Life Better for Lower and Middle Classes

Historians use the term ‘Industrial Revolution’ to refer to an era which began in the early 1700s in England. Like most constructs, this era does not have specific and clear starting and ending points in time, but is general concept rather than a specific and concrete one. Although the concept is not precise or definite, there are certainly specific and concrete events which form the historical data underlying the construct.

Much later, a wave of industrialization spread across the United States. Historians sometimes refer to this as the ‘Second Industrial Revolution’ or the ‘Technological Revolution.’ Prior to this era, more than 90% of the population was engaged in agriculture and lived in rural settings.

Industrialization changed various aspects of society. Increasing percentages of the population lived in cities: urbanization. Transportation and food-preserving techniques reduced the ability of a poor harvest to cause starvation. Standards of living, especially for the lower and middle classes, improved, as mass-produced goods fell in price.

Modern sewage and drinking-water systems improved public health. Specialization of the labor force made both skilled and unskilled labor more productive, which in turn improved wages.

Although the original Industrial Revolution included instances of child labor, the Second Industrial Revolution decreased child labor and replaced it with more years of childhood education.

The middle class expanded as upward mobility created opportunities for the lower class to increase their wages and join the middle class. As historian Tanya Lee Stone writes,

In the late 1800s, millions of Americans left small towns and farming areas to move to cities, where workers were needed more than ever before. Of course, they had to have places to life.

People were drawn to factory towns because they could earn better wages than they had earned on farms. Industrialization created opportunities.

The technological revolution meant that even the lower classes would obtain benefits like telephones and electric lights. Even the working classes could travel by train instead of long, uncomfortable trips on horseback or on a horse-drawn wagon.

A small number of wealthy people began to buy as much land as they could and build houses and apartments buildings. They charged people fees, or rent, to live there.

The real benefit of industrialization was the increase of wealth. The landowners received more money in the form of rent paid to them. The workers received increasing wages and could buy consumer goods at decreasing prices. The remaining farmers earned more because, as some people the farms to move to factory towns, the other farmers could farm larger areas of land and get more money when they sold their harvests.

As machinery for agriculture developed, one individual farmer could farm more land.

All the people in society can benefit when wealth is increased. If the system is a free-market system, then not only the rich, but also the middle class and lower class people have chances to earn better wages.

Not everyone was happy about this prosperity. Some people hoped to create unrest and conflict, so they began calling the rich people of this era ‘robber barons.’ They said that these factory owners were inflicting misery on the poor.

But the working class didn’t believe that the factory owners were so bad. The workers were experiencing rising wages and increasing standards of living, and they were paying lower prices for the products they wanted. So the factory workers didn’t want to organize a rebellion against the factory owners.

In the years between approximately 1870 and 1920, all social classes in the United States experienced rising standards of living. The economic foundation created during these decades helped the United States face the challenges of the 20th century.