Science, Faith and the Glory of Ignorance

June 24, 2017

Why is it assumed (please forgive the passive voice) that if one believe’s in a Creator then one must not globe-73397_960_720believe in science? Who has decided that these two sources of information are mutually exclusive?

I am a Christian. I have believed and been taught all my life that I am a child of God, that Jesus Christ is His Only Begotten Son in the flesh and my Savior and Redeemer. I have studied the Bible as well as other sacred writings. They have strengthened my faith and deepened my conviction. I have also been taught that scientific method is the best way mankind has come up with to date for understanding our world. Some might call this a dilemma.

To those without it, faith is a copout. A way to explain anything we can’t explain. But stop for a moment. Faith is described by Paul as “the substance of things hoped for, the evidence of things not seen.” (Heb 11:1)  Substance and evidence are real. Not always tangible, but real nonetheless. While non-believers will scoff, there are many things in our existence that are real but difficult or even impossible to adequately describe to one who has never experienced it. For example, try to describe the taste of salt to one who has never experienced it.  salt-51973__340Using the word “salt” is ineffective because it requires familiarity. Referring to sea water or other comparisons still require that experience of having first tasted it. So, does that mean it’s not real? Only in the minds of those who have never tasted. But in no way does that negate the reality of salt nor its value. That’s how it is describing faith. To those of us who have experienced it and come to value it, it is as real as the chair I’m sitting in. But trying to help someone understand it who has no common base to start from is virtually impossible until they have an experience of their own.

On the other hand, to many Christians, science is the work of the faithless or even of the devil himself, blinding people to the works of God.  Is that not as limiting as the other?

Sure, science doesn’t always get it right, yet we still hold on to it. How long did we “bleed” disease out of people, killing more with the cure than the disease itself? We are historically an ignorant species that seems to grasp at whatever information we think we have and cling to it like it’s a life raft in the north Atlantic. But that does not devalue the real knowledge we are acquiring. Brilliant men and women who spend their lives in dissecting substances, experimenting with chemical reactions and basically playing with fire have learned incredible things about our world. Sure, a great deal of “knowledge” has been thrown out as our understanding grows. Much more will continue to be thrown out as new information comes to light. But that does not discount the very real store of knowledge that we have. How many of us would discount germ theory or throw away our cell phones because it was knowledge acquired through scientific research? Personally, I enjoy being able to fly when I need, keep informed on world and local events, and talk to my sister halfway across the country any time I want. These discoveries weren’t found in the Bible or Book of Mormon. They were found through scientific research and brilliant engineering.

Don’t get me wrong – I absolutely credit our Father in Heaven for enlightening the minds of those who brought this knowledge to light. That’s the thing – these two disciplines were never supposed to be mutually exclusive. Faith leads one to action and reminds us to be humble enough to give honor to the One who gives us minds to think and air to breathe. Science is one vital way we use those minds. To discount knowledge gained by scientific study because we believe in God or to discount faith because it can’t be proven is ridiculous. “Truth is truth no matter where it is found.” To ignore enlightenment or research and only hold to a single source is blindly ignorant.

The question then comes, what about when the two conflict. For some, if it seems to contradict scientific theory then it must be false. For others, if it seems to contradict the Bible, then it must be false. How arrogant are we when we think so! I first attended college at Brigham Young University in Provo, UT. There, I studied biology under a professor who was not of my faith. I don’t know if he believed in God at all or not. But when we talked about evolution, the discussion was fast, furious (literally) and very one-sided. The students, most around 18-20 years old, were certain that there was no such thing as evolution. The teacher, employed by a church not the state, was treading on very delicate ground.

In this case, I hold to the scriptures which tell us that God spoke and Creation obeyed. So, does that mean that I discount all the science? No. What that professor said has stuck with me ever since. homo-erectus-2242425__340“I believe that when all things are made known, there will be a lot of very surprised people on both sides of this argument.”

I couldn’t agree more. And no one yet has been able to sufficiently tell me how the dinosaurs fit into all this. They don’t disprove religion. They only show how much we don’t yet understand. The hazard in using a single example of conflict is that it narrows the focus too much on this topic, evolution. Please don’t miss the point. There is room to acknowledge on all sides that we are sorely ignorant.

What I recognize, and what I would love others to recognize, is that we mortal humans do not understand a fraction of what we are certain about. Not about science and not about God. We have research, we have scientific method, and we are often wrong. We are often right, too. We have scriptures, we have prophets and we have historical records. We are often wrong. And we are often right as well. I may believe in God, but that doesn’t mean I understand all things about Him or His ways. As an eternal being I am in embryo.

This I do know: that God lives. That He created us in His image. That He gave us minds to think and reason and search. That He wants us to learn of our world as much as we can. “The glory of God is intelligence.” – that’s a sign on the campus of BYU.

I also know that mankind is a proud, self-centered race that glories in his own achievements. That is, in my belief, one of our most blinding weaknesses. We can’t seem to form the words, “I could be wrong.” Or “Maybe you’re right, too.”

My plea to both the scientific community and the religious community is to take a breath! Relax about who’s right and who’s wrong. Quit calling names and ridiculing each other, and allow that perhaps there is room for all of us to grow a little.

 

Food Rights of the Ainu People of Japan

April 14, 2015

The Ainu are an ancient people, populating the islands of the Sea of Okhotsk including Sakhalin, Ezo (later known in Japanese as Hokkaido), the Kuril Islands and the southern end of Kamchatka. For centuries they lived as hunter-fisher-gatherers who also practiced small scale agriculture. When the earliest Japanese, descendants of Chinese and Koreans, immigrated to the islands now known as Japan, their feudal society stayed for the most part in the southern islands, avoiding Ezo. This avoidance allowed the Ainu to live essentially in peace, continuing their customs and culture uninterrupted. Relations deteriorated under the Tokugawa Shogunate as the Japanese moved onto Ezo. The Ainu were forced to pay exorbitant tributes and made into second class citizens. In 1799, after the Ainu rebellions in the Kuril Islands, the shogun instituted an assimilation policy for the Ainu which was drastically accelerated in the 19th century during the Meiji era. Under such conditions Ainu hunting and fishing were severely curtailed as well as their language and religion, and they were driven to the brink of extinction. Today, thanks to interventionist efforts, the Ainu people and culture are experiencing a renaissance.

The earliest record of the Ainu dates back to the 14th century, a.d. Mitochondrial DNA evidence supports the theory that the Ainu descended from the Okhotsk and Satsumon cultures as early as the 13th century. (Sato) Some Ainu believe they originally came from the region of the Saru River on Ezo. (Shigeru)

As hunter-fisher-gatherers, the Ainu’s subsisted on a diet primarily of venison, bear and salmon with wild fruits and berries, millet, wild onions, wild potatoes and other plants. These items were found in abundance in the islands. “Whenever the Ainu needed meat they would enter the woods with bows and arrows and hunted as many deer as they desired.” (Shigeru) The traditional practices of Ainu hunting were honed by centuries of working with the land and observing the results. The Ainu called salmon “shipe” from the original phrase “shi-e-pe” which translates into “the real thing we eat”, meaning “our staple food”. Harvesting only what was needed at the time, “they ate only the ‘interest’ on the returning fish, so there was never a worry about the ‘capital’ or main stock of fish disappearing.” (Kayano, 23)

They had a spiritual connection to the land and the harvest. All animals and plants were considered sacred, although some more than others. In order to appease the spirit gods, hunters offered prayers before the hunt and after the kill. Religious and cultural ceremonies centered on the meat and fish. In one such ceremony, the first salmon of the season was placed upon a wooden cutting board, set at the seat of the father of the house with the fish’s head facing the fireplace in a very particular manner. The father would bow to the salmon and speak a prayer in his native tongue thanking the fish for the honor of its presence in his house. Shigeru Kayano, an Ainu and the author of Our Land was Once a Forest, wrote about such an offering made each year by his father. “Then he faced the flames in the fireplace and prayed to the goddess of the fire: ‘Today for the first time this year I have brought home a salmon. Please rejoice. This salmon is not merely for us humans to eat by ourselves, but for us to eat with the gods and with my children, as tiny as insects. Please watch over me, that I may catch many salmon hereafter.’” (Shigeru) Offerings such as this were performed over a great variety of foods and situations, always honoring the animal that was taken and asking for a blessing of some sort. The religious connection to their food was a significant part of the Ainu culture. These practices continue today among some of the few remaining Ainu who still identify with their heritage.

The problems of declining or destroyed food rights and sovereignty began in earnest with the Meiji restoration and the focus that government placed upon strengthening Japan’s military and central government. Land rights were stripped from the Ainu who had held some of the choicest territories in the country. Those tracts were handed out to Japanese settlers and nobility, leaving the Ainu without their traditional hunting, fishing and gathering grounds. “This trend caused not only poverty but also the destruction of Ainu traditional cultures.” (Godfrey) Hunting, fishing and gathering were integral parts of the Ainu identity. To remove these from them was akin to a prison sentence.

The Meiji government handed out seed and implements and ordered the Ainu to begin farming and raising cattle and pigs. Agriculture was traditionally women’s work for the Ainu people. Removing hunting and fishing from the men and forcing them into farming was devastating. “Probably one of the largest stumbling blocks for the Ainu was the change in the traditional division of labor between men and women which farming seemed to demand.” Famine, disease, poverty and starvation ensued. The Ainu were in decline. (Peng, 732) From a population estimated at around 80,000 the Ainu dropped to about 15,000 during the Meiji. And what remained was a conquered people.

With practices of assimilation not too different from the way the United States government treated Native Americans in the 19th century, Ainu children were removed from their homes, sent to assimilation schools, taught the Japanese language and forbidden to speak their own. Their religious practices were also forbidden, including those honoring the spirit gods. (Godfrey) The Ainu culture was removed forcibly and quickly.
Laws were passed enforcing the new way of life. Prior to the Japanese incursion, the Ainu hunted approximately 600 – 700 deer each year on Ezo. During the Meiji era, the Japanese hunted on average 15,000 deer annually. Then deer hunting was forbidden entirely. Salmon were on the list as well. Initially, when the Japanese took over the islands, they would fish the ocean salmon, leaving the freshwater salmon for the Ainu. As more commercial fishermen fought for limited opportunities in ocean fishing, the government opened up the rivers to them. Under Meiji law, private parties were no longer allowed to fish in the rivers. The Ainu were only subsistence fishermen, not commercial. Now they would not be fishermen at all. The sacred rites of the First Fish offerings would no longer be possible.

Some Ainu complied, making great efforts to abide by the changing laws. But there were many who continued to fish. Living in fear of arrest, parents counselled children that if any person asked if they ate fish, they were to deny it.

Kayano remembers how his family was torn apart over this prohibition:
“One night [in 1932] a policeman stepped inside [my home], looked at my father and said ‘Shall we go, Seitaro?’ My father prostrated himself on the floor and said, ‘Yes, I’m coming.’ Without raising his head, he let large tears fall onto the wooden floorboards … . My father was being taken away by the police for catching salmon… . As my father was led away, I ran after him, sobbing.” (Kayano)

Fortunately for the Ainu, as their traditional foods became out of reach to them and they were forced to change their diets in order to survive, the change was fairly nutritionally equitable. Where other aboriginal cultures suffered malnutrition and starvation due to forced dietary changes, the Ainu continued to eat foods that were at least similar to their old ways, even if they were unable to continue their old practices. Pork became the meat of choice – not by choice but by default. If that was all that was available, that was what was eaten.

With the change of cuisine came a change of attitude. The assimilation policy of the Meiji period set up conditions that promoted Japanese food as superior and Ainu food as inferior and undesirable. Under the new cuisine promotions, rice was considered “more tasty” than millet. The wild onion/garlic carried a strong smell and taste. Called by its Ainu name pukusa, it became the subject of ridicule. Many avoided it all together. Others simply renamed it “Ainu negi” (Ainu onion) now associating the smell of the onion with the Ainu people. Over time, other foods were added to the derogatory list simply because of their association with the Ainu people. The Ainu adapted the best they could, accepting the new standards in order to avoid the stigma that was now attached to their old habits. Those who did continue their old practices did so secretly. (Iwasaki-Goodman)
For those foods the Japanese liked and wanted to continue using, renaming them provided that option. They were then gradually added into the Japanese diet but with a Japanese name. In so doing, they removed any association with the Ainu and thereby any consideration as “undesirable”. (Iwasaki-Goodman) Traditional Ainu dishes and names now brought shame. Even the schools served only Japanese fare.

In addition to losing the Ainu names of the foods they ate, the children also were not being taught the religious rituals of their ancestors. “The important prayers to and attitudes towards the spiritual beings involved in harvesting and processing certain foods” were being lost. (Iwasaki-Goodman) Children in this period had lost their most basic ties to their heritage.

 

“The normative influences during the adolescence of these people prevented Ainu food habits from being reinforced through secondary socialization. Instead, negative social conditions influence attitudes to Ainu food habits, making people more willing to shift towards the Japanese way of life.” (Iwasaki-Goodman)

According to Masami Iwasaki-Goodman, food culture is learned from birth and developed as children grow up eating what their families eat and preparing it, treating it, and valuing it as their families do. “Along with food habits, children also learn the attitudes and values associated with food items and their preparation through interactions with family members and friends.” (Iwasaki-Goodman) Lasting ties are formed in kitchens and around the table. Tastes and traditions are passed from generation to generation, all through food. When those traditions are taken away, so are the ties that they bring.

By the end of the assimilation efforts, questions were raised about its effectiveness. Dr. Noémi Godfrey considered it a failure, saying the Ainu had not been assimilated but rather acculturated.

“The Ainu can no longer live according to their traditional way of life. They went from being hunter-gatherers and traders to farmers or factory workers. They are no longer allowed to practice their religious customs, and the use of their native tongue is restricted. A cultural gap widens between generations, parents and children no longer speaking the same language or practicing the same customs. No longer considered Ainu, but still not considered Japanese, the former aborigines cannot find their place in Japanese society, and find themselves completely acculturated.”(Godfrey)

By 1921, Japanese immigration into Hokkaido/Ezo had reached its peak. The Meiji era had ended, and with it ended the great push to assimilate the Ainu. But the people were left without any sense of identity, of history or their own heritage. Almost 100 years of denial had effectively wiped out a culture. The language was considered dead, the laws prohibiting fishing and hunting remained and the stigma of identifying with the Ainu was pervasive.

In more recent years, efforts have been made both from within the Ainu community and from outside to restore some of that which has been lost. Members of the Ainu community have teamed with a research group from Hokkai-Gakuen University in Sapporo, Japan, to provide information and the experience of eating traditional Ainu food. In doing so, they have begun reintroducing Ainu foods into their food culture. The plan has four parts as described by Masami Iwasaki-Goodman:

1. A community newsletter providing information about traditional food items
2. A series of cooking lessons
3. Preparation of Ainu dishes for ceremonial occasions
4. Other activities conducted outside the community.

One of the first things the Ainu students learned was how much traditional Ainu foods they were already eating but never realized because the names were now only referred to in Japanese and had become part of Japanese cuisine. They found out there were real differences between their diets and those of non-Ainu. Probably most telling was the heavier than normal use of wild vegetables in their cooking.

Along with reintroducing traditional foods, the intervention group has also brought back several of the sacred rituals that had been lost. One of those centered on a fermented drink called “tonoto” which is a mixture of rice and egg millet cooked into a porridge. When it is placed into a keg, a piece of hot coal is placed on top of the mixture while the person performing the ritual offers a prayer to the fire god “Apefuchi Kamuy” to protect the tonoto while it ferments. (Iwasaki-Goodman)

Today, the laws prohibiting Ainu from fishing have not been lifted. But fishing laws are being examined both at the local and international level. Kayano went to court against Hokkaido Syuyhouiinkai to fight against a proposed dam that would, in his words, “constitute [a] threat to Ainu culture…” (CJIELP)

Food sovereignty, or the “right of peoples to healthy and culturally appropriate food…and their right to define their own food and agriculture systems” (Rosenberger, 18) may never be completely within the grasp of the Ainu of Hokkaido. But where they had once lost everything to a conqueror, today they have far more of their rights restored than they have seen in centuries. Where the children are learning and holding on to the sacred traditions of their ancestors, perhaps there is hope one day to see them flourish once more.

Works Cited:

Colorado Journal of International Environmental Law and Policy. 245. Summer, 2001: LexisNexis Academic. Web. Accessed 3/4/2015.

Godfrey, Noémi. “The Ainu Assimilation Policies During the Meiji Period and the Acculturation of Hokkaido’s Indigenous People.” Paris. National Institute of Oriental Language and Culture Studies. Web. Accessed 3/4/2015.

Iwasaki-Goodman, Masami. “Tasty Tonoto and not-so-tasty tonoto: fostering traditional food culture among the Ainu people in the Saru River region, Japan.” Indigenous Peoples’ Food Systems and Well-being. Sapporo. Hokkai-Gakuen University. 2009. Web. Accessed 2/16/2015.

Kayano, Shigeru. Our Land Was a Forest: An Ainu Memoir. Japan: Westview Press, 1994. Kindle AZW file.

Kayano, Shigeru. “Traditional Ainu Life: Living Off the Interest.” First Fish, First People: Salmon Tales of the North Pacific Rim. Ed. Judith Roche and Meg McHutchison. Seattle: University of Washington Press, 1998. 22-30. Print.

Peng, Fred C. C., Ricketts, Robert and Imamura, Nario. “The Socioeconomic Status of the Ainu: The Past in the Present”. American Ethnologist, Vol. 1, No. 4. Wiley. 1974. Web. Accessed 3/4/2015.

Rosenberger, Nancy. Seeking Food Rights: Nation, Inequality and Repression in Uzbekistan. Belmont: Wadsworth, Cengage Learning. 2012.

Sato, Takehiro, et al. Origins and Genetic Features of the Okhotsk People, Revealed By Ancient Mitochondrial DNA Analysis. The Japan Society of Human Genetics and Springer. 2007. Web. Accessed 3/4/2015.

Turner, Nancy J., Plotkin, Mark, and Kuhnlein, Harriet V. Indigenous People’s Food Systems and Well-being. Sapporo. Hokkai-Gakuen University. 2009. Web. Accessed 2/17/2015.

Trail of Tears – a Brief History

February 11, 2013

America needed land, and the Indians were in the way, especially regarding some 5 million acres in the new state of Georgia.  No matter how “civilized” the Native Americans became, state and federal governments used economic, cultural and political pressure to force them out.

Image

Economically, the Indians were having problems.  In the late 1700’s and early 1800’s the deerskin trade that had previously been so lucrative had begun to fade. One answer was to concede in some degree to becoming civilized and learn farming in hopes of being able to feed their people.  (28-29)

The people of Georgia had problems of their own. Soil tapped out by cotton farming and a growing population brought demands for more land, and the Cherokee had it – about 5 million acres of it.  One tactic of the Americans was to put in a trading post called the US Factory.  The Factory extended credit to Indians, allowing them to develop a taste for consumer goods.  The Indians went into debt and some had to sell their land to pay it off.  (29-30)      Instead of being ruled by materialism as Americans had expected, however, the Indians developed their own businesses, then used the money earned to invest.  The process didn’t make the Cherokee more willing to sell their land.  Instead, it gave them the funds to better protect it. (36)

Culturally, Indians and Caucasians couldn’t have been more different.  Where European Americans bought and sold land as a commodity, Cherokee saw themselves as spiritually attached to the land. (6, 19)  They owned it in common among their tribe.  White Americans may have acknowledged the Cherokee right to the land, but since they weren’t Christian, Anglos considered the Indian claim to be weak.  (12)

The only way to bridge the gap, according to Secretary of War Henry Knox, was to “civilize” the Indians –  to teach them to read, write and speak English, wear white man’s clothes, give up hunting and become farmers, and above all, become Christians.  This was the only way Knox could see the war ending.  In so doing, the Indians wouldn’t need so much land for hunting.  They could sell their hunting grounds and have investment capital for their farms and businesses. (24-25) This goal was seen as the great answer to preventing all out war.  Congress funded missionaries to teach the Indians how to become civilized.  The Cherokee, for their part, weren’t too excited about the blatant attempts at religious conversion.  Many liked the idea of having their children learn to read and write, but they generally rejected Christianity.  (32-33) Since conversion was key to being accepted by white society, once again, the great plan to convince the Indians to willingly give up their land had failed.

By far, the greatest pressure the government used was political. Various treaties had been signed between the British or American governments (whichever happened to be in power at the time) and the Indians were expected to honor them, regardless of whether the whites did or not.  While the Brits promised no settlers would cross the Appalachians, many did anyway. The crown paid for the land, and the settlers got a piece of Indian Territory.  (17)

Several US presidents saw the Indians as impediments to American progress and prosperity.  Thomas Jefferson believed that the future of the republic depended on speedy land acquisition. The importance of obtaining land outweighed the goal of civilizing the Indians. (31)  James Monroe felt that the Indians were sovereign and had the right to refuse to sell their land; but he also thought the Indians would be better off if they moved away.

Since Georgia had been the chief “thorn in their side” so to speak, in 1824 the Cherokee turned the tables and used Georgia’s own argument against them, saying that they (the Cherokee) couldn’t recognize the sovereignty of a state within their territorial boundaries. (54) Then, in a grand step towards the very civilization the whites claimed to want, the Cherokee nation drew up its own constitution in 1827. (57)

This was the final blow that removed the gloves.  Georgia’s legislature was incensed, calling the Cherokee constitution outrageous and their claim to sovereignty unconstitutional.  They blustered and threatened, then revealed their true colors saying, “The lands in question belong to Georgia.  She must and she will have them!”  Georgia then proceeded to pass legislation subjecting the Indians to white laws, but denying them the protection of the same. They also declared the Cherokee government and all its actions null and void.   The road to removal was set. With Andrew Jackson elected president in 1828, the fate of the Indians was fairly sealed.  Jackson was a known loather of Indians and made it his mission to get rid of them as quickly as he could without regard to any so-called rights or sovereignty. He gave them only two options: “emigrate beyond Mississippi” or “submit to the laws of those States.” (58-61)

There were those who fought against Jackson’s policy of removal.  In fact, most of the Republicans in Congress opposed it simply on party lines.  Jeremiah Everts, chief administrative officer of the American Board of Commissioners for Foreign Missions, was a key opponent of removal.  He published 24 essays in defense of Cherokee rights and condemning removal. Others responded in both camps, and the debate was on.  In April, 1830, a bill for removal was passed.  The Cherokee had lost. (61-63)

While the economic and cultural pressures certainly contributed, the removal became a matter of politics, both in the Cherokees’ attempts to remain sovereign and the US government’s determination to have the land. In the end, only one could win.

 

Source:

Purdue and Green, The Cherokee Nation and the Trail of Tears, Penguin Library of American Indian History, 2008

The Emergency Banking Relief Act of 1933

January 30, 2013

At the height of the Great Depression, FDR took extreme measures to halt massive bank closures with the Emergency Banking Act of 1933.

By Elizabeth Linehan

Image

From the opening years of the Great Depression, Herbert Hoover had hoped for individual and private solutions for the economic difficulties faced by Americans. Many – but by no means all – who have studied his choices have labeled them “laissez-faire” (literally “leave to do” or more commonly “hands off”). Regardless of public christening, there is little doubt that Franklin Roosevelt was elected for exactly the opposite aim – direct, decisive and drastic intervention. He delivered.

Again, history would remember Roosevelt’s New Deal measures in many and varied ways. Considered in equal measure heroic and disastrous, there is little room for argument that many of his measures made vast and immediate differences.

Thousands of Bank Closures

In the first four years following the collapse of Wall Street on Black Tuesday, October 29, 1929, banks had been closing by the thousands. In 1931 alone, 2300 banks shut their doors. In 1933, that number almost doubled to more than 4000. Panic was universal, and there was no end in sight.

As soon as FDR took office in 1933, he took sweeping action to try to turn around the plummeting economy. One of his first actions in March of that year was to halt massive bank closures by declaring a banking holiday. From Monday, March 6 to Thursday, March 9, 1933, all banks in the US were closed for business. Then, on March 9, in what some would see as retroactive CYA, Roosevelt quickly wrote and pushed to Congress an amendment to the “Trading with the Enemy Act” (TEA) passed during World War I, legalizing the closures he had just enacted. This was the Emergency Banking Act of 1933.

Image

Emergency Banking Act of 1933

Title 1 Section 1 of the Emergency Banking Act confirmed the President’s actions/rules/etc taken since March 4, 1933 under the TEA, also called “Act of October 16, 1917”. In other words, it legalized things the President had already done but without renewing proper legal consent. It extended the President’s powers under the TEA to include persons within US or any place under its jurisdiction, rather than just foreign countries.

Sections 2 and 3 prohibited hoarding, melting, etc, of gold by private citizens and gave the Treasury the right to confiscate all privately held gold, paying for it with cash. That cash was not backed by gold, as it had been before.

Section 4 made doing business with banks during a declared emergency illegal, except by permission from the President of the US.

Title 2, called the “Bank Conservation Act”, provided for a Comptroller of the Currency and essentially put the national banking system in receivership. Of course, the official title for the “receiver” was “conservator”. The controller had the ability to take control over the banks and set the rules for running them, limiting withdrawals and debt payments under the direction of the President in an emergency. Title 2 also gave the rules for reorganizing banks. This effectively gave the President absolute control of national finances during a declared emergency.

Title 3 governed the handling of shares of bank stock, common and preferred. It outlined the notification and treatment of shareholders, protecting the interests of the holders of preferred stocks first and foremost over those of common stocks. Banks could then absolve themselves of their debts as long as the Comptroller of Currency and the majority of their stockholders agreed.

Title 4 allowed banks to convert their debts into cash, and any checks or drafts into cash but at only 90% of their value.

Finally, Title 5 set aside $2,000,000 for expenditures incurred by the Treasury in executing this act.

Lasting Effects

It could reasonably be argued that simply to use the “banking holiday” to halt the race to bankruptcy would have been sufficient, that the confiscation of gold and outlawing private ownership of it was unnecessary and unconstitutional. The gold standard which had backed US currency since the founding of this nation was gone, never to return. Fortunately for Americans, the right to privately own gold was restored on January 1, 1975.

One other banking act passed in 1933 that lives on today more appreciated by private citizens. The Glass-Steagall Act of 1933 (not to be confused with the first Glass-Steagall Act, passed in February, 1932), provided for the Federal Deposit Insurance Corporation. At its onset, the maximum amount a single depositor could have insured in a single bank was $2500. Today that amount has grown to $250,000, protecting checking and savings deposits and certificates of deposit, but not mutual funds, annuities, stocks, bonds, treasury securities and other investment products. In time, even those standards may evolve.

Sources:

Documents of American History, Emergency Banking Act of 1933, Web

United States Treasury, Trading with the Enemy Act, Web

Internet Archive, Glass-Steagal Act (1933), Web

Federal Deposit Insurance Corporation, Insured or Not Insured? Web

 

Copyright Elizabeth Linehan

 

Homophobia (or “Warning – Label May Be Misleading”)

October 3, 2012

A couple years ago I took a class called “Lifetime and Human Development”. Fascinating class that reaffirmed that not only were my kids normal and right on track, but my husband and I weren’t so bad off, either – or so I thought. Well, he’s okay. I’m the one who got into trouble.

As part of the course, we talked for a time about sexuality. Of course, sexual preferences were part of the discussion. Since the beginning of the class, our instructor made it clear that all opinions/pov’s/etc were to be accepted and no one was to be ostracized for being different. In a class such as this, that reminder is hugely important. That being said, there was a particular PowerPoint presentation that discussed statistics regarding sexuality. According to this PPt, around 10% of the people we encounter are likely to be gay, lesbian, bisexual or transgender. What caught my attention was one particular slide that talked about points of view. In describing people who disagree with the gay perspective, this slide used the word “homophobic”.

Let me say one thing before we continue. Everyone, and I mean everyone, has the right to live without fear, without harassment and without feeling less valuable because of their beliefs, lifestyle, faith, or other intensely personal character traits. There is NO excuse for bullying, discrimination, degradation or any other disrespect of another person, no matter the age, no matter the setting. This piece is not about homosexuality. It is about language, communication, and censorship. There is a difference. So, on with the story…

Being the quiet, demure student that I am (uh, well…) during the discussion following the presentation I raised my hand. I said that I had a problem with that particular word (“homophobia”) because it correlated disagreement with a given ideology with being “phobic”, or unreasonably fearful of gays or of homosexuality. I said, in what I had thought at the time were very carefully selected words, that while I may disagree with a certain ideology, that doesn’t make me “afraid” of it.

With only a minimal comment from our instructor, we moved on. A couple days later, I received an email from my instructor stating her concern that my words had offended some of the other students, two of whom had approached her after class and complained. One of those was a gay who had suffered years of abuse and derision from others and was quite understandably concerned that I was going to cause him/her more problems. The other was from another student – I assume hetero – who was simply offended by my view point.

I, of course, responded as soon as I could after taking some time to think about it all. First, I had no intention of causing anyone undue alarm. I didn’t realize that I had given this first student cause for fear for his/her own safety, and I was very sorry for doing so. There was no opportunity to make amends directly. I hoped that the teacher had been able to assure this student that I had never meant to cause him/her distress. From what the teacher said, I think she did. On the other hand, I thought the second student was being as intolerant as she/he viewed me to be. I had voiced an opinion in a class where all opinions were supposed to be considered important. I had not spoken against homosexuality, only the misuse of a specific word.

That’s where that episode ended. But I haven’t been able to completely stop thinking about it, and here’s why:

First – “Homophobia” and “homophobic” are words that, thanks to a very liberal main stream media, are sprayed around the United States like Weed-Be-Gone. Anyone who doesn’t openly support homosexuality and all related variations is immediately labeled as hostile, closed-minded, bigoted, and otherwise intolerant.

I can accept that the media are going to use specific words to push their agendas. But to find that same bias in a class that is all about tolerance and acceptance is very disturbing and disappointing to me. My instructor, of all people, should have been aware of the actual definition of such wording and the implicit connotations. The label of “homophobic” should never have been allowed in that classroom, especially when we were discussing the effects of some teachers’ seemingly harmless words/actions/jokes/etc on their more vulnerable students.

Second – by very definition, “homophobia” and “homophobic” are demeaning and prejudicial, even inflammatory. A phobia is, according to another teacher, an “unreasonable fear of something.” From medicalexicon.com,
phobia
Type: Term
Pronunciation: fō′bē-ă
Definitions:
1. Any objectively unfounded morbid dread or fear that arouses a state of panic.

We’re all familiar with arachnaphobia – the “unfounded morbid dread” of spiders – and agoraphobia – the “unreasonable fear” of public or open places. Since when does simple disagreement with an ideology equate with unreasonable fear? Because I disagree about sports teams or the way my son eats spaghetti, does that make me fearful of them? I may have serious disagreements with policies of the current presidential administration, but I’m not at all afraid of them, and I’m certainly not unreasonable in my disagreement.

That applies equally here. I – or anyone else – may have our own reasons for having our own thoughts and opinions about the ideology and/or practice of homosexuality without being fearful of such and certainly without being unreasonable or without foundation. To label someone who disagrees as “unreasonable” or “unfounded” is inherently biased, intolerant and even censoring. If I can’t voice my opinion – taking into account better than I did at that time the likely effect on those who have a very painful history of dealing with intolerance – then I am being censored in the very manner our class was supposed to prevent. In this country, where our core values are based on our right to speak our minds without fear of retribution or censorship – as long as we respect the rights of others – how can such labeling be a good thing?

We have seen this homophobia label so often we actually are starting to believe it. But it’s a lie, plain and simple. Repeating a lie to the point of acceptance doesn’t change it to the truth. It is still a lie. To allow such censoring labels to be perpetuated is damaging to all of us. Such crimes against our basic rights of speech and print are even crimes against our thoughts. They will spread from this area to others as surely as a cancer metastasizes until the whole being is consumed.

To my friends and acquaintances whose views differ from my own – Keep them! Speak them! And be prepared to defend them. As Evelyn Beatrice Hall badly paraphrased Voltaire, “I disagree with what you say, but I will defend to the death your right to say it!”

James McPherson Lecture Tonight!

April 21, 2010

There are certain advantages to being a university student. One is having an inside track into local events that

Lecture – Lincoln and His Generals

otherwise I’d probably never know about, nor would I realize what an opportunity is was. A few years ago Jane Goodall came to Bozeman to promote her new book. Admission was free, even. I really wanted to go and bring Rachel, our younger daughter who is something of an animal hugger. For whatever reason, other things got in the way and we didn’t go. Probably a once in a lifetime opportunity lost.

This time, that’s not going to happen! Tonight, James McPherson – a well known (Pulitzer Prize winning) Civil War historian – is lecturing here at MSU. Dr. Rydell, my American History prof, requested that we all attend. Absolutely! And I’m bringing my family. And my neighbor. And her family…:) You can see where this is going!

I have one of McPherson’s books, Of Cause and Comrades. We used it as a text for our coverage of the Civil War in first semester American History. It was very poignant in its personal view of day to day civil war life.

So, tonight my four children and I will be sitting in the SUB ballroom, listening to McPherson educate us on “Tried by War: Lincoln as Commander in Chief.”  I’m totally stoked! (Hope my nine-year-old can keep his feet still.) Rich, my husband, won’t be able to be there. He has to drive to Idaho on his regular run. Guess I’ll just have to fill him in on every last detail. 😉

Roosevelt’s New Deal – A Question of Success

April 2, 2010

New Deal programs put 1/3 of America’s destitute to work.

In the decades since the end of the Great Depression, debate has raged in the hearts of tax payers and politicians about the success or failure of the New Deal; President Franklin Delano Roosevelt’s collections of legislative actions aimed at providing relief from and a way out of the economic chaos that was the Depression. When addressing the question of success or failure, one cannot simply look at the program as a whole, because it was not simply one program. Many acts were proposed and passed – some to great benefit and some to disappointment. This paper will show five programs in particular that hail the New Deal as a success for one specific reason: that these acts did exactly what they were intended to do – to stop the collapse of the banks and to bring relief to millions of suffering poor. These five acts are: the Emergency Banking Act, the National Recovery Act, the Federal Emergency Relief Administration, the Civil Works Administration, and the Works Progress Administration.

The need for relief from the dire situation of the Great Depression is of no debate. The years from 1928 to 1933 saw unprecedented collapse of markets and economies across the nation. Businesses failed by the hundreds of thousands, large ticket sales plummeted to fractions of their earlier levels, millions of workers were unemployed, homes and farms were in foreclosure and people were hungry with nowhere to turn (Henretta 727-728). The inaction of the Hoover administration overtaxed charities to the breaking point. Newly elected President Roosevelt and the democratic majority Congress set to work immediately to shore up a failing economy with a whirlwind of legislative action. (Henretta 739) The sheer speed at which they addressed current economic concerns turned the mood of the nation around. Finally, the people of the U.S. had hope that relief would be forthcoming.

The first act passed was the Emergency Banking Act (EBA) of 1933. To stem the flow of bank closures (nearly 2300 in 1931 alone), Roosevelt “declared a national ‘bank holiday’” (Henretta 739), bringing an immediate, if temporary halt to any more closures. For one day, all banks closed their doors.  In a special session of Congress, the EBA demanded a Treasury Department inspection of any bank wanting to reopen to ensure they had enough cash reserves. Using the radio to reach the American public, Roosevelt convinced the people that their money was better off in the banks than out of them. With confidence returning, the follow-up to the EBA was the Glass-Steagall Act which gave birth to the Federal Deposit Insurance Corporation, protecting deposits of up to $2500 from losses. These two actions brought enough confidence back to banking that bank closures topping 4000 in 1933 virtually ceased in 1934 when only 61 went under.

The National Recovery Act changed the way businesses governed themselves and how they set prices and production quotas. Of those changes, the most beneficial to the American people were the codes that outlawed child labor, set minimum wages and limited working hours for adults (Henretta 742). This group of actions prevented employees from being forced to work 16 hour days, thereby opening                      up man-hours for more employees. It moved children out of the work place, opening up more jobs while improving quality of life for children. And finally, wages were raised to a legal minimum across the nation, paying workers more for the hours they put in.

Bringing relief to the unemployed required a different action – one that would encourage people to continue looking for

There was a gnawing fear of “what would happen” if the Depression continued.

work while allowing them to feed their families and pay for their homes. The Federal Emergency Relief Administration was created to do just that. Its first director, Harry Hopkins, sent federal monies to state relief programs. The money was desperately needed and quickly distributed – $5million in just two hours after Hopkins started. In the first two years, over $1billion was spent (Henretta 742). It wasn’t a long term answer, but it was fast. The masses who continued to go without work were at least able to put food on their families’ tables.

Congress set up a construction program called the Public Works Administration (PWA). To fund it, Hopkins was then named head of a new Civil Works Administration through which he gave $400 million to PWA.  In 30 days, 2.6 million people had jobs. The following year, CWA would hit its high point, paying for the jobs of 4 million Americans. These jobs involved “repairing bridges, building highways, constructing public buildings and setting up community projects.” (Henretta 743/744) Valuable work was being done, and millions of formerly unemployed finally had respectable occupations.

The first New Deal didn’t continue long enough to see any stability in the recovering economy. When a new wave of recession began to rear its ugly head, FDR brought out phase two – the Second New Deal. One of the many programs in Act Two was the Works Progress Administration, another back-to-work program that “became the main federal relief agency.”  (Henretta 747) Unlike the FERA, the WPA put people straight onto the federal payrolls instead of paying the states to employ them.

The WPA was massive. Running from 1935 to 1943, the WPA employed over 8 million people and spent $10.5 billion. Assignments included construction or repair of over 650,000 miles of roads, 124,000 bridges, 125,000 public buildings, 8200 parks and 850 airports. No one was getting rich working on these projects. Pay averaged $55 per month. But for those working, it was a far step above a hand-out. There were jobs for only one-third of the unemployed in the US. (Henretta 747), but for that third, this was heaven-sent.

What was, perhaps, one of the best “products” to come from the New Deal was not as tangible as a paycheck or a pot of soup. Perhaps the greatest benefit was hope. With the inaction of the Herbert Hoover, hope was in short supply. As an Arizona man was quoted in the text saying, “You can’t sleep, you know. You wake up at 2 a.m. and you lie and think.” (Henretta 737) In 1931, Mary Hamilton, a writer from Great Britain, observed, “…long queues of dreary-looking men and women standing in ‘breadlines’ outside the relief offices and the various church and other charitable institutions. Times Square…is packed with shabby, utterly dumb and apathetic-looking men, who stand there, waiting…there is an obscure alarm as to what they may do ‘if this goes on’…” (Henretta 732) Through these programs that hope and optimism returned. Americans could finally believe there would be an end to the hunger and destitution.

Three of the aforementioned programs were effective here in Montana. The Civil Works Administration “put 20,000 Montanans to work within three months.”  Due to mismanagement, it was replaced by the Federal Emergency Relief Administration, first giving direct relief and then employment. But the most present was again the Works Progress Administration which employed up to 21,000 people in the state.  “The [dormitory] project allowed rural children to stay in school, eased the financial burden of their families and gave (cook) MacLean pride in her own labor.” (Murphy 52) Other projects included “schools, roads, bridges, dams, stadiums, parks, swimming pools, tennis courts, fairgrounds, golf courses, water and sewage systems, airports, fish hatcheries, fences, sidewalks and curbs.” (Murphy 53) The work was valuable, honest and a worthy trade for the pay. Finally, people who were unable to grow crops on drought stricken land had a chance to feed their families from the fruit of their own labor.

These programs that made up the “New Deal” weren’t panaceas. Some programs didn’t provide much good at all. But these five brought desperately needed relief to millions, kept banks open, and even encouraged investment. It would take the Second World War to finally bring comprehensive relief to the nation, but in the mean time, the New Deal gave millions of people the ability to hold on.

Searching for Eleanor Roosevelt

April 1, 2010

Again, my favorite class intrigued me with a tidbit to pursue. In American History, Dr. Robert Rydell closed class by bringing the McCarthy era home to us. He told our class about a gentleman who used to be part of the MSU staff, Robert Dunbar, and his invitation to First Lady Eleanor Roosevelt to come speak here at the (then) Montana State College.

Statue of Eleanor Roosevelt

Even Eleanor Roosevelt was accused of communism for her support and activity with the UN.

For those who didn’t know, Mrs. Roosevelt had been a signer in the formation of the United Nations. She worked feverishly to alleviate hunger and suffering across the world in the aftermath of World War II. But since, in the eyes of Senator Joseph McCarthy, the UN weakened America, anyone supporting it, especially those who were instrumental in creating it, were seeking the destruction of America and must therefore be Communist.

At the time, then MSC President Roland Renne had some political ambitions and was seeking the office of Governor of Montana. He had grave concerns regarding Mrs. Roosevelt’s pending visit and how it would reflect on him. Fearing association with a suspected Communist sympathizer, Mr. Renne had the audacity to deny Mrs. Roosevelt a place on campus to speak. She was only able to take her plans into downtown Bozeman and speak at another venue (Dr. Rydell wasn’t sure which. If I’m able to find out, I’ll update this.)  Dr. Dunbar was flabbergasted, of course, but powerless to do anything about it.  Mrs. Roosevelt stepped up to speak on a stage completely draped in red – the carpet, the podium cover, the curtains, all of it. The implication was obvious. Still, she went on to deliver her message to a packed house.

Dr. Dunbar, in the meantime, was accused of communism by the Bozeman community. He received numerous death threats, kidnapping threats aimed at his children and other persecution. Like Mrs. Roosevelt, Dunbar wasn’t deterred. He went on to form the school’s first Peace Corps chapter – a group that in 2008 received recognition from the parent organization for high volunteerism and service.

What I found most perplexing was the near complete lack of information available about this episode with Mrs. Roosevelt. There is a very brief mention on the University’s website (historical page) and, so far as I have found, nothing else. Why? Perhaps it wasn’t (isn’t?) considered noteworthy. That may be, but looking at the utter nonesense that otherwise finds its way into historical documents, this seems at least as memorable or significant. Perhaps it’s a splotch of mud on our shining coat. No one today likes to be remembered as reactionary or worse, duped.

Most likely, I’m just not looking in the right place. That’s what I’m hoping. If true, then once more, I’ll update this when more facts are known. In the mean time, here’s looking forward to more of Dr. Robert Rydell’s classes. May they all be as thought provoking as this series have been!

History of the Pledge of Allegiance

February 13, 2010

We had a rather interesting class in American History this week. My professor, Dr. Robert Rydell, gave us a brief history of the Pledge of Allegiance. I’ve always known it had been changed a time or two over the years, like adding the words “under God” in the 1950s. But I had no idea the evolution the Pledge has gone through!

First, a little background. The pledge was the effort of Francis Bellamy in August, 1892. According to Dr. Rydell, the Pledge was written for children in celebration of the 400th anniversary of Columbus’ landing in the New World. It would be published in the Boston magazine, “The Youth’s Companion”. The words to the Pledge were sent to school children all over the country. The original Pledge holds only limited resemblance to the words we recite today:

From 1892: I pledge allegiance to my flag and the republic for which it stands; one nation, indivisible, with liberty and justice for all.

Dr. Rydell showed us the differences in how people saluted the flag in that day. First, the salute – known as Bellamy’s Salute – began the same as a military salute, at the eyebrow. That would be held as people said the words, “I pledge allegiance to”. As they then said, “my flag…” the right hand was extended from the salute to a reach toward the flag, hand still open with fingers together. It was held there until the pledge was finished.

Changes were not long in coming. First of all, the word “to” was added before “the republic” almost immediately. Larger changes took a bit longer.

This was a time in the US when immigration was becoming vastly unpopular, the economy was in turmoil due to over-production, a series of labor strikes and subsequent economic “panics”, as well as an influx of European and Asian immigrants. Concern grew among some groups, including some national leaders, that those immigrants would be pointing to the US flag, while privately intending their “pledge of allegiance” to their own flag back home. Thus, in 1923 at a National Flag Conference in Washington, DC, “my flag” was changed to “the flag of the United States of America”.

In the 1940s when the US was at war with Nazi Germany, the dreaded “Heil, Hitler” salute was all too close to the part of our salute raised to the flag. The American salute to the flag was changed to the “hand over heart” that we do today.

In 1942, the Pledge was made an official part of displaying the American flag, as part of an effort “to codify and emphasize the existing rules and customs pertaining to the display and use of the flag of the United States of America,” Congress enacted a Pledge of Allegiance to the flag.” [H.R. Rep. No. 2047, 77th Cong., 2d Sess. 1 (1942)]

The final big change came in 1954 when President Dwight D. Eisenhower approved adding the words, “under God”. He said, “From this day forward, the millions of our school children will daily proclaim in every city and town, every village and rural school house, the dedication of our nation and our people to the Almighty.”

In 2002, Michael Newdow, an atheist and attorney, filed suit in the 9th District Court seeking to ban the Pledge because it purported to teach monotheism to his daughter. With various decisions, overturnings, and refilings for different plaintiffs, the Eastern District Court of California declared mandatory teacher-led recitation of the Pledge to be unconstitutional. Since then, other states have taken their own paths, some allowing voluntary recitation, others dropping it all together. Some, including New York, require it to be read each day. The United States Congress, Supreme Court and other organizations recite the Pledge at session openings.

Resources:

“Francis Bellamy”, http://en.wikipedia.org/wiki/Francis_Bellamy
Lectures: Dr. Robert Rydell, American History 102, Montana State University, Bozeman
Newdow v The Congress of the United States, et al (NO. CIV. S-05-17 LKK/DAD)
“The Pledge of Allegiance”, John W. Baer, http://oldtimeislands.org/pledge/pledge.htm

Lessons from El Mozote

January 12, 2010

note:  This was a “short paper” I wrote for a class, Latin American History, at Montana State University. It was a commentary on Mark Danner’s book, The Massacre at El Mozote, which proved to be a rude awakening to a devout American. I still believe in this great nation. I just can’t be as naive as I once was.

The term “dirty war” was originally used to describe the government sponsored combat in Argentina against leftist guerrillas in the 1970s, killing subversives by the tens of thousands. The same term was used in Mark Danner’s Book, The Massacre at El Mozote, to describe a similar operation in El Salvador in the early 1980s. (Danner, 25) In this book, Danner gives a glimpse into the efforts used by Salvadoran government and by extension its army, to contain revolutionary changes during the Cold War period.  Those measures, made acceptable to them in consideration of the history of civilian support of the rebels, were military tactics such as their so-called “Hammer and Anvil”, a scorched earth policy, and wholesale brutality (including the subsequent lies to cover it up).  Although there is no evidence that US forces directly took part in the atrocities, Atlacatl (Salvadoran army) leaders were largely taught and their army funded by the United States government with full knowledge of the actions they supported.

The Salvadoran army had faced guerrillas in the past.  Rebel forces frequently lived among, did business with and even recruited from small towns in the mountains.  In the late 1970s, radical priests had brought their congregations to believe in the leftist causes.  Many of the youth would, at the guerrillas’ bidding, join the national army “in order to receive military training and gain firsthand knowledge of the enemy…” (Danner 30)  They provided intelligence to the guerillas and later left the army and joined them (leftist forces). “By 1980, small groups of young guerrillas were operating throughout northern Morazán, drawing food and support from sympathetic peasants, and launching raids from time to time against the National Guard posts in the towns.” (Danner 31)   Of course, the fact that the people of El Mozote had stepped out of that mold, refusing to actively support either the military or the rebels didn’t seem to matter to the government. The ERP (guerrilla forces) understood that the people of El Mozote only cooperated “at the lowest level” so as not to bring any harm on their town. A rebel called Licho said, “Sometimes they sold us things, yes, but they didn’t want anything to do with us.” (Danner 18/19)

Hammer and Anvil was a general term for any method of counter-guerrilla fighting that would “expel the guerrillas from the zone”.  The intended effect was to get rid of the rebel-imposed Marxist-Leninist system, hoping to break “the support of the people they [the guerrillas] had indoctrinated.” This method was largely ineffective for several reasons.  First, it required a large military force to maintain the territory taken, and the army didn’t have enough troops or equipment. Second, there were disagreements among the army personnel as to what to do with the town or people. Civilians were often accused of being subversives and were killed. (Danner 41). After a short time, the army would move out. The guerrillas would move back in and any progress would be lost.    The end result was only a few rebels killed and the civilian support was not broken. (Danner 42)

Those whom the army had captured didn’t rate the word “rebel” or “guerrilla”.  Instead the army called them “delinquent terrorists” and all civilians in that zone “masas” or guerrilla supporters.  That turned the civilians into legitimate targets for the army.  The Salvadoran government adopted the stand, “If you’re not with me you’re against me. And if you’re against me, I have to destroy you.” (Danner 42)

Although El Mozote was known to not support the rebels, they didn’t much more support the army.  The minimal courtesy they gave to each force was by and large uniform, not showing favoritism to one or the other.

The “Hammer and Anvil” had already been tried out at El Rosario in 1980.  Assuming that most of the townspeople were guerrilla supporters, the army pushed them down into the city center where the plan was to annihilate them with artillery.  The push (Hammer) came by way of armed combat lasting two weeks.  The Anvil portion was never fulfilled as intended. While about 40 civilians were killed directly by the soldiers, a far greater slaughter was averted because the officers couldn’t agree on who was enemy and who wasn’t.  When the killing was over and survivors escaped, the “scorched earth” part began.  With only a few people remaining in El Rosario, “the soldiers burned all the corps they could find.” (Danner 43/44) With nothing to sustain them, peasants from Morazán headed north to Honduras.

A year later, El Mozote would not be so fortunate.

The people of El Mozote trusted the army.  In the past there had been no reason not to.  Everyone knew they weren’t rebel supporters, so the Atlacatl (LTC Monterrosa’s elite American trained army) would have no argument with them.  So, when army personnel told a well-respected member of the community, Marco Díaz, that people would be safer in their homes than fleeing into the mountains, he believed it. So did the rest of the small town. (Danner 17)  After all, why would they lie?

Site of 1981 Massacre

El Mozote

The hammer worked perfectly. People from surrounding areas all gathered in the homes of family or friends in El Mozote, resting on the reassurances of the army and of Marco Díaz. The wholesale slaughter and depraved degradation that followed defies comprehension. (Danner 68-84)  If the rapes, beheadings, hangings, impalings, and other atrocities were policy, the only evidence in Danner’s book were the words of one captain who told his men, “What we did yesterday, what we’ve been doing on this operation, this is what war is, gentlemen.” (Danner 82)

Some may call it rogue, some may call it policy. This was the “dirty war”. The rest of El Salvador was not exempt. As shown towards the beginning of this book, many cities shared the “mutilated corpses” and “headless or faceless” bodies showing evidence of many of the same atrocities and some even worse. (Danner 25) The use of “death squads” was “organized by the Salvadoran Army officers…and the American Embassy was well aware of it.” (Danner 27)

Monterrosa’s claim that the guerrillas needed their masses (wives, children, community support that followed the camp) and in the fighting some of these women and children would be killed smacked of excuse and alibi more than policy. (Danner 170) El Mozote was not a town of masas but of civilians without rebel connections. Once discovered, these ruthless actions were followed by a steady flow of lies from Salvadoran ambassador to the United States, Ernesto Gallont, who rejected “emphatically that the Army of El Salvador [killed] women and children.”  He claimed, “It is not within the armed institution’s philosophy to act like that.” (Danner 183) Later, when faced with having to answer questions regarding the massacre, Defense Minister Garcia said, “I’ll deny it and prove it fabricated.” (Danner 202)

The US involvement began long before El Mozote was an issue.  Located in Panama, The School of the Americas was started by the US back in the 1940s, training Latin American military officers “in psychological warfare, counterinsurgency, interrogation techniques, and infantry and commando tactics.”   According to a website sponsored by an organization that believes the SOA to be more of a “School of Assassins”, out of the 12 Salvadoran officers indicated in the El Mozote incident, 10 were graduates of the SOA. (Bourgeois)

The United States was in a difficult position at this time.  With one strongly communist nation sitting 80 miles off Miami beach, President Reagan was loathe to allow another any sort of foothold in Central or South America if he could at all stop it.  Funding from the US assured a democratic influence in El Salvador, so Reagan thought.  But with a poor track record on human rights abuses, that funding would stop if El Salvador could not show “a concerted and significant effort to comply with internationally recognized human rights.” Reagan had just signed off on the certification of that effort, but the reports published in four newspapers could have derailed all of it.

Pressure from Amnesty International, the ACLU, the National Council of Churches and other civil and human rights groups pushed Congress into opening an investigation into the events in Morazán. Despite the best efforts of Ambassador Hinton,  LTC Monterrosa, and Assistant Secretary of State Enders, the horrific events were disclosed and participants exposed. Enders protected the decision to continue funding for all he was worth, twisting the words of the panel, quoting “this Foreign Affairs Committee” without addressing in the least the question posed by Mr. Solarz and otherwise avoiding divulging incriminating information that could overturn certification. (Danner 201-221)

The strong arm tactics used in El Salvador and the continued funding and training by the United States showed a tendency to place the value of human life far below political ambition and perceived political/ideological threats. The use of death squads and “scorched earth” tactics, then deception, denial and fraud to cover it up don’t speak well of either nation.

Work Cited:

Danner, Mark. The Massacre at El Mozote. New York: Vintage – Random. 1994. Print.

Bourgeois, Roy. “The School of the Americas”. Third World Traveler. SOA Watch, 08 May 2002. Web. 28 Nov. 2009.