Stat Counter


View My Stats

Tuesday, June 30, 2015

The Best and The Brightest?

Texas senator Ted Cruz graduated with the highest honors from Princeton and Harvard Law. Doctor and presidential candidate Ben Carson was a leading neurological surgeon at Johns Hopkins after receiving his medical degree from U. of Michigan. Louisiana governor Bobby Jindal graduated from Brown and was a Rhodes scholar at Oxford.

These three brilliant men are completely ignorant about almost every important social issue: gay marriage and other LGBTQ rights; health care; evolution; a woman’s right to choose. None of them acknowledge the scientific evidence of man-made climate change. They cling steadfastly to the wrong side of almost every significant issue of our time.  

While these facts may show the defects of our educational system, they certainly can be explained by something else these men have in common. They all claim to be guided in their beliefs by their strong Christian faith. (You can add other presidential candidates to this category: former Pennsylvania senator Rick Santorum, former Arkansas governor Mike Huckabee and former Texas governor Rick Perry, although their college careers didn’t match the other group for documented brilliance.)

In our country, religion has had a spotty history vis a vis Progress. Our founding fathers were “deists,” that is they shared the Enlightenment Era’s attitude toward religious belief. They were skeptical of the power of churches, but asserted a general belief in God. Before accepting the new Constitution they insisted on an amendment to protect the right to pray freely, but also included the assurance that no particular faith could be established as the preferred religion of the government.

Ever since then, many people have looked to their faith for guidance as to the most critical and long lasting divisive issue we have ever faced: race. During the slavery debate, the Bible was cited by both sides for support and provided emotional succor to the most adamant positions for and against slavery. Abolitionists preached from northern pulpits, and many churches provided sanctuary on the underground railroad. But in the south and elsewhere, preachers in other pulpits were equally passionate about the righteousness of their cause and equally certain that God and His Bible approved of slavery. Lincoln observed both sides could not be right. He urged that a God that disfavored freedom was not worthy of belief. 

The Catholic Church has historically been vocal as the most conservative on all of these issues. This church calls itself “catholic” but that is contrary to the word with a “small c” which is defined as including a wide variety of things; all-embracing.” Synonyms include: universal, diverse, broad-based, eclectic, liberal, comprehensive, all-embracing, all-inclusive. This is the opposite of the policies of the Roman Catholic Church.

The same is true of most “organized churches” not just Christian ones. Any religion can be used to justify intolerance and the antonyms of “catholic” values.  

The Constitution also created three branches, and although the branches were “co-equal” they intended a division of power: The Legislature passes the laws, the Judicial insures that the laws conform to the Constitution, and the Executive enforces the laws that pass muster.

None of those brilliant students who are now running for president seem to understand the fundamental rules that are at the foundation of our freedom. This is odd considering that their they claim to be adherents to “fundamentalist” religions. The First Amendment referred to above also requires toleration of non-religious beliefs, i.e., atheists. This resulted in the Supreme Court’s ruling prohibiting organized prayer in public schools, another issue these brilliant candidates complain of in their ignorance. 

Harvard Law Prof. Alan Dershowitz called Ted Cruz one of his most brilliant students. Wow! Cruz objects to a majority of SCOTUS justices interpreting the ACA Act, and the finding of "Equal Protection" for same-sex marriage. And he wants to revoke lifetime appointments. Amazing, the scholar Cruz didn't seem to find fault when the Court ruled in Citizens United or the Second Amendment cases.  

These geniuses also mislabel themselves as “conservative.” The dictionary definition of this word is “a person who is averse to change and holds to traditional values and attitudes.” This apparently doesn’t apply to our founders’ wary attitude toward religion or their enlightened reliance on science over faith. 

Although those other brilliant men were also flawed in their inability to live up to their asserted value that “all men are created equal,” much less to conceive that the concept of equality must apply to women and people of other races, still they were far more enlightened than the best and the brightest that the Republican Party can produce in 2015. 

Friday, June 26, 2015

SYMBOLS



In response to the current effort to remove the Confederate battle flag (aka the Stars and Bars) from public spaces, an elderly man complained that to him it symbolized his ancestor’s heroism in the Civil War. “’You’re asking me to agree that my great-grandparent and great-great-grandparents were monsters,’ said Greg Stewart, a member of the Sons of Confederate Veterans and the executive director of Beauvoir, the last home of Jefferson Davis.” (NY Times, Dig. Ed. 6/24/2015.)

At first glance, one might sympathize with Mr. Stewart. I would not want to be deprived of the right to honor my ancestors who fought for this country in the world wars. But first glances aren’t enough. There is a difference. Mr. Stewart’s forbears fought AGAINST this country. They fought for their state’s right to deprive human beings of their freedom.

This was and is one of the worst causes ever fought for. It should have been lost and those who sought to support it and fought and killed other Americans should be viewed as traitors rather than heroes. They were wrong to fight for it. Bravery on behalf of an evil cause is not something that should be honored by the society.

Mr. Stewart and others may, if they choose, continue to revere their ancestors who died as soldiers, just as the Japanese and Germans may honor theirs. But that is different from honoring the causes they fought for. Germany bans the swastika and other Nazi symbols for good reasons, not all of them merely politically correct policy to prevent embarrassing reminders of a bad nightmare. There are ethical and moral reasons for their decision—a collective sense of guilt, shame. The banning shows a raised level of sensitivity that bespeaks of progress. Germans should take more pride in this cultural shift than all of Wagner’s compositions.

The Confederate battle flag attained public exposure as a symbol in the 1950’s when the Civil Rights movement gained traction in the south. It was used by the KKK, White Citizen’s Counsel, and then by every redneck thug as a battle flag to show violent opposition to any advancement of equality. It thus became a vivid symbol of hatred and violence.

Some argue that this is trivial: it is only a symbol that can mean different things to people. Civil War memorabilia, carried into battle by brave young men who fought and died with courage. Certainly, a flag can represent such things. It is only a symbol and as someone said, the “deranged young man” was the killer, not the flag. Yes, and guns don’t kill without people shooting them. I get it.    

Symbols are a source of contention because they are important. A lawn jockey. The Frito Bandito. The swastika. How do good Christians feel about the pentagram and 666? These and many other symbols are offensive, not only because they hurt particular groups, but also because there is a social consensus that they are, in the least, in such bad taste that they should be forbidden.  

But what about the First Amendment?

The issue of free speech as defined by the Supreme Court is a separate issue. Flags and other symbols certainly are a form of expression (as is flag burning). No law prohibits private individuals from the poor taste of possessing or exhibiting these things that have become symbols of hate to others. Removing the symbols from places that are supported by public funds or on public property is a different issue.

Democracy is rule by the majority BUT with respect and tolerance for the minority and sensitivity toward the legitimate concerns of all. This distinction is the essential ingredient that separates our society from those in which the majority (or dominant plurality) tramples the rights of minorities. ISIS, for instance, crushes all opposing religions, destroys all symbols of others, past or present. We perceive these acts as barbaric, but they are consistent with the fundamentalist notion of heresy – and fanatics aren’t the only ones who destroy symbols of enemies as soon as they get the chance. The Romans destroyed the Temple in Jerusalem; Goths and then Christians toppled Roman statues; Moslems converted Christian churches into mosques and destroyed icons; Hindus in India destroyed mosques . . .  

Islam prohibits adherents to worship idols and forbids images of the Prophet. The evil is not their faith; it is the imposition of their belief on others and their intolerance of the beliefs of others. They destroy a Buddhist statue to prove their power, not their righteousness.  But it is very bad judgment for non-believers to intentionally provoke violence by offending believers of any faith. We protect the right of individuals to do so because of our faith that freedom of speech is an essential part of democracy. BUT we do not permit our government or its officers to intentionally insult religions or religious people. (The First Amendment also promises that no laws shall either impinge on the free exercise of religion or create an official state religion).   

To many white southerners, the Confederate battle flag is a relic of “The Lost Cause.” Southern white culture ever since 1865 has mythologized the famous heroes of the Confederacy like Robert E. Lee, Stonewall Jackson, Nathan Bedford Forrest, and the brave soldiers in butternut and grey who fought for these leaders and died under that flag. Lincoln (who was and still is hated by many owners of that flag) urged mercy for defeated rebels, refused calls for their trial and punishment as traitors. He ordered Grant to provide lenient surrender terms to Lee’s tattered army. As soon as they swore allegiance to the American flag, they regained their citizenship, which included the right to vote.

For many years after the war, Southern writers, artists and politicians perpetuated the romantic myth of the unjust defeat: powerful dramas such as Birth Of A Nation and Gone With The Wind told the romantic (and dishonest) tale, glorifying the supposed gentility of ante bellum south, the nobility of the war to preserve a comfortable and civilized way of life, and perpetuated the belief that corrupt and greedy northern carpetbaggers conspired with ignorant and greedy freed Negroes to force reconstruction on the downtrodden white population.

It is safe to say that a most people no longer argue the righteousness of the rebel cause in the issue of slavery. But the South still clings to its sectional prerogative; the residue of “states rights.” But as one southerner said, the believers in that faith are of a dwindling breed. The New South is integrated, multi-cultural, far more cosmopolitan, at least in urban areas. African Americans and Hispanic s as well as other non “nativist whites” permeate the power structure and this trend is inexorable. Holdouts in rural areas and small towns, where poor whites still subsist and resent “foreigners” are dying out; their children moving to the cities for education at integrated colleges or for jobs at integrated Walmarts, hospitals, and Toyota plants.    

My generation was torn in half by the major post WW II issues. Civil rights – for African Americans and other ethnic groups; for women and children, for gays; for the disabled. And the Cold War, including Viet-Nam and other wars. The opponents of progress lost the arguments in all of these societal debates. Now, our generation is dwindling in size and power. In the 2008 presidential election the most vivid images were of Obama, the son of a white liberated mother and an African father opposed by a man who was a hero in the Viet Nam War. Mc Cain’s rallies were attended by elderly white males and their wives, all bemoaning the fact that they were losers in the social debate of the last forty years.  


Despite all the sins committed by my generation, the heritage that is most important for the immediate future may be the suppression (if not defeat) of racism.

Monday, June 01, 2015

WHEN A MOVIE IS NOT JUST A MOVIE

A recent FRONTLINE special titled "Secrets, Politics and Torture" (broadcast on PBS, May 15, 2015 observed that the CIA had used the movie, "Zero Dark Thirty" to create the impression that their program of "enhanced interrogation techniques" had succeeded in providing the crucial information that led to the location of Kalid Sheik Muhammed (KSM) and eventually to Osama Bin Laden.

The Fontline documentary shows by analysis of the secret papers Congress forced from the files of the CIA that it was a distortion of the truth. 

Senator Feinstein and others quoted in the Frontline program expressed disgust with the movie that purported to be based on the truth, but in fact was a willing tool to disseminate self-serving propaganda by the CIA and supporters within the Bush administration who were desperate to persuade the public that torture was needed for national security. 

History is replete with examples of movies that have been used as tools to sell a particular point of view. Movies are just movies, we know that. We should not expect a form that seeks to entertain to hew to the "literal truth" especially when it would interfere with the mission - to amuse, engross, and make profits. But when the movie makers choose a subject and take a stance on a side of a controversial issue - whether it relates to politics, institutions, or history - there is a greater responsibility. 

Some argue that it doesn't matter because no one takes movies seriously as "truth." By this view, people don't rely on movies to teach them anything. 

I disagree with this. The evidence of movie history, going as far back as "Birth Of A Nation" proves the point. 

Now, there is support in scholarly studies.    
This New York Times article is worth reading.

NEW YORK TIMES . . . Why Movie ‘Facts’ Prevail

FEB. 13, 2015

BY Jeffrey M. Zacks, a professor of psychology and radiology at Washington University in St. Louis, is the author of “Flicker: Your Brain on Movies.”

THIS year’s Oscar nominees for best picture include four films based on true stories: “American Sniper” (about the sharpshooter Chris Kyle), “The Imitation Game” (about the British mathematician Alan Turing), “Selma” (about the passage of the Voting Rights Act in 1965) and “The Theory of Everything” (about the physicist Stephen Hawking).

Each film has been criticized for factual inaccuracy. Doesn’t “Selma” ignore Lyndon B. Johnson’s dedication to black voting rights? Doesn’t “The Imitation Game” misrepresent the nature of Turing’s work, just as “The Theory of Everything” does Mr. Hawking’s? Doesn’t “American Sniper” sanitize the military conflicts it purports to depict?

You might think: Does it really matter? Can’t we keep the film world separate from the real world?

Unfortunately, the answer is no. Studies show that if you watch a film — even one concerning historical events about which you are informed — your beliefs may be reshaped by “facts” that are not factual.

In one study, published in the journal Psychological Science in 2009, a team of researchers had college students read historical essays and then watch clips from historical movies containing information that was inaccurate and inconsistent with the essays. Despite being warned that the movies might contain factual distortions, the students produced about a third of the fake facts from the movies on a subsequent test.

In another study, published in Applied Cognitive Psychology in 2012, another team of researchers, repeating the experiment, tried to eliminate the “misinformation effect” by explicitly asking the students to monitor the clips for inaccuracies. It didn’t work. If anything, the students were more prone to accept the untruths. The more engaged the students were by the clips, the more their memories were contaminated.

Why do we have such a hard time sorting film “facts” from real facts? One suggestion is that our minds are well equipped to remember things that we see or hear — but not to remember the source of those memories.

Consider the following evolutionary story. As our distant ancestors became increasingly able to communicate facts to one another via language, and to store them in their memories, this helped them survive. If a hunter on the savanna approached a watering hole, being able to remember that there had been a lion attack at that hole could be a lifesaver. But retrieving the source of the memory (did my cousin tell me about it? or was it my brother?) was less critical. As a result, our brain’s systems for source memory are not robust and are prone to failure.

This story is speculative, but it is consistent with what we know about source memory. Cognitively, source memory develops relatively late in children; and neurologically, it depends selectively on the prefrontal cortex, a region of the brain that is also late to mature.

Source memory is also fragile, highly sensitive to the vicissitudes of aging, injury and disease. Patients with damage to the prefrontal cortex exhibit failures of source memory that exaggerate the everyday errors we all make. One research subject who had suffered a lesion to his prefrontal cortex believed that a particular building in his vicinity was being used for sinister purposes; only later did it emerge that his paranoid interpretation of the building was a result of a spy film that he had seen some 40 years before.

In a 1997 study, patients with similar lesions were presented with lists of words or sentences read by two different experimenters, one male and one female. The patients were pretty good at recognizing whether they had heard a sentence or not, but they were significantly impaired at identifying which speaker had read it. It’s important to note, however, that this task was fairly difficult for healthy control subjects as well. None of us are immune from the perils of source memory.

The weakness of source memory leaves us, to some degree, at the mercy of inaccurate films. Is there anything we can do?

The research described above did reveal one technique that helps: Having the misinformation explicitly pointed out and corrected at the time it was encountered substantially reduced its influence. But actually implementing this strategy — creating fact-checking commentary tracks that play during movies? always bringing a historian to the theater with you? — could be a challenge.

Jeffrey M. Zacks, a professor of psychology and radiology at Washington University in St. Louis, is the author of “Flicker: Your Brain on Movies.”