"No religious test shall ever be required as a qualification to any office or public trust under the United States of America." -- U.S. Constitution, Article VI, Paragraph 3.
The concept of religious freedom was so important to the Founders of the American Republic that it was written into the main body of the Constitution itself, and later strengthened by the First Amendment, which reads, "Congress shall make no law respecting the establishment of religion, or the free exercise thereof."
Reason: Previously, under British rule, anyone who would be a member of Parliament, or who wanted to attend the major universities in England, had to be a member of the Anglican Communion, also known as the Church of England, the Established Church, the official denomination recognized by the government.
Therefore, since there were -- and are -- so many religious groups in America who wanted to follow their spiritual preferences as they wished, without pressure from government. this principle was formalized and written into the Constitution to guarantee a separation of church and state.
Unfortunately, many have forgotten this, especially politicians and those True Believers who are convinced that they and they alone have the Right Way and all others are wrong. (Note: Orthodox translates as "right way.")
As for the phrase "under God," in the Pledge of Allegiance, it's important to remember that the Pledge dates back only to 1892, and that this phrase was added in 1954, after a long campaign led by the Knights of Columbus, a group similar to the Masons, but sponsored by the Roman Catholic Church.
Nowhere in the founding documents -- the Declaration of Independence and Constitution -- are there any references to God or Christianity. In the oath (or affirmation) to be taken by a new President, the phrase "so help me God" is not specified in the Constitution, but was improvised and added by George Washington when he took office. It is not mandated by the Constitution, but has become customary, following the lead of President Washington.
(Note: Even that story may be wishful thinking. There is evidence that it didn't happen, but was added in a biography of Washington published in 1854. Click on the comments below.)
So as America becomes more diverse, it's increasingly important that everyone remember the principle of keeping church and state separate.
Your right to walk whatever spiritual path you choose remains guaranteed. But you do not have the right to impose your views on others and force them to walk the same path.
Friday, March 27, 2015
Tuesday, March 24, 2015
Religion and Politics
"The Government of the United States is not, in any sense, founded on the Christian religion." -- Treaty of Tripoli, 1797, signed by President John Adams and approved unanimously by Congress.
"America is a Christian nation," goes the refrain, taken up again by the Radical Righteous as they work to impose their views on everyone else.
Except for the 9 million Jews, 2.7 million Muslims, 1.4 million Buddhists, 586,000 Hindus, 582,000 Native Americans, 186,000 Sikhs, 340,000 Wiccans, 342,000 Pagans, 35 million Atheists and Agnostics, 31 million who say they are Non-Religious, and the 12 million who refused to reply to the question posed by the U.S. Census Bureau. Not to mention the millions of others who are Zoroastrian, Confucian, Shinto, Tao, Baha'i, or the odd Druid here and there.
The "Christian nation" chant is heard most often from ultra-conservative, evangelistic groups. But they are a distinct minority among the more than 30 major Christian denominations listed in the Census report. Of the total adult population of 228 million, self-described Christians total 173.4 million, or some 54 percent -- just above half. And the largest denomination is Roman Catholic, with 57.2 million members, followed by Baptists, with 36.1 million. Moreover, this lists only those who say they have a religious affiliation, not whether they actively participate.
The survey dealt with adults. The total population of the U.S. is now about 316 million.
One recent survey showed that more than half of Republicans surveyed said Christianity should be made the official religion of the U.S., and another survey showed that "55 percent of Americans believed it already was," according to a column in the New York Times March 14 issue.
The column, by Kevin Kruse, a professor of history at Princeton, points out that this belief is relatively recent in America, going back only to Depression Era efforts by corporations to fight off federal regulations.
It's a continuation of the earlier attitude of corporate executives who relied on their religious beliefs to justify their efforts to defeat labor union activists. To quote George Frederick Baer, president of the Philadelphia and Reading Railroad in 1902, "The rights and interests of the laboring man will be protected and cared for -- not by the labor agitators, but by the Christian men of property to whom God has given control of the property rights of the country."
It's important to remember that nowhere in the Constitution or in the Declaration of Independence is Christianity mentioned. Neither, for that matter, is God. And any rights listed in the founding documents are not granted by government. Rather, the documents guarantee rights we already have. And Congress shall make no law abridging these rights.
The separation of church and state was so important to the founders because they had been victims of the imperial tradition that disallowed many positions to those who were not members of the state-sponsored, established church.
There is no established church in America, and to claim that there is, or should be, is to show an ignorance of what America is all about: Freedom of religion, as specified in the First Amendment, and freedom from religion, as specified in Article Six of the Constitution itself.
Except for the 9 million Jews, 2.7 million Muslims, 1.4 million Buddhists, 586,000 Hindus, 582,000 Native Americans, 186,000 Sikhs, 340,000 Wiccans, 342,000 Pagans, 35 million Atheists and Agnostics, 31 million who say they are Non-Religious, and the 12 million who refused to reply to the question posed by the U.S. Census Bureau. Not to mention the millions of others who are Zoroastrian, Confucian, Shinto, Tao, Baha'i, or the odd Druid here and there.
The "Christian nation" chant is heard most often from ultra-conservative, evangelistic groups. But they are a distinct minority among the more than 30 major Christian denominations listed in the Census report. Of the total adult population of 228 million, self-described Christians total 173.4 million, or some 54 percent -- just above half. And the largest denomination is Roman Catholic, with 57.2 million members, followed by Baptists, with 36.1 million. Moreover, this lists only those who say they have a religious affiliation, not whether they actively participate.
The survey dealt with adults. The total population of the U.S. is now about 316 million.
One recent survey showed that more than half of Republicans surveyed said Christianity should be made the official religion of the U.S., and another survey showed that "55 percent of Americans believed it already was," according to a column in the New York Times March 14 issue.
The column, by Kevin Kruse, a professor of history at Princeton, points out that this belief is relatively recent in America, going back only to Depression Era efforts by corporations to fight off federal regulations.
It's a continuation of the earlier attitude of corporate executives who relied on their religious beliefs to justify their efforts to defeat labor union activists. To quote George Frederick Baer, president of the Philadelphia and Reading Railroad in 1902, "The rights and interests of the laboring man will be protected and cared for -- not by the labor agitators, but by the Christian men of property to whom God has given control of the property rights of the country."
It's important to remember that nowhere in the Constitution or in the Declaration of Independence is Christianity mentioned. Neither, for that matter, is God. And any rights listed in the founding documents are not granted by government. Rather, the documents guarantee rights we already have. And Congress shall make no law abridging these rights.
The separation of church and state was so important to the founders because they had been victims of the imperial tradition that disallowed many positions to those who were not members of the state-sponsored, established church.
There is no established church in America, and to claim that there is, or should be, is to show an ignorance of what America is all about: Freedom of religion, as specified in the First Amendment, and freedom from religion, as specified in Article Six of the Constitution itself.
Tuesday, March 17, 2015
Reaganomics Redux
New label + new voice = new concept. Not.
If you make it, they will buy.
If it sounds too good to be true, it probably is.
Supply-side economics, also known as trickle-down economics or Reaganomics, is back, this time being touted by a new set of conservatives under a new name -- dynamic scoring.
According to the "new" concept justifying tax cuts for the wealthy, supposedly with the result that the more cash investors and firms have to play with, the more stuff will be made and sold, and the increased tax revenue from increased sales will more than offset the tax reductions. Now, with a moving dynamic of scoring the range of possible results from changes in government fiscal policy, backers of the "new" strategy can pick and choose the numbers that best fit their preconceived notions of what they want to achieve.
In its Reagan-era incarnation, the theory claimed that the benefits of tax cuts for the wealthy would encourage them to invest the extra cash in producing more stuff, thus resulting in more jobs and more sales and increased economic activity and more tax revenue for the government.
Sounds great, if in fact that happens. Except that it didn't then, and it won't now. Reducing taxes on the wealthy only gives them more cash to stash in private accounts.
A supporting pillar of Reaganomics was the Laffer Curve, named after economist Arthur Laffer. His theory was that high taxes interfere with production, so cutting taxes would boost production significantly enough as to overcome any deficits caused by the cuts.
Except that during the Reagan years, the benefits did not trickle down to the rest of the population, and the government deficit tripled.
It is to laff.
Even President George H.W. Bush called it "voodoo economics," and the supply-side or trickle-down theory has little backing among mainstream economists. It does, however, have strong support from the Radical Righteous and their corporate election campaign donors.
With "dynamic scoring" of the potential benefits of changes in fiscal policy, backers develop a range of possibilities that any changes would bring about, and then select the option that best suits their goals. It would be better to use the option most likely to reach reality, and adjust goals accordingly.
But no. The historical reality that supply-side economics doesn't work has no effect on their thinking and planning. They remain afflicted by the Riley Syndrome: "My head's made up. You can't confuse me with the facts."
Or maybe, the resident cynic said, they do know, but don't care. Their self-aggrandizing strategy dictates that they develop a new name for the same concept, and market it through new voices, so the public is led to believe it's a new concept.
If you make it, they will buy.
If it sounds too good to be true, it probably is.
Supply-side economics, also known as trickle-down economics or Reaganomics, is back, this time being touted by a new set of conservatives under a new name -- dynamic scoring.
According to the "new" concept justifying tax cuts for the wealthy, supposedly with the result that the more cash investors and firms have to play with, the more stuff will be made and sold, and the increased tax revenue from increased sales will more than offset the tax reductions. Now, with a moving dynamic of scoring the range of possible results from changes in government fiscal policy, backers of the "new" strategy can pick and choose the numbers that best fit their preconceived notions of what they want to achieve.
In its Reagan-era incarnation, the theory claimed that the benefits of tax cuts for the wealthy would encourage them to invest the extra cash in producing more stuff, thus resulting in more jobs and more sales and increased economic activity and more tax revenue for the government.
Sounds great, if in fact that happens. Except that it didn't then, and it won't now. Reducing taxes on the wealthy only gives them more cash to stash in private accounts.
A supporting pillar of Reaganomics was the Laffer Curve, named after economist Arthur Laffer. His theory was that high taxes interfere with production, so cutting taxes would boost production significantly enough as to overcome any deficits caused by the cuts.
Except that during the Reagan years, the benefits did not trickle down to the rest of the population, and the government deficit tripled.
It is to laff.
Even President George H.W. Bush called it "voodoo economics," and the supply-side or trickle-down theory has little backing among mainstream economists. It does, however, have strong support from the Radical Righteous and their corporate election campaign donors.
With "dynamic scoring" of the potential benefits of changes in fiscal policy, backers develop a range of possibilities that any changes would bring about, and then select the option that best suits their goals. It would be better to use the option most likely to reach reality, and adjust goals accordingly.
But no. The historical reality that supply-side economics doesn't work has no effect on their thinking and planning. They remain afflicted by the Riley Syndrome: "My head's made up. You can't confuse me with the facts."
Or maybe, the resident cynic said, they do know, but don't care. Their self-aggrandizing strategy dictates that they develop a new name for the same concept, and market it through new voices, so the public is led to believe it's a new concept.
Sunday, March 15, 2015
Rule vs Regulation
Linguistically, all dialects are equal. The reason some dialects have more prestige than others is a social judgment, not linguistic.
My words mean just what I choose them to mean -- Humpty Dumpty.
A grammatical rule is a description of what users of a language or dialect actually do, not a prescription of what they must do. Unless, of course, they want to be perceived as part of a group with more social prestige. Then they should follow the guidelines of what others in the target group do.
But if they don't care, it doesn't matter. Even so, people -- being the judgmental sort that they are -- will judge that the writer or speaker is part of a less prestigious group, and will treat him or her accordingly. That, however, is a social judgment, not linguistic.
Conversely, if a member of a group adopts the speech patterns of a group that happens to have more prestige, he or she runs the risk of being called a snob.
In some contexts, rules and regulations can be mandates, such as in law and in military codes of conduct. But in others, such as in language usage and style, a grammatical rule is merely a guideline to what many experienced writers and speakers regularly do. These guidelines are offered as a way of facilitating communication among users of the same dialect.
In a way, it's a another code of conduct -- if you want to be accepted by others in the group, you follow the pattern of behavior and speech adopted by the majority members of the group. That's part of the reason teenagers have their own jargon, or set of terms that only they use. And once their parents pick up and start using these terms, teens abandon them and invent new jargon. If nothing else, it's a way of showing their independence.
All that being said, standards in grammar and spelling are important because they ensure accurate communication.
My words mean just what I choose them to mean -- Humpty Dumpty.
A grammatical rule is a description of what users of a language or dialect actually do, not a prescription of what they must do. Unless, of course, they want to be perceived as part of a group with more social prestige. Then they should follow the guidelines of what others in the target group do.
But if they don't care, it doesn't matter. Even so, people -- being the judgmental sort that they are -- will judge that the writer or speaker is part of a less prestigious group, and will treat him or her accordingly. That, however, is a social judgment, not linguistic.
Conversely, if a member of a group adopts the speech patterns of a group that happens to have more prestige, he or she runs the risk of being called a snob.
In some contexts, rules and regulations can be mandates, such as in law and in military codes of conduct. But in others, such as in language usage and style, a grammatical rule is merely a guideline to what many experienced writers and speakers regularly do. These guidelines are offered as a way of facilitating communication among users of the same dialect.
In a way, it's a another code of conduct -- if you want to be accepted by others in the group, you follow the pattern of behavior and speech adopted by the majority members of the group. That's part of the reason teenagers have their own jargon, or set of terms that only they use. And once their parents pick up and start using these terms, teens abandon them and invent new jargon. If nothing else, it's a way of showing their independence.
All that being said, standards in grammar and spelling are important because they ensure accurate communication.
Saturday, March 14, 2015
Language Logic
Logic uses language, but language is not logical
Copy editors are the standard-keepers of grammar, spelling, punctuation and usage of language, and style books are the bibles for writers.
However, just as biblical interpretations change among those who study sacred scriptures, so also do the rules set down in writers' style manuals. The latest example is a report that the Associated Press has changed its stance on the difference between "over" and "more than." After decades of cautioning editors and writers that there is a difference, as in this sentence: "The plane flew over the field of more than ten acres," the AP has apparently surrendered to common usage.
Is this a terrible thing? It is to some. To others, however, it is simply another sign of language development and change. What is unacceptable for one generation becomes commonplace in the next, and to insist that there can be only one set of rules, permanent and unalterable, is unrealistic at best and folly to try to resist change.
Change is inevitable. Meanwhile, there are standards, even as those standards change over time.
Time was, a preposition was something you should never end a sentence with. And collective nouns always took a singular verb. Now, a collective noun such as team or crew can take a singular or plural verb depending on whether the members are acting as an organized group or as a bunch of rowdies. This has long been the practice across the pond, but only recently has become routine in America.
As for style books and the sanctity that some would attribute to them, remember that there are as many style books as there are editors. The Associated Press Style Book is perhaps the most widely used among journalists, but there are many other manuals advocating various standards to be used by writers for dozens -- nay, hundreds -- of formats, both print and electronic.
For example, which form should be used when dealing with percentages? Should a writer use percent as one word or two: per cent? Or should the standard be the abbreviation pct (with or without a period)? Or would the symbol % be best?
All are correct, and the only standard in deciding which to use is the chief editor's preference. Moreover, the guiding principle for all manuals is to establish consistency of usage.
Nonetheless, there are some formats that writers should use, since their goal is to communicate an idea or fact. And mixing both a fraction and a percentage in the same sentence spoils communication, since using both forces the reader to stop and figure out how the two relate. Granted, readers certainly are able to do so, but good writers don't force them to do it.
Another hint: When you can't decide whether to use a masculine or feminine gender pronoun, the better choice is neither. Recast to the plural, as in the previous sentence.
Copy editors are the standard-keepers of grammar, spelling, punctuation and usage of language, and style books are the bibles for writers.
However, just as biblical interpretations change among those who study sacred scriptures, so also do the rules set down in writers' style manuals. The latest example is a report that the Associated Press has changed its stance on the difference between "over" and "more than." After decades of cautioning editors and writers that there is a difference, as in this sentence: "The plane flew over the field of more than ten acres," the AP has apparently surrendered to common usage.
Is this a terrible thing? It is to some. To others, however, it is simply another sign of language development and change. What is unacceptable for one generation becomes commonplace in the next, and to insist that there can be only one set of rules, permanent and unalterable, is unrealistic at best and folly to try to resist change.
Change is inevitable. Meanwhile, there are standards, even as those standards change over time.
Time was, a preposition was something you should never end a sentence with. And collective nouns always took a singular verb. Now, a collective noun such as team or crew can take a singular or plural verb depending on whether the members are acting as an organized group or as a bunch of rowdies. This has long been the practice across the pond, but only recently has become routine in America.
As for style books and the sanctity that some would attribute to them, remember that there are as many style books as there are editors. The Associated Press Style Book is perhaps the most widely used among journalists, but there are many other manuals advocating various standards to be used by writers for dozens -- nay, hundreds -- of formats, both print and electronic.
For example, which form should be used when dealing with percentages? Should a writer use percent as one word or two: per cent? Or should the standard be the abbreviation pct (with or without a period)? Or would the symbol % be best?
All are correct, and the only standard in deciding which to use is the chief editor's preference. Moreover, the guiding principle for all manuals is to establish consistency of usage.
Nonetheless, there are some formats that writers should use, since their goal is to communicate an idea or fact. And mixing both a fraction and a percentage in the same sentence spoils communication, since using both forces the reader to stop and figure out how the two relate. Granted, readers certainly are able to do so, but good writers don't force them to do it.
Another hint: When you can't decide whether to use a masculine or feminine gender pronoun, the better choice is neither. Recast to the plural, as in the previous sentence.
Saturday, March 7, 2015
Hillarygate
Editors too often demand sizzle even when there is no steak.
The latest rant making the media rounds is the revelation that Hillary Clinton used a personal email account for communication when she was Secretary of State, rather than posting her official correspondence on a government system. The New York Times broke the story, which immediately set off a firestorm of criticism of the former government official and likely presidential candidate.
It makes for a sizzling story and a lively debate among political types about rule-breaking. However, it seems that no such rule was in place until a year and a half after she left the State Department. So what rule was it she broke?
That has not stopped political opponents from jumping up and down for headlines, slamming Clinton for the error of her ways, even though they do similar things themselves.
From this corner comes a question: Is the story being overplayed, emphasizing the sizzle when there is very little steak?
And our resident cynic, Dinty Ramble, wonders who tipped off the New York Times to the story. Was it a political opponent, eager to sully her reputation and damage her credibility as a candidate? The revelation is certainly a worthwhile news story, but ignoring the point that there was no rule to break emphasizes one part of the story -- the sizzle -- and ignores what little steak there was.
It's also possible that the NYT wanted to show some balance in its coverage by beating up on a Democrat in addition to printing negative stories about Republicans. And the TV pundits lapped up the story as something new to yammer about until the next sizzler comes along. But that's the cynic speaking again.
In any case, it reminded us of the worldwide flap about the Millennium Bug, the story that warned incessantly that every computer in the world would shut down at the stroke of midnight when the year 1999 ended and the year 2000 began. The warning was based on the idea that early computer systems dated their programs with just two digits for a year, rather than four. Therefore, when the calendar year ending 99 closed and the computers turned to 00, the entire system would collapse in confusion.
It didn't happen. Reason: Computer programmers, especially those in the finance industry dealing with bond issues and mortgages with a 30-year term, noted decades ahead of time that a two-digit year code would be a problem. So they fixed it.
That didn't stop some folks, however, from marketing "special computer programs" to fix a problem that didn't really exist, and fanning the flames of fearmongering as part of their marketing plan.
And the media mavens saw the light of a blazing story with lots of sizzle, warning that the sky would fall and computers worldwide would crash at the stroke of midnight.
Nobody bothered to ask whether that would be Eastern Standard Time, Greenwich Mean Time, or any of the 22 other time zones around the planet.
The latest rant making the media rounds is the revelation that Hillary Clinton used a personal email account for communication when she was Secretary of State, rather than posting her official correspondence on a government system. The New York Times broke the story, which immediately set off a firestorm of criticism of the former government official and likely presidential candidate.
It makes for a sizzling story and a lively debate among political types about rule-breaking. However, it seems that no such rule was in place until a year and a half after she left the State Department. So what rule was it she broke?
That has not stopped political opponents from jumping up and down for headlines, slamming Clinton for the error of her ways, even though they do similar things themselves.
From this corner comes a question: Is the story being overplayed, emphasizing the sizzle when there is very little steak?
And our resident cynic, Dinty Ramble, wonders who tipped off the New York Times to the story. Was it a political opponent, eager to sully her reputation and damage her credibility as a candidate? The revelation is certainly a worthwhile news story, but ignoring the point that there was no rule to break emphasizes one part of the story -- the sizzle -- and ignores what little steak there was.
It's also possible that the NYT wanted to show some balance in its coverage by beating up on a Democrat in addition to printing negative stories about Republicans. And the TV pundits lapped up the story as something new to yammer about until the next sizzler comes along. But that's the cynic speaking again.
In any case, it reminded us of the worldwide flap about the Millennium Bug, the story that warned incessantly that every computer in the world would shut down at the stroke of midnight when the year 1999 ended and the year 2000 began. The warning was based on the idea that early computer systems dated their programs with just two digits for a year, rather than four. Therefore, when the calendar year ending 99 closed and the computers turned to 00, the entire system would collapse in confusion.
It didn't happen. Reason: Computer programmers, especially those in the finance industry dealing with bond issues and mortgages with a 30-year term, noted decades ahead of time that a two-digit year code would be a problem. So they fixed it.
That didn't stop some folks, however, from marketing "special computer programs" to fix a problem that didn't really exist, and fanning the flames of fearmongering as part of their marketing plan.
And the media mavens saw the light of a blazing story with lots of sizzle, warning that the sky would fall and computers worldwide would crash at the stroke of midnight.
Nobody bothered to ask whether that would be Eastern Standard Time, Greenwich Mean Time, or any of the 22 other time zones around the planet.
Subscribe to:
Posts (Atom)