Wealth International, Limited (trustprofessionals.com) : Where There's W.I.L., There's A Way

W.I.L. Offshore News Digest :: May 2009, Part 4

This Week’s Entries :

JAMAICA: FROM COLUMBUS TO INDEPENDENCE

The Caribbean’s post-European-encounter history was drenched in blood for several hundred years. The advent of the era of U.S. predominance in North America effectively ended the European powers killing and plundering of each other in the Caribbean theater, but misery in the form of slavery was only eradicated later. In Haiti, notably, misery persists -- due a government with steroid-enhanced criminal tendencies.

The history of Jamaica is emblematic of that of the typical Caribbean island through the European colonial period and thereafter. This brief introduction serves as a good background piece.

Jamaica has a rich and vibrant history, which inspired it to move forward as a nation. Jamaica's history speaks to experiences of hardships and prosperity; and the growth and determination of a people. Understanding a little bit more about the history of Jamaica can help others to appreciate its beauty, culture, and unique spirit. However, this island's history has been complicated, with many troubles since Columbus's arrival.

Original Inhabitants

The original inhabitants of Jamaica are believed to be the Arawaks, also called Tainos. They came from South America 2,500 years ago and named the island Xaymaca, which meant “land of wood and water.” The Arawaks were a mild and simple people by nature. Physically, they were light brown in color, short and well-shaped with coarse, black hair. Their faces were broad and their noses flat. The Arawaks led quiet and peaceful lives until they were destroyed by the Spaniards some years after Christopher Columbus discovered the island in 1494.

The Discovery of Jamaica

On May 5, 1494 Christopher Columbus, the European explorer, who sailed west to get to the East Indies and came upon the region now called the West Indies, landed in Jamaica. This occurred on his second voyage to the West Indies. Columbus had heard about Jamaica, then called Xaymaca, from the Cubans who described it as “the land of blessed gold.” Columbus was soon to find out that there was no gold in Jamaica.

On arrival at St. Ann's Bay, Columbus found the Arawak Indians inhabiting the island. Initially, Columbus thought these Indians were hostile, as they attacked his men when they tried to land on the island. As he was determined to annex the island in the name of the king and queen of Spain, he was not deterred. Columbus also needed wood and water and a chance to repair his vessels.

He sailed down the coast and docked at Discovery Bay. The Arawaks there were also hostile to the Spaniards. Their attitudes changed however, when they were attacked by a dog from one of the Spanish ships and Columbus's cross-bow men. Some of the Arawaks were killed and wounded in this attack. Columbus was then able to land and claim the island.

The Spaniards, when they came, tortured and killed the Arawaks to get their land. They were so overworked and ill-treated that within a short time they had all died. The process was aided by the introduction of European diseases to which the Arawaks had little or no resistance. The island remained poor under Spanish rule as few Spaniards settled here. Jamaica served mainly as a supply base: food, men, arms and horse were shipped here to help in conquering the American mainland.

Fifteen years later in 1509, after their first visit to the island, the first Spanish colonists came here under the Spanish governor Juan de Esquivel. They first settled in the St. Ann's Bay area. The first town was called New Seville or Sevilla la Nueva.

Towns were little more than settlements. The only town that was developed was Spanish Town, the old capital of Jamaica, then called St. Jago de la Vega. It was the center of government and trade and had many churches and convents.

The little attention the colony received from Spain soon led to a major reason for internal strife. This contributed to the weakening of the colony in the last years of Spanish occupation. The governors were not getting proper support from home and quarrels with church authorities undermined their control. Frequent attacks by pirates also contributed to the colony's woes.

The English Attack

On May 10, 1655, Admiral William Penn and General Robert Venables led a successful attack on Jamaica. The Spaniards surrendered to the English, freed their slaves and then fled to Cuba. It was this set of freed slaves and their descendants who became known as the Maroons.

The early period of English settlement in Jamaica, drew much attention to the buccaneers based at Port Royal. Buccaneering had begun on the islands of Tortuga and Hispaniola. They were a wild, rough and ruthless set of sea rovers. They took their loot of gold, silver and jewels to Port Royal.

Port Royal prior to this time was an insignificant town in Jamaica. Under the buccaneers' leadership the town, within a decade and a half, grew to become known as the “wealthiest and wickedest city in the world.”

The greatest buccaneer captain of all was Henry Morgan. He started out as a pirate and later became a privateer. Morgan mercilessly raided Spanish fleet and colonies. He kept the Spaniards so busy defending their coasts that they had little time to attack Jamaica. Morgan was knighted by king Charles II of England and was appointed Lieutenant governor of Jamaica in 1673. Morgan died in 1688. A violent earthquake destroyed Port Royal on June 7, 1692. The survivors of the earthquake who re-settled in Kingston abandoned the Port. Port Royal became an important naval base in the 18th century.

The Slave Trade

The English settlers concerned themselves with growing crops that could easily be sold in England. Tobacco, indigo and cocoa soon gave way to sugar which became the main crop for the island. The sugar industry grew so rapidly that the 57 sugar estates in the island in 1673 grew to nearly 430 by 1739.

Enslaved Africans filled the large labor force required for the industry. The colonists were impressed with the performance and endurance of the Africans, as well as the fact that African labor was cheaper and more promising. They continued to ship Africans to the West Indies to be sold to planters who forced them to work on sugar plantations.

The slave trade became a popular and profitable venture for the colonists. In fact the transportation of slaves became such a regular affair that the journey from Africa to the West Indies became known as the “Middle Passage.” The voyage was so named because the journey of a British slaver was 3-sided, starting from England with trade goods, to Africa where these were exchanged for slaves.

Afterwards, the journey continued to the West Indies where the slaves were landed and sugar, rum and molasses taken aboard for the final leg of the journey back to England. The slaves, however, were unhappy with their status, so they rebelled whenever they could. Many of them were successful in running away from the plantations and joining the Maroons in the almost inaccessible mountains.

Several slave rebellions stand out in Jamaica's history, for example, the Easter Rebellion of 1760 led by Tacky; and the Christmas Rebellion of 1831 which began on the Kensington Estate in St. James, led by Sam Sharpe. He has since been named a National Hero.

The Maroons also had several wars against the English. In 1739 and 1740 after two major Maroon Wars, treaties were signed with the British. In the treaty of 1740, they were given land and rights as free men. In return they were to stop fighting and help to recapture run-away slaves. This treaty resulted in a rift among the Maroons as they did not all agree that they should return run-away slaves to the plantations.

The frequent slave rebellions in the Caribbean was one factor that led to the abolition of the slave trade and slavery. Other factors included the work of humanitarians who were concerned about the slaves' well-being. Humanitarian groups such as the Quakers publicly protested against slavery and the slave trade. They formed an anti slavery committee which was joined by supporters such as Granville Sharp, James Ramsay, Thomas Clarkson and later on, William Wilberforce.

On January 1, 1808 the Abolition Bill was passed. Trading in African slaves was declared to be “utterly abolished, prohibited and declared to be unlawful.” Emancipation and apprenticeship came into effect in 1834 and full freedom was granted in 1838.

The immediate post slavery days were very difficult for the poorer classes. Though most of the English planters had left the islands and new owners were running the plantations, the old oligarchic system still remained. The will of the masses was not deemed important and hence ignored.

To add fuel to the already burning flame, the American Civil War resulted in supplies being cut off from the island. A severe drought was also in progress and most crops were ruined. The succeeding years saw the island's recovery and development -- social, constitutional and economic, and its evolution into a sovereign state.

Education, health, and social services were greatly improved. A proper island-wide savings back system was organized. Roads, bridges and railways (railways became government owned in 1845) were built and cable communication with Europe established (1859). The island's capital was moved from Spanish Town to Kingston (1872).

The 1930s saw Jamaica heading towards another crisis. The contributing factors were discontent at the slow pace of political advance. For example, the distress caused by a world-wide economic depression, the ruin of the banana industry by the Panama industry Disease, falling sugar prices, growing unemployment aggravated by the curtailment of migration opportunities and a steeply rising population growth rate. In 1938 things came to a head with widespread violence and rioting.

Out of these disturbances came the formation of the first labor unions and the formation of the two major political parties. These were the Bustamante Industrial Trade Union (BITU) named after the founder, Sir Alexander Bustamante. He was also the founder and leader of the Jamaica Labor Party (JLP), the political party affiliated with the BITU. Norman Manley was the founder of the National Workers' union and the political party the People's National Party (PNP).

Both Sir Alexander Bustamante and Norman Manley were instrumental in Jamaica's move towards self-government. The first general elections under Universal Adult Suffrage was held in December 1944.

In 1958, Jamaica and 10 other Caribbean countries formed the Federation of the West Indies. The concept of Caribbean unity was soon abandoned in 1961 when Jamaicans voted against the Federation of the West Indies. On August 6, 1962, Jamaica was granted its independence from England. Jamaica now had its own constitution which sets out the laws by which the people are governed. The constitution provides for the freedom, equality and justice for all who dwell in the country.

After almost 500 years of living under the tyranny of invading forces, Jamaica was at last a free and independent nation.

OBAMA’S TAX CRACKDOWN MAY PROMPT BRITISH BANKS TO DUMP U.S. CLIENTS

The U.S. would like every bank in the world to act as if they were U.S. banks and report all income, transactions and balances regarding U.S. citizens to the IRS. The UK, for one, is saying this would be sufficiently costly that they might just stop servicing U.S. clients altogether. Not a problem for the IRS we are sure.

The UK’s Association of Private Client Investment Managers and Stockbrokers has called on the IRS to ensure that any laws are “proportionate and cost-effective.” Good luck. It is a characteristic of government regulation in general that the costs they impose on those they regulate are of little concern until the regulated have sufficient political muscle to fight back. In this case the regulated have very little political recourse. And the U.S. lawmakers seem intent on making a statement no matter what the cost.

Financial institutions in the UK are threatening to withdraw their services to American clients if the United States Congress approves proposed changes to reporting and withholding rules for tax purposes.

Banks and wealth managers are warning that it will no longer be cost effective for them to service U.S. clients if tough new reporting rules, part of the Obama administration’s intended crackdown on international tax avoidance, are incorporated into the U.S. Qualified Intermediary (QI) program.

Under the proposals, announced by the U.S. Treasury earlier this month, foreign financial institutions that have dealings with the United States will be required to sign an agreement with the U.S. Internal Revenue Service (IRS) to become a Qualified Intermediary and share the same information about their U.S. customers as is currently required of U.S. financial institutions, “or else face the presumption that they may be facilitating tax evasion and have taxes withheld on payments to their customers.”

Proposed changes to the QI program were put out to consultation by the IRS last October, but have yet to be incorporated into U.S. tax law. Under the reforms, financial institutions that are QIs would have to provide early notification of material failure of internal controls, improve evaluation of risk of circumvention of U.S. taxation by U.S. persons, and include audit oversight by a U.S. auditor.

“This is an important program, and we cannot tolerate anyone abusing or skirting the requirements,” said IRS Commissioner Doug Shulman. “This proposal lays out a strong set of actions in our ongoing effort to strengthen the Qualified Intermediary program.”

The international tax crackdown has been included in Obama’s maiden budget for fiscal year 2010. Although non-binding, Congress is expected to take up many aspects of the proposals in the near future.

However, institutions in the UK fear that the new rules will be difficult to comply with and could land firms in hot water with the IRS if not followed correctly. In response to the suggested amendments to the QI tax rules, the Association of Private Client Investment Managers and Stockbrokers in the UK has called on the IRS to ensure that changes are “proportionate and cost-effective.”

“The QI regime continues to be an administratively burdensome and costly regime to APCIMS member firms, particularly as they have very few, if any, U.S. clients,” said Andy Thompson, Director of Operations at APCIMS.

APCIMS made reference to one firm that has costs of £215,000 ($343,000) of which audit represents just under 25% compared to reportable income of $175,000 ($5,000 tax deducted) for calendar year 2007.

The association believes that amendments to the process which will require the involvement of American auditors will be “costly and unnecessary” and may constitute a breach of the Data Protection Act in the UK.

“There is no need for UK firms to endure the increased costs through having to use US external auditors. IRS already have appropriate powers as all of the big UK audit firms are registered with the Public Company Accounting Oversight Board (PCAOB),” Thompson continued. “PCAOB has broad investigative and disciplinary authority over registered public accounting firms and utilizing this provision will meet the IRS’s aims of accuracy and accountability in the QI audit process.”

“Ultimately we believe the burden should fall on the audit firms rather than QIs themselves,” he argued.

According to the UK’s Daily Telegraph, APCIMS has been joined by the British Bankers Association in its opposition to the U.S. proposals. Both organizations discussed the matter with European counterparts earlier this month, and a delegation will be sent to Washington in June to urge the U.S. Treasury Department to reconsider the plans.

HOW TO GIVE A LOAN TO A FAMILY MEMBER

The right way for parents to loan money to their adult children.

Inter-family money transfers can be tricky enough without inadvertently running afoul of the U.S. tax code. When making a loan which has the true substance of a loan, i.e., you genuninely expect to get repaid and are charging an interest rate which has some connection to market rates, be aware of the form that needs to back the substance. For some basics, read on.

Back when they were feeling flush, many well-off folks made gifts of cash or stock to their adult children without much hesitation. Now some are reluctant to give, worried that they will not have enough left for their own retirement or that their children may become too dependent on handouts. “It is one thing to go out to dinner or go on vacation together and always pick up the tab, but it is another to give them money to live on” is how one client explained it recently to New York City CPA Stuart Kessler.

Fortunately, there is a fine alternative to cutting the kids off cold turkey: Lend them the cash. Kessler’s clients have made loans to children this year to start a business, buy a new car and pay income taxes. “I tell them, ‘I want you to get the money back,’” he says.

Even parents who might not have made gifts before are opening up a loan window to children affected by layoffs or the credit crunch. Los Angeles CPA Michael Eisenberg recently helped a father structure a $2,000-a-month loan to his out-of-work son. The expectation is that the economy will turn around and the son will get a new job; the documents require him to start paying the loan back in 2011. “Make sure everyone is clear on the terms – is it a loan or a gift?” Eisenberg says. “You have got to be clear which way you are going, or it can ruin the family relationship,” he adds.

Put things in writing, and charge at least the minimum IRS-set rate of interest.

An ambiguous loan can also ruin your relationship with the Internal Revenue Service, which might argue the loan is really a gift, possibly subject to gift tax. The key to avoiding either family or tax trouble is to put things in writing and, in most cases, to charge your kids a minimum IRS-set rate of interest. The interest they pay is taxable income to you.

There are exceptions: If you have lent a child less than $10,000 in total, you do not have to charge interest. Plus, there are some convoluted rules that might allow you to avoid charging interest on loans between $10,000 and $100,000. But those rules do not seem worth fooling with now, given how little interest you have to charge. In May the IRS minimum interest rate ranged from 0.8% a year for loans of three years or less to 3.6% for loans longer than 9 years.

Setting this up need not be expensive. You can buy standard loan forms through Nolo.com. Or one of billionaire Sir Richard Branson’s latest ventures – Virgin Money – will generate the paperwork for $99. But if you are lending a substantial amount, or for a long period, you should consult with your lawyer or accountant about the gift and estate tax angles and how any outstanding loan will be treated should you die.

The tax-free gift limit is up to $13,000 per year.

If you are so inclined, you can convert the loan into a gift over time, using the annual gift tax exclusion to forgive some principal each year. Each individual can give $13,000 a year to anyone else without gift tax consequences – meaning a couple could forgive up to $26,000 a year in principal lent to a child. Alternatively, you can leave the borrower assets in your will with which to pay off the loan. Just do not put into your will that the loan is forgiven at your death, warns Boston lawyer Louis Katz. That could turn a nontaxable inheritance left to your child into taxable “debt forgiveness” income.

If the IRS ever does scrutinize the loan, you should be ready to show that the child had a realistic chance of repaying the money and that you were ready to collect in the case of a default.

DUMP YOUR INVESTMENT LOSSES ON THE IRS

Use existing tax law in a clever way to recoup what is rightfully yours.

In the “heads we win, tails you lose” U.S. tax code, capital gains are taxed when they are realized while capital losses can only be used to offset $3,000 worth of non-investment income per year. With the market tank of the last year, you may well be sitting on some large capital losses, realized or not. Herewith are a few creative ideas for putting a capital loss carryforward to work.

The treatment of investors in this country is cruel and inhumane.

I am referring to the rule that says capital gains are taxable but losses are only somewhat deductible. If you realized $200,000 in gains back when the bull market was on, you paid federal and state income taxes on them. Now you have $200,000 in losses and are back to where you started. Can you carry the loss back and get a refund? Absolutely not. What you can do is deduct net trading losses against other income at a rate of $3,000 a year. It will take you 67 years to even the score.

The crummy $3,000 deduction was last raised in 1978. There is a proposal in Congress to raise the number to $20,000, but the probability of passage is about the same as the probability that Citigroup shares will hit a new high.

Take the law into your own hands. That means using existing tax law in a clever way to recoup what is rightfully yours.

I posed this question to Robert Gordon of Twenty-First Securities in New York City: What do you recommend to someone who is sitting on a lifetime of capital loss carryforwards? Most of Gordon’s business revolves around complex derivatives for people with megaportfolios. But he reached into his bag of tricks for a few things that would avail a middling investor. Let’s define that as someone with $1 million in the market, not counting 401(k)s and the like. Three of his answers:

Own S&P futures. If you own an S&P 500 index fund, your 2.5% yield is destined to be hit with a stiff tax when the Bush tax cut expires. Instead of owning one, own index futures, which trade at a discount whenever stock yields are higher than short-term interest rates. If one-year Treasurys yield 0.5% and stocks yield 2.5%, the futures should be priced at an annualized 2% discount to the spot price. If stocks go nowhere, you make 2% a year owning futures, taxed as a capital gain (a blend of short- and long-term gains). But for someone in your sorry state, sitting on unused loss carryforwards, capital gains are effectively tax free.

Short Treasury bonds. The 53TK4s of August 2010 are trading at 1061TK2. Short $1 million of them, generating $1,065,000 of cash to be invested at short-term rates. You will owe $72,000 to the lender to replace the missing coupon. That is fully deductible against investment income (like interest and dividends). The $65,000 of guaranteed short-selling profit is treated as a capital gain – which, for someone like you, is effectively tax exempt.

Exit bond funds before the dividend. American Century Target 2025 will pay an annual dividend to holders of record as of December 10. Redeem on December 9 and you have effectively converted a chunk of interest income into a (for you, tax-free) capital gain. Alas, bond funds with lump-sum distributions like this are scarce.

MILLIONAIRES ARE FLEEING MARYLAND. THE STATE’S ESTATE TAX IS WHY

The income tax surcharge is another more minor reason.

Taxation is the gentle art of picking the goose in such a way as to secure the greatest amount of feathers with the least amount of squawking, goes one waggish aphorism. Maryland will soon discover, and is perhaps already feeling the effects of, trading too much squawk for not enough feathers. As other states reduce their estate tax schedules Maryland is beginning to stand out, and the exodus of millionaires fleeing to avoid the hit may have begun.

Observing the results from among the states within the United States, we see that while we cannot expect vigorous competition to drive taxes towards zero, constraints about just how rates can get definitely arise. Maryland millionaires have the relatively easy remedy of moving out of the geographically small state. U.S. citizens are not nearly so lucky. Not only is moving out of the country a far more formidable undertaking, but – unlike (we think, anyway) Maryland – the tax authorities will pursue you to the ends of the earth and your life to make sure you are paying their calculation of your “fair share.”

The millionaires are fleeing Maryland, all right. But not because of the measly tax surcharge on income over $1 million.

They are bugging out because of Maryland’s estate tax, which applies to a bigger portion of a dead person’s hoard than the federal estate tax or those in other states.

Strange to tell, rich refugees did not want to speak with me. But their lawyers did. They suggest the high inheritance tax costs the state a lot more than it brings in because absconding aristocrats do not pay any Maryland tax, let alone the one when they die.

“For years and years, I have had clients who complained about Maryland taxes and never took any action,” says Lowell G. Herman, head of the trust and estate practices at Gordon Feinblatt in Baltimore.

But recently, nearly a dozen customers with big stashes set up residence elsewhere, largely because of Maryland’s failure to match other states in reducing or eliminating its estate tax, he says. “But that is just sort of the beginning. There are many others who are thinking about it.”

Wailing rose from Annapolis last week after this newspaper reported that income tax returns from those making more than $1 million plunged by a third.

Suspicion focused on the millionaire tax, which takes 6.25% of every dollar pocketed over that amount. According to one theory, that slightly higher (and temporary) rate, which became effective last year, pushed plutocrats across the border.

The much more likely explanation is that the worst financial crisis in decades culled the ranks of million-dollar filers. But that does not mean numerous wealthy Marylanders are not becoming wealthy Floridians or Virginians.

They are. But their problem is the estate tax, which involves much bigger dollars.

“Nobody is going to leave the state because income tax rates go up a point over a million dollars – or whatever it is. It is just not going to happen,” says Stuart Levine, a Baltimore tax lawyer and adjunct professor at the University of Baltimore law school. “But they do have to leave to some degree because of the estate tax.”

Maryland has a complicated relationship with rich people. As one of the wealthiest states in the country, it cultivates more than its share. But they do not like to stick around once they have amassed a pile. At least they do not want to be taxed here.

Along with a cameo appearance in baseball records for his 50-home run, 1996 season, former Baltimore Oriole Brady Anderson shows up in Maryland’s tax annals.

The state tried to declare him a resident and claimed thousands in back taxes, even though he owned a house in Nevada. A comptroller’s hearing officer rejected the claim.

Maybe the most famous Maryland “domicile” case involved a former IRS manager and tax consultant for the state. The guy bought a condo in Miami Beach, kept his house in Pikesville and claimed residency in low-tax Florida.

Maryland tax authorities, his former colleagues, came after him with guns blazing. They investigated his voting records, car registration and phone book listings, declared him a Marylander and billed him $2,246 in back taxes.

“Former tax collectors do not like to pay income taxes any more than other taxpayers,” wrote the judge who ruled on his appeal.

They probably like to pay estate taxes even less. Virginia eliminated its estate tax in 2007; Florida, in 2005. Many other states have done the same.

The federal estate tax does not kick in on accumulations smaller than $3.5 million. But Maryland taxes as much as 16 percent of estates exceeding $1 million, which when you think about it is not huge for a lifetime of earning and saving. The value of a home alone in Maryland could get you halfway there. It got worse in 2005 after Congress no longer allowed a federal credit for state-paid estate taxes.

For somebody worth $3 million, leaving Maryland would save his or her heirs $182,000. That is well worth the effort and often easy to do. People worried about the estate tax are mainly retirees. Many already own second homes in low-tax states, and it is hardly a sacrifice spending a few extra months a year in Sarasota to avoid filing in Maryland.

OK, you do not feel sorry for the retiree with $3 million. It does not matter. She can choose where to live, and driving her from Maryland means she is not buying in local stores, attending the symphony, or paying sales and income tax.

“I have never said this about any other law,” said Herman, who agrees it is the estate tax, not the income tax, that is compelling the wealthy to leave. “This is one of the dumbest laws I have ever seen. It is very shortsighted from an economic and sociological standpoint for the state of Maryland.”

One problem with that view is that last year, Maryland’s estate-tax haul hit a record $195 million. But that could have been an aberration, swelled by the deaths of a few very rich folks, says David Roose, director of the comptroller’s Bureau of Revenue Estimates.

We will know much more in a year or two, after the comptroller’s office replaces its neolithic computers. It will be able to track how many people in which tax brackets are moving in, moving out, living and dying.

If Maryland has not cut the estate tax by then, do not be surprised if it shows the millionaire exodus has increased.

COMING SOON TO A CITY NEAR YOU

Another false-flag operation.

In Thus Spoke Zarathustra Nietzsche writes: “A state? What is that? Well! open now your ears to me, for now I will speak to you about the death of peoples. State is the name of the coldest of all cold monsters. Coldly it lies; and this lie slips from its mouth: ‘I, the state, am the people.’ ... Everything in it is false; it bites with stolen teeth, and bites often. It is false down to its bowels.”

The state must continually lie to justify its existence. A favorite form of lie was birddogged by H.L. Mencken: “The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.”

And as stated below: “A minority can control and enslave a majority only as long as the majority fears the minority. The state will do whatever it takes to maintain that control. The state lied about the circumstances of the attack on Pearl Harbor; they lied about the Gulf of Tonkin incident; they lied about the Oklahoma City bombing; they lied about TWA 800; they lied about Ruby Ridge; they lied about Waco and they lied about 9/11. ... There will be another event such as those mentioned above when the state feels it is beginning to loose control of Boobus, it needs the premise for a new war, or reason to expand the ones it now has.”

Be prepared.

In November of 2008, millions of American voters went to the polls and voted for “change.” Many are beginning to realize that very little has actually changed. When those who run the state apparatus begin to realize a lack of confidence in their abilities to control, drastic measures are often employed.

For two consecutive election cycles, 2006 and 2008, the American electorate voted against the illegal, immoral wars of the Bush administration. Now that we have a Bush clone in a darker skin tone, no one seems to care about the killing, maiming and torturing. To the hypocritical voters in America, the crime is irrelevant, it only matters whose criminal is pulling the trigger.

The “change” since the election, as I see it, is: A much larger national debt, ever expanding wars in the Middle East, a greater threat to personal liberty, a huge increase in gun sales, a national shortage of ammo, and rapidly increasing fuel prices. Is that the “change” Americans voted for?

The control the state is able to maintain over its citizens is directly proportionate to its ability to create fear in those citizens. Therefore, to defeat the war on terror, all we have to do is stop being afraid of the state’s villain du jour. To increase our personal freedoms and liberty, we must stop fearing the state. When one ceases to look through eyes clouded with fear, the state and its weaknesses are exposed.

The government spends billions of taxpayer dollars creating and sustaining a perpetual fear level, with the mainstream media (MSM) more than a willing partner.

Recently, we were inundated through the government and MSM with the possible threat of a pandemic associated with “Swine Flu.” This threat gained very little traction with the populace.

The FBI and NYPD have informed us we have been saved from another terrorist attack by their great work. The shills in the MSM will never examine the dynamics of this case. First, we have a group of people angry because the fedgov is killing people they care about with their illegal and immoral wars. Agents of the fedgov then infiltrate this group. Obviously, to become trusted by this group, these agents must profess to share the group’s feelings, emotions, and dedication of purpose. These agents then offer to provide the explosives and weapons necessary to carry out a terrorist attack. Did the fedgov agents choose the targets and tactics as well? Just how much of the entire plan was that of the agents involved as opposed to those they duped? Members of the group are provided (inert) explosives and weapons by these agents and then promptly arrested for having them and attempting to carry out the act the agents participated in creating. A classic case of entrapment.

The entire purpose of this case was to create fear in the population and the (mistaken) belief we are being protected by the state, a classic case of creating a problem and then appearing to be the solution.

Three incidents that created enormous fear of the boogeyman du jour were the Oklahoma City Bombing, TWA 800, and of course, 9/11. I will not discuss the issue of whether the government was involved in either of the above, but there is credible evidence here, here, and here that the government’s version of the events are highly suspect and certainly would not hold up under the scrutiny of a totally independent (non-governmental) investigation.

Many believe the government incapable of conducting, or being involved in, any false-flag operation involving the possible deaths of American citizens. I offer as exhibit (A) Operation Northwoods, and Exhibit (B) evidence from the 1993 WTC bombing.

The state has one powerful tool at its disposal to control the masses: fear. A minority can control and enslave a majority only as long as the majority fears the minority. The state will do whatever it takes to maintain that control. The state lied about the circumstances of the attack on Pearl Harbor; they lied about the Gulf of Tonkin incident; they lied about the Oklahoma City bombing; they lied about TWA 800; they lied about Ruby Ridge; they lied about Waco and they lied about 9/11. The very essence of government is fear and the lie.

There will be another event such as those mentioned above when the state feels it is beginning to loose control of Boobus, it needs the premise for a new war, or reason to expand the ones it now has. When it happens, will you run their lies up the flagpole, or demand the truth, refute the lies, and expose the liars?

I’LL NEVER RETIRE

Another government conspiracy.

The concept of retirement is actually fairly new. The idea of hanging up one’s career spikes at 65 and living out the rest of one’s life in leisure was essentially a creation of the U.S. government. Once people give up a life of being productive and financial independence for a life of living on the dole, in the form of Social Security and Medicare for the “retired,” they become dependent on and thus dependable supporters of big government.

Accidental? We think not.

Before the mid 1950s, there was no “retirement” as we use the term today. A 1950 poll showed most workers aspired to work for as long as possible. Quitting was for the disabled. Life did not offer “twilight years,” two decades of uninterrupted leisure courtesy of the U.S. taxpayer.

Just since 1960, the percentage of men over 65 still working has dropped by half. And the average retirement age keeps falling. It is down to 62, which gives the average man 18 years to be retired in its current meaning. It is not unusual to see people ending their careers in their mid-50s.

This is one of the monumental changes in the fabric of society wrought by the government, that has so altered the integrity of the people.

As someone on a payroll until the age of 79, and now employed on a non-compensated basis, I came to see that I was regarded as something of a freak. Was I trying to set some sort of record? Had I failed to accumulate a large enough estate?

There seemed to be some feelings too that I was somehow un-American, and a poor reflection on a generation that is supposed to be enjoying the good life.

Observing my generation opt for leisure, I see all sorts of adaptations. One described his life in Florida as meeting the same three golfers on the first tee at the same time each day for nine holes, then lunch in the club house, nine holes after lunch, shower, gin and tonic, and then back to the condo to dress for dinner. When asked if this was the routine for every day, he said, “No, I help my wife clean on Tuesday.”

This is what I am supposed to aspire to?

Another friend, in answer, said “I sleep as late as I can because I don’t know what to do when I get up.”

The remark heard most frequently is “I have been so busy since I retired, I don’t know how I ever had time for my job” or “Retirement is so wonderful, I should have retired sooner.”

At this point it might be in order to ask – “Busy doing what?”

Many of those who retire at 55, 60, 65, or 70 are some of the most experienced, knowledgeable, and capable people in the workforce. Rather than occupying positions that might be available to younger people, they could be creating and expanding job opportunities for others.

There is a sense of self-worth that comes from working to a purpose that is essential to well-being, whether the task involves major responsibility or physical exertion, as both require diligence and daily attendance.

How did we come to this slough of despondency? Like so many of our present disorders, it was the siren call of the great white father in Washington: “Come unto me all ye who labor and are heavy laden and I will give you rest.”

With Social Security, Medicare, and public pensions, the government has created a large new class of dependents who see no necessity to save or to accept responsibility for themselves, their offspring, or their parents.

As this fatally flawed scheme proceeds toward disaster, the beneficiaries are so insistent that their benefits be maintained and are such a strong political force, that few congressmen have the temerity to say publicly what everyone knows: Payments cannot be sustained. Those who are working are paying benefits that will not be available to themselves.

Buddha on his deathbed admonished his followers to, above all, observe strenuousness. How strange that sounds in today’s world. Our culture denies this essential virtue to our seniors, who have become dilettantes.

As we observe able-bodied citizens hiking the malls or sampling the midnight buffets on the cruise ships, we are struck by their purposelessness, and the overwhelming boredom they manifest. There is no need to arise in the morning, or any necessity to go to bed on time. Their reason for existence has ceased. They have lost the respect of those who support them, and lost their self-respect in the process.

A story is told of one who had led a long and eventful life. When the time came to cross the deep lake, he was pleased with the skiff and the oarsman as well as his welcome and the accommodations furnished him. The surroundings were beautiful, the weather pleasant, and the food more than adequate. After a few weeks, he wanted to try his hand at gardening again, but that could not be arranged. After repeated requests to work in the dining hall or on the grounds, he cried in exasperation, “This is no better than Hell.” The reply came from above, “Where did you think you were?”

Irving Babbitt reflected on the nature of work, how it was seen in the past as a God-given calling, and indeed served to define a person. With the loss of vocation has come a loss of identification.

To remedy this loss does not require legislation or public awareness. The solution is within the grasp of everyone who has decided to continue to be productive. It often means a change in occupation. It may mean giving up benefits and accepting a lower wage, or no wage at all. But a reason for living, and a retention of identity, are surely sufficient remuneration.

Traditional Company Pensions Are Going Away Fast

Staying productive by not retiring may be a virtue. It is also a decision one would rather make voluntarily. Declining retirement plan assets may instead leave little choice in the matter.

Personal retirement plans are usually equated with IRAs and 401(k)s; however, until fairly recently company pension plans were part of the average employee’s retirement plan mix. Similar to an annuity, a company defined benefit plan would promise $X per year – with inflation indexing if the employee was lucky – once he/she retired until death. In effect the company assumed the risk of declining values of the assets backing the promise rather than the employees, who directly assume the risk with IRAs and the like. In theory companies could spread risks and practice asset allocation in a more enlightened manner than the small fry individual employees.

In theory.

In practice, as far as we can tell, companies were more prone to idiotic “alternative investments” such as private equity and arcane mortgage-backed securities tranches. And even if this is off-base, they were just as exposed as individuals to the crash in garden variety equities and corporate bonds. Following the declines in asset values of the last years, companies find themselves with less than adequate backing for their pension fund promises.

Of course they are looking to minimize the associated costs in whatever ways they can. In general this has not involved unethical contract breaches, but it is safe to say the whole concept of company-backed post-retirement payout guarantees – Perhaps this should have been left to the life insurance/annuity issuers all along? – is endangered. Bottom line is that even if your company’s plan has not been terminated, expect to get less out of it than originally planned.

Life is risky and the future is always uncertain. The risk can be transferred but not eliminated. Individuals are best situated to assess their own futures, and specialists who will assume certain risks for a price are out there. With the pop in the financial bubble perhaps there will ultimately be a concomitant adjustment in thinking about future risks. An optimistic view of things is that a move towards reality-based thinking in one important domain may even have salubrious effects elsewhere in the individual and collective discourse.

Nine years ago, Devon Group, a small public relations and marketing group based in Middletown, New Jersey, began offering a traditional pension plan to its employees.

Business was booming, and the costs of offering the benefit “seemed very reasonable,” says Jeanne Achille, Devon’s chief executive officer. A pension plan also provided some valuable tax benefits for the firm, she says.

But after the economy deteriorated last year, “We realized that this was going to be too rich a benefit for us to continue,” Achille says. “You are required to fund the plan every year, regardless of whether your profits are where you would like them to be.” Rather than continue funding the plan, Devon Group voluntarily terminated its pension and sent each employee a check for the amount accrued.

The number of companies offering traditional defined benefit pension plans was shrinking even before the recession, but the downturn has accelerated the decline. Since the beginning of the year, at least 20 companies have frozen their defined pension plans, exceeding the number of plan freezes for all of 2008. A recent survey by Watson Wyatt found that, for the first time, the majority of Fortune 100 companies are offering new salaried employees only one type of retirement plan: a 401(k) or similar “defined contribution” plan.

The rapid disappearance of traditional pensions comes at a time when many workers have seen their retirement savings eviscerated by the bear market. The average 401(k) balance plummeted 27% last year, according to Fidelity Investments. While younger workers have time to make up the difference, workers in their 50s and 60s will have a hard time recovering their losses before retirement.

“The market collapse has just proven how fundamentally flawed 401(k) plans are as a vehicle to provide retirement income,” says Karen Friedman, policy director for the Pension Rights Center.

But increasingly, employees cannot rely on traditional pensions, either. Reasons they are endangered:

Declining profits. In late April, Lockheed Martin said its 1st-quarter earnings fell 8.7% because rising pension costs outweighed an increase in sales. The defense contractor plans to continue its pension plan for existing participants, a company spokesman said.

But some plan sponsors fear they will have to close plants or take other drastic actions unless they lower their pension costs, says Lynn Dudley, senior vice president, policy, for the American Benefits Council, a trade group for companies that offer employee benefits. Freezing a plan, she says, “is better than laying people off.”

New funding requirements. Investment losses in 2008 shrank the assets of the nation’s largest pension plans to 79% of projected liabilities, down from 109% at the end of 2007, according to Watson Wyatt. At the same time, pension plan sponsors are facing stricter funding requirements to strengthen the long-term health of pension plans.

Those requirements, combined with the investment losses, are forcing companies to shovel more money into their pension plans at a time when they can least afford it, says Dena Battle, director of tax policy for the National Association of Manufacturers.

“The cost of that is jobs, a reduction in capital expenditures, a reduction in benefits, and unfortunately, that includes plan freezes,” she says.

NAM and other groups representing plan sponsors are pressing lawmakers for temporary relief from the funding rules. So far, Congress has not acted. Some Democratic lawmakers and pension-rights advocates have proposed tying funding relief to a guarantee that a company will not freeze its pension for at least five years.

“If we do give employers more time to fund their plans, there should be something employers promise in return,” Friedman says.

Battle says companies would not accept such conditions because they would limit their ability to manage their businesses. In addition, she says, forcing companies to continue offering a pension would set a dangerous precedent, because this nation’s employer-sponsored retirement system has always been voluntary.

Concerns that taxpayers could be on the hook for underfunded pension plans could also complicate efforts to ease the funding requirements.

On Wednesday [May 20], the Pension Benefit Guaranty Corp., which insures pensions for 44 million retirees, reported a $33.5 billion deficit for the first half of fiscal 2009, up from $11 billion in fiscal 2008. That shortfall, the largest in the agency’s 35-year history, could increase dramatically if the agency is forced to take over pension obligations for General Motors and Chrysler. The PBGC says it has enough money to cover current liabilities.

Competitive pressures. Even if Congress approves funding relief, some companies may go ahead and freeze their plans because their competitors are not offering pensions,” says Scott Jarboe, senior retirement consultant in for Mercer, a human resources consulting firm.

Health insurance giant Cigna, which announced this month that it will freeze its pension July 1, said in a statement its retirement package was “significantly higher” in value than plans provided by its competitors. “While the company continues to be financially stable, making this change will improve our competitive cost position,” Cigna said.

Lack of interest. When Devon Group adopted a traditional pension, the company thought it was offering a valuable benefit for its employees, chief executive Achille says. But she soon learned that young job candidates were more interested in a 401(k) plan, because they assumed they would change jobs several times during their careers. The company plans to offer a 401(k) plan later this year.

Company 401(k) plans offer “visibility and portability,” says Alan Glickstein, senior retirement consultant at Watson Wyatt. “Everyone understands what an account is worth. With a traditional defined benefit plan, it is hard for employees to really understand their value.”

However, huge losses in 401(k) plans – readily apparent to anyone who looks at an account statement – could change employees’ attitudes toward traditional pensions, says Norman Stein, professor at the University of Alabama School of Law and a pension expert. In this environment, he says, “It should not be a tough sell to get employees to say these are actually pretty valuable plans.”

Older workers hardest hit

When a company freezes its pension, employees get to keep the benefits they have already accrued, but they usually will not earn any more.

That makes pension freezes particularly hard on older employees, who have less time to make up the difference by saving more. In addition, traditional pensions “are worth a lot more at the end of your career than at the beginning of your career,” Friedman says. “If the freeze comes in your 40s and 50s, you end up with a much smaller benefit.”

John Gaz, 46, a flight simulator technician for Delta Air Lines, saw his pension frozen in 2005. While the company upped matching contributions to his 401(k) plan, Gaz says he will never be able to contribute enough to make up for the loss of his benefits. Gaz plans to leave his job at Delta at age 52, the first year he will be eligible to draw money from his pension. “I will walk away from the airline industry, because there is nothing to keep me here anymore,” he says.

In the past, most pension freezes were accompanied by improvements to the company’s 401(k) plan. But in these tight-fisted times, that is no longer the case. Retail chain Talbots froze its traditional pension this year and also suspended matching contributions to its 401(k) plan. Similarly, Boise Cascade froze its pension plan for salaried employees and suspended matching contributions during the first quarter.

While 401(k) plans are considered less costly than traditional pensions, they are not immune from cutbacks during tough times. Since the beginning of the year, more than 200 employers have reduced or suspended contributions to their 401(k) plans, according to the Pension Rights Center.

Hybrid pensions could be coming

The cutbacks in 401(k) matches and pension freezes reflect companies’ struggles to survive during extraordinarily difficult economic times, Glickstein says. And with unemployment approaching 9%, he says, “People are not going to be haggling over benefits if they can keep their job.”

But Glickstein says he is not ready to write the obituary for pension plans.

When the economy recovers, he predicts, more companies will consider adopting a cash-balance pension plan, a hybrid pension that combines features of a 401(k) plan and a traditional pension. These plans can play a valuable role in encouraging older workers to retire, an important part of workforce management, Glickstein says.

Millions of workers are postponing retirement because they are afraid they cannot afford to stop working.

“If people cannot leave when they are ready to leave and you are ready to have them leave,” Glickstein says, “that’s an issue.”

AMERICA’S ANTI-MILITARIST HERITAGE

“Liberals” are anti-war only when someone they do not like is waging it.

We last posted a review of Bill Kauffman’s Ain’t My America: The Long, Noble History of Antiwar Conservatism and Middle-American Anti-Imperialism eight long months ago, here. That one in turn was preceded by another review, which we posted here. Why post yet another review? Well, there has been a little change in presidents in the U.S., and excuse us if we are sustaining a bout of “Meet the new president, same as the old president” déjà vu all over again when it comes to warmongering.

Presidents may change, but the elite’s attractive to imposing their ideas and maintaining their prerogatives by using force, home and abroad, does not.

This review differs from previous ones in that it specifically summarizes from the book instances of opposition to the many wars America has engaged in from the beginning.

Americans do not have much historical memory anymore. That is not just because of the dumbing down of the educational system and the fact that most young people read very little on their own. It is because most of what little they do hear about our history is colored by statist theology.

But if you talk to some older Americans – people in their 70s and 80s – you will encounter a few who know some important things. First, they know that there was widespread opposition to the wars the United States fought in the 20th century; and second, they know that most of the opposition to war came from the “Right.” That is, “liberals” were the ones champing at the bit to send American forces into combat and “conservatives” were the ones saying, “Let’s just mind our own business.”

Bill Kauffman’s book Ain’t My America is intended to drive that point home. His subtitle lets the reader know where he is going – the long, noble history of anti-war conservatism and middle-American anti-imperialism. This is not just a dry and pedantic bit of historiography, though. Kauffman writes with an angry edge because he is sick and tired of the politicians – left, right, and center – who just cannot resist the calls for sending American troops into combat all around the globe. He wants to kindle the embers of an old fire – the deep conviction among Americans on the political Right that keeping America’s national nose out of foreign wars is morally and politically the intelligent policy. Americans should not start wars. They should not participate in those already begun. They should just mind their own business! That should be the stance of the “Right” even more than of the “Left.”

When Americans read about their history, they learn the results of the numerous wars they have been in, but almost never is any space devoted to the decisions to get into them. Wars do not just break out spontaneously. Government officials have to act, but what of those, in and out of government, who did not want to get involved? Only if you look deeply will you find anything about the people who opposed America’s wars. Kauffman has done exactly that. In Ain’t My America, he shows that there was opposition to every one of America’s foreign wars, mostly from small-town, freedom-loving folks whose chief demand of the government was that it respect their rights.

The War of 1812

Although I daresay that I know a good deal more about American history than most people, I was surprised by many of the facts Kauffman presents. I had not known that Daniel Webster was an opponent of the War of 1812. The great orator said at the time,
Who will show me any Constitutional injunction which makes it the duty of the American people to surrender everything valuable in life, and even life itself, not when the safety of their country and its liberties may demand the sacrifice, but whenever the purposes of an ambitious and mischievous government may require it?
Ah – an early understanding of the truth that politicians usually seek war for their own advantage.

The Mexican War

The Mexican War of 1846–48 was sought by President James K. Polk, who fabricated a border incident to serve as the justification of hostilities – just as Hitler did with the Poles in 1939. Many Americans, however, saw right through his deception and bellicose rhetoric. A little-known member of Congress named Abraham Lincoln was one. Another was Rep. Alexander Stephens of Georgia (later the vice president of the Confederacy), who said, “Fields of blood and carnage may make men brave and heroic, but seldom tend to make nations either good, virtuous, or great.” Lincoln, Stephens, and many others saw the Mexican War as simple aggression by the United States and wanted no part of it.

After the bloodbath of the Civil War, the United States stayed out of foreign conflicts until late in the 19th century. Hawaii was annexed in 1898. While the takeover was bloodless, former president Grover Cleveland said that he was “ashamed of the whole affair.”

The Spanish-American War

Far worse was the Spanish-American War. Whatever might have caused the sinking of the battleship Maine in Havana’s harbor, the McKinley administration instantly seized on it as a casus belli and the country was at war before any opposition could form. After the end of the hostilities, a group of capitalists who wanted peace rather than an empire formed the Anti-Imperialist League. One of them, George Boutwell, criticized U.S. involvement in the Philippines, where American troops were fighting nationalist guerillas:
Is it wise and just for us, as a nation, to make war for the seizure and governance of distant lands, occupied by millions of inhabitants who are alien to us in every aspect of life except that we are together members of the same human family?
A great amount of death and suffering would have been avoided if the United States had stayed out of the Philippines, but the expansionists were firmly in charge in Washington. The Anti-Imperialist League was drowned out with jingoistic slogans.

At this point, we meet one of Kauffman’s heroes, Sen. George F. Hoar of Massachusetts, a crusty Republican who wanted to keep out of foreign military adventures. Writing in 1902 about America’s Philippine involvement, Hoar said bitterly,
We crushed the only republic in Asia. We made war on the only Christian people in the East. We vulgarized the American flag. We inflicted torture on unarmed men to extort confessions. We put children to death. We established reconcentration camps. We baffled the aspirations of a people for liberty.
World War I

World War I was a replay of the Spanish-American War, but on a gigantic scale. It was the big-thinking nationalists who insisted on preparing for and eventually entering the war by sending American troops to France. While it is often said that the business class – usually vilified as “merchants of death” – were instrumental in pushing the nation into a war that had no bearing on Americans at all, Kauffman shows that many businessmen were against President Wilson’s determination to participate in the carnage in Europe. They foresaw that war would bring not only death and destruction, but also regimentation and high taxes.

Henry Ford was one voice for peace and sanity. Prior to Wilson’s victory over the pacifists with the April 1917 declaration of war, he wrote,
For months, the people of the United States have had fear pounded into their brains by magazines, newspapers and motion pictures. No enemy has been pointed out. All the wild cry for the spending of billions, the piling up of armaments and the saddling of the country with a military caste has been based on nothing but fiction.
America’s foremost capitalist was not alone in wanting peace. Millions of people who liked their government small and saw no glory in war wanted to stay out of “Wilson’s War.” (See my review of Rich Man’s War, Poor Man’s Fight, by Jeanette Keith, in the June 2005 Freedom Daily. The book details the opposition to the war in the South. [Review posted below.]) Of the 50 House members who voted against war, 33 were Republicans. Only 16 Democrats went against their messianic president.

Wilson got his war. Americans who spoke out against it were imprisoned. Kauffman quotes one South Dakota farmer who got a 5-year prison sentence for saying, “It was all foolishness to send our boys over there to get killed by the thousands, all for the sake of Wall Street.” Not all Wall Streeters wanted the war, but most of small town and rural America was opposed. The war was entirely the doing of the nation’s political elite, which looked down its collective nose at the rubes who could not see that America had to fight to save the world.

World War II

In the late 1930s, with the storm clouds of war again building up over Europe and Asia, the same drama was replayed. Conservative, small-town America could see that there would be another war and tried to keep the United States out of it. Kauffman concentrates especially on the America First Committee. “It was not in any way pro-fascist or pro-Nazi, though of course anyone who opposes a war in modern America gets tagged as an enemy symp,” he writes. The America Firsters believed in the libertarian position that the country should be sufficiently armed to repel any attack on it, but stay out of the war unless attacked. Public polling in 1940 showed that about 80% of the people agreed. Kauffman does not go into Roosevelt’s machinations to goad the Japanese into attacking, but once the bombs fell on Pearl Harbor, war was inevitable. Once again, the “just leave us alone” instincts of most Americans were trampled upon.

The Cold War

When World War II was finally over, the big-government internationalists could not allow the power they had worked to amass to wither away, so they conjured up the Cold War. By that time, much of the American Right had been lured into the camp of the bellicose, but a few remained to argue against the Truman/Eisenhower policies of confrontation. One was old Herbert Hoover, who opposed committing U.S. troops to NATO and declared that Truman had violated the Constitution by involving the country in the Korean War without a declaration of war by Congress.

Another was Sen. Robert Taft (R-Ohio), who said in a Senate speech in January 1951, “The principal purpose of the foreign policy of the U.S. is to maintain the liberty of our people.” Unfortunately, liberty was far from the minds of most of his colleagues.

Less well known than Hoover and Taft is another Kauffman hero, Howard Buffett, father of the billionaire investor. Howard Buffett was a member of the House from Nebraska in the 1940s and 1950s. He was fervently opposed to militarism, foreign aid of all kinds, and anything that went beyond his vision of a government that just protected life, liberty, and property. Buffett was adamantly opposed to the military draft, which to him was no different from slavery.

With the passing decades, the Right has largely become the pro-war side of the political spectrum and the Left now contains most of the anti-war crowd. There are some exceptions, of course. Republican congressmen Ron Paul (R-Texas) and John Duncan (R-Tennessee) opposed the Iraq War from the beginning, but most Republicans have fallen into the neocon orbit and believe that the solution to just about anything the United States does not like around the world is to send in American troops. Opposition to military escapades comes mostly from “liberals” but not with much effect. (I wish that Kauffman had pointed out that the problem with leftist opposition to war is that it is unprincipled. People who favor massive government taxation and control of nearly every other aspect of life are not on firm ground when they say, “Let’s not use military force for anything but self-defense.”)

What Kauffman hopes to see is a revival of anti-war sentiment among those who should be its strongest natural proponents – Americans who want their government small, their taxes low, and no soldiers in body bags. Despite all the propaganda that wanting to avoid war is cowardly, he is optimistic:
It may not be too late for the American Right – for Main Street America in all its conservative neighborliness, its homely yet life-giving blend of the communal and the libertarian – to rediscover the wisdom of its ancestors, who understood that empire is the enemy of the small and war is the enemy of the home.
Bill Kauffman has hit the nail right on the head. It should not be just the far Left that says “No” to war. There is a strong history of anti-militarism on the Right and it is time to bring it back to life.

A Rich Man’s War and a Poor Man’s Fight

Some of the most determined resistance to the World War I draft took place in the rural South: Jeanette Keith’s Rich Man’s War, Poor Man’s Fight: Race, Class, and Power in the Rural South during the First World War reviewed.

The decision by the U.S. government to intervene in what came to be known as World War I was possibly the most catastrophic decision in Western history. It led directly to the Nazis, fascists and Bolsheviks coming to power, and then to the Cold War. Liberal Western values which rose to prominence during the Age of Reason have been fighting a rearguard action against the forces of collectivism and militarism ever since. It is unclear whether they will not soon be wiped off the world map for generations to come.

Did any of those opposed to America’s entry into European hostilities foresee this? Probably not. The Europeans who started the whole thing thought it would be over in a matter of weeks. Woodrow Wilson thought he was making the world “safe for democracy,” and died a broken man when his hallowed vision failed. Wilson’s handlers, spearheaded by the mysterious “Colonel” Edward Mandell House, wanted to make sure the right side was victorious. We tend to think even they would have chosen a European stalemate had they accurately predicted the outcome of America’s intervention.

Those opposed to the intervention were driven by principle, as elaborated upon in the Bill Kauffman book reviewed about, or just the very understandable desire not to be killed getting involved in someone else’s business.

History professor Jeanette Keith’s book Rich Man’s War, Poor Man’s Fight examines opposition to entering World War I through the lense of “race, class, and power in the rural South.” The book review below alone reveals ironic elements in the war’s execution, e.g., that blacks were often exempted from the draft because their white employers used connections to avoid losing their labor force. Those were drafted were usually not posted on the front lines because they were not considered reliable. Most interesting is that among the strongest war resisters were those favorite whipping boys of the pro-state – and inevitably pro-war – elite: rural Southerners. Having had their own homeland devastated a generation earlier, perhaps they were not in the mood for a repeat performance.

That little most Americans have heard about U.S. involvement in World War I is that U.S. troops swaggered into France, defeated the mighty armies of Imperial Germany, and thereby made the world safe for democracy (as President Wilson put it). That there was deep opposition to the war across a wide swath of the American public is scarcely known at all. At the time, however, the Wilson administration was so concerned about opposition to U.S. entry into the raging European conflict that it pushed through Congress the Espionage and Sedition Acts, which were vigorously used against people who spoke out against the war. Neither the effusive pro-war rhetoric of Wilson and his allies nor the crackdown on civil liberties was, however, able to extinguish the sentiment among many Americans that the war was a horrible blunder.

In her new book, Rich Man’s War, Poor Man’s Fight, history professor Jeanette Keith examines the opposition to American participation in World War I by focusing on, as the book’s subtitle says, “race, class, and power in the rural South.” Keith has dug deep into historical data – small-town newspapers, Selective Service records, court documents, and more – to give us a picture that many people will find difficult to believe, namely that “some of the most determined resistance to the World War I draft took place in the rural South.” She tells a fascinating story.

The groundwork for U.S. military intervention abroad was begun years before the onset of war in Europe. Keith explains that the “Preparedness Movement” was the brainchild of ex-President Theodore Roosevelt and other nationalists who “made a military buildup part of their agenda, along with Anglophilia, immigration restrictions, Americanization, eugenics, and strident glorification of manhood and ‘patriotic Motherhood.’” That movement started in the Republican Party, following its split in 1912 and the consequent victory of Woodrow Wilson. Once the war began in Europe, the cries for “preparedness” spread rapidly throughout much of both the middle and upper classes. Newspapers editorialized in favor of conscription and the expansion of the military. Writers such as Hudson Maxim (of the famed armaments family) harangued the populace with tales of how American women would become the prey of invading German armies unless the nation turned itself into a New World version of Prussia. By 1915, much of America was bristling and ready for action.

Southern opposition to World War I

But much of it wasn’t. Many Americans, from all walks of life and places in the political spectrum, abhorred the militaristic talk and tried to dampen the nation’s surging bellicosity. Although the South is generally regarded as an especially militaristic section of the country, Keith shows that there was strong opposition to the Preparedness Movement there. “Southern antimilitarists,” she writes,
argued that when the nation needed defending, American men would volunteer for the military, as they had in all previous wars, and they opposed building up a conscripted military force large enough to allow the U.S. government to go adventuring overseas.
Opponents also raised another objection – that a big army would mean tax increases. It is interesting to note that when government was relatively small, people were attentive to the prospect of even a small increase in taxes, while today, with our vast government, people hardly seem troubled at all when further huge expansions are announced.

When war was finally declared in April 1917, some of the most vocal opponents were southern Democrats. Rep. Claude Kitchin of North Carolina, for example, spoke against the declaration of war, saying, “Let me once remind the House that it takes neither moral nor physical courage to declare a war for others to fight.” Many Southerners felt the same way. Keith quotes from a letter written to a Mississippi senator:
You may go ahead and declare war in order to satisfy a very few, but I hear the men behind the plow say they are not going for they have nothing in Wilson’s war.
Sedition and conscription

Soon the phrase “rich man’s war, poor man’s fight” became a popular expression of the disdain many ordinary Americans had for U.S. entry into the war. The Wilson administration, frantic to shut down criticism, quickly passed the Sedition Act to make it illegal for people to denounce the war. Within days of its enactment, a barber in Roanoke, Virginia, was arrested by federal agents for having distributed a flyer entitled “A Rich Man’s War and a Poor Man’s Fight.” Freedom of speech was unimportant to Wilson and his backers. Maximizing the war effort trumped every other consideration, including the Constitution.

Wilson’s campaign against open dissent was quite successful. Keith writes,
By late summer, when “the boys” shipped out for camp, southern rural dissenters had been thoroughly intimidated: denied access to the mails, spied upon by agents of the federal government, denounced by their local political enemies, and in some cases, accused of sedition and incarcerated.
Opposing the war and opposing the suppression of free speech were equally dangerous to one’s liberty, yet a few Southerners still did. Some, Keith notes, were Populists, some agrarians, a few were Socialists, and many had no particular political philosophy. The common thread was an inability to see why young American men should be forced to risk death in the trenches of Europe.

Keith’s chapter “Race, Class, Gender, and Draft Dodging” is especially enlightening. She notes first that while the government initially sought to fill up the army’s ranks with volunteers, so few men volunteered that conscription was quickly adopted. Local draft boards held almost unchallengeable power to either induct or defer men. It is perhaps surprising that the prevailing racism and economic structure here worked against whites. Blacks were largely exempted from the draft because they were mostly employed by white businessmen who did not want to lose their labor force. Those businessmen had “connections” and used them to keep their workers home.

Also at work was the widespread belief that blacks just would not be dependable soldiers. In that, there may have been a fair measure of truth, since many southern blacks in 1917–1918 tended to view the war as irrelevant to their concerns. Some blacks were drafted, but the white view was that they would not be reliable in battle and they were mostly consigned to rear-echelon duties.

It was rural white men who were drafted in the largest numbers and were sent to the front- lines. Keith quotes numerous letters written by women to draft boards and elected officials begging that their husbands and sons be exempted from the military because their work was needed at home. Such pleas fell mainly on deaf ears.

Draft evasion was surprisingly common. One key reason that it was possible for many Southerners to escape conscription was the “primitive” state of governmental record keeping in the South. In those days, prior to Social Security and its near-universal tracking of people, government officials often lacked accurate information about citizens’ residences and ages. Referring to James Scott’s important book Seeing Like a State, Keith contends that Southern states had not yet perfected the techniques used by modern governments to “see” their populations and thereby subject them to control. No doubt, some young men in the South survived owing to the fact that “their” officials did not know as much about them as officials in the Northern states knew about young men there.

Among those who could not evade the draft, there was a surprisingly high degree of resistance and desertion. Federal officials had a difficult time tracking down draft resisters and deserters from military camps, men who were often sheltered by sympathetic citizens. Desertion rates, Keith’s research indicates, ranged from 7% in North Carolina to more than 20% in Florida. Desertion was not just a southern phenomenon, however; in New York, the desertion rate was higher than 13%. Blood was shed in more than a few of the forays where officials went to apprehend men who were supposed to be in the army, but preferred their freedom instead.

Rich Man’s War, Poor Man’s Fight gives the reader a unique view of the United States in its first modern war, the extraordinary lengths to which the government was willing to go to choke off dissent, and the reaction to the war in a region of the country that most people would assume reflexively supported a Democratic president who had done his utmost to generate war fever in the nation. It is a truly original piece of historical analysis.

My sole complaint is that Keith makes it sound as though the only opposition to the war came from the American Left. (Some of those leftists, it should be noted, were war enthusiasts later when Stalin attacked nations such as Poland and Finland. Their opposition to American involvement in World War I was opportunistic rather than based on a principled rejection of militarism.)

While it was not her aim to give a thorough catalogue of the Americans who did not buy the war hysteria, Keith might at some point have noted that there were libertarian opponents such as Albert Jay Nock and H.L. Mencken. The phrase “rich man’s war” was a calumny on the many wealthy who wanted the United States to stay out of the war. World War I was mostly a poor man’s fight, but it would have been more accurate to call it an interventionist politicians’ war. But for them, America would have stayed at peace.

That imbalance aside, this is an excellent work that reveals much about militarism and its enemies in early 20th-century America.

YOU WANT ENGAGEMENT? THEN START BEING CLEAR!

How to keep the wheels turning even when you are not looking ...

Here is some advice to leaders and managers on how to motivate everyone else involved to “go the extra mile” – to align with the overall goal or intent of the project and to act on that behalf without being micromanaged or otherwise ordered around. The author deems the issue as one of encouraging engagement. And when that is not happening, he posits that one or more of the following is missing: clarity, hope or commitment.

The Problem: You want your staff to go the extra mile. You want your team to take some risks. You want your employees to “get the big picture” and do what it takes to make it happen. You want the wheels to stay on the bus even when you are not there.

What you want is engagement. But no one is buying. If you want something done you have to spell it out in detail, or just give up and get to that ugly “I will just do it myself” place of the defeated manager. You feel like every time you turn your back, the wheels come off the bus again. You have got zero engagement.

Engagement was defined by John Gibbons (writing for the Conference Board) as “a heightened emotional connection that an employee feels for his or her organization, that influences him or her to exert greater discretionary effort to his or her work.” Notice the line from emotional connection to greater discretionary effort.

Discretionary effort is the phrase that describes what every employer wants: For the employee to figure out what is needed for success in the bigger picture, and to do whatever is needed to get there – without anyone standing over their shoulder ... Stuff just gets done.

So how do you get your team to this place? What do we require to become fully engaged? In my experience, three things are needed: clarity, hope, and commitment.

Clarity

In his book The One Thing You Need to Know ... Marcus Buckingham writes that the one thing you need to know about great leadership is “Discover what is universal, and capitalize on it.” Buckingham tells us that what is universally required of leadership is clarity. Specifically, an optimistic clarity about the future.

We will work our hearts out for you (that is discretionary engagement) if you can make us see with crystal clarity the great future we are all headed for.

Leadership is the work of leaders. That means get out front and lead. You must see what others cannot yet see. You must see the future with an optimistic clarity that inspires others to follow. Leadership does not just require clarity; leadership is clarity.

If you cannot see the future more clearly and more optimistically than the rest of us, what makes you a leader?

Hope

If there is clarity about the future, then the next link in the chain is possible: hope.

Hope, as I have defined it, has two components: an optimistic vision of the future, and the belief that we have what it takes to get there. As a leader your clarity of vision creates the precondition for that kind of hope. We must see where we are going, and we must believe it is a place worth getting to, before we decide to invest our blood, sweat, and tears to get there! The success of every great religious leader, every reformer, every leader of any expedition across any ocean or continent has been dependent on their clarity of just how much greener that grass over there is.

When we can see that where we are headed is better than where we are now, clarity becomes hope.

Great managers play a critical role in inspiring hopefulness in teams. With their defining work in understanding the strengths of every employee (Buckingham again), great managers help us understand exactly what our role is, and leverage our strengths in achieving the goals of the organization. Great managers support our contribution by constantly encouraging further growth where they know we are strong, and by giving us opportunities to use those strengths for the greater good.

Great managers act as match-makers between our strengths and the jobs that need to get done. The result, when all is right, is that powerful feeling of a team that is firing on all cylinders, and every member is clear about their role in the success of the overall project. And like so much in life, success builds more success: feeling like we are successful contributors to the greater good, and being part of a successful initiative, builds the confidence and hopefulness that leads to more success.

When clarity and true hopefulness exist in an organization, the stage is set for the third component of total engagement: commitment.

Commitment

When our leaders give us clarity and hope about our futures and the future of the organization, the stage is set for us to make a commitment. Starting on a journey of change and growth requires clarity and hope. But there is no journey at all without commitment. Commitment is the action piece. It is time to start walking. Commitment is, to paraphrase Nike Corp., “just doing it.”

If organizational change sometimes feels like going over a cliff, then clarity is envisioning just how we will make the tricky descent, and hope is the confidence we will make it to the bottom in one piece. Commitment is taking the first step over the edge. Commitment is the point at which there is no turning back.

If clarity is the domain of leadership, and engendering hopefulness the domain of great managers, commitment is the responsibility of the whole team. Literally, if clarity and hopefulness are the call, commitment is the response. We are all going over the edge together, and my commitment as a team member is that I will take that first step with everyone else, and every required step after it, until we reach our goal.

Now we have engagement.

So you want complete engagement? Give us complete clarity. Do not complain that you cannot get anyone involved/engaged/committed in your project, your vision, if you cannot help us see it. Do your job as a leader, or we will wander off somewhere else. Require that our managers provide the kind of intelligent feedback and empowerment that strengthens our confidence in ourselves and in the organization we work for, or we will falter and lose our commitment.

Do you want your employees to tap into that mysterious “discretionary effort” that means the wheels stay on the bus even when you are out of the building? Then make sure that you have done your part to be clear, and to connect our strengths with the task at hand. If you have really done your part, then you will get passionate engagement and everyone will take that first step towards extraordinary growth and change together, and then keep on walking!

SHORT TAKES

Murray Rothbard’s Anatomy of the State

Anatomy of the State is now available as a stand-alone 60 page essay. Among expositions of the “the state as an organized crime racket” thesis, it may be the best combination of concise and complete around.

Murray Rothbard was known as the state’s greatest living enemy, and this is his most succinct and powerful statement on the topic, an exhibit A in how he came to wear that designation proudly. He explains what a state is and what it is not, according to his own ideological vision. He shows how it is one institution that purports to hold the right to violate all that we otherwise hold as honest and moral, and how it operates under a false cover now and always. He shows how the state wrecks freedom, destroys civilization, and threatens all lives and property and social well-being.

The essay is seminal in another respect. Here Rothbard had bound together the cause of private-property capitalism with anarchist politics – and he was truly the first thinker in the history of the world to fully forge the perspective that later came to be known as anarcho-capitalism. He took all that he had learned from the Misesian tradition and the liberal tradition and the anarchist tradition to put together what is really a new and highly systematic way of thinking about the entire subject of political economy and social thought.

Understanding his point of view has an interesting effect on any reader. It has the effect of putting things together in a way that changes the way we see the world.

And he explains all of this in a very short space, and in this very beautiful book. This is the first time that this essay has been published separately, and it is done in order that the book can be ordered in large quantities and distributed to all interested people.

Bookies Win Battle Over Irish Betting Duty

Governments show the capacity to be rational when the stakes are low enough. The Irish government called off a plan to increase the betting duty from 1% to 2%, noting it would further burden betting companies while providing only a small increase in tax receipts.

Call off a tax because it hurts the collector and does not increase revenues? What is next?

The government's plans to double betting tax on Irish bookmakers in May were repealed at the eleventh hour. The Irish Finance Bill 2009, as amended by the Select Committee on Finance and the Public Service, included phrasing to rescind the proposed hike.

The hike, which would have increased the betting duty to 2% from 1% under Section 67(1) of the Finance Act 2002, will remain at 1% and will be "deferred indefinitely," having never been enacted nor entered into effect.

Upon review of the sector, the government noted that doubling the tax would only add additional burden on betting companies and provide nominal increases in tax receipts. Instead the Department of Finance is to discuss with the sector, how to increase the tax base in a "fair and workable" manner. One area in particular which it noted was the possibility of enforcing taxation on online and phone betting, which currently provides little revenue.