community

Frances Fox Piven’s "Challenging Authority"

Posted on

Thursday, May 15, 2008

Frances Fox Piven’s “Challenging Authority” – by Stephen Lendman
https://i0.wp.com/www.worldproutassembly.org/challenging%20authority.jpg

Frances Fox Piven is a Canadian-born Professor of Political Science and Sociology at The Graduate Center, City University of New York (CUNY). Her career is long and distinguished. She’s the recipient of numerous awards, has combined scholarship with activism, and is the author of many important books. Most notable is her 1971 classic “Regulating the Poor: The Functions of Public Welfare.” It’s a landmark historical and theoretical analysis of how welfare policy is used to control the poor and working class.

A more recent book is her 2006-published “Challenging Authority” and subject of this review. It’s about how social movements can be pivotal forces for change because ordinary people in enough numbers have enormous political clout. Abolitionists, labor movements and civil rights activists proved it. Piven examines their collective actions plus one other in the four examples she chose – the American Revolution.

Piven’s book is succinct and masterful. Howard Zinn calls it a “brilliant analysis of the interplay between popular protest and electoral politics.” Canadian Professor Leo Panitch says the book is “theoretically profound, yet immensely readable,” and sociologist and social movements expert Susan Eckstein describes the book as “quintessentially Piven-esque.” It “eloquently (shows) how ordinary people….have taken it upon themselves to correct injustices.”

Piven’s theme is powerfully relevant at a perilous time in our history. The nation is at war on two fronts, a third one looms, constitutional protections have eroded, social services erased, the country is militarized, dissent repressed, and the government is empowered to crush freedom and defend privilege at the expense of beneficial social change it won’t tolerate.

Introduction

In light of the current situation, Piven’s introductory Thomas Jefferson quote is relevant. It was his response to the repressive 1798 Alien and Sedition Acts. He wrote: “A little patience, and we shall see the reign of witches pass over, their spells dissolve, and the people, recovering their true sight, restore their government to its true principles.” Disruptive social actions have done it in the past, and Piven puts it this way: “ordinary people (have) power….when they rise up in anger and hope, defy the rules….disrupt (state) institutions….propel new issues to the center of political debate….(and force) political leaders (to) stem voter defections by proferring reforms. These are the conditions that produce (America’s) democratic moments.”

Electoral participation alone won’t do it. “In the real American political world, numerous obstacles” remain – structural, legal and practical. Despite liberalization of the process through the years, “large numbers of ostensibly eligible voters” are effectively disenfranchised. Former restrictive laws are gone, but new schemes replaced them – intimidation, misinformation, electoral fraud, and the corrupting power of money in a nation beholden to capital at the expense of the greater good.

Piven cites more as well:

— the power of incumbency,

— the two-party system that shuts out independent and minority interests,

— the construct of the law that empowers the powerful,

— the revolving door between business and government,

— the corrupted dominant media,

— the lack of accountability to voters,

— arbitrary redistricting for political advantage,

— believing markets work best so let them,

— disdaining the harm they cause,

— feeling interfering with market excess is “moral trespass,”

— sacrificing democracy in the pursuit of profit,

— and it all turning the public away from a process they no longer trust.

It shows in declining voter turnout with half or less of the electorate showing up at the polls and many without conviction.

Post-WW II, “most political scientists viewed American democracy with a self-satisfied complacency.” It wasn’t perfect, but it was the best possible at the time. Two decades later, system imperfections were more apparent, and more recently political science professor Robert Dahl said our system is “among the most opaque, complex, confusing, and difficult to understand” to show how badly we fare compared to other democracies.

Inequalities are extreme and growing, and Piven calls it “pernicious.” It breeds “patterns of domination and subservience (and) undermines democratic capabilities.” She quotes political analyst Kevin Phillips saying Washington is “the leading interest-group bazaar of the Western World,” and economist Paul Krugman calling our political system “utterly and perhaps irrevocably corrupted.”

Bad as it now is, Piven says democracy “never worked well in the United States.” Citing the 19th century, she notes how it “was stamped and molded by intense religious and ethnic allegiances (that in turn created a culture of) political parties (at all levels) steeped in patronage.” It was at a time corporate power grew and began to gain advantages that are now commonplace and harmful to the public interest.

Nonetheless, egalitarian reform is possible, and Piven recounts four crucial times when it showed up. Each time, protest movements achieved it by influencing American politics, “if only temporarily.” It’s no surprise that power “flows to those who have more of the things and attributes valued in social life.” But times emerge when “workers or peasants or rioters exercise power,” it’s “distinctive….disruptive or interdependent,” and it happens when conditions are right for it to be actualized.

Piven states the “central question” of her book: “given the power inequalities (in America)” and how it corrupts the political process, “how does egalitarian reform ever occur” at all? It’s only been at times of “disruptive protest movements” with their “distinctive kind of power” Piven calls “disruptive power.”

The Nature of Disruptive Power

First a definition of power in the abstract. Piven notes the “widely held thesis that (it’s) based on control of wealth and force” – big landowners over peasants, rich over poor, armies over civilians, and so forth. However, it’s not always the case, and “history is dotted” with examples of “people without wealth or coercive resources….exercis(ing) power, at least for a time.”

She notes how societies organize through cooperation and interdependence, but disparate interests at times conflict. While workers depend on management for jobs, managers, in turn, need a work force to produce. If labor is withheld, production halts. Both sides have leverage. Either one can activate it. Piven calls the “activation of interdependent power ‘disruption.’ ” It’s a power strategy based on “withdrawing cooperation in social relations.” Protest movements “mobilize disruptive power.” They achieve leverage by breaking down “institutionally regulated cooperation” as in strikes, boycotts or riots.

At these times, ordinary people (potentially) have enormous power – “their ability to disrupt institutionalized cooperation that depends on their continuing contributions.” Key is that great reforms in history have been “responses to the threatened (or use of) disruptive power.” In the US, it achieved representative government, ending slavery, the right to organize, social welfare and civil rights. Grassroots bottom-up “disruptive power” produced them.

But it takes more than marches, rallies, slogans, shouting or even violence. It’s also too simplistic to think power from below is there for the taking. Actualizing power depends on the ability to withhold cooperation. But it’s not “actionable” until certain problems are solved:

— recognizing interdependence and the potential power from below such as workers withholding their labor or wives their domestic services;

— the necessity of people breaking rules; rules are power strategies; they allow some people to dominate others, establish property rights, become law, and so forth;

— individuals must coordinate their disruptive power for strategic advantage;

— they must overcome constraints of an entire matrix of social relations; examples are the influence of family ties or the threat of religious excommunication;

— disruptive power must be sustained, cooperation withheld, and be able to withstand whatever reprisals occur; and

— the determination to stay the course in the wake of threats and uncertainty – employers who may hire scabs or relocate their plants and facilities.

New strategies aren’t invented for each challenge. They’re “embedded in memory or culture, in a language of resistance (and) become a ‘repertoire’ (of a) specific constellation of strategies to actualize interdependent power.” New repertoires from below are developed in response to social and economic change. They become “forged in a political process of action and reaction.” Popular struggles change over time, so, for example, food riots became rare and strike actions typical. However, they’re now threatened with weakened labor protections, the growth of temporary workers, and the ability of employers to operate anywhere in the world under WTO rules.

Slowly over time, new repertoires emerge to respond to conditions of the times. Lessons are learned from defeat, anger and defiance builds, and creative imagination invents new solutions to old problems.

The Mob and the State – Disruptive Power and the Construction of American Electoral-Representative Arrangements

Disorderly and defiant crowds or mobs figure prominently in the history of disruptive movements. They played an important role in the Revolutionary War period and years leading up to it. American elites allied with mobs because they grew uneasy about British rule and developed radical ideas about the right of the colonies to self-government. Without mob support, the war with England couldn’t have been won. They provided the troops who fought it.

Most colonists were from England, and by the mid-1700s numbered around 1.6 million. Most had egalitarian ideas and were ordinary people – artisans, apprentices, sailors, laborers, urban poor, farmers, bonded servants, and so forth. They also relied on mob action for results.

In the pre-revolutionary period, “riots and tumults” were commonplace. Bacon’s 1676 Rebellion of discontented frontiersmen and slaves was the first one of note. In the next 100 years, another 18 uprisings erupted (according to Howard Zinn) against colonial governments along with six black rebellions and 40 riots.

Tensions grew as the years passed. They challenged Britain and colonial elites. Inequalities also increased, and they spawned protests against them. One study cited 150 riots in cities and rural areas between 1765 and 1769. In addition, merchants and landowners grew angry with the Crown. In 1763, it sent a standing army to the colonies, introduced new taxes, made demands to billet British troops and to curb colonial assemblies’ power. It introduced the Sugar Act, Tea Act and a new Stamp Act. Colonists resisted and mob action was crucial.

They made Stamp Act enforcement impossible and dumped tea into more than one harbor to prove it, besides the notable December 16, 1773 Boston action. Historian Edward Countryman called it the “final rupture” leading up to war. Those who took up arms wanted popular democracy, and it affected the post-revolutionary drafting of state constitutions. They reflected “egalitarian and libertarian ideas that were spreading up and down the eastern seaboard.” They wanted popular liberty and drafted laws that limited executive powers, established unicameral legislatures or at least powerful lower houses, short terms of office to force elected officials to face voters more often, and essentially make government accountable to the people.

It alarmed the nation’s elites who, in turn, precipitated efforts to reform the new state constitutions and reign in their democratic excesses. Defeating England unleashed electorate demands, and they showed up in popular rebellions. They were fueled by postwar depression, debt, and legislative imposition of poll and property taxes on farmers. They petitioned for relief, got none, so armed mobs closed the courts to stop debtor suits and stave off foreclosure on their farms. Rebellions spread across New England with Daniel Shays leading the most famous one in 1786 and 1787. The rebels were dispersed, but they got amnesty, tax relief, and most imprisoned debtors were released.

Elites were alarmed, excess democracy had to be curbed, and the 1787 Constitutional Convention became the way to do it. There were other problems as well. The Articles of Confederation were unwieldy, had to be replaced, and a new document was needed that would last into “remote futurity” to serve the interests of “the (only) people” who mattered. They were established white male property owning delegates and members of state conventions who rammed the ratification process through in the face of a largely indifferent and uncomprehending populace left out entirely.

The challenge was to offer democratic concessions, create an appearance of democracy, but frame a document for rich property owners in charge of the process for their own self-interest. Only the privileged could vote. Women, blacks, Indians and children couldn’t and most who qualified didn’t bother. The process, and what it produced, showed operatively democracy is little more than fantasy, but it wasn’t designed to appear that way.

The “people” got to elect lower house members, who, in turn, elected senators to the upper chamber. The system stayed that way until the 17th Amendment (ratified in 1913) allowed voters in each state to elect representatives to both Houses of Congress.

Also proposed was a chief executive, a national judiciary with a Supreme Court, and provisions for admitting new states with republican governments. In addition, the Constitution had procedures for amendments and much more, including terms of office and staggered elections to prevent too many officials being unseated at the same time. In the end, the final product was a bundle of compromises, yielded little of substance to “the people,” and assured power was left to the powerful.

The Constitution’s opening words were “We the people,” but, in fact, they were nowhere in sight. The framers “engineered a conservative counter-revolution….whose purpose….was to thwart the will of the people in whose will they acted.” Government under the new document was created to fill the vacuum created by the defeat of Great Britain. It restored the essential British commercial and financial system and put it under new management. Monarchal wrappings were removed, everything changed, and yet everything, in fact, stayed the same. Rarely, if ever, was there so much rebellion with so little cause, and with so little to show for it.

Consider the Constitution’s crowing achievement, at least so we’re told – the Bill of Rights. Adopting them made the difference to get 13 states to ratify the document and make it law. Their protections weren’t for “the people.” They were for the privileged who wanted:

— prohibitions against quartering troops in their property;

— unreasonable searches and seizures there as well;

— the right to have state militias protect them;

— the right to bear arms, but not the way the Second Amendment is today interpreted;

— – the rights of free speech, the press, religion, assembly and petition – largely for the monied and propertied interests;

— due process of law with speedy public trials; and

— various other provisions worked out through compromise; two additional amendments were proposed but rejected; Jefferson and Madison wanted them; Adams and Hamilton were opposed; they would have banned monopolies and standing armies; in the end, the first 10 alone were adopted; we never saw what difference the other two might have made.

Piven’s main point isn’t that “constitution-making” limited “popular power.” It’s that “disruptive power challenges (of the time) could not be (entirely) ignored….” The founders established a republican government, popular liberties (to a degree) were conceded, and the idea (if not the reality) of the “consent of the governed” became a fundamental principle of political thought.

Further, in subsequent decades, suffrage expanded, taxpaying requirements replaced property ones, and these, too, were gradually eliminated. By the 1830s, most white men had the right to vote. It’s unlikely these changes would have happened under British rule. So while was no disagreement on how government was to be run, (in John Adams’ words, by “the rich, the well born, and the able,”) the mob, according to Piven, “played a large if convoluted role in the construction of a new state with at least some of the elemental features of democracy.”

Dissensus Politics, or the Interaction of Disruptive Challenges with Electoral Politics – The Case of the Abolitionist Movement

Piven defines “dissensus” as a tug of war between the need for political leaders to “mobilize majorities” and “disruptive challengers work(ing) to fragment them.” She also calls this “the key to understanding” disruptive protest power over public policy decisions. Political coalitions are at times fragile and vulnerable. When opposition to consensus surfaces and builds, it can be fractious, disruptive, and an “opening (to get) policy concessions on the (breakaway) movement’s issues.”

Case in point – “Abolitionism.” By one estimate, free blacks numbered around 59,000 in 1790. By the start of the Civil War, the total had increased eightfold to about 488,000. In the run-up the the Revolutionary War, slavery issues were contentious with hints early on about what later might develop.

In spite of owning slaves himself, Jefferson’s first Declaration of Independence draft included grievances against the Crown’s involvement in trafficking. Southern representatives took issue, the clause was dropped, and to build postwar consensus the South had to be reassured that their slave system would remain intact.

It led to Article 1, Section 2, Clause 3 of the Constitution saying that slaves would be counted as three-fifths of a person for purposes of allocating congressional representation. According to historian Gary Wills: For southern states, this issue was “a nonnegotiable condition for their joining the Union” and with it they got “a large and domineering representation in Congress.”

Consider some other relevant facts:

— large slave owners had disproportionate power; they controlled state legislatures and selected senators;

— most American presidents until the Civil War were southerners and slaveholders (including Washington, Jefferson, Madison, Monroe and Jackson);

— the first US 1790 census reported 757,000 blacks or nearly one-fifth of the total four million population;

— in 1807, Congress outlawed the importation of African slaves after 1808, yet trafficking illegally brought in another 250,000 until 1860;

— enacted slavery provisions were for the North as well as the South; only Pennsylvania and the New England states outlawed the practice; in 1787, most states were slave states, and the new Constitution protected their holdings;

— intersectional planter, commercial, banking and manufacturing interests tied the North and South together; slavery and cotton enriched the South, production boomed, and northern manufacturing also benefitted;

— the human bondage system affected radical abolitionists; they knew that ending slavery meant “overturning” the Constitution;

— to accommodate consensus politics, compromise was preferable to conflict; to protect the South from the majority nonslave North, “balanced” admission of new slave and free states was agreed on as well as a similar arrangement for presidential and vice-presidential tickets;

— nonetheless, compromises were fragile and sectional conflicts arose; one instance was over the Mexican War, annexation of Texas, and disposition of 650,000 square miles of new territory; neither side was satisfied even though compromise was achievable on matters of tariffs, centralized banking, internal improvements, and free western land.

Given the enormous costs of dissolution, why weren’t both sides committed to preventing it? Piven cites “the strident and disruptive abolitionist campaign with its demands for immediate emancipation. Abolitionism fractured….the sectional accord” that held disparate elements together – until 1860.

Who were the abolitionists? According to Howard Zinn, they were “editors, orators, run-away slaves, free Negro militants, and gun-toting preachers.” Together they “shaped….the movement and contributed to its disruptive power.” Its effects fractured intersectional parties, divided the nation, and led to the Civil War and legal emancipation.

“Evangelical revivalists” were committed to reform. They believed slavery was sinful, and would accept nothing less than ending it. In 1831, William Lloyd Garrison founded The Liberator. It became the voice of militant abolitionism. “Garrison was no gradualist.” He refused compromise and demanded “immediate and unconditional emancipation.”

Others were equally committed. They formed antislavery associations, edited papers, spoke publicly, and by 1841 claimed 200,000 members. Religious passion and enlightenment fervor spread throughout the North. In the South, it was opposed by “Southern rights” societies that used the Bible to claim “slavery fulfilled God’s purposes.” It produced schisms and strife, got Garrison paraded through Boston with a rope around his neck, and vigilante welcoming committees awaited northern abolitionists coming south.

Nonetheless, abolitionism grew, congressional antislavery petitions mounted, Congress claimed no authority to act, and thousands of slaves took matters into their own hands. They resisted by “evasion, sabotage, suicide, or running away.” There were also slave revolts – in 1800 in a march on Richmond; 1811 on a plantation near New Orleans; 1817 and 1818 in Florida; and Nat Turner and 70 other slaves in Virginia “kill(ing) all whites” and sparing no one.

Most disruptive was the Underground Railway with whites and free blacks involved. It defied federal antifugitive laws and freed tens of thousands of southern slaves. Abolitionist disruptions “inevitably penetrated electoral politics.” It fragmented both parties, made compromise impossible, and led to the emergence of the Republican Party. It opposed expanding slavery as new states entered the union, and in 1860 got Abraham Lincoln elected president. His platform – containing slavery and condemning threats of disunion as treason.

The South responded. Seven states seceded, Fort Sumpter was attacked, the Civil War began, four more slave states joined the others, and Lincoln committed to war to restore the union. As conflict wore on, its horrific toll drove him toward emancipation. Piven notes that the “insurrectionary role of the slaves….was probably critical to his decision.” During the war, hundreds of thousands of them refused to work, deserted plantations, and crippled the Confederacy’s ability to feed itself. In addition, around 200,000 slaves fought with the North, and their numbers were significant in achieving victory.

Abolitionism grew, southern secession spurred it, and in January 1865 Congress passed the Thirteenth Amendment banning slavery. Nominally, former slaves got more rights from the Fourteenth (due process and equal protection) and Fifteenth (forbidding racial discrimination in voting) Amendments as well as the Civil Rights Act of 1866.

“Abolitionists had triumphed,” they did it through electoral politics by splitting the parties, yet their victory was limited. Post-emancipation, the movement “melted into the Republican Party,” southern and northern leaders became accommodative, and elites in the South “moved rapidly to restore their control over blacks.” Nonetheless, an impressive victory was won even if only marginally, and it would take another century before blacks got any of their constitutional rights.

Movements and Reform in the American Twentieth Century

Throughout American history, disruptive protests were common, yet rarely did any have a “big bang” effect. Decades elapsed between successful abolitionism and New Deal reforms. In the 20th century, Piven notes that almost all important labor, civil rights and social welfare legislation got passed in just two six-year periods – 1933 – 1938 and 1963 – 1968. There was one exception – the 1972 Supplemental Security Income (SSI) for the elderly poor and people with disabilities.

Great Depression hard times spurred important reforms to provide emergency relief:

— the Civil Works Administration (CWA) for work relief; it reached 28 million people (22.2% of the population);

— overall social spending rose from 1.34% of GDP in 1932 to 5% by 1934 and showed that government works for the people when it wants to;

— the 1935 Social Security Act established the framework for all future income support programs – retirement benefits, unemployment, supplemental income, subsidized housing, and all categories of “welfare;”

— most entitlements expanded in the 1960s – old age pensions; unemployment insurance; quadrupling the numbers of women and children receiving Aid to Dependent Children; Medicare; Medicaid; new nutritional programs, including food stamps and school lunches; federal aid to education; and inner-city development through the Model Cities Act of 1966.

Overall in the 1960s, social spending rose from $37 billion to $140 billion in the post-1965 decade. By the mid-1970s, poverty levels were down from 20% in 1965 to 11%.

Each period also saw political rights expand. Mass strikes of the early 1930s produced the landmark 1935 National Labor Relations Act (NLRA). For the first time, it gave labor the right to bargain collectively on equal terms with management and provided legal protections to strike actions. The 1938 Fair Labor Standards Act established national minimum wages and maximum hours. These laws advanced worker rights over the next three decades.

In 1964, civil rights actions got the Twenty-Fourth Amendment passed. It prohibited poll taxes in federal elections, and along with the 1964 Civil Rights Act and 1965 Voting Rights Act overrode state and local franchise restrictions that were in place in the South since Reconstruction. As Piven put it: The 1960s civil rights movement “finally won, a century later, the reforms first announced (but never gotten) in the Fourteenth and Fifteenth Amendments.” In addition, the 1964 Equal Opportunity Act (antipoverty program) provided federal funds for poor communities.

Why these “big bangs” then and not at other times? It’s because they were gotten during periods of “mass disruption” that mobilized “interdependent power from below….” Veterans marched on Washington, rent strikes spread, people commandeered food, labor walkouts occurred, demonstrations demanded relief, so Roosevelt had to act. It wasn’t out of benevolence, and his 1932 platform showed it. It contained the same old 1920s planks that kept Republicans in power throughout the decade. Conditions now changed, disruptive protests demanded help, echoes of the 1917 Russian Revolution were still audible, so Roosevelt acted to save capitalism. He gave a little to save a lot for the privileged who understood the fragility of their position.

The 1960s saw other disruptive protests – this time by a massive black insurgency on one side against white southern “resistance” on the other. It came to a head in the mid-1960s in the form of civil disobedience. It began in the South, spread across the country, resulted in harsh police crackdowns, greater disruptive riots, and they forced the federal government to intervene. Turbulence, social unrest, and a climate of general crisis produced reforms to diffuse the disorder of the times.

Electoral forces also played a role the way Piven explains. She calls the “interplay between electoral shifts and political leaders….the most influential explanation of twentieth-century policy change.” Big bangs were “big electoral” ones. Two credible hypotheses explain how they occur:

— the “mobilization” thesis (during hard times) raising the level of voter turnout; new voters are key; they provide impetus for realignment under this theory; and

— the “conversion” thesis (also during hard times) detaching voters from their traditional Republican Party affiliation; here shifting loyalties explain it.

Either way, political leaders respond, strive to win and/or hold their support, and they enacted social relief measures in the 1930s and 1960s.

More is in play as well as voters by themselves have little influence over policy. In addition, politicians need broad majorities, and building them takes avoiding conflict, building consensus and striking familiar appeals for prosperity, God, country and family. As a result, electoral shifts alone don’t automatically produce bold new initiatives. In fact, they rarely do unless special times produce extraordinary responses. In the 1930s and 1960s, disruptive protests and potential institutional disorder got Roosevelt and Lyndon Johnson to act quite differently than they would have had conditions been normal.

Under the right circumstances, protest movements are powerful and provide the impetus for social reform. “The urgency, solidarity, and militancy that conflict generates lends movements distinctive capacities as political communicators.” At least for a brief time, “marches, rallies, strikes and shutdowns can break the monopoly on political discourse otherwise held by politicians and the mass media.” They can bring vital issues to the fore and get politicians (out of fear) to address them. Potential or actual “voter dissensus is the main source of movement influence on public policy.” It was true in the 1930s, again in the 1960s, and the latter victories inspired other movements for women’s rights, the disabled, gays, lesbians, and so forth.

The Times-In-Between

Unfortunately, disruptive movements are short-lived. After a few years they pass as politicians mount rollback initiatives when the pressure is off and they’re able to do it. New state constitutions stripped away hard-won abolitionist reforms. Labor rights underwent a gradual erosion after peaking in the 1930s. Union membership declined from a post-war 34.7% high. It was 16.8% after the Reagan era and is currently around 12% overall today but only 7.4% in the private sector.

Social gains have also eroded, and now have Democrats as much against them as Republicans. Why so is the question? It’s because protest movements lose their energy when the reasons causing them subside. Further, it’s because internal movement dynamics are hard to sustain. They wane from exhaustion. Exhilaration can’t last forever. In addition, defiance entails costs and sacrifice. Strikers lose wages. Workers get fired. Plants relocate, and governments support business and sometimes with force.

Protests also fade when gains are won. They always fall short and yet fail to embolden more action. Movement leaders also get co-opted, become more conciliatory to management, get more enmeshed in party politics, and sometimes run for office at federal, state or local levels. Dissensus has its limits. Inevitably, gains come at the expense of concessions, the movement runs out of energy, disruption ebbs, and hard-won reforms get rolled back. Nonetheless, these are glorious times in our history, momentous advances get achieved, and the lesson is that at other times for other reasons it can happen again.

People in large numbers and with enough will have enormous power provided they use it. Nonetheless, it’s disconcerting that the Constitution was designed as a conservative document to protect what Michael Parenti calls “a rising bourgeoisie(‘s)” freedom to “invest, speculate, trade, and accumulate,” and to assure that (as John Jay believed) “The people who own the country (ought) to run it.”

After Reconstruction, Abolitionists lost out as well. Southern states regrouped, enacted new laws, and curbed the rights of newly freed blacks. The old planter class was gone but not its mentality. A new capitalist planter class replaced it, many from the North, and it proved easy for them to devise new ways to exploit cheap, vulnerable black labor.

The Supreme Court went along much the way it does today. In a number of decisions, it rolled back civil rights gains, including enough of the Fourteenth Amendment to restore near-total white supremacy in the South. Its 1896 “separate but equal” Plessy ruling added insult to its 1857 Dred Scott support for slavery.

Post-war, blacks were nominally free but light years from equality, and southern states intended to keep it that way. Property tests, poll taxes and literacy qualifications were imposed to enforce disenfranchisement. Jim Crow laws multiplied and lynchings became a way of life. Washington was dismissive.

Labor also lost out in the post-New Deal years. What the NLRA gave, Taft-Hartley and other regressive laws took back. Labor got progressively weaker, its leadership became part of the problem, while business ascended to omnipotence with plenty of friendly governments on its side. Early on, workers hoped the Democrat Party would represent them. How could it in the conservative (anti-labor) South and, in the North, where big city bosses ran things. Over time, business took over and effectively created a one-party state with “two right wings,” as Gore Vidal explains.

Post-WW II, Piven notes that America’s economic dominance was unchallenged for 25 years, so business opposition to New Deal gains was largely muted. But once Europe and Japan recovered, they became formidable competitors, profit margins got squeezed, and a conservative counterassault gained momentum to roll back earlier social gains. Piven cites four ways:

— a “war of ideas” beginning in the early 1970s with the formation of a right wing “message machine” – corporate-funded think tanks like Cato, Hoover, Heritage and AEI; they preached cutting social programs, weakening unions, ending costly regulations, military spending, tough law enforcement, privatizing everything, and using the dominant media for propaganda;

— building up a business lobbying capacity; “K Street” became a household term, and so is the “revolving door” arrangement between business and government;

— the growth of right wing populism, “rooted in fundamentalist churches” as part of the powerful Christian Right; also pro-life, defense-of-marriage and gun groups, along with others opposed to progressive ideas, racial and sexual liberalism, and the notion that public welfare is a good thing and government ought to provide it; in their best of all possible worlds, markets work best so let them, and democracy is only for the priviliged; and

— the effective merging of Republicans and Democrats into one pro-business party with each pretty much vying to outdo or outfox the other; it took Democrat Bill Clinton to “end welfare as we know it,” continue shifting more of the tax burden from the rich to workers, enact tough law enforcement measures, offer big giveaways to business, cut social benefits as much as Republicans, and pretty much make the 1990s a new golden age for Wall Street and the privileged. James Petras calls the decade “the golden age of pillage.”

George Bush then took over and went Clinton whole new measures better – declaring open warfare on workers, waging real wars on the world, enacting repressive police state laws, surrendering unconditionally to business, smashing every social service in sight, desecrating the environment, pretty much acting as despotic and vicious as the worst third world dictators, and getting away with it.

Since the early 1970s, and especially since Ronald Reagan, most notable in Piven’s mind is “the striking rise in wealth and income inequality” that economist Paul Krugman calls “unprecedented.” Moreover, “as wealth concentration grows, so does the arrogance and power that it yields to the wealth-holders to continue to bend government policies to their own interests.”

With business so omnipotent, government as its handmaiden, the scale of corruption extreme, the electoral process so flawed, it makes the task of redressing social gains lost formidable but not impossible.

Epilogue

Given the state of things, Piven poses the essential question – is another “popular upheaval” possible? She calls this “the big question for our time.” Nothing is certain or simple, but historically “hardship propels people to collective defiance,” especially in times of extreme inequalities of wealth. The current American era is the most extreme ever, so how long will people tolerate the decline in their standard of living as the rich grow richer and multi-billions go wars without end.

How does the Bush administration respond – with a dominant media “message machine” touting an “ownership society,” scaring people to accept the outlandish and fraudulent “war on terror,” blaming victims for their own misfortune, letting (Christian) faith-based groups take over welfare, preaching God and markets solve everything, and calling a lack of patriotism the equivalent of treason.

Piven, nonetheless, is hopeful. Independent polls show Bush’s approval at record lows as well as a large majority opposing the Iraq war. In addition, she sees “an intimate connection between what people think is possible in politics and what they think is right.” Popular aspirations tend to rise for what people believe is “evident” and “reach(able).”

So she asks: “What, then, are the prospects for the emergence of new social movements that mobilize disruptive power?” Global justice demonstrations in Seattle and around the world aren’t enough. Much more is needed. Labor must become resurgent, but it’s no simple matter doing it and without committed leadership impossible.

Yet it happened in the 1930s at a time of great need, and Piven suggests that “Maybe workers need to see the possibility of worker power again.” Activists and organizers must concentrate on “developing and demonstrating power strategies” for a “new economy” that’s increasingly service-based, high-tech and global.

Millions still live here, their standard of living is declining, business pretty much has it all, and it’s high time that changed. People have power but only if they use it. New times need “new forms of political action, new ‘repertoires’ that extend across borders and tap the chokepoints of new systems of production (and governance)” where they’re most vulnerable to mass disruption.

Piven closes by saying that history shows that “collective defiance” and its subsequent “disruption” have “always been essential to the preservation of democracy.” It’s no different today than it’s ever been, and that’s an idea to build on.

Stephen Lendman lives in Chicago and can be reached at lendmanstephen@sbcglobal.net.

Also visit his blog site at sjlendman.blogspot.com and listen to The Global Research New Hour on RepublicBroadcasting.org Mondays from 11AM to 1PM US Central time for cutting-edge discussions with distinguished guests. Programs are also archived for easy listening.

http://www.globalresearch.ca/index.php?context=va&aid=8924

Monday, May 12, 2008

Disturbing Stirrings – Ratcheting Up For War On Iran

Disturbing Stirrings – Ratcheting Up For War on Iran – by Stephen Lendman

Led by Dick Cheney, Bush administration neocons want war on Iran. So does the Israeli Lobby, but it doesn’t mean they’ll get it. Powerful forces in Washington and the Pentagon are opposed and so far have prevailed. Nonetheless, worrisome recent events increase the possibility and must be closely watched.

Recall George Bush’s January 10, 2007 address to the nation. He announced the 20,000 troop “surge” and more. “Succeeding in Iraq,” he said, “also requires defending its territorial integrity and stabilizing the region in the face of extremist challenges. This begins with addressing Iran and Syria. These two regimes are allowing ‘terrorists’ and ‘insurgents’ to use their territory to move in and out of Iraq. Iran is providing material support for attacks on American troops. We will disrupt (those) attacks….we will seek out and destroy the networks providing advanced weaponry and training to our enemies in Iraq.”

That was then; this is now. On May 3, Andrew Cockburn wrote on CounterPunch: “Six weeks ago, President Bush signed a secret ‘finding’ authorizing a covert offensive against the Iranian regime that, according to those familiar with its contents, (is) ‘unprecedented in its scope.’ ” The directive permits a range of actions across a broad area costing hundreds of millions with an initial $300 million for starters. Elements of the scheme include:

— targeted assassinations;

— funding Iranian opposition groups; among them – Mujahedin-e-Khalq that the State Department designates a Foreign Terrorist Organization (FTO); Jundullah, the “army of god militant Sunni group in Iranian Baluchistan; Iranian Kurdish nationalists; and Ahwazi arabs in southwest Iran;

— destabilizing Syria and Hezbollah; the current Lebanon turbulence raises the stakes;

— putting a hawkish commander in charge; more on that below; and

— kicking off things at the earliest possible time.

These type efforts and others were initiated before and likely never stopped. So it remains to be seen what differences emerge this time and how much more intense they become.

More concerns were cited in a Michael Smith May 4 Times Online report headlined “United States is drawing up plans to strike on Iranian insurgency camp.” It refers to a “surgical strike” against an “insurgent training camp.” In spite of hostile signals, however, “the administration has put plans for an attack on Iran’s nuclear facilities on the back burner” after Gates replaced Rumsfeld. The article makes several other key points:

— “American defense chiefs (meaning top generals and admirals) are firmly opposed to (attacking) Iranian nuclear facilities;”

— on the other hand, they very much support hitting one or more “training camps (to) deliver a powerful message to Tehran;”

— in contrast, UK officials downplay Iranian involvement in Iraq even though Tehran’s Revolutionary Guard has close ties to al-Sadr and his Mahdi Army; and

— Bush and Cheney are determined not to hand over “the Iran problem” to a successor.

Earlier on April 7, Haaretz reported still more stirrings. It was about Israel’s “largest-ever emergency drill start(ed) to test the authorities’ preparedness for threats (of) a missile attack on central Israel.” Prime Minister Olmert announced that the “drill (was) no front for Israeli bellicose intentions toward Syria” and by implication Iran. Both countries and Hezbollah see it otherwise and with good reason. Further, Israeli officials indicated that this exercise might be repeated annually because they say Iran may have a nuclear capability by early 2009, so Israel will prepare accordingly.

No one can predict US and Israeli plans, but certain things are known and future possibilities can be assessed. Consider recent events. In mid-March, Dick Cheney toured the Middle East with stops in Israel, the West Bank, Saudi Arabia, Turkey, Oman, Afghanistan and Iraq. It came after Centcom commander Admiral William Fallon “resigned” March 10 (a year after his appointment) after reports were that he sharply disagreed with regional administration policy.

Public comments played it down, but speculation was twofold – Fallon’s criticism of current Iraq policy and his opposition to attacking Iran. Before the March 10 announcement, smart money said he’d be sacked by summer and replaced by someone more hawkish. It came sooner than expected, and, even more worrisome, by a super-hawk. One with big ambitions, and that’s a bad combination. More on that below.

First, recall another Pentagon sacking last June, officially announced as a “retirement.” George Bush was said to have “reluctantly agreed” to replacing Joint Chiefs Chairman Peter Pace because of his “highest regard” for the general. At issue, of course, was disagreement again over Middle East policy with indications Pace was far from on board. He signaled it on February 17, 2006 at a National Press Club luncheon. Responding to a question, he said: “It is the absolute responsibility of everybody in uniform to disobey an order that is either illegal or immoral.” He later added that commanders should “not obey illegal and immoral orders to use weapons of mass destruction….They cannot commit crimes against humanity.”

These comments and likely private discussions led to Pace’s dismissal. This administration won’t tolerate dissent even by Joint Chiefs Chairmen. It’s clear that officials from any branch of government will be removed or marginalized if they oppose key administration policy. Some go quietly while more notable ones make headlines that omit what’s most important. For one thing, that the Pentagon is rife with dissent over the administration’s Middle East policy.

For another, the law of the land, and there’s nothing more fundamental than that. The administration disdains it so it’s no fit topic for the media. Law Professor Francis Boyle champions it in his classroom, speeches, various writings and books like his newest – Protesting Power: War, Resistance, and Law.

Boyle is an expert. He knows the law and has plenty to cite – the UN Charter; Nuremberg Charter, Judgment and Principles; Convention on the Prevention and Punishment of the Crime of Genocide; Universal Declaration of Human Rights; Hague Regulations; Geneva Conventions; Supreme and lower Court decisions; US Army Field Manual 27-10; the Law of Land Warfare (1956); and US Constitution.

He unequivocally states that every US citizen, including members of the military and all government officials, are duty bound to obey the law and to refuse to carry out orders that violate it. Doing so makes them culpable. Included are all international laws and treaties. The Constitution’s supremacy clause (“the supreme law of the land” under Article VI) makes them domestic law. General Pace, Fallon and others on down aren’t exempt. Neither is the president, vice-president, all administration members and everyone in Congress.

Before Fallon’s sacking, things were heating up. Three US warships (including the USS Cole guided-missile destroyer) were deployed to the Lebanese coast – officially “to show support for regional stability (and over) concern about the situation in Lebanon.” It’s been in political crisis for months, and it’s got Washington and Israel disturbed – because of Hezbollah’s widespread popularity and ability to defend itself.

Any regional US show of force causes concern, especially when more is happening there simultaneously. Russia’s UN Ambassador Vitaly Churkin criticized it, and Hezbollah said it “threat(ened)” regional stability – with good reason. It believes conflict will erupt in northern Occupied Palestine close to the Lebanese border. It’s also preparing to counter Israel’s latest threat – an Israeli Channel 10 News report that the IDF is on high alert “inside and outside Israel” and is prepared to launch a massive attack if Hezbollah retaliates for the assassination of one of its senior leaders, Imad Fayez Mughniyah, by a February 12 Damascus car-bombing.

Then came Cheney’s Middle East tour with likely indications of its purpose – oil, Israeli interests and, of course, isolating Iran, Syria, Hezbollah, Hamas further, and rallying support for more war in a region where Arab states want to end the current ones. What worries them most, or should, is the possibility that Washington will use nuclear weapons. If so, consider the consequences – subsequent radioactive fallout that will contaminate vast regional swaths permanently.

After Cheney left Saudi Arabia, the state-friendly Okaz newspaper reported that the Saudi Shura Council (the kingdom’s elite decision-making body) began formulating “national plans to deal with any sudden nuclear and radioactive hazards that may affect the kingdom” should the Pentagon use nuclear weapons against Iran. It’s a sign Saudi leaders are worried and a clear indication of what they discussed with Cheney.

Saudi, Iranian and other world leaders know the stakes. They’re also familiar with Bush administration strategy and tactics post-9/11.

Exhibit A: the December 2001 Nuclear Policy Review; it states that America has a unilateral right to use first strike nuclear weapons preemptively; it can be for any national security reason, even against non-nuclear states posing no discernible threat;

Exhibit B: the 2002 and hardened 2006 National Security Strategies reaffirm this policy; the latter edition mentions Iran 16 times stating: “We may face no greater challenge from a single country country than Iran;” unstated is that Iran never attacked another nation in its history – after Persia became Iran in 1935; it did defend itself vigorously when attacked by Iraq in 1980;

Exhibit C: post-9/11, the Bush administration scrapped the “nuclear deterrence” option; in his 2005 book “America’s War on Terrorism,” Michel Chossudovsky revealed a secret leaked report to the Los Angeles Times; it stated henceforth nuclear weapons could be used under three conditions:

— “against targets able to withstand non-nuclear attack;

— in retaliation for attack with nuclear, biological or chemical weapons; or

— in the event of surprising military developments;” that can mean anything the administration wants it to or any threats it wishes to invent.

WMD echoes still resonate. Now it’s a nuclearized Iran. Preemptive deterrence is the strategy, and Dick Cheney places the Islamic Republic “right at the top of the list” of world trouble spots. He calls Tehran a “darkening cloud” in the region; claims “obviously, they’re heavily involved in trying to develop nuclear weapons enrichment….to weapons grade levels;” cites fake evidence that Iran’s state policy is “the destruction of Israel;” and official post-9/11 policy identifies Iran and Syria (after Iraq and Afghanistan) as the next phase of “the road map to war.” Removing Hezbollah and Hamas are close behind plus whatever other “rogue elements” are identified;

Exhibit D: former Defense Undersecretary Douglas Feith’s new book, “War and Decision;” in it, he recounts the administration’s aggressive Middle East agenda – to remake the region militarily; plans took shape a few weeks post-9/11 when Donald Rumsfeld made removing Saddam Hussein official policy; the same scheme targeted Afghanistan and proposed regime change in Iran and elsewhere – unnamed but likely Syria, Somalia, Sudan, at the time Libya, removing Syria from Lebanon, and Hezbollah as well.

On the Campaign Trail – Iran in the Crosshairs

John McCain is so hawkish he even scares some in the Pentagon. Here’s what he said about Iran at a May 5 campaign event. He called the Tehran government the gravest danger to US Middle East interests and added: a “league of nations” must counter the “Iranian threat. Iran ‘obviously’ is on the path toward acquiring nuclear weapons. At the end of the day, we cannot allow Iran to have a nuclear weapon. They are not only doing that, they are exporting very lethal devices and explosives into Iraq (and) training people (there as) Jihadists.”

It’s no surprise most Democrats have similar views, especially the leadership and leading presidential contenders. Obama calls Iran “a threat to us all.” For him, a “radical (nuclearized) Muslim theocracy” is unthinkable, and as president he won’t rule out using force. Nor will he against Pakistan or likely any other Muslim state. Obama also calls his support for Israel “unwavering.” He fully endorsed the 2006 Lebanon war, and it’s no secret where Israel stands on Iran and Syria.

Clinton is even more menacing. One writer calls her a “war goddess,” and her rhetoric confirms it. On the one hand, “Israeli security” tops “any American approach to the Middle East….we must not – dare not – waver from this commitment.” She then calls Iran “pro-terrorist, anti-American and anti-Israel.” She says a “nuclear Iran (is) a danger to Israel (and we’ve) lost critical time in dealing” with the situation. “US policy must be clear and unequivocal. We cannot and should not – must not – permit Iran to build or acquire nuclear weapons.”

Worst of all was her comment on ABC’s Good Morning America in response to (a preposterous hypothetical) about Iran “launch(ing) a nuclear attack on Israel.” Her answer: “I want the Iranians to know that if I’m the president, we will attack Iran. And I want them to understand that. We would be able to ‘totally obliterate’ them (meaning, of course, every man, woman and child).” She then added: “I don’t think it’s time to equivocate. (Iran has) to know they would face massive retaliation. That is the only way to rein them in.”

At the same time, she, the other leading candidates, and nearly everyone in Washington ignore Iran’s official policy. The late Ayatollah Khomeini banned nuclear weapons development. Today, Ayatollah Ali Khamenei and President Ahmadinejad affirm that position, but western media won’t report it. They also play down IAEA reports confirming that no evidence shows Iran has a nuclear weapons program or that it’s violating NPT.

Media Rhetoric Heating Up

It happens repeatedly, then cools down, so what to make of the latest Iran-bashing. Nothing maybe, but who can know. So it’s tea leaves reading time again to pick up clues about potential impending action. Without question, the administration wants regime change, and right wing media keep selling it – Iranian leaders are bad; removing them is good, and what better way than by “shock and awe.”

Take Fouad Ajami for example from his May 5 Wall Street Journal op-ed. It’s headlined – “Iran Must Finally Pay A Price.” He’s a Lebanese-born US academic specializing in Middle East issues. He’s also a well-paid flack for hard right policies, including their belligerency. He shows up often in the Wall Street Journal (and on TV, too) and always to spew hate and lies – his real specialty.

His latest piece is typical. Here’s a sampling that’s indicative of lots else coming out now:

— “three decades of playing cat-and-mouse with American power have emboldened Iran’s rulers;

— why are the mullahs allowed to kill our soldiers with impunity;”

— in Iraq, “Iranians played arsonists and firemen at the same time; (it’s) part of a larger pattern;

— Tehran has wreaked havoc on regional order and peace over the last three decades;”

— earlier, George HW Bush offered an olive branch to Iran’s rulers;

— “Madeleine Albright (apologized) for America’s role in the (1953) coup;”

— all the while, “the clerics have had no interest in any bargain;” their oil wealth gives them great latitude;

— “they have harassed Arab rulers while posing as status quo players at peace with the order of the region;”

— they use regional proxies like “Hezbollah in Lebanon, warlords and militias in Iraq, purveyors of terror for the hire;

— the (earlier) hope….that Iran would refrain from (interfering) in Iran (was) wishful thinking;” now there’s Iran’s nuclear “ambitions” to consider; the “Persian menace” has to “be shown that there is a price for their transgressions.”

Sum it up, and it spells vicious agitprop by an expert at spewing it. He’s not alone. Disputing one of his assertions, a May 5 AFP report quotes Iraq government spokesman Ali al-Dabbagh saying no “hard evidence” shows Iran is backing Shiite militiamen or inciting violence in the country.

Consider the Arab street as well. It’s unconcerned about Iran but outraged over US adverturism. Recall also that on March 2 Iranian President Ahmadinejad became the first Iranian head of state to visit Iraq in three decades. Prime Minister al-Maliki and President Talabani invited him and welcomed him warmly as a friend.

That doesn’t deter The New York Times Michael Gordon. He’s taken up where Judith Miller left off, and his May 5 piece is typical. It’s headlined “Hezbollah Trains Iraqis in Iran, Officials Say.” The key words, of course, are “Officials Say” to sell the idea that their saying it makes it so. No dissent allowed to debunk them or other administrative-supportive comments.

This one cites supposed information from “four Shiite militia members who were captured in Iraq late last year and questioned separately.” For Gordon and “Officials (who) Say,” it’s incriminating evidence for what Washington has long charged – “that the Iranians (are) training Iraqi militia fighters in Iran,” and Hezbollah is involved. The Pentagon calls them “special groups.”

Gordon goes on to report that Iran has gotten “less obtrusive (by) bringing small groups of Iraqi Shiite militants to camps in Iran, where they are taught how to do their own training, ‘American officials say.’ “

Once trained, “the militants then return to Iraq to teach their comrades how to fire rockets and mortars, fight as snipers or assemble explosively formed penetrators, a particularly lethal type of roadside bomb….according to American officials.”

As usual, the “officials” are anonymous and their “information has not been released publicly.” Gordon continues with more of the same, but sum it up and he sounds like Ajami, Judith Miller, and growing numbers of others like them.

On March 17, Fairness & Accuracy in Reporting (FAIR) put out an Action Alert headlined “No Antiwar Voices in NYT ‘Debate.’ ” It referred to The Times March 16 “Week in Review” section on the war’s fifth anniversary featuring nine so-called experts – all chosen for their hawkish credentials. Included were familiar names like Richard Perle, Fred Kagan, Anthony Cordesman, Kenneth Pollack and even Paul Bremer. On May 4, The Times reconvened the same lineup for a repeat performance that would make any state-controlled media proud.

No need to explain their assessment either time, but NYT op-ed page editor said this on July 31, 2005: The op-ed page (where the above review was published) is “a venue for people with a wide range of perspectives, experiences and talents (to provide) a lively page of clashing opinions, one where as many people as possible have the opportunity to make the best arguments they can.” As long as they don’t conflict with official state policy, offend Times advertisers or potential ones, acknowledge Iran’s decisive role in ending the recent Basra fighting, or mention the (latest) 2007 (US) National Intelligence Estimate that Iran halted its nuclear weapons program in 2003 – even though it’s likely one never existed and doesn’t now.

With Iraq still raging and hawkishness over Iran heating up, it’s disquieting to think what’s coming, and it’s got Middle East leaders uneasy. Not about Iran, about a rogue administration with over eight months left to incinerate the region in a mushroom-shaped cloud and no hesitation about doing it.

Enter the Generalissimo – Initials DP, Ambitions Outsized

Fallon is out, and, in late April, Defense Secretary Robert Gates said David Petraeus is being nominated to replace him as Centcom commander. General Raymond Odierno (his former deputy) will replace his former boss as Iraq chief. New York Times reporter Thom Shanker said these “two commanders (are) most closely associated with President Bush’s current strategy in Iraq,” so are on board to pursue it and maybe up the stakes.

Besides being a Latin American expert, James Petras writes extensively on the Middle East and how the Israeli Lobby influences US policy. His 2006 book, “The Power of Israel in the United States,” is must reading to understand it. Petras has a new article on Petraeus. It’s incisive, scary, and unsparing in exposing the generalissimo’s true character, failings, and ambitions.

Competence didn’t make him Iraq commander last year. It came the same way he got each star. In the words of some of his peers – by brown-nosing his way to the top. It made him more than a general. He’s a “brand,” and it got him Time Magazine’s 2007 runner-up slot for Person of the Year.

The media now shower him with praise for his stellar performance in an otherwise dismal war. So do politicians. McCain calls him “one of (our) greatest (ever) generals.” Clinton says he’s “an extraordinary leader and a wonderful advocate for our military.” Obama was less effusive but said he supports his nomination as Centcom chief and added: “I think Petraeus has done a good tactical job in Iraq….It would be stupid of me to ignore what he has to say.” It would also hurt his presidential hopes as the right wing media would bash him mercilessly if he disparaged America’s new war hero with very outsized ambitions and no shyness in pursuing them.

He got off to a flying start after being appointed to the top Iraq job last year. The White House spin machine took over and didn’t let facts interfere with its praise. It described him as aggressive in nature, an innovative thinker on counterinsurgency warfare, a talisman, a white knight, a do-or-die competitive legend, and a man able to turn defeat into victory.

Others like Admiral Fallon had a different assessment, and Petras noted it in his article. Before his removal, he was openly contemptuous of a man who shamelessly supported Israel “in northern Iraq and the Bush ‘Know Nothings’ in charge of Iraq and Iran policy planning.” It got him his April 16 promotion, and his week earlier Senate testimony sealed it. He was strikingly bellicose in blaming Iran for US troop deaths. That makes points any time on Capitol Hill, especially in an election year when rhetoric sells and whatever supports war and Israel does it best.

Petras adds that Petraeus had few competitors for the Centcom job because other top candidates won’t stoop the way he does – shamelessly flacking for Israel, the bellicose Bush agenda, and what Petras calls “his slavish adherence to….confrontation with Iran. Blaming Iran for his failed military policies served a double purpose – it covered up his incompetence and it secured the support of” the Senate’s most hawkish (independent) Democrat, Joe Lieberman.

It also served his outsized ambitions that may include a future run for the White House. His calculus seems to be – lie to Congress, hide his failures, blame Iran, support Israel and the Bush agenda unflinchingly, claim he turned Iraq around, say he’ll do it in the region, and make him president and he’ll fix everything.

He (nor the media) won’t report how bad things are in Iraq or the toll on its people. They won’t explain the “surge’s” failure to make any progress on the ground. They won’t reveal the weekly US troop death and injury count that’s far higher than reported numbers. By one estimate, (including weekly Pentagon wounded updates), it tops 85,000 when the following categories are included:

— “hostile” and “non-hostile” deaths, including from accidents and illness;

— total numbers wounded; and

— many thousands of later discovered casualties, mainly brain traumas from explosions.

Left out of the above figures are future illnesses and deaths from exposure to toxic substances like depleted uranium. It now saturates large areas of Iraq in the soil, air and drinking water. Also omitted is the vast psychological toll. For many, it causes permanent damage, and whole families become victims.

Consider civilian contractor casualties as well. They may be in the thousands. A February Houston Post report noted 1123 US civilian contractor deaths. It left out numbers of wounded or any information about foreign workers. They may have been affected most.

Several other reports are played down. One is from the VA about 18 known daily suicides. The true number may be higher. Another comes from Bloomberg.com on May 5 but unreported on TV news. It cited Thomas Insel, director of the National Institute of Mental Health on an April 2008 Rand Corporation study. It found about “18.5% of returning (Iraq and Afghan) US soldiers (afflicted with) post-traumatic stress disorder or depression (PTSD), and only half of them receive treatment.”

Much of it shows up later, and many of its victims never recover. A smaller psychiatric association study put the PTSD number at about 32%, and a January 2006 Journal of the American Medical Association put it even higher – 35% of Iraq vets seeking help for mental health problems. A still earlier 2003 New England Journal of Medicine Study reported an astonishing 60% of Iraq and Afghanistan veterans showing PTSD “symptoms.” Most victims said their duty caused it, but over half of them never sought treatment fearing damage to their careers.

The same Rand study said another 19% have possible traumatic brain injuries ranging from concussions to severe head wounds. About 7% of vets suffer a double hit – both brain injury and PTSD or depression. It’s a wonder numbers aren’t higher as most active duty and National Guard forces serve multiple tours – some as many as six or more in Iraq and Afghanistan combined. Surviving that ordeal in one piece is no small achievement.

Patraeus’ calculus omits these victims and all other war costs abroad and at home. They’re consigned to an over-stuffed memory hole for whatever outs the facts on the ground or his PR-enhanced image.

Petras strips it away and calls him “a disastrous failure” whose record is so poor it takes media magic to remake it. This man will now direct administration Middle East policy. He supports its aims, and if neocon wishes are adopted it means continued war and occupation of Iraq, stepped up efforts in Afghanistan, and making a hopeless enterprise worse by attacking Iran. No problem for Petraeus if it helps his ambitions. They, of course demand success, or at least the appearance, the way Petraeus so far has framed it. It remains to be seen what’s ahead, and how long defeat can be called victory.

And one more thing as well. Congress will soon vote on more Iraq-Afghanistan supplemental funding. Bush wants another $108 billion for FY 2008. In hopes a Democrat will be elected president, Congress may add another $70 billion through early FY 2009 for a total $178 billion new war spending (plus the usual pork add-ons) on top of an already bloated Pentagon budget programmed to increase.

It’s got economist Joseph Stiglitz alarmed and has for some time. In his judgment, the Iraq war alone (conservatively) will cost trillions of dollars, far more than his earlier estimates. That’s counting all war-related costs:

— from annual defense spending plus huge supplemental add-ons;

— outsized expenses treating injured and disabled veterans – for the government and families that must bear the burden;

— high energy costs; they’re affected by war but mostly result from blatant market manipulation; it’s not a supply/demand issue; there’s plenty of oil around, but not if you listen to industry flacks citing shortages and other false reasons why prices shot up so high;

— destructive budget and current account deficits; in the short run, they’re stimulative, but sooner or later they matter; they’re consuming the nation, and analysts like Stiglitz and Chalmers Johnson believe they’ll bankrupt us; others do as well like Independent Institute Senior Fellow Robert Higgs who last year outed the nation’s trillion dollar defense budget; in a recent May 7 article, he wrote: “As the US government taxes, spends, borrows, regulates, mismanages, and wastes resources on a scale never before witnessed in the history of mankind, it is digging its own grave;” others believe we’re past the tipping point and it’s too late;

— debts must be serviced; the higher they mount, the
greater the cost; they crowd out essential public and private investment; need growing billions for interest payments; damage the dollar; neglect human capital; and harm the country’s stature as an economic leader; the more we eat our seed corn, the greater the long-term damage;

— debts also reduce our manoeuvring room in times of national crisis; limitless money-creation and reckless spending can’t go on forever before inflation debases the currency; that’s a major unreported threat at a time monetary and fiscal stimulus shifted financial markets around, and touts now predict we’re out of the woods; they don’t say for how long, what may follow, or how they’ll explain it if they’re wrong;

— add up all quantifiable war costs, and Stiglitz now estimates (conservatively) a $4 – 5 trillion total for America alone; watch for higher figures later; both wars have legs; another may be coming; leading presidential candidates assure are on board and have no objection to out-of-control militarism;

— Stiglitz will be back; his estimate is low; before this ends, look for one of several outcomes – trillions more spent, bankruptcy finally ends it, or the worst of all possible scenarios: an unthinkable nuclear holocaust that (expert Helen Caldicott explains) “could end life on earth as we know it” unless sanity ends the madness.

The generalissimo is unconcerned. He’s planning his future. He envisions the White House, and imagine what then. Like the current occupant and whomever follows, look for more destructive wars to serve his political ambitions and theirs. They fall right in line with the defense establishment, Wall Street, and the Israeli Lobby.

Decades back, could anyone have thought things would come to this. Hopefully, good sense will gain currency and stop this madness before it consumes us.

Stephen Lendman lives in Chicago and can be reached at lendmanstephen@sbcglobal.net.

Also visit his blog site at sjlendman.blogspot.com and listen to The Global Research News Hour on RepublicBroadcasting.org Mondays from 11AM to 1PM for cutting-edge discussions with distinguished guests. Programs are also archived for easy listening.

http://www.globalresearch.ca/index.php?context=va&aid=8924

posted by Steve Lendman
http://sjlendman.blogspot.com/

The real Good Life: An entire village turns against supermarkets and grows its own food

Posted on

Yeah! Now this the kind of good news I like to see. Hopefully a wave of the future.
K
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
By LUKE SALKELD

Last updated at 17:46pm on 14th April 2008

It was a sitcom that inspired many a household to live off the land.

And although it might not attract the likes of Margo and Jerry to move to the area, an entire village is trying its hand at the Good Life.

In a bid to become less dependent on supermarkets, the residents of Martin are working together to become as self-sufficient as possible.

village of Martin

Villagers of Martin, Hants, who have shunned supermarkets to grow their own meat and veg

The Hampshire village is now home to hundreds of real life versions of the characters played by Felicity Kendall and Richard Briers, who lived off the land in the 1970s BBC comedy.

They work on a rota system and raise their own chickens and pigs and grow potatoes, garlic, onions, chillis and green vegetables on eight acres of rented land.

Of the 164 families who live in Martin, 101 have signed up as members of Future Farms for an annual £2 fee, although the produce can be sold to anyone who wants to buy it.

The “community allotment” sells 45 types of vegetables and 100 chickens a week, and is run by a committee which includes a radiologist, a computer programmer and a former probation officer.

The Good Life

In The Good Life, Tom and Barbara (played by Richard Briers and Felicity Kendal) try to live a self-sufficient lifestyle by converting their garden into allotments

Nick Snelgar, 58, who came up with idea in 2003, said the project was gradually “weaning” villagers off of supermarkets.

He said: “I like to think of it as a large allotment in which there are lots of Barbaras and Toms working away.

“There are also Margos as well, but everyone can get involved.

“The nearest supermarket is six miles away. Of course people still have to go there for things like loo roll and deodorant and fruit you can’t grow in Britain.

“So we aren’t boycotting supermarkets entirely but we are gradually weaning people off them and as a result are reducing our carbon footprint by not using carrier bags and packaging.”

village of Martin

Every Saturday the produce is sold at the village hall

Martin village

The good life: The village of Martin nestles in the Hampshire countryside

Mr Snelgar, a horticulturalist, said the VAT-registered co-operative had grown so much that last year it had a turnover of £27,000 – most of which was ploughed back into the scheme.

He said: “We began with vegetables and we found that all the skills we needed were here in the village.

“After the vegetables we introduced chickens and then pigs and we learned inch by inch.

“We have other producers whose goods we sell and they include a sheep farmer and someone who has honey.

pigs in village of Martin

The farm sells 20 pigs a year as well as chickens and lambs and is now starting to sell beef

“It has been a fantastically interesting experience and we now have four plots of land covering eight acres.

“There are 164 families in the village and they include about 300 adults and 100 children, so there are about 400 creatures to feed.’

Every Saturday the community comes together with their produce which is sold at the village hall.

Mr Snelgar added: “The most popular thing we sell is carrots.

The majority of families have signed up to the scheme, but anyone can buy the produce

“People love the smell of fresh carrots, and we pull them out of the ground the day before we sell them.

“We don’t yet do dairy, but we hope to include that in the future and we also intend to grow raspberries and strawberries.

“We set the prices by working out how much the food costs to produce. We then add 20 per cent.

“Our pork sausages, for example, are sometimes cheaper than sausages you buy in the supermarkets. We break even and all money gets ploughed back in.

“When we started some people thought it would fail and we’d never last, but as the years have gone by more and more people have become involved.

“It is also a talking point in the village and it’s great to see people walking to the village hall on a Saturday morning talking to each other. It has created a sense of belonging.”

One villager said they are not boycotting supermarkets but are weaning people off them

Source: The Daily Mail

Post by way of: http://whatreallyhappened.com/