iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: http://web.archive.org/web/20120711110747/http://www.salon.com/topic/alternet
AlterNet - Salon.com
The Wayback Machine - http://web.archive.org/web/20120711131233/http://www.salon.com/topic/alternet/

AlterNet

We need progressive religion

Often, religion offers much that progressives need to build movements for change

  • more
    • All Share Services

Topics: , , ,

We need progressive religion
This article originally appeared on AlterNet.

One of the great historical strengths of the progressive movement has been its resolute commitment to the separation of church and state. As progressives, we don’t want our government influenced by anybody’s religious laws. Instead of superstition and mob id, we prefer to have real science, based in real data and real evidence, guiding public policy. Instead of holy wars, othering, and social repression — the inevitable by-products of theocracy — we think that drawing from the widest possible range of philosophical traditions makes America smarter, stronger, and more durable over time.

That said: while we all want a government free of religion, there are good reasons that we may not want our own progressive movement to be shorn of every last spiritual impulse. In fact, the history of the progressive movement has shown us, over and over, that there are things that the spiritual community brings to political movements that are essential for success, and can’t easily be replaced with anything else.

Religion has been central to the formation of human communities — and to how we approach the future — for as long as homo sapiens has been around. Apart from God-belief (which varies widely between religions), all successful religions thrive and endure because they offer their adherents a variety of effective community-building, social activism, and change management tools that, taken together, make religion quite possibly the most powerful social change technology humans have ever developed.

What does religion offer that progressives need to make our movement work?

First: there’s nothing like it if you want to bond a bunch of very diverse people into a tight community of shared meaning and value. A religious congregation brings together people of all ages, backgrounds, educational levels, professional rank, and life circumstances, and melds them into an enduring tribe that’s centered around a shared commitment to mutual trust and care, and (most importantly) has a clear and vivid shared vision of the future they’re trying to create.

There is simply no other organizational form that encourages people to share their time, energy, and resources so quickly, completely, or enduringly; or aligns so much conviction toward the same goal. (This is why the leaders of corporations, the marketers of sports teams, and the military all study religious cultures, and try to appropriate their tribe-building techniques for their own purposes.) The resulting tribes can last for many centuries — and acquire a resounding moral voice that can reverberate throughout their larger communities, and well beyond. If you want to change the world, this is the kind of group — deeply bound by faith, trust, love, history, and a commitment to each other and to the world they envision that transcends life and death — that’s most likely to get it done. Religion is the best way going to get people to consecrate themselves, body and soul, to a larger cause; and to take on the kind of all-or-nothing risks that are often required to really change the world.

Second, religious narratives center people in the long arc of history, telling them where they came from, who they are, what they are capable of, and what kind of future is possible. History does this, too; but religion does it at a deeper, mythic level that gives these stories extra emotional and cognitive resonance. For most of human history, in fact, the task of imagining a different future and giving people the inspiration and courage to reach for it has been the primary role of religious prophets. (So has the job of warning the people that they’re wandering into grave error or betraying their own values, and must change their ways or face disaster.) Religion is the native home of the prophetic voice — the voice that calls people to transformative change. Throughout America’s history, our most evocative political prophets — both Roosevelts, all the Kennedys, Martin Luther King, Cesar Chavez, Van Jones, Barack Obama — have invariably been people who spent a lot of time in the pews, learning to speak the kind of language that calls us to a better place.

Third, over the course of American history, liberal religious faiths have been the primary promoter of progressive values throughout the culture — and also the leading institution when it came time to inculcate our progressive sensibilities into the next generation. Many, if not most, progressives in America are progressive specifically because they believe that this is what their faith demands of them. They’re raising their kids in churches and temples because they believe, as the Bible says, that “if you train up a child in the way that he should go, when he is old, he will not depart from it.”

Liberal congregations have etched our values onto the young souls of tens of millions of American progressives, over three centuries and dozens of generations. Do we really want to try to do without them now?

Fourth, progressive religion has always been America’s most credible and aggressive front-line defender of non-market-based values against the onslaught of capitalism and greed. In recent years, as the “free-market” fetishists took over (and gulled American Evangelicals into shilling for their hellish utilitarianism), our liberal faith communities — mainline Protestants and liberal Catholics, Jews and Quakers, Unitarian Universalists and the rising wave of reformist Muslims — are the strongest remaining cultural forces left with the moral authority to insist that we have a duty to the poor, that democracy cannot survive without a commitment to justice, and that compassion is always a better survival strategy than competition.

The market says: Everything and everybody has a price, and is for sale. Faith says: The most valuable things in our lives — good health, safe food, strong families, a clean environment, a just economy, meaningful work, access to opportunity — are beyond price, and should by right be available to us all. Our faith communities (especially, but not always exclusively, the progressive ones) have always held this light up within our culture, and it’s never been needed more than it’s needed right now.

Fifth, in a nation where over 90% of everybody has some kind of God-belief – and the overwhelming majority of them ground their political decisions in that belief — abandoning the entire landscape of faith to the right wing amounts to political malpractice. For most Americans, our religious worldviews are the epistemological soil in which every other decision we make is rooted — the basic model of reality that we use to navigate the world. When we stopped engaging people’s basic model of moral order, we effectively ceded the entire moral landscape of the nation to our enemies. It was, in retrospect, perhaps the most self-destructive error we’ve made over the past 40 years (and that’s saying something).

To our credit, a lot of our best organizers and activists are starting to realize the magnitude of this mistake. We’re paying a lot more attention these days to learning to clearly articulate progressive values, to express ourselves in explicitly moral language, and to put forward more strongly progressive frames, narratives, and future visions to counter the bankrupt conservative worldview that’s brought us to this sorry place in history.

But while we’re working toward some new understandings here, let’s also remember that the right wing’s success on taking this field was rooted directly in their ability to mobilize conservative churches to carry the moral banner forward into the culture for them. If we’re going to overwrite their brutal and anti-democratic story of how the world works, the most important step we can take is to tap into the vast reach and deep moral authority of our remaining progressive faith communities, and amplify their voices every way we can. Churches and temples have always been the first and most natural places Americans turn when it’s time to have serious cultural conversations about value and meaning and the future they desire. If we’re serious about changing the national story and bending the future in our preferred direction, then that’s where we need to be.

Sixth: Progressive faiths, across the board, promote the essential belief that human communities are, in themselves, inherently and intrinsically sacred. In fact, progressive atheists may be surprised to learn that among their more religious brothers and sisters, there’s very little agreement about the nature of God — but a very strong consensus that the act of radical community-making is the most intensely holy and essential work that they do.

If there is a God (and progressives of faith debate that question endlessly), then we might most reliably see the face of that divinity in that permanent circle of friends with whom we celebrate life’s passages and joys, and wrestle with its hardest challenges — the people whom we trust to stand with us no matter what comes, and who will work with us tirelessly toward our shared vision of a better world. It’s this deep faith in the dream of the beloved community that also feeds our faith in the potential of good government, and our confidence in the unleashed potential of the American people. (And furthermore: I don’t think I’ve ever met a progressive atheist who would disagree on this point.)

Across all the long centuries of the American progressive movement, we’ve never launched a successful change wave that didn’t draw most of its leadership, its base, and its moral grounding from the country’s deep liberal religious tradition.

Our churches and temples have been the fountain, the rock, the mother source of our movement from the very beginning. Progressives of faith have always played a central role in our political victories in the past. It’s time to stop imagining that somehow, we’re going to take the country back without them now.

Continue Reading Close

Sara Robinson is a trained social futurist and the editor of AlterNet's Vision page.

  • more
    • All Share Services

Capitalism’s great heist

Elites say inequality encourages the rich to invest and the creative to invent. This works out well for 1% dogs

  • more
    • All Share Services

Topics: , , , ,

Capitalism's great heistProtestors affiliated with the "Occupy Wall Street" protests chant outside 740 Park Avenue, home to billionaire David Koch, in New York (Credit: AP Photo/Andrew Burton)
This article originally appeared on AlterNet.

Editor’s Note: When harmful beliefs plague a population, you can bet that the 1% is benefiting. This article is the first in a new AlterNet series, “Capitalism Unmasked,” edited by Lynn Parramore and produced in partnership with author Douglas Smith and Econ4 to expose the myths and lies of unbridled capitalism and show the way to a better future. 

Summer 2009. Unemployment is soaring. Across America, millions of terrified people are facing foreclosure and getting kicked to the curb. Meanwhile in sunny California, the hotel-heiress Paris Hilton is investing $350,000 of her $100 million fortune in a two-story house for her dogs. A Pepto Bismol-colored replica of Paris’ own Beverly Hills home, the backyard doghouse provides her precious pooches with two floors of luxury living, complete with abundant closet space and central air.

AlterNet

By the standards of America’s rich these days, Paris’ dogs are roughing it. In a 2006 article, Vanity Fair’s Nina Munk described the luxe residences of America’s new financial elite. Compared with the 2,405 square feet of the average new American home, the abodes of Greenwich Connecticut hedge-fund managers clock in at 15,000 square feet, about the size of a typical industrial warehouse. Many come with pool houses of over 3,000 square feet.

Steven Cohen of SAC Capital is a typical product of the New Gilded Age. He paid $14.8 million for his Greenwich home, which he stuffed with a personal art collection that boasts Van Gogh’s Peasant Woman Against a Background of Wheat (priced at $100 million); Gauguin’s Bathers ($50 million); a Jackson Pollock drip painting (also $50 million); and Andy Warhol’s Superman ($75 million). Not satisfied, Cohen spent millions renovating and expanding, adding a massage room, exercise and media rooms, a full-size indoor basketball court, an enclosed swimming pool, a hairdressing salon, and a 6,734-square-foot ice-skating rink. The rink, of course, needs a Zamboni ice-resurfacer which Cohen houses in a 720-square-foot shingle cottage. Munk quotes a visitor to the estate who assured her, “You’d be happy to live in the Zamboni house.”

So would some of the over 650,000 Americans sleeping in shelters or under highway overpasses.

By the time it was finished, Cohen’s house had swelled to 32,000 square feet, the size of the Taj Mahal. Even at Taj prices, cost mattered little to a man whose net worth is estimated by the Wall Street Journal at $8 billion — with an income in 2010 of over $1 billion. Cohen’s payday is impressive, but by no means unique. In 2005, the 25 hedge-fund managers averaged $363 million. In cash. Paul Krugman observes that these 25 were paid three times as much as New York City’s 80,000 public school teachers combined. And because their pay is taxed as capital gains rather than salary, the teachers paid a higher tax rate!

Back in the 18th century, Alexis de Tocqueville called America the “best poor man’s country.” He believed that “equality of conditions” was the basic fact of life for Americans. How far we’ve come! Since then, the main benefits of economic growth have gone to the wealthy, including the Robber Barons of the Gilded Age whom Theodore Roosevelt condemned as “malefactors of great wealth” living at the expense of working people. By the 1920s, a fifth of American income and wealth went to the richest 1 percenters whose Newport mansions were that period’s Greenwich homes. President Franklin Roosevelt blamed these “economic royalists” for the crash of ’29. Their recklessness had undermined the stability of banks and other financial institutions, and the gross misdistribution of income reduced effective demand for products and employment by limiting the purchasing power for the great bulk of the population.

Roosevelt’s New Deal sought to address these concerns with measures to restrain financial speculation and to redistribute wealth down the economic ladder. The Glass-Steagall Act and the Securities Act restricted the activities of banks and securities traders. The National Labor Relations Act (the “Wagner Act”) helped prevent business depression by strengthening unions to raise wages and increase purchasing power. Other measures sought to spread the wealth in order to promote purchasing power, including the Social Security Act, with retirement pensions, aid to families with dependent children, and unemployment insurance; the Fair Labor Standards Act, setting a national minimum wage and maximum hours; and tax reforms that lowered taxes on workers while raising them on estates, corporations and the wealthy. And the kicker: Through pronouncement and Employment Act (1946), the New Deal committed the U.S. to maintain full employment.

The New Deal reversed the flow of income and wealth to the rich. For 25 years after World War II, strong labor unions and government policy committed to raising the income of the great majority ensured that all Americans benefited from our country’s rising productivity and increasing income.

Advocates of laissez faire economics warned that we would pay for egalitarian policies with slower economic growth because we need inequality to encourage the rich to invest and the creative to invent. But the high costs of inequality in reduced social cooperation and wasted human capital point to the giant flaws in this view. A more egalitarian income distribution provides better incentives for investment, and our economy functions much better when people can afford to buy goods and services.

The New Deal ushered in a period of unusually rapid and steady economic growth with the greatest gains going to the poor and the middle-class. Strong unions ensured that wages rose with productivity, government tax and spending policies helped to share the benefits of growth with the poor, the retired and the disabled. From 1947-’73, the bottom 90 percent received over two-thirds of economic growth.

Then, the political coalition behind the New Deal fragmented in the 1960s. Opponents seized the moment and reversed its policies. They began to funnel income toward the rich. With a policy agenda loosely characterized as “neoliberalism,” conservatives (including much of the economics profession) have swept away the New Deal’s focus on employment and economic equity to concentrate economic policy on fighting inflation by strengthening capital against labor. That has worked out very badly for most of America.

The GOP has led the attack on Roosevelt’s legacy, but there has been surprising bipartisan support. President Carter got the ball rolling with his endorsement of supply-side taxation and his commitment to fight inflation by promoting labor market competition and raising unemployment. Carter’s policies worked to reverse the New Deal’s tilt toward labor and higher wages. Under his watch, transportation and telecommunications were deregulated, which undermined unions and the practice of industry-wide solidarity bargaining. Carter also campaigned to lower trade barriers and to open our markets to foreign trade. These policies were presented as curbs on monopolistic behavior, but the effect was to weaken labor unions and drive down wages by allowing business to relocate production to employ lower-wage foreign workers while still selling in the American market.

Carter also began a fatal reversal of economic policy by refusing to support the Humphey-Hawkins Full Employment Act. Instead of pushing for full employment, Carter appointed Paul Volcker to chair the Federal Reserve with the charge to use monetary policy to restrain inflation without regard for the effect on unemployment. Since then, inflation rates have been brought down dramatically, but unemployment has been higher and the growth rate in national income and in wages has slowed dramatically compared with the New Deal era.

Already in the 1970s, a rising tide of anti-union activities by employers led Douglas Fraser, the head of the United Auto Workers to accuse employers of waging a “one-sided class war against working people, the unemployed, the poor, the minorities, the very young and the very old, and even many in the middle class of our society.” Organized labor’s attempt to fight with labor reform legislation amending the Wagner Act found little support in the Carter White House and went down to defeat in the Democratic-controlled Senate.

Any residual commitment toward collective bargaining under the Wagner Act was abandoned during the Reagan administration, ironically the only union president ever elected to the White House. Reagan, of course, is known as the president who fired striking air traffic controllers in 1981. He is also known for the devastating regulatory changes during his presidency and those of his Republican successors (the two Presidents Bush). Their appointments to the National Labor Relations Board helped to turn this agency from one charged with promoting union organization and collective bargaining to one charged with ensuring that employers were free to avoid unions. Under this new regime, private sector unionism, the unions covered by the Wagner Act, has almost disappeared.

The 1970s also saw a shift in tax policy away from the principles of ability-to-pay and income redistribution toward those associated with supply-side economists who argued for lower taxes on the rich to provide incentives to accumulate wealth. After campaigning for tax reform, Carter signed the Revenue Act of 1978, which gave small tax benefits for working people and dramatic cuts in capital gains and corporate taxes and on the top marginal rates. Since then, major reductions on taxes paid by the rich enacted under Presidents Reagan and George W. Bush have dramatically reduced the tax burden on the richest Americans.

Government spending policies have also turned away from ordinary Americans. In 1996, under President Bill Clinton, a vital piece of the New Deal safety net was repealed with the “Personal Responsibility and Work Opportunity Reconciliation Act.” Abolishing the provisions of the Social Security Act that established the program of Aid to Families with Dependent Children, the 1996 law ended the national right to relief. Along with restrictions on unemployment insurance, the abolition of programs of public jobs for the unemployed and gradual reductions in the real value of Social Security benefits, this act was another blow for working people.

The New Deal showed us how to combine economic growth and lower levels of unemployment. But the widening gap between rich and poor since the 1970s has been associated with higher levels of unemployment and a slowing of economic growth. Had economic growth rates continued after 1978 at the same rate as during the decades before, average income would have been more than $14,000 higher than it actually was in 2008.

The slowdown in growth since the abandonment of egalitarian New Deal policies has cost Americans about 30 percent of their income. And the massive redistribution of income away from average Americans and toward the rich has destroyed the sense that America is a land of opportunity for all. Quality of life has plunged because the shredding of social protections has exposed average Americans to much higher levels of risk. The substitution of defined contribution pensions, such as Individual Retirement Accounts or 401K plans, for defined benefit pensions has reduced retirement security for individuals while reducing the risk borne by employers or other social institutions. Just as important as declining income for many Americans, the stress and anxiety associated with the risk shift has contributed to rising levels of depression and morbidity and a decline in life expectancy for Americans compared with residents of other countries.

Workers’ security has been abandoned. But the government has let financial markets run wild. In 1982, Congress deregulated the thrift industry, freeing thrifts to engage in reckless and fraudulent behavior. In 1994, it removed restrictions on interstate banking. In 1998 it allowed Citigroup to merge with Travelers’ Insurance to create the world’s largest financial services company. And in the Gramm-Leach-Bliley Act of 1999, it repealed the remaining Glass-Steagall barriers between commercial and investment banking. Acting with the virtual consent of Congress and the president, in 2004, the Securities and Exchange Commission established a system of voluntary regulation that in essence allowed investment banks to set their own capital and leverage standards.

By then our financial regulatory system had largely returned to the pre-New Deal situation in which we trusted financial institutions to self-police. Advocates of deregulation, like Federal Reserve chair Alan Greenspan, were unconcerned because they expected banks and other financial firms to limit their risk for fear of failure. Either they misunderstood the incentives facing company managers, or they did not care. In practice, financiers are playing with other people’s money (ours). When they do well, their compensation is tied to profits and they can earn huge sums. But when their investments fail, they are protected because monetary authorities and the United States Treasury cannot allow “too big to fail” financial companies to go bust.  So long as risky investments would have periods of high returns, the managers of deregulated financial firms have an incentive to increase their risk, profiting from success while passing the costs of failure to the public. We have all been suffering from the consequences of their failures since the financial crisis of 2007-’08.

The share of income going to the top 1 percent has doubled since the 1970s, returning to the levels of the 1920s. The greatest gains have gone to the very wealthiest and to executives and managers, especially of financial firms. From 1973 to 2008, the average income of the bottom 90 percent of American households fell even while the rich gained. The wealthiest 1 percent gained 144 percent or over $600,000 per household; and the richest 1 percent of the 1 percent, barely 30,000 people, gained over 455 percent or over $19,000,000.

That’s enough to buy a nice doghouse. Or a mansion in Greenwich.

Gerald Friedman teaches economics at the University of Massachusetts, Amherst. He is the author, most recently, of “Reigniting the Labor Movement” (Routledge, 2007).

Continue Reading Close
  • more
    • All Share Services

Climate change: ‘This is just the beginning’

The media should not continue to ignore the essential link between extreme weather and climate change

  • more
    • All Share Services

Topics: , , , ,

Climate change: ‘This is just the beginning’A helicopters fills a 600-gallon bucket with water from a local reservoir in Alpine, Utah, Wednesday, July 4, 2012.(Credit: AP Photo/The Salt Lake Tribune, Leah Hogsten)
This article originally appeared on AlterNet.

Evidence supporting the existence of climate change is pummeling the United States this summer, from the mountain wildfires of Colorado to the recent “derecho” storm that left at least 23 dead and 1.4 million people without power from Illinois to Virginia. The phrase “extreme weather” flashes across television screens from coast to coast, but its connection to climate change is consistently ignored, if not outright mocked. If our news media, including — or especially — the meteorologists, continue to ignore the essential link between extreme weather and climate change, then we as a nation, the greatest per capita polluters on the planet, may not act in time to avert even greater catastrophe.

AlterNet

More than 2,000 heat records were broken last week around the United States. The National Oceanic and Atmospheric Administration (NOAA), the government agency that tracks the data, reported that the spring of 2012 “marked the largest temperature departure from average of any season on record for the contiguous United States.” These record temperatures in May, NOAA says, “have been so dramatically different that they establish a new ‘neighborhood’ apart from the historical year-to-date temperatures.”

In Colorado, at least seven major wildfires are burning at the time of this writing. The Waldo Canyon fire in Colorado Springs destroyed 347 homes and killed at least two people. The High Park fire farther north burned 259 homes and killed one. While officially “contained” now, that fire won’t go out, according to Colorado’s Office of Emergency Management, until an “act of nature such as prolonged rain or snowfall.” The “derecho” storm system is another example. “Derecho” is Spanish for “straight ahead,” and that is what the storm did, forming near Chicago and blasting east, leaving a trail of death, destruction and downed power lines.

Add drought to fire and violent thunderstorms. According to Dr. Jeff Masters, one of the few meteorologists who frequently makes the connection between extreme weather and climate change, “across the entire continental U.S., 72 percent of the land area was classified as being in dry or drought conditions” last week. “We’re going to be seeing a lot more weather like this, a lot more impacts like we’re seeing from this series of heat waves, fires and storms … This is just the beginning.”

Fortunately, we might be seeing a lot more of Jeff Masters, too. He was a co-founder of the popular weather website Weather Underground in 1995. Just this week he announced that the site had been purchased by the Weather Channel, perhaps the largest single purveyor of extreme weather reports. Masters promises the same focus on his blog, which he hopes will reach the much larger Weather Channel audience. He and others are needed to counter the drumbeat denial of the significance of human-induced climate change, of the sort delivered by CNN’s charismatic weatherman Rob Marciano. In 2007, a British judge was considering banning Al Gore’s movie “An Inconvenient Truth” from schools in England. After the report, Marciano said on CNN, “Finally. Finally … you know, the Oscars, they give out awards for fictional films, as well … Global warming does not conclusively cause stronger hurricanes like we’ve seen.” Masters responded to that characteristic clip by telling me, “Our TV meteorologists are missing a big opportunity here to educate and tell the population what is likely to happen.”

Beyond the borders of wealthy countries like the United States, in developing countries where most people in the world live, the impacts of climate change are much more deadly, from the growing desertification of Africa to the threats of rising sea levels and the submersion of small island nations.

The U.S. news media have a critical role to play in educating the public about climate change. Imagine if just half the times that they flash “Extreme Weather” across our TV screens, they alternated with “Global Warming.” This Independence Day holiday week might just be the beginning of people demanding the push to wean ourselves off fossil fuels and to pursue a sane course toward sustainable energy independence.

Denis Moynihan contributed research to this column.

Amy Goodman is the host of the nationally syndicated radio news program, Democracy Now!
Continue Reading Close
  • more
    • All Share Services

Local food isn’t bad

A critique proves that models used in neoliberal economics do not accurately apply to food and agriculture

  • more
    • All Share Services

Topics: , , , ,

Local food isn't bad
This article originally appeared on AlterNet.

A physicist, a chemist and an economist are stranded on an island, with nothing to eat. A can of soup washes ashore. The physicist says, “Let’s smash the can open with a rock.” The chemist says, “Let’s build a fire and heat the can first.” The economist says, “Let’s assume that we have a can-opener…”

AlterNet

Economists all know this joke, which “comes from the stereotype that many economic models require unrealistic or absurd assumptions in order to obtain results.” And yet, how many heed its warning?

A new book, “The Locavore’s Dilemma: In Praise of the 10,000 Mile Diet” by Pierre Desroches and Hiroku Shimizu, uses arguments from neoliberal economics to explain why those who advocate eating local food are wrong. Often, their arguments require assumptions as silly as the one in the joke. For example, in making the case that the world moved from a diet of local food to a global food system for a good reason (and therefore we should not return to eating local), they assume that modern locavores will face the same technological limitations as our ancestors, who were also locavores. But aside from the numerous strawman arguments found throughout the book, there are several points where economics are properly applied to food and agriculture and – the authors charge – prove that local food is a bad idea.

But perhaps the opposite is true instead; that the models used in neoliberal economics do not accurately apply here. Here are six economic principles that do not fit when it comes to food and agriculture:

1. Assume the Players Are Rational: In economics, one assumes all of the players are rational. When it comes to food, we are far from it. For example, frozen dinners had been introduced to supermarkets unsuccessfully before the TV dinner came along. The TV dinner succeeded because Americans were excited about TVs. When the Pepsi Challenge showed that Americans prefer the flavor of Pepsi to Coke, Coca-Cola took the bait and introduced New Coke, which tested better than Coke and Pepsi in taste tests. Turned out, consumers don’t drink Coke because of the flavor. They drink it because it’s “American” and “fun.” Coke learned its lesson, and its slogans have reflected it ever since (“Open happiness”).

2. Standardization of Food: Much of economic theory rests on the assumption that the goods in question are commodities. Our food is standardized so that it can be treated as a commodity. One Granny Smith apple is the same as any other Granny Smith apple, no matter where it’s from or how it was produced. But many foods are not so interchangeable, and indeed, when they are standardized, they often become standardly bad.

Take the strawberry. There’s nothing like the flavor of a fresh-picked, juicy strawberry. But if you pay $3.99 a pint for strawberries in your supermarket in January, the berries you buy will hardly even give a hint of flavor. They’ll be big and red, but who cares if the berry is perfectly red if it doesn’t deliver on strawberry flavor?

The market can deliver on the idea of year-round strawberries, but it cannot deliver on the delectable flavor one associates with those berries. Truly ripe strawberries are highly perishable, so they can’t be too ripe when picked. And strawberry flavor deteriorates when the berry goes in the fridge, but there’s no way to transport perishable berries across the country without refrigeration. Even in California, where nature provides fresh, local berries for about half the year, the flavor changes throughout the season. Early season berries aren’t very sweet or flavorful, unfortunately. It takes until May or June to get perfect, sweet, juicy strawberries. That goodness is fleeting and ephemeral, and it only comes once a year.

In addition to flavor, foods are not identical in terms of nutrition. It’s likely that supermarket eggs are standardized, all with roughly the same nutrition content. They were all produced in nearly identical conditions from genetically identical birds who were fed exactly the same feed. But when chickens can roam freely, eating grass and bugs in addition to chicken feed, their eggs become more nutritious. It would be extremely difficult to produce eggs this way on a large scale and do so profitably. But it’s easy and fun to do for homeowners with small backyard flocks.

Another factor is genetics. It’s certainly convenient to produce genetically identical food for the market, especially if you find the perfect combination of genetics to give you a high yield and great taste with disease and pest resistance. But what happens when Mother Nature throws a curve ball at you and a disease comes along that your crop has no resistance to? You need new genes. Sure, scientists can keep a supply of biodiversity in a seed vault and that might provide the new genes you’re looking for. But for genes that really keep up with the current challenges of nature, you need genetic diversity grown in nature, under conditions that change over time.

When governments signed NAFTA, they assumed that corn is corn is corn, and the U.S. produces corn cheaper than Mexican peasants are able to produce it. But those Mexican peasants are the guardians of the world’s most valuable supply of corn genetics. And in that sense, they can hardly be compared to an Iowa corn farmer who buys seeds each year from DuPont or Monsanto.

3. Creative Destruction: Economist Joseph Schumpeter spoke of creative destruction, explaining how new inventions will disrupt old, established businesses. When automobiles were introduced, carmakers and their employees succeeded as horse-drawn buggy makers went out of business and their employees lost their jobs. Therefore, under this line of thinking, it makes perfect sense for Florida to grow oranges until the global market some day decides to source oranges from Brazil instead. There’s no need for Florida to diversify its crops, not while Florida oranges are in demand. They can worry about switching to a more profitable crop (or perhaps, switch from farming to tourism) when the time comes, just like cassette tape makers in the 1980s did not stop making cassette tapes because some day CDs would become popular and put them out of business.

The devil here is in the details. Desroches and Shimizu specifically say that “profitable monocultures should be pursued as long as they remain viable in a particular location.” So California should provide all of the nation’s almonds and strawberries, while the entire midwestern United States is blanketed in corn and soybeans. But it is nearly impossible to grow monocultures year after year in the same field without running into pest and fertility problems.

Right now some experts predict the downfall of the Cavendish banana, the variety of banana sold in U.S. supermarkets, due to a fungal disease. Enormous fields of genetically identical bananas make easy prey for pests and diseases. Accepting large scale monoculture means accepting lots of pesticide use. When one crunches the numbers, the costs might add up because revenues outweigh the extra money spent on pesticides, but the equation leaves out the impact those pesticides have on farmworkers, eaters and the environment.

4. Comparative Advantage: The idea expressed above goes hand in hand with the principle of comparative advantage. The idea is simple. If Idaho can produce potatoes cheaper than California can, and California can produce strawberries cheaper than Idaho can, then Idaho should grow all of the potatoes and California should grow all of the strawberries, and they should trade. To some extent, this makes sense. No one is suggesting that Mexico attempt to produce its own maple syrup or that Vermont should try to grow its own pineapple. But relying on large-scale monoculture as suggested by the notion that California should supply the nation with strawberries runs into the need for toxic agrochemicals.

Until recently, California was set to approve a potent carcinogen to fumigate strawberry fields. This chemical, methyl iodide, was to replace its predecessor, methyl bromide, which was phased out globally because it harmed the ozone layer. Should Californians be exposed to a carcinogen just so a few strawberry growers can get rich and the rest of the country can eat cheap (but flavorless) strawberries year-round? Or should midwesterners drink tapwater laced with the herbicide atrazine? When confronted with this question, Desroches dismissed the idea that agrochemicals are harmful, pointing to the increasing human lifespan in recent times.

5. Legal System: In a recent case Nicaraguan farmworkers brought against Dole, the workers sued over health problems caused by a pesticide called Nemagon used on the Dole Plantations. Nemagon has been banned in the U.S. since 1979. Initially, the farmworkers won their case, but it was overturned in an appeal after Dole found 27 secret witnesses who testified that the plaintiffs were fraudulent. After Dole’s victory, several of the secret witnesses came forward and recanted their testimony, saying they testified because Dole offered them bribe money — and then never even paid them the bribe money.

Most recently, Dole settled out of court with the farmworkers. Economists might assume we can create a perfect legal system that passes laws protecting workers’ rights and enforces them as much as they assume they have can-openers — but it doesn’t make either one true.

In our global food system, human rights abuses abound. Most are invisible to us when we shop at the grocery store. A recent WTO ruling challenged even the basic principle that we should be allowed to know what country our beef comes from. Pesticide standards only require that residue on food in our stores is limited, but nobody checks to find out what farmworkers were exposed to. Did the company that grew your food drive an indigenous community of their ancestral land with bulldozers, or did they irrigate their crops so heavily that an entire river the community relied on now runs dry? Did somebody’s property value plummet after a factory farm moved in next door, forcing them to smell manure night and day while simultaneously killing their chances to sell their home and move? These are not made-up scenarios — all of them have happened. But in the grocery store, you don’t know.

6. GDP Meets Human Health: From the perspective of maximizing GDP, our current food system cannot be beat. We have found ways to make people eat more than ever (and more processed foods than ever), and then they spend more money on diet books, weight loss programs, gyms, and health care for diet-related illnesses. This boosts the GDP much more than if people just ate the right amounts of a diverse mix of whole foods and then skipped the weight loss programs and the diabetes meds. But is it what we want?

For an economist, our current food system is highly efficient, producing, distributing, and selling the maximum amount of cheap food. A large amount of food is “value-added” (i.e. highly processed), which means that more companies and employees will earn money from each food. Instead of wheat or even wheat flour, a consumer buys a loaf of bread, giving jobs to the bakery and requiring dough conditioners, preservatives and a plastic bag that would not have been necessary if they just baked a loaf of bread at home.

Unfortunately, economics only quantifies dollars, not human health. According to those looking for GDP growth, it’s better if you buy a bottle of Heinz ketchup than if you grow tomatoes from seed in your garden, and it’s better if you buy a Snapple made with 10 percent juice than if you eat a piece of fruit. Maybe we would be better off if economists start assuming that we don’t have can-openers, so we have to cook healthy whole foods from scratch.

Jill Richardson is the founder of the blog La Vida Locavore and a member of the Organic Consumers Association policy advisory board. She is the author of Recipe for America: Why Our Food System Is Broken and What We Can Do to Fix It.

Continue Reading Close
  • more
    • All Share Services

Is Mormon underwear magic?

To outsiders, there is little more fascinating about the Mormon religion than the secret underwear

  • more
    • All Share Services

Topics: , , , ,

Is Mormon underwear magic?
This article originally appeared on AlterNet.

To outsiders, there is little more fascinating about the Mormon religion than the secret underwear that Mormon temple initiates are expected to wear day and night. As one former believer put it, “I’ve been an exmo since 1967. All that time, the underwear questions were the first ones I got from people who found out I had been Mormon. A friend brought it up again last week at lunch.” Another former Mormon agrees: “When people first find out I’m exmo, their first question/comment almost ALWAYS is, ‘So what is the deal with the magic underwear?!’ Honest! People outside the morg are spending WAY too much time thinking about garmies!”

AlterNet

(“Garmies” is insider slang for the sacred undergarments prescribed by the religion’s founder, Joseph Smith.)

Some outside interest may be driven simply by curiosity: Mormons have sacred underwear! What do they look like? Or incredulity: Religious leaders can tell women to wear undershirts with special symbols all the time beneath their bras and people do it?! But that’s not the whole story.

The idea of sacred, secret underwear seems a little kinky, at least to some outsiders. Commenters on blogs and forums confess the attraction.

I tell you guys who grew up taking ‘undergarments’ for granted — WE in the not-know find these items fetching indeed [here in Idaho]. (Kymba)

Mine had been in the bottom of my closet in a moving box in a paper bag for 5 years until a couple weekends ago when I modeled them for my boyfriend. He was intrigued by the whole thing and found them to be very sexy. (Randy)

It only makes sense that some subset of us would find the idea of Mormon undies titillating. They are novel, they are secret, they are taboo, and they are in constant contact with genitalia.

But are they kinky to insiders?

It’s hard to get a balanced sample from active Mormons, because the Garments, as I said, are sacred, and catering to the curiosity of prurient outsiders would violate a covenant sworn during the same temple ceremony in which a Mormon gets authorized to wear the Garment. Unfortunately those who have been fantasizing about a romp in which layers of white cotton create the perfect sense of mystery (or bondage), ex-Mormons offer few words of encouragement. Discomfort seems to be the predominant theme.

I was continuously battling wedgies — often in public; how the people would stare as I would try to wrestle crumpled material out of my crack. (Lady DB)

If you have ever worn the modern ones, you should appreciate the distance these have come. When I first got married they came in a one-piece get-up with a wide neck so you could step into them. The back had a split crotch (not the kind in kinky panties), but this huge, wide, sloppy split would separate under your clothes, leaving a draft in your nether region much of the time. The little panel they sew into the ladies’ special part was so poorly designed that it would roll and twist till you felt like you were skewered by a roll of old toilet paper. (Insanad)

Of all of the things about Mo-dom, the thing I miss the least is the underwear. (Zapotec)

Theologically, Mormon undergarments are said to be symbols of a covenant between God and the believer. Initiates pantomime their own death should they violate this sacred trust. The underwear have sacred symbols drawn from the Masonic Order into which Joseph Smith was initiated shortly before he proclaimed God’s desire that people wear the Garment. True-believing Mormons avoid allowing their Garments to touch the ground. They may cut off and burn the symbols when a Garment itself is worn out.

I thought the kitchen was on fire a few times until I found my mom burning the “sacred symbols” in tin cans before she cut the underwear into dust cloths. I was slapped a time or two for letting them fall or drag on the floor when I did laundry as a child. (Cheryl)

In Mormon folk religion, Garments have special powers. Stories are told of wearers being saved from bullets or a fiery death in a car crash. One story tells of a Mormon soldier during WWII who was killed by a Japanese flame thrower – but his Garment survived intact. The stories go back to Joseph Smith himself, who died in a hail of bullets without his Garment on. His companion, Willard Richards, who was wearing his, emerged unscathed. Mormon historian Hubert Bancroft described the incident in his 1890 “History of Utah”: “This garment protects from disease, and even death, for the bullet of an enemy will not penetrate it. The Prophet Joseph carelessly left off this garment on the day of his death, and had he not done so, he would have escaped unharmed.”

Today such accounts are not investigated or endorsed by the church authorities. The Catholic hierarchy has an established procedure for assessing claims about weeping statues or a miracle cure, but the Mormon hierarchy largely ignores stories about the real-world protective powers of the white underwear. In 1988, Mormon authorities asserted that the Garment serves as a protection “against temptation and evil.” Unfortunately, ordinary believers may take the broader protective power of their Garments seriously, sometimes with painful consequences.

Flame swept up my arm and no clothing burned at all except the entire sleeve and part of the shoulder of the Garment that burned/melted. I was burned where the material melted into globules. I was a good person. They did not work as claimed. I will never ever forget that day. (AmIDarkNow)

My TBM (True-believing Mormon) father was a radiologist and believed that his garmies would protect him from radiation. Needless to say, the bonehead died of leukemia at 49. (Jeebus)

Taking off the Garments is a big step for many people leaving the Mormon religion. Some people feel vulnerable when they first abandon the regulation underclothes.

Well, I still remember the first time I took them off. Half wondered if I was going to die in a car wreck. (nonyabiz)

I was on the lookout for the death from no garmies, too, for a bit!! Oh good Lord! (makesmyheadspin)

But others experience a sense of freedom:

I cannot believe I let another grown man ask my wife and I what kind of underwear we were wearing and volunteer the information with a cheery smile. What was even more sick is that I believed in a tyrannical God that cared about what kind of underwear I was wearing. (Mortimer)

My hubby and I were cooking on the grill in the back yard. All of a sudden, the wind changed, and a flame leaped out of the grill and came straight at my chest. I looked down and saw it hit my shirt and chest, then arch away from me, back in the direction it came from. I looked down, my plastic buttons weren’t melted and the shirt wasn’t singed at all. I automatically thought, in my old morg ways, “my Garments must have protected me!” THEN … I pulled my shirt collar forward and looked down my shirt … I couldn’t stop laughing!! I had no Garments on!! This was soooo refreshing and invigorating! I had been “protected” and I had no Garments on!! (Kathy S)

Sometimes the heart of that newfound freedom is the freedom to explore sexuality or intimacy.

When we began to just lay together, skin to skin, and talk to each other; to feel each other’s pulse and breath; to simply feel our physical selves, our body-shame began to dissipate. (Waking Up)

I remember when I first quit wearing my garments and how feminine I felt. WOW, I actually had breasts and a waist. It was very liberating, and for the first time in my life I began to feel sexual, not a droid without any sexuality. And even though I didn’t fit … what I perceived as the ideal weight, shape, looks, I felt sexy and powerful. (Anonymous)

This is not to say that Mormon Garments have no place in the history or future of erotica. You say tomayto, I say tomaaahhhto. Interestingly, the Garment may owe its existence to Joseph Smith wrestling with his own high libido. As recent research on homosexuality suggests, people who are struggling to contain or suppress their own sexuality may be particularly interested in controlling the sexuality of others. Historians are unclear on the number of women Smith actually married and the number with which he simply had sexual relationships. The list of his wives, first published in the late 19th century and still debated, includes 27 names. Despite this, Smith preached against polygamy till his death. Was the design of the Garment (then a full-body, long-sleeved, button-up affair) the product of a divine revelation, Smith’s sexual tastes, or his effort to suppress desire? You decide.

Valerie Tarico is a psychologist and writer in Seattle, Washington, and the founder of Wisdom Commons. She is the author of “Trusting Doubt: A Former Evangelical Looks at Old Beliefs in a New Light” and “Deas and Other Imaginings.” Her articles can be found at Awaypoint.Wordpress.com.

Continue Reading Close
  • more
    • All Share Services

Weimar America

Four major ways we're following In Germany's fascist footsteps

  • more
    • All Share Services

Topics: , , , ,

Weimar America (Credit: AP/Al Grillo/Salon)
This article originally appeared on AlterNet.

What happens when a nation that was once an economic powerhouse turns its back on democracy and on its middle class, as wealthy right-wingers wage austerity campaigns and enable extremist politics?

It may sound like America in 2012. But it was also Germany in 1932.

AlterNet

Most Americans have never heard of the Weimar Republic, Germany’s democratic interlude between World War I and World War II. Those who have usually see it as a prologue to the horrors of Nazi Germany, an unstable transition between imperialism and fascism. In this view, Hitler’s rise to power is treated as an inevitable outcome of the Great Depression, rather than the result of a decision by right-wing politicians to make him chancellor in early 1933.

Historians reject teleological approaches to studying the past. No outcome is inevitable, even if some are more likely than others. Rather than looking for predictable outcomes, we ought to be looking to the past to understand how systems operate, especially liberal capitalist democracies. In that sense, Weimar Germany holds many useful lessons for contemporary Americans. In particular, there are four major points of similarity between Weimar Germany and Weimar America worth examining.

1. Austerity. Today’s German leaders preach the virtues of austerity. They justify their opposition to the inflationary, growth-creating policies that Europe desperately needs by pointing to the hyperinflation that occurred in 1923, and which became one of the most enduring memories of the Weimar Republic. Yet the austerity policies enacted after the onset of the Depression produced the worst of Germany’s economic crisis, while also destabilizing the country’s politics. Cuts to wages, benefits and public programs dramatically worsened unemployment, hunger and suffering.

So far, austerity in America has largely taken place at the state and local levels. However, the federal government is now working on undemocratic national austerity plans, in the form of so-called “trigger cuts” slated to take effect at the end of 2012. In addition, there’s the Bowles-Simpson austerity plan to slash Medicare and Social Security benefits along with a host of other public programs; and the Ryan Budget, a blueprint for widespread federal austerity should the Republicans win control of the Congress and the White House in November.

2. Attacks on democracy. Austerity was deeply unpopular with the German public. The Reichstag, Germany’s legislature, initially rejected austerity measures in 1930. As a result, right-wing Chancellor Heinrich Brüning implemented his austerity measures by using a provision in the Weimar constitution enabling him to rule by decree. More notoriously, Hitler was selected as chancellor despite his party never having won an election — the ultimate slap at democracy. Both these events took place amidst a larger backdrop of anti-democratic attitudes rampant in the Weimar era. Monarchists, fascists and large businesses all resented the left-leaning politics of a newly democratic Germany, and supported politicians and intellectuals who pledged to return control to a more authoritarian government.

Democracy is far older in the United States today than it was in Germany during the early 1930s. But that doesn’t mean that democracy is actually respected in practice today; it only means that attacks on it can’t be as overt as they were in Weimar Germany. From the Supreme Court’s Citizens United ruling to Republican voter ID laws to austerity proposals that bypass the normal legislative processes (remember the Supercommission?), American democracy is under similar direct threats now.

3. Enabling of extremists. Well before Hitler was made chancellor in 1933, leading conservatives and business leaders had concluded that their interests would be better served by something other than the democratic system established in 1919. During the 1920s, they actively supported parties that promoted anti-democratic ideologies, from monarchism to authoritarianism. Nazis were just one of the many extremist groups that they supported during the Weimar era. In fact, initially, many on the German right had attempted to exclude the Nazis from their efforts; and as chancellor, Brüning had tried to marginalize the Nazi party. However, his successor, the right-wing Franz von Papen, believed he could control Hitler and needed the support of the Nazi members of the Reichstag. Conservative German leaders ultimately decided their hunger for power was more important than keeping extremists at bay — and their support finally gave the Nazi Party control of the country.

Tea Party activists aren’t Nazis. But with roots in the 20th century radical right, the Tea Party’s attack on the public sector, on labor unions, on democratic practices, and on people who aren’t white mark them as the extremist wing of American politics; and they bear many of the hallmarks that characterize fascist movements around the world. In recent years, Republican leaders have been enabling these extremists in a successful bid to reclaim political power lost to Democrats in 2006 and 2008. We don’t yet know where this enabling is going to lead the country, but it’s hard to imagine it will be anywhere good.

4. Right-wing and corporate dominance. One of the the most prominent German media moguls in the 1920s was Alfred Hugenberg, owner of 53 newspapers that reached over a majority of German readers. The chairman of the right-wing German National People’s Party, Hugenberg promoted Adolf Hitler by providing favorable coverage of him from the mid-1920s onward. Major German corporations such as Krupp, IG Farben and others spent money in the 1920s and early 1930s to support the rise of right-wing political parties, including the Nazis, as part of a strategy to undermine democracy and labor unions. Even if Hitler had never taken power, that strategy had already achieved significant returns on their substantial investment.

Here in the United States, one only needs to look at Charles and David Koch, Fox News and other right-wing funders and their media outlets to see the analogy. By funding right-wing politicians who promote austerity, undermine democracy and support extremism, they are active agents in the creation of Weimar America.

The Road Not Taken

None of this means that the United States is about to fall victim to a fascist coup d’etat as Germany did in January 1933. Remember that no outcome is inevitable. Nor would it be accurate to say that the United States is repeating the exact same events and taking the same course as Germany did during the 1930s, because many other important details are different. For example: Germany was a nation saddled with huge debts and lacking the global political power it needed to reverse its situation; but even with today’s high unemployment rates, the United States remains the globe’s largest economy, and therefore doesn’t face the same fiscal constraints Weimar Germany faced. In fact, a better current analogy may be Greece, which is in a far more similar predicament now.

Yet the underlying similarities ought to be troubling — and are enough to give us pause. The combination of austerity and well-funded right-wing political movements hostile to democracy destroyed Weimar Germany. And Spain and Italy both experienced a similar situation in their slide into authoritarianism in the 1930s. In those cases — and in ours — as people saw their own financial position weaken, and as their democratic rights were increasingly limited in favor of giving more power to the large corporations, the future of a democratic society with a strong middle-class was increasingly jeopardized. Fascism is what happens when right-wing plutocrats weaken the middle class and then convince it to turn its back on democracy.

Will Weimar America face the same disastrous fate Weimar Germany did? On our current path, democracy and shared prosperity are both in serious trouble. We owe it to ourselves, to our children, and to our world to look to the lessons of history, find a way to change course, and get to work building something better.

Robert Cruickshank is a political activist and historian, and a senior advisor to Seattle Mayor Mike McGinn. The views expressed here are his own.

Continue Reading Close
  • more
    • All Share Services

Page 1 of 21 in AlterNet