MINNEAPOLIS (The Borowitz Report)—A new study, by the University of Minnesota, indicates that fear of contracting the Ebola virus is highest among Americans who did not pay attention during math and science classes. According to the study, those whose minds were elsewhere while being taught certain concepts, like what a virus is and numbers, are at a significantly greater risk of being afraid of catching Ebola than people who were paying even scant attention.
Interviews conducted with people who spent math and science classes focussing on what they would be having for dinner or what the student in front of them was wearing revealed the difficulty they are currently having grasping basic facts about Ebola.
For example, when a participant of the study was told that he had a one-in-thirteen-million chance of contracting the virus, his response was, “Whoa. Thirteen million is a really big number. That is totally scary.”
Davis Logsdon, who conducted the study for the University of Minnesota, puts the number of Americans who did not pay attention during math and science classes at seventy-two per cent, but adds, “I seriously doubt most people will know what that means.”
Researchers have examined dozens of Megalodon shark fossils and have estimated that the huge sharks have gone extinct about 2.6 million years ago. Megalodon sharks were the largest sharks to live and were 60 feet long creatures.
Though researchers are uncertain about the reason behind the extinction, they at least have a more accurate estimate about the time of their extinction.
Researchers from the University of Zurich and University of Florida conducted this study with the aim of finding out how removal of predators affected the local environments. They analyzed Megalodon fossil records and found out that the extinction of Megalodon sharks led to the growth in size of whales which then became the biggest animals on earth.
The Megalodon Sharks first appeared nearly 28 million years ago and with 40-60 feet in length and 50-100 metric tons in weight, could swallow an adult human easily. It is believed that they preyed mostly on whales.
The researchers used available databases and a mathematical formula that was developed by Christopher S. Clements, co-author of the study. The formula has been of help in recent studies in calculating the extinction times of several species. The research has been published in the journal PLOS ONE.
The researchers used OLE or Optimal Linear Estimation to set an approximate extinction date for the Megalodon sharks. Then they studied the whale populations of those times. They found out that the size and number of whales grew after the removal of these sharks, which were their largest predators.
Lead researcher Catalina Pimiento from the Florida Museum of Natural History says that recent estimations made from this research point out that large-bodied, shallow water species of sharks at a greater risk among marine animals. This could possibly be due to climate change and warmer waters.
RIO DE JANEIRO — Brazilian voters re-elected Dilma Rousseff as president on Sunday, endorsing a leftist leader who has achieved important gains in reducing poverty and keeping unemployment low over a centrist challenger who castigated her government over a simmering bribery scandal and a sluggish economy.
Ms. Rousseff of the Workers Party took 51.4 percent of vote in the second and final round of elections, against 48.5 percent for Aécio Neves, a senator from the Social Democracy party and scion of a political family from the state of Minas Gerais, electoral officials said Sunday night with 98 percent of votes in the country counted.
While Ms. Rousseff won by a thin margin, the tumultuous race was marked by accusations of corruption, personal insults and heated debates, revealing climbing polarization in Brazil. Mr. Neves surged into the lead this month in opinion surveys, only to be eclipsed by Ms. Rousseff as the vote on Sunday approached.
“People without much money have seen their lives improve during recent years,” said Liane Lima, 62, a secretary in São Paulo who voted for Ms. Rousseff. “I think we should let Dilma finish what she started.”
Indeed, Ms. Rousseff’s victory reflects broad changes in Brazilian society since the Workers Party rose to power 12 years ago with the election of her predecessor and mentor, Luiz Inácio Lula da Silva, who chose Ms. Rousseff as his successor to run in the 2010 election and campaigned for her again this year.
Building on an economic stabilization project put in place by the Social Democrats in the 1990s, Ms. Rousseff and Mr. da Silva aggressively expanded social welfare programs, lifting millions of Brazilians out of poverty. Pointing to the popularity of the antipoverty spending, Mr. Neves, the challenger in the race, said he would not scale it back.
But while Ms. Rousseff campaigned largely on her government’s support for poor and working-class citizens, she faced fierce criticism over her economic policies, with Brazil struggling with slow growth throughout her first term and a recession this year. Brazil’s financial markets gyrated wildlythroughout the race, reflecting skepticism over her management of the economy.
Ms. Rousseff, 66, a former Marxist guerrilla who was imprisoned andtortured by Brazil’s military dictatorship, rejected much of the criticism while emphasizing that she had no plans to shift away from policies involving greater state control over the economy. Still, she signaled openness to shaking up her cabinet, including replacing her unpopular finance minister, Guido Mantega.
In addition to facing turbulence in the markets, Ms. Rousseff will deal in her next four-year term with a sprawling scandal involving testimony of bribes and money laundering at Petrobras, the national oil company, which has eroded confidence in the Workers Party. A former high-ranking executive at Petrobras has testified that he channeled bribes to the party and its allies in Brasília.
“I always voted for the Workers Party, since I was a teenager, but this government hasn’t done anything different,” said José Abel, 48, who runs a tourist agency in Brasília and voted for Mr. Neves largely out of concern over corruption in Ms. Rousseff’s government. “They’re just the same as other parties now.”
Still, with the unemployment rate remaining near historical lows even during a recession, economic stability seemed to trump corruption as a major issue among voters. Many people who cast ballots on Sunday expressed concern that a change in government could erode welfare benefits which are now a fixture of society.
“My life is stable thanks to Dilma’s government,” said Diogo Bernardo, 28, an installer of telephone lines in Rio de Janeiro who voted for Ms. Rousseff, referring to her informally by her first name, as is common in Brazil. “She’s not great, but Aécio would have been worse since he cares less about the rights of working people. I voted for the lesser of two evils.”NEXT IN AMERICAS
Sold: The Butte Nugget, one of the biggest gold nuggets found in modern times in Northern California's historic Gold Country was sold to a secret buyer for about $400,000
A Palestinian boy looks at statues that are made of fiberglass and covered with clay by Palestinian artist Eyad Sabbah, which are depictions for the Palestinians who fled their houses from Israeli shelling during the most recent conflict between Israel and Hamas, in the east of Gaza City, October 21, 2014.
***
"Is Israel The World's Worst Terror State? An Israeli General's Son Thinks So"
Excerpt:"Hold that chocolate bar. The researchers also warn that the compound found in cocoa exists only in minuscule amounts in the average chocolate bar compared with the amount used in the study, so gorging on chocolate in the name of health and improving one’s memory could backfire... The study involved 37 healthy subjects who ranged in age from 50 to 69. On a random basis, they were given either a high-flavanol diet, consuming 900 milligrams a day, or a low flavanol diet, consuming 10mg per day. Brain scans, which measure blood volume in the dentate gyrus, and memory tests were used to evaluate the effect of the diet. Small said the typical candy bar contains about 40mg of flavanols.
Researchers said that if a person had the memory of a typical 60-year-old at the beginning of the study, after three months, on average, that person’s memory would function more like a 30- or 40-year-old’s. The researchers also cautioned that more work is needed because of the study’s small sample size... Most cocoa-processing methods in use today remove many of the flavanols found in cocoa. But the chocolate maker Mars produced a cocoa flavanol test drink specifically designed for the research, which the company also subsidized." (Alan: The experiment's high-flavanol diet included a daily dose of flavanols 22.5 times higher than the flavanols found in a typical chocolate bar.)
In case anyone needed another reason to love chocolate, a new study suggests that a natural compound found in cocoa, tea and some vegetables can reverse age-related memory loss.
The findings suggest that the compound increases connectivity and, subsequently, blood flow in a region of the brain critical to memory, the researchers said.
The study — published online Sunday in Nature Neuroscience and partly financed by a chocolate company — found that flavanols reverse mild memory loss in older adults. Using brain scans and memory tests, the latest study built on previous work showing that flavanols extracted from cocoa beans had improved neuronal connections in mice’s dentate gyrus, a part of the brain involved in memory formation.
But hold that chocolate bar. The researchers also warn that the compound found in cocoa exists only in minuscule amounts in the average chocolate bar compared with the amount used in the study, so gorging on chocolate in the name of health and improving one’s memory could backfire.
“It would make a lot of people happy, but it would also make them unhealthy,” Scott A. Small, a professor of neurology and director of the Alzheimer’s Disease Research Center at the Taub Institute at Columbia University Medical Center, said Friday.
Small said that even more important, the new study offers the first direct evidence that memory deteriorates with age because of changes in the dentate gyrus, a region of the hippocampus. Previous studies had shown a link between changes in this region of the brain and normal, age-related memory loss, but the Columbia University study asserts a causal link.
"It more firmly establishes that this is the anatomical source of age-related memory loss,” Small, the study’s senior author, said. He said the study also offered yet more evidence thatdietandhealthy lifestyles that increase blood flow to the brain can slow or reverse age-related cognitive decline.
The study involved 37 healthy subjects who ranged in age from 50 to 69. On a random basis, they were given either a high-flavanol diet, consuming 900 milligrams a day, or a low flavanol diet, consuming 10mg per day. Brain scans, which measure blood volume in the dentate gyrus, and memory tests were used to evaluate the effect of the diet. Small said the typical candy bar contains about 40mg of flavanols.
Researchers said that if a person had the memory of a typical 60-year-old at the beginning of the study, after three months, on average, that person’s memory would function more like a 30- or 40-year-old’s. The researchers also cautioned that more work is needed because of the study’s small sample size.
The compounds appear to enhance connectivity andmetabolic activity in the dentate gyrus. Aging appears to reduce the synapses, or connections, between neurons in that part of the brain. That decline, however, is not related to severe memory loss and cell death inAlzheimer’s disease or other dementias, Small said.
Most cocoa-processing methods in use today remove many of the flavanols found in cocoa. But the chocolate maker Mars produced a cocoa flavanol test drink specifically designed for the research, which the company also subsidized.
Alan: If I saw the above image in anyone else's blog, I'd be furious.
As a blanket generality, it is simply untrue.
As a statistical generality however it is more true than positing: "Married parents raise future prison inmates."
A persistent shortcoming in society's treatment of single motherhood is that it lends itself to pharisaic finger-wagging that, paradoxically, contradicts the stated goal.
Yeshua Excoriates Fellow Pharisees: "The Woe Passages"
Many people - particularly American conservatives - are determined to feel good by "kicking the dog" while doing nothing to foster family-friendly culture in effective ways.
Here's a place to start. (Will any American conservative "get on board?")
Excerpt:"But the biggest social cost of less marriage involves children. “New choices for adults,” Sawhill writes, “have not generally been helpful to the well-being of children.” Single-parent families have exploded. In 1950, they were 7 percent of families with children under 18; by 2013, they were 31 percent. Nor was the shift isolated. The share was 27 percent for whites, 34 percent for Hispanics and 62 percent for African Americans. By harming children’s emotional and intellectual development, the expansion of adult choices may have reduced society’s collective welfare."
We Americans believe in progress, and yet progress is often a double-edged sword. The benefits and adventures of change often vie with the shortcomings and disruptions, leaving us in a twilight zone of ambiguity and doubt about the ultimate outcome. Few subjects better illustrate this than the decline of marriage, as Isabel Sawhill shows in her sobering book “Generation Unbound: Drifting into Sex and Parenthood without Marriage.”
Even those who know marriage is on the skids — presumably, most of us — may be surprised by the extent of its decline. A little history helps. To Americans coming of age in the 1950s, the expectation was that most would marry. It was part of society’s belief structure. And most did. Now these powerful social pressures have faded and, for many, disappeared.
Consider. In 1960, only 12 percent of adults ages 25 to 34 had never married; by the time they were 45 to 54, the never-married share had dropped to 5 percent. Now fast forward. In 2010, 47 percent of Americans 25 to 34 had never married. Based on present trends, this will still be 25 percent in 2030 when they’re 45 to 54, estimates the Pew Research Center.
The stranglehold that marriage had on middle-class thinking and behavior began to weaken in the 1960s with birth control pills, publication of Betty Friedan’s “The Feminine Mystique” — an assault on women’s traditional housecleaning and child-rearing roles — and the gradual liberalization of divorce laws.
The resulting expansion of personal choice has been breathtaking. Those liberalized divorce laws have freed millions of women and men from unsatisfying or abusive marriages. (From 1960 to 1980, the divorce rate rose nearly 150 percent; it has since reversed about half that gain.) Taboos against premarital sex and cohabitation have virtually vanished. So has the stigma of out-of-wedlock birth or, for married couples, of not having children. With more job opportunities, women flooded the labor market.
Naturally, there was a backlash. These changes spawned new discontents.
Balancing home life and work is stressful and often guilt-ridden. Women and men alike worry they are not devoting enough time to children and/or their jobs. Women resent men who don’t do enough around the house, even though they do much more than they once did. In 1965, wives spent almost seven times as much time on household and child-rearing as husbands; now women do only twice as much, while men still spend more time on the job.
The flight from marriage may also have subtracted from personal happiness. Sawhill quotes from one respected study that “married women and men live longer; they are less likely to be disabled. . . . [They] have better sex than the unmarried and they are less likely to be lonely.”
But the biggest social cost of less marriage involves children. “New choices for adults,” Sawhill writes, “have not generally been helpful to the well-being of children.” Single-parent families have exploded. In 1950, they were 7 percent of families with children under 18; by 2013, they were 31 percent. Nor was the shift isolated. The share was 27 percent for whites, 34 percent for Hispanics and 62 percent for African Americans. By harming children’s emotional and intellectual development, the expansion of adult choices may have reduced society’s collective welfare.
It is not (as Sawhill repeatedly says) that all single-parent households are bad or that all two-parent families are good. But the advantage lies with the approach that can provide children more financial support and personal attention. Two low-income paychecks, or two good listeners, are better than one. With a colleague, Sawhill simulated the effect today if the marriage rates of 1970 still prevailed. The result: The child poverty rate would drop by about 20 percent — a “huge effect” compared with most government programs.
There are other drawbacks. Social commentatorsJonathan Rauchand Charles Murray, among others, have argued that marriage is becoming “the great class divide,” as college graduates continue to marry and less-educated couples increasingly don’t. More than 40 percent of births now go to the unwed. Some of these mothers, says Sawhill, will have multiple partners and subject their children “to a degree of relationship chaos and instability that is hard to grasp.”
Sawhill doubts that either liberals or conservatives have workable remedies for these problems. More social services (the liberals) could be “very expensive” and ignore Bill Clinton’s dictum: “Governments don’t raise children; parents do.” Reviving marriage (conservatives) presumes — unrealistically, she says — that many anti-marriage norms can be reversed. Her own preference is that young women better use contraceptives to prevent unwanted pregnancies. This, too, sounds wishful.
Along with the budget deficit, we have a family deficit. It explains some stubborn poverty and our frustrations in combating it. We’ve learned that what good families provide cannot easily be gotten elsewhere. For the nation, this is the deficit that matters most.
The Secrets Behind The Mid-Terms E.J. Dionne "A Pew Research Center survey conducted Oct. 15 to 20 found that Democrats had a 21-point advantage among registered voters as the party “more concerned about needs of people like me,” up from a lead of only 11 points in October 2010. Voters picked Republicans over Democrats as the party “more influenced by special interests” by 46 percent to 32 percent, a 14-point Republican deficit, up from an 8-point deficit four years ago. And on a new measure, voters declared the Republican Party “more extreme in its positions” by 52 percent to 36 percent." By E.J. Dionne Jr.Opinion writerOctober 26, 2014
There’s a hidden history to the nasty midterm election campaign that will, mercifully, end on Nov. 4. What’s not being widely talked about is as important as what’s in the news.
Underappreciated fact No. 1: The number of Democratic seats that are notin play this year.
E.J. Dionne writes about politics in a twice-weekly column and on the PostPartisan blog. He is also a senior fellow in Governance Studies at the Brookings Institution, a government professor at Georgetown University and a frequent commentator on politics for National Public Radio, ABC’s “This Week” and NBC’s “Meet the Press.”View Archive
In planning their effort to take control of the Senate, Republicans shrewdly launched challenges to Democrats in states that would not automatically be on a GOP target list. “Broadening the map” is wise when you’re in a strong position.
Two of the states on that extended list, Colorado and Iowa, have paid off for Republicans. It’s still far from certain that they will defeat Sen. Mark Udall in Colorado or Rep. Bruce Braley in Iowa, who is trying to hold retiring senator Tom Harkin’s seat. Republicans have a clear shot at both, and this has strengthened their chances of taking the majority.
Just as striking is how many Democrats seem to have nailed down races the Republicans had once hoped to make competitive. This has narrowed the GOP’s path to a majority. Among them: Sen. Al Franken of Minnesota, Sen. Jeff Merkley of Oregon and Rep. Gary Peters of Michigan, who is likely to retain Sen. Carl Levin’s seat. Sen. Mark Warner of Virginia is also polling well, though he was always favored against former Republican National Committee chairman Ed Gillespie. Sen. Jeanne Shaheen of New Hampshire is in a tougher race with former Massachusetts senator Scott Brown, but she has led most of the way.
Democrats have benefited in these states from their opponents’ shortcomings. But what has been lost in the national news is underappreciated fact No. 2: How important economic issues have been in shoring up the party’s incumbents and in giving life to Democratic challengers in Georgia, Kentucky and (a much longer shot) South Dakota.
It’s commonly said that Republicans are nationalizing the elections against President Obama while Democrats are making them about “local” issues. That is true of North Carolina, where Democratic Sen. Kay Hagan has, so far successfully, targeted the unpopular Republican state government. But elsewhere, endangered Democrats are campaigning on a different set of national concerns related to economic worries. These include equal pay for women, relief for student loan recipients and a minimum-wage increase. Several Democrats, including Shaheen and Michelle Nunn in Georgia, have made an issue of opposing the outsourcing of U.S. jobs overseas.
This leads to underappreciated fact No. 3: Given Obama’s low approval ratings, Republicans could have been running away with this thing. They’re not, because they look more extreme and out of touch than they did four years ago.
A Pew Research Center survey conducted Oct. 15 to 20 found that Democrats had a 21-point advantage among registered voters as the party “more concerned about needs of people like me,” up from a lead of only 11 points in October 2010. Voters picked Republicans over Democrats as the party “more influenced by special interests” by 46 percent to 32 percent, a 14-point Republican deficit, up from an 8-point deficit four years ago. And on a new measure, voters declared the Republican Party “more extreme in its positions” by 52 percent to 36 percent.
Underappreciated fact No. 4: While opposition to Obama will motivate Republican turnout, particularly in contested red states, close races will hang on voters who are not casting ballots either for or against the president. The Pew survey found that 32 percent of voters saw their congressional ballot as a vote against Obama, and only 20 percent as a vote for him. That’s the GOP advantage.
But 45 percent said Obama would not be a factor in their decision. And within this group, 60 percent are Democrats or independents who lean that way and only 28 percent are Republicans or Republican-leaners. This is why Sen. Elizabeth Warren (D-Mass.) andBill andHillary Clinton have been in such demand on the campaign trail. Their messages, particularly Warren’s,have been about economics, which brings us back to fact No. 2. Warren has been especially effective in tagging the GOP as in league with the wealthy and the special interests, a lesson that forcefulness is more likely to grab voters’ attention than running scared is.
Many Democrats quietly concede that the Senate playing field still tilts the Republicans’ way. If Democrats upset expectations, these underappreciated factors will be the reason. If the Republicans prevail, the fact that the election has been so closely fought points to problems that even a victorious GOP will have to confront.
Alan: Yesterday, at the Novus Ordo site, I was struck by the clear "call" for "submission," the same word that translates the Arabic root of "Islam." Although submission can be virtuous, it lends itself to vice-viciousness when informed by psychological Absolutism.
The trade-off of Absolutism is this: we will absolutely surrender to Authority (typically a male authority) with the understanding that Absolute Authority will take providential care of Everything, even ensuring that there is "good reason" for the worst tragedies - forgotten children dying in car seats, plagues, The Iraq War, multiple amputations from vaccine-preventable meningococcal infection, and -- pièce de résistance -- an ironclad guarantee of "my" personal salvation.
On the other hand, Liberalism is so saturatedly satanic that there is no possibility of salvation, only hope of Total Annihilation followed by Eternal Damnation.
Liberalism: "Satanic Rebellion Against God?" (The Thinking Housewife)
The ironic downside of total (totalitarian) submission to Absolute Authority is that Christian traditionalists, political Zionists and jihadists all adopt the same self-certain pose: "Not only is "my" Absolutism better than "your" Absolutism, "my" Absolutism happens to be The One, True, Holy Absolutism which - in demonstration of God's Great Goodness - authorizes me to act as The Smiting Hand of Deity, working tirelessly for your total destruction while anticipating the great good cheer caused by your irredeemable damnation to The Lake of Unquenchable Fire.
Following the 2000 census, Republican control of the last redistricting process in just five states – Florida, Michigan, Ohio, Pennsylvania and Texas – resulted in a 31-seat swing in Congress.
Alan: The nation could easily move in the direction of "one person, one vote" democratic process by eliminating partisan control of redistricting.
Feb 2, 2013 - Normally we would expect more seats in Congress to go to the political ... yet Republicans won control of the House by a 234 to 201 margin. ... report on Redmap, its multiyear plan to influence redistricting. ... That's an 8-point increase over what they would have had to do in 2010, and a margin that happens ..
How the GOP Is Resegregating the South
Republicans are using the redistricting process to undermine minority voting power and ensure their party's dominance.
North Carolina State Senator Eric Mansfield was born in 1964, a year before the passage of the Voting Rights Act, which guaranteed the right to vote for African-Americans. He grew up in Columbus, Georgia, and moved to North Carolina when he was stationed at Fort Bragg. He became an Army doctor, opening a practice in Fayetteville after leaving the service. Mansfield says he was always “very cynical about politics” but decided to run for office in 2010 after being inspired by Barack Obama’s presidential run.
He ran a grassroots campaign in the Obama mold, easily winning the election with 67 percent of the vote. He represented a compact section of northwest Fayetteville that included Fort Bragg and the most populous areas of the city. It was a socioeconomically diverse district, comprising white and black and rich and poor sections of the city. Though his district had a black voting age population (BVAP) of 45 percent, Mansfield, who is African-American, lives in an old, affluent part of town that he estimates is 90 percent white. Many of his neighbors are also his patients.
But after the 2010 census and North Carolina’s once-per-decade redistricting process—which Republicans control by virtue of winning the state’s General Assembly for the first time since the McKinley administration—Mansfield’s district looks radically different. It resembles a fat squid, its large head in an adjoining rural county with little in common with Mansfield’s previously urban district, and its long tentacles reaching exclusively into the black neighborhoods of Fayetteville. The BVAP has increased from 45 to 51 percent, as white voters were surgically removed from the district and placed in a neighboring Senate district represented by a white Republican whom GOP leaders want to protect in 2012. Mansfield’s own street was divided in half, and he no longer represents most of the people in his neighborhood. His new district spans 350 square miles, roughly the distance from Fayetteville to Atlanta. Thirty-three voting precincts in his district have been divided to accommodate the influx of new black voters. “My district has never elected a nonminority state senator, even though minorities were never more than 45 percent of the vote,” Mansfield says. “I didn’t need the help. I was doing OK.”
Mansfield’s district is emblematic of how the redistricting process has changed the political complexion of North Carolina, as Republicans attempt to turn this racially integrated swing state into a GOP bastion, with white Republicans in the majority and black Democrats in the minority for the next decade. “We’re having the same conversations we had forty years ago in the South, that black people can only represent black people and white people can only represent white people,” says Mansfield. “I’d hope that in 2012 we’d have grown better than that.” Before this year, for example, there were no Senate districts with a BVAP of 50 percent or higher. Now there are nine. A lawsuit filed by the NAACP and other advocacy groups calls the redistricting maps “an intentional and cynical use of race that exceeds what is required to ensure fairness to previously disenfranchised racial minority voters.”
And it’s not just happening in North Carolina. In virtually every state in the South, at the Congressional and state level, Republicans—to protect and expand their gains in 2010—have increased the number of minority voters in majority-minority districts represented overwhelmingly by black Democrats while diluting the minority vote in swing or crossover districts held by white Democrats. “What’s uniform across the South is that Republicans are using race as a central basis in drawing districts for partisan advantage,” says Anita Earls, a prominent civil rights lawyer and executive director of the Durham-based Southern Coalition for Social Justice. “The bigger picture is to ultimately make the Democratic Party in the South be represented only by people of color.” The GOP’s long-term goal is to enshrine a system of racially polarized voting that will make it harder for Democrats to win races on local, state, federal and presidential levels. Four years after the election of Barack Obama, which offered the promise of a new day of postracial politics in states like North Carolina, Republicans are once again employing a Southern Strategy that would make Richard Nixon and Lee Atwater proud.
The consequences of redistricting in North Carolina—one of the most important swing states in the country—could determine who controls Congress and the presidency in 2012. Democrats hold seven of the state’s thirteen Congressional seats, but after redistricting they could control only three—the largest shift for Republicans at the Congressional level in any state this year. Though Obama won eight of the thirteen districts, under the new maps his vote would be contained in only three heavily Democratic districts—all of which would have voted 68 percent or higher for the president in 2008—while the rest of the districts would have favored John McCain by 55 percent or more. “GOP candidates could win just over half of the statewide vote for Congress and end up with 62 percent to 77 percent of the seats,” found John Hood, president of the conservative John Locke Foundation.
The same holds true at the state level, where only 10 percent of state legislative races can be considered a tossup. “If these maps hold, Republicans have a solid majority plus a cushion in the North Carolina House and Senate,” says J. Michael Bitzer, a professor of political science at Catawba College. “They don’t even need to win the swing districts.” North Carolina is now a political paradox: a presidential swing state with few swing districts. Republicans have turned what Bitzer calls an “aberration”—the Tea Party wave of 2010—“into the norm.”
Republicans accomplished this remarkable feat by drawing half the state’s black population of 2.2 million people, who vote overwhelmingly for Democrats, into a fifth of all legislative and Congressional districts. As a result, black voters are twice as likely as white voters to see their communities divided. “The new North Carolina legislative lines take the cake for the most grotesquely drawn districts I’ve ever seen,” says Jeff Wice, a Democratic redistricting lawyer in Washington.
According to data compiled by Bob Hall, executive director of Democracy North Carolina, precincts that are 90 percent white have a 3 percent chance of being split, and precincts that are 80 percent black have a 12 percent chance of being split, but precincts with a BVAP between 15 and 45 percent have a 40 percent chance of being split. Republicans “systematically moved [street] blocks in or out of their precincts on the basis of their race,” found Ted Arrington, a redistricting expert at the University of North Carolina, Charlotte. “No other explanation is possible given the statistical data.” Such trends reflect not just a standard partisan gerrymander but an attack on the very idea of integration. In one example, Senate redistricting chair Bob Rucho admitted that Democratic State Senator Linda Garrou was drawn out of her plurality African-American district in Winston-Salem and into an overwhelmingly white Republican district simply because she is white. “The districts here take us back to a day of segregation that most of us thought we’d moved away from,” says State Senator Dan Blue Jr., who in the 1990s was the first African-American Speaker of the North Carolina House.
* * *
Nationwide, Republicans have a major advantage in redistricting heading into the November elections. The party controls the process in twenty states, including key swing states like Florida, Ohio, Michigan, Virginia and Wisconsin, compared with seven for Democrats (the rest are home to either a split government or independent redistricting commissions). Republicans control more than four times as many seats at the Congressional level, including two-thirds of the seventy most competitive races of 2010.
This gives the GOP a major opportunity to build on its gains from 2010. Today GOP Representative Paul Ryan, nobody’s idea of a moderate, represents the median House district in America based on party preference, according to Dave Wasserman, House editor of the Cook Political Report. That district will become two points more Republican after the current redistricting cycle. “The fact of a Republican wave election on the eve of redistricting means that Republican legislators are in far better shape to shore up that wave,” says Justin Levitt, a redistricting expert at Loyola Law School. Though public dissatisfaction with GOP members of Congress is at an all-time high, Republican dominance of the redistricting process could prove an insurmountable impediment to Democratic hopes of retaking the House, where the GOP now has a fifty-one-seat edge. Speaker of the House John Boehner predicts that the GOP’s redistricting advantage will allow the party to retain control of the House, perhaps for the next decade.
Aside from protecting vulnerable freshmen, which would count as a major victory even if the GOP didn’t pick up any new seats, the party’s biggest gains will come in the South. Though the region has trended Republican at the presidential level for decades, Democrats managed to hang on to the Statehouses (which draw the redistricting maps in most states) for a remarkable stretch of time. Before 2010, Democrats controlled five Statehouses (in Alabama, Arkansas, Louisiana, Mississippi, North Carolina) and one chamber in two (Kentucky and Virginia). Two years later, Republicans control every Southern Statehouse except the Arkansas legislature and Kentucky House.
Race has always been at the center of the Southern Strategy, though not always in ways you’d expect. In addition to pushing hot-button issues like busing and welfare to appeal to white voters, Southern Republicans formed an “unholy alliance” with black Southern Democrats when it came to redistricting. In the 1980s and ’90s, when white Democrats ruled the Statehouses, Republicans supported new majority-minority districts for black Democrats in select urban and rural areas in exchange for an increased GOP presence elsewhere, especially in fast-growing metropolitan suburbs. With Democrats grouped in fewer areas, Republicans found it easier to target white Democrats for extinction. Ben Ginsberg, a prominent GOP election lawyer, memorably termed the strategy “Project Ratfuck.”
Republicans prepared for the 2010 election with an eye toward replicating and expanding this strategy. The Republican State Leadership Committee (RSLC) unveiled the Redistricting Majority Project (REDMAP) in 2010 to target Statehouse races and put Republicans in charge of redistricting efforts following the election. Ed Gillespie, former chair of the Republican National Committee, became the group’s chair, while Chris Jankowski, a corporate lobbyist in Virginia, handled day-to-day operations. The group, which as a tax-exempt 527 could accept unlimited corporate donations, became the self-described “lead Republican Redistricting organization,” taking over many of the functions of the RNC. The RSLC attracted six- and seven-figure donations from the likes of the US Chamber of Commerce, tobacco companies Altria and Reynolds American, Blue Cross and Blue Shield, the Karl Rove–founded American Crossroads and the American Justice Partnership, a conservative legal group that has been a partner of the American Legislative Exchange Council, a state-based conservative advocacy group. Funding from these corporate interests allowed the RSLC to spend $30 million on state races in 2010, including $1.2 million in North Carolina.
One of the group’s largest funders in North Carolina was Art Pope, a furniture magnate who has bankrolled much of the state’s conservative movement. Pope’s Variety Wholesalers gave $36,500 to the RSLC in July 2010. The RSLC then gave $1.25 million to a group called Real Jobs NC to run attack ads against Democrats. In total, Pope and Pope-supported entities spent $2.2 million on twenty-two state legislative races, winning eighteen. After the election, the GOP redistricting committees hired the RSLC’s redistricting expert, Tom Hofeller, to redraw North Carolina’s districts. He was paid with state dollars through the General Assembly budget. (Hofeller says he has also been “intensely involved” in this cycle’s redistricting process in Alabama, Massachusetts, Texas and Virginia.)
Pope has long been “the moving force behind Republican redistricting efforts in North Carolina,” says Dan Blue Jr. (Pope says he supports an independent state redistricting commission.) In 1992 Pope urged Blue, then Statehouse Speaker, to create twenty-six majority-minority districts. Blue refused, creating nineteen instead. Pope then sued him. “He seemed to believe that African-Americans were required to be represented by African-Americans,” Blue says. Twenty years later, Hofeller enacted Pope’s strategy. “The best recent example of success is in North Carolina,” the RSLC wrote in a July 2011 blog post.
* * *
The strategy was repeated in other Southern states including Georgia, Louisiana and South Carolina, as Republicans created new majority-minority districts at the state level as a means to pack Democrats into as few as possible. They also increased the BVAP in existing majority-minority Congressional districts held by Democrats like Jim Clyburn in South Carolina and Bobby Scott in Virginia, who have occupied their seats for almost two decades.
Yet this year, unlike in past cycles, the unholy alliance between white Republicans and black Democrats has dissolved. Stacey Abrams, the first African-American leader of the Georgia House, denounced the GOP plan to create seven new majority-minority districts in the Statehouse but eliminate the seats of nearly half the white Democrats. “Republicans intentionally targeted white Democrats, thinking that as an African-American leader I wouldn’t fight against these maps because I got an extra number of black seats,” she says. “I’m not the chair of the ‘black caucus.’ I’m the leader of the Democratic caucus. And the Democratic caucus has to be racially integrated in order to be reflective of the state.” Under the new GOP maps, Abrams says, “we will have the greatest number of minority seats in Georgia history and the least amount of power in modern history.”
Democrats accounted for 47 percent of the statewide vote in Georgia in 2008 and 2010 but, thanks to redistricting, can elect just 31 percent of Statehouse members. Abrams is especially upset that Republicans pitted incumbent white Democrats against incumbent black Democrats in four House districts in Atlanta, which she sees as an attempt to divide the party through ugly racial politics. “They placed whites who represented majority-minority districts against blacks who represented majority-minority districts and enhanced the number of minority voters in those districts in order to wipe the white Democrats out,” she explains. The new districts slither across the metropolis to pick up as many black voters as possible. Abrams says the new maps “look like a bunch of snakes that got run over.”
The same thing happened in the Georgia Senate, where Republicans targeted State Senator George Hooks, who has been in the body since 1991 and is known as the “dean of the Senate.” Hooks represented the peanut farming country of rural Southwest Georgia, including Plains, the hometown of Jimmy Carter. Republicans dismantled his district, which had a BVAP of 43 percent, and created a new GOP district in North Georgia with a BVAP of 8 percent. They moved the black voters in his district into two adjoining majority-minority districts and two white Republican districts, and pitted Hooks against an incumbent black Democrat in a district that is 59 percent black. His political career is likely finished.
The GOP similarly took aim at Representative John Barrow, the last white Democrat from the Deep South in the US House. Republicans increased the BVAP in three of the four majority-minority Congressional districts represented by Georgia Democrats but decreased the BVAP from 42 to 33 percent in Barrow’s east Georgia seat, moving 41,000 African-Americans in Savannah out of his district. Just to be sure, they also drew Barrow’s home out of the district as well. Based on population shifts—Georgia gained one new seat from the 2010 census—the district could have become a new majority-minority district, but instead it’s much whiter and thus solidly Republican.
As a consequence of redistricting, Republicans could control ten of Georgia’s fourteen Congressional districts, up from eight in 2010, and could hold a two-thirds majority in the State Legislature, which would allow the party to pass constitutional amendments without a single Democratic vote. When the dust settles, Georgia and North Carolina could send twenty Republicans, five black Democrats and two white Democrats to the US House. That’s a generous number of Democrats compared with Alabama, Louisiana, Mississippi and South Carolina, which each have only one Democratic Representative in Congress—all of them black, from majority-minority districts.
In 1949 white Democrats controlled 103 of 105 House seats in the former Confederacy. Today the number is sixteen of 131, and it could reach single digits after 2012. “I should be stuffed and put in a museum when I pass away,” says Representative Steve Cohen, a white Democrat who represents a majority-minority district in Memphis, “and people can say, ‘Yes, a white Southern Democrat once lived here.’”
Unlike the Republican Party, which is 95 percent white in states like Georgia, North Carolina and South Carolina, the Democratic Party can thrive only as a multiracial coalition. The elimination of white Democrats has also crippled the political aspirations of black Democrats. According to a recent report from the Joint Center for Political and Economic Studies, only 4.8 percent of black state legislators in the South serve in the majority. “Black voters and elected officials have less influence now than at any time since the civil rights era,” the report found. Sadly, the report came out before all the redistricting changes had gone into effect. By the end of this cycle, Republicans in Georgia, South Carolina and Tennessee could have filibuster-proof majorities in their legislatures, and most white Democrats in Alabama and Mississippi (which haven’t completed redistricting yet) could be wiped out.
* * *
Texas, a state not known for subtlety, chose to ignore its rapidly growing minority population altogether. One of four majority-minority states, Texas grew by 4.3 million people between 2000 and 2010, two-thirds of them Hispanics and 11 percent black. As a result, the state gained four Congressional seats this cycle. Yet the number of seats to which minority voters could elect a candidate declined, from eleven to ten. As a result, Republicans will pick up three of the four new seats. “The Texas plan is by far the most extreme example of racial gerrymandering among all the redistricting proposals passed by lawmakers so far this year,” says Elisabeth MacNamara, president of the League of Women Voters.
As in the rest of the South, the new lines were drawn by white Republicans with no minority input. As the maps were drafted, Eric Opiela, counsel to the state’s Congressional Republicans, referred to key sections of the Voting Rights Act as “hocus-pocus.” Last year the Justice Department found that the state’s Congressional and Statehouse plans violated Section 5 of the VRA by “diminishing the ability of citizens of the United States, on account of race, color, or membership in a language minority group, to elect their preferred candidates of choice.” (Texas has lost more Section 5 enforcement suits than any other state.)
Only by reading the voluminous lawsuits filed against the state can one appreciate just how creative Texas Republicans had to be to so successfully dilute and suppress the state’s minority vote. According to a lawsuit filed by a host of civil rights groups, “even though Whites’ share of the population declined from 52 percent to 45 percent, they remain the majority in 70 percent of Congressional Districts.” To cite just one of many examples: in the Dallas-Fort Worth area, the Hispanic population increased by 440,898, the African-American population grew by 152,825 and the white population fell by 156,742. Yet white Republicans, a minority in the metropolis, control four of five Congressional seats. Despite declining in population, white Republicans managed to pick up two Congressional seats in the Dallas and Houston areas. In fact, whites are the minority in the state’s five largest counties but control twelve of nineteen Congressional districts.
Based on these disturbing facts, a DC District Court invalidated the state’s maps and ordered a three-judge panel in San Antonio to draw new ones that better accounted for Texas’s minority population, which improved Democratic prospects. The Supreme Court, however, recently ruled that the San Antonio court must use the state’s maps as the basis for the new districts, at least until a separate three-judge panel in Washington decides whether the maps violate the VRA. Final arguments will take place January 31, in a case that could have far-reaching ramifications for the rights of minority voters not just in Texas but across the South.
* * *
In a recent speech about voting rights at the LBJ presidential library in Austin, Attorney General Eric Holder noted that “no fewer than five lawsuits” are challenging Section 5 of the Voting Rights Act, which he called the “keystone of our voting rights laws.” Section 5 requires that states covered by the act receive pre-clearance from the Justice Department or a three-judge District Court in Washington for any election law changes that affect minority voters.
Conservatives want to scrub this requirement. In a 2009 decision, the Supreme Court stopped short of declaring Section 5 unconstitutional but asserted that “the Act’s preclearance requirements and its coverage formula raise serious constitutional questions.” Justice Clarence Thomas, in a dissent, sought to abolish Section 5, arguing that intentional discrimination in voting “no longer exists.” But in September a US District Court judge dismissed a challenge to Section 5, writing that it “remains a ‘congruent and proportional remedy’ to the 21st century problem of voting discrimination in covered jurisdictions.” Voting rights experts expect the Supreme Court to address this issue in the coming year.
Meanwhile, just as they’re seeking to declare Section 5 unconstitutional, Republicans are also invoking the VRA as a justification for isolating minority voters. “There’s no question that’s an unintended consequence,” says Jankowski of the RSLC (which takes no position on Section 5). “Republicans benefit from the requirement of these majority-minority districts. It has hurt the Democratic Party’s ability to compete in the South.” But Kareem Crayton, a redistricting expert at the UNC School of Law, argues that Republicans “clearly decided to ignore what federal law requires,” noting that “a party that doesn’t like federal mandates all of a sudden getting religion and talking about the importance of federal voting rights is more than a little ironic.”
The VRA states that lawmakers must not diminish the ability of minority voters to participate in the political process or elect a candidate of their choice. “There’s nothing out there that says a state can’t draw a 42 percent black district instead of a 50 percent black district as long as black voters still have the opportunity to elect a candidate of choice,” argues Paul Smith, a prominent redistricting lawyer at Jenner & Block in Washington. The VRA, in other words, did not compel Republicans to pack minority voters into heavily Democratic districts. “Using the Voting Rights Act to justify racial discrimination is anathema to the purpose of the Voting Rights Act,” says Stacey Abrams.
But it’s also difficult for voting rights advocates to prove in federal court that packing minority voters into majority-minority districts diminishes their ability to elect candidates of choice. That’s why the Justice Department has pre-cleared redistricting plans in every Southern state so far except Texas, much to the chagrin of civil rights activists. (Plaintiffs may have better luck in state court in places like North Carolina, where the court has acknowledged that civil rights groups have raised “serious issues and arguments about, among other things, the extent to which racial classifications were used.”) “I have not been at all satisfied with the civil rights division of the Justice Department under the Obama administration,” says Joe Reed, a longtime civil rights activist and redistricting expert in Alabama.
Wasserman says the Justice Department is saving its legal firepower to challenge restrictive voting laws passed by Republicans in half a dozen Southern states since 2010. The laws require proof of citizenship to register to vote, cut back on early voting, curtailed voter registration drives and required voters to produce a government-issued ID before casting a ballot. The department has already objected to South Carolina’s voter ID law, since blacks are more likely than whites to lack the necessary ID. “Every method that human ingenuity can conceive of is being used to undermine, dilute and circumvent the rights of minority voters to enjoy the franchise,” says Reed.
The use of race in redistricting is just one part of a broader racial strategy used by Southern Republicans to not only make it more difficult for minorities to vote and to limit their electoral influence but to pass draconian anti-immigration laws, end integrated busing, drug-test welfare recipients and curb the ability of death-row inmates to challenge convictions based on racial bias. GOP presidential candidates have gotten in on the act, with Newt Gingrich calling President Obama “the best food-stamp president in American history.” The new Southern Strategy, it turns out, isn’t very different from the old one.
Bookmakers are giving Senate Democrats long odds for keeping their majority. The probability of a Republican victory is about 86 percent, according to betting markets, or about six to one. Justin Wolfers in The New York Times.
Crush the car dealerships. State laws banning automakers from selling directly to consumers add several percentage points to the cost of every car, and making buying one a hassle. Uncle Sam can encourage the states to establish a free market in auto retail. Vox
***
"Level The Playing Field: Let Credit Unions Offer Same Services As Commercial Banks"
Just after Labor Day, the Gluten and Allergen Free Expo stopped for a weekend at the Meadowlands Exposition Center. Each year, the event wends its way across the country like a travelling medicine show, billing itself as the largest display of gluten-free products in the United States. Banners hung from the rafters, with welcoming messages like “Plantain Flour Is the New Kale.” Plantain flour contains no gluten, and neither did anything else at the exposition (including kale). There were gluten-free chips, gluten-free dips, gluten-free soups, and gluten-free stews; there were gluten-free breads, croutons, pretzels, and beer. There was gluten-free artisanal fusilli and penne from Italy, and gluten-free artisanal fusilli and penne from the United States. Dozens of companies had set up tables, offering samples of gluten-free cheese sticks, fish sticks, bread sticks, and soy sticks. One man passed out packets of bread crumbs, made by “master bakers,” that were certified as gluten-free, G.M.O.-free, and kosher. There was even gluten-free dog food.
Gluten, one of the most heavily consumed proteins on earth, is created when two molecules, glutenin and gliadin, come into contact and form a bond. When bakers knead dough, that bond creates an elastic membrane, which is what gives bread its chewy texture and permits pizza chefs to toss and twirl the dough into the air. Gluten also traps carbon dioxide, which, as it ferments, adds volume to the loaf. Humans have been eating wheat, and the gluten in it, for at least ten thousand years. For people with celiac disease—about one per cent of the population—the briefest exposure to gluten can trigger an immune reaction powerful enough to severely damage the brushlike surfaces of the small intestine. People with celiac have to be alert around food at all times, learning to spot hidden hazards in common products, such as hydrolyzed vegetable protein and malt vinegar. Eating in restaurants requires particular vigilance. Even reusing water in which wheat pasta has been cooked can be dangerous.
Until about a decade ago, the other ninety-nine per cent of Americans rarely seemed to give gluten much thought. But, led by people like William Davis, a cardiologist whose book “Wheat Belly” created an empire founded on the conviction that gluten is a poison, the protein has become a culinary villain. Davis believes that even “healthy” whole grains are destructive, and he has blamed gluten for everything from arthritis and asthma to multiple sclerosis and schizophrenia. David Perlmutter, a neurologist and the author of another of the gluten-free movement’s foundational texts, “Grain Brain: The Surprising Truth About Wheat, Carbs, and Sugar—Your Brain’s Silent Killers,” goes further still. Gluten sensitivity, he writes, “represents one of the greatest and most under-recognized health threats to humanity.’’
Nearly twenty million people contend that they regularly experience distress after eating products that contain gluten, and a third of American adults say that they are trying to eliminate it from their diets. One study that tracks American restaurant trends found that customers ordered more than two hundred million dishes last year that were gluten- or wheat-free. (Gluten is also found in rye and barley; a gluten-free diet contains neither these grains nor wheat.) The syndrome has even acquired a name: non-celiac gluten sensitivity. “I’ve been gluten-free these last four years, and it has changed my life,’’ Marie Papp, a photographer, told me at the expo. “I would have headaches, nausea, trouble sleeping. I know that I’m intolerant because I gave it up and I felt better. That explanation is probably not scientific enough for you. But I know how I felt, how I feel, and what I did to make it change.” She went on, “I’m a foodie. It’s been five years since I had biscotti. And I just had one here, gluten-free. And it rocks.”»
Sales of gluten-free products will exceed fifteen billion dollars by 2016, twice the amount of five years earlier. The growing list of gluten-free options has been a gift for many children, who no longer have to go through life knowing that they will never eat pizza, cookies, or cake. As with organic food, which was at first sold almost exclusively by outlets with a local clientele, the market is controlled increasingly by corporations. Goya and ShopRite both had booths at the expo; so did Glutino, which was founded in 1983 and has grown into a gluten-free conglomerate. “There were a lot of smaller gluten-free companies that were mom-and-pop-type shops,” Steven Singer, the co-founder of Glutino, said in an interview last month with the Globe and Mail. “So they had, like, a baking mix or a cookie mix, and they were all great people, but there was no business. And that is what drove us, the idea of being that one-stop shop in gluten-free, the category leader, the category captain.”
For many people, avoiding gluten has become a cultural as well as a dietary choice, and the exposition offered an entry ramp to a new kind of life. There was a travel agent who specialized in gluten-free vacations, and a woman who helps plan gluten-free wedding receptions. One vender passed out placards: “I am nut free,” “I am shellfish free,” “I am egg free,” “I am wheat free.” I also saw an advertisement for gluten-free communion wafers.
The fear of gluten has become so pronounced that, a few weeks ago, the television show “South Park” devoted an episode to the issue. South Park became the first entirely gluten-free town in the nation. Federal agents placed anyone suspected of having been “contaminated” in quarantine at a Papa John’s surrounded by razor wire. Citizens were forced to strip their cupboards of offending foods, and an angry mob took a flamethrower to the wheat fields.
“No matter what kind of sickness has taken hold of you, let’s blame gluten,’’ April Peveteaux writes in her highly entertaining book “Gluten Is My Bitch.” (Peveteaux maintains a blog with the same name.) “If you want or need to get gluten out of your diet, bravo! Kick that nasty gluten to the curb. . . . Not sure if gluten-free is for you? Perhaps gluten simply causes you some discomfort, but you’ve never been diagnosed. Then eff that gluten!’’
Wheat provides about twenty per cent of the world’s calories and more nourishment than any other source of food. Last year’s harvest, of seven hundred and eighteen million tons, amounted to roughly two hundred pounds for every person on earth. In the United States, wheat consumption appears to fluctuate according to nutritional trends. It rose steadily from the nineteen-seventies to about 2000, a reflection of the growing concern over the relationships between meat and saturated fat, cholesterol, and heart disease. Since then, the number of people who say that wheat, barley, and rye make them sick has soared, though wheat consumption has fallen.
Wheat is easy to grow, to store, and to ship. The chemical properties of flour and dough also make wheat versatile. Most people know that it is integral to bread, pasta, noodles, and cereal. But wheat has become a hidden ingredient in thousands of other products, including soups, sauces, gravies, dressings, spreads, and snack foods, and even processed meats and frozen vegetables. Nearly a third of the foods found in American supermarkets contain some component of wheat—usually gluten or starch, or both.
The most obvious question is also the most difficult to answer: How could gluten, present in a staple food that has sustained humanity for thousands of years, have suddenly become so threatening? There are many theories but no clear, scientifically satisfying answers. Some researchers argue that wheat genes have become toxic. Davis has said that bread today is nothing like the bread found on tables just fifty years ago: “What’s changed is that wheat’s adverse effects on human health have been amplified many-fold. . . .The version of ‘wheat’ we consume today is a product of genetic research. . . . You and I cannot, to any degree, obtain the forms of wheat that were grown fifty years ago, let alone one hundred, one thousand, or ten thousand years ago. . . . We have to restrict other carbohydrates beyond wheat, but wheat still stands apart as the worst of the worst.’’ Perlmutter is less restrained: “As many as forty percent of us can’t properly process gluten, and the remaining sixty percent could be in harm’s way.”
Although dietary patterns have changed dramatically in the past century, our genes have not. The human body has not evolved to consume a modern Western diet, with meals full of sugary substances and refined, high-calorie carbohydrates. Moreover, most of the wheat we eat today has been milled into white flour, which has plenty of gluten but few vitamins or nutrients, and can cause the sharp increases in blood sugar that often lead to diabetes and other chronic diseases.
Donald Kasarda, a researcher at the U.S. Department of Agriculture, has studied wheat genetics for decades. In a recent study published in the Journal of Agricultural and Food Chemistry, he found no evidence that a change in wheat-breeding practices might have led to an increase in the incidence of celiac disease. “My survey of protein content in wheat in the U.S. over approximately the past one hundred years did not support such an increase on the basis of historical data in comparison with recent data,’’ he subsequently told an interviewer.
Joseph A. Murray, a professor of medicine at the Mayo Clinic and the president of the North American Society for the Study of Celiac Disease, has also studied wheat genetics. He agrees with Kasarda. “The wheat grain is not a lot different than it was fifty years ago,’’ Murray told me. “Chemically, the contents just have not changed much. And there is something more important to note. Wheat consumption is going down, not up. I don’t think this is a problem that can be linked to the genetics of wheat.”
But something strange is clearly going on. For reasons that remain largely unexplained, the incidence of celiac disease has increased more than fourfold in the past sixty years. Researchers initially attributed the growing number of cases to greater public awareness and better diagnoses. But neither can fully account for the leap since 1950. Murray and his colleagues at the Mayo Clinic discovered the increase almost by accident. Murray wanted to examine the long-term effects of undiagnosed celiac disease. To do that, he analyzed blood samples that had been taken from nine thousand Air Force recruits between 1948 and 1954. The researchers looked for antibodies to an enzyme called transglutaminase; they are a reliable marker for celiac disease. Murray assumed that one per cent of the soldiers would test positive, matching the current celiac rate. Instead, the team found the antibodies in the blood of just two-tenths of one per cent of the soldiers. Then they compared the results with samples taken recently from demographically similar groups of twenty- and seventy-year-old men. In both groups, the biochemical markers were present in about one per cent of the samples.
“That suggested that whatever has happened with celiac disease has happened since 1950,’’ Murray said. “The increase affected young and old people equally.” These results imply that the cause is environmental.
Nobody can say for sure why the rise in celiac disease has been so rapid. The modern diet may be to blame. And there is also growing evidence, in animal studies and in humans, that our microbiome—the many bacterial species inhabiting our gut—can have a significant impact on a range of diseases. None of that, however, explains why so many people who don’t have celiac disease feel the need to give up gluten.
Gluten anxiety has been building for years, but it didn’t become acute until 2011, when a group led by Peter Gibson, a professor of gastroenterology at Monash University and the director of the G.I. unit at the Alfred Hospital, in Melbourne, seemed to provide evidence that gluten was capable of causing illness even in people who did not have celiac disease. Gibson and his colleagues recruited thirty-four people with irritable-bowel syndrome, all of whom had complained of stomach ailments that largely disappeared when they stopped eating gluten. He put them all on a strictly monitored gluten-free diet, but, unbeknownst to the subjects, about half got muffins and bread with gluten. It was a double-blind study, so neither the doctors nor the patients knew which muffins and bread contained gluten. But most of those who ate the gluten reported that the pain returned; for most of the others it did not. The study was small but meticulous, and the results were compelling. Several similar studies are now under way, but dietary research is notoriously time-consuming and difficult.
Gibson published his findings in the American Journal ofGastroenterology, but, along with other experts, he urged restraint in interpreting data from such a small study. Nevertheless, millions of people with vague symptoms of gastric distress suddenly found something concrete for which to blame their troubles. The market boomed, but the essential mystery remained unsolved: Why was gluten suddenly so hazardous? Perhaps, researchers thought, farmers had increased the protein (and gluten) content of wheat so drastically that people could no longer digest it properly.»
But there is more to wheat than gluten. Wheat also contains a combination of complex carbohydrates, and the Australian team wondered if these could be responsible for the problems. Gibson and his colleagues devised a different study: they recruited a group of thirty-seven volunteers who seemed unable to digest gluten properly. This time, the researchers attempted to rule out the carbohydrates and confirm gluten as the culprit. Gibson put all the volunteers on a diet that was gluten-free and also free of a group of carbohydrates that he and his colleagues called FODMAPs, an acronym for a series of words that few people will ever remember: fermentable oligosaccharides, disaccharides, monosaccharides, and polyols. Not all carbohydrates are considered FODMAPs, but many types of foods contain them, including foods that are high in fructose, like honey, apples, mangoes, and watermelon; dairy products, like milk and ice cream; and fructans, such as garlic and onions.
Most people have no trouble digesting FODMAPs, but these carbohydrates are osmotic, which means that they pull water into the intestinal tract. That can cause abdominal pain, bloating, and diarrhea. When the carbohydrates enter the small intestine undigested, they move on to the colon, where bacteria begin to break them down. That process causes fermentation, and one product of fermentation is gas. In Gibson’s new study, when the subjects were placed on a diet free of both gluten and FODMAPs, their gastrointestinal symptoms abated. After two weeks, all of the participants reported that they felt better. Some subjects were then secretly given food that contained gluten; the symptoms did not recur. The study provided evidence that the 2011 study was wrong—or, at least, incomplete. The cause of the symptoms seemed to be FODMAPs, not gluten; no biological markers were found in the blood, feces, or urine to suggest that gluten caused any unusual metabolic response.
In fact, FODMAPs seem more likely than gluten to cause widespread intestinal distress, since bacteria regularly ferment carbohydrates but ferment protein less frequently. Although a FODMAP-free diet is complicated, it permits people to eliminate individual foods temporarily and then reintroduce them systematically to determine which, if any, are responsible for their stomach problems. FODMAPs are not as trendy as gluten and not as easy to understand. But, biologically, their role makes more sense, Murray says.
“That first paper, in 2011, blew our minds,” Murray told me. “Essentially, it said that people are intolerant of gluten, and it was based on a well-designed, double-blind study. When people were challenged with gluten, by eating the muffins, they got sick. We just couldn’t figure it out. But then came the second study. By then, it was almost too late to put the genie back in the bottle. You have millions of people out there completely convinced that they feel better when they don’t eat gluten—and they don’t want to hear anything different.”
The FODMAP research, while influential and highly regarded, involved fewer than a hundred people, not enough to account definitively for the number of people who have abandoned foods that contain gluten. Several groups are trying to repeat those results. But studies like that take time. At present, there are no blood tests, biopsies, genetic markers, or antibodies that can confirm a diagnosis of non-celiac gluten sensitivity. There have been a few studies suggesting that people without celiac disease have a reason to eliminate gluten from their diet. But most of the data are unclear or preliminary. Doctors rarely diagnose non-celiac gluten sensitivity, and many don’t believe that it exists. Few people seem to have been deterred by the lack of evidence. “Everyone is trying to figure out what is going on, but nobody in medicine, at least not in my field, thinks this adds up to anything like the number of people who say they feel better when they take gluten out of their diet,” Murray said. “It’s hard to put a number on these things, but I would have to say that at least seventy per cent of it is hype and desire. There is just nothing obviously related to gluten that is wrong with most of these people.’’
About a month ago, in an attempt to gain a better understanding of the role that gluten plays in our diet, I flew to Seattle, then drove north for an hour, to Mount Vernon, where Washington State University’s Bread Lab is situated. The lab is part of the university’s wheat-breeding program; by studying the diversity of the grains grown in the Pacific Northwest, researchers there hope to determine which are most suitable for baking, brewing, and making pasta. Dan Barber, a chef and the co-owner of the Blue Hill restaurants, in Manhattan and in Pocantico Hills, had suggested that I visit Stephen Jones, a molecular cytogeneticist and the lab’s director. Barber, in his recent book “The Third Plate,” describes Jones as a savior of traditional wheat in a world that has transformed most crops into bland industrial commodities. I was more eager to hear what he had to say about the implications of adding extra gluten to bread dough, which has become routine in industrial bakeries.
Jones, a strapping man with an aw-shucks manner, has spent the past twenty-five years trying to figure out the best way to make a loaf of bread. The amount of gluten added to industrially made bread keeps increasing, and Jones has become acutely interested in whether that extra gluten may be at least partly responsible for the gastrointestinal distress reported by so many people. “My Ph.D. was on the genetics of loaf volume—looking at chromosomes and relating them to the strength of the dough in bread,’’ Jones said, as he greeted me at the entrance to the research center. The inviting, if somewhat incongruous, aroma of freshly baked bread filled the building. Jones’s lab is unique; few bakeries have Brabender farinographs, which Jones and his team use in their search for the ideal ratio of gluten to water in dough, and to measure the strength of flour. Nor can there be many labs with a Matador deck baking oven, which can accommodate more than a dozen loaves at a time, and which circulates heat uniformly, at hot enough temperatures, to insure a voluminous loaf and the strongest possible crust.
For all the high-tech gadgets on display in the Bread Lab, the operation is decidedly old-fashioned, relying on stone mills of a type that have not been used for more than a century and on a philosophy that all it takes to make genuine and delicious whole-wheat bread is time, talent, flour, a little salt, and lots of water. There are essentially two ways to turn flour into bread. The first is the way it was done for most of human history: let the flour absorb as much water as possible and give it time to ferment, a process that allows yeast and bacteria to activate the dough. Kneading then binds the two proteins that come together to form gluten. Most of the bread consumed in the United States is made the other way: in place of hydration, fermentation, and kneading, manufacturers save time by relying on artificial additives and huge industrial mixers to ram together the essential proteins that form gluten.
Until the late nineteenth century, when steel rollers and industrial mills came into use, wheat was ground on stones, a slow and imprecise process. Steel was fast, efficient, and easy to maintain, and it permitted millers to discard the germ and the bran in the wheat kernel and then rapidly process the starchy endosperm. This made white flour. Almost nobody seemed to notice, or care, that by tossing out the rest of the kernel industrial bakers were stripping bread of its vitamins, its fibre, and most of its healthy fats. White bread was seen as an affordable luxury. Like many Jews arriving from Russia at the turn of the twentieth century, my great-grandfather had never seen white bread before, but when he did he immediately made what was referred to, at least in my family, as an “American sandwich”: he took two pieces of the black bread that he had always eaten, and carefully placed a piece of industrially made white bread between them. He is said to have been delighted.
The Bread Lab team, which includes the patient, inventive baker Jonathan Bethony, uses whole grains, water, salt, and yeast. Nothing else. Whole-wheat bread, even when it’s good, is usually dense and chewy, and rarely moist; Bethony’s bread was remarkably airy and light. It contains only the natural gluten formed by kneading the flour. Most bakers, even those who would never go near an industrial mixing machine, include an additive called vital wheat gluten to strengthen the dough and to help the loaf rise. (In general, the higher the protein content of wheat, the more gluten it contains.)
Vital wheat gluten is a powdered, concentrated form of the gluten that is found naturally in all bread. It is made by washing wheat flour with water until the starches dissolve. Bakers add extra gluten to their dough to provide the strength and elasticity necessary for it to endure the often brutal process of commercial mixing. Vital wheat gluten increases shelf life and acts as a binder; because it’s so versatile, food companies have added it not only to bread but to pastas, snacks, cereals, and crackers, and as a thickener in hundreds of foods and even in some cosmetics. Chemically, vital wheat gluten is identical to regular gluten, and no more likely to cause harm. But the fact that it is added to the protein already in the flour worries Jones. “Vital wheat gluten is a crutch,’’ he said. “It’s all storage and functionality. No flavor. People act as if it were magic. But there is no magic to food.”
Jones is a careful scientist, and he said more than once that he had no evidence that a growing reliance on any single additive could explain why celiac disease has become more common, or why so many people say that they have trouble digesting gluten. But he and his colleagues are certain that vital wheat gluten makes bread taste like mush. “Flour that is sliced and packed into plastic wrapping in less than three hours—that’s not bread,’’ Jones said. He and Bethany Econopouly, one of his doctoral students, recently published an essay in the Huffington Post in which they argue that the legal definition of the word “bread” has become meaningless and ought to be changed: “FDA regulations state that for bread to be labeled as ‘bread,’ it must be made of flour, yeast, and a moistening ingredient, usually water. When bleached flour is used, chemicals like acetone peroxide, chlorine, and benzoyl peroxide (yes, the one used to treat acne) can be included in the recipe and are masked under the term ‘bleached.’ Optional ingredients are also permissible in products called bread: shortening, sweeteners, ground dehulled soybeans, coloring, potassium bromate . . . and other dough strengtheners (such as bleaching agents and vital gluten).”
Could millions of people simply be eating too much vital wheat gluten? There are no real data to answer that question, but Jones is not alone in seeking to gain a better understanding of the potential physiological impact. Joseph Murray, at the Mayo Clinic, has begun studying its effect on the immune system. Murray says, “This is a major component of the bread we eat, and we don’t know much about it. It’s very important that we figure out what effect, if any, there is when we add all that extra gluten to bread.’’
Paradoxically, the increased consumption of vital wheat gluten can be attributed, at least in part, to a demand for healthier baked goods. It is not possible to manufacture, package, and ship large amounts of industrially made whole-grain bread without adding something to help strengthen the dough. Jones refers to these products generically as “Bob’s groovy breads.’’ Look closely at labels of “healthy” whole-wheat breads, and it’s easy to understand what he means. (After my trip to Seattle, the first bread I saw that advertised itself as having been milled from hundred-per-cent whole grains contained many ingredients. The first four, listed in descending order of weight or volume, were whole-wheat flour, water, wheat gluten, and wheat fibre. In other words: gluten, water, more gluten, and fibrous gluten.) In the promotional videos for Dave’s Killer Bread, a popular brand, the founder, Dave, speaks glowingly about the properties of gluten. Pictures of the factory show pallets stacked with fifty-pound bags of vital wheat gluten. “I just wonder how much of this additional gluten our bodies can digest,’’ Jones told me when I was at the Bread Lab. “There has to be some limit.”
I was having trouble visualizing vital wheat gluten as a discrete substance. When I said that, Jones nodded at Econopouly, and she left the room. Two minutes later, she returned and handed me a shard of vital wheat gluten. It looked like a prehistoric weapon, or the hardened bone marrow of a small mammal. “We put a plug of gluten in Coke and it foamed for a while, then became a glob that sat there for weeks,’’ Jones said. “It didn’t disintegrate into slime and mush. It just stayed there.’’ He took the plug out of my hands and slapped it on the lab counter. Nothing happened. “The stuff is simply indestructible,’’ he said.
The next morning, before leaving Seattle, I stopped by the offices of Intellectual Ventures, the patent and invention factory run by Nathan Myhrvold, the former chief technology officer at Microsoft. Myhrvold has long been a serious amateur chef and has also served as a gastronomic adviser to the Zagat Survey. Three years ago, he published “Modernist Cuisine: the Art and Science of Cooking,’’ a six-volume, twenty-four-hundred-page set of books that quickly became an essential guide for chefs around the world. Since then, Myhrvold and his team have been working on an equally ambitious follow-up project, tentatively called “The Art and Science of Bread.’’ The book won’t be ready for at least another year, but Myhrvold has said that it will be both a comprehensive history of bread and an exhaustive guide to baking it.
The project’s chef, Francisco Migoya, asked me if I had ever eaten gluten by itself. I shook my head. He placed a small ball of raw gluten in a microwave and pressed start. After about twenty seconds, the gluten puffed up like a balloon, at which point it was removed, set carefully on a plate, and served. It had the texture of pork rind. Gluten has a long culinary history, and has become a common substitute for meat and tofu. In Asia, where it is particularly popular, gluten is called seitan, and it is often steamed, fried, or baked.
Myhrvold wasn’t in town that day, but I caught up with him later. He is highly opinionated, and delights in controversy; saying the words “gluten-free” to him was like waving a red flag at a bull. “When I was a kid, I would watch National Geographic specials all the time,’’ he told me. “Often, they would travel to remote places and talk to shamans about evil spirits. It was an era of true condescension; the idea was that we know better and these poor people are noble, but they think that spirits are everywhere. That is exactly what this gluten-free thing is all about.” He stressed that he was not referring to people with celiac disease or questioning the possibility that some others might also have trouble eating gluten. “For most people, this is in no way different from saying, ‘Oh, my God, we are cursed.’ We have undergone what amounts to an attack of evil spirits: gluten will destroy your brain, it will give you cancer, it will kill you. We are the same people who talk to shamans.
“To find out the effect something like gluten has on people’s diets is complicated,’’ he said. “We’ll need long-term studies, and there won’t be a useful answer for years. So, instead of telling everyone you are going on a gluten-free diet, what if you said, ‘Hey, I am going on an experimental regimen, and it will be years before we know what effect it might have.’ I don’t know about you, but instead of saying ‘Eat this because it will be good for you,’ I would say, ‘Good luck.’ ’’
Fad dieting is nothing new in America; it’s what we do instead of eating balanced, nutritiously wholesome meals. Scarsdale, Atkins, South Beach, Zone, flexitarian, pescatarian, and paleo have all been awarded their fifteen minutes of fame and then shoved aside for the next great diet. They are rarely effective for long. Some nutrition specialists say that the current preoccupation with gluten-free products reminds them of the national obsession with removing fats from foods in the late nineteen-eighties. “Low-fat” foods are often packed with sugar and calories to make up for the lack of fat. The same is true of many products that are advertised as “gluten-free.”
While there are no scientific data to demonstrate that millions of people have become allergic or intolerant to gluten (or to other wheat proteins), there is convincing and repeated evidence that dietary self-diagnoses are almost always wrong, particularly when the diagnosis extends to most of society. We still feel more comfortable relying on anecdotes and intuition than on statistics or data. Since the nineteen-sixties, for example, monosodium glutamate, or MSG, has been vilified. Even now, it is common to see Chinese restaurants advertise their food as “MSG-free.” The symptoms that MSG is purported to cause—headaches and palpitations are among the most frequently cited—were initially described as “Chinese-restaurant syndrome” in a letter published, in 1968, inThe New England Journal of Medicine. The Internet is filled with sites that name the “hidden” sources of MSG. Yet, after decades of study, there is no evidence that MSG causes those symptoms or any others. This should surprise no one, since there are no chemical differences between the naturally occurring glutamate ions in our bodies and those present in the MSG we eat. Nor is MSG simply an additive: there is MSG in tomatoes, Parmesan, potatoes, mushrooms, and many other foods.
Our abject fear of eating fat has long been among the more egregious examples of the lack of connection between nutritional facts and the powerful myths that govern our eating habits. For decades, low-fat diets have been recommended for weight loss and to prevent heart disease. Food companies have altered thousands of products so that they can be labelled as low in fat, but replacing those fats with sugars, salt, and refined carbohydrates makes the food even less healthy. “Almost all of this has proved to be nonsense,’’ Myhrvold said. “Research shows that the total amount of fat in the diet isn’t really linked to weight or disease. What matters is the type of fat and the total calories you consume.” Bad fats increase the risk of death from heart disease and good fats lower it.
Margarine is a bad fat. Yet for decades doctors encouraged consumers to eat it, instead of butter, because butter is laden with saturated fat, which was considered even more dangerous than the fat in margarine. The assumption was not tested until the early nineteen-nineties, when researchers at the Harvard School of Public Health began to analyze data from the Nurses’ Health Study, which had followed the health of ninety thousand nurses for more than a decade. The study showed that women who ate four teaspoons of margarine a day had a fifty per cent greater risk of heart disease than those who rarely or never ate margarine. Yet again, the intuitive advice followed by so many people had been wrong.
Peter H. R. Green, the director of the celiac-disease center at the Columbia University medical school and one of the nation’s most prominent celiac doctors, says that the opposition to gluten has followed a similar pattern, and that it is harming at least as many people as it is helping. “This is a largely self-diagnosed disease,’’ Green said, when I visited his office, at New York-Presbyterian Hospital. “In the absence of celiac disease, physicians don’t usually tell people they are sensitive to gluten. This is becoming one of the most difficult problems that I face in my daily practice.”
He went on, “I recently saw a retired executive of an international company. He got a life coach to help him, and one of the pieces of advice the coach gave him was to get on a gluten-free diet. A life coach is prescribing a gluten-free diet. So do podiatrists, chiropractors, even psychiatrists.’’ He stopped, stood up, shook his head as if he were about to say something he shouldn’t, then shrugged and sat down again. “A friend of mine told me his wife was seeing a psychiatrist for anxiety and depression. And one of the first things the psychiatrist did was to put her on a gluten-free diet. This is getting out of hand. We are seeing more and more cases of orthorexia nervosa”—people who progressively withdraw different foods in what they perceive as an attempt to improve their health. “First, they come off gluten. Then corn. Then soy. Then tomatoes. Then milk. After a while, they don’t have anything left to eat—and they proselytize about it. Worse is what parents are doing to their children. It’s cruel and unusual treatment to put a child on a gluten-free diet without its being indicated medically. Parental perception of a child’s feeling better on a gluten-free diet is even weaker than self-perception.”
The initial appeal, and potential success, of a gluten-free diet is not hard to understand, particularly for people with genuine stomach ailments. Cutting back on foods that contain gluten often helps people reduce their consumption of refined carbohydrates, bread, beer, and other highly caloric foods. When followed carefully, those restrictions help people lose weight, particularly if they substitute foods like quinoa and lentils for the starches they had been eating. But eliminating gluten is complicated, inconvenient, and costly, and data suggest that most people don’t do it for long.
The diet can also be unhealthy. “Often, gluten-free versions of traditional wheat-based foods are actually junk food,’’ Green said. That becomes clear after a cursory glance at the labels of many gluten-free products. Ingredients like rice starch, cornstarch, tapioca starch, and potato starch are often used as replacements for white flour. But they are highly refined carbohydrates, and release at least as much sugar into the bloodstream as the foods that people have forsaken. “Our patients have jumped on this bandwagon and largely left the medical community wondering what the hell is going on,’’ Green said.
“You know, people are always dropping off samples of gluten-free products at our office. And when I eat them I regret it. I get heartburn. I feel nauseous. Because what are the things that sell food? Salt, sugar, fat, and gluten. If the makers take one away, then they add more of another to keep it attractive to people. If you don’t have celiac disease, then these diets are not going to help you.” People seem to forget that a gluten-free cake is still a cake.
I have been baking bread for more than thirty years, and there are few things I find more satisfying than turning a pound of wheat into something that I can feed to my friends. But it’s not always easy to believe in gluten these days. A couple of years ago, having learned that the nutrients and vitamins in wheat berries begin to degrade soon after they are processed, I bought a home mill and began to make my own flour. I started ordering wheat, in fifty-pound buckets, from places in Montana and South Dakota. I bought books that explained the differences between hard red winter wheat, which is good for whole-grain bread, and soft white wheat, which has a lower protein content and is used mostly for cookies, cakes, and pastries. I acquired sourdough starter from a friend, and treat it like a pet.
I have run into a couple of problems, however. The first was technical: I couldn’t make the wheat rise. I decided early on to bake only whole-wheat bread, but there just wasn’t enough protein in any combination of the grains I used. The bread often looked like brown matzoh, so I began to root around the Internet, and soon stumbled on the solution: vital wheat gluten. (“If you want to keep your bread 100% whole wheat, vital wheat gluten is your new best friend,’’ a message on one bread forum said. “This stuff is super-concentrated gluten flour, and it really helps to give low-gluten doughs better structure.”) That turned out to be true. It was like pumping air into a flat tire. A few tablespoons mixed into my flour, and the bread became elastic and chewy, and it looked like a normal loaf of bread; vital wheat gluten became my magic wand. Gradually, another problem arose, as more and more of my friends began to say, “Thanks, but I am staying away from gluten these days.”
I told Jonathan Bethony, the baker at the Bread Lab, about my gluten issue. Then he told me about his. “I went into baking because I thought it was a wholesome form of expression,’’ he said while kneading a loaf he would bake the next day. “I kept hearing about this gluten thing all the time. How gluten was so dangerous, and it was really getting me down in my heart. I started to ask myself, Am I making people sick? Have I become this spear of death?’’ He began to think about a different profession.
“It came to a head one day while I was working at a groovy natural health-food store in the Bay Area,” he went on. “My wife came home from work and said, ‘Sweetie, there is something I have to tell you. The doctor said that I am gluten intolerant. I can’t eat bread anymore.’ ” Bethony looked up from his dough. “I held it in as long as I could, but I just lost it. I had brought a loaf home with me, and I went charging up the stairs as fast as I could and launched that loaf from the balcony like a football.’’ Now Bethony wondered whether he ought to quit. But a famous baker lived nearby, and encouraged him to stick with it. He taught him to bake with nothing but whole grains and lots of water, and to leave plenty of time for the bread to ferment. The results have been sublime.
Later that week, I flew back to New York, went home, and dumped my vital wheat gluten in the trash. I have returned to baking whole-wheat bread the way it is supposed to be made: water, yeast, flour, and salt. I will try to live without the magic wand. But I am certainly not going to live without gluten. That just seems silly. ♦
Michael Specter has been a staff writer at The New Yorker since 1998, and has written frequently about AIDS, T.B., and malaria in the developing world, as well as about agricultural biotechnology, avian influenza, the world’s diminishing freshwater resources, and synthetic biology.
Pollution is leading to blooms of luminescent algae in Florida waters. The rivers are glowing in the dark due to nitrogen, phosphorus and other pollutants. Destructive, sometimes toxic blooms are becoming more common around the country. In Toledo, residents were forced to shut off their taps. Joby Warrick and Darryl Fears in The Washington Post
"Behind Ohio Drinking-Water Ban, A Lake Erie Mystery," Christian Science Monitor
A view of NYC to to the north from the 75th floor of 432 Park Avenue October 15, 2014.
Alan: Population growth falls in lockstep with economic progress and improved living standards.
Case in point. In the last 40 years, the average number of children born by Mexican women has declined from nearly 7 each to about 2.5.
I recall a University of Toronto lecture given in the late 1960s which examined an island deer population where local wolves suddenly disappeared. In short span, the deer population exceeded the carrying capacity of the land and suddenly declined by one third... or two thirds. (Lamentably I do not remember which of these percentages obtained.)
Although this regulatory mechanism is harsh, it reveals that population trend lines only continue so long before catastrophic "correction."
Stop pretending we can fix the environment by curbing population growth
When ecologist Corey Bradshaw gives public talks about the state of the globe's endangered wildlife, he sees a recurrent phenomenon. At the end of the talk, in the question and answer period, "someone will stand up and say, 'You’ve neglected the elephant in the room -- human population size is the principal problem,'" says Bradshaw, who is director of ecological modeling at the Environment Institute at the University of Adelaide in Australia.
It's easy to see where this idea comes from: There are a lot of humans on the Earth. There are currently more than 7.2 billion of us, and the UNprojects there will be 9.6 billion by 2050, and 10.9 billion by 2100. In fact, though estimatesvary, something on the order of 6.5 to 14 percent of all the people who have ever lived at any time, ever, are living now. (Stop and think about that for a minute.) And there's no denying these people take up space and consume resources.
So Bradshaw decided to look into this question of whether trying to reduce the size of the global population would help stave off climate change, the loss of species, and other environmental concerns. The resulting research, just out in the Proceedings of the National Academy of Sciences and co-authored by the University of Adelaide's Barry Brook, seriously challenges the idea that greens ought to be campaigning for population control.
On top of the serious ethical problems with trying to restrict the global population, the study also finds a purely practical one: It doesn't even appear possible. "No matter what levers you pull, we have such a huge demographic momentum, there’s no way we can rein in the human population fast enough to address sustainability issues in the next century," says Bradshaw. Or as the study itself puts it, the increase in population over the course of the 21st century is "virtually locked-in"; this means, the authors argue, that population reduction "cannot be argued to be the elephant in the room for immediate environmental sustainability and climate policy."
To show as much, the researchers used population models to examine nine different scenarios for the global human population over the course of the 21st century. One involved a "humane" approach to population control, in which female empowerment and greater access to contraception lead to better family planning and women having more control over their fertility. Two others modeled a much harsher "one child" policy, implemented either gradually by 2100, or more rapidly by 2045.
Rather grimly, the study also modeled a number of hypothetical disaster scenarios in which large swaths of humanity die due to climate change, pandemics, or wars. In one scenario, Bradshaw and Brook added together the number of people killed in World War I, World War II, and the deadly Spanish Flu (roughly 131 million) and calculated what percentage of humanity that represented at the close of the second World War. The answer was about 5.2 percent. So in one scenario, the study assumed a similar 5 percent die-off of the human population circa 2056 (when that total would be 500 million people out of a projected 9.95 billion living on the globe). Higher catastrophe scenarios, meanwhile, contemplated the staggering death of 2 billion and 6 billion people worldwide. In a climate change scenario simulating food shortages, meanwhile, childhood mortality dramatically increased.
The upshot was that only very unrealistic and extreme scenarios -- 6 billion people suddenly killed in a catastrophic war or pandemic; or a sudden, draconian and globally enforced one-child policy -- dramatically changed the trajectory of population growth by 2100. With greater childhood mortality due to climate change, a dramatic die-off of 500 million people, and a gentle, humane move towards female empowerment, the result was always the same: We wound up roughly where the U.N. currently projects, or around 10 billion people by 2100. "The population has this natural resistance to catastrophe," says Bradshaw.
In some ways, just as significant as the new paper itself is the person who edited it: Stanford University's Paul Ehrlich, author of the 1968 classic book The Population Bomb, which in many ways put population concerns on the map. They have since become a third rail because of the ethical issues-- though Bradshaw emphasizes that he does not support any kind of population reduction except for the voluntary kind that naturally results from female empowerment, leading to greater control of fertility through contraception. "Fertility reduction through family planning is the only humane way to reduce the human population size," says Bradshaw.
By 2050, that approach might reduce the global population by a few hundred million people. That's not nothing, but Bradshaw emphasizes that it's no big dent in population projections. The reason is simply that at present, we're in a phase of population growth that is exponential. "Think about a speeding car," says Bradshaw. "We’re going 150 km per hour." We may slow it down eventually, but there is no way to take away the current momentum.
Alan: Yesterday's presidential election in Brazil (won by former revolutionary Dilma Rousseff) reminds me that in much of the world, elections are two-step affairs in which the lack of a majority winner in round one automatically triggers a second round of voting (several months later) when the top two vote getters go head-to-head.
If the United States adopted this electoral method, we would immediately know which "small-money" candidates are favored when people vote their conscience rather than choosing "the lesser of two evils."
Like any political method, two-tier voting would not guarantee "Kingdom Come" but would provide unprecedented insight into those "dark horses" that might reach the finish line first if their unexpectedly strong "flames" were fanned into real fire.
Alan:It occurred to me today that "the poor" don't vote and "the rich" do.
Although I like the rhetorical ring of Emma Goldman's quip, "If voting changed anything, they'd make it illegal," shunning the ballot box is politically counterproductive, diminishing the vision of poor people with the calamitous notion that they are "right" -- even "noble" -- not to vote.
Meanwhile the "smart money""gets out""the wealthy vote."
The rich throng the polls in such large numbers that a clear minority of Americans prolongs and expands the tentacular reach of plutocracy.
It is true that voting may not accomplish much.
But like Churchill's observation that "Democracy is the worst form of government except for all the others," the vote is the best political option we've got.
If everyone earning zippo to $50,000.00 a year voted, the federal government of the United States would be as enlightened as the governments of Europe, enacting single payer healthcare, guaranteeing months of paid leave for new mothers, weeks of paid leave for new fathers, a minimum of four weeks paid vacation, free college education and a guaranteed national salary of $25,000.00 (adjusted for inflation) like the one Richard Nixon proposed and, ironically, Milton Freidman supported.
Do the math folks. If Richard Nixon got within spitting distance of a guaranteed national income, "the poor" can vote their way out of poverty simply by voting at the same rate wealthy people do.
"Guaranteed Minimum Income: The Most Conservative Way To Fight Poverty"
Part of the answer may be that the rich vote more than the poor. Seventy-eight percent of Americans making over $150,000 per year voted in the 2008 election, while less than half of those making under $30,000 per year voted.