• Shortcuts : 'n' next unread feed - 'p' previous unread feed • Styles : 1 2

» Publishers, Monetize your RSS feeds with FeedShow:  More infos  (Show/Hide Ads)


Date: Tuesday, 16 Sep 2014 14:35

Over the next few weeks, American undergraduates flooding back to campus will take part in a university tradition even older than drinking from Solo cups or inhaling stale pizza: They’ll be setting up homes inside the rock-hewn walls of Gothic buildings that look like Medieval castles, retrofitted for serious scholars. Many of these buildings were designed a century ago, when young American colleges—desperate to assert their legitimacy—went on a knock-off binge. They cloned British universities’ libraries, cathedrals, quads, sculptures and even dress codes in the hopes of recreating the feel (and prestige) of Oxford and Cambridge.

These days, colleges in China are copying America’s copycat approach. There’s a university in Shanghai where faux English manor houses sit side-by-side with dorms modeled on Britain’s half-timbered homes. To the north, Hebei province boasts a university inspired by Harry Potter’s Hogwarts—itself fashioned on the traditional collegiate Gothic. Even specific colleges have been cloned. The University of Nottingham’s Ningbo campus features replicas of the U.K. school’s iconic landmarks, flanked by British gardens.

Elsewhere, Chinese developers hired the California-based Dahlin Group to design a high school resembling Stanford University. The theme was “generated by the developers’ love of the campus and the connotation of the highest quality of education,” says Dahlin Group partner Chip Pierson.

Certain campuses, like Ningbo’s Nottingham, are joint ventures between foreign schools and local administrators, who’ve used architecture to link their outpost with the mother ship. Others are located in second and third-tier cities—China’s less cosmopolitan but still-humming metropolises—at institutions that lack the name recognition of a school like Beijing’s top-rated Tsinghua University.

Yet not even Tsinghua is duplitecture-free. Its campus features a twin of the University of Virginia’s Rotunda, which is itself a reimagining of Rome's Pantheon. Tsinghua's copy-of-a-copy dates back to the first wave of collegiate Gothic that swept China in the early 20th century when foreign missionaries embarked on an architecture spree, setting up universities that transposed the look and feel of places like Cambridge and Yale.

But the recent turn toward revival architecture has been largely driven by the Chinese themselves: Looking like the best schools in the world seems, to many, like the natural first step toward becoming one of the best schools in the world. It’s a “dress for the ranking you want, not the ranking you have” mentality, and the historic styles serve to make newer schools seem as though they’re bastions of a time-honored academic tradition.

All those arches and columns seem to be working. Laishan Lee, a 22-year-old from Hong Kong who is studying at the University of Nottingham Ningbo, says her school’s British buildings make her “feel the prestige" because it seems more advanced.

“In China, our culture really, really admires the western cultures,” Lee adds. Setting foot on Nottingham’s campus, “you feel like, ‘Oh, it’s different—you are in a higher-class society than the others.’”

The classical designs not only brand the institution, but offer the student body a taste of the overseas college experience, which is in high demand. The number of Chinese pupils enrolled in American schools more than doubled between 2008 and 2013 to over 235,000 people, according to the Institute of International Education. The number who apply and don’t get in is no doubt far greater. As Businessweek recently reported, one successful college consultant who works extensively with Asian clients and has offices in China can earn as much as $1.1 million for getting a single student into a top-ranked American school.

But for those who fail to gain one of these coveted spots, a place like Hebei University can deliver something that at least feels a bit like undergrad life abroad. It’s a taste of the Ivy League that doesn’t require leaving China.

“For sure [Nottingham's architecture] gives me more of an understanding of foreign culture and environments,” says Lee.

Another Nottingham student, 20-year-old Yangluan Luo, says that during the summer, Chinese high schoolers take classes at her university's campus because “they want students to experience this kind of architecture and culture.”

But this enthusiasm for foreign architecture is also a sign of something deeper: a shift toward what's seen as a more Western approach to education. From the 1950s, under Mao, through the early years of China’s political reforms in the 1980s, Chinese schools were meant to support the Communist Party’s revolutionary ideals and were closely supervised by the state. Bureaucrats imposed a rigid and centralized approach to schools' curricula—a mindset reflected in the imposing Soviet buildings common on campuses at the time.

In China’s current push to become the world’s superpower, such methods have given way to practices that more closely resemble those at Harvard or USC. Teachers working in China say there is a prevailing sense among parents and government officials that the nation’s universities are still playing catch-up to their counterparts in the West and must quickly learn how those institutions teach.

In 2001, China passed a set of reforms designed to prioritize “student-centered pedagogy,” emphasizing analytical thinking over memorization and discussion rather than transmission, among other changes. China’s universities have also aggressively recruited foreign faculty to set up new programs, part of a national “Thousand Foreign Experts” initiative to lure skilled individuals to Chinese businesses and schools. Professors with experience teaching in China say they’ve seen a notable change over the past several years. There are more seminars, smaller classes, and more discourse and debate among their pupils—all contributing to a campus environment more similar to what they’d find overseas.

“There is definitely a trend of bringing more Western education ideas and practices [into colleges],” says Nini Suet, founder of Beijing-based Shang Learning, which offers leadership and skills training to Chinese students seeking to go abroad. “A lot of Chinese parents, they don't have a lot of faith in the pure Chinese educational system. They want their kids to receive the best education and to them the best education is a Western education.”

Gothic and revival styles can be one way to broadcast a school’s embrace of foreign ideas and practices—even though the actual changes within the classrooms may be far subtler than the over-the-top architecture implies.

For Luo, seeing the replica of Nottingham's clock tower on her Ningbo campus made her believe in the school’s commitment to a more British system.

“The first time I went to the university, I thought, ‘It’s real,’” says Luo. “Not just that people talk about that British system at the school, but that yes, I can see it. I can feel it. Everything is modern. I can see the integration between Chinese and U.K. culture.”

She explained this over Skype, on a call that started late because of a busy evening the night before. She’d been helping to organize an “Awards Ball” to welcome the new freshman. The theme of the event was Harry Potter.

This article was originally published at http://www.theatlantic.com/education/archive/2014/09/chinese-colleges-are-trying-to-look-like-the-ivy-league/379997/








Author: "Bianca Bosker" Tags: "Education"
Send by mail Print  Save  Delicious 
Date: Tuesday, 16 Sep 2014 14:00

In the American Association for Cancer Research's mammoth new cancer progress report lies the sad fact that about half of the 585,720 cancer deaths expected to occur in the United States this year are related to preventable behaviors. For a disease that often seems (and is) so senseless, it turns out that many cases can be avoided with lifestyle tweaks.

Smoking is the biggest one, associated with nearly 33 percent of preventable cancer diagnoses:


Top Preventable Causes of Cancer

American Association of Cancer Research

But as this graph shows, a combination of weight problems, poor diet, and exercise account for another third of all preventable cancers. Being overweight or obese is linked to colorectal, endometrial, gallbladder, kidney, pancreatic, and postmenopausal breast cancer.

The good news is that some kinds of cancer—like lung cancer—are on the decline. Others, though—like those of the pancreas, kidney, thyroid, and liver—are rising steadily.

"The cancers that are increasing are the ones that are associated with obesity," said AACR spokesman and University of Pennsylvania cancer epidemiologist Timothy Rebbeck.

Americans might be smoking less than ever, but obesity rates keep on climbing.

"These things are not independent of one another, so if you smoke and are overweight and are physically inactive, you have multiple hits," Rebbeck added.

The mechanism by which weight influences cancer varies by the type of cancer, but it has to do with the way a skewed body mass index disrupts the body's hormones, which then go on to disrupt DNA. "Obesity is also associated with inflammation, and cancer is a disease of inflammation," Rebbeck aid.

It would appear, then, that diet and exercise are nearly as important as not smoking when it comes to preventing cancer.

In light of that, it seems strange that this isn't more of a public-health message. Usually the warnings around obesity center on heart disease and diabetes—two scary diseases that are nevertheless far less scary than cancer.

Rebbeck confirmed that so far, the public-health message on obesity has tended to sidestep the cancer risk.

"There's been more education in the population about the link between obesity and diabetes," Rebbeck said. "In the cancer community, we've focused on things like smoking and going out in the sun."

But that could start to change, he said, "because people fear cancer in ways that they don't fear other diseases."

In that case, we might soon see the rise of extremely graphic PSAs featuring people whose cancer could be attributed to their junk-food habits.

This article was originally published at http://www.theatlantic.com/health/archive/2014/09/diet-exercise-and-weight-cause-cancer/380213/








Author: "Olga Khazan" Tags: "Health"
Send by mail Print  Save  Delicious 
Date: Tuesday, 16 Sep 2014 13:24

One informed observer of the national-security state, Marc Ambinder, is doing his best to puzzle out President Obama's approach to fighting the radical militia ISIS. He is sympathetic to the difficult decisions Obama faces, as am I, but his analysis leans heavily on the notion that counterterrorism is a gray zone beyond classification. That belief causes him to be too forgiving of the president's illegal power grabs.

"With his administration using the word 'war' and promising to 'destroy' ISIS, Obama has gotten himself into more of a pickle. I don't actually think, in his heart of hearts, Obama believes that the U.S. is going to 'war' with anyone," Ambinder writes. "Obama does not think the United States has to fight a 'war on terrorism.' There is no such thing. War, real war, is different. Qualitatively. Fundamentally. War implies an end. Combating violent extremism, which was the administration's phrase of choice, speaks to the enduring nature of the conflict."

The larger context, as Ambinder sees it (his emphasis):

Counterterrorism campaigns do not neatly fit into our black-and-white descriptions of the way conventional wars begin and end. There will never be "victory" in the sense that terrorists will stop trying to attack the United States. What there will be, instead, is managed risk. A constant effort to detect and degrade the threat. A balance of measures—political, military, legal, and otherwise—focusing on the capacity of terrorists to create havoc outside their geographical boundaries. Preventing them from obtaining or developing weapons of mass destruction.

Later he expands on why he doesn't think that we're really at war:

The number of U.S. combat troops in Somalia, Yemen, and Iraq—three places where Islamic extremists are on the march—is tiny, a fraction of the number that President Bush mobilized for Iraq and Afghanistan. The cost, too, is not war-like. If Obama keeps even 2,000 special operations forces and their enablers in Iraq until the end of his presidency, the cost is negligible.

This is a curious metric for war. Yes, the War on Terrorism is different from other conflicts in various ways. But why does the number of troops needed as compared to Iraq matter? Why does the cost, which is only "negligible" in terms of the rest of a gargantuan military budget, matter? Why must Obama be graded on a curve set by George W. Bush and Dick Cheney?

If American war planes are firing missiles at a foreign nation or militia, that is war. Everyone understands as much with respect to foreign countries. Imagine an Iranian drone carried out a single targeted missile strike on an Israeli settlement. Would that be an act of war? Or not so much, because it's merely part of "a balance of measures—political, military, legal, and otherwise," to degrade Zionism? What if Russia stationed, in a foreign country, just a tiny fraction of the troops that Bush mobilized for the occupations of Iraq and Afghanistan?

The Framers gave Congress the power to declare war in part because they knew that war is the health of the state. They feared that the power incentives presidents have to wage war would cause them to do so in cases when they shouldn't. They trusted a body of representatives more than the instinct of one man. And they believed that a body directly responsive to the people should have a say.

Everything about that logic applies to the decision to fight ISIS.

As I see it, Obama is waging war illegally, just like he did previously (and without legal consequences) in Libya. He has failed to secure the congressional authorization needed to be consistent with his own avowed understanding of the War Powers Resolution. That legislation passed because a bygone president was able to enmesh America in a war that, for some years, didn't seem entirely like a war.

​But say for the sake of argument that we aren't really at war, and that "in his heart of hearts," Obama doesn't believe himself to be at war. If that's so, the Obama administration has no right to take many of the actions it has justified by citing the president's war powers. Obama and his defenders can't have it both ways. If no war, then no increased powers as the commander-in-chief. In court, Obama's lawyers don't have any doubt about whether their boss considers himself to be at war as a matter of law. They don't much care what's in his heart.

Later in his piece, Ambinder hazards a guess as to why we're fighting ISIS. It's as good as any:

We are not fighting ISIS because ISIS is plotting an imminent attack on the U.S. We are fighting ISIS because (a) the U.S. does not want Iran to fight and defeat ISIS alone; (b) the Saudis recognize that ISIS poses an existential threat to them if not checked soon; (c) Obama believes the U.S. has a residual responsibility to try to help stabilize Iraq if Iraq asks for the help, which it now is; (d) ISIS, well-funded and well-armed, has threatened the United States directly, and there is no reason to think that they won't try to find some way to directly attack American interests down the road: (e) an ISIS unchecked could throw the entire region into complete chaos; (f) Syria seems to welcome the help, and in any case, the administration has signaled that airstrikes in Syria will be a very modest part of this campaign; and (g) the relative risk to American assets, people, and authority is low.

Again, using military force in foreign countries to undercut geopolitical rivals, protect allies, and preempt attacks sounds an awful lot like war, but whatever word one wants to use, shouldn't there be a national debate about whether our money and military ought to be used for those goals? Obama purports to believe that "it is always preferable to have the informed consent of Congress prior to any military action." He hasn't honored that belief or disavowed it. The ambiguity permits him to retain his greatest asset as a politician: slipperiness. He inspires in his supporters a desire to see in him whatever they want to see in him.

Though deeply skeptical of another military intervention, I've stayed agnostic about the best policy toward ISIS. What vexes me more than anything about Obama's approach is that in ways big and small, he keeps subverting the vital public debate that ought to influence American foreign policy. He has been instrumental in depriving Americans of congressional votes on matters of military force. Without those votes, there can be no accountability of the sort that contributed to an Iraq War supporter like Hillary Clinton losing a close 2008 primary.

Obama gets a pass from commentators who presume he is more than a politician winging it. Over time I've come to have a different theory of the man: The only "12-dimensional chess" he ever played was persuading a good portion of the country's political observers that he cared about civil liberties, limits on executive power, and fighting terrorism without compromising core values. Shame on him for misleading us. Continuing to extend to him the presumption of good faith? That is irrational.

I suspect Obama does not ask Congress or the people before doing whatever he believes is best in foreign affairs because he knows other "deciders" might thwart what he "knows" in his "heart of hearts" to be best. Call it arrogance or paternalism, or if you're an Obama apologist, call it his wisdom for seeing that while checks and balances and a legislature may be important generally speaking, America is better off setting all that aside, just so long as we have such a prudent, intelligent, Abraham Lincoln-like man in the White House (plus subordinates like CIA Director John Brennan, the most "priest-like" man ever to keep a death list).

Even if I felt Obama to be a man of uncommon prudence, character, and judgment, I'd understand the danger of the precedents he is so recklessly setting. The extreme theories of executive power that he has embraced won't end with his presidency. Unless and until the legislature acts to rein in the executive branch, future presidents will unilaterally start wars, and sooner or later the consequences will be so dire that no one will have any doubts about the wars being real. These future presidents won't care what was in Obama's heart, they will focus on what he actually did—and so should we. A president that makes war as he sees fit undermines important laws and norms. A president who uses war powers to engage in never-ending "combat against violent extremism" does just as much damage.

This article was originally published at http://www.theatlantic.com/politics/archive/2014/09/why-president-obama-should-ask-permission-to-wage-war/380256/








Author: "Conor Friedersdorf" Tags: "Politics"
Send by mail Print  Save  Delicious 
Date: Tuesday, 16 Sep 2014 13:00

When actors play doctors on TV, that does not make them actual doctors. And that does not mean they should scour some Internet boards, confront their pediatricians, and demand fewer vaccinations for their children, as some Hollywood parents in Los Angeles have apparently been doing.

The Hollywood Reporter has a great investigation for which it sought the vaccination records of elementary schools all over Los Angeles County. They found that vaccination rates in elite neighborhoods like Santa Monica and Beverly Hills have tanked, and the incidence of whooping cough there has skyrocketed.

Here's a map of the schools with dangerously low vaccination rates (an interactive version is on their site). Note how the schools cluster together as little red dots all over the wealthy, crazy Westside—not unlike crimson spots on a measles patient:


Los Angeles Schools at "High Risk" of Outbreaks

The Hollywood Reporter

Parents in these schools are submitting a form called a "personal belief exemption," which states that they are not vaccinating their kids due to "a diffuse constellation of unproven anxieties, from allergies and asthma to eczema and seizures," reporter Gary Baum writes.

In some schools, up to 60 to 70 percent of parents have filed these PBEs, indicating a vaccination rate as low as that of Chad or South Sudan. Unlike in Santa Monica, however, parents in South Sudan have trouble getting their children vaccinated because of an ongoing civil war.

And lo, it is these very same L.A. neighborhoods that are experiencing a resurgence of diseases like whooping cough, otherwise known as pertussis. Measles cases have also hit a high in California this year.

To be clear, not all PBEs are evidence of an anti-vaxxer parent. Schools require either a PBE or an up-to-date shot record for school attendance, and sometimes parents submit them if they simply aren't able to get the shots done on time. Still, the L.A. Times has previously reported that the percentage of kindergartens in which at least 8 percent of students were not fully vaccinated because of the parent's beliefs had more than doubled since 2007, and private-school parents were likelier to file the PBEs than their public-school counterparts. The paper found that the exemption rate for all of Santa Monica and Malibu was 15 percent.

The Hollywood Reporter interviewed several Westside doctors who seem to encourage this kind of vaccine roulette, as well as several area school administrators who seem only vaguely aware of the risk.

“I’d prefer that children be immunized," Shari Latta, director of Children’s Creative Workshop in Malibu, told THR. "We’ve been fortunate to avoid any outbreaks.”

So fortunate! If only there were something parents could do to improve those fortunes ... a way to, say, inoculate their families against the tragedy of untreatable disease.

The anti-vaxxer turn is a frustrating development for a city that's obsessed with health and fitness. There are very few things that we know prevent sickness without a doubt. The "Beaming" juice at L.A.'s Cafe Gratitude, which consists of carrot juice, camu-camu, and astralagus, and which is surely delicious and beam-inducing, is not one. Vaccines are.

It's tempting to suggest, as some of Baum's L.A. sources do, that these are just concerned, well-meaning parents who, along with forbidding processed food and dragging their offspring to baby yoga, also avoid any medications that aren't strictly "natural." (Of course, vaccines are natural—they're derived from the naturally-occurring pathogen itself.)

That kind of thinking ignores the way vaccines work, through herd immunity. A community can only be protected when 92 percent or more of a population is immunized, and many of L.A.'s elementary schools are dipping far below that number. These parents aren't just risking their own kids' health, they're risking everyone's.

Wealth enables these people to hire fringe pediatricians who will coddle their irrational beliefs. But it doesn't entitle them to threaten an entire city's children with terrifying, 19th-century diseases for no reason.

This article was originally published at http://www.theatlantic.com/health/archive/2014/09/wealthy-la-schools-vaccination-rates-are-as-low-as-south-sudans/380252/








Author: "Olga Khazan" Tags: "Health"
Send by mail Print  Save  Delicious 
Date: Tuesday, 16 Sep 2014 12:10

In September of 1989, NBC aired a message from Tom Brokaw. "The more you know about an impending disaster, the more likely you are to do something about it," the news anchor told his audience. "That's why we're embarking on a campaign of public service messages. The campaign is called, appropriately, 'The More You Know.'"

In the 25 years since, the campaign Brokaw introduced back then has expanded its purview to include definitions of "disaster" that are both broadened and narrowed from the original: unhealthful eating habits, low self-esteem, environmental crises, cyber-bullying. The More You Know has won Emmys. It has counted presidents and First Ladies, as well as TV stars, as participants. It has also won what might be the best demonstration of cultural ascendance: widespread parody. (Doo-doo-doo-DOOOO…)

The campaign has also varied, enormously—not just in its messaging, but in its tone. Many of the ads have been earnest-and-serious:

Some of them have been earnest-and-jokey:

Some have been self-parodying:

And some have straddled the line between fact and fiction:

The tone of the spots has changed greatly as The More You Know has made its way from experiment to formula. One thing that hasn't changed, though, is the idea at the heart of the campaign: the notion that a monolithic "You"—a You who knows things, and who could know even more things, and who is looking to broadcast television networks to assist in that enlightenment—can exist in the first place. As Brokaw explains it, "You want your audience to care about you and know that you care about them, and the issues that are important to them." The spots, he adds, are a way to "remind everyone of our obligation as citizens."

With that in mind, here's a little more to know about The More You Know. Because the more you … well, you know.

* * *

The public-service announcement has long existed in the form of newspaper ads. But the shape The More You Know takes—the short, jaunty video, with a single message—was born in the UK, in the 1930s. The spots, featuring (and often directed by) the actor Richard Massingham, included such broadly relevant, pragmatic topics as how to prevent the spread of illness, how to swim, and—yes—how to safely cross a road. They were not high-minded. They featured pretty much the opposite of stars and rainbows. They were essentially the "For Dummies" book series, in televised pseudo-slapstick. Massingham's films struck a chord, though, and when Britain entered the war, its government commissioned him for a new set of spots, to be focused on the war effort. The short films spread to the U.S.—fitting accompaniments to the printed posters of Rosie the Riveter and Uncle Sam.

Peace returned, but the ads remained. They were useful, the networks realized, not just as a way to fill the air during unsold advertising time between scheduled programming, but also as a way of meeting the newly formed FCC's standard of public interest. (The Communications Act of 1934—still the governing charter for broadcast television—required that the networks operate in the "public interest, convenience, and necessity.")

The ads tended to focus on cultural education (like this little treatment on office etiquette, from 1953):

They also focused on public safety (like this, from 1961):

As PSAs evolved, they often doubled down on their own implicit paternalism by targeting themselves to children. In the 1980s, cartoons like He-Man and the Masters of the Universe and G.I. Joe: A Real American Hero used PSAs as their show-closing epilogues. ("Knowing," Joe jauntily informed a generation, "is half the battle.") NBC itself had a dedicated series of kid- and teen-targeted PSAs in the '80s, too. "One to Grow On" aired during the network's kid-targeted Saturday morning lineup from 1983 to 1989. It focused on community, ethics, and personal safety. It included spots like this:

And like this:

And then, in the late 1980s, a group of education-focused nonprofits came to NBC with a request: They wanted the network's help in recruiting teachers. There was a shortage at the time, and the organizations wanted the network's help in raising not just people's respect for the profession, but also in having qualified applicants for open teaching positions across the country. NBC agreed. It recruited its star anchor, Tom Brokaw, to present The More You Know's first message, with the idea that the ads would be part of a larger PSA campaign featuring a variety of NBC talent. In 1990, NBC ran this ad, featuring Bill Cosby making the case to go into (or at least respect the profession of) teaching:

It has now become something of a rite of passage for TV personalities to do one of the spots. The recent crop of The More You Know message-givers includes Amy Poehler, Coolio, Joan Rivers, Jack McBrayer, Tiki Barber, Shakira, Usher, Steve Harvey, Anjelica Huston, Ken Jeong, Questlove, and Jimmy Fallon. (And also Al Roker, who explains that "it's exciting to be part of something that does good, that gets families talking about issues.")

Often, celebrities are personally connected to the causes they discuss in the spots. Seth Meyers's mother is a teacher; he did an ad about education. Ken Jeong, who was a physician before he was an actor, did a spot about health. As Beth Colleton, NBCUniversal's senior vice president for corporate social responsibility, told me: "The spots are authentic in the sense that the talent speak to things they really care about. And I think we've seen that over the years."  

NBC is also expanding the campaign to make it more accessible to audiences who are, increasingly, not parked in front of televisions. It now runs its video PSAs as banner ads on its websites—the better, Colleton notes, to reach audiences across different platforms. Last year, the network released an ebook, aimed at parents and teachers, discussing how to talk to kids about navigating the Internet. NBC has also been working with the channel Sprout to develop spots targeted to pre-schoolers and their families. And, earlier this year, the Today Show ran its own PSA campaign—one that dealt with body image issues. It made its program more interactive than most, asking people to tag images of themselves with the punny hashtag #loveyourselfie.

The campaign may evolve, but its star—earnestness—stays the same. Doo-doo-doo-DOOOO...

This article was originally published at http://www.theatlantic.com/entertainment/archive/2014/09/theres-more-to-know-about-the-more-you-know/380242/








Author: "Megan Garber" Tags: "Entertainment"
Send by mail Print  Save  Delicious 
Date: Tuesday, 16 Sep 2014 11:30
Dave Bevard, former union president of the Machinists in Galesburg, Illinois, with the last refrigerator ever produced at the Maytag plant there. (Chad Broughton)

Ten years ago on this day in September, the last Maytag refrigerator moved down the assembly line in Galesburg, Illinois, a quiet little city of 32,000 on the western edge of the Rust Belt. Workers signed the white appliance with a black Sharpie as it passed, said their goodbyes, and left to start new lives.

In that same spot, a century earlier, a few dozen men hammered out steel plowing discs in a little brick workshop for nearby prairie farmers. A sprawling patchwork of buildings swallowed the old workshop in the postwar years and, by the early 1970s, the factory buzzed with the work activity of nearly 5,000 people. Called “Appliance City” by some, it supplied millions of appliances each year to America’s kitchens. Today, a decade after the shuttering, the autographed refrigerator sits in the Galesburg Antiques Mall. The former Appliance City site—the size of over 40 football fields packed together—is now mostly rubble and weeds.

Inside the Maytag factory in 1994 (courtesy The Register-Mail)

Rusting line inside shuttered factory in 2008, before the building was razed (David Samuel Stern)

The Galesburg shuttering, highlighted in Barack Obama’s iconic 2004 Democratic National Convention keynote address, was part of a historic hollowing of the industrial base of the United States. Manufacturing job loss has been a fact of American life since the 1970s, but in the 2000s manufacturing stepped off a cliff, shedding 5.8 million jobs, or about one of every three—most of them before the Great Recession began at the end of 2007. Illinois alone lost 320,900 manufacturing jobs, or 36.6 percent of its total, in the 2000s. Good jobs for those without a college diploma disappeared in the 2000s and generally did not come back. In December of 2000, the ratio of unemployed job seekers to job openings had been 1.1 to 1. At the end of the decade, it spiked to 6.1 to 1. The 2000s was the first recorded decade of zero job growth.

Commentators debate offshoring’s role in the jobs crisis, but there’s no doubt that it became downright fashionable in the early Bush years—alongside an increasingly short-term focus of the investor class. Maytag, for example, shifted refrigerator production to Reynosa, Mexico, in 2004. According to manufacturing expert John Shook, “There was a herd mentality to the offshoring” in the early 2000s. In the 1990s, American multinationals added 4.4 million jobs in the U.S. and 2.7 million jobs overseas. But in the 2000s those same multinationals cut 2.9 million American jobs and increased overseas employment by an additional 2.4 million.


U.S. Manufacturing Jobs in Millions, 1939-2014

Chart by author using data from the Bureau of Labor Statistics. Click through for bigger version.

Western Illinois had a front-row view of the results of that industrial hollowing: an eroded lower-middle class, a surge in downward mobility, and growing income inequality. In Galesburg, the relative equality of factory life at Maytag—where workers earned, on average, $15.14 an hour and had good benefits—gave way to wildly unequal outcomes. A lucky few got steady jobs with the expanding BNSF Railway (Galesburg is a railroad town) or with John Deere in Moline, Illinois, 50 miles north. They earned more than they had at Maytag, maybe $16 to $22 an hour, though it always came at some cost. One former Maytag worker, Aaron Kemp, calculated that he had logged a few hundred thousand miles in six years at BNSF traveling weekly by car between his family in Illinois and his work on the rails in Texas, New Mexico, Louisiana, and California.

Most of the laid-off earned less—often significantly less—than their Maytag wage, even after two or four years of schooling and retraining. One couple, Jackie and Shannon Cummins, together earned almost 60,000 (or about $76,000 in today’s dollars) at Maytag as assemblers; now, a decade later, they scrape by on a combined income of $37,000, even after both retrained and found nearly full-time jobs. Jackie works at a local hospital and Shannon works as a special education aid at a small rural high school, and sorts and folds clothes at Goodwill over some summer breaks. Like many displaced refrigerator-makers, Jackie and Shannon entered the expansive lowest rungs of the healthcare and education sectors, both of which are heavily reliant upon public funding. The lower wages haven’t been the worst of it. For many, especially those with 15 or 20 years at the factory, the loss of good health insurance and a solid pension stings the most. Jackie and Shannon have cycled between private coverage, Medicaid, and no coverage at all in the past few years.

The Cummins family on the porch of their home in Abingdon, Illinois (Chad Broughton)

As ex-Maytag workers struggle, so does the little city. After Maytag left and more stores in Carl Sandburg Mall shut down, the tax burden shifted to residents, who’ve seen property taxes rise as the housing stock declines. When the Great Recession hit, the state’s fiscal problems pinched education, health, and social-services budgets, just when the need for those resources grew. Since the closing, the percentage of people on Medicaid in Knox County has nearly doubled as employer-based coverage at Maytag and other private employers has waned. Likewise, the percentage of children classified as “low-income” in cash-strapped District 205 in Galesburg increased from 43 percent in 2002 to 68 percent in 2013. This year, the school year began with a tense 15-day strike as the school board and teachers in Galesburg faced off in part over ever-scarcer local resources.

Galesburg is in slow decline, but this is hardly the devastated ghost town that some predicted a decade ago. “This town is going to die,” R. J. West, a Maytag worker at the time, told The Register-Mail when the closing announcement was made. Defying the catastrophic forecasts, fewer than one thousand people trickled out of Galesburg in the decade since the shuttering. The railroad, buoyed by its key logistical role in international trade, which includes shipping manufactured products from China to American consumers, has helped, some, to fill the enormous void Maytag left. Though payday loan shops, fast-food joints, and a new Walmart Supercenter line its main commercial strips, there is a hint of gentrification downtown near Knox College, where a new craft brewpub and upscale coffee shop thrive. Consumption patterns in Galesburg, in a modest way, reflect a growing divide among high- and low-end consumers nationwide.

Galesburg endures because its residents have stuck it out. They stay for their friends and families, and because of the low cost of living, especially housing. As local employment counselor David Lindstrom said, “You can sell your house here, and you can buy a brick in Chicago.” The Maytag casualties—nearly half of whom were women—were mostly mid-life with families or nearing retirement, with little desire to start over or to deal with the bustle of the big city, where their wage prospects aren’t much rosier. They want to stay, and, for most, it makes sense to stay. Labor stays put if it can, as Adam Smith observed in the early years of the industrial revolution. “After all that has been said of the levity and inconstancy of human nature,” Smith wrote in The Wealth of Nations, “it appears evidently from experience that a man is of all sorts of luggage the most difficult to be transported.” Jackie Cummins said it this way: “We love apple pie, baseball games—we’re just kinda cheesy Midwesterners.”

The old Appliance City site today (Chad Broughton)

Galesburg may never make appliances again, and if it did it would not need the crowded work-city of the early 1970s to make them. There are forward-looking possibilities, however. Newton, Iowa—Maytag’s headquarters for over a century—now manufactures wind towers in a former Maytag laundry plant. Walla Walla, Washington—another town hard hit by free-trade agreements—now manufactures wine (yes, wine-making is manufacturing). To enter the green economy, Newton needed a federal tax credit for wind energy and an innovative state-local plan to get going. In Walla Walla, the local community college developed a specialty in viticulture and enology.

U.S. manufacturing faces not only foreign competition and incoherent federal support, but widespread disinterest and even prejudice. For many it’s part of the old economy that is losing out in the inevitable advance toward the post-industrial age. That view is wrong. A 2012 report to the president shows how manufacturing is central to a thriving, modern economy—especially advanced manufacturing in areas like green energy, nanomanufacturing, and three-dimensional printing.

There are still more than 12 million manufacturing jobs in the U.S. and output is as high as ever, and just behind China’s. In an overlooked story, the United States added manufacturing jobs for 12 months in a row in the past year. The gains are modest, but such a winning streak has only happened four times in the last 30 years. Some business elites have shifted their thinking. General Electric’s CEO Jeffrey Immelt wrote in 2012, “Outsourcing that is based only on labor costs is yesterday’s model.”

When Obama came to Galesburg last year he noted the uptick in manufacturing in a major economic speech. “For the first time since the 1990s, the number of American manufacturing jobs has actually gone up instead of down,” the President said. “But we can do more… I know there’s an old site right here in Galesburg, over on Monmouth Boulevard—let’s put some folks to work!”

The Great Recession garners attention; the wrecking of manufacturing that predated it does not, but should. When manufacturing leaves, it takes more than jobs; it takes with it ingenuity, creativity, and an economy’s inventive edge. Manufacturing offers a nation the best mix of variously skilled jobs, strong exports to lower the trade deficit, and security in its food, energy, health, and cyber-defense systems. The American economy cannot thrive on the service sector alone. As Dave Bevard, a former union president of the Machinists in Galesburg, said, “I don’t know of an economy that can survive on the principle of ‘You mow my lawn, and I’ll wash your dishes.’ We’re great because we make things.”

 

This article was originally published at http://www.theatlantic.com/business/archive/2014/09/the-last-refrigerator/380154/








Author: "Chad Broughton" Tags: "Business"
Send by mail Print  Save  Delicious 
Date: Tuesday, 16 Sep 2014 11:01

In honor of the 30th anniversary of the Coen brothers' debut, Blood Simple, I’m re-watching their 16 feature films and attempting to jot down observations on one per day, in order of their release. For a fuller explanation of what I’m doing and why, see my first entry, on Blood Simple. (Here, too, are my entries on Raising Arizona, Miller’s Crossing, Barton Fink, The Hudsucker Proxy, and Fargo. The landing page for the whole series is here.)

Notes on The Big Lebowski (1998)

• When the Coens offered their early homages to James M. Cain (Blood Simple) and Dashiell Hammett (Miller’s Crossing) they played them straight—at least by Coen standards. But when it came to the third cardinal of hardboiled fiction, Raymond Chandler, they decided to go another way. The Big Lebowski, very vaguely inspired by the Big Sleep, was the Coens’ loosest, loopiest film to date, a stoner crime comedy about bowling, Vietnam, and the critical importance of having that one interior-design element that ties the whole room together.

• Ironically, following in the wake of the utterly fabricated “true story” of Fargo, the Coens offered a wild fantasia of a movie that actually had its basis in real people. Their friend Pete Exline, a film producer (and Vietnam veteran), got them started with the ironic line about his scruffy rug “tying the room together” and a story about his having caught a teenage carjacker who’d made the mistake of leaving his homework in the vehicle. The characters of Jeff “the Dude” Lebowski (Jeff Bridges) and Walter Sobchak (John Goodman) were subsequently filled out with elements of two other Hollywood figures the Coens had gotten to know: Jeff Dowd, a scruffy producer and activist who was a former real-life member of the Seattle Seven; and John Milius, the storied screenwriter who is also a noted gun nut and military enthusiast.  

• Raymond Chandler once explained the difference between the classic murder mystery and the hardboiled genre he’d helped invent: In the former, the plot—the careful alignment of details that enabled the mystery to be solved—was paramount, while in the latter, “the scene outranked the plot, in the sense that a good plot was one that made good scenes.” I don’t know whether the Coens ever read this analysis, but they surely embraced its spirit in The Big Lebowski. As Joel explained, echoing Chandler, “The plot is kind of secondary to other things.” The Dude, having had his precious rug peed on by thugs who mistook him for a different, wealthy—i.e., “big”—Lebowski (David Huddleston), decides to visit the latter for compensation, and soon finds himself in the midst of a convoluted kidnapping plot that may or may not be genuine. Indeed, by the end of the film, it’s revealed that almost nothing that’s taken place has been genuine: The kidnapping was phony, the ransom bag that the Dude and Walter dropped was phony, and even the real ransom bag that they’d planned (but failed) to keep was phony.

• Nor is it only the plot that’s loose. In contrast to the Coens’ previous films, the movie is a clamor of looks and moods, from the dingy muddle of the Dude’s bungalow to the glimmering neon of the bowling alley to the Busby-Berkeley-meets-Salvador-Dali absurdity of the dream sequences. As cinematographer Roger Deakins said of the movie, “I don’t think it has one style.” Even the era is a little fuzzy around the edges. Though the movie is set in 1991, both the Dude and Walter are obsessed with their experiences in the late ‘60s and early 70s (political activism and Vietnam, respectively). Meanwhile, the bowling alley—and bowling generally—consciously conjures the 1950s. The Big Lebowski is also the first Coens movie in which the soundtrack figures more prominently than the score, and that soundtrack (the Coens’ first collaboration with T Bone Burnett) is an eclectic mix that spans decades, though again with an emphasis on the late ‘60s and early ‘70s.

• The cast, too, feels looser, as if the directors, at least to some degree, released the tight grip for which they are famous. Though John Turturro’s role as a bowling adversary is small, he was allowed to supply his own embellishments (the dance, the ball polishing). And Bridges and Goodman seem let off the leash altogether, the former delivering one of the most iconic stoner performances in history, and the latter … What should we say of Goodman? Written specifically for him, the role of Walter Sobchak is almost certainly the highlight of his career, an unforgettable opportunity to express both his outsized persona and his actorly control. As different as the characters are, when watching Walter’s ongoing battle between fury and self-restraint I was reminded of the brilliantly swerving moods of Turturro’s Bernie Bernbaum in Miller’s Crossing.

• Once again, Coen-world in-jokes abound. Jon Polito appears briefly as a P.I. trailing the protagonist in a VW Bug (Blood Simple), and commends the latter for “playing one side against the other, in bed with everyone” (Miller’s Crossing). Walter’s constant demand to their other bowling partner (Steve Buscemi) that he “Shut the fuck up, Donnie,” is only secondarily intended for its perceived recipient; primarily, it’s a reference back to Buscemi’s logorrheic character in Fargo. The ransom note sent to the big Lebowski, demanding $1 million (Fargo) for the return of his trophy wife, Bunny (Tara Reid), is on stationary from the Hotel Earl (Barton Fink). Moreover, Bunny is really a girl named Fawn Knutson from Moorhead, Minnesota—a sister city lying directly across the Red River from Fargo, North Dakota. If that weren’t enough, Peter Stormare, playing one of the nihilists, finally gets the pancakes he’d been pining for.

• But the movie's most convoluted inside joke doesn’t relate to Coen brothers oeuvre at all. When the Dude first meets Bunny, she asks him to blow on her nail-polished toes, a near-certain reference to the legendary “you know how to whistle, don’t you” double-entendre that a 19-year-old Lauren Bacall had delivered to Humphrey Bogart in To Have and Have Not. The next film in which the two (now a couple) starred together was Howard Hawks’s adaptation of, yes, The Big Sleep. I refuse to believe that this is a coincidence. (It’s true that Bacall’s character in that film tracks more closely with Maude, the Julianne Moore character in The Big Lebowski, than with Bunny, but still … ) It’s worth noting here that Hawks’s straightforward Chandler adaptation served far less as a model for Lebowski than did Robert Altman’s offbeat 1973 variation on The Long Goodbye. The latter, in which Elliott Gould played a semi-comical, half-hearted Philip Marlowe, lies almost exactly at the midpoint between Chandler and the Coens’ satirical reinvention. (It also offered the second, uncredited, onscreen appearance of Arnold Schwarzenegger, who played a nameless hood.)

• One “joke” that was not deliberate is the date of the 69-cent check that the Dude writes out to pay for half and half (a crucial ingredient of his signature White Russians) at the beginning of the movie: September 11, 1991, exactly 10 years before the attacks on the World Trade Center and Pentagon. Rendering the coincidence creepier still is the movie’s Gulf War backdrop and the fact that immediately after writing the check, the Dude glances up at a television in the grocery store on which George H. W. Bush is promising that Saddam Hussein’s aggression against Kuwait “will not stand.” Conspiracy theorists, start your engines.

• So here’s where I confess that, as much as I like The Big Lebowski, it doesn’t quite make it into my very top tier of Coen brothers movies. [Ed. note: Wait! Don’t stop reading!] This is primarily because that top tier is really, really hard to get into, given how much I love a handful of Coens movies. But it’s also the case that, hilarious though it is, the movie is a little loose for my taste. Yes, this is true to Chandler, but I always thought he let himself off a bit easy as a writer with his whole narrative dichotomy: There’s no reason one can’t aim to write to write both excellent plots and excellent scenes. Where The Big Lebowski is concerned, almost all the pieces—Jackie Treehorn, Brandt, the nihilists(!)—fit nicely for me in their discursive, Chandlerian way. But there are two exceptions: Julianne Moore’s Maude Lebowski is so arch and mannered that she seems to have wandered in from another film altogether (maybe a David Lynch?), particularly in the scene that concludes with her and a friend played by David Thewlis tittering maniacally. And as much as I love Sam Elliott, his cowboy narrator, The Stranger, doesn’t make a lick of sense, and throws me completely out of the movie. Sure, he’s not really supposed to make sense—as Ethan Coen once said, “Sam would actually ask us, ‘What am I doing in this movie?’ We didn’t know either.” But when it comes to my very favorite Coens movies, it’s a game of inches, and these are enough to keep Lebowski from quite crossing the goal line.

• That said, as a courtesy to those hardcore Lebowski lovers who have not yet canceled your Atlantic subscriptions and/or dismantled your computers, I offer—as I did for Barton Fink—an opportunity for rebuttal by proxy. In this case, I promise that your appointed surrogate loves The Big Lebowski at least as much as, and almost certainly more than, you do. Indeed, if you have ever suffered a moment of doubt in the midst of your totalizing adoration, he’d probably throw you right on the pyre with confirmed heretics such as myself. He’s also responsible for what may be the best pop-cultural metaphor for American politics that I can remember reading in a long while. With that, I pass you on to the nuanced analysis and gentle suasion of my dear friend Jon Chait:

Shut the fuck up, Chris.

Well, I think that pretty much summarizes my response. But, in case more elaboration is needed, The Big Lebowski is not only a truly great movie but a uniquely great one. Its uniqueness can be seen in the deep continuing devotion of its fans—the festivals, books, plays, and other forms of tribute that live on. The best single way to explain its unique appeal is that The Big Lebowski is the only film I know of that is more enjoyable upon second or third, or even fifth or sixth, viewing than the first.

Now, this stems from (here I depart from almost all the Lebowski community) a flaw in the film that you cited: its confusing plot. Fascinating, hysterical, but nonetheless extraneous characters jump in and out of the film. Think of Turturro’s “the Jesus,” or Elliott’s narrator, or Maude’s strange art. Because they are so compelling, they divert the audience from the central characters and the plot involving them, which is perfect. This, I surmise, explains why so many viewers have trouble appreciating it fully on first viewing: You need to have a firm delineation between the plot and the sideshow in order to follow it, and to zero in on the central characters, the Dude and Walter.

Our otherwise excellent reviewer, unfortunately, developed a firm opinion of the movie many years ago after merely one viewing, and has clung stubbornly to it ever since. So you have no frame of reference, Chris. You're like a child who wanders in in the middle of a movie …

Beneath the confusing appendages is a relatively tight story about a crime, which lends itself to completely unexpected comic revelations. A lot of the comedy is narrowly targeted. Walter’s bitter recriminations against Germans—first the bowling league manager (“I told that kraut a fucking thousand times I don't roll on shabbas”) and then the nihilists (“Fucking Germans. Nothing changes”)—will strike anybody with older Jewish relatives as both hysterically familiar and hysterically anomalous, coming from a burly working-class Polish-American. Political thought runs throughout the story, with the right-wing and left-wing Lebowskis, the neoconservative Walter, and the nihilists applying their respective political philosophies to their role in the plot’s caper.

The movie is in large degree a political comedy that, without taking itself remotely seriously, goes far deeper than highbrow fare. That has always made Chris’s failure to fully appreciate its genius all the more disturbing to me.

This will not stand.

And there you have it. I, Dude-like, aspire only to find a way that we can all abide in peace. Jon, by contrast, embraces the combative moral certainty of Walter and the neocons, past and present. Maybe—maybe—my toe slipped over the line a little. Big deal. It’s just a game, man.

Et Cetera:

Where I rank The Big Lebowski among Coens films: #6 (out of 16)

Where I rank its soundtrack, curated by T Bone Burnett, among Coens soundtracks curated by T Bone Burnett: #2 (out of 4)

Best line: “Say what you will about the tenets of National Socialism, Dude, at least it’s an ethos.”

Best visual: The bowling-ball’s-eye views of the world

Best sound: Glass splinters, metal groans, and Walter utters the immortal line [as it was rendered on network TV], “This is what happens when you find a stranger in the Alps!”

Notable locale: Los Angeles

Notable Influences: Raymond Chandler, Robert Altman

Drinking-game-suitable catchphrase: “It really tied the room together.”

Bonus catchphrase for hard drinkers: “Shut the fuck up, Donny.”

Things that roll: Tumbleweeds, bowling balls

Dream sequence(s): Yes, emphatically

Important scene(s) set in a bathroom: Yes

Number of characters who vomit: Probably zero (It’s a little hard to tell: there’s a moment when it looks as though the Dude will, but ultimately he doesn’t seem to)

John Goodman going berserk: Constantly, magnificently

Next up: O Brother, Where Art Thou?

This article was originally published at http://www.theatlantic.com/entertainment/archive/2014/09/30-years-of-coens-the-big-lebowski/380220/








Author: "Christopher Orr" Tags: "Entertainment"
Send by mail Print  Save  Delicious 
Date: Tuesday, 16 Sep 2014 11:00

It’s been seven years since the first launch of the iPhone. Before that, smartphones were a curiosity, mostly an affectation of would-be executives—Blackberry and Treo and so forth. Not even a decade ago, they were wild and feral. Today, smartphones are fully domesticated. Tigers made kittens, which we now pet ceaselessly. Over two-thirds of Americans own them, and they have become the primary form of computing.

But along with that domestication comes the inescapability of docility. Have you not accepted your smartphone’s reign over you, rather than lamenting it? Stroking our glass screens is just what we do now, even if it also feels sinful. The hope and promise of new computer technology has given way to the malaise of living with it.

Shifts in technology are also shifts in culture and custom. And these shifts have become more frequent and more rapid over time. Before 2007, one of the most substantial technological shifts in daily life was probably the World Wide Web, which was already commercialized by the mid-1990s and mainstream by 2000. Before that? The personal computer, perhaps, which took from about 1977 until 1993 or so to become a staple of both home and business life. First we computerized work, then we computerized home and social life, then we condensed and transferred that life to our pockets. With the newly announced Apple Watch, now the company wants to condense it even further and have you wear it on your wrist.

Change is exciting, but it can also be exhausting. And for the first time in a long time, reactions to the Apple Watch reveal seem to underscore exhaustion as much as excitement. But even these skeptical replies question the watch's implementation, rather than express lethargy at the prospect of living in the world it might bestow on us.  

Some have accused Apple of failing to explain the purpose of their new wearable. The wristwatch connoisseur Benjamin Clymer calls it a “market leader in a category nobody asked for.” Apple veteran Ben Thompson rejoins Cook for failing to explain “why the Apple Watch existed, or what need it is supposed to fill.” Felix Salmon agrees, observing that Apple “has always been the company which makes products for real people, rather than gadgets for geeks,” before lamenting that the Apple Watch falls into the latter category.

“Apple hasn’t solved the basic smartwatch dilemma,” Salmon writes. But the dilemma he’s worried about proves to be a banal detail: “Smart watches use up far more energy than dumb watches.” He later admits that Apple might solve the battery and heft problems in a couple generations, but “I’m not holding my breath.” Salmon reacts to the Apple Watch’s design and engineering failings, rather than lamenting the more mundane afflictions of being subjected to wrist-sized emails in addition to desktop and pocket-sized ones. We’re rearranging icons on the Titanic.

After the Apple keynote, The Onion joked about the real product Apple had unveiled—a “brief, fleeting moment of excitement.” But like so much satire these days, it’s not really a joke. As Dan Frommer recently suggested, the Apple keynote is no less a product than are its phones and tablets. Apple is in the business of introducing big things as much as it is in the business of designing, manufacturing, distributing, and supporting them. In part, they have to be: Apple’s massive valuation, revenues, and past successes have only increased the street’s expectations for the company. In a world of so-called disruptive innovation, a company like Apple is expected to manufacture market-defining hit after hit.

Indeed, business is another context we often use to avoid engaging with our technological weariness. We talk about how Apple’s CEO Tim Cook must steer the tech giant into new waters—such as wearables—to insure a fresh supply of desire, customers, and revenue. But the exigency of big business has an impact on our ordinary lives. It’s easy to cite the negative effects of a business environment focused on quarterly profits above all else, including maintaining job stability and paying into the federal or municipal tax base. In the case of Apple, something else is going on, too. In addition to an economic burden, the urgency of technological innovation has become so habitual that we have become resigned to it. Wearables might not be perfect yet, we conclude, but they will happen. They already have.

I’m less interested in accepting wearables given the right technological conditions as I am prospectively exhausted at the idea of dealing with that future’s existence. Just think about it. All those people staring at their watches in the parking structure, in the elevator. Tapping and stroking them, nearly spilling their coffee as they swivel their hands to spin the watch’s tiny crown control.

A whole new tech cliché convention: the zoned-out smartwatch early adopter staring into his outstretched arm, like an inert judoka at the ready. The inevitable thinkpieces turned non-fiction trade books about “wrist shrift” or some similarly punsome quip on the promise-and-danger of wearables.

The variegated buzzes of so many variable “haptic engine” vibrations, sending notices of emails arriving from a boss or a spammer or obscene images received from a Facebook friend. The terrible battery life Salmon worries about, and the necessity of purchasing a new $400 wristwatch every couple years, along with an equally expensive smartphone with which to mate it.

The emergence of a new, laborious media creation and consumption ecosystem built for glancing. The rise of the “glancicle,” which will replace the listicle. The PR emails and the B2B adverts and the business consulting conference promotions all asking, is your brand glance-aware?

These are mundane future grievances, but they are also likely ones. Unlike its competitor Google, with its eyeglass wearables and delivery drones and autonomous cars, Apple’s products are reasonable and expected—prosaic even, despite their refined design. Google’s future is truly science fictional, whereas Apple’s is mostly foreseeable. You can imagine wearing Apple Watch, in no small part because you remember thinking that you could imagine carrying Apple’s iPhone—and then you did, and now you always do.

Technology moves fast, but its speed now slows us down. A torpor has descended, the weariness of having lived this change before—or one similar enough, anyway—and all too recently. The future isn’t even here yet, and it’s already exhausted us in advance.

It’s a far cry from “future shock,” Alvin Toffler’s 1970 term for the post-industrial sensation that too much change happens in too short a time. Where once the loss of familiar institutions and practices produced a shock, now it produces something more tepid and routine. The planned obsolescence that coaxes us to replace our iPhone 5 with an iPhone 6 is no longer disquieting, but just expected. I have to have one has become Of course I’ll get one. The idea that we might willingly reinvent social practice around wristwatch computers less than a decade after reforming it for smartphones is no longer surprising, but predictable. We’ve heard this story before; we know how it ends.

Future shock is over. Apple Watch reveals that we suffer a new affliction: future ennui. The excitement of a novel technology (or anything, really) has been replaced—or at least dampened—by the anguish of knowing its future burden. This listlessness might yet prove even worse than blind boosterism or cynical naysaying. Where the trauma of future shock could at least light a fire under its sufferers, future ennui exudes the viscous languor of indifferent acceptance. It doesn’t really matter that the Apple Watch doesn’t seem necessary, no more than the iPhone once didn’t too. Increasingly, change is not revolutionary, to use a word Apple has made banal, but presaged.

Our lassitude will probably be great for the companies like Apple, who have worn us down with the constancy of their pestering. The poet Charles Baudelaire called ennui the worst sin, the one that could “swallow the world in a yawn.” As Apple Watch leads the suppuration of a new era of wearables, who has energy left to object? Who has the leisure for revolution, as we keep up with our social media timelines and emails and home thermostats and heart monitors?

When one is enervated by future ennui, there’s no vigor left even to ask if this future is one we even want. And even if we ask, lethargy will likely curtail our answers. No matter, though: Soon enough only a wrist’s glance worth of ideas will matter anyway.

This article was originally published at http://www.theatlantic.com/technology/archive/2014/09/future-ennui/380099/








Author: "Ian Bogost" Tags: "Technology"
Send by mail Print  Save  Delicious 
Date: Tuesday, 16 Sep 2014 10:00

Before IBM, before punch-card computers, before Charles Babbage's Analytical Engine, one of the very first machines that could run something like what we now call a "program" was used to make fabric. This machine—a loom—could process so much information that the fabric it produced could display pictures detailed enough that they might be mistaken for engravings.

Like, for instance, the image above: a woven piece of fabric that depicts Joseph-Marie Jacquard, the inventor of the weaving technology that made its creation possible. As James Essinger recounts in Jacquard's Web, in the early 1840s Charles Babbage kept a copy at home and would ask guests to guess how it was made. They were usually wrong.  

Weaving a pattern that detailed was, before the early 1800s, if not outright impossible, impossibly tedious. At its simplest, weaving means taking a series of parallel strings (the warp) lifting a selection of them up, and running another string (the weft) between the two layers, creating a crosshatch. Making more complicated patterns means choosing more carefully which of the warp strings lie on top of the weft each time it passes through. In the years before Jacquard invented his loom head, the best way to do this was by hand.

The Jacquard loom, though, could process information about which of those strings should be lifted up and in what order. That information was stored in punch cards—often 2,000 or more strung together. The holes in the punch cards would let through only a selection of the rods that lifted the warp strings. In other words, the machine could replace the role of a person manually selecting which strings would appear on top. Once the punch cards were created, Jacquard looms could quickly make pictures with subtle curves and details that earlier would have take months to complete. And the loom could replicate those designs, over and over again.

This was exactly the sort of system that Babbage envisioned for his Analytical Engine—except instead of printing patterns, his machine would have performed mathematical operations. As Ava Lovelace wrote him: "We may say most aptly that the Analytical Engine weaves algebraical patterns just as the Jacquard-loom weaves flowers and leaves."

This article was originally published at http://www.theatlantic.com/technology/archive/2014/09/before-computers-people-programmed-looms/380163/








Author: "Sarah Laskow" Tags: "Technology"
Send by mail Print  Save  Delicious 
Date: Tuesday, 16 Sep 2014 09:00

Israel has a substantial arsenal of nuclear weapons.

Former CIA Director Robert Gates said so during his 2006 Senate confirmation hearings for secretary of defense, when he noted—while serving as a university president—that Iran is surrounded by “powers with nuclear weapons,” including “the Israelis to the west.” Former President Jimmy Carter said so in 2008 and again this year, in interviews and speeches in which he pegged the number of Israel’s nuclear warheads at 150 to around 300.

But due to a quirk of federal secrecy rules, such remarks generally cannot be made even now by those who work for the U.S. government and hold active security clearances. In fact, U.S. officials, even those on Capitol Hill, are routinely admonished not to mention the existence of an Israeli nuclear arsenal and occasionally punished when they do so.

The policy of never publicly confirming what a scholar once called one of the world’s “worst-kept secrets” dates from a political deal between the United States and Israel in the late 1960s. Its consequence has been to help Israel maintain a distinctive military posture in the Middle East while avoiding the scrutiny—and occasional disapprobation—applied to the world’s eight acknowledged nuclear powers.

But the U.S. policy of shielding the Israeli program has recently provoked new controversy, partly because of allegations that it played a role in the censure of a well-known national-laboratory arms researcher in July, after he published an article in which he acknowledged that Israel has nuclear arms. Some scholars and experts are also complaining that the government’s lack of candor is complicating its high-profile campaign to block the development of nuclear arms in Iran, as well as U.S.-led planning for a potential treaty prohibiting nuclear arms anywhere in the region.

The U.S. silence is largely unwavering, however. “We would never say flatly that Israel has nuclear weapons,” explained a former senior State Department official who dealt with nuclear issues during the Bush administration. “We would have to couch it in other language, we would have to say ‘we assume’ or ‘we presume that Israel has nuclear weapons,’ or ‘it’s reported’ that they have them,” the former official said, requesting that his name not be used due to the political sensitivity surrounding the topic.

President Barack Obama made clear that this four-decade-old U.S. policy would persist at his first White House press conference in 2009, when journalist Helen Thomas asked if he knew of any nations in the Middle East with nuclear arms. “With respect to nuclear weapons, you know, I don’t want to speculate,” Obama said, as though Israel’s established status as a nuclear-weapons state was only a matter of rumor and conjecture.

So wary is Paul Pillar, a former U.S. national-intelligence officer for the Middle East, of making any direct, public reference to Israel’s nuclear arsenal that when he wrote an article this month in The National Interest, entitled “Israel’s Widely Suspected Unmentionables,” he referred to warheads as “kumquats” throughout his manuscript.

Even Congress has been coy on the subject. When the Senate Foreign Relations Committee published a 2008 report titled “Chain Reaction: Avoiding a Nuclear Arms Race in the Middle East,” it included chapters on Saudi Arabia, Egypt, and Turkey—but not Israel. The 61-page report relegated Israel’s nuclear arms to a footnote that suggested that Israel’s arsenal was a “perception.”

“This report does not take a position on the existence of Israeli nuclear weapons,” the report said. “Although Israel has not officially acknowledged it possesses nuclear weapons, a widespread consensus exists in the region and among experts in the United States that Israel possesses a number of nuclear weapons. For Israel’s neighbors, this perception is more important than reality.”

While former White House or cabinet-level officers—such as Gates—have gotten away with more candor, the bureaucracy does not take honesty by junior officials lightly. James Doyle, a veteran nuclear analyst at Los Alamos National Laboratory who was recently censured, evidently left himself open to punishment by straying minutely from U.S. policy in a February 2013 article published by the British journal Survival.

“Nuclear weapons did not deter Egypt and Syria from attacking Israel in 1973, Argentina from attacking British territory in the 1982 Falklands War or Iraq from attacking Israel during the 1991 Gulf War,” Doyle said in a bitingly critical appraisal of Western nuclear policy, which angered his superiors at the nuclear-weapons lab as well as a Republican staff member of the House Armed Services Committee.

Even though three secrecy specialists at the lab concluded the article contained no secrets, more senior officials overruled them and cited an unspecified breach as justification for censuring Doyle and declaring the article classified, after its publication. They docked his pay, searched his home computer, and, eventually, fired him this summer. The lab has said his firing—as opposed to the censure and search—was not related to the article’s content, but Doyle and his lawyer have said they are convinced it was pure punishment for his skepticism about the tenets of nuclear deterrence.

Neither Doyle nor his colleagues revealed if the sentence in his article about Israel’s arsenal was the one that provoked officials to nitpick about a security violation, but several independent experts have surmised it was.

Steven Aftergood, director of the Project on Government Secrecy at the Federation of American Scientists, said the clues lie in the Energy Department’s citation—in a document summarizing the facts behind Doyle’s unsuccessful appeal of his ill treatment—of a classification bulletin numbered “WPN-136.”

The full, correct title of that bulletin, according to an Energy Department circular, is “WNP-136, Foreign Nuclear Capabilities.” The classification bulletin itself is not public. But Aftergood said Doyle’s only reference to a sensitive foreign nuclear program was his mention of Israel’s, making it highly probable this was the cudgel the lab used against him. “I’m certain that that’s what it is,” Aftergood said in an interview.

The circumstances surrounding Doyle’s censure are among several cases now being examined by Department of Energy (DOE) Inspector General Gregory Friedman, as part of a broader examination of inconsistent classification practices within the department and the national laboratories, several officials said.

Doyle’s reference to the existence of Israel’s nuclear arsenal reflects the consensus intelligence judgment within DOE nuclear weapons-related laboratories, former officials say. But some said they find it so hard to avoid any public reference to the weapons that classification officers periodically hold special briefings about skirting the issue.

“It was one of those things that was not obvious,” a former laboratory official said, asking not to be identified due to the sensitivity of the topic. “Especially when there’s so much about it in the open domain.”

Israel’s nuclear-weapons program began in the 1950s, and the country is widely believed to have assembled its first three weapons during the crisis leading to the Six-Day War in 1967, according to the Nuclear Threat Initiative, a nonprofit group in Washington that tracks nuclear-weapons developments.

For decades, however, Israel itself has wrapped its nuclear program in a policy it calls amimut, meaning opacity or ambiguity. By hinting at but not confirming that it has these weapons, Israel has sought to deter its enemies from a major attack without provoking a concerted effort by others to develop a matching arsenal.

Israeli-American historian Avner Cohen has written that U.S. adherence to this policy evidently grew out of a September 1969 meeting between President Richard Nixon and Israeli Prime Minister Golda Meir. No transcript of the meeting has surfaced, but Cohen said it is clear the two leaders struck a deal: Israel would not test its nuclear weapons or announce it possessed them, while the United States wouldn’t press Israel to give them up or to sign the Non-Proliferation Treaty, and would halt its annual inspections of Dimona, the site of Israel’s Negev Nuclear Research Center.

As an outgrowth of the deal, Washington, moreover, would adopt Israel’s secret as its own, eventually acquiescing to a public formulation of Israeli policy that was initially strenuously opposed by top U.S. officials.

Golda Meir, Richard Nixon, and Henry Kissinger in 1973, four years after an earlier meeting that may have established the policy of mutual silence regarding Israel's nuclear armaments. (Wikimedia)

“Israel will not be the first country to introduce nuclear weapons into the Middle East,” the boilerplate Israeli account has long stated. “Israel supports a Middle East free of all weapons of mass destruction following the attainment of peace.” When Nixon’s aides sought assurances that this pledge meant Israel would not actually build any bombs, Israeli officials said the word “introduce” would have a different meaning: It meant the country would not publicly test bombs or admit to possessing them, leaving ample room for its unacknowledged arsenal.

“While we might ideally like to halt actual Israeli possession,” then-National Security Advisor Henry Kissinger wrote in a July 1969 memo to Nixon that summarized Washington’s enduring policy, “what we really want at a minimum may be just to keep Israeli possession from becoming an established international fact.”

Even when Mordechai Vanunu, a technician at Dimona, provided the first detailed, public account of the program in 1986 and released photos he had snapped there of nuclear-weapons components, both countries refused to shift gears. After being snatched from Italy, Vanunu was imprisoned by Israel for 18 years, mostly in solitary confinement, and subsequently forbidden to travel abroad or deal substantively with foreign journalists. In an email exchange with the Center for Public Integrity, Vanunu indicated that he still faces restrictions but did not elaborate. “You can write me again when I am free, out of Israel,” he said.

The avoidance of candor has sometimes extended to private government channels. A former U.S. intelligence official said he recalled being flabbergasted in the 1990s by the absence of any mention of Israel in a classified document purporting to describe all foreign nuclear-weapons programs. He said he complained to colleagues at the time that “we’ve really got a problem if we can’t acknowledge the truth even in classified documents,” and finally won a grudging but spare mention of the country’s weaponry.

Gary Samore, who was President Obama’s top advisor on nuclear nonproliferation from 2009 to 2013, said the United States has long preferred that Israel hold to its policy of amimut, out of concern that other Middle Eastern nations would feel threatened by Israel’s coming out of the nuclear closet.

“For the Israelis to acknowledge and declare it, that would be seen as provocative,” he said. “It could spur some of the Arab states and Iran to produce weapons. So we like calculated ambiguity.” But when asked point-blank if the fact that Israel has nuclear weapons is classified, Samore—who is now at Harvard University—answered: “It doesn’t sound very classified to me—that Israel has nuclear weapons?”

The U.S. government’s official silence was broken only by accident, when, in 1979, the CIA released a four-page summary of an intelligence memorandum titled “Prospects for Further Proliferation of Nuclear Weapons” in response to a Freedom of Information Act request by the Natural Resources Defense Council, a nonprofit environmental group.

“We believe that Israel already has produced nuclear weapons,” the 1974 report said, citing Israel’s stockpiling of large quantities of uranium, its uranium-enrichment program, and its investment in a costly missile system capable of delivering nuclear warheads. Release of the report triggered a spate of headlines. “CIA said in 1974 Israel had A-Bombs,” a New York Times headline declared. “Israel a Nuclear Club Member Since 1974, CIA Study Indicates,” announced The Washington Star.

But it stemmed from a goof.

John Despres, who was the CIA’s national-intelligence officer for nuclear proliferation at the time, said he was in charge of censoring or “redacting” the secret material from the report prior to its release. But portions he wanted withheld were released, he said in an interview, while sections that were supposed to be released were withheld.

“This was a sort of classic case of a bureaucratic screw-up,” said Despres, now retired. “People misinterpreted my instructions.” He said that as far as he knows, no one was disciplined for the mix-up. Moreover, in 2008, when the National Security Archive obtained a copy of the document under the Freedom of Information Act, that judgment remained unexcised.

But Washington’s refusal to confirm the obvious in any other way has produced some weird trips down the rabbit hole for those seeking official data about the Israeli arsenal. Bryan Siebert, who was the most senior career executive in charge of guarding DOE’s nuclear-weapons secrets from 1992 to 2002, said he recalls seeing a two-cubic-foot stack at one point of CIA, FBI, Justice, and Energy department documents about Israel’s nuclear program.

John Fitzpatrick, who since 2011 has served as director of the federal Information Security Oversight Office, confirmed that “aspects” of Israel’s nuclear status are considered secret by the United States. “We know this from classifying authorities at agencies who handle that material,” said Fitzpatrick, who declined to provide more details.

Kerry Brodie, director of communications for the Israeli embassy in Washington, similarly said no one there would discuss the subject of the country’s nuclear status. “Unfortunately, we do not have any comment we can share at this point,” she wrote in an email. A former speaker of the Israeli Knesset, Avraham Burg, was less discrete during a December 2013 conference in Haifa, where he said “Israel has nuclear and chemical weapons” and called the policy of ambiguity “outdated and childish.”

Through a spokesman, Robert Gates declined to discuss the issue. But a growing number of U.S. experts agree with Burg.

Pillar, for example, wrote in his article this month that the 45-year-old U.S. policy of shielding Israel’s program is seen around the world “as not just a double standard but living a lie. Whatever the United States says about nuclear weapons will always be taken with a grain of salt or with some measure of disdain as long as the United States says nothing about kumquats.”

Victor Gilinsky, a physicist and former member of the Nuclear Regulatory Commission who has written about the history of the Israeli program, complained in a recent book that “the pretense of ignorance about Israeli bombs does not wash anymore. … The evident double standard undermines efforts to control the spread of nuclear weapons worldwide.”

J. William Leonard, who ran a government-wide declassification effort as President George W. Bush’s director of the Information Security Oversight Office from 2002 to 2008, commented that “in some regards, it undermines the integrity of the classification system when you’re using classification to officially protect a known secret. It can get exceedingly awkward, obviously.”

Aftergood said the secrecy surrounding Israel’s nuclear weapons is “obsolete and fraying around the edges. … It takes an effort to preserve the fiction that this is a secret,” he said. Meanwhile, he added, it can still be abused as an instrument for punishing federal employees such as Doyle for unrelated or politically inspired reasons. “Managers have broad discretion to overlook or forgive a particular infraction,” Aftergood said. “The problem is that discretion can be abused. And some employees get punished severely while others do not.”

Dana H. Allin, the editor of Doyle’s article in Survival magazine, said in a recent commentary published by the International Institute for Strategic Studies in London that “anyone with a passing knowledge of international affairs knows about these weapons.” He called the government’s claim that the article contained secrets “ludicrous” and said Doyle’s ordeal at the hands of the classification authorities was nothing short of Kafkaesque.


This article appears courtesy of the Center for Public Integrity's National Security team.

 

This article was originally published at http://www.theatlantic.com/international/archive/2014/09/israel-nuclear-weapons-secret-united-states/380237/








Author: "Douglas Birch and R. Jeffrey Smith" Tags: "International"
Send by mail Print  Save  Delicious 
Date: Monday, 15 Sep 2014 19:00

Until recently, Scotland's independence referendum received little attention in the United States. Americans had company in London, where political elites also paid little heed to the possible break-up of their country. Then, at the end of August, just weeks before the September 18 vote, polls detected a sudden rise in pro-independence ‘Yes’ sentiment. ‘No’ is still favored to win. But the result will be uncomfortably close.

The ‘Yes’ side has received a boost from a manipulative campaign by Scotland’s independence-minded local authorities. Back in 1998, Tony Blair’s government devolved considerable power to a newly created Scottish parliament. The separatist Scottish National Party (SNP) gained the first ministership in 2007 and won an outright majority in 2011. Tapping into this legislative power, the separatists have rigged the referendum rules in their favor.

Scots are a famously mobile people. “The noblest prospect which a Scotsman ever sees is the high road that leads him to England,” Samuel Johnson once gibed. Those Scots who have taken the road, however, forfeit their vote on whether that road shall remain open or be closed: Anyone who has resided outside of Scotland for more than a year is disqualified from the referendum. The franchise was extended downward to 16- and 17-year-olds: Teenagers, unsurprisingly, are more attracted to the high-risk, low-reward independence proposition than are grownups. The Scottish separatists deployed the resources of government to produce a hefty white paper on independence with blithe assurances that an independent Scotland will enjoy lower taxes, higher social benefits, and less public debt—and that all difficult problems, including currency questions, will be amicably negotiated to Scotland’s advantage after a ‘Yes’ vote.

The standard political expectation is that the ‘No’ side of any referendum will outperform its poll-measured strength, as natural voter caution asserts itself at the last minute. But Americans need to understand: Their national interests are also at risk on September 18.

A vote in favor of Scottish independence would hurt Americans in five important ways.

First, a ‘Yes’ vote would immediately deliver a shattering blow to the political and economic stability of a crucial American ally and global financial power. The day after a ‘Yes’ vote, the British political system would be plunged into a protracted, self-involved constitutional crisis. Britain’s ability to act effectively would be gravely impaired on every issue: ISIS, Ukraine, the weak economic recovery in the European Union.

Second, a ‘Yes’ vote would lead to a longer-term decline in Britain’s contribution to global security. The Scottish separatists have a 30-year history of hostility toward NATO. They abruptly reversed their position on the military alliance in 2012 to reassure wavering middle-of-the-road voters. But the sincerity of this referendum-eve conversion is doubtful. Even if it was authentic, the SNP’s continuing insistence on a nuclear weapons-free policy would lock U.S. and U.K. forces out of Scotland’s naval bases. The SNP’s instincts are often anti-American and pro-anybody-on-the-other-side of any quarrel with the United States, from Vladimir Putin to Hamas.

Third, a ‘Yes’ vote would embitter English politics and empower those who wish to quit the European Union. Since the 1990s, the central British government has attempted to appease Scottish separatism. Tony Blair devolved powers; David Cameron agreed that the U.K. would recognize a Scottish independence vote as binding. In the wake of a ‘Yes’ vote, however, English public opinion would harden. The bargaining over public debt, ownership of North Sea oil, and other contentious issues would be ferocious—and those English politicians who urge a tougher line on these matters would likely dominate the debate. Such politicians also tend to be Euro-skeptics. The United States has traditionally preferred an EU that includes the U.K., both because a cross-Channel common market makes it easier for U.S. businesses to conduct commerce and because U.K. leaders—from the Conservative and Labour parties alike—have historically pushed the EU in a more free-market direction.

Fourth, a ‘Yes’ vote would aggravate the paralysis afflicting the European Union. An independent Scotland would seek admission to the EU as a 29th member state. A club of so many member states cannot function by consensus, and the EU has yet to develop more effective decision-making methods. The result, much of the time, is that no decision is made at all—a dynamic that Vladimir Putin depended on when he picked a fight with a multinational entity that is notionally much richer and stronger than Russia is.

Fifth, a ‘Yes’ vote would only further encourage German domination of the European Union. The EU originated as a bloc of three large countries (France, Italy, and West Germany) and three smaller ones (Belgium, Luxembourg, and the Netherlands). West Germany ranked as first among equals, but France and Italy could make their voices heard too. Today, an expanded EU contains 19 members that are less populous than Belgium and one—Malta—that is even smaller than Luxembourg. Meanwhile, united Germany looms bigger and richer than ever, accounting for more than 20 percent of the EU’s gross domestic product. Scotland as a 29th member nation (and, who knows, perhaps Catalonia as a 30th), along with the emergence of a weakened United Kingdom, would push the EU in an even more lopsided direction: The median EU member by population would be Austria—a country of 8.5 million that sells 30 percent of its exports to Germany.

Germany is an important American ally, and there is nothing sinister about the role it’s currently playing in Europe. But Germany’s interests do not align precisely with those of the United States or other EU member states. Germany usually favors a more deflationary monetary policy and a more accommodating policy toward Russia than most U.S. administrations and many EU members would prefer. As the ranks of the small, Germany-beholden states of Europe proliferate, the EU is evolving away from its historic role as a restraint on German dominance within Europe—and toward a disturbing new role as a multiplier of that dominance.

In February 1995, Bill Clinton traveled to Ottawa to speak in favor of Canadian unity. “In a world darkened by ethnic conflicts that tear nations apart, Canada stands as a model of how people of different cultures can live and work together in peace, prosperity, and mutual respect,” Clinton told the Canadian Parliament. The U.S. president was a more popular figure in Quebec than that province’s own politicians, and his words likely contributed to the narrow margin of victory of the ‘No’ side in Quebec’s second and final secession referendum later that year. President Obama has played no equivalent role in the debate over the survival of America’s close ally, the United Kingdom. If the ‘Yes’ vote prevails on September 18, Obama’s omission should be remembered in the postmortem assignment of blame for a potential disaster for the peoples of Britain, Europe, and the Western alliance.

This article was originally published at http://www.theatlantic.com/international/archive/2014/09/americas-stake-in-the-scottish-referendum/380228/








Author: "David Frum" Tags: "International"
Send by mail Print  Save  Delicious 
Date: Monday, 15 Sep 2014 18:00

I smoked my second cigarette on a dour July afternoon when I was 15.

When I was younger, in the years of Captain Planet and after-school specials, I had been stridently anti-smoking. I would hide my uncle and aunt's cigarettes, or harangue them to quit, fruitlessly but to the best of my 10-year-old abilities. I smoked my first cigarette around that time, when I snuck into my uncle's place and lit the longest butt in the ashtray just to be sure I didn’t like it. I didn’t—I don't remember if I coughed, but I do remember the smoke from the rapidly shortening nub rising up into my eyes, and burning so strongly that I dropped the cigarette and almost spilled the ashtray.

But as a teenager, when that summer day bore down on me, grabbing that Marlboro Menthol Light just seemed right. My friends and I were sitting on the sidewalk just off Grand Avenue in Queens, dressed better than our usual sneakers and jeans. Someone may have been wearing a tie. I don't recall anyone speaking, just a gauzy image of everyone plucking cigarettes as a girl swung an open pack in front of us. Our eyes were red. We'd all been crying. That shell that 15-year-olds walk around in that makes them feel invincible—mine had been shattered.

More accurately, I suppose, it had happened a few days earlier. I arrived home to a message from my friend, Justin. He spoke quickly, saying to call him, but it wasn't much stranger than the typical awkward messages he left on my answering machine. I called him back. I never quite understood why people say “You should sit down for this” when they have big news, but when Justin told me that our close friend Jorge was dead, I sat down.

Jorge was fun and loyal, the type to give you the shirt off his back, and on at least one occasion he actually did. We were both underdressed for a surprisingly cold morning and he handed me his Old Navy button-down, which wouldn't have done either of us any good against the breeze, but the gesture always stayed with memy definitive Jorge moment. During the summer between 9th and 10th grades, he went to stay with family in Utah and drowned while swimming in a lake, some combination of his being a poor swimmer and catching a cramp. The exact circumstances didn't actually matter to me because they were an answer to the wrong question. The question wasn't “Why didn’t he wear a life vest?” or “Why wasn't anyone able to get to him?” It was an infinitely broad and dizzyingly cosmic “Why?”

After Justin's call, not knowing what else to do, I called Jorge's house. His sister Vivian answered. I choked out some tearful version of “Is it true?” She howled yes and we sobbed in each other's ears for a few minutes. But I had never been so shaken as at the funeral, when I caught that first glimpse of him in the casket. The world melted away and I can't tell you where anyone was or even the color of the coffin, but the warm-hued image of my friend seared itself deeply, viciously, into my mind. He looked mature and calm, and somehow unfamiliar. He was buried in his Yankees jersey. Whatever I felt burning in my chest at that moment, it was dwarfed entirely by the plaintive wailing of Jorge's mother. I felt terribly small in that room. And if smoking that cigarette wasn't right, then it was something—a single modicum of control, a fleeting break from helplessness. Even if it was destructive.

Jorge’s death is not the reason I became a smoker—it was merely the unfortunate happenstance that set off my smoking habit. If it were as simple as just needing the relief at the funeral, I wouldn't have kept smoking, but I did.

In fact, I soon grew to love smoking. It became a wonderful crutch to me over the years. I loved the sound of a Zippo lighter snapping open. I loved having an excuse to strike up a conversation with people outside bars, or to break the monotony of a long work day with a few trips outside. I especially loved nights sitting on my stoop with my cousin, smoking and talking until the paper got delivered, just as day broke.

I liked to use stress as an excuse for smoking because people barely argued with it, but the truth is that I rarely felt stressed. I had a habit, no doubt, but it was a behavioral craving, not a physical one. There were times to have a cigarette: before school, at the bus stop, after a big meal, before bed. And I liked the action of it. There was a sense of ritual: Slide a cigarette from the pack, tap it a few times against the lighter, the flame, the sizzle. Inhale, exhale. Repeat.

I cut back gradually after college, then drastically when I moved in with my girlfriend, only smoking on occasional weekends. But I liked knowing I could turn to it. I'd buy a pack for the last few weeks of a semester in graduate school, just to help the stress of those final assignments, but what would start as one or two cigarettes a day became two or three packs a week before I'd have to consciously pump the brakes. The truth is that I didn't particularly want to stop. I knew I should, but I felt that by declaring I quit, I'd be judged if I failed. What would I do when real stress crept up on me?

In the spring of 2013, I found out. My mother told me she had to go under the knife for a valve replacement. She scheduled her surgery for after my birthday, because as a single mother she was incapable of selfishness—she was facing a tremendously frightening unknown, and she didn't want to ruin my birthday if something went wrong. It was touching, even if it was pessimistic. Wrapping up a semester of grad school had been a welcome distraction, but as the day approached I found myself smoking more than normal.

My mother arrived at Mount Sinai before 5:00 a.m. with her boyfriend, Pete, and my grandmother. Just before she got called into the prep area, she and I broke away into the hallway for a moment. She told me she was proud of me, gave me a hug, and I steeled my nerves and told her to tell me all about it when she got out. As they carted her away, she blew us a small kiss and waved. I went outside for a cigarette. There were a few left in my pack, and I lit one as I sat on a bench outside Central Park. I remember watching the cigarette tremble in my hand.

The procedure was routine, as far as open-heart surgery goes. We spent the morning and most of the afternoon in the waiting area. It was a six-hour operation, they said. There was a board up on the wall, like the ones that show airline arrivals and departures, that was meant to let us know what stage of the surgery. The fact that it never changed was not reassuring. Pete fell asleep for a while, and my grandmother might have even snatched a few winks, but as the day went on, I shifted from my early no-news-is-good-news outlook to a more anxious state of sleeplessness. I spent hours craning my neck to check the board every few seconds and made a few more trips outside to smoke.

When the doctor called my cell phone to say it was over , his tone was upbeat, and as soon as I heard him use the word “perfect,” I felt a wave of relief wash over me. I didn't know I was holding my breath until I exhaled. The same relief was visible in my grandmother and Pete, and palpable over the phone when I called more family from the waiting room. Then I went outside for a smoke, to calm myself down. Later that day, my mother woke up for a second, and I told her she did well, that she was strong. She nodded at me hazily through the anesthesia, and we left so she could rest. When I finally arrived home later that night, I stayed out for a cigarette before going inside, one more to cap a long day.

That was the night that I decided to stop smoking for good.

Outside my house, I looked up at the moon and exhaled, watching it shine through my cloud of smoke. A startling thought crept inthat we weren’t out of the woods yet, because bad news likes to wait until you're at ease before it crushes you—and I felt my relief beginning to crack.

“If everything turns out alright with my mother, I'll stop smoking,” I thought. And if not, then quitting would be the last thing on my mind.

I do not believe in God, or any great sentient power. Beyond occasional conversational thoughts to my late grandfatherwho I had appealed to a lot that daythis cosmic bargain was the closest I've ever come to earnest prayer. Or maybe it was just growing upfrom a teenager betrayed by mortality turning to self-destruction to an adult throwing away a harmful habit as I watched someone I love fight to live.

When I went to see my mother the next day, she was awake and alert and making jokes. I knew I had to uphold my end of the deal. The truth is, I barely miss the smoking. A few months later I took a drag of a friend's cigarette, just to make sure I was done with them. I was.

I know now that quitting was probably just another desperate swipe at control. And I know that my mother's health did not hinge on my quitting, but I nonetheless feel obligated to my word. I started smoking in a vulnerable, angry, feeble attempt to assuage a loss. I quit to avoid another. So far, so good.

This article was originally published at http://www.theatlantic.com/health/archive/2014/09/my-life-in-cigarettes/379249/








Author: "Robert Tutton" Tags: "Health"
Send by mail Print  Save  Delicious 
Date: Monday, 15 Sep 2014 17:50
Author: "Chris Heller" Tags: "Video"
Send by mail Print  Save  Delicious 
Date: Monday, 15 Sep 2014 17:44

Are you having a good day? Are you feeling rested, and happy, and ready to conquer the week ahead with your signature mixture of aggression and aplomb? Are you on top of your game, and thus on top of the world?

Then you might want to get off the Internet. Because the Internet does not agree with your sparkly sense of optimism. The Internet—with all due no offenses and sorry, buts—does not think you are on top of your game. Your capacity to work? To love? To live your life? You may not have asked for the Internet's opinion on these matters, but it will tell you anyway: It thinks you are Doing It Wrong.

Well, not the Internet (if, when it comes to precision, we are Doing It Right). The people who are employed to write things for the Internet. "You're doing it wrong" has become a common headline cliche, a sassier, snarkier version of "8 Questions You Were Too Embarrassed to Ask" and "You'll Never Believe What Happened Next." Here, according to the Internet/journalism/your fellow humans, is an incomplete list of all the things you—yes, you—have recently been doing wrong:

Tweeting
Parenting
Exercising
Investing in biotech
Investing in general
Making French toast
Making French fries
Making potato salad
Making tomato soup
Making muffins
Eating cupcakes
Eating popsicles
Eating grapefruit
Eating cauliflower
Eating corn
Eating tacos
Eating Tic-Tacs
Following the Paleo diet
Taking the Ice Bucket Challenge
Marketing to millennials
Blogging
Surfing
Horseback riding
Demonstrating respect
Designing websites
Cutting cucumbers
Curing colds
Having relationships
Sleeping with dogs
Looking for jobs
Hiring employees
Encouraging a Feedback Culture
Recycling plastic
Using the bathroom
Making cornbread
Making pasta
Cooking eggs
Checking out at grocery stores
Science
Anthropology
Anatomy
Digital marketing
Social media marketing
Server performance
APIs
Bagels
Paella
Refried beans
Nuts
Rice
Religion
Summer
Puberty
Philanthropy
Life
Love
Working
Parking
Changing habits
Paying bills
Making music
Folding sheets
Folding clothes
Making bruschetta
Pronouncing "bruschetta"
Applying nail polish
Wearing pants
Writing novels
Peeling ginger
Peeling bananas
Wrapping burritos
Making crepes
Praying
Using jetpacks
Being a dad
Making people happy
Making people sad
Waging psychological warfare
Facing the future

Yeah. Sorry.

Like so many such things, "You're Doing It Wrong" started as a meme. In the early 2000s, you could often find it as the all-caps wording on "FAIL" images (like, say, a guy taking an electric shaver to his forearms, or George W. Bush talking into a the earpiece, rather than mouthpiece, of a telephone). But, unlike many such things, the meme didn't die. Instead, it evolved. Journalism reclaimed it for preachy headlines like "Doing It All: You're Probably Doing It Wrong." I've used it. Slate has a topic section dedicated to it.

We should probably stop with all this. The headlines are cheeky, sure, but in the aggregate, they are simply sad. In the name of helping people out, we have become a group of wanton, finger-wagging judgers. Which, no matter the particular moral or ethical code you subscribe to, is probably doing it wrong.

This article was originally published at http://www.theatlantic.com/technology/archive/2014/09/everything-youre-doing-it-wrong/380160/








Author: "Megan Garber" Tags: "Technology"
Send by mail Print  Save  Delicious 
Date: Monday, 15 Sep 2014 17:15

1) Green Power in Vermont. Last year our American Futures team reported on several almost-too-good-to-be-true aspects of life in Burlington, Vermont. A print newspaper that was thriving. A commercial airport that was actually pleasant. A brewery whose output was so much in demand that it rationed sales to give everyone a chance. A strong business-and-social-responsibility culture, including in clean tech and info tech. Advances in traditional higher-ed but also in a "career-oriented" approach. An ability to absorb refugees and immigrants. Overall, effective governance and public-private collaboration, from the era of its onetime Socialist mayor Bernie Sanders to the current Democratic mayor Miro Weinberger.

Today, a significant news announcement via this AP story by Wilson Ring. It begins:

Vermont's largest city has a new success to add to its list of socially conscious achievements: 100 percent of its electricity now comes from renewable sources such as wind, water and biomass....

"It shows that we're able to do it, and we're able to do it cost effectively in a way that makes Vermonters really positioned well for the future," said Christopher Recchia, the commissioner of the Vermont Department of Public Service.

A lot of things are easier to do in a small state like Vermont than in a big, sprawling, quarrelsome country like America as a whole. Still, these things don't happen on their own, and this is another reason to offer congratulations.

Inside Marriage Special Report bug
Reinvention and resilience across the nation
Read more

2) Eagles Power in Allentown. As we previewed last week (and as was covered on Marketplace), this past Friday night was a huge watershed for the government, businesses, and people of Allentown, Pa. That was when the high-stakes effort to revive the tattered downtown had its debut event, with a concert by The Eagles at the new PPL Arena.

How did it go? Our friends at the Morning Call have a number of generally upbeat stories. For instance, here's the way their review of the concert itself was played on the Morning Call's site.

And here was the connection between the band and the town. The Eagles group, John Moser wrote,

which once was such a mighty force in music that it sold 60 million albums in the 1970s alone and became the third-best-selling band of all time, is back on tour after just one studio album in 35 years and no Top 10 songs since 1980.

That puts the band in much the same situation as Allentown — a city that is decades removed from its glory years and is looking to reassert itself.

Both were pretty impressive at the arena's first event Friday.

For other local coverage, you can start here or here, which includes a charming video of people who had come back into the downtown for the first time in years. This latter story includes a variety of reactions like this:

"I used to come here as a kid, but there was never any reason to come back unless you had jury duty," [a suburbanite] said. "This is a lot nicer than I've heard. They've really cleaned it up. I can't believe this is Allentown."

Allentown still has a million problems and a very long way to go, as people there are aware and as we will discuss. But to kick off the new season of American Futures coverage, please check out John Tierney's new post on the unusual tax plan that lies behind the downtown revival, and the people in state and local government behind that plan. He explains how it's different from urban-incentive programs in place anywhere else, the tradeoffs it involves, and why—in his view, as a one-time political-science professor* and a communitarian in outlook—he thinks it is a positive step.

More to come later this week, as we unveil a new look for this series. For now, good wishes to people working hard for their communities from Burlington to Allentown.


*Small-world department: When John Tierney was in graduate school at Harvard, his dissertation supervisor was none other than the late James Q. Wilson, who is famous in and beyond Atlantic precincts as co-author of the hugely influential article "Broken Windows." John studied urban politics and has been wrestling with these issues for a long time.

This article was originally published at http://www.theatlantic.com/business/archive/2014/09/greening-up-in-burlington-starting-up-in-allentown/380222/








Author: "James Fallows" Tags: "Business"
Send by mail Print  Save  Delicious 
Date: Monday, 15 Sep 2014 16:06
Author: "--" Tags: "Video"
Send by mail Print  Save  Delicious 
Date: Monday, 15 Sep 2014 16:00

Constitutional disputes sometimes turn on technical legal terms: What is “due process of law,” for example, or “double jeopardy”? But most of the Constitution isn’t written in legalese, and some important cases are about the meaning of ordinary language. National Labor Relations Board v. Noel Canning, last year’s “recess appointment” case, had been decided by the court below by citing an 18th century dictionary on the meaning of “the.” (The Supreme Court majority didn’t decide the case on dictionary grounds, though Justice Antonin Scalia in his angry concurrence managed to kick up quite a row about the meaning of “happen.”)  

Here’s another constitutional conundrum: What does “legislature” mean?

The answer could determine an issue at the heart of our current poisonous politics. Can the voters of a state take control of drawing House districts out of the hands of their elected legislators and entrust it to a bipartisan commission? That’s what Arizona voters did in 2010. Now the legislature is demanding to be allowed back in.  

Article 1, section 4, clause 1 of the Constitution says that “The times, places and manner of holding elections for Senators and Representatives, shall be prescribed in each state by the Legislature thereof; but the Congress may at any time by law make or alter such regulations ....” No one questions that state governments can draw their own legislative districts. But what does “legislature” mean? Does it mean “the legislative power of a state,” or “the bunch of politicians with bad haircuts who meet at the state capitol every year or so”?

Briefs filed with the Supreme Court recite this question in the elevated language of original understanding, Madisonian theory, and the Federalist. But like many, if not most, important constitutional cases, Arizona State Legislature v. Arizona Independent Redistricting Commission is really comic-opera politics in knee britches.

In 2000, civic groups in Arizona—including the League of Women Voters, Common Cause, and the Arizona School Boards Association—joined a bipartisan group of political leaders to propose a voter initiative, Proposition 106. Approved by 56 percent of the voters, it created a new, bipartisan panel called the Independent Redistricting Commission. The commission’s job is to create new districts for the legislature and Arizona’s nine members of the U.S. House. It is not permitted to consider protection of incumbents; it is, however, under a duty to make as many districts “competitive” as possible. The legislature may not approve or disapprove the commission’s maps.

The appointment process is labyrinthine: First a commission on appointments proposes names, then officials of the legislature choose two Republicans and two Democrats to serve. These two then select an independent to serve as chair. The governor can remove a member for neglect of duty or misconduct, but otherwise, political control is nonexistent.

Despite its good-government origins, the commission broke into partisan squabbling; the chair, a political independent, often sided with the two Democrats. Governor Jan Brewer, a Republican, fired her. The state supreme court reinstated the chair, saying she hadn’t neglected or abused her office.

The Arizona Republican Party bitterly protested the commission’s 2012 maps. In the most recent legislative elections, the voters picked Republicans by a 17-13 margin in the state Senate and 36-24 in the House. In the U.S. House, the margin flipped from 5-4 Republican to 5-4 Democratic. A partisan districting plan, however, could have given the Republicans a supermajority in the statehouse and kept one or more additional House seats for the GOP.

Now the GOP-controlled legislature has sued, arguing that the Constitution doesn’t allow redistricting of a state by to any official body not controlled by “the legislature thereof.” A three-judge panel below dismissed the suit. The Court will decide, as soon as September 29, whether to affirm the three-judge court or put the case down for a full hearing. The constitutional issue is a close one; the political division underlying it is stark.

Our political system, as we all know, has degenerated into a partisan abattoir; in Congress and in many state capitals, compromise and conciliation are as out of fashion as the straw boater. One major reason is partisan gerrymandering, which produces legislators (on both sides of the aisle) who respond to no one except their wealthy funders and their partisan base. In a 2004 case called Vieth v. Jubelirer, the Supreme Court ducked the chance to put the brakes on this odious practice. Writing for four members of the Court, Scalia scoffed at the idea that partisan thimblerigging was worthy of the Court’s attention: “‘Fairness’ does not seem to us a judicially manageable standard.”

By the time of Vieth, the voters of Arizona had already decided to take politics out of redistricting themselves. The question then becomes whether the Constitution allows them to do so.  

The Court has decided a few cases—the most recent nearly 80 years ago—approving the involvement of a state’s governor, courts, and voters in the redistricting process. In a 1916 case called Ohio ex rel. Davis v. Hildebrandt, Ohio voters by referendum disapproved a legislature’s new district map; when the legislature sued, the Court said that “the referendum constituted a part of the state Constitution and laws, and was contained within the legislative power.” Article IV of Arizona’s state constitution sets up the legislative branch with this declaration: “The legislative authority of the state shall be vested in the legislature ... but the people reserve the power ... to approve or reject at the polls any act, or item, section, or part of any act, of the legislature.” Since statehood, the commission argues, the “legislative power” has been vested in the people.”

Lawyers for the legislature respond with Supreme Court caselaw suggesting that “legislature” means “the representative body which made the laws of the people,” not the entire legal apparatus of a state. And they note correctly that none of the previous cases involved a system in which the legislature had no role at all in drawing district maps.

The law in this area, sparse as it is, seems to be relatively settled—the legislature lost below because of these precedents. This case, however, comes to the Court as an “appeal”—that is, directly from a decision by a three-judge district-court panel below. The Court is more likely to hear appeal cases than cases brought to it by petition. In theory, it’s supposed to hear all appeals, but in fact it can reject an appeal for lack of a “substantial question,” meaning in essence that there’s nothing in the case that interests the justices.

The valence of this case from day one has been sharply partisan. Republican conservatives and dark-money groups hate the very idea of “non-political” redistricting. And of course Democrats misuse their legislative majorities to the same end. The ill effects of this scorched-earth strategy are easy to see. The Court has refused to prevent gerrymanders. To insist that the people of a state can’t do it either would be something else again.

This article was originally published at http://www.theatlantic.com/politics/archive/2014/09/can-the-voters-take-politics-out-of-redistricting/380150/








Author: "Garrett Epps" Tags: "Politics"
Send by mail Print  Save  Delicious 
Date: Monday, 15 Sep 2014 15:34

The news of NFL player Adrian Peterson’s arrest for child abuse came on the heels of a week focused on an act of domestic violence from another black football player. Some of his colleagues rushed to defend Peterson and suggest his charges were yet another example of our society’s widespread and unfair demonization of black men. “[Peterson] can’t play Sunday for disciplining his child[.] Jesus help us” tweeted fellow NFL player Roddy White. And running back Mark Ingram chimed in, arguing that his parents gave him “more whoopins” than he can count because “they just wanted [him] to be the best human possible.

To be sure, the prevalent use of corporal punishment among African-Americans is no secret. Some scholars have argued that beating children in the black community serves as some sort of traumatic reenactment of the brutal violence experienced during slavery, a remnant of centuries-old barbarity.

But black people are not the only ones who use corporal punishment when it comes to disciplining their children. About 80 percent of white and Hispanic parents admit that they spank their kids, too. Lots of Christians defend it as a religious practice and in 19 states it’s still legal for teachers and staff to punish children with spankings at school.

So, when a black man gets arrested for an act that is still ambivalently upheld by our country’s legal and moral codes, many people wonder about the role that race might be playing.

For some folks, the very act of questioning black parenting triggers concerns about racism. And for good reason. The absolute devastation of the black family during slavery shaped the very definition of freedom around the ability to raise one’s own children. But even after slavery ended, black parents continued to experience extensive governmental surveillance, critique, and intervention in their homes. Even today, a black woman is much more likely (than a white woman under the same conditions) to be investigated for child abuse and have her children removed. And her children are likely to stay in foster care for much longer.

Criticism of black fathers is an even touchier subject. Whether it’s the rhetoric of our own black-father-in-chief President Obama, infamous government studies about the failures of black mothers, or general commentary in the mainstream press, it seems like everyone thinks that fatherhood is the cure to all that ails black Americans. These folks argue that without men, children lack discipline, stability, and proper guidance. And, historically, many black men challenging racial constraints were taken from their homes through acts of white terrorism and murder. As a result, attacks on black fathers is not only seen as attack on black families, but also an assault on black progress. This is even more true when it comes to criticizing a professional athlete like Adrian Peterson, who so clearly embodies ideals of black masculinity, strength, and success.

As the comments of some of Peterson’s fellow players show, there are even those in the black community who make a larger cultural argument that the strict punishment of black children is a necessary evil. Physical discipline at the hands of a loved one is seen as preferable to the always-looming life-and-death threat of white supremacist violence. Renisha McBride, Trayvon Martin, Mike Brown, Donald Maiden, Jr., Darius Simmons, and others function as cautionary tales for many black parents, reminding them that being a black child in America is a dangerous enterprise. The severity of their beatings warns black children that they can’t afford to mess up. And the anger, fury, and pain of heavy hands convey their parents’ profound fear.

But these are all hollow excuses and cultural pretense when it comes to discussing what happened to a little boy at Adrian Peterson’s hands a few months ago. The truth of the matter is that Adrian Peterson beat his four-year-old son so badly that he had bruises on his scrotum. His son reported that before Peterson struck him repeatedly with a stick, he stuffed his mouth with leaves and pulled down his pants. This was how Adrian Peterson aimed to teach his son how to share. But in truth, it was a lesson in patriarchy—that a man’s power comes from the control and degradation of weaker others.

More than 1,500 children died from abuse and neglect in 2012 alone, most of them younger than four. So, all of those folks upholding Peterson as a symbol of black male oppression or denigration need to take a step back. The bruises on that little boy’s body are not symbolic. His fear and trauma are not due to some grand media conspiracy. And hiding and rationalizing violence against weak and helpless people represents the very worst of humankind. The black community is more than black men; violence is not love. And if you think the media coverage of men like Ray Rice or Adrian Peterson make black people look bad, then just think what it looks like when you defend and justify their abuse.

This article was originally published at http://www.theatlantic.com/entertainment/archive/2014/09/adrian-peterson-is-not-a-symbol/380199/








Author: "Khadijah Costley White" Tags: "Entertainment"
Send by mail Print  Save  Delicious 
Date: Monday, 15 Sep 2014 15:18

The most culturally significant female artist of the 1980s? Janet Jackson.

I realize that’s a big claim for a decade that included such talents as Whitney Houston, Tina Turner, Annie Lennox, Cyndi Lauper, and Madonna. It may seem even more dubious given the fact that Janet really only emerged as a major figure in 1986 with the release of Control—and only released two substantial albums over the course of the decade. Janet didn’t have the vocal prowess of Whitney Houston, or the poetic subtlety of Kate Bush; she didn’t have Annie Lennox’s penchant for the avant-garde or Madonna’s predilection for shock.

But none of these artists achieved the cross-racial impact (particularly on youth culture) of Janet. And none of them had an album like Rhythm Nation 1814.

In his Rolling Stone cover story, journalist David Ritz compared Rhythm Nation 1814, released 25 years ago today, to Marvin Gaye’s landmark 1971 album What’s Going On—a pairing that might seem strange, if not sacrilege. But think about it, and the comparison makes a lot of sense. Both albums are hard-won attempts by black musicians to be taken seriously as songwriters and artists—to communicate something meaningful in the face of great pressure to conform to corporate formulas. Both are concept albums with socially conscious themes addressing poverty, injustice, drug abuse, racism and war. Both blended the sounds, struggles, and voices of the street with cutting-edge studio production. Both fused the personal and the political. And both connected in profound ways with their respective cultural zeitgeists.

Yet while What’s Going On has rightfully been recognized as one of the great albums of the 20th century, Rhythm Nation’s significance has been largely forgotten. At the time, though, it was undeniable: For three solid years (1989-1991), the album ruled the pop universe, the last major multimedia blockbuster of the 1980s. During that time, all seven of its commercial singles soared into the top five of the Billboard Hot 100 (including five songs that reached No. 1), surpassing a seemingly impossible record set by brother Michael’s Thriller (the first album to generate seven Top 10 hits). Janet’s record has yet to be broken.

During its reign, Rhythm Nation shifted more than seven million copies in the U.S., sitting atop the charts for six weeks in 1989 before becoming the bestselling album of 1990. It was the first album in history to produce No. 1 hits in three separate years (1989, 1990, 1991). Meanwhile, its innovative music videos—including the iconic militant imagery and intricate choreography of the title track—were ubiquitous on MTV.

But its impact was far more than commercial. Rhythm Nation was a transformative work that arrived at a transformative moment. Released in 1989—the year of Spike Lee’s Do the Right Thing, protests at Tiananmen Square, and the fall of the Berlin Wall—its sounds, its visuals, its messaging spoke to a generation in transition, at once empowered and restless. The Reagan Era was over. The cultural anxiety about what was next, however, was palpable.

* * *

The 1980s were a paradoxical decade, particularly for African-Americans. It was an era of both increased possibility and poverty, visibility and invisibility. The revolution of the pop-cultural landscape was undeniable. “Crossover” icons like Janet, Michael, Prince, and Whitney shattered racialized narrowcasting on radio, television and film, while hip hop emerged as the most important musical movement since rock and roll. The Cosby Show changed the color of television, as Spike Lee and the New Black Cinema infiltrated Hollywood. Oprah Winfrey began her reign on daytime television, while Arsenio Hall’s hip late-night talk show drew some of the biggest names in America. By 1989, from Michael Jordan to Eddie Murphy to Tracy Chapman, black popular culture had never been more prominent in the American mainstream. Over the course of the decade, the black middle and upper class more than doubled and integrated into all facets of American life, from college campuses to the media to politics.

But there was a flip side to this narrative—the decay and abandonment of inner cities, the crack epidemic, the AIDS crisis, the huge spike in arrests and incarceration (particularly of young black men), and the widening gap between the haves and have-nots, including within the black community. By the end of the 1980s, nearly 50 percent of black children were living below the poverty line This was the reality early hip hop often spoke to and for. Chuck D. famously described rap as “CNN for black people.”

It was these voices, these struggles, these ongoing divides and injustices that Janet Jackson wanted to represent in Rhythm Nation 1814. “We have so little time to solve these problems,” she told journalist Ritz in a 1990 interview. “I want people to realize the urgency. I want to grab their attention. Music is my way of doing that.” Pop stars, she recognized, had unprecedented multimedia platforms—and she was determined to use hers to do more than simply entertain. “I wanted to reflect, not just react,” she said. “I re-listened to those artists who moved me most when I was younger ... Stevie Wonder, Joni Mitchell, Marvin Gaye. These were people who woke me up to the responsibility of music. They were beautiful singers and writers who felt for others. They understood suffering.”

A sprawling 12-track manifesto (plus interludes), Rhythm Nation acknowledges this suffering and transfuses it into communal power. It was Janet’s second collaboration with Jimmy Jam and Terry Lewis, the talented duo from Minneapolis who miraculously merged elements of three existing musical strands—Prince, Michael, and hip hop—into something entirely fresh and unique. The Flyte Tyme sound featured angular, staccato-synth bottoms, often overlaid with warm, melodic tops. The sound was tailored to Janet’s strengths: her rhythmic sensibility, her gorgeous stacked harmonies, her openness to new sounds, and her wide musical palette. Jam and Lewis also took the time to learn who Janet was, who she wanted to be, and what she wanted to say, and helped translate those sentiments and ideas into lyrics. On Rhythm Nation, Janet wrote or co-wrote seven of the album’s 12 songs, interweaving social and personal themes.

Twenty-five years later, those songs still pop with passion and energy. Listen to the signature bass of the title track, based on a sample loop of Sly Stone’s “Thank You (Falletinme Be Mice Elf Again),” and the dense textures of noise that accentuate the song’s urgency. Listen to the funky New Jack riff in “State of the World,” again surrounded by a collage of street sounds—sirens, barking dogs, muffled screams—as Janet narrates vignettes of quiet desperation. Listen to the industrial, Public Enemy-like sermon of “The Knowledge.” The opening suite of songs feel like being inside a sonic factory: machines spurt, hiss, and rattle, as if unaccountably left on; glass breaks, metal stomps and clashes. All this is juxtaposed, of course, with Janet’s intimate, feathery voice, making it even more striking.

Listen to how she sings in a lower register in the first verse of “Love Will Never Do (Without You),” then goes up an octave in the second, before the chorus nearly lifts you off the ground. The album is full of sudden, unexpected shifts, as when the euphoric throb of “Escapade” transitions into the arena-rock stomp of “Black Cat.” On the final track, following the eerie strains of young children singing (“Living in a world that’s filled with hate/ Living in a world we didn’t create”), the album concludes as it began, with a somber bell tolling, perhaps a reference to John Donne’s famous dictum, “Ask not for whom the bell tolls/ It tolls for thee.”

Taken as a complete artistic statement, Rhythm Nation 1814 was a stunning achievement. It married the pleasures of pop with the street energy and edge of hip-hop. It was by turns dark and radiant, calculated and carefree, political and playful, sensual and austere, sermonic and liberating. If Control announced the arrival of a young woman ready to take the reins of her personal life and career, Rhythm Nation revealed a maturing artist, surveying the world around her, determined to wake people out of apathy, cynicism, indifference. Writes Slant’s Eric Henderson, “Rhythm Nation expanded Janet's range in every conceivable direction. She was more credibly feminine, more crucially masculine, more viably adult, more believably childlike. This was, of course, critical to a project in which Janet assumed the role of mouthpiece for a nationless, multicultural utopia.”

“We are a nation with no geographic boundaries,” declared Janet on the album’s introductory “pledge,” “pushing toward a world rid of color lines.” Just seven years earlier, black artists couldn’t get on MTV; FM radio was dominated by album-oriented (white) rock; and the music industry was largely segregated by genre. Now a black woman was at the helm of a new pop-cultural “nation,” preaching liberation through music and dance, while calling on her audience to keep up the struggle. For all the inroads, she insisted, the battle wasn’t over.

Janet Jackson’s ascendance was significant for many reasons, not the least of which was how it coincided with (and spoke to) the rise of black feminism. Until the 1980s, feminism was dominated, by and large, by middle class white women. They defined its terms, its causes, its hierarchies, its representations, and its icons. It wasn’t, of course, that black feminists didn’t exist before the 1980s. From Sojourner Truth to Harriet Tubman to Ida B. Wells to Rosa Parks to Maya Angelou—black women made enormous contributions in the struggle for racial, gender, and class equality. But their contributions were often minimized, and their struggles marginalized. As Barbara Smith writes in her landmark 1977 essay, “Toward a Black Feminist Criticism,” “Black women’s existence, experience, and culture and the brutally complex systems of oppression which shape these are in the ‘real world’ of white and/or male consciousness beneath consideration, invisible, unknown ... It seems overwhelming to break such a massive silence.”

Black feminism, however, did just that in the 1980s. From Michelle Wallace’s bestselling Black Macho and the Myth of Superwoman (described by Ms. magazine as “the book that will shape the 80s”), to Alice Walker’s Pulitzer Prize-winning The Color Purple (which was adapted into a blockbuster film, directed by Steven Spielberg), black women achieved unprecedented breakthroughs over the course of the decade. In 1981, bell hooks released Ain’t I A Woman; in 1984, Audre Lorde published Sister Outsider; 1987 saw the arrival of Toni Morrison’s Beloved, perhaps the most universally canonized novel of the past 30 years. Appropriately capping the decade was Patricia Hill Collins’s Black Feminist Thought (1990), which documented and synthesized the flourishing movement’s central ideas and concerns. The book, Collins wrote, was intended to be “both individual and collective, personal and political, one reflecting the intersection of my unique biography with the larger meaning of my historical times.”

* * *

If there was one female artist in the 1980s who captured this spirit in popular music it was Janet Jackson in Rhythm Nation. It was an album that positioned a multifaceted, dynamic black woman as a leader, as someone whose ideas, experiences and emotions mattered. It challenged some of the most deeply entrenched scripts for women in popular culture. It also offered an alternative to the era’s other most powerful female icon: Madonna.

While they were not-so-friendly rivals, in certain ways Janet and Madonna helped trailblaze similar terrain. Both were strong, intelligent, fiercely ambitious artists. Neither expressed any reticence about their desire for mass commercial success. Both were engaged in similar struggles for respect, empowerment and agency in an industry dominated by men and male expectations. Both also faced serious pushback from music critics. In the 1980s, music reviews were frequently filtered through a rock-centric (read: white, male, and heteronormative) lens. “Pop creations” like Janet and Madonna were viewed with suspicion, if not outright contempt. The fact that they didn’t conform to traditional singer-songwriter expectations proved they lacked talent. The fact that they had talented collaborators and producers proved they lacked credibility. The fact that dance and image were important parts of their artistic presentation proved they lacked authenticity. As The New York Times’ Jon Pareles wrote in a 1990 review of Janet’s Rhythm Nation Tour: “Miss Jackson seems content simply to flesh out an image whose every move and utterance are minutely planned. Spontaneity has been ruled out; spectacle reigns, and the concert is as much a dance workout as a showcase for songs.”

In spite of such headwind, however, Janet and Madonna became two of the most influential icons of the late 20th century, each offering distinct versions of feminist liberation and empowerment to a generation of young people coming of age in the 1980s and 1990s. VH1 ranked them No. 1 and No. 2 respectively in their “50 Greatest Women of the Video Era.” On Billboard’s 2013 list of Top Artists in Hot 100 History, Madonna was No. 2 and Janet was No. 7. Over the course of their respective careers, Madonna has 12 No. 1 hits; Janet has 10. Madonna has 38 Top Ten singles; Janet has 27 (placing them both among the top 10 artists of all time). Both, meanwhile, have sold hundreds of millions of albums and influenced American culture in incalculable ways.

Yet in spite of their similar commercial achievements and cultural impact, Janet Jackson remains, by comparison, grossly undervalued by critics and historians. Try to find a book on her career, cultural significance, or creative work, and with the exception of her 2011 autobiography, True You: A Journey To Finding and Loving Yourself, which focuses on her struggles with body image and self-esteem, you will come up empty-handed. Do the same with Madonna, and you will find at least 20 books by major publishers.

The disparities are not simply in the amount of coverage, but in how each artist is interpreted and understood. In print coverage, both in the 1980s and today, Madonna is made the default representative of feminism and of the era (in a 1990 editorial for the New York Times, cultural critic Camille Paglia famously declared her “the future of feminism”). Madonna was perceived as somehow more important and interesting, more clever and cerebral. Her sense of irony and play with sexuality made her more appealing to postmodernists than Janet’s socially conscious sincerity. In 1989, Madonna was named “Artist of the Decade” by Billboard and MTV. Since that time, the appreciation gap has only widened.

In 2008, Madonna was inducted into the Rock and Roll Hall of Fame. In spite of her trailblazing career, Janet has yet to receive the same honor. She has been eligible for six years. Many believe she is still being punished for the 2004 Super Bowl controversy often referred to as “Nipplegate,” the response to which has been described as "one of the worst cases of mass hysteria in America since the Salem witch trials."​ It is hard to believe, given the controversies surrounding just about every artist inducted into the Hall of Fame, that this would be used as a legitimate rationale for her exclusion. But then again, it’s hard to imagine how an artist of Janet’s stature has yet to be nominated.

Long before Beyoncé, Janet carved out a space for the openly feminist, multidimensional pop star. She created a blueprint that hundreds of thousands of artists have followed, from Britney Spears to Ciara to Lady Gaga. Rhythm Nation 1814 was the album that revolutionized her career and the pop landscape. It demonstrated that black women needn’t be second to anyone. But it wasn’t individualistic. Its rallying call was about the collective we. We could be a part of the creative utopia—the rhythm nation—regardless of race, gender, class, sexuality or difference. It made you want to dance and change the world at the same time. Unrealistic, perhaps. But 25 years later, it’s still hard to listen and not want to join the movement.

This article was originally published at http://www.theatlantic.com/entertainment/archive/2014/09/the-world-changing-aspirations-of-rhythm-nation-1814/380144/








Author: "Joseph Vogel" Tags: "Entertainment"
Send by mail Print  Save  Delicious 
Date: Monday, 15 Sep 2014 14:57

Jimmie Luthuli, 34, has worked as a waitress all over Washington. She's fetched drinks on U Street, wiped down tables on Barracks Row, and taken orders on K Street. The scenery changed but her pay didn't: $2.77 an hour before tips.

Those meager wages were typically swallowed up by taxes, she said, leaving her to subsist on an unsteady income of tips. "Sometimes I would only get one table the whole five-hour lunch shift and then that table would leave me $5, and that would basically be my pay for the day," Luthuli told me over her lunch break on Wednesday. "I was coming home with only $80 or $100 for a whole week of work."

In 2009, the federal government raised the minimum wage to $7.25 per hourbut, under lobbying pressure from the restaurant industry, it did nothing for the many thousands of people who work for gratuities. Under a lesser-known subsection of the minimum-wage law known as the "tipped minimum wage," service workers may be paid as little as $2.13 an hour. Another lesser-known fact: That wage has been stagnant for the past 23 years.

While some states and D.C. have made modest increases to the tipped minimum wage on their own (modest in the District's case being 64 cents), and eight states have no tipped minimum wage at all, about half of all states continue to pay workers just $2.13 an hour, provided employers make up the difference if a server fails to achieve the standard minimum wage after tips. It's a particularly troubling statistic for women who make up two-thirds of tipped workers nationally and are regularlysome would say systematicallypaid less than men.

A new analysis by the National Women's Law Center found that in the eight "equal treatment" states where tipped workers are entitled to be paid the full federal minimum wage by their employers ($7.25), the wage gap between men and women is smaller, and poverty rates among tipped workers (particularly women) are lower than in states where tipped workers make a base wage of just $2.13 an hour. More specifically, the average wage gap is 17 percent smaller for women overall working full time; the average poverty rate for female tipped workers is 33 percent lower.

The analysis, which drew data from the Census Bureau's recent American Community Surveys, also emphasized that the situation is especially bad for people of color. In states with a tipped minimum wage of $2.13 per hour, African-American women working full time, year round are paid on average just 60 cents for every dollar earned by their white male counterparts (a wage gap of 40 cents). In "equal treatment states," the wage gap drops to 33 cents. Hispanic women fare worst of all, earning just 51 cents on the white male dollar in the former (wage gap: 49 cents) and 53 cents on the white male dollar in the latter (wage gap: 47 cents).

"What was striking to me was the variety of ways in which the inequality that's associated with the existence of this large tip credit manifests itself," said Katherine Gallagher Robbins, a senior policy analyst at NWLC and coauthor of the report. "We see repeatedly, no matter how you slice the data, workers in these states are doing worse." Her coauthor, NWLC's Julie Vogtman, said she was struck by extent to which the effects of a low cash wage for tipped workers "radiated out" to affect the wider wage gap in these states.

Industry groups have argued that the $2.13 base wage is a misnomer. Every tipped worker is guaranteed at least the minimum wage of their state, after all. And it's the employer's legal responsibility to make up the difference between the tipped minimum cash wage and the regular minimum wage, if a server's tips fall short. The trouble is, whether employers are deliberately dodging this requirement or are simply held back by logistics, many workers never see that money at all.

A recently released report from the Economic Policy Institute underscores the problem. In the most recent compliance sweep of full-service restaurants by the U.S. Labor Department's Wage and Hour Division, 83.8 percent of investigated restaurants had some type of violation. "In total, WHD recovered $56.8 million in back wages for nearly 82,000 workers and assessed $2.5 million in civil money penalties," the Labor Department reported. "Violations included 1,170 tip credit infractions that resulted in nearly $5.5 million in back wages."

The latest NWLC analysis comes as the Senate eyes a new vote on a minimum-wage bill that would increase the regular federal minimum wage from $7.25 to $10.10. The bill would also reestablish the link between the tipped and the regular minimum wages, raising the former to 70 percent of the latter within six years. Senate Majority Leader Harry Reid's office has been coy on timing, but people well sourced in the matter say the proposal could come up for a vote as soon as next week.

The minimum-wage bill isn't everything advocates are hoping for, but it would certainly improve the situation for tipped workers, adding greater stability to their income and boosting their total pay. In a conversation Wednesday, NWLC's Vogtman called it a "very important step in the right direction."

Luthuli, for her part, isn't waiting around to see how it goes. She's channeled her frustration by volunteering with the Restaurant Opportunities Center, an advocacy group that, according to organizers, is working on legislation to eliminate the tipped minimum wage in eight states. She's also working odd jobs to pay the bills, including some freelance consulting and four shifts per week as a food runner at a restaurant on U street.

"I decided it's better to be a food runner than a server," she said. "It only pays around $6 an hour, but I like it because at least I know my hourly wages. For me it comes out to more money."

This article was originally published at http://www.nationaljournal.com/next-america/economic-empowerment/why-you-should-always-tip-your-waitress-20140904








Author: "Lucia Graves" Tags: "Business"
Send by mail Print  Save  Delicious 
Next page
» You can also retrieve older items : Read
» © All content and copyrights belong to their respective authors.«
» © FeedShow - Online RSS Feeds Reader