- Teary Beckham gets grand send-off in last game
- Danish entry wins Eurovision Song Contest
- Wilkinson says club title 'up there' with World Cup
But their new movie ranks with their very best (Fargo, No Country for Old Men, A Serious Man) in its nearly pitch-perfect balance of biting satirical humour and deep reserves of feeling. The film's protagonist (played by singer-actor Oscar Isaac in a star-making, award-worthy turn) embodies the tricky duality of cruelty and tenderness that makes Inside Llewyn Davis such a treat. Navigating his mess of an offstage life—couch-hopping, mooching, and wrangling with his scam artist manager and another folk singer who may or may not be carrying his child (Carey Mulligan, radiating fury tinged with longing in a marvellously vivid comic performance)—Llewyn Davis is a schlumpy, scowling grump. But when he performs (glorious folk tunes arranged by T-Bone Burnett and sung live on set), revealing a honeyed, slightly raspy voice, his face mellows, his eyes close, and he seems to be opening his soul to the world. This jerk's music is his redeeming feature.
Indeed, the folk songs in Inside Llewyn Davis (even the ones the filmmakers mock affectionately), with their yearning lyrics and melancholy melodies, don't just offset the dry, Coen-esque wit of the screenplay; they give the movie a rich emotional subtext, allowing the characters' often laugh-out-loud gripes, swipes, one-liners, and accusations to echo with hints of regret and desire.
As Llewyn Davis fights to kick-start his floundering career, scrape together money without sacrificing his ambitions, get along with the people in his private and professional lives, and keep track of a cat he inconveniently finds himself caring for, we come to understand that the movie is a comedy about loss and letting go: of an artistic partner, a love interest, a childhood dream.
As usual, the Coens find a fitting visual match for their themes: cinematographer Bruno Delbonnel uses wintry whites, greys and browns, and repeated shots of hallways, long subway cars, and empty stretches of highway suggest the title character's existential anxiety.
Such images feel just right for a film that's not just a meditation on art, failure, responsibility, and self-acceptance, but also a portrait of a specific American cultural moment—at the start of the Vietnam War—when young people were facing stark choices of identity and values.
The film's a portrait of a specific American cultural moment—at the start of the Vietnam War—when young people were facing stark choices of identity and values.Along the way, Inside Llewyn Davis features one of the most sublime sequences the Coens have ever shot, involving a snowy highway in the middle of the night, that pesky cat Llewyn looks after, and a snippet of opera playing on the car radio.
The film also flaunts a uniformly superb ensemble, with Justin Timberlake, John Goodman, Garrett Hedlund, Jeanine Serralles, Max Casella, Stark Sands, F. Murray Abraham, Robin Martlett, and Adam Driver (of HBO's Girls) delivering jewel-like supporting performances.
We haven't even reached the festival's halfway point yet, so it's far too early to start talking about the Palme d'Or. But Inside Llewyn Davis is the best thing I've seen at Cannes so far this year.
A version of this post appears on France 24, an Atlantic partner site.
Silverstein, the pilot in command, raised objections and was given three options: wait inside the FBO [the "Fixed Base Operator," the little office that exists at most small airports] or wait quietly outside, or be detained in handcuffs. An instrument-rated private pilot and AOPA member, Silverstein is also an active real estate investment banker who has never committed a crime, he said.
When last we left our hero—The Doctor of House Gallifrey, Eleventh of His Name, Predator of the Daleks, Scourge of the Cybermen, Lord of Time—he had just emerged from a nice, long mope. His previous traveling companions, Amy Pond and Rory Williams, had been zapped back in time to live out their lives without him. So he parked his time machine on top of a cloud in 1890s London so he could wear a crumpled hat and have a sad.
Then he met Clara Oswald, a woman (or as the show repeatedly insists, though the actress who plays her is now 27 years old, a "girl") leading a double life as a barmaid and a governess. She was brave, beautiful, smart, fast-talking, plus she snogged him, so naturally he invited her to travel with him—and then she got killed by an ice monster. Galvanized into action, the Doctor defeated the alien threat to London—and then discovered that he'd actually met the same woman once before, in the future, where she'd also died.
The Doctor's quest to unravel the mystery of who Clara really is and how the same person keeps popping up at different points in his life is the ostensible plot arc for the second half of the seventh season of the revived Doctor Who. Only it isn't an arc; that term implies movement, and showrunner Steven Moffat's structuring of the half-season gives viewers precious little of that, outside of camera movement and, you know, running.
The Doctor mostly just muses to himself about the mystery as he meets, saves, befriends and begins traveling with a 2013 version of Clara. The one time he actually asks her about it, it's in the most aggressive way possible, scaring the crap out of Clara towards the end of "Journey to the Centre of the TARDIS"—but then there's a reset button, so she forgets. The explanation has to wait until the finale, which aired Saturday night, and it's not particularly satisfying.
The mystery ultimately wasn't really a mystery: It lacked any sense of tension or progress. Unfortunately, the same can be said of the lead characters. Both the Doctor and Clara remained completely static throughout their adventures, each holding the other at arm's length. Matt Smith as the Doctor and Jenna-Louise Coleman turned in consistently strong performances, but the two characters' reluctance to engage fully with each other also kept viewers at an emotional remove from both of them.
Both the Doctor and Clara remained static throughout their adventures, each holding the other at arm's length. The characters' reluctance to engage with each other also kept viewers at an emotional remove.
That's not to say that Series 7B didn't have its moments. Smith's monologue in "The Rings of Akhaten" to a hungry, star-sized alien was as powerfully delivered, and with far more emotional weight, than his other famous speech to a sky full of badness in "The Pandorica Opens." The herky-jerky glimpses of the Crooked Man in "Hide" made that rubber monster far scarier than it had any business being. And, as is usual with Who these days, great performances abounded, including from such distinguished guest stars as David Warner, Warwick Davis, Liam Cunningham, and Dame Diana Rigg.
But none of this added up to even one standout, all-time-classic episode. Unfortunately, Moffat's tendency to tell rather than show continued, as did his penchant for repurposing his own good ideas. ("I don't know where I am" is the new "Hey, who turned out the lights?") He and his writing team also evinced an increasing tendency to borrow from other iconic sci-fi/fantasy tropes. The format of Who has always allowed it to dabble in pretty much any genre, but this season, several moments stood out as cut-rate knock-offs.
Smith's back-and-forth dialogue with himself in "Nightmare in Silver"? Andy Serkis did it better in The Two Towers. The death of Victorian crime-fighter Jenny Flint in "The Name of the Doctor" while she was on the psychic conference call? Beautifully shot, wonderful acting job by Catrin Stewart—but not as effective as the unplugging sequence from The Matrix. The Whispermen? Cool looking, but derivative of the much scarier Gentlemen from the Buffy the Vampire Slayer episode "Hush." (The Whispermen kill you by stopping your heart. The Gentlemen steal your voice—and then cut out your heart. And both were heralded by a creepy nursery rhyme.)
In the finale, as per usual, Moffat struggles to confine himself to exploring a single Big Idea: He just keeps throwing them in there, one after the other, to see what sticks. The conference call/séance was a thoughtful nod to Victorian-era spiritualism. The idea that the one place a time traveler must never go is his or her own grave is solid. The true corpse of a Time Lord being not a physical cadaver, but a great ball of wibbly-wobbly timey-wimey stuff, is also neat.
But... the fact that anyone can walk into this glowing timeline and mess around with it, after that Time Lord is already dead? Seems like a bug, not a feature. By jumping into the dead Doctor's timeline, the villainous Great Intelligence tried to kill all of the Doctor's selves at once, while Clara, who jumped in after him, worked against him to save the Doctor. Which is how we got the montage of Clara in different costumes stalking footage of earlier Doctors. It looks like Clara's time-fragmentation will be the conceit that will allow her to interact with David Tennant and Billie Piper—the Tenth Doctor and his longest-serving companion, Rose Tyler—both of whom have been cast in the Doctor Who 50th anniversary special, set to air in November. We also met another announced cast member of that special: John Hurt as the Doctor's lost regeneration (OMG Gertrude Stein was a Time Lord!), most likely one who came between the Eighth and Ninth and, it is implied, committed the double genocide that ended the Last Great Time War.
So, the solution to Clara's mystery is the same as the solution to River Song's: This woman's life, from birth to death, is inextricably bound to the Doctor's. And as with River, since we'd already seen the end of her story, there was no tension when she had an opportunity to make a sacrifice for her man. Ah, there's a natural segue into a rundown of some of this half-season's nuggets of sexist bullshit, the kind that Moffat and his crew—which has included exactly zero women writers—seem to think is so droll...
In "Journey to the Centre of the TARDIS," the Doctor only lets Clara fly the TARDIS in "basic mode." When she asks if this is because she's a girl, he says no, and then makes a face.
At the end of "Nightmare in Silver," the Doctor refers to Clara as "a mystery wrapped in an enigma squeezed into a skirt that's just a little too... tight." And then he starts to smile—leer, really—and makes an "uhh" noise in the back of his throat before he checks himself and looks appalled. Too late, dude.
In "The Crimson Horror," Jenny rescues the Doctor from partial paralysis. In his elation, he grabs her, dips her, and forcibly kisses her on the lips. A woman he knows to be both queer and married. (This is the same guy who twice asked Rory for "permission" to hug Amy.)
Many reviewers didn't seem to think the last one was a big deal, reading Jenny's immediate slap of the Doctor as enough of a rebuke. Others were uncomfortable with it, and at least some were mad enough to call this nonconsensual kiss what it would be in the real world: sexual assault. As far as I'm concerned, both the kiss and the slap were played for laughs, letting the Doctor off the hook far too easily.
As a wise man once said in a very complex song-and-dance number involving puppets, Doctor Who is about the triumph of intellect and romance over brute force and cynicism. Kissing people when they don't want to be kissed is nothing if not the triumph of brute force, and it sure ain't romantic.
This Doctor is always telling us what's cool. You know what's cooler than all the bowties, fezzes, and Stetsons in the universe? Consent.
The season finale marked the last regular SNL appearance of Seth Meyers (slated to succeed Jimmy Fallon as host of NBC's Late Night), Fred Armisen, and Bill Hader. (Jason Sudeikis's return remains uncertain.) The show sent them off with a mostly strong episode and some fitting farewell moments. Host Ben Affleck was joined by his wife, Jennifer Garner, during the monologue. Amy Poehler joined Seth Meyers for Weekend Update. Musical guest Kanye West performed "Black Skinhead" and "New Slaves."
Seth Meyers beats out Anderson Cooper for Stefon's hand in marriage.
Amy Poehler joins Seth Meyers for one last edition of Really!?! with Seth and Amy.
Ex-porn stars Vanessa Bayer and Cecily Strong are back, pitching "Herman's" Handbags, with help from fellow porn veteran Girth Brooks (Ben Affleck). "Perfect for occasions like: everyday, business lunch, carrying, cesarians, and eating breakfast off of Tiffany..."
"Don't be overwhelmed by the most flawlessly executed wedding you've ever seen..." Xanax for Gay Summer Weddings.
Fred Armisen, as punk rocker Ian Rubbish, says goodbye—singing "It's Alright, It's Been a Lovely Life," accompanied by Bill Hader, Jason Sudeikis, Taran Killam, his Portlandia costar Carrie Brownstein, and musicians Steve Jones, J Mascis, Aimee Mann, Kim Gordon, and Michael Penn.
Also: Iranian president Mahmoud Ahmadinejad (Fred Armisen), irked by Argo's portrayal of Iran, makes his own movie about the making of Argo, starring himself as Ben Affleck; and cold open—a confused Al Sharpton (Kenan Thompson) addresses the IRS scandal.
The three Obama "scandals" have varying characteristics and varying levels of legitimacy, but all three share a meta-story. And I think I know whereof I speak as a former GOP staffer.Beginning with the dethronement of Jim Wright and the House banking scandal, and achieving escape velocity in the mid-1990s with Matt Drudge becoming the virtual assignment editor of the mainstream press during the Clinton impeachment, the Washington press corps has become increasingly "wired" to accept the Republican view of what constitutes a scandal. The public has been ignoring Benghazi for 8 months; as for the Washington press, we saw how Jonathan Karl got played by Republican staffers' misrepresentations of the administration's e-mails.
Let's think about the modern history of "the scandal," and how such episodes emerge.The modern saga all starts with Nixon. Obviously there have been scandals throughout political history, and in the immediate pre-Nixon era you had Bobby Baker, Billy Sol Estes and Walter Jenkins with LBJ; Sherman Adams under Eisenhower; and such different political dramas as the Army-McCarthy hearings in the early Eisenhower era and the Bay of Pigs aftermath under JFK. But Nixon marked the beginning of the modern scandal era. That is because the phenomena of the televised high-stakes public inquiries (as with the Watergate hearings and impeachment preparations) really dates to then, as does modern press-consciousness of how coverage of a big, exciting scandal looks and feels.
I. The main stops along the way:Nixon: Watergate -> hearings -> dramatic revelations -> Supreme Court hearing -> impeachment -> resignation.Also under Nixon: Spiro Agnew taking a brown paper bag full of bribery money, in his Vice Presidential office, and having to resign, something barely remembered now.Nixon era: Teddy Kennedy and Chappaquiddick.Ford: None really.Carter: No long-running ones, despite flaps about his budget director and his chief of staff. But his era marks a major change, since round-the-clock news coverage was just getting going then, and Ted Koppel's Nightline, originally known as America Held Hostage, pioneered what we now think of as scandal-style coverage, of the American captives in Teheran.Reagan: The Iran-contra mess, complete with Fawn Hall and Oliver North.GHW Bush: None, really, though the Clarence Thomas nomination got scandal-style coverage because of the charges against him and the dramatic hearings.Clinton: First the phony scandals of Whitewater and Vince Foster. Then the real problems via Monica Lewinsky. Clinton era notable for the creation/revelation of something like a permanent-scandal mentality in politics and the press.GW Bush: Few scandals in the technical sense. But the election, recount, and judicial overreach known as Bush v. Gore got scandal-style coverage. Then Abu Ghraib, waterboarding and torture, and the war as a whole.
Obama: In his first term, the phony scandals of birtherism and Shirley Sherrod. Now the three-"scandal" combo made of elements that have nothing whatsoever to do with one another and don't necessarily have anything to do with Obama himself but that nonetheless satisfy that phantom-limb craving for a good exciting scandal.II. What a Scandal Takes, to Take Off1. An underlying offense people can understand. Clinton and Monica meet this test. Also Nixon ordering wiretaps, or Agnew taking a bag of money. Iran-Contra was always sort of a struggle on this front -- for people to grasp what exactly the offense was. Today's IRS/Tea Party accusation meets the test (despite complexity of the underlying reality); Benghazi, less so.2. Evidence of president's personal involvement. The Watergate tapes again lead the way here -- Nixon's own voice, cursing and swearing. Monica and Clinton -- whew. Obama "scandals" lacking here.3. A formal hearing/ investigative structure that guarantees an ongoing daily drip-drip-drip coverage. When there is a schedule of witnesses for a hearing, an upcoming set of votes, or a sequence of new revelations, then the story can keep going for weeks, months, even years. Darrel Issa, listen up!4. A press culture and DC culture that is now wired to swing into "scandal mode," and start writing stories and giving commentary reflecting that "narrative."5. A structure of news coverage that keeps the scandal narrative going. This was probably at its strongest in the era of the weekly news magazines (Time, Newsweek, etc.) Then you would have: daily coverage in the papers; nightly coverage on TV news; weekly advancing of the narrative by news mags (and Sunday talk shows); analysis of "Administration in crisis" and "President under pressure"; and it would all keep going. Now, in a sense, the hourly / minute-by-minute cycle can make scandals "burn out" too fast.
There's a country-wide intranet, and the content consists of state news and message boards. A custom-built operating system, Red Star, includes a mandatory readme file about "how important it is that the operating system correlates with the country's values." Whenever leader Kim Jong Un is mentioned, his name is displayed slightly bigger than the text around it.
And the weirdest part about it? It doesn't have to be that way.
In a recent conversation with The Atlantic's Steve Clemons, Google chairman Eric Schmidt and Google Ideas director Jared Cohen -- who co-wrote the new book The New Digital Age: Reshaping the Future of People, Nations, and Business -- mentioned that North Korea actually has the capacity for a full-fledged Internet network. It just chooses not to have one.
The mobile networks are there, Schmidt said, the country's leadership just hasn't turned on the data.
"It's just arrogance, stupidity and bad decisions that prevents this," Schmidt said. "There is literally one command to turn on the Internet."
North Korea has too many horrors to describe -- gulags, starvation, forced reverence of a demented leader -- but Internet access would speed the end of the regime. The utter lack of knowledge about the outside world there keeps a lid on dissent and domestic strife. North Koreans for the most part live in a total information dead zone, told they are lucky even as they endure food shortages and freezing winters without heating. The fact that it could all be undone with the flip of a switch is both maddening and heartbreaking.
Schmidt visited Pyongyang in January in part to try to push the leadership on the Internet issue, he later told reporters:
North Korea "is the last really closed country in the world," he said. "This is a country that has suffered from lack of information. The Internet was built for everyone, including North Koreans. The quickest way to get economic growth in North Korea is to open up the Internet. I did my best to tell them this."
When foreigners visit, Schmidt wrote earlier this year, "the government stages Internet browsing sessions by having 'students' look at pre-downloaded and preapproved content, spending hours (as they did when we were there) scrolling up and down their screens in totalitarian unison."
At the Atlantic event, Schmidt pointed out that at least now, smuggled mobile phones and DVDs from South Korea are proliferating, and where possessing these illicit materials might have carried the death penalty years ago, the government now appears to be letting such minor transgressions slide. Still, though, for a major policy decision like turning on the Internet, "the Kid has to decide," Schmidt said.
It's easy to assume that a global Internet, with all its promise of scaled communication and education and democratization, will eventually help to foster democracy. But it's also not entirely accurate to assume that. In a conversation with The Atlantic's Steve Clemons yesterday evening, Eric Schmidt and Jared Cohen -- co-Googlers and co-authors of The New Digital Age: Reshaping the Future of People, Nations, and Business -- made a point of emphasizing the limitations of technological innovation. Particularly when it comes to geopolitical change.
"We're very concerned about the balkanization of the Internet," Schmidt said -- and not just because division itself in so many ways runs contrary to the ideals the Internet, and the web, were founded on. Splintering, especially based on geopolitical divisions, could also have direct political and physical consequences. If you're an autocratic government that feels threatened by the existence of an open Internet, Schmidt noted, you're going to resist that Internet -- in the way that, say, Iran has resisted it. Last month, he pointed out, Iran announced plans for a state-run digital map that would function as an "Islamic Google Earth." ("I'm not making this up," Schmidt insisted, as the crowd laughed at the sad absurdity. "This is actually what they announced.")
Iran's willful exclusion of its citizens from the Internet much of the rest of the world knows -- its attempt to take the "worldwide" aspect out of the World Wide Web -- may well indicate weakness in the regime itself. ("If this is what passes for leadership," Schmidt put it, "these guys have a problem.") If so, though, it's a weakness belied by technological capabilities: the Iranian regime, at this point, does have the infrastructural power to filter the Internet. It can censor its citizens with increasingly surgical precision. It can create its own version of YouTube. It can bypass Google and Facebook and other American companies, shaping the experience of the Internet for all but the most technologically savvy of its citizens. "Islamic Google Earth" may be a joke; it may also be, however, a harbinger of what's to come.
There's another concern, too, Cohen pointed out. Internet balkanization could enable autocratic states to "band together" to create cyber-alliances that can collaboratively edit the web in order to enforce shared norms. The collectives might edit comments made about their regimes. They might cooperate to monitor user behavior. They might attempt to enforce their notion of "moral behavior" through censorship and other means.
And they might ultimately engage, Cohen continued, in a kind of "digital ethnic cleansing." Traditional legal and political checks on mass criminality have been developed within and for the physical world, he noted; in the digital, however, those checks are less developed. The web is simply too new. And you could imagine autocratic regimes or other communities taking advantage of that, creating a scenario in which one group finds a way to, for example, filter another group's content from the web. Or to shut down -- or severely slow down -- their Internet access. Or to infiltrate them with malware and/or orchestrate elaborate denial-of-service (DDoS) attacks. One group, in other words, could essentially annihilate the digital existence of another.
And then the perpetrator could simply, Cohen said, "make the whole thing look like technical difficulties." Making punishment or retribution, per physical-world systems, extremely difficult.
What all this comes down to, Clemons suggested, is something that Schmidt and Cohen emphasize in their book: the new need for a "virtual foreign policy." While the physical world and the digital obviously coexist, Schmidt pointed out, we have much more finely established ways to regulate behavior in the physical world. When people in the virtual community begin to misbehave, committing crimes that wouldn't be legal in the physical space, we currently have very few mechanisms for correction. As that reality plays out on the geopolitical stage, he said, you could have "this bizarre situation" in which, say, the U.S. and China have a generally good relationship in the physical world: cash flow, open communications, travel between the two countries, etc. And yet behind the scenes -- in the digital world -- those countries could be, effectively, waging war on each other through their digital infrastructure.
Without a more strategic merging between the physical world and the digital, Schmidt suggested, that scenario could well become reality. For now, Cohen said, the big question looming before us is this: "At what point is a cyber attack so big that its effects spill over into the physical world?" We don't yet know. But there's a good chance we'll soon find out.
Since 2005, a team at NASA has been monitoring meteoroid explosions on the surface of the moon. In March, they announced yesterday, they observed an explosion so bright it would have been visible from Earth without a telescope. "For about one second," NASA said in a statement, "the impact site was glowing like a 4th magnitude star." It was nearly ten times as bright as any other previously recorded impact.
Unlike our safely cocooned planet, the moon has no atmosphere to protect it, and meteoroids do not burn up as they approach the surface. Over the past eight years, NASA scientists have observed more than 300 impacts, some of which are mapped below, with the March strike denoted by the red square.
What does it mean for something to "explode" in an environment with no oxygen? What are you actually seeing in the video above? NASA explains:
Lunar meteors don't require oxygen or combustion to make themselves visible. They hit the ground with so much kinetic energy that even a pebble can make a crater several feet wide. The flash of light comes not from combustion but rather from the thermal glow of molten rock and hot vapors at the impact site.
In the case of this major explosion -- which occurred on March 17th, during a period of increased meteor activity in Earth's atmosphere as well -- the rock struck the lunar surface at a speed of 56,000 miles per hour. The next time NASA's Lunar Reconnaissance Orbiter is in the vicinity, it will have its sights set on the Mare Imbrium region where the explosion occurred, searching for a new crater that could be as many as 20 meters wide.
Fort Irwin is a U.S. army base nearly the size of Rhode Island, located in the Mojave Desert about an hour's drive northeast of Barstow, California. There you will find the National Training Center, or NTC, at which all U.S. troops, from all the services, spend a twenty-one day rotation before they deploy overseas.
Sprawling and often infernally hot in the summer months, the base offers free tours, open to the public, twice a month. We made the trip, cameras in hand and notebooks at the ready, to learn more about the simulated battlefields in which imaginary conflicts loop, day after day, without end.
Coincidentally, as we explored the Painted Rocks located just outside the gate while waiting for the tour to start, an old acquaintance from Los Angeles -- architect and geographer Rick Miller -- pulled up in his Prius, also early for the same tour.
We laughed, said hello, and caught up about a class Rick had been teaching at UCLA about the military defense of L.A. from World War II to the present. An artificial battlefield, beyond even the furthest fringes of Los Angeles, Fort Irwin thus seemed like an appropriate place to meet.
We were soon joined by a small group of other visitors- -- consisting, for the most part, of family members of soldiers deployed on the base, as well as two architecture students from Montréal -- before a large white tour bus rolled up across the gravel.
Renita, a former combat videographer who now handles public affairs at Fort Irwin, took our names, IDs, and signatures for reasons of liability (we would be seeing live explosions and simulated gunfire, and there was always the risk that someone might get hurt).
The day began with a glimpse into the economics and culture of how a nation prepares its soldiers for war; an orientation, of sorts, before we headed out to visit one of fifteen artificial cities scattered throughout the base.
In the plush lecture hall used for "After Action Reviews" -- and thus, Renita apologized, air-conditioned to a morgue-like chill in order to keep soldiers awake as their adrenalin levels crash -- we received a briefing from the base's commander, Brigadier General Terry Ferrell.
With pride, Ferrell noted that Fort Irwin is the only place where the U.S. military can train using all of the systems it will later use in theater. The base's 1,000 square miles of desert is large enough to allow what Ferrell called "great maneuverability"; its airspace is restricted; and its truly remote location ensures an uncluttered electromagnetic spectrum, meaning that troops can practice both collection and jamming. These latter techniques even include interfering with GPS, providing they warn the Federal Aviation Administration in advance.
Oddly, it's worth noting that Fort Irwin also houses the electromagnetically sensitive Goldstone Deep Space Communications Complex, part of NASA's global Deep Space Network. As science writer Oliver Morton explains in a paper called "Moonshine and Glue: A Thirteen-Unit Guide to the Extreme Edge of Astrophysics" (PDF), "when digitized battalions slug it out with all the tools of modern warfare, radio, radar, and electronic warfare emissions fly as freely around Fort Irwin as bullets in a battle. For people listening to signals from distant spacecraft on pre-arranged frequency bands, this noise is not too much of a problem." However, he adds, for other, far more sensitive experiments, "radio interference from the military next door is its biggest headache."
Unusually for the American West, where mineral rights are often transferred separately, the military also owns the ground beneath Fort Irwin, which means that they have carved out an extensive network of tunnels and caves from which to flush pretend insurgents.
This 120-person strong insurgent troop is drawn from the base's own Blackhorse Regiment, a division of the U.S. Army that exists solely to provide opposition. Whatever the war, the 11th Armored is always the pretend enemy. According to Ferrell, their current role as Afghan rebels is widely envied: They receive specialized training (for example, in building IEDs) and are held to "reduced grooming standards," while their mission is simply to "stay alive and wreak havoc."
If they die during a NTC simulation, they have to shave and go back on detail on the base, Ferrell added, so the incentive to evade their American opponents is strong.
In addition to the in-house enemy regiment, there is an entire 2,200-person logistics corps dedicated to rotating units in and out of Fort Irwin and equipping them for training. Every ordnance the United States military has, with the exception of biological and chemical weapons, is used during NTC simulations, Ferrell told us. What's more, in the interests of realism (and expense be damned), troops train using their own equipment, which means that bringing in, for example, the 10th Mountain Division (on rotation during our visit), also means transporting their tanks and helicopters from their home base at Fort Drum, New York, to California, and back again.
Units are deployed to Fort Irwin for twenty-one days, fourteen of which are spent in what Fort Irwin refers to as "The Box" (as in "sandbox"). This is the vast desert training area that includes fifteen simulated towns and the previously mentioned tunnel and caves, as well as expansive gunnery ranges and tank battle arenas.
Following our briefing, we headed out to the largest mock village in the complex, the Afghan town of Ertebat Shar, originally known, during its Iraqi incarnation, as Medina Wasl. Before we re-boarded the bus, Renita issued a stern warning: "'Afghanistan' is not modernized with plumbing. There are Porta-Johns, but I wanted to let you know the situation before we roll out there."
A twenty-minute drive later, through relatively featureless desert, our visit to "Afghanistan" began with a casual walk down the main street, where we were greeted by actors trying to sell us plastic loaves of bread and piles of fake meat. Fort Irwin employs more than 350 civilian role-players, many of whom are of Middle Eastern origin, although Ferrell explained that they are still trying to recruit more Afghans, in order "to provide the texture of the culture."
The atmosphere is strangely good-natured, which was at least partially amplified by a feeling of mild embarrassment, as the rules of engagement, so to speak, are not immediately clear; you, the visitor, are obviously aware of the fact that these people are paid actors, but it feels distinctly odd to slip into character yourself and pretend that you might want to buy some bread.
In fact, it's impossible not to wonder how peculiar it must be for a refugee, or even a second-generation immigrant, from Iraq or Afghanistan, to pretend to be a baker in a simulated "native" village on a military base in the California desert, only to see tourists in shorts and sunglasses walking through, smiling uncomfortably and taking photos with their phones before strolling away without saying anything.
Even more peculiarly, as we reached the end of the street, the market -- and all the actors in it -- vanished behind us, dispersing back into the fake city, as if only called into being by our presence.
By now, with the opening act over, we were stopped in front of the town's "Lyndon Marcus International Hotel" to take stock of our surroundings. In his earlier briefing, Ferrell had described the simulated villages' close attention to detail -- apparently, the footprint for the village came from actual satellite imagery of Baghdad, in order to accurately recreate street widths, and the step sizes inside buildings are Iraqi, rather than U.S., standard.
Dimensions notwithstanding, however, this is a city of cargo containers, their Orientalized facades slapped up and plastered on like make-up. Seen from above, the wooden frames of the illusion become visible and it becomes more and more clear that you are on a film set, an immersive theater of war.
This kind of test village has a long history in U.S. war planning. As journalist Tom Vanderbilt writes in his book Survival City, "In March 1943, with bombing attacks on cities being intensified by all sides, the U.S. Army Corps of Engineers began construction at Dugway [Utah] on a series of 'enemy villages,' detailed reproductions of the typical housing found in the industrial districts of cities in Germany and Japan."
The point of the villages at Dugway, however, was not to train soldiers in urban warfare -- with, for instance, simulated street battles or house-to-house clearances -- but simply to test the burn capacity of the structures themselves. What sorts of explosives should the U.S. use? How much damage would result? The attention to architectural detail was simply a subset of this larger, more violent inquiry. As Vanderbilt explains, bombs at Dugway "were tested as to their effectiveness against architecture: How well the bombs penetrated the roofs of buildings (without penetrating too far), where they lodged in the building, and the intensity of the resulting fire."
During the Cold War, combat moved away from urban settings, and Fort Irwin's desert sandbox became the stage for massive set-piece tank battles against the "Soviet" Blackhorse Cavalry. But, in 1993, following the embarrassment of the Black Hawk Down incident in Mogadishu, Fort Irwin hosted its first urban warfare, or MOUT (Military Operations on Urbanized Terrain) exercise. This response was part of a growing realization shared amongst the armed forces, national security experts, and military contractors that future wars would again take the city as their battlefield.
As Russell W. Glenn of the RAND Corporation puts it bluntly in his report Combat in Hell: A Consideration of Constrained Urban Warfare, "Armed forces are ever more likely to fight in cities as the world becomes increasingly urbanized."
Massed, professional, and essentially symmetrical armies no longer confront one another on the broad forests and plains of central Europe, the new tactical thinking goes; instead, undeclared combatants living beside -- sometimes even with -- families in stacked apartment blocks or tight-knit courtyards send out the occasional missile, bullet, or improvised explosive device from a logistically confusing tangle of streets, and "war" becomes the spatial process of determining how to respond.
At Fort Irwin, mock villages began to pop up in the desert. They started out as "sheds bought from Shed World," Ferrell told us, before being replaced by shipping containers, which, in turn, have been enhanced with stone siding, mosque domes, awnings, and street signs, and, in some cases, even with internal staircases and furniture, too. Indeed, Ertebat Shar/Medina Wasl began its simulated existence in 2007, with just thirteen buildings, but has since expanded to include more than two hundred structures.
The point of these architectural reproductions is no longer, as in the World War II test villages of Dugway, to find better or more efficient methods of architectural destruction; instead, these ersatz buildings and villages are used to equip troops to better navigate the complexity of urban structures -- both physical, and, perhaps most importantly, socio-cultural.
In other words, at the most basic level, soldiers will use Fort Irwin's facsimile villages to practice clearing structures and navigating unmapped, roofed alleyways through cities without clear satellite communications links. However, at least in the training activities accessible to public visitors, the architecture is primarily a stage set for the theater of human relations: a backdrop for meeting and befriending locals (again, paid actors), controlling crowds (actors), rescuing casualties (Fort Irwin's roster of eight amputees are its most highly paid actors, we learned, in recompense for being literally dragged around during simulated combat operations), and, ultimately, locating and eliminating the bad guys (the Blackhorse regiment).
In the series of set-piece training exercises that take place within the village, the action is coordinated from above by a ring of walkie-talkie connected scenographers, including an extensive internal media presence, who film all of the simulations for later replay in combat analysis. The sense of being on an elaborate, extremely detailed film set is here made explicit. In fact, visitors are openly encouraged to participate in this mediation of the events: We were repeatedly urged to take as many photographs as possible and to share the resulting images on Facebook, Twitter, and more.
Appropriately equipped with ear plugs and eye protection, we filed upstairs to a veranda overlooking one of the village's main throughways, where we joined the "Observer Coaches" and film crew, taking our positions for the afternoon's scripted exercise.
Loud explosions, smoke, and fairly grisly combat scenes ensued -- and thus, despite their simulated nature, involving Hollywood-style prosthetics and fake blood, please be warned that many of the forthcoming photos could still be quite upsetting for some viewers.
The afternoon's action began quietly enough, with an American soldier on patrol waving off a man trying to sell him a melon. Suddenly, a truck bomb detonated, smoke filled the air, and an injured woman began to wail, while a soldier slumped against a wall, applying a tourniquet to his own severed arm.
In the subsequent chaos, it was hard to tell who was doing what, and why: Gun trucks began rolling down the streets, dodging a live goat and letting off round after round as insurgents fired RPGs (mounted on invisible fishing line that blended in with the electrical wires above our heads) from upstairs windows; blood-covered casualties were loaded into an ambulance while soldiers went door-to-door with their weapons drawn; and, in the episode's climax, a suicide bomber blew himself up directly beneath us, showering our tour group with ashes.
Twenty minutes later, it was all over. The smoke died down; the actors reassembled, uninjured, to discuss what just occurred; and the sound of blank rounds being fired off behind the buildings at the end of the exercise echoed through the streets.
Incredibly, blank rounds assigned to a particular exercise must be used during that exercise and cannot be saved for another day; if you are curious as to where your tax dollars might be going, picture paid actors shooting entire magazines full of blank rounds out of machine guns behind simulated Middle Eastern buildings in the Mojave desert. Every single blank must be accounted for, leading to the peculiar sight of a village's worth of insurgents stooped, gathering used blank casings into their prop kettles, bread baskets, and plastic bags.
Finally, we descended back down onto the street, dazed, ears ringing, and a little shocked by all the explosions and gunfire. Stepping carefully around pools of fake blood and chunks of plastic viscera, we made our way back to the lobby of the International Hotel for cups of water and a debrief with soldiers involved in planning and implementing the simulation.
Our hosts there were an interesting mix of earnest young boys who looked like they had successful careers in politics ahead of them, standing beside older men, almost stereotypically hard-faced, who could probably scare an AK-47 into misfiring just by staring at it, and a few female soldiers.
Somewhat subdued at this point, our group sat on sofas that had seen better days and passed around an extraordinary collection of injury cards handed out to fallen soldiers and civilians. These detail the specific rules given for role-playing a suite of symptoms and behavior -- a kind of design fiction of military injury.
A few of us tried on the MILES (Multiple Integrated Laser Engagement System) harnesses that soldiers wear to sense hits from fired blanks, and then an enemy soldier demonstrated an exploding door sill.
While the film crew and Observer Coaches prepared for their "After Action Review," our guides seemed talkative but unwilling to discuss how well or badly the afternoon's session had gone. We asked, instead, about the future of Fort Irwin's villages, as the U.S. withdraws from Afghanistan. The vision is to expand the range of urban conditions into what Ferrell termed a "Decisive Action Training Environment," in which U.S. military will continue to encounter "the world's worst actors" [sic]--"guerrillas, criminals, and insurgents"--amidst the furniture of city life.
As he escorted us back down the market street to our bus, one soldier off-handedly remarked that he'd heard the village might be redesigned soon as a Spanish-speaking environment--before hastily and somewhat nervously adding that he didn't know for sure, and, anyway, it probably wasn't true.
The "town" is visible on Google Maps, if you're curious, and it is easy to reach from Barstow. Tours of "The Box" are run twice a month and fill up quickly; learn more at the Fort Irwin website, including safety tips and age restrictions.
This post was originally published at V-e-n-u-e.com, an Atlantic partner site.
... be sure to come by the 5th anniversary Air Show / celebration for the Hangar 24 craft brewery. If I weren't necessarily at a policy big-think event in DC this weekend, I would be there myself.
I was living in Beijing when I saw news of Hangar 24's opening five years ago: that long-sought combination of small-airport aviation and micro-brewing in my original home town. I ended up spending more time there than I expected in 2008, as I went back and forth from China in my dad's final months. I've been there in happier circumstances as often as I can recently. Have a Columbus IPA or Orange Wheat for me if you go there -- or for my dad, whose 88th birthday it would be. And enjoy the air show!
It's not exactly The Cat Who Wished to Be a Man territory, but this extraordinarily determined and observant feline from Skopje, Macedonia, appears to have some unusual human-world savvy. "Leon the cat," according to YouTube uploader Marjan Kirovski, performs the trick of opening five doors in a row by jumping up on their handles and pulling them down with his paws until the catches release. Then he bats at each cracked door until it swings open, allowing him his next few steps of freedom. It helps that all the doors in the house appear to have the same kind of lever handle, which is non-standard in the United States. Doing this trick with slippery rounded doorknobs would be impossible for any cat less sticky than a gecko. It makes you wonder what other types of technology cats can figure out but lack the right anatomy to do anything with.
"We've seen some recent press allegations that the IRS is targeting certain Tea Party groups across the country -- requesting owners' documents requests, delaying approval for tax-exempt status and that kind of thing," Rep. Charles Boustany of Louisiana observed during the Q&A; portion of the hearing. "Can you elaborate on what's going on with that? Can you give us assurances that the IRS is not targeting particular groups based on political leanings?"Shulman's answer is worth considering at length, because it helps explain what otherwise seems a mystifying denial. It's all about the culture of the IRS and what its more general approach to people is. His reply:
Thanks for bringing this up because I think there's been a lot of press about this and a lot of moving information, so I appreciate the opportunity to clarify. First, let me start by saying, yes, I can give you assurances. As you know, we pride ourselves on being a non-political, non-partisan organization. I am the only -- me and our chief counsel -- are the only presidential appointees, and I have a five-year term that runs through presidential elections, just so we will have none of that kind of political intervention in things that we do.
For 501 (c)(4) organizations, which is what's been in the press, organizations do not need to apply for tax exemption. Organizations can actually hold themselves out as 501 (c)(4) organizations and then file a 990 with us.
The organizations that have been in the press are all ones that are in the application process.
First of all, I think it's very important to emphasize that all of these organizations came in voluntarily.
They did not need to engage the IRS in a back-and-forth.
They could have held themselves out, filed a 990, and if we had seen an issue, we would have engaged but otherwise we wouldn't....
And so what's been happening has been the normal back-and-forth that happens with the IRS. None of the alleged taxpayers and obviously I can't talk about individual taxpayers and I'm not involved in these -- are in an examination process.
They're in an application process which they moved into voluntarily.
There is absolutely no targeting.
In short, according to the IRS, if you're minding your own business and the agency decides to go after you or your group about your taxes -- for an examination or an audit -- you've been targeted. But if you apply for your group to be a tax-exempt one and your application gets sat on for more than a year or you receive a flurry of excessive and/or illegal information requests, that's not targeting. That's what former acting IRS commissioner Steve Miller, speaking at a new Ways & Means hearing, on Friday called "horrible customer service."
To a layperson, the semantic parsing over whether the Tea Party-type groups were targeted or not seems astonishing. Republican Rep. Tom Reed, a self-described "country lawyer from upstate New York," said the burgeoning scandal should be called "IRS Targeting-gate."
But if you look closely at Shulman's answer, it's clear that from an inside-the-agency perspective, it's hard to conceive of the tiny part of the massive national organization that involves review of voluntary requests for evaluation performing a targeted intervention, since the IRS does not require the groups to file such requests, and did not ask them to do it.
Such a way of defining targeting doesn't make any sense from a lay perspective -- but Shulman's more than year-old answer reveals a lot about the IRS perspective on the world and the agency's general level of power.
Computer History Museum
Zilog was founded by Intel veterans Federico Faggins and Ralph Ungermann in 1974. Their first microprocessor, the Z80, was a hit. Intel's products, the company's Dave House admitted, "kind of got stomped on by Zilog with their Z80." But Zilog's success brought trouble in an unlikely form: Exxon.
First, Exxon made a large investment in exchange for 51 percent of the company. Then, they bought Zilog clean out, despite its next-generation 16-bit microprocessor, the Z8000, not having tremendous success. It was downhill from there. And by 1985, having invested a billion dollars, Exxon sold the company back to some of its employees and the investment firm Warburg Pincus.
I ran across this story while reporting on the history of Intel, and in those key days during the early 1980s, right before IBM decided to use Intel chips, Zilog was providing legitimate competition for the now giant company.
What I soon discovered, though, was that Exxon was not alone in trying to make money off the computing boom. As Forbes recalled in 1997, many companies went chasing tech growth and came up empty-handed or worse.
There was something about the way these companies managed their businesses that seemed destined to run highly innovative chip companies into the ground.
It seemed like a good idea at the time. Schlumberger was flush with cash from its oil well logging business. Fairchild Camera & Instrument was a pioneer in the semiconductor industry and in need of capital. Semiconductor chips didn't seem too far afield from Schlumberger's expertise. Didn't oil well measuring tools use electronics heavily? Schlumberger wrote out a check for $425 million to purchase Fairchild.
This was in 1979, just before the great boom in personal computers got underway. Schlumberger should have made billions of dollars from its acquisition. But it didn't. In 1987 it sold Fairchild at a $220 million loss to National Semiconductor.
You could make a long list of merger fiascos in computers and electronics: Xerox paying $900 million for mainframe manufacturer Scientific Data Systems in 1969. Exxon buying Zilog, a microprocessor company, and then some word processor companies, into which it sank $1 billion before selling and writing off the businesses. AT&T; losing $4 billion on NCR during a bull market.
Faggins, for his part, still sounds a little bitter about it all. He left Intel disgruntled and wanted to take the company on. "And we almost succeeded," he recalled in an oral history. "The Z80 was our first product and it became very successful. It took the business away from the [Intel] 8080. Zilog was winning in the market, but then IBM's choice to adopt the Intel 8086 reversed the direction. That was the turning point. By the way, the key reason IBM chose Intel was that our sole investor, Exxon Enterprises, had declared war on IBM."
But it's not just sour grapes. G. Dan Hutcheson, president of VLSI Research, told Forbes, "Zilog might have been what Intel is today, if Exxon hadn't tied them down."
What exactly did they do wrong (aside from tiffing with IBM)? Bernard Peuto, who was at Zilog in the early years and later went on to Sun Microsystems, had a simple answer for what tended to go wrong: The big companies gave Silicon Valley upstarts too much money.
"Quite frankly I blame Exxon," Peuto said in a panel about Zilog at the Computer History Museum. "Exxon essentially choked us with money. They basically gave us too much money and too many directions, which we then kind of went into and in some sense there are times where you have to refuse and that's very hard to do when somebody gives you dollars. But the reality was the reason we were doing too many things is because we could afford to do it because Exxon was kind of giving us the check. That's my personal view. The elephant [had grown too] complex."
And maybe that's the history lesson we can apply today: Too much money too fast breeds too little focus and too much complexity.
The scandals we are talking about in Washington today are not tied to the individual of Barack Obama. While there's still more information to be gathered and more investigations to be done, all indications are that these decisions - on the AP, on the IRS, on Benghazi - don't proceed from him. The talk of impeachment is absurd. The queries of "what did the president know and when did he know it" will probably end up finding out "just about nothing, and right around the time everyone else found out."
Is it your position that any government official should be able to leak any classified information to a journalist with impunity even when that leak endangers lives and compromises national security? Where are your boundaries?And:
I don't think you're really grappling with President Obama's argument in favor of the leak investigation. His argument is straightforward: revealing national security secrets is a matter of life and death for Americans overseas. Anyone who reveals those secrets should be arrested and prosecuted as a matter of justice and deterrence. That's a solid argument, and for you to rebut Obama by talking about the lessons of history is an exercise in evasion. When Aldrich Ames exposed the names of CIA agents and sources to the Soviet Union, those agents and sources were promptly arrested and executed. It seems very likely that the wikileaks data dumps had the same result, especially since Julian Assange refused to redact any of the information. The Bradley Manning court case has been an embarrassment, but it's hard to argue that the federal government should not have moved heaven and earth to find the culprit and prosecute him.Several people also pointed out this item, by Kevin Drum, on why the government took such a hard line in this leak case (although they've been consistently hard on leakers all along). And a university math professor said, in response to my claim that "secrets always get out," "My jaw dropped reading that, given the selection bias inherent in the claim!" (If I had said "all secrets always get out," I would have to respond Touché. My point is that every president has had to cope with "shocking" and "dangerous" releases of classified information.)
I could be persuaded that AG Holder was wrong ... and that President Obama was wrong in backing him. But I am skeptical that the verdict of history is self-evidently against the president, who after all does have a responsibility to protect national security. One of the temptations that presidents should avoid is worrying about looking better in history's eyes. As you know, history is greatly influenced by journalists, who have a certain conflict of interest on issues like this. I am sure the President would rather not prevent journalists from talking to sources (in contrast to President Bush, who would have been overjoyed to send a few journalists to prison), but it's not his top priority. Should it be? You still have to do the hard work of arguing that this tactic, in this instance, was misguided.
In explanation of my own hard-line tone, let me be more precise. On the "Administration's side" of the case, I recognize these points:
1) Leaks can do genuine, terrible damage -- mainly by exposing vulnerable informants and sources, in the way Kevin Drum explains. One reason I was never a fan of the Wikileaks approach is that I knew how many sources in China, in particular, were likely to be harmed by this indiscriminate info-dump.
2) Organizations can and should take reasonable steps to police themselves -- that is, to encourage their own members to observe codes of confidence, and to identify and if necessary punish those who transgress. It matters tremendously to me and other staff members of this magazine that we protect the confidence of people who share information with us. That matters in corporations; it matters in government agencies; it matters for doctors and teachers and detectives and on down a long list.
So what is my complaint?
3) There is a very long history of presidents losing all perspective about leaks, and compounding the problems the original leak through a disproportionate reaction. Jonathan Bernstein explains some of that here (thanks to AS). That is the history that I said a figure as level-headed and unflappable as Barack Obama should be aware of. Also see James Traub on this pattern in Obama's time.
3a) There is also a history of leaks usually (though not always) being less damaging than initially claimed. See: the history of The Pentagon Papers.
4) An important exception to point (2) above is that these post-leak punitive hunts are most likely to lead to trouble when they spill over to the press. The CIA giving its own members lie-detector tests or intercepting their mail to see who's disloyal is one thing -- and generally a proper thing, in my view. Same for a police department, a military unit, or within reason a company.
It is something else to force reporters to testify (or go to jail if they refuse), or to seize records of their phone calls or meetings. Let's leave aside the First Amendment issues: the complications are similar to those involved in forcing clergy members to talk about their parishoners, or doctors about their patients, or attorneys about their clients, or husbands about their wives, and parents about their children. The CIA investigating its own is straightforward. Dragging in the press is different and has very rarely turned out well. That is the reality that I expected a leader with Obama's Niebuhrian awareness of tragic possibilities to be guided by.
I think this is it for me on this theme. I hope the first two faux-scandals peter out, and that the issues of secrecy and disclosure in the era of long twilight war get more serious examination.
Imagine being alone, in space. Just you and your shiny spacesuit and your tiny metal capsule, the world splayed beneath you in swaths of blue and swirls of white. The only immediate link to the humans below you being a faint, crackling radio line back to Earth.
It sounds kind of amazing, right?
The first fortunate human to experience this most sublime of plane rides was Yuri Gagarin, just over 52 years ago. And the last person to experience it -- for the U.S., at least -- was Leroy Gordon Cooper, Jr., who piloted NASA's final Mercury mission, Atlas 9, 50 years ago this week.
Cooper, who was a little more commonly and a lot more awesomely known as "Gordo," wasn't merely the last American to make a solo journey into space. His trip also set a new record for the longest amount of time spent in space. He was, for a stretch of minutes that must have felt at once impossibly long and frustratingly short, the first American to really travel to space.
In that, though, Cooper followed a long line of sojourners. The Mercury program, overall, had two goals: send a human into orbit, and do it before the Russians. And while it didn't succeed in the latter mission, Mercury did end up sending a series of men beyond Earth's confines. Their flights, though, were relatively brief. Alan Shepard, who made the U.S.'s first suborbital flight into space, spent a mere 15 minutes away from Earth that first time out; John Glenn, who made the first orbits of the planet for the U.S., had a nearly 5-hour flight (as did Malcolm Carpenter, who made another three orbits in 1962).
Before NASA's Atlas 9 mission, the longest amount of time an American had spent in space had been a whopping 9 hours and 13 minutes -- a record set by Wally Schirra, who made six orbits of Earth for the Mercury program in October of 1962.
And then Cooper, the seventh member of the "Original Seven," came along. Gordo, for his part, spent a total of one day, 10 hours, 19 minutes, and 49 seconds in space, making 22 full orbits of the planet before splashing down in the Pacific on May 16, 1963. (His flight overall took 34 hours.) Over the course of his long voyage, Cooper had a dinner of "powdered roast beef mush" washed down with water. He captured mesmerizing pictures of the Earth below. He became the first American to sleep in space.
The story doesn't end there, though: Cooper also ran into some trouble. On his 19th orbit, the solo astronaut encountered a problem with the indicator light on the craft he named himself, Faith 7. On the 20th, he lost his attitude readings. On the 21st, a short-circuit occurred, leaving the tiny craft's automatic stabilization and control systems without electrical power.
Suddenly, the crackling radio connecting Gordo to Earth became even more crucial than it had been before. John Glenn, aboard a ship in Japan at the time, communicated with Cooper as he swept around the planet, helping the solo space traveler to revise the checklist NASA had prepared for his entry back to Earth. Meanwhile, Mercury Control Center was in a flurry of worried activity," one history has it, "cross-checking Faith 7's problems and Cooper's diagnostic actions with identical equipment at the Cape [Canaveral] and in St. Louis, then relaying to each communications site questions to ask and instructions to give."
The team soon had another problem to wrestle with: rising carbon dioxide levels in Cooper's craft -- and in his suit. The cabin temperature was rising to 100 degrees Fahrenheit. "Things are beginning to stack up a little," Cooper told the ground of the issue, understatedly.
Indeed. Throughout all this, reports make a point of emphasizing, Cooper -- alone in space, in a tiny, malfunctioning pod named Faith -- remained calm. This was in part because he had taken a medically prescribed pill of dextroamphetamine, stimulating his alertness. But it was also because these are the kinds of situations that astronauts, then as now, are trained for. Ground Control determined that Cooper, given the problems Faith 7 was experiencing, would need to make a manual re-entry back to Earth. The margin of error for angling the craft correctly would be slight: If Cooper came in too steeply, g-forces would crush him; if his trajectory were too shallow, the craft would bounce off the atmosphere and be shot back into space.
But a manual re-entry it was going to have to be. Cooper made his calculations, with help from the ground, based on his knowledge of star patterns. In the process, he disproved the "spam in a can" idea that Chuck Yaeger had famously derided when it came to the Mercury missions: Gordo was much more than simply an experimental body in a NASA-piloted spaceship. He put his education -- and his environs -- to use, drawing lines on Faith 7's window to help him check his orientation against the constellations outside. He shifted from passenger to pilot. "I used my wrist watch for time," Cooper later recalled, "my eyeballs out the window for attitude."
He got it right: Cooper splashed down safely, right next to the aircraft carrier that had been dispatched to retrieve him. Which wouldn't come as a surprise to the NASA engineers who had worked with him on the final Mercury mission. The last American to solo in space was given that honor in large part because of his capacity for self-reliance. "He knew what he was doing," a NASA coworker would later recall, "and could always make things work."
I don't think suffering is good, but I do believe that we have to pay a price for past sins, and the longer we put it off, the higher the price will be.
But this cure has been one ice-cream sundae after another. It can't be that easy, can it? The puritan in me says that there has to be some pain. That's not to say that there hasn't been plenty of economic pain. But that pain has come from the recession itself, not the cure.
A Chinese firm best known for building air conditioning units is constructing a vertical city.
- A Chinese Journalist Just Took Down a Senior Government Official, But Hold the Applause
- China's Gold Binge Shows How Increasingly Freaked Out China's Households Are and Unwilling to Consume
The 838-meter-tall (2,749 feet) tower more commonly known as "Sky City" will be about 30 feet taller than the world's highest skyscraper at present, Dubai's Burj Khalifa. Moreover, it will be stationed in the southern provincial capital of Changsha of about 7 million people -- tiny compared to cities like Beijing or Shanghai.
BSB likes to market the project as the "next step in urbanization." It will house about 30,000 people in a 202-floor building that will also include offices, a hotel, a school, and a hospital -- not to mention 92 elevators, a six-mile-ramp between floors, and 17 helipads.
But whether BSB will actually pull off the project is up for some debate. Chairman Zhang Yue made waves in 2011 when the company built a 30-story building in 15 days, using prefabricated steel and concrete building material. Yue has planned to use a similar approach with Sky City and bragged the company would complete construction within 90 days. The company has had to delay its plans once already, reportedly because local authorities held back permits because of concerns about safety, environmental impact, and congestion. On its website, BSB now says construction will take seven months.
Ta-Nehisi has used an imagine of Walter White, the first African American head of the NAACP, to illustrate the pliability of the black identity. It certainly shows that there are no fixed definitions of race which are particularly useful. But that is a misconception of biological science, which is rife with exceptions and boundary conditions, and characterized by an instrumental perspective. The data above suggests that self-identified African Americans are characterized by some African ancestry, but over 90% are more than 50% African in ancestry. Walter White, who had five black great great great grandparents and 27 white ones, was almost certainly less than 20% African in ancestry. There are such people even today, but they are not typical, and do not disprove the reality that African Americans are predominantly of African ancestry.
As expected, PCA on our entire sample revealed the greatest genetic differentiation between the US Caucasians and the Africans, with the African Americans intermediate between them, reflecting their recent admixture between ancestors from Europe and Africa. Our estimate of European individual admixture (IA) in the African Americans was also roughly consistent with prior studies , with an average of 21.9%. We found considerable variation among individuals in terms of European IA, and a number of individuals with particularly high European IA values (eight individuals of 136, or 6% with values greater than 45%).Prior studies focusing on mtDNA and Y chromosomes have found a greater African and lesser European representation of mtDNA haplotypes compared with Y chromosome haplotypes in African Americans, suggesting a greater contribution of African matrilineal descent compared with patrilineal descent [6,7]. For example, Kayser and colleagues  estimated that 27.5% to 33.6% of Y chromosomes in African Americans are of European origin, compared with 9.0% to 15.4% of mtDNA haplotypes.
"Race" as a term is very nebulous. But human subgroups with similar ancestries can have group differences in DNA -- and intelligence is highly unlikely to have no genetic basis at all (although most now believe its impact is greatly qualified by cultural and developmental differences).
But what I really want TNC to address is the data. Yes, "race" is a social construct when we define it as "white", "black," "Asian" or, even more ludicrously, "Hispanic." But why then does the overwhelming data show IQ as varying in statistically significant amounts between these completely arbitrary racially constructed populations? Is the testing rigged? If the categories are arbitrary, then the IQs should be randomly distributed. But they aren't, even controlling for education, income, etc.
For example, consider only affluent households whose incomes were above $75,000 in each year (adjusted for inflation). Table 2 shows that the average affluent white household lived in a neighborhood where the poverty share was under 10 percent in every year. But poor white households (incomes below $40,000) lived in neighborhoods with only slightly greater poverty shares, about 12 percent or 13 percent.In contrast, affluent blacks lived in neighborhoods that were 14-15 percent poor, and affluent Hispanics in neighborhoods that were about 13 percent poor. On average around the country, in this whole period of nearly two decades, affluent blacks and Hispanics lived in neighborhoods with fewer resources than did poor whites.