• Shortcuts : 'n' next unread feed - 'p' previous unread feed • Styles : 1 2

» Publishers, Monetize your RSS feeds with FeedShow:  More infos  (Show/Hide Ads)


Date: Friday, 12 Sep 2014 16:45

A Call To Duty (David Weber, Timothy Zahn; Baen Books) is a passable extension of Baen Book’s tent-pole Honorverse franchise. Though billed as by David Weber, it resembled almost all of Baen’s double-billed “collaborations” in that most of the actual writing was clearly done by the guy on the second line, with the first line there as a marketing hook.

Zahn has a bit of fun subverting at least one major trope of the subgenre; Travis Long is definitely not the kind of personality one expects as a protagonist. Otherwise all the usual ingredients are present in much the expected combinations. Teenager longing for structure in his life joins the Navy, goes to boot camp, struggles in his first assignment, has something special to contribute when the shit hits the fan. Also, space pirates!

Baen knows its business; there may not be much very original about this, but Honorverse fans will enjoy this book well enough. And for all its cliched quality, it’s more engaging that Zahn’s rather sour last outing, Soulminder, which I previously reviewed.

The knack for careful worldbuilding within a franchise’s canonical constraints that Zahn exhibited in his Star Wars tie-ins is deployed here, where details of the architecture of Honorverse warships become significant plot elements. Also we get a look at Manticore in its very early years, with some characters making the decisions that will grow it into the powerful star kingdom of Honor Harrington’s lifetime.

For these reason, if no others, Honorverse completists will want to read this one too.

Author: "Eric Raymond" Tags: "Review, Science Fiction"
Send by mail Print  Save  Delicious 
Date: Thursday, 11 Sep 2014 21:39

Infinite Science Fiction One (edited by Dany G. Zuwen and Joanna Jacksonl Infinite Acacia) starts out rather oddly, with Zuwen’s introducton in which, though he says he’s not religious, he connects his love of SF with having read the Bible as a child. The leap from faith narratives to a literature that celebrates rational knowability seems jarring and a bit implausible.

That said, the selection of stories here is not bad. Higher-profile editors have done worse, sometimes in anthologies I’ve reviewed.

Janka Hobbs’s Real is a dark, affecting little tale of a future in which people who don’t want the mess and bother of real children buy robotic child surrogates, and what happens when a grifter invents a novel scam.

Tim Majors’s By The Numbers is a less successful exploration of the idea of the quantified self – a failure, really, because it contains an impossible oracle-machine in what is clearly intended to be an SF story.

Elizabeth Bannon’s Tin Soul is a sort of counterpoint to Real in which a man’s anti-robot prejudices destroy his ability to relate to his prosthetically-equipped son.

P. Anthony Ramanauskas’s Six Minutes is a prison-break story told from the point of view of a monster, an immortal mind predator who steals the bodies of humans to maintain existence. It’s well written, but diminished by the author’s failure to actually end it and dangling references to a larger setting that we are never shown. Possibly a section from a larger work in progress?

John Walters’s Matchmaker works a familiar theme – the time traveler at a crisis, forbidden to interfere or form attachments – unfortunately, to no other effect than an emotional tone painting. Competent writing does not save it from becoming maudlin and trivial.

Nick Holburn’s The Wedding is a creepy tale of a wedding disrupted by an undead spouse. Not bad on its own terms, but I question what it’s doing in an SF anthology.

Jay Wilburn’s Slow is a gripping tale of an astronaut fighting off being consumed by a symbiote that has at least temporarily saved his life. Definitely SF; not for the squeamish.

Rebecca Ann Jordan’s Gospel Of is strange and gripping. An exile with a bomb strapped to her chest, a future spin on the sacrificed year-king, and a satisfying twist in the ending.

Dan Devine’s The Silent Dead is old-school in the best way – could have been an Astounding story in the 1950s. The mass suicide of a planetary colony has horrifying implications the reader may guess before the ending…

Matthew S. Dent’s Nothing Besides Remains carries forward another old-school tradition – a robot come to sentience yearning for its lost makers. No great surprises here, but a good exploration of the theme.

William Ledbetter’s The Night With Stars is very clever, a sort of anthropological reply to Larry Niven’s classic The Magic Goes Away. What if Stone-Age humans relied on elrctromagnetic features of their environment – and then, due to a shift in the geomagnetic field, lost them? Well done.

Doug Tidwell’s Butterflies is, alas, a textbook example of what not to do in an SF story. At best it’s a trivial finger exercise about an astronaut going mad. There’s no reveal anywhere, and it contradicts the actual facts of history without explanation; no astronaut did this during Kennedy’s term.

Michaele Jordan’s Message of War is a well-executed tale of weapons that can wipe a people from history, and how they might be used. Subtly horrifying even if we are supposed to think of the wielders as the good guys.

Liam Nicolas Pezzano’s Rolling By in the Moonlight starts well, but turns out to be all imagery with no point. The author has an English degree; that figures, this piece smells of literary status envy, a disease the anthology is otherwise largely and blessedly free of.

J.B. Rockwell’s Midnight also starts well and ends badly. An AI on a terminally damaged warship struggling to get its cryopreserved crew launched to somewhere they might live again, that’s a good premise. Too bad it’s wasted on empty sentimentality about cute robots.

This anthology is only about 50% good, but the good stuff is quite original and the less good is mostly just defective SF rather than being anti-SF infected with literary status envy. On balance, better value than some higher-profile anthologies with more pretensions.

Author: "Eric Raymond" Tags: "General"
Send by mail Print  Save  Delicious 
Date: Thursday, 11 Sep 2014 09:11

Collision of Empires (Prit Buttar; Osprey Publishing) is a clear and accessible history that attempts to address a common lack in accounts of the Great War that began a century ago this year: they tend to be centered on the Western Front and the staggering meat-grinder that static trench warfare became as outmoded tactics collided with the reality of machine guns and indirect-fire artillery.

Concentration on the Western Front is understandable in the U.S. and England; the successor states of the Western Front’s victors have maintained good records, and nationals of the English-speaking countries were directly involved there. But in many ways the Eastern Front story is more interesting, especially in the first year that Buttar chooses to cover – less static, and with a sometimes bewilderingly varied cast. And, arguably, larger consequences. The war in the east eventually destroyed three empires and put Lenin’s Communists in power in Russia.

Prit Buttar does a really admirable job of illuminating the thinking of the German, Austrian, and Russian leadership in the run-up to the war – not just at the diplomatic level but in the ways that their militaries were struggling to come to grips with the implications of new technology. The extensive discussion of internecine disputes over military doctrine in the three officer corps involved is better than anything similar I’ve seen elsewhere.

Alas, the author’s gift for lucid exposition falters a bit when it comes to describing actual battles. Ted Raicer did a better job of this in 2010’s Crowns In The Gutter, supported by a lot of rather fine-grained movement maps. Without these, Buttar’s narrative tends to bog down in a confusing mess of similar unit designations and vaguely comic-operatic Russo-German names.

Still, the effort to follow it is worthwhile. Buttar is very clear on the ways that flawed leadership, confused objectives and wishful thinking on all sides engendered a war in which there could be no clear-cut victory short of the utter exhaustion and collapse of one of the alliances.

On the Eastern Front, as on the Western, soldiers fought with remarkable courage for generals and politicians who – even on the victorious side – seriously failed them.

Author: "Eric Raymond" Tags: "Review"
Send by mail Print  Save  Delicious 
Date: Monday, 08 Sep 2014 04:06

The Abyss Beyond Dreams (Peter F. Hamilton, Random House/Del Rey) is a sequel set in the author’s Commonwealth universe, which earlier included one duology (Pandora’s Star, Judas Unchained) and a trilogy (The Dreaming Void, The The Temporal Void, The Evolutionary Void). It brings back one of the major characters (the scientist/leader Nigel Sheldon) on a mission to discover the true nature of the Void at the heart of the Galaxy.

The Void is a pocket universe which threatens to enter an expansion phase that would destroy everything. It is a gigantic artifact of some kind, but neither its builders nor purpose are known. Castaway cultures of humans live inside it, gifted with psionic powers in life and harvested by the enigmatic Skylords in death. And Nigel Sheldon wants to know why.

This is space opera and planetary romance pulled off with almost Hamilton’s usual flair. I say “almost” because the opening sequence, though action-packed, comes off as curiously listless. Nigel Sheldon’s appearance rescues the show, and we are shortly afterwards pitched into an entertaining tale of courage and revolution on a Void world. But things are not as they seem, and the revolutionaries are being manipulated for purposes they cannot guess…

The strongest parts of this book show off Hamilton’s worldbuilding imagination and knack for the telling detail. Yes, we get some insight into what the Void actually is, and an astute reader can guess more. But the final reveal will await the second book of this duology.

Author: "Eric Raymond" Tags: "Review, Science Fiction"
Send by mail Print  Save  Delicious 
Date: Sunday, 07 Sep 2014 20:35

Sometimes reading code is really difficult, even when it’s good code. I have a challenge for all you hackers out there…

cvs-fast-export translates CVS repositories into a git-fast-export stream. It does a remarkably good job, considering that (a) the problem is hard and grotty, with weird edge cases, and (b) the codebase is small and written in C, which is not the optimal language for this sort of thing.

It does a remarkably good job because Keith Packard wrote most of it, and Keith is a brilliant systems hacker (he codesigned X and wrote large parts of it). I wrote most of the parts Keith didn’t, and while I like to think my contribution is solid it doesn’t approach his in algorithmic density.

Algorithmic density has a downside. There are significant parts of Keith’s code I don’t understand. Sadly, Keith no longer understands them either. This is a problem, because there are a bunch of individually small issues which (I think) add up to: the core code needs work. Right now, neither I nor anyone else has the knowledge required to do that work.

I’ve just spent most of a week trying to acquire and document that knowledge. The result is a file called “hacking.asc” in the cvs-fast-export repository. It documents what I’ve been able to figure out about the code. It also lists unanswered questions. But it is incomplete.

It won’t be complete until someone can read it and know how to intelligently modify the heart of the program – a function called rev_list_merge() that does the hard part of merging cliques of CVS per-file commits into a changeset DAG.

The good news is that I’ve managed to figure out and document almost everything else. A week ago, the code for analyzing CVS masters into in-core data objects was trackless jungle. Now, pretty much any reasonably competent C systems programmer could read hacking.txt and the comments and grasp what’s going on.

More remains to be done, though, and I’ve hit a wall. The problem needs a fresh perspective, ideally more than one. Accordingly, I’m requesting help. If you want a real challenge in comprehending C code written by a master programmer – a work of genius, seriously – dive in.

https://gitorious.org/cvs-fast-export/

There’s the repository link. Get the code; it’s not huge, only 10KLOC, but it’s fiendishly clever. Read it. See what you can figure out that isn’t already documented. Discuss it with me. I guarantee you’ll find it an impressive learning experience – I have, and I’ve been writing C for 30 years.

This challenge is recommended for intermediate to advanced C systems programmers, especially those with an interest in the technicalia of version-control systems.

Author: "Eric Raymond" Tags: "Software"
Send by mail Print  Save  Delicious 
Date: Thursday, 04 Sep 2014 00:50

Yesterday I shipped cvs-fast-export 1.15, with a significant performance improvement produced by replacing a naive O(n**3) sort with a properly tuned O(n log n) version.

In ensuing discussion on G+, one of my followers there asked if I thought this was likely to produce a real performance improvement, as in small inputs the constant setup time of a cleverly tuned algorithm often dominates the nominal savings.

This is one of those cases where an intelligent question elicits knowledge you didn’t know you had. I discovered that I do believe strongly that cvs-fast-export’s workload is dominated by large repositories. The reason is a kind of adverse selection phenomenon that I think is very general to old technologies with high exit costs.

The rest of this blog post will use CVS as an example of the phenomenon, and may thus be of interest even to people who don’t specifically care about high version control systems.

Cast your mind back to the point at which CVS was definitely superseded by better VCS designs. It doesn’t matter for this discussion exactly when that point was, but you can place it somewhere between 2000 and 2004 based on when you think Subversion went from a beta program to a production tool.

At that point there were lots of CVS repositories around, greatly varying in size and complexity. Some were small and simple, some large and ugly. By “ugly” I mean full of Things That Should Not Be – tags not corresponding to coherent changesets, partially merged import branches, deleted files for which the masters recording older versions had been “cleaned up”, and various other artifacts that would later cause severe headaches for anyone trying to convert the repositories to a modern VCS.

In general, size and ugliness correlated well with project age. There are exceptions, however. When I converted the groff repository from CVS to git I was braced for an ordeal; groff is quite an old project. But the maintainer and his devs had been, it turned out very careful and disciplined and comitted none of the sloppinesses that commonly lead to nasty artifacts.

So, at the point that people started to look seriously at moving off CVS, there was a large range of CVS repo sizes out there, with difficulty and fidelity of up-conversion roughly correlated to size and age.

The result was that small projects (and well-disciplined larger projects resembling groff) converted out early. The surviving population of CVS repositories became, on average, larger and gnarlier. After ten years of adverse selection, the CVS repositories we now have left in the wild tend to be the very largest and grottiest kind, usually associated with projects of venerable age.

GNUPLOT and various BSD Unixes stand out as examples. We have now, I think, reached the point where the remaining CVS conversions are in general huge, nasty projects that will require heroic effort with even carefully tuned and optimized tools. This is not a regime in which the constant startup cost of an optimized sort is going to dominate.

At the limit, there may be some repositories that never get converted because the concentrated pain associated with doing that overwhelms any time-discounted estimate of the costs of using obsolescent tools – or even the best tools may not be good enough to handle their sheer bulk. Emacs was almost there. There are hints that some of the BSD Unix repositories may be there already – I know of failed attempts, and tried to assist one such failure.

I think you can see this kind of adverse selection effect in survivals of a lot of obsolete technology. Naval architecture is one non-computing field where it’s particularly obvious. Surviving obsolescent ships tend to be large and ugly rather than small and ugly, because the capital requirement to replace the big ones is harder to swallow.

Has anyone coined a name for this phenomenon? Maybe we ought to.

Author: "Eric Raymond" Tags: "Software, version-control"
Send by mail Print  Save  Delicious 
Date: Wednesday, 03 Sep 2014 05:35

Better Identification of Viking Corpses Reveals: Half of the Warriors Were Female insists an article at tor.com. It’s complete bullshit.

What you find when you read the linked article is an obvious, though as it turns out a superficial problem. The linked research doesn’t say what the article claims. What it establishes is that a hair less than half of Viking migrants were female, which is no surprise to anyone who’s been paying attention. The leap from that to “half the warriors were female” is unjustified and quite large.

There’s a deeper problem the article is trying to ignore or gaslight out of existence: reality is, at least where pre-gunpowder weapons are involved, viciously sexist.

It happens that I know a whole lot from direct experience about fighting and training with contact weapons – knives, swords, and polearms in particular. I do this for fun, and I do it in training environments that include women among the fighters.

I also know a good deal about Viking archeology – and my wife, an expert on Viking and late Iron Age costume who corresponds on equal terms with specialist historians, may know more than I do. (Persons new to the blog might wish to read my review of William Short’s Viking Weapons and Combat.) We’ve both read saga literature. We both have more than a passing acquaintance with the archeological and other evidence from other cultures historically reported to field women in combat, such as the Scythians, and have discussed it in depth.

And I’m calling bullshit. Males have, on average, about a 150% advantage in upper-body strength over females. It takes an exceptionally strong woman to match the ability of even the average man to move a contact weapon with power and speed and precise control. At equivalent levels of training, with the weight of real weapons rather than boffers, that strength advantage will almost always tell.

Supporting this, there is only very scant archeological evidence for female warriors (burials with weapons). There is almost no such evidence from Viking cultures, and what little we have is disputed; the Scythians and earlier Germanics from the Migration period have substantially more burials that might have been warrior women. Tellingly, they are almost always archers.

I’m excluding personal daggers for self-defense here and speaking of the battlefield contact weapons that go with the shieldmaidens of myth and legend. I also acknowledge that a very few exceptionally able women can fight on equal terms with men. My circle of friends contains several such exceptional women; alas, this tells us nothing about woman as a class but much about how I select my friends.

But it is a very few. And if a pre-industrial culture has chosen to train more than a tiny fraction of its women as shieldmaidens, it would have lost out to a culture that protected and used their reproductive capacity to birth more male warriors. Brynhilde may be a sexy idea, but she’s a bioenergetic gamble that is near certain to be a net waste.

Firearms changes all this, of course – some of the physiological differences that make them inferior with contact weapons are actual advantages at shooting (again I speak from experience, as I teach women to shoot). So much so that anyone who wants to suppress personal firearams is objectively anti-female and automatically oppressive of women.

Author: "Eric Raymond" Tags: "Martial Arts, Science"
Send by mail Print  Save  Delicious 
Date: Thursday, 28 Aug 2014 12:31

Our new cat Zola, it appears, has a mysterious past. The computer that knows about the ID chip embedded under his skin thinks he’s a dog.

There’s more to the story. And it makes us think we may have misread Zola’s initial behavior. I’m torn between wishing he could tell us what he’d been through, and maybe being thankful that he can’t. Because if he could, I suspect I might experience an urge to go punch someone’s lights out that would be bad for my karma.

On Zola’s first vet visit, one of the techs did a routine check and discovered that Zola had had an ID chip implanted under his skin. This confirmed our suspicion that he’d been raised by humans rather than being feral or semi-feral. Carol, our contact at PALS (the rescue network we got Zola from) put some more effort into trying to trace his background.

We already knew that PALS rescued Zola from an ASPCA shelter in Cumberland County, New Jersey, just before he would have been euthanized. Further inquiry disclosed that (a) he’d been dumped at the shelter by a human, and (b) he was, in Carol’s words, “alarmingly skinny” – they had to feed him up to a normal weight.

The PALS people didn’t know he was chipped. When we queried Home Again, the chip-tracking outfit, the record for the chip turned out to record the carrier as a dog. The staffer my wife Cathy spoke with at Home Again thought that was distinctly odd. This is not, apparently, a common sort of confusion.

My wife subsequently asked Home Again to contact the person or family who had Zola chipped and request that the record be altered to point to us. (This is a routine procedure for them when an animal changes owners.)

We got a reply informing us that permission for the transfer was refused.

These facts indicate to us that somewhere out there, there is someone who (a) got Zola as a kitten, (b) apparently failed to feed him properly, (c) dumped him at a shelter, and now (d) won’t allow the chip record to be changed to point to his new home.

This does not add up to a happy picture of Zola’s kittenhood. It is causing us to reconsider how we evaluated his behavior when we first met him. We thought he was placid and dignified – friendly but a little reserved.

Now we wonder – because he isn’t “placid” any more. He scampers around in high spirits. He’s very affectionate, even a bit needy sometimes. (He’s started to lick our hands occasionally during play.) Did we misunderstand? Was his reserve a learned fear of mistreatment? We don’t know for sure, but it has become to seem uncomfortably plausible.

There’s never any good reason for mistreating a cat, but it seems like an especially nasty possibility when the cat is as sweet-natured and human-friendly as Zola is. He’s not quite the extraordinarily loving creature Sugar was, but his Coon genes are telling. He thrives on affection and returns it more generously every week.

I don’t know if we’ll ever find out anything more. Nobody at PALs or Home Again or our vet has a plausible theory about why Zola is carrying an ID chip registered to a dog, nor why his former owners owners won’t OK a transfer.

We’re just glad he’s here.

Author: "Eric Raymond" Tags: "General, Zola"
Send by mail Print  Save  Delicious 
Date: Wednesday, 27 Aug 2014 08:58

I just had a rather hair-raising experience with a phase-of-moon-dependent bug.

I released GPSD 3.11 this last Saturday (three days ago) to meet a deadline for a Debian freeze. Code tested ninety-six different ways, run through four different static analyzers, the whole works. Because it was a hurried release I deliberately deferred a bunch of cleanups and feature additions in my queue. Got it out on time and it’s pretty much all good – we’ve since turned up two minor build failures in two unusual feature-switch cases, and one problem with the NTP interface code that won’t affect reasonable hardware.

I’ve been having an extremely productive time since chewing through all the stuff I had deferred. New features for gpsmon, improvements for GPSes watching GLONASS birds, a nice space optimization for embedded systems, some code to prevent certain false-match cases in structured AIS Type 6 and Type 8 messages, merging some Android port tweaks, a righteous featurectomy or two. Good clean fun – and of course I was running my regression tests frequently and noting when I’d done so in my change comments.

Everything was going swimmingly until about two hours ago. Then, as I was verifying a perfectly innocent-appearing tweak to the SiRF-binary driver, the regression tests went horribly, horribly wrong. Not just the SiRF binary testloads, all of them.

My friends, do you know what it looks like when the glibc detects a buffer overflow at runtime? Pages and pages of hex garble, utterly incomprehensible and a big flare-lit clue that something bad done happened.

“Yoicks!” I muttered, and backed out the latest change. Ran “scons check” again. Kaboom! Same garble. Wait – I’d run regressions successfully on that revision just a few minutes previously, or so I thought.

Don’t panic. Back up to the last revision were the change comment includes the reassuring line “All regression tests passed.” Rebuild. “scons check”. Aaaand…kaboom!

Oh shit oh dear. Now I have real trouble. That buffer overflow has apparently been lurking in ambush for some time, with regression tests passing despite it because the phase of the moon was wrong or something.

The first thing you do in this situation is try to bound the damage and hope it didn’t ship in the last release. I dropped back to the release 3.11 revision, rebuilt and tested. No kaboom. Phew!

These are the times when git bisect is your friend. Five test runs later I found the killer commit – a place where I had tried recovering from bad file descriptor errors in the daemon’s main select call (which can happen if an attached GPS dies under pessimal circumstances) and garbage-collecting the storage for the lost devices.

Once I had the right commit it was not hard to zero in on the code that triggered the problem. By inspection, the problem had to be in a particular 6-line loop that was the meat of the commit. I checked out the head version and experimentally conditioned out parts of it until I had the kaboom isolated to one line.

It was a subtle – and entirely typical – sort of systems-programming bug. The garbage-collection code iterated over the array of attached devices conditionally freeing them. What I forgot when I coded this was that that sort of operation is only safe on device-array slots that are currently allocated and thus contain live data. The test operation on a dead slot – an FD_ISSET() – was the kaboomer.

The bug was random because the pattern of stale data in the dead slots was not predictable. It had to be just right for the kaboom to happen. The kaboom didn’t happen for nearly three days, during which I am certain I ran the regression tests well over 20 times a day. (Wise programmers pay attention to making their test suites fast, so they can be run often without interrupting concentration.)

It cannot be said too often: version control is your friend. Fast version control is damn near your best friend, with the possible exception of a fast and complete test suite. Without these things, fixing this one could have ballooned from 45 minutes of oh-shit-oh-dear to a week – possibly more – of ulcer-generating agony.

Version control is maybe old news, but lots of developers still don’t invest as much effort on their test suites as they should. I’m here to confirm that it makes programming a hell of a lot less hassle when you build your tests in parallel with your code, do the work to make them cover well and run fast, then run them often. GPSD has about 100 tests; they run in just under 2 minutes, and I run them at least three or four times an hour.

This puts out little fires before they become big ones. It means I get to spend less time debugging and more time doing fun stuff like architecture and features. The time I spent on them has been multiply repaid. Go and do thou likewise.

Author: "Eric Raymond" Tags: "Software, GPSD"
Send by mail Print  Save  Delicious 
Date: Wednesday, 27 Aug 2014 00:22

The newest addition to Rootless Root:

On one occasion, as Master Foo was traveling to a conference with a few of his senior disciples, he was accosted by a hardware designer.

The hardware designer said: “It is rumored that you are a great programmer. How many lines of code do you write per year?”

Master Foo replied with a question: “How many square inches of silicon  do you lay out per year?”

“Why…we hardware designers never measure our work in that way,” the man said.

“And why not?” Master Foo inquired.

“If we did so,” the hardware designer replied, “we would be tempted to design chips so large that they cannot be fabricated – and, if they were fabricated, their overwhelming complexity would make it be impossible to generate proper test vectors for them.”

Master Foo smiled, and bowed to the hardware designer.

In that moment, the hardware designer achieved enlightenment.

Author: "Eric Raymond" Tags: "Hacker Culture, Software"
Send by mail Print  Save  Delicious 
Date: Tuesday, 26 Aug 2014 04:15

Yes, I’m aware of the spam on the blog front page. The management does not hawk dubious drugs.

Daniel Franke and I just did an audit and re-secure of the blog last night, so this is a new attack. Looks like a different vector; previously the spam was edited into the posts and invisible, this time it’s only in the front-page display and visible.

It’s a fresh instance of WordPress verified against pristine sources less than 24 hours ago, all permissions checked. Accordingly, this may be a zero-day attack.

Daniel and I will tackle it later tonight after his dinner and my kung-fu class. I’ll update this post with news.

UPDATE: The initial spam has been removed. We don’t know where the hole is, though, so more may appear.

UPDATE2: It’s now about 6 hours later and spam has not reappeared.  I changed my blog password for a stronger one, so one theory is that the bad guys were running a really good dictionary cracker.

Author: "Eric Raymond" Tags: "Administrivia"
Send by mail Print  Save  Delicious 
Date: Friday, 22 Aug 2014 17:48

Once Dead (Richard Phillips; Amazon Publishing) is a passable airport thriller with some SF elements.

Jack Gregory should have died in that alley in Calcutta. Assigned by the CIA to kill the renegade reponsible for his brother’s death, he was nearly succeeding – until local knifemen take a hand. Bleeding, stabbed and near death, he is offered a choice: die, or become host to Ananchu – an extradimensional being who has ridden the limbic systems of history’s greatest slayers.

It’s a grim bargain. Ananchu will give him certain abilities, notably the ability to sense life at a distance and read the intentions of his enemies. But the cost is a near-uncontrollable addiction to danger and death. Gregory will literally be in constant struggle with an inner demon – and when the human who dragged his body from the alley is found insane, mumbling of the return of Jack the Ripper, a dark legend is reborn.

Airport-thriller action ensues as Gregory, believed dead by the CIA, goes freelance but is drawn into opposing a plot to cripple the U.S. with an EMP attack. There are lots of bullets, big explosions, a heavy from the Russian Mafia, treachery from rogues inside the CIA, torture scenes, exotic international locations, some sex, weapon porn, and a climactic Special-Ops-style assault on Baikonur. There’s not much surprising here, and the SF elements tend to recede into the background as the plot develops. There are clear indication that the author intends a series.

It’s not brilliant or terribly original, but it’s competently done. The author is a former Army Ranger; the gunplay, hand-to-hand fighting, and combat ops are written as by someone who has seen how it’s done right, if not done it himself. Read it on an airplane.

Author: "esr" Tags: "Review, Science Fiction"
Send by mail Print  Save  Delicious 
Date: Tuesday, 19 Aug 2014 06:09

Before you read any further, go look at the drawing accompanying the New Tork Times article on the autopsy of Michael Brown,

There’s a story in that picture. To read it, you have to be familiar with pistol shooting and the kind of pistol self-defense training that cops and amateur sheepdogs like me engage in.

In the remainder of this post I’m going to walk you through the process of extracting the story from the picture.

In case the link I’m using disappears behind a paywall, here are the most salient features of what I see:

  • The entry and exit wounds form a nearly linear arc from the crown of the head to the right hand.
  • There are no entry or exit wounds on the back.
  • On the head, there are two entry wounds at the crown and right eye, and a wound in the jaw which could be entry or exit (it’s unclear in the drawing).
  • There is one wound at the base of the neck on the right hand side, drawn to suggest an exit wound.
  • There is one wound in the upper right pectoral muscle, drawn like an exit.
  • There are three wounds on the right arm. The topmost one is very near the pectoral wound and drawn like an entry. The torn shape of the middle one, on the upper arm, suggests an exit wound (later update: turns out it’s a graze, and the only one that could have been inflicted from the rear). The bottom one, on the forearm, is clearly drawn as an entry wound.
  • The wound on the hand is drawn to suggest that the bullet entered there at a shallow angle (more tearing would be shown if it were an exit wound)

The first thing that jumps out at me is that this was not wild, amateurish shooting. Had it been, the distribution of bullet holes would have resembled an irregular blob. The near-linear arrangement suggests a relatively steady hand and a shooter who wasn’t panicked.

It also strongly hints that Brown was not moving sideways when the shots were fired. He was either stationary or moving directly forward or away from Officer Wilson.

We know Wilson is a cop and we know how cops are trained – to aim for the target’s center of mass (COM). But that’s not where the shots landed. What I see here looks like good aim at the COM compromised by a mild case of trigger jerk from a right-handed shooter, pulling the muzzle slightly left from the point of aim.

This is probably the single most common shooting fault there is. I do it myself when I’ve been out of practice; my first target, at 30 feet, is likely to feature a vertical line of holes a few inches left of the X-ring. It’s a very easy mistake to make under fatigue or stress.

The location and angle of the head wounds, and the absence of wounds on the rear surfaces of the body, is also telling. For starters, it tells us that Brown was not shot in the back as some accounts have claimed.

I think the only posture that could produce this wound pattern is for Brown to have been leaning well forward when he was first shot, with his right arm stretched forward (the pair of wounds around the right armpit and the shallow-entry wound in the hand are suggestive of the latter).

I originally thought the head wounds indicated that Officer Wilson was shooting Mozambique drill – double tap to the body followed by a head shot. This is how police trainers teach you to take out a charging assailant who might be high out of his mind. I drill this technique myself, as do most serious self-defense shooters.

Now I think it’s equally possible that Brown began to collapse forward when he took the first bullet or two and his head fell into the path of Wilson’s following shots.

One possibility we can rule out is that Brown was shot while prone on the ground after collapsing. There are no wounds at the right places and angles for this. If he had been shot prone at close range, the angle of the crown wound would be impossible; if he had been shot while prone at some distance the crown wound might just barely possible but we’d also see shallow-angle wounds on the back.

Everything I see here is consistent with the report from an unnamed friend of officer Wilson that Brown charged Wilson and Wilson shot him at very close range, probably while Brown was grabbing for Wilson or the pistol with his right hand.

UPDATE: I failed to make clear that the reason I’m sure Brown was moving is the extreme torso angle suggested by the lack of exit wounds on the back. A human trying to do that standing still would overbalance and fall, which is why I think he was running or lunging when he took the bullets.

UPDATE2: We now have have a bit more information on the report.

“Dr. Baden and I concluded that he was shot at least six times. We’ve got one to the very top of the head, the apex. We’ve got one that entered just above the right eyebrow. We’ve got one that entered the top part of the right arm. We’ve got a graze wound, a superficial graze wound, to the middle part of the right arm. We’ve got a wound that entered the medial aspect of the right arm, and we’ve got a deep graze wound that produced a laceration to the palm of the right hand,” Parcells said while pointing out the location of the wounds on a diagram.

Baden and Parcells concur that the head shots came last, and that the crown wound killed Brown. The middle wound on the arm was not, as I thought from the drawing, an exit; it was a graze. Their description of the hand wound as a graze causing a laceration confirms my reading that the bullet hit the hand at a very low angle – thus, Brown’s hands cannot have been up when he took the shot.

Author: "esr" Tags: "Firearms"
Send by mail Print  Save  Delicious 
Date: Friday, 15 Aug 2014 16:33

I join my voice to those of Rand Paul and other prominent libertarians who are reacting to the violence in Ferguson, Mo. by calling for the demilitarization of the U.S.’s police. Beyond question, the local civil police in the U.S. are too heavily armed and in many places have developed an adversarial attitude towards the civilians they serve, one that makes police overreactions and civil violence almost inevitable.

But I publish this blog in part because I think it is my duty to speak taboo and unspeakable truths. And there’s another injustice being done here: the specific assumption, common among civil libertarians, that police overreactions are being driven by institutional racism. I believe this is dangerously untrue and actually impedes effective thinking about how to prevent future outrages.

In the Kivila language of the Trobriand Islands there is a lovely word, “mokita”, which means “truth we all know but agree not to talk about”. I am about to speak some mokitas.

Let’s begin with some statistics. Wikipedia has this to say about race and homicide rates:

According to the US Department of Justice, blacks accounted for 52.5% of homicide offenders from 1980 to 2008, with whites 45.3% and Native Americans and Asians 2.2%. The offending rate for blacks was almost 8 times higher than whites, and the victim rate 6 times higher. Most murders were intraracial, with 84% of white homicide victims murdered by whites, and 93% of black victims murdered by blacks.

Moving forward from 2008 or back from 1980 would change these figures very little; I cite Wikipedia because it’s handy, but I already knew them within a couple of percentage points and they’ve been very similar since before I was born in the 1950s. And we can take homicide figures as representative of racial disparities in wider violent crime rates, because – observably – they are.

Now here are some more facts which taken together, change the implications of that 52.5% a lot. First: in any subpopulation, whether chosen by race or SES or any other criterion, almost all violent crime (up to statistical noise) is perpetrated by males between the ages of 15 and 25.

Second: The black population of the U.S., as of the 2010 census, is 12.61% of the total.

Third: Within that population, males 15-25 are approximately 8% of it (add up the 15-19 and 20-24 boxes in table 2 and divide by two to account for the fact that half of that percentage is female). Multiplying these, the percentage of black males 15-24 in the general population is about 1%. If you add “mixed”, which is reasonable in order to correspond to a policeman’s category of “nonwhite”, it goes to about 2%.

That 2% is responsible for almost all of 52% of U.S. homicides. Or, to put it differently, by these figures a young black or “mixed” male is roughly 26 times more likely to be a homicidal threat than a random person outside that category – older or younger blacks, whites, hispanics, females, whatever. If the young male is unambiguously black that figure goes up, about doubling.

26 times more likely. That’s a lot. It means that even given very forgiving assumptions about differential rates of conviction and other factors we probably still have a difference in propensity to homicide (and other violent crimes for which its rates are an index, including rape, armed robbery, and hot burglary) of around 20:1. That’s being very generous, assuming that cumulative errors have thrown my calculations are off by up to a factor of 6 in the direction unfavorable to my argument.

Now suppose you’re a cop. Your job rubs your nose in the reality behind crime statistics. What you’re going to see on the streets every day is that random black male youths are roughly 20 times more likely to be dangerous to you – and to other civilians – than anyone who isn’t a random black male youth.

Any cop who treated members of a group with a factor 20 greater threat level than population baseline “equally” would be crazy. He wouldn’t be doing his job; he’d be jeopardizing the civil peace by inaction.

Yeah, my all means let’s demilitarize the police. But let’s also stop screaming “racism” when, by the numbers, the bad shit that goes down with black male youths reflects a cop’s rational fear of that particular demographic – and not racism against blacks in general. Often the cops in these incidents are themselves black, a fact that media accounts tend to suppress.

What we can actually do about the implied problem is a larger question. (Decriminalizing drugs would be a good start.) But it’s one we can’t even begin to address rationally without seeing past the accusation of racism.

Author: "esr" Tags: "Politics, racism"
Send by mail Print  Save  Delicious 
Date: Friday, 15 Aug 2014 09:47

One of the reasons I like cats is because I find it enjoyable to try to model their thought processes by observing their behavior. They’re like furry aliens, just enough like us that a limited degree of communication (mostly emotional) is possible.

Just now I’m contemplating a recent change in the behavior of our new cat, Zola. Recent as in the last couple of days. Some kind of switch has flipped.

When last I reported on Zola, about six weeks ago, he was – very gradually – losing his initial reserve around us; behavior becoming more like Sugar’s was. Not all that surprising in retrospect – those Maine Coon genes are telling.

In the last couple of days Zola has become markedly more affectionate and attention-seeking. He’s even taken to sleeping part of the night on the waterbed coverlet, which was something Sugar did that we liked (before a few days ago, he’d occasionally jump up but then skedaddle after less than a minute). We find it very restful to have a cat curled up nearby when we’re dozing off or wake up in the middle of the night.

I think I understand the long-term, gradual increase in affectionate behavior; Zola has been testing us and learning that we’re safe. I wish I understood the sudden jump. It’s as though we’ve moved to a different category in his representation of the world. It doesn’t feel like he’s testing us anymore; now he just cheerfully assumes that we love him and loves us right back. He’s happier and more relaxed – he’s almost stopped disappearing during the day (but is still more nocturnal than Sugar was).

The reason I’m writing about this is to invite speculation – or, better yet, reports from ethological studies – on what social classifications cats have other than “stranger”, “packmate”, and “kin”. Also, whether there’s any evidence for domestic cats putting humans in a close-kin category, or something else distinguishable from and more trusted than “packmate”.

Now, if we can just teach him not to sprawl where he might get stepped on. Without actually stepping on him *wince*…

Author: "esr" Tags: "Science, Zola"
Send by mail Print  Save  Delicious 
Date: Tuesday, 12 Aug 2014 23:56

I shipped point releases of cvs-fast-export and reposurgeon today. Both of them are intended to fix some issues around the translation of ignore patterns in CVS and Subversion repositories. Both releases illustrate, I think, a general point about software engineering: sometimes, it’s better to punt tricky edge cases to a human than to write code that is doomed to be messy, over-complex, and a defect attractor.

For those of you new to version-control systems, an ignore pattern tells a VCS about things for the VCS to ignore when warning the user about untracked files. Such patterns often contain wildcards; for example, “*.o” is good for telling almost any VCS that it shouldn’t try to track Unix object files.

In most version control systems ignore patterns are kept in a per-directory dotfile. Thus CVS has .cvsignore, git has .gitignore, etc. Ignore patterns in Subversion are not kept in such a dotfile; instead they are the values of svn:ignore properties attached to directories.

Translating ignore patterns between version-control systems is messy that most conversion tools fluff it. My reposurgeon tool is an exception; it goes to considerable lengths to translate Subversion ignore properties into patterns in whatever kind of dotfile is required on the target system.

Unfortunately, this feature collides with git-svn. People using that tool to interact with a Subversion repository often create .gitignore files in the Subversion repository which are independent of any native svn:ignore properties it might have.

This becomes a problem when you try to convert the repo to git. In that case, .gitignore files created by git-svn users and .gitignore files generated from the native svn:ignore properties can step on each other in odd ways.

I’ve had a bug report about this in my inbox for a couple of months. Submitter innocently asked me to write logic that would automatically do the right thing, merging .gitignore patterns with svn:ignore patterns and throwing out duplicates. And somewhere in the back of my brain, a robot voice called out “WARNING, WILL ROBINSON! DANGER! DANGER!”

One of the senses you develop after writing complex software for a couple of decades is some ability to tell when a feature is going to be a defect attractor – a source of numerous hard-to-characterize bugs and a maintenance nightmare. That alarm rang very loudly on this one. But I was blocked for quite a while on the question of what, if any, simpler alternative to go for.

I resolved my problem when I realized that this challenge – merging the properties – will be both (a) uncommon, and (b) the sort of thing computers find difficult but humans find easy. Typically it would only have to be dealt with once in the aftermath of a repository conversion, rather than frequently as the repo is in use.

My conclusion was that the best behavior is to discard the hand-hacked SVN .gitignores, warning the user this is being done. It’s then up to the reposurgeon user to rescue any patterns that should be moved from the old hand-hacked .gitignores to the new generated ones.

Because, very often, the hand-hacked .gitignores are there just to duplicate way the native svn:ignore properties are doing, the user often won’t have to do any work at all. The unusual cases in which that is false are the same unusual cases that automated merge code could too easily get wrong.

The general point here is that engineering is tradeoffs. Sometimes chasing really recondite edge cases piles up a lot of technical debt for only tiny gains.

The more subtle point is that if you don’t have any way at all to punt weird cases to a human, your software system may be brittle and overengineered – doing sporadic exceptional cases at a high life-cycle cost that a human could do cheaply and at a cumulatively lower defect risk.

This bears emphasizing because hackers have such a horror of manularity, going to extreme lengths to automate instead. Sometimes, doing that gets the tradeoff wrong.

Reposurgeon creates the option get this right because it was designed from the beginning as a tool to amplify human judgment rather than trying to automate it entirely out of the picture. All other repository-conversion tools are indeed brittle in exactly the opposite way by comparison.

A similar issue arose with cvs-fast-export. I got a bug report that revealed a couple of issues in how it translates .cvsignore files to .gitignores in its output stream. Among other things, it writes a representation of CVS’s default ignore patterns into a synthetic .gitignore in the first commit. This is so users browsing the early history in the converted git repo won’t have untracked files jumping out out them that CVS would have kept quiet about.

With the report, I got a request for a switch to suppress this behavior. The right answer, I replied, was not to add that switch and some complexity to cvs-fast-export. Rather, I reminded the requester that he could easily delete that synthetic .gitignore from the history using reposurgeon. Then I added the command to do that to the list of examples on the reposurgeon man page.

The point, again, is that rushing in to code a feature would have been the wrong thing – programmer macho. Alternatively, we could view the cvs-fast-export/reposurgeon combination as an instance of the design pattern alternate hard and soft layers and draw a slightly different lesson; sometimes it’s better to manually exploit a soft layer than add an expensive feature to a hard one.

Author: "esr" Tags: "Software, reposurgeon"
Send by mail Print  Save  Delicious 
Date: Tuesday, 12 Aug 2014 23:53

Unexpected Stories (Octavia Butler; Open Road Integrated Media) is a slight but revealing work; a novelette and a short story, one set in an alien ecology among photophore-skinned not-quite humans, another set in a near future barely distinguishable from her own time. The second piece (Childfinder) was originally intended for publication in Harlan Ellison’s never-completed New Wave anthology The Last Dangerous Visions; this is its first appearance.

These stories do not show Butler at her best. They are fairly transparent allegories about race and revenge of the kind that causes writers to be much caressed by the people who like political message fiction more than science fiction. The first, A Necessary Being almost manages to rise above its allegorical content into being interesting SF; the second, Childfinder, is merely angry and trite.

The only real attraction here is the worldbuilding in A Necessary Being; Butler explores the possible social consequences of humanoids having genetic lines that differ dramatically in physical capabilities, mindset, and ability to express varying colors in the photophores that cover their skins. But having laid out the premise and some consequences, Butler never really gets to any moment of conceptual breakthrough; the resolution of the plot could have gone down the same way in any human tribal society with charismatic leaders. The counterfactual/SF premise is effectively discarded a half-step before it should have paid off in some kind of transformative insight that changes the condition of the world.

This illustrates one of the ways in which allegorical or political preoccupations can damage SF writers. A Necessary Being fails as SF because Butler was distracted by her allegorical agenda and forgot what she owed the reader. This is a particular shame because the story displays imagination and an ability to write.

Childfinder is not merely flawed, it is an actively nasty revenge fantasy. There’s very little here other than a thin attempt at justification for a black woman psychically crippling white children who might otherwise have become telepaths. The framing story is rudimentary and poorly written. It would probably be better for Butler’s reputation if this one had remained in the trunk.

Author: "esr" Tags: "Review, Science Fiction"
Send by mail Print  Save  Delicious 
Date: Tuesday, 12 Aug 2014 07:18

The Gods of War (Graham Brown & Spencer J. Andrews; Stealth Books) is one of the better arguments for the traditional system of publishing-house gatekeepers I’ve seen recently. It is not merely bad, it is a stinkeroo that trumpets its awfulness from page one.

Straight up, a cabal of the shady super-rich in the year 2137 are told that the Earth’s ecology will collapse within about a year, and their maximum leader has a clever plan to evacuate them to…Mars. Which unspecified good guys have developed with a dream of turning it into (get this) an agricultural colony that can feed half the earth.

Nothing that calls itself SF but leads off with that much ignorance of the reality of Mars and the energy economics of space transport can possibly land anywhere good. Especially not when the prose is clumsy, the characters strictly cartoons, and the authors seem bent on writing a political allegory for which “hamfisted” and “stupid” will be among the mildest negative adjectives one could apply.

The only mercy in this book is that it is such an obvious waste of electrons that I was able to give up with a clear conscience on page twelve rather than forcing myself to slog through it all in hopes of finding some redeeming value. In brief: avoid.

Author: "esr" Tags: "Review, Science Fiction"
Send by mail Print  Save  Delicious 
Date: Monday, 11 Aug 2014 03:44

I just got back from the 2014 World Boardgaming Championships in Lancaster, PA. This event is the “brain” half of the split vacation my wife Cathy and I generally take every year, the “brawn” half being summer weapons camp. WBC is a solid week of tournament tracks in strategy and war games of all kinds, with a huge open pickup gaming scene as well. People fly in from all over the planet to be here – takes less effort for us as the venue is about 90 minutes straight west of where we live.

Cathy and I aren’t – quite – steady world-championship-level players. I did make the Power Grid finals (top 5 players) two years ago, but have been unable to replicate that feat since. Usually we do make quarter-finals (top 64 to 125 players) or semi-finals (top 16 to 25) in a couple of our favorite games and then lose by the slimmest of margins to people just a hair better (or at least more obsessive) than we are. That’s pretty much what happened this year.

I’m not going to do a blow-by-blow of my whole week, but rather try to hit the dramatic high spots in a way I hope will convey something of the flavor to people who aren’t deeply immersed in tournament gaming. I think the best way to do that is to organize by game rather than chronology. The section titles link to game descriptions.

Terra Mystica

I’ve been enjoying this one a lot lately and was very pleased to be able to fit a pickup game in on the first night. Three to six players, 2.5-3 hours, fantasy-themed – contending factions with magical powers trying for interlocking levels of area control on a multicolored hex grid.

This game is strategically deep and tricky to play – very “crunchy” in gamer slang. Suits me fine; I like my games super-crunchy, which is an elite taste even among strategy gamers. If Terra Mystica becomes a WBC tournament game (which I think is extremely likely to happen within two years) a trophy in it will earn more respect than one in a lighter, “fluffier” game.

Some of you may be entertained to know that my joke name for this one is “Eclipse: The Gathering”. For the rest of you, this hints at similarities to a game (Eclipse) I often play, and another (Magic: The Gathering) that I used to play.

The one flaw the game seems to have is one that’s common in games with asymmetrical player powers; the factions aren’t quite balanced, with some chronically stronger or weaker than average (this sort of thing can slip through even careful playtesting sometimes). The Darklings, for example, are often said to be about the winningest side; my one time playing them I did very well, pulling a strong second.

This was about my fifth play of Terra Mystica, maybe fourth. This time I drafted the Engineers – I’m trying to get to playing every one of the 14 factions. I cannot recommend them. I had to work hard to pull second even though all the other players were less experienced than me; the Engineers have real trouble getting enough workers to expand even though my first couple of actions were the expensive ones required to reduce my terraforming cost to 1 worker. Copping the bonus for most area controlled and maxing out the Air cult track helped a lot.

Commands & Colors: Ancients

I love ancient-period wargames. Phalanxes, peltasts, barbarians, war elephants – I actually prefer a straight historical to fantasy-themed stuff. I’d say my favorite single period is the wars of the Diadochi (lots and lots of war elephants, hurray!), but anything set from the late Bronze Age to the fall of the Western Roman Empire will easily catch my interest.

Commands & Colors: Ancients is, in my opinion, one of the best game systems ever devised for this span of time. While not as crunchy as some of the older simulationist games like the PRESTAGS system, you will get authentic results when you use period tactics. Knowing what historical generals did, and having some idea why, actually helps significantly. There are expansions and scenarios for hundreds of different historical battles.

Alas, the game is flawed for tournament play. The problem with it is that when two highly skilled players meet, they can counter each others’ tactics so well that the outcome comes down to who gets good breaks on the battle dice. I’m quite a good player, but I had skipped competing at WBC for the last few years because I found it too irritating to lose to the dice rather than my opponents.

This year, however, I had a hole in my WBC schedule where the C&C:A tournament was and decided to give it another try. The scenario was the battle of Bagradas, Carthaginians vs. Romans during the First Punic War. With elephants!

Three games later I had: 1 narrow loss to a player who afterwards shook his head and said “You played that very well, I just got better dice”; 1 solid win against a player not quite as good as me; and one heartbreaker of a loss to a player about my equal where we both knew it came down to one failed die roll on my attempted final kill – which, by the odds, I should have made.

That wasn’t good enough to get me to the second round. And it was just about what I expected from my previous experience; the tournament game is a crapshoot in which it’s not enough to be good, you also have to be lucky. I prefer games in which. if there’s a random element at all, luck is nevertheless relatively unimportant.

I’ll probably sit out C&C:A next year.

Ticket To Ride

TTR is a railroad game in which you build track and connect cities to make victory points. It’s relatively fluffy, a “family game”, but has enough strategy so that serious gamers will play it as a light diversion when circumstances aren’t right for something crunchier.

I am difficult to beat at the Europe variant, which I like better than the American map; the geography creates more tactical complexities. In my first heat I kerb-stomped the other three players, coming in 19 points ahead of 2nd and sweeping every scoring category and bonus.

The second heat looked like it was going to go the same way. I built both monster tunnels (St. Petersburg-Stockholm and Kyiv-Budapest) on succeeding turns for a 36-point power play, then successfully forced an early game end in order to leave the other players with unused trains (and thus unscored points). When we started endgame scoring everyone at the table thought I had the win locked in.

Sadly, in order to get rid of my own train tokens as fast as possible I had to give up on the longest-continuous-track bonus. Another player got it, and piled up just enough completed route bonuses to get past me by 1 solitary victory point. Hard to say which of us was more astonished.

My schedule was such that it wasn’t possible after that to make the second win that would get me to semifinals guaranteed. But I was a high alternate and might have made it in anyway; I was just checking in with the GM when my wife ran in to tell me I’d squeaked into the Puerto Rico semifinals running at the same time – and that’s a game I take more seriously.

Ah well, maybe next time. I think none of the WBC regulars in this tournament would be very surprised if I took gold some year, if I’m not preoccupied with more serious games.

Puerto Rico

Puerto Rico was not quite the first of the genre now called “Eurogames”, but it was the first to hit big internationally back in 2002. The theme is colonization and economic development in the 16th-century Caribbean; you build cities, you run plantations, you trade, and you ship goods to Europe for victory points.

This game is to Eurogame as apple is to fruit, the one everyone thinks of first. It looks light on the surface but isn’t; it has a lot of subtlety and tremendous continuing replay value. It has outlasted dozens of newer, flashier games that had six months or a year of glory but now molder half-forgotten in closets.

My wife and I are both experienced and strong players. The WBC tournament referees and many of the past champion players know us, and we’ve beaten some of them in individual games. We seldom fail to make quarter-finals, and some years we make semi-finals. I think each of us can realistically hope for gold some year.

But maybe we’re not quite good enough yet. Cathy got two wins in the qualifying heats, good for a bye past the quarter-finals into semis. I scored one utterly crushing victory at the only three-place table in the second qualifying heat, playing my default factory/fast-buildout strategy. Then, only a close second – but that made me second alternate (one of the guys I beat in that game was last year’s champion) and I got in because a couple of qualified players dropped out to do other things (like play in the Ticket To Ride semis I passed on to play in these).

Cathy pulled third in her game; she says she was outplayed. Me, I got off to a roaring start. Play order in the initial round is known to be important from statistical analysis, so much so that in championship you bid competitively for it by agreeing to deduct victory points from your final score. I got fourth seat (generally considered second-best to third) relatively cheaply for -1.5.

Usually I plan to play corn shipper when in fourth seat. But, due to the only random element in the game (the order plantation tiles in the game come out) and some good money-accumulation moves, I managed to build and man a coffee roaster very early. That pointed me back at my default strategy, which aims at a fast city build-out with minimal shipping using Factory as a money generator – one coffee crop comes close to paying for the (expensive) Factory.

Damned if it didn’t work perfectly. I had the only coffee in production, which scared other players off of triggering production, particularly styming the bulk shippers. For most of the game it looked to everyone at the table like I was cruising to an easy win. There were admiring remarks about this.

The one drawback of this strategy, however, is that it has a slow ramp-up. You make most of your points quite near the end of the game through big buildings. If anyone can force game end before you’re expecting it, you take a bigger hit to your score than shippers who have been making points from the beginning.

That’s how I got whacked. There were these three or maybe four guys down from Quebec specifically for the Puerto Rico tournament; gossip said they weren’t playing anything else. One of them – Matthieu, I think his name was – was sitting to my left (after me in the turn order) and pulled a brilliant hack that shortened the game by at least two rounds, maybe more. Doing this deprived me of the last, crucial rounds of build-out time when I would have pulled down the biggest points.

Those of you who play the game know that one way to accelerate the end is to deliberately leave buildings unmanned so they suck colonists off the colony ship faster; when those run out, you’re done. There’s a recently discovered ambiguity in the rules that makes this tactic work much faster – turns out that someone playing Mayor is allowed, under a strict reading, to refuse to take his colonists, casting them into the void and leaving his building empty to pull more out of the boat on the next round.

The resulting vanishing-colonist play may be a bug produced by poorly crafted rules or a bad translation from the original German (wouldn’t be the first time that’s been a problem, either). The tournament referee is not happy with it, nor are the WBC regulars – it screws with the “book”, the known strategies, very badly. The ref intends to brace the game’s designer about this, and we may get a rules amendment disallowing the play next year.

In the meantime, nobody could argue that the guys from Quebec weren’t within their rights to exploit this hack ruthlessly. And they did. Three of them used it to finish at the finals table. Matthieu, the one that dry-gulched me, took the gold.

There was a lot of … I won’t say “angry”, but rather perturbed talk about this. I wasn’t the only person to feel somewhat blindsided and screwed (though we also admired their nerve and dedication). These guys were monomaniacs; unlike most top WBC gamers, who (like me) play up to a half-a-dozen games very well, the Quebeckers were laser focused on thus one game and studied it to the point where they found the hack that would break the standard book.

Sigh…and that’s why no trophy for me this year. (Everyone in the final four would have gotten one.) Cathy and I will try again. Nobody would be surprised at either of us making the finals, but it could take a few years’ more practice.

Author: "esr" Tags: "Games"
Send by mail Print  Save  Delicious 
Date: Wednesday, 06 Aug 2014 04:36

Nexus (Nicholas Wilson; Victory Editing) is the sort of thing that probably would have been unpublishable before e-books, and in this case I’m not sure whether that’s good or bad.

There’s a lot about this book that makes me unsure, beginning with whether it’s an intentional sex comedy or a work of naive self-projection by an author with a pre-adolescent’s sense of humor and a way, way overactive libido. Imagine Star Trek scripted as semi-pornographic farce with the alien-space-babes tropes turned up to 11 and you’ve about got this book.

It’s implausible verging on grotesque, but some of the dialog is pretty funny. If you dislike gross-out humor, avoid.

Author: "esr" Tags: "Review, Science Fiction"
Send by mail Print  Save  Delicious 
Next page
» You can also retrieve older items : Read
» © All content and copyrights belong to their respective authors.«
» © FeedShow - Online RSS Feeds Reader