Interesting to see what happens when two “well-liked” communities clash.
Background: Google released a web accelerator, which follows the HTTP specs (AFAICT), but wreaks havoc with a category of Web 2.0-style apps which, for convenience, ignored the part of the spec having to do with the idempotency of GETs (see Sam Ruby on the topic).
This web accelerator, because it runs in the browser, hence as the logged-in user, drills into sites which, while non spec-compliant, were considered “OK” because the rest of the web infrastructure (spiders, caching servers, etc.) weren’t logged in, hence weren’t exposing the flaw.
Some Ruby on Rails apps by the 37Signals crew was vulnerable, with the web accelerator causing data loss.
Google is following the spec, the Rails folks say “yeah, but the world hasn’t been following the spec for years”. Typical prescriptive vs. descriptive argument, but in a a community which has stuck to standards over convenience for a decade (in great part in a battle to the death with Microsoft).
It’s quite a fascinating debate, one that Sam predicted. Myself, I hope that Google adjusts the program to be less destructive (for the sake of the users) but sticks to the principle of the sanctity of the spec, and that the Rails folks use their considerable smarts to find a way to route around the limitations of the spec.