Good news - I'm working on another book!
In the meantime, here's an interesting and forgotten page from the history of JavaScript that I stumbled upon thanks to Tavis. It's a long read, edited a bit for clarity; it's also fascinating account of how close we came to replacing the same-origin policy - its faults notwithstanding - with something much worse than that:
"The security model adopted by Navigator 2.0 and 3.0 is functional, but suffers from a number of problems. The [same-origin policy] that prevents one script from reading the contents of a window from another server is a particularly draconian example. This [policy] means that I cannot write a [web-based debugger] and post it on my web site for other developers to use [in] their own JavaScript. [Similarly, it] prevents the creation of JavaScript programs that crawl the Web, recursively following links from a given starting page."
"Because of the problems with [the same-origin policy], and with the theoretical underpinnings of [this class of security mechanisms], the developers at Netscape have created an entirely new security model. This new model is experimental in Navigator 3.0, and may be enabled by the end user through a procedure outlined later in this section. The new security model is theoretically much stronger, and should be a big advance for JavaScript security if it is enabled by default in Navigator 4.0."
"[Let's consider] the security problem we are worried about in the first place. For the most part, the problem is that private data may be sent across the Web by malicious JavaScript programs. The [SOP approach] patches this problem by preventing JavaScript programs from accessing private data. Unfortunately, this approach rules out non-malicious JavaScript [code that wants to use this data] without exporting it."
"Instead of preventing scripts from reading private data, a better approach would be to prevent them from exporting it, since this is what we are trying to [achieve] in the first place. If we could do this, then we could lift most of the [restrictions] that were detailed in the sections above."
"This is where the concept of data tainting comes in. The idea is that all JavaScript data values are given a flag. This flag indicates if the value is "tainted" (private) or not. [Tainted values will be prevented] from being exported to a server that does not already "own" it. [...] Whenever an attempt to export data violates the tainting rules, the user will be prompted with a dialog box asking them whether the export should be allowed. If they so choose, they can allow the export."
Of course, tainting would not have prevented malicious JavaScript from relaying the observations about the tainted data to parties unknown.
EmoticonEmoticon