Thanks for posting those, Tao! I do see a difference in the gradients, but the second image also appeared less saturated (I know, many variables). Also, I don't argue that people seem to be "pulling" more detail out of the highlights in RAW files, I just haven't been able to locate any empirical data showing this quantitatively.
Another significant difference between the two, as I understand it, is that a preset gamma is applied to the image's characteristic curve in a .JPG capture, while in a RAW capture, it can be re-determined in post. This also, is likely a key argument for shooting RAW. With the exception of RAW-based, motion-video cameras like RED, in most modern broadcast video cameras, many of these parameters are field-adjustable (black level, gamma, knee, etc.). I guess there isn't really a parallel level of pre-capture control (over the characteristic curve) in even top-line DSLRs yet.
Also, the key argument regarding bit-depth which got my attention was this one:
"JPG has 8 bits per color per pixel and raw may have 12 bits, but here's the big catch: raw is 12 bit linear, and JPG is 8 bit log, gamma corrected or some other non-linear transform derived from the 12 bit linear data." --that "Ken" guy.
While I understand the basic concept of bit-depth, this statement seems to imply that there is "useful" data compression achieved in the in-camera gamma application to the curve, in and of itself--possibly a "good" thing, depending on the definition of "good" here. Now I don't mean to start another heated debate about this, I just want to vet some of these statements so I fully understand what's actually going on, and what, precisely, I'm giving up.
Lastly, sRGB vs. Adobe RGB: I am also choosing to shoot in Adobe RGB, since I will always be aware that I'm in Adobe RGB in post, and I will only be sending files to Adobe RGB-aware printing devices.