Was checking some specs on film and realized that film grain is not measured in microns. Instead, a stat called 'rms' is used for slide film. Kodak's Ektachrome 100G is said to have 8rms grain, while E100VS is listed at 11rms. I gathered that 'rms' is determined by viewing a saturated color through a 48 micron aperature and counting the dots somehow.
To top that, Kodak has abandoned rms for its new Ektar negative film and instead is using something called 'print grain index' which is print size dependent -- not measured on the film, at all.
How does this, or can this, compare with a micron measurement, and why can't we simply state the grain size in microns?
So you know, I was hoping to compare in-camera pixel size with film graininess.
wow, ok, this post here, explaining grain measurement is uber-propellerhead:http://www.answers.com/topic/film-grain
I'm guessing here that it's not particularly useful to measure grain size in microns because it varies so much within a sample, and the problems with the dye layers and stuff you mention. The rms method looks like it's reading density for an area by measuring the light passing through it, generalizing the sample, sort of like if you were describing soil particle size by measuring the rate that water passed through it.
This image from that link actually does, however, show a micron range in the notes. (It's from the early 1900s)
As far as your quest, I pretty much agree with the other posts, it's ultimately not going to tell you much, although it is kind of interesting to compare the two completely different processes.