To avoid arguments that are more about language than photographic technique, let us agree to use words like "exposure" in standard agreed way (like "the amount of light reaching the sensor") and move on the the practical issues
Glad you agree with me on this point, BJL, because we've have had a few disagreements in the past.
Another issue which I think is also important is getting a grasp on the practical significance of any differences of exposure in conjunction with differences in ISO settings.
One may understand in terms of a theoretical concept that A is better than B, but in order to determine the practical significance of such differences, one might need to do one's own experiments and comparisons, rather than just accept the theoretical model and behave accordingly.
An example would be the way one uses a Canon 50D. This camera is specified as having an ISO range from 100 to 12,800, with ISOs above 3200 being expanded. What this means is that there is nothing to be gained regards fundamental image quality, by using ISO 12,800.
If one uses the same exposure (as defined by f/stop and shutter speed) at ISO 3200, as what one would consider to be the correct, or optimal, or ETTR exposure at ISO 12,800, then the camera's LCD screen and histogram will show an apparent underexposure of 2 stops.
The image will look dark, but after appropriate processing in the RAW converter the ISO 3200 shot should have equally good shadow detail as the 12,800 shot. Or to be more precise, equally bad shadow detail
So what are the practical consequences here, one might ask. As I see it, the ISO 12,800 shot produces a review on the camera's LCD screen which can be seen more clearly. Have you captured what you want, or do you have to take another shot?
The ISO 3200 image will likely be a bit hopeless for such assessments. The main advantage will be, there is far less risk of blowing highlights unintentionally if one uses the same exposure at ISO 3200.
Another example of the importance of conducting one's own tests, which is illustrated by the peculiarities of the 50D, is the practical significance of using the ISO 100 setting as opposed to ISO 200 on the 50D. There's no doubt that the same exposure (as defined by F/stop and shutter speed) at ISO 200 will provide lower shadow noise than you would get at ISO 100. But what happens if we compare an ETTR exposure at ISO 100 with an 'apparent' ETTR exposure at ISO 200? In these circumstances the sensor receives twice the amount of light at ISO 100. Shadow detail must surely be better.
For a whole year or more, after buying a Canon 50D, taking photos in places like museums, churches and art galleries where flash and tripods were not allowed, I sometimes struggled to hold the camera steady, often preferring to use ISO 100 with the widest aperture on the lens instead of the sharpest aperture on the lens, thus compromising resolution at least a bit, because I imagined I'd be getting better shadow detail and better SNR at ISO 100 with the 50D.
Some time later, after DXO had made its test results freely available to the public, and I was in the market for an upgraded camera, I happened to compare the performance of the Canon 50D with other more recent models, using the DXOMark graphs for ISO Sensitivity, SNR and DR etc.
I got quite a surprise when I noticed that DXO seemed to be claiming that the ISO sensitivity of the 50D at ISO 100 is the same as at ISO 200
, and that as a consequence their graphs for SNR and DR etc did not even include the results for ISO 100.
They seemed to be implying that ISO 200 was the real
base ISO for the 50D, and that ISO 100 is an 'expanded' ISO, as it is in the Nikon D3 and D700. Yet there is no mention of this in the Canon handbook and no menu-setting for expanding ISO 200 to ISO 100.
Of course, my immediate reaction was not to fall off my chair, but to go out and take a few shots of high-dynamic-range scenes, at ISO 100 and 200 for the purpose of comparison.
I was very surprised to discover that I could not see any difference in any significant respect at all
between shots taken at an exposure of twice the shutter speed at ISO 200, compared with half the shutter speed at ISO 100.
At half the shuitter speed, at ISO 100, the sensor receives twice the number of photons. What has happened to those photons, I asked myself? Why are the highlights in the ISO 100 shots not blown when both shots at both ISO's are ETTR? If the highlights are not blown in both shots compared, why does the ISO 100 shot not have cleaner shadows? What electronic processes have Canon employed? It's as though they have created the equivalent of an electronic neutral density filter.
I'm completely mystified as to what's going on here, and a bit annoyed with myself for not discovering this situation sooner. I've got lots of photos taken in places like the Hermitage in St Petersburg, which I now realise could have been a bit sharper if I'd used either a faster shutter speed at ISO 200, or an aperture of F4 at ISO 200 instead of F2.8 at ISO 100.