There are different standards out there, but don't understand why you guys are stressing the point. It's rather simple now.
sRGB/Rec.709 define the primaries for RGB. They are the same. This is what you should calibrate to & is the standard. The standard for images, and for HDTV. Gamma differs slightly, but just aim for 2.2 to keep it simple. If you use something like madVR it will take care of converting that for you based on the source information. 6500K is your white point /colour temp to aim for.
As I said, those standards don't define a brightness as that is related to ambient light. If you have sun shining onto your screen you will need more brightness or your image is washed out. In dark/controlled environments its generally accepted that 100-120 cdm2 is the "standard" (and will keep eye strain away!).
The rest is how the bluray is produced. So if they want you to watch a crazy oversaturated film thats what they've intended, and if its muted and less contrasty thats what they intended as well. You shouldn't go changing settings to pump up a muted film just because you find the look boring.. it wasn't what was intended.
So, 1 standard.. sRGB / Rec.709, forget the rest.