I am researching sensitivity, EI...etc.
ISO sensitivity has been described with the numerical system of ASA and Din in the film era,
this numerical system is still in use today with the advent of digital cameras.
How do film and digital cameras, tape media, etc. define sensitivity to achieve a common brightness?
According to the ISO definition, in the days of black and white negative film, it was possible to calculate the sensitivity of that film from the HD curve,
Although various OETFs and Sensors are used in modern digital cameras,
How does camera vendors define similar brightness results with similar values?
How is the sensitivity value determined, given the results of shooting at a certain shutter speed, iris, scene light...etc?
I believe that exposure meters have been using the same calculation method since the days of film,
What is the definition of sensitivity used by ARRI in particular?
I know this is a difficult sentence to understand, but if you have any information, please let me know.
Thank you.