Showing posts with label raw. Show all posts
Showing posts with label raw. Show all posts

Wednesday, July 19, 2017

HDR, EDR, wides gamuts DCI-P3, 10bits for each RGB-Transparency and screen


"HDR (High Dynamic Range)" video describes pixels using three different criteria:

  • Greater resolution (“more pixels”)
  • Greater color saturation (“fatter pixels”)
  • Greater shadow and highlight range (“brighter pixels”)--> only this criteria for classic HDR.

DCI P3 (Digital Cinema Initiative P3) falls into the “greater color saturation” section. It sets the white point at 6500° Kelvin and is more saturated than traditional HD video, but not as deeply saturated as the ultimate goal of Rec. 2020. Apple calls this “wide color,” or “wide color gamut,” media.

None of the new Macs have HDR screens. They offer a wider color gamut and higher resolution than previous gen Macs, but they are not HDR.

Timing DCI-P3

DCI-P3 was defined by the Digital Cinema Initiatives (DCI) organization and published by the Society of Motion Picture and Television Engineers (SMPTE) in SMPTE EG 432-1 and SMPTE RP 431-2.
In September 2015, Apple's iMac desktop became the first consumer computer with a built-in wide-gamut display, supporting the P3 color space.
Normal white LED backlights produce a lot of blue light, but not as much red or green light. Late 2015 iMacs are using the not-that-new GB-r LED backlight technology. This creates a much stronger green and red spectral component to match the already strong blue. The larger gamut can reproduce more vivid greens, reds, and their derivatives (cyan, yellow, magenta). http://www.astramael.com/
In August 2016, the Samsung Galaxy Note 7 shipped with an HDR display with 100% DCI-P3 color gamut.
https://en.wikipedia.org/wiki/DCI-P3
In September 2016, Apple's iPhone 7 shipped with a wide-gamut display, supporting P3.
In October 2016, Apple's new MacBook Pro notebook computers were released with P3 displays.

DCI-P3

Traditionally, the folks that cared the most about extended color spaces were professional still photographers. This is because, in the past, HD video created the significantly lower resolution images where a 1080p frame contained 2.1 million pixels at 8-bit depth, while a 4×5 inch digital negative could contain up to 40 million pixels at 16-bit depth.
In the world of still photos, Adobe RGB (and its close cousin sRGB) reigned supreme as the color space of choice because of its color range and dynamics.
From what I read, the DCI P3 color space is optimized for projection systems, not for monitors. Why? Because P3 was the color space designed for use by digital cinema projectors in theaters.
This means that many photographers are puzzled about why Apple chose a video standard for its monitors, rather than a stills standards. Apple hasn’t responded to these questions, but my guess it that consumers are embracing video in all aspects of their life – both socially and professionally – and it makes sense to support them with greater video quality. Also, the P3 spec has more reds and warmer colors than Adobe RGB.
https://larryjordan.com/articles/configure-the-new-macbook-pro-to-p3-color-space/

To view or change the color space setting for your monitor, open System Preferences > Display and click the Color tab. The top option – Color LCD – isn’t labeled “P3,” but it is. When you compare the Color LCD setting with the Display P3 setting, you’ll see they are essentially identical.

There’s a great way to explore the differences between Adobe RGB, HD (Rec. 709), P3, and HDR (Rec. 2020) color spaces. It’s called ColorSync and its already installed in the Utility folder on your system.
Open the Utility folder inside Applications (shortcut: Shift + Cmd + U), then open the ColorSync Utility. This program allows you to view and modify color profiles.
Click the Display P3 profile and, on the right, is a visual representation of the colors that can be displayed in the P3 color space; illustrated above.
One of the hidden features of ColorSync is the ability to compare color spaces, so you can easily see the differences between them.

Mac OS

Sierra & EDR, In  french, En français

macOS Sierra supporte  une technologie de type pseudo-HDR, qu’Apple nomme EDR.
L’idée du "pseudo-HDR" est d’augmenter fortement le contraste pour obtenir des images plus naturelles. La méthode typique, proposée par Apple, consiste à abaisser la valeur du noir et d’augmenter fortement la luminosité de l’écran pour créer un éblouissement. Actuellement, le "HDR" est essentiellement utilisé dans les films, Netflix (dans certains cas, comme la série Marco Polo) ou certains Blu-ray Ultra HD proposent des effets de ce type. Le HDR améliore l’immersion avec un soleil vraiment éblouissant, des lumières plus réalistes, etc.

Le support dans macOS Sierra semble limité pour le moment : ça ne fonctionne qu’en OpenGL ou avec Metal (l’API graphique maison) et uniquement sur les iMac 27 pouces Retina (5K). On peut supposer que la vidéo devrait tirer parti de cette possibilité avec la version finale, en même temps que les couleurs encodées sur 10 bits, les dalles wide gammut (même si c’est déjà le cas) et (j’espère) le codec HEVC (alias H.265).

http://www.journaldulapin.com/2016/07/07/macos-sierra-supporte-le-hdr/
http://www.journaldulapin.com/2017/04/25/wide-gammut-os-x/

Pour faire simple, le monde de l’informatique travaille généralement avec des couleurs codées sur 8 bits, soit 256 valeurs possibles pour chaque canal (rouge, vert, bleu et la transparence, soit 32 bits). Dans les moniteurs, les premiers LCD travaillaient sur 6 bits, les modèles modernes évidemment sur 8 bits, mais depuis quelques années des modèles 10 bits (1 024 valeurs) arrivent sur le marché. Avec OS X El Capitan (et évidemment macOS Sierra), le support a été ajouté et il est donc possible de brancher un écran “10 bits” sur un Mac (écran HDR vrai).

carte vidéo

Sur un MacBookPro2016 la carte vidéo est une Intel Iris Graphics 540.
https://www.notebookcheck.net/Intel-Iris-Graphics-540.149939.0.html
The revised video engine now decodes H.265/HEVC completely in hardware and thereby much more efficiently than before. Displays can be connected via DP 1.2 / eDP 1.3 (max. 3840 x 2160 @ 60 Hz), whereas HDMI is limited to the older version 1.4a (max. 3840 x 2160 @ 30 Hz). However, HDMI 2.0 can be added using a DisplayPort (DP) converter. Up to 3 displays can be controlled simultaneously.
https://www.intel.com/content/www/us/en/support/graphics-drivers/000022440.html

Safari accepte les images 10 bits

L’intérêt de l’affichage sur 10 bits est de notamment améliorer les dégradés entre les couleurs, étant donné qu’il existe plus d’intermédiaires possibles. Les écrans 10 bits utilisent aussi souvent un espace de couleurs plus large, ce qui permet d’afficher des couleurs qui n’existent pas dans l’espace de couleur classique. En clair, une image comme celle présentée sur ce post de blog chez Webkit (https://webkit.org/blog/6682/improving-color-on-the-web/) apparaît complètement rouge sur un écran 8 bits classique, mais montre un logo sur un écran 10 bits avec un espace de couleur étendu. Le rouge du logo utilise des valeurs qu’un écran classique interprète comme la valeur maximale du rouge, alors qu’il s’agit d’une teinte distincte pour un modèle moderne. Certains écrans peuvent d’ailleurs combiner des couleurs 8 bits classiques avec un espace large (wide gammut) et afficher le logo, c’est le cas des MacBook Pro 2016.

Une image prévue pour un affichage sur un écran wide gammut,
sinon que du rouge pour les autres


Le support des écrans 10 bits a de l’intérêt dans le monde de la photo, car beaucoup d’appareils peuvent photographier en RAW en 10, 12 ou même 14 bits.

Les wides gamuts sont plutôt à partir de l’espace AdobeRVB (https://en.wikipedia.org/wiki/Wide-gamut_RGB_color_space) et dernièrement P3 chez Apple.
Cependant, chose rare pour un sRVB, il supporte les couleurs 10 bits par le displayPort.
Le logo au centre est hors de l’espace sRGB. Y a pas besoin de 10 bits pour le voir, juste un écran wide gamut (P3 par exemple, comme certains Mac) ou un profil de couleur bizarre. Sur mon écran *pas* wide gamut, certains profils ICC affichent bien le logo.
Il s'affiche dans chrome safari mais pas Firefox des versions de Juin 2017 sur mac sierra et macBook Pro 2016.
Le profil de couleur est dans l’image jpeg, mais tous les navigateurs le chargent pas...




Monday, July 17, 2017

HDR high dynamic range, problem avantages and photo, raw, DRO, video, game, science, computer, screen, camera, web platform, understanding SONY auto-HDR


Intro

https://en.wikipedia.org/wiki/High-dynamic-range_imaging

I will focus on the example of the Sony alpha 77 and 65 (the first with built-in auto-HDR).

Tone mapping

The method of rendering an HDR image to a standard monitor or printing device is called tone mapping. This method reduces the overall contrast of an HDR image to facilitate display on devices or printouts with lower dynamic range (LDR), and can be applied to produce images with preserved local contrast (or exaggerated for artistic effect).
https://en.wikipedia.org/wiki/Tone_mapping

The main problem is the dynamic ranges of common devices:
Dynamic ranges of common devices
DeviceStopsContrast
LCD9.5700:1 (250:1 – 1750:1)
Negative film (Kodak VISION3)138000:1
Human eye (static)10–141000:1 – 15000:1
High-end DSLR camera 14.828500:1
Human eye (dynamic)20
1000000:1
https://en.wikipedia.org/wiki/High-dynamic-range_imaging

In our case (camera), the CMOS sensor and its electronic is limited. For example:
The dynamic range is typically limited by the readout process of the CMOS imager pixel. Techniques have been developed in the past to cope with this, usually by non-linear compression of the signal.
ams Sensors Belgium developed a new CMOS image sensor pixels that allows readout of a photodiode with a wide dynamic range, which maintains a linear response to light. After exposure, the photodiode is read out via two transfer gates to two sense nodes. Two signals are then read from each pixel. The first signal only reads charge transferred to the first sense node, with maximal gain. This sample is used for small charge packets and is read with with low read noise. The second sample reads the total charge transferred to both sense nodes, with a lower gain. Pixels with a read noise of 3.3 electrons and a full well charge of 100,000 electrons have been demonstrated, resulting in a linear dynamic range of 90 dB.
http://www.cmosis.com/technology/technology_overview/high_dynamic_range_pixels

DRO, auto-HDR, Raw

In fact you have 5 possibilities:

  • DRO (how DRO works: https://www.dpreview.com/articles/3798759501/apical)
  • auto-HDR
  • shooting a bracket 3-5 exposures (well within the camera's reach) and RAW then doing the HDR work on the computer (function "Merge to HDR"),
  • research domain with you own software (macro photo, fluorescent photo, photo with very high contrast, 3D photo)
  • one Raw and manual or semi-automatic processing with specialized software (raw converters):


see Merged HDR with many Preset
https://www.hdrsoft.com/

HDR auto from Sony

in french: Le mode HDR automatique des Sony Alpha 450, 500 et 550 (2010)
http://www.alpha-numerique.fr/index.php/technique/elements-techniques/426-le-mode-hdr-automatique-des-sony-alpha-450-500-et-550

Auto HDR of alpha 65 and 77 (2012)
http://blog.william-porter.net/2012/01/sony-a77-expanding-dynamic-range-with.html
This excellent post shows the difference of DRO, HDR, and raw.
The best is to select HDR with you own contrast decision.

In-camera HDR ("high dynamic range") is a different way to solve the problem. Like on-the-computer HDR, in-camera HDR starts with several different exposures of the same scene, then combines them into a single output file in which the well-exposed bright areas from one shot have been combined with the best-exposed dark areas from another, and the composite file has been adjusted to make things look natural. Sony's programmers have written programs that seem to do a very good job — sometimes — of combining the exposures. But a key factor in getting good results, is providing the processor with good source images. The new fixed-mirror (SLT) cameras from Sony are especially well suited to gathering the multiple exposures because, lacking a moving mirror, these cameras can take more shots per second than their traditional reflex (moving) mirror competitors.

Raw file, unprocessed. and clipping region  (no contrast)
The red is where the scene is brighter than the camera could capture (with these settings) 
and the blue is where the scene is too dark for the camera (with these settings) to retain detail.
It's when you turn on Lightroom's "show clipping" feature.


 Raw file, unprocessed.

 Sony auto-HDR
The HDR AUTO has preserved detail outside well 
but surrendered detail in the shadows.
It's the choice of Sony's programmers.
 Sony auto-HDR 
with knowing that the dynamic range of this scene 
was fairly extreme, I set HDR to its max (6 EV). 

The raw data file has a lot more latitude than a jpeg. To compare what I can get from the raw file with what Sony's in-camera DRO and HDR offer, I reshot the scene, saved the raw file, and processed it myself in Lightroom or other software.
I used an adjustment brush to bring the bright areas (the windows) down 1.5 stops, and a separate brush to bring the shadow areas up 1.5 stops. Even so, the result was pretty good. See the original RAW file at the beginning.

Morever, because in-camera HDR takes multiple exposures and then processes them, achieving a single HDR result in the camera takes about five or six seconds. And you simply can't use it if anything in the scene is moving quickly. Finally, the A65 or A77's processor saves the HDR file as a jpeg, by necessity. The HDR file is a composite, a processed result. There is no raw original of the HDR result.

And if you really want to hedge your bets, shoot RAW + JPEG with DRO AUTO enabled. You may find that the raw file is badly exposed but the jpeg is usable and you won't have to fuss with the raw file on the computer.
 sony auto-DRO
sony DRO LV5;
Sensing that there was at least a five to six stop gap between the darks
and the lights here, I then changed the DRO setting from Auto to "Lv5".

 The feature is called "Auto HDR" and it has seven possible options: levels 1-6 plus an option called, confusingly, "auto." Sounds tautological to say "put the Auto HDR feature on auto," but it's not.

But a word or two further about raw might be pertinent here.

There is never a question about shooting raw or not. We all shoot raw always, willy nilly. Raw is how the camera works. The question is simply, where does the raw data get processed? Your choices are

  • (a) let the little processor with the no-choice software in your camera do it and hope you're happy with the results, because you lose the chance to do it over; or 
  • (b) keep the raw data, then process and reprocess as many times you like on a full-blown computer, using as many different raw converters as you like or as many different programs as you can afford. If you put it this way, it's not hard to see that, if you really need to get it right, you're better off shooting raw and shooting a bracket 3-5 exposures (well within the camera's reach) then doing the HDR work on the computer.


That said, while in-camera HDR might not produce the best results possible, I will readily admit that is that its results are pretty darned good. I'm still not quite sold, but I do acknowledge that Sony's done a tremendous job here and with MFNR multi frame noise reduction.
HDR image has less noise in the shadows.
One of the downsides to shooting and processing HDR in-camera is that you have no control over the tone-curve applied. 

See also

https://www.dpreview.com/reviews/sonyslta77/12
https://www.dpreview.com/reviews/sonynexc3/7
The Complete Guide to Sony's Alpha 65 and 77 SLT Cameras B&W Edition Volume II:
https://books.google.fr/books?id=R_mWAwAAQBAJ&pg=PA456&lpg=PA456&dq=HDR+automatic+Sony+Alpha+65&source=bl&ots=-ajGH13gZg&sig=UkvJtQ1KE5laUZ7vKgPGtd0KTcc&hl=en&sa=X&ved=0ahUKEwiap-Gw4o3VAhWBB8AKHVuzAmIQ6AEIYjAJ#v=onepage&q=HDR%20automatic%20Sony%20Alpha%2065&f=false

Process

The 'HDR' mode can be set to Auto, or can be manually set from 1 stop to 6 stops EV. This is the mode that takes 3 frames quickly, and merges them together in camera to deliver a single HDR merged photo.

This is different than bracketing, and manually taking your own set of photos to merge in software after the fact, which is what most other folks on the web are talking about. For this, you need to see what the maximum bracketing range is for the camera in the exposure bracketing mode...with most Sony cameras, it is indeed only + -0.7 EV. The A77 being an exception. Also of note: you don't HAVE to use bracketing mode to take manual HDR exposures to blend - if you set up the camera on a tripod and take a series of photos where you manually adjust the exposure by a stop or two at a time, you can take any EV range you want and any number of photos you want - bracketing is what people use when they are trying to eliminate motion blur between a series of shots from handheld action, as it takes 3 photos relatively quickly. Sony's HDR mode does the same, but also does the blending of the HDR in the camera so the output is a single, final HDR photo.

Page 129 of the A65 Handbook states, You cannot use the Auto HDR function on RAW images. And, when the exposure mode is set to

  • AUTO, 
  • AUTO+, 
  • Sweep Panaroma, 
  • 3D Sweep Panaroma, 
  • Continuous Advance Priority AE 
  • Scent Selection, 
  • when Multi Frame Noise Reduct. is selected...

you cannot select Auto HDR.

I think that the delay between the 3 photos is around 150ms (frequency 1.6 Hz; or "speed process"=1/6.6), "speed of all the shooting process"=1/3.3then be careful when you have a moving part in your photo. The sound between the 3 photos indicates this tempo...
After the computer processing is around 5s and your camera is blocked.

JPEG, RAW and high iso.

http://www.photographyblog.com/reviews/sony_a65_review/image_quality/

List of all cameras and Auto Exposure Bracketing option

Auto Exposure Bracketing Settings by Camera Model
list of all cameras
https://www.hdrsoft.com/resources/aeb.html
Camera Model Auto-bracketed frames Max EV step increment Max EV range in AEB Max burst rate
Sony Alpha A65 3 0.7 1.4 10 fps
Sony Alpha A77 3 or 5 3 (3 frames), 0.7 (5 frames) 6 12 fps

Many digital cameras include an Auto Exposure Bracketing (AEB) option. When AEB is selected, the camera automatically takes three or more shots, each at a different exposure.
Auto Exposure Bracketing is very useful for capturing high contrast scenes for HDR. However, AEB wasn't intended for HDR initially, but for ensuring that one of the shots taken is correctly exposed. This means that some camera models only offer a maximum of 1 EV spacing, or even less, in just three auto bracketed shots.
Unfortunately, three shots spaced by one EV are often not sufficient for capturing high contrast scenes.