15.6 4k lcd panel hdr 10bit free sample

The 4K full frame S1 is on my desk at EOSHD HQ and for the first time it’s the final released camera that is shipping to stores, with firmware version 1.0. I couldn’t offer any original files from my shoot in Barcelona as the firmware back then was version 0.7. So now’s your chance, if you’re wondering how good the initial 10bit codec is at just 72Mbit or how well it grades, to download my files and have a look for yourself…

Broadly speaking, the Panasonic S1 H.265 codec carries on a trend that began with the Samsung NX1 in 2014 towards replacing H.264, but it’s been such a slow trend that advanced compressed RAW formats are now available that balance smaller file sizes, ease of playback and editing with image quality as well. Sony for example, on the upcoming A7S III is said to be using H.265 compression in their RAW codec which is an interesting hybrid of the two. Blackmagic of course now have BRAW on the Pocket 4K camera. Initially the Panasonic S1 will use normal H.265 at quite low bitrates, no raw, before a paid firmware update later unlocks higher bitrates in 10bit and V-LOG, but still no raw. The Nikon Z6 will get ProRes RAW via HDMI “from May/June onwards” and that’s free, so something for Panasonic to consider there.

The 10bit 4K offered out of the box right now on the S1 is equivalent to 140Mbit H.264 (similar to a Nikon Z6 or D850) but half the file sizes. The bitrate is listed as 72Mbit in the menus but that’s not a constant bitrate and varies depending on the workload presented to the encoder by the images you’re shooting. Unlike the Fuji X-T3 10bit H.265 there is not yet an option to select a constant bitrate mode. I do wonder if Panasonic held back on certain advanced codec options on the S1, to leave room for a S1H or video orientated model later. Everything they have told me so far says that the S1 is a video oriented model, but it clearly isn’t to the extent that the GH5 is.

The H.265 4K 10bit files are recorded to the DCIM folder with photos, so there’s no PRIVATE directory folder to navigate as there is on a Sony camera, thankfully. The file format wrapper is MP4 but no option for MOV. The camera uses a broadcast safe RGB range in 10bit of 64-940… A bit like shooting 8bit at 16-235, this prevents the clipping of highlights and shadows by your display, PC or NLE. So all that’s fine.

You have to be in creative movie mode on the dial to access the Rec. File Format option for 10bit H.265 HDR. In the stills mode it only shoots 8bit 4K. That’s not ideal, in my opinion.

Only the Rec.2020 Hybrid LOG Gamma picture profile is available in this 10bit mode, there is no standard picture profile. There is a helpful Hybrid LOG Gamma View Assist so you don’t have to shoot from a low contrast image so that’s ok… After all the 10bit mode is designed to deliver far more dynamic range than the standard 8bit codec with standard picture styles including Cinelike D.

I’ve had experience of quite a few 10bit cameras, which have failed to deliver the promised image quality advantage due to too much compression, banding, macro blocking, smoothed-away noise grain, plastic texture and even inter frame noise reduction bugs like those found on the Sony FS5 when it first came out. Here, things are much better. There’s a significant step up in dynamic range from the standard 8bit video modes, not least due to the fact that Hybrid LOG Gamma is only available in H.265 10bit.

72Mbit may seem ridiculously low but it is above the 100Mbit Sony codec on the A7 cameras, and 140Mbit equivalent, beyond this point the visible advantages aren’t linear with the increase in bitrate. The X-T3 has a VERY good 4K 10bit H.265 codec at 200Mbit and even 400Mbit ALL-I but the S1 is easily competitive. ALL-I gives better motion cadence and looks very cinematic but it comes at the cost of image quality per-frame and huge files sizes. The X-T3 10bit codec at 400Mbit ALL-I has similar file sizes to 8bit MJPEG on the 1D C and 1D X Mark II!

My only concern is that NLEs are starting to recognise files shot in Hybrid LOG Gamma and automatically apply a gamma curve to convert to a punchy HDR image. Final Cut Pro X does this and you may need a workaround. I like to use Hybrid LOG Gamma as a recording format to grade from and make use of the extra dynamic range in my own way without having a gamma curve forced on me by Apple or Adobe. So Panasonic needs to work closely with these companies to ensure better user experiences.

H.265 10bit 4K now playback smoothly on most modern machines, due to better hardware acceleration. It’s no longer the case that support is lacking for basic playback and editing of H.265 like it was when the Samsung NX1 came out.

15.6 4k lcd panel hdr 10bit free sample

Typical LCDs are edge-lit by a strip of white LEDs. The 2D backlighting system in Pro Display XDR is unlike any other. It uses a superbright array of 576 blue LEDs that allows for unmatched light control compared with white LEDs. Twelve controllers rapidly modulate each LED so that areas of the screen can be incredibly bright while other areas are incredibly dark. All of this produces an extraordinary contrast that’s the foundation for XDR.

With a massive amount of processing power, the timing controller (TCON) chip utilizes an algorithm specifically created to analyze and reproduce images. It controls LEDs at over 10 times the refresh rate of the LCD itself, reducing latency and blooming. It’s capable of multiple refresh rates for amazingly smooth playback. Managing both the LED array and LCD pixels, the TCON precisely directs light and color to bring your work to life with stunning accuracy.

15.6 4k lcd panel hdr 10bit free sample

One of the biggest issues, specifically with PQ based HDR although HLG can be affected too, is screen brightness/luminance variations, due to a range of associated problems with the way HDR works on most displays.

The basic issue is HDR can change the screen/image brightness/luminance in ways that cause fundamental variations in the viewed image, potentially in ways that distort the original artistic intent of the graded footage, as defined by the film"s director and colourist.

Such issues can be defined as part of the expected HDR workflow, such as Dynamic metadata, technical limitations with the display technology used such as ABL (Auto Brightness Limiting) and Local Dimming, or unexpected brightness/luminance changes due to incorrect implementation within the display, specifically home TVs, where the display deviates from the expected HDR standard, due to the manufacturers believing they are generating a better final image.

The use of metadata to dynamically define the brightness of the displayed image is something that is marketed by many HDR aficionados as a real benefit of PQ based HD, enabling dark scenes to be "brightened", and bright scenes to be darkened to preserve hight detail.

In reality, as nominal diffuse white is defined to be approximately 100 nits, with just spectral highlight information beyond that level, the inherent visual intent of the image will be contained below the 100 nits level, meaning that for correctly graded HDR, the theoretical best approach to displaying the image on a lower peak luminance display would be to simply clip at the display"s maximum luminance, potentially with the use of roll-off to prevent highlight "blocking", with no use of dynamic metadata at all.

Another of the often overlooked potential issues with HDR has to do with the (legal) need to limit the power requirement of the display, as obviously extreme brightness causes excessive power consumption. That in itself is a cause for concern, based both on the power costs, and potential environmental issues. Hopefully, both those can be overcome with more efficient display back-lighting technologies.

However, in an attempt to overcome extreme power requirements, just about all HDR displays use one form or another of ABL (Auto Brightness Limiting - often called Power Limiting in HDR terminology). In very simple terms ABL reduces the power to the screen dependant on the percentage screen area that goes over a predetermined brightness level, so reducing the overall brightness of the scene. The PQ HDR specification defines what is known as MaxCLL (Maximum Content Light Level) and MaxFALL (Maximum Frame-Average Light Level) which are intended to be part of the HDR mastering metadata, from which the viewing display will calculate how to show the image, limiting potentially high power requirements.

Local Dimming is used in LCD based HDR displays, and consists of an array of back-lights to provide localised bright image areas, without having to have a single backlight that is always bright, as that would greatly lift the black level, so greatly compromising the display.

Some newer LCD displays have what is effectively a backlight per pixel, such as the new Eizo Prominence CG3145, and FSI"s XM310K, totally overcoming the Local Dimming issue.

OLED displays inherently have a backlight per pixel, as each pixel is self illuminating, but cannot reach the high peak luminance levels of LCD displays.

A final issue with a lot of displays, specifically home TVs, is the manufacturers deliberately deviating from the HDR specification, in an attempt to generate what they view as better images.

However, this issue is actually something we have sympathy for, because as mentioned previously above, the PQ HDR specification is flawed, as the standard is "absolute", and includes no option to increase the display"s light output to overcome surrounding room light levels. The result is that in less than ideal viewing environments, where the surrounding room brightness level is relatively high, the bulk of the HDR image will appear very dark, with shadow detail potentially becoming very difficult to see.

15.6 4k lcd panel hdr 10bit free sample

If you’re a YouTube creator, HDR (high dynamic range) video is a good way to draw eyeballs and make your footage look as beautiful as possible. Compared to standard video (SDR), it’s far brighter and more colorful — almost more real than reality.

With that in mind, I decided earlier this summer to create Engadget’s first 4K HDR video for YouTube on Fujifilm’s X-T4 and document the process in an explainer. How hard could it be? Little did I know that it would turn out to be a trainwreck in nearly every possible way. In the end, I ran out of time and failed to post the video in HDR.

If you’re a producer, HDR video can elevate your work because it’s simply brighter and more colorful than standard video. The benefits are more dramatic than 4K, which only delivers extra resolution that many people can’t even see.

While few people own HDR monitors, plenty of folks have HDR televisions and HDR smartphones. If you have a newer HDR-capable phone or TV and want to see the stunning difference between SDR and HDR, watch an ordinary video and then watch this one.

If you’re now feeling motivated, let me bring you back down to Earth. Producing HDR is extremely challenging, and depending on what equipment you have and your level of ambition, it may not be worth your time. It requires you to know a bunch of different standards, complex color space concepts and endless jargon like “MaxFALL” and “Rec.2020.”

It’s begging for a more streamlined process from acquisition to post-production to final delivery on YouTube. Unfortunately, the key players have focused on professional HDR production and streaming delivery to consumers on HDR TVs. Meanwhile, little attention has been paid to YouTubers or PC users. As a result, the experience is pretty miserable on a Windows 10 PC or Mac and browsers like Google Chrome.

To start, HDR is completely broken on YouTube on PCs right now, and has been for at least two months, thanks to a bug on all Chrome-based browsers and also the clunky way that Windows 10 handles HDR. While it might not affect that many users, it clearly affects creators who took the time to craft an HDR video. It’s also very demotivating for anyone thinking of making an HDR video (i.e., me) because many folks can’t even play it on their PC right now.

If you’re not too discouraged yet, read on. While shooting and editing 4K is not that different from producing HD, HDR changes the entire color space, forcing you to rethink how you shoot, edit and color-correct your videos.

To produce HDR, it’s best to have a decent understanding of how it works. For a more technical dive into HDR, check out Engadget’s explainer video. For the purposes of this story, though, I’ve included a brief explanation here, as well.

HDR’s main draw is the extra brightness. Display brightness is usually expressed in “nits” of peak brightness. HDR increases dynamic range by increasing brightness, which in turn boosts the contrast between light and dark images, measured in photographic “stops.”HDR is designed to go up to 10,000 nits, around 30 to 40 times more than the screen you’re probably looking at right now.

The other thing HDR offers is a wider range of colors (called a gamut) and more colors within that gamut (the bit depth). Certain hues, particularly very saturated colors, are visible on HDR displays but not on standard TVs, projectors or computer displays.

HDR devices also deliver more colors within that gamut by increasing bit depth. Older Rec.709 HDTVs and cameras were generally limited to displaying 8-bit color information, meaning that each pixel has 256 levels between black and white luminance. This means you can often see “banding” or blocky color transitions in subtle gradient areas in skies or shadows.

However, HDR TVs and cameras using Rec.2020 standards can record and display color in 10 or more bits, so you get 1,024 levels of luminance between the blackest black and whitest white and over a billion colors. That delivers much smoother gradients between colors and shades.

One key thing to know about HDR is that there are two flavors: PQ (or perceptual quantizer) and HLG (hybrid log gamma). Explaining those is beyond the scope of this article, but both use the Rec.2020 wide color gamut. The main difference is that HLG is designed for old-school broadcast signals, so it’s backward compatible with SDR’s Rec.709. PQ (also known as ST.2084) isn’t backward compatible, so it’s trickier to work with but generally delivers better HDR results.

The key to shooting HDR is to have the right equipment and setup. Beware that this is a simplified guide for novices. For a more detailed dive, check out this series from Mystery Box, a company responsible for many popular HDR videos on YouTube.

Can you shoot HDR with a camera that lacks an HDR-friendly log, along with 10-bit recording functions? Sure. Will it look great? No. You’re more likely to end up with banding, clipping, detail loss, washed-out colors and an image that just doesn’t measure up.

If you can afford it, your best bet is to use an external recorder with an HDR display, like Blackmagic Design’s $1,000 Video Assist 12G HDR or the $1,300 Shogun 7 HDR recorder from Atomos. That will allow you to apply a “LUT” (look-up table) setting that makes wide-gamut log footage look like regular Rec.709.

Once you start filming, exposure is crucial for HDR. Your best bet is to keep the log image away from the blacks and highlight regions and in the middle of your exposure range (using a zebra overlay or histogram can help with that). If you over- or under-expose the footage too much, you won’t be able to recover highlight or shadow detail.

A faster PC would work better. Thirty-two gigabytes of RAM would be much, much better than 16GB. I’d prefer to have an RTX-level NVIDIA or Radeon 5000-series (5500 or above) GPU or a Radeon VII. Unlike with gaming, extra CPU cores greatly improve performance when it comes to HDR production. The more, the better.

If you’re looking into laptops, Gigabyte’s Aero 15 WB is a good mid-range option (yes, $1,900 is a mid-range option for HDR editing), with an RTX 2070 Max-Q GPU, 6-core 10th-generation Intel Core i7 CPU, 16GB of RAM and a 512GB NVMe SSD. Just add another SSD (it can take two) and you’re ready to go. On the higher end, the Razer Blade 15 Advanced Edition comes with an Intel Core i7-10875H CPU, an NVIDIA RTX 2080 Super Max-Q GPU, 16GB of RAM and a 1TB NVMe SSD for $3,100. If neither fits your budget, you can probably build a well-specced desktop PC for less.

In a pinch, you can use any decent 10-bit HDR monitor, with some models costing under $600. Just be aware that they won’t be as bright or as accurate as the monitors mentioned above. Along with the ASUS ProArt model, I edited this HDR video using BenQ’s $2,000 SW321C, which isn’t nearly as bright but is just as color accurate. Dell’s 27-inch 4K U2720Q offers DisplayHDR 400 brightness in an IPS panel, with 10-bit color for $600.

Another choice is just to use a good 4K HDR TV set. Generally, the more you spend, the brighter and more color accurate it’ll be. Many models from Vizio, TCL, HiSense, LG, Sony, Samsung and others should fit the bill. On top of an HDR grading monitor or TV, you’ll need a regular monitor for your editing software (unless you’re content to use a laptop display). That can be virtually any PC monitor, but as usual, it’s better to spend the most you can.

There’s one piece of equipment you need but may not even know exists. Video capture/playback devices let you properly display an HDR (PQ or HLG) signal on your monitor or TV. You’ll need this because your GPU isn’t capable of outputting a dedicated HDR video signal from editing software. So, without one, you’ll have no way of monitoring your HDR video for editing or color grading.

If you have a desktop PC and free PCIe slot, you can purchase a product like Blackmagic’s DeckLink Mini Monitor 4K for as little as $195. That will give you a 4K HDR output via HDMI 2.0, and you can connect that directly to your HDR display.

With a laptop you’ll need, at a minimum, Blackmagic Design’s $145 UltraStudio Monitor, which connects to your PC via a Thunderbolt port and supports HDR at up to 1080p30. While it can’t display 4K resolution, you’d see your colors accurately. To do this video, I borrowed Blackmagic Design’s $995 UltraStudio 4K Mini, which connected to the Thunderbolt port on my Gigabyte Aero 17. It supports 4K 60p HDR output and recording via HDMI 2.0b or professional SDI ports, and presented zero issues with either the ASUS ProArt or BenQ displays.

I’m using Blackmagic Design’s $300 DaVinci Resolve 16 Studio software for editing and color grading, simply because it has better tools than Premiere Pro for producing HDR content. (Note that the free Resolve version isn’t appropriate for HDR as it lacks many of the key features needed.) I’m assuming that you’re familiar with DaVinci Resolve editing and grading tools and have correctly set up the aforementioned gear. If not, I’d suggest starting your research here.

Editing and grading on Resolve in HDR is too complicated to explain in a single article. However, you can check out this excellent post on the subject from Mystery Box. That article is one in a five-part guide on doing HDR from theory to shooting to posting.

How-to videos are also, generally, hard to find on YouTube, but this one from Reelisations explains the basics and is actually in HDR, to boot. Another guide from Film Resolved explains exactly how to deliver HLG HDR video to YouTube if you don’t have an HDR display. Finally, the site Labo de Jay has some detailed YouTube explainers on HDR along with sample footage in all major HDR formats (in French with English subtitles).

By using HLG, you can edit and color correct your footage without the need for an HDR display or a video display device. Just cut and grade your video normally, then export it to YouTube and check the HDR image on an HDR smartphone or TV. If you don’t like what you see, you can make adjustments and repeat the process until you do. This can be a pain, but it does make HDR doable on the cheap.

As mentioned, I graded the video accompanying this article using HLG. As I have two HDR displays on loan, I was able to check my project both in SDR and HDR at the same time. I needed to make some compromises, though. If I set the contrast and saturation where I wanted in SDR, it was overly contrasty and saturated in HDR. However, if I adjusted the HDR to look reasonably natural, the SDR image lacked punch and color.

In the end, I figured that far more people would see the video in SDR, so I graded it for that. It made the HDR hyper-realistic in terms of color and contrast, but gives folks who watched it on HDR smartphones and TVs a good idea of the potential. If you try to post your own HDR video and use HLG, you’ll have to figure out your own compromise. However, if you try to get the best of both and use PQ HDR — which uses metadata to help YouTube sort out HDR from SDR — then best of luck. It cost me many sleepless nights.

Despite its enormous potential, HDR for YouTube creators is still primitive and broken in some ways. That’s too bad, because all the pieces are there. Features like log and even RAW video have come to inexpensive cameras. And, with a relatively modest investment in a PC, display and some other extras, anyone can create a mini HDR production studio.

But it’s tricky to learn and the PC features required for YouTube creators are hard to use. HDR on YouTube for PCs depends on the Chrome browser engine, but Google and Microsoft have left the Windows version in a broken state for months.

It’s still worthwhile learning how to make HDR content, even if you don’t have an HDR display. However, clearly, creators and their fans won’t buy into it en masse if the companies behind it aren’t interested. So, the camera makers, editing software companies and streaming platforms have to work together to make it more functional. Until that happens, HDR will remain a YouTube niche.

15.6 4k lcd panel hdr 10bit free sample

The Hisense U8H matches the excellent brightness and color performance of much pricier LCD TVs, and its Google TV smart platform is a welcome addition. But it’s available in only three screen sizes.

The Hisense U8H is the best LCD/LED TV for most people because it delivers the performance of a much pricier TV yet starts at under $1,000, for the smallest (55-inch) screen size. This TV utilizes quantum dots, a full-array backlight with mini-LEDs, and a 120 Hz refresh rate to deliver a great-looking 4K HDR image. It’s compatible with every major HDR format. And it’s equipped with two full-bandwidth HDMI 2.1 inputs to support 4K 120 Hz gaming from the newest Xbox and PlayStation consoles. Add in the intuitive, fully featured Google TV smart-TV platform, and the U8H’s price-to-performance ratio is of inarguable value.

Chief among the U8H’s many strengths is its impressive peak brightness. When sending it HDR test patterns, I measured an average brightness of 1,500 nits, with peaks just north of 1,800 nits (a measurement of luminance; see TV features, defined for more info). To put that into perspective, consider that the 65-inch version of our budget 4K TV pick (the TCL 5-Series) typically costs around half as much as the 65-inch U8H but achieves only around 30% to 40% of its brightness. On the other side of the coin, the 65-inch version of our upgrade pick (the Samsung QN90B) costs almost twice as much as the 65-inch U8H, but it achieves only nominally higher brightness. Adequate light output creates convincing highlights and image contrast and (when necessary) combats ambient light from lamps or windows. It is a necessity for any TV worth buying—especially if you hope to watch HDR movies or play HDR games—and the U8H simply outpaces most TVs in its price range (and some in the next price bracket up, too).

The U8H’s brightness, black-level integrity, and local-dimming abilities make this an excellent TV for watching HDR content. The U8H is capable of playing HDR content in all of the major formats (HDR10, HDR10+, Dolby Vision, and HLG), but when it comes to impressive HDR, what’s under the hood is much more important than format compatibility. The most crucial thing for good HDR is high brightness and deep color saturation, and the U8H’s quantum dots achieve the latter. It’s not as simple as just having quantum dots, however: While many TVs (even the budget options) have quantum dots nowadays, what is often not taken into account is that brightness directly affects color saturation. For example, both the 2022 TCL 6-Series and the Hisense U8H are equipped with quantum dots, mini-LED backlights, and local dimming. But because the U8H is notably brighter than the 6-Series, it also achieves a higher total color volume. During our color-volume testing, the U8H exhibited color ranges at more than 100% of the DCI-P3 color space (the range of color needed to properly display HDR content), and it is capable of roughly 10% more total color volume compared with the 6-Series.

What does this mean in real-world terms? It means that the Hisense U8H truly excels as a modern 4K HDR TV, whether you’re watching the latest episode of Rings of Power or playing Overwatch 2. While watching HDR content side by side on the U8H and on our upgrade pick, the Samsung QN90B, I was truly surprised by how similar they looked at times, given that our upgrade pick is much more expensive. That said, though the U8H achieves impressive results where light output and color volume are concerned, it also exhibited some occasional video processing and upscaling issues (see Flaws but not dealbreakers), which videophiles and AV enthusiasts may take umbrage with. But in general, the picture quality punches well above its weight, metaphorically speaking.

The TV’s higher refresh rate also reduces motion blur in faster-moving sports and allows for smoother, more stable motion in games. Two of the four HDMI inputs support 4K gaming at 120 Hz. The U8H measured low input lag while playing in 4K resolution, and Hisense’s helpful GameZone setting in the picture menu allowed me to confirm the presence of 120 Hz playback and variable refresh rate during games.

In terms of design, the Hisense U8H is not as svelte as our upgrade pick, but it’s plenty sturdy and doesn’t look or feel cheap. Two narrow, metal feet jut out from beneath the panel and steadily hold the TV. They can be attached in two separate spots, either closer in toward the middle of the panel or out toward the edges, to account for different-size TV stands. The feet are also equipped with cable organization clasps—a nice touch for keeping your TV stand free of cable clutter. Though the TV is primarily plastic, its bezels are lined with metal strips, providing a bit more durability in the long run. I moved it around my home, and it was no worse for wear, but we’ll know more after doing some long-term testing.

The Hisense U8H has some difficulties with banding, or areas of uneven gradation, where transitions that should appear smooth instead look like “bands” of color (sometimes also called posterization). Like many current 4K HDR TVs, the U8H uses an 8-bit panel rather than a 10-bit panel, which affects the color decoding and color presentation process. This is usually relevant only with HDR video and games. When playing games on the PlayStation 5 and Xbox Series X, I saw a few instances where the content wasn’t rendered correctly and displayed ugly splotches of color on the screen. However, this almost always occurred during static screens (such as a pause menu or loading screen); I rarely spotted it during actual gameplay. Hisense has stated that it would address the problem in a future firmware update, but at the time of writing it was still present. This is a flaw that may give dedicated gamers pause, but we don’t consider it to be a dealbreaker for most people.

I also saw occasional instances of banding with TV shows and movies, though they were few and far between. The U8H isn’t the best at upscaling sub-4K content, so videos with a 1080p or lower resolution looked a little soft. You can get better overall video processing and upscaling by springing for our upgrade pick (this is one reason it’s more expensive, after all).

Although the UH8 TV has four HDMI inputs, only two of them are fully HDMI 2.1–compatible. And one of those is designated as the eARC input (intended as an audio connection for a soundbar or AV receiver connection). So if you’re pairing an external audio system with the U8H, you may have only one input remaining that can support HDMI 2.1 features like 4K 120 Hz playback, variable refresh rate, and auto game mode; this could be a dealbreaker if you own more than one current-gen gaming console. If you’re in that boat, you may want to splash out some extra dough for our upgrade pick. Additionally, folks using pre-HDMI source devices—like the five-cable composite connector with green, red, blue, and red/white audio inputs—should be aware that this TV requires an adapter to allow those devices to connect, and an adapter is not included in the box.

Finally, like most TVs that use vertical alignment (VA) LCD panels, the U8H has a limited horizontal viewing angle, which may be a bit annoying if you’re hoping to entertain a large crowd. Our upgrade pick uses a special wide-angle technology to address this.

If you’re watching in a darker room and want the most accurate picture you can get—preserving the director’s intent—select the U8H’s Filmmaker Mode as your picture mode. In a brighter room, we recommend the Theater Day picture mode. In either case, you should go into the backlight settings, disable the automatic light sensor, and set the backlight to your personal preference. This is true whether you’re watching SDR or HDR content.

15.6 4k lcd panel hdr 10bit free sample

If you"re comparing the three main HDR formats, there are a few things you need to look at, including color depth, brightness, tone mapping, and metadata. HDR10 is the most basic format out of the three, and any modern 4k TV supports HDR10. Dolby Vision and HDR10+ are the more advanced formats, and while many TVs have either HDR10+ or Dolby Vision support, some TVs support both, so they"re not mutually exclusive. Below you can see the main differences between each format.

Color bit depth is the amount of information the TV can use to tell a pixel which color to display. If a TV has higher color depth, it can display more colors and reduce banding in scenes with shades of similar colors, like a sunset. 8-bit TVs display 16.7 million colors, which is typically used in SDR content, and 10-bit color depth has 1.07 billion colors. 12-bit displays take it even further with an incredible 68.7 billion colors. Both Dolby Vision and HDR10+ can technically support content above 10-bit color depth, but that content is limited to Ultra HD Blu-rays with Dolby Vision, and even at that, not many of them go up to 12-bit color depth. HDR10 can"t go past 10-bit color depth.

Winner: Tie between Dolby and HDR10+.Even if both HDR10+ and Dolby Vision can support content with higher bit depth above 10-bit, most content won"t reach that, and streaming content is always capped at 10-bit color depth, so there"s no difference between the two dynamic formats.

When it comes to watching HDR content, a high peak brightness is very important as it makes highlights pop. HDR content is mastered at a certain brightness, and the TV needs to match that brightness. So if the content is mastered at 1,000 cd/m², you want it to display content exactly at 1,000 cd/m².

Dolby Vision and HDR10+ content are currently mastered between 1,000 to 4,000 cd/m², with most content at around 1,000 cd/m². HDR10 can be mastered anywhere up to 4,000 cd/m², depending on the content, but it doesn"t have a minimum brightness. All three standards cater for images of up to 10,000 cd/m², although no display can currently reach that level. Therefore there"s no real difference between the dynamic formats as they both top out at 4,000 cd/m².

Winner: Tie between HDR10+ and Dolby Vision.Both HDR10+ and Dolby Vision are currently mastered between 1,000 to 4,000 cd/m², so there"s no difference there.

One of the ways the three formats differ is their use of metadata. HDR10 only asks for static metadata. With static metadata, the boundaries in brightness are set once for the entire movie or show and are determined by taking the brightness range of the brightest scene. Dolby Vision and HDR10+ improve on this by using dynamic metadata, which allows it to tell the TV how to apply tone-mapping on a scene-by-scene or even on a frame-by-frame basis. This provides a better overall experience, as dark scenes won"t appear too bright.

However, some TV manufacturers ignore the metadata, and the TVs use their own tone-mapping to master content, in which case the HDR format"s metadata doesn"t matter, and the performance comes down to the TV.

Tone mapping tells us how well a TV can display colors that it doesn"t display. In other words, if an HDR movie has a bright red in a scene, but the TV can"t display that particular shade of red, what does it do to make up for it? There are two ways for a TV to tone map colors to deal with it. The first is called clipping, where a TV gets so bright that you don"t see details above a certain level of brightness, and there aren"t any visible colors above that brightness.

Between the three HDR formats, the differences are how each TV deals with tone mapping. Dynamic formats like Dolby Vision and HDR10+ can tone map on a scene-by-scene basis, and sometimes the content is tone-mapped by the source, which saves processing power required from the TV. As for HDR10, since it uses static metadata, the tone mapping is the same across the entire movie or show, so content doesn"t look as good.

Both HDR10+ and Dolby Vision are backward-compatible with static HDR formats on Ultra HD Blu-rays, so if you"re watching older HDR content, you won"t have to worry about which format it"s in as your new TV will be able to display it. Dolby Vision and HDR10+ are both backward-compatible, but they use different technology to build upon older HDR formats. HDR10+ adds dynamic metadata to HDR10 content, so if an HDR10+ TV needs to display HDR10 content, it does so without the dynamic metadata. Dolby Vision is more complicated because it can use any static HDR format as a "base layer" and build from it. Because it builds from static metadata, Dolby Vision TVs can read the static metadata alone, making it backward-compatible.

All Blu-ray discs need to use HDR10 as a static metadata layer. It means that it"s backward-compatible with any TV; if it"s a Dolby Vision disc and your TV only supports HDR10+, it"ll play the movie in HDR10 instead. However, the same can"t be said about streaming content because a Dolby Vision movie on Netflix might not carry the HDR10 base layer, so if your TV doesn"t support Dolby Vision, it will simply play in SDR instead.

If your TV supports Dolby Vision or HDR10+, but not both, you"ll be limited to the type of HDR content you can watch in that format. If a TV doesn"t support the HDR format your Blu-ray is in, it will be limited to HDR10, so you can"t watch the content in the intended format. For example, Samsung TVs don"t support Dolby Vision, so any Blu-ray in Dolby Vision will be limited to HDR10, and if you"re streaming a Dolby Vision movie that doesn"t have the HDR10 base layer, the content will be in SDR. TVs that support both formats have an advantage, and you"ll see content in their proper dynamic format.

The availability of the new HDR formats has drastically improved in recent years. All HDR content is at least available in HDR10, and Dolby Vision is available with most streaming services. Although not as common, HDR10+ is growing in popularity with Blu-rays and certain streaming services like Amazon Prime Video. As of October 2022, Apple now supports HDR10+ on the Apple TV+ app, and all of their HDR content has been updated with HDR10+ metadata. Find out where to find HDR content here.

While most TVs support HDR10 and many models support at least one of the more advanced formats, only a few brands like Vizio, Hisense, and TCL have support for both on their TVs. In the United States, Sony and LG support Dolby Vision, while Samsung TVs have HDR10+ support.

You shouldn"t expect the cheaper HDR TVs to use all the extra capabilities of the formats. For most of them, you won"t even be able to see a difference, as only high-end TVs can take advantage of HDR and display it to its full capabilities.

Although HDR was initially for movies, the advantages for gaming are undeniable. Modern consoles like the Xbox One and Xbox Series X both support Dolby Vision. Like with movies, game developers have to enable HDR support in their games. There are a handful of Dolby Vision games available for the PC and consoles, including Borderlands 3, F1 2021, and Call of Duty: Black Ops Cold War, to name a few. HDR10+ Gaming is an expansion of HDR10+ to focus on gaming, and PC gamers can take advantage of this, especially if you have a Samsung display, but consoles will stick with Dolby Vision support. Unfortunately, HDR isn"t always implemented properly, so the actual performance varies.

Winner:Tie between Dolby Vision and HDR10+.Dolby Vision games are more widely available than HDR10+ games, especially when it comes to consoles, but HDR10+ is slowly making its way into the PC gaming world.

The vast majority of them support HDR, but this doesn"t mean that they"re good for HDR, as they"re behind TVs in that regard. Most monitors only support HDR10 and not HDR10+ and Dolby Vision, so you don"t get dynamic metadata, and they usually have low contrast and low HDR peak brightness. If you want the best HDR experience possible, watch content on a TV.

Dolby Vision, HDR10+, and HDR10 aren"t the only HDR formats. There"s also HLG, also known as Hybrid Log Gamma. All modern TVs support it, and HLG aims to simplify things by combining SDR and HDR into one signal. It"s ideal for live broadcasts, as any device receiving the signal can play it. If the device supports HDR, it will display it in HDR; if it doesn"t, the SDR portion of the signal is played. As it"s intended for live broadcasts, there"s very little HLG content available.

Between Dolby Vision and HDR10+, there"s no clear winner from a technical standpoint because they both use dynamic metadata to help improve the overall quality. HDR10+ almost matches the capabilities of Dolby Vision but is lacking in content, and not as many TVs support HDR10+ as Dolby Vision. HDR10 has the distinct advantage of having more content available and being supported by every 4k TV.

Ultimately, the difference between the three formats isn"t that important. The quality of the TV itself has a much bigger impact on HDR. Both formats can produce much more dynamic images than what we used to see, and HDR delivers a more impactful movie experience as long as the TV displays it properly. There are limitations with HDR, though, because TVs can"t reach the 10,000 nit peak brightness and all the colors HDR is capable of, but most TVs still deliver a satisfying HDR experience.

15.6 4k lcd panel hdr 10bit free sample

High Dynamic Range (HDR) is the next generation of color clarity and realism in images and videos. Ideal for media that require high contrast or mix light and shadows, HDR preserves the clarity better than Standard Dynamic Range (SDR).

Continue reading to learn more about HDR technology (and get a handy checklist for making the switch). If you’d like to take a look at our range of monitors designed specifically for color accuracy, click here.

Chances are you have probably already heard about High Dynamic Range and how it’s going to take your viewing experience “to the next level”. However, since HDR is still a fairly new technology, some people are still not quite clear about how it actually works.

HDR is an imaging technique that captures, processes, and reproduces content in such a way that the detail of both the shadows and highlights of a scene are increased. While HDR was used in traditional photography in the past, it has recently made the jump to smartphones, TVs, monitors, and more.

For the purposes of this article, we will focus on HDR video content. To understand high dynamic range, we must first understand how standard dynamic range works.

When a monitor has a low contrast ratio or doesn’t operate with HDR, it is common to see colors and detail in an image being “clipped” as a result of the monitor’s display capabilities. As we mentioned earlier any information that has been clipped will be lost and therefore cannot be seen. When a monitor is trying to produce a scene with a wide range of luminance, this problem becomes even more pronounced.

HDR remedies this problem by calculating the amount of light in a given scene and using that information to preserve details within the image, even in scenes with large variations in brightness. This is done in an attempt to create more realistic-looking images.

SDR, or Standard Dynamic Range, is the current standard for video and cinema displays. Unfortunately, it is limited by its ability to only represent a fraction of the dynamic range that HDR is capable of. HDR, therefore, preserves details in scenes where the contrast ratio of the monitor could otherwise be a hindrance. SDR, on the other hand, lacks this aptitude.

To put it simply, when comparing HDR vs. SDR, HDR allows you to see more of the detail and color in scenes with a high dynamic range. Another difference between the two lies in their inherent measurement.

For those of you familiar with photography, dynamic range can be measured in stops, much like the aperture of a camera. This reflects adjustments made to the gamma and bit depth used in the image and will differ depending on whether HDR or SDR is in effect.

On a typical SDR display, for instance, images will have a dynamic range of about 6 stops. Conversely, HDR content is able to almost triple that dynamic range, with an average approximate total of 17.6 stops. Because of this, a dark scene may see dark grey tones becoming clipped to black, while in the bright scene some colors and detail in that part of the scene may become clipped to white.

Dolby Vision is an HDR standard requires monitors to have been specifically designed with a Dolby Vision hardware chip, which Dolby receives licensing fees for. Dolby Vision uses 12-bit color and a 10,000 nit brightness limit. It’s worth noting that Dolby Vision’s color gamut and brightness level exceed the limits of what can be achieved by displays being made today. Moreover, the barrier to entry for display manufacturers to incorporate Dolby Vision is still high due to its specific hardware support requirements.

HDR10 is a more easily adoptable standard and is used by manufacturers as a means of avoiding having to submit to Dolby’s standards and fees. For example, HDR10 uses 10-bit color and has the capability to master content at 1000 nits of brightness.

HDR10 has established itself as the default standard for 4K UHD Blu-ray disks and has also been used by Sony and Microsoft in the PlayStation 4 and the Xbox One S. When it comes to computer screens, some monitors from the ViewSonic VP Professional Monitor Series come equipped with HDR10 support.

It’s an easy mistake to make, especially if you own an HDR television, that all content is HDR content. Well, this is not the case; not all content is created equally!

To provide a relevant example, if you own a 4K television, you won’t be able to benefit from the 4K detail unless the content you’re watching is also in 4K. The same goes for HDR, in that in order to enjoy it, you’ll need to ensure that your viewing material supports such an experience.

Currently, HDR is content is available in a number of ways. In terms of streaming, Netflix supports HDR on Windows 10 and Amazon Prime has jumped onto the HDR bandwagon as well. In terms of physical content, there are HDR Blue-ray disks and players available for purchase along with the built-in players on the Sony PlayStation 4 and the Microsoft Xbox One S gaming consoles.

Once you’ve got your HDR content cued up, whether it be HDR video or and HDR game, you’ll have to make sure your setup is capable of displaying that HDR content.

HDR can be displayed over an HDMI 2.0 and DisplayPort 1.3. If your GPU has either of these ports then it should be capable of displaying HDR content. As a rule of thumb, all Nvidia 9xx series GPU’s and newer have an HDMI 2.0 port, as do all AMD cards from 2016 onward.

As far as your display goes, you’ll have to make sure that it too is capable of supporting HDR content. HDR-compatible displays must have a minimum of Full HD 1080p resolution. Products like the ViewSonic VP3268-4K and VP2785-4K are examples of 4K monitors with HDR10 content support. These monitors also factor color accuracy into the equation in an attempt to make sure that on-screen images look as true to life as possible.

In the case of the latter-most example, high definition televisions did not become available in the United States until 1998, and popular until five to eight years thereafter. Today, Full HD resolution is commonplace throughout the country, with 4K UHD acting as the new fringe option.

The current modern battle between HDR vs. SDR reflects the same trend. Although HDR has been commonplace in the photography world for some time, it is the new kid on the block in terms of television and monitors. Some have even gone so far as to argue that 1080p HDR looks better than 4K SDR!

While of course, nothing is ever 100% certain, HDR technology has fortune in its favor. Currently, its inherent technology is tied closely to that of ultra-high definition resolution, otherwise known as 4K.

Since 4K is being adopted by the general market with remarkable ease and speed, it stands to reason that HDR will follow the same course going forward. We can compare HDR vs. SDR all day but whether or not HDR is good for you will ultimately come down to your own personal experience. For now, feel free to explore ViewSonic’s range of HDR-compatible ColorPro monitorsand or dive deeper into the world of color correction and color grading.

Luckily for all of the early adopters out there, HDR products are not hard to come by. The benefits of HDR even extend into gaming by allowing you to see more detail in your games for a more realistic feel.

For more on gaming monitors with HDR, you can check out the ViewSonic XG3220 and XG3240C, which are equipped with gaming-centric features and HDR compatibility.

15.6 4k lcd panel hdr 10bit free sample

Anti-glare, 100% Adobe RGB color gamut, 100% sRGB color gamut, Hard Coating (3H), 100% Rec 709 color gamut, HDR (high dynamic range), 97.7% DCI-P3, 76.9% Rec 2020 color gamut

15.6 4k lcd panel hdr 10bit free sample

There are two versions of the new MacBook Pro and we"ve got the 16-inch version, although the 14-inch model"s display is very similar just smaller and with a different resolution. Apple calls this particular display a "Liquid Retina XDR display" which is typical Apple marketing speak. If I translate this into what Apple actually means, they are giving you a high resolution full array local dimming mini-LED LCD with true HDR functionality.

If we dive deeper into the specs, the 16.2-inch panel has a resolution of 3456 x 2234 which continues Apple"s tradition of using non-standard resolutions across their line-up. Apple doesn"t disclose the exact technology used here, but it"s an LCD panel which appears to be IPS-like in design. The backlight has 10,000 mini-LEDs for impressive zone density at this size, allowing for a contrast ratio of 1,000,000:1 and peak brightness up to 1,600 nits in the HDR mode on paper.

As for refresh rate, Apple are offering up to 120Hz with adaptive sync, which they"ve rebranded into "ProMotion" although this sort of functionality has been available for many years now in other laptops and displays. The combination of everything though is a first, and the only rivals to this sort of panel are the latest wave of 4K OLED panels seen in a few high-end Windows laptops.

The MacBook Pro"s display is a wide gamut display with 99% coverage of the DCI-P3 color space. That"s an excellent result for any creator looking to produce content in that gamut. This also means perfect sRGB coverage, so if you"re designing web content, creating SDR videos, or working with wide gamut HDR videos then Apple is providing you the tools to do that.

Based on this you should probably just leave your MacBook in the Apple Display mode for everyday use as it"s accurate enough for sRGB content and will also let you benefit from wide gamuts where needed. The performance in the Apple Display XDR mode is similar as well for SDR content, so that"s an option if you want to also use HDR at times.

In the HDR mode, brightness is extremely impressive. There"s no major difference between sustained and peak brightness, so there"s no automatic brightness limiter that activates after a short period to dim the screen in intensely bright scenes. Brightness is as high as 1670 nits at small window sizes, and over 1500 nits at 50%, before dropping to around 1150 nits for a full screen sustained white window. That"s impressive, although it does come with a corresponding increase to power consumption, so running the display at over 1000 nits all the time isn"t advisable on battery.

Contrast behavior is also different in HDR compared to SDR. When displaying HDR content, the mini-LED backlight will, at times, fully switch off to display black, delivering an effectively infinite contrast ratio. That"s the best case performance you"ll see. In more tricky conditions, such as a checkerboard test or measuring light and dark areas close together, I measured a contrast ratio of slightly over 50,000:1. This is right where you"d want performance to be for HDR content, contrast ratios of 50,000:1 worst case and up to 1,000,000:1 or greater in other situations. Apple are meeting all the recommendations for performance that I"ve heard when speaking to HDR, calibration and mastering experts.

This performance also destroys basically any other LCD based monitor I"ve looked at before. On the standalone monitor side, it"s virtually unheard of right now to see LCD zone counts higher than a couple of thousand. This limits worse case contrast to around 12,000:1 in the case of the 2,000-zone Samsung Odyssey Neo G9 with VA technology, or just 4,000:1 in a checkerboard test.

Apple choosing to use 5-10x the zone count massively improves the achievable contrast ratio in tricky situations and I"d say this amount of zones - and the density of zones - is what is required as a minimum for the best HDR experience with an LCD panel. Even Apple"s own ridiculously overpriced Pro Display XDR doesn"t compare as it has a paltry 576-zone backlight and it was criticized at launch for poor blooming compared to professional level HDR mastering displays. The MacBook Pro"s display will be far better for producing HDR content, aside from the small size.

When actually viewing HDR content, the level of blooming is pretty minimal, even in tricky conditions like viewing Christmas lights or starfields. However it"s not completely free of blooming, and the halo-like glow effect can be visible in some conditions if you look for it.

So from one perspective it"s easily one of the best LCD-based HDR experiences I"ve seen, but on the other hand it isn"t a self-lit panel like an OLED which is completely free of blooming and in some situations OLED still delivers better HDR. Of course, OLEDs have other drawbacks such as lower brightness levels and the risk of burn in so I can understand why Apple would opt for LCD instead. Besides this one complaint though the HDR experience is excellent, especially for a laptop.

Unfortunately there is a major drawback to the Liquid Retina XDR display used on the new MacBook Pros, and that"s the motion performance. While it"s nice to see Apple upgrade the refresh rate to 120Hz compared to the 60Hz they were using previously, the display being used here doesn"t have the appropriate level of response times to keep up with that 120Hz refresh rate. The panel is actually very, very slow, which is a disappointment.

This is exacerbated by using a combination of IPS-like LCD technology, and an always-active mini-LED backlight, noting that both the LCD layer and mini-LED need to change to transition fully.

Luckily full transition fall times aren"t as horrific, though still reasonably poor at over 15ms even with our very generous 20% tolerance. The real transition time is more like 35ms, so less than half that of the rise time, but far slower than most other LCDs out there. The best laptop grade OLED panels can perform these transitions in under 2ms with the same test conditions, making them an order of magnitude faster.

I tested a few more transitions of varying degrees and typically the MacBook Pro would fall between 20 and 40ms, though luckily there is no overshoot to speak of. When viewing UFO test results, you can see the product of these horrific response times: a substantial blur trail behind moving objects. Even though the panel can feel somewhat smooth to use because it has a moderate refresh rate of 120Hz, the actual clarity in motion is terrible and this impacts the usefulness of the higher refresh rate.

Right next the MacBook Pro we have the Aero 15 OLED"s panel which has half the refresh rate at just 60Hz, but massively faster response times. You"ll see here that even though the MacBook Pro"s display is twice as fast in refresh rate, the extremely slow response behavior limits motion clarity to more like a 60Hz monitor or worse. The level of smearing is insane and I"m not sure how a modern LCD could end up this slow, Apple really should have experimented with some sort of overdrive.

The Liquid Retina XDR display has impressive HDR specifications and performance. A mini-LED backlight zone count of 10,000 is the star of the show in this respect, significantly reducing blooming compared to other LCD-based HDR monitors, and providing exceptionally high brightness. The level of performance is good enough for both enthusiast level mastering and HDR playback, so the MacBook Pro is a great device for video editing on the go when you also factor in its overall performance.

A few nitpicks aside, the major downside to the display is motion performance. This display is exceptionally slow even for an LCD, despite packing a 120Hz refresh rate. This affects areas including web browsing and any work with text as you scroll through content, and blur trails can be visible across a wide range of use cases, not just gaming. It"s not bad enough to negate the benefits you get elsewhere, but Apple needs to put a lot of work into optimizing how quickly their panels transition. I also feel the lack of HDMI 2.1 on the MacBook Pro is a bit puzzling, going HDMI 2.0 for external monitors (in addition to Thunderbolt) is a bit annoying.

The only real competition right now are OLED panels, which come with their own set of strengths and weaknesses. There are a few other mini-LED laptop options on the Windows side, like the screen you get in the Acer Predator Helios 500, but that display only has 512 zones, not the 10,000 on offer here. So it"s a battle between the MacBook and the OLEDs you see in products like the Gigabyte Aero 15 OLED.

The reasons to get an OLED display over this LCD would be in terms of its self-lit pure HDR experience with zero blooming, significantly faster response times for better motion clarity, and wider color gamut allowing for accurate work in the Adobe RGB color space as well as P3 and Rec.709. However, the drawbacks are also significant, including a 60Hz refresh rate limitation with current 4K offerings, the risk of permanent burn in, and significantly lower brightness. Actual implementations we"ve seen also lack the calibration Apple is offering.

On the balance of things, I"d prefer to get the Liquid Retina XDR in the new MacBook Pro than an OLED, especially for color-accurate content creation, and the HDR experience is close enough to OLED that I can forgive very minor blooming on occasion. I wouldn"t say Apple is miles in front with this screen, but it"s certainly very impressive and calling it the best display for production work is justified.

15.6 4k lcd panel hdr 10bit free sample

But 10-bit color is your ticket to producing High Dynamic Range (HDR) content. Thankfully, 10-bit displays are increasing as HDR TVs become more common. Some phones support HDR now, and even some 8-bit displays can fake it using a technique called frame rate control (FRC).

Chroma subsampling is a separate beast altogether. This is often called color resolution, as compared to the spatial resolution, like 4K. As an example, 4K Ultra HD video has a spatial resolution 3,840 x 2,160 pixels — but the color of each pixel is derived from a much smaller sampling than that.

Note that color resolution is tied to spatial resolution. A 4K video with 4:2:0 subsampling will still sample color from more pixels than a Full HD video with 4:2:2 subsampling.

The Lumix GH5 was one of the first cameras that offered internal 4K recording with 10-bit 4:2:2 color, which can save videographers time and money by not requiring an external recorder.

15.6 4k lcd panel hdr 10bit free sample

IPS Monitors: Boasting crystal-clear displays and true-to-life colors, LG 4K Ultra-HD IPS monitors let you see the bigger picture. Browse our selection of IPS monitors available in a range of sizes and ground-breaking features.