Category: The Front Line
Of Samsung, Big Screens, IoT, HDR, And Patience
- Published on Wednesday, 13 April 2016 16:30
- Pete Putman
- 0 Comments
Last Tuesday, April 12, Samsung held its annual press briefing and TV launch event at its new, “hip” facility in the Chelsea section of Manhattan. The multi-story building is known as Samsung837 (like a Twitter handle), as its location is on 837 Washington Street by the High Line elevated walkway.
Samsung, who has dominated the worldwide television market for many years – and who has a pretty good market share in smartphones, too – has been a leader in developing Ultra HD (4K) televisions with high dynamic range and wider color gamuts, most notably in their S-line.
At the briefing, they announced their new, top-of-the-line Ultra HDTVs, equipped for high dynamic range with quantum dot backlights manufactured by Nanosys of Sunnyvale, CA. There are a few new sizes in the line that are re-defining what a “small” TV screen means! The flagship model is the KS9800 curved SUHDTV, which will be available in a 65-inch size ($4,499), 78 inches ($9,999), and a mammoth 88-inch version ($19,999).
Stepping down, we find the KS9500-series, with a 55” model for $2,499, a 65” model for $3,699, and a 78” model for $7,999 (June). The flat-screen KS9000 comes in three flavors – 55” ($2,299), 65” ($3,499) and 75” ($6,499, June). There are two entry-level SKUs (if that’s even the right term to use) as well – the KS8500, a curved-screen version, is aimed at the consumer wanting a smaller screen, with a 55” model for $1,999 and a 65” model for $2,999. A 49” model will be available in May for $1,699. The line is rounded out with the KS8000 flat SUHDTV (55” $1, 799; and 65” $2,799, with a 49” model for $1,499 and a 60” model for $2,299; both to come in May).
There’s not a huge difference between these models – the differences have mostly to do with curved and flat surfaces and the screen size options available. Plus a bevy of “bells and whistles.” Perhaps the most intriguing are a set of “connect and control” features.
Samsung’s been offering a Smart Hub feature for some time, and this year’s iteration lets you plug in a cable box from Comcast or Time Warner or a set-top from DirecTV, and the TV will automatically recognize the box and set up all the required control functions on the Samsung TV remote. All you have to do is plug in an HDMI cable.
On top of that, Samsung’s Smart Things feature provides on-off control of things like locks, lamps, and other devices connected by Wi-Fi, ZigBee, or Z-Wave protocols. The company offers switchable outlets, water sensors, proximity sensors, and motion sensors; all of which connect back to your television and smart phone for monitoring and control. (And yes, the television can also be controlled by this system.)
Samsung’s concept is this: Since we spend so much time in front of our big screen TVs, why not make them the hub of a home monitoring and control system? And why not make the connection and activation of everything from set-top boxes to remotely-controlled AC outlets a plug-and-play operation? A Smart Things starter kit is available for $249, and you can add compatible ZigBee and Z-Wave devices like thermostats, smoke and CO detectors, and locks from companies like Honeywell, Schlage, Cree, Leviton, and First Alert.
So why are Samsung and other TV manufacturers looking to get into home control systems? A combination of declining TV sales and falling prices has resulted in an accelerating transition away from Full HD (1920×1080) televisions and displays to Ultra HD (3840×2160), as TV manufacturing shifts to China and manufacturers frantically search for profitability.
Samsung – likely motivated by this trend – is looking a way to add value to TV sales, pitching a complete home entertainment and control system (with sound bars, surround audio, and Ultra HD Blu-ray players, of course) to consumers. It’s all about the Internet of Things (IoT) – the idea that every electronic gadget in your home has an IP address and can be controlled with a driver and an app.
Think about this for a moment: Seven years ago, a first-tier 50-inch 1080p plasma equipped with active-shutter 3D playback was priced at $2,500. Today, you can buy four times the resolution, eight times the brightness, a much wider color gamut, a much lighter set with lower power consumption, and five more inches of screen for about $600 less.
Amazing! You’re thinking. My next TV is going to be an Ultra HDTV! Good thinking, as your next TV sized 55 inches or larger will probably be an Ultra HD set anyway, since TV manufacturers are ramping down production of 1080p sets and retailers are devoting more shelf space to UHD.
While there are and will continue to be some amazing deals on Ultra HD sets, don’t forget the enhancements. In addition to the aforementioned high dynamic range and wider color gamut, higher frame rates (HFR) will also become a part of the UHD ecosystem. (So will 8K displays, but I’m getting ahead of myself…)
Problem is; no two companies are implementing all of these add-ons the same way. We have competing systems for HDR (Dolby Vision, Technicolor, BBC/NHK HLG, and yes, Samsung), and yet another controversy about pixel resolution in displays using the pentile red-green-blue-white (RGBW) pixel array (LG’s new Ultra HD OLEDs).
To date, only two HDR Blu-ray players have been announced, and only one (Samsung) is available at retail. A bigger problem: Many Ultra HDTVs have only one HDMI 2.0 input, which needs to support the CTA 861.3 HDR metadata standard. (DisplayPort 1.4 also works with CTA 861.3, but it was just announced). And HDMI 2.0 is barely fast enough for 4K HDR: If you want to connect a PC for Ultra HD gaming at 60Hz with 10-bit RGB (4:4:4) color, you’re out of luck.
In other words; it’s chaos as usual in the CE world, like HDTV was circa 1998. I don’t know how fast these issues will be worked out. All HDR-10 compatible TVs should play back 10-bit content from Ultra HD Blu-ray discs and media files. When it comes to enhanced HDR systems, Vizio, TCL, and LG support Dolby Vision, but Samsung does not; neither do Panasonic and Sony.
Only a handful of TV models have opted to include the still royalty-free DisplayPort interface to overcome some of the UHD speed limit issues of HDMI. 4K content isn’t exactly in abundance, either. No broadcasts are planned in the near future, and a handful of cable systems are working on 4K channels (remember the 3D channels from Comcast and DirecTV?). Netflix and Amazon Prime do stream in UHD, but you need a TV that supports the VP9/VP10 and H.265 codecs to watch.
If you are considering a purchase of an Ultra HDTV and not in a big hurry, my advice is to sit on your hands for another year until many of these issues get ironed out. Sometimes doing nothing really is the best option…
To Cut, Or Not To Cut: That Is The Question…
- Published on Monday, 11 April 2016 18:00
- Pete Putman
- 0 Comments
A recent report from Convergence Consulting Group states that by their estimates, 1.13 million TV households in the United States canceled pay TV services in 2015, which is about four times the pace of cancellations in 2014.
The report is somewhat humorously called “The Battle For The North America Couch Potato” and shows that even though pay TV subscription revenue increased by 3% in 2015 to $105B and is expected to tick up another 2% in 2016 to $107B, those percentages don’t match up to the rapid growth now being experienced with over-the-top (OTT) video services, like Netflix and Hulu.
Over the same time period, OTT subscription revenue increased by 29% to $5.1B in 2015, and is expected to grow another 20% this year to $6.1B. Now, that’s just 5.6% of the revenue forecast for conventional pay TV this year. But the growth rate of OTT is impressive and is mostly at the expense of conventional cable, fiber, and satellite TV subscriptions.
Convergence also reports that “cord never” and “cord cutter” households increased to 24.6M in 2015 from 22.5M in 2014. It’s expected that number will continue to increase to 26.7M households by the end of this year. (For some perspective, Comcast has a total of about 23 million broadband subscribers, which is more than their pay TV subscriber total.)
It’s no mystery why OTT continues to grow in popularity. Services like Netflix, Hulu, and Amazon Prime allow viewers to watch individual movies and episodes of TV shows on demand for reasonable prices, either as part of a mow monthly subscriber fee or an annual membership fee + small per-viewing charges.
In essence, what OTT viewers get is a la carte TV, instead of paying a hundred dollars or more for a service bundle that includes large blocks of TV channels that never get watched. (The average TV viewer watches about 17 different channels in a year.) And the key to making that possible is ever-faster broadband speeds, which (perhaps ironically) are being offered by cable TV companies to hold off the likes of Verizon’s FiOS, DirecTV, and Dish.
The analogy is of someone providing you the rope with which they will be hung. As Internet speeds increase along with cable bills, the first thing to get dumped is the pay TV channels. With many families, they’ve also dropped landline service in favor of mobile phones, so there’s no need for a “triple play” package (or even a “double play,” which in baseball means you’re out!)
There aren’t enough studies on hand to show how many of those cutters have picked up on watching free, over-the-air (OTA) digital TV broadcasts. And there continues to be disputes between different advocacy groups as to how much of the population actually watches OTA TV: I’ve seen estimates as low as 5-6% and as high as 20%.
Now, the second part of the story: Vizio, a leading TV brand, is now shipping a line of SmartCast Ultra HDTVs that will be “tuner-free.” You read that right; these TVs won’t have an on-board ATSC tuner for OTA broadcasts. An extra tuner would be required, along with an HDMI connection and indoor or outdoor antenna.
Technically speaking, a “TV” sold in the United States MUST have an ATSC tuner built-in, according to the FCC mandate that set a final compliance deadline of March 1, 2007. However, there is no reason why a company can’t sell a “monitor” or “display,” which would not be required to contain such a tuner. (The original FCC mandate exempted monitors that did not include analog tuners from having a digital tuner.)
According to a story on the TechHive Web site, the changes will apply to all of Vizio’s 4K Ultra HD TVs with SmartCast, including the new P-Series and upcoming E- and M-Series sets. In the story, a Vizio representative was quoted as saying that the company’s own surveys showed that less than 10 percent of their customers watched OTA broadcasts, and that a CEA (now CTA) study in 2013 claimed that just 7% of U.S. households used antennas to watch TV.
That figure is obviously low by an order of magnitude. In the 3rd quarter of 2015, the research firm Nielsen found that 12.8 million U.S. homes were relying solely on OTA TV reception, up from 12.2 the year before, and that this number didn’t include homes that are combining antenna broadcasts and streaming. All told, the percentage of homes that use an indoor or outdoor antenna in some way to watch TV probably falls between 10% and 12% – and could be even higher.
So why would Vizio drop the tuner? There’s certainly a cost savings associated with it, and not just for the hardware – there are also royalties associated with the underlying technology. But given that you can buy an outboard ATSC tuner for as little as $40, it can’t be a huge cost savings.
What’s funny about Vizio’s approach is that retailers are offering more antennas and even offering streaming media players and antennas as bundles. I’ve even noticed that the offerings of indoor TV antennas have increased at the local Best Buy (outdoor antennas are still a tough sell; only us hard-core OTA viewers will take the time to install them).
It doesn’t appear any other TV brands are following suit. However, there is a fly in the ointment: ATSC 3.0, which as a completely new standard would require an outboard set-top box or perhaps a USB stick to work with existing TVs. That’s because it supports different transmission modes that are incompatible with current ATSC tuners.
Another wrinkle – there’s no timeline for adoption of version 3.0. Right now, we’re in the middle of the first wave of FCC channel auctions, meaning that the UHF TV spectrum may be somewhat truncated after all is said and done – and many stations will have to relocate. So moving to a new terrestrial broadcast standard won’t be a priority for broadcasters any time soon.
A More Mobile Mobile is Coming
- Published on Tuesday, 29 March 2016 17:25
- Ken Werner
- 0 Comments
Data traffic on mobile networks will reach 367 exabytes — that’s 367×10^18 bytes or 367 billion gigabytes — in 2020, up from 44 exabytes in 2015, according to Cisco Systems’ recently released Visual Networking Index Global Mobile Data Traffic Forecast, 2015-2020.
A lot of that is video, which accounted for 55% of all mobile data traffic in 2015, and will account for 75% in 2020, predicts Cisco. But significant contributions to growth are expected from automotive infotainment and mobile networks to support the Internet of Things, including connected cars, said a Cisco Global Technology Policy VP in a company blog on February 3.
Auto manufacturers recognize that in-car electronic technology is now a stronger driver of sales than traditional measures of automotive performance like torque, power, straight-line acceleration, and cornering ability.
Automotive connectivity is here now, of course, in such common systems as GPS navigation and radar detectors that use GPS to recognize fixed sources of x-ray emissions, such as automatic doors, and not count them as “threats.”
But there is much more to come, with the integration of in-carsystems, vehicle-to-vehicle communication, and vehicle-to-mobile-network communication culminating in autonomous vehicles. Although we are beginning to talk (a lot) about autonomous vehicles, non-specialists may not have been giving too much thought to the various levels of vehicle automation. Fortunately, our friends at the U.S. National Highway Traffic Safety Administraton (NHTSA) have.
The NHTSA defines five levels of vehicle automation.
Level 0: No automation
Level 1: Function-specific automation, such electronic stability control and ABS. This is standard today.
Level 2: Combined function automation. An example is adaptive cruise control combined with lane centering. Such systems are widely available today as extra-cost options.
Level 3: Limited self-driving automation. The driver can, under certain conditions, cede full control of all safety-critical functions to the vehicle. The driver must be available for occasional control, or to take over when conditions are no longer suitable for self-driving.
Level 4: Full self-driving automation. The driver is expection to provide destination or navigation input, but not to control the vehicle at any time during a trip.
At TU Automotive’s one-day “Consumer Telematics Show,” held in Las Vegas during CES 2016, the members of a panel entitled “Making Autonomous Driving a Reality,” opined that Level 2 is an impressive offering that makes commutes much less tiring. Level 4 is much harder to do, but Level 3 is tricky because it requires the driver to switch from complete uninvolvement to taking full control, perhaps rather quickly (although the NHTSA definition specifies that a reasonable amount of time should be provide for the driver to do this). People tend not to be good at this sort of transition, particularly when they don’t have the ongoing situational awareness a driver — at least an attentive driver — would traditionally have. One panel member suggested that this Level 3 scenario may be unworkabel and that we will have to leap from Level 2 to Level 4.
A final comment from a panel member was “Don’t forget the social component of the human-car interaction.” As the car does more, the driver is likely to anthropomorphize the car. Man-machine communication would be optimized if the car could respond in kind. “Hello, Chevy….” “Hello, Ken. My but you look sporty today.” What kind of holographic avatar should Chevy have to optimize this communication? For that matter, what kind of avatar should I have? I can’t have Chevy seeing me the way I really am.
OLED-TV Is Real: Sales Reached $1 Billion Dollars in 2015
- Published on Wednesday, 09 March 2016 17:10
- Ken Werner
- 0 Comments
One billion dollars worth of OLED-TV sets were sold last year, seven times the sales in 2014, according to a recent IHS report. Ninety percent of that market belonged to LG, but Shin Hyun-jun, an analyst with LIG Investment & Securities Co., expects Samsung to enter the OLED-TV market in late 2017, according to a story to a late-February story from South Korea’s Yonhap News Agency.
This increase in TV sales drove a 12% increase in light-emitting OLED materials last year for a yearly total of 26,000 tons, according to the latest issue of IHS’s “OLED Materials Market Tracker. Revenues from these materials were $465 million, and IHS expects them to reach $1.8 billion in 2018.
Kihyun Kim, IHS Technology’s senior analysis for chemic materials research, said “The market for small and mediium OLED displays is stable, and OLED TV shipments are increasing, which is supporting OLED light-emitting materials market growth. Shipments of organic light-emitting materials for WOLED are expected to increase with along with WOLED TV shipments, as more manufacturers are planning to adopt the technology. WOLED materials are expected to outstrip final-metal-mask red-green-blue (FMM RGB) materials in 2017 for the first time.”
That means that for the first time OLED-TV will be the primary driver of OLED materials sales, not cell phones and tablets, and that change will be remarkably rapid — after years of not being rapid at all. FMM RGB materials took 82% of the market last year; WOLED will take 51% of shipments in 2017 and 55% in 2018, predicts IHS. Revenue growth for WOLED materials will be greater than shipment growth because WOLED materials remain more expensive than FMM RGB materials for now.
Samsung’s initial foray into OLED-TV was forced by LG’s agressive leap into WOLED-TV sets. LG used ists FMM RGB technology, and quickly withdrew from the market, saying the technology was not yet ready for commercial involvement. In the last year, Samsung has ramped up its OLED-TV develop program significantly. Initial reports indicated that Samsung was pursuing a WOLED approach generally similar to LG’s. More recently, “remorts” (more than a rumor but not quite a report) have indicated a parallel development program for FMM RGB TV panels, which could use some kind of hybrid approach.
OLED-TV is also riding one of the overall TV market’s bright spots: 4K TV. (All of LG’s 2016 OLED-TVs are 4K). Half of 55-inch-and-larger TV shipments were 4K, and even at screen sizes of 48 to 50 inches, the 4K share was 30%, according to the latest IHS “TV Sets Market Tracker.”
For 2016, LG announced that their OLED-TV offerings will consist of a number of sets in several families, rather than only. This marks the evolution of OLED-TV into a product line-up appealing to a wider range of consumers at a wider range of price points. The strategy is being supported by the construction of a new advanced-generation panel fabrication facility.
At CES, LG’s top-of-the-line 4K UHD Premium sets were clearly “best in show.” With improving panel yields and significantly increased manufacturing capability, and with competition from Samsung just over the horizon, prices will continue to decline and will drive significantly increasing material, panel, and TV-set sales. Finally, OLED-TV as a premium mass-market product is within sight. OLED-TV is Real.
“HDR” Is Coming To Your Next TV. So What, Exactly, Does That Mean?
- Published on Monday, 07 March 2016 12:21
- Pete Putman
- 0 Comments
Thinking about buying a new Ultra HDTV? You might want to wait a few months…or maybe a year. HDR is coming!
I know, I know. It seems like the new TV you just bought is already obsolete (although it really isn’t; just a little behind the times.) You can’t keep up – first, it was 720p plasma, and the market move to 1080p. Then it was 1080p LCD, followed by super-thin LCD televisions. Then “smart” TV and 3D (although the latter died a quick, merciful death).
And now, it’s Ultra HD. And OLED TV. When will it stop? Answer – it won’t, not with overcapacity for panel manufacturing in Asia and plummeting retail prices for bigger screens. In fact, as I’ve pointed out numerous times before, Ultra HD and Full HD televisions have essentially reached price parity. In many cases, an extra $100 will buy you Ultra HD resolution in the same screen size. Or $50 will get you an Ultra HDTV with five fewer inches of screen size.
The way things are heading, your next television purchase is almost certain to be an Ultra HDTV, provided it’s 50 inches or larger and you buy it no earlier than December. By then, prices will have fallen so much on UHD models that it wouldn’t make any sense to invest in a newer Full HD model. Not only that, but retailers are already allocating a larger percentage of inventory to Ultra HDTVs, cutting back on the number of Full HD models they stock.
There’s another reason you’ll want to wait until December (or later) to pick up a new Ultra HDTV, and that’s HDR – or, more specifically, high dynamic range.
HDR is the latest enhancement to come to television. Unlike 3D, you don’t need any special eyewear to see it. And the difference between standard televisions and HDR sets is dramatic – much brighter whites and higher contrast ratios on LCDs, greater shadow detail and brighter highlights on OLEDs. In other words, television pictures that approximate what your eyes see every day.
In the world of photography, we measure exposures in “stops” of light, like f2.8, 4, 5.6, 8, etc. Think of standard dynamic range as something in the range of 8 to 10 stops. In comparison, HDR can represent a minimum of 15 stops of light, with each additional stop being twice as bright as the previous one. (Some advanced HDR cameras can capture 20 stops of light!)
It’s hard to describe the concept of HDR with words, but trust me; when you see it, you’ll know it. Combined with Ultra HD resolution, it is an entirely new TV viewing experience than anything you’ve seen before. Even plain vanilla Full HDTV looks different with HDR content.
HDR has become such a big deal that a good portion of the Day 2 session at the recent Hollywood Post Alliance Technology Retreat was devoted to this topic, with a couple dozen speakers covering all aspects of capture, post, mastering, and distribution to the home. And to be honest, not many of these experts know how it will all work in the end, especially when it comes to the consumer viewing experience.
So, what do you need to watch HDR? First off; your TV must have some way of reproducing the high dynamic range signal, which means the basic white LED backlight with color filters used by just about every garden-variety LCD TV won’t work. Instead, you’ll want to look for LCD televisions using enhanced backlighting technology like quantum dots.
Quantum dots (QDs) are tiny nanocrystalline chemical compounds that emit high-intensity color light when stimulated by photons, usually from blue or ultraviolet light sources. (That’s the “quantum energy” effect.) Several different companies manufacture quantum dots – QD Vision makes them in light pipes for thin LCDs, while Nanosys and 3M have joined forces to produce a QD film layer for LCD displays.
Presently, Samsung (S-LCD), Vizio, and Sony (certain Triluminous models) sell Ultra HDTVs with quantum dot technology, and are soon to be joined by TCL and Hisense. LG has also shown LCD TVs with quantum dot technology, but they have a trick up their sleeve – organic light-emitting diode (OLEDs) televisions.
OLED technology can also reproduce HDR signals. LG’s white OLED emitters work with color filters in a red-green-blue-white stripe to achieve high brightness and strong color saturation, easily achieving the 15-stop threshold. While OLEDs can’t hit the peak brightness levels of HDR LCDs (800 nits or more), they do much better coming out of black and reproducing very low luminance steps – something that LCDs can’t do without tricks like dynamic backlight dimming and contrast/black level manipulation.
At the 2016 CES, the Ultra HD Alliance released their specifications for “premium” Ultra HD, a/k/a HDR. The sets must have a minimum resolution of 3840×2160 pixels and reproduce HDR signals using the SMPTE ST2084 standard, with 10 bits per pixel minimum. (The current Blu-ray format, along with broadcast cable, satellite, and streaming TV services, relies on 8-bit color formatting.)
For LCD Ultra HDTVs, the specification calls for a level of black no higher than .05 nits (it can be lower) and a minimum brightness of 1000 nits. For OLED TVs, the black level must be .0005 nits (no higher) and white has to hit 540 nits. If you‘re interested in the resulting contrast ratios, it would be 20,000:1 for LCDs and over 1,000,000:1 for OLEDs.
Hand-in-hand with HDR is a new, wider gamut of colors (WCG) known formally as ITU Recommendation BT.2020. The “2020” color space is quite a bit larger than the current ITU Rec.709 color space that came into use with digital TV. With this new space, you’ll see brighter, more saturated greens and reds and over a billion shades of color. (8-bit color is limited to 16.7 million shades.) And to reproduce those shades of color, you need more horsepower under the hood. (Hence; quantum dots and OLEDs.)
What about content? New standards have been released for HDR Blu-ray discs that follow the UHD Alliance Premium specs – 10-bit color, 3840×2160 resolution, and BT.2020 color space representation. In the Samsung booth at CES, a shelf display contained more than 100 Blu-ray movie packages that have been or will be mastered with HDR and WCG. Some of those titles are available now to play back on Samsung’s UBD-K8500 player ($350) or Panasonic’s DMP-UB900 (no price yet). Expect BD players from LG and Sony to make an appearance this year, too.
But the question now is the relevance of optical media. Numerous studies have shown that rentals of Blu-ray discs have been in decline for some time, and BD sales don’t make a dent in the ever-growing volume of transactional video-on-demand, streaming, and digital downloads.
The good news is that HDR content can be streamed or downloaded, although your Ultra HDTV or media player will likely require support for a new video compression/decompression (codec) standard, High Efficiency Video Coding (HEVC) H.265. Many new Ultra HDTVs support this standard. Google’s VP9 and VP10 codecs, used with YouTube 4K content, may also support HDR in the future.
And what about flavors of HDR? Right now, the system getting the most attention is Dolby Vision, which got out of the gate early and is now implemented on Vizio, TCL, Sony, and Philips HDR LCD Ultra HDTVs. LG announced at CES that they would also support Dolby Vision on their premium Ultra HD OLED TVs. Another system has been proposed by Technicolor and it appears that TV manufacturers will support it as well.
The trick is compliance with the CTA 861.3 standard for reading and understanding HDR “metadata” that will be encoded with the HDR movie or TV program. This metadata will travel through the HDMI or DisplayPort interface in what’s called an “info frame” and the Ultra HDTV should reproduce it correctly. For streaming content, HDR metadata will be embedded in the program and read by the TV on the fly.
At CES, both Samsung and LG showed HDR Ultra HD content as a broadcast signal, using the new ATSC 3.0 standard and a UHF TV channel. Not many people paid much attention to this demo, but it was significant that HDR content can be broadcast as well as streamed. Yet another HDR format, hybrid log gamma, has been proposed by the BBC and NHK as a way to transmit one signal with both SDR and HDR content, letting the compatible Ultra HDTV show it in the appropriate format.
We already have several precedents for this piggy-back backward-compatible approach, such as the NTSC color “burst” signal added to black-and-white television transmissions in the 1950s and the FM stereo sub-carrier that also appeared in the late 1950s. Viewers with older Ultra HDTVs (which wouldn’t be that old, trust me) would simply see an SDR signal, while newer sets would expand the dynamic range at the high (brighter) end to achieve HDR.
Now, a lot of what I’ve just described is still in the building stages. Only a handful of HDR Ultra HDTVs are available right now, and only Samsung’s HDR Blu-ray player is on store shelves. I don’t know of any streaming content providers that are formatting programs in HDR, although Netflix and Amazon Prime are streaming 4K video. There aren’t any 4K cable channels at present, nor are any broadcast networks transmitting 4K shows.
But they’ll all catch up over time. They key is to have an Ultra HDTV that supports HDR and WCG playback, preferably one with both HDMI 2.0a (HDR) and DisplayPort 1.4 inputs. The former interface is already supported, although on a limited basis, while the latter was just announced a week ago.
And that brings me back to my original premise – if you are considering the purchase of a new Ultra HDTV, you’d be smart to wait until the end of the year or even until mid-January when TV prices are historically their lowest. And check to make sure your new set supports HDR through ALL inputs, not just the HDMI connection.
By then, you’ll have a much larger menu of HDR content choices, and of course you can still enjoy watching SDR 4K content. (And by then, you’ll see that big-screen Full HD sets have largely disappeared from store shelves anyway!)