Posts Tagged ‘HDMI’

HDMI 2.1: The Need For Speed Continues

Ever since HDMI version 2.0 was announced in September 2013, I’ve been pretty vocal about criticizing its “not quite fast enough” speed upgrade from 10.2 to 18 Gb/s, which turned out to be barely adequate for transporting 4K (3840×2160) video at full color resolution (RGB, or 4:4:4 in the world) at a frame rate of 60 Hz – and only with 8-bit color.

Given how quickly the display industry is shifting to 4K and even higher resolutions, it was inconceivable that this new interface would in effect create a “speed bump” in the 4K chain, particularly since high dynamic range (HDR) and wide color gamut (WCG) enhancements were becoming part of the UHD ecosystem. And both enhancements require at least 10-bit color rendering, something that would be impossible to pass through the HDMI 2.0 interface if using a full-resolution color format.

It didn’t help that HDMI’s competitor – DisplayPort – had already broken the 20 Gb/s barrier way back in 2007 with version 1.2 and could easily interface a 2160p/60 signal with 10-bit RGB color @ 60 Hz, and earlier in 2013 had announced version 1.3, which saw a speed boost to 32.4 Gb/s.

For a time there, I thought the superMHL format, which had its debut at CES 2015, might be the successor to HDMI. It was faster (36 Gb/s), had a large, reversible connector, was compatible with USB Type-C Alternate Mode, and most importantly, supported Display Stream Compression.

Alas; it appears superMHL turned out to be mostly a science experiment. The MHL Forum was conspicuous by its absence at CES 2017, but the HDMI Forum more than made up for it by unveiling version 2.1. And now, we’ve got a real horse race.

High dynamic range support will be much easier with version 2.1, especially deeper color from RGB sources.

THE DETAILS

The public press release on HDMI 2.1 is sketchy on details, except to say that the maximum speed of the interface has now reached a mind-boggling 48 Gb/s (that’s faster than most network switches!). Quite the leap from 18 Gb/s, wouldn’t you say?

The release goes on to talk about a new generation of 48G cables, a greatly improved eARC audio return channel with auto-detect, and finishes with a discussion of high dynamic range and higher video resolutions, both of which are possible with faster data rates that enable higher frame rats and deeper color. And of all of this happened while retaining the familiar 19-pin Type A connector. (Wha-a-a-t?)

But what’s really going on here? How did HDMI accelerate to 48 Gb/s? Hold on, and I’ll provide the details missing from the press release.

First off, the current version of HDMI uses three connections – well call them lanes, like DisplayPort does – to transport red, green, and blue display pixels. There’s a fourth lane for the clock to synchronize frames, and the balance of the connectors are used for ‘hot plug detect’ connections, the Data Display Channel (EDID). That doesn’t leave much room for expansion.

But HDMI 2.1 adds another lane for TMDS data (although it’s not really TMDS anymore) by taking over the clock lane and embedding clock data within the existing signal, much the same way it’s done with packet-based signaling systems.

Next, the physical data rate over each lane has been raised from 6 Gb/s to 12 Gb/s. I don’t know how that 100% increase was achieved, but that’s an impressive achievement considering that we are still waiting for 12G SDI cables to come to market.

The 12G number may also be a function of jiggering the acceptable signal-to-noise (SNR) ratio, something proposed a year ago by Steve Lampen of Belden – but then again, we’re not likely to see 12 Gb/s of data traveling down any display pipes in the immediate future. (For comparison, DisplayPort’s HBR3 cap is 8.1 Gb/s per lane.)

That’s not all. The standard ANSI coding format for HDMI, DVI, and DP (not to mention numerous other interfaces) is known as 8b/10b, coding 8-bit words into 10-bit symbols, resulting in about 20% overhead. Example: A 4K/60 signal encoded as an 8-bit RGB signal requires 17.28 Gb/s, and 20% of that is overhead from 8b/10b coding.

HDMI 2.1 has adopted a more obscure form of coding known as 16b/18b. You can find a IEEE PDF from 1999 describing how it works here, and it’s formally known as “partitioned DC-balanced 16b/18b transmission code.” The net effect of moving from 8b/10b to 16b/18b is reducing the overhead to about 12% from 20%. What’s interesting though is that the HDMI 2.1 signal isn’t really TMDS we’ve come to know and love when in this mode – it’s something else, possibly more of a packet structure.

HDMI is now compatible with USB Type-C Alternate Mode – a”must have” feature for any new display interface.

Last but not least, HDMI announced last fall that it was compatible with the USB Type-C Alternate Mode format. And now, it appears that HDMI 2.1 is also compatible with DisplayStream 1.2 compression, which is a much more efficient way to transport signals like 7680×4320/60 (8K, for those not paying attention). Although at 48 Gb/s, version 2.1 could theoretically transport that signal uncompressed using 4:2:0 color.)

Compatibility with DSC wouldn’t be that much of a shocker – superMHL also offered it and it’s another TMDS format. In fact, at second glance, it appears that much of the engineering that went into superMHL has now migrated over to HDMI 2.1 (about time) and the most significant breakthrough is doubling the interface speed.

Given that 40 Gb/s is definitely optical fiber territory, the only remaining question is why we still haven’t seen a detailed HDMI specification for direct optical interfaces. 48G cables will be expensive and difficult to engineer, but multimode optical fiber can already do the job and is cheap. To come up with 50-foot and longer manufactured optical cables for HDMI would be a piece of cake – and it’s already been done in the past for HDMI 1.3/1.4.

So there you have it: HDMI 2.1; a faster, smarter, and more appropriate display interface as we head into the era of 4K and beyond. How soon will we see HDMI 2.1 interfaces and cables? Well, considering it took almost 3 years for version 2.0 to achieve any significant presence in commercial AV, I’d say maybe a year from now at the earliest…and perhaps not until 2019 in any quantity.

By then, a good deal of the industry may have already shifted to AV-over-IP for the bulk of its signal switching and distribution, using simple format conversion at the display end. And we still have to see who is going to adopt on DisplayPort 1.3/1.4, still a “no-royalty” interface that can hit 32 Gb/s and supports all the forward-looking necessities (Type-C Alternate Mode, DSC, HDR).

Gentlemen, start your engines…

Hey, Whatever Happened To superMHL?

There is no such thing as a ‘sure thing.’ You can have a 20-yard field goal try with 5 seconds left, two foul shots left to ice the game, or a one-on-one penalty shot with your best wing on the ice. Doesn’t matter – things do go awry. In fact, sometimes they never get going in the first place.

Two years ago this coming January, Silicon Image (now Lattice Semiconductor) unveiled what they claimed to be the best next-generation display interface. They called it superMHL, and it was super indeed; sporting a large, 32-pin symmetrical plug design to go with a 36 gigabits-per-second (Gb/s) data transfer rate.

That’s wasn’t all. superMHL (basically MHL on steroids) also supported the new Display Stream Compression (DSC) 1.1 standard. And it would also work with the all-important USB 3.0 Type-C plug’s Alternate Mode, which multiplexed display connections and fast USB serial data in the same ‘smart’ plug.

Wow! I didn’t see this coming; neither did most of the trade press in attendance. Here was a connector faster than DisplayPort’s version 1.3 (32 Gb/s), plus it was symmetrical in operation (plug it in either way, it doesn’t care, it’s smart enough to set itself up the right way). And it was compatible with the next generation of USB connectors.

Even more amazing, the MHL Consortium demo showed 8K content flowing to a large Samsung 8K TV through this interface, which claimed to support 7680×4320 video @ 60 Hz with 4:2:0 color (albeit using DSC to pack things down a bit in size). If there was ever a ‘sure thing,’ this was it!

It's the fastest display interface out there - and no one uses it. Maybe they should call it HDMI 3.0?

It’s the fastest display interface out there – and no one uses it. Maybe they should call it HDMI 3.0?

I was assured in the following months that Lattice and the MHL Consortium would have several press announcements pertaining to design wins for the 2015 holiday season. I’d see several new UHDTV televisions with at least one superMHL port and the rest of the inputs would be HDMI 2.0 connections. Thus, we’d be ready for the brave new world of 8K TV! (Never mind that 4K TV was still getting on its feet at the time!)

But it never happened. Black Friday, Christmas, New Year’s, and then ICES and the 2016 Super Bowl came and went with no announcements. At ICES 2016, the MHL Consortium once again had a demo of 8K content playback through an LG 98-inch LCD TV using the superMHL interface, and “yes, it looked great” and “we’re ready for 8K TV” and “it works with USB Type-C” and so on, and so forth.

Right now, it’s pretty much radio silence about superMHL. So what happened?

For one thing, the adoption rate of HDMI 2.0 since its formal unveiling in 2013 can be charitably described as “slow.” Early Ultra HDTVs had perhaps one HDMI 2.0 port on them, and not all of them supported the new HDCP 2.2 copy protection protocol. In our industry, we’re only now starting to see distribution amplifiers and switches with HDMI 2.0 connections – there’s still a lot of version 1.4 product out there, too.

Another perplexing question: Since superMHL fixes the speed limit problems of HDMI 2.0 by doubling them – and also adds the all-important compatibility with USB Type-C (a must, going forward) along with support for DSC (critical as we push display resolutions beyond 5K), why would Lattice continue to support both formats, or even suggest they could be mixed on future UHD+ televisions and monitors?

In other words; if there is a better option, then why wouldn’t you want that option?

To be sure; Lattice is in a tricky position. Through their subsidiary HDMI Licensing LLC, they reap millions of dollars each year in royalties associated with every HDMI port on every piece of consumer and commercial gear. That’s a nice cash flow, and who wants to mess with it?

But they really can’t lose here, inasmuch as they control the IP for all of these transition-minimized differential signaling (TMDS) interfaces. Why not bite the bullet and announce the phase-out of HDMI 1.3/1.4, and move everyone to version 2.0? Better yet; just announce a sunset for version 2.0 and start the transition to superMHL, a/k/a HDMI 3.0?

Yeah, it's fun to demo 8K TV using superMHL, but that takes the focus off the real-world, practical interfacing solutions we're facing now.

Yeah, it’s fun to demo 8K TV using superMHL, but that takes the focus off the real-world, practical interfacing solutions we’re facing now.

One problem Lattice created with this new connector is that it’s effectively an oxymoron. MHL stands for Mobile High-definition Link, and it was originally designed to multiplex HDMI signals through 5-pin micro USB ports. The concept was that the single micro USB connector on your smartphone or tablet could connect to a television so you could play back videos, show photos, and share your screen. (Never mind that the majority of people prefer to do this via a wireless connection and not a 15-foot HDMI-to-micro USB cable that often requires a power adapter.)

So MHL meant “small, fast, and powerful.” And now we have the ‘funny car’ of display interfaces with a large connector that will never get anywhere near your mobile device…and the way things are going, it may never get anywhere near your TV, either.

In previous columns and in my classes and talks, I’ve written about the deficiencies of HDMI 2.0 – slow speed, non-symmetrical, no support for USB Type-C (finally remedied a few months ago) and lack of support for Display Stream Compression. superMHL fixes all of these problems in one fell swoop.

The answer? Re-brand this connector as HDMI 3.0 – which it really is – and make the appropriate announcement in two months at ICES 2017. Practically speaking; MHL has been a non-starter (among major U.S. brands, only Sony, Samsung, and LG have supported it on their smartphones and TVs) and the adoption rate for HDMI 2.0 is nowhere near as fast as it was for version 1.3. Too many interfaces and too much confusion!

After all, even Elvis Presley had to make a comeback…

AV-over-IP: It’s Here. Time To Get On Board!

At InfoComm next week in Las Vegas, I look forward to seeing many familiar faces – both individuals and manufacturers – that have frequented the show since I first attended over 20 years ago. And I also expect to find quite a few newcomers, based on the press releases and product announcements I’ve been receiving daily.

Many of those newcomers will be hawking the latest technology – AV-over-IP. More specifically, transporting video, audio, and metadata that are encoded into some sort of compressed or lightly-compressed format, wrapped with IP headers, and transported over IP networks.

This isn’t exactly a new trend: The broadcast, telecom, and cable/satellite worlds have already begun or completed the migration to IT infrastructures. The increasing use of optical fiber and lower-cost, fast network switches are making it all possible. Think 10 gigabit Ethernet with single-mode fiber interconnections, and you can see where the state-of-the-art is today.

You’ve already experienced this AV-over-IP phenomenon if you watch streaming HD and 4K video. Home Internet connection speeds have accelerated by several orders of magnitude ever since the first “slow as a snail” dial-up connections got us into AOL two decades ago. Now, it’s not unusual to have sustained 10, 15, 25, and even 50 megabit per second (Mb/s) to the home – fast enough to stream Ultra HD content with multichannel sound.

And so it goes with commercial video and audio transport. Broadcast television stations had to migrate to HD-SDI starting nearly 20 years ago when the first HDTV broadcasts commenced. (Wow, has it really been that long?) Now, they’re moving to IP and copper/fiber backbones to achieve greater bandwidth and to take advantage of things like cloud storage and archiving.

So why hasn’t the AV industry gotten with the program? Because we still have a tendency to cling to old, familiar, and often outdated or cumbersome technology, rationalizing that “it’s still good enough, and it works.” (You know who you are…still using VGA and composite video switching and distribution products…)

I’ve observed that there is often considerable and continual aversion in our industry to anything having to do with IT networks and optical fiber. And it just doesn’t make any sense. Maybe it originates from a fear of losing control to IT specialists and administrators. Or, it could just be a reluctance to learn something new.

The result is that we’ve created a monster when it comes to digital signal management. Things were complicated enough when the AV industry was dragged away from analog to digital and hung its hats on the HDMI consumer video interface for switching and distribution. Now, that industry has created behemoth switch matrices to handle the current and next flavors of HDMI (a format that never was suitable for commercial AV applications).

We’ve even figured out a way to digitize the HDMI TMDS signal and extend it using category wire, up to a whopping 300 feet. And somehow, we think that’s impressive? Single-mode fiber can carry an HD video signal over 10 miles. Now, THAT’S impressive – and it’s not exactly new science.

So, now we’re installing ever-larger racks of complex HDMI switching and distribution gear that is expensive and also bandwidth-capped – not nearly fast enough for the next generation of UHD+ displays with full RGB (4:4:4) color, high dynamic range, and high frame rates. How does that make any sense?

What’s worse, the marketing folks have gotten out in front, muddying the waters with all kinds of nonsensical claims about “4K compatibility,” “4K readiness,” and even “4K certified.” What does that even mean? Just because your switch or DA product can support a very basic level of Ultra HD video with slow frame rates and reduced color resolution, it’s considered “ready” or “certified?” Give me a break.

Digitizing HDMI and extending it 300 feet isn’t future-proof. Neither is limiting Ultra HD bandwidth to 30 Hz 8-bit RGB color, or 60 Hz 8-bit 4:2:0 color. Not even close. Not when you can already buy a 27-inch 5K (yes, 5K!) monitor with 5120×2880 resolution and the ability to show 60 Hz 10-bit color. And when 8K monitors are coming to market.

So why we keep playing tricks with specifications, and working with Band-Aid solutions? We shouldn’t. We don’t need to. And the answer is already at hand.

It’s time to move away from the concept of big, bulky, expensive, and basically obsolete switching and distribution hardware that’s based on a proprietary consumer display interface standard. It’s time to move to a software-based switching and distribution concept that uses an IT structure, standard codecs like JPEG2000, M-JPEG, H.264, and H.265, and everyday off-the-shelf switches to move signals around.

Now, we can design a fast, reliable AV network that allows us to manage available bandwidth and add connections as needed. Our video can be lightly compressed with low latency, or more highly compressed for efficiency. The only display interfaces we’ll need will be at the end points where the display is connected.

Even better, our network also provides access to monitoring and controlling every piece of equipment we’ve connected. We can design and configure device controls and interfaces using cloud-based driver databases. We can access content from remote servers (the cloud, again) and send it anywhere we want. And we can log in from anywhere in the world to keep tabs on how it’s all functioning.

And if we’re smart and not afraid to learn something new, we’ll wire all of it up with optical fiber, instead of bulky cables or transmitters and receivers to convert the signals to a packet format and back. (Guess what? AV-over-IP is already digital! You can toss out those distance-limited HDMI extenders, folks!)

For those who apparently haven’t gotten the memo, 40 Gb/s network switches have been available for a few years, with 100 Gb/s models now coming to market. So much for speed limit issues…

To the naysayers who claim AV-over-IP won’t work as well as display interface switching: That’s a bunch of hooey. How are Comcast, Time Warner, NBC, Disney, Universal, Netflix, Amazon, CBS, and other content originators and distributors moving their content around? You guessed it.

AV-over-IP is what you should be looking for as you walk the aisles of the Las Vegas Convention Center, not new, bigger, and bulkier HDMI/DVI matrices. AV-over-IP is the future of our industry, whether we embrace it or are dragged into it, kicking and screaming.

Are you on board, or what?

CES 2016 In The Rear View Mirror

I’m a little less than a week back from one of the world’s largest trade shows, the 2016 International CES. According to press releases from the Consumer Technology Association (CTA), the new name for the Consumer Electronics Association, upwards of 170,000 people attended the show this year, which was spread out over several venues in Las Vegas.

Based on the crowds I saw, I’d say that number wasn’t far off. Walking through booths in the Las Vegas Convention Center gave me the feeling of strolling along the beach, unaware that a tidal wave was sneaking up on you – one minute you had a particular exhibit all to yourself, and the next, you were swamped by a sea of bodies adorned with CES badges.

Trying to predict which trends in electronics will be “hot” each year is basically a fool’s errand. Going into the show, I was deluged with press releases about “Internet of Things” gadgets, and the show didn’t disappoint – I saw everything from connected thermostats and body sensors to pet food dispensers and shower heads that monitor how much water each member of your family uses – and record that data, too.

The LG floor-to-ceiling OLED wall at CES put many people into a trance.

The LG floor-to-ceiling OLED wall at CES put many people into a trance.

 

TCL set up their usual tiny booth in the Central Hall.

TCL set up their usual tiny booth in the Central Hall.

Last year, the show was all about Ultra HDTV, with some unusual video aspect ratios and pixel counts thrown in. This year, I figured high dynamic range (HDR) would be the “hot” item in every booth. Surprisingly, it wasn’t generating all that much buzz, even though it was featured in the Sony, Samsung, LG, and Chinese TV booths. Instead, there seemed to me much more interest in virtual reality (VR); examples of which were to be found everywhere in the LVCC and also over at the Sands Expo Center.

What was an eye-opener (although not entirely unexpected) was the reduction in booth space devoted to televisions in the Samsung, Panasonic, and LG booths. Sony chose to use Ultra HDTVs to illustrate HDR, wide color gamut, and local area dimming concepts, while Panasonic largely ignored TVs altogether, featuring just a 65-inch UHD OLED TV in one part of their booth and a 55-inch 8K LCD set in another; primarily to demonstrate 8K signal transport over optical fiber.

LG and Samsung devoted more real estate than ever before to connected and “smart” appliances, tablets, smartphones, and personal electronics like smart watches, subtly pushing TVs (of which there were still plenty, believe me) to a secondary role with less square footage. The fact is; appliances are more profitable than TVs these days…WAY more profitable. And Samsung and LG had plenty of refrigerators, ovens, washers, and even dryers out for inspection.

For LG, CES was a big “coming out” party for their expanding line of OLED Ultra HDTVs – they were everywhere, dazzling with their deep blacks and saturated colors. But LCD still plays a part in the LG ecosystem: The 98-inch 8K LCD panel that blew us away last year made a return appearance, as did the 105-inch 21:9 5K (5120×2160) model.

This Innolux 8K LCD monster TV showed up in the Hisense booth and a few other locations.

This Innolux 8K LCD monster TV showed up in the Hisense booth and a few other locations.

 

Samsung showed the

Samsung showed the “World’s largest 170-inch TV.” Apparently there are smaller ones I didn’t know about.

Over in the Samsung booth, they kept the “mine’s bigger than yours” contest going with a 170-inch Ultra HDTV based on a LCD panel fabbed at CSOT in China and equipped with quantum dots. (Last year, Samsung insisted their quantum dot illumination technology was to be called “nanocrystals.” This year, they did a 180-degree turn, and are now calling them quantum dots.) A curved 8K TV and some demos of live broadcast Ultra HD with HDR were also showcased alongside the company’s new Ultra HD Blu-ray player ($399 when it ships in the spring).

The “towers” and stacks of LG and Samsung televisions we used to marvel at a decade ago have now found their way into the ever-expanding booths of Chinese TV brands like Hisense, TCL, Changhong, Haier, Konka, and Skyworth. (Not familiar names? Don’t worry, you’ll get to know them soon enough.) And notable by its absence was Sharp Electronics, whose US TV business and assembly plant in Mexico were acquired by Hisense last year. That’s quite a change from ten years ago, when the company held a 21% worldwide market share in LCD TV shipments.

To be sure, there was a Sharp meeting room w-a-y in the back of the Hisense booth, which was enormous – almost as big as TCL’s behemoth in the middle of the Central Hall. And the Konka, Changhong, and Skyworth booths weren’t far behind in size. If you needed to see the writing on the wall regarding the future of television manufacturing, it couldn’t have been more clear – everything is slowly and inexorably moving to China. (It’s a good bet that the LCD panel in your current TV came out of a Chinese or Taiwanese assembly plant!)

TVs were just part of the story in Las Vegas. I had been waiting a few years to see which companies would finally pick up the baton and start manufacturing 802.11ad Wi-Fi chipsets. For those readers who haven’t heard of it before, 802.11ad – or its more common names, “Wireless Gigabit” and “Certified Wireless Gigabit” is a standard that uses the 60 GHz millimeter-wave band to transmit high-speed data over 2 GHz-wide channels.

Letv demonstrated wireless 4K video streaming over 60 GHz 802.11ad, using this new smartphone and Qualcomm's chipset.

Letv demonstrated wireless 4K video streaming over 60 GHz 802.11ad, using this new smartphone and Qualcomm’s chipset.

 

Are you on the USB Type-C bandwagon yet? (Check your new laptop or smartphone...)

Are you on the USB Type-C bandwagon yet? (Check your new laptop or smartphone…)

Considering that the current channels in the 2.4 GHz and 5 GHz band are only 20 MHz wide, and that the 802.11ac channel bonding protocol can only combine enough of them to create a 160 MHz channel, that’s quite a leap in bandwidth! The catch? 60 GHz signals are reflected by just about solid object, limiting their use to inside rooms. But with high-power operation and steerable antennas, those signals can travel a pretty good distance.

In-room, high-bandwidth operation is perfect for streaming video – even at 4K resolution – from phones, tablets, set-top boxes, and even Blu-ray players to TVs, projectors, AV receivers, and switching and distribution gear. Qualcomm had demos of numerous ready-to-manufacture tri-band modems (2.4/5/60 GHz), along with LETV’s latest smart phone with a built-in 60 GHz radio chip. And SiBEAM, a part of Lattice Semiconductor, showed 4K streaming through their WiHD technology, along with close-proximity interface coupling using SNAP to download images and video from a waterproofed GoPro camera.

Lattice had some other tricks up their sleeve in their meeting room. One of those was using a Windows 10 phone with a MHL (Mobile High-definition Link) connection through USB Type-C to create a virtual desktop PC. All that needed to be added was a mouse, a keyboard, and monitor. In another area, they showed a scheme to compress Ultra HD signals before transmitting them over an HDBaseT link, with decompression at the far end. This, presumably to overcome the 18 Gb/s speed limit of HDMI 2.0.

DisplayPort had a good demonstration of Display Stream Compression (DSC). That's the chipset under that enormous fan.

DisplayPort had a good demonstration of Display Stream Compression (DSC). That’s the chipset under that enormous fan.

 

Ultra HD Blu-ray is here, complete with high dynamic range mastering. How will it hold up against the growing trend to stream video?

Ultra HD Blu-ray is here, complete with high dynamic range mastering. How will it hold up against the growing trend to stream video?

Not far away, the “funny car” guys at the MHL Consortium showed their superMHL interface linking video to another LG 98-inch 8K LCD display. Converting what was once a tiny, 5-pin interface designed for 1080p/60 streaming off phones and tablets to a 32-pin, full-size symmetrical connector that can hit speeds of 36 Gb/s seems like putting Caterpillar truck tires and a big-block Chevy engine in a Smart Car to me…but they did it anyway, and added support for USB Type-C Alternate mode. Now, they’re ready for 8K, or so they keep telling me. (That’s fine, but the immediate need is for faster interfaces to accommodate Ultra HD with 10-bit and 12-bit RGB color at high frame rates. Let’s hear about some design wins!)

At the nearby VESA/DisplayPort booth, there were numerous demonstrations of video streaming over USB Type-C connections in Alternate mode, with one lash-up supporting two 1920x1080p monitors AND a 2550×1536 monitor, all at the same time. DP got somewhat faster with version 1.3 (32 Gb/s) and now a new version (1.4) will be announced by the end of January. The VESA guys also had a nice exhibit of Display Stream Compression (DSC), which can pack down a display signal by a 2:1 or 3:1 ratio with essentially no loss or latency (a few microseconds). If we’re going to keep pushing clock speeds higher and higher, compression is inevitable.

The world of display interfacing appears to becoming more disjointed, what with the majority of consumer devices still supporting HDMI 1.4 and 2.0, while an increasing number of computer and video card manufacturers are jumping on the DisplayPort bandwagon (Apple, HP, and Lenovo, among others). How superMHL will fit into this is anyone’s guess: The format is TMDS-based, like HDMI, but outstrips it in every way (HDMI 2.0 does not support DSC or USB Type-C operation). Do we really need two TMDS-based interfaces, going forward?

Speaking of USB Type-C, everybody and their brother/sister at CES had Type-C hubs, adapters, and even extenders out for inspection. If any connector is going to force the competing display interface standards to get in line, it will be this one. Apple, Intel, Lenovo, and several phone/tablet manufacturers are already casting their lots with Type-C, and it looks to be the next “sure thing” as we head toward a universal data/video/audio/power interface. I even came home with a credit card-sized press kit with a reversible USB 2.0 / 3.0 Type-C plug built-in!

First it was vinyl. Then cassettes. Now, Kodak is bringing back Super 8mm film and cameras. (I kid you not!)

First it was vinyl. Then cassettes. Now, Kodak is bringing back Super 8mm film and cameras. (I kid you not!)

 

Lenovo is one of four laptop manufacturers now offering OLED screens, here on a ThinkPad X1 Yoga (right).

Lenovo is one of four laptop manufacturers now offering OLED screens, here on a ThinkPad X1 Yoga (right).

So – how about HDR? Yes, a few companies showed it, and there were spirited discussions over dinner whether OLEDs could actually show signals with high dynamic range (they most assuredly can, as they can reproduce 15 stops of light from just above black to full white without clipping) and whether you actually need thousands of cd/m2 to qualify as an HDR display (I’m not in that camp; displays that bright can be painful to look at).

For LCDs, quantum dots (QDs) will lead the way to HDR. Both QD Vision and 3M had demos of quantum dot illuminants, with QD Vision focusing on light pipes for now and 3M partnering with Nanosys to manufacture a quantum dot enhancement film. Both work very well and provide a much larger color gamut than our current ITU Rec.709 color space, which looks positively washed-out compared to the more expansive Rec.2020 color gamut associated with UHD and HDR. QD Vision also showed the reduction in power consumption over OLEDs when using QDs. However, you won’t get the deep blacks and wide viewing angles out of an LCD in any case, so a few more watts may not matter to the videophiles.

The Ultra HD Blu-ray format had its formal debut at CES with Panasonic and Samsung both showing players. The latter can be pre-ordered for $399 and will ship in the spring. (Remember when Samsung’s first-ever Blu-ray player sold for nearly $2,000 almost a decade ago?) To support HDR – which requires 10-bit encoding – the HDMI interface must be type 2.0a to correctly read the metadata. That can be in the DolbyVision format, or the Technicolor format, but the baseline definition is HDR-10.

LG Display's flexible 18-inch OLED display was just too cool for words.

LG Display’s flexible 18-inch OLED display was just too cool for words.

 

Stand four 65-inch UHD OLED panels on end, stitch them together, and this is what you get. Bibbedy-bobbedy-boo!

Stand four 65-inch UHD OLED panels on end, stitch them together, and this is what you get. Bibbedy-bobbedy-boo!

I saved the best for last. Every year, LG Display invites a few journalists up to what we call the “candy store” to see the latest in display technology. And this year didn’t disappoint: How about dual-side 55-inch flexible OLED TVs just millimeters thick? Or a 25-inch waterfall (curved) display that could form the entire center console in a car, with flexible OLEDs in the dashboard creating bright, colorful, and contrasty gauges?

LGD has WAY too much fun coming up with demos for this suite. I saw four 65-inch OLED panels stacked on end, edge to edge, and bent into an S-curve to create a 2.2:1 ratio widescreen UHD+ display. And it also had video playing on both sides. In another location, I saw a jaw-dropping 31.5” 8K LCD monitor with almost perfect uniformity, and an 82-inch “pillar” LCD display.

How about a 55-inch UHD OLED display rolled into a half-pipe, with you standing at the center, playing a video game? Talk about filling your field of view! Next to it was a convex 55-inch display, wrapped around a ceiling support pole. And next to that, a 55-inch transparent OLED display with graphics and text floating over real jewelry, arranged on tiers. The actual transparency index is about 40% and the concept worked great.

Toyota's Future Concept Vehicle (FCV) is a bit roomier than last year's sidecar-shaped model.

Toyota’s Future Concept Vehicle (FCV) is a bit roomier than last year’s sidecar-shaped model.

 

Wow, drones are getting REALLY big these days!

Wow, drones are getting REALLY big these days!

The icing on the cake was an 18-inch flexible OLED with 800×1200 resolution that could be rolled up into a tube or a cone-like shape while showing HD video. This was one of those “I gotta get me one of these!” moments, but significantly, it shows how OLED technology has matured to the point where it can be manufactured on flexible substrates. And what is the largest market in the world or displays? Transportation, where G-forces and vibration eventually crack rigid substrates, like LCD glass.

That’s just a snapshot of what I saw, and I haven’t even mentioned drones (buzzing all over the place), fold-up scooters and hoverboards, smart appliances, pet cams, alarms that alert you when an alarm goes off (really!), wooden smartphones (really!), talking spoons and forks (really!), toothbrushes linked to video games (would I kid you?), and 4K action cams with built-in solar cell chargers.

Gotta run now. My phone just sent me a Wi-Fi alarm that a Bluetooth-connected doorbell camera spotted the UPS guy delivering a package I was already alerted about via email to my desktop that signaled a buzzer via ZigBee in my virtual desktop PC that was connected wirelessly to my smartphone, currently streaming 4K video over a 60 GHz link to my “smart” TV that is also…also…also…

Oh, great. Now I’ve forgotten what I was talking about…Does anyone make an iRemember app? (Look for my “second thoughts” column later this month…)

HDMI 2.0 Is Here…And It’s Not Fast Enough?

This morning, the HDMI Forum announced the release of HDMI 2.0, which was almost two years in the making. The impetus for this new standard was and continues to be 4K, which requires such increases in data rates that the older 1.4 version can’t support it, except at slow frame rates.

Now, HDMI 2.0 has a maximum data rate of 18 gigabits per second (Gb/s), slightly faster than DisplayPort’s 17.2 Gb/s. If you do the math, this should be fast enough to transport 3840×2160 video with frame rates of 50 and 60 Hz, using 8-bit and 10-bit color (at 60 Hz, the clock rate for 8-bit 4K is about 14.9 Gb/s; with 10-bit color, about 17.9 Gb/s).

Here are the highlights from the official press release:

“HDMI 2.0, which is backwards compatible with earlier versions of the HDMI specifications, significantly increases bandwidth up to 18Gb/s and adds key enhancements to support continuing market requirements for enhancing the consumer video and audio experience. New functionality includes:

– Support for 4k@50/60, (2160p: 4 times the clarity of 1080p/60 video resolution)

– Up to 32 audio channels for a multi-dimensional immersive audio experience

– Up to 1536kHz audio sample frequency for the highest audio fidelity

– Simultaneous delivery of dual video streams to multiple users on the same screen

– Simultaneous delivery of multi-stream audio to multiple users (up to 4)

– Support for the wide-angle theatrical 21:9 video aspect ratio

– Dynamic synchronization of video and audio streams

– CEC extensions provides expanded command and control of consumer electronics devices through a single control point

HDMI 2.0 does not define new cables or new connectors. Current High Speed cables (Category 2 cables) are capable of carrying the increased bandwidth.”

After reviewing the specifications, it appears to me that the HDMI Forum was trying to squeeze every last drop of speed out of the existing connector/interface architecture without having to re-engineer the standard. There’s no mention of locking connectors (a bugaboo of the broadcast and AV industries). Nor is there any discussion of speeding up HDCP key exchanges beyond what’s already been accomplished with InstaPort. But an HDMI 2.0 standard should eliminate the need for two or even four separate HDMI ports to playback 4K content (several TV and projector manufacturers currently use this approach).

Adding multiple channels of audio and increasing the sampling frequency is relatively simple stuff, as the bit rates for audio are a small fraction of those needed for 2K and 4K video. And you can already deliver two separate video streams through one HDMI connector – it’s only a bandwidth issue; the new standard just establishes a protocol for doing so. Supporting 21:9 isn’t all that big a deal, either.

I’m not sure what “dynamic synchronization of video and audio streams” means yet and will have to talk to the folks at HDMI Licensing to get a better explanation. As for CEC, it appears that control functionality has been souped-up beyond the basic command sets used to operate AV receivers and Blu-ray players.

What’s clear now is that HDMI 2.0 is NOT going to be the big breakthrough many of us analysts and writers expected, and that it will NOT be able to transport 10-bit and 12-bit 4K video running at higher frame rates (>60 Hz). Both of these specifications are necessary to develop high dynamic range (HDR) video and movie content.

Nor is there any indication of supporting a high-speed data bus overlay like Thunderbolt, which is becoming more important with the growth in popularity of tablets and smart phones, not to mention ultrabooks. These devices are leading the industry changeover to single, dense, multifunction interfaces across all sorts of CE products.

In contrast; over at VESA, they’ve already commenced development of Display Stream, a new interface that will use “light” JPEG compression to push data rates up to 25 Gb/s and beyond over conventional DisplayPort connections. This is a more “future-proof” approach to display connectivity and reflects the current state of 4K and UHDTV product and content development, what with all of the 4K television announcements that have been made this year.

But the reality is that HDMI dominates the CE marketplace and is making major inroads to commercial AV and broadcast installations. The market has largely ignored DisplayPort, despite the facts that (a) there are currently no royalties associated with its use, (b) its connectors come in many different flavors, including support for mobile and fiber optic interfaces, and (c) it already supports a high-speed data bus overlay – the 20 Gb/s Thunderbolt layer.

Maybe they’ll get it right next time..