Posts Tagged ‘DisplayPort’

Measuring Up With DisplayHDR

For the past 16 years, the High Definition Multimedia Interface (HDMI) has ruled the roost for display connections, pushing aside VGA at first and then DVI on everything from televisions and Blu-ray players to laptop computers and camcorders. It’s evolved numerous times from a basic plug-and-play interface for televisions and AV receivers to a high-speed transport system for 4K and ultimately 8K video. Ironically, HDMI is often the input and output connection for video encoders and decoders that, in theory, could displace it from the market altogether.

But there are other players in the interfacing market, and that would be the folks at the Video Electronics Standards Association (VESA), who developed and periodically update DisplayPort. First launched in 2006, DisplayPort was intended to replace the old analog VGA connector with a newer, 100%-digital version that could handle many times the bandwidth of an XGA (1024×768) or UXGA (1600×1200) video signal.

Other forward-looking features included direct display drivers (no need for a video card), support for optical fiber, multiplexing with USB and other data bus formats, and even a wireless specification (it never really caught on). Like HDMI, DP had its “mini” and “micro” versions (Mini DP and Mobility DP).

In recent years, VESA stayed current by upping the speed limit from 21.6 to 32 gigabits per second (Gb/s), supporting the USB 3.0 Alternate Mode, adding some cool bells and whistles like simultaneous multi-display output, adopting the first compression system for display signals (Display Stream Compression), recognizing high dynamic range metadata formats, and even accepting color formats other than RGB.

Best of all, there continue to be no royalties associated with DP use, unlike HDMI. The specification is available to anyone who’s interested, unlike HDMI. And DP was ready to support deep color and high frame rate 4K video as recently as 2013, unlike HDMI.

However…unlike HDMI, DisplayPort has had limited success penetrating the consumer electronics display interfacing market. While some laptop manufacturers have adopted the interface, along with commercial AV monitors and video cards for high-performance PCs, HDMI is still the undisputed king of the hill when it comes to plugging any sort of media device into a display.

Even long-time supporters of DP have switched allegiances. Apple, known for using Mini DisplayPort on its MacBook laptops, is now adding HDMI connections. Lenovo, another DP stalwart, is doing the same thing on its newer ThinkPad laptops.

One of the many DisplayHDR-certified monitors in VESA’s booth at CES 2018.

But VESA has a few more tricks up its sleeve. Earlier this year at CES, VESA had several stands in their booth demonstrating a new set of standards for high dynamic range and wide color gamuts on computer monitors – specifically, those using LCD technology. DisplayHDR calls out specific numbers that must be achieved to qualify for DisplayHDR 400, DisplayHDR 600, and DisplayHDR 1000 certification.

Those numbers fall into the categories of 10% full white, full screen white “flash,” and full screen white “sustained” operation, minimum black level, minimum color gamut, minimum color bit depth, and black-to-white transition time. With interest in HDR video growing, the DisplayHDR specifications are an attempt to get around vague descriptions of things like color range (“70% of NTSC!”) and contrast ratios that don’t specify how the measurements were taken.

And this is actually a good thing. In the CE world, the UHD Alliance has a vague set of minimum requirements for a TV to qualify as high dynamic range. Compared to the more stringent DisplayHDR requirements, the UHD Alliance specs are equivalent to asking if you can walk and chew gum at the same time. Whereas HDMI version 2.0 (currently the fastest available) can transport an Ultra HD signal with 8-bit RGB color safely at 60 Hz, that’s setting the bar kinda low in our opinion.

In contrast, DisplayPort 1.3 and 1.4 (adds HDR metadata and support for 4:2:0 and 4:2:2 color) aren’t even breathing hard with a 12-bit RGB Ultra HD video stream refreshed at 60 Hz. And that means a computer display certified to meet one of the DisplayHDR standards can actually accept a robust HDR signal. (Note that VESA isn’t choosing sides here – DisplayHDR-certified screens can also use HDMI connections, but signal options are limited by HDMI 2.0’s top speed of 18 Gb/s.) You can learn more about DisplayHDR here.

With HDMI 2.1 looming on the horizon – a new version of the interface that liberally borrows from DisplayPort architecture – VESA will certainly have its work cut out. The accelerated trend to 4K and ultimately 8K imaging will help, as DP can get to the faster data rates more quickly than HDMI. And the DisplayHDR standards aren’t just fluff – they’re also a way to expand awareness of the DisplayPort brand.

CES 2016 In The Rear View Mirror

I’m a little less than a week back from one of the world’s largest trade shows, the 2016 International CES. According to press releases from the Consumer Technology Association (CTA), the new name for the Consumer Electronics Association, upwards of 170,000 people attended the show this year, which was spread out over several venues in Las Vegas.

Based on the crowds I saw, I’d say that number wasn’t far off. Walking through booths in the Las Vegas Convention Center gave me the feeling of strolling along the beach, unaware that a tidal wave was sneaking up on you – one minute you had a particular exhibit all to yourself, and the next, you were swamped by a sea of bodies adorned with CES badges.

Trying to predict which trends in electronics will be “hot” each year is basically a fool’s errand. Going into the show, I was deluged with press releases about “Internet of Things” gadgets, and the show didn’t disappoint – I saw everything from connected thermostats and body sensors to pet food dispensers and shower heads that monitor how much water each member of your family uses – and record that data, too.

The LG floor-to-ceiling OLED wall at CES put many people into a trance.

The LG floor-to-ceiling OLED wall at CES put many people into a trance.

 

TCL set up their usual tiny booth in the Central Hall.

TCL set up their usual tiny booth in the Central Hall.

Last year, the show was all about Ultra HDTV, with some unusual video aspect ratios and pixel counts thrown in. This year, I figured high dynamic range (HDR) would be the “hot” item in every booth. Surprisingly, it wasn’t generating all that much buzz, even though it was featured in the Sony, Samsung, LG, and Chinese TV booths. Instead, there seemed to me much more interest in virtual reality (VR); examples of which were to be found everywhere in the LVCC and also over at the Sands Expo Center.

What was an eye-opener (although not entirely unexpected) was the reduction in booth space devoted to televisions in the Samsung, Panasonic, and LG booths. Sony chose to use Ultra HDTVs to illustrate HDR, wide color gamut, and local area dimming concepts, while Panasonic largely ignored TVs altogether, featuring just a 65-inch UHD OLED TV in one part of their booth and a 55-inch 8K LCD set in another; primarily to demonstrate 8K signal transport over optical fiber.

LG and Samsung devoted more real estate than ever before to connected and “smart” appliances, tablets, smartphones, and personal electronics like smart watches, subtly pushing TVs (of which there were still plenty, believe me) to a secondary role with less square footage. The fact is; appliances are more profitable than TVs these days…WAY more profitable. And Samsung and LG had plenty of refrigerators, ovens, washers, and even dryers out for inspection.

For LG, CES was a big “coming out” party for their expanding line of OLED Ultra HDTVs – they were everywhere, dazzling with their deep blacks and saturated colors. But LCD still plays a part in the LG ecosystem: The 98-inch 8K LCD panel that blew us away last year made a return appearance, as did the 105-inch 21:9 5K (5120×2160) model.

This Innolux 8K LCD monster TV showed up in the Hisense booth and a few other locations.

This Innolux 8K LCD monster TV showed up in the Hisense booth and a few other locations.

 

Samsung showed the

Samsung showed the “World’s largest 170-inch TV.” Apparently there are smaller ones I didn’t know about.

Over in the Samsung booth, they kept the “mine’s bigger than yours” contest going with a 170-inch Ultra HDTV based on a LCD panel fabbed at CSOT in China and equipped with quantum dots. (Last year, Samsung insisted their quantum dot illumination technology was to be called “nanocrystals.” This year, they did a 180-degree turn, and are now calling them quantum dots.) A curved 8K TV and some demos of live broadcast Ultra HD with HDR were also showcased alongside the company’s new Ultra HD Blu-ray player ($399 when it ships in the spring).

The “towers” and stacks of LG and Samsung televisions we used to marvel at a decade ago have now found their way into the ever-expanding booths of Chinese TV brands like Hisense, TCL, Changhong, Haier, Konka, and Skyworth. (Not familiar names? Don’t worry, you’ll get to know them soon enough.) And notable by its absence was Sharp Electronics, whose US TV business and assembly plant in Mexico were acquired by Hisense last year. That’s quite a change from ten years ago, when the company held a 21% worldwide market share in LCD TV shipments.

To be sure, there was a Sharp meeting room w-a-y in the back of the Hisense booth, which was enormous – almost as big as TCL’s behemoth in the middle of the Central Hall. And the Konka, Changhong, and Skyworth booths weren’t far behind in size. If you needed to see the writing on the wall regarding the future of television manufacturing, it couldn’t have been more clear – everything is slowly and inexorably moving to China. (It’s a good bet that the LCD panel in your current TV came out of a Chinese or Taiwanese assembly plant!)

TVs were just part of the story in Las Vegas. I had been waiting a few years to see which companies would finally pick up the baton and start manufacturing 802.11ad Wi-Fi chipsets. For those readers who haven’t heard of it before, 802.11ad – or its more common names, “Wireless Gigabit” and “Certified Wireless Gigabit” is a standard that uses the 60 GHz millimeter-wave band to transmit high-speed data over 2 GHz-wide channels.

Letv demonstrated wireless 4K video streaming over 60 GHz 802.11ad, using this new smartphone and Qualcomm's chipset.

Letv demonstrated wireless 4K video streaming over 60 GHz 802.11ad, using this new smartphone and Qualcomm’s chipset.

 

Are you on the USB Type-C bandwagon yet? (Check your new laptop or smartphone...)

Are you on the USB Type-C bandwagon yet? (Check your new laptop or smartphone…)

Considering that the current channels in the 2.4 GHz and 5 GHz band are only 20 MHz wide, and that the 802.11ac channel bonding protocol can only combine enough of them to create a 160 MHz channel, that’s quite a leap in bandwidth! The catch? 60 GHz signals are reflected by just about solid object, limiting their use to inside rooms. But with high-power operation and steerable antennas, those signals can travel a pretty good distance.

In-room, high-bandwidth operation is perfect for streaming video – even at 4K resolution – from phones, tablets, set-top boxes, and even Blu-ray players to TVs, projectors, AV receivers, and switching and distribution gear. Qualcomm had demos of numerous ready-to-manufacture tri-band modems (2.4/5/60 GHz), along with LETV’s latest smart phone with a built-in 60 GHz radio chip. And SiBEAM, a part of Lattice Semiconductor, showed 4K streaming through their WiHD technology, along with close-proximity interface coupling using SNAP to download images and video from a waterproofed GoPro camera.

Lattice had some other tricks up their sleeve in their meeting room. One of those was using a Windows 10 phone with a MHL (Mobile High-definition Link) connection through USB Type-C to create a virtual desktop PC. All that needed to be added was a mouse, a keyboard, and monitor. In another area, they showed a scheme to compress Ultra HD signals before transmitting them over an HDBaseT link, with decompression at the far end. This, presumably to overcome the 18 Gb/s speed limit of HDMI 2.0.

DisplayPort had a good demonstration of Display Stream Compression (DSC). That's the chipset under that enormous fan.

DisplayPort had a good demonstration of Display Stream Compression (DSC). That’s the chipset under that enormous fan.

 

Ultra HD Blu-ray is here, complete with high dynamic range mastering. How will it hold up against the growing trend to stream video?

Ultra HD Blu-ray is here, complete with high dynamic range mastering. How will it hold up against the growing trend to stream video?

Not far away, the “funny car” guys at the MHL Consortium showed their superMHL interface linking video to another LG 98-inch 8K LCD display. Converting what was once a tiny, 5-pin interface designed for 1080p/60 streaming off phones and tablets to a 32-pin, full-size symmetrical connector that can hit speeds of 36 Gb/s seems like putting Caterpillar truck tires and a big-block Chevy engine in a Smart Car to me…but they did it anyway, and added support for USB Type-C Alternate mode. Now, they’re ready for 8K, or so they keep telling me. (That’s fine, but the immediate need is for faster interfaces to accommodate Ultra HD with 10-bit and 12-bit RGB color at high frame rates. Let’s hear about some design wins!)

At the nearby VESA/DisplayPort booth, there were numerous demonstrations of video streaming over USB Type-C connections in Alternate mode, with one lash-up supporting two 1920x1080p monitors AND a 2550×1536 monitor, all at the same time. DP got somewhat faster with version 1.3 (32 Gb/s) and now a new version (1.4) will be announced by the end of January. The VESA guys also had a nice exhibit of Display Stream Compression (DSC), which can pack down a display signal by a 2:1 or 3:1 ratio with essentially no loss or latency (a few microseconds). If we’re going to keep pushing clock speeds higher and higher, compression is inevitable.

The world of display interfacing appears to becoming more disjointed, what with the majority of consumer devices still supporting HDMI 1.4 and 2.0, while an increasing number of computer and video card manufacturers are jumping on the DisplayPort bandwagon (Apple, HP, and Lenovo, among others). How superMHL will fit into this is anyone’s guess: The format is TMDS-based, like HDMI, but outstrips it in every way (HDMI 2.0 does not support DSC or USB Type-C operation). Do we really need two TMDS-based interfaces, going forward?

Speaking of USB Type-C, everybody and their brother/sister at CES had Type-C hubs, adapters, and even extenders out for inspection. If any connector is going to force the competing display interface standards to get in line, it will be this one. Apple, Intel, Lenovo, and several phone/tablet manufacturers are already casting their lots with Type-C, and it looks to be the next “sure thing” as we head toward a universal data/video/audio/power interface. I even came home with a credit card-sized press kit with a reversible USB 2.0 / 3.0 Type-C plug built-in!

First it was vinyl. Then cassettes. Now, Kodak is bringing back Super 8mm film and cameras. (I kid you not!)

First it was vinyl. Then cassettes. Now, Kodak is bringing back Super 8mm film and cameras. (I kid you not!)

 

Lenovo is one of four laptop manufacturers now offering OLED screens, here on a ThinkPad X1 Yoga (right).

Lenovo is one of four laptop manufacturers now offering OLED screens, here on a ThinkPad X1 Yoga (right).

So – how about HDR? Yes, a few companies showed it, and there were spirited discussions over dinner whether OLEDs could actually show signals with high dynamic range (they most assuredly can, as they can reproduce 15 stops of light from just above black to full white without clipping) and whether you actually need thousands of cd/m2 to qualify as an HDR display (I’m not in that camp; displays that bright can be painful to look at).

For LCDs, quantum dots (QDs) will lead the way to HDR. Both QD Vision and 3M had demos of quantum dot illuminants, with QD Vision focusing on light pipes for now and 3M partnering with Nanosys to manufacture a quantum dot enhancement film. Both work very well and provide a much larger color gamut than our current ITU Rec.709 color space, which looks positively washed-out compared to the more expansive Rec.2020 color gamut associated with UHD and HDR. QD Vision also showed the reduction in power consumption over OLEDs when using QDs. However, you won’t get the deep blacks and wide viewing angles out of an LCD in any case, so a few more watts may not matter to the videophiles.

The Ultra HD Blu-ray format had its formal debut at CES with Panasonic and Samsung both showing players. The latter can be pre-ordered for $399 and will ship in the spring. (Remember when Samsung’s first-ever Blu-ray player sold for nearly $2,000 almost a decade ago?) To support HDR – which requires 10-bit encoding – the HDMI interface must be type 2.0a to correctly read the metadata. That can be in the DolbyVision format, or the Technicolor format, but the baseline definition is HDR-10.

LG Display's flexible 18-inch OLED display was just too cool for words.

LG Display’s flexible 18-inch OLED display was just too cool for words.

 

Stand four 65-inch UHD OLED panels on end, stitch them together, and this is what you get. Bibbedy-bobbedy-boo!

Stand four 65-inch UHD OLED panels on end, stitch them together, and this is what you get. Bibbedy-bobbedy-boo!

I saved the best for last. Every year, LG Display invites a few journalists up to what we call the “candy store” to see the latest in display technology. And this year didn’t disappoint: How about dual-side 55-inch flexible OLED TVs just millimeters thick? Or a 25-inch waterfall (curved) display that could form the entire center console in a car, with flexible OLEDs in the dashboard creating bright, colorful, and contrasty gauges?

LGD has WAY too much fun coming up with demos for this suite. I saw four 65-inch OLED panels stacked on end, edge to edge, and bent into an S-curve to create a 2.2:1 ratio widescreen UHD+ display. And it also had video playing on both sides. In another location, I saw a jaw-dropping 31.5” 8K LCD monitor with almost perfect uniformity, and an 82-inch “pillar” LCD display.

How about a 55-inch UHD OLED display rolled into a half-pipe, with you standing at the center, playing a video game? Talk about filling your field of view! Next to it was a convex 55-inch display, wrapped around a ceiling support pole. And next to that, a 55-inch transparent OLED display with graphics and text floating over real jewelry, arranged on tiers. The actual transparency index is about 40% and the concept worked great.

Toyota's Future Concept Vehicle (FCV) is a bit roomier than last year's sidecar-shaped model.

Toyota’s Future Concept Vehicle (FCV) is a bit roomier than last year’s sidecar-shaped model.

 

Wow, drones are getting REALLY big these days!

Wow, drones are getting REALLY big these days!

The icing on the cake was an 18-inch flexible OLED with 800×1200 resolution that could be rolled up into a tube or a cone-like shape while showing HD video. This was one of those “I gotta get me one of these!” moments, but significantly, it shows how OLED technology has matured to the point where it can be manufactured on flexible substrates. And what is the largest market in the world or displays? Transportation, where G-forces and vibration eventually crack rigid substrates, like LCD glass.

That’s just a snapshot of what I saw, and I haven’t even mentioned drones (buzzing all over the place), fold-up scooters and hoverboards, smart appliances, pet cams, alarms that alert you when an alarm goes off (really!), wooden smartphones (really!), talking spoons and forks (really!), toothbrushes linked to video games (would I kid you?), and 4K action cams with built-in solar cell chargers.

Gotta run now. My phone just sent me a Wi-Fi alarm that a Bluetooth-connected doorbell camera spotted the UPS guy delivering a package I was already alerted about via email to my desktop that signaled a buzzer via ZigBee in my virtual desktop PC that was connected wirelessly to my smartphone, currently streaming 4K video over a 60 GHz link to my “smart” TV that is also…also…also…

Oh, great. Now I’ve forgotten what I was talking about…Does anyone make an iRemember app? (Look for my “second thoughts” column later this month…)

2016 – A Turning Point For Television

In a few short weeks, I (and hundreds of my colleagues in the press) will dutifully board planes for Las Vegas to once again spend a week walking the show floor at International CES. We’ll listen to PR pitches, grab fast-food meals on the fly, show up late for appointments, have numerous ad hoc discussions in hallways and cabs, and try to make sense of all the new technologies unveiled in the Las Vegas Convention Center and nearby hotels.

As usual, many of us will want to focus on televisions – or more specifically, what televisions are becoming. TVs have always been an important product category at CES, and that was particularly true with the introduction of digital, high definition TV in the late 1990s, followed by plasma and then LCD display technologies in the early to mid-2000s.

Today, the bloom is largely off the rose. TVs have become commodities, thanks to aggressive pricing and distribution by Korean manufacturers that have largely driven the Japanese brands out of the business. And we’re seeing that cycle repeat itself as China becomes the nexus for TV manufacturing and prices for 1080p sets continue in free fall.

But something new is here – Ultra HD (a/k/a 4K). And the transition is happening at a breathtaking pace: The first 4K / UHD sets appeared on these shores in 2012 with astronomically high price tags. Four years later, you can buy a 55-inch Ultra HDTV with “smart” wireless functions for less than $800, a price point that has forced same-size 1080p sets below $500.

And it’s not just more pixels. High dynamic range (HDR) is coming to market, as are new illumination technologies that will provide much larger color gamuts. LCD and OLED panel manufacturers are now able to address at 10 bits per pixel, breaking past the now-inadequate 8-bit standard that has held back displays of all kinds for over a decade.

Chinese manufacturer Hisense now owns the Sharp TV brand, and will bring a line of quantum dot-equipped Ultra HDTVs to market in 2016.

Chinese manufacturer Hisense now owns the Sharp TV brand, and will bring a line of quantum dot-equipped Ultra HDTVs to market in 2016.

Screen sizes are getting larger, too. Ten years ago, a 42-inch TV was considered “big” and anything larger was a home theater installation. Today? Consumers are routinely buying 50-inch, 55-inch, and even 60-inch sets as prices have fallen. That same 42-inch set is often consigned to a bedroom or kid’s room, or maybe a summer home.

Back in September of 2008, I bought a Panasonic 42-inch 1080p plasma TV for about $1,100. It had two HDMI 1.3 connections, three analog composite/component video inputs, and no network connectivity of any kind. But wow, did it make great pictures!

Seven years later, that TV sits in my basement, unused. It was replaced by a price-comparable, more energy-efficient 46-inch LCD model after Hurricane Sandy killed our power for several days and I did a whole-house energy audit. (And no, the LCD picture quality doesn’t compare to the plasma.)

But that’s not all that changed. I picked up four HDMI 1.4 inputs along the way (yep, it was set up for 3D), plus built-in Wi-Fi and “smart” functions. And I added a sound bar to make up for the awful quality of the built-in speakers. Plus, I added a Blu-ray player to round out the package, although it hardly sees any discs these days – it’s mostly used for streaming.

So – let’s say I’d like to replace that TV in 2016, just five years later. What would my options be?

To start with, I’d be able to buy a lot more screen. Right now, I could pick up a Samsung or LG 65-inch smart 1080p set for what I spent in 2011. Or, I could bite the bullet and make the move to Ultra HD with a 55-inch or 60-inch screen, complete with four HDMI inputs (one or two would be version 2.0, with HDCP 2.2 support), Wi-Fi, Netflix streaming (very important these days), and possibly a quantum dot backlight for HDR and WCG support.

My new set should support the HEVC H.265 codec, of course. That will make it possible to stream UHD content into my TV at 12 – 18 Mb/s from Netflix, Amazon Prime, Vimeo, Vudu, and any other company that jumps on the 4K content bandwagon. I could even go out and buy a brand-new Ultra HD Blu-ray player to complement it. But it’s more likely I’d opt to stream UHD content over my new, fast 30 Mb/s Internet connection from Comcast.

Now, it might pay to wait until later in 2016, when I could be sure of purchasing an Ultra HDTV that would support one or more of the proposed HDR delivery standards for disc-based and streaming UHD movies. And maybe I’d have more “fast” inputs, like DisplayPort 1.2 or even 1.3 to go along with HDMI 2.0 (and quite possibly, superMHL).

And I might even swing back over to an emissive display, to replace the picture quality I got from my old plasma set. That would mean purchasing an OLED Ultra HDTV, which would also support HDR and WCG, plus all of the usual bells and whistles (Wi-Fi, multiple HDMI/DP inputs, streaming, apps).

My point? We’re going to see some amazing technology in the next generation of televisions at ICES. And consumers are apparently warming up to Ultra HD – while sales of 1080p sets continue to decline, Ultra HD sales are climbing by double-digit percentages. I expect that number to accelerate as we near the Super Bowl, even though it won’t be broadcast in 4K (yet!).

If you are thinking about upgrading your main TV, 2016 could give you plenty of reasons to do it. My advice? Wait until all the puzzle pieces are in place for delivery of HDR and WCG to your home, and look into upgrading your Internet connections – streaming 4K will be here faster than you realize. And if you can live with your 1080p set until the fall of 2016, you’ll be amazed and likely very pleased at the upgrade…

Display Interfacing: Welcome to Babylon

For many years, ‘interfacing’ a video signal meant plugging in a yellow RCA or silver BNC connector that carried composite video. As picture resolution went up, computers became commonplace at work and home, and the term ‘progressive scan’ entered the lexicon, we saw the birth of S-video and then component video (YPbPr and RGB).

So we adapted, building switching and distribution gear that could handle one-, two-, and three-wire formats. All was well and good…until ‘digital’ made its grand entrance about 15 years ago.

Now, we have digital versions of component and RGB video, starting with the Digital Video Interface (DVI) and moving to High Definition Multimedia Interface (HDMI), DisplayPort, and the new superMHL interface that (according to the MHL Alliance) will start appearing on televisions as soon as December.

If I’m a consumer, I mostly don’t care about any of this. As long as I can plug in my set-top box, Blu-ray player, and other gadgets with the right cables I can find at Best Buy, this is just a bunch of alphabet soup.

However, if I’m an integrator (consumer or commercial), then I care VERY much about where all of this is heading. And if I’m paying any attention at all to the growing market for 4K and UHD, then I’m rightfully concerned about the impending problems with interfacing these signals.

superMHL is certainly fast enough to handle UHD. But you can't find it in use yet. is there a better way?

superMHL is certainly fast enough to handle UHD. But you can’t find it in pro AV gear yet. Is there a better way?

Consider that:

*Even though HDMI 2.0 was announced in September of 2013 – TWO FULL YEARS AGO – virtually no manufacturer in the pro AV space supports this interface on their switchers and distribution amplifiers. Instead, the vast majority are still providing version 1.4 while claiming these products are “4K compatible” or “4K ready” because version 1.4 is just fast enough to pass an Ultra HD (3840×260) signal at 30 Hz with 8-bit RGB color. That’s setting the bar kinda low, isn’t it?

*Some computer manufacturers don’t even support HDMI, like Apple (DisplayPort) and Lenovo (also DisplayPort). So, now you have to carry dongles everywhere you go?

*HDMI 2.0 arrives hand-in-hand with a new version of copy protection (HDCP 2.2) which is much more rigorous than versions 1.3 and 1.4. If a valid HDCP key exchange isn’t made within 20 milliseconds, the connection will shut down. Period.

*HDMI 2.0 isn’t fast enough for what UHD is turning out to be – a real departure from 1080p and Wide UXGA, with a move to 10-bit color to support high dynamic range (HDR), wide color gamuts (WCG), and high frame rates (HFR). DisplayPort 1.2 can barely support these requirements; DP version 1.3 and super MHL are better positioned to handle the job.

*The intellectual property behind HDMI and superMHL is owned by the same company – Lattice Semiconductor – and whereas once there were clear dividing lines between the two interfaces (MHL was designed originally for smartphones and tablets), they are now competing against each other. I’ve even sat in on presentations where it was explained that both could exist on consumer TVs. (And why would that make sense, again, when neither interface has been widely deployed to date, and one is clearly an improvement over the other?)

You can imagine what this trend is doing to product designers and manufacturers. Sure, HDMI is a “safe bet” for now, but what if our UHD needs quickly outstrip its maximum clock speed? DP is certainly faster and there appears to be more support for it from computer manufacturers. But super MHL is faster still. Shouldn’t your interfaces at least have a head start on display manufacturers?

This reliance on HDMI has led several manufacturers into a potential trap, investing heavily on signal distribution architectures that may quickly run into a “future-proofing” problem. In contrast; outside the commercial AV industry, everyone from cable TV system operators to broadcasters and telecom operators are busy migrating to an IP-based architecture.

Not only does IP-based architecture have the advantage of being a relatively open system, it also solves many of the speed issues as 1-gigabit and 10-gigabit networks are becoming more commonplace. (Heck, Comcast just upgraded my home Internet speeds to 75 Mb/s on downloads, which is more than fast enough for me to stream 4K content from Netflix and Amazon!)

So, why don’t we do the same in the commercial AV industry? It’s not for a lack of products – there are several companies offering AV-over-IP transmitters and receivers, along with encoders and decoders. I’ve also seen impressive demos of “middleware” used to locate, switch, and play out media assets over IP networks. All of these guys were at InfoComm 2015.

The big players in HDMI-based switching and distribution argue against AV-over-IP for in-room and short-run signal distribution, citing latency and compression issues. Well, we now have a new codec (HEVC H.265) to handle that end of things, and it’s possible to stream video and high resolutions with low latency. (How does 1920x1080p/60 at 1 to 2 Mb/s sound to you? Thought so.)

High latency is often the result of over-compression and heavy forward error correction (FEC). But if video and audio assets are streaming on bandwidth-managed, private IP networks, there isn’t a lot of forward error correction required. Group of Pictures (GOP) sizes can also increase to reduce latency. So latency is sort of a “straw man” argument. (And HDMI 2.0 will have plenty of issues with HDCP 2.2, trust me. Talk about latency…)

As for copy protection; video and audio assets streaming over IP connections have their own security protocols. Practically speaking, what could be more secure than video content streaming directly into a UHDTV, through an Ethernet connection? And you don’t even have to plug in a cable to make it work, unless you use a wired Ethernet hookup. Bandwidth issues? Well, how about 5 GHz 802.11ac channel-bonding routers? I’m getting 70+ Mb/s download speeds from mine with wired connections, and 25 – 30 Mb/s some distance from my 5 GHz wireless link.

Again, looking outside our industry, the two most common signal distribution and switching architectures are based on HD-SDI or IP (or both). Not HDMI, and certainly not HDMI-derived, structured-wire systems like HDBaseT. If the rest of the world wants to multiplex video, audio, metadata, and other low bitrate control signals, they do it over optical fiber. (Did you know that multimode fiber is cheaper than Category 6 wire?)

I’ll wrap up thing by saying that the smart move is for commercial AV integrators to move to an AV-over-IP signal distribution system at the core like everyone else, leaving the HDMI, DisplayPort, superMHL, and “whatever comes next” connections for the far ends, near the displays (if those far-end conversions are even needed at all).

Leave the core as a high-speed, copper bus or optical bus, software-based switcher. If there’s enough bandwidth (and there should be), that system can also carry local TCP/IP traffic, SMTP alerts from connected devices, and control signals to all devices. Not only does this approach free everyone from the “closed world” paradigm of HDMI, it also makes the system infinitely more appealing to end-users and facility administrators, an increasing number of whom come from the IT world.

Consider this ad that was posted recently on a listserv for higher education:

“We are looking for an experienced AV-IT Engineer for the role of Technical Guru. The position will provide planning and support for AV-IT systems used in teaching and learning spaces big and small. The person in this position will focus on design, installation, and troubleshooting of AV-IT systems in a variety of venues, including traditional classrooms, active learning classrooms, large auditoria, computer labs, and even Makerspaces…We are looking for a seasoned professional with a solid background in AV-IT systems. This is a great opportunity for a doer who is excited about not just maintaining but also shaping the future of AV-IT technology as a key element of the teaching mission of one of the world’s top universities.”

I rest my case. It’s time for the commercial AV industry to get in step with the rest of the world and move to AV-over-IP signal distribution.

Wake up. Have you smelled the coffee yet?

Look Out, HDMI – Here Comes Super MHL!

Yesterday, the MHL Consortium announced its newest flavor of display interface – Super MHL (or, more accurately, superMHL). MHL, which stands for Mobile High-definition Link, was original developed to enable display connections over Micro USB ports on phones and tablets. You’ll find it most often on mobile devices and televisions from Samsung and Sony. It will also pop up on LG products and there’s even an MHL port on the Pioneer AV receiver I bought a couple of months ago.

There have been some clever demos of MHL applications at past CES events. One was to build a “dumb” laptop (no CPU or video card) – just keyboard, touchpad, and display – and use MHL to dock a smartphone into it make everything work. Another demo in the Silicon Image booth featured smartphones being used as video game controllers with the video playing back on the controller screen.

Yet another demo showed a Sony Experia phone being used as a remote control with a Samsung TV to select inputs, play video, and launch Internet applications. It’s easy to do this stuff when you can multiplex video and serial data through the same connector, which in MHL version 3.0 can even play back Ultra HD video at 30 fps with 8-bit color.

Note the emphasis on the “mobile” part. In the world of transition-minimized differential signaling (TMDS), MHL is one of a few players, the others being HDMI (the dominant digital display interface), its predecessor DVI (still going strong although the standard isn’t updated anymore), and Micro and Mini HDMI, niche connectors on smartphones and cameras.

The advent of Ultra HD, 4K, and higher display resolutions like the new “5K” widescreen workstation monitors use has created a problem: Our display interfaces need to get faster. A LOT faster!

But HDMI 2.0, announced in September 2013, isn’t fast enough. I get into frequent debates with people about why it isn’t, so let me clarify my position: HDMI 2.0 has a maximum overall clock (data) rate of 18 gigabits per second (18 Gb/s). 80% of that can be used to carry display signals; the rest is overhead using 8 bit/10 bit mapping.

So that limits HDMI 2.0 to supporting 3840×2160 pixels (4400×2250 pixels with blanking) in an RGB signal format @ 60 Hz refresh. That’s the hard, fast speed limit. For anyone using a computer workstation or media player with RGB output, this hard, fast limit is a serious obstacle: How will people who buy the new HP/Dell 27-inch workstation monitors connect them? Their working resolution is 5120×2880 pixels, and at 60 Hz, that’s just too fast for HDMI 2.0.

It looked like DisplayPort 1.2 would finally ascend to the top of the podium, since its existing speed of 21.6 Gb/s (17.28 Gb/s usable) was already faster than HDMI 2.0. And now, DisplayPort 1.3 has been announced, with a top speed of 32 Gb/s (about 26 Gb/s usable) and the adoption of Display Stream compression. Indeed, more computer manufacturers are providing DP connections on laptops: Lenovo seems to have moved completely to this format, and Apple has been supporting DP for some time now.

8K is here! (Okay, maybe that's a few years away...)

8K is here! (Okay, maybe that’s a few years away…)

With all of that in mind, I will admit I was completely blind-sided by superMHL at this year’s International CES. Instead of a 5-pin Micro USB connector, superMHL offers a 32-pin, full-size connector that’s symmetrical (the next big thing in connectivity, a la USB Type-C). It also supports Display Stream compression. And it’s compatible with USB Type-C, although not with all six lanes. And it has a maximum data rate of 36 Gb/s across six lanes of data. (According to the MHL Consortium, that’s fast enough to transport an 8K (7680×4320) image with 120 Hz refresh and 4:2:0 color.)

The MHL Consortium’s announcement yesterday featured Silicon Image’s new Sil97798 port processor, which can also handle HDMI 2.0 signals. Here are the key specs from the Super MHL press release:

  • 8K 60fps video resolution, as outlined in the superMHL specification
  • New, reversible 32-pin superMHL connector
  • USB Type-C with MHL Alt Mode
  • High Dynamic Range (HDR), Deep Color, BT.2020
  • Object audio – Dolby Atmos®, DTS:X, 3D audio, audio-only mode
  • High bit-rate audio extraction
  • HDCP 2.2 premium content protection

 

Here's the 32-pin superMHL reversible connector.

Here’s the 32-pin superMHL reversible connector.

Whew! That’s quite a jump up from MHL. Some might say that superMHL is on steroids, but no matter how you look at it, superMHL is now a serious contender for the next generation of display connectivity. In the press briefing, a representative of the MHL Consortium waxed on about the approach of 8K broadcasting (it’s already been operating for two years in Japan) and how we would see a migration to 8K TV and displays in the near future.

As Larry David says, “Curb your enthusiasm!” Supporting 8K would be nice, but we’ve barely started the transition to UHDTV. And right now, selling 8K TV to the average consumer is like trying to peddle a Ferrari to someone who lives on a dirt road.

Where superMHL will find its niche is in supporting the higher bit rates that high dynamic range (HDR), wide color gamuts (BT.2020), and higher frame rates (60/96/100/120 Hz) require. All will shortly become important parts of the next-generation (UHD) television system. DisplayPort is already there with version 1.3, and you’ll even find DP 1.2 connections on selected models of Ultra HDTVs so that gamers can connect laptops and desktops at Ultra HD resolutions with 60 Hz refresh.

Now, the elephant in the room: How does the emergence of superMHL affect HDMI? Even though version 2.0 is over a year and a half old, you don’t see many HDMI 2.0 jacks on Ultra HDTVs. Casual inspections at Best Buy, HH Gregg, and other outlets show that the typical HDMI 2.0 port count is usually one (1), even as we approach April of 2015.

In the superMHL presentation, the concept of a TV with multiple HDMI 2.0 inputs and one superMHL input was outlined. This would, in effect, be the next step up from where we are now, with the typical Ultra HDTV having one HDMI 2.0 input and three HDMI 1.4 inputs.

But if Silicon Image’s new Sil9779 port processor can handle both formats, why bother with HDMI 2.0 in the first place, especially with its speed limitations? Wouldn’t it make more sense to future-proof all inputs and go with superMHL across the board? (Of course, the cost of adopting superMHL could weigh heavy on that decision.)

In the commercial AV and broadcast worlds, it would definitely make sense to jump to superMHL in the interests of future-proofing installations. Given the limited rollout of HDMI 2.0 to date, maybe supporting both HDMI 1.4 for legacy devices and superMHL is a smarter approach. (Note that superMHL and HDMI 2.0 both support HDCP 2.2, which is the next level in encryption and NOT compatible with older versions of HDMI.)

Summing up; the race for faster interface speed just got a lot more interesting with the addition of superMHL to the lineup. I can imagine that manufacturers of AV matrix switchers and distribution amplifiers are feeling another migraine headache coming on…

EDITOR’S NOTE: Last week, it was announced that Silicon Image has been acquired by Lattice Semiconductor of Hillsboro, Oregon, “ a leading provider of programmable connectivity solutions” according to the press release. The acquisition price was about $600M and now leaves Lattice in control of HDMI, MHL and superMHL, and SiBEAM (WiHD) patents and IP. More information can be found on the Lattice Web site at http://www.latticesemi.com/.