Posts Tagged ‘HDMI 2.0’

Hey, Whatever Happened To superMHL?

There is no such thing as a ‘sure thing.’ You can have a 20-yard field goal try with 5 seconds left, two foul shots left to ice the game, or a one-on-one penalty shot with your best wing on the ice. Doesn’t matter – things do go awry. In fact, sometimes they never get going in the first place.

Two years ago this coming January, Silicon Image (now Lattice Semiconductor) unveiled what they claimed to be the best next-generation display interface. They called it superMHL, and it was super indeed; sporting a large, 32-pin symmetrical plug design to go with a 36 gigabits-per-second (Gb/s) data transfer rate.

That’s wasn’t all. superMHL (basically MHL on steroids) also supported the new Display Stream Compression (DSC) 1.1 standard. And it would also work with the all-important USB 3.0 Type-C plug’s Alternate Mode, which multiplexed display connections and fast USB serial data in the same ‘smart’ plug.

Wow! I didn’t see this coming; neither did most of the trade press in attendance. Here was a connector faster than DisplayPort’s version 1.3 (32 Gb/s), plus it was symmetrical in operation (plug it in either way, it doesn’t care, it’s smart enough to set itself up the right way). And it was compatible with the next generation of USB connectors.

Even more amazing, the MHL Consortium demo showed 8K content flowing to a large Samsung 8K TV through this interface, which claimed to support 7680×4320 video @ 60 Hz with 4:2:0 color (albeit using DSC to pack things down a bit in size). If there was ever a ‘sure thing,’ this was it!

It's the fastest display interface out there - and no one uses it. Maybe they should call it HDMI 3.0?

It’s the fastest display interface out there – and no one uses it. Maybe they should call it HDMI 3.0?

I was assured in the following months that Lattice and the MHL Consortium would have several press announcements pertaining to design wins for the 2015 holiday season. I’d see several new UHDTV televisions with at least one superMHL port and the rest of the inputs would be HDMI 2.0 connections. Thus, we’d be ready for the brave new world of 8K TV! (Never mind that 4K TV was still getting on its feet at the time!)

But it never happened. Black Friday, Christmas, New Year’s, and then ICES and the 2016 Super Bowl came and went with no announcements. At ICES 2016, the MHL Consortium once again had a demo of 8K content playback through an LG 98-inch LCD TV using the superMHL interface, and “yes, it looked great” and “we’re ready for 8K TV” and “it works with USB Type-C” and so on, and so forth.

Right now, it’s pretty much radio silence about superMHL. So what happened?

For one thing, the adoption rate of HDMI 2.0 since its formal unveiling in 2013 can be charitably described as “slow.” Early Ultra HDTVs had perhaps one HDMI 2.0 port on them, and not all of them supported the new HDCP 2.2 copy protection protocol. In our industry, we’re only now starting to see distribution amplifiers and switches with HDMI 2.0 connections – there’s still a lot of version 1.4 product out there, too.

Another perplexing question: Since superMHL fixes the speed limit problems of HDMI 2.0 by doubling them – and also adds the all-important compatibility with USB Type-C (a must, going forward) along with support for DSC (critical as we push display resolutions beyond 5K), why would Lattice continue to support both formats, or even suggest they could be mixed on future UHD+ televisions and monitors?

In other words; if there is a better option, then why wouldn’t you want that option?

To be sure; Lattice is in a tricky position. Through their subsidiary HDMI Licensing LLC, they reap millions of dollars each year in royalties associated with every HDMI port on every piece of consumer and commercial gear. That’s a nice cash flow, and who wants to mess with it?

But they really can’t lose here, inasmuch as they control the IP for all of these transition-minimized differential signaling (TMDS) interfaces. Why not bite the bullet and announce the phase-out of HDMI 1.3/1.4, and move everyone to version 2.0? Better yet; just announce a sunset for version 2.0 and start the transition to superMHL, a/k/a HDMI 3.0?

Yeah, it's fun to demo 8K TV using superMHL, but that takes the focus off the real-world, practical interfacing solutions we're facing now.

Yeah, it’s fun to demo 8K TV using superMHL, but that takes the focus off the real-world, practical interfacing solutions we’re facing now.

One problem Lattice created with this new connector is that it’s effectively an oxymoron. MHL stands for Mobile High-definition Link, and it was originally designed to multiplex HDMI signals through 5-pin micro USB ports. The concept was that the single micro USB connector on your smartphone or tablet could connect to a television so you could play back videos, show photos, and share your screen. (Never mind that the majority of people prefer to do this via a wireless connection and not a 15-foot HDMI-to-micro USB cable that often requires a power adapter.)

So MHL meant “small, fast, and powerful.” And now we have the ‘funny car’ of display interfaces with a large connector that will never get anywhere near your mobile device…and the way things are going, it may never get anywhere near your TV, either.

In previous columns and in my classes and talks, I’ve written about the deficiencies of HDMI 2.0 – slow speed, non-symmetrical, no support for USB Type-C (finally remedied a few months ago) and lack of support for Display Stream Compression. superMHL fixes all of these problems in one fell swoop.

The answer? Re-brand this connector as HDMI 3.0 – which it really is – and make the appropriate announcement in two months at ICES 2017. Practically speaking; MHL has been a non-starter (among major U.S. brands, only Sony, Samsung, and LG have supported it on their smartphones and TVs) and the adoption rate for HDMI 2.0 is nowhere near as fast as it was for version 1.3. Too many interfaces and too much confusion!

After all, even Elvis Presley had to make a comeback…

Ultra HDTV, HDMI 2.0, and HDCP 2.2 – Oh, What A Tangled Web We Weave…

A few days ago, I received an email from the president of an AV products manufacturer. He had purchased a Samsung UN65HU8550 65-inch Ultra HDTV back in 2014 and decided to take the plunge into Ultra HD Blu-ray.

Previously, he had been using an upscaling Blu-ray player to achieve 3840×2160 resolution, but now he wanted the real thing. So, he visited his local Best Buy and picked up Samsung’s UBD-K8550 UHD Blu-ray player, took it home, and connected it to one of the HDMI inputs on his UHDTV.

Sounds simple, right? Except that it didn’t work. The UHD disc spun up, started to play, and then a message was displayed that the player would down-convert to 1080p resolution because it didn’t detect support for HDCP 2.2 (the newest and most aggressive form of copy protection for optical disc media).

To him, this made no sense whatsoever. (Me, too!) Here he was, playing an Ultra HD Blu-ray disc, from a Samsung Ultra HD BD player, into a Samsung Ultra HDTV – and it wouldn’t work. I advised him to make sure he was truly using an HDMI 2.0 input (sometimes labeled as such, or color-coded).

He tried all of the inputs, including the MHL input that is supposed to be compliant with HDCP 2.2, but no luck. Again, the disc would spin up, and then display the same error message. (By the way, HDMI 1.3/1.4 inputs can also support HDCP 2.2.)

Another trip to Best Buy resulted in the purchase of Philips’ BDP7501 Ultra HD Blu-ray model, which was then connected to the Samsung TV and – voila! It worked, playing back in true 2160p resolution.  That is; only when connected directly to the Samsung TV, and NOT through his existing Denon AVR (which likely doesn’t support HDCP 2.2 key exchanges on any of its HDMI ports).

Some quick checks on the Internet showed this wasn’t an isolated problem – others had purchased the same TV, or different screen size variations of it, and were unable to watch 4K movies from the Samsung player. One comment I read talked about going so far as to buy an HDCP 2.2 to HDCP 1.4 converter, a product I wasn’t even aware existed. And apparently, it worked! (Warning: This product may be illegal to purchase as it alters a copy-protection process. I’m only providing the URL as a reference.) (http://www.hdtvsupply.com/hdcp-2-2-to-hdcp-1-4-converter.html)

The next step was to check in with my friends at Samsung, who responded that an upgrade kit would fix the problem. It’s called the SEK3500U One Connect Evolution Kit, and attaches to your Samsung 4K TV through a separate connector on the side panel.  This $400 box – which resembles a thin Blu-ray player – provides four HDMI 2.0 inputs, all up to speed with HDCP 2.2 support, HDMI 2.0a compatibility for high dynamic range playback, and improved color rendering, according to several Amazon reviews I read. (https://www.amazon.com/Samsung-SEK-3500U-ZA-Evolution-Kit) Samsung also commented that frame rates may play a part in the problem, as the Blu-ray Disc Association HDR specification for HDMI 2.0a calls for 2160p60 playback with 4:4:4 color, and that using a lower frame rate might fool the UHDTV into down-converting to 1080p resolution.

All of this just confirms my continued advice to my friends and colleagues: “Wait just a little bit longer before you buy a 4K TV.” Too many things are still in a state of flux on the manufacturing side, not the least of which is support for multiple high dynamic range formats. And the issues with HDCP 2.2 support are frankly, just ridiculous at this point: The standard’s been out for a few years, and it will be used exclusively with all HDMI inputs on Ultra HDTVs.

Another takeaway from this is the slow and steady move away from optical disc delivery of 4K movies and TV shows to streaming connections. The protocols for copy protection are a bit different for streaming, but at least the underlying architecture is standard across all platforms (some sort of common streaming protocol like RTSP, carrying MPEG4 H.264 or HEVC H.265 / VP9 with IP headers) and can be easily updated with software.

Given the continual increase in home broadband speeds – especially in metro areas – 4K streaming is fast becoming a realistic option. Granted, the image quality at 15 – 20 Mb/s won’t be as good as a file coming off an optical disc at 100 – 110 Mb/s, but as we’ve seen repeatedly, the vast majority of home viewers continue to choose convenience and price over quality. That may be one reason there are only three Ultra HD Blu-ray players on the market today: How many people are going to spend $300 – $400 – $500 for one?

As I write this, the SEK3500U is on its way, and my colleague will soon be enjoying true Ultra HD movies like he should have been from the start. I suppose the $400 cost is a small price to pay if you’ve already shelled out a few thousand dollars for an Ultra HDTV, but it would irk me to no end to be in that situation. (You know what they say about the “leading” edge often being the “bleeding” edge.)

To summarize; my advice to readers remains the same as it has been. If you are thinking of buying a new Ultra HDTV – like me – WAIT until next spring, or at least until Super Bowl time. Not only will you see lower prices, but you’re more likely to have all of the bugs out of the system – and you’ll be able to score a good deal on a set that can show high dynamic range content, too; certainly supporting two or more of the new HDR formats.

And if you just gotta have an Ultra HD Blu-ray player? Those prices will have come down, too. A quick check on Amazon shows the UBD-K8500 currently available for $317.99, while the Philips BDP7501 will cost you $279.99.  (Panasonic’s DMP-UB900 player wasn’t shipping at the time this article was written.)

Caveat emptor….

 

2016 – A Turning Point For Television

In a few short weeks, I (and hundreds of my colleagues in the press) will dutifully board planes for Las Vegas to once again spend a week walking the show floor at International CES. We’ll listen to PR pitches, grab fast-food meals on the fly, show up late for appointments, have numerous ad hoc discussions in hallways and cabs, and try to make sense of all the new technologies unveiled in the Las Vegas Convention Center and nearby hotels.

As usual, many of us will want to focus on televisions – or more specifically, what televisions are becoming. TVs have always been an important product category at CES, and that was particularly true with the introduction of digital, high definition TV in the late 1990s, followed by plasma and then LCD display technologies in the early to mid-2000s.

Today, the bloom is largely off the rose. TVs have become commodities, thanks to aggressive pricing and distribution by Korean manufacturers that have largely driven the Japanese brands out of the business. And we’re seeing that cycle repeat itself as China becomes the nexus for TV manufacturing and prices for 1080p sets continue in free fall.

But something new is here – Ultra HD (a/k/a 4K). And the transition is happening at a breathtaking pace: The first 4K / UHD sets appeared on these shores in 2012 with astronomically high price tags. Four years later, you can buy a 55-inch Ultra HDTV with “smart” wireless functions for less than $800, a price point that has forced same-size 1080p sets below $500.

And it’s not just more pixels. High dynamic range (HDR) is coming to market, as are new illumination technologies that will provide much larger color gamuts. LCD and OLED panel manufacturers are now able to address at 10 bits per pixel, breaking past the now-inadequate 8-bit standard that has held back displays of all kinds for over a decade.

Chinese manufacturer Hisense now owns the Sharp TV brand, and will bring a line of quantum dot-equipped Ultra HDTVs to market in 2016.

Chinese manufacturer Hisense now owns the Sharp TV brand, and will bring a line of quantum dot-equipped Ultra HDTVs to market in 2016.

Screen sizes are getting larger, too. Ten years ago, a 42-inch TV was considered “big” and anything larger was a home theater installation. Today? Consumers are routinely buying 50-inch, 55-inch, and even 60-inch sets as prices have fallen. That same 42-inch set is often consigned to a bedroom or kid’s room, or maybe a summer home.

Back in September of 2008, I bought a Panasonic 42-inch 1080p plasma TV for about $1,100. It had two HDMI 1.3 connections, three analog composite/component video inputs, and no network connectivity of any kind. But wow, did it make great pictures!

Seven years later, that TV sits in my basement, unused. It was replaced by a price-comparable, more energy-efficient 46-inch LCD model after Hurricane Sandy killed our power for several days and I did a whole-house energy audit. (And no, the LCD picture quality doesn’t compare to the plasma.)

But that’s not all that changed. I picked up four HDMI 1.4 inputs along the way (yep, it was set up for 3D), plus built-in Wi-Fi and “smart” functions. And I added a sound bar to make up for the awful quality of the built-in speakers. Plus, I added a Blu-ray player to round out the package, although it hardly sees any discs these days – it’s mostly used for streaming.

So – let’s say I’d like to replace that TV in 2016, just five years later. What would my options be?

To start with, I’d be able to buy a lot more screen. Right now, I could pick up a Samsung or LG 65-inch smart 1080p set for what I spent in 2011. Or, I could bite the bullet and make the move to Ultra HD with a 55-inch or 60-inch screen, complete with four HDMI inputs (one or two would be version 2.0, with HDCP 2.2 support), Wi-Fi, Netflix streaming (very important these days), and possibly a quantum dot backlight for HDR and WCG support.

My new set should support the HEVC H.265 codec, of course. That will make it possible to stream UHD content into my TV at 12 – 18 Mb/s from Netflix, Amazon Prime, Vimeo, Vudu, and any other company that jumps on the 4K content bandwagon. I could even go out and buy a brand-new Ultra HD Blu-ray player to complement it. But it’s more likely I’d opt to stream UHD content over my new, fast 30 Mb/s Internet connection from Comcast.

Now, it might pay to wait until later in 2016, when I could be sure of purchasing an Ultra HDTV that would support one or more of the proposed HDR delivery standards for disc-based and streaming UHD movies. And maybe I’d have more “fast” inputs, like DisplayPort 1.2 or even 1.3 to go along with HDMI 2.0 (and quite possibly, superMHL).

And I might even swing back over to an emissive display, to replace the picture quality I got from my old plasma set. That would mean purchasing an OLED Ultra HDTV, which would also support HDR and WCG, plus all of the usual bells and whistles (Wi-Fi, multiple HDMI/DP inputs, streaming, apps).

My point? We’re going to see some amazing technology in the next generation of televisions at ICES. And consumers are apparently warming up to Ultra HD – while sales of 1080p sets continue to decline, Ultra HD sales are climbing by double-digit percentages. I expect that number to accelerate as we near the Super Bowl, even though it won’t be broadcast in 4K (yet!).

If you are thinking about upgrading your main TV, 2016 could give you plenty of reasons to do it. My advice? Wait until all the puzzle pieces are in place for delivery of HDR and WCG to your home, and look into upgrading your Internet connections – streaming 4K will be here faster than you realize. And if you can live with your 1080p set until the fall of 2016, you’ll be amazed and likely very pleased at the upgrade…

Display Interfacing: Welcome to Babylon

For many years, ‘interfacing’ a video signal meant plugging in a yellow RCA or silver BNC connector that carried composite video. As picture resolution went up, computers became commonplace at work and home, and the term ‘progressive scan’ entered the lexicon, we saw the birth of S-video and then component video (YPbPr and RGB).

So we adapted, building switching and distribution gear that could handle one-, two-, and three-wire formats. All was well and good…until ‘digital’ made its grand entrance about 15 years ago.

Now, we have digital versions of component and RGB video, starting with the Digital Video Interface (DVI) and moving to High Definition Multimedia Interface (HDMI), DisplayPort, and the new superMHL interface that (according to the MHL Alliance) will start appearing on televisions as soon as December.

If I’m a consumer, I mostly don’t care about any of this. As long as I can plug in my set-top box, Blu-ray player, and other gadgets with the right cables I can find at Best Buy, this is just a bunch of alphabet soup.

However, if I’m an integrator (consumer or commercial), then I care VERY much about where all of this is heading. And if I’m paying any attention at all to the growing market for 4K and UHD, then I’m rightfully concerned about the impending problems with interfacing these signals.

superMHL is certainly fast enough to handle UHD. But you can't find it in use yet. is there a better way?

superMHL is certainly fast enough to handle UHD. But you can’t find it in pro AV gear yet. Is there a better way?

Consider that:

*Even though HDMI 2.0 was announced in September of 2013 – TWO FULL YEARS AGO – virtually no manufacturer in the pro AV space supports this interface on their switchers and distribution amplifiers. Instead, the vast majority are still providing version 1.4 while claiming these products are “4K compatible” or “4K ready” because version 1.4 is just fast enough to pass an Ultra HD (3840×260) signal at 30 Hz with 8-bit RGB color. That’s setting the bar kinda low, isn’t it?

*Some computer manufacturers don’t even support HDMI, like Apple (DisplayPort) and Lenovo (also DisplayPort). So, now you have to carry dongles everywhere you go?

*HDMI 2.0 arrives hand-in-hand with a new version of copy protection (HDCP 2.2) which is much more rigorous than versions 1.3 and 1.4. If a valid HDCP key exchange isn’t made within 20 milliseconds, the connection will shut down. Period.

*HDMI 2.0 isn’t fast enough for what UHD is turning out to be – a real departure from 1080p and Wide UXGA, with a move to 10-bit color to support high dynamic range (HDR), wide color gamuts (WCG), and high frame rates (HFR). DisplayPort 1.2 can barely support these requirements; DP version 1.3 and super MHL are better positioned to handle the job.

*The intellectual property behind HDMI and superMHL is owned by the same company – Lattice Semiconductor – and whereas once there were clear dividing lines between the two interfaces (MHL was designed originally for smartphones and tablets), they are now competing against each other. I’ve even sat in on presentations where it was explained that both could exist on consumer TVs. (And why would that make sense, again, when neither interface has been widely deployed to date, and one is clearly an improvement over the other?)

You can imagine what this trend is doing to product designers and manufacturers. Sure, HDMI is a “safe bet” for now, but what if our UHD needs quickly outstrip its maximum clock speed? DP is certainly faster and there appears to be more support for it from computer manufacturers. But super MHL is faster still. Shouldn’t your interfaces at least have a head start on display manufacturers?

This reliance on HDMI has led several manufacturers into a potential trap, investing heavily on signal distribution architectures that may quickly run into a “future-proofing” problem. In contrast; outside the commercial AV industry, everyone from cable TV system operators to broadcasters and telecom operators are busy migrating to an IP-based architecture.

Not only does IP-based architecture have the advantage of being a relatively open system, it also solves many of the speed issues as 1-gigabit and 10-gigabit networks are becoming more commonplace. (Heck, Comcast just upgraded my home Internet speeds to 75 Mb/s on downloads, which is more than fast enough for me to stream 4K content from Netflix and Amazon!)

So, why don’t we do the same in the commercial AV industry? It’s not for a lack of products – there are several companies offering AV-over-IP transmitters and receivers, along with encoders and decoders. I’ve also seen impressive demos of “middleware” used to locate, switch, and play out media assets over IP networks. All of these guys were at InfoComm 2015.

The big players in HDMI-based switching and distribution argue against AV-over-IP for in-room and short-run signal distribution, citing latency and compression issues. Well, we now have a new codec (HEVC H.265) to handle that end of things, and it’s possible to stream video and high resolutions with low latency. (How does 1920x1080p/60 at 1 to 2 Mb/s sound to you? Thought so.)

High latency is often the result of over-compression and heavy forward error correction (FEC). But if video and audio assets are streaming on bandwidth-managed, private IP networks, there isn’t a lot of forward error correction required. Group of Pictures (GOP) sizes can also increase to reduce latency. So latency is sort of a “straw man” argument. (And HDMI 2.0 will have plenty of issues with HDCP 2.2, trust me. Talk about latency…)

As for copy protection; video and audio assets streaming over IP connections have their own security protocols. Practically speaking, what could be more secure than video content streaming directly into a UHDTV, through an Ethernet connection? And you don’t even have to plug in a cable to make it work, unless you use a wired Ethernet hookup. Bandwidth issues? Well, how about 5 GHz 802.11ac channel-bonding routers? I’m getting 70+ Mb/s download speeds from mine with wired connections, and 25 – 30 Mb/s some distance from my 5 GHz wireless link.

Again, looking outside our industry, the two most common signal distribution and switching architectures are based on HD-SDI or IP (or both). Not HDMI, and certainly not HDMI-derived, structured-wire systems like HDBaseT. If the rest of the world wants to multiplex video, audio, metadata, and other low bitrate control signals, they do it over optical fiber. (Did you know that multimode fiber is cheaper than Category 6 wire?)

I’ll wrap up thing by saying that the smart move is for commercial AV integrators to move to an AV-over-IP signal distribution system at the core like everyone else, leaving the HDMI, DisplayPort, superMHL, and “whatever comes next” connections for the far ends, near the displays (if those far-end conversions are even needed at all).

Leave the core as a high-speed, copper bus or optical bus, software-based switcher. If there’s enough bandwidth (and there should be), that system can also carry local TCP/IP traffic, SMTP alerts from connected devices, and control signals to all devices. Not only does this approach free everyone from the “closed world” paradigm of HDMI, it also makes the system infinitely more appealing to end-users and facility administrators, an increasing number of whom come from the IT world.

Consider this ad that was posted recently on a listserv for higher education:

“We are looking for an experienced AV-IT Engineer for the role of Technical Guru. The position will provide planning and support for AV-IT systems used in teaching and learning spaces big and small. The person in this position will focus on design, installation, and troubleshooting of AV-IT systems in a variety of venues, including traditional classrooms, active learning classrooms, large auditoria, computer labs, and even Makerspaces…We are looking for a seasoned professional with a solid background in AV-IT systems. This is a great opportunity for a doer who is excited about not just maintaining but also shaping the future of AV-IT technology as a key element of the teaching mission of one of the world’s top universities.”

I rest my case. It’s time for the commercial AV industry to get in step with the rest of the world and move to AV-over-IP signal distribution.

Wake up. Have you smelled the coffee yet?

Look Out, HDMI – Here Comes Super MHL!

Yesterday, the MHL Consortium announced its newest flavor of display interface – Super MHL (or, more accurately, superMHL). MHL, which stands for Mobile High-definition Link, was original developed to enable display connections over Micro USB ports on phones and tablets. You’ll find it most often on mobile devices and televisions from Samsung and Sony. It will also pop up on LG products and there’s even an MHL port on the Pioneer AV receiver I bought a couple of months ago.

There have been some clever demos of MHL applications at past CES events. One was to build a “dumb” laptop (no CPU or video card) – just keyboard, touchpad, and display – and use MHL to dock a smartphone into it make everything work. Another demo in the Silicon Image booth featured smartphones being used as video game controllers with the video playing back on the controller screen.

Yet another demo showed a Sony Experia phone being used as a remote control with a Samsung TV to select inputs, play video, and launch Internet applications. It’s easy to do this stuff when you can multiplex video and serial data through the same connector, which in MHL version 3.0 can even play back Ultra HD video at 30 fps with 8-bit color.

Note the emphasis on the “mobile” part. In the world of transition-minimized differential signaling (TMDS), MHL is one of a few players, the others being HDMI (the dominant digital display interface), its predecessor DVI (still going strong although the standard isn’t updated anymore), and Micro and Mini HDMI, niche connectors on smartphones and cameras.

The advent of Ultra HD, 4K, and higher display resolutions like the new “5K” widescreen workstation monitors use has created a problem: Our display interfaces need to get faster. A LOT faster!

But HDMI 2.0, announced in September 2013, isn’t fast enough. I get into frequent debates with people about why it isn’t, so let me clarify my position: HDMI 2.0 has a maximum overall clock (data) rate of 18 gigabits per second (18 Gb/s). 80% of that can be used to carry display signals; the rest is overhead using 8 bit/10 bit mapping.

So that limits HDMI 2.0 to supporting 3840×2160 pixels (4400×2250 pixels with blanking) in an RGB signal format @ 60 Hz refresh. That’s the hard, fast speed limit. For anyone using a computer workstation or media player with RGB output, this hard, fast limit is a serious obstacle: How will people who buy the new HP/Dell 27-inch workstation monitors connect them? Their working resolution is 5120×2880 pixels, and at 60 Hz, that’s just too fast for HDMI 2.0.

It looked like DisplayPort 1.2 would finally ascend to the top of the podium, since its existing speed of 21.6 Gb/s (17.28 Gb/s usable) was already faster than HDMI 2.0. And now, DisplayPort 1.3 has been announced, with a top speed of 32 Gb/s (about 26 Gb/s usable) and the adoption of Display Stream compression. Indeed, more computer manufacturers are providing DP connections on laptops: Lenovo seems to have moved completely to this format, and Apple has been supporting DP for some time now.

8K is here! (Okay, maybe that's a few years away...)

8K is here! (Okay, maybe that’s a few years away…)

With all of that in mind, I will admit I was completely blind-sided by superMHL at this year’s International CES. Instead of a 5-pin Micro USB connector, superMHL offers a 32-pin, full-size connector that’s symmetrical (the next big thing in connectivity, a la USB Type-C). It also supports Display Stream compression. And it’s compatible with USB Type-C, although not with all six lanes. And it has a maximum data rate of 36 Gb/s across six lanes of data. (According to the MHL Consortium, that’s fast enough to transport an 8K (7680×4320) image with 120 Hz refresh and 4:2:0 color.)

The MHL Consortium’s announcement yesterday featured Silicon Image’s new Sil97798 port processor, which can also handle HDMI 2.0 signals. Here are the key specs from the Super MHL press release:

  • 8K 60fps video resolution, as outlined in the superMHL specification
  • New, reversible 32-pin superMHL connector
  • USB Type-C with MHL Alt Mode
  • High Dynamic Range (HDR), Deep Color, BT.2020
  • Object audio – Dolby Atmos®, DTS:X, 3D audio, audio-only mode
  • High bit-rate audio extraction
  • HDCP 2.2 premium content protection

 

Here's the 32-pin superMHL reversible connector.

Here’s the 32-pin superMHL reversible connector.

Whew! That’s quite a jump up from MHL. Some might say that superMHL is on steroids, but no matter how you look at it, superMHL is now a serious contender for the next generation of display connectivity. In the press briefing, a representative of the MHL Consortium waxed on about the approach of 8K broadcasting (it’s already been operating for two years in Japan) and how we would see a migration to 8K TV and displays in the near future.

As Larry David says, “Curb your enthusiasm!” Supporting 8K would be nice, but we’ve barely started the transition to UHDTV. And right now, selling 8K TV to the average consumer is like trying to peddle a Ferrari to someone who lives on a dirt road.

Where superMHL will find its niche is in supporting the higher bit rates that high dynamic range (HDR), wide color gamuts (BT.2020), and higher frame rates (60/96/100/120 Hz) require. All will shortly become important parts of the next-generation (UHD) television system. DisplayPort is already there with version 1.3, and you’ll even find DP 1.2 connections on selected models of Ultra HDTVs so that gamers can connect laptops and desktops at Ultra HD resolutions with 60 Hz refresh.

Now, the elephant in the room: How does the emergence of superMHL affect HDMI? Even though version 2.0 is over a year and a half old, you don’t see many HDMI 2.0 jacks on Ultra HDTVs. Casual inspections at Best Buy, HH Gregg, and other outlets show that the typical HDMI 2.0 port count is usually one (1), even as we approach April of 2015.

In the superMHL presentation, the concept of a TV with multiple HDMI 2.0 inputs and one superMHL input was outlined. This would, in effect, be the next step up from where we are now, with the typical Ultra HDTV having one HDMI 2.0 input and three HDMI 1.4 inputs.

But if Silicon Image’s new Sil9779 port processor can handle both formats, why bother with HDMI 2.0 in the first place, especially with its speed limitations? Wouldn’t it make more sense to future-proof all inputs and go with superMHL across the board? (Of course, the cost of adopting superMHL could weigh heavy on that decision.)

In the commercial AV and broadcast worlds, it would definitely make sense to jump to superMHL in the interests of future-proofing installations. Given the limited rollout of HDMI 2.0 to date, maybe supporting both HDMI 1.4 for legacy devices and superMHL is a smarter approach. (Note that superMHL and HDMI 2.0 both support HDCP 2.2, which is the next level in encryption and NOT compatible with older versions of HDMI.)

Summing up; the race for faster interface speed just got a lot more interesting with the addition of superMHL to the lineup. I can imagine that manufacturers of AV matrix switchers and distribution amplifiers are feeling another migraine headache coming on…

EDITOR’S NOTE: Last week, it was announced that Silicon Image has been acquired by Lattice Semiconductor of Hillsboro, Oregon, “ a leading provider of programmable connectivity solutions” according to the press release. The acquisition price was about $600M and now leaves Lattice in control of HDMI, MHL and superMHL, and SiBEAM (WiHD) patents and IP. More information can be found on the Lattice Web site at http://www.latticesemi.com/.