Posts Tagged ‘Ultra HD’
InfoComm 2016 In The Rearview Mirror
- Published on Friday, 17 June 2016 12:48
- Pete Putman
- 0 Comments
Another InfoComm show has come and gone. This is my 23rd InfoComm and it’s hard to imagine when I first set foot in Anaheim way back in 1994 – ostensibly to cover the now-defunct Projection Shoot-Out – that I’d still be making the treks to Orlando and Las Vegas, let alone teaching classes and joining the InfoComm faculty.
For this recap, I’ll focus on trends I saw at the show that will continue to impact our industry for some time to come. And there were plenty of them, everywhere you looked.
First off; I’ve been saying for several years now that software is becoming increasingly more important than hardware in our industry (and across all market segments – look at how inexpensive Ultra HDTVs have become already), and that we’d start to see less of a focus on expensive hardware and more of an emphasis on software and managed services.
And that’s exactly what I spotted in Las Vegas. Astute observers noticed that the once humongous booths set up by the likes of Sony, Panasonic, Crestron, LG, Samsung, Hitachi, and other companies have gotten a bit smaller. (NEC, Da-Lite, and Christie were exceptions to the rule.)
AMX, when it was a stand-alone company, used to have an enormous booth at the show (not to mention a huge party every year). Now, AMX is limited to a few small stands within the Harman booth. Walk the show floor these days and you’ll recognize other once-mighty brands that have been acquired by holding companies and now occupy much smaller footprints.
And this trend shouldn’t be any surprise. When hardware used to sell for four and five figures (and in some cases, six figures), you could justify those million-dollar booths that looked like mini-malls. (Remember the huge tented Sanyo projector booths?) But that’s not the case anymore.
Practically speaking, how much real estate do you need to talk about software programs and managed services? The same thing is happening at NAB, where once humongous companies like Harris (now Imagine) are largely touting services and not hardware.
Even Digital Projection has scaled back its enormous multi-tier InfoComm booth. And projectiondesign has shed some square footage since being acquired by Barco, which has itself gone on a square footage diet. Ditto Sharp, which had one of the smallest booths ever at this show, perhaps related to the company’s ongoing financial challenges.
Surprisingly, Toshiba showed there is indeed a second act by showing up with a nice-size booth full of LCD monitors for tiled display walls. That’s not exactly an easy market to compete in, what with LG, Samsung, and NEC having a big footprint. But they’re giving it a shot.
Another trend that’s really picking up speed is the move away from projection lamps to solid-state illumination systems, most often lasers with color phosphor wheels. The availability of large, inexpensive LCD displays has cut deeply into sales of projectors – particularly in small classrooms and meeting rooms, where we used to put in “hang and bang” projection systems.
If you talk to people who’ve made the switch away from projection to direct-view, the reason they most frequently cite is that they don’t have to change out lamps anymore, and the LCD displays can be used under normal room lighting and turn on instantly.
Well, projector manufacturers have gotten the message and are moving en masse to solid state light sources. Early adopters like Casio have reaped the benefits, but now everyone from Sony and Panasonic to Vivitek and Optoma is on board.
Even so, the corner wasn’t really turned until this year when Epson – one of the big manufacturers of projection lamps – showed a 25,000-lumen 3LCD projector powered by a laser light engine. And I saw more than one UHD-resolution projector using the laser-phosphor combination, even in ultra-short throw configurations.
How much longer will we be changing out lamps? I don’t think it will be more than a few years before the majority of projectors offered for sale will use laser or LED light engines (or both). There will be exceptions for certain models, but for all intents and purposes, short-arc lamps are toast.
Here’s another trend – LED walls. I tried to count all of the exhibitors at InfoComm and lost track after wandering through the North Hall. And just about every single exhibitor was based in China, with names you would not recognize. Were they looking for U.S. dealer/distributor partners? It’s not likely many would pick up customers here, and that may be why Leyard (another Chinese manufacturer) bought Planar last year – everyone knows who Planar is.
I also saw LED walls with pitches as small as .9mm. That’s smaller than the pixel pitch of a 50-inch 1366×768 plasma monitor from 1995! And if anyone continues to go big with their booths, it’s the LED wall manufacturers. (Not like they have any choice!) Leyard’s 100’+ 8K LED wall was a perfect example of why bigger is still better when it comes to a booth.
And Sony’s Cledis 8Kx2K LED wall shows just how much farther we’ve come with this technology, creating what appeared to be a perfectly seamless, pixel-free panoramic LED wall that dazzled with bright, super-saturated color images.
The Chinese dominance in LED displays shouldn’t be surprising. They’re moving to a similar level in the manufacturing of LCD panels, monitors, and televisions, undermining the Korean manufacturers (who undermined the Japanese, who took our U.S.-based television business away in the 1980s).
In fact, so much of our hardware is fabricated, soldered, and assembled in China and Southeast Asia these days that it should be no surprise prices have dropped as much as they have. Back in the day, a quality line doubler (remember those?) would set you back as much as $5,000 to $8,000. Today, you can buy a compact scaler that works to 1080p and Wide UXGA for a few hundred bucks.
My last trend has to do with the slow migration of video and audio signal distribution and switching away from hardware-intensive platforms based on display interface standards to software-based platforms that use IT switches, encoders, and decoders. Wow, did I spot a lot of those products at the show, even from some previously-vigorous defenders of HDMI-based architectures.
The interest in learning how to move to an “open” IP-type AV distribution architecture must be considerable: I taught a class on AV-over-IP this year at InfoComm and was astounded to see that 185 people had signed up to attend. And there were very few no-shows, as I found out when I had attendees sitting on the floor and standing along the back wall for almost the entire 90-minute class.
What’s more, a substantial portion of those attendees came from the higher education market segment, and an informal poll revealed that most of them were still upgrading from older analog systems to all-digital infrastructure. In essence, they were telling me that they preferred to skip by HDMI-based solutions and move directly to an IP-type solution.
Hand-in-hand with this discovery came more responses about transitioning to app-based AV control systems and away from proprietary, code-based control that requires specialized programming. Well, there were a few companies showing app-based AV control products in Vegas that had super-simple GUIs; software that just about anyone could learn to use in a few hours.
Throw in the accelerating transition to UHD resolution displays (they’ll largely replace Full HD within a year), and you have some very interesting times in store for the AV industry as this decade winds on…
Display Interfacing: Welcome to Babylon
- Published on Thursday, 10 September 2015 13:53
- Pete Putman
- 0 Comments
For many years, ‘interfacing’ a video signal meant plugging in a yellow RCA or silver BNC connector that carried composite video. As picture resolution went up, computers became commonplace at work and home, and the term ‘progressive scan’ entered the lexicon, we saw the birth of S-video and then component video (YPbPr and RGB).
So we adapted, building switching and distribution gear that could handle one-, two-, and three-wire formats. All was well and good…until ‘digital’ made its grand entrance about 15 years ago.
Now, we have digital versions of component and RGB video, starting with the Digital Video Interface (DVI) and moving to High Definition Multimedia Interface (HDMI), DisplayPort, and the new superMHL interface that (according to the MHL Alliance) will start appearing on televisions as soon as December.
If I’m a consumer, I mostly don’t care about any of this. As long as I can plug in my set-top box, Blu-ray player, and other gadgets with the right cables I can find at Best Buy, this is just a bunch of alphabet soup.
However, if I’m an integrator (consumer or commercial), then I care VERY much about where all of this is heading. And if I’m paying any attention at all to the growing market for 4K and UHD, then I’m rightfully concerned about the impending problems with interfacing these signals.
*Even though HDMI 2.0 was announced in September of 2013 – TWO FULL YEARS AGO – virtually no manufacturer in the pro AV space supports this interface on their switchers and distribution amplifiers. Instead, the vast majority are still providing version 1.4 while claiming these products are “4K compatible” or “4K ready” because version 1.4 is just fast enough to pass an Ultra HD (3840×260) signal at 30 Hz with 8-bit RGB color. That’s setting the bar kinda low, isn’t it?
*Some computer manufacturers don’t even support HDMI, like Apple (DisplayPort) and Lenovo (also DisplayPort). So, now you have to carry dongles everywhere you go?
*HDMI 2.0 arrives hand-in-hand with a new version of copy protection (HDCP 2.2) which is much more rigorous than versions 1.3 and 1.4. If a valid HDCP key exchange isn’t made within 20 milliseconds, the connection will shut down. Period.
*HDMI 2.0 isn’t fast enough for what UHD is turning out to be – a real departure from 1080p and Wide UXGA, with a move to 10-bit color to support high dynamic range (HDR), wide color gamuts (WCG), and high frame rates (HFR). DisplayPort 1.2 can barely support these requirements; DP version 1.3 and super MHL are better positioned to handle the job.
*The intellectual property behind HDMI and superMHL is owned by the same company – Lattice Semiconductor – and whereas once there were clear dividing lines between the two interfaces (MHL was designed originally for smartphones and tablets), they are now competing against each other. I’ve even sat in on presentations where it was explained that both could exist on consumer TVs. (And why would that make sense, again, when neither interface has been widely deployed to date, and one is clearly an improvement over the other?)
You can imagine what this trend is doing to product designers and manufacturers. Sure, HDMI is a “safe bet” for now, but what if our UHD needs quickly outstrip its maximum clock speed? DP is certainly faster and there appears to be more support for it from computer manufacturers. But super MHL is faster still. Shouldn’t your interfaces at least have a head start on display manufacturers?
This reliance on HDMI has led several manufacturers into a potential trap, investing heavily on signal distribution architectures that may quickly run into a “future-proofing” problem. In contrast; outside the commercial AV industry, everyone from cable TV system operators to broadcasters and telecom operators are busy migrating to an IP-based architecture.
Not only does IP-based architecture have the advantage of being a relatively open system, it also solves many of the speed issues as 1-gigabit and 10-gigabit networks are becoming more commonplace. (Heck, Comcast just upgraded my home Internet speeds to 75 Mb/s on downloads, which is more than fast enough for me to stream 4K content from Netflix and Amazon!)
So, why don’t we do the same in the commercial AV industry? It’s not for a lack of products – there are several companies offering AV-over-IP transmitters and receivers, along with encoders and decoders. I’ve also seen impressive demos of “middleware” used to locate, switch, and play out media assets over IP networks. All of these guys were at InfoComm 2015.
The big players in HDMI-based switching and distribution argue against AV-over-IP for in-room and short-run signal distribution, citing latency and compression issues. Well, we now have a new codec (HEVC H.265) to handle that end of things, and it’s possible to stream video and high resolutions with low latency. (How does 1920x1080p/60 at 1 to 2 Mb/s sound to you? Thought so.)
High latency is often the result of over-compression and heavy forward error correction (FEC). But if video and audio assets are streaming on bandwidth-managed, private IP networks, there isn’t a lot of forward error correction required. Group of Pictures (GOP) sizes can also increase to reduce latency. So latency is sort of a “straw man” argument. (And HDMI 2.0 will have plenty of issues with HDCP 2.2, trust me. Talk about latency…)
As for copy protection; video and audio assets streaming over IP connections have their own security protocols. Practically speaking, what could be more secure than video content streaming directly into a UHDTV, through an Ethernet connection? And you don’t even have to plug in a cable to make it work, unless you use a wired Ethernet hookup. Bandwidth issues? Well, how about 5 GHz 802.11ac channel-bonding routers? I’m getting 70+ Mb/s download speeds from mine with wired connections, and 25 – 30 Mb/s some distance from my 5 GHz wireless link.
Again, looking outside our industry, the two most common signal distribution and switching architectures are based on HD-SDI or IP (or both). Not HDMI, and certainly not HDMI-derived, structured-wire systems like HDBaseT. If the rest of the world wants to multiplex video, audio, metadata, and other low bitrate control signals, they do it over optical fiber. (Did you know that multimode fiber is cheaper than Category 6 wire?)
I’ll wrap up thing by saying that the smart move is for commercial AV integrators to move to an AV-over-IP signal distribution system at the core like everyone else, leaving the HDMI, DisplayPort, superMHL, and “whatever comes next” connections for the far ends, near the displays (if those far-end conversions are even needed at all).
Leave the core as a high-speed, copper bus or optical bus, software-based switcher. If there’s enough bandwidth (and there should be), that system can also carry local TCP/IP traffic, SMTP alerts from connected devices, and control signals to all devices. Not only does this approach free everyone from the “closed world” paradigm of HDMI, it also makes the system infinitely more appealing to end-users and facility administrators, an increasing number of whom come from the IT world.
Consider this ad that was posted recently on a listserv for higher education:
“We are looking for an experienced AV-IT Engineer for the role of Technical Guru. The position will provide planning and support for AV-IT systems used in teaching and learning spaces big and small. The person in this position will focus on design, installation, and troubleshooting of AV-IT systems in a variety of venues, including traditional classrooms, active learning classrooms, large auditoria, computer labs, and even Makerspaces…We are looking for a seasoned professional with a solid background in AV-IT systems. This is a great opportunity for a doer who is excited about not just maintaining but also shaping the future of AV-IT technology as a key element of the teaching mission of one of the world’s top universities.”
I rest my case. It’s time for the commercial AV industry to get in step with the rest of the world and move to AV-over-IP signal distribution.
Wake up. Have you smelled the coffee yet?
4K, Collapsing Prices, and the Declining Importance of Hardware
- Published on Thursday, 10 September 2015 12:24
- Pete Putman
- 0 Comments
As I write this, the 2015 season of the National Football League is about to get underway, with last year’s Super Bowl champion New England Patriots taking on the Pittsburgh Steelers. If you’re not a football fan, why should you care?
Simple: Football, more than any other sport or event, drives the sale of televisions. And the TV business is in a major funk right now.
According to IHS’ latest survey of the global television market, worldwide shipments of TVs fell an astounding 8 percent Y-Y during the second quarter of 2015. Even though LCD TVs now account for almost 99% of all TV shipments, “…LCD TV sales have not made up for the lost volume of cathode-ray tube (CRT) and plasma televisions, which have largely left the marketplace.”
The one bright spot? 4K. The IHS report states, “4K TV was a bright spot in the global TV market, with unit shipments growing 197 percent year over year in Q2 2015, to reach 6.2 million units. The growth in 4K TVs is the direct result of increased price erosion and more affordable tiers of 4K models becoming available.”
I’ve written on numerous occasions that we’re on the cusp of an industry switchover from 1080p resolution to Ultra HD (3840×2160) for precisely this reason, plus the fact that it’s becoming increasingly difficult to make any money on the manufacturing and sales of 1080p-resolution LCD panels. That’s part of the reason that Sharp – once the premier brand of LCD televisions – finally threw in the towel and exited the North American television business, selling their Mexican factory and “Sharp” brand to Hisense.
Need proof? Check out the most recent HH Gregg and Best Buy circulars. You can now buy a 48-inch Haier 1080p LCD TV for $298 or a 60-inch LG 1080p smart TV for $898. Want Ultra HD resolution instead? Samsung’s got a curved 55-inch smart model for $1198, and a 60-inch smart set for $1498.
But here’s the kicker: Samsung’s HDR Ultra HDTVs (S-UHD) are almost the same price. A 50-inch model (UN50JS7000) is tagged at $1098 by HH Gregg, while the 55-inch version is $1298. Too expensive? Sharp’s got a 43-inch Ultra HD offering for $598, a 50-inch set for $748, and a 55-inch version for $848. (Not to be left out, LG has cut the price on their 55-inch smart Ultra HDTV to $998, and they’ve also got a 49-inch UHD set for $798.)
Now, step back from that mass of numbers, and think about this: Those are insanely low prices for Ultra HDTVs, which were tagged around $15 – $20K when they first came to these shores in 2012. I know of several friends and acquaintances that had to replace older TVs recently, and every one of them bought an Ultra HD set because of these falling prices.
If overall sales of TVs are falling but 4K TV sales are increasing, it doesn’t take a weatherman to see which way the wind is blowing: 4K and Ultra HD are rapidly taking over the TV marketplace for sets larger than 42 inches. This is happening so quickly that by the end of next year, ALL TVs larger than 50 inches will be Ultra HD models.
There’s a bigger message here. The money isn’t in hardware anymore – it’s moving to software. I find it hard to believe that I would spend more in a year for cable TV and Internet service than the cost of an Ultra HDTV, but that’s exactly what’s happening. Content is king, and who cares about the hardware?
So, why are TV sales in decline? It could be for a very simple reason, and that is the average household has a large-enough TV with enough bells and whistles that they see no reason to upgrade. If you already own a 55-inch or 60-inch 1080p set with “smart” functions ( and the all-important Netflix streaming), then the speed of your Internet connection is much more important than adding another 5 inches in screen size or quadrupling your screen resolution.
There’s a corollary in the world of tablets, where sales and shipments are also slowing down much faster than analysts predicted. There are any number of reasons why, but the two most likely culprits are the shift in preferences for larger smartphone screens (“phablets”) and the fact that people just hang onto tablets longer (at least, until their batteries die), often passing them down to children or off to relatives when a new model is purchased.
This shift to 4K and Ultra HD resolution is also impacting the commercial AV industry, which is heading for some serious interfacing issues. More and more of the large displays that will be installed will have Ultra HD resolution. And that will create a major headache for integrators, as the predominant interface for pro AV is still HDMI 1.4, even though version 2.0 was announced two years ago.
None of this is good news for the projector manufacturers, who are struggling to defend their turf from the large, cheap LCD displays. Unlike panel manufacturers, projector brands are moving slowly to adopt 4K resolution, which isn’t surprising because of the cost involved to tool up and manufacture microdisplays with 4K resolution and the much smaller market for projectors.
As for the naysayers who still think 4K is a fad, I would just advise them to wake up and smell the coffee. The world of consumer electronics absolutely drives the world of commercial AV – what’s happening over there is going to happen here, and that means you as an integrator will be installing more and more displays with UHD resolution; from desktop monitors and TVs to single-panel and tiled wall-mounted displays.
Count on it!
NAB In The Rear View Mirror
- Published on Friday, 24 April 2015 21:28
- Pete Putman
- 0 Comments
It’s been over a week since I got back from Las Vegas and edited all of my photos and videos. But once again, NAB scored big numbers with attendance and there were enough goodies to be found in all three exhibit halls, if you were willing to put in the time to pound the pavement. Over 100,000 folks made their way to the Las Vegas Convention Center to see endless demos of streaming, drones, 4K cameras and post-production, and H.265 encoders.
We were also treated to a rare haboob, or dust storm, which blew through town late Tuesday afternoon and blotted out the sun, leaving a fine dusting of sand particles on everything (and in everyone’s hair, ears, and eyes.) While most of the conferences and presentations tend to be somewhat predictable, the third day of the show featured the notorious John McAfee (yes, THAT John McAfee) as the keynote speaker at the NAB Technology Luncheon. Escorted by a security detail, McAfee walked up on stage and proceeded to warn everyone about the security risks inherent in loading apps onto phones and tablets. (Come to think of it, why does a flashlight app for my phone need permission to access my contact list and my camera?)
Some readers may remember the Streaming Video pavilion in the Central Hall at this show back in 1999. There, dozens of small start-up companies had booths showing how they could push 320×240-resolution video (“dancing postage stamps”) over 10 megabit and 100 megabit Ethernet connections, and not always reliably. (And not surprisingly, most of those companies were gone a year later.)
Today, companies like Harmonic, Elemental, Ericsson, Ateme, and the Fraunhofer Institute routinely demonstrate 4K (3840×2160) video through 1GigE networks at a data rate of 15 Mb/s, using 65-inch and 84-inch 4K video screens to demonstrate the picture quality. 4K file storage and editing “solutions” are everywhere, as are the first crop of reference-quality 4K displays using LCD and OLED technology.
In some ways, the NAB show resembles InfoComm. Many of the exhibitors at NAB have also set up shop at InfoComm, waiting for the pro AV channel to embrace digital video over IP networks. (It’ll happen, guys. Just be patient.) In the NAB world, video transport over IP using optical fiber backbones is quite the common occurrence, although it’s still a novelty to our world. (Haven’t you heard? Fiber is good for you!)
I spent three and a half days wandering around the aisles in a daze, but managed to find some gems among the crowds. Here were some highlights:
Blackmagic Design drew a crowd to see its Micro Cinema Camera, and it is indeed tiny. The sensor size is Super 16 (mm) and is capable of capturing 13 stops of light. RAW and Apple ProRes recording formats are native, and Blackmagic has also included an expansion port “…featuring PWM and S.Bus inputs for airplane remote control.” (Can you say “drone?”) And all of this for just $995…
RED’s booth showed the prototype of a new 8K (7680×4320) camera body that will capture video at 6K resolution from 1 to 100 frames per second. In 4K (3840×2160) mode, the Dragon can record footage as fast as 150 frames per second. (Both of these are in RAW mode.) Data transfer (writing speeds) was listed at 300 Mb/s, and the camera has built-in wireless connectivity.
Arri showed a 65mm digital camera, resurrecting a format that goes back to the 1950s. The actual resolution of the camera sensor is 5120×2880, or “5K” as Arri calls it. This sensor size is analogous to the old 6 cm x 6 cm box cameras made by Rollei and Yashica, and there is quite a bit of data flowing from this camera when it records! (Can you say “terabytes” of storage?”)
Drones dominated the show, with powerhouse DJI setting up in the central hall and an entire section of the rear south hall devoted to a drone “fly-off” competition. Nearby, a pavilion featured nothing but drones, cameras, accessories, and even wireless camera links such as Amimon’s Connex 5 GHz system. (You may recognize this as a variant of the company’s WHDI wireless HDMI product.)
Sony had side-by-side comparisons of standard dynamic range (SDR) and high dynamic range (HDR) footage using their new BVM-X300 30-inch HDR OLED display. This is the 3rd generation of OLED reference monitor products to come out of the Sony labs, and it’s a doozy with 4096×2160 resolution (3G-SDI Quad-link up to 4096 x 2160/48p/50p/60p) and coverage of the DCI P3 minimum color space. The monitor can also reproduce about 80% of the new BT.2020 color gamut. Peak brightness (scene to scene) is about 800 nits, and color reproduction is very accurate with respect to flesh tones and pastels.
Canon also took the wraps off a new reference monitor. The DP-V2410 4K reference display has 4096×2160 pixels of resolution (the DCI 4K standard) and uses an IPS LCD panel that is capable to showing high dynamic range (HDR), usually defined as at least 15 stops of light. It supports the ITU BT.2020 color space, can upscale 2K content to 4K, and will run off 24 volts DC for field use.
Panasonic unveiled their first laser-powered 3-chip DLP projector, and it’s a doozy. Using a short-throw lens, the Panasonic guys lit up a 10-foot diagonal screen with 12,000 lumens at WUXGA (1920×1200) resolution from the PT-RZ12KU. It uses a blue laser to excite a yellow-green color wheel and create white light, which is then refracted into red, green, and blue light for imaging. The projector weighs just 95 pounds, and the demo used an ultra-short-throw lens positioned about 12” – 16” in front of the screen.
Fine-pitch indoor and outdoor LED displays are a growing market. Both Leyard and Panasonic showed large LED displays with 1.6mm dot pitch, which isn’t much larger than what you would have found on a 768p-resolution plasma display from 15 years ago. The color quality and contrast on these displays was quite impressive and you have to stand pretty close to notice the pixel structure, unlike the more commonly-used 6mm and 10mm pitch for outdoor LED displays. Brightness on these displays is in the thousands of nits (talk about high-dynamic range!).
Speaking of HDR, Dolby had a demonstration in its booth of new UHDTVs from Vizio that incorporate Dolby’s version of high dynamic range. Vizio showed a prototype product a year ago at CES and it now appears close to delivery. The target brightness for peak white will be well over 1000 nits, but the challenge for any LCD panel is being able to show extremely low levels of gray – near black.
Vitec had what may be the world’s first portable HEVC H.265 encoder, the MGW Ace. Unlike most of the H.265 demos at the show, this product does everything in hardware with a dedicated H.265 compression chip (most likely from Broadcom). And it is small, at about ¾ of a rack wide. Inputs include 3G/SDI, composite video (yep, that’s still around), HDMI, and DVI, with support for embedded and serial digital audio. Two Ethernet ports complete the I/O complement.
Over in the NTT booth, a demonstration was being made of “the first H.265 HEVC encoder ever to perform 4K 4:2:2 encoding in real time.” I’m not sure if that was true, but it was a cool demo: NTT (a/k/a Nippon Telephone & Telegraph) researchers developed the NARA processor to reduce power consumption and save space over existing software/hardware based encoders. And it comes with extension interfaces to encode video with even higher resolution.
NHK was back again with their extension demo area of 8K acquisition, image processing, and broadcasting. (Yes, NHK IS broadcasting 8K in Tokyo, and has been doing so for a few years.) Among the cooler things in their booth was a 13-inch 8K OLED display – which almost seems like an oxymoron – and an impressive demonstration of 8K/60 and 8K/120 shooting and playback. On the 120Hz side of the screen, there was no blur whatsoever of footage taken during a soccer match.
This is just scratching the surface, and I’ll have more information during my annual “Future Trends” presentation at InfoComm in June. For now, I’ll let one of my colleagues sum up the show as being about “wireless 4K drones over IP.” (Okay, that’s a bit of a simplification…)
Look Out, HDMI – Here Comes Super MHL!
- Published on Tuesday, 17 March 2015 12:38
- Pete Putman
- 0 Comments
Yesterday, the MHL Consortium announced its newest flavor of display interface – Super MHL (or, more accurately, superMHL). MHL, which stands for Mobile High-definition Link, was original developed to enable display connections over Micro USB ports on phones and tablets. You’ll find it most often on mobile devices and televisions from Samsung and Sony. It will also pop up on LG products and there’s even an MHL port on the Pioneer AV receiver I bought a couple of months ago.
There have been some clever demos of MHL applications at past CES events. One was to build a “dumb” laptop (no CPU or video card) – just keyboard, touchpad, and display – and use MHL to dock a smartphone into it make everything work. Another demo in the Silicon Image booth featured smartphones being used as video game controllers with the video playing back on the controller screen.
Yet another demo showed a Sony Experia phone being used as a remote control with a Samsung TV to select inputs, play video, and launch Internet applications. It’s easy to do this stuff when you can multiplex video and serial data through the same connector, which in MHL version 3.0 can even play back Ultra HD video at 30 fps with 8-bit color.
Note the emphasis on the “mobile” part. In the world of transition-minimized differential signaling (TMDS), MHL is one of a few players, the others being HDMI (the dominant digital display interface), its predecessor DVI (still going strong although the standard isn’t updated anymore), and Micro and Mini HDMI, niche connectors on smartphones and cameras.
The advent of Ultra HD, 4K, and higher display resolutions like the new “5K” widescreen workstation monitors use has created a problem: Our display interfaces need to get faster. A LOT faster!
But HDMI 2.0, announced in September 2013, isn’t fast enough. I get into frequent debates with people about why it isn’t, so let me clarify my position: HDMI 2.0 has a maximum overall clock (data) rate of 18 gigabits per second (18 Gb/s). 80% of that can be used to carry display signals; the rest is overhead using 8 bit/10 bit mapping.
So that limits HDMI 2.0 to supporting 3840×2160 pixels (4400×2250 pixels with blanking) in an RGB signal format @ 60 Hz refresh. That’s the hard, fast speed limit. For anyone using a computer workstation or media player with RGB output, this hard, fast limit is a serious obstacle: How will people who buy the new HP/Dell 27-inch workstation monitors connect them? Their working resolution is 5120×2880 pixels, and at 60 Hz, that’s just too fast for HDMI 2.0.
It looked like DisplayPort 1.2 would finally ascend to the top of the podium, since its existing speed of 21.6 Gb/s (17.28 Gb/s usable) was already faster than HDMI 2.0. And now, DisplayPort 1.3 has been announced, with a top speed of 32 Gb/s (about 26 Gb/s usable) and the adoption of Display Stream compression. Indeed, more computer manufacturers are providing DP connections on laptops: Lenovo seems to have moved completely to this format, and Apple has been supporting DP for some time now.
With all of that in mind, I will admit I was completely blind-sided by superMHL at this year’s International CES. Instead of a 5-pin Micro USB connector, superMHL offers a 32-pin, full-size connector that’s symmetrical (the next big thing in connectivity, a la USB Type-C). It also supports Display Stream compression. And it’s compatible with USB Type-C, although not with all six lanes. And it has a maximum data rate of 36 Gb/s across six lanes of data. (According to the MHL Consortium, that’s fast enough to transport an 8K (7680×4320) image with 120 Hz refresh and 4:2:0 color.)
The MHL Consortium’s announcement yesterday featured Silicon Image’s new Sil97798 port processor, which can also handle HDMI 2.0 signals. Here are the key specs from the Super MHL press release:
- 8K 60fps video resolution, as outlined in the superMHL specification
- New, reversible 32-pin superMHL connector
- USB Type-C with MHL Alt Mode
- High Dynamic Range (HDR), Deep Color, BT.2020
- Object audio – Dolby Atmos®, DTS:X, 3D audio, audio-only mode
- High bit-rate audio extraction
- HDCP 2.2 premium content protection
Whew! That’s quite a jump up from MHL. Some might say that superMHL is on steroids, but no matter how you look at it, superMHL is now a serious contender for the next generation of display connectivity. In the press briefing, a representative of the MHL Consortium waxed on about the approach of 8K broadcasting (it’s already been operating for two years in Japan) and how we would see a migration to 8K TV and displays in the near future.
As Larry David says, “Curb your enthusiasm!” Supporting 8K would be nice, but we’ve barely started the transition to UHDTV. And right now, selling 8K TV to the average consumer is like trying to peddle a Ferrari to someone who lives on a dirt road.
Where superMHL will find its niche is in supporting the higher bit rates that high dynamic range (HDR), wide color gamuts (BT.2020), and higher frame rates (60/96/100/120 Hz) require. All will shortly become important parts of the next-generation (UHD) television system. DisplayPort is already there with version 1.3, and you’ll even find DP 1.2 connections on selected models of Ultra HDTVs so that gamers can connect laptops and desktops at Ultra HD resolutions with 60 Hz refresh.
Now, the elephant in the room: How does the emergence of superMHL affect HDMI? Even though version 2.0 is over a year and a half old, you don’t see many HDMI 2.0 jacks on Ultra HDTVs. Casual inspections at Best Buy, HH Gregg, and other outlets show that the typical HDMI 2.0 port count is usually one (1), even as we approach April of 2015.
In the superMHL presentation, the concept of a TV with multiple HDMI 2.0 inputs and one superMHL input was outlined. This would, in effect, be the next step up from where we are now, with the typical Ultra HDTV having one HDMI 2.0 input and three HDMI 1.4 inputs.
But if Silicon Image’s new Sil9779 port processor can handle both formats, why bother with HDMI 2.0 in the first place, especially with its speed limitations? Wouldn’t it make more sense to future-proof all inputs and go with superMHL across the board? (Of course, the cost of adopting superMHL could weigh heavy on that decision.)
In the commercial AV and broadcast worlds, it would definitely make sense to jump to superMHL in the interests of future-proofing installations. Given the limited rollout of HDMI 2.0 to date, maybe supporting both HDMI 1.4 for legacy devices and superMHL is a smarter approach. (Note that superMHL and HDMI 2.0 both support HDCP 2.2, which is the next level in encryption and NOT compatible with older versions of HDMI.)
Summing up; the race for faster interface speed just got a lot more interesting with the addition of superMHL to the lineup. I can imagine that manufacturers of AV matrix switchers and distribution amplifiers are feeling another migraine headache coming on…
EDITOR’S NOTE: Last week, it was announced that Silicon Image has been acquired by Lattice Semiconductor of Hillsboro, Oregon, “ a leading provider of programmable connectivity solutions” according to the press release. The acquisition price was about $600M and now leaves Lattice in control of HDMI, MHL and superMHL, and SiBEAM (WiHD) patents and IP. More information can be found on the Lattice Web site at http://www.latticesemi.com/.