Posts Tagged ‘HDMI 2.0’
2016 – A Turning Point For Television
- Published on Monday, 07 December 2015 18:27
- Pete Putman
- 0 Comments
In a few short weeks, I (and hundreds of my colleagues in the press) will dutifully board planes for Las Vegas to once again spend a week walking the show floor at International CES. We’ll listen to PR pitches, grab fast-food meals on the fly, show up late for appointments, have numerous ad hoc discussions in hallways and cabs, and try to make sense of all the new technologies unveiled in the Las Vegas Convention Center and nearby hotels.
As usual, many of us will want to focus on televisions – or more specifically, what televisions are becoming. TVs have always been an important product category at CES, and that was particularly true with the introduction of digital, high definition TV in the late 1990s, followed by plasma and then LCD display technologies in the early to mid-2000s.
Today, the bloom is largely off the rose. TVs have become commodities, thanks to aggressive pricing and distribution by Korean manufacturers that have largely driven the Japanese brands out of the business. And we’re seeing that cycle repeat itself as China becomes the nexus for TV manufacturing and prices for 1080p sets continue in free fall.
But something new is here – Ultra HD (a/k/a 4K). And the transition is happening at a breathtaking pace: The first 4K / UHD sets appeared on these shores in 2012 with astronomically high price tags. Four years later, you can buy a 55-inch Ultra HDTV with “smart” wireless functions for less than $800, a price point that has forced same-size 1080p sets below $500.
And it’s not just more pixels. High dynamic range (HDR) is coming to market, as are new illumination technologies that will provide much larger color gamuts. LCD and OLED panel manufacturers are now able to address at 10 bits per pixel, breaking past the now-inadequate 8-bit standard that has held back displays of all kinds for over a decade.
Screen sizes are getting larger, too. Ten years ago, a 42-inch TV was considered “big” and anything larger was a home theater installation. Today? Consumers are routinely buying 50-inch, 55-inch, and even 60-inch sets as prices have fallen. That same 42-inch set is often consigned to a bedroom or kid’s room, or maybe a summer home.
Back in September of 2008, I bought a Panasonic 42-inch 1080p plasma TV for about $1,100. It had two HDMI 1.3 connections, three analog composite/component video inputs, and no network connectivity of any kind. But wow, did it make great pictures!
Seven years later, that TV sits in my basement, unused. It was replaced by a price-comparable, more energy-efficient 46-inch LCD model after Hurricane Sandy killed our power for several days and I did a whole-house energy audit. (And no, the LCD picture quality doesn’t compare to the plasma.)
But that’s not all that changed. I picked up four HDMI 1.4 inputs along the way (yep, it was set up for 3D), plus built-in Wi-Fi and “smart” functions. And I added a sound bar to make up for the awful quality of the built-in speakers. Plus, I added a Blu-ray player to round out the package, although it hardly sees any discs these days – it’s mostly used for streaming.
So – let’s say I’d like to replace that TV in 2016, just five years later. What would my options be?
To start with, I’d be able to buy a lot more screen. Right now, I could pick up a Samsung or LG 65-inch smart 1080p set for what I spent in 2011. Or, I could bite the bullet and make the move to Ultra HD with a 55-inch or 60-inch screen, complete with four HDMI inputs (one or two would be version 2.0, with HDCP 2.2 support), Wi-Fi, Netflix streaming (very important these days), and possibly a quantum dot backlight for HDR and WCG support.
My new set should support the HEVC H.265 codec, of course. That will make it possible to stream UHD content into my TV at 12 – 18 Mb/s from Netflix, Amazon Prime, Vimeo, Vudu, and any other company that jumps on the 4K content bandwagon. I could even go out and buy a brand-new Ultra HD Blu-ray player to complement it. But it’s more likely I’d opt to stream UHD content over my new, fast 30 Mb/s Internet connection from Comcast.
Now, it might pay to wait until later in 2016, when I could be sure of purchasing an Ultra HDTV that would support one or more of the proposed HDR delivery standards for disc-based and streaming UHD movies. And maybe I’d have more “fast” inputs, like DisplayPort 1.2 or even 1.3 to go along with HDMI 2.0 (and quite possibly, superMHL).
And I might even swing back over to an emissive display, to replace the picture quality I got from my old plasma set. That would mean purchasing an OLED Ultra HDTV, which would also support HDR and WCG, plus all of the usual bells and whistles (Wi-Fi, multiple HDMI/DP inputs, streaming, apps).
My point? We’re going to see some amazing technology in the next generation of televisions at ICES. And consumers are apparently warming up to Ultra HD – while sales of 1080p sets continue to decline, Ultra HD sales are climbing by double-digit percentages. I expect that number to accelerate as we near the Super Bowl, even though it won’t be broadcast in 4K (yet!).
If you are thinking about upgrading your main TV, 2016 could give you plenty of reasons to do it. My advice? Wait until all the puzzle pieces are in place for delivery of HDR and WCG to your home, and look into upgrading your Internet connections – streaming 4K will be here faster than you realize. And if you can live with your 1080p set until the fall of 2016, you’ll be amazed and likely very pleased at the upgrade…
Display Interfacing: Welcome to Babylon
- Published on Thursday, 10 September 2015 13:53
- Pete Putman
- 0 Comments
For many years, ‘interfacing’ a video signal meant plugging in a yellow RCA or silver BNC connector that carried composite video. As picture resolution went up, computers became commonplace at work and home, and the term ‘progressive scan’ entered the lexicon, we saw the birth of S-video and then component video (YPbPr and RGB).
So we adapted, building switching and distribution gear that could handle one-, two-, and three-wire formats. All was well and good…until ‘digital’ made its grand entrance about 15 years ago.
Now, we have digital versions of component and RGB video, starting with the Digital Video Interface (DVI) and moving to High Definition Multimedia Interface (HDMI), DisplayPort, and the new superMHL interface that (according to the MHL Alliance) will start appearing on televisions as soon as December.
If I’m a consumer, I mostly don’t care about any of this. As long as I can plug in my set-top box, Blu-ray player, and other gadgets with the right cables I can find at Best Buy, this is just a bunch of alphabet soup.
However, if I’m an integrator (consumer or commercial), then I care VERY much about where all of this is heading. And if I’m paying any attention at all to the growing market for 4K and UHD, then I’m rightfully concerned about the impending problems with interfacing these signals.
*Even though HDMI 2.0 was announced in September of 2013 – TWO FULL YEARS AGO – virtually no manufacturer in the pro AV space supports this interface on their switchers and distribution amplifiers. Instead, the vast majority are still providing version 1.4 while claiming these products are “4K compatible” or “4K ready” because version 1.4 is just fast enough to pass an Ultra HD (3840×260) signal at 30 Hz with 8-bit RGB color. That’s setting the bar kinda low, isn’t it?
*Some computer manufacturers don’t even support HDMI, like Apple (DisplayPort) and Lenovo (also DisplayPort). So, now you have to carry dongles everywhere you go?
*HDMI 2.0 arrives hand-in-hand with a new version of copy protection (HDCP 2.2) which is much more rigorous than versions 1.3 and 1.4. If a valid HDCP key exchange isn’t made within 20 milliseconds, the connection will shut down. Period.
*HDMI 2.0 isn’t fast enough for what UHD is turning out to be – a real departure from 1080p and Wide UXGA, with a move to 10-bit color to support high dynamic range (HDR), wide color gamuts (WCG), and high frame rates (HFR). DisplayPort 1.2 can barely support these requirements; DP version 1.3 and super MHL are better positioned to handle the job.
*The intellectual property behind HDMI and superMHL is owned by the same company – Lattice Semiconductor – and whereas once there were clear dividing lines between the two interfaces (MHL was designed originally for smartphones and tablets), they are now competing against each other. I’ve even sat in on presentations where it was explained that both could exist on consumer TVs. (And why would that make sense, again, when neither interface has been widely deployed to date, and one is clearly an improvement over the other?)
You can imagine what this trend is doing to product designers and manufacturers. Sure, HDMI is a “safe bet” for now, but what if our UHD needs quickly outstrip its maximum clock speed? DP is certainly faster and there appears to be more support for it from computer manufacturers. But super MHL is faster still. Shouldn’t your interfaces at least have a head start on display manufacturers?
This reliance on HDMI has led several manufacturers into a potential trap, investing heavily on signal distribution architectures that may quickly run into a “future-proofing” problem. In contrast; outside the commercial AV industry, everyone from cable TV system operators to broadcasters and telecom operators are busy migrating to an IP-based architecture.
Not only does IP-based architecture have the advantage of being a relatively open system, it also solves many of the speed issues as 1-gigabit and 10-gigabit networks are becoming more commonplace. (Heck, Comcast just upgraded my home Internet speeds to 75 Mb/s on downloads, which is more than fast enough for me to stream 4K content from Netflix and Amazon!)
So, why don’t we do the same in the commercial AV industry? It’s not for a lack of products – there are several companies offering AV-over-IP transmitters and receivers, along with encoders and decoders. I’ve also seen impressive demos of “middleware” used to locate, switch, and play out media assets over IP networks. All of these guys were at InfoComm 2015.
The big players in HDMI-based switching and distribution argue against AV-over-IP for in-room and short-run signal distribution, citing latency and compression issues. Well, we now have a new codec (HEVC H.265) to handle that end of things, and it’s possible to stream video and high resolutions with low latency. (How does 1920x1080p/60 at 1 to 2 Mb/s sound to you? Thought so.)
High latency is often the result of over-compression and heavy forward error correction (FEC). But if video and audio assets are streaming on bandwidth-managed, private IP networks, there isn’t a lot of forward error correction required. Group of Pictures (GOP) sizes can also increase to reduce latency. So latency is sort of a “straw man” argument. (And HDMI 2.0 will have plenty of issues with HDCP 2.2, trust me. Talk about latency…)
As for copy protection; video and audio assets streaming over IP connections have their own security protocols. Practically speaking, what could be more secure than video content streaming directly into a UHDTV, through an Ethernet connection? And you don’t even have to plug in a cable to make it work, unless you use a wired Ethernet hookup. Bandwidth issues? Well, how about 5 GHz 802.11ac channel-bonding routers? I’m getting 70+ Mb/s download speeds from mine with wired connections, and 25 – 30 Mb/s some distance from my 5 GHz wireless link.
Again, looking outside our industry, the two most common signal distribution and switching architectures are based on HD-SDI or IP (or both). Not HDMI, and certainly not HDMI-derived, structured-wire systems like HDBaseT. If the rest of the world wants to multiplex video, audio, metadata, and other low bitrate control signals, they do it over optical fiber. (Did you know that multimode fiber is cheaper than Category 6 wire?)
I’ll wrap up thing by saying that the smart move is for commercial AV integrators to move to an AV-over-IP signal distribution system at the core like everyone else, leaving the HDMI, DisplayPort, superMHL, and “whatever comes next” connections for the far ends, near the displays (if those far-end conversions are even needed at all).
Leave the core as a high-speed, copper bus or optical bus, software-based switcher. If there’s enough bandwidth (and there should be), that system can also carry local TCP/IP traffic, SMTP alerts from connected devices, and control signals to all devices. Not only does this approach free everyone from the “closed world” paradigm of HDMI, it also makes the system infinitely more appealing to end-users and facility administrators, an increasing number of whom come from the IT world.
Consider this ad that was posted recently on a listserv for higher education:
“We are looking for an experienced AV-IT Engineer for the role of Technical Guru. The position will provide planning and support for AV-IT systems used in teaching and learning spaces big and small. The person in this position will focus on design, installation, and troubleshooting of AV-IT systems in a variety of venues, including traditional classrooms, active learning classrooms, large auditoria, computer labs, and even Makerspaces…We are looking for a seasoned professional with a solid background in AV-IT systems. This is a great opportunity for a doer who is excited about not just maintaining but also shaping the future of AV-IT technology as a key element of the teaching mission of one of the world’s top universities.”
I rest my case. It’s time for the commercial AV industry to get in step with the rest of the world and move to AV-over-IP signal distribution.
Wake up. Have you smelled the coffee yet?
Look Out, HDMI – Here Comes Super MHL!
- Published on Tuesday, 17 March 2015 12:38
- Pete Putman
- 0 Comments
Yesterday, the MHL Consortium announced its newest flavor of display interface – Super MHL (or, more accurately, superMHL). MHL, which stands for Mobile High-definition Link, was original developed to enable display connections over Micro USB ports on phones and tablets. You’ll find it most often on mobile devices and televisions from Samsung and Sony. It will also pop up on LG products and there’s even an MHL port on the Pioneer AV receiver I bought a couple of months ago.
There have been some clever demos of MHL applications at past CES events. One was to build a “dumb” laptop (no CPU or video card) – just keyboard, touchpad, and display – and use MHL to dock a smartphone into it make everything work. Another demo in the Silicon Image booth featured smartphones being used as video game controllers with the video playing back on the controller screen.
Yet another demo showed a Sony Experia phone being used as a remote control with a Samsung TV to select inputs, play video, and launch Internet applications. It’s easy to do this stuff when you can multiplex video and serial data through the same connector, which in MHL version 3.0 can even play back Ultra HD video at 30 fps with 8-bit color.
Note the emphasis on the “mobile” part. In the world of transition-minimized differential signaling (TMDS), MHL is one of a few players, the others being HDMI (the dominant digital display interface), its predecessor DVI (still going strong although the standard isn’t updated anymore), and Micro and Mini HDMI, niche connectors on smartphones and cameras.
The advent of Ultra HD, 4K, and higher display resolutions like the new “5K” widescreen workstation monitors use has created a problem: Our display interfaces need to get faster. A LOT faster!
But HDMI 2.0, announced in September 2013, isn’t fast enough. I get into frequent debates with people about why it isn’t, so let me clarify my position: HDMI 2.0 has a maximum overall clock (data) rate of 18 gigabits per second (18 Gb/s). 80% of that can be used to carry display signals; the rest is overhead using 8 bit/10 bit mapping.
So that limits HDMI 2.0 to supporting 3840×2160 pixels (4400×2250 pixels with blanking) in an RGB signal format @ 60 Hz refresh. That’s the hard, fast speed limit. For anyone using a computer workstation or media player with RGB output, this hard, fast limit is a serious obstacle: How will people who buy the new HP/Dell 27-inch workstation monitors connect them? Their working resolution is 5120×2880 pixels, and at 60 Hz, that’s just too fast for HDMI 2.0.
It looked like DisplayPort 1.2 would finally ascend to the top of the podium, since its existing speed of 21.6 Gb/s (17.28 Gb/s usable) was already faster than HDMI 2.0. And now, DisplayPort 1.3 has been announced, with a top speed of 32 Gb/s (about 26 Gb/s usable) and the adoption of Display Stream compression. Indeed, more computer manufacturers are providing DP connections on laptops: Lenovo seems to have moved completely to this format, and Apple has been supporting DP for some time now.
With all of that in mind, I will admit I was completely blind-sided by superMHL at this year’s International CES. Instead of a 5-pin Micro USB connector, superMHL offers a 32-pin, full-size connector that’s symmetrical (the next big thing in connectivity, a la USB Type-C). It also supports Display Stream compression. And it’s compatible with USB Type-C, although not with all six lanes. And it has a maximum data rate of 36 Gb/s across six lanes of data. (According to the MHL Consortium, that’s fast enough to transport an 8K (7680×4320) image with 120 Hz refresh and 4:2:0 color.)
The MHL Consortium’s announcement yesterday featured Silicon Image’s new Sil97798 port processor, which can also handle HDMI 2.0 signals. Here are the key specs from the Super MHL press release:
- 8K 60fps video resolution, as outlined in the superMHL specification
- New, reversible 32-pin superMHL connector
- USB Type-C with MHL Alt Mode
- High Dynamic Range (HDR), Deep Color, BT.2020
- Object audio – Dolby Atmos®, DTS:X, 3D audio, audio-only mode
- High bit-rate audio extraction
- HDCP 2.2 premium content protection
Whew! That’s quite a jump up from MHL. Some might say that superMHL is on steroids, but no matter how you look at it, superMHL is now a serious contender for the next generation of display connectivity. In the press briefing, a representative of the MHL Consortium waxed on about the approach of 8K broadcasting (it’s already been operating for two years in Japan) and how we would see a migration to 8K TV and displays in the near future.
As Larry David says, “Curb your enthusiasm!” Supporting 8K would be nice, but we’ve barely started the transition to UHDTV. And right now, selling 8K TV to the average consumer is like trying to peddle a Ferrari to someone who lives on a dirt road.
Where superMHL will find its niche is in supporting the higher bit rates that high dynamic range (HDR), wide color gamuts (BT.2020), and higher frame rates (60/96/100/120 Hz) require. All will shortly become important parts of the next-generation (UHD) television system. DisplayPort is already there with version 1.3, and you’ll even find DP 1.2 connections on selected models of Ultra HDTVs so that gamers can connect laptops and desktops at Ultra HD resolutions with 60 Hz refresh.
Now, the elephant in the room: How does the emergence of superMHL affect HDMI? Even though version 2.0 is over a year and a half old, you don’t see many HDMI 2.0 jacks on Ultra HDTVs. Casual inspections at Best Buy, HH Gregg, and other outlets show that the typical HDMI 2.0 port count is usually one (1), even as we approach April of 2015.
In the superMHL presentation, the concept of a TV with multiple HDMI 2.0 inputs and one superMHL input was outlined. This would, in effect, be the next step up from where we are now, with the typical Ultra HDTV having one HDMI 2.0 input and three HDMI 1.4 inputs.
But if Silicon Image’s new Sil9779 port processor can handle both formats, why bother with HDMI 2.0 in the first place, especially with its speed limitations? Wouldn’t it make more sense to future-proof all inputs and go with superMHL across the board? (Of course, the cost of adopting superMHL could weigh heavy on that decision.)
In the commercial AV and broadcast worlds, it would definitely make sense to jump to superMHL in the interests of future-proofing installations. Given the limited rollout of HDMI 2.0 to date, maybe supporting both HDMI 1.4 for legacy devices and superMHL is a smarter approach. (Note that superMHL and HDMI 2.0 both support HDCP 2.2, which is the next level in encryption and NOT compatible with older versions of HDMI.)
Summing up; the race for faster interface speed just got a lot more interesting with the addition of superMHL to the lineup. I can imagine that manufacturers of AV matrix switchers and distribution amplifiers are feeling another migraine headache coming on…
EDITOR’S NOTE: Last week, it was announced that Silicon Image has been acquired by Lattice Semiconductor of Hillsboro, Oregon, “ a leading provider of programmable connectivity solutions” according to the press release. The acquisition price was about $600M and now leaves Lattice in control of HDMI, MHL and superMHL, and SiBEAM (WiHD) patents and IP. More information can be found on the Lattice Web site at http://www.latticesemi.com/.
There’s fast…and then there’s FAST.
- Published on Friday, 26 September 2014 16:42
- Pete Putman
- 0 Comments
There’s Fast…and then There’s FAST.
A little over a year ago, Silicon Image announced the latest version of HDMI – 2.0. Among the other enhancements to this interface was an increase in the clock rate to 600 MHz, allowing data rates as high as 18 gigabits per second (GB/s).
Good thing, too, with Ultra HD televisions coming to market. The previous iteration of HDMI (v1.4) had a capped data rate of 10.2 Gb/s, which was barely fast enough for Ultra HD signals (3840×2160 pixels) refreshed at 30 Hz, with color depth not to exceed 8 bits per pixel.
By boosting the speed to 18 Gb/s, HDMI 2.0 can now pass a 60 Hz Quad HD signal – but only with 8-bit RGB color. (If you’re willing to cut the color resolution in half, you can increase the bit depth.) To me, that’s not enough of an improvement: If you’ve seen what it takes to shoot, edit, and post 4K content, you’ll realize why 10-bit and even 12-bit encoding is the way to go. But HDMI 2.0 can’t handle that with high frame rates.
Earlier this month, the Video Electronics Standards Association “officially” announced what we knew was coming for some time – DisplayPort version 1.3, which will boost its data rates to a mind-boggling 32 Gb/s – almost twice as fast as HDMI 2.0 – and also employ for the first time a form of visually-lossless compression, known as Display Stream.
Unlike HDMI, DisplayPort is a pure digital transport, using packet-based communication. It transports video, audio, metadata, and even Ethernet, using four scalable “lanes” to carry signals. In the current version (1.2), the capped data rate for each lane is 5.4 Gb/s, but with version 1.3, it will rise to 8 Gb/s.
DisplayPort’s architecture is designed to be flexible. There are full-sized and mobile versions of the connector, along with wireless and optical fiber interface specifications. Users of Apple MacBooks are familiar with the Mini DisplayPort interface, and a mobile version (Mobility DisplayPort or SlimPort) is available for tablets and phones and uses a single lane for 1080p/60 playback.
The maximum data rate for all four lanes is 21.6 Gb/s, which can accommodate a 3840x2160p/60 signal encoded with 10 bits per pixel in the RGB format. That’s considerably faster than HDMI 1.4 and one reason why a handful of TV manufacturers are adding DP 1.2 connectors to their new 4K TVs. The other reason is the lack of royalties (for now) to use the interface.
At CES, VESA announced DockPort, a multiplexed signal format that blends USB 3.0 connectivity with display signals in the standard and mini DP connectors. Now, there has been a major announcement by the USB 3.0 Promoter Group and the Video Electronics Standards Association (VESA) of something called “USB Type-C Alternate Mode.”
Drilling deeper, we find that the USB 3.0 Group has introduced a new variation of their interface, known as the Type-C connector. Unlike other versions (Types A&B and their more commonly used “full-size” and “mini” designations); the type-C connector borrows a page from Apple’s playbook and is reversible. That is; it makes no difference which way you plug it in – there is no right side or wrong side up.
There’s more: The Type-C connector (about half as large as a conventional USB Type-A connector) can carry serial data at speeds up to 10.2 Gb/s (USB 3.1 Gen 2). It can also deliver up to 100 watts of power (20 volts DC at a maximum of 5 amperes) so that a connected device could be operated while its battery charges.
There are 12 pins on a Type-C connector arrayed along both edges of the blade. Viewed from the end, the top and bottom pins are reversed from left to right, which is how you can plug it in either way and it will still work. Two pins (1 and 12) are used for ground. Pins 2 and 3 are reserved for a high-speed transmit (TX) data path, while pins 10 and 11 are reserved for a receive data (RX) path.
Pins 4 and 9 provide bus power, and pin 5 (CC) is used to communicate with the connected device to determine operating mode. Finally, pins 6 and 7 function as a USB 2.0 interface. Needless to say, the host and connected device need a USB 3.0-compatible connector switch to determine the operating mode and enable data exchange in 2.0 or 3.0 formats.
What’s unique about Type-C cables is that so-called “full feature” passive cables will actually contain an internal ID chip that signals the source connection to turn on all USB 3.0 functions and enable high-speed data exchange. These cables will be able to transmit 10.2 Gb/s of data over 1 meter (3 feet) and 5 Gb/s over 2 meters (6 feet).
Using Alternate DisplayPort mode, a maximum of four DP “lanes” of display data can travel over that same tiny USB connector, providing all the resolution and bit depth of the full-size and mini DisplayPort connectors. Or, you can reserve two lanes for USB 3.1 operation and employ the other two for displays, something you might want to do in a docking station application. USB 2.0 data exchange is always available through the same connector.
As you can see, the USB connector has gotten a lot smaller. It’s also a lot faster, and is symmetrical (no more fumbling around trying to orient the plug the right way). And it can provide the primary display connection for any device while also sending and receiving high-speed data.
With this announcement, the USB 3.0 Group and VESA have shown that “less is more” when it comes to digital signal interfacing with Type-C Alternate operation. Oh, did I mention that you will be able to buy and use a cable with a Type-C USB connector on one end and a DisplayPort plug on the other? Can’t get any easier to use than that!
CES 2014 In The Rear-View Mirror
- Published on Tuesday, 21 January 2014 15:21
- Pete Putman
- 0 Comments
Once again, CES has come and gone. It sneaks up on us right after a relaxing Christmas / New Year holiday. We’re jolted out of a quiet reverie and it’s back to the rush to board at the airport gate, walking the serpentine lines for taxis at McCarran Airport, and “late to bed, early to rise” as we scramble to make our booth and off-site appointments in Las Vegas.
We don’t make them all on time. Some we miss completely. But there’s a serendipity angle to it all: We might find, in our haste to get from one meeting to another, some amazing new gadget we didn’t know about as we take shortcuts through booths in the North, South, and Central Halls.
Or a colleague sends us a text or leaves a voicemail, emphatically stating “you have to see this!” Or a chance meeting leads to an ad hoc meeting, often off-site or over a hasty lunch in the convention center.
My point is this: You “find” as many cool things at the show as you “lose.” For every must-see product that you don’t see, there’s another one you trip over. Granted; many “must-see” products are yawners – you’ve figured it out 30 seconds into your carefully-staged meeting with PR people and company executives, and you’re getting fidgety.
My best CES discoveries involve products or demos where I can observe them anonymously, without PR folks hovering at my side or staring at my badge before they pounce like hungry mountain lions.
Unlike most of my colleagues in the consumer electronics press, I don’t need to break stories the instant I hear about them. There are already too many people doing that. What’s missing is the filter of analysis – some time spent to digest the significance of a press release, product demo, or concept demo.
And that’s what I enjoy the most: Waiting a few days – or even a week – after the show to think about what I saw and ultimately explain the significance of it all. What follows is my analysis of the 2014 International CES (as we are instructed to call it) and which products and demos I thought had real significance, as opposed to those which served no apparent purpose beyond generating daily headlines and “buzz.”
Curved TV screens: OK, I had to start with this one, since every TV manufacturer at the show (save Panasonic and Toshiba) exhibited one or more curved-screen OLED and LCD televisions. Is there something to the curved-screen concept? On first blush, you’d think so, given all of the PR hype that accompanied these products.
The truth is; really big TV screens do benefit a little from a curved surface, particularly if they are UHDTV models and you are sitting close to them. The effect is not unlike Cinerama movie screens from the 1950s and 1960s. (That’s how I saw Dr. Zhivago and 2001: A Space Odyssey back in the day.)
Bear in mind I’m talking about BIG screens here – in the range of 80 inches and up. The super-widescreen (21:9 aspect ratio) LCD TVs shown by Samsung, LG, and Toshiba used the curve to great effect. But conventional 16:9 TVs didn’t seem to benefit as much, especially in side-by-side demos.
The facts show that worldwide TV shipments and sales have declined for two straight years, except in China where they grew by double digits each year. TV prices are also collapsing – you can buy a first-tier 55-inch “smart” 1080p LCD TV now for $600, and 60-inch “smart” sets are well under $800 – so manufacturers will try anything to stimulate sales.
Is that the reason why we’re seeing so many UHDTV (4K) TVs all of a sudden? Partially. Unfortunately, there’s just no money in manufacturing and selling 2K TVs anymore (ask the Japanese manufacturers how that’s been working for them), and the incremental cost to crank out 4K LCD panels isn’t that much.
Chinese panel and TV manufacturers have already figured this out and are shifting production to 4K in large panels while simultaneously dropping prices. You can already buy a 50-inch 4K LCD TV from TCL for $999. Vizio, who is a contract buyer much like Apple, announced at the show that they’d have a 55-inch 4K LCD TV for $1299 and a 65-inch model for well under $2,000.
Consider that the going price for a 55-inch 4K “smart” LCD TV from Samsung, LG, and Sony is sitting at $2,999 as of this writing and you can see where the industry is heading. My prediction is that all LCD TV screens 60 inches or larger will use 4K panels exclusively within three years. (4K scaling engines work much better than you might think!)
And don’t make the popular mistake of conflating 4K with 3D as ‘failed’ technologies. The latter was basically doomed from the start: Who wants to wear glasses to watch television? Not many people I know. Unfortunately, glasses-free (autostereo) TV is still not ready for prime time, so 3D (for now) is basically a freebie add-on to certain models of televisions.
4K, on the other hand, has legs. And those legs will get stronger and faster as the new High Efficiency Video Codec (HEVC) chips start showing up in televisions and video encoders. HEVC, or H.265 encoding, can cut the required bit rate for 2K content delivery in half. That means it can also deliver 4K at the old 2K rates, somewhere in the ballpark of 10 – 20 Mb/s.
While consumer demand for 4K is slowly ramping up, there is plenty of interest in UHDTV from the commercial AV sector. And Panasonic focused in on that sector almost exclusively in their CES booth. I’m not sure why – there are plenty of inferences here; most significantly, it would appear that Panasonic is exiting the money-losing television business entirely. (Ditto nearby Toshiba, which had similar 4K “applications” showcased and which also did not exhibit a line of 2014 televisions.)
Long story short; you may be buying 4K televisions in the near future whether you want ‘em or not. It’s a manufacturing and plant utilization issue, and if commercial demand for 4K picks up as expected, that will drive the changeover even faster.
As for sources of 4K content; Samsung announced a partnership with Paramount and Fox to get it into the home via the M-Go platform. Comcast had an Xfinity demo for connected set-top-boxes to stream 4K, and of course Netflix plans to roll out 4K delivery this year direct to subscribers.
I’m not sure how they’ll pull that off. My broadband speeds vary widely, depending on time of day: I’m writing this at noontime and according to CNET’s Broadband Speed Test, my downstream bit rate is about 22 megabits per second (Mb/s). Yet, I’ve seen that drop to as low as 2 – 3 Mb/s during late evening hours, when many neighbors are no doubt streaming Netflix movies.
Even so, HEVC will definitely help that problem. I spoke to a couple of Comcast folks on my flights out to and back from CES, and they’re all focused on the bandwidth and bit rate challenges of 2K streaming, let alone 4K. More 4K streaming interface products are needed, such as Nanotech’s $300 Nuvola NP-H1, which is about the size of an Apple TV box and ridiculously simple to connect and operate.
Oh, yeah. I should have mentioned organic light-emitting diode (OLED) displays earlier. There were lots of OLED displays at CES, ranging from the cool, curved 6-inch OLED screen used in the new LG G-Flex curved smartphone to prototype 30-inch OLED TVs and workstation monitors in the TCL booth and on to the 55-inch, 65-iunch, and even 77-inch OLED TVs seen around the floor. (LG’s 77-inch offering is current the world’s largest OLED TV, and of course, it’s curved.)
OLEDs are tricky beasts to manufacture. Yields are usually on the low side (less than 25% per manufacturing run) and that number goes down as screen sizes increase, which explains the high prices for these TVs.
And there’s the unresolved issue of differential color aging, most notably in dark blue emitters. With current OLED science, you can expect dark blue emitters to reach half-brightness at about 5,000 hours of operation with a maximum brightness of 200 nits. Samsung addresses this quandary by employing two blue emitters for every red and green pixel on their OLED TVs, while LG has the more difficult task of managing blue aging in their white OLED emitters.
Several studies over the past three years consistently show people hanging on to their flat screen TVs for 5 to 7 years, which is likely to be a lot longer than 5,000 hours of operation. Will differential color aging rear its ugly head as early adopters shell out close to $10K for a 55-inch OLED TV? Bet on it.
Turns out, there’s another way to get wide color gamuts and saturated colors: Quantum dots. QDs, as we call them, are inorganic compounds that exhibit piezoelectric behavior when bombarded with photons. They emit stable, narrow-bandwidth colors with no drift, and can do so for long periods of time – long enough to work in a consumer television.
QDs are manufactured by numerous companies, most notably Nanosys and QD Vision in the United States. The former company has partnered with 3M to manufacture an optical film that goes on the backside of LCD panels, while the latter offers Color IQ optical components that interface with the entire LED illumination system in edge-lit TVs.
Sony is already selling 55-inch and 65-inch 4K LCD TVs using the Color IQ technology, and I can tell you that the difference in color is remarkable. Red – perhaps the most difficult color to reproduce accurately in any flat-screen TV – really looks like red when viewed with a QD backlight. And it’s possible to show many subtle shades of red with this technology.
All you need is a QD film or emitter with arrays of red and green dots, plus a backlight made up of blue LEDs. The blue passes through, while the blue photons “tickle” the red and green dots, causing them to emit their respective colors. It’s also possible to build a direct-illumination display out of quantum dots that would rival OLED TVs.
How about 4K display interfaces? By now, you’ve probably heard that HDMI has “upgraded” to version 2.0 and can support a maximum data rate of 18 gigabits per second (GB/s). Practically speaking; because of the way display data is transmitted, only 16 Gb/s of that is really available for a display connection. Still, that’s fast enough to show 4K content (3840×2160, or Quad HD) with a 60 Hz frame rate, using 8-bit color.
Over at the DisplayPort booth, I heard stories of version 1.3 looming later this spring. DisplayPort 1.2, unlike HDMI, uses a packet structure to stream display, audio, and other data across four scalable lanes, and has a maximum rate of 21.6 Gb/s – much faster than HDMI. Applying the “20 percent” rule, that leaves about 17.3 Gb/s to actually carry 4K signals. And the extra bits over HDMI means that DP can transport 3840×2160 video with a frame rate of 60 Hz, but with 10-bit color.
Don’t underestimate the value of higher data rates: 4K could turn out to be a revolutionary shift in the way we watch TV, adding much wide color gamuts, higher frame rates, and high dynamic range (HDR) to the equation. HDMI clearly isn’t fast enough to play on that field; DP barely is. Both interfaces still have a long way to go.
So – why not make a wireless 4K connection? There were plenty of demos of wireless connectivity at the show, and I’m not just talking about Wi-Fi. Perhaps the most impressive was in the Silicon Image meeting room, all the way at the back of the lower South Hall, near the Arizona border.
SI, which bought out wireless manufacturer SiBEAM a few years ago, demonstrated super-compact 60 GHz wireless HDMI and MHL links using their UltraGig silicon. A variety of prototype cradles for phones and tablets were available for the demo: Simply plug in your handheld device and start streaming 1080p/60 video to a nearby 55-inch LCD TV screen.
Granted, the 60 GHz tech is a bit exotic. But it works quite well in small rooms and can take advantage of signal multipath “bounces” by using multiple, steerable antenna arrays built-in to each chip. And it can handle 4K, too – as long as the bit rate doesn’t exceed the HDMI 2.0 specification, the resolution, color bit depth, and frame rate are irrelevant.
This sort of product is a “holy grail” item for meeting rooms and education. Indeed; I field numerous questions every year during my InfoComm wireless AV classes along these lines: “Where can I buy a wireless tablet dongle?” Patience, my friends. Patience…
The decline in TV shipments and sales seems to be offset by a boom in connected personal lifestyle and health gadgets, most notably wristbands that monitor your pulse and workouts. There were plenty of these trinkets at the show and an entire booth in the lower South Hall devoted to “digital health.”
Of course, the big name brands had these products – LG’s LifeBand was a good example. But so did the Chinese and Taiwanese manufacturers. “Digital health” was like tablets a few years back – so many products were introduced at the show that they went from “wow!” to “ho-hum” in one day.
This boom in personal connectivity extends to appliances, beds (Sleep Number had a model that can elevate the head of the bed automatically with a voice command), cars (BMW’s i3 connected electric car was ubiquitous), and even your home. Combine it with short-range Bluetooth or ZigBee wireless connectivity and you can control and monitor just about anything on your smartphone and tablet.
Granted; there isn’t the money in these small products like there used to be in televisions. But consumers do want to connect, monitor, and control everything in their lives, and their refrigerators, cars, beds, televisions, percolators, and toasters will be able to comply. (And in 4K resolution, too!)
Obviously, I didn’t visit the subjects of gesture and voice control. There were several good demos at the show of each, and two of the leading companies I showcased last year – Omek and Prime Sense – have been subsequently acquired by Intel and Apple. Hillcrest Labs, PointGrab, and other had compelling demos of gesture control in Las Vegas – a subject for a later time.
Summing up, let’s first revisit my mantra: Hardware is cheap, and anyone can make it. Televisions and optical disc media storage are clearly on the decline, while streaming, 4K, health monitoring, and wireless are hot. The television manufacturing business is slowly and inexorably moving to China as prices continue their free-fall.
The consumer is shifting his and her focus to all the devices in the home they use every days; not just television. Connectivity is everything, and the television is evolving from an entertainment device into a control center or “hub” of connectivity. The more those connections are made with wireless, the better – and that includes high-definition video from tablets and phones.
It’s going to be an interesting year…