Posts Tagged ‘UHD’

InfoComm 2017 In The Rear View Mirror

InfoComm 2017 has come and gone, and left us with lots to think about.

For me, this year’s show was hectic, to say the least. I presented my annual Future Trends talk on Tuesday to kick off the Emerging Trends session, then conducted a 3-hour workshop on RF and wireless that afternoon to the largest crowd I’ve ever had for the class. (It may be the largest crowd I ever get as I’m thinking of shelving this class.)

Bright and early on Wednesday morning, I taught a 2-hour class  on AV-over-IT (the correct term; you could also use “AV-with-IP”) to a full house. There were even some folks standing in the back of the room. I guessed at least 200 were in attendance.

Thursday morning found me back in the same space, talking about 4K and Ultra HDTV to a smaller crowd (maybe not as “hot” a topic?) and urging them to set their BS meters to “high” when they headed to the show floor to talk to manufacturers about 4K-compatible/ready/friendly products.

With other presentation commitments, it worked out to nearly 15 hours standing in front of crowds and talking. Tiring to say the least, but I did get a ton of great follow-up questions after each session. People were paying attention!

AV-over-IT was a BIG theme at InfoComm, and it was hard to miss.

Mitsubishi had a very nice fine-pitch LED display at the show – one of the few that are not built in China.

The migration to using TCP/IP networks to transport video and audio instead of buying and installing ever-larger and more complex HDMI switchers and DAs is definitely catching steam. My colleagues and I have only been talking about this for over a decade and it’s rewarding to see that both manufacturers and end-users are buying in.

And why not? Computer hardware couldn’t get much cheaper. For my AV/IT demo, I was streaming a local TV station, broadcasting in the 720p HD format, using an H.264 AVC encoder/decoder pair running through a 1GigE NetGear managed switch. The streaming rates were in the range of 15 – 18 Mb/s, so I had plenty of headroom.

It worked like a champ. I was able to show how adjusting the group of pictures (GOP) length affected latency, along with the effects of constant bitrate (CBR) vs. variable bitrate (VBR) encoding. If I could have dug the gear up in time, I would have demonstrated UHD content through a 10 Gb/s switch – same principles, just a faster network.

I saw more companies than ever this year showing some sort of AV-over-IT solution. (Almost as many as those showing LED walls!) Lots of encoders and decoders, using H.264, Motion JPEG, and JPEG2000 formats; connected through fast switches and driving everything from televisions to projectors.

If it’s REALLY happening this time, then this is BIG. Migration to AV-over-IT is a big shot across the bow of companies that sell large HDMI-based matrix switches, not to mention distribution amplifiers and signal extenders – both made obsolete by this new technology. With AV on a network, all you need is a fast switch and a bunch of category cable. For longer runs, just run optical fiber connections to SPF fiber connections on the switch.

LG showed off its unique curved OLED displays – and they’re dual-sided.

Meanwhile, Samsung unveiled the first digital signage monitors to use quantum dot backlight technology for high dynamic range and wide color gamuts.

Hand-in-hand with this migration to an IT-based delivery system is a steady decline in the price of hardware, which has impacted the consumer electronics industry even harder. Consider that you can now buy a 65-inch Ultra HDTV (4K) with “smart” capabilities and support for basic high dynamic range video for about $800.

That’s even more amazing when you consider that the first Ultra HD displays arrived on our shores in 2012 with steep price tags around $20,000. But the nexus of the display industry has moved to mainland China, creating an excess of manufacturing capacity and causing wholesale and retail prices to plummet.

There is no better example of China’s impact on the display market than LED display tiles and walls. These products have migrated from expensive, coarse-resolution models to super-bright thin tiles with dot pitches below 1 millimeter – about the same pitch as a 50-inch plasma monitor two decades ago.

Talk to projector manufacturers and they’ll tell you that LED displays have cut heavily into their business, especially high-brightness projectors for large venues. LED wall manufacturers were prominent at the show, and some are hiring industry veterans to run their sales and marketing operations; removing a potential barrier to sales in this country by presenting potential customers with familiar faces.

Panasonic showed there are still plenty of applications for projection, especially on curved surfaces.

Absen is an up-and-coming LED brand, and they’re hiring veterans of the U.S. AV market to push sales along.

At the other end, large and inexpensive LCD displays with Full HD resolution have killed off much of the “hang and bang” projector business, and large panels with Ultra HD resolution are now popping up in sizes as large as 98 inches. The way things are going in Asia, Full HD panel production may disappear completely by the end of the decade as everyone shifts to Ultra HD panel production.

Even the newest HDR imaging technology – quantum dots – made an appearance in Orlando in a line of commercial monitors with UHD resolution. Considering that QD-equipped televisions have only been around for a couple of years, that’s an amazingly accelerated timeline. But compressed timelines between introduction and implementation are the norm nowadays.

This was my 24th consecutive InfoComm and the 21st show (so far as I can remember) where I taught at least one class. When I went to my first show in Anaheim, CRT projectors were still in use, a ‘bright’ light valve projector could generate maybe 2000 lumens, LCD projectors cost ten grand and weighed 30 pounds, and composite video and VGA resolution ruled the day. RS232 was used to control everything and stereo was about as ‘multichannel’ as audio got.

All of that has passed into oblivion (except for RS232 and VGA connectors) as we continue to blow by resolution, size, speed, and storage benchmarks. The transition to networked AV will result in even more gear being hauled off to recycling yards, as will advances in wireless high-bandwidth technology, flexible displays, cloud media storage and delivery, and object-based control systems.

Can’t wait for #25…

Display Interfacing: Welcome to Babylon

For many years, ‘interfacing’ a video signal meant plugging in a yellow RCA or silver BNC connector that carried composite video. As picture resolution went up, computers became commonplace at work and home, and the term ‘progressive scan’ entered the lexicon, we saw the birth of S-video and then component video (YPbPr and RGB).

So we adapted, building switching and distribution gear that could handle one-, two-, and three-wire formats. All was well and good…until ‘digital’ made its grand entrance about 15 years ago.

Now, we have digital versions of component and RGB video, starting with the Digital Video Interface (DVI) and moving to High Definition Multimedia Interface (HDMI), DisplayPort, and the new superMHL interface that (according to the MHL Alliance) will start appearing on televisions as soon as December.

If I’m a consumer, I mostly don’t care about any of this. As long as I can plug in my set-top box, Blu-ray player, and other gadgets with the right cables I can find at Best Buy, this is just a bunch of alphabet soup.

However, if I’m an integrator (consumer or commercial), then I care VERY much about where all of this is heading. And if I’m paying any attention at all to the growing market for 4K and UHD, then I’m rightfully concerned about the impending problems with interfacing these signals.

superMHL is certainly fast enough to handle UHD. But you can't find it in use yet. is there a better way?

superMHL is certainly fast enough to handle UHD. But you can’t find it in pro AV gear yet. Is there a better way?

Consider that:

*Even though HDMI 2.0 was announced in September of 2013 – TWO FULL YEARS AGO – virtually no manufacturer in the pro AV space supports this interface on their switchers and distribution amplifiers. Instead, the vast majority are still providing version 1.4 while claiming these products are “4K compatible” or “4K ready” because version 1.4 is just fast enough to pass an Ultra HD (3840×260) signal at 30 Hz with 8-bit RGB color. That’s setting the bar kinda low, isn’t it?

*Some computer manufacturers don’t even support HDMI, like Apple (DisplayPort) and Lenovo (also DisplayPort). So, now you have to carry dongles everywhere you go?

*HDMI 2.0 arrives hand-in-hand with a new version of copy protection (HDCP 2.2) which is much more rigorous than versions 1.3 and 1.4. If a valid HDCP key exchange isn’t made within 20 milliseconds, the connection will shut down. Period.

*HDMI 2.0 isn’t fast enough for what UHD is turning out to be – a real departure from 1080p and Wide UXGA, with a move to 10-bit color to support high dynamic range (HDR), wide color gamuts (WCG), and high frame rates (HFR). DisplayPort 1.2 can barely support these requirements; DP version 1.3 and super MHL are better positioned to handle the job.

*The intellectual property behind HDMI and superMHL is owned by the same company – Lattice Semiconductor – and whereas once there were clear dividing lines between the two interfaces (MHL was designed originally for smartphones and tablets), they are now competing against each other. I’ve even sat in on presentations where it was explained that both could exist on consumer TVs. (And why would that make sense, again, when neither interface has been widely deployed to date, and one is clearly an improvement over the other?)

You can imagine what this trend is doing to product designers and manufacturers. Sure, HDMI is a “safe bet” for now, but what if our UHD needs quickly outstrip its maximum clock speed? DP is certainly faster and there appears to be more support for it from computer manufacturers. But super MHL is faster still. Shouldn’t your interfaces at least have a head start on display manufacturers?

This reliance on HDMI has led several manufacturers into a potential trap, investing heavily on signal distribution architectures that may quickly run into a “future-proofing” problem. In contrast; outside the commercial AV industry, everyone from cable TV system operators to broadcasters and telecom operators are busy migrating to an IP-based architecture.

Not only does IP-based architecture have the advantage of being a relatively open system, it also solves many of the speed issues as 1-gigabit and 10-gigabit networks are becoming more commonplace. (Heck, Comcast just upgraded my home Internet speeds to 75 Mb/s on downloads, which is more than fast enough for me to stream 4K content from Netflix and Amazon!)

So, why don’t we do the same in the commercial AV industry? It’s not for a lack of products – there are several companies offering AV-over-IP transmitters and receivers, along with encoders and decoders. I’ve also seen impressive demos of “middleware” used to locate, switch, and play out media assets over IP networks. All of these guys were at InfoComm 2015.

The big players in HDMI-based switching and distribution argue against AV-over-IP for in-room and short-run signal distribution, citing latency and compression issues. Well, we now have a new codec (HEVC H.265) to handle that end of things, and it’s possible to stream video and high resolutions with low latency. (How does 1920x1080p/60 at 1 to 2 Mb/s sound to you? Thought so.)

High latency is often the result of over-compression and heavy forward error correction (FEC). But if video and audio assets are streaming on bandwidth-managed, private IP networks, there isn’t a lot of forward error correction required. Group of Pictures (GOP) sizes can also increase to reduce latency. So latency is sort of a “straw man” argument. (And HDMI 2.0 will have plenty of issues with HDCP 2.2, trust me. Talk about latency…)

As for copy protection; video and audio assets streaming over IP connections have their own security protocols. Practically speaking, what could be more secure than video content streaming directly into a UHDTV, through an Ethernet connection? And you don’t even have to plug in a cable to make it work, unless you use a wired Ethernet hookup. Bandwidth issues? Well, how about 5 GHz 802.11ac channel-bonding routers? I’m getting 70+ Mb/s download speeds from mine with wired connections, and 25 – 30 Mb/s some distance from my 5 GHz wireless link.

Again, looking outside our industry, the two most common signal distribution and switching architectures are based on HD-SDI or IP (or both). Not HDMI, and certainly not HDMI-derived, structured-wire systems like HDBaseT. If the rest of the world wants to multiplex video, audio, metadata, and other low bitrate control signals, they do it over optical fiber. (Did you know that multimode fiber is cheaper than Category 6 wire?)

I’ll wrap up thing by saying that the smart move is for commercial AV integrators to move to an AV-over-IP signal distribution system at the core like everyone else, leaving the HDMI, DisplayPort, superMHL, and “whatever comes next” connections for the far ends, near the displays (if those far-end conversions are even needed at all).

Leave the core as a high-speed, copper bus or optical bus, software-based switcher. If there’s enough bandwidth (and there should be), that system can also carry local TCP/IP traffic, SMTP alerts from connected devices, and control signals to all devices. Not only does this approach free everyone from the “closed world” paradigm of HDMI, it also makes the system infinitely more appealing to end-users and facility administrators, an increasing number of whom come from the IT world.

Consider this ad that was posted recently on a listserv for higher education:

“We are looking for an experienced AV-IT Engineer for the role of Technical Guru. The position will provide planning and support for AV-IT systems used in teaching and learning spaces big and small. The person in this position will focus on design, installation, and troubleshooting of AV-IT systems in a variety of venues, including traditional classrooms, active learning classrooms, large auditoria, computer labs, and even Makerspaces…We are looking for a seasoned professional with a solid background in AV-IT systems. This is a great opportunity for a doer who is excited about not just maintaining but also shaping the future of AV-IT technology as a key element of the teaching mission of one of the world’s top universities.”

I rest my case. It’s time for the commercial AV industry to get in step with the rest of the world and move to AV-over-IP signal distribution.

Wake up. Have you smelled the coffee yet?

Ultra HD: A Race To The Bottom?

On September 23, Vizio rolled out its new line of Ultra HD TVs at an art gallery in lower Manhattan. We’d been expecting these to show up ever since pricing was announced way back at CES in January, and there weren’t any real surprises in the lineup: Five models, ranging in size from 50” to 70” with 5” (diagonal) increments.

Unlike recent Ultra HD product launches from Seiki and TCL, the Vizio lineup sent a few tremors through the industry – in particular, at Samsung, LG, and Sony. Consider that each one of the Vizio TV models is a “smart” TV, and each uses full-array LED backlighting. You’ll find a bevy of HDMI 1.4 connectors on all of them, along with a single HDMI 2.0 interface. And the sets support HEVC H.265 decoding, too. (Can you say “Netflix 4K streaming?”

In other words, these aren’t bargain-basement models, like the aforementioned Seiki. But what will raise a few eyebrows is the retail pricing: The 50-inch P502ui-B1 retails for $999, while the 55-inch P552ui-B2 goes for $1,399. The 60-inch P602ui-B3 is ticketed at $1,699, while the 65-inch P652ui-B2 will cost $2,199. And the “top of the line” 70-inch P702ui-B3 will be available for just $2,499. (All prices are in $ USD)

To see exactly what impact that could have on the market, look at current prices for Samsung and LG 55-inch Ultra HDTVs. The current HH Gregg sales flyer for October 5 shows Samsung’s UN55HU6950 55-inch Ultra HD set for $1,599, and that represents quite a drop in price over their previous 55-inch model – about $1,400.

LG also started lowering prices on its Ultra HD sets in the late spring. Their 55-inch 55UB9500 Ultra HD set is now listed at $1,999, which is also a big markdown from earlier this year. How about Sony? The HH Gregg flyer shows the 65-inch XBR65X850B with Triluminous quantum dot backlight (by QD Vision) for $2,999, which (according to the flyer) represents a $1,000 discount. That’s still $800 more than the comparable Vizio model, which uses conventional LED backlights.

So why should any of this matter? Simple: Vizio is an established national brand that has enjoyed strong sales in large LCD TV screen sizes for several years. And they’ve expanded from their original bases in Costco and BJs to Wal-Mart, Sears, and now Best Buy.

That latter brick-and-mortar chain is where Samsung, LG, and Sony have been running an aggressive in-store promotion for Ultra HDTV since early August, playing back clips of 4K footage and raffling off Ultra HD TVs in an attempt to stir up business. TV sales have declined worldwide for the past two years and the major TV brands are clearly hoping that Ultra HD will re-start the engine.

The decline in Ultra HDTV prices has been breathtaking, to say the least. One year ago, you could expect to shell out upwards of $4,400 to buy a new Samsung or Sony 65-inch Ultra HD set. 55-inch models were retailing for about $1,000 less. And now Vizio has pulled the rug out from under its competitors with a line of 4K sets that looked impressive at the NY event.

What does this mean for Ultra HD TV pricing down the road? Given the scramble to find any profit in manufacturing 2K LCD glass – a challenge even for the Koreans – and the determination of China to be a major player in 4K glass manufacturing, we can expect prices to drop even lower by next year’s Super Bowl. Right now, you can buy a nice 55-inch 2K LCD TV for $600, and I’d expect a 4K version to sell for just under $1,000 by late January.

Long term, the profit in manufacturing 2K LCD glass will mostly evaporate, leading fabs to switch to 4K glass for larger TV sizes. As a consequence, you will see most TVs larger than 55 inches utilize 4K resolution glass in a few years, just as the industry shifted from 720p and 768p panels to 1080p glass in the mid-2000s.

According to NPD DisplaySearch, more Ultra HD sets were sold in the second quarter of 2014 (2.1 million) than in all of last year (1.3 million). But we’re still talking about a small percentage of all TVs sold worldwide in 2013 (208 million). So it is surprising to see price wars already starting up this early in the game.

Who will blink next?