Posts Tagged ‘4K’

True Facts

A recent post on LinkedIn led me to write this, and it has to do with 4K video and imaging. Or, at least how marketing types have redefined it as “True” 4K or “Faux” 4K.

The post in question had to do with a projector manufacturer’s 4K offerings and how other manufacturers may not be offering a “true” 4K product by comparison, calling those other products “faux” 4K (or “faux” K to be clever). That prompted more than a few comments about what “true” 4K is in the first place.

One comment pointed out that the projector brand behind the original post doesn’t even have a “true” 4K imaging device in its projector, as it uses Texas Instruments’ .66” DMD with 2716×1528 micromirrors and requires image shifting to create an image with full 4K resolution. (Some irony in that?)

Now, I know more than a few marketing folks in the AV industry, and they work very hard and diligently to promote their company’s products. However, sometimes they step out of bounds and create more confusion, particularly with new technologies. Which, by the way, was one reason I started teaching technology classes at InfoComm and other trade shows two decades ago – as a way to counter marketing hype with facts.

What, exactly, is “true” 4K? If you use spatial resolution as your benchmark, then your imager must have at least 4000 horizontal or vertical pixels. The fact is, very few displays today have that much resolution, save for a limited number of digital cinema projectors, a handful of home theater projectors, and a small selection of reference and color grading monitors. All of which will set you back quite a few $$$.

Most displays that are lumped into the 4K category are really Ultra HD displays, having a fixed resolution of 3840 horizontal and 2160 vertical pixels. This would include every so-called 4K consumer TV, many digital signage displays, and production monitors. Are they “true” 4K? Going by spatial resolution, no.

What make things even more confusing is projection specsmanship. Sony’s original SXRD projectors had Ultra HD resolution. Although Epson has shown a prototype HTPS LCD chip with UHD resolution, they’ve never brought it to market. And the only DMD that Texas Instruments makes with 4K resolution is the 1.38” dark chip they sell into the digital cinema marketplace.

What projector manufacturers do instead to get to 4K is to use lower-resolution chips and shift the image with very fast refresh rates to effectively create 4K images. I’ve seen demos of the .66” DMD creating 4K images vs. a native UHD imager and you can see the difference between native and shifted images, particularly with fine text and detail. But it represents a low-cost way to get something approaching UHD resolution.

Panasonic also did this with their PT-RQ32U 4K DLP projector, using devices with 2550×1536 resolution and mapping quadrants to get to 5120×3200 total pixels. Presumably, they’ve retained this trick on their newer 4K models shown at InfoComm 2019.

Is that “true 4K?” Not when it comes to spatial resolution. But what if you base your claims on each finished frame of video, after all sub-fields are created? In that case, you could have an argument that your device is actually creating 4K video. Since our eyes can’t keep up with refresh rates much past 60 Hz, we’re not likely to see any flicker from this technique (also known as “wobbulation” and used by such luminaries as JVC and Hewlett-Packard on their display products in the past).

In fact, Digital Projections’ Insight Laser 8K projector employs three 1.38” dark chip DMDs and some clever image shifting to get from native 4096 x 2160 resolution to get to 8K (presumably 8192 x 4320 pixels in the finished images). Native 8K DMDs don’t exist, and like 8K camera sensors, wouldn’t come cheap if they did. Scaling down, it would make no sense financially to try and ship single-chip 4K DLP projectors with the 1.38” 4K DMD, not to mention the optical engine would have to be a lot larger, resulting in a bigger and heavier projector.

At this point, we should stop using the nomenclature “4K” altogether and switch to the more accurate CTA designation for Ultra HD (3840 x 2160) when we talk about the next generation of displays past Full HD (1920 x 1080) and 2K (2048 x 1080). Also, SMPTE designates two sets of resolutions that go beyond Full HD – UHD-1, or anything up to and including 3840 x 2160, and UHD-2, anything beyond UHD-1 up to 8K (7680 x 4320) and beyond.

From my perspective; if your imaging device can show me a complete frame of video with at least 3840 x 2160 pixels, refreshed at 60 Hz, then I’m okay with calling it UHD (NOT 4K). But there’s a catch: High frame rate video is going to be a big thing with UHD-1 and UHD-2 and will require refresh rates of 90, 100, 120 Hz, and even 240 Hz. Can your current projector show me a complete video frame with at least 3840 x 2160 pixels of spatial resolution when refreshed at 240 Hz? 120 Hz?

Boy, I can hardly wait for 8K projector marketing campaigns to start…

(This article originally appeared on 9/19/2019 in Display Daily.)

HDMI 2.1 Update – Pretty Much Status Quo

Last Thursday, a joint press conference was held in New York City by the HDMI Licensing Administrator to update attendees on the latest version of HDMI – version 2.1.

V2.1, which was officially announced at CES in 2017, represents a quantum leap over earlier versions. It’s the first HDMI architecture to use a packet-based signaling structure, unlike earlier versions that employed transition-minimized differential signaling (TMDS). By moving to a packet transport (an architecture which V 2.1 apparently borrowed a lot from DisplayPort, according to my sources), the maximum data rate could be expanded several-fold from the previous cap of 18 gigabits per second (Gb/s) to a stratospheric 48 Gb/s.

What’s more, the clock reference can now travel embedded in one of the four lanes. Previously, HDMI versions up to 2.0 were limited to three signal lanes and one clock lane. And of course, a digital packet-based signal stream lends itself well to compression, accomplished with VESA’s Display Stream Compression (DSC) system that is also the basis for Aptovision’s Blue River NT technology.

The HDMI Forum simply had to kick up the performance of the interface. Version 2.0, announced five years ago, was perceived by many (including me) to be too slow right out of the gate, especially when compared to DisplayPort 1.2 (18 Gb/s vs. 21.6 Gb/s). That perception was prescient: Just half a decade later, Ultra HDTVs are rapidly approaching the unit shipment numbers of Full HD models, and the bandwidth demands of high dynamic range (HDR) imaging with wide color gamuts (WCG) need much faster highways, especially with RGB (4:4:4) color encoding and 10-bit and 12-bit color rendering.

And if we needed any more proof that a faster interface was overdue, along comes 8K. Samsung is already shipping an 8K TV in the U.S. as of this writing, and Sharp has introduced a model in Japan. LG’s bringing out an 8K OLED TV in early 2019, and Dell has a 32-inch 8K LCD monitor for your desktop.

To drive this point home, IHS analyst Paul Gagnon showed numbers that call for 430,000 shipments of 8K TVs in 2019, growing to 1.9 million in 2020 and to 5.4 million in 2022. 70% of that capacity is expected to go to China, with North America making up 15% market share and western Europe 7%. Presumably, at least one of the signal inputs on these TVs will support HDMI 2.1, as even a basic 8K video signal (60p, 10-bit 4:2:0) will require a data rate of about 36 Gb/s, while a 4:2:2 version demands 48 Gb/s – right at the red line. (DSC would cut both of those rates in half).

Aside from stating that over 900 million HDMI-equipped devices are expected to ship in 2019 (including everything from medical cameras to karaoke machines,) HDMI Licensing CEO Rob Tobias didn’t offer much in the way of real news. But I had a few deeper questions, the first of which was “Is there now native support for optical interfaces in the HDMI 2.1 standard?” (Answer – no, not yet.)

My next question was about manufacturers of V2.1 transmitter/receiver chipsets. Had any been announced that could actually support 48 Gb/s? According to Tobias, HDMI Forum member Socionext, a chip manufacturer in Japan, has begun production on said chipsets. I followed that reply up with a question about manufacturer support for DSC in televisions and other CE devices, but couldn’t get a specific answer.

Much of the discussion among these panel members and David Meyer (director of technical content for CEDIA), Brad Bramy, VP of marketing for HDMI LA, and Scott Kleinle, director of product management for Legrand (a supplier to the CEDIA industry) was focusing on future-proofing residential installations that used HDMI interconnects.

But why not just go optical for all HDMI 2.1 connections and guarantee future-proofing? The responses I got to my last question were mostly along the line of “The installer just wants it to work the first time.” Yes, there are faster (Ultra High Speed) HDMI cables available now to work with V2.1 connections. But an HDMI cable that has to run 20, 30, or 40 feet at over a GHz clock rate is a pretty fat cable!

Multimode fiber cable is inexpensive compared to Cat 6 cable and the terminations are not difficult to install. Running strands of fiber through conduit, stone, and behind walls seems to be the most logical solution at the required speeds and is certainly what I’d recommend to installers in the commercial AV market. Properly terminated, optical fiber works the first time and very time and can run over a mile without significant signal degradation.

Once again, the HDMI Forum will have a booth at CES in the lower South Hall. With a new display wrinkle lurking in the shadows – high frame rate (HDR) video – there will be more upward pressure than ever on data rates for display connections. HDMI 2.1 may be up to the task (most likely aided by DSC), so I will be curious to see if there are any 8K/120 demos in Las Vegas. – PP

NAB 2018 In The Rear View Mirror

I just returned from my annual visit to the NAB Show in Las Vegas and the overall impression was of an industry (or industries) marching in place. Many booths were smaller; there were plenty of empty spaces filled with tables and chairs for eating and lounging, and at times you could hear crickets chirping in the North and Central Halls.  (Not so the South Hall, which was a madhouse all three days I visited.)

There are a number of possible reasons for this lack of energy. The broadcast and film industries are taking the first steps to move to IP backbones for everything from production to post and distribution, and it’s moving slowly. Even so, there was no shortage of vendors trying to convince booth visitors that AV-over-IT is the way to go, chop-chop!

Some NAB exhibitors that were formerly powerhouses in traditional media production infrastructures have staked their entire business model on IT, with flashy exhibits featuring powerful codecs, cloud media storage and retrieval, high dynamic range (HDR) imaging, and production workflows (editing, color correction, and visual effects) all interconnected via an IT infrastructure.

And, of course, there is now a SMPTE standard for transporting professional media over managed AV networks (note the word “managed”), and that’s ST 2110. The pertinent documents that define the standards are (to date) SMPTE ST 2110-10/-20/-30 for addressing system concerns and uncompressed video and audio streams, and SMPTE ST 2110-21 for specifying traffic shaping and delivery timing of uncompressed video.

No doubt about it – the Central Hall booths were definitely smaller and quieter this year.

 

Canon’s Larry Thorpe and Ivo Norenberg talked about the company’s new 50-1000mm zoom lens for Full HD cameras.

 

BlackMagic Design’s Pocket Cinema 4K Camera is quite popular – and affordable.

Others at NAB weren’t so sure about this rush to IT and extolled the virtues of next-generation SDI (6G, 12G, and even 24G). Their argument is that deterministic video doesn’t always travel well with the non-real-time traffic you find on networks. And the “pro” SDI crowd may have an argument, based on all of the 12G connectivity demos we saw. 3G video, to be more specific, runs at about 2.97 Gb/s, so a 12G connection would be good for 11.88 Gb/s – fast enough to transport an uncompressed 4K/60 video signal with 8-bit 4:2:2 color or 10-bit 4:2:0 color.

I’ve talked about 8K video and displays in previous columns, but mostly from a science experiment perspective. Well, we were quite surprised – perhaps pleasantly – to see Sharp exhibiting at NAB, showing an entire acquisition, editing, production, storage, and display system for 8K video. (Yes, that Sharp, the same guys that make those huge LCD displays. Now owned by Hon Hai precision industries.)

Sharp’s 8K broadcast camera, more accurately the 8C-B60A, uses a single Super 35mm sensor with effective resolution of 7680×4320 pixels arrayed in a Bayer format. That’s 16 times the resolution of a Full HD camera, which means data rates that are 16x that of 3G SDI. In case you are math challenged, we’re talking in the range of 48 Gb/s of data for a 4320p/60 video signal with 8-bit 4:2:2 color, which requires four 12G connections.

Sharp is building 8K cameras for live coverage of the 2020 Tokyo Olympics.

 

NHK demonstrated an 8K 240Hz slow motion video playback system, along with other 8K goodies.

 

Soliton demonstrated H.265 encoding across multiple platforms, including Android devices.

And this isn’t a science experiment at all. Sharp is building cameras for the live 8K broadcasts to take place at the 2020 Tokyo Olympics, originating from Japanese broadcast network NHK. By now, this should be old hat, as NHK has been covering the Olympics in 8K since 2012 and showed different approaches to home viewing in Las Vegas. They also impressed with demos of 8K “slo-mo” video at a frame rate of 240 Hz, and yes, it is practical and ready to roll.

In the NHK booth, you could also watch a demonstration of 8K/60 video traveling through a 10 Gb/s switch using so-called mezzanine compression based on the TiCo system. In this case, NHK was using 5:1 TiCo compression to slow down a 40 Gb/s 8K/60 video stream to 8 Gb/s. (Four 12G video connections would result in a bit rate of nearly 48 Gb/s in case you’re wondering.)

Not far from NHK’s booth last year was a virtual city of companies showing virtual reality (VR) and augmented reality (AR) hardware and software. That was about twice the size of the VR/AR exhibits in 2016, so I expected to find a sprawling metropolis of VR goodies. Instead, I came across a very large food court and lots of partitioned-off space. Turns out, what was left of the VR companies occupied a small pavilion known as “Immersive Storytelling.” Is VR the next 3D? (Probably not, but you couldn’t be blamed for thinking that.)

Panasonic’s got a 55-inch 4K OLED monitor for client viewing.

 

Epson showed an ultra short-throw laser projection system with excellent edge-to-edge sharpness.

 

The gadgeteers at NTT built a drone with a spinning LED sign shaped like a globe. Why? Because they could, I suppose.

Upstairs in the South Hall, there were dozens of companies hawking video compression tools, streaming and cloud services, targeted ad insertion, audience analytics, and a bunch of other buzzwords I’m probably getting too old to completely understand. (It will be interesting to see how many of these enterprises are still around a year from now.)

But my primary goal in that hall was to talk to folks from the Alliance for Open Media coalition. In case you haven’t heard of this group, they’ve been promoting an open-source, royalty-free codec labeled AV-1 for “next-generation 4K video.” There are at least 18 prominent members of the group and you may recognize a few of them, such as Google, Apple, Mozilla, YouTube, Netflix, Facebook, and VideoLAN.

And that they’re promoting is a codec that is very similar to HEVC H.265, which is made up of lots of intellectual property that requires licensing from an organization known as MPEG-LA (Licensing Authority, not Los Angeles). The AOM contingent thinks it is taking WAY too long to get H.265 off the ground and would rather just make a suitable codec free to anyone who wants to use it to speed up the transition to 4K video.

In addition to giving out red, yellow, green, and blue lollipops, Google had its jump 360-degree camera out for inspection.

 

Technicolor claims to have solved the problem of rapid switching between different HDR formats streaming in the same program.

 

Keep an eye on the AV-1 codec. It could really upset the apple cart.

Of course, they didn’t have a ready answer when I questioned the future viability of any company that had sunk millions of dollars into H.265 development, only to see their hard work given away for free. The stock answers included “there will be winners and losers” and “some companies will probably be bought out.” Note that the primary goal of the members I listed is content delivery, not living off patent royalties, so that gives you some insights to their thinking.

The last puzzle piece was the new ATSC 3.0 standard for digital TV broadcasting, and it’s being tried out in several markets as I write this; most notably, Phoenix. ATSC 3.0 is not compatible with the current version 1.0 as it uses a different modulation process (ODM vs. VSB) and is very much intertwined with IP to make delivery to mobile devices practical. WRAL in Raleigh, North Carolina has been broadcasting in this format for almost a year now.

ATSC 3.0 is already being tested in several TV markets. Will it take off? And how will consumers choose to watch it?

 

CreateLED had this cool LED “waterfall” in their booth.

ATSC 3.0 is designed to be more bandwidth-efficient and can carry 1080p and 4K broadcasts along with high dynamic range video. At the show, I saw demos of ATSC 3.0 receivers married to 802.11ac WiFi routers, ATSC 3.0 set-top boxes, and even an autonomous shuttle vehicle between the Central and South Halls that was supposedly carrying live ATSC 3.0 mobile broadcasts. (It wasn’t working at the time, though. More crickets…)

All in all; a very subdued show, but reflective of an industry in transition from a world of deterministic video traveling uncompressed over coaxial cable to compressed audio and video packets streaming through wired and wireless networks with varying degrees of latency. Where do we go from here?

 

 

High Dynamic Range: It’s Here!

Ever since the launch of high definition television in the 1990s, it seems as if some new ‘bell and whistle’ enhancement comes along every few years. First it was the changeover to flat screen plasma displays in the late 1990s, followed by a shift to 1080p and Wide UXGA resolution in the early 2000s.

The industry transitioned to liquid-crystal display (LCD) panels for TVs and monitors a few years later. UHD (4K) imaging popped into our consciousness in 2012. And of course, 3D made its “once every few sunspot cycles” appearance in 2009, followed by virtual reality last year.

Some of these trends actually stuck, like 4K: Display manufacturers are slowing down production of Full HD (1920×1080) display panels in favor of UHD (3840×2160) as consumers increasingly choose the higher resolution. That, in turn, means that the displays we select for digital signage, classrooms, meeting rooms, and other applications will also be of the 4K variety.

The latest trend to rear its head is high dynamic range (HDR), which is accompanied by wide color gamut (WCG) imaging. In a nutshell, HDR means a greatly expanded range of tonal values that can be shown in still and moving images. Conventional cameras and imaging systems can capture anywhere from 9 to 11 f-stops of light. (Each f-stop increase represents a luminance value twice as bright as the previous one.)

HDR takes that to a higher level by capturing as many as 22 f-stops of light, and reproducing those tonal values becomes a real challenge to displays that employ conventional backlight or illumination systems. Hence, we are now seeing a new crop of LCD TVs with turbocharged backlights to reproduce the extreme dynamic ranges of HDR images. On the emissive display side, organic light-emitting diode (OLED) TVs can also reproduce HDR content, although with lower peak brightness levels.

At NAB 2017, NEC showed this 4K HDR encoder prototype, streaming 77 Mb/s with 99 ms latency.

For some perspective, the venerable CRT display had a peak brightness level somewhere around 29 foot-Lamberts (100 candelas per square meter), which represented close to 100% diffuse white. In an HDR display, that value largely holds, but more intense specular highlights (like the sun reflecting off a pane of glass or the water, or a bright streetlight at nighttime) can hit peaks much, much higher in the thousands of cd/m2 .

And HDR isn’t just about extreme brightness. The entire grayscale is expanded, so we should see more shadow details along with intense specular light sources. When done correctly, HDR images are quite the departure from ‘everyday’ HDTV, and more closely resemble the range of tonal values our eyes can register – with their visual contrast ratio approaching 1,000,000:1.

There are numerous ways to achieve higher levels of brightness. Dense arrays of light-emitting diodes can do it when used in a direct-illumination architecture. However, the favored approach is to employ a special optical film embedded with nano-sized red and green quantum dot particles, stimulated by an array of blue LEDs. 2017 TV models using this approach can achieve peak small-area brightness values of 2,000 cd/m2.

For perspective, consider that an LED (emissive) videowall for indoor use will routinely hit 3,000 cd/m2 brightness with full white images, and you can appreciate just how much of a leap HDR represents over current imaging technology. What’s more significant is how quickly the prices for HDR displays are coming down, particularly as Chinese TV manufacturers enter the marketplace.

Just prior to the Super Bowl – the best time to score a deal on a new TV, by the way – it was possible to purchase a 55-inch ‘smart’ Ultra HDTV for just $499 from a Tier 1 manufacturer. And a 65-inch model with basic HDR (static metadata) could be had from a Chinese brand for less than $700, while a Tier 1 HDR model of the same screen size was less than $900.

I mentioned wide color gamut earlier. It stands to reason that if a camera can capture a much wider range of luminance values, it can also record a much wider range of color shades. And that’s exactly what winds up happening. With the current 8-bit color system widely in use for everything from broadcast and cable television to Blu-ray discs and streaming media, a total of 16.7 million colors can be represented.

With HDR and WCG, the playing field is expanded considerably and now requires 10 bits per color pixel, resulting in 1,073,741,800 colors – over 1 billion color shades! That’s too much heavy lifting for LCD displays that use white LEDs with color filters, but it’s within reach of quantum dot LCDs and OLEDs.

The availability of HDR/WCG content has also forced a speed upgrade to display interfaces. HDMI 1.3/1.4 simply can’t handle a 4K HDR signal, so we must use HDMI 2.0 to do the job. And even version 2.0 is barely fast enough – if the 4K video signal uses lower color resolution (4:2:0, 4:2:2), then it can transport HDR signals as fast as 60 Hz. But switch to RGB (4:4:4) color mode – such as we’d see with 4K video from a computer video card – and HDMI 2.0 can’t pass a 60 Hz signal with anything more than 8-bit color.

On the DisplayPort side, things are somewhat more accommodating. Version 1.2 (the current one) can pass a 3840x2160p/60 signal 10-bit RGB (4:4:4) color, but nothing more. The newest DP version – 1.3 – raises its maximum speed to 32.4 Gb/s, which makes imaging 12-bit and even 16-bit 4K HDR content possible. However, version 1.4 is required to recognize the HDR ‘flags’ that travel with the content and must be passed on to the display. (HDMI uses extensions for HDR and WCG, with ‘a’ used for static HDR metadata and ‘b’ used for dynamic metadata.)

Marketing folks have a field day confusing people with new display tech and apparently they’re going to town with HDR. We’re now hearing about “HDR-compatible” products, particularly in signal interfacing. Nothing to see here, folks – if the signal distribution and switching equipment is fast enough to pass the required clock rate and hands over HDR metadata (CTA.861.3) to the display without alteration, then it is indeed “HDR compatible.” Simple as that.

I return to my original question: Will HDR have an impact on our industry? The answer is an emphatic “yes!” There are many customers that would realize a benefit from HDR imaging – medical, surveillance, military, research, virtual reality, and simulation verticals will embrace it pretty quickly, and others will follow.

InfoComm 2016 In The Rearview Mirror

Another InfoComm show has come and gone. This is my 23rd InfoComm and it’s hard to imagine when I first set foot in Anaheim way back in 1994 – ostensibly to cover the now-defunct Projection Shoot-Out – that I’d still be making the treks to Orlando and Las Vegas, let alone teaching classes and joining the InfoComm faculty.

For this recap, I’ll focus on trends I saw at the show that will continue to impact our industry for some time to come. And there were plenty of them, everywhere you looked.

First off; I’ve been saying for several years now that software is becoming increasingly more important than hardware in our industry (and across all market segments  – look at how inexpensive Ultra HDTVs have become already), and that we’d start to see less of a focus on expensive hardware and more of an emphasis on software and managed services.

And that’s exactly what I spotted in Las Vegas. Astute observers noticed that the once humongous booths set up by the likes of Sony, Panasonic, Crestron, LG, Samsung, Hitachi, and other companies have gotten a bit smaller. (NEC, Da-Lite, and Christie were exceptions to the rule.)

AMX, when it was a stand-alone company, used to have an enormous booth at the show (not to mention a huge party every year). Now, AMX is limited to a few small stands within the Harman booth.  Walk the show floor these days and you’ll recognize other once-mighty brands that have been acquired by holding companies and now occupy much smaller footprints.

And this trend shouldn’t be any surprise. When hardware used to sell for four and five figures (and in some cases, six figures), you could justify those million-dollar booths that looked like mini-malls. (Remember the huge tented Sanyo projector booths?) But that’s not the case anymore.

Kramer's huge booth at InfoComm, touting a shift away from

Kramer’s huge booth at InfoComm, touting a shift away from “big boxes” to software and the cloud, was one of the exceptions to the trend to go smaller.

 

LG is doing some very cool things with curved displays, thanks to advancements in OLED and LCD manufacturing.

LG is doing some very cool things with curved displays, thanks to advancements in OLED and LCD manufacturing.

Practically speaking, how much real estate do you need to talk about software programs and managed services? The same thing is happening at NAB, where once humongous companies like Harris (now Imagine) are largely touting services and not hardware.

Even Digital Projection has scaled back its enormous multi-tier InfoComm booth. And projectiondesign has shed some square footage since being acquired by Barco, which has itself gone on a square footage diet. Ditto Sharp, which had one of the smallest booths ever at this show, perhaps related to the company’s ongoing financial challenges.

Surprisingly, Toshiba showed there is indeed a second act by showing up with a nice-size booth full of LCD monitors for tiled display walls. That’s not exactly an easy market to compete in, what with LG, Samsung, and NEC having a big footprint. But they’re giving it a shot.

Toshiba has re-entered the super-competitive world of display walls...a market they once dominated 20 year ago.

Toshiba has re-entered the super-competitive world of display walls…a market they once dominated 20 year ago.

 

The

The “surfer dude engineers” from Santa Barbara have a very nice 4K-over-IP encoder/decoder line-up!

Another trend that’s really picking up speed is the move away from projection lamps to solid-state illumination systems, most often lasers with color phosphor wheels. The availability of large, inexpensive LCD displays has cut deeply into sales of projectors – particularly in small classrooms and meeting rooms, where we used to put in “hang and bang” projection systems.

If you talk to people who’ve made the switch away from projection to direct-view, the reason they most frequently cite is that they don’t have to change out lamps anymore, and the LCD displays can be used under normal room lighting and turn on instantly.

Well, projector manufacturers have gotten the message and are moving en masse to solid state light sources. Early adopters like Casio have reaped the benefits, but now everyone from Sony and Panasonic to Vivitek and Optoma is on board.

Even so, the corner wasn’t really turned until this year when Epson – one of the big manufacturers of projection lamps – showed a 25,000-lumen 3LCD projector powered by a laser light engine. And I saw more than one UHD-resolution projector using the laser-phosphor combination, even in ultra-short throw configurations.

Epson finally got religion and showed its first laser/phosphor 3LCD projector this year - a 25,000 lumens model.

Epson finally got religion and showed its first laser/phosphor 3LCD projector this year – a 25,000 lumens model.

 

And Panasonic harnessed laser/phosphor technology to a new high-brightness 4K projector.

And Panasonic harnessed laser/phosphor technology to a new high-brightness 4K projector.

How much longer will we be changing out lamps? I don’t think it will be more than a few years before the majority of projectors offered for sale will use laser or LED light engines (or both). There will be exceptions for certain models, but for all intents and purposes, short-arc lamps are toast.

Here’s another trend – LED walls. I tried to count all of the exhibitors at InfoComm and lost track after wandering through the North Hall. And just about every single exhibitor was based in China, with names you would not recognize. Were they looking for U.S. dealer/distributor partners? It’s not likely many would pick up customers here, and that may be why Leyard (another Chinese manufacturer) bought Planar last year – everyone knows who Planar is.

I also saw LED walls with pitches as small as .9mm. That’s smaller than the pixel pitch of a 50-inch 1366×768 plasma monitor from 1995! And if anyone continues to go big with their booths, it’s the LED wall manufacturers. (Not like they have any choice!) Leyard’s 100’+ 8K LED wall was a perfect example of why bigger is still better when it comes to a booth.

And Sony’s Cledis 8Kx2K LED wall shows just how much farther we’ve come with this technology, creating what appeared to be a perfectly seamless, pixel-free panoramic LED wall that dazzled with bright, super-saturated color images.

Sony's CLEDIS 8K x 2K LED wall did an excellent job of hiding its seams - and pixels.

Sony’s CLEDIS 8K x 2K LED wall did an excellent job of hiding its seams – and pixels.

 

Planar (Leyard) is building some amazingly big and bright display walls. And they've got 8K resolution, too, thanks to using 16 2K panels.

Planar (Leyard) is building some amazingly big and bright display walls. And they’ve got 8K resolution, too, thanks to using 16 2K panels.

The Chinese dominance in LED displays shouldn’t be surprising. They’re moving to a similar level in the manufacturing of LCD panels, monitors, and televisions, undermining the Korean manufacturers (who undermined the Japanese, who took our U.S.-based television business away in the 1980s).

In fact, so much of our hardware is fabricated, soldered, and assembled in China and Southeast Asia these days that it should be no surprise prices have dropped as much as they have. Back in the day, a quality line doubler (remember those?) would set you back as much as $5,000 to $8,000. Today, you can buy a compact scaler that works to 1080p and Wide UXGA for a few hundred bucks.

My last trend has to do with the slow migration of video and audio signal distribution and switching away from hardware-intensive platforms based on display interface standards to software-based platforms that use IT switches, encoders, and decoders. Wow, did I spot a lot of those products at the show, even from some previously-vigorous defenders of HDMI-based architectures.

The interest in learning how to move to an “open” IP-type AV distribution architecture must be considerable: I taught a class on AV-over-IP this year at InfoComm and was astounded to see that 185 people had signed up to attend. And there were very few no-shows, as I found out when I had attendees sitting on the floor and standing along the back wall for almost the entire 90-minute class.

You know there's considerable interest in AV-over-IP when these guys show up.

You know there’s considerable interest in AV-over-IP when these guys show up.

 

RGB Spectrum's new Zio AV-over-IP system has one of the most user-friendly interfaces I've seen to date - touch and swipe to connect video streams.

RGB Spectrum’s new Zio AV-over-IP system has one of the most user-friendly interfaces I’ve seen to date – touch and swipe to connect video streams.

What’s more, a substantial portion of those attendees came from the higher education market segment, and an informal poll revealed that most of them were still upgrading from older analog systems to all-digital infrastructure. In essence, they were telling me that they preferred to skip by HDMI-based solutions and move directly to an IP-type solution.

Hand-in-hand with this discovery came more responses about transitioning to app-based AV control systems and away from proprietary, code-based control that requires specialized programming. Well, there were a few companies showing app-based AV control products in Vegas that had super-simple GUIs; software that just about anyone could learn to use in a few hours.

Throw in the accelerating transition to UHD resolution displays (they’ll largely replace Full HD within a year), and you have some very interesting times in store for the AV industry as this decade winds on…