Posts Tagged ‘4K’
InfoComm 2016 In The Rearview Mirror
- Published on Friday, 17 June 2016 12:48
- Pete Putman
- 0 Comments
Another InfoComm show has come and gone. This is my 23rd InfoComm and it’s hard to imagine when I first set foot in Anaheim way back in 1994 – ostensibly to cover the now-defunct Projection Shoot-Out – that I’d still be making the treks to Orlando and Las Vegas, let alone teaching classes and joining the InfoComm faculty.
For this recap, I’ll focus on trends I saw at the show that will continue to impact our industry for some time to come. And there were plenty of them, everywhere you looked.
First off; I’ve been saying for several years now that software is becoming increasingly more important than hardware in our industry (and across all market segments – look at how inexpensive Ultra HDTVs have become already), and that we’d start to see less of a focus on expensive hardware and more of an emphasis on software and managed services.
And that’s exactly what I spotted in Las Vegas. Astute observers noticed that the once humongous booths set up by the likes of Sony, Panasonic, Crestron, LG, Samsung, Hitachi, and other companies have gotten a bit smaller. (NEC, Da-Lite, and Christie were exceptions to the rule.)
AMX, when it was a stand-alone company, used to have an enormous booth at the show (not to mention a huge party every year). Now, AMX is limited to a few small stands within the Harman booth. Walk the show floor these days and you’ll recognize other once-mighty brands that have been acquired by holding companies and now occupy much smaller footprints.
And this trend shouldn’t be any surprise. When hardware used to sell for four and five figures (and in some cases, six figures), you could justify those million-dollar booths that looked like mini-malls. (Remember the huge tented Sanyo projector booths?) But that’s not the case anymore.
Practically speaking, how much real estate do you need to talk about software programs and managed services? The same thing is happening at NAB, where once humongous companies like Harris (now Imagine) are largely touting services and not hardware.
Even Digital Projection has scaled back its enormous multi-tier InfoComm booth. And projectiondesign has shed some square footage since being acquired by Barco, which has itself gone on a square footage diet. Ditto Sharp, which had one of the smallest booths ever at this show, perhaps related to the company’s ongoing financial challenges.
Surprisingly, Toshiba showed there is indeed a second act by showing up with a nice-size booth full of LCD monitors for tiled display walls. That’s not exactly an easy market to compete in, what with LG, Samsung, and NEC having a big footprint. But they’re giving it a shot.
Another trend that’s really picking up speed is the move away from projection lamps to solid-state illumination systems, most often lasers with color phosphor wheels. The availability of large, inexpensive LCD displays has cut deeply into sales of projectors – particularly in small classrooms and meeting rooms, where we used to put in “hang and bang” projection systems.
If you talk to people who’ve made the switch away from projection to direct-view, the reason they most frequently cite is that they don’t have to change out lamps anymore, and the LCD displays can be used under normal room lighting and turn on instantly.
Well, projector manufacturers have gotten the message and are moving en masse to solid state light sources. Early adopters like Casio have reaped the benefits, but now everyone from Sony and Panasonic to Vivitek and Optoma is on board.
Even so, the corner wasn’t really turned until this year when Epson – one of the big manufacturers of projection lamps – showed a 25,000-lumen 3LCD projector powered by a laser light engine. And I saw more than one UHD-resolution projector using the laser-phosphor combination, even in ultra-short throw configurations.
How much longer will we be changing out lamps? I don’t think it will be more than a few years before the majority of projectors offered for sale will use laser or LED light engines (or both). There will be exceptions for certain models, but for all intents and purposes, short-arc lamps are toast.
Here’s another trend – LED walls. I tried to count all of the exhibitors at InfoComm and lost track after wandering through the North Hall. And just about every single exhibitor was based in China, with names you would not recognize. Were they looking for U.S. dealer/distributor partners? It’s not likely many would pick up customers here, and that may be why Leyard (another Chinese manufacturer) bought Planar last year – everyone knows who Planar is.
I also saw LED walls with pitches as small as .9mm. That’s smaller than the pixel pitch of a 50-inch 1366×768 plasma monitor from 1995! And if anyone continues to go big with their booths, it’s the LED wall manufacturers. (Not like they have any choice!) Leyard’s 100’+ 8K LED wall was a perfect example of why bigger is still better when it comes to a booth.
And Sony’s Cledis 8Kx2K LED wall shows just how much farther we’ve come with this technology, creating what appeared to be a perfectly seamless, pixel-free panoramic LED wall that dazzled with bright, super-saturated color images.
The Chinese dominance in LED displays shouldn’t be surprising. They’re moving to a similar level in the manufacturing of LCD panels, monitors, and televisions, undermining the Korean manufacturers (who undermined the Japanese, who took our U.S.-based television business away in the 1980s).
In fact, so much of our hardware is fabricated, soldered, and assembled in China and Southeast Asia these days that it should be no surprise prices have dropped as much as they have. Back in the day, a quality line doubler (remember those?) would set you back as much as $5,000 to $8,000. Today, you can buy a compact scaler that works to 1080p and Wide UXGA for a few hundred bucks.
My last trend has to do with the slow migration of video and audio signal distribution and switching away from hardware-intensive platforms based on display interface standards to software-based platforms that use IT switches, encoders, and decoders. Wow, did I spot a lot of those products at the show, even from some previously-vigorous defenders of HDMI-based architectures.
The interest in learning how to move to an “open” IP-type AV distribution architecture must be considerable: I taught a class on AV-over-IP this year at InfoComm and was astounded to see that 185 people had signed up to attend. And there were very few no-shows, as I found out when I had attendees sitting on the floor and standing along the back wall for almost the entire 90-minute class.
What’s more, a substantial portion of those attendees came from the higher education market segment, and an informal poll revealed that most of them were still upgrading from older analog systems to all-digital infrastructure. In essence, they were telling me that they preferred to skip by HDMI-based solutions and move directly to an IP-type solution.
Hand-in-hand with this discovery came more responses about transitioning to app-based AV control systems and away from proprietary, code-based control that requires specialized programming. Well, there were a few companies showing app-based AV control products in Vegas that had super-simple GUIs; software that just about anyone could learn to use in a few hours.
Throw in the accelerating transition to UHD resolution displays (they’ll largely replace Full HD within a year), and you have some very interesting times in store for the AV industry as this decade winds on…
AV-over-IP: It’s Here. Time To Get On Board!
- Published on Friday, 03 June 2016 14:21
- Pete Putman
- 0 Comments
At InfoComm next week in Las Vegas, I look forward to seeing many familiar faces – both individuals and manufacturers – that have frequented the show since I first attended over 20 years ago. And I also expect to find quite a few newcomers, based on the press releases and product announcements I’ve been receiving daily.
Many of those newcomers will be hawking the latest technology – AV-over-IP. More specifically, transporting video, audio, and metadata that are encoded into some sort of compressed or lightly-compressed format, wrapped with IP headers, and transported over IP networks.
This isn’t exactly a new trend: The broadcast, telecom, and cable/satellite worlds have already begun or completed the migration to IT infrastructures. The increasing use of optical fiber and lower-cost, fast network switches are making it all possible. Think 10 gigabit Ethernet with single-mode fiber interconnections, and you can see where the state-of-the-art is today.
You’ve already experienced this AV-over-IP phenomenon if you watch streaming HD and 4K video. Home Internet connection speeds have accelerated by several orders of magnitude ever since the first “slow as a snail” dial-up connections got us into AOL two decades ago. Now, it’s not unusual to have sustained 10, 15, 25, and even 50 megabit per second (Mb/s) to the home – fast enough to stream Ultra HD content with multichannel sound.
And so it goes with commercial video and audio transport. Broadcast television stations had to migrate to HD-SDI starting nearly 20 years ago when the first HDTV broadcasts commenced. (Wow, has it really been that long?) Now, they’re moving to IP and copper/fiber backbones to achieve greater bandwidth and to take advantage of things like cloud storage and archiving.
So why hasn’t the AV industry gotten with the program? Because we still have a tendency to cling to old, familiar, and often outdated or cumbersome technology, rationalizing that “it’s still good enough, and it works.” (You know who you are…still using VGA and composite video switching and distribution products…)
I’ve observed that there is often considerable and continual aversion in our industry to anything having to do with IT networks and optical fiber. And it just doesn’t make any sense. Maybe it originates from a fear of losing control to IT specialists and administrators. Or, it could just be a reluctance to learn something new.
The result is that we’ve created a monster when it comes to digital signal management. Things were complicated enough when the AV industry was dragged away from analog to digital and hung its hats on the HDMI consumer video interface for switching and distribution. Now, that industry has created behemoth switch matrices to handle the current and next flavors of HDMI (a format that never was suitable for commercial AV applications).
We’ve even figured out a way to digitize the HDMI TMDS signal and extend it using category wire, up to a whopping 300 feet. And somehow, we think that’s impressive? Single-mode fiber can carry an HD video signal over 10 miles. Now, THAT’S impressive – and it’s not exactly new science.
So, now we’re installing ever-larger racks of complex HDMI switching and distribution gear that is expensive and also bandwidth-capped – not nearly fast enough for the next generation of UHD+ displays with full RGB (4:4:4) color, high dynamic range, and high frame rates. How does that make any sense?
What’s worse, the marketing folks have gotten out in front, muddying the waters with all kinds of nonsensical claims about “4K compatibility,” “4K readiness,” and even “4K certified.” What does that even mean? Just because your switch or DA product can support a very basic level of Ultra HD video with slow frame rates and reduced color resolution, it’s considered “ready” or “certified?” Give me a break.
Digitizing HDMI and extending it 300 feet isn’t future-proof. Neither is limiting Ultra HD bandwidth to 30 Hz 8-bit RGB color, or 60 Hz 8-bit 4:2:0 color. Not even close. Not when you can already buy a 27-inch 5K (yes, 5K!) monitor with 5120×2880 resolution and the ability to show 60 Hz 10-bit color. And when 8K monitors are coming to market.
So why we keep playing tricks with specifications, and working with Band-Aid solutions? We shouldn’t. We don’t need to. And the answer is already at hand.
It’s time to move away from the concept of big, bulky, expensive, and basically obsolete switching and distribution hardware that’s based on a proprietary consumer display interface standard. It’s time to move to a software-based switching and distribution concept that uses an IT structure, standard codecs like JPEG2000, M-JPEG, H.264, and H.265, and everyday off-the-shelf switches to move signals around.
Now, we can design a fast, reliable AV network that allows us to manage available bandwidth and add connections as needed. Our video can be lightly compressed with low latency, or more highly compressed for efficiency. The only display interfaces we’ll need will be at the end points where the display is connected.
Even better, our network also provides access to monitoring and controlling every piece of equipment we’ve connected. We can design and configure device controls and interfaces using cloud-based driver databases. We can access content from remote servers (the cloud, again) and send it anywhere we want. And we can log in from anywhere in the world to keep tabs on how it’s all functioning.
And if we’re smart and not afraid to learn something new, we’ll wire all of it up with optical fiber, instead of bulky cables or transmitters and receivers to convert the signals to a packet format and back. (Guess what? AV-over-IP is already digital! You can toss out those distance-limited HDMI extenders, folks!)
For those who apparently haven’t gotten the memo, 40 Gb/s network switches have been available for a few years, with 100 Gb/s models now coming to market. So much for speed limit issues…
To the naysayers who claim AV-over-IP won’t work as well as display interface switching: That’s a bunch of hooey. How are Comcast, Time Warner, NBC, Disney, Universal, Netflix, Amazon, CBS, and other content originators and distributors moving their content around? You guessed it.
AV-over-IP is what you should be looking for as you walk the aisles of the Las Vegas Convention Center, not new, bigger, and bulkier HDMI/DVI matrices. AV-over-IP is the future of our industry, whether we embrace it or are dragged into it, kicking and screaming.
Are you on board, or what?
CES 2016: Some Second Thoughts
- Published on Thursday, 28 January 2016 13:21
- Pete Putman
- 0 Comments
We’re almost a month removed from the 2016 International CES, which was quite the crowded bazaar of electronic gadgets. I’ve already reported on what I saw at the show; now, I want to take a few minutes to do some “Monday morning quarterbacking.”
Quarterly reports in this week from two of the CE world’s titans – Apple and Samsung – aren’t very rosy. In fact, both companies are predicting a slowdown in sales of smartphones, which was arguably the hottest CE category over the past six years (even more so than televisions). Although shipments of smartphones are predicted to rise this year, consumer demand for them is in decline.
That shouldn’t be surprising. I bought a Samsung Galaxy V in December of 2014 and it’s still serving me well. In fact, it can do more things than I need, so I’m not likely to replace it when my service contract expires this coming December. (Yep, I’m one of a dying breed of two-year service contract holders!) And I suspect that many other smartphone owners feel the same way.
Tablets were also supposed to be hot prospects for 2015, with some analysts predicting 18% year-to-year growth. Yet, tablet shipments actually went into decline, while sales of laptop computers actually exceeded predictions. Once again, if you have a tablet that’s a couple of years old, there’s no real reason to replace it unless the battery goes dead.
The only drawback with some of these products is inadequate memory capacity. Most phones and tablets start with 16 GB of memory, expandable with micro SD cards. Yet, given how quickly apps and downloads can gobble up that space, it’s wiser to start with 32 GB and maybe even 64 GB these days. After all, memory is cheap (unless you buy it from Apple).
So – mobile devices aren’t providing the stellar sales and returns we all hoped for. How about televisions?
There’s no question that shipments and sales of 1080p TVs are in a slow decline, and have been for a few years. Practically speaking; if you bought a big (46” and larger) “smart” Full HD LCD TV in the past five years, you already have fast Wi-Fi connectivity, Netflix and possibly Amazon streaming, and three or four HDMI inputs – most of which you’re probably not using, if you stream video.
So why would you shell out money for a new Full HDTV? You wouldn’t, except that you can now buy a much larger screen for the money. But that’s not what’s happening – people are opting to move up to Ultra HD resolution, as the prices for these sets have just about reached parity with same-size Full HDTVs. And not surprisingly, Ultra HDTV sales have been strong and are growing by double digits each year. Still a small portion of overall TV shipments, but essential to the bottom line of Samsung (37% UHDTV market share through June 2015), LG (17% share), and Sony (10% share).
What’s new this year is a stronger presence from China Inc. brands, notably TCL and Hisense. The former acquired the Sanyo brand and factory from Panasonic, while the latter now owns Sharp’s US TV business and a former assembly plant in Mexico.
Excepting Ultra HDTV, it’s very difficult to make any money in the TV biz these days. What we’re seeing is more manufacturing and display panel sourcing from China, as the quality of LCD panels for TVs made at BOE, CSOT, Hisense, and TCL is very good. (And they’re cranking out Ultra HD panels, too.)
2016 will be the year that OLED TV technology finally goes mainstream. LG has placed some big bets on their white OLED / RGBW process and is also selling OLED panels to five of the largest Chinese TV manufacturers. Prices continue to fall stateside; LG just announced a Super Bowl promotion through February 13 that will snag you a 55-inch Full HD curved set for $1,999 and a flat or curved 55-inch Ultra HD model for $2,999.
OLEDs are already in wide use in smartphones and tablets (both my Samsung tablet and smartphone use them) and we’re seeing them in smart watches, too. LG Display’s demonstrations of super-curved, warped, and roll-up OLED displays at CES shows the promise of this technology for mobile displays, particularly in transportation applications.
For displays, we can expect more of the same in 2016 – ever-larger TV s at lower prices as retailers try to stir up sales of hat has become a disposable commodity. You can buy a 50-inch Hisense Full HD set now for $399, amazingly, and 42-inch TVs are getting ever close to the $200 price barrier.
So what’s going to change? It will take a while, but the 60 GHz wireless technology demos I saw in Las Vegas are very promising. Imagine streaming Ultra HD content with high dynamic range from your Ultra HD Blu-ray player to your 65-inch 4K OLED without cables. Or showing video clips from your phone or tablet the same way.
Better yet, how about downloading an HD movie before you travel in just 5 to 10 seconds? It’s possible with the new 60 GHz 802.11ad protocol, as demonstrated by Qualcomm with a bumper crop of tri-band (2.4/5/60 GHz) modems at CES, and a suitably-equipped phone or tablet. This one’s a game-changer, but I don’t think you’ll see many products with this feature until a year from now. Peraso’s aftermarket 60 GHz USB wireless links might help, as they can retrofit to any laptop or desktop computer.
The other category you’ll want to keep your eye on is the Internet of Things. It seems like every gadget has an IP address and can be controlled by an app. Through in Wi-Fi, and you have home security systems you can install yourself for about $250 bucks. Or wireless doorbell cameras, or LED bulbs that double as cameras and motion detectors. (And even alarms that monitor your alarms.)
This continual downward pricing pressure (again, led by Chinese manufacturing) will shift profitability away from hardware to software. Verizon Wireless, the last company to abandon annual service contracts, doesn’t really care what you send on your phone. They just want that recurring monthly revenue stream that you generate. (Notice how nobody charges for voice calling and texting anymore, just blocks of data? The increasing use of Wi-Fi for smartphone connectivity has a lot to do with it.)
I’ve said it before, and I’ll say it again: “Hardware is cheap, and anyone can make it.” Software and services are where the growth lies as we enter the second half of this decade, and you’ll see just how low prices will fall a year from now when you can buy a fully-featured smartphone for $300, you’ll be able to score a 65-inch Ultra HD “smart” TV with HDR and WCG support for $800, and a 4K “action” camera will cost less than $150.
May you live in interesting times!
Display Interfacing: Welcome to Babylon
- Published on Thursday, 10 September 2015 13:53
- Pete Putman
- 0 Comments
For many years, ‘interfacing’ a video signal meant plugging in a yellow RCA or silver BNC connector that carried composite video. As picture resolution went up, computers became commonplace at work and home, and the term ‘progressive scan’ entered the lexicon, we saw the birth of S-video and then component video (YPbPr and RGB).
So we adapted, building switching and distribution gear that could handle one-, two-, and three-wire formats. All was well and good…until ‘digital’ made its grand entrance about 15 years ago.
Now, we have digital versions of component and RGB video, starting with the Digital Video Interface (DVI) and moving to High Definition Multimedia Interface (HDMI), DisplayPort, and the new superMHL interface that (according to the MHL Alliance) will start appearing on televisions as soon as December.
If I’m a consumer, I mostly don’t care about any of this. As long as I can plug in my set-top box, Blu-ray player, and other gadgets with the right cables I can find at Best Buy, this is just a bunch of alphabet soup.
However, if I’m an integrator (consumer or commercial), then I care VERY much about where all of this is heading. And if I’m paying any attention at all to the growing market for 4K and UHD, then I’m rightfully concerned about the impending problems with interfacing these signals.
*Even though HDMI 2.0 was announced in September of 2013 – TWO FULL YEARS AGO – virtually no manufacturer in the pro AV space supports this interface on their switchers and distribution amplifiers. Instead, the vast majority are still providing version 1.4 while claiming these products are “4K compatible” or “4K ready” because version 1.4 is just fast enough to pass an Ultra HD (3840×260) signal at 30 Hz with 8-bit RGB color. That’s setting the bar kinda low, isn’t it?
*Some computer manufacturers don’t even support HDMI, like Apple (DisplayPort) and Lenovo (also DisplayPort). So, now you have to carry dongles everywhere you go?
*HDMI 2.0 arrives hand-in-hand with a new version of copy protection (HDCP 2.2) which is much more rigorous than versions 1.3 and 1.4. If a valid HDCP key exchange isn’t made within 20 milliseconds, the connection will shut down. Period.
*HDMI 2.0 isn’t fast enough for what UHD is turning out to be – a real departure from 1080p and Wide UXGA, with a move to 10-bit color to support high dynamic range (HDR), wide color gamuts (WCG), and high frame rates (HFR). DisplayPort 1.2 can barely support these requirements; DP version 1.3 and super MHL are better positioned to handle the job.
*The intellectual property behind HDMI and superMHL is owned by the same company – Lattice Semiconductor – and whereas once there were clear dividing lines between the two interfaces (MHL was designed originally for smartphones and tablets), they are now competing against each other. I’ve even sat in on presentations where it was explained that both could exist on consumer TVs. (And why would that make sense, again, when neither interface has been widely deployed to date, and one is clearly an improvement over the other?)
You can imagine what this trend is doing to product designers and manufacturers. Sure, HDMI is a “safe bet” for now, but what if our UHD needs quickly outstrip its maximum clock speed? DP is certainly faster and there appears to be more support for it from computer manufacturers. But super MHL is faster still. Shouldn’t your interfaces at least have a head start on display manufacturers?
This reliance on HDMI has led several manufacturers into a potential trap, investing heavily on signal distribution architectures that may quickly run into a “future-proofing” problem. In contrast; outside the commercial AV industry, everyone from cable TV system operators to broadcasters and telecom operators are busy migrating to an IP-based architecture.
Not only does IP-based architecture have the advantage of being a relatively open system, it also solves many of the speed issues as 1-gigabit and 10-gigabit networks are becoming more commonplace. (Heck, Comcast just upgraded my home Internet speeds to 75 Mb/s on downloads, which is more than fast enough for me to stream 4K content from Netflix and Amazon!)
So, why don’t we do the same in the commercial AV industry? It’s not for a lack of products – there are several companies offering AV-over-IP transmitters and receivers, along with encoders and decoders. I’ve also seen impressive demos of “middleware” used to locate, switch, and play out media assets over IP networks. All of these guys were at InfoComm 2015.
The big players in HDMI-based switching and distribution argue against AV-over-IP for in-room and short-run signal distribution, citing latency and compression issues. Well, we now have a new codec (HEVC H.265) to handle that end of things, and it’s possible to stream video and high resolutions with low latency. (How does 1920x1080p/60 at 1 to 2 Mb/s sound to you? Thought so.)
High latency is often the result of over-compression and heavy forward error correction (FEC). But if video and audio assets are streaming on bandwidth-managed, private IP networks, there isn’t a lot of forward error correction required. Group of Pictures (GOP) sizes can also increase to reduce latency. So latency is sort of a “straw man” argument. (And HDMI 2.0 will have plenty of issues with HDCP 2.2, trust me. Talk about latency…)
As for copy protection; video and audio assets streaming over IP connections have their own security protocols. Practically speaking, what could be more secure than video content streaming directly into a UHDTV, through an Ethernet connection? And you don’t even have to plug in a cable to make it work, unless you use a wired Ethernet hookup. Bandwidth issues? Well, how about 5 GHz 802.11ac channel-bonding routers? I’m getting 70+ Mb/s download speeds from mine with wired connections, and 25 – 30 Mb/s some distance from my 5 GHz wireless link.
Again, looking outside our industry, the two most common signal distribution and switching architectures are based on HD-SDI or IP (or both). Not HDMI, and certainly not HDMI-derived, structured-wire systems like HDBaseT. If the rest of the world wants to multiplex video, audio, metadata, and other low bitrate control signals, they do it over optical fiber. (Did you know that multimode fiber is cheaper than Category 6 wire?)
I’ll wrap up thing by saying that the smart move is for commercial AV integrators to move to an AV-over-IP signal distribution system at the core like everyone else, leaving the HDMI, DisplayPort, superMHL, and “whatever comes next” connections for the far ends, near the displays (if those far-end conversions are even needed at all).
Leave the core as a high-speed, copper bus or optical bus, software-based switcher. If there’s enough bandwidth (and there should be), that system can also carry local TCP/IP traffic, SMTP alerts from connected devices, and control signals to all devices. Not only does this approach free everyone from the “closed world” paradigm of HDMI, it also makes the system infinitely more appealing to end-users and facility administrators, an increasing number of whom come from the IT world.
Consider this ad that was posted recently on a listserv for higher education:
“We are looking for an experienced AV-IT Engineer for the role of Technical Guru. The position will provide planning and support for AV-IT systems used in teaching and learning spaces big and small. The person in this position will focus on design, installation, and troubleshooting of AV-IT systems in a variety of venues, including traditional classrooms, active learning classrooms, large auditoria, computer labs, and even Makerspaces…We are looking for a seasoned professional with a solid background in AV-IT systems. This is a great opportunity for a doer who is excited about not just maintaining but also shaping the future of AV-IT technology as a key element of the teaching mission of one of the world’s top universities.”
I rest my case. It’s time for the commercial AV industry to get in step with the rest of the world and move to AV-over-IP signal distribution.
Wake up. Have you smelled the coffee yet?
NAB In The Rear View Mirror
- Published on Friday, 24 April 2015 21:28
- Pete Putman
- 0 Comments
It’s been over a week since I got back from Las Vegas and edited all of my photos and videos. But once again, NAB scored big numbers with attendance and there were enough goodies to be found in all three exhibit halls, if you were willing to put in the time to pound the pavement. Over 100,000 folks made their way to the Las Vegas Convention Center to see endless demos of streaming, drones, 4K cameras and post-production, and H.265 encoders.
We were also treated to a rare haboob, or dust storm, which blew through town late Tuesday afternoon and blotted out the sun, leaving a fine dusting of sand particles on everything (and in everyone’s hair, ears, and eyes.) While most of the conferences and presentations tend to be somewhat predictable, the third day of the show featured the notorious John McAfee (yes, THAT John McAfee) as the keynote speaker at the NAB Technology Luncheon. Escorted by a security detail, McAfee walked up on stage and proceeded to warn everyone about the security risks inherent in loading apps onto phones and tablets. (Come to think of it, why does a flashlight app for my phone need permission to access my contact list and my camera?)
Some readers may remember the Streaming Video pavilion in the Central Hall at this show back in 1999. There, dozens of small start-up companies had booths showing how they could push 320×240-resolution video (“dancing postage stamps”) over 10 megabit and 100 megabit Ethernet connections, and not always reliably. (And not surprisingly, most of those companies were gone a year later.)
Today, companies like Harmonic, Elemental, Ericsson, Ateme, and the Fraunhofer Institute routinely demonstrate 4K (3840×2160) video through 1GigE networks at a data rate of 15 Mb/s, using 65-inch and 84-inch 4K video screens to demonstrate the picture quality. 4K file storage and editing “solutions” are everywhere, as are the first crop of reference-quality 4K displays using LCD and OLED technology.
In some ways, the NAB show resembles InfoComm. Many of the exhibitors at NAB have also set up shop at InfoComm, waiting for the pro AV channel to embrace digital video over IP networks. (It’ll happen, guys. Just be patient.) In the NAB world, video transport over IP using optical fiber backbones is quite the common occurrence, although it’s still a novelty to our world. (Haven’t you heard? Fiber is good for you!)
I spent three and a half days wandering around the aisles in a daze, but managed to find some gems among the crowds. Here were some highlights:
Blackmagic Design drew a crowd to see its Micro Cinema Camera, and it is indeed tiny. The sensor size is Super 16 (mm) and is capable of capturing 13 stops of light. RAW and Apple ProRes recording formats are native, and Blackmagic has also included an expansion port “…featuring PWM and S.Bus inputs for airplane remote control.” (Can you say “drone?”) And all of this for just $995…
RED’s booth showed the prototype of a new 8K (7680×4320) camera body that will capture video at 6K resolution from 1 to 100 frames per second. In 4K (3840×2160) mode, the Dragon can record footage as fast as 150 frames per second. (Both of these are in RAW mode.) Data transfer (writing speeds) was listed at 300 Mb/s, and the camera has built-in wireless connectivity.
Arri showed a 65mm digital camera, resurrecting a format that goes back to the 1950s. The actual resolution of the camera sensor is 5120×2880, or “5K” as Arri calls it. This sensor size is analogous to the old 6 cm x 6 cm box cameras made by Rollei and Yashica, and there is quite a bit of data flowing from this camera when it records! (Can you say “terabytes” of storage?”)
Drones dominated the show, with powerhouse DJI setting up in the central hall and an entire section of the rear south hall devoted to a drone “fly-off” competition. Nearby, a pavilion featured nothing but drones, cameras, accessories, and even wireless camera links such as Amimon’s Connex 5 GHz system. (You may recognize this as a variant of the company’s WHDI wireless HDMI product.)
Sony had side-by-side comparisons of standard dynamic range (SDR) and high dynamic range (HDR) footage using their new BVM-X300 30-inch HDR OLED display. This is the 3rd generation of OLED reference monitor products to come out of the Sony labs, and it’s a doozy with 4096×2160 resolution (3G-SDI Quad-link up to 4096 x 2160/48p/50p/60p) and coverage of the DCI P3 minimum color space. The monitor can also reproduce about 80% of the new BT.2020 color gamut. Peak brightness (scene to scene) is about 800 nits, and color reproduction is very accurate with respect to flesh tones and pastels.
Canon also took the wraps off a new reference monitor. The DP-V2410 4K reference display has 4096×2160 pixels of resolution (the DCI 4K standard) and uses an IPS LCD panel that is capable to showing high dynamic range (HDR), usually defined as at least 15 stops of light. It supports the ITU BT.2020 color space, can upscale 2K content to 4K, and will run off 24 volts DC for field use.
Panasonic unveiled their first laser-powered 3-chip DLP projector, and it’s a doozy. Using a short-throw lens, the Panasonic guys lit up a 10-foot diagonal screen with 12,000 lumens at WUXGA (1920×1200) resolution from the PT-RZ12KU. It uses a blue laser to excite a yellow-green color wheel and create white light, which is then refracted into red, green, and blue light for imaging. The projector weighs just 95 pounds, and the demo used an ultra-short-throw lens positioned about 12” – 16” in front of the screen.
Fine-pitch indoor and outdoor LED displays are a growing market. Both Leyard and Panasonic showed large LED displays with 1.6mm dot pitch, which isn’t much larger than what you would have found on a 768p-resolution plasma display from 15 years ago. The color quality and contrast on these displays was quite impressive and you have to stand pretty close to notice the pixel structure, unlike the more commonly-used 6mm and 10mm pitch for outdoor LED displays. Brightness on these displays is in the thousands of nits (talk about high-dynamic range!).
Speaking of HDR, Dolby had a demonstration in its booth of new UHDTVs from Vizio that incorporate Dolby’s version of high dynamic range. Vizio showed a prototype product a year ago at CES and it now appears close to delivery. The target brightness for peak white will be well over 1000 nits, but the challenge for any LCD panel is being able to show extremely low levels of gray – near black.
Vitec had what may be the world’s first portable HEVC H.265 encoder, the MGW Ace. Unlike most of the H.265 demos at the show, this product does everything in hardware with a dedicated H.265 compression chip (most likely from Broadcom). And it is small, at about ¾ of a rack wide. Inputs include 3G/SDI, composite video (yep, that’s still around), HDMI, and DVI, with support for embedded and serial digital audio. Two Ethernet ports complete the I/O complement.
Over in the NTT booth, a demonstration was being made of “the first H.265 HEVC encoder ever to perform 4K 4:2:2 encoding in real time.” I’m not sure if that was true, but it was a cool demo: NTT (a/k/a Nippon Telephone & Telegraph) researchers developed the NARA processor to reduce power consumption and save space over existing software/hardware based encoders. And it comes with extension interfaces to encode video with even higher resolution.
NHK was back again with their extension demo area of 8K acquisition, image processing, and broadcasting. (Yes, NHK IS broadcasting 8K in Tokyo, and has been doing so for a few years.) Among the cooler things in their booth was a 13-inch 8K OLED display – which almost seems like an oxymoron – and an impressive demonstration of 8K/60 and 8K/120 shooting and playback. On the 120Hz side of the screen, there was no blur whatsoever of footage taken during a soccer match.
This is just scratching the surface, and I’ll have more information during my annual “Future Trends” presentation at InfoComm in June. For now, I’ll let one of my colleagues sum up the show as being about “wireless 4K drones over IP.” (Okay, that’s a bit of a simplification…)