Category: The Front Line
InfoComm 2016 In The Rearview Mirror
- Published on Friday, 17 June 2016 12:48
- Pete Putman
- 0 Comments
Another InfoComm show has come and gone. This is my 23rd InfoComm and it’s hard to imagine when I first set foot in Anaheim way back in 1994 – ostensibly to cover the now-defunct Projection Shoot-Out – that I’d still be making the treks to Orlando and Las Vegas, let alone teaching classes and joining the InfoComm faculty.
For this recap, I’ll focus on trends I saw at the show that will continue to impact our industry for some time to come. And there were plenty of them, everywhere you looked.
First off; I’ve been saying for several years now that software is becoming increasingly more important than hardware in our industry (and across all market segments – look at how inexpensive Ultra HDTVs have become already), and that we’d start to see less of a focus on expensive hardware and more of an emphasis on software and managed services.
And that’s exactly what I spotted in Las Vegas. Astute observers noticed that the once humongous booths set up by the likes of Sony, Panasonic, Crestron, LG, Samsung, Hitachi, and other companies have gotten a bit smaller. (NEC, Da-Lite, and Christie were exceptions to the rule.)
AMX, when it was a stand-alone company, used to have an enormous booth at the show (not to mention a huge party every year). Now, AMX is limited to a few small stands within the Harman booth. Walk the show floor these days and you’ll recognize other once-mighty brands that have been acquired by holding companies and now occupy much smaller footprints.
And this trend shouldn’t be any surprise. When hardware used to sell for four and five figures (and in some cases, six figures), you could justify those million-dollar booths that looked like mini-malls. (Remember the huge tented Sanyo projector booths?) But that’s not the case anymore.
Practically speaking, how much real estate do you need to talk about software programs and managed services? The same thing is happening at NAB, where once humongous companies like Harris (now Imagine) are largely touting services and not hardware.
Even Digital Projection has scaled back its enormous multi-tier InfoComm booth. And projectiondesign has shed some square footage since being acquired by Barco, which has itself gone on a square footage diet. Ditto Sharp, which had one of the smallest booths ever at this show, perhaps related to the company’s ongoing financial challenges.
Surprisingly, Toshiba showed there is indeed a second act by showing up with a nice-size booth full of LCD monitors for tiled display walls. That’s not exactly an easy market to compete in, what with LG, Samsung, and NEC having a big footprint. But they’re giving it a shot.
Another trend that’s really picking up speed is the move away from projection lamps to solid-state illumination systems, most often lasers with color phosphor wheels. The availability of large, inexpensive LCD displays has cut deeply into sales of projectors – particularly in small classrooms and meeting rooms, where we used to put in “hang and bang” projection systems.
If you talk to people who’ve made the switch away from projection to direct-view, the reason they most frequently cite is that they don’t have to change out lamps anymore, and the LCD displays can be used under normal room lighting and turn on instantly.
Well, projector manufacturers have gotten the message and are moving en masse to solid state light sources. Early adopters like Casio have reaped the benefits, but now everyone from Sony and Panasonic to Vivitek and Optoma is on board.
Even so, the corner wasn’t really turned until this year when Epson – one of the big manufacturers of projection lamps – showed a 25,000-lumen 3LCD projector powered by a laser light engine. And I saw more than one UHD-resolution projector using the laser-phosphor combination, even in ultra-short throw configurations.
How much longer will we be changing out lamps? I don’t think it will be more than a few years before the majority of projectors offered for sale will use laser or LED light engines (or both). There will be exceptions for certain models, but for all intents and purposes, short-arc lamps are toast.
Here’s another trend – LED walls. I tried to count all of the exhibitors at InfoComm and lost track after wandering through the North Hall. And just about every single exhibitor was based in China, with names you would not recognize. Were they looking for U.S. dealer/distributor partners? It’s not likely many would pick up customers here, and that may be why Leyard (another Chinese manufacturer) bought Planar last year – everyone knows who Planar is.
I also saw LED walls with pitches as small as .9mm. That’s smaller than the pixel pitch of a 50-inch 1366×768 plasma monitor from 1995! And if anyone continues to go big with their booths, it’s the LED wall manufacturers. (Not like they have any choice!) Leyard’s 100’+ 8K LED wall was a perfect example of why bigger is still better when it comes to a booth.
And Sony’s Cledis 8Kx2K LED wall shows just how much farther we’ve come with this technology, creating what appeared to be a perfectly seamless, pixel-free panoramic LED wall that dazzled with bright, super-saturated color images.
The Chinese dominance in LED displays shouldn’t be surprising. They’re moving to a similar level in the manufacturing of LCD panels, monitors, and televisions, undermining the Korean manufacturers (who undermined the Japanese, who took our U.S.-based television business away in the 1980s).
In fact, so much of our hardware is fabricated, soldered, and assembled in China and Southeast Asia these days that it should be no surprise prices have dropped as much as they have. Back in the day, a quality line doubler (remember those?) would set you back as much as $5,000 to $8,000. Today, you can buy a compact scaler that works to 1080p and Wide UXGA for a few hundred bucks.
My last trend has to do with the slow migration of video and audio signal distribution and switching away from hardware-intensive platforms based on display interface standards to software-based platforms that use IT switches, encoders, and decoders. Wow, did I spot a lot of those products at the show, even from some previously-vigorous defenders of HDMI-based architectures.
The interest in learning how to move to an “open” IP-type AV distribution architecture must be considerable: I taught a class on AV-over-IP this year at InfoComm and was astounded to see that 185 people had signed up to attend. And there were very few no-shows, as I found out when I had attendees sitting on the floor and standing along the back wall for almost the entire 90-minute class.
What’s more, a substantial portion of those attendees came from the higher education market segment, and an informal poll revealed that most of them were still upgrading from older analog systems to all-digital infrastructure. In essence, they were telling me that they preferred to skip by HDMI-based solutions and move directly to an IP-type solution.
Hand-in-hand with this discovery came more responses about transitioning to app-based AV control systems and away from proprietary, code-based control that requires specialized programming. Well, there were a few companies showing app-based AV control products in Vegas that had super-simple GUIs; software that just about anyone could learn to use in a few hours.
Throw in the accelerating transition to UHD resolution displays (they’ll largely replace Full HD within a year), and you have some very interesting times in store for the AV industry as this decade winds on…
AV-over-IP: It’s Here. Time To Get On Board!
- Published on Friday, 03 June 2016 14:21
- Pete Putman
- 0 Comments
At InfoComm next week in Las Vegas, I look forward to seeing many familiar faces – both individuals and manufacturers – that have frequented the show since I first attended over 20 years ago. And I also expect to find quite a few newcomers, based on the press releases and product announcements I’ve been receiving daily.
Many of those newcomers will be hawking the latest technology – AV-over-IP. More specifically, transporting video, audio, and metadata that are encoded into some sort of compressed or lightly-compressed format, wrapped with IP headers, and transported over IP networks.
This isn’t exactly a new trend: The broadcast, telecom, and cable/satellite worlds have already begun or completed the migration to IT infrastructures. The increasing use of optical fiber and lower-cost, fast network switches are making it all possible. Think 10 gigabit Ethernet with single-mode fiber interconnections, and you can see where the state-of-the-art is today.
You’ve already experienced this AV-over-IP phenomenon if you watch streaming HD and 4K video. Home Internet connection speeds have accelerated by several orders of magnitude ever since the first “slow as a snail” dial-up connections got us into AOL two decades ago. Now, it’s not unusual to have sustained 10, 15, 25, and even 50 megabit per second (Mb/s) to the home – fast enough to stream Ultra HD content with multichannel sound.
And so it goes with commercial video and audio transport. Broadcast television stations had to migrate to HD-SDI starting nearly 20 years ago when the first HDTV broadcasts commenced. (Wow, has it really been that long?) Now, they’re moving to IP and copper/fiber backbones to achieve greater bandwidth and to take advantage of things like cloud storage and archiving.
So why hasn’t the AV industry gotten with the program? Because we still have a tendency to cling to old, familiar, and often outdated or cumbersome technology, rationalizing that “it’s still good enough, and it works.” (You know who you are…still using VGA and composite video switching and distribution products…)
I’ve observed that there is often considerable and continual aversion in our industry to anything having to do with IT networks and optical fiber. And it just doesn’t make any sense. Maybe it originates from a fear of losing control to IT specialists and administrators. Or, it could just be a reluctance to learn something new.
The result is that we’ve created a monster when it comes to digital signal management. Things were complicated enough when the AV industry was dragged away from analog to digital and hung its hats on the HDMI consumer video interface for switching and distribution. Now, that industry has created behemoth switch matrices to handle the current and next flavors of HDMI (a format that never was suitable for commercial AV applications).
We’ve even figured out a way to digitize the HDMI TMDS signal and extend it using category wire, up to a whopping 300 feet. And somehow, we think that’s impressive? Single-mode fiber can carry an HD video signal over 10 miles. Now, THAT’S impressive – and it’s not exactly new science.
So, now we’re installing ever-larger racks of complex HDMI switching and distribution gear that is expensive and also bandwidth-capped – not nearly fast enough for the next generation of UHD+ displays with full RGB (4:4:4) color, high dynamic range, and high frame rates. How does that make any sense?
What’s worse, the marketing folks have gotten out in front, muddying the waters with all kinds of nonsensical claims about “4K compatibility,” “4K readiness,” and even “4K certified.” What does that even mean? Just because your switch or DA product can support a very basic level of Ultra HD video with slow frame rates and reduced color resolution, it’s considered “ready” or “certified?” Give me a break.
Digitizing HDMI and extending it 300 feet isn’t future-proof. Neither is limiting Ultra HD bandwidth to 30 Hz 8-bit RGB color, or 60 Hz 8-bit 4:2:0 color. Not even close. Not when you can already buy a 27-inch 5K (yes, 5K!) monitor with 5120×2880 resolution and the ability to show 60 Hz 10-bit color. And when 8K monitors are coming to market.
So why we keep playing tricks with specifications, and working with Band-Aid solutions? We shouldn’t. We don’t need to. And the answer is already at hand.
It’s time to move away from the concept of big, bulky, expensive, and basically obsolete switching and distribution hardware that’s based on a proprietary consumer display interface standard. It’s time to move to a software-based switching and distribution concept that uses an IT structure, standard codecs like JPEG2000, M-JPEG, H.264, and H.265, and everyday off-the-shelf switches to move signals around.
Now, we can design a fast, reliable AV network that allows us to manage available bandwidth and add connections as needed. Our video can be lightly compressed with low latency, or more highly compressed for efficiency. The only display interfaces we’ll need will be at the end points where the display is connected.
Even better, our network also provides access to monitoring and controlling every piece of equipment we’ve connected. We can design and configure device controls and interfaces using cloud-based driver databases. We can access content from remote servers (the cloud, again) and send it anywhere we want. And we can log in from anywhere in the world to keep tabs on how it’s all functioning.
And if we’re smart and not afraid to learn something new, we’ll wire all of it up with optical fiber, instead of bulky cables or transmitters and receivers to convert the signals to a packet format and back. (Guess what? AV-over-IP is already digital! You can toss out those distance-limited HDMI extenders, folks!)
For those who apparently haven’t gotten the memo, 40 Gb/s network switches have been available for a few years, with 100 Gb/s models now coming to market. So much for speed limit issues…
To the naysayers who claim AV-over-IP won’t work as well as display interface switching: That’s a bunch of hooey. How are Comcast, Time Warner, NBC, Disney, Universal, Netflix, Amazon, CBS, and other content originators and distributors moving their content around? You guessed it.
AV-over-IP is what you should be looking for as you walk the aisles of the Las Vegas Convention Center, not new, bigger, and bulkier HDMI/DVI matrices. AV-over-IP is the future of our industry, whether we embrace it or are dragged into it, kicking and screaming.
Are you on board, or what?
The End Of One Era And The Start Of Another
- Published on Tuesday, 31 May 2016 18:38
- Pete Putman
- 0 Comments
Panasonic, a long-time leader in consumer electronics announced on Tuesday (May 31) that it would stop manufacturing large LCD panels for televisions at its Himeji fab in western Japan. Production will wind down this summer and stop completely in September, with the balance of IPS LCD panels going to transportation and medical markets.
Himeji was a relatively new fab, having come inline in 2010. This is where Panasonic’s IPS-Alpha LCD TVs were born even as the company’s plasma TV manufacturing business was going into a nosedive. And that line of TVs was well-received by the trade press and the general public.
But it turned out that LCD panel manufacturing in Japan is a costly effort, compared to manufacturing in Korea and China. Indeed; many Panasonic LCD TVs use glass that’s made across the China Sea, and that trend goes back a couple of years. According to a story on the Reuters Web site, “…the plant has never logged a profit during years of heavy price competition with South Korean and Chinese rivals.”
It doesn’t help that Panasonic has virtually no U.S. market share in LCD TVs, having fallen behind Vizio and Hisense even as competitors like Toshiba, Mitsubishi, Pioneer, and Hitachi have all packed it in over the past decade. And with all of the company’s eggs in the plasma basket for many years, they were late to wake up and smell the coffee.
A quick online check of Best Buy, HH Gregg, and Fry’s Web sites showed no listings for Panasonic TVs – not even among Ultra HD models. So I went to the company’s U.S. Web site and searched again, finding six models of Ultra HDTVs ranging in size from 50 inches to 85 inches – and all of them carried the notation “Not available.”
I got the same results when I clicked on “View All TVs,” and actually got a smaller selection of Ultra HD models to peruse. So no major retailer carries Panasonic TVs now. (Not even Sears, which actually has a Kenmore line of HDTVs and even 4K TVs!)
According to the Reuters story, “The decision to close the (Himeji LCD TV panel) business comes after Panasonic scrapped a company-wide revenue target of 10 trillion yen ($90.1 billion) for the year through March 2019 to focus on profitability.” Based on what I found out, it would appear that the entire TV business in North America (if not elsewhere) is also at an end.
Ironically, I just received an invite from Panasonic’s PR agency to come see their new Blu-ray player (DMP-UB900) at the company’s corporate headquarters in a few weeks. So my question now becomes, “Why are you showing me an Ultra HD Blu-ray player when you don’t seem to have any Ultra HDTV models to go with it?”
Puzzling indeed, especially in light of the rapid move away from 1080p (Full HD) televisions to Ultra HD models that’s taking place all over the world! And you need no further proof than to go on the aforementioned big box store Web sites and take a gander at the selection of Full HD and Ultra HDTV models.
A quick search showed that Best Buy currently has 108 models of 2160p (Ultra HD) sets for sale, compared to 58 Full HD models and 27 720p models. The picture isn’t quite as clear at HH Gregg, as they show 133 “LED TVs” (presumably Full HD) and 78 Ultra HD models. And Fry’s shows two different categories for “4K TVs,” although those account for 173 models with Full HD showing 220 models.
So what to make of this? Simple – the adoption rate for Ultra HDTV sets is accelerating to the point where (at Best Buy, at least) the offerings now exceed those with Full HD resolution. Keep in mind that the first Ultra HDTVs appeared on our shores not quite four years ago, used the LG 84-inch IPS LCD panel, and cost anywhere between $15,000 and $20,000.
Now, you can buy a first-tier 55-inch Samsung Ultra HD “smart” LCD TV for $700. As I’m writing this, Vizio has a 50-inch “smart” 4K model for $499, as does Insignia. Westinghouse goes them one better with a 55-inch 4K set for the same price, and LG is clearing out a 49-inch Ultra HD set for $550. And Hisense recently announced HDR models for less than $600! Mind-boggling.
So here we go at warp speed; zooming into a new world of 4K TVs as Full HD sets fade into the distance, having first appeared a little over a decade ago. I’ve said it before and I’ll say it again – your next big screen LCD TV is going to be an Ultra HD model, especially if you pick it up in December, or wait a little longer into 2017.
And your new Ultra HDTV will NOT be a Panasonic model, based on what my Web search revealed. In fact, there’s a good chance your next Ultra HDTV could be a Chinese brand, like Hisense or TCL, thanks to their very aggressive pricing. (But make sure your set supports high dynamic range to be future-proof!)
NAB 2016: Thoughts and Afterthoughts
- Published on Tuesday, 26 April 2016 20:06
- Pete Putman
- 0 Comments
I’m back from my 22nd consecutive NAB Show, and it’s always a worthwhile trip. NAB isn’t quite as large or crazy as CES, but it’s still sprawled out enough to require three full days to see everything. (Except that you don’t have to fight the insane crowds that fill the Las Vegas Convention Center in January.)
This year’s theme was “Unleash!” or something like that. I never was completely sure, and it sounded more appropriate for a competition of hunting dogs anyway. But the crowds came anyway (over 100,000 for sure) to see everything from 4K and 8K video to live demonstrations of the new ATSC 3.0 digital broadcasting system, a plethora of small 4K cameras, the accelerating move to IP infrastructures instead of SDI, and video streaming to every conceivable device.
My visit to the show had a threefold purpose. In addition to press coverage and checking out product trends for customers, I also delivered a presentation during the Broadcast Engineering Conference titled “Next Generation Interfaces: Progress, or Babylon?” The subject was a new wave of high-speed interfaces needed to connect 4K, 5K, 6K, and 8K displays (DisplayPort, HDMI 2.0, and superMHL, not to mention Display Stream Compression).
Besides hundreds of exhibits, there are the pavilions. Trade shows LOVE setting up pavilions to showcase a hot technology or trend. Sometimes they’re a bit premature: In 1999, the show featured an enormous “streaming media” area in the central hall of the Las Vegas Convention Center stuffed full of startup companies showing postage-stamp-sized video, streaming over DSL and dial-up connections. All of those companies were gone a year later.
In addition to the Futures Park pavilion – which showcased NHK’s 8K broadcasting efforts and ATSC 3.0, and which was mysteriously stuffed all the way at the back (east) end of the upper south hall, where few people rarely go – there was the Sprockit startup pavilion in the north hall, near the Virtual Reality / Augmented Reality pavilion (more on that in a moment).
There was also a demonstration of ATSC 3.0 in the home, located at the upper entrance (west end) of the south hall. Outside, Nokia set up a concert stage and had entertainment each day, all day long, streaming the performances into the VR/AR booth for viewing and listening on appropriate headgear.
To set the table and see just how much the industry has changed in a little over 20 years, the “hot” broadcasting formats in 1995 were Digital Betacam (two years old), DVCPRO, and a new HD format called D5. Non-linear editing was just getting off the ground from the likes of Avid, Media 100, and Boxx Technologies. A decent SD camera for studio and field production cost about $20,000, and HD was still very much in the experimental stage – the new Grand Alliance HD format was heavily promoting the format, model station WHD in Washington was conducting trial broadcasts, and there was no such thing as 720p/60/59.94 just yet.
The standard connectors for video? BNC and RCA for composite, with BNC doubling for the serial digital interface (SDI) connection. VGA was the connector of choice for PCs, and component video was tricky to implement. Tape was the preferred recording media, as optical disc hadn’t made its public debut yet. “High resolution” on a graphics workstation was around 1280×1024 (SXGA), a “bright” LCD projector could crank out about 500 lumens with 640×480 resolution, and the Internet was still a mystery to 90% of attendees.
We all know how the intervening years played out. TV broadcasters are now in the middle of a channel auction, and we may lose more UHF spectrum (in 1995, UHF channels ran from 14 to 69), possibly as much as 60 – 80+ MHz, or 10 – 14 channels. Demand for optical disc media is very much on the wane as streaming and cloud services are picking up the reins.
You don’t see very many transmitter and antenna manufacturers at the show any more, and when you do, their booths are pretty small. There’s been consolidation in the industry with antenna maker Dielectric shutting down a few years ago, then getting bought by the Sinclair Broadcast Group and revived (just in time for the auction!). Harmonic recently purchased Thomson, which explains the big empty booth where they should have been.
And the biggest booth at the show doesn’t belong to Sony, or Panasonic, or Imagine (Harris). Nope, that honor goes to Canon, showing you that there’s still plenty of money to be made in video and still cameras, optical glass, and camera sensors. In a sign o’ the times, Panasonic’s once-enormous booth, which occupied the full width of the central hall mezzanine, has shrunk down to about half its original size.
NAB now is all about “anytime, anywhere” content creation, mastering, storage, and delivery. The concept of broadcasting is almost quaint these days (ATSC 3.0 notwithstanding) as more and more viewers avail themselves of faster broadband speeds and opt for on-demand streaming and binge viewing of TV shows.
Brands like Netflix and Amazon are stirring the pot, not ABC and NBC. (Most of the TV shows in the top 20 every week are CBS programs.) YouTube now offers a premium ad-free service (ironic, since ten years ago it was a place to share videos commercial-free). And this year’s “3D” is virtual reality (VR), backed up by augmented reality (AR).
Not clear on the difference? VR presents a totally electronic “pseudo” view of the world, which can be represented by custom video clips or generated by computer graphics. AR takes real-world views and overlays text, graphics, and other picture elements to “augment” your experience.
Google Glass is a good example of augmented reality – you’d walk down the street and graphics would appear in the near-to-eye display, showing you the location of a restaurant, displaying a text message, or alerting you to a phone call. Oculus Rift and Samsung Galaxy Gear are good examples of virtual reality, immersing your eyes and ears in imaginary worlds with large headsets and earphones.
I’ve tried VR and AR systems a few times, and the eyewear works- but it’s heavy and quite bulky. And the multichannel spatial audio is also impressive, but I have to strap headphones over those enormous headsets. In fact, the biggest problem with VR and AR right now IS the headset. Galaxy Gear and other systems use your smartphone as a stereo display (you can do the same thing with a simple cardboard viewer), but the resolution of your smartphone’s display simply isn’t fine enough to work in a near-to-eye application.
After you wear a VR/AR headset for a while and stand up and take it off, you may find your sense of balance is also out of whack and that you momentarily have some trouble walking correctly. That’s another example of a spatial disorientation problem caused by the disconnect between your eyesight and other senses.
If some of these problems sound familiar, they should. We heard much the same thing during the latest incarnation of 3D from 2008 to 2012, particularly from people wearing active-shutter 3D glasses. During the roll-out of 3D, it became apparent that as much as 25% of the general population could not view 3D correctly because of eye disorders, spatial disorientation, incompatibility with contact lenses, and other problems.
Back to reality! Here are a few more interesting things I saw in Las Vegas:
ATSC 3.0 is ready for its day in the sun. A consortium of interest groups recently petitioned the FCC to make that happen, and based on the demos at the show, it has a fighting chance to ensure broadcasting sticks around for a while. For current TVs, some sort of sidecar box will be required. But you’ll be able to watch 4K (Ultra HD) broadcasts with spatial audio and stream broadcast content to phones, tablets, and laptops, too.
8K Real-Time HEVC Encoding was on display in the NTT and NEC booths. For those counting, there are 7680 horizontal and 4320 vertical pixels in one 8K image, and both companies had demos of 4:2:0 video streaming at about 80 Mb/s. Recall that 8K has 16 times the resolution of 1080p full HD, and you can see that a ton of computational power is required to make it all work.
HEVC Encoding was also in abundance on the show floor. Vitec had some super-small contribution H.265 encoders, and Haivision brought out a new Makito H.265 portable encoder. The Fraunhofer Institute had an impressive demo of contribution 4K video with HDR and wide color gamut encoded at 16 Mb/s, resulting in picture quality that would rival an Ultra HD Blu-ray disc streaming six times as fast.
Organic Light-Emitting Diode (OLED) displays are gaining ground on LCD for studio and broadcast operations. Three different companies – Boland, Sony, and Fusion – were showing Ultra HD “client” and “reference” monitors based on a 55-inch RGBW panel manufactured by LG Display. Sony, of course, has 30-inch and 25-inch models, and some of the older 25-inch glass is being used in monitors made by companies like Flanders Scientific. Newer OLED panels use 10-bit drivers and can reproduce HDR signals with a wide color gamut.
High Dynamic Range was very much on people’s minds at NAB 2016. Dolby showed its Dolby Vision proprietary HDR system, and Technicolor privately demoed its dual SDR/HDR workflow and distribution scheme. Samsung was an expected visitor to the show floor – their booth featured a side-by-side comparison of SDR and HDR with dynamic tone mapping, a system they invented and will make available openly to anyone. It’s also a candidate for SMPTE HDR standards.
Super-fine pitch LED display walls are the next big thing, and I mean that – literally. Leyard, who bought Planar Systems last year, had an impressive 100-foot diagonal “8K’ LED video wall (no mention of the dot pitch, but it had to be around 1.2mm) that dominated the floor. An industry colleague remarked that the brightness and size of this screen would be sufficient to replace cinema screens and overcome reflective, contrast lowering glare. (Plus kick the electric bill up quite a few notches!)
Leyard also had a prototype 4K LED display wall using .9mm dot pitch LED emitters and not far away, Christie showed its Velvet series of LED walls, with dot pitches ranging from as coarse as 4mm (remember when that used to be a fine pitch?) to as sharp as .9mm. Top= put all of that into perspective, the first 42-inch and 50-inch plasma monitors that entered the U.S. market in the mid-1990s had a dot pitch of about 1mm, and 720p/768p plasma monitors were about .85mm. How far we’ve come!
And there’s still very much a place for AVC H.264 encoding. Z3 had a super-tiny DME-10 H.264 encoder for streaming over IP, as did Vitec. Matrox unveiled their Monarch Lecture Capture system (also based on H.264), and NTT had an impressive multistream H.264 / IP encoder/decoder system out for inspection. Some of these boxes would actually fit in your shirt pocket – that’s how small they’ve become.
Of course, the wizards at Blackmagic Design were at it again. This time, they showed an H.265-based recorder/duplicator system that can write 25 SD cards simultaneously with HEVC 2K and 4K video and audio – just plug ‘em in, and go! Over at the Adtec booth, the Affiniti system held the spotlight. This fast, “universal” bus for encoders and decoders is designed to be configured and maintained by anyone with minimal technical knowledge. It uses an SFP backplane, an approach more manufacturers are taking to keep up with the ever-higher speeds of 4K and UHD+ data.
Finally, I just had to mention the “world’s smallest 8K display,” as seen in the NHK booth. Yep, it measures just 13 inches diagonally and has an amazing pixel density of 664 pixels per inch (ppi). This display, made by the Semiconductor Energy Laboratory Company of Japan, has a resolution of 7680 by 4320 pixels and employs a top-emission white OLED layer with color filters. (Really???)
Of Samsung, Big Screens, IoT, HDR, And Patience
- Published on Wednesday, 13 April 2016 16:30
- Pete Putman
- 0 Comments
Last Tuesday, April 12, Samsung held its annual press briefing and TV launch event at its new, “hip” facility in the Chelsea section of Manhattan. The multi-story building is known as Samsung837 (like a Twitter handle), as its location is on 837 Washington Street by the High Line elevated walkway.
Samsung, who has dominated the worldwide television market for many years – and who has a pretty good market share in smartphones, too – has been a leader in developing Ultra HD (4K) televisions with high dynamic range and wider color gamuts, most notably in their S-line.
At the briefing, they announced their new, top-of-the-line Ultra HDTVs, equipped for high dynamic range with quantum dot backlights manufactured by Nanosys of Sunnyvale, CA. There are a few new sizes in the line that are re-defining what a “small” TV screen means! The flagship model is the KS9800 curved SUHDTV, which will be available in a 65-inch size ($4,499), 78 inches ($9,999), and a mammoth 88-inch version ($19,999).
Stepping down, we find the KS9500-series, with a 55” model for $2,499, a 65” model for $3,699, and a 78” model for $7,999 (June). The flat-screen KS9000 comes in three flavors – 55” ($2,299), 65” ($3,499) and 75” ($6,499, June). There are two entry-level SKUs (if that’s even the right term to use) as well – the KS8500, a curved-screen version, is aimed at the consumer wanting a smaller screen, with a 55” model for $1,999 and a 65” model for $2,999. A 49” model will be available in May for $1,699. The line is rounded out with the KS8000 flat SUHDTV (55” $1, 799; and 65” $2,799, with a 49” model for $1,499 and a 60” model for $2,299; both to come in May).
There’s not a huge difference between these models – the differences have mostly to do with curved and flat surfaces and the screen size options available. Plus a bevy of “bells and whistles.” Perhaps the most intriguing are a set of “connect and control” features.
Samsung’s been offering a Smart Hub feature for some time, and this year’s iteration lets you plug in a cable box from Comcast or Time Warner or a set-top from DirecTV, and the TV will automatically recognize the box and set up all the required control functions on the Samsung TV remote. All you have to do is plug in an HDMI cable.
On top of that, Samsung’s Smart Things feature provides on-off control of things like locks, lamps, and other devices connected by Wi-Fi, ZigBee, or Z-Wave protocols. The company offers switchable outlets, water sensors, proximity sensors, and motion sensors; all of which connect back to your television and smart phone for monitoring and control. (And yes, the television can also be controlled by this system.)
Samsung’s concept is this: Since we spend so much time in front of our big screen TVs, why not make them the hub of a home monitoring and control system? And why not make the connection and activation of everything from set-top boxes to remotely-controlled AC outlets a plug-and-play operation? A Smart Things starter kit is available for $249, and you can add compatible ZigBee and Z-Wave devices like thermostats, smoke and CO detectors, and locks from companies like Honeywell, Schlage, Cree, Leviton, and First Alert.
So why are Samsung and other TV manufacturers looking to get into home control systems? A combination of declining TV sales and falling prices has resulted in an accelerating transition away from Full HD (1920×1080) televisions and displays to Ultra HD (3840×2160), as TV manufacturing shifts to China and manufacturers frantically search for profitability.
Samsung – likely motivated by this trend – is looking a way to add value to TV sales, pitching a complete home entertainment and control system (with sound bars, surround audio, and Ultra HD Blu-ray players, of course) to consumers. It’s all about the Internet of Things (IoT) – the idea that every electronic gadget in your home has an IP address and can be controlled with a driver and an app.
Think about this for a moment: Seven years ago, a first-tier 50-inch 1080p plasma equipped with active-shutter 3D playback was priced at $2,500. Today, you can buy four times the resolution, eight times the brightness, a much wider color gamut, a much lighter set with lower power consumption, and five more inches of screen for about $600 less.
Amazing! You’re thinking. My next TV is going to be an Ultra HDTV! Good thinking, as your next TV sized 55 inches or larger will probably be an Ultra HD set anyway, since TV manufacturers are ramping down production of 1080p sets and retailers are devoting more shelf space to UHD.
While there are and will continue to be some amazing deals on Ultra HD sets, don’t forget the enhancements. In addition to the aforementioned high dynamic range and wider color gamut, higher frame rates (HFR) will also become a part of the UHD ecosystem. (So will 8K displays, but I’m getting ahead of myself…)
Problem is; no two companies are implementing all of these add-ons the same way. We have competing systems for HDR (Dolby Vision, Technicolor, BBC/NHK HLG, and yes, Samsung), and yet another controversy about pixel resolution in displays using the pentile red-green-blue-white (RGBW) pixel array (LG’s new Ultra HD OLEDs).
To date, only two HDR Blu-ray players have been announced, and only one (Samsung) is available at retail. A bigger problem: Many Ultra HDTVs have only one HDMI 2.0 input, which needs to support the CTA 861.3 HDR metadata standard. (DisplayPort 1.4 also works with CTA 861.3, but it was just announced). And HDMI 2.0 is barely fast enough for 4K HDR: If you want to connect a PC for Ultra HD gaming at 60Hz with 10-bit RGB (4:4:4) color, you’re out of luck.
In other words; it’s chaos as usual in the CE world, like HDTV was circa 1998. I don’t know how fast these issues will be worked out. All HDR-10 compatible TVs should play back 10-bit content from Ultra HD Blu-ray discs and media files. When it comes to enhanced HDR systems, Vizio, TCL, and LG support Dolby Vision, but Samsung does not; neither do Panasonic and Sony.
Only a handful of TV models have opted to include the still royalty-free DisplayPort interface to overcome some of the UHD speed limit issues of HDMI. 4K content isn’t exactly in abundance, either. No broadcasts are planned in the near future, and a handful of cable systems are working on 4K channels (remember the 3D channels from Comcast and DirecTV?). Netflix and Amazon Prime do stream in UHD, but you need a TV that supports the VP9/VP10 and H.265 codecs to watch.
If you are considering a purchase of an Ultra HDTV and not in a big hurry, my advice is to sit on your hands for another year until many of these issues get ironed out. Sometimes doing nothing really is the best option…