Category: The Front Line
AV-over-IP: It’s Here. Time To Get On Board!
- Published on Friday, 03 June 2016 14:21
- Pete Putman
- 0 Comments
At InfoComm next week in Las Vegas, I look forward to seeing many familiar faces – both individuals and manufacturers – that have frequented the show since I first attended over 20 years ago. And I also expect to find quite a few newcomers, based on the press releases and product announcements I’ve been receiving daily.
Many of those newcomers will be hawking the latest technology – AV-over-IP. More specifically, transporting video, audio, and metadata that are encoded into some sort of compressed or lightly-compressed format, wrapped with IP headers, and transported over IP networks.
This isn’t exactly a new trend: The broadcast, telecom, and cable/satellite worlds have already begun or completed the migration to IT infrastructures. The increasing use of optical fiber and lower-cost, fast network switches are making it all possible. Think 10 gigabit Ethernet with single-mode fiber interconnections, and you can see where the state-of-the-art is today.
You’ve already experienced this AV-over-IP phenomenon if you watch streaming HD and 4K video. Home Internet connection speeds have accelerated by several orders of magnitude ever since the first “slow as a snail” dial-up connections got us into AOL two decades ago. Now, it’s not unusual to have sustained 10, 15, 25, and even 50 megabit per second (Mb/s) to the home – fast enough to stream Ultra HD content with multichannel sound.
And so it goes with commercial video and audio transport. Broadcast television stations had to migrate to HD-SDI starting nearly 20 years ago when the first HDTV broadcasts commenced. (Wow, has it really been that long?) Now, they’re moving to IP and copper/fiber backbones to achieve greater bandwidth and to take advantage of things like cloud storage and archiving.
So why hasn’t the AV industry gotten with the program? Because we still have a tendency to cling to old, familiar, and often outdated or cumbersome technology, rationalizing that “it’s still good enough, and it works.” (You know who you are…still using VGA and composite video switching and distribution products…)
I’ve observed that there is often considerable and continual aversion in our industry to anything having to do with IT networks and optical fiber. And it just doesn’t make any sense. Maybe it originates from a fear of losing control to IT specialists and administrators. Or, it could just be a reluctance to learn something new.
The result is that we’ve created a monster when it comes to digital signal management. Things were complicated enough when the AV industry was dragged away from analog to digital and hung its hats on the HDMI consumer video interface for switching and distribution. Now, that industry has created behemoth switch matrices to handle the current and next flavors of HDMI (a format that never was suitable for commercial AV applications).
We’ve even figured out a way to digitize the HDMI TMDS signal and extend it using category wire, up to a whopping 300 feet. And somehow, we think that’s impressive? Single-mode fiber can carry an HD video signal over 10 miles. Now, THAT’S impressive – and it’s not exactly new science.
So, now we’re installing ever-larger racks of complex HDMI switching and distribution gear that is expensive and also bandwidth-capped – not nearly fast enough for the next generation of UHD+ displays with full RGB (4:4:4) color, high dynamic range, and high frame rates. How does that make any sense?
What’s worse, the marketing folks have gotten out in front, muddying the waters with all kinds of nonsensical claims about “4K compatibility,” “4K readiness,” and even “4K certified.” What does that even mean? Just because your switch or DA product can support a very basic level of Ultra HD video with slow frame rates and reduced color resolution, it’s considered “ready” or “certified?” Give me a break.
Digitizing HDMI and extending it 300 feet isn’t future-proof. Neither is limiting Ultra HD bandwidth to 30 Hz 8-bit RGB color, or 60 Hz 8-bit 4:2:0 color. Not even close. Not when you can already buy a 27-inch 5K (yes, 5K!) monitor with 5120×2880 resolution and the ability to show 60 Hz 10-bit color. And when 8K monitors are coming to market.
So why we keep playing tricks with specifications, and working with Band-Aid solutions? We shouldn’t. We don’t need to. And the answer is already at hand.
It’s time to move away from the concept of big, bulky, expensive, and basically obsolete switching and distribution hardware that’s based on a proprietary consumer display interface standard. It’s time to move to a software-based switching and distribution concept that uses an IT structure, standard codecs like JPEG2000, M-JPEG, H.264, and H.265, and everyday off-the-shelf switches to move signals around.
Now, we can design a fast, reliable AV network that allows us to manage available bandwidth and add connections as needed. Our video can be lightly compressed with low latency, or more highly compressed for efficiency. The only display interfaces we’ll need will be at the end points where the display is connected.
Even better, our network also provides access to monitoring and controlling every piece of equipment we’ve connected. We can design and configure device controls and interfaces using cloud-based driver databases. We can access content from remote servers (the cloud, again) and send it anywhere we want. And we can log in from anywhere in the world to keep tabs on how it’s all functioning.
And if we’re smart and not afraid to learn something new, we’ll wire all of it up with optical fiber, instead of bulky cables or transmitters and receivers to convert the signals to a packet format and back. (Guess what? AV-over-IP is already digital! You can toss out those distance-limited HDMI extenders, folks!)
For those who apparently haven’t gotten the memo, 40 Gb/s network switches have been available for a few years, with 100 Gb/s models now coming to market. So much for speed limit issues…
To the naysayers who claim AV-over-IP won’t work as well as display interface switching: That’s a bunch of hooey. How are Comcast, Time Warner, NBC, Disney, Universal, Netflix, Amazon, CBS, and other content originators and distributors moving their content around? You guessed it.
AV-over-IP is what you should be looking for as you walk the aisles of the Las Vegas Convention Center, not new, bigger, and bulkier HDMI/DVI matrices. AV-over-IP is the future of our industry, whether we embrace it or are dragged into it, kicking and screaming.
Are you on board, or what?
The End Of One Era And The Start Of Another
- Published on Tuesday, 31 May 2016 18:38
- Pete Putman
- 0 Comments
Panasonic, a long-time leader in consumer electronics announced on Tuesday (May 31) that it would stop manufacturing large LCD panels for televisions at its Himeji fab in western Japan. Production will wind down this summer and stop completely in September, with the balance of IPS LCD panels going to transportation and medical markets.
Himeji was a relatively new fab, having come inline in 2010. This is where Panasonic’s IPS-Alpha LCD TVs were born even as the company’s plasma TV manufacturing business was going into a nosedive. And that line of TVs was well-received by the trade press and the general public.
But it turned out that LCD panel manufacturing in Japan is a costly effort, compared to manufacturing in Korea and China. Indeed; many Panasonic LCD TVs use glass that’s made across the China Sea, and that trend goes back a couple of years. According to a story on the Reuters Web site, “…the plant has never logged a profit during years of heavy price competition with South Korean and Chinese rivals.”
It doesn’t help that Panasonic has virtually no U.S. market share in LCD TVs, having fallen behind Vizio and Hisense even as competitors like Toshiba, Mitsubishi, Pioneer, and Hitachi have all packed it in over the past decade. And with all of the company’s eggs in the plasma basket for many years, they were late to wake up and smell the coffee.
A quick online check of Best Buy, HH Gregg, and Fry’s Web sites showed no listings for Panasonic TVs – not even among Ultra HD models. So I went to the company’s U.S. Web site and searched again, finding six models of Ultra HDTVs ranging in size from 50 inches to 85 inches – and all of them carried the notation “Not available.”
I got the same results when I clicked on “View All TVs,” and actually got a smaller selection of Ultra HD models to peruse. So no major retailer carries Panasonic TVs now. (Not even Sears, which actually has a Kenmore line of HDTVs and even 4K TVs!)
According to the Reuters story, “The decision to close the (Himeji LCD TV panel) business comes after Panasonic scrapped a company-wide revenue target of 10 trillion yen ($90.1 billion) for the year through March 2019 to focus on profitability.” Based on what I found out, it would appear that the entire TV business in North America (if not elsewhere) is also at an end.
Ironically, I just received an invite from Panasonic’s PR agency to come see their new Blu-ray player (DMP-UB900) at the company’s corporate headquarters in a few weeks. So my question now becomes, “Why are you showing me an Ultra HD Blu-ray player when you don’t seem to have any Ultra HDTV models to go with it?”
Puzzling indeed, especially in light of the rapid move away from 1080p (Full HD) televisions to Ultra HD models that’s taking place all over the world! And you need no further proof than to go on the aforementioned big box store Web sites and take a gander at the selection of Full HD and Ultra HDTV models.
A quick search showed that Best Buy currently has 108 models of 2160p (Ultra HD) sets for sale, compared to 58 Full HD models and 27 720p models. The picture isn’t quite as clear at HH Gregg, as they show 133 “LED TVs” (presumably Full HD) and 78 Ultra HD models. And Fry’s shows two different categories for “4K TVs,” although those account for 173 models with Full HD showing 220 models.
So what to make of this? Simple – the adoption rate for Ultra HDTV sets is accelerating to the point where (at Best Buy, at least) the offerings now exceed those with Full HD resolution. Keep in mind that the first Ultra HDTVs appeared on our shores not quite four years ago, used the LG 84-inch IPS LCD panel, and cost anywhere between $15,000 and $20,000.
Now, you can buy a first-tier 55-inch Samsung Ultra HD “smart” LCD TV for $700. As I’m writing this, Vizio has a 50-inch “smart” 4K model for $499, as does Insignia. Westinghouse goes them one better with a 55-inch 4K set for the same price, and LG is clearing out a 49-inch Ultra HD set for $550. And Hisense recently announced HDR models for less than $600! Mind-boggling.
So here we go at warp speed; zooming into a new world of 4K TVs as Full HD sets fade into the distance, having first appeared a little over a decade ago. I’ve said it before and I’ll say it again – your next big screen LCD TV is going to be an Ultra HD model, especially if you pick it up in December, or wait a little longer into 2017.
And your new Ultra HDTV will NOT be a Panasonic model, based on what my Web search revealed. In fact, there’s a good chance your next Ultra HDTV could be a Chinese brand, like Hisense or TCL, thanks to their very aggressive pricing. (But make sure your set supports high dynamic range to be future-proof!)
NAB 2016: Thoughts and Afterthoughts
- Published on Tuesday, 26 April 2016 20:06
- Pete Putman
- 0 Comments
I’m back from my 22nd consecutive NAB Show, and it’s always a worthwhile trip. NAB isn’t quite as large or crazy as CES, but it’s still sprawled out enough to require three full days to see everything. (Except that you don’t have to fight the insane crowds that fill the Las Vegas Convention Center in January.)
This year’s theme was “Unleash!” or something like that. I never was completely sure, and it sounded more appropriate for a competition of hunting dogs anyway. But the crowds came anyway (over 100,000 for sure) to see everything from 4K and 8K video to live demonstrations of the new ATSC 3.0 digital broadcasting system, a plethora of small 4K cameras, the accelerating move to IP infrastructures instead of SDI, and video streaming to every conceivable device.
My visit to the show had a threefold purpose. In addition to press coverage and checking out product trends for customers, I also delivered a presentation during the Broadcast Engineering Conference titled “Next Generation Interfaces: Progress, or Babylon?” The subject was a new wave of high-speed interfaces needed to connect 4K, 5K, 6K, and 8K displays (DisplayPort, HDMI 2.0, and superMHL, not to mention Display Stream Compression).
Besides hundreds of exhibits, there are the pavilions. Trade shows LOVE setting up pavilions to showcase a hot technology or trend. Sometimes they’re a bit premature: In 1999, the show featured an enormous “streaming media” area in the central hall of the Las Vegas Convention Center stuffed full of startup companies showing postage-stamp-sized video, streaming over DSL and dial-up connections. All of those companies were gone a year later.
In addition to the Futures Park pavilion – which showcased NHK’s 8K broadcasting efforts and ATSC 3.0, and which was mysteriously stuffed all the way at the back (east) end of the upper south hall, where few people rarely go – there was the Sprockit startup pavilion in the north hall, near the Virtual Reality / Augmented Reality pavilion (more on that in a moment).
There was also a demonstration of ATSC 3.0 in the home, located at the upper entrance (west end) of the south hall. Outside, Nokia set up a concert stage and had entertainment each day, all day long, streaming the performances into the VR/AR booth for viewing and listening on appropriate headgear.
To set the table and see just how much the industry has changed in a little over 20 years, the “hot” broadcasting formats in 1995 were Digital Betacam (two years old), DVCPRO, and a new HD format called D5. Non-linear editing was just getting off the ground from the likes of Avid, Media 100, and Boxx Technologies. A decent SD camera for studio and field production cost about $20,000, and HD was still very much in the experimental stage – the new Grand Alliance HD format was heavily promoting the format, model station WHD in Washington was conducting trial broadcasts, and there was no such thing as 720p/60/59.94 just yet.
The standard connectors for video? BNC and RCA for composite, with BNC doubling for the serial digital interface (SDI) connection. VGA was the connector of choice for PCs, and component video was tricky to implement. Tape was the preferred recording media, as optical disc hadn’t made its public debut yet. “High resolution” on a graphics workstation was around 1280×1024 (SXGA), a “bright” LCD projector could crank out about 500 lumens with 640×480 resolution, and the Internet was still a mystery to 90% of attendees.
We all know how the intervening years played out. TV broadcasters are now in the middle of a channel auction, and we may lose more UHF spectrum (in 1995, UHF channels ran from 14 to 69), possibly as much as 60 – 80+ MHz, or 10 – 14 channels. Demand for optical disc media is very much on the wane as streaming and cloud services are picking up the reins.
You don’t see very many transmitter and antenna manufacturers at the show any more, and when you do, their booths are pretty small. There’s been consolidation in the industry with antenna maker Dielectric shutting down a few years ago, then getting bought by the Sinclair Broadcast Group and revived (just in time for the auction!). Harmonic recently purchased Thomson, which explains the big empty booth where they should have been.
And the biggest booth at the show doesn’t belong to Sony, or Panasonic, or Imagine (Harris). Nope, that honor goes to Canon, showing you that there’s still plenty of money to be made in video and still cameras, optical glass, and camera sensors. In a sign o’ the times, Panasonic’s once-enormous booth, which occupied the full width of the central hall mezzanine, has shrunk down to about half its original size.
NAB now is all about “anytime, anywhere” content creation, mastering, storage, and delivery. The concept of broadcasting is almost quaint these days (ATSC 3.0 notwithstanding) as more and more viewers avail themselves of faster broadband speeds and opt for on-demand streaming and binge viewing of TV shows.
Brands like Netflix and Amazon are stirring the pot, not ABC and NBC. (Most of the TV shows in the top 20 every week are CBS programs.) YouTube now offers a premium ad-free service (ironic, since ten years ago it was a place to share videos commercial-free). And this year’s “3D” is virtual reality (VR), backed up by augmented reality (AR).
Not clear on the difference? VR presents a totally electronic “pseudo” view of the world, which can be represented by custom video clips or generated by computer graphics. AR takes real-world views and overlays text, graphics, and other picture elements to “augment” your experience.
Google Glass is a good example of augmented reality – you’d walk down the street and graphics would appear in the near-to-eye display, showing you the location of a restaurant, displaying a text message, or alerting you to a phone call. Oculus Rift and Samsung Galaxy Gear are good examples of virtual reality, immersing your eyes and ears in imaginary worlds with large headsets and earphones.
I’ve tried VR and AR systems a few times, and the eyewear works- but it’s heavy and quite bulky. And the multichannel spatial audio is also impressive, but I have to strap headphones over those enormous headsets. In fact, the biggest problem with VR and AR right now IS the headset. Galaxy Gear and other systems use your smartphone as a stereo display (you can do the same thing with a simple cardboard viewer), but the resolution of your smartphone’s display simply isn’t fine enough to work in a near-to-eye application.
After you wear a VR/AR headset for a while and stand up and take it off, you may find your sense of balance is also out of whack and that you momentarily have some trouble walking correctly. That’s another example of a spatial disorientation problem caused by the disconnect between your eyesight and other senses.
If some of these problems sound familiar, they should. We heard much the same thing during the latest incarnation of 3D from 2008 to 2012, particularly from people wearing active-shutter 3D glasses. During the roll-out of 3D, it became apparent that as much as 25% of the general population could not view 3D correctly because of eye disorders, spatial disorientation, incompatibility with contact lenses, and other problems.
Back to reality! Here are a few more interesting things I saw in Las Vegas:
ATSC 3.0 is ready for its day in the sun. A consortium of interest groups recently petitioned the FCC to make that happen, and based on the demos at the show, it has a fighting chance to ensure broadcasting sticks around for a while. For current TVs, some sort of sidecar box will be required. But you’ll be able to watch 4K (Ultra HD) broadcasts with spatial audio and stream broadcast content to phones, tablets, and laptops, too.
8K Real-Time HEVC Encoding was on display in the NTT and NEC booths. For those counting, there are 7680 horizontal and 4320 vertical pixels in one 8K image, and both companies had demos of 4:2:0 video streaming at about 80 Mb/s. Recall that 8K has 16 times the resolution of 1080p full HD, and you can see that a ton of computational power is required to make it all work.
HEVC Encoding was also in abundance on the show floor. Vitec had some super-small contribution H.265 encoders, and Haivision brought out a new Makito H.265 portable encoder. The Fraunhofer Institute had an impressive demo of contribution 4K video with HDR and wide color gamut encoded at 16 Mb/s, resulting in picture quality that would rival an Ultra HD Blu-ray disc streaming six times as fast.
Organic Light-Emitting Diode (OLED) displays are gaining ground on LCD for studio and broadcast operations. Three different companies – Boland, Sony, and Fusion – were showing Ultra HD “client” and “reference” monitors based on a 55-inch RGBW panel manufactured by LG Display. Sony, of course, has 30-inch and 25-inch models, and some of the older 25-inch glass is being used in monitors made by companies like Flanders Scientific. Newer OLED panels use 10-bit drivers and can reproduce HDR signals with a wide color gamut.
High Dynamic Range was very much on people’s minds at NAB 2016. Dolby showed its Dolby Vision proprietary HDR system, and Technicolor privately demoed its dual SDR/HDR workflow and distribution scheme. Samsung was an expected visitor to the show floor – their booth featured a side-by-side comparison of SDR and HDR with dynamic tone mapping, a system they invented and will make available openly to anyone. It’s also a candidate for SMPTE HDR standards.
Super-fine pitch LED display walls are the next big thing, and I mean that – literally. Leyard, who bought Planar Systems last year, had an impressive 100-foot diagonal “8K’ LED video wall (no mention of the dot pitch, but it had to be around 1.2mm) that dominated the floor. An industry colleague remarked that the brightness and size of this screen would be sufficient to replace cinema screens and overcome reflective, contrast lowering glare. (Plus kick the electric bill up quite a few notches!)
Leyard also had a prototype 4K LED display wall using .9mm dot pitch LED emitters and not far away, Christie showed its Velvet series of LED walls, with dot pitches ranging from as coarse as 4mm (remember when that used to be a fine pitch?) to as sharp as .9mm. Top= put all of that into perspective, the first 42-inch and 50-inch plasma monitors that entered the U.S. market in the mid-1990s had a dot pitch of about 1mm, and 720p/768p plasma monitors were about .85mm. How far we’ve come!
And there’s still very much a place for AVC H.264 encoding. Z3 had a super-tiny DME-10 H.264 encoder for streaming over IP, as did Vitec. Matrox unveiled their Monarch Lecture Capture system (also based on H.264), and NTT had an impressive multistream H.264 / IP encoder/decoder system out for inspection. Some of these boxes would actually fit in your shirt pocket – that’s how small they’ve become.
Of course, the wizards at Blackmagic Design were at it again. This time, they showed an H.265-based recorder/duplicator system that can write 25 SD cards simultaneously with HEVC 2K and 4K video and audio – just plug ‘em in, and go! Over at the Adtec booth, the Affiniti system held the spotlight. This fast, “universal” bus for encoders and decoders is designed to be configured and maintained by anyone with minimal technical knowledge. It uses an SFP backplane, an approach more manufacturers are taking to keep up with the ever-higher speeds of 4K and UHD+ data.
Finally, I just had to mention the “world’s smallest 8K display,” as seen in the NHK booth. Yep, it measures just 13 inches diagonally and has an amazing pixel density of 664 pixels per inch (ppi). This display, made by the Semiconductor Energy Laboratory Company of Japan, has a resolution of 7680 by 4320 pixels and employs a top-emission white OLED layer with color filters. (Really???)
Of Samsung, Big Screens, IoT, HDR, And Patience
- Published on Wednesday, 13 April 2016 16:30
- Pete Putman
- 0 Comments
Last Tuesday, April 12, Samsung held its annual press briefing and TV launch event at its new, “hip” facility in the Chelsea section of Manhattan. The multi-story building is known as Samsung837 (like a Twitter handle), as its location is on 837 Washington Street by the High Line elevated walkway.
Samsung, who has dominated the worldwide television market for many years – and who has a pretty good market share in smartphones, too – has been a leader in developing Ultra HD (4K) televisions with high dynamic range and wider color gamuts, most notably in their S-line.
At the briefing, they announced their new, top-of-the-line Ultra HDTVs, equipped for high dynamic range with quantum dot backlights manufactured by Nanosys of Sunnyvale, CA. There are a few new sizes in the line that are re-defining what a “small” TV screen means! The flagship model is the KS9800 curved SUHDTV, which will be available in a 65-inch size ($4,499), 78 inches ($9,999), and a mammoth 88-inch version ($19,999).
Stepping down, we find the KS9500-series, with a 55” model for $2,499, a 65” model for $3,699, and a 78” model for $7,999 (June). The flat-screen KS9000 comes in three flavors – 55” ($2,299), 65” ($3,499) and 75” ($6,499, June). There are two entry-level SKUs (if that’s even the right term to use) as well – the KS8500, a curved-screen version, is aimed at the consumer wanting a smaller screen, with a 55” model for $1,999 and a 65” model for $2,999. A 49” model will be available in May for $1,699. The line is rounded out with the KS8000 flat SUHDTV (55” $1, 799; and 65” $2,799, with a 49” model for $1,499 and a 60” model for $2,299; both to come in May).
There’s not a huge difference between these models – the differences have mostly to do with curved and flat surfaces and the screen size options available. Plus a bevy of “bells and whistles.” Perhaps the most intriguing are a set of “connect and control” features.
Samsung’s been offering a Smart Hub feature for some time, and this year’s iteration lets you plug in a cable box from Comcast or Time Warner or a set-top from DirecTV, and the TV will automatically recognize the box and set up all the required control functions on the Samsung TV remote. All you have to do is plug in an HDMI cable.
On top of that, Samsung’s Smart Things feature provides on-off control of things like locks, lamps, and other devices connected by Wi-Fi, ZigBee, or Z-Wave protocols. The company offers switchable outlets, water sensors, proximity sensors, and motion sensors; all of which connect back to your television and smart phone for monitoring and control. (And yes, the television can also be controlled by this system.)
Samsung’s concept is this: Since we spend so much time in front of our big screen TVs, why not make them the hub of a home monitoring and control system? And why not make the connection and activation of everything from set-top boxes to remotely-controlled AC outlets a plug-and-play operation? A Smart Things starter kit is available for $249, and you can add compatible ZigBee and Z-Wave devices like thermostats, smoke and CO detectors, and locks from companies like Honeywell, Schlage, Cree, Leviton, and First Alert.
So why are Samsung and other TV manufacturers looking to get into home control systems? A combination of declining TV sales and falling prices has resulted in an accelerating transition away from Full HD (1920×1080) televisions and displays to Ultra HD (3840×2160), as TV manufacturing shifts to China and manufacturers frantically search for profitability.
Samsung – likely motivated by this trend – is looking a way to add value to TV sales, pitching a complete home entertainment and control system (with sound bars, surround audio, and Ultra HD Blu-ray players, of course) to consumers. It’s all about the Internet of Things (IoT) – the idea that every electronic gadget in your home has an IP address and can be controlled with a driver and an app.
Think about this for a moment: Seven years ago, a first-tier 50-inch 1080p plasma equipped with active-shutter 3D playback was priced at $2,500. Today, you can buy four times the resolution, eight times the brightness, a much wider color gamut, a much lighter set with lower power consumption, and five more inches of screen for about $600 less.
Amazing! You’re thinking. My next TV is going to be an Ultra HDTV! Good thinking, as your next TV sized 55 inches or larger will probably be an Ultra HD set anyway, since TV manufacturers are ramping down production of 1080p sets and retailers are devoting more shelf space to UHD.
While there are and will continue to be some amazing deals on Ultra HD sets, don’t forget the enhancements. In addition to the aforementioned high dynamic range and wider color gamut, higher frame rates (HFR) will also become a part of the UHD ecosystem. (So will 8K displays, but I’m getting ahead of myself…)
Problem is; no two companies are implementing all of these add-ons the same way. We have competing systems for HDR (Dolby Vision, Technicolor, BBC/NHK HLG, and yes, Samsung), and yet another controversy about pixel resolution in displays using the pentile red-green-blue-white (RGBW) pixel array (LG’s new Ultra HD OLEDs).
To date, only two HDR Blu-ray players have been announced, and only one (Samsung) is available at retail. A bigger problem: Many Ultra HDTVs have only one HDMI 2.0 input, which needs to support the CTA 861.3 HDR metadata standard. (DisplayPort 1.4 also works with CTA 861.3, but it was just announced). And HDMI 2.0 is barely fast enough for 4K HDR: If you want to connect a PC for Ultra HD gaming at 60Hz with 10-bit RGB (4:4:4) color, you’re out of luck.
In other words; it’s chaos as usual in the CE world, like HDTV was circa 1998. I don’t know how fast these issues will be worked out. All HDR-10 compatible TVs should play back 10-bit content from Ultra HD Blu-ray discs and media files. When it comes to enhanced HDR systems, Vizio, TCL, and LG support Dolby Vision, but Samsung does not; neither do Panasonic and Sony.
Only a handful of TV models have opted to include the still royalty-free DisplayPort interface to overcome some of the UHD speed limit issues of HDMI. 4K content isn’t exactly in abundance, either. No broadcasts are planned in the near future, and a handful of cable systems are working on 4K channels (remember the 3D channels from Comcast and DirecTV?). Netflix and Amazon Prime do stream in UHD, but you need a TV that supports the VP9/VP10 and H.265 codecs to watch.
If you are considering a purchase of an Ultra HDTV and not in a big hurry, my advice is to sit on your hands for another year until many of these issues get ironed out. Sometimes doing nothing really is the best option…
To Cut, Or Not To Cut: That Is The Question…
- Published on Monday, 11 April 2016 18:00
- Pete Putman
- 0 Comments
A recent report from Convergence Consulting Group states that by their estimates, 1.13 million TV households in the United States canceled pay TV services in 2015, which is about four times the pace of cancellations in 2014.
The report is somewhat humorously called “The Battle For The North America Couch Potato” and shows that even though pay TV subscription revenue increased by 3% in 2015 to $105B and is expected to tick up another 2% in 2016 to $107B, those percentages don’t match up to the rapid growth now being experienced with over-the-top (OTT) video services, like Netflix and Hulu.
Over the same time period, OTT subscription revenue increased by 29% to $5.1B in 2015, and is expected to grow another 20% this year to $6.1B. Now, that’s just 5.6% of the revenue forecast for conventional pay TV this year. But the growth rate of OTT is impressive and is mostly at the expense of conventional cable, fiber, and satellite TV subscriptions.
Convergence also reports that “cord never” and “cord cutter” households increased to 24.6M in 2015 from 22.5M in 2014. It’s expected that number will continue to increase to 26.7M households by the end of this year. (For some perspective, Comcast has a total of about 23 million broadband subscribers, which is more than their pay TV subscriber total.)
It’s no mystery why OTT continues to grow in popularity. Services like Netflix, Hulu, and Amazon Prime allow viewers to watch individual movies and episodes of TV shows on demand for reasonable prices, either as part of a mow monthly subscriber fee or an annual membership fee + small per-viewing charges.
In essence, what OTT viewers get is a la carte TV, instead of paying a hundred dollars or more for a service bundle that includes large blocks of TV channels that never get watched. (The average TV viewer watches about 17 different channels in a year.) And the key to making that possible is ever-faster broadband speeds, which (perhaps ironically) are being offered by cable TV companies to hold off the likes of Verizon’s FiOS, DirecTV, and Dish.
The analogy is of someone providing you the rope with which they will be hung. As Internet speeds increase along with cable bills, the first thing to get dumped is the pay TV channels. With many families, they’ve also dropped landline service in favor of mobile phones, so there’s no need for a “triple play” package (or even a “double play,” which in baseball means you’re out!)
There aren’t enough studies on hand to show how many of those cutters have picked up on watching free, over-the-air (OTA) digital TV broadcasts. And there continues to be disputes between different advocacy groups as to how much of the population actually watches OTA TV: I’ve seen estimates as low as 5-6% and as high as 20%.
Now, the second part of the story: Vizio, a leading TV brand, is now shipping a line of SmartCast Ultra HDTVs that will be “tuner-free.” You read that right; these TVs won’t have an on-board ATSC tuner for OTA broadcasts. An extra tuner would be required, along with an HDMI connection and indoor or outdoor antenna.
Technically speaking, a “TV” sold in the United States MUST have an ATSC tuner built-in, according to the FCC mandate that set a final compliance deadline of March 1, 2007. However, there is no reason why a company can’t sell a “monitor” or “display,” which would not be required to contain such a tuner. (The original FCC mandate exempted monitors that did not include analog tuners from having a digital tuner.)
According to a story on the TechHive Web site, the changes will apply to all of Vizio’s 4K Ultra HD TVs with SmartCast, including the new P-Series and upcoming E- and M-Series sets. In the story, a Vizio representative was quoted as saying that the company’s own surveys showed that less than 10 percent of their customers watched OTA broadcasts, and that a CEA (now CTA) study in 2013 claimed that just 7% of U.S. households used antennas to watch TV.
That figure is obviously low by an order of magnitude. In the 3rd quarter of 2015, the research firm Nielsen found that 12.8 million U.S. homes were relying solely on OTA TV reception, up from 12.2 the year before, and that this number didn’t include homes that are combining antenna broadcasts and streaming. All told, the percentage of homes that use an indoor or outdoor antenna in some way to watch TV probably falls between 10% and 12% – and could be even higher.
So why would Vizio drop the tuner? There’s certainly a cost savings associated with it, and not just for the hardware – there are also royalties associated with the underlying technology. But given that you can buy an outboard ATSC tuner for as little as $40, it can’t be a huge cost savings.
What’s funny about Vizio’s approach is that retailers are offering more antennas and even offering streaming media players and antennas as bundles. I’ve even noticed that the offerings of indoor TV antennas have increased at the local Best Buy (outdoor antennas are still a tough sell; only us hard-core OTA viewers will take the time to install them).
It doesn’t appear any other TV brands are following suit. However, there is a fly in the ointment: ATSC 3.0, which as a completely new standard would require an outboard set-top box or perhaps a USB stick to work with existing TVs. That’s because it supports different transmission modes that are incompatible with current ATSC tuners.
Another wrinkle – there’s no timeline for adoption of version 3.0. Right now, we’re in the middle of the first wave of FCC channel auctions, meaning that the UHF TV spectrum may be somewhat truncated after all is said and done – and many stations will have to relocate. So moving to a new terrestrial broadcast standard won’t be a priority for broadcasters any time soon.