Category: The Front Line

There’s More To The Story (There Always Is!)

A couple weeks ago, I posted a story about a particularly irritating problem I was having with my high-speed Comcast internet service dropping out. After lots of troubleshooting, I thought I had cornered the culprit – a Samsung-manufactured set-top box that Comcast was using for delivery basic Xfinity services (no DVR).

When I connected the cable TV input connection to a spectrum analyzer, I saw some pretty nasty burst of spectral noise that ranged from 11 MHz all the way up to 400 MHz. I figured this might have been the cause of the dropouts and promptly returned it to the local Comcast Store, only to find out that particular model wasn’t in use any more and that I’d have a much smaller, flatter version to take home – one that would link automatically to my main X1 DVR.

Coincidentally, the dropout problem stopped, so I did what any other reasonable person would do: I assumed that was the end of it.

Except it wasn’t. A few days after I replaced the box, the WAN connection started dropping out again. Some days it dropped only a couple of times, but on April 13, it dropped out almost fourteen times in four hours. Out came the test equipment (and plenty of expletives) as I started testing every line in the house, taking more and more things of-line.

At one point, the only thing connected to the Comcast drop was my wireless gateway and my spectrum analyzer, through a brand-new 2-way splitter good to 1.5 GHz. Sure enough, the WAN connection dropped again – but this time, I caught something on the analyzer I hadn’t seen before.

Figure 1. The entire system (two cable boxes and a wireless gateway) are working just fine with these signals.

Figure 1 shows the ‘normal’ levels of QAM carriers coming through the drop. There’s a little up-and-down there, but the entire system – in particular, the downstream QAM carriers above 650 MHz – all measured at least 32 dB above the noise floor (about -87 to -88 dBm). In this condition, the wireless gateway was chugging along just fine and broadband speeds were pretty fast.

Just ten minutes later – while I was watching the analyzer screen – the QAM carriers from 50 MHz through 400 MHz dropped precipitously, as seen in Figure 2. Right on schedule, the WAN connection stopped working! Yet, I hadn’t touched, changed, or re-wired anything. This was starting to look like a classic ghost in the machine, and likely an issue outside my house. (Yes, the Samsung box did need to be replaced in any case – it was quite dirty, RF-wise.)

Figure 2. Ten minutes later, KABOOM – the WAN connection dropped and my analyzer showed some nasty QAM waveforms below 400 MHz. What was causing this?

Well, after escalating this problem to Comcast’s Special Operations unit (yes Virginia, they do have Special Ops guys), I was visited by Jason Litton and Fredrick Finger of Comcast. I asked them to replace the DC block/ground block outside the house and also to sweep the underground cable coming back to the house. I had previously gone out to check the ground block and discovered (a) it was grounded – the wire had come loose) and (b) there was a tiny bit of play in the connections at either end, which I tightened up before they arrived.

Long story short; the block was eventually replaced, re-grounded, and new connectors were installed at either end of the underground drop. During testing, Jason spotted noise coming from a neighbor’s coaxial drop and proceeded to install several more new connectors. I also took the opportunity to have them put in two brand-new splitters in my basement (overkill, but what the heck) and run a new coaxial line to my workbench.

And that finally did the trick. Whatever phantom was haunting my system had finally been exorcised for good. Using Comcast’s brodband speed test to New Castle, Delaware and Secaucus, New Jersey, I saw wired LAN and 5 GHz 802.11ac download speeds hitting 100 MB/s. Using the popular TestMy.net server in Dallas, Texas; I measured download speeds around 30 – 48 Mb/s. Upload speeds to all servers were in the range of 10 – 12 Mb/s.

Figure 3. The final setup with all QAM levels where they should be!

 

Figure 4. Best of all, there’s no noise below 50 MHz on the upstream channels. FINALLY!

So what was the culprit? Most likely the cheapest thing in the system – the DC block. Noise from the Samsung STB didn’t help, and apparently neither did the noise coming from my neighbor’s cable drop. But the block probably had an intermittent connection and was creating some nasty standing waves, causing tilt on the lower QAM carriers and noise at the uplink frequencies around 30 MHz.

I’ll have more details on this unfortunate series of events during my RF/Wireless class at InfoComm in June. Until then, things are working well (knock on wood, or metal, or coax, or modem…)

Now You See It…Now You Don’t

A few years ago, readers may recall my on-going battles with Comcast to fix a service reliability issue. I had asked from the start that the ‘drop’ – the underground RG-6 coaxial cable running from the sidewalk to my house – be replaced. I suspected the cable’s jacket had been compromised with moisture over time.

Several visits from Comcast later (plus two modem upgrades), they finally did just that. Lo and behold, the underground cable had been spliced at some point with a barrel and simply covered with dirt  – no  insulating tape or waterproofing applied.

In the middle of all this fun and games, I received an offer from Comcast to upgrade to their X1 platform. That meant two new set-top boxes – one with a DVR function (not quite an accurate description, as the “DVR” is actually a cloud server), and one ‘basic’ box without DVR access.

So I went ahead and replaced everything. My network speeds had also gone up considerably along the way, and I also had a new 802.11ac channel bonding modem with 2.4 and 5 GHz WiFi connections. Life was good, right?

Well, it was – until last fall, when I started experiencing intermittent dropouts of my Internet service. They came and went randomly and could last as long as 15 minutes before service was restored.

“Here we go again!” I thought to myself as several rounds of diagnostics ensued. Was it the drop to the street? Nope, my spectrum analyzer showed a full spectrum of strong, level QAM carriers all the way up to 700 MHz. Was it a modem problem? I switched out the 802.11ac Arris modem I originally received as part of my upgraded street drop for a Technicolor model.

That didn’t solve anything, either. I looked at all of the coaxial lines running through my house and also checked out the Cat 6 cables I had run after we installed hardwood floors. Was it possible that energy was coupling from the bundle of category cables into the modem? Not likely, as that arrangement had been working well for over a year.

Hard at work, running new network and coaxial cables. (Did you know you can shoot vertical panoramas with a smartphone? I didn’t…)

Finally, I got a Comcast technician to stop by the house and we started sweeping all of the coaxial lines for noise. Lo and behold, the line that feeds my family room and master bedroom boxes had noise around 26-29 MHz, which could have affected the upstream signal from the modem.

Temporarily running an RG-6 cable directly from my X1 DVR to the noise meter showed a clean spectrum, so I got to work running a new cable connection back to the two-way splitter in my basement. After dressing the wires and reconnecting everything, I figured I was out of the woods.

Except I wasn’t. Not long after, the intermittent drop-out started again. In the middle of critical projects, I took to using my Samsung Galaxy as a temporary WiFi hot spot to wait out things out. I rebooted the modem numerous times and was at wit’s end.

Calls to Comcast revealed there weren’t any service outages. Was it RFI from the category cables, by some crazy chance? I replaced the bundle with a single piece of Cat 6 STP (shielded) wire – drains soldered at both ends – and installed an 8-port 1 Gb switch to feed all of my networked devices behind my family room TV.

That didn’t fix the problem.  ARRGGGH! What could it be? The noise HAD to be coming through the coaxial cable – but from where?

After pondering my next move, I decided to connect my spectrum analyzer to each coaxial cable run and look very carefully for noise and unwanted energy around 25 – 30 MHz. Connected to my X1 DVR, I saw nothing. It was as clean as a whistle.

The guilty party under test. A short piece of coax runs from the CABLE IN connector directly to my spectrum analyzer.

However, connected to my ‘basic’ X1 box in the bedroom, I saw the noise floor briefly jump from about -85 dBm to -80 dBm – and this was happening about every two seconds, as regular as a clock. Well, this looks promising! So I switched on “persistence” mode on the analyzer, which would allow whatever momentary spikes of energy to remain visible on screen.

What I saw next was jaw-dropping. High-energy spurious RF carriers dotted the spectrum, starting as low as 6 MHz and running all the way up past 50 MHz. Expanding my scan width to 100 MHz, I found even more of these little buggers – some of them as strong as -64 dBm. (For reference, the QAM carriers in my system are around -40 dBm.)

I expanded the upper limit of my spectral scan to 200, 300, and 400 MHz, only to find RF spikes everywhere. Yikes! I ran upstairs, unplugged the X1 ‘basic’ box, brought it down to my lab, and connected the CABLE IN port directly to the input of my spectrum analyzer.

These RF spikes were enough cause for alarm, so I expanded my spectrum scan.

 

Things were looking even worse at 100 MHz!

 

What a mess! And some of the spikes were only 20 dB down from my QAM carriers, like the one at about 300 MHz.

And there was the answer. The X1 box (manufactured by Samsung) had apparently been generating broadband RF interference every 2 seconds for several months, and it was coupling back through the RF input into my system. When my modem sent an upstream request to the Comcast server that happened to coincide with one of these “bursts” of RFI, the connection to the server was lost…and our Internet was out. DOH!

A trip to the Comcast “store” (that’s a generous description as it’s more like a crowded reception room) to return the offending box revealed that design isn’t used any more. Instead, a small (about 6” x 6”) flat terminal was exchanged for my RFI generator. This new STB had just four ports – power, HDMI, RF input, and USB – and now syncs as a slave to my X1 DVR, emulating all of its functions.

In the meantime, our Internet service is now as clean as a whistle and drop-out free. Once again, some thorough detective work and analytical thinking solved what had become a frustrating problem with no apparent resolution.

I’ll discuss this fun episode from my life in more detail during my RF and wireless class at InfoComm this coming June. Come on by, and learn more about troubleshooting RF interference problems in the real world! (And keep an eye on your cable box while you’re at it…)

Broadcast TV Spectrum Repacking: The Devil Is In The Details

The FCC has concluded its spectrum auction, and although the winning bids generated only about ¼ of what was expected, plenty of TV stations will be moving to new channels.

But there’s a catch. And you probably won’t be happy when you hear about it.

Currently, the majority of TV stations broadcast in the UHF television spectrum from channels 14 to 51. Another smaller block of stations use the high band VHF channels (7 through 13) while fewer than 50 stations transmit on low band VHF channels (2 through 6).

That is certainly going to change, as it appears all UHF TV channels above 37 will be re-allocated for a variety of services, including Wi-Fi, mobile phones, and a bunch of other “white space” operations. There may even be a few TV stations still mixed in with these services, and we won’t know how broadcasters will be re-packed until early April.

Finding new channels for broadcasters who gave up their channels in return for some nice cash will be a pain in the neck. And it will certainly require some stations to move back to those low-band VHF channels, which were once desirable back in the early days of television (Channels 2 and 3 were ‘golden’ then), but now make up what’s essentially the low-rent district of TV broadcasting.

Why? First off, much larger antennas will be required to receive these stations. A full wavelength at 56 MHz (channel 2) is about 5.4 meters, or about 17.5 feet. So a somewhat-efficient ¼-wave whip antenna to pull in that channel needs to be about 4.4 feet long. (Now you know why grandpa and grandma’s TVs had those long ‘rabbit ear’ indoor antennas!)

That’s not to say you can’t pick up these channels with smaller antennas – if the signal is strong enough, you probably can. But that means being much closer to the transmitter than you’d need to be with a high-band VHF or UHF signal, as it’s easier to design antennas for those channel that have some gain.

Here’s another problem. The spectrum from 50 to 88 MHz is historically plagued with interference from impulse noise (vacuum cleaners, electric motors, switching power supplies, lightning and other static). And during certain times of the year, atmospheric enhancement of radio signals occurs where distant stations will come in stronger than local stations, creating plenty of interference (ionization of the E-layer of the troposphere, a/k/a “E-skip”).

While the high-band TV channels are also susceptible to man-made and natural interference, it’s not as much of a problem. And UHF TV signals are essentially immune to impulse noise, although they can also experience signal ‘skip’ conditions; particularly in the late summer and early fall with tropospheric ducting.

During one of my RF and wireless classes at InfoComm, we aimed an antenna from the 2nd floor North Hall meeting rooms toward Black Mountain, a line-of-sight path parallel to the Las Vegas Strip, to try and bring up channel 2 (KSNV). Guess what? We couldn’t even find the signal using a spectrum analyzer, due to all an extremely high noise floor of about -50 dBm. (The noise floor at my home office is about -88 dBm, which is moderately quiet.)

Reception of channel 2 became such a problem for KSNV that they eventually relocated to UHF channel 22, where they can easily be picked up with an indoor antenna. But other stations won’t be as fortunate, as the spectrum will be fully packed after this year with no place to move for a ‘do-over’ if reception is a problem.

When the DTV transition happened in June of 2009, three stations in Pennsylvania, New York, and Connecticut had to move to channel 6. It became apparent very quickly that DTV converter boxes weren’t selective enough to reject interference from nearby high-power FM broadcast stations. So WPVI in Philadelphia and WRGB in Schenectady applied for and got permission to double their transmitter power in hopes of fixing the problem.

WPVI’s signal on channel 6 is having a hard time up against the many Philly-area FM stations just higher in frequency.

You can see how much of a challenge this downward move will present to people using indoor antennas. Figure 1 shows how WPVI’s 8VSB carrier looked when I tested the Antennas Direct ClearStream Eclipse loop antenna with an amplifier, while Figure 2 shows channels 9 and 12 with the same rig – but a much lower noise floor. Note the strong FM stations immediately to the right of WPVI, a potential source of interference and receiver overload that would not be an issue on high-band VHF and UHF channels.

This view of WBPH-9 and WHYY-12 shows both carriers standing tall above the noise floor (about -85 dBm) and easy to receive.

Figure 3 shows a bunch of UHF channels received the same way – the Eclipse has some resonance at these frequencies as it is close to a full-wave loop antenna, so indoor reception is relatively easy.

Pulling in UHF TV stations is a much easier task for a small indoor antennas like the Eclipse. A low noise floor (-87 dBm) doesn’t hurt, either.

I have two pretty sophisticated rooftop antenna systems (one on a rotator) and I have trouble picking up KJWP-TV in Philadelphia on channel 2 – the signal breaks up frequently and there’s lots of broadband noise showing on my spectrum analyzer when I point the antenna in that direction. There’s also a station on channel 4 (WACP) that pops in from time to time, although in the other direction toward New York City.

If a station you like to watch has to relocate to the low-rent district, you may need to spring for a better antenna, and it might be larger than some of the indoor models you’re used to seeing. If you are 20 or more miles away from the transmitter, you can forget those small picture frame or box-shaped models – they won’t work.

You might even have to (“gasp!) go back to using a pair of rabbit ears. Yes, they still make these; I found a pair in Best Buy the other day for about $15. Or it might be time to consider an outside antenna, and even that will have to be larger.

I’ll have more news once the spectrum repack is done later this month, and the FCC usually provides a link to a listing of TV station channel assignments. If you live near a large city where most of the high-band and low UHF channels are Being used by major networks, you’re probably not going to see much in the way of musical channels.

But if you live in a market where all of the active channels are on UHF – say, like Syracuse, NY or Scranton/Wilkes-Barre, PA – don’t be surprised when you can’t pick up some of those stations in the future. They might have moved down the street…

 

 

 

Two Keys to Optimal HDR TVs: Dynamic HDR Metadata and Tone Mapping

The Society for Information Display LA Chapter held its 14th annual One-Day Conference on February 3, 2017 at the Costa Mesa Country Club in Costa Mesa, California. At the conference, Gerard Catapano gave a presentation entitled “HDR, Today into Tomorrow.”

Catapano, formerly Associate Director of Electronics Testing at Consumer Reports and now Director of Quality Assurance at Samsung’s QA Lab in Pine Brook, New Jersey, introduced high dynamic range (HDR) as “the latest and most innovative technology that helps film studios deliver a better expression of details in shadows and highlights to the consumer.” He presented the Consumer Technology Association’s definition of an HDR-compatible display as one that has at least these four attributes:

• Includes at least one interface that supports HDR signaling as defined in CEA-861-F, as extended by CEA-861.3.
• Receives and processes static HDR metadata compliant with CEA-861.3 for uncompressed video.
• Receives and processes the HDR10 Media Profile from IP, HDMI, or other video delivery sources. Other media profiles may be supported in addition.
• Applies an appropriate electro-optical transfer function (EOTF) before rendering the image.

The HDR standard has been endorsed by a variety of organizations, include the Blu-ray Disc Association, MPEG, the UHD Alliance, and the ITU. Although HDR is currently a premium feature, Catapano predicted that it will become a basic feature of TVs over all screen sizes and display technologies.

Samsung TV supports only the HDR10 media profile because it is an open standard that does not require licensing fees and, as a result, permits customization with

The CIE 1976 LAB Color Space clearly indicates the range of colors available for each level of luminance.

in the profile. Since use of at least HDR10 is required by the CTA definition of an HDR-compatible display, it will be supported by all major manufacturers.  Catapano noted that at NAB 2016, the major encoder manufacturers were offering 4K HDR as an option, and the major mastering and editing tool sets were implementing it.

The CTA definition only requires HDR sets to support static HDR metadata: metadata that is constant throughout the entire film or video. But much more can be done with dynamic HDR metadata, which changes scene by scene.  SMPTE ST.2094-40 provides for the use of dynamic metadata for tone mapping with HDR10.  In a subsequent conversation, Mindoo Chun, an engineer at the QA Lab, told me the dynamic metadata and tone-mapping technology codified in ST.2094-40 was developed by Samsung and a made available to SMPTE.

Tone mapping is a key technology in HDR TVs, Catapano said. It is a color-volume transform that renders incoming HDR content for a display having a dynamic range that is smaller than that for which the contents were coded. With static metadata, the only way to compress the scene with the greatest color volume so it fits into the set’s color volume is to over-compress the much larger number of less demanding scenes.

With dynamic metadata, each scene can be optimally compressed, with that result that many scenes will not require color-volume compression at all. Catapano observed that Samsung HDR TVs for the 2017 model year “are ready for ST.2094-40.”

An ICC color profile based on CIELAB. As with most real devices, this one can reproduce only a portion of the colors defined in the CIELAB color space.

Let’s say a little more about “color volume.”  The most common way of looking at color gamut is still with the 85-year-old CIE 1931 color diagram, which compresses the luminance (“brightness”) Z-axis so that the color space is pressed into a plane.  With the limited luminance capabilities displays have had until recently, that was a simple and (perhaps) adequate approach.  But with high dynamic range, you lose a lot of information that way.  Over the years many three-dimensional color spaces have been developed, with the CIELAB color space being the most common.  Now you can think of each value of luminance as having a two-dimensional color gamut associated with it, and the entire color volume is the stack of these two-dimensional gamuts running from black to white.  The gamut decreases at low and high luminance values, and one of the things you want to do in HDR  set is to have a relatively large gamut at high luminance levels so bright colors do not wash out.

Tone mapping maps the colors in the program material’s color volume to the smaller color volume of a less capable TV set while providing the best possible picture.  From Samsung’s point of view, it’s very convenient that the OLED TVs of arch-rival LG inherently have a smaller color volume than HDR LCD sets because they have a substantially smaller maximum luminance.  There is much more to say about that, but we’ll save it for another time.

Now, when you go into Costco or Fry’s to buy your next TV set, you can ask the sales associate whether the set supports SMPTE ST.2094-40. I look forward to hearing how that conversation goes.
Ken Werner is Principal of Nutmeg Consultants, specializing in the display industry, manufacturing, technology, and applications, including mobile devices and television. He consults for attorneys, investment analysts, and companies re-positioning themselves within the display industry or using displays in their products. He is the 2017 winner of the Society for Informational Display’s Lewis and Beatrice Winner Award.

You can reach him at kwerner@nutmegconsultants.com.