Category: The Front Line

Gaming Monitors that Don’t Tear or Stutter

At PEPCOM’s Holiday Spectacular, a press-and-analysts-only show held last month at the Metropolitan Pavilion on New York City’s West Side, I came across the new ASUS PG278Q 2560×1440 gaming monitor. In discussing the monitor with ASUS personnel, and then with nVidia’s Bryan Del Rizzo, I learned things about the way high-performance games present their imagery to monitors I didn’t know before, and how nVidia’s G-Sync technology solves the long-standing problems of tearing and stutter that have beset gamers and their fixed-frame-rate monitors.

“The source of [these problem]s is that modern games don’t deliver a consistent frame rate due to the complexity and richness of the scenes being rendered. Some frames take longer to render than others. Frames that are rendered in 10ms will push the [frames per second (FPS)] in the game higher, while frames that take longer to render will reduce the overall FPS in the game,” according a reviewer’s guide Del Rizzo sent me.

“Currently, gamers have few options when it comes to how frames are delivered. Most gamers disable V-Sync to get the best input response they can, but this introduces some serious visual ASUS G-Syncartifacts:  Tearing and Stutter. With V-Sync enabled, you eliminate the tearing, but introduce input delay and inconsistent frame delivery as the GPU will rarely generate frame rates in perfect sync with the refresh rate of the monitor,” the guide says.

With G-Sync, a G-Sync controller built into the monitor receives a signal from the graphics processing unit (GPU) in the computer. (Currently G-Sync is supported by nVidia Series 7 and 9 GeForce graphics cards.) When the GPU has finished rendering a frame, no matter how long it takes, it sends a signal that tells the monitor to update the display. The result is a variable-rate monitor that solves the problem of tearing and stuttering. Lag between a user input and screen updating based on the input is reduced compared to V-sync.

ASUS advertises its PG278Q as the first 1440p monitor to feature G-Sync technology. Acer advertises its XB280HK gaming monitor — an example of which was in the nVidia booth — as the world?s first 4k display featuring nVidia G-Sync technology. Del Rizzo told me that G-Sync is currently available, or will soon be available, in gaming monitors from four or five different brands.

Del Rizzo said nVidia  has been developing G-Sync for several years, and that this is serious technology developed to allow gamers to have more fun. But that should come as no surprise. Gaming is a very serious business.

The 800-pound gorilla just bellowed…

On Tuesday, September 23, Vizio officially launched its new P-Series of 4K (Ultra HD) televisions at an art gallery in Chelsea, NY.

These televisions weren’t a secret to anyone who attended CES w-a-y back in January. Vizio caused a bit of a stir by announcing five “smart” LCD TVs with direct LED backlighting that would retail at prices considerably lower than Sony, Samsung, LG, Sharp, and pretty much anyone else except the Chinese brands (TCL, Hisense, etc.)

But whereas those selfsame Chinese TV brands don’t get much respect from the general public and big box retailers, Vizio is different. They’ve been around for over a decade, and they have a track record of market disruption. Vizio has done so well selling TVs in the U.S. that they sponsored the Rose Bowl this year and also had a crew go out and shoot original 4K footage on a Red camera to be used for demos. Vizio is also going to license Dolby’s high-dynamic range process for upscale monitors and TVs.

Vizio's event was staged in a gallery and featured "moving art" by Louie Schwartzberg.

Vizio’s event was staged in a gallery and featured “moving art” by Louie Schwartzberg.

 

The New York City event was minimalist, consisting of demos of each model hanging on the wall like artwork and showing 4K clips. In three back rooms, Vizio demonstrated (a) its superior motion blur-correcting technology compared to a Samsung 4K TV, (b) its lower black levels compared to the same Samsung TV (and lack of color tint artifacts by using full array, direct backlight illumination), and higher contrast ratio and improved shadow detail.

I’ve been to enough side-by-side demos that I’ve learned not to believe anything I can’t verify on my own, but the Vizio sets did look impressive. What was even more impressive, however, were the prices: The 50-inch P502ui-B1 retails for $999, while the 55-inch P552ui-B2 will sell for $1,399. That’s $600 less than the current price for a comparable 55-inch LG Ultra HD TV.

Vizio also has a 60-inch model (P602ui-B3) for $1,699 and a 65-inch version (P652ui-B2) for $2,199. The line is rounded out with a 70-inch Ultra HD set, the P702ui-B3 for $2,499. That last price is $500 lower that what I heard at CES, so there’s already been some market-tweaking.

All of the P-series Ultra HD TVs use full array LED backlighting.

All of the P-series Ultra HD TVs use full array LED backlighting.

 

At least one Vizio model P552ui-B2) uses IPS LCD glass, while the rest employ MDA LCD. Vizio claims this provides better black levels than Samsung's PVA LCD panels.

At least one Vizio model P552ui-B2) uses IPS LCD glass, while the rest employ MDA LCD. Vizio claims this provides better black levels than Samsung’s PVA LCD panels.

 

Readers with reasonably long memories will recall that Sony’s and Samsung’s inaugural Ultra HD sets were priced close to $3,400 for 55-inch models and $4,400 for 65-inch versions. (Sony’s sets also use quantum dot backlights, something Vizio is not pursuing at the moment.)

But if you pull out the latest Sunday paper inserts from Best Buy and HHGregg, you will see that Samsung and LG have both slashed their prices considerably, perhaps in anticipation of Vizio (who, ironically, is now a featured brand at Best Buy!).

The Vizio sets are already designed to support 4K streaming from Netflix and Amazon Instant Video, and customers of the latter service will be able to buy or rent 4K content from UltraFlix for streaming. As for inputs, all five TVs come with four HDMI 1.4 connections and one (just one) HDMI 2.0 port. That last one is a puzzler, since the HDMI 2.0 standard has been out for a year already.

I was told by Vizio that these new sets support HEVC H.265-coded material, but I’m not sure if they are decoding in hardware or their own software. The lone HDMI 2.0 port is also fully compliant with HDCP 2.2, a controversial amendment to the HDCP standard that will cause some serious backwards-compatibility problems with older TVs, 4K models included.

So – here we go with the Ultra HD price wars. Samsung, Sony, and LG are nearing the end of an intensive 4K education campaign at their stores (which, I must say, the majority of salespeople I encountered did a very good job with), timed perfectly with the fall college and NFL football season.

Now, here comes Vizio to steal their thunder and benefit from this campaign without having contributed a dime to it. Research by Nielsen and other companies has consistently shown that Americans just want “big, cheap TVs.” I’m not sure if that applies also to 4K, but if so, Vizio is ready and willing to deliver. It will be very interested to see how the “Big 3” TV brands respond to this challenge and what price cuts will result over the next couple of months. Stay tuned…

 

There’s fast…and then there’s FAST.

There’s Fast…and then There’s FAST.

A little over a year ago, Silicon Image announced the latest version of HDMI – 2.0. Among the other enhancements to this interface was an increase in the clock rate to 600 MHz, allowing data rates as high as 18 gigabits per second (GB/s).

Good thing, too, with Ultra HD televisions coming to market. The previous iteration of HDMI (v1.4) had a capped data rate of 10.2 Gb/s, which was barely fast enough for Ultra HD signals (3840×2160 pixels) refreshed at 30 Hz, with color depth not to exceed 8 bits per pixel.

By boosting the speed to 18 Gb/s, HDMI 2.0 can now pass a 60 Hz Quad HD signal – but only with 8-bit RGB color. (If you’re willing to cut the color resolution in half, you can increase the bit depth.) To me, that’s not enough of an improvement: If you’ve seen what it takes to shoot, edit, and post 4K content, you’ll realize why 10-bit and even 12-bit encoding is the way to go. But HDMI 2.0 can’t handle that with high frame rates.

Earlier this month, the Video Electronics Standards Association “officially” announced what we knew was coming for some time – DisplayPort version 1.3, which will boost its data rates to a mind-boggling 32 Gb/s – almost twice as fast as HDMI 2.0 – and also employ for the first time a form of visually-lossless compression, known as Display Stream.

Unlike HDMI, DisplayPort is a pure digital transport, using packet-based communication. It transports video, audio, metadata, and even Ethernet, using four scalable “lanes” to carry signals. In the current version (1.2), the capped data rate for each lane is 5.4 Gb/s, but with version 1.3, it will rise to 8 Gb/s.

DisplayPort’s architecture is designed to be flexible. There are full-sized and mobile versions of the connector, along with wireless and optical fiber interface specifications. Users of Apple MacBooks are familiar with the Mini DisplayPort interface, and a mobile version (Mobility DisplayPort or SlimPort) is available for tablets and phones and uses a single lane for 1080p/60 playback.

The maximum data rate for all four lanes is 21.6 Gb/s, which can accommodate a 3840x2160p/60 signal encoded with 10 bits per pixel in the RGB format. That’s considerably faster than HDMI 1.4 and one reason why a handful of TV manufacturers are adding DP 1.2 connectors to their new 4K TVs. The other reason is the lack of royalties (for now) to use the interface.

At CES, VESA announced DockPort, a multiplexed signal format that blends USB 3.0 connectivity with display signals in the standard and mini DP connectors. Now, there has been a major announcement by the USB 3.0 Promoter Group and the Video Electronics Standards Association (VESA) of something called “USB Type-C Alternate Mode.”

Drilling deeper, we find that the USB 3.0 Group has introduced a new variation of their interface, known as the Type-C connector. Unlike other versions (Types A&B and their more commonly used “full-size” and “mini” designations); the type-C connector borrows a page from Apple’s playbook and is reversible. That is; it makes no difference which way you plug it in – there is no right side or wrong side up.

There’s more: The Type-C connector (about half as large as a conventional USB Type-A connector) can carry serial data at speeds up to 10.2 Gb/s (USB 3.1 Gen 2). It can also deliver up to 100 watts of power (20 volts DC at a maximum of 5 amperes) so that a connected device could be operated while its battery charges.

There are 12 pins on a Type-C connector arrayed along both edges of the blade. Viewed from the end, the top and bottom pins are reversed from left to right, which is how you can plug it in either way and it will still work. Two pins (1 and 12) are used for ground. Pins 2 and 3 are reserved for a high-speed transmit (TX) data path, while pins 10 and 11 are reserved for a receive data (RX) path.

Pins 4 and 9 provide bus power, and pin 5 (CC) is used to communicate with the connected device to determine operating mode. Finally, pins 6 and 7 function as a USB 2.0 interface. Needless to say, the host and connected device need a USB 3.0-compatible connector switch to determine the operating mode and enable data exchange in 2.0 or 3.0 formats.

Print

DisplayPort can run over the new USB Type-C connector – with no in-between adapters.

What’s unique about Type-C cables is that so-called “full feature” passive cables will actually contain an internal ID chip that signals the source connection to turn on all USB 3.0 functions and enable high-speed data exchange. These cables will be able to transmit 10.2 Gb/s of data over 1 meter (3 feet) and 5 Gb/s over 2 meters (6 feet).

Using Alternate DisplayPort mode, a maximum of four DP “lanes” of display data can travel over that same tiny USB connector, providing all the resolution and bit depth of the full-size and mini DisplayPort connectors. Or, you can reserve two lanes for USB 3.1 operation and employ the other two for displays, something you might want to do in a docking station application. USB 2.0 data exchange is always available through the same connector.

As you can see, the USB connector has gotten a lot smaller. It’s also a lot faster, and is symmetrical (no more fumbling around trying to orient the plug the right way). And it can provide the primary display connection for any device while also sending and receiving high-speed data.

With this announcement, the USB 3.0 Group and VESA have shown that “less is more” when it comes to digital signal interfacing with Type-C Alternate operation. Oh, did I mention that you will be able to buy and use a cable with a Type-C USB connector on one end and a DisplayPort plug on the other? Can’t get any easier to use than that!

Trends: Ignore Them At Your Peril

On August 15, Leichtman Research Group of Durham, NH released its quarterly revenue and subscription numbers for U.S. cable TV providers. And there was a surprise to be found in the calculations.

For the first time ever, the number of broadband service subscribers for major cable TV service providers exceeded (barely) the number of cable TV channel subscribers. This happened during the 2nd quarter of 2014 and represents a milestone for pay TV services. (And yours truly predicted it would happen a year earlier, in a DD posted a few years back. Oh well, close enough for government work…)

The actual differential favoring broadband subscriptions was small, amounting to about 5,000 more broadband customers. The actual totals for cable TV systems (not including Wide Open West, an overbuilder) were 49,915,000 for broadband, and 49,910,000 for cable channel service. What’s more interesting is that thirteen largest pay TV providers in the US (about 95% of the market) lost about 300,000 net video subscribers in 2Q 2014, compared to a loss of about 350,000 video subscribers in 2Q 2013.

To offset that decline, the 17 largest pay TV providers added about 385,000 broadband customers during the same time period. Cable TV companies control the lion’s share of broadband service revenue and have a 59% market share vs. AT&T’s U-Verse and Verizon’s FiOS services. The latter companies stayed essentially flat in new subscribers as an almost equal number of customers dropped DSL service (627,000) compared to those who signed up for faster broadband (636,000).

For all cable and telcos that Leichtman surveyed, the total number of broadband subscribers was about 85 million. Of that total, industry giant Comcast claimed 21.27 million and #2 service provider Time Warner Cable accounted for 11.97 million. Among cable TV companies, those numbers represent 42% and 23% market shares, respectively. (Keep that in mind as you ponder the consequences of a potential Comcast – Time Warner merger.)

Now for some additional perspective: Netflix recently broke the 50 million worldwide subscriber mark, with 36 million of those subscribers located in the United States. That’s larger than any cable TV or telco subscriber base. In fact, it’s more than Comcast and Time Warner combined, and is indicative of the meteoric growth Netflix has experienced since it commenced a streaming service in 2007.

Combined with the shift toward consumption of digital media online vs. renting or buying optical discs (as outlined in my last Display Daily), it’s clear that broadband is becoming the more desirable service for many households. I’d also venture an educated guess that customers who subscribe only to broadband services tend to skew much younger (Millennials) while traditional cable TV channel subscribers skew older (Baby Boomers).

While AT&T and Verizon have a smaller share of the pie, it’s still a large enough slice to motivate Comcast, Time warner et al to keep increasing their broadband speeds and not lose any competitive edge. I am a Comcast subscriber and while writing this article, checked my download speeds using CNET’s Internet Speed Test. The result? 20 Mb/s downstream at 5 PM, which is a considerable boost from what I had three years ago. Could the fact that Verizon ran optical fiber through my front yard a few years ago have anything to do with it?

What does all of this mean, long term? First off, the preference for faster broadband vs. a pile of pay TV channels that most people never watch will continue to re-shape the business model for cable TV companies. (The median number of channels watched in pay TV households currently stands at 17.) Continued price increases and increasing reliance on wireless (and not wired) phone service will prompt more customers to drop so-called “triple play” offerings and just go with broadband (and probably use services like Ooma for VoIP calling).

Secondly, the sheer size of Netflix and its expanding category of both rental movies and original series provide even more impetus for disgruntled pay TV subscribers to dump costly channel packages and stream everything from the Big Red Father. Both House of Cards and Orange Is The New Black are wildly popular – there’s no reason to assume Netflix won’t hit a few more home runs. (And their success is prompting HBO to finally discuss publicly a subscription streaming service independent of cable TV delivery.)

Finally; it may take more time than I prognosticated several years ago, but cable TV companies and telcos will slowly and inevitably morph into something that looks more like your local electric company, providing metered high-speed broadband connections and letting customers decide what they want to watch, and when. The DVR may even pay the ultimate price and fall by the wayside in favor of streaming from cloud servers as this comes to pass.

Even the biggest fires start with a tiny spark, and most people don’t even notice trends until they are well under way. Ignore them at your peril…

Mirasol Finds a Home

I may have been wrong. Here’s some history.

Some years ago Qualcomm purchased an MIT display spinout called Iridigm and renamed the technology mirasol (with a small “m”). At a time before Apple created the consumer tablet revolution, the standard for non-PC media consumption was the eReader, and the standard for low-power reflective displays was (and still is) E Ink’s electrophoretic display technology.

But as good as E Ink was for reading black-and-white text, it had obvious limitations that encouraged several companies to develop competing technologies. The limitations were that, at the time, E Ink was limited to black and white and refresh time was too slow for video or even smooth animation. Iridigm had developed a remarkably elegant reflective technology that promised to overcome E Ink’s limitations, and Qualcomm invested huge amounts of money developing it.

The Iridigm technology used optical interference to create color, with the interference changes created by MEMS-actuated mirrors. As elegant as this approach was, it had practical shortcomings. The technology never developed well-saturated reds, and since color rendering was based on interference, color was very sensitive to viewing angle.

Still, when the competition was E Ink and the application was eReaders, mirasol color might have been good enough, and the

Timex Ironman ONE GPS+ with mirasol SMI reflective color display.  Timex is accepting pre-orders for the watch.  (Photo:  Timex)

Timex Ironman ONE GPS+ with mirasol SMI reflective color display. Timex is accepting pre-orders for the watch. (Photo: Timex)

Korean Kobo bookstore chain did produce a mirasol-based Kobo Reader for the Korean market. Not many units made it to the U.S. but I have one of them, and the color is unsaturated, varies with viewing angle, and has an iridescent quality (suggested in the original Iridigm name) that does not make reading or image viewing easier.

Nonetheless, other mirasol eReader projects were in the works when the consumer tablet revolution struck. Almost overnight, “good-enough” color wasn’t good enough, as the standard for color and motion was now that of the very well developed LCD. The other mirasol eReader projects vanished, and the other developmental color reflective technologies faded away to one degree or another. (LiquaVista, perhaps the most interesting of these, disappeared into the maw of Amazon. I have been assured that the LiquaVista program is alive and well, but beyond that NOBODY is saying ANYTHING.)

Qualcomm retreated into showing wristwatch prototypes, where mirasol’s deficiencies were less obvious and less objectionable, and they also purchased Pixtronix, which had developed a transflective in-plane MEMS technology with field-sequential LED-backlit color. I’ve gone on record as saying Pixtronix was much more likely to be successful than mirasol. (Sharp has combined the Pixtronix technology with its IGZO backplane and will soon be manufacturing these displays in commercial quantities, initially for industrial applications.)

Qualcomm, however, did not give up on mirasol. They developed a new generation of the technology called SMI. The original approach, called IMOD, could only implement one mirror position per subpixel, so a red subpixel was either red or black. SMI permits continuous mirror positioning, so each subpixel (now pixel) can render a full range of colors. In practice, color saturation is improved and the disturbing iridescent quality is reduced. Still, SMI would not be my choice for an eReader or tablet display.

But eReaders are not what Qualcomm is pushing. AT SID 2014, Qualcomm was showing a 5.1-inch smart-phone display with 2560×1440 pixels. (SMI allows much higher pixel density than IMOD with its area dithering, and Qualcomm was showing this off.) Not bad, but it’s hard to see how SMI will match the color quality of AMOLED and quantum-dot-enhanced LCD for smart phones.

More convincing were a 1.45-inch display with 353×352 pixels and a 1.6-inch with 384×384 pixels. Both displays were labeled “wearable” and both had the same 343 ppi pixel density.

Qualcomm was showing its own Toq smart watch, which incorporates the 1.45-inch display (I think), and which is available on Amazon for about $235. Qualcomm is not a manufacturer of consumer products, so we can assume the Toq is designed as a vehicle for mirasol development and exposure. Can this approach convince any “real” watchmakers? The answer is yes.

Timex has just announced its Ironman ONE GPS+ fitness-and-athletics-oriented smart watch, which contains a remarkable collection of hardware, in addition to its mirasol display, for an MSRP of $399.95 ($449.95 with a bundled hear-rate sensor.) Timex is taking pre-orders now.

The phone, which operates independently of a cell phone, includes GPS, text messaging, music storage/playback, interval times, distance and speed calculation, and your friends can use the GPS to track you in real time. The watch is compatible with Bluetooth heart-rate sensors.

The characteristics of mirasol SMI — sunlight readability, very low power consumption, and color (where full color and color fidelity are not required) — are well suited to this application.

Will Timex’s highly functional but expensive athlete’s watch be a winner that stimulates additional wearable uses for mirasol? If Timex has miscalculated, it may deter other potential mirasol customers, even if the display was not at fault. We shall see.

Was I wrong about mirasol? I might have been, but the jury is still coming in.

Ken Werner is Principal of Nutmeg Consultants, specializing in the display industry, manufacturing, technology, and applications, including mobile devices and television. He consults for attorneys, investment analysts, and display-related companies. You can reach him at kwerner@nutmegconsultants.com.