Category: The Front Line

LG Is “All In” With OLED TVs

Yesterday (April 8), LG formally launched its new line of OLED televisions at The Garage on Manhattan’s upper west side. In addition to showcasing the 65-inch 65EG9600 ($8,999) and 55-inch 55EG9600 ($5,499) UHDTVs, LG also held a press briefing in conjunction with Netflix’ latest streaming series, Daredevil, which is available starting Friday, April 10.

I had the opportunity to sit on this panel and answer a few technical questions about OLED picture quality. Scott Mirer, VP of device ecosystem at Netflix was also on hand to offer his observations about the new OLED TVs, as was Matt Lloyd, director of photography for Daredevil (which, coincidentally, was shot in the adjoining Hell’s Kitchen neighborhood).

During my part of the discussion, I asked for a show of hands to see how many members of the press were currently using plasma TVs, and better than 60% of the hands went up. While LCD display technology current owns about 95% of the worldwide television market, there’s just no comparison to a late-model Panasonic, LG, Pioneer, or LG plasma set when it comes to video picture quality.

The panel discussion at LG's OLED TV launch event. Left to right, yours truly, Tim Alessi of LG Electronics, Matt Lloyd, DP on Daredevil, Scott Mirer of Netflix, and moderator Shelly Palmer.

The panel discussion at LG’s OLED TV launch event. Left to right; yours truly, Tim Alessi of LG Electronics, Matt Lloyd, DP on Daredevil, Scott Mirer of Netflix, and moderator Shelly Palmer.

Many of us shed more than a tear when it was announced that Panasonic was departing from the plasma TV business a couple of years ago. And we all figured that OLED (organic light-emitting diode) televisions would quickly step into the breach.

That didn’t quite happen like we expected. Even through large OLED TVs have been shown for well over a decade (going back to Samsung’s and Epson’s 40-inch prototypes in 2003), they just never seemed to make it to the starting line.

In the summer of 2013, LG launched a 55-inch curved 1080p OLED TV with much splash and hoopla. Later that year, Samsung followed suit with their 55-inch curved OLED TV, pricing theirs almost $6,000 less than LG. And in short order, a price war ensued – but it didn’t last very long, as Samsung pulled their product off the market for undisclosed reasons.

LG’s OLED imaging panels employ a white OLED emitter and color filters arrayed in an RGBW stripe to provide brighter images. This technology originated in none other than Rochester, NY at Eastman Kodak and was an outgrowth of research and development in the late 1970s and early 1980s.

In 2009, Kodak sold its OLED patent portfolios and business to LG Electronics outright. Ever since then, LG has been working industriously to bring OLED TVs to market. The ‘catch’ was manufacturing yields, which not all that long ago were in the low double digits.

Although subsidiary LG Display won’t disclose its current OLED yields, they are believed to be better than 50%, which is probably why we’re now seeing several models of televisions finally coming to retail. Granted; they’re not cheap – in comparison, you can by a 55-inch “smart” 1080p LCD TV for about $700 now, while a quantum dot-equipped 1080p LCD set will run about $3,000 currently.

However, the market knows what it wants to pay for a television, and you can expect those prices to come down in short order. LG’s original 55EA9800 OLED set started out at just under $15,000, but it can be yours now for just one-fifth of that original price. (For those with short memories, that’s what a quality 50-inch plasma cost about 7-8 years ago.)

The OLED exhibit featured this comparison between a 55-inch LCD with LED backlight (left), a 55-inch oLED TV (middle), and a 55-inch LCD TV equipped with quantum dot backlight (right.)

The OLED exhibit featured this comparison between a 55-inch LCD with LED backlight (left), a 55-inch oLED TV (middle), and a 55-inch LCD TV equipped with quantum dot backlight (right.)

While the rich blacks and saturated colors draw people like flies to OLEDs, it’s worth nothing that those same deep blacks and consistent grayscale and color reproduction at very low luminance levels allow OLED displays to show images with high dynamic range. If we go by an industry definition of HDR as 15 stops of light, OLED is definitely up to the challenge: With full white at 500 nits, for example, the step above black would measure just around .1 nits.

That’s a level of black previously attained only by plasma TVs, as well as LCD TVs with some trickery involved (black stretch, dynamic contrast, APL). But of course OLEDs can go much lower with grayscale reproduction: A more typical low gray (near black) level on an OLED display might be around .05 nits or so.

The clips of Daredevil provided by Netflix really showed off the abilities of OLEDs to handle dark scenes with point sources of high-key light, like streetlights. Another clip showed a fight scene in a dark hallway, with the only light coming from green-tinted fluorescent lamps. Yet, you could see details even in the darkest corners.

The consistent color tracking of OLEDs, their emissive structure, and their low operating voltages make them an ideal replacement – nay, step-up – from plasma display technology, which had to rely on high voltage, pulse-width modulation (PWM) technology to create images. OLEDs are also a lot thinner than any other display, and can even by printed onto flexible substrates.

But enough about technology! OLED televisions are finally coming to market, and that’s something to celebrate. As a bonus, both of LG’s newest OLED models are UHDTV-resolution (3840×2160 pixels) and have excellent 1080p upscaling, based on the Blu-ray clips of Skyfall that I saw at the event. Can’t wait for the rest of the lineup!

Look Out, HDMI – Here Comes Super MHL!

Yesterday, the MHL Consortium announced its newest flavor of display interface – Super MHL (or, more accurately, superMHL). MHL, which stands for Mobile High-definition Link, was original developed to enable display connections over Micro USB ports on phones and tablets. You’ll find it most often on mobile devices and televisions from Samsung and Sony. It will also pop up on LG products and there’s even an MHL port on the Pioneer AV receiver I bought a couple of months ago.

There have been some clever demos of MHL applications at past CES events. One was to build a “dumb” laptop (no CPU or video card) – just keyboard, touchpad, and display – and use MHL to dock a smartphone into it make everything work. Another demo in the Silicon Image booth featured smartphones being used as video game controllers with the video playing back on the controller screen.

Yet another demo showed a Sony Experia phone being used as a remote control with a Samsung TV to select inputs, play video, and launch Internet applications. It’s easy to do this stuff when you can multiplex video and serial data through the same connector, which in MHL version 3.0 can even play back Ultra HD video at 30 fps with 8-bit color.

Note the emphasis on the “mobile” part. In the world of transition-minimized differential signaling (TMDS), MHL is one of a few players, the others being HDMI (the dominant digital display interface), its predecessor DVI (still going strong although the standard isn’t updated anymore), and Micro and Mini HDMI, niche connectors on smartphones and cameras.

The advent of Ultra HD, 4K, and higher display resolutions like the new “5K” widescreen workstation monitors use has created a problem: Our display interfaces need to get faster. A LOT faster!

But HDMI 2.0, announced in September 2013, isn’t fast enough. I get into frequent debates with people about why it isn’t, so let me clarify my position: HDMI 2.0 has a maximum overall clock (data) rate of 18 gigabits per second (18 Gb/s). 80% of that can be used to carry display signals; the rest is overhead using 8 bit/10 bit mapping.

So that limits HDMI 2.0 to supporting 3840×2160 pixels (4400×2250 pixels with blanking) in an RGB signal format @ 60 Hz refresh. That’s the hard, fast speed limit. For anyone using a computer workstation or media player with RGB output, this hard, fast limit is a serious obstacle: How will people who buy the new HP/Dell 27-inch workstation monitors connect them? Their working resolution is 5120×2880 pixels, and at 60 Hz, that’s just too fast for HDMI 2.0.

It looked like DisplayPort 1.2 would finally ascend to the top of the podium, since its existing speed of 21.6 Gb/s (17.28 Gb/s usable) was already faster than HDMI 2.0. And now, DisplayPort 1.3 has been announced, with a top speed of 32 Gb/s (about 26 Gb/s usable) and the adoption of Display Stream compression. Indeed, more computer manufacturers are providing DP connections on laptops: Lenovo seems to have moved completely to this format, and Apple has been supporting DP for some time now.

8K is here! (Okay, maybe that's a few years away...)

8K is here! (Okay, maybe that’s a few years away…)

With all of that in mind, I will admit I was completely blind-sided by superMHL at this year’s International CES. Instead of a 5-pin Micro USB connector, superMHL offers a 32-pin, full-size connector that’s symmetrical (the next big thing in connectivity, a la USB Type-C). It also supports Display Stream compression. And it’s compatible with USB Type-C, although not with all six lanes. And it has a maximum data rate of 36 Gb/s across six lanes of data. (According to the MHL Consortium, that’s fast enough to transport an 8K (7680×4320) image with 120 Hz refresh and 4:2:0 color.)

The MHL Consortium’s announcement yesterday featured Silicon Image’s new Sil97798 port processor, which can also handle HDMI 2.0 signals. Here are the key specs from the Super MHL press release:

  • 8K 60fps video resolution, as outlined in the superMHL specification
  • New, reversible 32-pin superMHL connector
  • USB Type-C with MHL Alt Mode
  • High Dynamic Range (HDR), Deep Color, BT.2020
  • Object audio – Dolby Atmos®, DTS:X, 3D audio, audio-only mode
  • High bit-rate audio extraction
  • HDCP 2.2 premium content protection

 

Here's the 32-pin superMHL reversible connector.

Here’s the 32-pin superMHL reversible connector.

Whew! That’s quite a jump up from MHL. Some might say that superMHL is on steroids, but no matter how you look at it, superMHL is now a serious contender for the next generation of display connectivity. In the press briefing, a representative of the MHL Consortium waxed on about the approach of 8K broadcasting (it’s already been operating for two years in Japan) and how we would see a migration to 8K TV and displays in the near future.

As Larry David says, “Curb your enthusiasm!” Supporting 8K would be nice, but we’ve barely started the transition to UHDTV. And right now, selling 8K TV to the average consumer is like trying to peddle a Ferrari to someone who lives on a dirt road.

Where superMHL will find its niche is in supporting the higher bit rates that high dynamic range (HDR), wide color gamuts (BT.2020), and higher frame rates (60/96/100/120 Hz) require. All will shortly become important parts of the next-generation (UHD) television system. DisplayPort is already there with version 1.3, and you’ll even find DP 1.2 connections on selected models of Ultra HDTVs so that gamers can connect laptops and desktops at Ultra HD resolutions with 60 Hz refresh.

Now, the elephant in the room: How does the emergence of superMHL affect HDMI? Even though version 2.0 is over a year and a half old, you don’t see many HDMI 2.0 jacks on Ultra HDTVs. Casual inspections at Best Buy, HH Gregg, and other outlets show that the typical HDMI 2.0 port count is usually one (1), even as we approach April of 2015.

In the superMHL presentation, the concept of a TV with multiple HDMI 2.0 inputs and one superMHL input was outlined. This would, in effect, be the next step up from where we are now, with the typical Ultra HDTV having one HDMI 2.0 input and three HDMI 1.4 inputs.

But if Silicon Image’s new Sil9779 port processor can handle both formats, why bother with HDMI 2.0 in the first place, especially with its speed limitations? Wouldn’t it make more sense to future-proof all inputs and go with superMHL across the board? (Of course, the cost of adopting superMHL could weigh heavy on that decision.)

In the commercial AV and broadcast worlds, it would definitely make sense to jump to superMHL in the interests of future-proofing installations. Given the limited rollout of HDMI 2.0 to date, maybe supporting both HDMI 1.4 for legacy devices and superMHL is a smarter approach. (Note that superMHL and HDMI 2.0 both support HDCP 2.2, which is the next level in encryption and NOT compatible with older versions of HDMI.)

Summing up; the race for faster interface speed just got a lot more interesting with the addition of superMHL to the lineup. I can imagine that manufacturers of AV matrix switchers and distribution amplifiers are feeling another migraine headache coming on…

EDITOR’S NOTE: Last week, it was announced that Silicon Image has been acquired by Lattice Semiconductor of Hillsboro, Oregon, “ a leading provider of programmable connectivity solutions” according to the press release. The acquisition price was about $600M and now leaves Lattice in control of HDMI, MHL and superMHL, and SiBEAM (WiHD) patents and IP. More information can be found on the Lattice Web site at http://www.latticesemi.com/.

EE LLC: A Million PCs and Nobody Knows Its Name

There is a company you’ve probably never heard of whose display technology sits inside a million notebook PCs. Entertainment Experience LLC has developed a total color management system whose mathematical and vision models are embedded in multidimensional look-up tables (LUTs). Current customers include Dell and Quanta, the world’s largest notebook PC ODM.

This week, I spoke separately with CEO John Parkinson and the technology’s inventor and developer Jim Sullivan.

When the image captured by a typical professional video camera is re-mapped to Rec.709, at least 70% of the color information is thrown away, Sullivan said. One example: grass green and laser green map to the same RGB values. All the energy that goes into maintaining color fidelity is applied only to that reduced gamut and does not address the loss of perceived fidelity that was inflicted early in the process. Part of what EE LLCs software product, eeColor, does is to compensate for that loss in perceived fidelity by intentionally breaking away from “hardware fidelity.” Parkinson noted that the software recomputes the values for each pixel in the frame individually in real time.

Most natural colors only occupy the central portion of the display’s color space. eeColor expands this portion to more completely fill the color space of the display. But memory colors, such as skin and sky, are displayed without modification. Doing this blindly would result in “memory colors,” such as skin and sky, becoming distorted. eeColor identifies the range of color coordinates that include these memory colors and preserves the color values of these pixels as the overal gamut is expanded. EE LLC calls these patches of preserved color coordinates “filters.”

eeColor incorporates models of skin tone developed at the Rochester Institute of Technology and uses them to maintain skin tone under brightness changes and color remapping. Different skin tones, from African to Scandinavian, are mostly a matter of brightness rather than color shift. But there are cultural preferences. Asians seem to like displays to show skin tones a bit more blue than do Americans and Europeans. This is addressed with slide bar in the UI.

Image after eeColor processing.

Image after eeColor processing.

Venice originale

Venice. The original image.

Sullivan said that engineering the filters was the toughest part of the job. There are times when the color space is remapped that the skin-tone vector (for example) most move in the opposite director for the image vector for the overall scene. Strange things can happen at the boundaries of these filters with the overall display color space.

At EE LLC, they use “colorfulness” to describe the color content of a frame. This is the volume of colors contained in the 3D volume of the IPT color space. Percent of NTSC, which the hardware people have been trained to use, “is useless.”

They use the IPT color space because it preserves hue and brightness under gamut mapping. Moving the color portion (PT) of the vector does not change the perceived brightness (I). CIELAB doesn’t do this, nor does RGB, Sullivan said.

The vision model used by eeColor allows the viewer to observe rec.709 imagery in a relatively bright home or office environment and see the colors as they would appear in darkened cinema. The adaptive color boost is working with the behavior of the human visual system.

eeColor’s “management produces greater brightness with the same power, while keeping color quality,” Sullivan said. It is therefore very attractive for battery-operated devices.

The transformations performed by eeColor affect color only. For a complete solution, eeColor needed to handle sharpness and contrast, as well. For this they teamed up with Razzor Technologies, an RIT spin-off. Razzor’s approach also uses LUTs, so it was possible to combined the two sets of technologies into a single software product. Razzor’s sharpening technology avoids the white fringe that convolution filters impose on sharpened edges.

The company’s approach allows every input color to be mapped to an output color based on sound visual models, which has interesting applications. Among these is the ability to adjust for unit-to-unit variations in color rendition, and also to allow product manufacturers to compensate for the color differences between panels from different manufacturers.

The company has also worked with LG Display to minimize use of the blue phosphor and thus retard blue-phosphor aging.

These functions can be implemented in hardware, but Sullivan says EE LLC’s licensing fee per unit is less than the $3.00 unit cost (in volume) of a popular graphics co-processor. And with current personal devices having processing power to spare, EE LLC believes that software is the way to go. (The folks at Pixelworks may disagree.)

eeColor is currently available for the Microsoft and Android OS’s running on popular chips.

Has Sony Finally Seen The Light?

Sony’s ongoing financial woes have been well-documented by this writer over the past few years. Gone are the days when the Tokyo-based electronics giant could invent and own all parts of a media format, like the Walkman and Betacam.

It’s exceedingly difficult to make any money selling hardware to consumers these days, as fellow CE giants Panasonic, Toshiba, and Hitachi have all found out. And one of the biggest loss leaders is the Bravia television business, thanks to cutthroat competition from Samsung and LG, and now Chinese brands like Hisense and TCL.

Sony’s late entry into the LCD television marketplace a little over 10 years ago didn’t help. Back then, the company had OEM deals for LCD and plasma TVs with Pioneer and the aforementioned LG, along with a joint venture with Samsung to manufacture LCD televisions (S-LCD). But even with the Sony brand and decent market share, profits were nowhere to be found.

As losses piled up in the television unit, more red ink started flowing from Sony’s VAIO computer operations (since sold off to Lenovo). And in a real head-scratcher, Sony bought out its share of a mobile phone joint venture from Ericsson, only to see that miscalculation produce even more financial misery than the TV group ever did.

Now, chairman Kazuo Hirai has made it official: Sony will no longer chase higher sales in smartphones, where its Experia models just can’t compete with Samsung and Apple. And Sony won’t get any traction in the world’s largest mobile phone market, China, where home-grown brands like Huawei play a dominant role.

Are the days of "Make Believe" are over at Sony?

Are the days of “Make Believe” are over at Sony?

Significantly, Hirai also said that he would not rule out an exit strategy for both smartphones and televisions. (Sony’s TV operations were recently spun off as a separate operating unit so their losses can be clearly identified from the rest of the company.) Sony is on track to post a $1.5B loss for the current fiscal year that ends March 31, continuing a string of down years. Layoffs have continued company-wide and about 1200 more employees will be let go from the mobile division this year.

Despite the gloomy news, Sony’s ace in the hole is a burgeoning entertainment division. Sony Pictures, Sony Pictures Television, Sony Music, and PlayStation – taken together – are profitable operations. More than one institutional investor has called for Sony to exit the hardware business altogether and concentrate on content and software, which is where the money is nowadays.

But Sony has such a strong and rich legacy in consumer electronics that they can’t bring themselves to let go of the past, even after posting year upon year of record losses attributable to that same CE hardware. It’s gotten so bad that the company even announced last year that they would not pay a stock dividend for the first time in 50+ years. (Boy, did THAT news wake everyone up!)

In a recent Reuters story, Hirai stated that Sony would target a return on equity of more than 10% by 2018, aiming for an operating profit of $4.2 billion for fiscal 2017. That would be quite a turnaround, given Sony’s performance over the past five years. And it won’t be possible unless the company kisses the TV and phone businesses goodbye, once and for all.

Did Sony learn the lesson of Panasonic, who bit the bullet and shut down their plasma TV manufacturing business cold turkey in 2013, returning to profitability last year? (Panasonic is on track to make about a $2 billion profit for FY 2014.)

Panasonic also shut down other underperforming business units and shifted its focus to commercial products, and it would not surprise me to see them walk away from consumer TVs altogether in the next year or so as their market share is so small.

What about Sony’s Japanese competitors? Hitachi read the tea leaves several years ago and gave up on TVs altogether, while Toshiba is retrenching to the Japanese market. Sharp continues to struggle in the television business as its once-dominant 21% worldwide market share in TV shipments (2006) has dwindled to about 3% and a $250 million loss is staring them in the face for FY 2014.

It seems like everyone but Sony figured out the way back to profitability several years ago. Now, has Sony wised up? Have they finally seen the light?

Time will tell…

Pixelworks Improves Mobile Display Quality While Reducing System Cost

We have just succeeded in wrapping our heads around the fact that quantum-dot enhancement can signficantly increase the color gamut and color saturation of LCDs with little — and eventually no — increase in system cost.

Now, with its “Iris” mobile display co-processor, Pixelworks is giving us another example of improved display performance with, in this case, reduced system cost.

In the Pixelworks suite at CES, Graham Loveridge (Senior Vice President of Strategic Marketing and Business Development), said Iris is the world’s first mobile display co-processor. Many of Iris’s functions have been performed by television video processing chips and cores for years. But incorporating those functions and others in a chip that takes up sufficiently little space and consumes sufficiently little power for a mobile device is new.

Pixelworks calls the display performance that results from Iris processing “True Clarity.”

One of the more obvious things Iris does is up-convert mobile-display video from 15 or 30 frames per second (fps) to 60 fps. In side-by-side demonstrations in the suite, this provided motion images with far less judder, much smoother scrolling, and motion that had much less blur. This shouldn’t be a surprise since we’ve seen the same evolution in large-screen television, and Iris uses motion estimation and motion compensation (MEMC) algorithms to do its work, which is also used for TV. Loveridge said that Iris is unique in that it does MEMC without producing a halo around moving images.

Pixelworks also claims enhanced colors and wider gamut through the use of a 3D look-up table, better contrast, better high-ambient visibility, and custom color tuning. The color tuning, Loveridge said, can be used to make sure that all displays in a production run look the same. But more than that, the OEM can buy displays from different manufacturers and tune them so they all look the same.

What is surprising is that all of this can be done with a power reduction of roughly 25%. Some of the saving comes from the Iris chip off-loading some functions from the GPU and CPU, and performing them more efficiently.

Because the Iris chip permits savings elsewhere in the display electronics, it can save $6 on panel cost, said Loveridge. The power savings permit a smaller battery, which can save another $2. Depending on order size, the Iris chip can had for less than $6. So, said Loveridge, “if a manufacturer is savvy he can improve system performance and simultaneously lower cost.”

TV-quality LCD cells are increasingly common in mobile devices. With the addition of TV-quality video processing, it will be even more appealing for viewers to do more of their entertainment viewing on mobile devices.