Category: The Front Line

A Trend Is A Trend – Until It Isn’t

A story posted on the CNET Web site for August 22 might have gone unnoticed – except that it shows that the tide is now flowing the other way when it comes to smartphones and tablets.

By “tide,” I mean market forces and analyst predictions. The former is showing a decided preference for ever-larger smartphone screens, while the latter is prematurely writing the epitaph for notebook and laptop computers.

When the first iPads burst onto the market, everyone had to have one. There was nothing like it, and what we know as a smartphone was still in the toddler stage, with small screens and limited ability to take photos and stream videos.

Indeed; as recently as two years ago, analysts were predicting that sales of desktop PCs would eventually fizzle out and notebook computers would follow in short order. To some extent, they were right – you can now buy high-powered notebooks for less than $500, a consequence of lowered demand and an oversupply of components, including LCD screens.

But analysts are often more wrong than right, and they definitely got it wrong with the future of larger smartphone screens. “No one will buy a phone with a 5-inch screen. And a 6-inch screen? That’s crazy!” they thundered.

Um, guys – The hottest category now for smartphones is that same 5-inch to 6-inch screen size category. Apple’s sold plenty of iPhone 6s, as has Samsung with their Galaxy 5 and 6. I upgraded from a Motorola Droid Razr Maxx (4.7” OLED screen) to a Galaxy 5 (5.5” OLED screen) last December, and love it. I rarely make calls with it, but I do text, take pictures, shoot video, use sports scores, and even read newspapers while having breakfast or when traveling.

The ever-larger size of smartphones, combined with a somewhat stagnant market for phone sales, has depressed the sales forecasts for tablets. According to the CNET article, “Sales of slate-style tablets are expected to fall 8 percent, according to a report from research firm Strategy Analytics. Sales in Apple’s iPad business, meanwhile, fell 18 percent year over year in its most recent quarter, the sixth consecutive quarterly decline.”

How fast things change. Back in early 2014, tablet sales were forecast to grow 18% by the end of the year. Now, we’re seeing the numbers run in reverse. And part of the problem is that people don’t turn over tablets as fast as they do phones – my wife still uses an iPad 2 from 2011, although the battery is starting to go.

I have a Barnes & Noble Nook HD that’s also vintage 2011 and hardly gets any use anymore, thanks to my new Samsung Galaxy Tab 8.4. Somewhere in a drawer, I have a Nook reader with “Glowlight” that crapped out about six months ago. (And my wife’s Nook Tablet, vintage 2010, still works just fine.)

So, where’s the growth in mobile computing devices? Looks like it’s now happening with so-called “2 in 1s;” devices that combine a detachable keyboard with a larger tablet screen. Microsoft’s Surface Pro is one example; Lenovo’s Yoga Pro is another. The CNET article says that sales of these devices are expected to grow by 5x this year over last, and new processors such as Intel’s Core M give them CPU speeds comparable to midrange laptops.

In terms of turnover, tablets are lasting 5 to 7 years. (Not good news for Apple, I suspect!) Smartphones are still driven by the length of service contracts, nominally 2 years. But Intel claims that buyers of 2-in-1s are turning over laptops and notebooks much more frequently – on average, 18 to 24 months.

We’ve also seen much sales of much larger tablet screens pick up. Samsung’s 10.4-inch Galaxy Tab is popular, and the CNET story mentions a rumor that Apple plans to unveil a 13-inch iPad Pro this fall. (No word on whether it will have a detachable keyboard, a feature that Apple has resisted for now.)

The demand for the Surface Pro product stands in stark contrast to its earlier failures at launch three years ago. (Wow, has it been THAT long?) At one point, the company had hundreds of thousands of unsold units sitting in warehouses, no doubt due to the public’s emphatic rejection of Windows 8 software.

Now, Surface Pros are a popular product, and can run special versions of Office software. With gradual acceptance of cloud-based storage as opposed to CD drives, these tablets are quite powerful, thin, and lightweight.

A move to larger screens on smartphones can’t continue indefinitely: The 6-inch Galaxy is about the largest phone size I can fit into a shirt or pants pocket, so we may be hitting a wall in that area (although I have heard of plans by one Chinese brand to come out with an 8-inch 4K smartphone!).

So if any device will be sacrificed on the CE altar, it will be mid-sized tablets – 7 to 9 inches – and that’s already happening, based on market numbers. As the owner of a still-running Toshiba 10.4” notebook with OS 7, I’m intrigued by the idea of replacing all of that weight with a same-size tablet and keyboard – and a higher-resolution display, too.

For AV connectivity, the market switch creates its own headaches. Micro HDMI? MHL? Lightning? In all likelihood, the interface of choice will become wireless, most likely using 5 GHz Wi-Fi channel bonding technology for more reliable video streaming. Or, we may see some early adopters of 60 GHz wireless links for “2-in-1s,” using the 802.11ad protocol or SiBEAM’s Snap wireless docking system.

Keep your eye on the new USB 3.0 Type-C connector. This could be a game-changer: Like Lightning, it is symmetrical and thus reversible. It can carry high-speed data (up to 10 Gb/s), DC power for charging, and in Alternate Mode, transport display signals like DisplayPort 1.3 (packet) and superMHL (TMDS).

It’s a good bet that as the market ramps up production of “2-in-1s,” they’ll include the Type-C interface and probably drop everything else except power connections. For that matter, Type-C is in a position to displace everything from Mini DisplayPort to HDMI as it is the closest thing we’ll have to a do-everything, universal I/O connector going forward.

As for picking winners and losers in the smartphone/tablet/notebook/laptop game, better leave that to the “experts.’ They’re just as confused as anyone else…

We’re Not Having Fun Anymore…

Last Friday (7/31), Sharp Corporation made the announcement that they would finally throw in the towel and withdraw from marketing and selling televisions in “the Americas,” opting to sell the company’s LCD TV manufacturing plant in Mexico to emerging Chinese CE giant Hisense.

Sharp also indicated that it would allow Hisense to sell TVs on this side of the Pacific that are branded with the Sharp name. (Hitachi, JVC, and Toshiba have similar arrangements.) This announcement came just months after the announcement that industry marketing veteran Peter Weedfald was being hired by the company, presumably to try and turn the U.S. consumer electronics operation’s fortunes around.

Sharp also had a nice line show of nine new Ultra HD (4K) TVs back in May, signifying a commitment to the U.S. TV market. Now, it appears everything was for naught.

This has to be quite a blow to the ego of the company that basically created the LCD television business, and that just 9 years ago held a 21% worldwide market share in LCD TV shipments. Today? They’re not even on the radar, having ceded ground to Samsung, LG, Sony (who also is struggling), and most recently, Hisense and TCL.

There are nine new Ultra HDTVs in the Sharp line now, ranging from 43 inches to 80 inches.

What will happen to Sharp’s new line of Ultra HDTVs?

But it’s the right move. The company’s world-largest Generation 10 LCD fab in Sakai, Japan became a white elephant almost immediately as the world went into a recession in 2008-2009. Pressed for cash, Sharp sold 46% of the Sakai fab capacity to Hon Hai Precision Industries for about 20 cents on the dollar not long after the plant opened.

Several years of red ink followed, as did numerous rounds of financing. In the 1st quarter of this year (April – June), Sharp booked an operating loss of 28.8 billion yen ($231.87 million), down from a 4.7 billion yen profit a year prior, with a net loss of 34 billion yen ($270 million). The company still maintains it will be profitable to the tune of 80 billion yen ($644 million) by the end of March 2016.

According to a Reuters story, Sharp’s CEO Kozo Takahashi was “…open to major restructuring including some kind of strategic deal for its LCD business.” It’s well-known that Hon Hai CEO Terry Gou would love to buy the Sakai facility – he already owns almost 50% of its capacity, and Hon Hai subsidiary Foxcon sources LCD glass from Sharp for various Apple i-products. Gou also stated earlier this year that he would be willing to put more money into Sharp in exchange for a seat on its board.

While Sharp continues to wrestle with black ink, Sony posted what appeared to be positive financial results for its 1st quarter. The once-formidable CE brand logged an operating profit of 97 billion yen ($781 million), far exceeding the estimates of financial analysts.

There’s no question that camera sensor manufacturing is a lucrative business for Sony. Hundreds of millions of cameras and phones use Sony sensors, and the company announced a few months ago that it would expand sensor manufacturing capacity at two plants in Japan.

The strong first quarter was helped by an increase in operating income for its gaming (PlayStation) division of 350% to 19.5 billion yen ($153 million). So everything is coming up roses in Tokyo – right?

Not really. Sony’s beleaguered mobile phone division strung up a loss of 22.9 billion yen ($184 million) for the same quarter, and according to a Reuters story, the company is now predicting a loss of 60 billion yen ($483 million) for the fiscal year that ends next March, citing “a significant decrease in smartphone unit sales resulting from a strategic decision not to pursue scale in order to improve profitability”.

Drilling down into Sony’s Q1 FY2015 Consolidated Financial Results, the Home Entertainment and Sound group posted 168.9 billion yen ($1.36 billion dollars) in sales during Q1, with operating income of just 7 billion yen ($56 million). (That is a margin of 4.1%.) Sales were down 13.8% from the same period last year due to a “…decrease in unit sales of LCD televisions, mainly in the mid-range” and a “decrease in home audio and video unit sales reflecting contraction of the market.”

For all of 2014, Sony sold 14.6 million LCD TVs. Their current forecast calls for 11.5 million to be sold by the end of next March, a drop of 21% Y-Y. (2.6 million LCD TVs were sold by the company in the first quarter.) The TV business has long been a cash-sucker and Sony has been racking up losses in this market segment for a decade.

If Sony was to cut loose its mobile and home entertainment businesses, it would be quite the profitable company. For that matter, even the digital camera segment is seeing a downturn, as the year-long forecast for camera sales (5.9 million units) represents a drop of 30% from last year’s numbers – which were 26% down from 2013.

Aside from PlayStation, the long-term view for Sony’s consumer business isn’t good. Cameras in general are being displaced by smartphones, and even powerhouses like Samsung are seeing their mobile phone business decline as Chinese companies gain more market share in Asia.

And it’s pretty clear what’s happened to the Japanese TV business – only Sony, Sharp, and Panasonic retain a presence in the United States, and Sharp just announced it’s getting out. Look for Panasonic to do the same by year’s end, as their market share is miniscule and supporting continued sales of televisions doesn’t make much sense financially.

That just leaves Sony, who once proudly exclaimed that they had a chain of products “from lens to screen.” Well, that was in the good old days, when everyone was having a great time selling consumer electronics.

But we’re not having fun anymore…

“The Only Disruptive Technology at Display Week”

At SID Display Week, in an aisle on the show floor, I had a brief conversation with Candice Brown-Elliott, Nouvoyance CEO and creator of the Pentile Matrix pixel configuration widely used in Samsung OLED displays. She said that micro LED was the only disruptive technology she saw at Display Week. In addition to being a trusted colleague, Brown-Elliott has the rare gift of being both an insightful technical visionary and an effective engineer who doesn’t mind getting her knuckles scraped and her fingernails dirty. When Brown-Elliott says a technology is disruptive, I pay attention.

Just as remarkable as this technology’s potentially transformative nature is that micro LEDs (or microscale LEDs or µ-ILEDs) were not well known outside the relatively small community of people who work on them before Apple acquired LuxVue last year, at which point a much wider community started scrambling to learn about them.

Clearly, it would be very attractive to make phone, tablet, and TV displays from inorganic LEDs, but there has been no inexpensive way to assemble LED chips into RGB arrays of the appropriate density. If it were possible, such displays could be several times as efficient as OLEDs and have longer lifetimes.

So, the room was crowded when John Rogers — a professor at the University of Illinois and co-founder of and technology advisor to X-Celeprint — presented a Monday seminar entitled “Microscale LEDs for Multifunctional Display Systems.” What Rogers and his colleagues, along with a handful of other micro-LED companies, have learned to do is is initiate the epitaxial growth of AlInGaP LEDs on recyclable GaAs wafers. Rogers described a process for making multiple layers of LEDs with sacrificial layers in between that allow the layers to be lifted off That’s impressive but it solves only half the problem. If we went no farther, we could no more than make expensive wafer-sized displays.

Multilayer epitaxial lift-off  (Graphic:  John Rogers)

Multilayer epitaxial lift-off (Graphic: John Rogers)

The second part of the solution was covered by Chris Bower, CTO of X-Celeprint (Cork, Ireland), who described the company’s technology for performing transfer printing of the chips using elastomeric stamps utilizing peel-rate-dependent adhesion. To oversimplify shamelessly, if you place the stamp on the layer of chips and peel it off quickly the chips adhere to the stamp. Impress the stamp on the target substrate and peel it off slowly, the chips adhere to the target. This is also impressive, but it still doesn’t created LED arrays any larger than the original lattice-matched array.

As it turns out, it is relatively simple to impose patterns on the stamps that result in picking up every 10th, 20th, or nth LED before depositing them on the substrate. In this way, you can go from the dense array of the original wafer to a sparse array on the target substrate. In principal, this allows you to make µ-ILED displays of virtually any diagonal. Bower said that X-Celeprint has made 150-mm stamps. Making larger ones is just a matter of engineering, he said, not science.

Now, is it obvious to that if you can transfer-print u-iLEDs you can also transfer-print CMOS switching circuits and no longer worry about the instability issues of a-Si and IGZO TFTs or the scalability issues of LTPS? Well, even if it’s not obvious, Rogers discussed it in his seminar. In fact, you can transfer print many kinds of “chiplets,” and even assemble them in three-dimensional structures. Displays are only one application of the technology.

The first µ-ILED display we see in a commercial product may very well come from LuxVue and appear in an Apple iWatch next year.

Looking farther forward, is it possible that µ-ILED, not OLED, will become the universal display that replaces LCD? All of us in the display community should be thinking about that question, and thinking hard.

How Sharp Makes Its “Free-form” Displays

For some time Sharp has been showing examples of its “free-form” displays, which do both the “row” and “column” driving through the same edge of the display, leaving the rest of the display to be cut in curves or other unusual shapes. But Sharp had not been willing to describe in detail how it distributed the gate drivers throughout the display so that no conventional row drivers mounted on a vertical display edge are necessary.

At the most recent SID Display Week held in San Jose in early June, that changed. In the Sharp booth, Automotive Marketing Director for Display Products Thomas Spears did his best to explain the innovation, but it was hard for him to do so in any detail amidst the cut and thrust on the show floor. More detail was available from the invited paper by Hidefumi Yoshida and 13 colleagues from Sharp in Nara, Japan. The paper, “flexible Flat-Panel Display Designs with Gate Driver Circuits Integrated within the Pixel Area,” described Sharp’s truly clever approach.

Yoshida and friends began with a well-known technology, gate driver monolithic circuitry (GDM). With GDM, the shift registers and output transistors of the gate drivers are deposited on the vertical edge of the display at the same time as the switching transistors are fabricated. This is an alternative to the more conventional approach of using ICs for the gate driver circuitry. Since GDM circuitry can occupy significant real estate on the vertical edge of the display, especially when implemented in amorphous silicon, it requires a wide bezel, which is not compatible with current display preferences or with gracefully curved display contours.

Here’s where Sharp’s cleverness comes into play.

At SID Display Week, Sharp showed this "free-form" LCD with curved corners and very thin bezel.  (Photo:  Ken Werner)

At SID Display Week, Sharp showed this “free-form” LCD with curved corners and very thin bezel. (Photo: Ken Werner)

First, instead of putting the GDM circuitry on the vertical edge(s) of the display, Sharp locates it in one or more vertical “bands” within the display area. I’ve put “bands” in quotes because Sharp has done far more than simply shifting the left-edge circuitry into the image area as a block (which would create dead areas within the display). Sharp also disperses the transistors of the GDM circuitry so individual transistors are located at iseparate pixel locations and interconnected via additional surface connections and a large number of through holes. Thus, the gate driver control signals enter through the bottom edge of the display, which is also where the source drive ICs are located. The gate signals travel from the dispersed GDM circuits horizontally to pixels, but entirely within the image area. This allows the left, right, and top edges of the display to have very thin bezels, which can be shaped with great freedom. Sharp has shown a triple-curve display that is appropriate for the tachometer, speedometer, and combined temperature/gas gauge in a primary automotive instrument display.

This is a significant innovation in display architecture that is, as Yoshida et al. carefully note, just as applicable to OLED displays as to LCDs. Sharp’s Thomas Spears said there was very significant interest in the displays from automobile manufacturers, and that Sharp was seriously engaged with all of them. We will see these displays in (or as) auto instrument clusters but, given automobile design cycles, probably not until 2017.

The Wires Remain The Same. Only the Format Has Been Changed (to Confuse the Innocent)

For the longest time, the pro AV industry was characterized by proprietary cabling formats: One piece of coax with BNCs (or yellow RCA plugs) for composite video. A 15-pin DB9 connector for VGA. DIN connectors for S-video. And RJ-45 plugs for twisted-pair analog signal extenders.

With the advent of digital signal interfacing, we’ve got a slew of new connectors that look nothing like their predecessors: The 19-pin HDMI plug. The 20-pin DisplayPort plug. Micro USB. Type-C USB. DVI. And RJ-45 plugs for twisted-pair digital signal extenders.

Wait – what? We’re still using RJ-45 plugs, and category wire? Apparently, and we’ve now migrated to the more robust category 6 wire (rated for 1GigE connections); more often than not equipped with shielding to minimize crosstalk and ground wires for longer signal transmission distances.

The thing is; we’re now facing a new set of challenges in the way we multiplex and transport video, audio, RS232, IR, USB, metadata, and even power. One camp advocates for using a proprietary system (HDBaseT) that currently has a practical limit of about 330 feet and is still limited to supporting the older HDMI 1.4 standard. But it transports uncompressed signals and is very popular in the InfoComm world.

The other camp is advocating that we compress and convert all video/audio/data to packets and transmit them with IP headers through conventional networks. This method increases transmission distance considerably and can run over copper or optical fiber (or even coax, for that matter), through conventional, open-system network switches. This approach is favored by telecom companies, along with broadcast networks, IPTV services, and other multichannel video system operators.

Now, another camp says that they’ve developed a “better mousetrap” for doing AV-over-IP, using a low-latency protocol known as BlueRiver NT that uses light compression on video and audio.

Logos Combined July 2015 CROP 1024So which is the way to go? That’s not an easy question to answer, but the most common approach to transmitting digital video and audio over long distances is solution #2, using MPEG compression and standard IP protocols to transport video and audio through everyday networks and switches.

What’s more; it’s likely to stay that way. While the HDBaseT format works very well, it is based on a proprietary pulse-amplitude modulation (PAM) scheme that requires chipsets manufactured by Valens Semiconductor. And there is that distance limitation, although support for optical fiber is now in the standard. But you can’t run HDBaseT signals through conventional network switches.

The BlueRiver NT approach (designed by AptoVision) claims to improve on conventional AV-over-IP transmission while retaining low latency with Adaptive Clock Re-synchronization. This technique interleaves audio, video, 1GB Ethernet and other signals with an embedded clocking mechanism.

According to AptoVision, this approach recovers the clocks for both audio and video at the decoder end with only a few lines of latency while remaining fully synched to the source clock across the entire network; even through switches. The light compression cranks down data rates by 50% with a “lossless” two-step codec.

While you can run BlueRiver NT-coded video and audio through conventional IP networks and switches, you must use their proprietary codec in transmitters and receivers. So it’s not a true “open” system, although it is more flexible than HDBaseT for installation in a network environment.

So, back to conventional AV-over-IP, which (come to think of it) isn’t really that “conventional” nowadays. Thanks to the new HEVC H.265 codec and a series of real-time protocols, it’s now possible to stream 1080p content with conventional IP headers through any network and switch and decode it with any H.265-compatible device, like a set-top box or media player, or even a new Ultra HDTV.

And your 1080p content can travel through networks at speeds as slow as 1 to 2 megabits per second, yet still yield good image quality when decoded. Compare that to the current 6 – 10 Mb/s requirement for 1080p/60 using H.264 AVC coding, and you’re seeing quite an improvement.

H.265 decoder chips are now widely available from Broadcom, which means that a whole host of displays and media players can be used to decode AV-over-IP signals – you aren’t stuck with a proprietary system. What’s more, AV-over-IP systems aren’t restricted by bandwidth in their transmitters and receivers. If the network can handle 1 Gb/s of data, so be it. And if you are fortunate enough to tie into a 10GigE network with optical fiber, the sky’s the limit!

Now, none of what I just wrote says these systems can’t co-exist. It may make sense to use HDBaseT extenders (or BlueRiver NT versions) to connect from a decoder to distant displays. Or, the input of an encoder could be fed by an HDBaseT / BlueRiver receiver.

The advantage of a 100% AV-over-IP system is that it nicely sidesteps the current speed limit problems we’re experiencing with HDMI, and to a lesser extent, DisplayPort. We’ve reduced the video and audio signals to a baseband format and compressed them into packets, which can travel through ANY manufacturer’s IP switching and routing gear.

Best of all, the addressing is done in software with IP addresses, which helps manage the size of the switch and ensures it is always easily scalable. If you didn’t specify enough inputs and outputs on a matrix switch for HDMI, you’ve got a problem! But if you need to connect more displays through an AV-over-IP system, you just need more IP addresses.

In the near future, you can count on hearing plenty of debates about which of these formats is “the way to go” for digital signal distribution. HDBaseT is widely entrenched in the commercial AV world (and to some extent, in home theater). But it’s not popular with IT-savvy users, where conventional MPEG/AES and IP headers rule the day.

And it remains to be seen how much traction BlueRiver NT can gain in the pro AV space. Some manufacturers are already supporting this format as a better way to do AV-over-IP than H.265. Latency issues with any video codec are largely a result of both compression and forward error correction, and we’re still in the early stages of H.265 adoption. So it’s a little too early to pick winners and losers here.

Frankly; if I was designing a high-performance video network, I’d use 100% optical fiber cabling and H.265/IP to get the job done, running everything through 1GigE or (if the budget permitted) 10GigE switches and using fiber-to-“whatever” receivers/converters at all terminations.

That would essentially guarantee future-proofing of the installation, as all I’d need to do to connect an upgraded interface would be to swap out a plug-in card or install a low-cost black box as needed.

But that’s just me…