Category: The Front Line

Display Interfacing: Welcome to Babylon

For many years, ‘interfacing’ a video signal meant plugging in a yellow RCA or silver BNC connector that carried composite video. As picture resolution went up, computers became commonplace at work and home, and the term ‘progressive scan’ entered the lexicon, we saw the birth of S-video and then component video (YPbPr and RGB).

So we adapted, building switching and distribution gear that could handle one-, two-, and three-wire formats. All was well and good…until ‘digital’ made its grand entrance about 15 years ago.

Now, we have digital versions of component and RGB video, starting with the Digital Video Interface (DVI) and moving to High Definition Multimedia Interface (HDMI), DisplayPort, and the new superMHL interface that (according to the MHL Alliance) will start appearing on televisions as soon as December.

If I’m a consumer, I mostly don’t care about any of this. As long as I can plug in my set-top box, Blu-ray player, and other gadgets with the right cables I can find at Best Buy, this is just a bunch of alphabet soup.

However, if I’m an integrator (consumer or commercial), then I care VERY much about where all of this is heading. And if I’m paying any attention at all to the growing market for 4K and UHD, then I’m rightfully concerned about the impending problems with interfacing these signals.

superMHL is certainly fast enough to handle UHD. But you can't find it in use yet. is there a better way?

superMHL is certainly fast enough to handle UHD. But you can’t find it in pro AV gear yet. Is there a better way?

Consider that:

*Even though HDMI 2.0 was announced in September of 2013 – TWO FULL YEARS AGO – virtually no manufacturer in the pro AV space supports this interface on their switchers and distribution amplifiers. Instead, the vast majority are still providing version 1.4 while claiming these products are “4K compatible” or “4K ready” because version 1.4 is just fast enough to pass an Ultra HD (3840×260) signal at 30 Hz with 8-bit RGB color. That’s setting the bar kinda low, isn’t it?

*Some computer manufacturers don’t even support HDMI, like Apple (DisplayPort) and Lenovo (also DisplayPort). So, now you have to carry dongles everywhere you go?

*HDMI 2.0 arrives hand-in-hand with a new version of copy protection (HDCP 2.2) which is much more rigorous than versions 1.3 and 1.4. If a valid HDCP key exchange isn’t made within 20 milliseconds, the connection will shut down. Period.

*HDMI 2.0 isn’t fast enough for what UHD is turning out to be – a real departure from 1080p and Wide UXGA, with a move to 10-bit color to support high dynamic range (HDR), wide color gamuts (WCG), and high frame rates (HFR). DisplayPort 1.2 can barely support these requirements; DP version 1.3 and super MHL are better positioned to handle the job.

*The intellectual property behind HDMI and superMHL is owned by the same company – Lattice Semiconductor – and whereas once there were clear dividing lines between the two interfaces (MHL was designed originally for smartphones and tablets), they are now competing against each other. I’ve even sat in on presentations where it was explained that both could exist on consumer TVs. (And why would that make sense, again, when neither interface has been widely deployed to date, and one is clearly an improvement over the other?)

You can imagine what this trend is doing to product designers and manufacturers. Sure, HDMI is a “safe bet” for now, but what if our UHD needs quickly outstrip its maximum clock speed? DP is certainly faster and there appears to be more support for it from computer manufacturers. But super MHL is faster still. Shouldn’t your interfaces at least have a head start on display manufacturers?

This reliance on HDMI has led several manufacturers into a potential trap, investing heavily on signal distribution architectures that may quickly run into a “future-proofing” problem. In contrast; outside the commercial AV industry, everyone from cable TV system operators to broadcasters and telecom operators are busy migrating to an IP-based architecture.

Not only does IP-based architecture have the advantage of being a relatively open system, it also solves many of the speed issues as 1-gigabit and 10-gigabit networks are becoming more commonplace. (Heck, Comcast just upgraded my home Internet speeds to 75 Mb/s on downloads, which is more than fast enough for me to stream 4K content from Netflix and Amazon!)

So, why don’t we do the same in the commercial AV industry? It’s not for a lack of products – there are several companies offering AV-over-IP transmitters and receivers, along with encoders and decoders. I’ve also seen impressive demos of “middleware” used to locate, switch, and play out media assets over IP networks. All of these guys were at InfoComm 2015.

The big players in HDMI-based switching and distribution argue against AV-over-IP for in-room and short-run signal distribution, citing latency and compression issues. Well, we now have a new codec (HEVC H.265) to handle that end of things, and it’s possible to stream video and high resolutions with low latency. (How does 1920x1080p/60 at 1 to 2 Mb/s sound to you? Thought so.)

High latency is often the result of over-compression and heavy forward error correction (FEC). But if video and audio assets are streaming on bandwidth-managed, private IP networks, there isn’t a lot of forward error correction required. Group of Pictures (GOP) sizes can also increase to reduce latency. So latency is sort of a “straw man” argument. (And HDMI 2.0 will have plenty of issues with HDCP 2.2, trust me. Talk about latency…)

As for copy protection; video and audio assets streaming over IP connections have their own security protocols. Practically speaking, what could be more secure than video content streaming directly into a UHDTV, through an Ethernet connection? And you don’t even have to plug in a cable to make it work, unless you use a wired Ethernet hookup. Bandwidth issues? Well, how about 5 GHz 802.11ac channel-bonding routers? I’m getting 70+ Mb/s download speeds from mine with wired connections, and 25 – 30 Mb/s some distance from my 5 GHz wireless link.

Again, looking outside our industry, the two most common signal distribution and switching architectures are based on HD-SDI or IP (or both). Not HDMI, and certainly not HDMI-derived, structured-wire systems like HDBaseT. If the rest of the world wants to multiplex video, audio, metadata, and other low bitrate control signals, they do it over optical fiber. (Did you know that multimode fiber is cheaper than Category 6 wire?)

I’ll wrap up thing by saying that the smart move is for commercial AV integrators to move to an AV-over-IP signal distribution system at the core like everyone else, leaving the HDMI, DisplayPort, superMHL, and “whatever comes next” connections for the far ends, near the displays (if those far-end conversions are even needed at all).

Leave the core as a high-speed, copper bus or optical bus, software-based switcher. If there’s enough bandwidth (and there should be), that system can also carry local TCP/IP traffic, SMTP alerts from connected devices, and control signals to all devices. Not only does this approach free everyone from the “closed world” paradigm of HDMI, it also makes the system infinitely more appealing to end-users and facility administrators, an increasing number of whom come from the IT world.

Consider this ad that was posted recently on a listserv for higher education:

“We are looking for an experienced AV-IT Engineer for the role of Technical Guru. The position will provide planning and support for AV-IT systems used in teaching and learning spaces big and small. The person in this position will focus on design, installation, and troubleshooting of AV-IT systems in a variety of venues, including traditional classrooms, active learning classrooms, large auditoria, computer labs, and even Makerspaces…We are looking for a seasoned professional with a solid background in AV-IT systems. This is a great opportunity for a doer who is excited about not just maintaining but also shaping the future of AV-IT technology as a key element of the teaching mission of one of the world’s top universities.”

I rest my case. It’s time for the commercial AV industry to get in step with the rest of the world and move to AV-over-IP signal distribution.

Wake up. Have you smelled the coffee yet?

4K, Collapsing Prices, and the Declining Importance of Hardware

As I write this, the 2015 season of the National Football League is about to get underway, with last year’s Super Bowl champion New England Patriots taking on the Pittsburgh Steelers. If you’re not a football fan, why should you care?

Simple: Football, more than any other sport or event, drives the sale of televisions. And the TV business is in a major funk right now.

According to IHS’ latest survey of the global television market, worldwide shipments of TVs fell an astounding 8 percent Y-Y during the second quarter of 2015. Even though LCD TVs now account for almost 99% of all TV shipments, “…LCD TV sales have not made up for the lost volume of cathode-ray tube (CRT) and plasma televisions, which have largely left the marketplace.”

The one bright spot? 4K. The IHS report states, “4K TV was a bright spot in the global TV market, with unit shipments growing 197 percent year over year in Q2 2015, to reach 6.2 million units. The growth in 4K TVs is the direct result of increased price erosion and more affordable tiers of 4K models becoming available.”

I’ve written on numerous occasions that we’re on the cusp of an industry switchover from 1080p resolution to Ultra HD (3840×2160) for precisely this reason, plus the fact that it’s becoming increasingly difficult to make any money on the manufacturing and sales of 1080p-resolution LCD panels. That’s part of the reason that Sharp – once the premier brand of LCD televisions – finally threw in the towel and exited the North American television business, selling their Mexican factory and “Sharp” brand to Hisense.

Need proof? Check out the most recent HH Gregg and Best Buy circulars. You can now buy a 48-inch Haier 1080p LCD TV for $298 or a 60-inch LG 1080p smart TV for $898. Want Ultra HD resolution instead? Samsung’s got a curved 55-inch smart model for $1198, and a 60-inch smart set for $1498.

Samsung has slashed the prices on its new S-line of HDR Ultra HDTVs by as much as 20%.

Samsung has slashed the prices on its new S-line of HDR Ultra HDTVs by as much as 20%.

But here’s the kicker: Samsung’s HDR Ultra HDTVs (S-UHD) are almost the same price. A 50-inch model (UN50JS7000) is tagged at $1098 by HH Gregg, while the 55-inch version is $1298. Too expensive? Sharp’s got a 43-inch Ultra HD offering for $598, a 50-inch set for $748, and a 55-inch version for $848. (Not to be left out, LG has cut the price on their 55-inch smart Ultra HDTV to $998, and they’ve also got a 49-inch UHD set for $798.)

Now, step back from that mass of numbers, and think about this: Those are insanely low prices for Ultra HDTVs, which were tagged around $15 – $20K when they first came to these shores in 2012. I know of several friends and acquaintances that had to replace older TVs recently, and every one of them bought an Ultra HD set because of these falling prices.

If overall sales of TVs are falling but 4K TV sales are increasing, it doesn’t take a weatherman to see which way the wind is blowing: 4K and Ultra HD are rapidly taking over the TV marketplace for sets larger than 42 inches. This is happening so quickly that by the end of next year, ALL TVs larger than 50 inches will be Ultra HD models.

There’s a bigger message here. The money isn’t in hardware anymore – it’s moving to software. I find it hard to believe that I would spend more in a year for cable TV and Internet service than the cost of an Ultra HDTV, but that’s exactly what’s happening. Content is king, and who cares about the hardware?

So, why are TV sales in decline? It could be for a very simple reason, and that is the average household has a large-enough TV with enough bells and whistles that they see no reason to upgrade. If you already own a 55-inch or 60-inch 1080p set with “smart” functions ( and the all-important Netflix streaming), then the speed of your Internet connection is much more important than adding another 5 inches in screen size or quadrupling your screen resolution.

There’s a corollary in the world of tablets, where sales and shipments are also slowing down much faster than analysts predicted. There are any number of reasons why, but the two most likely culprits are the shift in preferences for larger smartphone screens (“phablets”) and the fact that people just hang onto tablets longer (at least, until their batteries die), often passing them down to children or off to relatives when a new model is purchased.

This shift to 4K and Ultra HD resolution is also impacting the commercial AV industry, which is heading for some serious interfacing issues. More and more of the large displays that will be installed will have Ultra HD resolution. And that will create a major headache for integrators, as the predominant interface for pro AV is still HDMI 1.4, even though version 2.0 was announced two years ago.

None of this is good news for the projector manufacturers, who are struggling to defend their turf from the large, cheap LCD displays. Unlike panel manufacturers, projector brands are moving slowly to adopt 4K resolution, which isn’t surprising because of the cost involved to tool up and manufacture microdisplays with 4K resolution and the much smaller market for projectors.

As for the naysayers who still think 4K is a fad, I would just advise them to wake up and smell the coffee. The world of consumer electronics absolutely drives the world of commercial AV – what’s happening over there is going to happen here, and that means you as an integrator will be installing more and more displays with UHD resolution; from desktop monitors and TVs to single-panel and tiled wall-mounted displays.

Count on it!

 

 

 

xFinity: “The Future of Awesome” Is Looking A Bit Less Confusing…

In a previous post, I detailed the horror show I ran into trying to upgrade my old cable modem/router and migrating from a shopworn TiVo HD to two of Comcast’s new (Samsung) xFinity set-top boxes.

Those adventures took place w-a-y back in May and early June, and I won’t recap how everything turned into a three-ring circus. Instead, I will talk about the fact that since early June, I’ve lost (at one time or another) my Internet, TV channels, phone, and EVERYTHING. Yep, no signals at all.

There have been about six of these outages in all, and the one which cut off all service resulted in my waiting from 1 to 4 on a Friday afternoon for a tech who never showed up (apparently he went to the wrong address!). Ditto the following Monday, even though our service came back on its own.

Comcast has an “escalation” email address, we_can_help@comcast.net, that seems to get you through to the right people when you’ve hit a wall. I had suspected that our service problems were more due to things happening outside the house, possibly at the street drop where our underground cable connects up.

After all, the cable was installed nearly 30 years ago, when the house was built. And even “non-contaminating” coaxial cables eventually go bad and let in moisture, shorting out along the way. So my repeated calls to Comcast customer service stressed that a tech should check the outside wiring.

After getting nowhere with this approach, and experiencing another drop-out of signals (as witnessed with my spectrum analyzer), I sent yet another “fix the damn connection!” email to Comcast. I finally got a call from the executive office confirming that a tech would be here today (although no one had previously called to tell me that, or ask when I’d be around).

The executive “escalation group” had also done some checking into service records and discovered (lo and behold) that other customers in my neighborhood had also experienced service outages. (More ammunition for my argument.) So it sounded like I was finally making headway.

A three-truck service call is definitely an

A three-truck service call is definitely an “escalation.” But it was about time!

The tech called and said he could be over in a jiffy – much earlier than expected. Great! Soon, one truck, then two trucks, then THREE trucks pulled up my street and parked. Wow, they really called out the cavalry!

In short order, the two techs and a supervisor opened the street drop, found a bad cable (it pulled right out of its connector) buried near the house, and ran a new waterproof coaxial cable to the street. They also installed a new junction box and ground wire, and ran a new cable into the basement where everything. The street splitter “rang out” okay.

Now, to the basement. The lead tech advised me that the “new” Arris 2.4 GHz DOCSIS 3.0 wireless modem/router I had installed back in late May (ironically, at Comcast’s suggestion) was actually an “older’ model. And that he had a newer, dual-band Cisco wireless router in the truck, and did I want to swap it out?

The

The “drop” with a rat’s nest of cables and taps. It was determined that an entirely new cable to the house was need. (Wow, what a surprise!)

 

The tech fastens the house cable while a supervisor installs a termination box and ground wire.

The tech fastens the house cable while a supervisor installs a termination box and ground wire.

Well, of course! In no time at all, the “older” new router was removed and replaced. After some phone calls to the head end to activate everything, I now had a dual band, 2.4 / 5 GHz modem with 802.11ac channel bonding capability. Given that I had recently gotten several teaser emails from Comcast, advising me that my Internet speeds had been upgraded to 75 Mb/s, I was quite happy to finally see that extra speed when all was said and done.

The good news is; both techs and the Comcast supervisor were “on the ball.” They hung around to make sure the modem and all phone lines were working, and also that I had reconfigured the wireless network names and new passwords correctly. (The Cisco modem has a quirky habit of re-booting every time you make even the slightest change to its settings.) It’s nice to work with people who listen to you, understand the problem, and work quickly to diagnose and repair as needed.

So – now, I finally have the new cable drop to the house I asked for multiple times. And even though I am working on my third wireless modem/router in three months, it is great to have the extra speed through the 5 GHz wireless connection (better than 50 Mb/s sustained). And I’m hoping that the service outages I’ve been plagued with are finally a thing of the past. (Knock on polypropylene.)

With that done and behind me, I’m now waiting for the “awesomeness” to set in…

Are the Folks at QD Vision Worried?

Although QD Vision’s message at Display Week in San Jose was bullish, if I worked there I’d be worried. Why is that?

TV set architecture is progressing along two parallel paths: 1) Very thin sets with LED edge lighting, and 2) somewhat thicker sets with direct backlighting. Initially, QD Vision’s Color IQ linear quantum-dot optical element seemed as if it would be the preferred approach for large-screen TV, but it is only applicable to sets with edge lighting. To the extent that direct backlighting takes a significant share of sales, QD Vision’s total available market will decline.

The good news for QD Vision is that quantum-dot backlight enhancement will capture only a small share of the market in 2015. Only 1.3 million TV sets sold in 2015 will be QD-TVs, growing to 18.7 million in 2018, according to IHS/DisplaySearch. So, in the immediate future, there’s plenty of growth to go around.

The bad news for QD Vision is that the direct-backlighting approach captured more than 60% of the market in 2014, according to TrendForce. Fundamental considerations strongly suggest that the penetration of direct backlighting will continue to grow, according to Nutmeg Consultants.  (Disclosure:  The author is Principal at Nutmeg Consultants.)

What are these fundamentals? First, direct backlighting is compatible with local dimming of the backlight, which is required for high dynamic range (HDR), one of the features to be hotly promoted for the next generation of premium TVs. Second, it is now possible to use fewer LEDs with direct backlighting than with edge lighting, reduces cost. Third, direct backlighting eliminates the relatively expensive light-guide plate. Fourth, wide-angle lenses for the LEDs are decreasing the thickness of direct-backlit sets to the point it will not be objectionable to most consumers in most situations.

Both the linear optic of QD Vision and the Quantum-Dot Enhancement Film (QDEF) of 3M and Nanosys will provide the same enhanced color gamut, another of the features to be hotly promoted in new sets. So, how important is HDR likely to be?

For the first generation of UHD sets, the manufacturers stressed the increased pixel content, but the UHD spec encompasses far more than that. When fully implemented in succeeding generations of sets, UHD will also encompass extended color gamut (through quantum dots in LCD sets) and high dynamic range. Setmakers are moving energetically in this direction because they have been doing studies that show consumers are not excited by the difference in resolution between 2K and 4K sets. (Some of us may not share this perception, but if you are reading this column it’s a good guess that you are not a typical television consumer.) Typical consumers are, however, excited by increased color gamut and HDR. You can expect advertising for high-end sets to emphasize these aspects of the UHD spec soon.

The new Hisense H6510B2 has 3840x2160 pixels, extended color gamut 10-bit color depth, high dynamic range, and local dimming, all for  US $3000.  (Photo:  Hisense)

The new Hisense H6510B2 has 3840×2160 pixels, extended color gamut 10-bit color depth, high dynamic range, and local dimming, all for US $3000. (Photo: Hisense)

The movement toward direct backlighting and QDEF is illustrated by two press releases distributed by Nanosys in the last few days. The first concernes the launch of Hisense’s 4K ULED H10 Series 65-inch curved smart TV. The ULED series features the QDEF film and HDR. Hisense maintains its ULED TV will “compete with the picture quality of OLED TV and [Samsung’s] SUHD TV at a lower price. For $3000, Hisense provides a 100% NTSC colorgamut, 3840 x 2160 pixels, 10-bit color depth, 120Hz frame rate, 900 nits peak luminance, H.265 video coding, and a claimed dynamic contrast of 800 million to 1. You will be able to buy it this month at Amazon.

Significant for the near future is the announcement that AUO will be making Advanced LCD (ALCD) display modules featuring UHD at up to an 85-inch diagonal. AUO’s modules are naturally available to all TV brands that wish to incorporated them. “AUO ALCD technology puts TV brands in position to create devices capable of reaching Rec.2020,” says the release. AUO says it will enter mass production by the end of this year.

At Display Week, after a QD Vision executive said the company was comfortable with the number of edge-lit sets that would continue to be sold, he said casually that of course the company was also looking at other form factors. I am guessing that this “look” is anything but casual. QD Vision probably leads its competitors in quantum dots that resist degradation from high temperature and luminous flux. That knowledge could be leveraged into a lower-cost solution that is compatible with 2D local dimming and HDR, but everyone I talk to agrees the needed materials research will take time. The race is on.

Ken Werner is Principal of Nutmeg Consultants, specializing in the display industry, manufacturing, technology, and applications, including mobile devices and television.  He consults for attorneys, investment analysts, and companies using displays in their products.  You can reach him at kwerner@nutmegconsultants.com.

A Trend Is A Trend – Until It Isn’t

A story posted on the CNET Web site for August 22 might have gone unnoticed – except that it shows that the tide is now flowing the other way when it comes to smartphones and tablets.

By “tide,” I mean market forces and analyst predictions. The former is showing a decided preference for ever-larger smartphone screens, while the latter is prematurely writing the epitaph for notebook and laptop computers.

When the first iPads burst onto the market, everyone had to have one. There was nothing like it, and what we know as a smartphone was still in the toddler stage, with small screens and limited ability to take photos and stream videos.

Indeed; as recently as two years ago, analysts were predicting that sales of desktop PCs would eventually fizzle out and notebook computers would follow in short order. To some extent, they were right – you can now buy high-powered notebooks for less than $500, a consequence of lowered demand and an oversupply of components, including LCD screens.

But analysts are often more wrong than right, and they definitely got it wrong with the future of larger smartphone screens. “No one will buy a phone with a 5-inch screen. And a 6-inch screen? That’s crazy!” they thundered.

Um, guys – The hottest category now for smartphones is that same 5-inch to 6-inch screen size category. Apple’s sold plenty of iPhone 6s, as has Samsung with their Galaxy 5 and 6. I upgraded from a Motorola Droid Razr Maxx (4.7” OLED screen) to a Galaxy 5 (5.5” OLED screen) last December, and love it. I rarely make calls with it, but I do text, take pictures, shoot video, use sports scores, and even read newspapers while having breakfast or when traveling.

The ever-larger size of smartphones, combined with a somewhat stagnant market for phone sales, has depressed the sales forecasts for tablets. According to the CNET article, “Sales of slate-style tablets are expected to fall 8 percent, according to a report from research firm Strategy Analytics. Sales in Apple’s iPad business, meanwhile, fell 18 percent year over year in its most recent quarter, the sixth consecutive quarterly decline.”

How fast things change. Back in early 2014, tablet sales were forecast to grow 18% by the end of the year. Now, we’re seeing the numbers run in reverse. And part of the problem is that people don’t turn over tablets as fast as they do phones – my wife still uses an iPad 2 from 2011, although the battery is starting to go.

I have a Barnes & Noble Nook HD that’s also vintage 2011 and hardly gets any use anymore, thanks to my new Samsung Galaxy Tab 8.4. Somewhere in a drawer, I have a Nook reader with “Glowlight” that crapped out about six months ago. (And my wife’s Nook Tablet, vintage 2010, still works just fine.)

So, where’s the growth in mobile computing devices? Looks like it’s now happening with so-called “2 in 1s;” devices that combine a detachable keyboard with a larger tablet screen. Microsoft’s Surface Pro is one example; Lenovo’s Yoga Pro is another. The CNET article says that sales of these devices are expected to grow by 5x this year over last, and new processors such as Intel’s Core M give them CPU speeds comparable to midrange laptops.

In terms of turnover, tablets are lasting 5 to 7 years. (Not good news for Apple, I suspect!) Smartphones are still driven by the length of service contracts, nominally 2 years. But Intel claims that buyers of 2-in-1s are turning over laptops and notebooks much more frequently – on average, 18 to 24 months.

We’ve also seen much sales of much larger tablet screens pick up. Samsung’s 10.4-inch Galaxy Tab is popular, and the CNET story mentions a rumor that Apple plans to unveil a 13-inch iPad Pro this fall. (No word on whether it will have a detachable keyboard, a feature that Apple has resisted for now.)

The demand for the Surface Pro product stands in stark contrast to its earlier failures at launch three years ago. (Wow, has it been THAT long?) At one point, the company had hundreds of thousands of unsold units sitting in warehouses, no doubt due to the public’s emphatic rejection of Windows 8 software.

Now, Surface Pros are a popular product, and can run special versions of Office software. With gradual acceptance of cloud-based storage as opposed to CD drives, these tablets are quite powerful, thin, and lightweight.

A move to larger screens on smartphones can’t continue indefinitely: The 6-inch Galaxy is about the largest phone size I can fit into a shirt or pants pocket, so we may be hitting a wall in that area (although I have heard of plans by one Chinese brand to come out with an 8-inch 4K smartphone!).

So if any device will be sacrificed on the CE altar, it will be mid-sized tablets – 7 to 9 inches – and that’s already happening, based on market numbers. As the owner of a still-running Toshiba 10.4” notebook with OS 7, I’m intrigued by the idea of replacing all of that weight with a same-size tablet and keyboard – and a higher-resolution display, too.

For AV connectivity, the market switch creates its own headaches. Micro HDMI? MHL? Lightning? In all likelihood, the interface of choice will become wireless, most likely using 5 GHz Wi-Fi channel bonding technology for more reliable video streaming. Or, we may see some early adopters of 60 GHz wireless links for “2-in-1s,” using the 802.11ad protocol or SiBEAM’s Snap wireless docking system.

Keep your eye on the new USB 3.0 Type-C connector. This could be a game-changer: Like Lightning, it is symmetrical and thus reversible. It can carry high-speed data (up to 10 Gb/s), DC power for charging, and in Alternate Mode, transport display signals like DisplayPort 1.3 (packet) and superMHL (TMDS).

It’s a good bet that as the market ramps up production of “2-in-1s,” they’ll include the Type-C interface and probably drop everything else except power connections. For that matter, Type-C is in a position to displace everything from Mini DisplayPort to HDMI as it is the closest thing we’ll have to a do-everything, universal I/O connector going forward.

As for picking winners and losers in the smartphone/tablet/notebook/laptop game, better leave that to the “experts.’ They’re just as confused as anyone else…