Pain Points and Compromises: How Ajay Bhatt Engineered USB Into a Standard: Page 2 of 3

Ahead of his ACE Lifetime Achievement Award, Intel's Ajay Bhatt reflects on a storied career, the IoT's untapped potential and need of standards, and why USB connectors aren't flippable.

Convincing Intel that USB was a project worth pursing was in itself a journey of over two years.

“It took my about two to two-and-a-half years to convince people, even our customers, it was a worthy investment and get like-minded people in one place,” Bhatt said. “And before we even got everyone in one place I had to come up with a fairly good proposal that described our vision and any possible answers to that vision.

“At the time I had just started working at Intel and there was a big issue with how difficult PCs were to use. So I went to my management and said can we make the PC easy to use so people don't have to open up the box, install drivers, set up switches, ect. Most people said, 'Hey you just don't understand the PC world, compatibility is very stringent and you if you open it up you're going to break things.”

In the mid-1990s when USB was being developed, competitors like Apple were already sharing a similar vision as Bhatt of a universal connector. Apple's vision would eventually culminate in the Lightening Connector introduced in 2012, but before that that there was IEEE 1394, also known as FireWire. Circa the early 2000s, if you owned a high-end PC, especially if it was for video editing, it likely had both a FireWire port, as well as USB. Bhatt said it was the complexity and expensiveness of FireWire that eventually killed it off.

“Initially the goal [for USB] was quite modest,” Bhatt said. “But as we got into it we realized we could actually come up with something that would be more scalable.” Through its development, USB's data rates, initially at 100 KB/s climbed up to 12 MB/s. Today USB 3.0 offers a data transfer rate of 10GB/s. “We told ourselves, let's define a physical layer and abstract that layer out from upper layers of architecture such as software drivers and operating systems so as technology evolves we can still maintain the OS interface, keep it very modular, and yet take advantage of Moore's Law and improve the physical layer. “

From there Bhatt said the biggest issue his team encountered was cost. Before USB could become a standard it had to coexist with everything already out there until OEMs and consumers could make the full transition.

“The biggest deal was the cost,” Bhatt said, “The new standard had to coexist with everything that existed before. So before life got better it had to get a little more complicated because you couldn't get rid of serial port and parallel ports outright. You had to add one more port and over time move people from legacy to USB.” Dig up a 10-year-old computer today and remind yourself how busy the back console used to be with connectors.

The next challenge lay in creating an entire new software ecosystem for USB – getting operating systems to support it and getting other companies to support the layered architecture that Bhatt and his team had envisioned. “Life prior to this was the world of MS-Dos were people wrote apps that would directly touch the hardware, Bhatt said. “We had to move people away from that and say if you're writing an app for a device you really have to go through the OS, which will in turn call device drivers which will call class drivers and so on. Getting that ecosystem to move was a lot of heavy lifting.

Once that happen people started seeing the value, but it took seven years for people to appreciate what USB could do for them.”

On the Flip Side

For the record, Bhatt has seen the memes and heard all of the jokes. Most consumers at this point will be well aware of the USB paradox – no matter which way you insert the connector the first time it will aways be wrong. It's a design issue so common yet strangely endearing that it has even led to speculation that it may be proof that the multiverse exists (somewhere there is an alternate world where the USB is always right side up the first time).

For their part, Bhatt and the team at Intel knew that making the connector flippable would go a long way in improving usability, but it was simply not feasible from a cost perspective.

“The biggest mess with USB was the connector is not flippable,” he said. “Even today when you look at USB, if we had made it flippable it would have been a lot easier. We could have focused on a higher data rate to start like FireWire did. They were at 100 MB while we were at 12 MB.

“But it was a matter of complexity. I think, even looking back, starting modestly was the way to go. It allowed us to focus on a much wider variety of products to start with and over time we were able to improve the performance over time anyway.”

Making the connector flippable, he explained, would have required double the number of wires and even more circuitry – all things that cost money and would have passed a higher price down to the consumer.

“If you have a lot of cost up front for an unproven technology it might not take off. So that was our fear. You have to be really cost conscious when you start out,” Bhatt said. “In hindsight you can say it could have been better, but compared to where we were with serial ports and parallel ports, where each of the ports were logical, had cables, and multiple wires, and were not plug-and-playable, USB was significantly better.”

Comments (2)

Please log in or to post comments.
  • Oldest First
  • Newest First
Loading Comments...