Pain Points and Compromises: How Ajay Bhatt Engineered USB Into a Standard

Ahead of his ACE Lifetime Achievement Award, Intel's Ajay Bhatt reflects on a storied career, the IoT's untapped potential and need of standards, and why USB connectors aren't flippable.

Chris Wiltz

November 30, 2017

12 Min Read
Pain Points and Compromises: How Ajay Bhatt Engineered USB Into a Standard

The co-inventor of USB is building a smart house. Since his retirement, Ajay Bhatt, who spent over two decades at Intel specializing in platform architectures and developing technologies such as accelerated graphics port (AGP) and PCI-Express, has spent the last year experimenting with the Internet of Things (IoT), specifically by building a own smart home.

Ajay Bhatt will be receivng a Lifetime Achievement Award at the 2017 ACE Awards. (Image source: Intel) 

“While I was working I started designing this house and my goal was to put in a lot of IoT devices in it. Part of my time was spend doing general construction and part of it was adding technologies that would scale over time,” Bhatt told Design News.

Aside from some advisory work for a few startups, Bhatt, who will be receiving a Lifetime Achievement award at the Annual Creativity in Electronics (ACE) Awards as part of the 2017 Embedded Systems Conference in Silicon Valley, says that he has been content to enjoy his retirement ... at least until something new comes along.

“My whole mission in life is to make things easy to work with and easy to design. I'm just starting to play with things and figure out how it should be. Before I know the answer I want to know what the current problems are with existing systems.”

The irony of what he's doing isn't lost on Bhatt. As one of the key developers of USB Bhatt and his associates at Intel created a universal standard for connecting devices via cable. Today USB ports are as ubiquitous as wall outlets and you'd be hard pressed to find a computer, laptop, tablet, or smartphone that doesn't take advantage of it in some way.

But right now the world of IoT is anything but standardized as companies compete over not only who can offer the best and “smartest” devices for your home, office, and even the factory but also over what standards and protocols will drive the entire Industry 4.0. “Before I built the home I talked to [IoT] companies so I could understand what they do. And I quickly discovered they are very high end. They're closed systems they don't scale very well and they lock you into an ecosystem. I wanted to do something much more open and flexible,” Bhatt said.

Bhatt believes technology has to be fast. It has to scale, interoperate, and it has to work seamlessly when you ask it to do something. “But there are so many standards and so many interfaces and it's really confusing,” he said. “So I”m trying to see how can you simplify this.”

Before Life Got Better ... It Got Complicated

Speak with Bhatt long enough and you get the sense that simplicity is a word he lives by. It shows in the measured way he speaks – carefully choosing every word, and it certainly shows in his work.

“When we started I don't think most people, including my colleagues at Intel, realized that USB was something that was needed,” Bhatt said. “I kept on saying that if you want to make computers useable to a common user then you have to make things easier to use. I always used my own family as an example. When my wife tried to use the PC at home she as always very frustrated even with basic things like printing or scanning. She always used to say, 'What good is this? Any time I need to do something it just doesn't work.' ”

Convincing Intel that USB was a project worth pursing was in itself a journey of over two years.

“It took my about two to two-and-a-half years to convince people, even our customers, it was a worthy investment and get like-minded people in one place,” Bhatt said. “And before we even got everyone in one place I had to come up with a fairly good proposal that described our vision and any possible answers to that vision.

“At the time I had just started working at Intel and there was a big issue with how difficult PCs were to use. So I went to my management and said can we make the PC easy to use so people don't have to open up the box, install drivers, set up switches, ect. Most people said, 'Hey you just don't understand the PC world, compatibility is very stringent and you if you open it up you're going to break things.”

In the mid-1990s when USB was being developed, competitors like Apple were already sharing a similar vision as Bhatt of a universal connector. Apple's vision would eventually culminate in the Lightening Connector introduced in 2012, but before that that there was IEEE 1394, also known as FireWire. Circa the early 2000s, if you owned a high-end PC, especially if it was for video editing, it likely had both a FireWire port, as well as USB. Bhatt said it was the complexity and expensiveness of FireWire that eventually killed it off.

“Initially the goal [for USB] was quite modest,” Bhatt said. “But as we got into it we realized we could actually come up with something that would be more scalable.” Through its development, USB's data rates, initially at 100 KB/s climbed up to 12 MB/s. Today USB 3.0 offers a data transfer rate of 10GB/s. “We told ourselves, let's define a physical layer and abstract that layer out from upper layers of architecture such as software drivers and operating systems so as technology evolves we can still maintain the OS interface, keep it very modular, and yet take advantage of Moore's Law and improve the physical layer. “

From there Bhatt said the biggest issue his team encountered was cost. Before USB could become a standard it had to coexist with everything already out there until OEMs and consumers could make the full transition.

“The biggest deal was the cost,” Bhatt said, “The new standard had to coexist with everything that existed before. So before life got better it had to get a little more complicated because you couldn't get rid of serial port and parallel ports outright. You had to add one more port and over time move people from legacy to USB.” Dig up a 10-year-old computer today and remind yourself how busy the back console used to be with connectors.

The next challenge lay in creating an entire new software ecosystem for USB – getting operating systems to support it and getting other companies to support the layered architecture that Bhatt and his team had envisioned. “Life prior to this was the world of MS-Dos were people wrote apps that would directly touch the hardware, Bhatt said. “We had to move people away from that and say if you're writing an app for a device you really have to go through the OS, which will in turn call device drivers which will call class drivers and so on. Getting that ecosystem to move was a lot of heavy lifting.

Once that happen people started seeing the value, but it took seven years for people to appreciate what USB could do for them.”

On the Flip Side

For the record, Bhatt has seen the memes and heard all of the jokes. Most consumers at this point will be well aware of the USB paradox – no matter which way you insert the connector the first time it will aways be wrong. It's a design issue so common yet strangely endearing that it has even led to speculation that it may be proof that the multiverse exists (somewhere there is an alternate world where the USB is always right side up the first time).

For their part, Bhatt and the team at Intel knew that making the connector flippable would go a long way in improving usability, but it was simply not feasible from a cost perspective.

“The biggest mess with USB was the connector is not flippable,” he said. “Even today when you look at USB, if we had made it flippable it would have been a lot easier. We could have focused on a higher data rate to start like FireWire did. They were at 100 MB while we were at 12 MB.

“But it was a matter of complexity. I think, even looking back, starting modestly was the way to go. It allowed us to focus on a much wider variety of products to start with and over time we were able to improve the performance over time anyway.”

Making the connector flippable, he explained, would have required double the number of wires and even more circuitry – all things that cost money and would have passed a higher price down to the consumer.

“If you have a lot of cost up front for an unproven technology it might not take off. So that was our fear. You have to be really cost conscious when you start out,” Bhatt said. “In hindsight you can say it could have been better, but compared to where we were with serial ports and parallel ports, where each of the ports were logical, had cables, and multiple wires, and were not plug-and-playable, USB was significantly better.”

Intel's Rock Star

Born in Vadodara, Gujarat, India, Bhatt began his career at Intel in 1990 as a senior staff chipset architect. “I was working on chipsets processor design and I had a proposal for a high-integration 486 processor,” Bhatt recalls. “Even when I was working on USB my boss would have liked me to be working on multimedia instruction set extensions, which at Intel was a very lucrative job. Once I started working on USB and that became successfully deployed upper management approached me and said. 'We're starting this new line of 64-bit processors and workstations will be one of the killer apps used for them, but we don't have a good graphics interface and we would like you lead that effort.' “

That request would be Bhatt's entry into developing AGP (Accelerated Graphics Port). Intended to be a successor to PCI in delivering high-performance graphics, AGP would eventually be phased out in favor of another of Bhatt's projects – PCI-Express (PCI-E).

“Once you succeed in one arena you keep developing along that path,” he said. “Once I worked on AGP I got a chance to look at a lot of graphics architecture, at which point we realized at the platform level that the existing PCI interface was not very scalable. Then I made the proposal for PCI-Express.”

The success of USB even made Bhatt a celebrity of sorts. A 2009 ad campaign by Intel featured an actor playing Bhatt, confidently striding through an office the way a rock star might slide through a club full of fans and groupies.

A 2009 Intel ad featured an actor portraying Bhatt.

Before retiring from Intel in 2016, Bhatt was working on creating another standard – a universal stylus for touch interfaces. Rather than being proprietary to any particular device, Bhatt wanted to create a stylus that would be as universal to desktop and handheld devices as an ordinary pen is to pieces of paper. 'I thought if you want the stylus to really take off you should make it like an actual pencil,” he said. “Normally a touch platform works on a special dedicated controller. We had designed a subsystem where touch processing was done on a GPU and it was very, very responsive – better than anything I had seen before.”

Bhatt worked with a team to create the initial specs for the stylus, but retired soon after. Since then the idea hasn't launched and the broader technology community has pushed the stylus to the wayside in favor of more sensitive and accurate touchscreens. Bhatt has always considered cost and interoperability to be core issues to address with any technology. In this instance it looks like industry settled on a standard without him.

Finding the Pain Points

Now, the man who spent his entire career creating standards is dabbling in a world that appears to be splintering before our very eyes. But it's the exact same issues IoT is facing – cost, interoperability, and simplicity – that have motivated Bhatt through his entire career.

He sees particular potential in contextual computing for IoT. “If we can take IoT and fuse some of the context awareness into devices that would be very interesting. Machine learning has been an area of interest to me because you can take all of this data and make decisions with it. There is some aweseome compute power that is coming online now and if you can take all of the data from these IoT devices and also gather the context – user context, middle context, and device context -- and fuse it to make decisions that would be very interesting.”

So is there work in IoT in Bhatt's future? For now he's content to wait and see what the New Year will bring. “I'm just taking it easy for now,” he said. “I'm trying to form an opinion as a user of what I don't want to do at any point. I'm asking myself, what are the major pain points? Once you see the pain points and appreciate them then you start thinking about what it could be and how to solve the issue.”

The ACE Awards Honor Ajay Bhatt 
For his innovative work Ajay Bhatt will be honored with a Lifetime Achievement Award at the Annual Creativity in Electronics (ACE) Awards. The ACE Awards will take place December 6 as part of the 2017 Embedded Systems Conference (ESC) in Silicon Valley. Click here to register for ESC and find tickets for the ACE Awards here.

Chris Wiltz is a Senior Editor at Design News covering emerging technologies including AI, VR/AR, and robotics. He has never successfully inserted a USB connector on the first try. 

Sign up for the Design News Daily newsletter.

You May Also Like