14 Corporate Innovation Lab Products That Changed The World

Today, it seems like every large corporation has its own “innovation lab.”

Designed to be nimble, startup-like working environments inside big corporations, these labs have produced some successes. They have also, like startups, produced a lot of failures.

Few are more critical of the model than startup entrepreneurs and VCs, who argue that corporate innovation labs are mostly innovation theater.

Get the Best of corporate innovation

Get exclusive access to six of our top innovation reports, including The State of Innovation, 75 Corporate Innovation Labs, and more.

Every once in a while, however, a development emerges from one of these labs that pushes innovation forward and truly raises the bar for everyone else in the industry. Some have even become fundamental to the way we live and work today, including the Unix programming language, the smartphone, and the transistor, to name just three.

Here, we run down the most influential products which got their start in a corporate innovation lab — and how those innovations moved from the lab to the market.

We begin with Google’s self-driving car project and go backward, all the way to the innovations of the post-WWII era.

1. Waymo (Google X)

How Google X is taking self-driving car technology from moonshot to $70B juggernaut

Google’s self-driving car company, Waymo, was the first project ofGoogle X, the company’s self-described “moonshot factory.” In December 2016, the project was spun off into its own subsidiary, Waymo LLC, named for the company’s mission to find a “new way forward in mobility.”

Waymo has long been considered a leader in the self-driving car race, thanks to the project’s longevity, key partnerships with established automakers like Fiat Chrysler and Jaguar Land Rover, and the amount of data it has been able to accumulate. In 2018, their cars logged more reported miles than all the other self-driving car contenders combined.

That said, the project has been beset by delays. Google executives have had to revise their predictions with regards to when the technology will be ready several times. In 2012, Google co-founder Sergey Brin stated that self-driving cars would be available to the general public in 2017. In 2014, the then project director Chris Urmson revised that date to 2020.

Waymo has launched several pilot programs, including a transportation service in Arizona that launched in 2018, and an autonomous truck project at a Google data center in Atlanta.

Competition in the self-driving car space has been heated. In February 2017, Waymo sued Uber, and its subsidiary self-driving trucking company, for allegedly stealing Waymo’s trade secrets and infringing upon its patents. The suit was settled in February 2018.

Waymo vehicles are hitting the road in pilot programs in select cities. Image source: Wikimedia


2. Amazon Alexa (Amazon Lab126)

How Amazon’s Lab126 planted a flag in the home automation space with a voice-activation device

Officially introduced in November 2014, the project that became Amazon Echo and Alexa started at Lab126 as far back as 2010. It was Amazon’s first effort to expand its hardware sales beyond Kindle (although the Fire TV would be launched before it).

To jump-start the Alexa project, Amazon hired several people who had worked at the speech recognition company Nuance, and bought two voice tech startups, Yap and Evi.

The Echo began as part of a bigger, more ambitious project known internally as Project C. Amazon has remained close-lipped about the project and it has since been canceled. However, patents filed at the time suggested an augmented reality concept in which virtual displays would follow people around their homes, offering services in response to voice commands and physical gestures, according to Bloomberg.

As of September 2017, Amazon had more than 5,000 employees working on Alexa and related products. In January 2019, Amazon’s devices team announced that they had sold over 100M Alexa-enabled devices. The company has also begun making Alexa available for integration with non-Amazon devices, such as Bose speakers and Rokus.

Even with 100M devices sold, Amazon is still in a heated race with other tech giants in the battle to control the $49B voice market. Apple and Google have both surpassed the 500M mark with Siri- and Google Assistant-enabled devices, respectively. Microsoft’s Cortana is available on over 400M devices.

Alexa has not been without controversy. There have been multiple reports of the speaker recording and sending content without users’ consent. Nevertheless, the success of Alexa has secured Amazon’s place as a leading innovator in the rapidly expanding field of artificial intelligence.

Amazon has sold more than 100M Alexa-enabled devices, including the Echo and Echo Dot. Image source: Flickr


3. Watson (IBM Research)

How IBM built a revolutionary artificial intelligence worthy of Jeopardy!

With its Watson AI, IBM established itself as a force to be reckoned with in the emerging space of artificial intelligence. Today, IBM is considered one of the foremost innovators in artificial intelligence, alongside Google, Microsoft, Amazon, Facebook, and Apple.

Watson was developed as part of the DeepQA project at IBM.Numerous universities collaborate with DeepQA, including Carnegie Mellon, MIT, University of Texas, and the University of Southern California.

According to IBM, “The goal is to have computers start to interact in natural human terms across a range of applications and processes, understanding the questions that humans ask and providing answers that humans can understand and justify.”

In 2011, Watson competed on the quiz show Jeopardy! against champions Brad Rutter and Ken Jennings, winning first prize.

Two years later, IBM announced the first commercial application for Watson, partnering with Memorial Sloan Kettering Cancer Center in New York City to be trained to provide cancer treatment recommendations. In 2014, IBM announced it was creating a business unit around Watson focused on enterprise applications for artificial intelligence in areas including healthcare, finance, advertising, education, Internet of Things (IoT), customer engagement, and talent management.

The road has not been entirely smooth for Watson — particularly in the healthcare space. In April 2019, STAT reported that IBM was halting sales of Watson for Drug Discovery — a service that uses Watson AI to analyze connections between genes, drugs, and diseases in search of new medications. Several healthcare experts told IEEE Spectrum that IBM had “overpromised and underdelivered” with Watson Health.

Watson was named after IBM’s first CEO, Thomas J. Watson. Image source: Wikimedia


4. The Kindle (Amazon Lab126)

How pressure to stay on top of the book world pushed Amazon to create their first hardware device

The Kindle was developed at Amazon’s newly created Lab126 between 2004 and 2007. It was the first hardware product to come out of Amazon. Lab126 was created specifically for the purpose of working on the Kindle.

The directive to create the Kindle came directly from Amazon CEO Jeff Bezos himself, when he urged Amazon employees to build the world’s best e-reader before Amazon’s competitors could.

For Bezos, the Kindle is an example of Amazon’s philosophy of building products by “working backwards from customer needs.”

“The skills-forward approach says, ‘We are really good at X. What else can we do with X?’ That’s a useful and rewarding business approach,” he writes, “However, if used exclusively, the company employing it will never be driven to develop fresh skills.”

The first edition Kindle was released on November 19, 2007 for $399 and sold out in five and a half hours, remaining out of stock for five months.

The Kindle has undergone multiple iterations and evolutions since its introduction, including versions with keyboards and illuminated screens. However, the e-reader has always been a single-purpose device for reading, rather than being multipurpose hardware that might create distractions while reading.

“We knew Kindle would have to get out of the way, just like a physical book, so readers could become engrossed in the words and forget they’re reading on a device,” Bezos wrote in a 2007 letter to shareholders. “We also knew we shouldn’t try to copy every last feature of a book — we could never out-book the book. We’d have to add new capabilities — ones that could never be possible with a traditional book.”

In 2011, Amazon announced that sales of e-books had surpassed sales of paperback books through Amazon for the first time ever. As of March 2018, the Kindle Store had over 6M e-book titles available in the United States. Amazon accounts for more than 80% of US e-book sales, according to a study by AuthorEarnings.

Kindle included free 3G data access nationwide. Image Source: Tolbxela


5. The Ethernet (Xerox PARC)

How solving an internal problem led PARC to revolutionize connectivity and industrial manufacturing

Developed at the Xerox PARC innovation lab in Palo Alto between 1973 and 1974, Ethernet was one of the first networking solutions that allowed for high-speed connections between numerous computers. The technology has been massively influential in the industrial sector and the emerging Internet of Things (IoT), as it facilitated the rapid transfer of data from computer to computer and, crucially, from computer to machine.

Ethernet began life as a solution to an internal problem at PARC. Co-creator Robert Metcalfe was asked to create a local-area network to connect the Xerox Alto computer to the Scanned Laser Output Terminal, the world’s first laser printer, also developed at PARC.

Ethernet remained an in-house technology at Xerox until 1980. In 1979, Metcalfe left Xerox to form 3Com, and convinced Digital Equipment Corporation (DEC), Intel, and Xerox to work together to promote Ethernet as a standard. Ethernet faced competition from several competitors in the early days, including token bus (favored by General Motors for factory networking) and Token Ring from IBM.

Several factors contributed to Ethernet’s eventual victory, including DEC’s early decision to support the technology, and its status as being more open. In 1983, Ethernet received approval as a standard from the IEEE 802.3 committee, which facilitated its rapid market adoption among corporations.

The original Ethernet cable carried only 3 Mbps. Today, the technology can reach speeds of up to 100 Gbps.


6. Unix (AT&T Bell Labs)

How an abandoned partnership led Bell Labs to build the operating system that spawned iOS

Unix refers to a family of multitasking, multi-user computer operating systems that trace their origins back to the first Unix system, created by Bell Labs in the 1970s. Known for superior interactivity, running on inexpensive hardware, and being easy to adapt and move to different machines, among other benefits, the Unix system was hugely influential on other operating systems.

The history of Unix dates back to the mid-1960s, when AT&T Bell Labs was involved in a joint project with MIT and GE to develop an experimental time-sharing operating system called Multics. Frustrated by the limitations of Multics, Bell Labs left the joint project and began developing its own system called Unix.

Bell Labs produced several versions of Unix before selling the first source license for the system to the University of Illinois Department of Computer Science in 1975. Throughout the 1970s and 1980s, Unix gained increasing popularity in academic circles, eventually leading to the large-scale adoption of Unix by commercial startups.

In 2000, Apple released Darwin, a system derived from Unix, which became the core of the macOS and iOS operating systems. Today, iOS and macOS have 1.4B users combined.

Apple’s iOS and macOS are derived from Unix.


7. C programming language (AT&T Bell Labs)

How AT&T Bell Labs developed a language that became the foundation for modern programming

C is a programming language originally developed at Bell Labs by Dennis Ritchie, who also co-created Unix, in the early 1970s. Today, C remains one of the world’s most popular programming languages.

C was developed specifically to work with the Unix operating system, also developed at Bell Labs. The name “C” comes from the fact that it was the third language Bell Labs had developed for the purpose, with the two previous attempts being named A and B.

C gained in popularity throughout the 1980s and also influenced many later languages, including C#, Go, Java, JavaScript, Python, and many more. Due to the language’s wide availability and efficiency, the compilers, libraries, and interpreters of other programming languages are often implemented in C.

Dennis Ritchie played a major role in the creation of both C and Unix.

Get the Best of corporate innovation

Get exclusive access to six of our top innovation reports, including The State of Innovation, 75 Corporate Innovation Labs, and more.


8. The Charge-Coupled Device (AT&T Bell Labs)

How an ultimatum from the higher-ups at Bell labs led to a revolution in camera technology

Charge-coupled devices (CCDs) are electronic sensors that provided the first practical way to let a light-sensitive silicon chip digitize an image. Applications of CCDs are found in many devices, including digital cameras, medical devices, and barcode readers.

The technology was invented at AT&T Bell Labs in 1969 by Willard Boyle and George Smith. At the time of the CCD’s development, Bell Labs was also working on a new type of computer memory. Impatient to speed up the development of the technology, upper management urged Boyle and Smith to come up with a solution using the semiconductor technology the organization was studying, or risk losing funding. Though perhaps not a management technique most would advocate for today, it’s hard to deny that the pressure got results. Boyle and Smith are said to have sketched out the basic design, principles of operation, and preliminary thoughts about applications for the technology during a brainstorming session that lasted less than an hour.

Steven Sasson, an electrical engineer working for Kodak, invented the first digital still camera in 1975, featuring the CCD technology developed by Boyle and Smith. That was followed by the first digital single-lens reflex (DSLR) camera. By the mid-2000s, DSLR cameras had largely replaced film cameras. The mass adoption of digital cameras spelled disaster for more than one company, including film giant Polaroid, which had to file for bankruptcy in 2001 (though it has since seen a resurgence in its fortunes).

In 2009, Boyle and Smith were awarded the Nobel Prize in Physics in recognition of the CCD’s far-reaching impact.

A CCD sensor from a Sony video camera. Image source: Wikimedia


9. The Smartphone (IBM Research)

How thinking small led to the creation of a forward looking phone

The first commercially available smartphone was the IBM Simon, released in 1994. Development of the technology began at IBM Research in the early 1990s as part of an effort to introduce smaller, lighter products.

The officially credited “inventor” of the smartphone, Frank Canova, came up with the idea when he realized that chip-and-wire technology was becoming small enough to make a hand-held computer-like device possible. Canova’s boss, Jerry Merckel, was working on memory cards that would expand the memory of laptop computers, and saw that this technology could also be applied to the device described by Canova. Merckel then pitched the idea to his boss, Paul C. Mugge, as “the phone of the future.” Mugge greenlit the project.

The Simon phone did not have a web browser, but it did feature a touchscreen and many of the capabilities we associate with smartphones today, such as email, calendars, maps, stock reports, and news. While the term “smartphone” was not coined until after Simon was introduced, historians and industry experts have since argued that Simon featured enough of the defining features of a “smartphone” to deserve the title.

Poor battery life (just one hour) and improvements in the design of competitors killed Simon — it only sold 50,000 units in its 6 months on the market — but the groundwork was laid for Apple and others to pick up the mantle a decade later.

IBM’s Simon is widely considered to be the first smartphone. Image source: Rob Stothard/Getty Images


10. The Graphical User Interface (Xerox PARC)

How Xerox PARC transformed the computer interface

The graphical user interface (GUI) allows a user to issue commands to a computer by interacting with visual elements displayed on a screen, such as icons, input windows, or labeled buttons. It was an essential step in facilitating the widespread adoption of computers. Prior to GUIs, the most common way to interact with a computer program was to type command lines into a terminal, which required specialized knowledge that put computing out of reach for the majority of the population.

The GUI was developed at PARC in the 1970s by computer scientist Alan Kay, who built on previous work that had been done using text-based hyperlinks to connect related information together. The first GUI featured now-familiar features like windows, menus, and check boxes.

Xerox has faced hefty criticism for its failure to successfully commercialize the GUI. The first computer to feature a graphical user interface, the Xerox Alto, was designed for corporate and research purposes and not intended for mass commercialization.

In 1979, Xerox PARC received a visit from a young tech executive by the name of Steve Jobs. Jobs and other Apple employees reportedly saw a demonstration of the technology from Xerox as part of an investment deal Xerox had made in Apple. It has been speculated that the PARC visit influenced Apple’s decision to incorporate the graphical user interface design into both the Apple Lisa (1983) and the Macintosh (1984).

Today, graphical user interfaces are the norm, not only for computers, but for mobile devices, tablets, gaming consoles, and many other electronics. However, as we’ve seen, their status as the go-to user interface is being challenged by the rise of voice-activated devices like the Amazon Echo.

The Xerox Alto was the first computer to feature both a mouse and a graphical user interface. Image source: Flickr


11. The Computer Mouse (Xerox PARC)

How Xerox PARC stood on the shoulders of others and changed computer navigation forever

Like the GUI, the computer mouse was essential in spurring the mass adoption of computers. Before the mouse, the only way to interact with a computer was by typing commands, which made them difficult and unwieldy to navigate. The introduction of the mouse and the graphical user interface, both developed at Xerox PARC, helped bring the computer from specialty tech to consumer product with broad applications and accessibility.

Early iterations of a mouse-like computer input device have existed since 1949. But the first official “mouse” was invented by Douglas Engelbart in 1964, and consisted of a wooden shell, circuit board, and two metal wheels.

One of the early computers to feature a mouse was the Xerox Star, introduced in 1981, but it wasn’t until the introduction of the Apple Macintosh in 1984 that the mouse saw more widespread use. Twenty-four years later, in 2008, computer accessories giant Logitech announced that it had produced its billionth mouse.

The basic functionality of the computer mouse hasn’t changed much across the years. Image source: Wikimedia


12. The Laser Printer (Xerox PARC)

How Xerox consolidated its competitive advantage with the laser printer

Xerox was already the dominant player in the photocopier market in the 1960s when the idea for the laser printer was born. In 1995, the company ran magazine ads headlined “Who invented the laser printer?” with the answer “It’s Xerox.”

Gary Starkweather, a Xerox product developer, developed the first plans for a laser printer in late 1969. Two years later, Starkweather transferred to PARC, where he modified an existing Xerox copier to create the Scanned Laser Output Terminal (SLOT).

The first commercial application of a laser printer came in 1975 with the IBM 3800. Xerox brought its first commercial laser printer, the Xerox 9700, to market in 1977.

Over the subsequent years, office and later home printers were introduced by companies including IBM, Canon, Xerox, Apple, and Hewlett-Packard. As of 2019, printing is estimated to be a $77B business, according to IBISWorld.

Laser printers, like this one from Sony, can trace their origin to Xerox PARC. Image source: Wikimedia


13. The Photovoltaic Cell (AT&T Bell Labs)

How Bell Labs put the silicon in Silicon Valley and helped launch the first solar-powered satellites into space

While experimental demonstrations of generating electricity from light stretch back all the way to the 1800s, it took a breakthrough at Bell Labs to bring photovoltaic (PV) cell technology to the commercial space.

The first demonstration of a practical photovoltaic cell took place at Bell Labs in 1954. The breakthrough came when inventors Calvin Souther Fuller, Daryl Chapin and Gerald Pearson used silicon to more efficiently convert sunlight into electricity  than selenium, which had been the previous focus of PV research.

One of the early adopters of solar cell technology was NASA, which wanted to extend the life of satellite missions by adding solar cells. The first solar-powered satellite, Vanguard I, was launched in 1958. By the 1960s, solar cells were the main power source for most Earth-orbiting satellites — and they continue to be today.

The US installed 11 gigawatts (GW) of solar PV capacity in 2018, bringing its total up to 64 GW, according to the Solar Energy Industries Association — roughly equivalent to the power needs of 12M homes.

The demand for solar panels in the US is increasing quickly. Image source: Flickr


14. The Transistor (AT&T Bell Labs)

How Bell Labs laid the foundation for modern computing

The transistor is a fundamental building block of modern electronic devices. Transistors are key component in radios, calculators, and computers, as well as the Kindle, Echo, self-driving car, and most other innovations we’ve included in this list.

The transistor as we know it was invented at AT&T’s Bell Labs by John Bardeen, Walter Houser Brattain, and William Shockley in 1947. Early efforts to build the system proved challenging, but on the afternoon of December 23, 1947, Brattain and Bardeen successfully demonstrated the technology to several Bell Labs colleagues and managers, marking what many consider to be the birth of the transistor.

The first commercial production of the transistor began at a Western Electric plant in Pennsylvania in 1951. By 1953, the transistor was seeing adoption in some devices, such as hearing aids and telephone exchanges, but issues including sensitivity to moisture and the fragility of the wires slowed broader uptake at first. The technology’s commercial breakthrough came with the creation of the transistor radio, released by Texas Instruments in 1954 and estimated to have sold around 150,000 units. The proliferation of the transistor paved the way for future innovations such as portable CD players, personal computers, and, eventually, smartphones.

Bardeen, Brattain and Shockley were awarded the Nobel Prize in 1956 in recognition of their work.

The first transistor was built in 1947. Image source: Wikimedia

Get the Best of corporate innovation

Get exclusive access to six of our top innovation reports, including The State of Innovation, 75 Corporate Innovation Labs, and more.

The post 14 Corporate Innovation Lab Products That Changed The World appeared first on CB Insights Research.