Today, we tend to associate the words “Silicon Valley” with the hundreds of Internet giants and billion-dollar startups headquartered in the San Francisco Bay Area. However, startups and Internet companies were not always the face of Silicon Valley. The word “silicon” referred to the large number of companies that specialized in creating silicon-based chips and semiconductors during the early days of the computer industry.
In 1951, William Shockley and two co-workers from Bell Labs invented the bipolar junction transistor. Several years later, in 1956, Shockley moved to Mountain View, California to found Shockley Semiconductor Laboratories with the goal of replacing germanium transistors with silicon ones, which he believed to have better performance. Although Shockley was strong in technical skills, being an accomplished researcher and Nobel Prize recipient, he lacked management skills. His authoritarian management style caused many disputes with his employees and one year after he founded his company, eight PhD graduates resigned to start their own company, Fairchild Semiconductor. Among them were Robert Noyce and Gordon Moore, who would later go on to found Intel, one of the most influential companies in the upcoming decades.
Fairchild Semiconductor made several important inventions during the 1960s, including the first op-amp and silicon integrated circuit. Soon, new advances in silicon technology decreased their manufacturing costs and in 1965, Golden Moore projected that the number of transistors on an integrated circuit would double every year, which became known as Moore’s Law. Three years later, Moore and Robert Noyce founded Intel (Integrated Electronics), which invented the popular x86 microprocessors.
During this time, the Cold War was in full swing. The Space Race, the failed Bay of Pigs invasion, the Cuban Missile Crisis, and the Vietnam War were reminders that technological advances were necessary to maintain balance and power. The Advanced Research Projects Agency of the Department of Defence funded the development of networking computers in different geographical regions together so they could quickly transmit and receive data to each other. This network was known as ARPANET and in 1969, the same year mankind walked on the moon, ARPANET has successfully connected UCLA, Stanford, UCSB, and the University of Utah. In the following decade, new networking technologies, protocols, and organizations were created, including TCP/IP (which became the backbone of the Internet), FTP, SMTP, DNS and the IETF. In 1990, ARPANET was decommissioned and the World Wide Web was born. Soon after, the Internet became commercialized and the browser war between Internet Explorer and Netscape’s Navigator began. As the Internet became increasingly prevalent in daily life, more Internet-related companies were founded, including Amazon, eBay, and Google, and the hype around it also led to a lot of speculation among investors that set off the dot-com bubble in 2000.
Let’s head back to 1970, two years after Intel was founded. Xerox created its Palo Alto Research Centre (PARC), which built many useful creations: the laser printer, Smalltalk, the first WYSIWYG text editor, GUI, the mouse, and the Ethernet. Steve Jobs and Steve Wozniak founded Apple Computers in 1976 and visited Xerox PARC to learn more about their innovations while working on the Macintosh. Microsoft was founded in 1975 and became a competitor of Apple with the release of MS-DOS, Windows and their productivity programs.
Silicon Valley grew larger as more companies were founded and brought innovating technologies. It was known as the breeding ground for taking risks and making inventions that changed the world. Silicon Valley started with only a handful of tech companies but today has hundreds of mid-large sized companies and thousands of startups. Although it is now known for its Internet-based companies and emerging technologies, such as VR and machine learning, its roots are firmly planted into the original silicon chip innovators from half a century ago.