HomeTECHNOLOGYThe History Of Intel

The History Of Intel

While representing a minor anomaly in the computing industry, Intel is a market leader in computer processors and computing devices. Unusual. When we talk about a Californian startup, especially if active in the hi-tech sector, we immediately think of a group of shabby guys with very little money, trying to make ends meet in the garage or the basement of their parent’s home. So Steve Jobs and Steve Wozniak took their first steps before Apple became the giant we know today; a similar story for Jeff Bezos and Amazon, “born” in the garage of a house on the outskirts of Seattle, and many other big names in the world of hi-tech. 

For Intel, things went differently: the historic company producing processors, graphics cards and semiconductors (among the most important in the world for production and turnover) was born from the mind of the engineers Robert Noyce and Gordon Moore in 1968, adequately financed by Arthur Rock, American financier creator of the concept of venture capitalists. At the time of the founding of Intel (crase of Integrated El electronics ), Noyce and Moore were two in their forties with a particular reputation in the semiconductor and integrated circuit environment: the former is among the inventors of the silicon integrated circuit and general manager at Fairchild Semiconductor; the second is in charge of the research and development laboratories for the same company. Therefore, it is almost natural for the two to “fish” in Fairchild Semiconductor’s workforce when founding their new company.

The Beginnings

Intel takes its first steps in the world of memory chips: a few months after its birth, it introduces the first metal-oxide transistor (Intel 1101), but with little success. Noyce and Moore do not lose heart, and a short time later, they reappear on the market with the Intel 1103, a 1-kilobyte dynamic RAM chip that immediately attracts the attention of the big names in the IT sector. The first to adopt it within its computer systems is Honeywell, attracted by the speed of the chip and by the ability to contain a large amount of information (for the 70s, of course) at a relatively affordable cost and with reduced consumption.

In 1971 Intel introduced two innovations that would change the course of events in the field of computing. The commercialization of EPROM memories (an acronym for erasable programmable read-only memory) started in the first part of the year. In contrast, in the second part, it is the turn of the Intel 4004 (designed, among others, by the Italian engineer Federico Faggin), the first general-purpose 4-bit processor consisting of a single chip. In both cases, the company based in Santa Clara (California) reports great successes: EPROM memories will be Intel’s best sellers until 1985, while the 4004 paves the way for a new generation of CPUs.

The Era Of Microprocessors

In the mid-70s, Intel underwent a profound corporate restructuring: the management decided to gradually abandon the production of DRAM memories to focus on the design, development and realization of CPUs. In 1972 the processor 8008 arrived on the market, an 8-bit CPU direct descendant of the Intel 4004; in 1974, it was the time of the Intel 8080, ten times more powerful than its predecessor; in 1978 came the 8086, the first 16-bit processor from Intel. 

In 1981 IBM chose the Intel 8088 as the CPU of its first computer made for large-scale distribution (the first personal computer in history), opening up to the Santa Clara company also the market for “compatible” devices with the IBM machine. Among the many CPUs produced in this period, the most important is the Intel 80386, the first 32-bit processor backward compatible with previous models. This is an epochal change: thanks to the extended compatibility across multiple generations of CPUs, software developers can be sure that programs written for old processors will also work for new-generation computing units.

The Pentium Was Beyond

In 1993 the Santa Clara company decided to change the way of “baptizing” its products, abandoning the numerical “series” to make room for alphanumeric names: thus, the Pentium era was born. The new family of processors, taken from a new production technology, is characterized by incredible computing power. Each CPU is equipped with 3.1 million transistors, and the performances grow thanks to parallel computing exponentially. The Pentium processors, combined with the new Microsoft Windows 3.1 operating system, contribute to the expansion of the IT market and the spread of computers in all segments of society. 

From this moment on, despite some small defiance, the Intel-Windows duo will know no rivals in the field of information technology, constituting a de facto monopoly repeatedly sanctioned by the US and European authorities. In 2005 the last barrier fell: Apple, which since 1984 has been using only and exclusively processors produced by Motorola (the famous PowerPCs), decided to overcome the obstacle and start creating computers (first iMac and then MacBook) based on the Intel platform.

Towards New Frontiers

Domination of the processor market allows the Santa Clara giant to focus on other sectors of the IT market as well. In the mid-90s, the production of Intel-branded motherboards began: the technological and information advantage enjoyed by the Noyce and Moore company is such that Intel gained a 40% share of the motherboard market within a few years. Next came Intel graphics accelerators (GPUs) and, from September 2008, solid-state hard drives as well.

The Era Of Acquisitions

To strengthen its position in the market and expand into other market sectors, from 2009 onwards, Intel implemented a real purchasing campaign. Between 2010 and 2011, he completed the acquisition of McAfee, a historic software house specializing in developing programs and tools for computer security (antivirus, but not only), for about 8 billion dollars. This operation increases Intel’s workforce enormously: the Santa Clara company has over 90,000 employees, including over 12,000 software engineers. Also in 2010 came the agreement with Infineon Technologies to acquire Infineon Wireless Solutions, a company branch specializing in producing chips and devices for creating wireless networks. 

Intel’s goal is to use Infineon technology within laptops, smartphones and tablets to improve their performance with wireless connections. In 2011, Intel Fulcrum Microsystem and Israel’s Telmap came under the sphere of influence. The first, specialized in creating network devices (in particular network switches ), is part of the list of the 60 emerging startups of 2011, according to the Times; the second, on the other hand, is a development house that creates software solutions for geolocation. Intel’s interest in software houses was confirmed in 2013 when the Santa Clara-based company completed the acquisition of Omek Interactive and the Spanish Indices. 

The first designs and develops technologies for gesture-based software interfaces, while the Spanish development house creates artificial intelligence software specialized in speech recognition. The manufacturer of chips for network devices is dated February 2015, while in June 2015, it is the turn of Altera, a company specialized in the design of chips and FPGA devices. This latest agreement has aroused particular interest in the industry press: to finalize it, Intel has wrung a check for over 16 billion dollars: it is the most expensive acquisition in the history of Intel.

Also Read: Artificial Intelligence: What It Is And How It Can Help Your Business

 

Techno Publishhttps://www.technopublish.com
Technopublish.com is a reliable online destination for tech news readers who want to keep themselves updated on current innovations and advancements on topics related to technology.