Intel CEO Pat Gelsinger inherited a troubled company that had lost its edge in manufacturing skills and had ceded to rivals the hugely lucrative markets for chips used in mobile phones and artificial intelligence.
1968 – Robert Noyce and Gordon Moore found Intel, helping reshape California’s Santa Clara Valley from fruit orchards into the Silicon Valley tech hub.
1971 – Intel introduces the 4004, the world’s first commercially made programmable microprocessor, with 2,300 transistors. These were among the first chips that could be programmed to perform specific functions, unlike previous processors that were hardwired only for a certain task – a turning point in the history of the semiconductor industry that laid the groundwork for the development of CPUs.
1981 – Intel’s 8088 microprocessor, with 29,000 transistors, becomes the brain in the IBM Personal Computer, kicking off the era of personal computing.
1982 – Advanced Micro Devices, co-founded in 1969 by a fellow Fairchild Semiconductor alum, becomes a second producer of the Intel 8086 microprocessor. This partnership would later cause a long legal battle over AMD’s rights to use Intel’s x86 chip architecture, which is the basis of a majority of PC and server chips today.
1985 – Intel decides to withdraw from the dynamic random access memory (DRAM) market – its initial claim to fame – as slumping demand prompts it to focus on microprocessors.
1985 – Intel begins cutting its workforce, impacted by an industry-wide downturn brought on by memory over-supply. Intel would lay off thousands of workers for the remainder of the 80’s.
1987 – Andy Grove, famous for his motto, “Only the paranoid survive”, becomes Intel’s third CEO and steers the company through a massive slump in the chip market. Intel solidifies its position as a linchpin of the American semiconductor industry, while players like AMD and National Semiconductor struggle.
1991 – Intel