aurizon 8 hours ago

Once accountants started to run this ship, they sailed onto rocky shores. Profits should be used for research, instead they wasted ~~100 billion on stock buy-backs to keep the funds happy. Those billions, if spent on research, might have kept them off the rocks.

  • whatever1 2 hours ago

    Depends on your scope. From an intel perspective it would be wise to keep their cash, and maybe they would be in a better position today. From a market perspective we unlocked 100B and pumped them to other companies for example Apple, tsmc, nvidia etc or IPO’ed new ones. These seemed to achieve better multipliers on the capital hence as a whole we are better off (theoretically).

    Now of course all of these are on the macro level. If you look closely the collapse of Intel would cause severe disruption both on the business and on the geopolitics fronts.

slowmovintarget 15 hours ago

This is the story of the birth of Intel, and with it so many of the firsts that laid the foundation for our current technology landscape: The first DRAM chip, the creation of the first microprocessor (the 4004), on through the release of the Intel 8080.

  • em3rgent0rdr 9 hours ago

    Debatable to claim the 4004 as "the first microprocessor". It's safer to specify it as the first "commercially-available general purpose" microprocessor. See https://en.wikipedia.org/wiki/Microprocessor#First_projects for a few pre-4004 chips that also are debatabley the first microprocessor: - Four-Phase Systems AL1 chip (1969), which was later demonstrated in a courtroom hack to act as a microprocessor (though there is much debate on whether that hack was too hacky) - The F-14 CADC's ALU chip (1970), which was classified at the time - Texas Instruments TMS 1802NC (announced September 17, 1971, two months before the 4004), which is more specifically termed a microcontroller nowadays, but nevertheless the core was entirely inside a single chip.

    • adrian_b an hour ago

      I do not consider 4004 as "general purpose".

      It was designed for implementing a desktop calculator, not a general-purpose computer. With some effort it could be repurposed to implement a simple controller, but it was completely unsuitable for implementing the processor of a general-purpose programmable computer.

      For implementing a general-purpose processor, it is likely that using MSI TTL integrated circuits would have been simpler than using Intel 4004.

      Intel 8008 (which implemented the architecture of Datapoint 2200), was the first commercially-available monolithic processor that could be used to make a general-purpose computer, and which has actually been used for this.

      Around the same time with the first monolithic processors, Intel has invented the ultraviolet-erasable programmable read-only memory.

      The EPROM invented by Intel has been almost as important as the microprocessors for enabling the appearance of cheap personal computers, by avoiding the need for other kinds of non-volatile memories for storing programs (e.g. punched-tape readers or magnetic core memories), which would have been more expensive than the entire computer.

  • brcmthrowaway 13 hours ago

    How did Intel lose dram to micron?!

    • adrian_b 30 minutes ago

      By the time when the Japanese DRAM manufacturers began to make DRAM chips that were both better and cheaper than those made by Intel, Intel has decided in 1985, probably rightly, that instead of investing a lot of money in trying to catch up with the Japanese they should better cut their losses by exiting the DRAM market and they should concentrate on what they were doing better, i.e. CPUs for IBM compatible PCs.

      This decision has been taken not much after the launch of the IBM PC/AT and also when Intel was preparing the launch of 80386, so they were pretty certain that they can make a lot of money from CPUs, even if they abandon their traditional market.

      It is likely that Intel has reached that point where such a decision had to be taken because for many years they must have underestimated the competence of the Japanese, by believing that they are not innovating, but only copying what the Americans do, exactly like now many Americans claim about the Chinese. When they have realized that actually the quality of the Japanese DRAMs is higher and their semiconductor plants have much better fabrication yields, it was too late.

    • Panzer04 12 hours ago

      Deliberate decision to focus on higher-margin products that aren't commodities (like memory). I believe similar logic was used to justify the sale of their flash business.

      • xadhominemx 12 hours ago

        Micron itself was often touch and go until several competitors went bankrupt around 2010

    • bee_rider 10 hours ago

      I don’t really get the Intel/Micron relationship. Much later, Intel collaborated with Micron on their NVME tech (3D Xpoint/optane), but in the end they gave up the product line to Micron, right?

      Companies don’t have friends. But they seem quite cozy?

wslh 9 hours ago

> The 3101 held 64 bits of data (eight letters of sixteen digits)

The 3101 held 64 bits of data (eight bytes, each representing values from 0 to 255).

aurizon 8 hours ago

I recall when Mike Magee of the UK inquirer coined the term 'Chimpzilla'(AMD) as Intel's(Chipzilla) perpetual rival