Jump to content

“Perhaps it is simpler to say that Intel…was disrupted”


Rimbo

Recommended Posts

https://stratechery.com/2018/intel-and-the-danger-of-integration/

Perhaps this discussion fits in the "Video Games" board as well. Anyway...

This is a terrific and fantastic article (with some terrible drawings). Fascinating to me because of my time following the graphics industry and the PC, which grew up during the same time that I did. Intel's failure is the last loose end tied up with the PC revolution... that era is officially over, I think. Every major player who made it happened has either been blown out of the business (DEC, Commodore, Compaq, Radio Shack), disrupted (TI, Microsoft and now Intel), or moved on to other things in advance of being disrupted (IBM, Apple, AMD).
 

  • Like 3
Link to comment
Share on other sites

good article.  thanks for posting.

 

everything is clearer in hindsight.  the author hinted at it.  no ceo would've survived if at the genesis of the the troubling signs, he suggested that they invest immense capital and expense to kill their main revenue stream.  the investors and the board would never allow the company to stop milking the teet that has made the millions.

 

this brings up a interesting thought/question.  has any company or ceo been able to reinvent themselves to avoid being disrupted?  with the markets incredibly myopic focus of quarterly earnings, i don't think many ceo's have the leeway required to divert large sums of capital on things which may kill current revenue and margin.

  • Like 1
Link to comment
Share on other sites

I don't think it's possible. The pattern that hit Intel is the same one that hit Microsoft, IBM, AT&T, and DEC before them, and is the same one that will hit Apple in the mobile space eventually.

In the 3 examples I listed of companies that changed themselves -- IBM, Apple and AMD -- the company had either already been disrupted and made a change in response to that (IBM), or they were not the leaders in the first place (Apple re: the Mac, and AMD).

Apple is an interesting example, because they actually were the market leader with the Apple II, but at the time of its market dominance the company's management had absolutely zero interest in even acknowledging the product's existence. They were putting all of their efforts into the Apple III instead; literally every ad for the Apple II was a 3rd-party company selling pieces to the Apple II's ecosystem. (Source: I am Woz, Steve W's autobio.)

Link to comment
Share on other sites

Seems early to say Intel has been disrupted. They have significantly higher margins than the rest of the industry given their integrated model and market dominance.  There are lots of head winds facing Intel example being that they missed mobile. Processors are also moving into more niche use cases (e.g. AI, autonomous driving). I would not be surprised that in 5 yrs time that Intel loses a lot of market share but to predict that and blame it on their model is a fairly bold prediction. 

It is less about their integrated model and rather their inability to deploy R&D to find the next wave of processors or get lucky like Nvidia. 

  • Like 1
Link to comment
Share on other sites

55 minutes ago, crash_davis said:

this brings up a interesting thought/question.  has any company or ceo been able to reinvent themselves to avoid being disrupted?  with the markets incredibly myopic focus of quarterly earnings, i don't think many ceo's have the leeway required to divert large sums of capital on things which may kill current revenue and margin.

Amazon was an online book seller.  Netflix rented DVD's.  Basf made dyes and similar chemicals before risking everything to commercialize their Haber-Bosch nitrogen process.

Link to comment
Share on other sites

So, I was following the graphics industry very closely at the time that Nvidia rose, and I can say with some amount of authority that Nvidia did not "get lucky." 3dfx got lucky, and I'll point you to guys who worked there at the time who can confirm this personally -- they got lucky when their high-end graphics accelerator became a consumer-grade masterpiece after an unexpected drop in the price of VRAM.

But Nvidia had a deliberate plan in place to make themselves a permanent fixture in the market, and executed it with backups to spare.

As for Intel, I don't think it's early to say it, because generally we can call these things out several years before we see them occur. We can call out Apple now, I think, as well, in the mobile handset space, because their model is starting to hit some problems (also this one and this one) that are, to me, signs that the industry has made its direction and Google and Amazon are the new leaders.

  • Like 2
Link to comment
Share on other sites

3 minutes ago, Parliament said:

Amazon was an online book seller.  Netflix rented DVD's.  Basf made dyes and similar chemicals before risking everything to commercialize their Haber-Bosch nitrogen process.

Netflix is a good example, but Amazon less so; Amazon's goal was always to expand. The interesting thing is how much AWS is part of their business now, but they remain dominant in the book space. They haven't abandoned books (FAR from it) in the way Netflix left DVD rentals behind.

Edited by Rimbo
Link to comment
Share on other sites

4 minutes ago, Rimbo said:

So, I was following the graphics industry very closely at the time that Nvidia rose, and I can say with some amount of authority that Nvidia did not "get lucky." 3dfx got lucky, and I'll point you to guys who worked there at the time who can confirm this personally -- they got lucky when their high-end graphics accelerator became a consumer-grade masterpiece after an unexpected drop in the price of VRAM.

But Nvidia had a deliberate plan in place to make themselves a permanent fixture in the market, and executed it with backups to spare.

As for Intel, I don't think it's early to say it, because generally we can call these things out several years before we see them occur. We can call out Apple now, I think, as well, in the mobile handset space, because their model is starting to hit some problems (also this one and this one) that are, to me, signs that the industry has made its direction and Google and Amazon are the new leaders.

CEO was throwing so much shit at the wall and finally had a huge hit. We can argue if it was a deliberate road map but in my view it wasn't. I have no technical capability so basing my opinion on ex-Intel employees who are obviously bias. 

Edited by Washpark
Link to comment
Share on other sites

1 hour ago, Washpark said:

CEO was throwing so much shit at the wall and finally had a huge hit. We can argue if it was a deliberate road map but in my view it wasn't. I have no technical capability so basing my opinion on ex-Intel employees who are obviously bias. 

What was their "One big hit?" I think you have them confused with 3dfx.

What Nvidia did was an iterative process; they had 4 teams on 18-month development cycles (so that if one fell behind, there'd be another in line to get there), releasing products in lockstep with the OEM cycle. They always overpromised and under-delivered, because they wanted the hype of the new version to keep interest away from competitors, but would cut features if necessary to hit their release targets, so that every 6 months, the OEMs had a new Nvidia product to buy. OEMs learned they could trust Nvidia to have an upgrade they could sell every 6 months, and the upgrade for 6 months ago was now the budget card... by fitting into their customers' needs, they grew and eventually dominated.

Their products didn't have to be ahead of the curve technologically (although they were always ahead of Intel, who was barely even pursuing 3D at the time), but by feeding their success into the cycle, they eventually had so many resources brought to bear that they inevitably became the best-performing. Eventually the blue ocean market that had allowed for many new companies to form collapsed as the product became a commodity, and Nvidia and ATI remained the winners while previous giants like Diamond, Matrox and S3 fell. ATI and Nvidia absorbed everyone left; ATI themselves were sucked up by AMD a while back.

There was never any one-hit wonder for Nvidia. They just followed a business playbook to achieve business success. Technical dominance was just a happy coincidence.

It's worth adding: At the time, I was a HUGE 3dfx fan-boy. But after it became clear even to those of us who were fanboys that they were doomed, I went back and looked at what had actually happened honestly, and realized that a lot of the things I was criticizing Nvidia for were things that were leading to their success as a business.

Business success in technological markets rarely goes to the company with the best technology; it goes to the well-run business with good enough technology. In fact, I'd go as far as to say that having the best technology is almost always a hindrance, because the costs required to make the best tech are almost never recovered.

Edited by Rimbo
  • Like 1
Link to comment
Share on other sites

Also didn't help that 3dfx was doing shit like renting out the Playboy mansion for parties.

I'm still stumped that 3dfx's Glide compatibility and features like hardware FSAA lost out to the Nvidia feature set of stuff like 32-bit color and T&L, which weren't really useful anyway in their early generation hardware. It seemed like OEMs wanted marketing-based features that looked cool on the box, more than stuff you needed for actual gaming. Then again, fast forward to now when all the TVs are 4K HDR yet the shit my cable box pumps out is overcompressed 1080i/720p which actually looks worse on a 4K display. Marketing always wins, especially when backed up with predictable execution. 

But any day now, Rampage will be out, and things will be OK.

 

  • Like 1
Link to comment
Share on other sites

10 minutes ago, Paper_jam said:

Also didn't help that 3dfx was doing shit like renting out the Playboy mansion for parties.

I'm still stumped that 3dfx's Glide compatibility and features like hardware FSAA lost out to the Nvidia feature set of stuff like 32-bit color and T&L, which weren't really useful anyway in their early generation hardware. It seemed like OEMs wanted marketing-based features that looked cool on the box, more than stuff you needed for actual gaming. Then again, fast forward to now when all the TVs are 4K HDR yet the shit my cable box pumps out is overcompressed 1080i/720p which actually looks worse on a 4K display. Marketing always wins, especially when backed up with predictable execution. 

But any day now, Rampage will be out, and things will be OK.

 

Well, the thing is, even though their hardware T&L wasn't useful in those early generations, by getting the product out there and into developers' and customers' hands, and then improving on it with release after semi-yearly release, it became useful, while 3dfx had nothing that could do hardware T&L. In fact, by pooh-poohing the early GPUs, they painted themselves into a marketing corner, where it would've looked kinda bad if they did release T&L, unless it was a world-beating T&L. But they were already behind on it, and running out of money due to bad business decisions...

The horrifying truth was that hardware FSAA could simply be matched by having a faster renderer at higher resolutions. Even though Nvidia wasn't as fast at the time as, say, a Voodoo3, one year later they had matched surpassed Voodoo3's performance while 3dfx was still waiting for another home run.

To analogize with baseball, 3dfx basically got lucky and hit a home run, while Nvidia just kept on hitting singles. 3dfx kept trying to hit another home run, they ended up striking out and flying out every time. In the end, Nvidia had the higher OPS and WAR on account of their vastly superior OBP. :D

Edited by Rimbo
Link to comment
Share on other sites

4 hours ago, Rimbo said:

I don't think it's possible. The pattern that hit Intel is the same one that hit Microsoft, IBM, AT&T, and DEC before them, and is the same one that will hit Apple in the mobile space eventually.

In the 3 examples I listed of companies that changed themselves -- IBM, Apple and AMD -- the company had either already been disrupted and made a change in response to that (IBM), or they were not the leaders in the first place (Apple re: the Mac, and AMD).

Apple is an interesting example, because they actually were the market leader with the Apple II, but at the time of its market dominance the company's management had absolutely zero interest in even acknowledging the product's existence. They were putting all of their efforts into the Apple III instead; literally every ad for the Apple II was a 3rd-party company selling pieces to the Apple II's ecosystem. (Source: I am Woz, Steve W's autobio.)

 

My understanding was that Woz wanted to continue and advance the market leading Apple II but Jobs pushed the shitty Mac and subsequently nerfed the Apple II platform evolution. Evidently there are still legions of apple fanboys pissed at Jobs about it.

 

 

Link to comment
Share on other sites

Hit is the wrong word but being positioned for the shift / demand for AI focused processors. NVDA had languished for years and seemed from an outside perspective that they stumbled upon this market on accident and only happened to be best positioned for it vs visionary roadmap.

Link to comment
Share on other sites

When AMD launched their 1st CPU outside traditional AMD/GF fabs (Ontario TSMC 40nm), I said that there could be a day in which AMD had products at Intel fabs. This was around 2011?  My point was that Intel needed to spinoff their fabs to a new company.

They still should.  The weakening PC market means that Intel cannot fill up their fab.  And they have dipped their toe into the foundry model (they have a side business) but for them to succeed at it, they need to go all-in.  The design side would benefit because they could have the flexibility to use TSMC 7nm; the foundry side would benefit because TSMC has no real competition (GF and Samsung, ehh).

I promise you; the last thing TSMC wants is to have Intel as a competitor on the foundry side.

Link to comment
Share on other sites

One thing all this does tend to show is that even seemingly the most entrenched monopolist (Micro$oft, now Intel) can be unseated fairly rapidly.  Gives a certain force to the reformation of Antitrust law.

 

Also, I'm not sure TI was disrupted.  They were always only a marginal player in the PC space, and at that time heavily involved in the defense industry.  Their wise divestiture of defense to Raytheon to focus on non-processor, application-specific electronics and analog devices was probably just a wise strategic move that focused on their core competence.  Doing so took them out of the public eye, but most of their business over most of their history was out of the public eye.

Edited by TwiceHorn
Link to comment
Share on other sites

14 hours ago, MagicSoccerSpray said:

 

My understanding was that Woz wanted to continue and advance the market leading Apple II but Jobs pushed the shitty Mac and subsequently nerfed the Apple II platform evolution. Evidently there are still legions of apple fanboys pissed at Jobs about it.

 

 

Jobs had nothing to do with it, nor did the Mac; the Mac wasn't to come out until much later. We're talking about a 7-year gap between the II (1977) and the Mac (1984).

What happened -- and this is according to Woz himself in his autobiography -- is that when they first had commercial success with the II, the Steves looked at each other and thought, "We need to hire people to run this company for us, we don't know how;" (and Woz didn't want to), and so they did so. The trouble is, the people they hired knew nothing about the emerging market the Steves had created, and their bell cow was going to be the Apple III. Of course, the III was a terrible product.

Both of them were marginalized by the people they hired -- remember Jobs' firing, which led to him forming Next, which eventually became Mac OS X (when they brought him back).

Now, I was one of those fanboys mad at Jobs, but the truth was that the II was not a salvageable series by that point. The IIgs was just a stopgap to buy the Macintosh time. And Jobs, himself, was not running the company.

Link to comment
Share on other sites

I'm a huge fan of Ben Thompson but he's a bit off here.  Why?  Intel has a viable path -> they have so spin off the fabs. Even in a day of mobile and less desktop computers, there is a need for more foundries.  TSMC is the gorilla and they charge a pretty penny. Intel is a bit behind TSMC but these guys can leapfrog each other and continue to fill their fabs.  Running fabs isn't cheap but we are in a world with more electronics, not less.

The team that can get disrupted is Intel design.  Intel's reliance on x86 is similar to Microsoft's reliance on Windows. But without going to a fully fab-less design company, I don't see Intel changing.

For the record, their datacenter/server business is KILLING IT.  And that can continue if they went fab-less. 

Link to comment
Share on other sites

12 minutes ago, RD3 said:

I'm a huge fan of Ben Thompson but he's a bit off here.  Why?  Intel has a viable path -> they have so spin off the fabs. Even in a day of mobile and less desktop computers, there is a need for more foundries.  TSMC is the gorilla and they charge a pretty penny. Intel is a bit behind TSMC but these guys can leapfrog each other and continue to fill their fabs.  Running fabs isn't cheap but we are in a world with more electronics, not less.

The team that can get disrupted is Intel design.  Intel's reliance on x86 is similar to Microsoft's reliance on Windows. But without going to a fully fab-less design company, I don't see Intel changing.

For the record, their datacenter/server business is KILLING IT.  And that can continue if they went fab-less. 

Probably an astute point.  TI was on the bleeding edge of RAM design and fabrication for a decade or more, to no end except patent royalties because no one else could build those designs fast enough or cheap enough for commercial use until they were a generation behind or so.  Like everything else manufacturing-related, offshoring it is probably the path forward.

Link to comment
Share on other sites

42 minutes ago, Hornius Emeritus said:

 

Apple did.

Yes, but it's worth adding that they weren't a market leader at the time they did so. It's a lot easier to reinvent yourself when you don't have a dominant market share to lose. When you're already in the leadership position? It's basically impossible.

Link to comment
Share on other sites

23 hours ago, crash_davis said:

this brings up a interesting thought/question.  has any company or ceo been able to reinvent themselves to avoid being disrupted?  with the markets incredibly myopic focus of quarterly earnings, i don't think many ceo's have the leeway required to divert large sums of capital on things which may kill current revenue and margin.

The classic example is actually Intel. They started off as memory company and was the worldwide leader in DRAM.  Then the Japanese leap frogged them and they were faced with the dilemma.  What to do?  Chase DRAM or put their chips all in on their smaller micropocessor group?  Andy Grove was a genius and said what would our successors executive team do?  Gordon Moore (or Moore's Law) said microprocessors...  Then why wait?  And the rest is history.

http://watercoolernewsletter.com/the-revolving-door-test-how-intel-overcame-fear-by-gaining-an-outsiders-perspective/

Quote

In his memoir, Only the Paranoid Survive, Andy Grove recalled a tough dilemma he faced in 1985 as the president of Intel: whether to kill the company’s line of memory chips. Intel’s business had been founded on memories, and indeed for a time the company was the world’s only source of memory. But by the end of the 1970s, a dozen or so competitors had emerged. Meanwhile, a small team at Intel had developed another product, the microprocessor, and in 1981 that team got a big break when IBM chose Intel’s microprocessor to be the brain of its new personal computer.

 

Link to comment
Share on other sites

23 hours ago, crash_davis said:

this brings up a interesting thought/question.  has any company or ceo been able to reinvent themselves to avoid being disrupted?  with the markets incredibly myopic focus of quarterly earnings, i don't think many ceo's have the leeway required to divert large sums of capital on things which may kill current revenue and margin.

Disruption (as Prof. Christensen defines) is when a company continues to go after higher margins leaving lower margin segments to other competitors until the lower competitors scale up with different (disruptional) technology to compete at lower costs and unseat the original company. And the danger is the original company (in the short term) is incentivized to continue on this high margin path.

IBM is another example when they introduced the PC.  The original IBM PCs threatened their own higher margin businesses... and IBM eventually made a ton of money off that forward looking move.  Of course they opened up Pandora's box and the rest is history.

Link to comment
Share on other sites

On 7/10/2018 at 9:52 AM, RD3 said:

When AMD launched their 1st CPU outside traditional AMD/GF fabs (Ontario TSMC 40nm), I said that there could be a day in which AMD had products at Intel fabs. This was around 2011?  My point was that Intel needed to spinoff their fabs to a new company.

They still should.  The weakening PC market means that Intel cannot fill up their fab.  And they have dipped their toe into the foundry model (they have a side business) but for them to succeed at it, they need to go all-in.  The design side would benefit because they could have the flexibility to use TSMC 7nm; the foundry side would benefit because TSMC has no real competition (GF and Samsung, ehh).

I promise you; the last thing TSMC wants is to have Intel as a competitor on the foundry side.

I agree with a lot of what you say, but I'm not as confident as you are that Intel could be successful in the near term ( ~5-10 years) going to a foundry model. 

As you point out, the business has a design side and a manufacturing side.  Intel has always been so far ahead on the design side, and their product was so valuable that they never tolerated any shortage of capacity so they built a lot of fabs.  This was justifiable because the loss of revenue from being unable to meet demand due to supply constraints was much more significant than the extra expense of having a lot of fabs not at full capacity, so they overbuilt.  That worked out great for them and was almost certainly the right decision at the time. But margins are too thin to run like that in a foundry.  And its not as simple as just saying "we'll fill up our factories and get more money out of them."  Everything about running the factory has to change. Business processes involved QA, repair and maintenance, operations, have not over the decades been subjected to the same sense  of time based urgency as is the case at memory fabs or a pure play foundry.  Changing that mindset and culture is quite an undertaking.  

A second issue is all of the internal organization and infrastructure that has to be created to deal with all the different products and customers.  At as single Intel fab they may make 4 families of processors, with several different very similar versions in each family.  And the customer for every single one of those chips is Intel, with very clear Intel requirements.  At a TSMC fab they will be simultaneously making everything from the SOC for a iPhone, image sensors for an Indian digital camera customer, a display driver for an LG TV,  infotainment parts for Ford, ASIC chips for bitcoining mining, etc.  There would be hundreds of types of chips and over a 1000 different products and the insane number of reticle sets that go along with that.  You have to have an organzied way to maintain SPC charts and monitoring for that many devices.  Device engineers who understand the layout of those chips to diagnose issues during manufacturing have to be hired.   There need to be people that maintain relationships with hundreds of customers, each of which has different pPk requirements, different requirements for manufacturing turn around time. 

Intel is fucking great at designing CPUs based on x86.  They are great a manufacturing very very expensive chips in a slow and steady way.  That has very little to do with running a foundry.

Link to comment
Share on other sites

Yes, being a TSMC is very different than being an Intel. One is a services company and the other is a product company. Being a service provider means you are necessarily market following and at least one level below the product company in the food chain, which it is reflected in margins, brand, product roadmaps, business dealings, etc. Intel providing foundry services automatically means they have accepted defeat.

Intel's mistake is not due to integrating design with manufacturing, but sticking with x86, or more correctly, not developing new alternatives with enough enthusiasm. Their only other serious product different from x86 that I recall is Itanium (XScale doesn't count). It also shows how monumentally difficult changing philosophies is when even Intel with near monopoly power over their customers could not ram through a new product (x86-64 won). 

nVidia may be a one trick pony too, but they were visionary enough to see beyond polygon pushing and have been beating the "GPU is a better multi-thread processor than CPU" drum for many years. They worked with scientific community and developed parallel computing solutions from workstations to supercomputers. So, they were in the right place at the right time to take advantage of the current machine learning and crypto mining craze.

AMD is the small and sometimes abused sibling of Intel, where Intel will take the whole dining table and leave scrapes. Very hard to compete and develop when you are starving. Credit to them (despite couple of dumb leadership phases) for hanging on.

Coming back to Intel, they are seriously in trouble. They are the apex predator and hence need the whole jungle to do well to thrive, otherwise they become just another component provider. But that is what is happening with new ecosystems that are vertically integrated. Apple/Samsung don't need them, they need Apple/Samsung. Google/Amazon/Facebook will/are making their own chips. Microsoft has rediscovered their identity as software first company with hardware as enabler of software development. If they feel like they need a Surface-like product line for chips to do selective AI/ML stuff, they will do it. They have already done many chips for Xbox, Hololens, etc. That leaves only system aggregators (HP/Dell/Lenovo) and tier II (HTC, Huawei, etc.). So this means only Xeon servers are real money-makers, and how this market will be with big guys bringing chip design in will be interesting to watch.

So, Intel desperately needs a hit. They are investing heavily in ML/AI and have acquired many companies. Can they come up with a solution when the big vertical guys have far more ammunition (algorithms, customer data from photos to shopping history, etc.) to develop better systems? The same question largely applies to Qualcomm too. I think this is what is driving Broadcom to swallow giants whole to somehow become big enough to survive.

 

Link to comment
Share on other sites

 

 

 

So, I was following the graphics industry very closely at the time that Nvidia rose, and I can say with some amount of authority that Nvidia did not "get lucky." 3dfx got lucky, and I'll point you to guys who worked there at the time who can confirm this personally -- they got lucky when their high-end graphics accelerator became a consumer-grade masterpiece after an unexpected drop in the price of VRAM.

But Nvidia had a deliberate plan in place to make themselves a permanent fixture in the market, and executed it with backups to spare.

 

 

As I understand it, the Riva 128 was the last backup after their previous chips failed on the market.

 

 

But, that chip hit, and the rest is history.

 

Link to comment
Share on other sites

16 hours ago, JimmyGlass said:

I agree with a lot of what you say, but I'm not as confident as you are that Intel could be successful in the near term ( ~5-10 years) going to a foundry model. 

...

Intel is fucking great at designing CPUs based on x86.  They are great a manufacturing very very expensive chips in a slow and steady way.  That has very little to do with running a foundry.

Let me take a step back -> this won't be a slam dunk.  It would take a huge step.  And that's what those guys are paid the big bucks to consider...

Link to comment
Share on other sites

10 hours ago, elfenix said:

 

 

 

 

 

As I understand it, the Riva 128 was the last backup after their previous chips failed on the market.

 

 

But, that chip hit, and the rest is history.

 

Now that's ancient history, man. :)  Riva 128 was basically able to ride the coat-tails of the VRAM price drop, just like 30 other companies' products did. That price drop allowed everyone to succeed.

Edited by Rimbo
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.



×
×
  • Create New...