Here's a fun fact (and I suspect shows what a loser I am :-)) Pat Gelsinger and I were colleagues at Intel in the early 80's. We both worked in what was called "MIPO" (Microprocessor Operation) but I was in Systems Validation and he was on the customer facing side with field engineering. My career went off through a variety of engineering roles, his went into management and up. When he joined EMC, I told him I had always thought he would have been Intel's CEO. He told me, "Hey, maybe I still will be :-)" That was like 10 years ago, and here he is.
There are a lot of things you can say about Silicon Valley, but one of the more interesting aspects of it for me has been how "small" it is in terms of people have an oversized impact. I have never been one of those people of course, just a part of the entourage. But it has been interesting to watch and learn from folks who are good (and bad) role models.
Pat went to my high school and computer science program. It’s in a small rural, Pennsylvania farming town - pop 2000. Somehow we got an amazing math teacher turned CompSci and he turned out world class students - we always ranked high enough each year to compete internationally in ACSL.
A few years back when the teacher retired, we all gathered together to thank him, including Pat! World-class guy, and makes us proud! So great to share a story with students, who come from nothing, how you can leave our hometown and eventually be the CEO of Intel.
Wishing Pat the best of luck in his new role!
A single good teacher can make all the difference in the world.
From an economics perspective, it's kinda insane that teachers are paid so poorly, since they are such force multipliers. A single brilliant teacher can positively influence the future educational trajectory (and future earnings) of thousands of kids. Investing in making that happen more often seems like it would have great dividends.
Which is kind of sad, isn't it?
Doesn't that mean that a lot of people will never be on the receiving end just because they did not win the teacher-lottery (arguably the chance of getting such a teacher is rather slim)?
Shouldn't our systems be more robust? (Not talking about normalizing these effects down, but creating a better environment for everyone, using these effects en masse)
I have a few former colleagues who were equals (or close) with me early in my career, and are now senior executives at Fortune 500 companies, and it does always make me feel kind of like a loser!
But also I think about how much crap they put up with in those roles, and how relatively low-stress my life is, and then I feel better about it. They're cut out for that kind of job, and I most certainly am not.
If I may ask, any pointers on how is the stress level in your role currently/over the course of years and what is the difference in your compensation and your colleagues compensation cumulatively?
Not the GP, but I'd guess the difference is an order of magnitude for both stress and comp.
This question makes me groan.
I see a remarkable number of people who have good personal experiences with Pat but I've got this concern.
Intel's technical failure with 10nm has gone hand-in-hand with financial success with 14nm. That is, without 10nm chips on the market in a meaningful way they've been able to raise prices for 14nm parts -- Intel put up better financial numbers than ever in a time when it has not been investing in future success.
VMWare was a big thing in 1998 but it was obsolete by the time Pat got involved -- a hypervisor is naturally part of the kernel and there is no way cloud providers are going to spend their margin on VMWare. Yes, many people in business are terrified of open source software and want a proprietary product so they have somebody to sue (VMWare) or they need somebody to hold their hand (Pivotal.) Either way, VMWare and Pivotal are units that can be merged and spun off whenever a company based in Texas (Dell) or Massachusetts (EMC) wants to look like it has a presence in San Francisco -- you see the vmware logo on CNBC every morning and somebody thinks the king is on the throne and a pound is worth a pound but that doesn't mean anything in the trenches.
Like Intel in the past 10 years, VMWare is entirely based on a harvesting business model. In the short term Intel made profits by pandering to cloud providers; but in the long term cloud providers invested their profits in better chips. (What if Southwest Airlines had developed a 737 replacement designed from the ground up for a low cost airline?)
Pat might be able to keep the game of soaking enterprise customers going for longer, but someday the enterprise customers will be running ARM and the clients will be running ARM and the coders will be thinking "did they add all of those AMX registers just to put dead space on the die to make it easier to cool?" and falling in love again with AVR8.
Speaking as an ex-Pivot (but only for myself), a couple of thoughts:
- VMware grew almost 2.5x under Pat. Obviously their core franchise is about harvesting, but new adjacent products like NSX and VSAN helped a lot with growth. They just took a long time to get traction given the customer base, who don’t like change. VMware Cloud on AWS has been surprisingly strong even to skeptics like myself.
On the other hand, companies would get rid of VMware if they could, and they tried and failed with OpenStack. The hypervisor is commodity, but the overall compute/network/storage private cloud system isn’t trivial. Turns out “holding hands” whether by software or services is pretty valuable?
Public cloud of course is a substitute, though private clouds and data centres have survived and thrived as well (for now).
- Pivotal had a boutique software and services model that would be difficult to scale as a public company given the current shifts in the enterprise fashion away from productivity-at-any-cost (PaaS and serverless) and towards perceived low cost building blocks (Kubernetes and its ecosystem).
But it would be an gross oversimplification to suggest this was merely a vanity project for EMC and Dell. It took in $500m+ annually on open source software (and another $300m in services), which is no small feat, though still minuscule given what surrounds it. But anything new has to start somewhere. Not enough to impact the mothership’s balance sheets, but there was something special there that could be nurtured. Whether it can be, or whether the differences matter enough is anyone’s guess.
Pat could take Intel in a direction we don’t anticipate. VMware was written off for dead when Pat came on but he managed to buy it another 10-15 years and at peak a doubling of the stock price. He’s probably learned from that.
This is spot on. And I think the TL;DR of the hypervisor argument goes as follows:
Hypervisor is a commodity, however management and support of hundreds or thousands of them is not. You can either pay people to support them and fix the software when it breaks or you can pay <vendor name here>. Given the former requires expertise and planning it's often more cost effective to go the latter.
Disclaimer: I'm employed by VMware (less than 1 year) and chose to come here based on pivots I feel they are making.
Any thoughts on what made him special (either from your perspective or from those choosing to promote him)?
Keeping in context that I've not worked with him in (gulp, 30 years!) at Intel he was one of the folks who put problems into the whole picture. So a customer would say, this chip doesn't work, over at Systems Validation we would get hopefully enough information to re-create their problem on an Intel built board, and Pat's job (at the time) was to co-ordinate between us, design engineering, and marketing to figure out how to tell the customer to proceed.
The "actual" problem could be anything from the customer misinterpreting the datasheet (Marketing/Comms problem), to insufficient testing (Factory/Production problem), to chip function (Design Engineering problem). As a "new college grad", or NCG in the lexicon, I admired how he dug out details from various folks to get to the real problem. He always had ideas for things Systems Validation (SV) could do to maybe trigger the problem that made sense to me. He really embodied the philosophy of fixing the problem not fixing blame on some group.
People that can ask the right question are much more impactful than people who can “only” find the answer. (NOT trivializing the smarts it takes to find the answer. )
My Pat Gelsinger story: I worked in a successor org to MIPO, in those days called MPG. One day during pre-silicon validation on the Pentium II, one of my NCG’s came to me and said: “An old test from the historical test archive is failing in the simulator. I want to find the author and ask him about it. Do you know a Pat Gelsinger?”
Me: “Well, he is an Intel Fellow now, so he might not remember what that test was supposed to do. “
"Notable absent from that list is he fired Pat Gelsinger. Please just bring him back as CEO." -  2012 on HN, when Paul Otellini Retired.
"The only one who may have a slim chance to completely transform Intel is Pat Gelsinger, if Andy Grove saved Intel last time, it will be his apprentice to save Intel again. Unfortunately given what Intel has done to Pat during his last tenure, I am not sure if he is willing to pick up the job, especially the board's Chairman is Bryant, not sure how well they go together. But we know Pat still loves Intel, and I know a lot of us miss Pat."  - June, 2018
"This is the same as Intel pushing out Pat Gelsinger. The product people get pushed out by sales and marketing. Which are increasingly running the show at Apple."  30 Days ago.
And numerous other reference since 2009. Many more around various other forums and twitter. I am getting quite emotional right now. I cant believe this is really happening. ( I am wiring this with tears in my eyes! ) I guess Andy Bryant retired makes the decision a little easier. And Pat has always loved Intel. I guess he is pissed those muppets drove it to the ground.
This is 12 years! 12 years to prove a point! Consider 4 - 5 years of work in lead-time since he left in 2009. That is 2014. Guess what happen after 2014?
May be it is too little too late? Or May be this will be another Andy Grove "Only the paranoid survive" moment?
The King is Back at Intel, despite being a fan of Dr Lisa Su, I am little worry about AMD.
Pat's a great guy and first class engineer, but I think it's just too late for him to turn Intel around at this point. The problems have become too entrenched. They needed someone like him at the helm 10 years ago. Apple's M1 shows that the world has passed Intel by.
I wish him luck, though.
"It's just too late to turn them around" is pretty much what people said about Apple when Steve Jobs returned, though (in fact, it's more or less what I thought as well).
As Adam Smith said, "there is a great deal of ruin in a nation", and likewise, big companies generally get more opportunities to reinvent themselves than one would expect.
Similarly some (many?) thought Microsoft was a lost cause due to Ballmer (sales/business development background), but they seem to be doing okay under Nadella (engineering background). That said, only time will tell.
> As Adam Smith said, "there is a great deal of ruin in a nation"
Tired brain read this as "there's a great deal in a ruin of a nation." Close enough, I guess?
How often do big companies like Apple and Intel succeed in righting a sinking ship though? You can’t keep pointing to the one guy that succeeded.
Yea, I'm not sure what their plan is other than to keep trying to sell to institutions until everyone makes the switch to someone else.
If they actually want to survive long term, there are two paths as I see it:
A) Be legitimately better than AMD, this could include opening up the management engine, much higher performance chips at lower price points, or some sort of space magic utilizing their Altera acquisition.
B) Embrace RISC-V and push it to laptops and desktops HARD, while not pulling the Microsoft Embrace Extend Extinguish™ play. If they go this route then their stock becomes an exceptionally strong buy IMO.
They could give up on design and just try to be an American TSMC.
There's a lot of interest in that from a national security perspective anyway.
I would love a RISC-V laptop, but only if the vector and matrix extensions can be used for training and inferring neural networks. Having separated vectorized processing for CPU (SSE) and a neural engine doesn't make sense to me.
I want to be able to train NNs with a laptop that can normally last 20 hours.
Imagine what Apple does to x64 to work on ARM, Intel having Altera, could do x86 translation to FPGA gates... It wouldn't take whole programs, but it could reconfigure itself to run most commonly executed routines on FPGA.
The same could have been said about AMD 7 years ago. Then Lisa Su came.
Don't forget Keller's contributions there. It was a much smaller operation so it was easier for a couple of people to change it. Intel, on the other hand, is a supertanker that's going to be very difficult to turn around in time.
I recently purchased a MacbookPro w/ an M1 chip and i'm taken aback at the value/dollar, compute/power, compute/heat, compute/noise. I love the MBP-m1.
I realize that WinTel machines arent going anywhere because of entrenched business use -- but between NVIDIA GPUs for heavy workloads and M1s for day-to-day activities -- what is the feeling inside Intel right now? Is this like a Microsoft-Netscape moment in the 90s?
An M1 equipped Mac Mini with 16 GB of ram and some more storage is still more than £1000. The M1 has to get a lot cheaper to be anything more than a warning shot for Intel.
What it should definitely be, however, is a death-sentence for x86. I bet Intel are proud of it, but ultimately they must be at least somewhat jealous of those who don't have to use it.
Maybe more like Apple in the 90s?
Interesting - you think he can pull this off?
It'll be interesting to see what happens, I had written them off as on the path of inevitable decline and irrelevance.
If they have someone as CEO who understand the existential threat they're facing from everywhere maybe they'll survive.
The irony of the other top comment is that the AMD threat wasn't the competitive threat that mattered. AMD is also screwed.
I'm old enough to remember writing IBM off in the early 90's and Microsoft off in the late 2000's. These companies had one thing in common: giant war chests of cash and legacy businesses that were still pumping out money by the truckload. It's not like a startup/small business that's riding on a razor's edge, these companies have so many resources they can keep searching for a way "out" of their predicament for a very long time until they finally land on the right combination of people, vision and timing.
We see it now with Google now being the "evil empire". They haven't had a real hit since Android and seem to be floundering, but online ad revenue is such a huge geyser of cash they're gonna be fine for a very, VERY long time.
I think you’re right that they’ll be around a while, but Microsoft (and Apple) were both saved by CEOs in fairly dramatic fashion.
I’d argue IBM largely is irrelevant today, but they technically still exist.
Google is lucky they have an ad monopoly because they don’t really have a coherent company vision, I wouldn’t buy their stock. They might get lucky with their deepmind purchase.
I don’t disagree with you, but if I was choosing companies in a strong position today it’d be Apple, Amazon, and Microsoft. I wouldn’t make a long term bet on Google or Intel.
Why would amd be screwed?
My long term bet is on M1 and ARM/RISC5 for the future.
x86 is on the way out and can't compete on power or performance. In the end RISC won, it just took a while to get there.
As things trend that direction AMD and Intel don't really have much to offer, they're competing on legacy tech.
With TSMC providing fabs for designs from anyone, the serious players don't need AMD or Intel. Apple has a massive lead here, but others watching this will follow it.
Devils advocate from a throwaway for reasons; Pat foisted VMWare on a small startup trying to find its engineering footing culture wise, after being invited in to advise on the business side (loan his name mostly).
Cost a bunch of engineering time and forward motion, internal politicking. Eventually it got binned after months of not getting what we wanted out of it. There was no technical reason for it.
Maybe hardware really is his thing, but that quid pro quo hurt productivity.
I don't think I've seen a thread here before where the top 2 posts are such opposites of each other.
>A few years ago, back in 2016, Intel did a “RIF” (reduction in force) of about 11%. Intel had previously done a significant reduction way back in 2006 of about 10%
>In an industry that runs on “tribal knowledge” and “copy exact” and experience of how to run a very, very complex multi billion dollar fab, much of the most experienced, best talent walked out the doors at Intel’s behest, with years of knowledge in their collective heads
bottom line: Intel created the hole by itself and jump into that deep end.
Plus Intel for many years has paid slightly above average wages. If I recall correctly they targeted paying at about 55-60 percentile. The problem with that is that the FAANG companies will literally pay their engineers twice that.
Many of the competent people I knew at Intel have left (not all), while many of the incompetent people I knew are still there.
Management rule of thumb: For every % you cut from the bottom, you lose from the top. Cut 10% of workforce, and 10% of your best people say "yes" to the next headhunter trying to poach them.
It's actually more complex than that, when a big company wants to reduce workforce they will fire the bottom and offer retirement packages to their oldest employees which are above or near retirement age, sometime as much as a full year of salary.
The intuition is that older employees cost more and by cutting them you can reduce your payroll more significantly while doing what looks like smaller employee cuts from the outside. This is often viewed favourably by investors because on paper it doesn't seem as the company is stalling (head count is still high, costs are down). The obvious issue is that these older employees are not easily replaceable and you end up losing more velocity in the long run than originally anticipated.
The above is more applicable to traditional blue-chip businesses where workforce movements are more limited. For software engineering (which Intel is not really) your assumption is correct and once cuts are announced a lot of your great engineers will jump ship.
Those older employees better be replaceable! Many will be gone in a few more years because they retire anyway, so you should have a plan in place to save their knowledge.
The above applies to everyone. When I was an intern the company folklore was full of horror stories because the last guy knew anything about a very profitable product died suddenly. (the product was for mainframes: clearly near end of life, but it was still mission critical for major customers and got had to get minor updates)
I've also known important people to find a better job. Even when an offer of more money gets them to stay, my experience is they always wonder if they made the right decision and so are never again as good as they were before.
Moral of the story: don't allow anyone in your company to get irreplaceable. This is good for you too: it means you won't stagnate doing the same thing over an over.
I work (but not for much longer) for a tech company that has done an ER package twice (and is well known for layoffs). The result is that the company has a sort of corporate alzheimers. It knows the inventory of all the things it used to know, but doesn't actually seem able to recall anything. The trajectory and outlook are not good.
I don't think tech companies can afford this practice. So much knowledge resides in their senior talent, and the hard-won experience-based understanding and things gleaned through opportunistic exposure that they seem to voluntarily surrender.
Pat was a "boy wonder" at Intel and could do no wrong — until Larrabee. I was working at Intel at the time and remember always assuming that Pat would someday be CEO. His departure came as such a shock to a lot of us, as does his return.
He might have what it takes to turn Intel around.
There was also Intel's whole pursuit of frequency--they demoed I think it was a 10GHz chip at IDF at one point (and Itanium was essentially an ILP-oriented design)--and resistance to multi-core. Some of it was doubtless Intel convincing themselves they could make it work. But they were also under a lot of pressure from Microsoft who didn't have confidence that they could do SMP effectively--at least that's what a certain Intel CTO told me. (Ironically, multi-core didn't end up being nearly the issue a lot of people were wringing hands over at the time thought it would be for various reasons.)
I’m not sure Itanium was a technical failure, to me it always was a business model failure as that CPU was co-developed with HP and essentially became a dedicated HP-Oracle box and by the time the ecosystem was opened up it was too late.
The heavy reliance on the compiler for ILP was an “odd-choice” but not something that was unsound in principle.
If the ecosystem was more open from the get go and more vendors were involved it had a much better chance of taking off.
And if nothing else at least it was something new.
The biggest disappointment I have with Itanium is that it and later Larabee/XeonPhi kinda pushed Intel even further into their own little x86 box when it came to processing units.
I think that failure is also why they haven’t really done anything interesting with Altera.
They do have Xe-graphics now. It's the closed Intel has come to a competitive non-x86 part in recent memory. It feels kind of forced though, everyone else has their own CPU+GPU now including Apple/Nvidia in addition to AMD/QC, so why wouldn't Intel? They also have OneAPI.
It would be interesting to see an explicitly JIT-based approach to ILP.
From my personal memory at the time, early NUMA multicore on Windows wasn't the smoothest sailing.
It wasn't. A few years earlier, I was the product manager for a line of large NUMA systems which admittedly had far larger near-far memory latency differences than it was on multicore systems. Commercial Unix systems still could have issues for write-intensive workloads but Windows was pretty much unusable for configurations that had far memory. Things were likely better by the mid-2000s but Windows was definitely still behind Unix in this regard. (Don't really know where Linux was at that point but IBM at least had done work in OSDL on various scale-up optimizations.)
It seems it's not great now either, seeing how the first iteration of AMD's Threadripper performed badly on Windows.
In fairness it's not like Microsoft are alone in that. Single-thread performance is still incredibly important. The Mill focuses on single thread performance almost exclusively for that reason: nobody ever made the mythical auto-parallelising compilers we were all supposed to have by now, not even for Haskell. The big wins for exploiting parallelism in most ordinary software have been just scaling up lots of independent single-threaded transactional workloads via sharding, and massively concurrent runtimes like the JVM where you can move all the memory management workload onto other cores and out of the critical paths. In terms of ordinary programmers writing ordinary logic, single-threaded perf is still where it's at which is why the M1 has 4 big super-wide cores and 4 small cores rather than 32 medium cores.
The trillion dollar question is:
Is the board leaning into the usual MBA moves 101 and turn Intel into a "services company" gradually going fabless and milking those sweet patents OR will they put the work boots on and start building an actual tech company with the people who actually can save them on the payroll? cutting on the usual contractors meat grinder and invite the vast armies of middle-management and marketing drones to leave?
> Is the board leaning into the usual MBA moves 101
Probably not, considering the guy they're throwing out the back door is a finance dude with an MBA and Gelsinger was/is an actual engineer.
BK was an engineer too but still managed to pull an Elop.
BK wasn't even remotely in the same class as Gelsinger engineering-wise.
Pat Gelsinger fits the second option better, he has an actual engineering background and worked on some of the most important Intel products early in his career.
Swan was utterly the first option.
Pat Gelsinger didn't do that at VMW, and it seems unlikely he'll do it at Intel. He's an engineer.
All of the other fabs are so busy they can't handle Intel's chip production plus Intel's technology is wound around their own labs. Switching wouldn't be easy and might end up being a failure and taking the company with it.
The argument I've seen for outsourcing is that everyone uses machines from ASML et al anyway, so retooling to run a different company's silicon may not be as impossible as it seems.
I think there's an issue with helping Intel temporarily, because if you're TSMC you'd rather use your capacity to serve long-term partners rather than helping Intel bridge the gap to 7nm only to get dropped a couple of years from now when they get their chips in order.
To me, that move makes a lot of sense. But judging from the other comments, I'm the only one with that assessment.
I'm my opinion, the secret sauce that makes Intel dominate certain industries is software. And it has been for some years already.
If you need really fast mathematical number crunching, e.g. high frequency trading or realtime audio filtering, then you need MKL, the Intel math kernel library.
If you want to further reduce latency with parallelism, you need TBB, the Intel thread building blocks.
Raytracing? Intel embree.
Once you are locked in that deeply, the raw Intel vs AMD performance becomes meaningless. You only care about how fast the Intel libraries run
So a CEO with experience building high performance low level software seems like an amazing fit.
Edit: And I almost forgot, the Intel compiler used in pretty every PC game to speed up physics. Plus some people have seen success replacing GPUs with icc+avx for huge deployment cost savings in AI.
To me this description sounds like a specialist high performance computing company rather than a consumer technology company. That may be a perfectly reasonable market to be in, but is that type of company worth $200bn? I'm not sure.
Roughly 40% of their revenue is consumer chips where, apart from some games optimisation, they are no longer standing out from the crowd, and the leader is arguably Apple, with AMD doing well. The next ~30% of their business is servers, where there may be a significant number of HPC clients, but the bulk of this is again likely to be VMs running non-Intel specific software, and this market is starting to realise that Intel is nothing special here.
Looking at their revenue breakdown, I struggle to put more than 20% into the things that you mention they are great at. Should they focus on this? It would lose them much of their market cap if they did.
>Roughly 40% of their revenue is consumer chips where, apart from some games optimisation, they are no longer standing out from the crowd, and the leader is arguably Apple, with AMD doing well.
You lost me at Apple. Apple owns around 15% of the PC market space and almost the entirety of that is Intel-based systems. Outside of HN, nobody cares about the M1 chip, it isn't a selling point to my mom or her friends. If someone at the Apple store recommends it they might buy it instead of an intel-based system but it definitely isn't something they're seeking out.
The only threat Intel has right now in the consumer space is AMD, and it's a very real threat. AMD won both Sony and Microsoft console designs, and the mobile Ryzen 5000 chips released at CES look to have enough OEM design wins to put a serious hurt on Intel in 2021.
Even if Apple goes 100% M1, there's the other 85% of the market that Intel is likely far more concerned about.
I get your point, but I think the M1 is more significant as proof of what is possible than because I think everyone will buy a Mac.
I can absolutely see Qualcomm offering laptop chips off the back of the M1's success. They may not be as good, but they might be much cheaper. I can also see Microsoft pushing Windows on ARM harder, and rolling out their own chips at some point.
Also once the market gets "used to" multi-architecture software (again), I think we'll see a renaissance of chip design as many more players crop up, because of the lower barrier to entry.
You misunderstand the point of the M1 out of Apple, and for that matter the graviton2 instances out of AWS. What was demonstrated in the marketplace is that the biggest tech companies are now able to develop in-house processors that are more cost efficient and more performant. These processors are based on ARM and have minimal overhead licensing costs, as compared to buying Intel or AMD chips for their vast fleets / products.
If AWS and Apple can do it, soon other very large companies will, but in a few years, even OEMs will be able to develop their own chips. The market for high end gaming is unlikely to be touched, but the vast consumer market is going to be eaten by custom made ARM-based chips.
So in a world where processor design becomes a commodity, what does that mean for Intel and AMD? And what does that mean for the overall datacenter, consumer markets?
>it isn't a selling point to my mom or her friends
Gargantuan battery life isn't a selling point? For laptop? In what universe?
My mom would love to be able to use the same apps on her phone and her computer. I've been thinking about suggesting her next computer be an apple one since she got her first iphone and this makes it easier. Your Mom May Vary.
I agree with your market breakdown, but surely not with your assessment.
In the consumer segment, you have regular people trying to make vacation videos with software like Adobe Premiere and Adobe Media Encoder, or Magix. Nvenc quality is bad. AMD is horribly slow. The only fast high quality encode is with Intel's dedicated CPU instructions, which both apps heavily promote to their users.
And the 30% that you mention that run VMs... Wouldn't they be pretty happy if Intel added dedicated CPU instructions to make VMware better?
I agree that for the work that I do, AMD is as good as or better. But people doing highly parallelizable tasks like compiling are the minority.
I think you might over estimate the prevalence of video editing software like this. Adobe don't appear to sell consumer versions anymore, it's only pro subscriptions now. Magix is sold at a "vacation video friendly" price, but doesn't mention Intel in their marketing material.
I just don't think the market for home devices is thinking about their video encoding time when they buy a laptop, but I do think they'll use an M1 Mac and find it surprisingly fast, or hear from a friend or family member that they are really good.
Intel just haven't been optimising for the main user experience seen by these people, or those writing "normal" server software either. They've been pushing AVX512 instead, which looks good for video or things like that, but not for regular use-cases.
> AMD is horribly slow
Not sure where you're getting that these days? Absolutely in the days of Bulldozer, but AMD's Zen 3 architecture has taken even the single core lead from Intel, not to mention the multi core lead they've held for several years now.
> AMD is horribly slow. The only fast high quality encode is with Intel's dedicated CPU instructions,
You might need a source for that.
>if Intel added dedicated CPU instructions to make VMware better
They (and AMD) did years ago. Intel VT.
The compiler tricks can only get you so far. I administered a HPC cluster and we have a lot of software dependent on MKL and BLAS. However, with the lucrative performance boost AMD seems to put out, open source libraries like BLIS and open BLAS are attempting to fill gaps. Trust me, no one likes the intel lock-in if there is an alternative that is even close enough in performance.
> I'm my opinion, the secret sauce that makes Intel dominate certain industries is software. And it has been for some years already
Intel's secret sauce is inertia.
The thought that Intel's is not challengeable, and the world doesn't need a company to dethrone it either.
But that assumption is no longer true, and the counter movement is in its full swing.
The future of computing is on not CPU if you ask me. It would move from general computing to heterogeneous computing, and possibly application-specific chips/FPGA. MKL is fast, probably, but GPU and ASIC would be even faster.
> If you want to further reduce latency with parallelism, you need TBB, the Intel thread building blocks.
That's not how latency works and there is nothing too special about Intel's TBB library. It is a big bloated group of libraries that doesn't actually contain anything irreplaceable. Don't be fooled by marketing or people that haven't looked under the hood. It should also work on amd cpus.
> Raytracing? Intel embree.
Embree is a cool convenience, but also doesn't marry anyone to intel cpus.
The CEO of a semiconductor company needs to have an engineering background, IMO. The tech is too complex and too important to the business to have a CEO who doesn't understand the nuances. Wish Pat all the success at Intel. We need Intel to do better.
Disclosure: I worked at AMD for about a decade, although that's a while back now. It is traditional in semiconductor companies (or was, anyway) to have a triumvirate:
1) the "outside" guy (sales, know the customer) 2) the "inside" guy (operations, now the employees) 3) the "tech" guy
Any of these three can run the company, but whichever one it is, they need to have the other two near at hand, and they need to listen closely to them. The problem comes when, as at Intel and perhaps also at Boeing, you have options (1) or (2) in charge, and they're not listening to the person who is position (3) in the triumvirate, or they don't have a triumvirate at all. If the person in position (3) is in charge (as at AMD currently), they will still need to have experts in (1) and (2), and they will need to listen to them.
I agree, that is why Lisa Su runs AMD so well
Gelsinger earned a master's degree from Stanford University in 1985, his bachelor's degree from Santa Clara University in 1983 (magna cum laude), and an associate degree from Lincoln Technical Institute in 1979, all in electrical engineering.
I'd call it an engineering background.
I've understood it as a critique of a previous CEO, not the new one.
Wasn’t he Andy Groves Protege?
Pretty damming to Bob Swan when a $200B market cap company jumps +8% on news that you are stepping down.
Pretty sure he gives zero fucks about this considering the juiciness of his golden parachute.
The only gig where you get rewarded handsomely even if you fail.
But us customers can watch Intel succeed again, rather than squander 7 years.
This x 1000.
Is there any security to short Bob Swan? Seems like a good bet.
Get a daily email with the the top stories from Hacker News. No spam, unsubscribe at any time.