Yeah, no, I believe lots of people don’t know Intel when it comes to our technique with AI, and so I’m completely satisfied to speak by way of that just a little bit. So each when it comes to {hardware} and software program, we’re actually geared toward AI when it comes to the {hardware} entrance, which individuals are maybe extra acquainted with. We have now our Xeon product line, which is a knowledge middle CPU that we use for inference in a variety of circumstances for AI workloads, after which extra just lately, we’ve introduced the AI PC, and type of began that entire class, which incorporates – you understand, the machine has a CPU, a GPU, after which what’s referred to as an NPU, a neural processing unit… In order that’s thrilling. That’s type of to optimize workloads in your native machine.
After which again to the information middle facet, we’ve the Gaudi product line, which is a extremely good efficiency substitute for lots of the trendy GPUs which might be on the market. In order that’s actually thrilling as nicely, actually highly effective knowledge middle {hardware} that we’ve.
In order that covers a few of our {hardware}. I suppose the opposite large one which I wish to emphasize as nicely is we’ve Falcon Shores popping out sooner or later, which is an all-purpose GPU, a knowledge middle GPU as nicely. So type of main into that’s Gaudi, and we’ll get extra into that within the episode. However on the {hardware} entrance, we’ve a number of merchandise for AI. In order that’s actually thrilling.
[00:06:05.10] After which on the software program entrance – Intel, once more, spans type of the entire gamut of software program for AI, for enabling workloads in AI. However moderately than type of going by way of the entire software program stack that we’ve, I’ll simply discuss a few issues that I’m enthusiastic about. So the PyTorch 2.4 launch consists of assist for the Intel GPU. So proper now’s the Mac sequence GPU, and will likely be Falcon Shores. In order that’s actually thrilling, that the upstreamed mainstream model of PyTorch has now assist for that. After which coming quickly, PyTorch 2.5 could have assist for the Arc GPU, which varieties part of the discrete GPU product line that we’ve, and that’s additionally included within the AI PC. So these are a few thrilling issues with PyTorch which might be taking place.
After which we even have the OPEA ecosystem. That’s the Open Platform for Enterprise AI. And it’s type of this open framework that we’ve that a number of folks can contribute to for gen AI workloads, reminiscent of chat Q&A, a code Copilot, and completely different different gen AI examples. So yeah, these are a couple of of the issues that I’m enthusiastic about on the software program entrance.