Intel's Latest Drivers Lets Users Allocate Up To 93% of System Memory To Arc iGPUs For Wider AI LLM Support

Intel’s New Arc iGPU Driver Update Lets Users Allocate Up to 93% of System RAM for Expanded AI LLM Workloads

Intel just made it much easier to run larger AI models on Arc integrated graphics by letting users dedicate far more system memory to the GPU. A new Arc Pro Graphics HotFix driver (302.0.101.8517 – Q1.26 R2) increases the maximum shareable memory limit, which can be a game-changer if you’ve ever tried loading an AI LLM and hit a “not enough VRAM” wall.

With earlier drivers, Arc users could allocate up to 87% of system RAM to graphics. In a typical 32GB PC, that worked out to about 28GB usable for the GPU. The latest driver raises that ceiling to roughly 92% (and up to 93% in the driver highlights), meaning a 32GB system can now dedicate about 30GB to the iGPU. On a 64GB machine, the driver notes that up to 93% can be dynamically allocated, which comes out to approximately 59.5GB available to the built-in Intel Arc Pro GPU. For AI workflows, that extra few gigabytes can be the difference between running a bigger model natively versus resorting to smaller variants, heavier quantization, or offloading more work to the CPU.

This update targets Intel’s Arc Pro integrated GPUs, including products like Arc Pro B390 and Arc Pro B370, and it also supports several Arc Pro discrete GPUs across Intel’s Alchemist and Battlemage families. While the release notes don’t call out other major performance tweaks, the driver is positioned as part of Intel’s broader push for improved workstation readiness, including expanding ISV certifications for Arc Pro hardware.

From a competitive standpoint, this higher memory allocation is noteworthy. Many other platforms cap iGPU memory sharing around the high-80% range, which is still helpful for AI tasks but leaves less headroom for very large models. At the high end, platforms designed for massive shared-memory graphics can dedicate well over 100GB to the GPU when paired with 128GB of system memory, showing why memory scaling has become a key talking point for local AI.

The driver support covers modern Windows 10 and Windows 11 versions, including Windows 10 22H2 and multiple Windows 11 releases from 21H2 through 25H2. On the hardware side, it applies across the Intel Arc Graphics family (Alchemist and Battlemage) and a wide range of Intel Core Ultra platforms, including Meteor Lake, Lunar Lake, Arrow Lake-S, Arrow Lake-H, and Panther Lake.

For anyone trying to run AI LLMs locally on Intel Arc iGPUs, this driver update is a practical upgrade: more usable “VRAM,” better odds of fitting larger models into memory, and fewer compromises when experimenting with modern generative AI on a workstation or capable laptop.