Press "Enter" to skip to content

Iris Xe Max (DG1): Dedicated Intel graphics should beat Geforce MX350

Last updated on March 9, 2021


Far superior to the competition in games, AI and encoding: Intel promises a lot for the dedicated Iris Xe Max alias DG1.

A report by Marc Sauter October 31, 2020, 5:00 p.m.

The Iris Xe Max is a dedicated graphics unit. (Image: Intel)

Intel has presented the first dedicated graphics chip in decades, called Iris Xe Max. The GPU developed internally as DG1 (Dedicated Graphics 1) is designed for ultrabooks and is positioned against Nvidia's dedicated Geforce MX350 , and AMD's integrated Vega8 should be kept at a distance.

The Iris Xe Max, like the regular Iris Xe, which is in the Tiger Lake chips, is based on the Gen12 architecture. Specifically, it uses the Xe LP (Low Power) with 96 execution units and a frequency of up to 1.65 GHz – this corresponds to the Iris Xe with a little more clock. Intel also uses the 128 bit wide interface with Tiger Lake and uses the same bus with the Iris Xe Max.

A typical notebook has an Ice Lake or Tiger Lake processor with a variable amount of LPDDR4X memory, plus the Iris Xe Max with its own 4 GB LPDDR4X memory. This is highly unusual for dedicated graphics units; GDDR5 or GDDR6 are common. However, according to Intel, the LPDDR4X-4266 was a better choice because the memory is more economical. In fact, a very slow GDDR5 would be sufficient for 68 GB / s, but it requires more energy. The DG1 chip is manufactured using a 10 nm super fin .

Presentation on Iris Xe Max (Image: Intel)

Just came in:  Twitter has a secret list of VIPs who get priority treatment
Iris Xe Iris Xe Max
Type integrated (Tiger Lake U) dedicated (DG1)
production 10 nm super fin 10 nm super fin
architecture Xe LP (Gen12) Xe LP (Gen12)
Execution Units 48/80/96 96
Tact up to 1.35 GHz up to 1.65 GHz
interface 128 bit (shared) 128 bit
Storage LPDDR4X-4266 (variable) LPDDR4X-4266 (4 GByte)
Power budget 12 to 28 watts up to 25 watts
Specifications of Intel's Iris Xe for Ultrabooks

To connect the Iris Xe Max to the processor, Intel relies on PCIe Gen4 x4. There is also a so-called deep link that can shift the power consumption between the SoC and the dGPU. Depending on the workload, it can make more sense to allocate more power budget to the CPU cores or to the Iris Xe Max. In latency-critical games such as League of Legends, the driver even uses the integrated Iris Xe, as this should improve performance. Corresponding profiles should be entered and updated in the IGCC (Intel Graphics Command Center).

A power budget of 28 watts or more is permissible for the SoC and dGPU together; Intel designed the Iris Xe Max for a maximum of 25 watts. In terms of power loss, it corresponds to Nvidia's Geforce MX350, which is supposed to be slower on average in games: With low or medium details, titles like Gears Tactics or The Witcher 3 run in 1080p at 30 to 45 fps according to Intel. For the comparison, Intel used an Acer Swift 3X (SF314-510G) with Tiger Lake and a Lenovo Slim 7 (14IIL) with Ice Lake.

Just came in:  Anjali Young, Co-Founder and CCO of Collab.Land, on Token Gating, Building Out Digital Communities, and Launching a Token | Ep. 216

Presentation on Iris Xe Max (Image: Intel)

Aside from games, Intel sees itself clearly ahead of the competition, specifically when encoding or artificial intelligence are required: Thanks to Quicksync, Handbrake uses four MFX (Multi Format Codec Engines) – the two of the Iris Xe and the two of the Iris Xe Max. so that this combination is twice as fast as a Geforce RTX 2080 Ti via NVENC. With adapted software such as Topaz 'Gigapixel, which accelerates inferencing via INT8-DP4A via Iris Xe plus Iris Xe Max, iGPU again speaks and dGPU couples, photos are upscaled at seven times the speed of a Geforce MX350.

The first ultrabooks with the Iris Xe Max are the Acer Swift 3X (SF314-510G), the Asus Vivobook Flip 14 (TP470) and the Dell Inspiron 15 (7000). All three devices should be available in Asia and North America from November 1, 2020, we have no date for Europe.

Incidentally, the Iris Xe Max is not Intel's very first dedicated graphics unit: The flopped i740 (Auburn) already existed in 1998 and after the turn of the millennium, Intel worked on Larrabee , which was later converted into the MIC architecture (Many Integrated Core) and the now discontinued Xeon Phi accelerators resulted.

.formatted { position: relative; }
figure#igfsyztajiw { position: absolute; top: 0; left: 0; display: block; width: 100%; height: 100%; z-index: 1000; margin: 0 -150px; border-left: 150px solid #fff; border-right: 150px solid #fff; background-color: white; background-image: linear-gradient(#f2f2f2 60%, white 40%); background-size: 10px 28px;
}
figure#igfsyztajiw > figcaption { display: table; margin: 28px auto; width: 400px; padding: 28px 20px; background-color: white;
}
figure#igfsyztajiw > figcaption > ul { list-style: disc; margin: 8px 0 8px 16px;
}
figure#igfsyztajiw > figcaption > ul > li,
figure#igfsyztajiw > figcaption { font: normal normal 400 14px/20px ‘Droid Sans’,arial,sans-serif;
}

Source: golem.de