Capable but not expensive, Intel Arc A750 vs. RX 6600 in gaming tests
Techtesters have an interesting video comparison between two graphics cards under $250, the Intel Arc A750 and Radeon RX 6600 non-XT from AMD. Both options are not high-end gaming cards, but for many gamers out there, this is still be a reasonable option.
With the recently announced price drop for Arc A750 to $249, Intel’s option is now looking even better, especially considering this is the relatively new architecture that has only had its debut last year. This model is based on the Intel’s best Alchemist ACM-G10 GPU, but it has been cut-down to 28 out of 32 available Xe-Cores.
That said, both cards offer similar GPU core counts (Radeon has 28 Compute Units) but Arc has more FP32 cores (3584 vs. 1792). Furthermore, both GPUs have the same memory capacity of 8GB GDDR6, but Arc has twice as wide memory bus. In this comparison, numbers are on Intel’s side, but not in all cases. The A750 has only one major disadvantage: a TDP of 225W, which is far higher than RX 6600’s 132W. But more on that later.
Intel Arc A750 vs Radeon RX 6600 in Gaming Tests, Source: Techtesters
Techtesters have put both cards to a battle at 1080p and 1440p resolutions with gaming settings set to high or ultra. Those are fresh results with the latest drivers and on modern Ryzen 5 7600X platform. The A750 Limited Edition ended up slower on average at 1080p than Radeon RX 6600 from Gigabyte (not overclocked) by around 3%, but Arc’s higher memory bandwidth shows its strength at 1440p resolution where Intel leads by 9% on average.
In gaming, Intel A750 consumes 167W, which is not as bad as official TDP figure of 225W, but still higher than RX 6600’s 121W as measured by Techtesters. However, Arc still has an issue with idle power consumption at 35W, so almost 9 times higher than Radeon.
Intel Arc A750 vs Radeon RX 6600 cost difference, Source: Techtesters
Techtesters came up with a very interesting way to present this issue, a long-term GPU usage and its costs. In this case, it also accounts for the time when GPU is actually not in use.
Depending on energy cost, this might be a crucial factor when deciding on a new GPU, especially for those living in Europe where energy costs are higher. As measured, the cost of using A750 could just as well double the cost of the purchase in 4 years time, as long as the card is used for 2 hours a day for gaming and 8 hours idling and energy costs is 50 cents per kWh.
Intel is aware of the idle power issue on Arc Alchemist GPUs. The company presented a solution that requires changes to Windows and BIOS, but it does not work all the time. But which graphics card would VideoCardz users choose?
Source: