I’m considering building a new machine soon and was looking at the Intel Arc GPUs as a possibility. Anyone have experience using them in their system? I’m on Arch btw
I have an a770. The only issue that I’ve had with what little gaming I do, is that CS2 ran pretty terribly, although I tried again last night and it seemed much better.
Intel have just released a driver update to combat this. Its somethimg to do with a transition layer implemetation that has been massively improved giving 500%+ performance boost.
500%+ performance boost
To one game. Most others tested have seen a 5-15% increase in performance, and a couple have had 50% increases.
That’s still quite massive for a driver based fix alone.
Appreciate it. It sounds like with the new announcement they’re putting quite a bit of support behind it so I’m optimistic improvements are made quickly
I’ve been running an A770 Limited Edition on Arch for a year now and I am happy with it now. It was a rough start, with issues ranging from glitches and crashes to HDMI and DisplayPort audio/VRR issues, but these days it is pretty solid. VRR works fine on my DisplayPort 144Hz 4K monitor. Most games perform pretty well but temper your expectations, the A770 is a midrange card.
I can play Overwatch 2 at 4K 144Hz low settings just fine and I don’t see many frame dips. It’s not noticeable if it does dip because VRR. CS2 performance isn’t amazing, but at low settings 4K I get between 100 and 160 frames depending on complexity. I have FSR turned on. On Cyberpunk I have FSR turned on and it seems to dip down to 20fps when out in the desert and the city is in view, but usually 40 to 60.
Thanks for this. I’m on 1440p so hopefully the performance will be a bit better. The A770 seems like it has great price to performance though, making it one of the top spots on my list.
Glad to hear that support is solid on Arch
I upgraded from a 1440p 144Hz screen last month. It works well with 1440p and you won’t need to rely on FSR as much as on a 4K 144 screen.
I wasn’t able to enable VRR on my monitor (with freesync). I’m using KDE Wayland on Debian Testing, just wondering if you knew a workaround or something?
What kernel are you using? Debian tends to lag behind with kernel updates which makes it a bad choice when running new hardware. I switched from Debian to Arch when I got my A770 because at the time Debian’s latest kernel even in sid didn’t support Arc at all while it worked fine in Arch.
I’m running 6.5.10, also with an A770. I could maybe try/compile 6.6 later, but 6.5 seems new enough I thought.
Hmm, 6.5 should support VRR just fine yeah
Yeah no change with 6.6, I guess I’ll probably open an issue somewhere when I have the time to figure out what’s broken.
Just to make sure, you’re using DisplayPort right? I don’t think the Arc cards support VRR over HDMI. The HDMI port on the Arc is actually a built in DisplayPort to HDMI converter, and I don’t think any converter chips support VRR modes.
Yep, it’s definitely using DisplayPort!
Do you know you can add testing, sid, experimental repositories, right? Sid & experimental have super new kernels/versions…
Yeah, I’ve noticed occasional regressions in video decode performance between kernel releases but they tend to fix them in the next release.
Otherwise smoother sailing than Nvidia for sure.
Does anyone know if the drivers are open source?
I’ve heard conflicting claims online and I saw that Phoronix states that they are but their article doesn’t provide any sources backing up that claim
https://github.com/intel/media-driver/
https://gitlab.freedesktop.org/mesa/mesa/-/blob/main/src/intel/meson.build?ref_type=heads
https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git/tree/drivers/gpu/drm/i915
Yeah. Only thing missing is the GUC/HUC firmware source and loading that should be optional still.
thank you for the links/references btw! +1
As a general rule for new hardware from Linux friendly companies, you’re pretty much in the best possible position already by using a rolling release distribution. It’s the same with AMD where reports of bugs from 6 months ago are basically ancient past by now.
Glad I’m not the only one with this question. Feels like it’s difficult to find up-to-date information on the performance of these Arc cards on Linux; I’d like to support Intel’s move into this space but it’s hard without knowing how drastically it’s going to affect my gaming performance. 😅
I’m glad to hear the situation seems to be rapidly improving. I may pick up an A770 yet.
I would love to upgrade to one, but from tests I gathered that they have an exceedingly bad idle power draw. Given that the card would idle most of the time, I don’t really want to waste power on it if nvidia and amd manage to stay far lower.
It was pretty much plug and play for me, I don’t really play much but it’s worked for any game I’ve thrown at it (although there was some artifacting in CS2). I’ve also done some AI stuff with it and haven’t had any issues.
Intel A350, can’t say I have many complaints now. a lot of the issues have been ironed out. I’m not sure if the sparse work has landed for i915 yet, but once it does I don’t think I will have too many super major issues left. Im getting some artifacts when using gamescope, but that’s not a major issue for me since I don’t really need gamescope
UPDATE: I picked up the ARC A750. Been driving it around for awhile. Older DirectX games perform on par or often even better on Linux with ARC than they do on Windows. DX12 games had negligible performance boosts being run on Windows vs. Linux with ARC save some big exceptions…
Certain DX12 titles, one of which I own (Halo Infinite) WILL NOT RUN under Linux WITH the ARC card due to a lack of features in Vulkan. There are still some DX12 calls that have no equivalents in Vulkan, and while some games flag this feature set without using it and MAY be able to be tricked into running without it, any games that actually USE those features will not run under Linux with the ARC card, period. So… Research your newer AAA DX12 titles first.
No personal experience but I heard support is good