The L3 finding on Linux is likely caused by the C3 entry cache flush, which goes away in our upcoming 6.1 kernel. It doesn't affect most game workloads but can be a real performance problem in some specific cases.
For those that don't know (like me, three minutes ago) gamescope [1] is a Wayland compositor custom-written for games (and, I believe, what the Steam Deck uses). it's open source, and under the "BSD 2-clause" license.
It seems to be a conversation where Deepak Sharma talks about needing to resubmit a patch (and a follow up patch), rather than be the submitted patches themselves.
Cool. Can I ask why does valve/steamdeck use Gamescope and not Wayland?
To my very limited understanding they both target to do more or less the same thing, and I think we have enough fragmentation in the Linux ecosystem as it is, so to my uninitiated brain it would have seemed logical to put al that development effort in improving Wayland for the steamdeck instead of building another compositor.
Wayland is just a protocol. GNOME and KDE both use their own Wayland compositors for example. Gamescope as a gaming focused Wayland compositor probably has more emphasis on things like latency than other compositors.
Wow I’m glad somebody is finally quantifying why the steam deck has terrible CPU performance. I thought it might be the tiny cache.
Many older games struggle to hit a consistent 60 FPS due to CPU limitations. With some testing you can usually figure out which options are mostly bottlenecked by the CPU.
There are some games, like Fallout 3, that will chug down towards 30 FPS in busy areas regardless of settings :/
I was always interested why Dolphin (wii and GameCube) emulator would chug on the steam deck. Very frequent single digit fps drops. My 2013 MacBook Air on Linux plays with 0 drops.
It is strange one, but try to disable SMT and limit cores to 3 while playing dolphin. Solved all of my issues. Guess linux version have some issues when limiting cores.
I disagree, it is not really terrible, it is on par with i7 4770, i7 7700hq and even 10310u at two-four times less power consumed. 99% games are only needs 4 cores to achieve 60 fps, and apu is only limit for them at these extremely low TDP.
AMD put a decent apu for balance, and compare to 6800u with twice more cores and 1.5 more powerful apu, they only lose 10-20% while consuming 2 times less power.
Where do you suspect the problem might be? I know that the memory latency is the primary issue for that engine. It sucked on the CPU of that era, it sucks on the steam deck, but it does not suck for most modern CPU because they have way more L3 cache and better latency outside of it (as the article states)
Fallout 3 has massive issues on any new operating system. There are unofficial patches and mods to fix that but Valve does not auto-install anything unofficial, even if it would make things far better.
I recently played through Death Stranding which is more recent, better graphics and seemingly better optimized than Fallout 3. I'm pretty sure if the CPU can handle Death Stranding (and other more complex games) without problem, it could handle Fallout 3 if the programming was on point.
It's important to not forget that Skyrim has been ported to an insane amount of consoles over the years, which includes engine and optimization updates. It's a very actively maintained game that from what I can tell mostly follows sensible design practices with current hardware.
Fallout 3(& by extension New Vegas) hasn't had that luxury; those games only got two releases (initial & "ultimate edition"), both of which running on an engine where the glue holding things together is duct tape and technical debt going back to Oblivion and it was released for a set of consoles that needed optimization to the degree where it was just easier to dump the entire world state as it changed into a save file instead of just storing the bare minimum[0]. That's the sort of optimization that will lead to bugs down the line as undocumented features and behaviors change (not to mention the already buggy release state of both games.)
[0]: Which in turn caused another infamous bug to pop up where the PS3 would run out of storage to keep individual game saves (the cap was something like 10mb) after playing for long enough.
Fallout 3 and New Vegas both have massive issues. You may be better off using Lutris with the GOG version on the Steam Deck and finding a config that installs the performance fix mods.
I tried performance patches for both FO3/FNV and couldn’t fix it for either. I believe it’s because every prefab that Bethesda renders requires a memory access on every frame. So areas with a ton of prefabs like Mr. House’s penthouse or rivet city market are extremely slow.
It checks out given the articles coverage of just how poor the memory latency is once you exceed the tiny 4MB cache.
There are ways to “fix” it but they tend to suck. You can crank the draw distance of objects to the minimum but if it’s not done dynamically then every other scene becomes terribly ugly/broken. You can also install mods that remove some of the clutter but how do you know which clutter to remove? Some of it is aesthetically important and the tone of the scene can change dramatically by randomly removing stuff.
Yeah pretty early on I realized that I didn’t like playing demanding games on it for a few reasons - screen quality, fan noise, and CPU performance. It’s mostly smaller indie games and emulation for me.
You mean the game that is not marked as verified but rather as "playable" (which really means: it starts. Figure out yourself how to make it run well) doesn't run well?
CryoByte33 on YouTube[0] has videos explaining in details different configuration presets to run games like God of War, Witcher 3, or Breath of the Wild.
He's also maintaining an impressive piece of software[1] that helps optimize your Steam Deck.
Worst case scenario, if you don't find the game you're looking for in his videos, chances are you will find a decent configuration on SteamDeckHQ[2]
Entirely unhelpful, but considering your original comment was pretty much complaining that <old device> cannot run <very new and demanding software>, I figured it was in the theme.
Yeah, but Switch doesn't come close to Decks performance in any configuration - there's a staggering amount of games that don't even reach 30fps consistently while looking like a blurry mess.
Don't let everyone get you down, I have one and it's extremely performant, even with games from the past few years. It'll work totally fine for stuff from that era.
You should give it a shot. HN can be too negative at times. At least the YouTube vids seem to indicate that the deck can handle the games of that era just fine.
(author of the article here) Agreed. I'm just commenting on the hardware architecture. The deck is a pretty competent gaming device, given its ultraportable form factor and tight power constraints. You can find that out on any number of sites that have reviewed the Steam Deck from gaming experience perspective, so I didn't think it was necessary rehash that.
I’ve been very happy with mine playing games of all vintages and development budgets. Just this weekend I was playing Far Cry 4 and The Division 2 quite smoothly.
I've been playing Midnight Suns on mine, very recent game, and it works extremely well(if you lock to 30fps). I've played a good chunk of Elden Ring also completely locked to 30fps - no problem at all.
Someone said something about it before that makes sense to me - it's a portable PS4. If you are happy with playing games at the level a PS4 would play them, then it's a perfect device for you. If you expect proper next-gen games to run on it(Returnal) then you will be disappointed.
I hope Valve will release an updated SteamDeck with a new SoC featuring Zen4 instead of Zen2 and RDNA3 vs RDNA2, all on the smaller 4nm process, as the current SoC is kind of holding it back a bit.
That would definitely boost performance and battery life. Wouldn't mind it if came with a price increase, as it would still be cheaper than a single high end GPU alone.
There was interview with valve [1], and i'm pretty sure there will be no "steam deck 2" anytime soon. They want better battery and screen (tbh, my deck have pretty noticeable IPS glow in dark scenes), but that's it. So i expect "steam deck pro" to be like switch OLED compared to switch v1/v2 - better battery, better screen, same performance.
Also, keeping current deck hw specs will allow developers to focus on single device when optimizing game for deck. Some games (i.e. cyberpunk 2077) already have dedicated "steam deck" graphics preset. Some (i.e. sons of the forest) have steamdeck: true/false variable in debug window. Maybe we'll see dedicated deck support in some game engines? Who knows.
I am more amazed that Cyberpunk can work at all on the deck. I thought it had such horrible performance optimization out the gate, that it was pulled from the Playstation. Surprised that any amount of patching could make it workable on a portable device.
The base PS4 and Xbox one are essentially two netbook cpus taped together with a GPU on top. The cpu in Steamdeck is about 2x-4x faster than the one in those consoles.
Cyberpunk just have awful optimization. Sure, it's pretty (when it works) but even 4090 can't pull it on max settings at 4k@60fps after all the patches. People will say "it's 2 GPU generations ahead" but why would you release such game. It's just excuse to justify horrible optimization.
> Sure, it's pretty (when it works) but even 4090 can't pull it on max settings at 4k@60fps
That sounds strange, I'm using a 3090ti and can play it on 4K with more than 60fps (with DLSS). Maybe something else is the bottleneck in your setup? Or maybe you haven't tried it since the initial launch? Initially the performance was shit for me as well.
With DLSS, yeah. I expected game to run without DLSS on 4090. But it still can't hit 60fps even in desert. And nope, it's GPU bottleneck, it just uses all 100% of gpu power.
DLSS is nice, but you still can notice some flickering on neon stuff, hair, etc in cyberpunk with DLSS enabled.
Why wouldn't newer APU help for lower battery drain though?
> keeping current deck hw specs will allow developers to focus on single device when optimizing game for deck
That argument is used by incumbent console makers to refresh their hardware once in 5 years which results in it being horribly behind and holding games back as well. So I don't really buy this logic.
We're talking about portable games here. There will always be some drawback due to power limit and other issues. While i can relate to your sentence about games being horribly behind due to "platform parity", that probably won't work for portable platform.
Well, if it can run Cyberpunk 2077 which can be super demanding on even the most high end desktop, I'd say it's really up to developers to make their games scalable according to available resources, instead of thinking in terms of "I target this low end device only".
With such approach there is no need to stagnate available hardware on any platform.
Then you can get into a situation where the refresh Xbox plays games better than the original release. That confusion about, "Is this game compatible" is not great for consumers.
Isn't it exactly how it's on PC now? You get newer generation hardware and performance would be better. Except you can get it way more frequently than with consoles.
The biggest upgrade would actually be the screen. I already found the screen resolution + meh ips made it unappealing to play demanding games anyway.
For example, I could make monster hunter world to run at 60 fps but many games like this is just not designed for 720p. There are so many things on screen that are shrunk to few pixels and it just looks bad
I’ve found that forcing games to run at above native resolutions can help quite a bit. The aliasing that happens at the native res can be pretty bad even with high AA settings depending on the implementation.
Also, the deck tends to have GPU headroom due to the awful CPU performance so you can do it for free usually.
It's not just that. Many modern games are simply not designed for low resolution. 1080p is the barely minimum for them and higher resolutions would be even better.
There will be too many objects on screen. When u shrink it to 720p, it's simply not viable anymore as the objects and textures have too few pixels to represent.
Sounds like more of a game problem. The screen is about 200dpi which is pretty good. As more games test on the deck that should improve.
I tried playing some steam board games on there, the text is readable but on a small screen it's not a good experience. It worked well with an external screen.
It's a tradeoff between cost, performance, portability, battery life and resolution. They've done a nice job in my opinion.
> I hope Valve will release an updated SteamDeck with a new SoC featuring Zen4 instead of Zen2 and RDNA3 vs RDNA2, all on the smaller 4nm process, as the current SoC is kind of holding it back a bit.
>
>That would definitely boost performance and battery life. Wouldn't mind it if came with a price increase, as it would still be cheaper than a single high end GPU alone.
That would be a terribly unwise decision from Valve. If there is one sure thing about video games console systems, is that stability of the product is more important than outright performances. That is what makes them so successful over gaming PCs. If Nintendo or Sony was releasing a new model every 2 years and people couldn't run new games on the old releases, only a fraction of those users, the very hardcore gamers, would follow the upgrade path and most users would just leave the platforms.
The best thing that can happen is the Steamdeck stays the same for 5 to 10 years and video game developers continue making sure their games run well on it.
I agree that it's too soon for a CPU refresh for the Steam Deck, but I disagree with the idea of thinking of it as a console. It's a PC, and PC game developers are already required to support a vast, unfathomable array of configurations far beyond what an upgraded Steam Deck would require.
It is definitely a pc but if you don't treat it as a console and start selling newer upgraded versions, it will soon be dead as a platform. It is not machine you can swap the GPU or the motherboard or upgrade the memory at will and pc game developers / companies mostly don't give a fuck about people who aren't using something at least close to the greatest and fastest unless it is a highly sold platform.
I wonder whether the upcoming APUs will feature 3D cache. Assuming it server pretty well in a gaming scenario (and more so when the CPU and GPU is sharing the same memory bandwith) it should be a pretty clear win. And since these chips are meant to be drawing less power than the general purpose ones, I guess having another layer of 3D v-cache would not constitute as big of a problem as it does on the desktop platform.
These chips are meant to be cost competitive, not performant. There might be more incentive to add gddr6 bandwith rather than limited 3D cache. Theres also HBM memory as another alternative that would function better to reduce power draw compared to gddr.
The missing L3 cache is interesting, I can only assume that it's another sacrifice made for power optimisation. Maybe their perf metrics showed disabling L3 provided a worthwhile power/perf tradeoff?
It seems like a test artifact on the latency test.
> L3 issues seem to be gone with a bandwidth test, with similar results using Windows or Linux. We see over 200 GB/s of L3 bandwidth with an all-thread load.
Somethings weird. There's definitely an L3 and it definitely works but not as predictably as most cores.
Maybe cost too. Custom 7nm silicone aint cheap and the steamdeck is quite an affordable product considering Vlave don't have access to the same economies of scale that Sony, Nintendo or Microsoft do.
The L3 cache appears again in Windows, so it's clearly there physically on the die. But under the default performance tuning of the deck seems like it's not really being used. Possibly they are turning L3 off/down at runtime?
Big guess here (I mostly know CPU design from what EE I gleaned from college) but my guess is that the L3 is shared per CCX, which probably makes it harder to properly power down/up when only one core is in use.
Maybe AMD took on a bit of the costs for developing the Steam Deck APU so that they can sell it to other users as well like NVIDIA did with the Tegra chip.
AFAIK nothing else uses the Steam Deck APU despite there being several PC handheld manufacturers. All the others go with standard laptop chips for better or worse.
Although I wonder how much exclusivity you get anyways with how remarkably similar the PS4/Xbox One and PS5/XSX designs are.
Well, the Steam Deck is meant as a dedicated gaming PC, and explicitly supports replacing the OS, so Windows performance is eminently relevant.
Also, every single reference to Windows is in the context of testing whether the performance issues they're seeing are OS or hardware related, which is exactly the sort of analysis work I'd like to see more of.
No, I don't see people benchmarking BeOS on a Surface. Then again, I've never heard anybody say "well, if this whole Windows thing doesn't work out, I'll install Be on my Surface".
At any rate, this article isn't about the Steam Deck's performance, it's about Van Gogh specifically, and how it relates to other pieces of Zen 2 hardware. Testing with both Linux and Windows is useful. There's a comment elsewhere in this discussion by a Valve engineer saying that one of the issues the author saw is due to an ACPI handling issue in Linux. That's the sort of thing you catch by testing with multiple OSes.
(author here) full disclosure, I work at Microsoft. On Azure though, not on Windows.
But using Windows, as another commentator pointed out, was done because my latency test gave very weird results from SteamOS and needed more investigation. I also used Windows to test the iGPU because Nemes's Vulkan test ran into problems under SteamOS.
Thanks, I just don’t know how relevant the results are under Windows though. It looks like you think the L3 “disappeared” but that just doesn’t make sense.
They are not relevant, the author made a totally irrational choice by benchmarking under Windows, not realizing the Deck is marketed as a Linux machine first and foremost, and they should be ashamed of themself.
From the first day the Steam Deck was released, ALL FORUMS, SOCIAL MEDIA, AND OTHER ONLINE OUTLETS, BLOGS, AND THE TECH PRESS has been repeating the doubtless paid Microsoft astroturf that "it'll be a great Windows gaming platform" and lo and behold, we get yet another article in this vein.
It's like Linux on the Switch - you don't see the gaming press benchmarking the Switch using Linux. But then again the Linux Foundation doesn't pay thousands of "people" in the gaming press to astroturf its product. Microsoft does.
Was Van Gogh developed custom for Valve? I think I remember talk about Van Gogh from before the Steam Deck was announced, and at that time it wasn't presented as a custom chip but delayed and troubled cheap gaming APU project. Silicon bug workarounds could explain some of the strange behavior seen here. If Van Gogh wasn't a custom project and Valve just happened to have the perfect use case for it that would be serindipitous.
There were rumors of both a surfacebook and a MacBook that would’ve used it when the first rumors of Van Gogh circulated. The persistent story I’ve seen is that Microsoft wanted an apu for a gaming capable surfacebook then scrapped it, and valve ended up using said apu.
Yeah, if linux gaming does take off the story of the development of this chip might be have the trappings of “Pirates of Silicon Valley” story one day, at least to techies.
It would be interesting to get the details of how this all went down considering about a year before Van Gogh, at the beginning of 2020, there was a rumor that Apple was going to make a 5000 dollar gaming mac (that would sorta slot into where the mac studio is now). If the “gaming mac” portion of the rumor is true, that potentially raises more questions like if Apple might have wanted to keep designing x86_64 macs, and if AMD was shopping the idea of this chip to a number of parties or who put money in the development of this chip and when (as you pointed out). Again not because theres something nefarious going on, but just because it seems like a great story.
That's not really what happened. Nintendo was working with Sony to create a CD add-on for the SNES. As a part of that deal Sony was making a version of the SNES that included the CD drive add-on called the PlayStation similar to how Sharp made the Twin Famicom that incorporated the Famicom Disk System. Nintendo realized the deal with Sony would give Sony too much control over CD games so they scrapped it and went with Philips (which is how they got to make those bad Nintendo CDi games). In the end a SNES CD add-on was never released.
Sony was pretty pissed off when Nintendo announced they were going to be partnering with Philips when they thought Nintendo was going to be announcing they were working with Sony. So they ended up entering the gaming market to compete with Nintendo. However the PlayStation they came up with was an entirely new design focused on 3D graphics.
Just to add, the CD add-on to the SNES would have added CD-based audio and presumably larger/cheaper games, but would have added no additional computing power. None of the playstation hardware was based on this product.
Isn't that akin to what happened with Cell and IBM/Sony? With Microsoft and the 360 benefiting from the same work but dropping the SPEs?
Details are fuzzy for me so I might be misstating things but I always found that to be quite a fascinating tale of technology.
I believe the exact configuration of the Van Gogh APU is custom designed for Valve. There are very similar APUs but this exact chip is not used in laptops.
It was originally with microsoft, to go in a surface device, then that fell through and the steam deck picked it up. Not general use persay, but microsoft's general use
I'm wondering, is the WiFi chip actually an RTL8822CE?
I did some of my own exploration 1-2 months ago, and I identified it as an RTL8821CE combo WiFi/Bluetooth chip. It took a bit of time, because the chip uses PCIe for the WiFi part and USB for the Bluetooth part, so the different functions appear on different busses and tools.
I've never seen that before, though my Steam Deck audio setup is a bit weird: HDMI to HDMI audio splitter, with audio going out to S/PDIF and video going to an HDMI-to-DP converter.
A typical Zen 2 CPU has 320 floating point "register files". It's been a long time since I studied architecture, but the idea is that the instruction set's floating point registers are mapped to one of those files at any given time. The operating system can change that mapping during a context switch. That allows the OS to quickly jump between contexts without having to save and restore all the floating point registers to RAM.
The PS5's CPU has fewer floating point register files. Given that it's a game console, there's probably fewer threads running, so it was just unnecessary die area.
If PS5 code uses fewer floating point operations than conventional desktop CPUs, there are fewer physical floating point registers needed for an equally sized speculation window.
Register window isn't used for context switching, it's used during function calls. In most architectures the sliding is done implicitly during by the call/ret instructions (named eg call4 ret4 to slide 4). Some architectures allow you to also slide the window manually but none (that I know of) allows you the absolute positioning needed to mimic traditional register files/banking.
It would make sense if PS5 games don't have as much of a need to do floating point work on the CPU, since they have shared-memory access to a high-performance GPU and can do pretty much anything they want with it. AFAIK AMD has had asynchronous GPU compute support in their hardware for a long time, so I'm sure the chip in the PS5 has it.
You mean this article, https://chipsandcheese.com/2022/10/27/why-you-cant-trust-cpu..., where they fake a 10 core Zen 4 part as an example of how you can quite easily fake CPU benchmark reports and why you should be careful what information you trust? Where they're completely transparent about what they did, why they did it and how they did it to raise awareness of how this stuff can be faked?
These folks seem to have ultra-legit reviews with deeply in-depth testing & comparison. A long track record of it, with many folks commenting in on HN to declare how great these kinds of reviews are to see, how numbers based.
You havent made a single claim amyone could refute, at this point. But you've pretty vaguely claimed a huge raft of issues. Are any of these things we can make real or discuss? Or should we just take it on faith, trust you, that this pretty well reputed site is bs?
Sounds like steam deck is the worst of all worlds. I am told people buy consoles because they just wanna play and not worry about tweaks or game config. Game will still work 5y later on a console, but not on a PC. Looks like steam deck implements NONE OF THET!
https://lore.kernel.org/lkml/CAJZ5v0jWX=H=aZ25PzHdH05bRJvtYb...
The way the benchmark works likely triggers it consistently.