Hacker News new | past | comments | ask | show | jobs | submit
> prioritizing backwards compatibility

Backwards compatibility is very "cheap" these days though? With no arcane architectures and chip designs. PS5 and Xbox are basically just generic PCs running a restricted OS and Switch is just a phone/tablet.

Depends on the level of hardware access.

If the GPU access is through a relatively "thick" API like DX/Vulkan and shaders stored in an intermediate representation like DXIL or SPIR-V, sure, swapping out the hardware implementation is relatively easy.

But if they're shipping GPU ISA binaries as the shaders, you'll have a much harder time ensuring compatibility.

Same with things like synchronization, on both the CPU and GPU (and any other offload devices like DSPs or future NPUs). If they use API-provided mechanisms, and those are used /correctly/, then the implementation can likely be changed. But if they cycle-count and rely on specific device timing, again all bets are off.

Things like DX12 and Vulkan have a large number of sync points and state transition metadata to allow different implementations to be "correct" in things like cache or format conversions (like compression). Not all those transitions are required for every vendor's hardware, and we regularly see issues caused by apps not "correctly" using them when the spec says it's required, as the vendor's hardware they happened to test on didn't require that one specific transition that another implementation might, or they happened across some timing that didn't happen to hit issues.

I guess my point is Compatibility is hard even if the APIs are intentionally designed to allow it. I have no idea how much the idea of such compatibility has been baked into console APIs in the first place. One of the primary advantages of consoles is to allow simplifications allowed by targeting limited hardware, so I can only assume they're less compatibility focused than the PC APIs we already have Big Problems with.

loading story #42072989
It is cheap only if you don't change CPU or GPU architectures. This is why the PS4 doesn't have PS3 compatibility.

When apple switched to ARM even with x64->ARMv8 translation layer (NOT emulating) it was still noticeably slow in a lot of software. Even though some x64 games worked on ARM macs they still lost A LOT of performance.

The backwards compatibility of the PS2 was due to the PS2 literally including an extra PS1 CPU (technically PS1-like CPU underclocked to match the original PS1 CPU when running PS1 games). On PS2 games this PS1 CPU handled only I/O so it wasn't completely wasted when running PS2 games.

https://en.wikipedia.org/wiki/PlayStation_2_technical_specif...

The PS2 CPU is a MIPS III while the PS1 CPU is a MIPS I. I am not an expert but I think but I think MIPS III is only backwards compatible to MIPS II, not MIPS I