AMD just announced the end of video drivers for 32-bit operating systems:
https://www.anandtech.com/show/13520/amd-ceases-32-bit-graphics-driver-developmentNvidia had already ceased support of 32-bit operating systems earlier this year:
https://www.anandtech.com/show/12191/nvidia-to-cease-driver-development-for-32bit-operating-systemsThat means that there are no longer any supported GPUs on 32-bit operating systems. Some game developers have already felt free to ignore the existence of 32-bit operating systems; any game that requires more than 4 GB of memory certainly does. But this pretty much makes it official that 32-bit PC gaming is dead.
Comments
We had Empires run by Emperors, we had Kingdoms run by Kings, now we have Countries...
The transition has been gradual, of course. For a game developer to go 64-bit only a week ago wasn't substantially different from doing so today. But the end of GPU driver support for 32-bit operating systems is about as clear of a milestone on that path as we're going to get, so I thought I'd highlight it.
So among the Intel GPUs on active support, the only reason why they haven't yet dropped support for 32-bit is that they never supported it in the first place.
My Skyrim, Fallout 4, Starbound and WoW + other game mods at MODDB:
https://www.moddb.com/mods/skyrim-anime-overhaul
I guess some people might have super old pcs and can't run new games anyway?
But I don't really know what other use having such an old pc would bring. Wouldn't even be able to do GOOD youtube videos if the pc is so old and ancient (and no one is gonna watch someone play on a junk PC)...so wouldn't be able to have a youtube career. Most art based programs take a pretty good PC to run...at least the good ones...so can't do digital art either.
I think those stuck in the mindset they can't upgrade windows xp are probably mostly seniors that can't adapt or those in super poor countries (which I'll concede that its a shame for them since upgrading can cost 10s of thousands of dollars when converted to USD, when for US it be like 1k for a really decent setup).
But it be so limiting because at that point likely the PC sucks anyway. All the new PCs have long ditched XP and at best might have windows 7 if they don't include windows 10...and again...can't do youtube videos or anything else on the pc on a bad PC. So I dunno what use it would have, except for seniors and for poor undeveloped countries.
Maybe I'm missing something. But all the stuff I personally use needs 64bit to be efficent. Photoshop? Needs a pretty decent PC or its too slow. Sony vegas pro for youtube? Again needs a good PC. To do high quality gaming videos? Again needs a good PC...and all the newer games really need a 64bit system to run.
My Skyrim, Fallout 4, Starbound and WoW + other game mods at MODDB:
https://www.moddb.com/mods/skyrim-anime-overhaul
PCs with <= 4GB of RAM are still sold today all the time.
I do agree it would be silly to buy a brand new i9 and just install 2GB of RAM though.
It spent like one paragraph explaining the difference between float and double, then discarded float completely.
It is sometimes said that you can't address more than 4 GB of memory with a 32-bit CPU. That's not actually true, as you can do it by chaining together computations. For example, a Sega Master System had 8 KB of system memory with an 8-bit CPU, which is a lot more than the 256 bytes that you can directly handle with 8-bit computations. What is true is that you can't do it efficiently, and needing to do several computations to get a memory address every time you want to access memory will cause a huge performance hit.
A lot of things would eventually want more than 4 GB of memory, but servers were the first thing that really drove it. You might reasonably think of 64 GB of memory as being a lot today, and for a desktop or laptop it is, but it's really not very much for a server. Today, you can easily get 192 GB per socket even with cheap 16 GB modules, or as much as 2 TB per socket with some fancier server stuff, and the reason you can get it is that there are customers that need it. When AMD launched the Athlon 64 about 15 years ago, 1 GB was a lot in a desktop, but 4 GB wasn't nearly enough for a lot of servers.
There is already a need for larger than 64-bit computations for a variety of purposes. It's just a lot less common than needing 64-bit in cases where 32-bit isn't enough. For now, this is handled by chaining together 64-bit instructions. There are a lot of programming tools to do this for you under the hood, such as Java's BigInteger class.
If applications that need 128-bit computations become very common, then we'll see a rise of 128-bit CPUs to handle those computations. But until then, it's more efficient to just make cores better at 64-bit instructions and handle larger numbers by chaining together multiple instructions, as that gives better performance in most of the things that people do.
Nintendo 64 was a 64-bit gaming system, and it released in 1996. Granted, subsequent consoles were mixed: Nintendo went back to 32-bit PPC for their next consoles, PS1 was 32-bit, PS2 was sort of 128 bit, and current consoles from MS/Sony are all x86-64.
guess it kinda goes to show 64-bit in and of itself is just a means to an end, it doesn’t really do much by itself
Besides, floating-point data types are algebraically weird, as neither addition nor multiplication is associative. Very few people really want to know or care what they're doing, or just how big the rounding errors are. More bits in the mantissa gives you smaller rounding errors, which is sometimes a good thing and sometimes irrelevant, but pretty much never a bad thing.
When you run into situations where it does make a big performance difference, then maybe you use floats. This could be because you have a large number of them and doubles take twice as much memory. The difference between 4 bytes of memory for a variable and 8 doesn't matter if it's just one variable, but if you have an array of a billion of them, the difference between 4 GB and 8 GB might. Or if you're doing those computations on a system heavily optimized for 32-bit, such as a GPU, then you don't use doubles unless you're forced to or you're careless.
A truism from Torval.