Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Game size is a perennial talking point and some games earn it and some games don’t.

There was a phase of games shipping with uncompressed audio in every language they support for no good reason that ballooned file size (Titanfall)

Some games have gotten around this problem by having higher resolution textures or additional languages being optional installs within whatever ecosystems DLC implementation is. I think this is the smart way to do it.

Examples of games shrinking I are hard to come by but special credit to IO interactive and Hitman 3.

The install size with all the content of the previous games included (If you own them) is less than the install size of the first game. That is impressive and commendable.

Games are weird and some people will look at the file size and think it means the game is better, just the way some people think games with more hours are better. The psychology of people who play video games is weird.

Duplication of assets on disk was a very weird byproduct of HDD file access conventions, but we shouldn’t need that anymore. There is still a shrinking, but valid crossover window between devs being able to assume all storage is solid state. I think things will get better but the debate of file size vs average/expects storage capacity vs Network speed will never end but there is certainly work to be done.



The .kkreiger devs really showed how far you can go to reduce download size:

https://en.wikipedia.org/wiki/.kkrieger

https://www.youtube.com/watch?v=ya_MUKc343U

It is dated, but looks pretty good by 2004 standards.


Note that one of the .kkreiger devs (ryg aka Fabian Giesen) works at Epic Game Tools on the Oodle compressors mentioned in OP's post.


While I'm all for reducing download sizes, perhaps lekktor[1] is a bridge too far.

[1]: https://fgiesen.wordpress.com/2012/04/08/metaprogramming-for...


Sure. Focus on the procedural textures and see where that gets you.


Blizzard sorted the high res textures and some of the raids last so that you could start playing the game halfway through the download.

Another problem is chunking the data properly so that incremental updates are not hugs.


Blizzard us a bad example. Their games take incredible amounts of time to start. I think they dont play their own games and nobody gives a shit.

The launcher is bugged and downloads either empty or 137mb patches (maybe bad CRC at some server? Different CRCs at some?). Every time.

Starcraft remastered was bugged at some point and couldnt be played for a week.

If you statt Heroes of the storm, it takes few minutes to download some 600 byte ghost patch, then does some (antihack?) processing... then you can start the queue after like 10 minutes.

It's probably like the GTA5 that also took ages to start due to poorly done code and devs not caring at all.


Your experience is not my experience at all. Blizzard is the gold standard for me in starting games faster than any other launcher. In fact they are the only one I’ve seen that lets you play the game part way into the download. No other dev seems to have this technology.

I waited hours to play Witcher 3, Cuberpunk, Assassins Creed Odyssey. With WoW I was playing within minutes.


Other games have 'start with partial download' technology. In fact, the core tech of the team that eventually created Valve's Steam was downloading assets on-the-fly so that you could start playing a game before everything was downloaded.

I worked on Guild Wars 2, which has this feature. I made a first prototype of it that streamed all content on-the-fly. It's pretty easy to implement - you have an abstraction that asynchronously loads a file off of the disk, and you can just make that download from the network instead.

The tricky part is when you want to ensure all the assets are there for a specific area before you load in, or simply knowing what order to download things in. For example, there was a starter area of Guild Wars 2 that spawned monsters from many other areas, this meant that the manifest of what was needed was enormous for that area.

So the 'playability' threshold becomes a trade-off between game experience (assets popping in as you play) and quick entry.


Guild Wars 2 needs more respect from the MMO community. The ability to do testing on the live servers, or enable patches with a client reboot and no server downtime, is great.


Guild Wars 2, EverQuest 2 also allow launching after a relatively small (maybe 10%) base amount of data has been downloaded. Its never ideal IMO as it creates longer loading times to areas not already downloaded, but I appreciate that its an option.


In the Beyond All Reason (still "alpha") RTS, the game itself is only a couple Go, but maps are downloaded on the fly, because downloading all of the featured ones would take a dozen Go.


what is 'Go' as a unit? size? time?


Probably Giga-octets. Either French or very formal.


gigaoctets


You waited hours... to play those games? From double clicking the icon to launch? I find that extremely hard to believe.

If you include download time, even that doesn't hold up to your claim.


Assassins creed odyssey was apparently 44-50GB at launch. That’s 3 hours @ 45 megabits/second.

This is not an uncommon internet download speed even today (let alone 5 years ago when it launched) according to the steam download stats (it varies by country, check australia for example, US is a little over 100 average): https://store.steampowered.com/stats/content/

Even at 100Mb, 45GB is 1 hour. Witcher 3 looks to be about 30GB but Cyberpunk is apparently 70-100GB.

Do the math next time :)


I regularly needed multiple hours (sometimes up to 12) to download large games until 2-3 years ago. So he's definitely correct.


Titanfall shipped uncompressed audio on purpose to cut out audio decompression CPU usage. You can disagree with the decision, but it wasn't an accident or negligence.


Why not ship the audio compressed and decompress it once when the game gets installed?

That wouldn’t help people whose issue with large game size is the amount of space used on their disk, but it would help people whose issue is the long download times.


For all we know it actually worked that way. Steam definitely decompresses game assets at install time. The people complaining about game sizes aren't looking at the depot files you download when installing the game, they're looking at their HDD (which is fair!)

My guess is Valve doesn't offer a 'decompress your ogg files to wav at install time' feature though.


Titanfall did work this way on the EA launcher. There was a tool that it'd run on first launch to decompress the compressed audio files.


But why.

Even when you do want lossless assets, as long as you compress in reasonable chunks there is no downside.


I realize they’re trying to optimize for a wide base, but I have plenty of spare cores to decompress audio, and SSD space is limited.

For me, the net result is that Titanfall was one of the first games to go when I needed space. Low hanging fruit.

Even if it was deliberate, I have to question the logic and wonder how much usage they enabled vs. uninstalls from file size over time.


You've already bought it, so the game studio got their money, despite having a "bloated" install. So the incentives aren't quite there to reduce the size.

I was thinking about buying Mass Effect yesterday, the recent update, but it requires 150GB and I don't have a Windows partition that big! (I'm Dual booting Linux) So I think money is being lost in people who aren't often gamers and haven't bought big disks for gaming.


> You've already bought it, so the game studio got their money, despite having a "bloated" install. So the incentives aren't quite there to reduce the size.

They may have some of my money, but not much (discount!), and they’re not getting money for DLCs, arguably the reason the base game was so cheap, and I’m not in a rush to buy more stuff that I can’t fit on my PC.

Prior to the era of the initial game purchase being just the entry point I’d have agreed with you, but many modern games have monetization strategies that make uninstallation a problem.

And if they have a strong enough incentive to care about CPU usage, they’re clearly putting effort into optimizing for some audience.


Do you have a big enough Linux partition? Mass Effect Legendary Edition runs pretty great via Proton.


Are spare cores really such a scarce resource? I thought most games didn't come close to using all available cores. And audio decompression is not known for being that cpu intensive.


It was a long time ago, the CPU usage was an actual problem.


It could've been handled as DLC that downloads with a game by default and can be uninstalled/disabled, if we are talking low-effort solutions.


I think their point was more on the "uncompressed _in all available languages_"


It's so different from game to game, some games are still only a couple Go despite having "4k" graphics...




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: