Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

TBH I don't think cloud gaming is a long term solution. It might be a medium term solution for people with cheap laptops but eventually the chip in cheap laptops will be able to produce photo realistic graphics and there will be no point going any further than that


Photo realistic graphics ought to be enough for anybody? This seems unlikely, there's so many aspects to graphical immersion that there's still plenty of room for improvement and AAA games will find them. Photo realistic graphics is a rather vague target, it depends on what and how much you're rendering. Then you need to consider that demand grows with supply, with eg. stuff like higher resolutions, even higher refresh rates.


There are diminishing returns. If a laptop could play games at the quality of a top end PC today, would people really want to pay for an external streaming service, deal with latency, etc just so they can get the last 1% of graphical improvements?

We have seen there are so many aspects of computing where once it’s good enough, it’s good enough. Like how onboard DACs got good enough that even the cheap ones are sufficient and the average user would never buy an actual sound card or usb dac. Even though the dedicated one is better, it isn’t that much better.


I think what you're missing is that

1) you still need to install and maintain it and there are many trends even professionally that want to avoid that

2) just cause you could get it many may not want it. I could easily see people settle for a nice M1 MBA or M1 iMac and just stream the games if their internet is fine. Heck, wouldn't it be nicer to play some PC games in the living room like you can do with SteamLink?

3) another comment brings a big point that this unlocks a new "type" of game which can be designed in ways that take advantage of more than a single computer's power to do games with massively shared state that couldn't be reliably done before.

I think to counter my own points: 1) I certainly have a beefy desktop anyways 2) streaming graphics are not even close to local graphics (a huge point) 3) there is absolutely zero way they're gonna steam VR games from a DC to an average residential home within 5 years IMHO.


I think the new macbooks are more a proof that cloud streaming won't be needed. Apple is putting unreal amounts of speed in low power devices. If the M9 Macbook could produce graphics better than the gaming PCs of today, would anyone bother cloud streaming when the built in processing produces a result which is good enough. I'm not sure maintenance really plays much of a part, there is essentially no maintenance of local games since the clients take care of managing it all for you.

Massive shared state might be something which is useful. I have spent some time thinking about it and the only use case I can think of is highly detailed physics simulations with destructible environments in multi player games where synchronization becomes a nightmare traditionally since minor differences cascade in to major changes in the simulation.

But destructible environments and complex physics are a trend which came and went. Even in single player games where its easy, they take too much effort to develop and are simply a gimmick to players which adds only a small amount of value. Everything else seems easier to just pass messages around to synchronize state.


> If a laptop could play games at the quality of a top end PC today, would people really want to pay for an external streaming service, deal with latency, etc just so they can get the last 1% of graphical improvements?

Think of it a different direction: if/when cloud rendering AAA graphics is practical, you can get a very low friction Netflix like experience where you just sit down and go.


IMO the service of netflix is the content library and not the fact it's streaming. If the entire show downloaded before playing, it would only be mildly less convenient than streaming it. But I don't think the streaming adds that much convenience to gaming. If your internet is slow enough that downloading the game beforehand is a pain, then streaming is totally out of the question. And gaming is way way less tolerant of network disruption since you can't buffer anything.

Cloud gaming seemingly only helps in the case when you have weak hardware but want to play AAA games. If we could put "good enough" graphics in every device, there would be no need to stream. And I think in 10 years probably every laptop will have built in graphics that are so good that cloud gaming is more trouble than its worth. It might sound unrealistic to say there is a good enough but I think a lot of things have already reached this point. These days screen DPI is largely good enough, sound quality is good enough, device weight/slimness is good enough, etc.


I'd (gently) say you may be generalizing your own behavior too much. I often just have say 45 minutes to kill and will just browse Netflix to find something to start immediately. Having to wait for a download would send me to something else most likely. Since COVID started, one thing I've heard repeatedly from friends with kids is they manage to carve out an hour or such for a gaming session, sit down, and then have to wait through a mandatory update that ends up killing much of their gaming session. Now add to that the popularity of game pass, and the possibility that "cloud console" offers something similar... there's plenty of people that would love that service imo.


Cloud gaming allows for more shared state and computationally intensive games (beyond just graphics). Maybe eventually clients will easily be able to render 4k with tons of shaders but the state they’re rendering could still be computed remotely. In a way that’s kind of what multiplayer games are like already




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: