I am probably unqualified to speak about this, as I am using an RX 550 low profile and a 768P monitor and almost never play newer titles, but I want to kickstart a discussion, so hear me out.

The push for more realistic graphics was ongoing for longer than most of us can remember, and it made sense for most of its lifespan, as anyone who looked at an older game can confirm - I am a person who has fun making fun of weird looking 3D people.

But I feel games’ graphics have reached the point of diminishing returns, AAA studios of today spend millions of dollars just to match the graphics’ level of their previous titles - often sacrificing other, more important things on the way, and that people are unnecessarily spending lots of money on electricity consuming heat generating GPUs.

I understand getting an expensive GPU for high resolution, high refresh rate gaming but for 1080P? you shouldn’t need anything more powerful than a 1080 TI for years. I think game studios should just slow down their graphical improvements, as they are unnecessary - in my opinion - and just prevent people with lower end systems from enjoying games, and who knows, maybe we will start seeing 50 watt gaming GPUs being viable and capable of running games at medium/high settings, going for cheap - even iGPUs render good graphics now.

TLDR: why pay for more and hurt the environment with higher power consumption when what we have is enough - and possibly overkill.

Note: it would be insane of me to claim that there is not a big difference between both pictures - Tomb Raider 2013 Vs Shadow of the Tomb raider 2018 - but can you really call either of them bad, especially the right picture (5 years old)?

Note 2: this is not much more that a discussion starter that is unlikely to evolve into something larger.

  • LastYearsPumpkin@feddit.ch
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Development has always been incremental, but as gaming engines get better, they start being used for more and more things. AAA games don’t always develop the graphics and game engine from scratch, most often they develop one technology and use it for many games, or even buy a pre-built engine and “just” build the game around it.

    Unreal Engine has been used not just for games, but also for real time, or near real time film making. The same engine that is used for playing a game can be used to create the background effects in TV shows, or whole scenes.

    It’s crazy to suggest we just stop working on something because it’s good enough, because that’s not what people do.

    • Crozekiel@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      It’s crazy to suggest we just stop working on something because it’s good enough, because that’s not what people do.

      Came here to say this, glad it’s already been posted.

      Also, why is it that every time someone is being critical of advancements in “realistic graphics” they always post screenshots of Lara Croft?

      • LanAkou@lemm.ee
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Lara Croft has been around since the triangle boobs. There aren’t too many other characters that have been in 3D as long as Lara Croft (Mario 64 Was released the same year, but Mario hasn’t come as far as Lara Croft has in terms of photorealism). Plus, she’s instantly recognizable. Personally, I don’t think there’s any deeper reason than that.

        • Crozekiel@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          That might make sense if they were posting pictures of PS1 era Lara Croft, but they aren’t, they always use the newest examples… They would show a bigger gap in time passed if they used the dragonborn from the initial release of skyrim vs the newest remastered version.