Battlefield RTX Benchmarks

Discussion in 'Computers and The Internet' started by wooleeheron, Nov 15, 2018.

  1. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter

    Messages:
    9,025
    Likes Received:
    2,394


    Ray tracing is so new people have been dying to see some real benchmarks, and the developers have had to disappoint them thus far because they haven't had the video cards themselves for any length of time. The reviewer describes the benchmarks as among the worst he's ever seen, which is not an exaggeration. A shooter like Battlefield V requires at least 50fps for a good experience on a PC today, and over 80fps at above 1080p resolution. The crucial thing to note is the Nvidia tensor cores required so long to crunch the numbers for the ray tracing, that half the power of the rest of card isn't used because it puts out such a low frame rate.

    Battlefield V is merely the first game to be tested, and I'm looking forward to someone testing Metro: Exodus when they release their RTX updates, but it looks like ray traced video games will require a few more years before anyone can really enjoy the experience. Two RTX cards can be used and the new sli technology is quite impressive, but that's a grand total of $2,500.oo for just the video cards, and I have no clue if it would help the ray tracing benchmarks.

    The strongest implication to me, is that ray tracing requires higher speed fpga circuitry and HBM4 memory to become a consumer reality, because you really want more like three times as much ray tracing in a video game as these cards can produce. It could even require a shrink beyond the current 7nm, and there's just no way to say for sure. By late next year or 2020 we should see AMD and Intel offering alternatives and the interesting thing to note is they both have their own variations on fpga circuitry that might make a big difference. Intel develops all their own circuitry or buys companies that make it for the most part, while Nvidia's fpga tensor cores are IBM technology, and AMD has partnered with one of the largest and oldest manufacturers of fpga circuitry.

    Video game developers could quickly find a variety of ways of running ray traced games significantly faster, but the frame rates are so low in even 1080p it will remain a novelty for some time to come. In the long run, path tracing is the way to go, but we'll have to wait and see what Intel and AMD can produce before we have any idea how long path tracing might take to come to market. Basically, I'd say Nvidia added ray tracing to these video cards only because they wanted to introduce it to public and, more importantly, introduce all the other technology their tensor cores empower, which are considerable.

    Unfortunately, as useful as these fpga circuits are, nobody has been able to use them for video gaming and home applications before, and it could easily require three years just for the developers to sort out all the fundamental ways they can be used. Physics, AI, and the obvious applications only begin to scratch the surface, with Nvidia working on their "infinite resolution" system that should enable significantly faster downloads. Battlefield V was a 45gb download and the recent patch was 55gb, with those kinds of downloads being ridiculous with todays internet speeds. About the one thing I can think of that these graphics cards are actually good for right now is photography and video editing, or playing 1440p or higher video game resolutions with high frame rates, and no ray tracing. The cheapest RTX video card is about right for 1080p resolutions.
     
  2. Irminsul

    Irminsul Valkyrie

    Messages:
    62
    Likes Received:
    111
    Lol. Battlefield.
     
  3. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter

    Messages:
    9,025
    Likes Received:
    2,394
    Its really a multiplayer game that's famous for their graphics and being able to put you in the cockpit of an airplane or tank as well as run and gun. Open World games are in huge demand, and the AAA developers are moving everything into the cloud and hedging their bets on being able to create the next multiplayer World of Warcraft online that will totally ellipse all video game sales. They literally can't count their money fast enough with the idiots even preordering video games, which is why both AAA games and video cards are now three times more expensive, so they are trying to figure out how to collect money faster from complete idiots with an online multiplayer game in the clouds. There is so much money involved, they are creating nationwide cloud server networks right now to ensure everyone has a good connection.

    Its empire baby, and this train ain't stopping until she derails. Hollywood is increasingly moving into the industry, and you don't really want to know how far its gone. Despite graphics card prices shooting through the roof, Nvidia cannot make them fast enough and neither can AMD. In part, due to the cellphone industry and blockchain sucking up all the available memory for computers and over a trillion dollars being invested in AI alone in the last year or so. AMD's newest server puts out a petaflop for cheap-cheap, and they are taking business Intel might get, but they are also generating new business from people would couldn't afford it before, which is helping to drive up the demand and the prices. In other words, even as they make the shit incredibly cheaper and faster, doing with a single consumer graphics card today what cost $60,000.oo they are generating more business faster than they can keep up with production, and even Intel has had to delay some of their chips.

    Die hard PC gamers and reviewers who know the industry, can't stop going cross-eyed and holding their breath waiting for the prices to finally come down to sanity once the consoles begin to catch up with PC hardware. If want a sure sign the end of the world is imminent, just check EA video game stock prices. These guys make Microsoft look like your best friend, and all can say is thankfully Linux is about to explode on the scene.
     
    Last edited: Nov 15, 2018
  4. Irminsul

    Irminsul Valkyrie

    Messages:
    62
    Likes Received:
    111
    It's a multiplayer game based around complete non realism and is a complete joke if you ask me. Always has been. Not even a barret 50cal will take a person down over distance with one hit.

    Post Scriptum, bud. You want a fair dinkum FPS try Post Scriptum with. Me.
     
  5. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter

    Messages:
    9,025
    Likes Received:
    2,394
    Thanks, I've never heard of that one. But, I'm an odd sort, and just play shooters for a mindless distraction. I can do a Rubic's Cube in my head, and arcade style 3D action, such as corridor shooters, offroad racing, or flying a Star Wars fighter are just how I relax and think about other things. Sometimes I debate the finer points of quantum mechanics, while blowing the heads off mutants.
     
  6. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter

    Messages:
    9,025
    Likes Received:
    2,394


    Here's another review of the DLSS technology the new tensor cores enable. You can think of these tensor cores as merely applying analog logic to provide a rough approximation, but then they tweak the approximation just right by trying all sorts of variations. In this case, the DLSS provides anti-aliasing which normally doubles the amount of number crunching your computer has to do, just to get rid of the "jaggies". The problem is your TV or monitor isn't 8k, so your eye can see when the game resolution and resolution of the monitor are too low to fake 8k well without additional help. This DLSS approach is effectively upscaling or upsampling, where they play around with the resolution and provide a good fake with fewer jaggies. But, the really nice part of using this kind of approach is the same tensor cores can be used to other things, while using them for anti-aliasing produces a serious boost in your frame rates and close to 4k resolutions on lower resolution monitors. As wonderful as photo-realism and 8k really are, the truth is everyone agrees the difference between 4k and 8k is so slight its not worth spending money on, while if you can get 4k results on a 1440p monitor that means you can have almost perfect graphic fidelity on what are about to become the next standard resolution monitors, by spending a little more on a graphics card.

    Of course, its expensive by today's standards, with a system containing one of these graphics cards costing easily two thousand dollars, but within two years we should the prices start to plunge for a wide variety of reasons, including manufacturers getting a better grip on the production of memory. AMD invented the HBM standard that everyone adopted, but it has been in enormous demand for servers, and the market simply is strangled for memory at this time.
     

Share This Page

  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice