Ampere Rumors

Discussion in 'Computers and The Internet' started by wooleeheron, Apr 3, 2018.

  1. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter

    Messages:
    9,025
    Likes Received:
    2,394
    Market Realist

    Technically, Nvidia's Titan V, based on their Tesla architecture, is the most powerful consumer graphics card on the market today with 110 teraflops of raw compute power. That's almost half the estimated requirements for running a Star Trek holodeck, and the video card can casually crush any video game on the planet at 4k resolutions and high frame rates. Unfortunately, it contains additional circuitry for AI that gamers don't need, is not really adapted for gaming, and costs no less than $3,000.oo.

    For the rest of us mere mortals with shallow pockets, the standard today is the gtx 1080 ti, which is widely considered the first affordable graphics card capable of doing 4k video gaming serious justice with just 11.5 teraflops and costing a third of what the Titan V does. If it were not for mining driving prices up, the 1080 ti would cost one fifth the price of the Titan V. Ampere looks likely to hit the shelves sometime in July or August and will be Nvidia's first real time ray tracing graphics card.

    Very likely, it will have some of the same asic or arithmetic accelerators that the Titan V does which are dedicated to rendering "Hybrid Ray Tracing" that adds lighting and shadows or whatever to a rasterized scene. The truth is, rasterized engines are so good they can compete with ray tracing but, when they can't such as rendering glass and shadows, its a huge leap forward in quality. Nvidia has huge advantages in both market share and technology fostering early adoption, while AMD's advantage will become apparent with the release of the next generation Xbox and Playstation in 2020. These will almost certainly use Ryzen chips and AMD graphics for ray tracing, compelling developers to adopt the AMD ray tracing system, which supports much more open platforms.

    In the meantime, the video game Star Citizen may be one of the few to adopt their new Radeon Rays, while we can expect many AAA titles to adopt Nvidia's new RTX within the next couple of years. Among other things, the Titan V has HBM2 memory which is expensive, and Ampere is probably going to use GDDR6 memory instead, which still costs 20% more. I would expect these new Ampere cards might put out somewhere up to 21 teraflops, with roughly 14 teraflops being enough for the job in most cases. But, that's a guess, with hybrid ray tracing requiring roughly 1/3 more power than standard 4k.
     
    Last edited: Apr 4, 2018
  2. Irminsul

    Irminsul Valkyrie

    Messages:
    62
    Likes Received:
    111
    I wish they'd just stick to one thing for a decade hey like this 4k shit, is it really necessary? I have a theory that with every new fancy pants TV and screen that comes out, so does your eye prescription as you realise your eyes suck on the new format.
     
  3. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter

    Messages:
    9,025
    Likes Received:
    2,394
    The 4k resolution standard is probably the end of the line, with everyone agreeing that 8k is not a significant enough improvement to be worth spending money on. Monitors are probably going to double in brightness, so they more often look like you are actually looking through a window, have much better color, and get over 200hz so you can't notice any blur. All of that is just what is becoming cheaper to manufacture that also makes a big difference. Adding ray tracing makes a huge difference as well and, personally, I'm excited because PC video gaming lost a lot of its appeal for me over the last several years, merely making everything higher resolution.

    My own interest is in the 3D quality of the images, which adding more light to enhances a great deal, and in the AI it makes possible. Ray traced bots can be easily programmed to look you and each other in the eye and act as if they recognize what you are doing, while currently many bots might as well be bouncing off the walls for all the realism they display. The same ray tracing circuitry can also be used for some physics like swirling leaves and whatnot.

    This is related to Microsoft's next generation interfaces. Using just two cheap cameras that can even be built into a laptop display frame the computer can recognize your gestures and expressions, and even track your eyes. Eye tracking is sometimes far superior to using a mouse, much more intuitive and faster.
     
    Last edited: Apr 4, 2018
  4. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter

    Messages:
    9,025
    Likes Received:
    2,394
    First Look: NVIDIA Next Gen Turing GeForce GTX Prototype Card Spotted

    This article got ahold of a picture of what appears to be an Nvidia Turing or Ampere video card with 12gb of GDDR6, for roughly the bandwidth of a Titan V. That puts it squarely in the neighborhood of 13-14tf of performance, or exactly what you would want for a video card capable of rendering just about anything including real time ray traced lighting.
     
  5. relaxxx

    relaxxx Senior Member

    Messages:
    3,459
    Likes Received:
    722
    Maybe I'm getting too old but I can't see myself ever getting a 4K TV. It would have to be at least 90". At 84" it would be the exact same pixel size or pitch as my 42" 1080p TV. This is what I have my "gaming" PC connected to, a second hand 4th gen i5 with a R7 250 video card that I paid $33 Canadian for. Recently downgraded from a GTX 950 because it was more power than I needed. Anyway, I can't even stand to use my PC with the resolution set to 1080p. Text is too small, I have to zoom my browser to 150%. At 6 feet away, I can barely tell the difference between 1080p video and 720p video. I prefer to actually be able to distinguish pixels in my games. This $33 card can more than handle 720p gaming, even set to 900p, most games run at 60fps. I can't see the difference between 900p and 1080p at all.
     
    Last edited: Jun 27, 2018
  6. Asmodean

    Asmodean Slo motion rider

    Messages:
    50,556
    Likes Received:
    10,126
    1080p really is sharp enough indeed. Let the gadget horny consumers spend lots of money on the newest improvements. Smart sensible consumers enjoy the 'old' and now cheaper screens :p
     
    relaxxx likes this.
  7. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter

    Messages:
    9,025
    Likes Received:
    2,394
    We've all been ruined by flat screen displays. The truth is, the old Sony Trinitron CRT was the perfection of the classic boob tube, still makes most modern flat TVs and monitors look like crap, and is still sold over twenty years later because it really is that good and flat alternatives are expensive. In comparison to the old CRT technology, the average flat TV and monitor is a piece of crap that is incredibly dim, has a super low refresh rate, and horrible color and resolution.

    A standard 1080p display is good enough these days, but 4k is really the standard that most people want and will eventually become the new standard, perhaps in another ten years. However, HDR10+ is a new standard already being adopted by the industry that allows even a 1080p monitor to simulate a 4k one. There are different designs, but I have one myself, and mine uses eight backlights it can control separately to make the sun in the sky look brighter, while the shadows remain dark without just looking like a muddy mess. It can put out roughly twice the brightness of your average monitor, and by varying the brightness on different backlights it can make that difference look even more dramatic but, just as importantly, it can display roughly a million more colors and shades of gray than your average 1080p monitor today. You can even buy a 1080p monitor or TV that has HDR and will display those million more colors and do a decent fake of higher resolutions.

    Basically, 720p is also called high definition, but that's because its merely a marketing term. Somewhere around 1440p or slightly higher resolutions are a good midway point for faking 4k displays while, everyone agrees, 8k is not significantly better enough than 4k to pay the difference. Monitors and TVs today put out perhaps 300 nits of brightness on the average, while 3,000-7,000 nits is broad daylight. If your TV were bright enough and had uhd resolution, it would much more often look indistinguishable from looking through a window.

    Displays of any kind, photography and movies, are all about producing the best cheap fakes they can when getting the real deal in the screen is just not cost effective. A color flat screen tv or monitor is dirt cheap today, but its also not nearly as good as technology almost half a century old.
     
    Last edited: Jun 27, 2018
  8. Asmodean

    Asmodean Slo motion rider

    Messages:
    50,556
    Likes Received:
    10,126
    I disagree. By mediocre quality maybe. But for the people who have use for it the HD flat screen is an undeniable improvement on the old television screen or pc monitor
     
  9. relaxxx

    relaxxx Senior Member

    Messages:
    3,459
    Likes Received:
    722
    What I really want is better CRT scan line emulation. I want my 80's arcade games to look like 80's arcade games.
     
  10. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter

    Messages:
    9,025
    Likes Received:
    2,394
    Flat screens are much more portable and make larger TVs possible, and I'm certainly not suggesting we throw away our flat screens. I'm just saying the picture quality has been bad all along, but is about to improve dramatically for even cheap models. 1440p are higher is likely to become the next standard resolution, and along with the HDR standard it should make for a huge improvement.
     
  11. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter

    Messages:
    9,025
    Likes Received:
    2,394
    NVIDIA GeForce GTX1180 from ASUS listed by Vietnamese store, to be released on September 28th | DSOGaming | The Dark Side Of Gaming

    This is the best confirmation yet of the Nvidia Ampere specs, with these specs being what Nvidia hinted at. That's 16gb GDDR6 ram, which is so much memory I can't begin to fathom all the things you can do with it on a video card. You need at least 6gb for decent udh video gaming, with maybe 14gb being an ideal. The most striking thing is it only has a 256 bit bus, suggesting Nvidia tweaked the bus for graphics in some way, because you would expect a wider bus normally in a video card with this much power. Part of the bus issue is how much advantage a graphics card can make of using pcie lanes for something other than just graphics, such as AI applications. Nvidia may be attempting to make sure their relatively cheap consumer gaming cards can't compete with their HTC ones costing tens of thousands, but we'll have to wait and see. At any rate, that would help to draw the lines in the sand between Nvidia and their competition.

    Nvidia says they will recommend an MSRP of $750.oo, but Nvidia's partners are the ones who actually turn their chips in graphics cards and set the prices. This one, is going for $1,500.oo, which should be taken with a grain of salt at this point, since it was a boo-boo one assumes and they simply posted the picture of it on sale too soon. Nvidia's partners would never tolerate them dolling them out one at a time, and playing favorites with them even more, and this launch will ensure everyone has them in stock already. The great thing about this video card coming on the market, is that it means within a year to two a video card that rocks could cost around $350.oo. If it were not for the crypto mining, we would already have them.

    Already the 1080ti is coming down in price, and is expected to drop significantly over the next month or two.
     

Share This Page

  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice