NVIDIA Demos Turing GPU Based Quadro RTX Graphics Card in Ray Tracing Ampere is here, but its not Ampere, but the more expensive professional versions. However, the consumer versions are coming out soon and this article illustrates what we might expect from these video cards in the way of ray tracing ability. Nvidia ran the Star Wars demo on a single video card, but that video card cost $6,000.oo. Programmers keep finding new ways to optimize programs and cut the requirements in half, and they have already found at least one way to do that since this Star Wars video was made. Assuming prices are cut in half every three years and it eventually will require half the power to run, within five years or so video cards that are capable of the most outrageous ray tracing will start to fall into the more common commercial price ranges. However, what Nvidia is pushing is hybrid ray tracing that mixes it with rasterization. The results can be quite impressive and developers want to go in this direction anyway because they have already come so close to ray traced lighting using rasterization. The AI and physics and other things that even the consumer cards will be capable of rendering are a mind blowing leap forward in themselves, and Nvidia rightfully claims this is the biggest innovation in consumer graphics in over ten years. Ampere combines the fastest known asynchronous processors from IBM with advances in memory and AI that can even be combined with AI circuitry going onto every processor being made now. Eventually, the end result will be even cheap home computers resembling a different of animal altogether than we are familiar with. Computers that don't have program bugs or bandwidth bottlenecking or any of the usual concerns today and can adapt to each individual as if they were personal assistants. At that point, most people will cease to care about what exact hardware goes into the box and they'll simply buy brand names or generic computers the way you might a cheap watch that is capable of way more than anyone ever cares to know.
GeForce RTX 2080 And RTX 2080 Ti Extravaganza: More Cards From MSI, Palit And Gigabyte Leak This is the latest leak on the upcoming Ampere offerings from MSI, and these are the first two RTX cards which will come on the market. Both AMD and Nvidia tend to tune the power of these graphics cards so they can extract the most money from consumers, and these two cards are probably not an exception. At roughly $1,200.oo they are way too expensive for most people's taste, but right in the price range for PC gaming enthusiasts, because they know exactly how much these guys will pay. The average consumer wants to spend $200.oo or less for a graphics card upgrade to their crappy computer, while around $350.oo is what a lot of people would pay for significantly better graphics for things like 4k resolutions. This upcoming generation of ray tracing video cards will remain expensive but, later this year, we can expect stiffer competition for lower tear video cards such as the gtx 1060 which is capable of doing justice to 4k video gaming and, hopefully, the blockchain will cooperate in helping to keep prices coming down. If I had to guess, the upcoming RTX 2080 Ti will become the next generation standard for ray tracing, but we'll have to see the benchmarks and how developers leverage the technology.
NVIDIA GeForce RTX 2070 Allegedly Sports 2304 CUDA Cores, 8GB GDDR6, $400-ish Price Everybody is springing leaks now that MSI first spilled the beans. This one falls in the category of all too plausible, with the RTX 1060 quite likely capable of easily handling any game on the market today with good frame rates at 4k resolutions. The RTX 1070 will likely be the one enthusiasts will go for, with $400.oo being about $50.oo more than they tend to prefer to pay. However, all these prices are dependent on the retailers and the blockchain, so keep your fingers crossed. What does seem certain, is that the price of memory will stay about as low as possible for the foreseeable future. If you need ram or something, now is the time to shop for deals. Nvidia actually wants to keep these prices low, because they need to restore their consumer gaming market which has died basically thanks to the blockchain, and sales are exactly what the doctor called for.
NVIDIA GeForce RTX 2080 Ti, RTX 2080 And RTX 2070 Launch September 20th Starting At $499 Here are three new demos of Nvidia's Ampere ray tracing. The first video of Battlefield V demonstrates how ray tracing can be used to great effect for just adding reflections and realistic glass, while the second Metro Exodus demo illustrates how ray tracing can be used to provide much more realistic shadows that make all the rasterized artwork pop. No doubt, all of these used an RTX 2080Ti to run and it may very well be the minimum ray tracing power that is worth buying at this point if you have that kind of money. Hothardware has an article that contains pretty much the same information as the last video, but the last video contains a good explanation of what all this means for the future of video gaming. Jay builds incredibly expensive PC gaming rigs, and always gives the straight poop about prices and marketing bullshit to watch out for. In this case, he notes that the potential ray tracing ability of these video cards is still a big unknown, that this first generation of video cards will merely provide a baseline, while the next generation of them released in perhaps another two years should give us a better idea of what kind of prices and performance we can expect from the new technology going forward. As for prices, it looks like Nvidia and their partners will attempt to gouge the market as much as possible upon the first release of these "founders" edition cards, and they could conceivably keep gouging the public until someone else releases a competitive product in another year or so. The cheapest video card they plan on selling will cost at least $500.oo for what should only cost maybe $300.oo. The situation sucks, but video gamers demanded Nvidia graphics cards and prices, along with Intel processors, and I have no sympathy for fanboys who complain and keep shelling out the money. Companies like AMD might be a few years behind them in technology, but they are working hard to create open source alternatives that make everything more affordable. AMD's HBM standard promises much more affordable graphics cards in the long run, while Intel is working on introducing their first discrete graphics card in forever sometime next year. Nvidia might be ahead and able to charge through the nose right now, but that's what competition is for. Memory prices are crashing, the new HBM4 is almost twice the performance for half the cost, and we'll just have to wait and see what the playing field looks like next year, after we have some actual benchmarks to compare. The gamescon description of RTX shows how it can easily be turned on and off at will for different effects and, in general, you could say benchmarking this kind of graphics is becoming more of an artform.
Jay is a popular video game rig builder who talks the straight dope. In this video he covers all the concerns over the RTX 2080 coming out in a few weeks. From what I can tell this early on, it seems like a video card should really have about three times as much ray tracing power as the most powerful of these cards, and the frame rates are the big issue. We'll just have to wait and see what the frame rates are, but its a very good reason not to buy one of these expensive cards right now and to wait until next year to even think about buying one. About 60gigarays are what I think you ideally need, and it will be at least six years before that becomes anywhere near affordable. For awhile there was speculation that AMD was attempting to use their infinity fabric to connect gpu processors. By connecting a bunch of cheaper older chips with a lot of vram, you can create one hell of a gpu, but AMD has hinted that this has turned out to be harder than anyone anticipated. Similarly, Intel hired AMD's gpu architecture guru and has been developing a single chip architecture for raw speed, which is their specialty, making silicon work faster. Theoretically, with next generation memory, Intel or AMD could easily produce a graphics card capable of up to 60tfs, which is a ridiculous amount for anything any consumer would ever know what to do with. That's roughly one third the power of a Star Trek holodeck, while we should see dramatic improvements in the technology at around 20tf as early as next year. Anyway, next year should prove interesting to say the least. Nvidia has the jump on everyone for ray tracing for a lot of reasons, but is relying on their old tactic of making an enormous chip that is more expensive to manufacture, but helps them get their product on the market faster. There is plenty of competition from others as well, such as Otoy, which have much more efficient and possibly cheaper to manufacture versions, but may require a few years longer to develop them.
This guy is certainly no technical genius, but he knows the subject and gives good reviews, and is not the slightest bit hesitant to criticize companies like Nvidia and Intel. In this video he describes how the entire launch of Nvidia's RTX cards has gone south from the start. The cards are not even on the market yet, and Nvidia is spouting all sorts of misleading crap about them. So much misleading crap you almost require an expert to sort it all out. The reviewer mentions that half the 21 video games that will support the new RTX technology only use it for super sampling anti-aliasing and not for adding lighting. Jaggies have always driven gamers up the wall and RTX is the first truly fast and effective method of dealing with them, however, so is simply buying a 4k monitor because they are just an artifact of using lower resolutions. Being able to combine both high resolution monitors and RTX anti-aliasing means the picture will "pop" all that much more in HDR, but these are minor improvements which Nvidia is asking way too much money for and people are pissed. But, every cloud has a silver lining and, in this case, the silver lining is that all the rumors and indications are that Nvidia has a huge pile of GTX 1080Ti cards they have to sell at a discount. With any luck, the bad news will keep coming and their prices will drop through the floor.
This is an interview with one of the developers of Metro Exodus. He says their goal has been to provide 60fps ray tracing at 1080p, which makes sense. Crunching more rays becomes exponentially harder the higher the resolution, with around 4k being an ideal resolution, so close to reality everybody says its not worth paying the difference. Around 1440p could become the next standard resolution after 1080p, which closes the gap enough to make an eye-popping improvement. Once you get used to higher resolutions, you never look at lower ones the same again. Within about two years, consumer graphics cards might have the ability to push that many rays simultaneously, and even to do it super efficiently. We'll just have to wait and see what happens, because this is entirely new technology pushing new limits to what they can do with silicon. At any rate, it appears my estimates were roughly correct and we might have to wait as long as six years for the technology to truly begin to mature and start coming down in price, but let's hope not. Right now this technology is excellent for photography, capable of making low resolution pictures look like 8K. Videos are not that far off in the future, and a graphics card like this can do all sorts of other wild tricks, that people haven't even thought up yet.
This is a long video trashing Nvidia for their deceptive marketing. The guy who made the video is an expert in graphics, video cards, and displays and basically shows how the GTX 1080 was designed for 1080p resolution, the GTX 1080ti was designed for 1440p resolutions, and the RTX 2080 is designed for 1440p, rather than 4K, and the newer video cards are not remotely as fast as Nvidia's marketing department is attempting to make them sound. People who want 4K gaming can still expect to require two high end video cards, but the prices on the GTX 1080 are expected to keep falling over the next several months and they can cost half as much as the newest RTX 2080. We might even see a few serious sales come Black Friday and the first of the new year. Reviewers are criticizing additional Nvidia marketing as total bullshit, and the entire launch of this new video card has gone south from day one, I assume, because the people working at Nvidia know all too well they are spouting more bullshit than a cow with diarrhea, and would rather appear foolish than downright greedy. Assuming AMD and Intel release new graphics cards next year, we shall see then just how loyal Nvidia fanboys are when they are presented with seriously inexpensive alternatives.
Video games are often dismissed as a waste of time, but people waste an awful lot of time watching 500 channels of unsatisfying crap, and at least video games are interactive. Its harder to forget who is in charge around here when it you. The prices of ram are also dropping rapidly, but the entire industry resembles the Mafia, so you have to buy crap whenever its cheap, because they'll inevitably find new ways to make something outrageously expensive. Power supply manufacturers are the worst, but have gotten better in recent years due to the rest of the industry threatening to kill them. If you already have a monitor that's worth a crap, you can always wait to upgrade until you find something on sale and have the cash. The Box is what PC gaming is all about, and for a little over a thousand bucks you can build a pretty damned impressive box these days, that can be cheaply upgraded for at least the next six to ten years. AMD's ryzen chips have changed the whole ballpark, lowing the cost for high powered video gaming rigs to within easy consumer reach, but the damned blockchain and Asians strangled the market for ram and video cards. Next year should definitely tell the tale of just how cheap prices can go.