Got buyer's remorse with your 8GB graphics card? Nvidia's AI texture compression promises huge benefits for GPUs with stingy amounts of memory This is becoming an all-too-familiar scenario with Nvidia. They're selling graphics cards with marginal vram for gaming, then producing technology a few years later that makes them work better. Which is great for the little guys who have to buy outdated hardware, but screws their high-end customers, and makes AMD a more viable option. It reflects the fact the demand outstrips the supply to the point they're charging three times what the damn things are worth. For gaming, you really only need a cheap cpu these days, but you want 16gb ram and 16gb vram, while Nvidia has been doling out the memory. Memory has become a bottleneck because it hasn't been able to keep up with the progress made in the graphics cards themselves, and Nvidia is taking advantage of the lack of competition. Thankfully, as we approach the point of squeezing a Star Trek holodeck onto a 15 watt chip, Nvidia will have to find someone other than gamers to squeeze for money. Very likely, they will dole out their neural compression now, and push Path Tracing for several years. Which means gamers will finally catch a break with the next generation AMD all-in-wonder chips, which will come AI ready, and most will gladly wait for path tracing to come down in price. Everyone agrees, ray tracing is OK, but unnecessary for most games. What it really empowers is cheap and easy ways to use AI in games, ways that the programmers will require a few years to figure out what works and is fun. AMD's current all-in-wonder chips are amazing, everything you could want, but too expensive, and its all about the price right now, with both coming out with impressive graphics and frame rates.
If you wanna buy a new vidcard, first look at what you want from it(games/software/mining/ect)........look for older factory overclocked cards........do some research before you buy(be it Nvidia or amd) Saves money and dissapointment.............................and AI is just a popular term to sell anything atm Mzzls
Computing is actually about using memory efficiently and, unless Nvidia releases their neural networking compression, we will have to wait for the next gen 2d transistors to see a really significant improvement. The high cost of cramming more on a single chip, and for developing processors and graphics cards, has inspired many to invest in the rest of the damn computer. It makes no damn sense to have a processor that works ten times faster than the rest of your laptop, unless it sells more laptops. With 2d transistors, you can put everything on a single chip that sips power, which is exactly what they're all working up to. Intel even put their own power regulator on their chips, because you can't trust anyone in the business to not cut corners.
I Wenn it comes to computers/proccesing/progres(AI) we already can do so much more(society)............Finance(profits)/economy(investment)/(geo)politics.........holding most things back. People just dont care.............they buy the next.......consumerisme(society) I am thinking of building a new pc............going to do some research first................just want to play games(bf6) Mzzls
Think of a cheap Timex watch that does way more than anybody cares to know. Roughly 260tf is enough to run a Star Trek holodeck, and do almost anything anyone wants including run a sizable corporation, and you can get that in a large desktop computer today. The only remaining issue are price and portability, which countless professionals pay attention to. Fifteen years ago, rendering something like a ray traced intro would require a week on a desktop workstation, and these machines are only now beginning to get fast enough to make them more worthwhile. AMD just produced the first seriously fast desktop workstation that doesn't cost a million dollars, and can create new AI even. In the past, people might have to run the machine nonstop for a month. The gen chips should be much more efficient, and make it faster, cheaper, and easier for anyone to even produce their own video games and Hollywood quality movies. The real promise of consumer electronics, is they can used much more than surfing the web.
Big problem is heat(small devices,laptops/smartphones/ect), memory and processors power(tech) are already ahead of there times(since a long time) and what they really can do . Again making it into mainstream..... most bigtech(silicon valley) companies atm are on the exchange(profits/politics).......and sadly not really busy with whats really good for the bigger picture(people/world) They all start out good(apple/google/ect)..........in the long run........? Mzzls
Intel's plan has always been to produce a chip that uses 3 watts, or less, with at least half the compute power required to run a Star Trek holodeck. The next big leap forward we might see is MEXENE circuit boards, which can connect all the chips you want using ballistic electrons, that work at the speed of light. Our current circuit-boards are outdated, and remain the biggest bottleneck in any computer, along with memory speeds being years behind those of modern processors. Intel doesn't even want to be in the business anymore, but they can't sell it off. The industry is working towards streamlining everything for automation, and China has already automated their assembly process. Its the assembly processes which are becoming national secrets, with Elon Musk making his fortune using things like 3d printing. Optics are the cutting edge of electronics now, and even our consumer devices are about to become optical electronics, relying on Terahertz frequencies for up to a million times more bandwidth, that will go right through your walls, or from one chip to another inside your laptop. Exactly where all this is going is still anyone's guess but, suffice it to say, wars inspire new technology that actually makes it into the public domain.