Ryzen 2

Discussion in 'Computers and The Internet' started by wooleeheron, Mar 16, 2018.

  1. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter HipForums Supporter

    Next month AMD is planning to release the first Zen+ processors, the 2600x, 2700x, or somesuch series, along with the new 400 series of south bridge chipsets with increased functionality. The processors are a die-shrink with a lot of added transistors and tweaking the architecture for greater efficiency and speed, but the new chips and south bridge chipsets will allow for higher speed ram and using more pcie lanes for greater overall bandwidth. With an eight core ryzen you can play games while running programs in the background, because most games are lucky if they require six cores. Threadripper has 16 cores and is also getting an update sometime in the second half of the year, but I recommend waiting for the new motherboards to come out, if you don't already have one.

    Navi, their gpu architecture, has been delayed until next year, but is rumored to use their infinity fabric to connect several gpu chips on a single die with a ton of high speed HBM3 memory, which is 50% faster. Already Nvidia has declined to release their newest Tesla architecture on the market, which includes a 38tf consumer graphics card. In other words, these graphics cards are already ridiculously powerful for any existing consumer applications, and they are all delaying releasing more powerful graphics cards, in order to milk the market for every dime they can get.
     
    Last edited: Mar 16, 2018
  2. Irminsul

    Irminsul Valkyrie

    I wouldn't buy AMD again. I was always nvidia. I switched to AMD a year ago and nothing but problems. I'm going to get a 1080TSI soon I think and bin this.
     
  3. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter HipForums Supporter

    LOL, Intel and AMD just combined their technology with impressive results. They are all soulless corporations and Intel's latest problems with Meltdown and Specter slowing down everyone's computer raised AMD's stock through the roof. About all I can say is AMD chips get hotter, thus noisier when overclocked without water, and use at most $10.00/year more in electricity. However, a ryzen 7 or threadripper system can cost $500.00 less to buy making their more common bugs worth dealing with for some, and they support a wealth of open standards. Thankfully, everyone is adding AI to everything and bugs will soon be a thing of the past. The day is coming when nobody will know or care who makes the stupid chips inside the thing, anymore than they normally do with a toaster.
     
  4. Irminsul

    Irminsul Valkyrie

    Overheating is my problem. It gets from 30C to 90C in 1 minute unless I have fans at 100% and I have to take my side cover off tower and run an external fan just to get it to stabiise a 65C.
     
  5. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter HipForums Supporter

    Noise is still an issue with even the new All in One AiO coolers, which are often cheap crap with noisy pumps and tiny radiators. That's the blunt honest truth that no reviewer will ever say. You can cool a desktop by just opening the side panel and using a box fan, with one group using a toilet in a basement to cool a tiny radiator that kept the entire computer cool. The radiators, fans, and plumbing fixtures might as well all be sold by the mafia, and either be outrageously overpriced quality parts, or complete crap. Sadly, that's the entire industry, which is largely deregulated to promote economic markets. They commonly sell $300-1,000.oo power supplies, that should actually cost no more than $50.oo, but you could never trust a $50.oo psu to do the same job.
     
  6. Irminsul

    Irminsul Valkyrie

    Noise is the least of my concerns in computers lol. I usually have the music cranked or my gaming headset on. Not a concern to me.
     
  7. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter HipForums Supporter

    That's how I feel, the gpu, psu, and cpu are the hottest and noisiest parts when gaming, but you crack up the speakers or wear headphones anyway. Still, if your box gets too hot it can become obnoxiously noisy and shorten its lifespan. All the chips are getting hotter too, as they max out how much heat silicon can take.
     
  8. I think the post was about CPUs not GPUs. So it's an "AMD vs Intel" thing, not an "AMD vs Nvidia" thing.

    As far as graphics cards go, I stick with Nvidia too.
     
  9. I have overheating problems with my Intel 7700K. I cool it with a Corsair H100i V2, which is overkill since I don't even overclock it. When the CPU is doing heavy calculations, the temperature still gets hot enough to shorten its lifespan even if I force my pump and fans to run at 100%.

    I think Intel stopped using solder to transfer heat in their CPUs deliberately with the goal of shortening their lives so people will be forced to buy new CPUs more often.
     
  10. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter HipForums Supporter

    Consumer electronics in general are rated at around a five year lifespan, but Ivy Bridge turned out to be an unusually robust node that makes the chips last longer. The current node of around 10-14nm has turned out to be about as disappointing a shrink as they get, and everyone is eagerly waiting for the next 7nm generation. Sadly, the truth is Intel stopped using solder simply because it was cheaper for them and didn't hurt performance enough to be worth the expense.

    Heat is the looming issue for everybody, cpu and gpu alike, because the manufacturers are learning how to maximize what silicon is capable of doing.
     
  11. I have no trouble cooling my GPU with its ordinary stock fan. I do force the fan to 100%, but doing so manages the heat excellently. Right now my GPU is continuously mining Ethereum when I'm not playing World of Warcraft. When Ethereum mining moves to proof of stake I'll go back to helping Fold at Home cure cancer.

    Despite being under constant load all of the time, I expect that my GPU will still be running strong when I replace it with a better model some years from now.

    I think Intel's choice to stop using solder was a deliberate move to shorten the lives of their CPUs and keep people coming back to buy new ones. If I'd known that my Intel CPU could not be cooled no matter what (at least not without cracking open the CPU case), I would have gone with an AMD CPU when I built this computer.
     
  12. wooleeheron

    wooleeheron Brain Damaged Lifetime Supporter HipForums Supporter

    Trust me, Intel is just cheap and so far ahead of the competition they take their time figuring out how to make their chips cheaper. It used to be power supplies were the worst component in computers, occasionally even catching fire and burning down the house, and power supplies are anything but "high tech". The entire industry resembles the Mafia and, if Intel could get away with shortening the lifespan of your chip they would, but the damned thing only has a five or seven year lifespan to begin with, and the rest of the computer is not likely to last any longer.

    Essentially, their chips are tiny microwave ovens made of sand, and the idea they will last more than five years or so on the average is a joke. Its the same with cars or anything else. Manufacturers can build cars that would last for a hundred years, but nobody would buy them.
     

Share This Page