Huawei shows off AI computing system to rival Nvidia's top product This makes more sense out of the Chinese government's recent announcement about sharing their AI technology with partner countries. Apparently, they found a way to use multiple slower chips to crunch numbers faster. Which means, we could also see similar results in laptop chips, with slower RISC-V and ARM chips steadily closing the gap, the larger the numbers they have to crunch. Architectures are all scalar these days and, once you get up to a certain size, all bets are off, because nobody's really had the time or money to play around with all the different things you can do with a Star Trek holodeck computer. With AI now programming computer code, people will likely get very creative over the next decade, then spend a couple of decades working out all the problems they've created. We saw the same effect with video games, as people struggled to work out the basics for low resolution graphics, and now we're about to find out what are the basics for a Star Trek holodeck AI. Within 3 years, creating your own AI at home should cost around $3,000.oo in the US, and $1,000.oo in Asia.