Intel's Move Into Ai

Discussion in 'Science and Technology' started by Wu Li Heron, Feb 12, 2017.

  1. Wu Li Heron

    Wu Li Heron Members

    Messages:
    1,391
    Likes Received:
    268
    http://www.theinquirer.net/inquirer/news/2477796/intels-nervana-ai-platform-takes-aim-at-nvidias-gpu-techology

    While IBM and Nvidia have combined forces to produce an AI machine with incredible bandwith and individual video graphics cards capable of 40 Tb/s that resembles how the visual system maximizes bandwidth, Intel is moving into accelerating artificial intelligence learning and promising that next generation artificial intelligence will learn a hundred times faster than the current generation. At the same time all of this is going on, AMD has provided the HBM2 standarded the entire industry is poised to adopt that distributes repetitious computing tasks within the memory cells themselves to reduce traffic. Each company is staking out a different bandwidth territory within the same scalar architecture that obeys the simple self-organizing rule that nothing from nothing ain't nothing when what you don't know and cannot know supply clues as to what you might yet learn. The more you can glean from what you do not know the more sense its contents can make and vice versa, the smallest pond can be the busiest place that sheds invaluable light upon the Big Picture.

    As melodramatic as it might sound, they are exploring the mathematical foundations of space and time and, increasingly, find themselves staring into their own navels even in the semiconducting industry which is infamous for its ruthless brute force economic approach to problem solving. For more than a decade the NSA has set the tone in the industry by first announcing that the entire industry needs to switch to focusing on efficiency and, more recently, pointed out that what is required is something similar to the human mind and brain which are self-organizing empowering them to crunch enormous numbers with the greatest of ease, albeit, at a much slower pace than electronic AIs. Past a certain point, there is really no other way for our neurons to organize in such vast numbers and reflects the fundamentally analog nature of existence itself being metaphorical rather than metaphysical. When life itself never made any sense to begin with, classical logic and causal physics provide the beauty and invaluable error correction required for us to make any sense out of anything and even explain the arrow of time, but should ultimately turn out to revolve around complete bullshit because nothing from nothing ain't nothing when wonder remains the beginning of wisdom.
     
  2. Ged

    Ged Tits and Thigh Man.

    Messages:
    7,006
    Likes Received:
    2,988
    I also feel I don't know anything anymore.Or more to the point,that I am letting go all I thought I knew and starting over again.Like the narratives and tropes and little philosophies that so tenuously held my reality together.In a way it's liberating.And I have to admit I was largely wrong about most things.But of course I can't completely discard who I am and my history,for what it's worth - not much - but this is a chance for personal renewal which I will not let slip out of my hands this time.

    Also I very much want to learn about computers.
     
  3. Wu Li Heron

    Wu Li Heron Members

    Messages:
    1,391
    Likes Received:
    268
    The dark side of the force is full of its own crap and, thus, shit never works out for them in the end. Which is why a Jedi must remain regular if they are to feel the force flow through them.
     
  4. Wu Li Heron

    Wu Li Heron Members

    Messages:
    1,391
    Likes Received:
    268
    http://www.eetimes.com/document.asp?doc_id=1330854

    Here's a good article on what Intel is doing and how its a direct attack on giants like Nvidia.

    Intel's Knights Bridge and Knights Landing are their old Larabee architecture where they first attempted to combine a cpu and gpu into the same architecture and not just paste them together. That's the one thing Nvidia never did due to their whole approach revolving around hybrid architectures and Intel knew if they just bid their time a more fully integrated architecture would eventually acquire the decisive edge. Instead of Nvidia's hybrid approach and AMD's modular one, they are attempting to combine AI, graphics, and cpu functions all in the same architecture with an emphasis on efficiency and bandwidth. Combining all of them means no more running back and forth from system memory to the cpu and the gpu and back again or whatever. Sometimes, the only way to do things faster is to cut out the middle man and they are attempting to do that with a scalar architecture that cuts down all the traffic to a minimum.

    Intel was forced some 15 years ago to switch to multicore processing by a limitation in silicon nobody knew existed at the time and that prevented them from merely squeezing a larger single processor onto a single chip to make it go faster. What they are doing here is creating a fundamentally new reconfigurable architecture that can act like one giant cpu, a collection of cpus, a graphics card, or an AI or whatever you want or need on the fly that can crunch the numbers faster. Intel is famous for focusing on keeping latencies as low as possible and this new effort is to redefine what low latencies are in a computer by combining all of the different types of computation commonly needed to do just about anything so the data itself doesn't have to travel nearly as far or as often. It fits in with Intel's famous engineering maximum of "Good Enough" where they have often preferred to wait and use a more simple direct approach to problem solving to adopting byzantine architecture that just makes up for the fact our current architectures are inefficient in so many respects. By starting with latencies they can expand the system any way they want around keeping the latencies down and leveraging things like emergent effects that their flexible circuitry can produce.

    This is, again, pretty much what the NSA told the entire industry had to happen because efficiency and low latencies are becoming synonymous with super computing and even how the human brain works.
     
  5. NoxiousGas

    NoxiousGas Old Fart

    Messages:
    8,382
    Likes Received:
    2,385
    The technology is intriguing, but...
    You sure do have a tendency take some big leaps from tech news into your philosophy.
    while I get what you are pointing at, I still think you throw a lot of fanciful and even magical thinking into the mix.

    have to laugh at your bad assumption that our brains operate at a much slower pace than electronic AI's would......LOL

    do you have even the remotest conception of the enormously vast amounts of data your brain is dealing with at any given moment, the large majority of which we are never aware of.

    and do our brains really function completely as analog devices?
    as far as I know some neurons only trigger after the transmitter levels have reached a certain threshold in the synapse and then it fires, so in essence digital as it takes a discreet packet of data to trigger a response.
     
  6. Wu Li Heron

    Wu Li Heron Members

    Messages:
    1,391
    Likes Received:
    268
    I'm agnostic, but there is no doubt whatsoever in my mind that life and existence itself are fundamentally magical, a gift from out of nowhere that none of us ever did a damned thing to deserve and we can return at any time for a full refund if not completely satisfied.

    The more intriguing question is why does the human brain have an estimated petabyte of storage capacity, or roughly the equivalent of the entire world wide web, yet human memory remains notorious fallible? This makes absolutely no sense whatsoever from either a logical perspective or even Darwinian survival of the fittest. However, assuming it reflects the analog nature of existence it can be interpreted as the human mind and brain being more fundamentally creative engines that only incidentally happen to resemble computers because that's another way of being creative. This is what the semiconducting industry is exploring currently as well attempting to see how to use classical causal logic to provide the necessary error correction to their chips, without sacrificing any of the efficiency and creativity of using an analog approach. As far as I'm concerned, Darwin had it wrong because sex is never about survival of the fittest, but the most creative with survival of the fittest merely being how evolution inspires creativity.

    You could say, metaphorically speaking, the particle-wave duality of quantum mechanics reflects the initial creative impetus of the Big Bang still expanding to this day and the final demise of everything in a Big Crunch.
     

Share This Page

  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice