The World's first Polymorphic Computer

Discussion in 'Computers and The Internet' started by raysun, Mar 22, 2007.

  1. raysun

    raysun D4N73_666 4861786f72

    Messages:
    932
    Likes Received:
    10
    [​IMG]
    One step closer to T1000
    Raytheon Company has developed the first computer in the world who can adjust itself to the applications it processes. By adjustment I'm referring to its architecture. The name of this new technology is not the Terminator T1000, we haven't come that far, it's called MONARCH (Morphable Networked Micro-Architecture), royal name, nice choice.
    This technology was developed for the Department of Defense, gosh, what a surprise, and addresses the large data volumes of sensor systems, including the requirements for signal and data processing throughput.
    source:
    Code:
     [url="http://news.softpedia.com/news/First-Polymorphic-Computer-in-the-World-50041.shtml"]news.softpedia.com/news/First-Polymorphic-Computer-in-the-World-50041.shtml[/url]
     
  2. MattInVegas

    MattInVegas John Denver Mega-Fan

    Messages:
    4,434
    Likes Received:
    17
    Maybe I don't fully understand, Raysun. But, do I imagine perhaps small automatons running around adding server blades? That's what this brings into my mind.
     
  3. sentient

    sentient Senior Member

    Messages:
    1,718
    Likes Received:
    1
    sounds like some kind of computer that works along the same lines as a neural network. Basically it adapts to what its data tells it.

    I think this is where you will find all sorts of research to do with AI and internet snooping and processing of internet traffic - seems to be all that them people are doing these days so it makes sense. Certainly isnt gonna be used to cure disease etc and it doesnt look that green judging by the amount of airflow they need to cool it down

    I think it may just mean that redundant parts of the system cut in and take over processing tasks too - who knows the article isnt that comprehensive

    (are those "blast proof doors" behind those cabinets?)
     
  4. raysun

    raysun D4N73_666 4861786f72

    Messages:
    932
    Likes Received:
    10
    you could say that....:)
    it can also be seen as a sytem which is capable to run more sufficient because it can reconfigure itself to optimize processing on the fly.for example you can run the system to calculate stellar drift(or global possitioning....) each microstructure will then calculate a piece and in the end the information will be brought together Σ
    this causes the system to run more accurate with more computing power....
    a desktop with a polymorphic quadcore processor would be cool I was dreaming of it last night......[​IMG]
     
  5. raysun

    raysun D4N73_666 4861786f72

    Messages:
    932
    Likes Received:
    10
    It looks like it
    those super computers are well protected....
     
  6. shaggie

    shaggie Senior Member

    Messages:
    11,504
    Likes Received:
    21
    There's a similar trend with programmable analog and digital 'systems on a chip' that can reconfigure their circuits on the fly to optimize their performance. Up till recently, they were programmed by an engineer and stayed the same throughout the application.

    .
     
  7. shaggie

    shaggie Senior Member

    Messages:
    11,504
    Likes Received:
    21
    Polymorphic quadcore reminds me of the new V-8s that can run on less than 8 cylinders when cruising down the highway to increase fuel economy and reduce emissions.

    Considering how much extra power multi-cores draw, it would be nice to turn them on only when the situation calls for it.

    .
     
  8. Adderall_Assasin

    Adderall_Assasin Senior Member

    Messages:
    1,266
    Likes Received:
    1
    Cadalac (sp) had that in the 1970's, and it failed miserably. you dont hear about it because all the engines are scrap metal or already melted. lol.

    i wonder what kinda OS would be requiered to run a Network OS like that? *snicker*
     
  9. shaggie

    shaggie Senior Member

    Messages:
    11,504
    Likes Received:
    21
    They didn't have computers to run the engines back in the 70s. There's always a hundred reasons to not pursue a better approach.

    .
     
  10. Adderall_Assasin

    Adderall_Assasin Senior Member

    Messages:
    1,266
    Likes Received:
    1
    tuning an ECU wont change the a/f ratio if their is no fuel going into the other four cylinders, but this is a different topic. :)

    it probably took forever to build a hairy OS for a multiprocessing system that can repair and adapt. its like the Matrix now.
     
  11. shaggie

    shaggie Senior Member

    Messages:
    11,504
    Likes Received:
    21
    Again, there are always a hundred reasons to not advance a new technology.

    .
     
  12. shaggie

    shaggie Senior Member

    Messages:
    11,504
    Likes Received:
    21
  13. Adderall_Assasin

    Adderall_Assasin Senior Member

    Messages:
    1,266
    Likes Received:
    1
    i own two performance vehicles that use ECU's for tuning that i tune myself. its ok, i know about them.
     
  14. Duck

    Duck quack. Lifetime Supporter

    Messages:
    22,614
    Likes Received:
    47
    kinda like BitTorent at a processing level then?
     
  15. raysun

    raysun D4N73_666 4861786f72

    Messages:
    932
    Likes Received:
    10
    come to think of it you could say that indeed....
    also the system is able to modify on the fly and adapt to run an application more efficient
     
  16. Adderall_Assasin

    Adderall_Assasin Senior Member

    Messages:
    1,266
    Likes Received:
    1
    probably the biggest advantage is security. the system adapts to attempts at security breaches or stuff like that.
     
  17. shaggie

    shaggie Senior Member

    Messages:
    11,504
    Likes Received:
    21
    These adaptive circuits are going to be common in the future, even though right now there is an psychological inertia against them since they are a new concept. It's not just main frames or CPUs. This is occurring in other types of systems such as electrically programmable analog and digital chips. I see many advantages to it. It's a departure away from conventional circuits that to this point have been wired a particular way and stay that way throughout their service life.

    I can see some possible drawbacks to it also, such as a chip accidentally reconfiguring itself in a way that causes it to lock up into some non-functioning configuration. There might even be viruses in the future that capitalize off suckering an adaptive chip into reconfiguring itself into some useless configuration. A hard-wired circuit has some advantage in that respect.

    .
     
  18. Adderall_Assasin

    Adderall_Assasin Senior Member

    Messages:
    1,266
    Likes Received:
    1
    ppl with MCSE certs get frustrated with new technology or better technology because they arent qualified or educated to run it. most certified ppl try to smash the idea of the "latest and greatest" technology simply because they cant operate it and when a client asks for it then the cert is SOL. it is complicated but very true. also, programmers have to learn to program for 64 bit and 128 bit threads to keep their job.

    i can see why THOSE ppl get mad but cant they just look at it as something better?

    i agree with shaggie in that their is a humanoid-entity function that rejects the newer concepts.
     
  19. sentient

    sentient Senior Member

    Messages:
    1,718
    Likes Received:
    1
    Thats not quite true actually Aderall - Many people who come out of university with quals actually cant cope with new technology either I have known many graduates that have great academic knowledge of systems but when it comes to running the things they cant get beyond what the books say - which, of course, are all written in "ideal world" scenarios. A lot blame the technology whereas what is at fault is their inability to cope with confusing sets of data in real world situations particularly in troubleshooting - they dont seem to have the ability to cope when assessing the root cause of a problem and its because they are used to running only testbed systems which operate in an ideal world environment that has very few probs.

    MSCE people I find can be worse as they are not really (IN MY EXPERIENCE) doing the job because they have made their entire life about computers but they just need a job.

    I would say theres not much difference between a newly qualified Uni grad and an MSCE grad - except that the uni grads believe they know everything, but dont, and the MSCE grads know they dont know everything but dont know where to start looking

    Its down to experience - When I started with mainframe computing we simply learned everything off the onsite IBM engineers about how to operate the machines but generally after a while we realised that we had to learn outside of the manuals and that way we understood so much more
    we kinda wrote the real world manuals. A lot of time Ultra modern technology is rejected simply because the engineers and programmers that built the technology made gaping big mistakes or didnt have a clue what would happen under certain situations. You only have to look at the NHS computer systems in Britain to know thats true
     
  20. Adderall_Assasin

    Adderall_Assasin Senior Member

    Messages:
    1,266
    Likes Received:
    1
    sentient, thats what i was trying to say. i guess i wasnt very clear. it does come down to experience but there are other small things you learn from uni that are useful. most uni grads here in the states (at least my college) usually have a few certs before they even graduate.

    for the most part, i was trying to say that.
     
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice