One step closer to T1000 Raytheon Company has developed the first computer in the world who can adjust itself to the applications it processes. By adjustment I'm referring to its architecture. The name of this new technology is not the Terminator T1000, we haven't come that far, it's called MONARCH (Morphable Networked Micro-Architecture), royal name, nice choice. This technology was developed for the Department of Defense, gosh, what a surprise, and addresses the large data volumes of sensor systems, including the requirements for signal and data processing throughput. source: Code: [url="http://news.softpedia.com/news/First-Polymorphic-Computer-in-the-World-50041.shtml"]news.softpedia.com/news/First-Polymorphic-Computer-in-the-World-50041.shtml[/url]
Maybe I don't fully understand, Raysun. But, do I imagine perhaps small automatons running around adding server blades? That's what this brings into my mind.
sounds like some kind of computer that works along the same lines as a neural network. Basically it adapts to what its data tells it. I think this is where you will find all sorts of research to do with AI and internet snooping and processing of internet traffic - seems to be all that them people are doing these days so it makes sense. Certainly isnt gonna be used to cure disease etc and it doesnt look that green judging by the amount of airflow they need to cool it down I think it may just mean that redundant parts of the system cut in and take over processing tasks too - who knows the article isnt that comprehensive (are those "blast proof doors" behind those cabinets?)
you could say that.... it can also be seen as a sytem which is capable to run more sufficient because it can reconfigure itself to optimize processing on the fly.for example you can run the system to calculate stellar drift(or global possitioning....) each microstructure will then calculate a piece and in the end the information will be brought together Σ this causes the system to run more accurate with more computing power.... a desktop with a polymorphic quadcore processor would be cool I was dreaming of it last night......
There's a similar trend with programmable analog and digital 'systems on a chip' that can reconfigure their circuits on the fly to optimize their performance. Up till recently, they were programmed by an engineer and stayed the same throughout the application. .
Polymorphic quadcore reminds me of the new V-8s that can run on less than 8 cylinders when cruising down the highway to increase fuel economy and reduce emissions. Considering how much extra power multi-cores draw, it would be nice to turn them on only when the situation calls for it. .
Cadalac (sp) had that in the 1970's, and it failed miserably. you dont hear about it because all the engines are scrap metal or already melted. lol. i wonder what kinda OS would be requiered to run a Network OS like that? *snicker*
They didn't have computers to run the engines back in the 70s. There's always a hundred reasons to not pursue a better approach. .
tuning an ECU wont change the a/f ratio if their is no fuel going into the other four cylinders, but this is a different topic. it probably took forever to build a hairy OS for a multiprocessing system that can repair and adapt. its like the Matrix now.
i own two performance vehicles that use ECU's for tuning that i tune myself. its ok, i know about them.
come to think of it you could say that indeed.... also the system is able to modify on the fly and adapt to run an application more efficient
probably the biggest advantage is security. the system adapts to attempts at security breaches or stuff like that.
These adaptive circuits are going to be common in the future, even though right now there is an psychological inertia against them since they are a new concept. It's not just main frames or CPUs. This is occurring in other types of systems such as electrically programmable analog and digital chips. I see many advantages to it. It's a departure away from conventional circuits that to this point have been wired a particular way and stay that way throughout their service life. I can see some possible drawbacks to it also, such as a chip accidentally reconfiguring itself in a way that causes it to lock up into some non-functioning configuration. There might even be viruses in the future that capitalize off suckering an adaptive chip into reconfiguring itself into some useless configuration. A hard-wired circuit has some advantage in that respect. .
ppl with MCSE certs get frustrated with new technology or better technology because they arent qualified or educated to run it. most certified ppl try to smash the idea of the "latest and greatest" technology simply because they cant operate it and when a client asks for it then the cert is SOL. it is complicated but very true. also, programmers have to learn to program for 64 bit and 128 bit threads to keep their job. i can see why THOSE ppl get mad but cant they just look at it as something better? i agree with shaggie in that their is a humanoid-entity function that rejects the newer concepts.
Thats not quite true actually Aderall - Many people who come out of university with quals actually cant cope with new technology either I have known many graduates that have great academic knowledge of systems but when it comes to running the things they cant get beyond what the books say - which, of course, are all written in "ideal world" scenarios. A lot blame the technology whereas what is at fault is their inability to cope with confusing sets of data in real world situations particularly in troubleshooting - they dont seem to have the ability to cope when assessing the root cause of a problem and its because they are used to running only testbed systems which operate in an ideal world environment that has very few probs. MSCE people I find can be worse as they are not really (IN MY EXPERIENCE) doing the job because they have made their entire life about computers but they just need a job. I would say theres not much difference between a newly qualified Uni grad and an MSCE grad - except that the uni grads believe they know everything, but dont, and the MSCE grads know they dont know everything but dont know where to start looking Its down to experience - When I started with mainframe computing we simply learned everything off the onsite IBM engineers about how to operate the machines but generally after a while we realised that we had to learn outside of the manuals and that way we understood so much more we kinda wrote the real world manuals. A lot of time Ultra modern technology is rejected simply because the engineers and programmers that built the technology made gaping big mistakes or didnt have a clue what would happen under certain situations. You only have to look at the NHS computer systems in Britain to know thats true
sentient, thats what i was trying to say. i guess i wasnt very clear. it does come down to experience but there are other small things you learn from uni that are useful. most uni grads here in the states (at least my college) usually have a few certs before they even graduate. for the most part, i was trying to say that.