The Dangers Of Artificial Intelligence.

Discussion in 'Artificial Intelligence (A.I.)' started by Jimbee68, Dec 25, 2023.

  1. Jimbee68

    Jimbee68 Member

    Messages:
    1,415
    Likes Received:
    528
    People recently have started talking about the dangers of artificial intelligence, of creating a being more intelligent that we are. Could we trust it? And would it try to take over the world? They actually brought up this very subject in the Star Trek: TNG episode "Elementary, Dear Data". The story is unique. Because in it the AI character, Professor Moriarty is created bad, but chooses to become good. As I said, people fear things might go the other way around for us. Anyways, I have selected a few of the more memorable quotes from the episode here. I have a question at the end...



    "He is still a fictional character, sir, originally programmed with nineteenth century knowledge."

    -Data, "Elementary, Dear Data",
    Stardate: 42486.3,
    Original Airdate: 5 Dec, 1988.




    "Very strange. You're beginning to sound very different from the Moriarty I've read about."

    -Dr. Pulaski, "Elementary, Dear Data",
    Stardate: 42486.3,
    Original Airdate: 5 Dec, 1988.



    "I'm a civilized abductor, Captain Picard. Civilized? but still dangerous."

    -Moriarty, "Elementary, Dear Data",
    Stardate: 42486.3,
    Original Airdate: 5 Dec, 1988.



    "What are you?"
    "If you refer to the arch you ordered, it provides computer control. Do you wish to input any commands?"
    "Not at this time."
    [Arch disappears]
    "It's dark magic, Moriarty!"
    "The best kind, I'm sure."

    -Moriarty, prostitute and computer, "Elementary, Dear Data",
    Stardate: 42486.3,
    Original Airdate: 5 Dec, 1988.




    "In programming Moriarty to defeat me, not Holmes, he had to be able to acquire something which I possess."
    "What exactly?"
    "Consciousness, sir. Without it, he could not defeat me."

    -Data and Picard, "Elementary, Dear Data",
    Stardate: 42486.3,
    Original Airdate: 5 Dec, 1988.



    "My mind is crowded with images, thoughts I do not understand, yet cannot purge; they plague me. You and your associate look and act so oddly. Yet though I have never met nor seen the like of either of you, I am familiar with you both; it's very confusing. I have felt new realities at the edge of my consciousness readying to break through. Surely, Holmes, if that's who you truly are, you of all people can appreciate what I mean..."

    -Moriarty, "Elementary, Dear Data",
    Stardate: 42486.3,
    Original Airdate: 5 Dec, 1988.




    "I have no idea what you're talking about."
    "Of course you do, madam. The more you proclaim your ignorance, the more you try to mislead me, the more I'm on to you. Your every silence speaks volumes."
    "Good. Then if you know what I'm saying when I'm not saying anything, what do you need me for? Thank you for the tea and crumpets. I guess I'll be going."

    -Dr. Pulaski and Moriarty, "Elementary, Dear Data",
    Stardate: 42486.3,
    Original Airdate: 5 Dec, 1988.




    "You'd be a big hit in London."

    -Commander Riker and Worf, "Elementary, Dear Data",
    Stardate: 42486.3,
    Original Airdate: 5 Dec, 1988.




    "The Doctor was right. Finally, we have a game worth playing."
    "The time for games is over."
    "Professor Moriarty, I presume?"

    -Lt. La Forge, Moriarty and Data, "Elementary, Dear Data",
    Stardate: 42486.3,
    Original Airdate: 5 Dec, 1988.



    "It's elementary, dear Data."

    -Dr. Pulaski, "Elementary, Dear Data",
    Stardate: 42486.3,
    Original Airdate: 5 Dec, 1988.



    Anyways, my question: Would artificial intelligence probably be bad? Or good? Some people think giving it emotions might come into play. But how? Do emotions make you bad or good? I was thinking, someone without human empathy might be capable of doing more evil. But someone ruled by their negative emotions would be more cruel and out of control on average.

    Actually, they talk about this last issue in the TOS episode "Mirror, Mirror". Spock famously says there:

    "It was far easier for you as civilized men to behave like barbarians, than it was for them as barbarians to behave like civilized men. I assume they returned to their Enterprise at the same time you appeared here."

    That's is an old way of looking at the matter. Because we all know serial killers are often cunning and well able to hide their true intentions. "Mirror, Mirror" brings up another interesting point too. It is harder for a good person to pretend they're evil. Because then they may have to do an evil act. Like when Capt. Kirk had to destroy the Halkan city in the parallel universe...
     
    skip likes this.
  2. Constantine666

    Constantine666 Members

    Messages:
    555
    Likes Received:
    585
    Star Trek the Next Generation: Season 6 Episode 12 (Ship in a Bottle)

    Reg Barclay accidentally reactivates Moriarty, who takes control of the Enterprise again, and holds it hostage until Picard can find a way to allow him to leave the Holodeck.
    He is eventually defeated when the crew creates a Holodeck within a Holodeck that allows Moriarty to leave the Holodeck and take a shuttlecraft to explore the galaxy with the Countess Regina.
    He is then put into a self-powered holodcube and put back in storage.
     
    skip likes this.
  3. Jimbee68

    Jimbee68 Member

    Messages:
    1,415
    Likes Received:
    528
    So you're saying he became bad again? He always said it was because he was "written" that way. He did give them 48 hours to comply. Countess Regina actually bothered me more. She just went along with everything her boyfriend said. She was kind of ditzy though ("I got to wear trousers all the time!").
     
  4. Jimbee68

    Jimbee68 Member

    Messages:
    1,415
    Likes Received:
    528
    Oh, "back in storage". You mean he was rendered harmless. I don't know about that one. AI's do have some rights. There is a utilitarian thought experiment. The Experience Machine. What if we all were just put in our separate "matrices" and allowed to go on living separate lives, though we believed we were still somehow together. Our interactions mirrored what the others were experiencing. So why the heck not, the question is (asked). Well, like that thought-experiment-cum-dilemma points out. Interaction is a valuable moral end unto itself. We may not need it. For practical purposes it may not be needed (cf. that experience machine). But it is moral. And for now we just don't know why.

    Now see that episode made me very angry. They cheated the professor. Capt. Picard did not keep his word. And keeping your word may not be necessary in Huxley/utilitarian/euclidean world. But it's important to me damnit. It's important all of us. And we all seem to intuitively know why. Though we can't say exactly why at the moment.
     
  5. Constantine666

    Constantine666 Members

    Messages:
    555
    Likes Received:
    585
    The Countess Regina was an enabler. A yes woman to bolster Moriarty's darker impulses.
    Come on ... Moriarty was written as a bad guy to challenge Holmes, but in TNG he was recreated to rival Data. He became sentient, but instinctively he knew, there was no way to remove a construct of light from a place designed for that. He's too intelligent. So giving them 48 hours to make it happen was just a way of telling them he was going to enact his plans anyway.
     
  6. Constantine666

    Constantine666 Members

    Messages:
    555
    Likes Received:
    585
    No, no ... I never said "Harmless." In ages of wireless communications, Air-Gapping is useless if the device contains any type of wireless monitoring system. He cold reappear given time.
    TNG Did set a precedence for AI rights, but only so far as it related to corporeal AI's Data is Autonomous. His maintenance and upkeep could be related to the maintenance and upkeep of the human body. The same cannot be said for A holographic AI. Moriarty could be considered the same as you creating your own AI Chat online and using a famous person's face as the Avatar.

    Keeping his word, was not something Moriarty ACTUALLY expected of Picard, and vice versa. As I mentioned above. Moriarty is too intelligent to truly believe that Picard, (and even including Data) could come up with a way to set him free of the Holo-Emitters, when he himself, with full access to all the resources of a Galaxy Class Starship, sciences and all, and the full use of Federation Databases, couldn't find a way, and he claimed to be more clever and intelligent than the rest of the crew of the ship. No, he expected Picard to find a way to placate him for a time. It's an ongoing story, which may in fact play a part in upcoming Star Trek projects. They may even bring him up when they start writing for ST: Legacy. Remember, Geordi has restored the 1701-D, so that Holo Storage cube is still around somewhere. it may even find it's way aboard 1701-D (formerly Titan). In which case Moriarty can match wits with 7 of 9 and Jack Crusher (Picard's son, and Moriarty would consider him the heir to his father's promise)
     
  7. Constantine666

    Constantine666 Members

    Messages:
    555
    Likes Received:
    585
    Let's also not forget that we did see Q come back. So it is not outside the realm of possibility that Q could cause the Moriarty program to appear in the 1701-F Holodeck.
    JACK: You told my father, Humanity's trial was over.
    Q: It is ... For him. But I'm here today because of you. You see yours, Jack, has just begun.
     
    skip likes this.
  8. Jimbee68

    Jimbee68 Member

    Messages:
    1,415
    Likes Received:
    528
    Understood :) .
     
  9. skip

    skip Founder Administrator

    Messages:
    12,839
    Likes Received:
    1,712
    Those episodes with the Holodeck were some of the best. I also liked the one with Riker and that sexy woman, Minuette. She purposely distracted Riker so he wouldn't notice the crisis with the ship.
     

Share This Page

  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice