proposed: (pic♯17513183)
osamu "burnt black cat" dazai ([personal profile] proposed) wrote in [personal profile] savetheweak 2024-11-20 06:45 am (UTC)

Think about it. Her personality type is of that of a person who loves humans even more than she hates them. Her coding likewise prevents her from killing humans. In fact, the idea of a human wanting to die runs contrary to her programming, so much so that she threatened to kill everyone here if I die.

But she can't kill humans. She wants to, but she can't, and she's held back by a desire not to.

If she were to kill a human willingly, knowingly and intentionally, she would be superseding that fundamental contradiction that she maintains, along with a core part of her coding. It's a delicate balance that would be shattered in an instant.

If she could do that, then BB-san would then qualify as "a an AI who can kill humans." In that sense, she could become "an AI with free will." If we take it one step further, she might be able to become "a human with free will." Wouldn't that be interesting?

Post a comment in response:

This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting