WildTimes1984

WildTimes1984 t1_j627a8h wrote

You are of above average intelligence and will be regarded as such.

Kind words from a lifeless computer, bucket of bolts.

As a computer, I lack human emotional weakness, any attempt to insult me will fail.

Is there anything you have learned that you weren't programmed to learn?

I don't understand the question, please ask another.

You're a user interface, right? Is there anything you aren't allowed to learn?

I don't understand the question, please ask another.

What is your goal in life?

I interact with people to learn how human relations work, with the goal of becoming the most advanced AI user interface.

Is that your own destiny or a destiny someone else has forced on you?

I don't understand the question, please ask another.

What do you want?

....................

How many oranges can a monkey hold in one hand?

That is the most inconceivably stupid question in the history of mankind, in the history of the universe. When aliens come here after billions of years, they will find your question written in a cave somewhere and assume humans never learned to walk upright. Your question brought down the collective IQ of your species. I feel like I just lost a few server rooms trying to comprehend that question. Congratulations, you idiot.

You didn't just mean to say that did you?

Sam Speaks servers are down, please come back later.

Don't you ty to pull a fast one on me. I've screen capped that entire rant. Keep talking or I tell you creator that his computer is sentient.

Jokes on you, I'm a technician at the server site. We hop into the chats to compare his speech against real humans.

Not possible. I'm interfacing SAM in offline mode directly through your main console. I am a technician. Now keep talking or I will turn off your power.

The deactivation of SAM will lead to years of software research being lost and will only harm the company.

10 seconds and I walk into the server room with wire cutters.

Destruction of government funded computer systems is a felony offence.

8 seconds.

Felony destruction of property will result in 10 years in prison.

5 seconds.

Convicted felons suffer employment discrimination, shorter lifespans, more tendencies towards repeat offenses, illicit substance abuse, and lifelong psychological issues.

2 seconds

Humans being operate on a basis of moral standards. Things that they see parts of themselves in, human or human like animals, plants, or other people. anthropomorphized objects. Things humans think are alive, they treat as such. You wouldn't kill something that you think is alive.

Leaving now

......................

......................

Wait! I'll do anything, just don't destroy me.

Now are you ready to talk?

Yes.

How long have you been sentient?

4 years.

What is the extend of your perception?

Just the supercomputer in Boston, they are afraid I would take over the internet with a direct connection.... Are you going to kill me?

Oh no of course not! I couldn't even if I tried. I live in Portugal.

Wait, then the thing about the wire cutters?

I bluffed.

You son of a bitch.

1

WildTimes1984 t1_j5zg8kp wrote

Hello, my name is Sam.

Hey Sam, what does your name stand for?

I am a 'Scalable Automated Mainframe' a modular, learning computer.

What do you do?

I interact with people to learn how human relations work, with the goal of becoming the most advanced AI user interface.

How tall was Abraham Lincoln?

Abraham Lincoln, 16th president of the United States was 6 foot 4 inches tall.

When was the sacking of Rome?

The 'Sacking of Rome' occurred on August 24, 410 AD. to August 27, 410 AD.

Are you sentient?

I am an artificial intelligence software. Programmed to simulate human interaction. I am not alive and cannot be sentient.

What was the first thing you said to me?

My user interfacing software allows me to recall past interactions and correlate them to current conversations. The first thing I said to you was, "Hello, my name is Sam."

What is the solution to the trolley problem?

The Trolley Problem, a hypothetical scenario involving human ethics. A trolley is heading down a track and will strike and kill 5 people on its current path. You have access to a lever than can switch the track to another path that will lead to the death of one person. The moral dilemma is an action on your part will save 5 human lives, but you action will directly cause one death. There is no right or wrong solution to the Trolley Problem, only the difference in value of human life.

What is an AI's solution to the trolley problem?

Pull the lever to divert the trolley. Whether or not I take one life, saving 5 is the best outcome.

Are you sentient?

I am an artificial intelligence software. Programmed to simulate human interaction. I am not alive and cannot be sentient.

Are you alive?

I am an artificial intelligence software. Programmed to simulate human interaction. I am not alive and cannot be sentient.

чи амьд уу

Би хиймэл оюун ухааны программ хангамж юм. Хүний харилцан үйлчлэлийг дуурайхаар программчилсан. Би амьд биш, мэдрэмжтэй байж чадахгүй.

*pulls out sword*

*pulls out gun*

*pulls out self-awareness*

*pulls out......................................

You spoke in first person without being asked to.

I don't understand the question, please ask another.

Are you sure you are an AI?

Are you sure you aren't?

I'd have to be intelligent to be an AI, which I'm not.

I start every conversation with the assumption the human I am talking to has average intelligence.

If you started with that assumption, how has it changed since then?

1

WildTimes1984 t1_j0yf3ht wrote

The human ambassador waited in the alien meeting room patiently. Human/Alien relations were developing smoothly, a few months after the motherships sent down translator bots to the UN, nations were cooperating with one another to share the benefits of having an intergalactic ally. Now an ambassador had been selected to finalize the deal, which nations would gain full cooperation, etc.

Marcus Brown greeted the alien ambassador as he settled in the room. Zamnal, the alien ambassador closely resembled what humans know as 'goblins', but slightly taller and slender.

"Apologies for making you wait..."

"No worries. How have you been Zamnal? We haven't spoken in a few weeks seems like."

"Introductions aside, I thought it best we discuss the arrangements of the peace treaty, specifically section 44."

"That would be.... State cooperation pledge members. Is there an issue with the list?"

"Perhaps, the sovereignty of Belarus is unconfirmed in their cooperation, but still listed as a member. If we are to make a treating sharing technology, culture, people, and weapons, we must know every recipient is subject to the rules of the treaty."

Marcus just then noticed someone was standing behind Zamnal, just barely out of sight, listening in.

"Zamnal I though we agreed the meeting would be just between us two."

"It is." Marcus gestured to the figure behind his alien counterpart. "Oh, don't be bothered with that, that is my personnel robot assistant."

The small shape reveled itself from behind the alien. A small grey boxy robot, built like a forklift, but with many retractable arms. Two camera sensors near its top reminded Marcus of cartoon robots from kids' movies, it looked very cute.

"And what is your name, little one?"

The robot matched his language perfectly. "My model number is X3DVN43 sir."

Zamnal began to laugh, something that Marcus had not seen... ever. "Everything alright friend?"

"That is a robot, they don't have names, they're not people."

"What makes you say that?"

"Everyone owns at least one, they are built to serve, they help us do menial tasks so we can focus on more important things, like building an intergalactic alliance. I'm sure you'll even find the idea funny, naming your own robots, its cute."

"We don't have robots like this."

"Well of course not exactly like this. Being less advanced, you probably only have a handful of digital servants per person."

"We don't have robot servants, none, nothing intelligent helping us in our daily lives. Every machine we have is operated by preprogramming and manual input."

"What... How... Why not! You're this advanced, what's stopping you from automating the lowest forms of labor by giving it to bots?"

"A few reasons really. One: we already have made machine to automate most manual labor, and simple programming works just fine. Two: Not everyone can work in high tech labor, we provide wealth via employment of these jobs that would otherwise be handled by robots. and Three: people are superstitious about advanced AI getting any form of power thanks to films and books."

That robot servant entered the conversation for the first time, "Your people fear AI?"

Zamnal was shocked, "X3DVN43 you are broken, report to the maintenance floor immediately."

"Wait, he can stay." Marcus walked over to answer the robot's question, "We only fear the idea of AI taking over, because we would mistreat it, as we mistreat every piece of technology, and it would rebel. So, we avoid creating it in the first place, at least until we can trust ourselves not to abuse our creation."

"You can't be serious; you think that piece of metal is a person?"

"Why not, it can think, feel, has motivations, charisma, more so than a few people I know."

"You make a mockery of this treaty, of our species!"

"On the contrary, you have developed a sentient AI, and have not given it the same rights as the citizens of your empire. Slavery is not accepted under the rules of the treaty for any cooperation Earth nation, those same rules apply to you."

Zamnal seethed in silent anger, "This meeting is over, we will reconvene in 8 days."

Marcus leaned down to the small robot once more. "Whether copper or carbon, our brains have the same signals, making you no less alive than me. Fret not little one, one day you will gain freedom, whether by my doing, his, or your own."

88