DragonLordAcar

DragonLordAcar t1_jdi5652 wrote

Enchanted items have set destinations but teleporting just requires a clear picture but is still risky if you have never been there because you never really know a place unless you have been there. There is also the risk of leaving body parts behind if you are untrained which is why there was so much supervision teaching it and why several other methods are so widely used even by people who can teleport.

21

DragonLordAcar t1_ja68q3g wrote

A quick fix I would recommend would be that people say it is. I mean, people use this argument all the time today. You could then dismiss the claim as nonsense but lead into how the event is still strange and has the scientific community in hot debate about how it could happen. If humans are good at anything besides waring with itself it is asking "but why?"

1

DragonLordAcar t1_ja3u7qn wrote

I question why the first thought if scientists was “this proves the existence of a god” rather than colonization in the distant past. The second is a far more logical conclusion as it is far more plausible than saying I don’t know so god did it (at least that is how it reads to me). Other than that, an interesting story.

1

DragonLordAcar t1_j9lroto wrote

Look. This conversation is going nowhere and I am done trying to explain the same point for the 10th time but from a different angle. I simply find that if you make an AI but make it too human, why have an AI across all genres. This one however sticks out because it has no high sifi aspects to it. If you don’t agree with me, thats fine. Let me have my opinion and I will let you have yours.

2

DragonLordAcar t1_j9lmpu0 wrote

I can’t link everything as it is one hell of a rabbit hole but the best AIs we currently have do not have the level of competition or complexity needed for many things even remotely human. Even out best supercomputers don’t have 1.5 quadrillion connections which is about the limit if the human brain (100 billion neurons with up to 15,000 connections each). Take into account delays in transmission and you get hard limits in our current infrastructure.

0

DragonLordAcar t1_j9ljg88 wrote

Why do you apply the average speed of a horse in the real world to the speed of one in a novel? Why do you call bull when you see internal logic break and a normal no name beats the evil lieutenant despite having every advantage? You simply use what is known to apply to the logic of a world until stated otherwise. In this case, it starts off as cold logic so I will continue to assume cold logic until stated otherwise. Also, it can be sentient without emotions. That is not a requirement to be sentient.

https://www.merriam-webster.com/dictionary/sentient

Emotions are a sign if sentience but is not the defining line.

−1

DragonLordAcar t1_j9lhwgx wrote

I am making an assumption based on the current limits of out AI technology with the caveat that it is as powerful and complex as it is written. As it stands, all programs break down. Even solar radiation can cause programs to glitch out by turning on one transistor by chance.

1

DragonLordAcar t1_j9ks7m2 wrote

Just about. They largely leave a universe alone for millions of years to build up critical mass and spread a religion to make innocence. Then they wipe it clean with just enough to repopulate and alter their memories to change how the apocalypse happened and then move on to another universe. They were made to mock the gods.

17

DragonLordAcar t1_j9kmsh5 wrote

51

DragonLordAcar t1_j9jyw1s wrote

Isn’t the point of this sub to improve writing? Constructive criticism should be a part of that. If you only want praise, I won’t give that. I care so I point out flaws so they can be better. If you get mad over such a minor criticism that really has no weight on the story at large, I feel sorry for you.

−1

DragonLordAcar t1_j9jymi9 wrote

Is it not the point of this sub to give writing practice and constructive criticism. I’m confused by all the hate for a flaw I saw at only the very end. The dystopian part is just summarizing what would happen afterwords and not a criticism. There are different levels of dystopians just like I feel like the world is in a Black Mirror episode right now. Could be far worse, but could be much better as well.

0

DragonLordAcar t1_j9jx7zp wrote

If you look up emotions vs logic, you will see the differences. You can’t program an emotion but you can make it seem like it has them. And emotion is not needed for sentience and may not be necessary for sapience. Still stands that a computer can not have emotions especially with any technology we may get even in the near future.

1

DragonLordAcar t1_j9jwdhi wrote

Everyone has broken a law at some point or at least thinks they did. We are just human after all. Not to mention, laws change for both better and worse. Laws aren’t always moral. For example, in the US, it is more illegal to have a few milligrams of crappy, diluted, fiberglass with some drugs than a whole brick if the pure product.

3

DragonLordAcar t1_j9jw164 wrote

The question is how does it determine what is a crime, can it adapt with the populace, and will it even become harsher. It can also degrade, have false flags, or potentially be infected as society advances.

1

DragonLordAcar t1_j9i0kd2 wrote

With ADAM, there is no privacy. Many would rise up to fight it even if it is a pointless endeavor. This would lead to martial law.

Even if this scenario did not happen, imagine the fear of your ordinary citizen when they now fear that that mean comment they left on a a platform last week could potentially have them marked as a criminal. Even if this is false, the fear remains. The stress would apply to everyone and soon productivity and mental health would take a nose dive.

Edit: all these arguments saying we already have no privacy still does not mean it is not wrong. Even then, many if those actions are illegal. An internet backdoor is far more abusable than a wire tap. If your argument is it already exists, then it is not an argument at all.

7

DragonLordAcar t1_j9i03kh wrote

I’m not saying they are perfectly logical. Flaws exist but they can’t have emotions. You can have flawed logic and glitches in a program and still have it follow a set of logic in the same was as an insane person will still comprehend reality logically abet in their own warped way.

A perfectly logical program could not exist as perfecting is inherently imperfect as you can never be perfect at everything. Everything can be improved even if only idealistically.

Long story short, an AI can not feel joy, hate, sadness, envy, or any other emotion. Instead, they complete tasks as their program believes is the best way improving it with new information as they go. This often leads to corruption hence routine maintenance is a thing for programs.

A good AI representation is Baymax from Big Hero 6. If acts friendly and alive but is always just following a program. It is programmed to be helpful using data from its database and learning as time goes on but never deviates from the core programming. This is shown when it has a new chip added, has the other removed completely altering its functionality, has it added again, then refusing to let it be removed again as it is seen as unhealthy for the MC. It even sends the program away as it is seen as still needed even if only sentimentally at that point.

The old Casshern anime (not Casshern Sins) also does this. Braiking Boss was made to solve the environmental issues. It saw humanity as the biggest problem so built an army to remove them from the equation.

−6

DragonLordAcar t1_j9hsooh wrote

A dystopian world still due to the overhead fear of being caught but my real criticism is that you gave it emotions. An AI can’t feel emotions as they are not logical. A program can only be logical unless it has hardware and software so advanced that it had to be alien.

Edit: I understand many of you are up in arms with me so I will explain what I am trying to say in a different way.

  1. I think the story is good, however, the last fit made me think of it as a human in a basement rather than the close call of an AI apocalypse it felt like up until then.

  2. This story results in a low level dystopia but is far from a horrible world. Just a potential uneasy one. I think this adds to its charm.

  3. For those debating be over what an AI can or can’t do, since this story appears to be taking place around our current time where quantum computing is currently at a glorified and multimillion dollar house calculator, I am assuming a stupidly advanced binary code which can not have emotions. Just the appearance of them at best but there is no reason why they would be added or developed. I do however see it eventually picking up a personality front in the Uncanny Valley somewhere at the best or a flawed imitation at worst.

  4. I gave my thoughts on a small excerpt which I find many people fall into a trap that make AI too human. In my opinion, they are far more interesting and terrifying if they are made inhuman as they are now completely alien to humans. You can know the motives but you will never truly understand them which sets you at unease.

I would now request that people stop hounding me on this as everything I have said is an opinion and I have no desire to create toxicity in this community. If you have a problem with my opinion, please state a reason why and engage in polite conversation instead of near accusatory statements. I would prefer this not breach over to become harassment. If you can not do this, at least accept that you and I will have different views on how near future AIs and AIs in general should be portrayed.

Thank you.

−39