Comments

You must log in or register to comment.

Over_North8884 t1_ittkcul wrote

Depends how you define "computer". An abacus is a an ancient computer as is the Antikythera mechanism.

1

Dokibatt t1_ittlgoi wrote

Technically, we tricked gas into thinking before we tricked rocks into thinking.

More practically, every modern non-quantum computer is built by millions to trillions of logic gates that boil down to answering whether both, one, more than zero, or neither of two things happened. While the question that an individual gate can answer is simple, when you combine enough of them, you get a smart phone.

10

MercurianAspirations t1_ittnbbp wrote

The first computers were mechanical devices. Say we have to do a very complex math problem, repeatedly, and as quickly as possible. (For example, it's 1941 and we need to figure out the how to aim the guns of a massive warship given the relative speed and direction of an enemy ship; how do we do that complex trigonometry problem as fast as possible?) A mechanical computer uses mechanical parts like levers and dials to represent values, and their mechanical relationships can be built such that you can do math with those values. If you only need to do one or two types of math problems, this is feasible with mechanical parts and fast enough.

But what if you could represent any arbitrary number, and do any arbitrary math with those numbers? Then you could do a lot more with your mechanical computer. Such a computer had actually been built all the way back in 1833 by Charles Babbage - it was just monstrously expensive, huge, and inconvenient, because every number had to be represented by a whole set of metal gears, and data and mathematical functions entered by use of punch cards.

The next breakthrough came with vacuum tubes. The specifics of how they work doesn't really matter, but suffice to say that they allow you to switch a current on or off, via input of another current. So they can function as an all-electric switch - one electric circuit can turn another on or off. So this greatly simplifies and speed's up Babbage's difference engine - because electric circuits are way faster than mechanical gears.

But even this kind of sucks. The tubes are big and fragile and fail often, and you need like 5,000 of them to get a very useful computer. In 1947 thanks to engineers at Bell Labs, we get the transistor: take a tiny bit of a semi-conductor material like silicon in a specific arrangement and you get everything that the vacuum tube does. Essentially this is a tiny little electric switch that can be on or off. So with enough of them, you can represent any arbitrary number and do any arbitrary math with them. This is the basis of all modern computers: using an array of thousands of tiny silicon switches to do math. Because silicon can etched into on a microscopic level, we can make processors that contain billions of transistors.

There are a couple of other questions here, like how to sort out computer memory. But that's the gist of it, you "teach a rock to think" by splitting it up into a ten billion tiny electric switches and then carefully turning them on and off.

1

captainAwesomePants t1_ittp1pl wrote

A computer is, basically, a fancy calculator. You can make a very simple calculator with marbles and some wood: https://woodgears.ca/marbleadd/.

If you make the little planks switch direction with electricity instead, and you also replace the marbles with electrical paths, you've basically got a vacuum tube. This sort of lets you do the same thing the marbles are doing, but faster.

Vacuum tubes are big and expensive, but transistors, which do the same thing but are way smaller, are a great alternative.

If you take a block of silicon (the rock) and slice it into a thin wafer, it turns out you can dig little trenches into them and make A LOT of transistors, and you can similarly wire them all together however you like. This is a horrendously complicated process, but you're left with something kind of like the marble machine except with literally billions of switches.

6

Utaha_Senpai t1_itu3nn4 wrote

We did a lot to get to today's gaming computers. Basically we created a lot of abstraction layers to get there + generation advancement.

Basically we have digital logic, the logic of 1s and 0s. With the invention of the transistor we created electrical circuits (logic gates) that code do digital logic, stick a lot of logic gates together and we have useful mathematical tools like an adder or a subtracter, stick a shitton of these tools and you get complex units, these complex units together make a microprocessor. Or a common thing you know, A CPU! Or a central processing unit, from it's name, it processes logic and makes math. There's a lot of ways to arrange and make your complex units and depending on your arrangements (architecture) you get different instructions. Instructions in a nutshell is like "put the value 5 in the x location" the modern CPU uses the 8086 architecture and we have a thing that makes programs called an assembler. It's basically it's programming but WAYY closer to the CPU/digital logic.

Idk how to describe it but It's quite simple but complex (you get commands that literally say "save the value 5 in address AX) so it's communicating directly with the CPU). Anyway we created the operating system with Assembly (the language) and then we created a programming language that is more readable to humans like c++

Let's sum things a bit, so we use c++ to make a program, that program translates to assembly, and assembly translate to 0s and 1s that are processed by the CPU. Now you use c++ to make programs like browsers and games.

Ok so this isn't eli5 and I suck at explaining things but I wanted to comment nonetheless or made mistakes, I also didn't talk about a lot of stuff like memory or the kernal of the OS because each layer is basically several courses. I highly recommend looking these things up if you are curious and even more highly recommend studying computer engineering if you are this curious about computers in general. I was too curious and wanted to exactly know how computers work from A to Z and CPE didn't disappoint me

2

T-T-N t1_itu5bfq wrote

I think tricking domino into adding is cooler. If you have more of them and a way to automatically erect themselves, that's a computer.

Or get a bunch of people on a field, give each instructions to raise or lower a flag and a big gong to synchronize them, uou have a computer (the v sauce documentaries demonstrates vision, but if you make each person a NAND gate, you can make a computer)

0

Wendals87 t1_itufm6x wrote

This was a great explanation and explains that technology is built on the shoulders of giants

Each new technological breakthrough was doable because someone had a breakthrough earlier that allowed the next person to build something not possible before

1

m_i_c_r_o_b_i_a_l t1_itwlbhi wrote

The basic building block is the transistor which is a kind of switch for electricity. The transistor is made from silicon (the rocks) mixed with stuff to get two silicon mixes. There are two basic kinds of this switch which are based on how you sandwich the silicon mixes. An analogy would be to have two types of push button switches with these behaviors

  • Button type 1: Push the button, a light turns on. Let go of the button the light turns off.
  • Button type 2: Push the button, the light turns off. Let go of the button the light turns on.

You can arrange these switches in certain combinations to make logic. One type of this logic could be turn on the light when one button is pushed, but not if two are pushed. It could also be turn on the light if either button is pushed. There is a variety of different combinations that can be made with just these two switch types. Those switches can in turn be fed into more switches to make more complex logic. In this analogy a computer program is just determining which buttons are pushed and when. The CPU is a few billion of these switches which arranged the right way allows for more complex things to happen.

There is a good set of videos by Ben Eater on youtube. He starts at one transistor and builds up some simple logic building blocks with a couple more. I learned a lot about why transistors put together in the correct way make the logic in computers work.

Here's one of his videos on how the transistor does its thing: https://www.youtube.com/watch?v=DXvAlwMAxiA

Here's another of his videos describing logic gates: https://www.youtube.com/watch?v=sTu3LwpF6XI

1