Viewing a single comment thread. View all comments

UnrulyNemesis t1_iwjg0yh wrote

Reply to comment by Nieshtze in A typical thought process by Kaarssteun

Bro, you just spit in the face of everyone who works or studies in STEM lmao 😂 We are living in a time where people in any science field is overwhelmed with the constantly changing and improving technology that almost appears to happen exponentially. For example, recently AI went from being able to show a couple hundred protein structures to being able to generate almost every single possible protein and many of it's interactions. This means you can sit down and design a drug from scratch on your computer before you touch a single physical compound. That is going to revolutionize the pharmaceutical industry and other industries are having similar constant transformations. If anything I'm worried we are moving too fast without wondering if we should. For example, we should ban the creation of stronger synthetic opioids that can be abused and instantly kill someone if the dosage is off by a milligram. Of course I can see your points with some technology, like smart phones. However, those technologies are purposely stagnant to make more money. For example, why do you have a phone and a laptop, when your phone is as powerful as a supercomputer made a couple years ago? It's because selling two devices is more profitable than selling a phone that can also connect to a separate bigger screen with a keyboard and work perfectly as a laptop (the software on current devices that can do this is currently purposely glitchy and subpar).

Tldr: Technological development is developing dangerously fast, and if you do not see that progress, you likely are not working in any specialized stem field and are looking at technologies that are purposely stagnant to make more money like smart phones.

13

visarga t1_iwk988q wrote

Moore's law slowed down from 2x every 1.5 years to 2x every 20 years. We're already 7 years deep into this stage. Because of that AI research is expensive, state of the art models are inaccessible to normal researchers, and democratic access to AI is threatened. Building a state of the art fab is so necessary and difficult that it becomes a national security issue. I think there's room for concern, even while acknowledging the fast progress.

2

UnrulyNemesis t1_iwl18w2 wrote

I agree that Moore's law is slowing down, but that isn't because scientists are stupid-it's because they are too good. Moore's law worked mainly because we were shrinking down the size of the transistor (the functional unit of a microchip) to fit exponentially more transistors on a microchips each year. However, they have gotten so small that the transistors do not follow the same rules of physics that normal particles, they are following quantum mechanics. This will open up a whole new avenue for quantum computers in the future and in my opinion it will quickly be able to solve hashes in a Blockchain to the point that is destroys NFTs and Crypto 👍. Also it's a good thing that development in transistor size and sheer processing power has stopped, since now focus is on other aspects of a microchip, such as efficiency. This is a very good thing as microchip technology development can be explained similar to a steam powered train. Instead of creating new power sources for a better train overall, we have been finding ways to put more coal inside the train to increase power and have hit a natural limit. Development for different architectures, is like using a different energy source for a train, for example the ARM chips in our phone are getting very powerful extremely fast and are extremely energy efficient compared to the processors in our desktop PCs. Hopefully we will continue making these processors more energy efficient and powerful, as soon everyone in the third world countries will want these devices and that energy consumption will add up quickly. As for sheer processing power, cloud computing has become very popular and effective in the past couple years for researchers, but I agree that it could be absolutely dangerous if someone hoards computing power to create a dangerous AI.

1