Yancy_Farnesworth

Yancy_Farnesworth t1_jaedutw wrote

It works exactly like you looking around a room. Large objects take up more space in your vision so it's easier to see. Smaller objects are harder to spot. Transparent objects are even harder to see.

Radar works like your eyes, except you can't really see the details of the object. Just how big it is, like you are really near sided and everything is just blurry shapes in the distance. With stealth, they shape the object to make it mostly transparent to radar, like a glass window for visible light. They also use radar absorbent materials, which is like painting an object black to hide it in a dark room. Both of these reduce the radar cross section.

Radar cross section could be used to approximate what a jet would look like on radar relevant to other things and basically describes how easy it is to spot on radar.

2

Yancy_Farnesworth t1_ja9yiz5 wrote

There's no fixed standard. Publishers usually do things their own way and it will depend on the situation. For example, Microsoft approached acquiring Bungie for Halo to launch the XBox. Sometimes studios are running low on funds, so they seek a publisher to support them until the game launches. They can reach out to each other directly, there are usually entire teams whose job is to manage these relationships. Alternatively, they could have networking events in industry conventions like the GDC (Game Developers Conference).

For 3 specifically they don't do this randomly. They will negotiate terms for a contract that both parties will have to sign. It's a contract where the publisher is selling their services for a price. In the software world, outright IP theft of things like game engines is rare because it's really easy to get caught. The publisher really only has to look at the distributed game to spot clear markers of it being based on Frostbite. Having a past record of good projects will definitely give the developers an easier time negotiating a contract over someone that doesn't. A publisher could definitely opt to publish a game with only a pitch from a studio with no experience if they wanted to. It's just not likely and more often than not they're going to ask for at least some evidence that they're not wasting their time.

1

Yancy_Farnesworth t1_ja9fyc0 wrote

Publishers basically do whatever it is the developers need them to do. Most of the time it's funding because making games is expensive. They will offer other services to developers depending on the situation.

Some games are completely developed independently by the developers and the publisher only deals with distribution and marketing. This extends from simply listing games on Steam to all the legal paperwork and lawyers needed to negotiate contracts with distributors like Steam. The publisher also tends to have a lot of existing relationships that they've built over years for marketing/selling. Something that is very expensive/time consuming for a small developer to invest in, especially if they have never done so before. This is fairly common with indie games.

In other cases publishers will advise developers on games. For example, highlighting that some gameplay features might not work in the general public. Companies like Paradox, given their general focus on strategy games, could provide a lot of useful insight to developers. Whether or not the developer takes the advice depends on the nature of the relationship. Sometimes publishers can dictate what the dev does, seeing as the publisher usually funds the game. Other times they take a more hands-off approach.

Publishers also have access to a lot of other resources beyond financial/legal/etc. For example, they might have access to a really powerful game engine that a developer could use if they decide to build a game on it. Or maybe they have a large artistic group that a developer could leverage. You can see this in EA where they publish games and the publisher itself owns a number of dev studios around the world. They tend to share employees and tools like the Frostbite engine.

This is all to say that there's a lot work aside from writing code/making art that goes into making a game. Publishers offer developers a way to navigate that with someone that has experience doing that a lot.

2

Yancy_Farnesworth t1_ja8ozlv wrote

Let's say you roll 2 dice one at a time. There are 36 possible outcomes. You always had a 1 in 36 chance of rolling one of these combinations. That means that rolling a 6 then a 6 was always 1 in 36.

After you roll the first dice, you can no longer roll the other 5 numbers on the first dice, so as a result the chances of 30 of the combinations coming up is now 0%. The remaining 6 combinations are now 100%, up from 16.6% before the first roll.

The logical fallacy is that rolling a 6 and a 6 was always a 1 in 36 chance. But after you've rolled the first dice, you're not just asking if you're going to roll a 6 again. You're also asking what's the probability that I'm going to roll a double, which should be a lower probability right? The problem is that rolling 2 6's wasn't the only possible double before you rolled the first dice. You always had a 6 in 36 chance of rolling a double (1, 1 or 2, 2 etc). After rolling that first dice, you only have 1 chance to roll a 6 but there's no longer 36 possibilities, there's only 6.

1

Yancy_Farnesworth t1_j9uwkrk wrote

> where we have been through like 6 major variants, with a few dozen minor variants, inside 2 years time.

That's the point? It's a relative comparison between the flu virus and the COVID virus, not a claim that COVID doesn't mutate. We would be in deep trouble if COVID had the same potential to mutate that the flu virus does. We're lucky that the flu virus doesn't have the same level of immune evasion/suppression that a lot of coronaviruses have.

6

Yancy_Farnesworth t1_j9ul5fz wrote

> Because a vaccine is already developed.

People really don't understand how much effort is expended annually to update the vaccine for flu season. The flu virus mutates really easily unlike COVID. It's why we have an annual flu shot. If someone is infected with multiple flu variants, the variants literally start swapping parts creating essentially a new strain. We've been developing new flu vaccines every year for decades. It's not just a few tweaks every year, it's literally a new vaccine every year.

The WHO is responsible for tracking flu strains going around the world and creating the vaccine for that year. This requires a lot of tracking of flu cases world-wide and analyzing the strains going around and creating a vaccine that targets the strains going around. And there are a lot of strains circulating, it's literally impossible to vaccinate against all of them.

8

Yancy_Farnesworth t1_j6nladg wrote

> In short there is no large difference between GPU and CPU besides the GPU uses what is called SIMD (single instruction, multiple data) architecture which is what this analogy was trying to convey.

The GPU is heavily geared towards floating point operations, while the CPU is less so. CPUs used to have to use a separate FPU chip. Transistors got small enough where they could fit the FPU on the CPU. Then the need for dedicated floating point performance skyrocketed with the rise of 3D games, which ultimately required a separate dedicated chip that could do absurd numbers of floating point operations in parallel, resulting in the GPU.

This floating point performance is why GPUs are a great tool for AI/ML and why Nvidia came to dominate hardware dedicated to AI/ML applications.

4

Yancy_Farnesworth t1_j09xl5d wrote

> it's now just a name for a fabrication process

Yes and no. The process name is supposed to describe an improvement in transistor density now. As in for the same company, the next node is some % improvement over the previous one. They did this because below 7nm the nm measurement became even more meaningless for indicating transistor size/density.

> has nothing to do with actual values compared to back when it used to

Even when the measurement applied to the smallest "feature size", it still didn't describe the size of the actual transistors or transistor density. For example, Intel 10nm was more transistor dense then TSMC's 7nm process. Intel's 7nm process was targeting a higher density than TSMC's 5nm process. Intel fell behind TSMC because they tried to do it on DUV machines rather than EUV, which set them back years as TSMC wound up getting access to EUV equipment first. Hell, even with 7nm, TSMC's was better than Samsung's equivalent process.

1

Yancy_Farnesworth t1_j09wcxt wrote

They're not starting from scratch. IBM demonstrated their 2nm process over a year and a half ago, they were the first to do so. They're at the stage where they're getting it from the lab to production.

IBM's not a newcomer to the fab business, they've just been mostly focused on producing their own chips for their own enterprise/datacenter equipment rather than mass market equipment. I'm betting that this partnership is IBM eyeing the fab-only business model; or looking to get themselves fabless like how AMD divested themselves of their fabs and spun off Global Foundries.

1

Yancy_Farnesworth t1_j09ug3j wrote

Neither. China has 0 access to the equipment that fabs leading node chips. People don't seem to understand that there's the fab process and then there's the fab equipment. TSMC, Samsung, and Intel all buy the same fab equipment from the same manufacturers, they don't make their own. All of those companies are from the US, the EU, and Japan. And it's impossible for them to keep these machines working without the expertise and replacement parts from the equipment manufacturers.

If China ever goes to war with Taiwan, they lose access to the entire semiconductor supply chain. Not to mention they also lose access to silicon wafer producers who are almost entirely US and Japan based.

2

Yancy_Farnesworth t1_iy6o8iq wrote

I have to wonder how Grant would have done in the era of industrialized war. The Civil War was in a lot of ways a prelude to that. It seems like he would have had the right strategic chops for it given how he fought the Confederacy... He used the Union's material might to great effect, but it was costly in terms of the body count. Something that generals would have to learn how to deal with in WWI and WWII.

6

Yancy_Farnesworth t1_iue3uez wrote

We really don't know. There's a lot of theories but nothing confirmed.

The commonly accepted theory these days is a rocky metallic core (mostly iron/nickel), like the inner planets. But right above it is basically an ocean of liquid metallic hydrogen. Heat and pressure are essentially opposing forces when it comes to what state matter will take. Heat pushes the element toward gas, pressure pushes it toward solid. In the center of planets like Jupiter the pressure wins, hence the liquid hydrogen ocean. The only reason we don't think there's solid hydrogen there is that there isn't enough pressure for hydrogen to solidify at the temperatures in Jupiter's core.

Note that gas giants can get big. There's been some that we've seen that are borderline stars. If they got just a little bit more matter they would have enough gravity, and therefore pressure, to kick off fusion and turn the core into plasma from the heat.

2

Yancy_Farnesworth t1_iu4q3pv wrote

Neutrons attach to atomic nuclei and transform them into other elements by literally slamming into the nucleus. It's not really about convenience, even a 0.00001% chance of a neutron hitting an atom just right to transform it is inevitable when we talk about the number of atoms in a given volume. The molar mass for atmospheric oxygen (2 oxygen atoms) is about 32 grams. 6 x 10^23 molecules of O2 weighs 32 grams. There's a lot of atoms around a nuclear explosion taking place on earth.

Any atom can be radioactive, they do not need to be heavy. Tritium for example is a radioactive form of hydrogen with 2 neutrons. Hydrogen is stable and not radioactive with 1 or 2 neutrons. There's generally a ratio between neutrons and protons (really simplifying this) where elements tend to be stable. Go outside of that ratio and the element tends to be unstable and will decay into something else. The why is related to quantum mechanics.

2