Viewing a single comment thread. View all comments

DuckQueue t1_j2wnm2i wrote

> After each generation you should adapt the number of potential vulnerable candidates compared to what initial R had.

In the abstract your point isn't wrong, but in the real world population sizes are much larger, resistance is imperfect to begin with, mutation occurs, and we're talking about a disease where previous infection doesn't confer a high degree of resistance that persists over the long-term - like COVID - so that isn't going to have nearly as large an impact as you're suggesting with your example.

1

SnooPuppers1978 t1_j2woh6q wrote

> In the abstract your point isn't wrong, but in the real world population sizes are much larger, resistance is imperfect to begin with, mutation occurs, and we're talking about a disease where previous infection doesn't confer a high degree of resistance that persists over the long-term - like COVID - so that isn't going to have nearly as large an impact as you're suggesting with your example.

Yes, all of this can be adapted to the model, but eventual result will still be a wave like graph, where difference of 22% would get lower the higher the R0 is and the end results in terms of magnitudes would be very similar to what I showed above.

Whether the population is 1,000,000 here or 1,000,000,000 doesn't make that much of a difference though. It's just few generations more.

We could create a model that incorporates waning immunity based on the studies we've seen. And also run a vaccine like intervention to see how results would differ. We could try to use 8 billion people and try to roll out vaccine to all of them within certain timeframe. I might do it during the weekend if I have time.

The results with 1 billion population here - you can see that it's just few more gens, magnitude wise, proportionally not that much difference:

R0 Gens before < 10 infections Total Cases Max Cases in a Gen
0.78 10 428 78
1 2153540 45454565 100
1.28 117 403004862 25885890
1.64 60 662734374 88055684
2.1 40 822064894 170459771
2.68 30 913564065 254590943
3.44 23 963666407 343953730
4.4 19 986999921 429451582
5.63 16 996336642 456685926
7.21 13 999256872 437456355
9.22 12 999900871 527835860
11.81 10 999992570 696393878
15.11 9 999999726 623729861
19.34 8 999999996 744938203
1

DuckQueue t1_j2wpnl4 wrote

> It's just few generations more.

Like... twice as many. Yes, not multiple orders of magnitude but still enough to make a huge difference, especially when you account for the other factors I mentioned.

And that still wouldn't account for how diseases actually spread in real populations, where not everyone has an equal chance of being exposed to any given other person. There's a reason actual models of the spread of disease are much more complex than the model you're providing. And a reason why observational estimates of the R0 for COVID haven't been appreciably declining over time.

1

SnooPuppers1978 t1_j2wqqe5 wrote

Yes, but there's also factors to the other side. As the OP above mentioned people will adapt their behaviour depending on how they perceive the risk for themselves. If people have vaccinated and perceive the risk as lower, they will be more likely to go out. If there's a huge wave currently ongoing, people will be less likely to go out. If there's little threat, people will go out more likely, making the likelihood of new wave to start higher. Risk behaviour will be another balancing factor that will make the wave smooth out whether you have the intervention or not. If people see death around them, they get scared and start to avoid, if people see no danger, they will increase their risk behaviour. Behaviour will influence the R0 so much. Imagine being in contact with 50% fewer people than you were previously. That would be halving the R.

So in the end with all those factors together, unless the efficacy is enough to create herd immunity it's going to be waves with not much differing total amount of cases. Efficacy has to be enough for herd immunity or very close to that, otherwise yeah, it would just be something like that.

Main effect will be for risk groups for whom the risk of the disease will be much lower in terms of hospitalisations and death thanks to their immune systems being prepared from the vaccine. And in addition less overload on hospitals due to flattening the curve, but in the end total amount of infections are not going to be magnitudes away due to inherent characteristics of this virus.

1

DuckQueue t1_j2ws3t1 wrote

> So in the end with all those factors together, unless the efficacy is enough to create herd immunity it's going to be waves with not much differing total amount of cases.

You seem to be assuming that the disease will exhaust itself and run out of people to infect, but as the real world shows, that isn't generally how infectious diseases - especially ones this effective at escaping the immune system - work.

It's only meaningful to talk about the total number of cases up to a given point in time - if you're trying to talk about the total where the number of new infections permamently drops to 0 you're talking about circumstances that might apply to some newly-arising zoonotic diseases but decidedly does not apply to the disease we're talking about.

1

SnooPuppers1978 t1_j2wsv97 wrote

> You seem to be assuming that the disease will exhaust itself and run out of people to infect

For certain amount of time, hence the waves. There will be smaller amount of population still having the virus, until the virus mutates or immunity wanes enough after which it will start all over again.

After certain amount of time like a year or two years, the total cases amount would be similar in terms of magnitude. They won't be 10x based on 22% efficacy.

Because you were suggesting 7k vs 1 million which is different in magnitudes.

I'm suggesting that difference would probably be less than 2x after 2 years for example. And if I had to guess it would probably be something like 25% difference similarly, if we tried to make our model more comprehensive.

1