Cleistheknees

Cleistheknees t1_jbz7m32 wrote

Answer the question: would the world be better if animals didn’t consume other animals?

> Our morality shouldn’t be based off of what happens in nature.

Unless you believe in magic or other supernatural forces, everything comes from nature. There is nowhere else for things to come from.

1

Cleistheknees t1_jbyzv12 wrote

Every zealot thinks history will eventually yield to their zealotry.

There are hundreds of thousands of animal species which consume other animals, in ways drastically more horrific than even the worst factory farming setup. Are those more immoral than livestock? They certainly die much more prolonged, painful, and stressful deaths. They’re often infants and juveniles as well.

2

Cleistheknees t1_jadx5yb wrote

> I disagree about how emphatic this is. I think there’s ample support for its occurrence, of course, and I do believe it, but the primary source for it in modern humans comes from a single author (I don’t recall his name, but it might be Liebenberg) writing about the San, who are very much a removed population operating outside their indigenous cultural norms.

Louis Liebenberg did the most detailed ethnography, but he’s certainly not the only source. As a grad student I personally interviewed people in East Tanzania who hunt local ungulates in a way we categorize as persistence hunting, but absent the endurance running aspect, which is something Liebenberg discusses as a limitation. Like, they jog of course, but it’s more of a 4-5 hour jog-track-jog-track etc.

You’re probably thinking of Pickering and Bunn’s critical response to him a couple years after that first paper, and while I don’t want to speak for other people, I would confidently say most people agree their retort stepped way over the bounds of what is reasonable. Louis was never making causal claims that endurance running and PH were the bundles of selection pressure that causally produced the capabilities related to endurance running, because this is a circular and nonsensical “just so”, and certainly not a mistake a staff associate at Harvard would make. His work is really more about tracking than endurance running, and it was before the wave of very clarifying research came out in the 2010’s which illuminated much of the transitionary period between Australopiths and early Homo, a lot of which was morphological, a lot out of Olduvai, etc.

Very important to delineate the version of persistence hunting in the capital-H hypothesis, with the actual anthropological definition, which in colloquial language would basically just be extended tracking at a pace the animal cannot maintain for a number of hours. Running a marathon chasing after a gazelle is a fantasy introduced by McDougall.

1

Cleistheknees t1_jad9ny7 wrote

> There are different kinds of bipedalism.

Sure, but limb development is highly canalized even in quadrupeds. You have to look at this scenario from the paradigm of the selection pressure which started our lineage on the trajectory towards bipedalism, and it very clearly extends far back beyond the earliest signal of even the Australopiths. A change in a trait like major skeletal morphology takes an extremely long time to fix.

> Early homo were unable to run even if they were obligate bipeds.

I’m going to push back on this. It’s too confident a claim for the physical evidence base, which is scarce. Unless some major work was released and I haven’t heard about it, which I feel is unlikely because I attend most seminars from the major anthropogenic institutions like CARTA and Leakey.

> Nuchal ligaments, Long bone length, gait, narrowing of the hip, extension of the achilles tendon and arched foot, broader heel and short toes. I’m unsure of any others off the top of my head. These traits would have appeared around 1.8mya

All of these have a pretty clear developmental trajectories extending far back beyond erectus. Australopith tibias are notably elongated.

> It very well might be that we just evolved a kit that could be repurposed for a hunting style - and we’re misunderstanding the cause and effect.

This is basically “just so” stories in a nutshell.

1

Cleistheknees t1_jabjzb8 wrote

That is unfortunately what we call a “just so” story, a common logical error when looking at traits and suggesting adaptive histories for them based on some trait interactions at a given point in time.

As for the timeline, bipedality is far older than one million years before sapiens, and in fact arrives around three million years before the earliest erectus. Australopiths are decidedly bipedal by 5mya, there’s a strong case for pushing this another million years back. Erectus dates back to just barely 2mya.

So, we’ll say bipedality arrives at around 5mya, the earliest stone tools at 3.2, the persistent presence of stone tools at 2.7, the diaspora of toolmakers out of Africa at 2.2, persistent butchery at 1.9, which are mostly small animals not really suited to persistence predation. Megafauna butchery arrives at around 1.5, a full 3.5+ million years after bipedalism.

Hair loss is a hazier picture because we still have basically all of the mammalian genetics for fur, and the adaptations seem to be substantially in hundreds of regulatory elements, which are harder to date as precisely. That said, the error bars don’t really extend back much further than 900kya, which is 1.1 mya after megafauna butchery becomes commonplace. What we’d need to see is hair loss happening during the transition from persistent butchery of small animals and into larger ungulates, etc, not over a million years after.

Remember, this is all a position again the actual hypothesis coined “Endurance Running/Persistence Hunting Hypothesis”, not the occurrence of persistence hunting in human evolutionary history overall, which is incontrovertible. The hypothesis frames persistence hunting as the adaptive roadmap for bipedalism and hairlessness, and given the timeline of these elements it’s basically unsupportable.

1

Cleistheknees t1_j9wety8 wrote

> They are injured or bled to death, I don’t imagine a zebra walking away with a 1foot stick inside it’s bowels,

You would be amazed at what a wild animal is capable of in the most dire circumstances. I have personally shot a prairie elk through both lungs and had to track it about 2km before finding it. There’s all kinds of footage from various sources like wildlife documentaries and whatnot of animals with dramatic and mortal injuries pushing on for hours.

> humans have evolved to hunt down their injured prays over long distances while inflicting multiple shots from safe distances. You spend your precious arrows but get rewarded afterwards.

This is called persistence hunting theory, and as I’ve mentioned elsewhere in this thread, it’s really just a theory at this point. There is virtually no material evidence of it being a consistent selection pressure across human evolutionary history. It was presented as fact in a book called Born to Run in 2009, and unfortunately got picked up by the lay public as if it was a settled question.

We do indeed have material evidence of consistent butchery going back ~2 million years, which much farther into our genus than sapiens, but putting a narrative of persistence hunting onto that is still a speculative leap. It could just as easily be opportunism, ambushes, scavenging, some mix of all three, etc.

3

Cleistheknees t1_j9s1g9t wrote

Anyone is welcome to argue whatever point they’d like.

In this context, large ungulates = hippopotamus, bison, elephants, rhinoceros, large boars, etc, because the actual animals in this discussion are generally extinct Pleistocene megafauna, not white tailed deer, which I agree are not difficult at all to drop in one shot for an experience or lucky hunter. The ambiguity here is probably because “large ungulate” means something different to me as an evolutionary biologist than it does to hunters. I hunt, but I wouldn’t really call myself “a hunter”, if that makes sense.

> If something sharp passes through the lungs or heart of a large ungulate like elk or moose, then they tend to die rapidly.

Rapidly seems kinda relative. I’ve double lung punched a prairie elk and had to go over two kilometers to get it.

6

Cleistheknees t1_j9r3b6v wrote

Large game is far more amenable to persistence predation, and non-fatal injuries are a vital part of that hunting style. It’s almost impossible to drop a large ungulate with a single shot, even with the most advanced compound bows available today, with carbon fiber shafts and titanium heads and all kinds of TactiCool gizmos.

8

Cleistheknees t1_j7oibty wrote

Well, as I’ve already said to the other person who said the same thing, it was a typo on my part exchanging “flowering plants” for “asterids”.

However, you should actually read the article you cited, because neither it nor the academic article it references present any evidence whatsoever of plants 140mya with morphology similar to what we call flowers. It is entirely a theoretical exercise.

1

Cleistheknees t1_j7o11id wrote

That is correct. The second “flowering plants” was a typo on my part, and should have been asterids. That said, I could still certainly be outdated on my understanding of the advent of asterids. To my knowledge the oldest specimens are dated to the Coniacian.

2

Cleistheknees t1_j7mds8s wrote

Best to contextualize this with the knowledge that the initial transition towards pastoralism and agriculture (ie the Neolithic) brought a major negative impact to overall human skeletal health.

There are hundreds of studies examining this question and generally coming to the same consensus, but here’s a couple more narrative and less jargony ones.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4466732/

https://pubmed.ncbi.nlm.nih.gov/21507735/

The overall takeaway is that agriculture resulted in a severe downturn for human health, but renders out to a net contribution to fitness because of massive increases in ovulation. Women even in extant hunter-gatherer populations experience a total number of ovulatory cycles somewhere around an order of magnetite less than women in agriculture societies, even in what we would today call “developing” conditions, or in older research is sometimes called “third world” or “unindustrialized”.

https://www.jstor.org/stable/4602928

https://www.jstor.org/stable/4602772

https://www.pnas.org/doi/10.1073/pnas.1524031113

From the third citation:

> We examine the effects of sedentarization and cultivation on disease load, mortality, and fertility among Agta foragers. We report increased disease and mortality rates associated with sedentarization alongside an even larger increase in fertility associated with both participation in cultivation and sedentarization. Thus, mothers who transition to agriculture have higher reproductive fitness. We provide the first empirical evidence, to our knowledge, of an adaptive mechanism behind the expansion of agriculture, explaining how we can reconcile the Neolithic increase in morbidity and mortality with the observed demographic expansion.

There is no precise physiological answer as to why, so if someone says they know, they’re either lying or not well-read enough. Agriculture means lots of carbohydrates, and insulin has major effects on sex steroid balancing via its modulation of aromatase, so that is probably a contributor. The massive decrease in population mobility also means a less energy-stressed environment, which is a potent suppressor of ovulation, and also of lactation. Lactation itself also suppresses ovulation, and breastfeeding duration is markedly longer in extant non-agricultural populations, though these groups tend to be somewhat hybridized and not nearly as mobile as their (and our) Paleolithic ancestors, so they should not be taken as a perfect model for those extinct groups.

28

Cleistheknees t1_j7m7qfy wrote

The surprise isn’t that an ancestor of those living plants was alive at that time, as that would be basically tautological: every living thing has at least one ancestor alive for every single moment in time, all the way back to the very origin of life on this planet.

The surprise is the morphology of this plant being dated so far back. Flowering plants as a whole (ie as a monophyletic group going back to a singular, morphologically similar ancestor) were thought to have arisen long after this, and a specimen being dated c 80mya with complex flowers means the onramp of development of this anatomy goes back much further.

This will certainly result in some cladistic musical chairs if the dating is accurate, and a bunch of botany students will have yet another outdated section in their textbooks. Some of us in evo joke that every time a plant is tossed somewhere else on the phylogenetic tree, somewhere a botanist takes a shot of tequila and cries.

Source: not a botanist but I do fight them over parking spots

36

Cleistheknees t1_j7cgenh wrote

To reiterate, if you’re saying “X is Y”, and it’s only true 30% of the time, then the statement is incorrect.

> They also provided me with This study that says there’s likely a genetic component since identical twins as well as non-identical if one has multiple of the antibodies.

There is most certainly a genetic component to autoimmune type 1 diabetes. That much is beyond question. However, you did say quite a bit more than “there’s a genetic component”.

2

Cleistheknees t1_j7c6xg2 wrote

Hey, just a heads up, basically everything you said here is wrong. Please avoid making authoritative claims like this if you aren’t educated on the topic, especially in subs like this. It’s how misinformation spreads.

> Type one is genetic

The ambiguous association patterns of genes like HLA and CTA4 basically proves that this statement cannot be conclusively true.

> and requires some sort of trigger (thought to be usually a virus, though as far as I’m aware they’ve not been able to pinpoint anything).

Again, sometimes. There are genotypes which are autosomal dominant for immune destruction of beta cells, like GCK-MODY. Further, there are no documented environmental triggers for T1D, so, again, stating this conclusively is wrong.

> Odds are that most people with the genetic predisposition will get it triggered at some point in their lives

There is no evidence to support this, at all, and quite a bit against it. The penetration of genes implicated in T1D is low. The concordance rate in monozygotic twins is low. 90% of people with T1D have no known relatives with the disease. Etc.

> so COVID may have been the trigger for a lot of people. But it won’t be statistically any more than normal, since they’d likely be exposed to something that triggers it anyways.

What? This makes zero sense.

Source: doctorate in evolutionary biology, and 25+ years with T1D, but only the former matters here

17

Cleistheknees t1_j4rzegz wrote

3

Cleistheknees t1_j4rz4xk wrote

There’s already lots of epidemiology pointing in this direction, so it’s more than a suggestion. Most infant formula contains little to no milk oligosaccharides (HMOs), and generally they are synthesized from bovine milk. Human milk generally has the highest amount and diversity of oligosaccharides of any mammal, so bovine milk isolates are a poor substitute, and might be completely useless since HMOs are just one piece of the extremely bioactive mixture in breastmilk.

21

Cleistheknees t1_j4rxbzx wrote

All oligosaccharides in mammalian breast milk are produced exclusively in the mammary glands during lactation, and do not enter or arrive from systemic circulation. It’s not something a breastfeeding mother can consume and have it show up in breastmilk.

https://academic.oup.com/jn/article/136/8/2127/4664766

15

Cleistheknees t1_j49r5gs wrote

What Eades is saying doesn’t really make sense. If a type 1 diabetic (ie, complete lack of endogenous insulin) consumes 15g of glucose or 150g, the change in serum glucose will be wildly different, and commensurate with the amount ingested. If what he’s saying was correct, that insulin-independent glucose disposal can account for even large boluses in the form of a meal, then the change in serum glucose in a T1D would the same regardless of intake, as that change would be from hepatic output, not from the meal, which in the absence of insulin will be the same whether you’re eating a bowl of pasta or arugula with salmon. As someone who’s last endogenous secretion of insulin was about 26 years ago, I can pretty confidently tell you that is not the reality. I can eat a turkey sandwich before a 20mi bike ride no problem, but a 16oz soda with 70g of sucrose is going to put me past 500mg/dL even with the same exercise.

It is true that the effect of insulin on serum glucose in the advanced T2D patient specifically seems mostly based on suppressing hepatic output, and not entirely on opening GLUT4 channels in skeletal muscle as was thought before, but that doesn’t mean it’s black and white. GLUT4 exists, and is activated via IRS1, and GLUT1 is virtually unexpressed in skeletal muscle after the first year of life (Gaster et al, 2000). That is beyond contestation at this point. If Eades has a good argument as to why this exquisitely architected system of insulin-dependent glucose uptake into muscles that are perfectly prepped for its immediate oxidation would all exist for no reason, I would love to hear it. In fact, we have a version of what Eades is describing, in the form of AMPk activated GLUT4 translocation, skipping IRS1 and the need for insulin, but that’s in situations where the myocyte senses energy deficiency, not at rest. You probably know as well as I do how inefficient exercise is for disposing calories, so unless Eades expects us all to take 6 hour zone 2 jogs after every meal, at some point insulin-mediated glucose disposal has to be accepted.

Plus, we and our predecessors in Homo have been consuming starches for hundreds of thousands of years, there is evidence of strong selection on regions like AMY1 that extends long before our genus, etc. It’s clearly just as native to us at this point as any other nutrient. What’s not native is—like I said before—the context. The liver obviously can’t handle 3 100g boluses of refined glucose per day, for 30 years, especially in a sedentary person. That doesn’t mean glucose, or insulin, is a thing to be suppressed at all costs, just like the mechanistic connection between fatty acids and cardiovascular disease doesn’t mean fatty acids should be suppressed at all costs. IMO, an insulin sensitive person who exercises 4-6 times per week can eat whatever combination of whole foods they like and have basically zero fear of MetS or accelerated atherosclerosis. Remember that JAMA paper in Jan 2021 that computed the WHI biomarkers HR for CVD? LDL was like 1.6, lipoprotein insulin resistance was higher, near to obesity, and T2D was like 10.5. To my mind, insulin resistance is the smoking gun, whether it’s the initial causative agent or not, it seems pretty clear that if you’re weight stable and insulin sensitive, you’re safe.

Plus, I feel like he of all people shouldn’t be relying entirely on rodent data. Isn’t that usually his thing? That metabolic observations from rats shouldn’t be extrapolated to humans?

> If blood glucose was stable and it went up 18mg/dl (1 mmol) wouldn’t that indicate the intake was too high ?

I mean, maybe in an absolute sense it means intake was greater than the ability of basal glucose disposal to completely account for, but “too high” seems like it suggests pathology, and I don’t see that small of a disturbance to serum glucose being anywhere near such a threshold. Any healthy person’s glucose will rise more than that simply by waking up, or lifting weights for a bit.

Plus, insulin shuttling glucose into muscle and adipose tissue isn’t bad. It’s a normal physiological process. Provided both of those tissue categories remain sensitive to insulin and the person is at stable weight, I don’t see how you could sneak your way into MetS.

4

Cleistheknees t1_j493eoa wrote

> but is all sugar bad, period

The molecule is the same, but the “badness” is based on where it goes, and metabolism is contextual and prioritized. A lot of absolutist sucrose defenders put these studies on blast by saying things like “what about fruit!!!”, noting that the epidemiological data on “fruit” isn’t nearly as bad as on SSBs. Obviously, though, they’re either consciously or mistakenly forgetting that the concentration of sucrose in SSBs is much higher than almost all fruits, and lacks a single gram of fiber, which nearly all fruits have. There is the added problem that “fruit” is a very nonspecific term. An apple and fig have substantially different sugar concentrations and fiber ratios.

The most important factor is time. The longer you can stretch out your liver’s exposure to a given sugar bolus, the better. This is exactly what fiber does: slows motility. The liver (and most peripheral tissues) can metabolize glucose without insulin, but when you’re at rest it’s only at a basal rate, which is the homeostatic demand for hepatic glucose output. The closer you can get your sugar intake to match that “background level” of insulin-independent glucose metabolism, the less negative effect (ie insulin requirement) it would have. Endothelial tissue gets it’s glucose via GLUT1 transporters, which are insulin independent. It’s why the vasculature takes the brunt of chronic hyperglycemia: it isn’t able to prevent all that glucose from diffusing in and wreaking havoc on its internal machinery via glycation.

https://www.nature.com/articles/s41401-021-00647-y

Low-intensity exercise (ie Zone 2) also opens up an insulin-independent glucose pathway, but into skeletal muscle. That’s why taking a brisk walk after meals is consistently so effective at improving glycemia in diabetics.

https://pubmed.ncbi.nlm.nih.gov/35268055/

So, the “good or bad” gist would be that any sugar which exceeds the ability of the liver to metabolize, which they gets exported as VLDL or contributes to chronic insulin spikes, is bad. I guess the problem isn’t the molecule from a certain point of view, it’s just that it has nowhere to go without a pattern of insulin secretion that is desensitizing in the long-term. Any sugar which is metabolized normally (especially into skeletal muscle) and without chronic insulin spikes is fine. Obviously this is just based on energy and ignores the other 99% of nutritional quality.

7

Cleistheknees t1_j490tsv wrote

They didn’t define it, just listing some common criteria. The ICD10 criteria for metabolic syndrome (E88.81) includes insulin resistance. In this constellation, IR is nearly 100% coincident with hyperinsulinemia. You can’t have one without the other. Resistance to insulin by definition means the pancreas is secreting more than the observed serum glucose would suggest in an insulin-sensitive person.

http://www.icd9data.com/2015/Volume1/240-279/270-279/277/277.7.htm

3

Cleistheknees t1_iylhxwc wrote

Got it, I’ll tell my girlfriend to take him off our Christmas party invite list.

Can you tell me an actual problem in the methodology of using Coca-Cola’s direct correspondence about their influence on nutrition research as….. evidence of their influence on nutrition research?

6

Cleistheknees t1_iyl815h wrote

There are three authors of this paper, and the lead is a woman: Sarah Steele. Also, investigators generally aren’t “funded” by “an industry”. Studies can be, and certainly are, however this one was not.

Do you have a specific problem with the methodology in this paper? In the kinds of studies Coca Cola is accused of influencing, sometimes the methodology is opaque enough to hide behind, even if it appears sound (food-frequency questionnaires, etc), but in this case they’re just using Coke’s own correspondences.

14

Cleistheknees t1_iy0u6dt wrote

Preface: the last “i” in HITI is actually for “integration”, not “insertion” as stated above, in case you wanted to google around for more info.

You guide the process in the same way. The donor sequence used (theoretically used, anyways) in HITI is capped by the same sequences that gRNA-Cas9 is pointed at, and in fact is exceptionally accurate. The idea behind HITI is that it isn’t limited to actively differentiating cells.

2