Leemour

Leemour t1_j9110bn wrote

>The most common complaint is that an atom can only absorb very specific wavelengths, but light of all wavelengths is slowed down by materials

Yeah, but not slowed at the same rate (dispersion), which is where I think quantum treatment of the system may be more insightful. Truth is that there is a fundamental problem of trying to account for everything on the very small scale (scattering, absorption, phonon interactions etc.) to match observations on the large scale ( "simple" intensity and spectral measurements, maybe with a clock).

It's like having a large container of 100+ tennis balls, and then trying to predict where they all land if the container is flipped. I think it's unfair to dismiss the insights of the dynamics of a single tennis ball, but clearly it's not enough to predict how the group collectively behaves.

1

Leemour t1_j910acq wrote

Short answer is we don't know if it is the same photon or not, because our useful models treat them as fields and waves for the most part. We treat light as a particle when it's interacting with matter, but not when it's propagating. Whether we'd find the same unique ID or not is something yet to be seen maybe one day.

>Are all photons at the same wavelength identical so that it just doesn't make any sense to ask this question or are there some properties that are effectively randomized each time it is re-emitted?

This depends on the laser, and I don't think it's wise to write a whole wall of text to explain the ways in which a laser may emit photons with the same energy, but other factors may be different and make the laser better or worse for the intended application. Given the nature of lasers (and any radiating black body) we can't produce a laser that emits photons that all have the same energy; we can roughly select for a desired wavelength and it's far more efficient than with conventional light sources, but still we have for every laser emission a bandwidth. Depending on whether you want a light pulse with extreme short time duration or extremely high temporal coherence, you'll pick broad bandwidth or narrow bandwidth respectively. Unfortunately, as of now this is not a simple switch we flip, it requires the addition or removal of many optical components to the point that it's a total overhaul of the system.

1

Leemour t1_j5lokji wrote

> Believing that certain behaviors are sinful or immoral, likewise not intolerance. Nor is merely expressing such beliefs, however annoying, upsetting, or offensive they may be to those who hear them.

This is... not really applicable. Majority of hamartiology books would describe sin as something harmful among other things. As long as a state of being is touted as "harmful", persons of such state of being will be at risk of violence: it does more harm to tolerate it, than not to.

Moreover, if we were to necessitate tolerance of such bigotry by allowing it to be said, then we also have to tolerate the visceral anti-religious sentiment that such bigotry causes to begin with. It not only means, that in fact we logically could scorn bigotry with a bigoted attitude, but it creates an unending cycle of insults and tension, where it's difficult to avoid spiraling into violence. It may be the hallmark of a "free" society, but it's definitely not the characteristic of a stable one.

24

Leemour t1_ixd8yjw wrote

The funny thing is that pension schemes dont work in most places. My grandparents rely on my parents, and Im ready to support my parents along with my children in the future, because these pensions are useless when the economy is so volatile.

Even in Germany there is a name for the phenomenon of "pensioners poverty" its so frustrating.

2

Leemour t1_ittua47 wrote

"Technological mindset" is a weird mental gymnastic to avoid criticising capitalism and consumerism (the real culprits behind the exploitation of nature). Instead of romanticizing primitivism we'd be better off with more useful hands and brains on working toward a solution to climate change.

1

Leemour t1_isjj274 wrote

So many commenters obviously didn't read the article. He is the first openly gay Episcopal bishop, with a husband and an adult son. He was never in doubt about his faith or that he's loved by his God even if he is openly gay. He was only doubtful if the church would let him be a bishop, thus he thought this dream was beyond achieving.

8

Leemour t1_irjpa50 wrote

Alright tinfoil hatter. Totally wasnt trying to tell you that you need to read carefully what researchers publish, that peer review is not the same thing as doing a meta study and all of this is separate from what the process of the medical community and drug administration selecting their treatment methods and drugs.

The meta study is major in the sense that it debunks seratonin theory altogether, but there were many psychologists and therapists who already knew that. Even the article points out that this myth was/is chiefly believed by the general public. Regardless of this, the drugs go through a drug trial, which is different from all of this; they worked for a certain amount of people, so it passed the test, maybe not for the reasons we understand, but if it works, then it works and despite this study its not wise to force these drugs off the shelf when they work in drug trials.

You, not understanding the nuance of these things only proves your ignorance. The hysteria is just cherry on top.

0

Leemour t1_irj0iuh wrote

I can't believe I have to point this out: what do you think professional means here? Researchers, therapists or both, what is the difference? In what world can the therapist force a patient off of a demonstrably effective antidepressant (despite the patients wishes!!!) when such comprehensive metastudy was lacking? General public view does not coincide with scientific consensus/discussion either, so again, what part of this is telling you it's a sham?

0

Leemour t1_irix8bn wrote

No, peer review =/= metastudy =/= drug approval. These are different processes with different functions and responsibilities. Not to mention therapists don't rely on questionable research in psychology without informing their patients that they are taking part in a study or questionable method that has X% success rate with possible complications.

1

Leemour t1_irgdcj9 wrote

I disagree, you need tangible resources to record, interpret and apply information. There is no process that perfectly and permanently records information, thus to just maintain fidelity, it requires analysis and re-recording of the information, but this assuming perfect formulation of information to begin with, which is also not the case. To deny this is unhelpful reductionism and culture is part of the problem, but by solving that the problem still stands.

1

Leemour t1_irfwv4y wrote

They're both scenarios where we push things to an extreme. As I see it (from a kind of Naturalist+Existentialist-I-Guess perspective based on my reading of Schrödinger's lectures on thermodynamics and more contemporary researchers who have posed existentialist questions) all things have this entropy nature in them, a pull towards disorder (don't read this as chaos), variation and ultimately a sort of dispersion. If we do nothing, everything will crumble, if we do everything we can to stop one thing from crumbling, then everything else will crumble faster: the key is to find the right amount of "push back" needed for the right things, the aim for the optimal instead of the ideal.

So any system we create will need reforms and changes, but there is no formula to "when", "how often" and "how extensively", besides it's something that a different generation needs to deal with anyway and we cannot predict what paradigms and context will be relevant in the future. Providing a framework for the means to make changes and giving a rational permission for occasional radicalism is how systems can overall preserve its function.

The way this relates to the OP is that since we have unprecedented, massive influx of new research publications, there is a lot that just goes kind of unchallenged, this is tied in with entropy, which gets "stronger" with "bigger numbers", so we can work against this by verifying the research in some way, but just how much resources put into this verification is worth when research is just getting more and more expensive? Currently the main form of this is by doing metastudies, which collect a bunch of relevant studies, critically studies their methods, results, impact on other researches, etc. to determine whether this was in some form verified, falsified or is suspiciously lacking. It is essentially doing science on science, and in many fields it seems to work just fine, though typically the criticism is that there isn't enough people doing this with enough resources, which I agree with, but I'd also add, that metastudies are hard to do because of the highly critical focus of it.

Overall, this OP is a bit dramatic, there is no real crisis, in the general scientific community, so no, scientists are not shacks (yet), but in some fields this has grown out of proportion (psychology and medicine namely) where a bit of change is needed or more frequent/sizable metastudies.

0