Viewing a single comment thread. View all comments

SubtlySubbing t1_j3dawdc wrote

I don't agree at all with what this article is saying. First of all a theory is the hypothesis with the most proof. It doesn't mean it's proven. It just means its better at predicting than other models.

> If a programme predicts nothing new or its predictions can’t be tested, then it is bad science, and might be degenerating to the point of pseudoscience.

Theories have always out paced experimental advancements. General relativity was created during a time where it couldn't be proven. It was only a few years ago that they detected gravitional waves. The Hot vs Cold Big Bang theory wasn't able to be tested until they launched COBE into space to measure if there was a Comic Microwave Background, falsifying the Cold Big Bang and advancing the Hot Big Bang theory. The Dark matter theory has multiple splits, but the main one is with WIMPs (weakly interacting massive particles), which, as the name implies, don't interact with other matter, so how can you use matter to probe it? But DM is there, or at least, something is causing more mass in the universe than there should be based on observations of matter. They mention String theory as a bad science since it can't be falsified. But it just can't be falsified yet. Doesn't mean the theory is incorrect. We just can't prove any of it.

It just seems like the author is conflating "unable to be truly falsified" with "unable to be falsified yet" like their example of astrology vs astronomy. If a theory intrinsically has no way to experiment, then it isnt scientific. "God is blue" isn't the same as "Dark matter exists."

There's always been a cycle of conjuring a theory that needs to wait for experimental advancements to catch up. And I would agrue that this strongly propels science forward. If we name these theories as "bad science," then how would we ever advance? How would we ever challenge ourselves by improving experimental techniques?

Also the examples they give are just straight up wrong.

> Some are inherent in the mathematical formulation of the theory, such as the assumption in Isaac Newton’s theory that the masses of gravitating bodies are located at their centres.

That isn't an assumption of the theory at all. It can be proven with simple Calculus that if you have a sphere of matter, then the cumulative newtonian gravitational force of all points of the sphere is the exact same as if it had all its mass in its center point. It's a by product, not an assumption.

> Others are necessary to simplify calculations, such as the assumption that in experimental studies of the electromagnetic force, the effects of other forces (such as gravity) can be safely ignored. This means that the resulting predictions are never derived directly from the theory itself, but rather from the theory as adapted by one or more ‘auxiliary’ hypotheses. If these predictions are then falsified, it’s never clear what’s gone wrong. It might be that the theory is indeed false, but it could be that one or more of the auxiliary hypotheses is invalid: the evidence can’t tell us which.

Yes, it can be safely ignored. The ratio between the electric force and gravitional force of two electrons is 4.17e ×10^42 (meaning the electric force is 4170000000000000000000000000000000000000000 times stronger than their gravity). Well beyond any experimental error that we can detect. And since forces are additive, if you wanted to add in gravity, then you could just add the force to the equation (but you're just practically adding zero). But since it's so small compared to the electric force and you wouldn't be able to detect it within error, it is very very safe to assume the gravitional forces can be neglected when trying to prove electric theories in subatomic particles. If there is a big enough discrepancy between the predictions of electric forces, then it is very clear it isnt your neglect of gravity. It is something wrong with your understanding of electricity. (This is actually another good point. We can't make experiments for Quantum gravity yet because we don't have the equipment to detect such a small force. Does mean the theory is bad science? Not at all.) The inverse of this is with measuring gravity in Cosmology: we can safely assume the electrical force acting between two galaxies is negligible compared to their gravitional force and any expirmental discrepancies would be too large to be caused by their electrical force. Physicists make approximations all the time, but that doesn't mean they are doing it willy-nilly.

The theory of QM has very different philosophical approaches to understanding what a probability wave means and how observation collapses it into a particle. It's very hard to truly falsify any of them, but you would be laughed at by saying the theory is bad science because we lack the fundamental understanding of its implications.

66

Seek_Equilibrium t1_j3em9ui wrote

> First of all a theory is the hypothesis with the most proof. It doesn’t mean it’s proven.

This is a bit of a bugbear of mine. A theory is not what a hypothesis graduates into when it collects enough evidence. Theories are broad explanatory frameworks. They incorporate and generate many hypotheses/predictions. Some are highly backed up by evidence (Darwinian evolutionary theory, general relativity, etc.) and some are discredited (lamarckian evolutionary theory, etc.), and still others are currently speculative and not yet decisively confirmed nor disconfirmed.

28

Le_Chevalier_Blanc t1_j3dhk19 wrote

I think if there was a comic background radiation the world would be a much better place.

13

ehdontknow t1_j3f9b4t wrote

Well damn, now I know what my hypothetical nerdy comedy troupe would have been called. But for real, it’s such a perfect typo, it seems a shame for it not to end up being used for something.

4

ShalmaneserIII t1_j3f8i8p wrote

Basically, there's a difference between "Can't be tested" and "Can't be tested yet, but we can tell you how a test should work when we can do that."

12

GreenVikingApple t1_j3fn25r wrote

> Theories have always out paced experimental advancements.

No, e.g., blackbody radiation, atomic spectra, and countless other examples.

5

not_jyc t1_j3fuxm7 wrote

Also tons of things in math: Fourier transforms came before analysis. Also every open conjecture.

6

tiredstars t1_j3gz9up wrote

> There's always been a cycle of conjuring a theory that needs to wait for experimental advancements to catch up. And I would agrue that this strongly propels science forward. If we name these theories as "bad science," then how would we ever advance?

There's a problem, though, isn't there, with a hypothesis that can be tested in theory but not in practice until some unknown point in the future? Especially if there's a risk of the theory being tweaked if the evidence doesn't come up - "ok, we didn't find what we were looking for with this particle collider, but if we build a bigger one we will..." (I think in Lakatos' terms that would be an "auxiliary hypothesis" created protect the core hypothesis.)

As the article points out this is a problem for Lakatos' ideas, as sometimes "degenerating" science does produce good results. Maybe that theory scientists have been pushing for a half a century without being empirically tested will turn out to be correct when the technology (or funding) is there. Or maybe you'll have wasted 50 years.

Thus this kind of science is risky. More risky than science which can be tested straight away, or in the near future. The article argues for honesty about this risk and a clear assessment of it when funding or supporting science.

To pick up on /u/ShalmaneserIII's comment, there's a difference between "can be tested now" and "can't be tested yet, but we can tell you how it should be done" and "can't be tested". The middle category falls somewhere between the ideal of the first and the junk of the last.

2

malament-hogarth t1_j3jg38e wrote

“Theories have always put paced experimental advancements.”

No. Take the CMB for example. It was discovered when not looking for it. Total accidental Nobel prize.

String theory is an interesting discussion, because it is more closely woven to what is serendipitous(especially in the extraction of dynamics), the nature of a substrate, and the nature of operational distinguishability, with a very simple rule set. Like Newton, it is a special use of general relativity, that requires a displacement. Newtons theory is not really a new paradigm, it is just a subset of Einstein, Kuhn is wrong in this regard. He is right in the expanse of a conceptual understanding.

Displacement is the core of the Calabi-Yau manifold, how to approximate distances. Nonlocality is the core assumptions, that degenerate into separable fermionic and bosonic interactions. DM comes into play because of how the Lagrangian drives the modern theory, the jig is not up that the anomalous magnetic dipole moment(or amplitude a electron absorbs and emits a photon) is slightly off from the Dirac delta function. There is a means of correction with coupling constants to a vast array of conformal field theories. This is what it means for a field to be renormalized, which some interpret as a means of rescaling, others a means of quantization. The constant of constants. Either way the electron mass must be measured and plugged into the theory, past the standard model. The standard model represents the core of the fracturing of symmetry and shows the use of symmetry to conserve features independent of dimensionality.

You talk about safely ignoring the ratio of between an electrical force and a gravitational one, but AdS is a all about the translational invariance in holography. The unification of fundamental forces implies a conservation.

Did nature cause decoherence for the fracturing of unitarity groups within the early universe? Was such a fracturing timeless, or “all at once” across all of time and space. Or is gravity a thermal emergence that occurs, and is conserved in DM and DE, hence what the fracturing of symmetry must have happened “all at once”? There are models for both of these, and very real philosophical interpretations to the Nature of truth we ought expect. And yet why would unitarity choose one over the other? The self referential problem, is these principles seem to imply multiple decompositions we must take on the road to reality, conditional on how we define an action, and what features we find ‘anomalous’ to reality. Parity, charge, and time all seem like very real things, but classical theory does not allow us to use all of them without including a degree of freedom, and yet we will never have all three. This is where Popperian thought collapses, what is the null hypothesis of handedness? Dimensionless features, not a problem to project some density functions. Yet there is no architecture, no experiment, to such a thing. We require the mapping of to a degree of freedom(or the implications of 1/2 for that matter).

Lakatos is wrong because the philosophy of science begins on the degenerative. Should we use classical or conditional probability? We should use both. The program is not degenerative because some aspect of subjectivity maybe intractable for causal conditional probabilities, we simply switch to classical probability. Because classical special relativity is timeless we ought be neurotic? Well good thing we are. But the classics can continue if we but assume one hidden layer(of course anymore, forget it) and work with what covariance affords, a context.

Aberration addiction is a problem of philosophy. It is a shortcut of the mind useful to an extent, a distortion we can recognize, perhaps something we can ever deload to some topological defect, or always make excuses for in Platonicism where formalism fails. Heck even intuitionism, where such lazy eliminativism is disallowed, fails.

What is so interesting, is that within poetic naturalism, we can recognize the “mob psychology” and its need for a change, but also human limitations to pierce reality. A healthy psychology has the slightly new. Maybe we just need more conditional probability, as Bayesian can commensurate the scientific method, yet Feyerbend will always be a great insurance policy. Some part of the pathos will keep the ensemble eloquent. Some part of the reduction will assume the collectively exhaustive. Some part of orthonormality will continue to be observable in the bell test. The use of capital T truth and lowercase t truth, does not make model based realism or the surrealism any less interesting. David Deutsch has good take, where we should work toward what things are “possible”.

That being said, I welcome what any imagination can bring to the absurdity of unitarity and yet the seeming rigidity of isometrics. There is a mystery that will continue in the debatable ambiguity as coverage, or the rationale hiding within the indistinguishable operationally equivalent. Our evolution finitistic as eon is defined, an extension of ourselves so familiar, yet alien. As auxiliary as is multiple realized, metastable claims are not sane, they are ahead in their thinking and behind for our kinesthetic senses. We can assume past the intractable, but we cannot state a reality wholly deterministic or ceramic. We are blessed with opportunity and knowledge, broken as a walk in three dimensions that seeks coverage.

2