Everything posted by Markus Hanke
-
Parameters of Theory of everything.
This is true, but one must bear in mind the local nature of the field equations - outside the system, in vacuum, the energy-momentum tensor is zero, so the equations you’re solving are actually just the vacuum equations \[R_{\mu \nu}=0\] There is no source term at all here, yet you’re still getting a curved spacetime, even far from the central source. This is precisely due to the non-linearity of the theory - the curvature inside the system can “bleed out” into the surrounding vacuum, because curvature at one point is itself a source of curvature for surrounding points. This is encoded in the non-linear structure of the equations themselves.
-
Parameters of Theory of everything.
It means that the gravity of a macroscopic system is in general different from a simple sum of the fields of their individual constituent particles taken in isolation. It also means that just because a volume of spacetime is empty, that doesn’t necessarily mean that it is flat. One has to start with the correct initial and boundary conditions.
-
Parameters of Theory of everything.
Gravity being self-coupling or non-linear (which means the same in this context) means - among other things - that you cannot simply add together gravitational fields of individual sources to obtain the field of a more complicated system. For example, the spacetime geometry around two bodies in close proximity is not just the sum of two Schwarzschild metrics, especially not if these bodies are in relative motion. This is why you get eg extra perihelion precession with Mercury, which you wouldn’t expect in a linear theory. Another example are gravitational waves - when they traverse an area of background curvature (like a massive body, or another wave front), they interact in ways that deviate from ordinary linear wave dynamics. In more technical terms, if you take two metrics, both of which are valid solutions to the Einstein equations, and add them together, then the result is in general not itself a valid solution to the equations. It also means that a gravitational field can exist in the absence of any “ordinary” sources; for example, exterior Schwarzschild spacetime is everywhere empty, yet nonetheless curved. This is because curved spacetime itself contains energy, which can act as a source for further gravity (caveat: this form of energy cannot be localised, unlike ordinary sources). This is in contrast to Newtonian gravity, which is completely linear.
-
Principle of Causality and Inertial Frames of Reference
Ok, this is basically what I had in mind. We now need to first make precise what we actually mean when we speak of observers and IFRs. I’m taking the following from Sachs/Wu, General Relativity for Mathematicians (1977), which is the standard formal definition. We start with defining spacetime to be a singly connected, time-oriented Lorentzian manifold (M,g), endowed with the Levi-Civita connection and a suitable metric. An observer on M is a future-pointing time-like curve with appropriate parametrisation. A reference frame on M is a vector field, the integral curves of which are observers as defined above. An inertial frame is one that is constant with respect to the covariant derivative. Thus, choosing an IFR means choosing a vector field on the manifold M that fulfils the above requirements. There may exist infinitely many such choices, or none at all, depending on the geometry and topology of M. This definition is clearly inconsistent with your idea, since it makes no sense to speak of connections, metrics, integral curves, and vector fields that span multiple manifolds that don’t map their points into each other 1-to-1. The best you could do here is consider a foliation in some higher-dimensional space, where each spacetime is a hypersurface of some constant parameter. But it doesn’t look like that’s what you’re doing. Since you say that each IFR is its own spacetime manifold, what does it formally mean to be stationary relative to spacetime? And what does it mean for a spacetime to be in motion relative to another spacetime? If each IFR is its own spacetime manifold, and events on these manifolds don’t map 1-to-1, then transmitting information between such manifolds would be a clear violation of local unitarity. This explicitly implies \[\nabla_{\mu} T^{\mu \nu}\neq 0\] on all manifolds involved. But you postulate that events are not the same in all IFRs, in the sense that there’s no 1-to-1 map between points on different manifolds. If different IFRs are different spacetimes, then accelerating means the test particle is leaving one spacetime and entering another. So it does cease to exist on the original spacetime, irrespective of any dependencies. How is this possible if there’s no 1-on-1 map between these manifolds? And what is the nature of these “dependencies”, since you also say that each IFR is subject only to its own causal relationships? There is a clear contradiction here. If the outcome of an experiment in an IFR is somehow dependent on non-local influences, we’d have spotted that by now. On the other hand, if there are no such influences, there is no mechanism to guarantee causal consistency between manifolds. But this is precisely not what you’re claiming…? Refer to your earlier examples with the moon, and photons/muons.
-
Where does atheist morality come from?
Fear of punishment - whether earthly, or by some god - is not the same as genuine morality.
-
Principle of Causality and Inertial Frames of Reference
To be honest, I’ve been silently reading along this thread, and it is still not clear to me exactly what this hypothesis you’re referring to actually is. You keep talking about events and causality, and how they change between IFRs, but those are all concepts that already presuppose the existence of spacetime endowed with a connection and a metric. At the same time you refer to something “more fundamental than spacetime”, which is meant to be implied by the hypothesis. I’m so far failing to make the connection. I don’t know - it was you who made this claim: If there’s no mapping (invertable or not) between those frames, there would be no transformation that relates them to one another; but you did say that such a transformation exists. So you have two entirely separate patches of spacetime, each with their own set of events, which can’t be mapped into one another by any 1-to-1 map. These patches would thus each have their own independent histories, which are not guaranteed to be mutually compatible (see your own example above). Yet the patches are still somehow IFRs in a Minkowski spacetime, and thus related via the usual SR transformations? Is that the idea?
-
Twin paradox (split)
If c wasn’t invariant, there’d be a plethora of unresolvable physical paradoxes, and the universe wouldn’t have evolved. An invariant c is a fundamental prerequisite for any internally self-consistent notion of spacetime and causality, among other things. To put it differently, in the abstract space of all conceivable sets of laws of physics, only those with an invariant c can give rise to a macroscopic, self-consistent spacetime.
-
Principle of Causality and Inertial Frames of Reference
I don’t see what such a transformation could possibly look like. For starters, massless particles have no rest frame (inertial or otherwise) associated with them, so it is not clear at all what it actually is that you’re mapping between. Furthermore, what would be the parameters of the map? It can’t involve v, since v=c for all IFRs, so the map would be 1-to-many. But if it’s not v, what else could it be, since that’s the only parameter whereby IFRs are related to one another? Also, there’s more than one massless particle in nature. What is it that determines that a particular set of IFRs map a muon into a photon, and not a gluon (or hypothetically a graviton)? And what about the fact that muons decay, and gluons are QCD-confined, whereas photons are free and stable?
-
Parameters of Theory of everything.
I never heard of this before, and tbh I don’t see how this is even mathematically possible…? Naively it would seem to me that you can’t get the proper polarisation states in g-radiation fields with anything less than spin-2 quanta. But maybe I’m missing something.
-
Abuse of the term "conspiracy theory" in popular culture
I would even go so far as to say that FE is nothing but a conspiracy theory - the dynamic at the heart of this concerns a “them” hiding things from “us”; it is wholly about control and power in politics, and has little if anything at all to do with the science of planets. It’s about mistrust in authority.
-
Time and space displacement by gravity relative to mass
None of this is what General Relativity actually says. What is the question or discussion point here?
-
Length contraction in a doughnut shaped universe.
This falls under the area of General Relativity. If you already have a background in maths, my recommendation would be the book Gravitation by Misner/Thorne/Wheeler.
-
A nail in the coffin of Loop Quantum Gravity, or just a tack in its rubber sole?
I’d bet there’s something to it. In fact, I’d go so far as to say that complexity, chaos and emergence in general are seriously underrated and under-utilised in modern physics. Just my opinion though
-
A nail in the coffin of Loop Quantum Gravity, or just a tack in its rubber sole?
I wouldn’t put it so strongly, it just means that the data places strong constraints on which models might be viable or not. On the other hand though we have good reason to believe that there is entropy associated with the horizon of BH’s - and since the concept of entropy only really makes sense for a system that exhibits discrete microstates in some form or anither, the interior region cannot be smooth and continuous empty space everywhere. So I’d still bet my money on some deeper structure that underlies classical spacetime, even if that turns out to not have anything to do with spin foams.
-
A nail in the coffin of Loop Quantum Gravity, or just a tack in its rubber sole?
Shame, that…I had some hopes for LQG. But this is just how science works.
-
Dark matter ....
This is the trouble, really; all the known and proposed alternative models of gravity have some form of problem. Generally speaking, they tend to be able to model one particular (class of) phenomenon better than GR, but then fail spectacularly in other situations. Most of them are also mathematically complicated, and unwieldy to work with, and oftentimes they rely on additional assumptions (extra fields etc) for which we have no evidence.
-
Dark matter ....
Sabine Hossenfelder, for instance - though I wasn’t immediately able to find a reference (have to look some more later). The idea isn’t new, and isn’t mine either, but I think never really came to the forefront, since it’s essentially untestable given the current limitations in computing power.
-
Latex switch to rich text format
I have experienced this switch from LaTeX to RTF numerous times also - for me this happens when I simply press “Edit” after submitting a post that contains LaTeX.
-
Dark matter ....
I can not, of course, be sure about such specifics either, nor even about whether or not anything special would happen at all in an n-body problem. It’s really just an hypothesis, based on emergent dynamics in non-linear systems. Basically I’m sceptical about both the particle as well as alternative-model options, so it’s good to have a third alternative. I agree with DanMP’s earlier comment that DM is a big part of our current model of the universe, so this is a very important issue. Well, I never have been trained. I simply base my thinking on what we do know - on situations with small n that can be exactly solved. For example the n=2 case; the spacetime of a binary body system is nothing like our naive idea of two Schwarzschild metrics superimposed. What happens for n= ~billions is anyone’s guess, because I don’t think those non-linearities smooth out. So maybe DM is really a chaos-theoretic problem.
-
Dark matter ....
Yes I get you, but to me this simply is extrapolating a model which we know works very well on solar system scales, to larger scales. After all, there is no immediate reason to assume that gravity works differently on galactic scales than on solar scales - while that could be so, we have no evidence that that’s actually the case. Therefore do I think it’s important to try and find ways to figure out what GR actually predicts for large n-body systems, rather than just simplified models with an unknown error factor. One might also say that possible alternatives such as MOND etc are “cheating”, because all of those models postulate things (extra fields, new universal constants etc) for which we have otherwise no evidence. At the very least, GR is the simplest possible metric model of gravity that is fully relativistic.
-
Dark matter ....
This is just the point - if the hypothesis is correct, then, if we were able to actually solve the Einstein equations for a very large n along with the correct boundary conditions, the result would exactly match observation, without having to postulate the presence of anything else. In other words, Dark Matter would just be the error introduced by using idealised computations, rather than an actual n-body problem with large n. The domain of applicability of GR is naturally limited - as a classical model, it can’t eg account for quantum effects, so it won’t ever be able to explain all observations. You just use it within its proper domain. The question is just how far this domain extends up and down, and we won’t be able to definitively answer that question until we have clarity as to the precise nature of DM. It is definitely possible that an alternative model of gravity is required on larger scales, which would restrict the validity of GR to the intermediate scale (~tens/hundreds of Ly?). The problem with that is of course that all currently known alternatives to GR themselves suffer from a variety of problems, and don’t match all observations. The idea with the n-body dynamics is just a hypothesis, to add to the “particles” and “alternative model” options. Of course it could be wrong, though we have no way to check right now. But then again, we know already that ordered structures can emerge from otherwise chaotic dynamics in non-linear n-body systems; so the concept isn’t that far-fetched at all. This is just the point - if the hypothesis is correct, then, if we were able to actually solve the Einstein equations for a very large n along with the correct boundary conditions, the result would exactly match observation, without having to postulate the presence of anything else. In other words, Dark Matter would just be the error introduced by using idealised computations, rather than an actual n-body problem with large n. The domain of applicability of GR is naturally limited - as a classical model, it can’t eg account for quantum effects, so it won’t ever be able to explain all observations. You just use it within its proper domain. The question is just how far this domain extends up and down, and we won’t be able to definitively answer that question until we have clarity as to the precise nature of DM. It is definitely possible that an alternative model of gravity is required on larger scales, which would restrict the validity of GR to the intermediate scale (~tens/hundreds of Ly?). The problem with that is of course that all currently known alternatives to GR themselves suffer from a variety of problems, and don’t match all observations. The idea with the n-body dynamics is just a hypothesis, to add to the “particles” and “alternative model” options. Of course it could be wrong, though we have no way to check right now. But then again, we know already that ordered structures can emerge from otherwise chaotic dynamics in non-linear n-body systems; so the concept isn’t that far-fetched at all. This is just the point - if the hypothesis is correct, then, if we were able to actually solve the Einstein equations for a very large n along with the correct boundary conditions, the result would exactly match observation, without having to postulate the presence of anything else. In other words, Dark Matter would just be the error introduced by using idealised computations, rather than an actual n-body problem with large n. The domain of applicability of GR is naturally limited - as a classical model, it can’t eg account for quantum effects, so it won’t ever be able to explain all observations. You just use it within its proper domain. The question is just how far this domain extends up and down, and we won’t be able to definitively answer that question until we have clarity as to the precise nature of DM. It is definitely possible that an alternative model of gravity is required on larger scales, which would restrict the validity of GR to the intermediate scale (~tens/hundreds of Ly?). The problem with that is of course that all currently known alternatives to GR themselves suffer from a variety of problems, and don’t match all observations. The idea with the n-body dynamics is just a hypothesis, to add to the “particles” and “alternative model” options. Of course it could be wrong, though we have no way to check right now. But then again, we know already that ordered structures can emerge from otherwise chaotic dynamics in non-linear n-body systems; so the concept isn’t that far-fetched at all. This is just the point - if the hypothesis is correct, then, if we were able to actually solve the Einstein equations for a very large n along with the correct boundary conditions, the result would exactly match observation, without having to postulate the presence of anything else. In other words, Dark Matter would just be the error introduced by using idealised computations, rather than an actual n-body problem with large n. The domain of applicability of GR is naturally limited - as a classical model, it can’t eg account for quantum effects, so it won’t ever be able to explain all observations. You just use it within its proper domain. The question is just how far this domain extends up and down, and we won’t be able to definitively answer that question until we have clarity as to the precise nature of DM. It is definitely possible that an alternative model of gravity is required on larger scales, which would restrict the validity of GR to the intermediate scale (~tens/hundreds of Ly?). The problem with that is of course that all currently known alternatives to GR themselves suffer from a variety of problems, and don’t match all observations. The idea with the n-body dynamics is just a hypothesis, to add to the “particles” and “alternative model” options. Of course it could be wrong, though we have no way to check right now. But then again, we know already that ordered structures can emerge from otherwise chaotic dynamics in non-linear n-body systems; so the concept isn’t that far-fetched at all. PS. I have no idea why the above post appears several times in a row?? This has never happened before, and I defo didn’t type it that way…
-
Dark matter ....
These are all excellent points. Unfortunately I’m up to my eyeballs in the real life at the moment, so I’ll need to come back to this at a later point. Consider the following though. Suppose you have an alien scientist whose species lives down in the ocean of a water-world (no solid land). One day he notices some sand on the bottom of the ocean, and begins to wonder: what would happen if you had a very large amount of grains of sand, without water, just under the influence of wind and gravity? He knows Newtonian physics, and he knows the Navier-Stokes equations. Based on these, he figures that each grain is blown about by the wind, pulled down by gravity, bounces about a bit in pretty much chaotic patterns, and might come to rest somewhere. Over large areas and long times, each point on the sand plain is equally likely to become the resting spot of a sand grain - so it’s reasonable to expect that all inhomogeneities smooth out over time, and you end up with a more or less flat expanse of sand eventually. So now he jumps into his (water-filled) UFO and visits Earth. He lands in a desert, and imagine his surprise when he sees this: A naive application of Newtonian gravity and Navier-Stokes fluid dynamics would give no indication that a large number of essentially isolated sand grains undergoing essentially chaotic dynamics would give rise to large scale ordered structures such as these. So our alien scientist would be forgiven in concluding that there must be some other influence that leads to the formation of dunes. The situation in GR is similar. Each star or galaxy taken in isolation is locally near-Newtonian, and would thus be expected to behave that way on all scales. However, an n-body system with very large n undergoing chaotic dynamics under the laws of GR might form global spacetime geometries that are not immediately predictable, just like sand grains and the formation of dunes (which is meant just an analogy, btw). This holds for stars in a galaxy, or for the interaction between galaxies, or for galaxies in the universe. The point is we don’t know if that’s the case or not, because we don’t have the computing power necessary to model a GR n-body problem with very large n. So this is just a hypothesis, based on the fact that metrics don’t add linearly; the overall metric of an n-body system is not the sum of n metrics for the n constituent bodies. So it’s possible at least in principle that the actual global spacetime might look like it contains more mass than we can observe, even though in actual fact it doesn’t. That’s not really what I’m saying. We can use GR quite accurately so long as it is permissible to make enough simplifying assumptions to render the maths manageable. For example, a single body that can be considered isolated (asymptotic flatness) and is symmetric enough can be easily modelled, and the result matches observation very closely. I think the problems arise only if we are dealing with n-body systems, because the non-linearities inherent in GR may not smooth out and become negligible; they might in fact compound in large enough systems. And the trouble is we don’t have enough computing power to actually run such simulations, for large n.
-
Dark matter ....
The current overall state of affairs seems to be that: 1. All efforts in directly detecting the more plausible types of DM particles have come up negative, and particle physicists are forced to consider more and more exotic extensions to the Standard Model to come up with workable alternatives 2. There appears to be little to no statistically significant evidence to support any one of the various alternative gravity models, since they all suffer from more or less significant problems While I think the current evidence isn’t strong enough to definitively rule out either the particle or the alternative gravity model option, I personally tend towards the third option, which avoids both of those - namely that DM is actually an artefact of our inability to produce solutions to the ordinary GR field equations that aren’t idealised. For example, even the best numerical approaches to modelling a spiral galaxy in the context of the GR equations need to be idealised - it’s going to be some sort of continuous dust distribution, with appropriate density curves and initial and boundary conditions. But a real galaxy is not that - it’s a discreet set of a large number of individual sources of gravity, all of which interact gravitationally and often also mechanically. Due to the non-linear nature of the GR equations, it is really not possible (with our current tools) to tell what kind of an error is introduced by idealising this situation to make it model-lable. We don’t have nearly enough computing power (by many orders of magnitude) to numerically solve a GR n-body problem with n on the order of ~100 billion, plus realistic boundary conditions. At least in principle the discrepancy between model and observation which we call Dark Matter could just be the error introduced by idealising a real-world scenario in a non-linear model. DM could be nothing more than a mathematical artefact.
-
Final Parsec Problem
I think this is the issue. From what I know, running an exact numerical solution with realistic background parameters (as opposed to idealised metrics) yields merging times far in excess of the age of the universe, so we shouldn’t be seeing SMBH mergers. Yet we do, so it seems we’re missing some mechanism or another that bleeds away angular momentum quickly enough.
-
Relativity Crisis
The physical content of Maxwell’s equations is invariant under Lorentz transformations, so any physics predicted by them is guaranteed to be compatible with relativity. As you say, this effect is well known - the speed here is the phase velocity of the wave packet, not the propagation speed of any individual wave front. There’s no upper limit to phase velocity, and no information can be transmitted superluminally in this way.