Jump to content

fredrik's viewpoint on string-think and fundamental physics


Recommended Posts

Hey Martin,

 

Do you advocate LQG? If so, what is your current attitude towards the dimensionality of spacetime?

 

What bothers me more than anything, is not just the geometry or structure of spacetime it's also maybe more the spacetime dimensionality itself. While the dimensionality may appear inuitively obvious, I don't think this is a valid assumption. It is far too strong. I'd expect there to be a better explanation for the apparent dimensions.

 

What is the mental rescue the LQG advocates uses to handle this? This is to me an important point, and my feeling is that ingnoring this is really too much.

 

While I'd like to answer that the answer is in the data - there still has to be a intelligent method for deriving "dimensionality" out of data.

 

Is this ignored also in LQG or have I missed some alternative more advanced interpretations of LQG? (I ask because I am no LQG expert)

 

/Fredrik

 

the answer would be different depending on what non-string QG approach you are asking about.

 

When string theorists say the main rival to string is "LQG" it can be a bit confusing----standard LQG was worked on a lot in the 1990s.

 

Now when they have a conference like Loops '05 they explicitly say it is for background independent approaches to quantum gravity.

 

(background independent means that spacetime itself is dynamic, generated by the theory----geometry is not fixed ahead of time----there are a number of ways to do this that are being explored)

 

Here is a dictionary of what people are working on:

 

LQC (Loop Quantum Cosmology, derives from vintage 1990s LQG and seems to have solved some problems by specializing to cosmology, resolves the classical bigbang and blackhole singularities, Bojowald, Ashtekar, Thiemann recently gave talks at Santa

Barbara about resolving singularities, available as online videos)

 

AQG (Algebraic QG, new version of LQG invented 2006, Thomas Thiemann)

 

spinfoams (approach with largest number of researchers---connection with Feynman diagrams of matter found by Laurent Freidel)

 

CDT (causal dynamical triangulations, spacetime builds itself from little building blocks, computer simulations of this----correct number of dimensions emerge at macroscopic scale but unfamiliar fractally structure emerges at microscopic)

 

QEG (Martin Reuter's "Quantum Einstein Gravity", he carries through a renormalization program on the straightforward quantization of Gen Rel that people decided didn't work in pre-1970 days---he forces the initial obvious approach to work and it doesn't quite blow up. "assymptotic safety" a strange dark horse candidate)

 

DSR and Cartan geometry (deformed special relativity, revise the concept of the continuum---replace the vintage 1850 differential manifold with flat tangent space with a vintage 1923 Cartan manifold and curved tangent space)

 

DSR and NCG (deformed special relativity and non-commutative geometry, a way of obtaining the standard model of matter)

 

Trinions (approach based on spin-networks that also obtains matter, developed since 2005 by Smolin, intriguing but at early stages----gravity is geometry, geometry is a web of spatial relations, matter is twists and. TANGLES in that web. The web is made of small simple components: topological elements called trinions, a trinion is a sphere with three punctures)

 

There is a book scheduled to come out this year, edited by Dan Oriti for Cambridge University Press, called Approaches to Quantum Gravity; Towards a New Understanding of Space, Time, and Matter

this should help to define the field of non-string QG which string theorists (and even sometimes the researchers themselves) somewhat confusingly call "LQG".

All these lines of investigation I mentioned will, I believe, be represented.

Also this year's "Loops 07" conference in June, if it is successful, will have papers delivered representing all these different background independent approaches.

 

That said, I can try to reply to your questions with reference to the kind of research that people are actually doing

Link to comment
Share on other sites

Only the tuner/builder understands all the effort that went into deciding string gauge, length, and tension, and in tuning actually retensions the string. . . . . .

 

:D Don't go there, Norman, we already have to contend with the celestial clockmaker, and we don't want upset his sidekick the celestial piano tuner also.......!

 

I am mostly unread and untutored in the matters discussed in this thread, but when I hear the faint sound of lateral thinkers beginning to hammer at at locked doors, I just want to join in. Locked doors always arouse my curiousity.

Link to comment
Share on other sites

The piano frame sets the lengths. I am not convinced that such exists in the vacuum itself. Massive Higgs, massive cast plate, whatever. I am looking at Planck's constant as the necessary proportionality between energy and frequency of bound states. The Ouroboros forms hoops of only certain sizes determined by a few constants and geometric facts. I shall leave tinkering to those wondering how cosmologic space could deviate from flatness by just a little bit. I think it was Martin and Severian who held good discussion on this last year. . . . .Triangles are certainly a good starting place, as they are the minimum form of a loop, and the logic of three elements is already ensconced in quarks and hadrons. Given time I do intend to develope <m=3> as I have the electron. Given a linear sum of <m=0>, <m=1>, and <m=3> you can see a three-lobed structure where two are the same.

Link to comment
Share on other sites

What bothers me more than anything' date=' is not just the geometry or structure of spacetime it's also maybe more the spacetime dimensionality itself. While the dimensionality may appear inuitively obvious, I don't think this is a valid assumption. It is far too strong. I'd expect there to be a better explanation for the apparent dimensions.

[/quote']

 

OK to try to answer your question. You are right to focus on dimensionality.

 

Of all these approaches, I think only CDT addresses this.

 

Martin Reuter has presented some papers that say that his approach QEG converges with CDT and gets some similar results, but that was back in 2005 and I havent heard more about it.

 

Dimensionality is a variable which can be measured by various methods, so it is an OBSERVABLE. It does not have to take on only integer values, it can for example be 1.9 or 2.1 or 3.87. The dimensionality of a space can depend on what scale you are looking. It can be 3D or very close to 3 if you look at large scale but it can be less, like 1.9, if you look at very microscopic.

 

In CDT research they run computer simulations and get universes to appear, evolve, and then go away according to their rules of spacetime dynamics. And in the computer, while the universe is existing, they make measurements of the dimensionality. And they get strange results like this.

 

There are two natural ways to measure the dimensionality at a point

A. Hausdorff dimension, you measure radius and volumes of balls around that point and see how the volume relates to the radius----what power of the radius best approximates the volume. In CDT the spacetime assembles itself from tiny building blocks which are considered to be all the same size. So they have a natural idea of volume----just count the blocks.

 

This depends on the size of the balls you are measuring. If you look at big balls you can get one Hausdorff dimension and if you look at much smaller balls you can find a different Hausdorff dimension.

 

IIRC "fractal" was named that because fractals often have fractional Hausdorff dimension. In CDT the spaces they generate have a fractal-like behavior at microscopic scale but look normal 3D or 4D at large scale. It is suggestive.

 

B. Spectral dimension. This is a way to measure the dimension by starting at a point and performing a RANDOM WALK and seeing how easily you get lost and never find the way back. In high dimension spaces the walker returns to home base with very low probability. In low dimension the walker has more chance of actually returning to where he started.

In the computer simulations one has a universe appear and then one can freeze it and study it. And one can perform these random walks thousands of times, starting from a point, and one can EXPERIMENTALLY DETERMINE what is the probability of returning to start point, and then there is a formula which tells the spectral dimension.

 

in simple Euclidean situations the spectral dimension agrees with the traditional idea of dimension, so it is a good measure.

 

You can control scale by controlling how long you let the walk go for.

=================

 

I am not advocating any particular approach to background independent QG.

I like it when researchers work on several approaches.

 

Right now I am interested by AQG-----I read some AQG papers recently and it seemed to me that this approach (which started in 2006) ALSO has the feature that the dimensionality is not chosen ahead of time!

this is not entirely clear and needs confirmation. But I got this impression from the research I read. One thing that is explicit is that AQG does not decide on a fixed topology of spacetime at the beginning.

 

there are various degrees of background independence. how much baggage can you throw out and still get a model that runs? How much prior assumptions can you do away with?

Link to comment
Share on other sites

Thanks for your overview of the LQG "flavours" Martin!

 

As a first approach I think you answered my question good enough for me to get a better picture on the LQG philosophy.

 

I'll try to look up CDT.

 

Dimensionality is a variable which can be measured by various methods, so it is an OBSERVABLE. It does not have to take on only integer values, it can for example be 1.9 or 2.1 or 3.87. The dimensionality of a space can depend on what scale you are looking. It can be 3D or very close to 3 if you look at large scale but it can be less, like 1.9, if you look at very microscopic.

 

In CDT research they run computer simulations and get universes to appear, evolve, and then go away according to their rules of spacetime dynamics. And in the computer, while the universe is existing, they make measurements of the dimensionality. And they get strange results like this.

 

There are two natural ways to measure the dimensionality at a point

A. Hausdorff dimension, you measure radius and volumes of balls around that point and see how the volume relates to the radius----what power of the radius best approximates the volume. In CDT the spacetime assembles itself from tiny building blocks which are considered to be all the same size. So they have a natural idea of volume----just count the blocks.

 

This depends on the size of the balls you are measuring. If you look at big balls you can get one Hausdorff dimension and if you look at much smaller balls you can find a different Hausdorff dimension.

 

IIRC "fractal" was named that because fractals often have fractional Hausdorff dimension. In CDT the spaces they generate have a fractal-like behavior at microscopic scale but look normal 3D or 4D at large scale. It is suggestive.

 

B. Spectral dimension. This is a way to measure the dimension by starting at a point and performing a RANDOM WALK and seeing how easily you get lost and never find the way back. In high dimension spaces the walker returns to home base with very low probability. In low dimension the walker has more chance of actually returning to where he started.

In the computer simulations one has a universe appear and then one can freeze it and study it. And one can perform these random walks thousands of times, starting from a point, and one can EXPERIMENTALLY DETERMINE what is the probability of returning to start point, and then there is a formula which tells the spectral dimension.

 

in simple Euclidean situations the spectral dimension agrees with the traditional idea of dimension, so it is a good measure.

 

You can control scale by controlling how long you let the walk go for.

=================

 

I am not advocating any particular approach to background independent QG.

I like it when researchers work on several approaches.

 

Right now I am interested by AQG-----I read some AQG papers recently and it seemed to me that this approach (which started in 2006) ALSO has the feature that the dimensionality is not chosen ahead of time!

this is not entirely clear and needs confirmation. But I got this impression from the research I read. One thing that is explicit is that AQG does not decide on a fixed topology of spacetime at the beginning.

 

Am I right that in the above suggestions you are presupposing a distance metric, or? From first principles, what would the meaning of a distance between two samples mean, unless possibly interpreted in terms of their correlations?

 

Suppose you throw a dice. What is the "distance" between 6 and 2? It seems to me, the prior is that all samples are equidistant, if at all? Comments? did I miss something?

 

I'd like to start from just consider a plain set of inputs, possibly ordered and countable (I am not sure yet), without defined metric.

 

What I picture is that all these things can evolve as data is consumed. Without even consuming the first sample, I have a hard time accepting things like a distance metric. Perhaps the sample space itself isn't complete. I think that may even expand. Suppose you start sampling real numbers. Do you know how high numbers you will get? It seems your sample space might be considered to increase on a "as needed" basis...

 

how much baggage can you throw out and still get a model that runs? How much prior assumptions can you do away with?

 

This sure is a very good and unavoidable key question. It seems clear that the less informative the priors, the harder and more work do we have to build the new theory. And at some point there has to be a practical limit. This is of course a little fuzzy, but I try to get rid of as much assumptions "as possible", which really just means that I will do the best I can, given my tiny brain. The priors I accept are motivated with that I simply can't do better at this point.

 

I remember discussing this with some of my old teachers, and their(ie those who get payed to do research - professionals that is) defence was that dealing with the big problems are to complex and takes to much time, that chances are that it would take so much time for some results worth publishing that the research funders might think you are not doing anything and withdraw the funding. The system encourages a "small step size", which means that some larger changes that might possibly be beneficial is effectively repressed by the research politics. Perhaps this is one reason why so little fundamental progress has been made? What do You think Martin?

 

Personally, beeing philosophically inclinded, I try to get rid of everything that I simply can not accept as axioms or beeing a sufficiently necessary assumption, and that I see a possible exploit to formalize it. One issue of concern is obviously that we pose a question so hard that we can not solve it within a reasonable time, but also consider the other issue, the risk that we spend alot of time working out an alternative solution that finally proves to not be good enough after all. Then, if the mistake was in the very foundations of this theory, it may not be so easy as to just introduce correction terms... the whole stuff might need reworking. And is the second scenario better? So I agree there has to be some balance.

 

/Fredrik

Link to comment
Share on other sites

One very fundamental property I would like to see in a scientific method is that it should be as expandable as possible. That is, I think there should be defined "handles", or "expansion points", where all the inconsistencies are collected, until they get big enough to separate from the noise. Thus giving birth to new concepts.

 

This is why I expect from a new theory, to naturally and on a fundamental level incorporate a evolutionary or adaptation step, so that the model can evolve and adapt in a more continous way in response to new data. This is also IMO the most natural way to trace us back to "big bang".

 

This may be a too complex task, but at least from the philosophical point I feel this is farily close to what I consider a good question. Should be answer a worse question, just because it's easier? This is my objection to some of the current theories, including string theory, that it simply doesn't target the questions I want answered. So I just don't find the motivation for it.

 

I am not currently that awfully well oriented in all possible approaches that are worked on. I know that there are many interesting approaches where I like parts, but so far I haven't seen anything that is dead on to my preferences. What is close to my taste are the bayesian logic approaches, but I have some doubts on how the sample spaces are treated. Sometimes they are simply "given", and I find that somewhat disturbing. This is what currently keeps me awake at nights.

 

/Fredrik

Link to comment
Share on other sites

Here is another question regarding string theory. Since Martin seems well oriented in these fields, perhaps he could comment?

 

In defense of a "reformulated" string theory, whatever the name may be.

 

10 years ago, I recall one possible opening in the "string theory interpretation" given that you accept the underlying procedure. While the concept of the "string" has been somewhat odd to me, what I could possibly motivate is to instead of things consider a gaussian distribution, that could be excited. This was what my thinking was like way back, and I guess it is still a possibility that would probably be easier than the other stuff I have been talking about lately. Not as well founded, but still possibly a systematic procedure. The motivation for the gaussian shape (in the continous case) is the probabilistic one.

 

What I found to be an obvious interpreation is to consider the quantization as an induction step, and logically it simply means "relating the relations", or the probability of the probability, or a kind of higher order logic. This could inductively be performed into n'th quantization. In the frequentists interpretation this may seem pretty akward, but in the bayesian thinking it seems natural and has a certain beauty also from a philosophical point of view.

 

Can someone, maybe Maritin, answer if this is anything where the developed "M-theory" reworkings are leaning towards?

 

If this is anything like it. I would give the "reformulated string theory" some hope! though it is still fuzzy and at this point seems to be only the second best option if all else fails, which is not out of question.

 

Comments?

 

/Fredrik

Link to comment
Share on other sites

I'll try to look up CDT.

 

there was a short (one-page) article about CDT in the February SciAm

 

Renate Loll has a bunch of links to popular articles that have appeared in the press (mainly in Europe)

 

http://www.phys.uu.nl/~loll/Web/press/press.html

============

 

If you can scan technical articles and find chunks of them that you can understand (without getting bogged in the parts you cant) then you have a much better choice. There are lots of good technical-journal articles on CDT and many of them have long non-mathematical passages at the beginning and at the end.

 

Renate Loll website lists some of these too and gives LINKS so you can quickly get the PDF file to look at

http://www.phys.uu.nl/~loll/Web/publications/publications.html

 

on this list most are ones you dont want, a preliminary selection would be:

 

 

Reviews, Overview Articles and Lecture Notes

[8] Quantum Gravity, or The Art of Building Spacetime (with J. Ambjørn, J. Jurkiewicz)

[arXiv: hep-th/0604212].

[7] The Universe from Scratch (with J. Ambjørn, J. Jurkiewicz)

Contemporary Physics 47 (2006) 103-117 [arXiv: hep-th/0509010].

 

 

Causal Dynamical Triangulations in Four Dimensions

[7] Counting a Black Hole in Lorentzian Product Triangulations (with B. Dittrich)

Classical and Quantum Gravity 23 (2006) 3849-3878 [arXiv: gr-qc/0506035].

[6] Reconstructing the Universe (with J. Ambjørn, J. Jurkiewicz)

Physical Review D 72 (2005) 064014 [arXiv: hep-th/0505154].

[5] Spectral Dimension of the Universe (with J. Ambjørn, J. Jurkiewicz)

Physical Review Letters 95 (2005) 171301 [arXiv: hep-th/0505113].

[4] Semiclassical Universe from First Principles (with J. Ambjørn, J. Jurkiewicz)

Physics Letters B 607 (2005) 205-213 [arXiv: hep-th/0411152].

[3] Emergence of a 4D World from Causal Quantum Gravity (with J. Ambjørn, J. Jurkiewicz)

Physical Review Letters 93 (2004) 131301 [arXiv: hep-th/0404156].

 

I am not bothering to give you the links. Her website gives the links so you can click directly on them.

I am also not narrowing down. I am giving you ten where you need to find one that you can read parts of.

sorry not to be more helpful. have to go.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.