Jump to content

Martin

Senior Members
  • Posts

    4594
  • Joined

  • Last visited

Posts posted by Martin

  1. Thanks for helping, AJB.

     

    I suspect the problem here is not that Michel doesn't understand, it is that HE HAS SOME AGENDA CAUSING HIM TO PRETEND NOT TO UNDERSTAND.

     

    If he can't find a flaw in what Lineweaver Davis say, then he will fault the way they say itj! :D

     

    "In this sense, the universe is self-contained. It needs neither

    a center to expand away from nor empty space on the

    outside (wherever that is) to expand into. When it expands, it

    does not claim previously unoccupied space from its surroundings."

     

    So there is are surroundings!

     

    No Michel, they do not say there are surroundings. This is getting boring. They say there are no surroundings.

    (and therefore the expansion does not claim space from unoccupied surroundings----since there are none.)

     

    Spare us the literary criticism, please.

  2. I don't see anything useful in this commentary.

     

    Can anyone else find something useful in it?

     

    Timo answered what seems to be the main objection. Just put ants on the balloon. They will not expand.

     

    The balloon analogy of expansion cosmology does not by itself specify what kind of objects one puts on the balloon. Could be pennies, could be raisins, could be ants. Whatever is chosen to represent stable clusters of galaxies is not going to expand.

     

    Michel, you seem to be just quibbling---arguing for the sake of it.

     

    I don't know where it is appropriate put that kind of thing at SFN. We don't usually do it here.

     

    ==============

     

    Michel, much of what you say is ostensibly a criticism of Lineweaver Davis' excellent Sci Am article of March 2005.

     

    But the critique is not substantive. It strawmans the article, quibbles, doesn't connect. Comes across as either lightweight or peevish. Do you really want to keep this critique around or should we dispose of it?

     

    For example it is common to say that the BB occurred everywhere. That it was ubiquitous. That's a correct statement about what the standard model says. It had no particular location in todays space. All the space that existed in the early universe around time of the BB participated.

    Ubique is Latin for everywhere, ubiquitous is English for everywhere.

     

    So what is your objection? You say Lineweaver Davis should not say that the beginning of expansion was ubiquitous because ubiquity is reserved for God. You admonish Lineweaver for that. Give us a break :D:rolleyes:

     

    =================

     

    The next thing I notice sounds like a Intelligent Design troll talking about the impossibility of biological evolution leading to complex life. The argument from incomprehension.

     

    How can structure... [arise]...when an expansion intuitively raises disorder. You need tons of books to explain that, there is no need to present that difficult point right in the beginning.

     

    It happens from gravity, very naturally. A nearly uniform gas cloud naturally condenses into cobwebby structure. Computer sims confirm this.

    You don't need tons of books.

    George Smoot does a good job explaining that in less than 15 minutes.

    Google "Smoot TED". The entire video is 18 minutes but he handles early universe structure formation in about 5 or 10.

  3. WOW!!! That's an incredibly big universe. But isn't that still not big enough for a margin of error of 2% for measuring the universe to be flat?

     

    actually not! If you take the errorbar (the currently available 95% confidence interval) for the curvature, and take the upper limit, and calculate the radius of curvature from that, you get very close to 100 billion light years.

     

    An upperbound on curvature corresponds to a lower bound on the radius of curvature. With 95% certainty the radius of curvature (RoC) has to be at least that.

     

    So the circumference has to be at least TWO PI time that. Which is 628 billion light years. In what I wrote I rounded to 600 not to make it seem high precision. If it is at least 628 then obviously it is at least 600.

     

    My information on the RoC comes from table 2 on page 3 of the relevant WMAP report.

     

    As I recall the

    RoC is calculated using the formula

    Hubble distance/sqrt(|Omega_k|)

     

    Note that Omega_k = 1 - Omega_tot

    so it has the opposite sign from what you expect. The more positive the curvature the more negative Omega_k is. Just an accident of notation.

     

    If you are still getting numbers much bigger than 100 for the RoC, or much bigger than 628 for the circumference, let us know.

  4. A and G are not in the chronological past of the origin.

     

    Draw light cones originating of the points A,...., G. See what cones contain the origin.

     

    From your answer, I can conclude that ONLY points B & F are observable.

     

    You concluded wrong. AJB meant you to see what SOLID cones (drawn forward in time) contain the origin.

     

    The correct answer is B, C, D, E, and F

     

    You said that only B and F.

     

    This thread is ridiculous since both Sisyphus and I have tried to straighten out repeatedly and you will not listen. You keep arguing.

     

    It is too stupid to be suitable for Astro Cosmo, so my patience has run out and I will move it.

     

    If you want to discuss in astro cosmo please don't argue with the mods and other staff so much. It's boring and wastes time.

  5. Right. that is part of the definition of the word "observable".

     

    I didn't want to include all sciences in the debate. I was speaking only about astronomic data.

     

    I was not aware that neutrinos could travel at speed 50% C. That means the observable cone has a non negligeable width...

     

     

    Observable by LIGHT is not part of the definition of the word observable.

     

    I don't accept your arbitrary limitation on what constitutes astronomy.

    Divisions between fields are not absolute and immutable.

    For example counting isotopes in nearby matter lets us "see" process in the very early universe which happened much earlier than what we see with light.

     

    All kinds of information can be relevant to the astronomer, not merely photons.

     

    An intelligent person observes with all his senses.

     

    Your picture of a thick cone that is spreading out say everything between 45 degrees and 30 degrees, seems ugly and arbitrary.

     

    Why not have the cone be everything between 20 degrees and 45 degrees?

     

    What you really need is a solid cone.

     

    Neutrino astronomy is so far very primitive. One can detect only the highest energy neutrinos. Those going near c.

    But there is no theoretical reason that one should not detect less energetic eventually.

     

    I forget what the temperature of the neutrino background (analogous to the CMB) is supposed to be. Something like 1.7 kelvin---roughly 2 kelvin.

    You can work out the speed if you are interested. And that would not be the lowest possible neutrino energy.

  6. Michel you put in one of my favorite pictures.

    BBcurved.jpg

    First saw this years ago at Ned Wright's site. I had been wondering if lightcones wouldn't actually be teardrop-shape because of expansion (though normally drawn cone-shape) and seeing this was reassuring and even something of a relief.

     

    On the other hand those horn-diagrams are just artist renditions and not to take too seriously---not to scale or anything as you probably realize.

     

    You seem to want to regard the observable universe as consisting ONLY of those events in the past which we observe at the present moment by means of LIGHT.

     

    In other words you want to equate the observable with the blue line in the picture. Not the events contained inside the teardrop shape.

     

    So, let's not quibble about words.

    Cosmic rays and neutrinos are matter. The solar wind and fossil dinosaur bones and the air trapped in Greenland ice are MATTER. I want to stress that we observe by smelling and feeling as well as by seeing. You want to stress visual observation, using only light.

     

    So go ahead and limit your idea of the observable universe to light. Just don't try to impose that on other people.

     

    Keep in mind that neutrinos can travel all different speeds (50% c as well as 99% c) and there are neutrino telescopes. And it is, in principle, possible to see farther back into the past with matter telescopes than with light telescopes.

     

    There are important parts of the past which cannot be observed with light which however can be observed with neutrinos.

    For example events which occurred when the age of expansion was much less than 300,000 years.

     

    There is a lot more to say about this. Measuring element and isotope abundance is a way of telling about events way earlier than expansion age 300,000.

     

    But light, including microwave, only takes us back to expansion age estimated at 380,000 as you doubtless know.

     

    So limiting your mind to the blue line is curiously like putting wax in your ears and nostrils. You throw away part of what is actually observable

  7. I agree with what Sisyphus says here. Folks remember that cosmology is a mathematical science meaning that it deals with equation models and fitting them to the data so that you get the best fit----which translates into being able to predict the next round of data. as the instruments constantly increase resolution and depth, more keeps coming in.

     

    Issues like how big is the universe are not what it's about. It is about a predictive theory that has to fit literally millions of data points. And agree with our well-tested theory of gravity (a theory of the universe's geometry.)

     

    In AstroCosmo forum our point of departure is current standard model (LCDM, for lambda cold dark matter). Understand that first then vary it all you like (if you too can fit the data and match the behavior of gravity/geometry as described by General Relativity.)

     

    The starting point is not philosophy. If you want to philosophize about the universe, try Spec forum.

     

    What we mean by the size of the universe is simply the size of the current model. And because the model has errorbars, ranges to its parameters, there will naturally be uncertainty about that.

     

    That is what I was saying, yes. Anyway, it definitely doesn't have a center. The way expansion behaves isn't consistent with expanding away from some central point, like a conventional explosion. That doesn't mean it must be infinite, though. A relatively simple alternative is that it folds back on itself. i.e., that a long enough straight line just points back at itself. Think of it like a computer game where going off one end of the map puts you on the opposite side, but 3D. So the volume is finite, but it has no center and no edges.

     

    I forget where I saw it, but I remember some physicist claiming that if the universe is finite, then the ratio of universe to observable universe would have to be at least that of the Earth to a 1 inch sphere. I have no idea if that's a consensus or current view.

     

    At present AFAIK the WMAP data that came out in 2008 contains the best, most widely accepted estimates of the LCDM parameters and about the size of the universe (which was certainly not the first thing they wanted) if it was finite, according to the S3 picture (the 3D spherical surface that gib is talking about), the circumference was at least 600 billion lightyears.

     

    I have to do something else, back in a while. Will try to same a bit more about this.

     

    Hmm... I have difficulty dissociating this scenario with the model of a 4D sphere on which our 3D universe is its surface. I guess I could chalk the 4D sphere up to a useful conceptual tool only and that the real nature of space - that is, how it wraps around like you say - to something beyond my ability to comprehend.

     

    But is there anything wrong with imagining an infinite amount of matter and energy filling the universe?

     

    It's no big deal, not to stress. I have a hunch you will without even trying get the concept of a closed 3D universe.

    What Sisyphus was describing was a toroidal 3D universe. Like PacMan 2D square with opposite edges identified but the whole 2D world jacked up one dimension. So a PacMan 3D cube with opposite faces identified. Try daydreaming that you live in such a thing. A room where you can pass out thru the east wall and the arm you stick thru the wall comes in from the west wall.

     

    In cosmology one doesn't hear very much about the toroidal 3D topology. But it is a workable example of closed 3D space, which is how S. used it in his post. It's the surface of a donut, jacked up one dimension.

     

    The more usual case is the socalled hypersphere or S3. This is the surface of a balloon, jacked up one dimension more.

    A balloon surface is S2 the socalled "two-sphere". It doesn't have to have a surrounding 3D space but can exist on its own as a closed 2D world. (That's important to realize.)

     

    The corresponding thing in one higher dimension also does not have to have a 4D surrounding---it can exist on its own. That's probably the most important thing to realize in all of differential geometry. Geometries can be experienced from within, curvature can be defined and measured from within---so they don't have to be embedded in higher dimensional surroundings.

     

    Use your imagination and try first to have the experience of a 2D creature living in the two-sphere. Slide around on the surface of a balloon. Go exploring.

     

    Then use your imaginatin and try to experience what it would be like to live in a reasonable size three-sphere. One not too large. So that you could circumnavigate.

     

    At this point we don't really have the right to believe finite, and reject infinite----or the right to believe infinite and reject finite. We don't know. It could be infinite. It could be finite three-sphere with circumference > 600 billion lightyears.

    There is virtually no difference numberwise. Both versions of the model give almost exactly as good fit to the data.

     

    So it's incumbent on us to get used to imagining it both ways. It could be either one. Future observations will probably decide the issue.

  8. The sum of the intersection of all these cones surfaces will give you the "observable Universe"...

     

    Information travels at every possible speed and therefore we are talking about a solid cone not just the hollow surface of a cone. Is that how you picture it?

     

    I don't understand what you mean by the sum of the "intersections" of cone surfaces.

     

    You can think of it more simply. Just the sum of all the cone surfaces of each possible angle from 0 to 45 degrees.

     

    But it is even simpler to think of a solid 45 degree cone, isn't it? I find it simpler that way.

     

    To repeat something I mentioned earlier, the EARTH AS IT WAS 10 MILLION YEARS AGO is part of our observable universe because we can drill holes and take rock samples, take ice cores, find fossils, measure sequestered isotopes.

    These are signals which travel at very slow speed compared with light, they hardly seem to travel at all. But they are signals which carry info to us about the universe.

  9. You are assuming that information travels always at speed c.

     

    The rule is that information can travel slower, but cannot exceed c.

     

    Therefore we can, today, be receiving information from events which are inside the cone, not on its surface. We can be observing such events.

     

    So C D and E are part of what is observable. As well as B and F.

    ============

     

    I think what you are doing may be mere quibbling about words. How the term "observable" is to be defined.

     

     

    Have to go, no time to spell it out.

     

    Think of things like cosmicray telescopes and neutrino telescopes. Cosmicrays and neutrinos travel slower than light, nearly but not quite c.

     

    Think of other ways we image the world. Think of other ways information about the universe travels, slower than light. Other ways the universe is observed.

  10. This is not a theory. It is just the way I feel things must be,... The entire Observable Universe is upon the SOL diagonals...

     

    There are neutrino telescopes.

     

    There are cosmic ray telescopes.

     

    These signals travel less than the speed of light.

     

    Not all information about past events travels exactly at the speed of light.

     

    When a geologist takes a core rock sample, and studies it's sediment layers, he observes something about past events. When a paleontologist discovers a dinosaur bone he observes, he gathers information about past events. The information has barely traveled at all, and certainly not at the speed of light.

     

    You have a feeling that the whole observable universe, all past events about which we gather info, and from which we receive signals, must be on the surface of the past lightcone. You have a feeling that all relevant information must be speed-of-light signals.

    I am not sure how to categorize this feeling, how to classify it.

    You present it as if you want to stimulate discussion! But as you say it is not a scientific theory, it is just a feeling.

     

    It is not supported by physical evidence. Indeed it is contradicted by the realworld evidence. I don't see it as a basis for scientific discussion. If anything it could spark a vacuous debate about words and feelings.

  11. Though I am unsure if it applies to pre bang conditions such a question is why I am so interested in wondering if conservation of energy can be violated. I think if energy conservation could be violated it would allow for a whole slew of weirdness, but on that note I am also unsure if energy conservation laws means that something prior to big bang had to exist.

     

    IN short from what I know it’s sort of like the theory of evolution. Biology indeed studies possible environments or what not that could spawn life from inanimate matter, but its not absolutely required for the theory to work scientifically in terms of understanding evolution. With that said from what I know current physical models of the universe, and what supports them like tests and observations currently cannot scientifically explain pre big bang.

     

    Two interesting observations.

     

    My comment about your comparison with biology would be that you never ultimately prove a theory true, theories are meant to be tested and improved. You can only show a theory is false---makes the wrong predictions---doesn't match the data.

     

    But theories that keep on matching gradually gain cred.

     

    So as an example I will take one theory, LQC, that I'm fairly familiar with.

     

    Current work by Aurelien Barrau and Julien Grain is aimed at testing LQC using CMB measurements.

     

    LQC predicts nearly the same stuff as classical General Rel cosmology but with some quantum effects making a slight difference.

     

    LQC could turn out to be the simplest explanation for what we see today, in the CMB and the overall structure of the cosmos, patterns in the distribution of radiation and matter.

     

    It could, then, gain credibility. Could. It stands a chance, but we don't know. And in LQC the BB proceeds from a prior phase of spacetime and matter which looks much the same as ours except it is contracting. There is an intervening very high density regime where quantum gravity effects dominate making gravity repellent (instead of attractive) and causing a rebound.

     

    Those are features which were not put in by hand---they just turned out when standard cosmology equations were quantized LQG style and LQC was formulated. Somewhat surprisingly the classical singularity went away.

     

    What LQC therefore has to do is pass tests based on what we can observe currently about the universe.

    If it passes and gains cred, particularly if it is economical as well, offering the biggest bang of explanatory power for the least buck of complication---then it will bring along a certain pre-BB picture.

     

    What you test is the entire model, you do not specifically test one part of the picture (the pre-BB part) you test the organic whole.

     

    There are several other researchers working on the testing problem but just as a sample here are papers by Julien Grain and Aurelien Barrau:

    http://arxiv.org/abs/0911.1625

    http://arxiv.org/abs/0910.2892

    http://arxiv.org/abs/0902.3605

    http://arxiv.org/abs/0902.0145

    http://arxiv.org/abs/0911.3745

     

    As a more detailed sample, I will copy the abstracts of the last two papers:

     

    Loop quantum gravity and the CMB: toward pre-Big Bounce cosmology

    Aurelien Barrau

    Proceedings of the 12th Marcel Grossman Meeting on General Relativity

    (Submitted on 19 Nov 2009)

    This brief article sums up the possible imprints of loop quantum gravity effects on the cosmological microwave background. We focus on semi-classical terms and show that "Big Bounce" corrections, together with the "pre Big Bounce" state, could modify the observed spectrum.

     

    Loop Quantum Cosmology corrections on gravity waves produced during primordial inflation

    J. Grain

    To be published in the AIP Proceedings of the 'Invisible Universe International Conference', UNESCO-Paris, June 29-July 3, 2009

    (Submitted on 9 Nov 2009)

    Loop Quantum Gravity (L.Q.G.) is one of the two most promising tentative theory for a quantum description of gravity. When applied to the entire universe, the so-called Loop Quantum Cosmology (L.Q.C.) framework offers microscopical models of the very early stages of the cosmological history, potentially solving the initial singularity problem via bouncing solutions or setting the universe in the appropriate initial conditions for inflation to start, via a phase of super-inflation. More interestingly, L.Q.C. could leave a footprint on cosmological observables such as the Cosmic Microwave Background (CMB) anisotropies. Focusing on the modified dispersion relation when holonomy and inverse-volume corrections arising from the L.Q.C. framework are considered, it is shown that primordial gravity waves generated during inflation are affected by quantum corrections. Depending on the type of corrections, the primordial tensor power spectrum is either suppressed or boosted at large length scales, and strongly departs from the power-law behavior expected in the standard scenario.

     

    Here's how I would respond to your second point:

    ... from what I know current physical models of the universe, and what supports them like tests and observations currently cannot scientifically explain pre big bang.

     

    I think you want to put the emphasis on testing, and not on explanation. LQC has a good simple explanation for pre-BB, in the sense of offering a simple mechanism that over-rides the singularity in classical theory.

     

    What is missing is therefore not explanation but some convincing tests. We may or may not get tests (it looks like we probably will given the work of Barrau Grain and others.) And if we get tests, LQC may or may not pass them.

     

    So I would say that we are not lacking scientific explanation, we are lacking scientific verification. But however you say it, it's work in progress and still quite incomplete.

     

    ========================

    A new part of the picture that just came out in the past month is an unexpected special compatibility between the LQC and inflation:

    http://arxiv.org/abs/0912.4093

  12. ... Im shure that before the big bang that there must be something that occupies space the way we know if here on earth,..

     

    That sounds like a working assumption that a lot of scientists make in studying the universe. Indeed it would be surprising if it turned out to be wrong.

     

    Considerable current research is going into developing models that extend back before BB. And additional effort goes into figuring out ways to test those models by astronomical observation.

     

    If you are a reader, Lucky, and are interested in finding out what the current pre-BB models are, just say and I will dig up some links. I don't know any popular writings in English, but there is plenty written at the professional level.

     

    ...It's a speculative question and should perhaps be moved, since none of this is really Science News.

     

    I suppose it depends case by case where you move threads like this.

     

    If someone were to ask a legitimate science question like "What do today's scientists, working cosmologists, think might have led up to the BB?" or "What models have they developed to explain the BB?" then I'd say move the discussion to the AstroCosmo forum.

     

    But another time it might be someone who thinks that he has solved the problem by his own common sense and that he's right and all the world's scientists are wrong because they think that before the BB there was Nothing.

    I'm not sure what to do in that case. It certainly isn't News, so maybe it is Speculation.

    The primary error is that starting off that way misrepresents the current situation in cosmology.

  13. Wow, that would really be something. Has there ever even been one that close in recorded history?

     

    If it actually did happen, it would be an amazing sight. Personally I never heard of anything like that happening in recorded history. Novas are fairly common, but supernovas are rare---I don't recall how rare they are estimated to be. Maybe you know.

     

    Hopefully we'll hear some more about this. Chance to pick up information.

     

    With core-collapse SNe, like type Ib, or II, the SN leaves a remnant, like a neutron star. But did you know that with type Ia it is different? Apparently there is no remnant left after the explosion.

     

    With type Ia it is a modest-size star that is not massive enough to do any more fusion----so after a while fusion stops with elements like Oxygen. But the star is gradually accreting material from a binary partner. And it gradually builds up to the critical mass of 1.4 solar. And then the whole thing goes, by abrupt nuclear fusion, leaving no remnant. Think of it as an "oxygen bomb" analogous to a hydrogen bomb----powered by fusion rather than by collapse.

     

    At least I think so, haven't checked. Maybe you can confirm. More usual SNe are powered by the sudden collapse, releasing gravitational energy, and the resulting rebound shock. So its really different.

  14. T Pyxidis, estimated 3000 lightyears from us, is being presented in science media as a potential supernova.

     

    I'm not sure this is right, or that it will be confirmed. But I respect Steinn Sigurdsson. He's a reputable astrophysicist. In my opinion reliable and hardhead, not a speculator. Judge for yourself from his other blog posting. Here's what he says about the T Pyxidis news:

    http://scienceblogs.com/catdynamics/2010/01/has_a_supernova_type_ia_progen.php

     

    The reports come from a group at Villanova University led by Prof. Edward Sion. What I get from this is that T Pyxidis is extremely interesting but most likely far enough away not to pose a threat to Earth, although this remains to be confirmed. And in any case I do not see how the data imply the possibility that the star will go supernova.

     

    The explosion they think T Pyxidis is preparing for for is type Ia. It is especially important to improve our understanding of the mechanism of type Ia supernova explosions because they are used as a standard candle in establishing distances in cosmology. From Sigurdsson's post I get the impression that although much is known about type Ia, from hundreds of observations, still the mechanism that causes them is imperfectly understood.

     

     

    Here's something from the media:

    http://www.spacedaily.com/reports/T_Pyxidis_Soon_To_Be_A_Type_Ia_Supernova_999.html

     

    Wikipedia authors seem to think the star could present some hazard:

    http://en.wikipedia.org/wiki/T_Pyxidis

     

    The star has recurrently gone nova. Flareups have been recorded in 1890, 1902, 1920, 1944 and 1966 (or 1967), but not since then. The increasing interval suggests to me that the star is actually losing mass, because it takes progressively more build up on the surface in order for a nova flash to happen.

  15. ... waiting for something like the Scale Factor. Correct me, but the expansion of space is often mentionned as a "scale factor", which is a very weird and difficult concept for common people.

     

    Let's humanize the scale factor. In around 1922 a wonderful young Petersburg russian named Alex Friedman came up with a formula for distance, technically a metric, that had a scale factor a(t) that could vary with time.

     

    He had a ridiculously bulbous head, like a lightbulb, or a comicbook alien.

    And at one time he held the world record for altitude. Ballooning.

    Probably without proper oxygen equipment. Incredible nerve, an adventuresome curious guy.

     

    So part of the formula had this factor a(t) multiplying it.

    It was a spacetime metric. The spatial part had the scalefactor on it.

     

    The amazing thing was that this metric was a solution to the 1915 equation of General Relativity. It was a solution that Einstein had not thought about and it seemed so strange to Einstein that at first he totally rejected it. But after a while he got around to agreeing and he recommended Friedman's paper for publication in the Zeitschrift für Physik.

     

    In Cyrilic his name is spelled with one n, Friedman, but the German spelling is Friedmann, so in an encyclopedia you will likely see it either way.

     

    Soon after publishing his papers, which provided the model still used by all cosmologists, the young Friedman died of typhoid fever. That was 1925 as I recall.

     

    Look him up, he gave us the scale factor and the Friedman model. A Belgian named Lemaître later came up with something similar---so some people call it "Friedmann-Lemaître". Call it whatever, it's the math model of the universe that gives a meaning to the scale factor, an increasing function of time a(t) that describes the expansion-history of space.

     

    The 1915 Einstein equation governs metrics. Metrics specify a geometry of the universe. That's what it's about. To be interesting, a metric has to be a solution of the Einstein (Gen Rel) equation.

     

    http://www-history.mcs.st-and.ac.uk/Biographies/Friedmann.html

     

    http://en.wikipedia.org/wiki/Alexander_Friedmann

  16. If you haven't seen this bit about the Scientific Peer Review system circa 1945 you may wish to check it out:

    A senior faculty and his grad students/postdocs discuss review of some research they have submitted for publication.

    A link was given to this earlier, as I recall, but some people may have missed it.

  17. qft :)

     

    hi Michel and iNow,

    thanks for the favorable comment, it's encouraging.

     

    iNow, as I recall from your past comments we have a fairly similar perspective on cosmology and you have some hands-on experience with the mainstream model embodied in Wright's and Morgan's cosmology calculators.

     

    Michel, my impression is you have mental energy that you put into assimilating and questioning new ideas. I have no idea of your actual age. The avatar is of a young person but you don't seem that. You don't have fixed set ideas AFAICS.

     

    It's risky getting into these borderline philosophical areas like what is distance what is time what is motion. How do we do geometry if we don't do it the way it is taught to adolescents in Middle School? What causes Euclidean geometry to work---why is it such a good fit to reality in ordinary circumstances? Part of the risk is that I will say something wrong and have to take it back or try to put it another way. Part of the risk is that by engaging these questions prematurely Michel just gets confused.

     

    Like rock climbers, be sure you have a safety rope. Be sure you can go back to your earlier world view, your previous practical understanding of time distance motion. In case of a wrong move, or somebody's foot slips.

     

    Michel, I haven't read every recent post on the SFN forums. I know only a small part of what you have written.

     

    So please fill me in. Are you familiar with those red and blue blotchy oval sky maps of the microwave background?

    Produced using data from various spacecraft missions: COBE in the 1990s, WMAP in the 2000s. And now Planck for the next couple of years. Until its liquid helium runs out.

     

    Do you have a mental picture of the black body thermal radiation curve? Not exact I mean, just a rough picture. The kind of lop-sided bell curve?

    It's a fairly deep feature of nature. Max Planck got the concept of it in 1900 and it triggered quantum mechanics. There is a different curve for each temperature. They all look the same, just morphed right or left to favor higher or lower wavelengths. Same lopsided bellcurve shape but the peak is at a different place. The curves plot radiation intensity versus wavelength, or alternatively they plot intensity versus frequency.

     

    I have some things I want to say about time distance and motion. But first I want to make sure those oval maps of the CMB sky temperature are familiar. And also to mention that the way radiation sky temperature is measured is to gauge the intensity in each small wavelength bracket and plot a curve---and compare that observed curve with the theoretical one that Max Planck discovered. Adjusting the temperature parameter in the theoretical curve, so you get the snuggest fit to the data, ultimately tells you the temperature of the observed radiation.

     

    I'm no great expert on the CMB, just have a common-sense understanding of it as an interested bystander. I want to check to make sure that we are on common ground here. That you have some comparable familiarity. Both on the same page. I'll explain why later.

    ===============================

     

    Well, I guess I'll go ahead and say a bit without waiting for response to the above.

     

    In mainstream cosmo it is generally assumed the oval CMB temperature sky map is a photograph of the ancient universe (AU). Absolute motion relative to the AU is clearly revealed by a doppler dipole. In preparing the map, our own motion is already deducted. The solar system is moving about 370 km/s in the direction of the Leo stars and that causes a hotspot in the initial map of observations. And that hotspot is taken out already when you see the map.

     

    So we have a universal idea of a stationary observer (not moving with respect to AU) and we map the sky from the ideal standpoint of a stationary observer who dropped out of the solar system and of our galaxy at a certain moment. It doesn't make a lot of difference because 370 km/s is so slow. It's all approximate anyway. But that's the ideal.

    =================================

     

    On the topic of motion in general. In order to define and talk about motion you seem to need a specified observer. At least a definitely specified imaginary observer.

    The ideas of time and distance depend on the observer. Also the choice of which events are simultaneous. It seems to me that for spatial distance to have a meaning, it has to be between simultaneous events. Otherwise the separation is a mix of temporal and spatial separation. It seems you can get different answers for distance depending on how you measure.

    It is often said that in cosmo there is no one obvious most natural definition of distance. There are several distances---Wright's calculator gives several and there are others.

     

    =================================

     

    On the other hand in practice the situation is not as vague as that suggests because a lot of the actual work is done using an unambiguous idea of distance based on observers stationary with respect to the ancient universe. Not moving with respect to CMB. It is one of the easiest things to refer to, a kind of obvious landmark.

     

    Before the CMB was observed, they already had the same idea of stationary, they just called it "comoving with the Hubble flow". That means stationary with respect to the expansion process as a whole. There are other ways of telling that we are moving 370 km/s in the direction of the Leo stars. One doesnt need the CMB, but that is the modern accurate way.

  18. ... In Gen Rel there is no substance called space, there is only geometry and the geometry is dynamic and evolves according to certain rules.

     

    Since there is no substance called space, ...

     

    In General Relativity, there is no substance called space.

    The subsequent discussion assumes we are in the context of Gen Rel.

     

    I can't keep putting the words "in Gen Rel" into each sentence.

     

    So please keep the context in mind, as a qualification.

     

    No absolute claim has been made that "there is no substance called space. What I'm saying is that in the mathematical framework we use in describing the universe, there is no substance called space.

    There is no mathematical entity in the theory that expands----there are distances, describing geometry. And of course there is matter.

     

    So when people popularize---when they talk about the current models of the universe in layman terms---they actually give a distorted impression when they say "expanding space" as if it were a substance.

     

    THERE ACTUALLY COULD BE SUCH A SUBSTANCE, like an "ether". But there is not such a thing in the theory. Therefore it is a miscommunication.

     

    I am always a little suspicious when science claims an absolute. "Since there is no substance called space ... " We are certain this statement is true because .... ?

     

    Sorry if I wasn't clear enough. I was not claiming an absolute certainty that there is no substance that expands. Such a thing might be discovered one day. But in Gen Rel, the basic math model we use in cosmology to describe and predict, to organize and fit data, there is no such substance. And popularizers mislead people if they speak as if there were.

     

    Intentionally or not, you were giving scientists a bad rap, in the sentence I quoted. Or scientists, or just me. Whoever the suspicion was being directed at.

     

    A lot of people (not necessarily yourself) seem to have an anti-science bias. And they tend systematically to mis-state what scientists are trying to say, to make the message look doctrinaire, and cast doubt on its credibility.

    This is a kind of straw-man debating technique where one misrepresents what another person is saying in order to discredit it. When this sort of thing happens to me, I reflect that it is often my own fault for not being clear enough.

     

    Other times I think the main fault is with popularizers. There are scores of them, although the only names I can think of at the moment are Stephen Hawking and Brian Greene. They are doubtless not the worst.

     

    What you said before did not involve a nozzle. You said

     

    ...Furthermore, if we removed the walls of the box, giving the atoms infinite space in which to disperse, what 'would' be the maximum velocity achieved?...

     

    Now you imagine a surrounding vacuum chamber a 100ft cube

    So a little box with 100 molecules sits inside the big cubical vacuum chamber. Somehow, in the twinkle of an eye, you instantaneously "remove the walls of the the box" as you said.

    Now the atoms have more freedom to roam about.

     

    I told you I thought there would be no change in velocity. In other words no change in the temperature of the gas.

    The random thermal motion just has a larger scope.

     

    However you are now asking about something different. You now want there to be a nozzle. A channel in which a pressure gradient is temporarily maintained and pressure converted into directed velocity.

     

    I'm not sure how this is supposed to relate to galaxies. Maybe you should refine the image and try to make it relate to the universe in some way.

     

    You should probably note that relative to the background, galaxies are nearly stationary. They have random motions which, when possible to measure, turn out to be a few hundred km/second at most. Whereas light goes about 300,000 km/second. That is, galaxies move on the order of 1/1000 of the speed of light.

     

    However most of the galaxies which we can see with, say, the Hubble telescope, are receding from us at more than twice the speed of light. That is, the distances from us to them are increasing at over twice 300,000 km/second.

     

    Recession is not the same as ordinary motion. It does not get you closer to anywhere. It has no destination. It is not limited by the Special Relativity Speed Limit. It is just a rate of distance increase.

     

    A distant galaxy may give you the impression that it is moving and going somewhere. But it is not getting closer to anything. There is nothing ahead of it that it is getting closer to. It has no direction of motion. It is getting farther away from everything. It has no momentum associated with this change in geometric circumstance. Distances from it, as from us, can change at rates exceeding c. Because that change is not ordinary motion.

     

    So somehow you want to relate this to gas in a box? It sounds like an analogy with very limited applicability. Useful as a visualization but not something that would make for a reasonable numerical comparison.

     

    BTW Happy New Year! I hope we all learn some new things in 2010 and that the experience is on the whole pleasant.

  19. In the box thought experiment, the box sits in a static geometry. We don't put General Relativity into the picture (which allows geometry to change dynamically, so that distances between stationary points can change.)

     

    ...

    Furthermore, if we removed the walls of the box, giving the atoms infinite space in which to disperse, what 'would' be the maximum velocity achieved?...

     

    No gain in velocity. Since there is no repulsive force. Remember we are thinking about this in the static framework of conventional Euclidean geometry. This is good on ordinary time and distance scales. If we were thinking in terms of cosmological scales. Thinking of the molecules behavior over billions of years time and millions of lightyears distance, then Gen Rel geometry would complicate the answer. there would be noticeable effects of nature's dynamic geometry.

     

     

    But now you begin putting Gen Rel into the picture. You start talking about expansion cosmology.

    The other point being ... it seems illogical that 'space' is actually expanding between any two atoms, anymore than 'space' is expanding between galaxies.

     

    "Space expanding" is just what they tell children and laymen. It is not a good way to think (IMHO) and it is not what the Gen Rel math says. In Gen Rel there is no substance called space, there is only geometry and the geometry is dynamic and evolves according to certain rules.

     

    Since there is no substance called space, it is a bit confusing to speak of it as expanding. However spatial geometry can change with time. That is the whole point of "non-Euclidean". Nature does not provide us with exact static Euclidean geometry. When and where the geometry looks Euclidean that is because something CAUSED it to adopt that pattern. Gen Rel is the theory of geometry and what causes it to be the way it is and what causes it to change. It is a successful theory that has passed many tests. Euclidean static geometry is wrong, we know. Gen Rel dynamic geometry is remarkably close to right except in certain extreme circumstances, and those are being worked on as we speak.

     

    To get a useful idea of distance in Gen Rel, one has to specify how one wants to define the present moment. Which events are simultaneous. How you do that is to some extent arbitrary, although working cosmologists dealing with real data have agreed on a very natural-seeming way to do it, so it doesn't look arbitrary.

     

    Once one has specified how to slice the 4D history into slices of simultaneous events, one can give a meaning to distances.

     

    For a working cosmologist, the CMB background is the ancient light from the collective matter when it was approximately evenly spread out. Like a hot gas. Before it began to condense into blobs and start falling towards other blobs.

     

    So we can define that you are STATIONARY relative to the bulk matter of the universe if you are stationary relative to that ancient light.

     

    But that is very easy to check. If you are not moving the CMB should have the same temperature in all directions. If you are moving in some direction then you should see a doppler hotspot ahead of you (a blueshifted patch of sky) and a doppler coldspot behind you (a redshifted patch on the microwave map).

     

    So we can define who are the stationary observers throughout the universe. They are the ones for whom there is no doppler hotspot in their microwave sky. They are at rest with respect to the ancient uncondensed matter and its ancient light.

     

    OK do now we can start doing geometry. We can take two widely separated stationary observers and see if the distance between is changing with time. There is no reason to suppose it wouldn't change. Nature does not subscribe to Euclid's prescriptions.

    And we can, in effect, measure the angles of very large triangles between stationary observers and see what they add up to. There is no reason to suppose that they would always add up to 180 degrees.

     

    Py, in your post I quoted where you say something isn't LOGICAL. I'm not sure what you mean. I might disagree. I would say that, knowing what we do about geometry, it is not logical to expect the distance now between two stationary observers to be the same as the distance sometime later.

    Or sometime earlier.

    Something similar can be said about very large triangles and 180 degrees.

     

    At small scale we don't have to worry about these things because the differences with Euclidean are undetectable, or just barely detectable. It would be nonsensical to worry about such small differences

  20. ...But this is what I would do:

     

    Since I think it is the 'homogenizing' property of space that makes the nitrogen released in a vacuum rush to equidistance, as opposed to a 'repelling' force of the atoms ( pulled, rather than pushed ), I would take the formula that expresses the accelerating velocity of that 'repulsion' and apply it to various ages of galaxies we know to be receding from us. I would then also apply it to the local group of galaxies to see if their gravitational attraction is in any way affected by space's 'tugging'. Maybe all this has already been done ... If so, my apoplogies.

     

    Anyway, I would think we would find a correlation between/among these phenomena.

     

    Bad thinking?

    ...

     

    Not bad thinking. It strikes me that you are here thinking like Aristotle rather than like Ludwig Boltzmann. Aristotle might have explained the observed effects of gravity by saying that things have a "downwards going" property---an innate tendency to go down. And if he had imagined air molecules he might have attributed to them a "repulsive" or a "spreading-out" tendency.

     

    But Boltzmann did not say that gas molecules repel each other. He did not attribute any innate dispersive tendency to them. He thought more deeply.

     

    There are more different ways they can be spread out. Boltzmann imagined what is called the multidimensional phase space of the molecules---where you describe the current situation by listing the position and velocity of each molecule.

    There is a bigger volume of phase space where they are spread out.

     

    If there are 100 molecules in the box, and you need 6 numbers to describe each (3 to tell position and 3 to tell speed and direction) then a point in "situation space" (called phase space) is simply a list of 600 numbers.

     

    There is more volume in this 600 dimensional space that has them spread out. If you pick a random point, a random list of 600 numbers satisfying whatever overall conditions like temperature, the chances are very high that it will describe a spread out situation.

     

    So Boltzmann did not NEED to postulate a dispersive tendency in molecules to want to get away from other molecules. He flourished in the second half of the Nineteenth---Old Vienna---the Austrian Empire. He was a major force in getting other physicists to accept the idea of atoms and molecules. One of the greatest, like Aristotle or Archimedes. He founded thermodynamics.

     

    I've oversimplified and left out stuff but tried to suggest the main idea. If you divide up the phase space (the range of situation list possibilities) into regions, where each region corresponds to some macro conditions like of temperature and pressure----that is called 'coarse-graining' the phase space.

     

    Entropy is the log of the volume of a given coarse grain, a given region. Because you only care about the macro conditions, like pressure temperature, that specify a region. You don't care how those conditions are implemented down at level of individual molecule position and motion.

     

    Boltzmann was able to give a mathematical definition of disorder. To quantify disorder so that one can track and predict and make equations to govern its evolution. This was the first time such a thing was done on our planet. Two great Viennese: Boltzmann and Mozart.

     

    The dispersion of galaxies is, as far as we know, NOT analogous to the dispersion of gas molecules in a box.

    However there is an Indian named Thanu Padmanabhan who has been trying to persuade people that the theory of geometry (General Relativity) can be based on thermodynamics. That is, that the theory of gravity (which describes expansion of distance) can be derived from a more general thermodynamics. Padmanabhan has written a dialog about this (like Galileo used to communicate radical ideas in the form of dialogs or imaginary conversations.) It sounds borderline but one must allow that Padmanabhan might turn out to be right. It is within the realm of possibility. However as far as we now know the geometrical expansion of distances between stationary points has nothing to do with matter-thermodynamics.

     

    In case anyone is interested, here is Thanu's dialog:

    http://arxiv.org/abs/0910.0839

    (proceed at your own risk).

  21. ...

    CMBR. This is the measurement that would 'prove' my model, I believe. As you know, it is a closed loop. If the universe is infinite, and generally isotropic, then there will have been an infinite number of 'bangs' in all directions in the past. And the properties of space would apply universally. If the universes are 'leaking' at all ( can't have just one with this 'defect' in infinity ), then we would be getting CMBR from across the 'voids'. ( Because photons would be travelling throught the medium they always travel through. It doesn't matter how far it is when you have been travelling for eternity ...)

     

    Anyway, the red shift would show this, would it not? I don't think it does. The argument that space itself is expanding collapses with an infinite/eternal universe. For that would mean a beginning. A starting point. If it's going in one direction now ( expanding ) it would shrink going back in time, to an infinitely small point.

    ...

     

    Py, I have highlighted what I think is a mistake in reasoning. Accepting present expansion doesn't mean assuming a beginning. Because the expansion doesn't need to extend that far back.

     

    When people apply quantum mechanics to the law of gravity one result that comes up (in some of the analysis) is that quantum effects make gravity repel at very high density (like what is conjectured around big bang 13 some billion years ago.

     

    That could mean that the bang was a bounce. A contracting phase reached a high enough density that the force reversed and started an expanding phase.

     

    I don't want to elaborate further right now, but logically one cannot infer back in time to an "infinitely small point."

     

    The most revolutionary thing going on in cosmology, at present, is the research called quantum cosmology. If you want get a taste of it, here is a keyword search in a research publications database called Spires:

     

    http://www.slac.stanford.edu/spires/find/hep/www?rawcmd=dk+quantum+cosmology+and+date%3E2005&FORMAT=WWW&SEQUENCE=citecount%28d%29

     

    Please let me know if your computer can't get the listing. I have set the date so only papers from after 2005 show up and I have set the ordering so that the most often referenced papers appear first. These are typically the ones that other researchers consider most important. Ground-breaking research tends to be referenced more often.

  22. If a group of people were in the event horizon of a blackhole, would they appear to be moving away from each other at an increasing rate?

     

    Let's answer it as if it were meant seriously, Michael. :D

    Arisone may have something serious in mind, like "is the whole universe a black hole?"

     

    Arisone, to be clear you really have to specify who the observer is.

     

    If a group of people were inside the EH of a black hole, how would they appear to an outside observer? They would not appear. The outside observer would only see them as they were just as they were in the process of falling through the horizon, on their way in.

     

    But how would they appear to one of their own bunch? An observer who is inside the EH falling along together with them. Say he is in the midst of the bunch and he looks around at his neighbors.

     

    Wouldn't this depend on which direction from him the neighbor is. If the neighbor is towards or away from center---along a radial line---I suppose the neighbor would be getting farther away. If it is a very large black hole the effect could be very gradual.

     

    But if the neighbor is in a direction at right angles to the radial, then I suppose the neighbor would be getting closer. Possibly only very gradually.

     

    Am I missing something? Haven't thought about this much. Maybe someone else will respond.

     

    With this kind of problem I tend to want to picture the black hole as a supermassive black hole with a Schwarzschild radius of billions of miles.

    That way falling in thru the horizon doesn't have to be such an immediately traumatic experience. One has time to think and experience. Things happen relatively slowly.

     

    Eventually, as Hatten says, tidal force would begin to stretch them along a radial line, and ultimately cause damage. The feet get uncomfortably far from the head. But this might not happen for quite a while. Much of the journey might not be that eventful.

  23. This is in reply to your post #24. I hadn't yet seen your post #27.

    Py, you raise some serious epistomological issues. Do we ever know facts about the universe? And if we don't have absolute certainty about it, does this even matter?

     

    I'll have to get back to this later. One stance would be to say that scientific theories are not meant to be believed, they are meant to make predictions and to be tested.

     

    That is, we do not presume to KNOW about the universe. What we want to do is develop the simplest most Occam models that fit the data, and ideally which turn out to fit NEW data surprisingly well, as it comes in.

     

    If a model surprises us repeatedly by predicting new phenomena which are then confirmed and by fitting the data amazingly well, surviving test after test, we might sink into the rut of ACTING as if we believe it and talking carelessly to laymen as if we believe it to be true. But that is not what is at the heart of science. It is not belief, it is making predictions and testing them with measurements/observations.

     

    One always retains a corner of skepticism about any model. Eventually any model must be found wanting and must be improved on. Well that is one possible attitude one can take.

     

    And so far, working cosmologists are still using essentially the model that Friedman derived (from the 1915 Einstein equations) in 1923 by a process of making simplifying assumptions. And the Friedman model is amazingly successful. Also it is starkly simple. Really really Occam.

    Look up wikipedia "Friedmann equations" (they use the German spelling, but he was Russian, so people differ in how they spell it). If you are not into math it may look complicated but as such things go it is the epitome of classic beauty and economy. And it keeps on fitting the data year after year.

     

    What I would mean by finite versus infinite is which version of the Friedman model. That is potentially decidable. Which fits better the infinite version of the model or the finite version? I expect that to be decided in my lifetime.

    But I do not believe the model. I have no need to believe any description of the universe and its history. I rely on it, in the absence of anything better, for calculation and communication, and it is eminently reliable, but why believe?*

     

    This is my personal attitude. (Perhaps I shouldn't bother you with that, but it could make it easier for us to arrive at an understanding.)

     

    *quantum gravitists are working on a revised (quantum) version of the Friedman model which does not suffer from a singularity, when that is ready and if it passes the tests then I'll probably switch over and rely on the quantum version wherever they differr significantly. If you want a link to that development, just ask.


    Merged post follows:

    Consecutive posts merged
    By the way Martin ... I just take your word ( and other physicists ) on things like this ...

    I'm a retired mathematician who has become interested in cosmology and in the current developments in quantum cosmology. I'm a sidelines onlooker, not a pro.

    Swansont, for instance, is a working physicist.

    In cosmology I try to report the mainstream consensus as well as I can.

     

    ...

     

    Martin, could you please explain how this applies to bounded/unbounded space? My understanding ( and attempts at comprehension ) of space is of 'bound' energy permeating infinity in a generally isotropic manner. Quantum fluctuations ( I think if it as a build up of static charge ) essentialy 'clumps' the 'fabric' of space together temporarily stretching the lines of energy. And space tries to immediately 'unclump' them always working toward a homogenous state. This is how I view the 'expansion' of space. Meaning space isn't actually expanding but working within it's own self-imposed limitations ( properties ) to smooth everything back out. This, as opposed to dark energy acting on space, stretching and expanding it ...

     

    Am I way off here? Or is this an equally possible alternative to dark energy?

     

    Like the atoms of gas 'trying' to become equidistant ( isotropic ) and gravity wanting to clump them together ( entropic ). A property rather than an outside force ...

     

    This is your speculation, then. A dynamic space endlessly struggling with matter's tendency to clump. There is no dark energy, but rather space has a kind of will of its own---it is trying to smooth everything back out.

    An unending tug of war between two principles.

     

    I see no reason anyone should object to your expounding this kind of worldview. You probably won't succeed in getting some mathematically adept person from among the SFN members to formulate your idea mathematically, in the form of equations. I say that just as a note of realism.

     

    If you did get it translated into equations then you could see what they say about the expansion history, and redshift data, and the microwave background. You could compare derived numbers with observed numbers.

     

    Another thing that could conceivably happen is someone might turn up some earlier work on a model LIKE yours in some essential way, and you might learn of an already existing mathematical formulation.

     

    I'm just kind of brainstorming, thinking out loud with no particular direction.

    Basically though, at least right at the moment, I find myself unable to respond. It doesn't look definite or quantitative enough to grapple with---a verbal embryo so to speak. Instinctively a hands-off feeling.

  24. The cosmicweb computer simulations are the work of Andrey Kravtsov. Truly wonderful. There are some longer ones which were shown by Smoot in his TED talk. Google "Smoot TED" for a great 18 minute lecture on structure formation by a Nobel cosmologist.

     

    As I recall Kravtsov's computer animations show the condensation of dark matter only. Since there is is so much more of it, it dominates the process of structure formation.

     

    Penrose gave a talk at Cambridge in 2005 where he explained that the entropy of the gravitational field is defined differently from the entropy of, for example, a gas. In the case of a gas, uniformity = high entropy. The gas spreads out to fill the box uniformly and entropy increases.

    In the case of the gravitational field, uniformity is very unstable. Because of universal attraction. Clumping breeds more clumping. It is the opposite of gas intuition. In the gravitational field, uniformity = low entropy. The combination of geometry and matter naturally kinks and curdles and condenses (into cobwebby structure in this case) and entropy increases.

     

    I don't agree with all of Penrose talk because some parts were exceedingly speculative but his diagrams explaining entropy and the second law as applied to cosmology were pretty effective.

    http://www.newton.ac.uk/webseminars/pg+ws/2005/gmr/gmrw04/1107/penrose/

     

    There are some Youtube videos of him giving essentially the same talk somewhere in the US in 2007

    This is Penrose' brief illustrated discussion of the entropy concept:

    Here is a 9-minute segment where he explains that the entropy of gravity is different

    It's a popular non-technical talk. All on a pictorial/intuitive level. Don't watch unless you enjoy or at least have patience with that type of presentation.

  25. Thanks so much! I appreciate your conciliatory gesture. I will try to explain some stuff even though it is nearly midnight here (pacific time) and getting near bedtime.

     

    Standard cosmology goes back to 1915-1925, Einstein's equations and then Friedman's simplification which makes the uniformity assumption you mentioned (homog and iso.) which Einstein IIRC named the "cosmological principle".

    There are other names besides Friedman's (Lemaitre, Robertson Walker) and various abbreviations for the model like FRW and FRWL. But let's just call it Friedman model.

     

    Matter in this model is pictured as "dust". Uniformly distributed througout all space. And mathematically space can either be finite volume or infinite volume. Space has no boundary. This was the first big bang cosmology. IIRC Friedman presented it around 1925. At first Einstein didn't like it but then he did. I forget the details. It was published (I've seen fax of the original.)

     

    There are many ways that space can be finite volume and boundaryless. A simple example is the so-called 3-sphere or hypersphere. Unless you have a taste for mathematics, the details are not terribly important. A lower dimensional analog would be a 2-sphere----space being only two dimensional and having the topology of the surface of a balloon.

     

    The Friedman model and the Friedman equations that govern its expansion are able to take in both the spatial finite and the spatial infinite case.

     

    Both versions of the model fit the observational data extremely well. One has to make very fine measurements to tell the difference. So far our observations are not fine enough to favor one over the other. But they get better every year. Practically speaking, it turns on a certain number which can be measured using data from supernovae, galaxy counts, and the cosmic microwave background. If this number is negative, the finite case is favored. If it is zero or positive then the infinite. The number is called omega-sub-k.

     

    So whether space is finite or infinite volume is actually an observational problem. It is not a question for philosophers or for amateur common sense. We actually do not know. It could go either way. We have to measure.

    This year the european space agency put a craft up about one million miles from earth to make measurements of the microwave background in part to be able to determine this omega-sub-k.

     

    As with much of science, when an important number is measured there is an error-bar. The error-bar we have for this number is a very small interval right around zero. As I said, it could go either way. When the european data comes in it will shrink the error-bar down some more and it could still be around zero, or it could be on the positive side, or it could be on the negative side.

     

    The US NASA craft that they reported on last year and this year, with the best data so far is called WMAP.

    The european craft that just started collecting data this year is called Planck.

    If you find any of this puzzling please ask questions.

    It's bed time for me. I'll check this tomorrow.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.