Jump to content


Senior Members
  • Posts

  • Joined

  • Last visited

Everything posted by Martin

  1. Jeykl and Widdekind, thanks for both your responses and for continuing the thread in a quantitative manner. Sorry not to have responded earlier. Jeykl, drunk or sober you clearly know how to calculate. What WMAP reports is a 95% confidence interval for Omega_k. You ask does the actual measuring give you Omega_k or does it give you Omega_tot. But these are logically the same because Omega_k is defined to be 1 - Omega_tot. The same data analysis proceedure that gives a value for one gives a value for the other at the same uncertainty level. I am not expert enough to tell you why they have the alternate notation in the first place. Represents the same information. Apparent redundant notation. Jeykl, you ask how spatial curvature can be measured. they use several methods and check them against each other. One way is very simple to understand and is by galaxy counts. As you may know methods of successive approximation involve an element of (not quite) circularity You believe near flatness and if just assume exact flat for starters, then you can set up a conversion of redshift to present distance and make a catalog of how many galaxies by distance. At the present moment. Now if there is zero spatial curvature then the volume of an R-ball should increase as R^3. And we assume largescale uniform distribution of galaxies. So the number within an R-ball should increase as R^3. Counting galaxies is a way of measuring volume. But if there is positive spatial curvature then the volume of space within distance R from us, the R-ball, will NOT increase as R^3. It will increase more slowly. By analogy think of the 2D case on surface of usual sphere. The area within an R-disk does not increase as fast as R^2. By counting galaxies, and measuring the volume of an R-ball, and seeing how it depends on increasing radius, one can gauge the curvature. Now, to do successive approximations, one has to go back and CORRECT one's original formula for cranking out the present distance to a galaxy from its redshift. But fortunately this will change things only a little bit. The count within a given R distance will be slightly different, and one can get a corrected estimate of the curvature. But basically we're done. That's one way. Now I should go fetch that 95% confidence interval. BTW for background context, google "Komatsu cosmology" and you get this UTexas page: http://gyudon.as.utexas.edu/~komatsu/ It has some latest news about Planck satellite (the next thing after wmap. It gives an idea what Eichi Komatsu is like. Then google "Komatsu wmap" and you get this NASA page: http://lambda.gsfc.nasa.gov/product/map/dr3/map_bibliography.cfm The one by Komatsu et al is the wmap paper relevant to cosmo. In that paper the confidence interval is given on page 4. −0.0179 < Ω_k < 0.0081 The greatest positive curvature is the case where Omega_tot = 1.0179 And that case is the smallest radius of curvature. So you get a lower bound of 13.7/sqrt(0.0179) billion lightyears. This comes to 102.4 billion lightyears. However some more recent data suggests that the figure for the Hubble radius should be 13.4 rather than 13.7. There is always some revision going on. So we can say that the lower bound estimate is roughly 100 billion. The radius of curvature is at least that, and it might be much bigger. Indeed space could be flat, with infinite radius of curvature, not curved at all. We don't really know very much yet. On page 4 it is better to use the column labeled WMAP + BAO + SN because that is based on wmap data pooled with other project's data, so is more reliable.
  2. Doctor, there is an auxilliary piece of notation which even tho drunk you should start getting used to, which is [math]\Omega_k[/math]. It exists merely to simplify some formulas and is defined by Oh here, let us take the latest WMAP-data. Maybe I got some decimals wrong (disregard that), but let us assume a density parameter as [math]\Omega_k = 1 - \Omega_{tot}[/math] So in the case you offered as an example, suppose in fact that [math] \Omega_{tot} = 1.003[/math] Then an equivalent way to say this would be [math]\Omega_k = - 0.003[/math] In that case the standard cosmo model say that the universe is spatially closed and that space has the overall geometry of a 3D hypersphere and it tells you how to compute the RADIUS OF CURVATURE of space which is the radius which the hypersphere would have if you imagine immersing it in 4D Euclidean space. The model doesn't say that such a surrounding 4D space exists, it assumes that whatever exists at this moment is concentrated in the 3D hypersphere. We have no evidence of anything outside or around it. But it still can have a radius of curvature. That would affect things like the sum of angles of large triangles, and how long it would take to go all the way around if you could freeze expansion. The circumference is of course 2pi times the radius. And the radius of curvature tells you the volume. The formula for a 3D hypersphere is 2 pi^2 R^3, To compute the radius of curvature is easy, it is just the Hubble radius divided by the square root of the absolute value of Omega_k. In the example, the absolute value of Omega_k is just 0.003. And the Hubble radius c/H is something like 13.7 billion lightyears. So you divide that by the square root of 0.003. Drunk or not you should be able to do this:D. What do you get? bTW if Omega_tot is exactly one, then Omega_k is exactly zero. And then the radius of curvature is infinite. Which describes spatial flatness. There are exotic topologies, toroids, that are spatial flat and still spatial closed, and smartassers will mention them now and then, but the basic picture of spatial flat is spatial infinite. Euclidean space. I hope it turns out Omega_tot is just a tiny bit greater than one, and then we have a simple hypersphere spatial closed case, and we never have to hear about toroids again. It will the simple surface-of-balloon picture just raised from 2D to 3D. Tell me what you get for the RoC and I'll see if I agree -------------------------- For more information google "Komatsu cosmology wmap" and you will get things like this: http://arxiv.org/abs/0803.0547 and look in Table 2 on page 4 and you will see the 95% errorbar for Omega_k and you will also see how they calculate a 95% lower bound on the radius of curvature, using the highest Omega_k, the high end of their errorbar because that gives the smallest RoC.
  3. That is a breath-taking vision, Arch. Gee...I dunno. ======================= Wait. What's the motive for going outside your home galaxy? A civilization concerned about its longterm survival would have a motive to spread to other stars, and perhaps might spread to a substantial number in one galaxy. But what's the point of going beyond that? especially on a systematic routine basis?
  4. Thanks again for relaying this news! Having an alert lookout report like this is great. I saw a July 6, 2009 news bulletin http://planetquest.jpl.nasa.gov/news/roundUp.cfm but I'm not sure it covered the four you mentioned. Do you have links to any more descriptive info?
  5. Alan I hope you proceed little by little and don't try to wrap your brain around too much all at once. Just take small steps and keep asking (that works for me and the students I've known anyway) What DH said is right, so be sure you absorb that. Then about the first thing you said, I don't know of any quasars that are estimated to be receding at 6c or better. So let's get that straight for starters. Where did you get that? Maybe I'm wrong, but I think that figure of 6c is way off. Maybe you could instead be talking about a redshift of 6, so let's get that straight for openers. I'd suggest you learn how to use the Cosmos Calculator. Just google "cosmos calculator" put in 0.25 for matter, 0.75 for lambda, 74 for hubble and you are ready to convert any redshift to a recession speed, or to a distance if that interests you.
  6. Earlier I gave a link to Wikipedia "einstein aether theory". Anybody happen to check it out? Maybe this thread is not really about Einstein-Aether theory, which would mean I was mistaken in mentioning it. If it is, then here is a survey paper on the current status. http://arxiv.org/abs/0801.1547 Here is the original 2001 paper that kicked off the current research activity in Einstein Aether. http://arxiv.org/abs/gr-qc/0007031 The 2001 paper on Einstein Aether has been cited by 181 other papers. It got a lot of interest from other researchers. http://www.slac.stanford.edu/spires/find/hep?c=PHRVA,D64,024028 A lot of activity has been directed towards testing it observationally, trying to rule it out experimentally (which seems to be quite hard to do.) There have been variations on this theme---a general covariant theory that behaves outwardly like General Relativity, but has a global notion of time which somehow exists without being obtrusive or noticeable. New variants keep getting invented. If the thread is in fact not about Einstein Aether, and what I'm pointing to is not relevant to the topic, then please ignore. I tend to agree that what Einstein himself said about it back in 1920 is probably not too important and has likely been superseded.
  7. Reaper, I was glad someone gave this paper a critical reading. If you have some particular points of criticism, where you can give a page (and maybe even paragraph) reference and point out some statement that is wrong, that would be appreciated too. Not necessary! But it would give me something extra to chew over. Did you think the author favored some one particular explanation, or reply to Fermi's Question? It seemed to me that he didn't pick a favorite or favorites. He just surveyed all the different possible responses that have been made, and tried to restate each argument concisely. Do you have a preferred explanation yourself? I don't have one, so far. It just seems bizarre to me that we have seen no evidence of another tech civilization. But I think it is just a fact that we have not---no alien beercans, no intercepted messages---and we just have to accept this as another bizarre unexplained fact. And hope that eventually we understand why. But I don't claim my attitude on this is especially correct or sound or an example to be followed:D, it is just my personal reaction. So I interested by the paper because it surveyed what other people had thought about it and what their attitudes were. There was even some paranoid answers to Fermi Question which he included, as I recall;)
  8. Too vague for a physics forum. Also the idea of time "moving" is outside the range of ordinary physics discourse. This is a personal idea more appropriate to Spec forum.
  9. Physman, please post your personal theories in the Speculation forum.
  10. Physman, I moved this to spec forum because it seems primarily philosophical in character. What do you mean by these terms? Is there some mathematical content? Sounds like purely verbal speculation, not related to quantitative empirical observation measurement etc. Want to clarify?
  11. Since you seem to know...please tell me: what did Einstein mean by "ether" when he said that? I've sometimes wondered. BTW I have Wilczek's book, Lightness of Being. It's a fascinating book, which introduced new ideas and ways of thinking to me. I like it. But it doesn't make clear to me the idea of "ether", either Einstein's or Wilczek's either. So you tell me what one or the other or both meant... Also BTW, the Wikipedia has a pretty good article: http://en.wikipedia.org/wiki/Einstein_aether_theory And there is a copy of Einstein's 1920 ether speech on line: http://www.tu-harburg.de/rzt/rzt/it/Ether.html I think all it means is the existence of a universal time, which is not necessarily the time which someone experiences. Maybe the universe would experience it, if the universe were able to experience. In cosmology the existence of a universal time is not so controversial. It is built into the model which nearly ever cosmologist uses, because it works---the Friedman model. One should not get so excited about this. There have been a number of papers recently about Einstein aether theory. You can do a respectable PhD thesis about it, and get a good postdoc job to pursue more study of it. The theory has been improved since 2004. It's not a dumb thing to work on. Very smart people Ted Jacobson, David Mattingly, somebody at Utrecht in Renate Loll's group, I forget his name. Lee Smolin has just put out a paper with a new approach which gets a universal time as a byproduct and solves the cosmological constant problem as the main goal. These things have to be checked observationally. I don't think any of this would contradict what we already know about special and general relativity. The basic stuff about Lorentz transformations would still work as well as it always has. Special and general are known to be approximations---amazingly good approximations---but cannot possibly fundamental. Special can't be right because it's geometry is flat and doesn't expand---which is unrealistic---but it's a very good approximation because in our everyday world space is very nearly flat and expands only very very slowly on a human timescale. General can't be correct because it breaks down at black hole centers and at the Bang. It has singularities, which are by definition where a theory fails. So it would be expected for Gen Rel to be replaced by an improved theory for which Gen Rel was a good temporary stand in, and to which Gen Rel closely approximates except where it breaks down. Par for the course, as physics theories go. If the replacement happens to have a universal time, well, so what? Nothing to get worked up about. Answer to your question, neither good news or bad news. What Einstein said in 1920 probably doesn't matter. I would say what's more meaningful is what Jacobson, Mattingly, Smolin, Renate Loll, are saying now. Loll's triangulations quantum gravity has a universal time-ordering. There is a Scientific American article by her in my signature ("signallake.com") have a look her approach to quantum gravity is terrific, she and her collaborators are being invited to many of the major conferences to talk about it. Currently making a hit. We'll see how it goes.
  12. That is a good observation! Radiation would certainly be a problem in the Jupiter system, as we know it. I guess it would be especially harsh in close like at Io and Europa. I can't speculate about the atmosphere and magnetic field of a more massive satelite of a gas giant, if one were discovered in the habitable zone of some star. Thanks for the additional links! I am glad to see some other people are interested in exomoons.
  13. Can't answer your questions in quite the same way you pose them. Now this is according to the standard model---space became transparent to light at about year 380,000 after start of expansion. The CMB, the background microwave, comes from that time, and the matter that emitted it was an estimated 41 million lightyears from here (from our matter that became us and the rest of Milkyway galaxy.) That matter is now 45 billion lightyears from here. The distance has expanded by a factor of about 1090, while the light has been traveling to us. And the wavelengths of the light have been enlarged by the same factor of 1090. The Hubble rate of expansion of distance is not constant over time but has varied greatly. It is believed to be constant over space, the same everywhere at any given time, so it is sometimes called Hubble "constant". But it used to be thousands of times larger than it is now, it is certainly not constant in time. The current Hubble rate is that distances expand by about 1/130 of a percent every million years. So you asked how long would it take light to travel---say---41 million lightyears, allowing for expansion. Well at the present rate of expansion it would take approximately (just slightly over) 41 million years. Because in that time the distance it had to travel would expand only by about 41/130 of a percent, which is hardly anything. Of course at an earlier more rapid rate of expansion it could take considerably longer, and indeed the microwave background light has taken much of the whole 13-some billion years. You ask how "big" the universe was at the time of last scattering (the moment of effective transparency you mentioned). We don't know. It might have been infinite then and likewise be so now. On the other hand it might be finite and only a few times bigger (say ten times bigger) than what we can see. What we can see has a radius now of about 45 billion ly and a radius then of about 41 million ly. I agree that the basic facts of the standard cosmology are hard to comprehend. They do not obey the conventional nonexpansive Euclidean geometry we are used to, the model does not 'live' in that kind of geometry. So we all face the same conceptual hurdles at one time or another in one way or another. We can only do the best we can and be patient and gradually get used to things. A wise person (I forget who) said that one never really understands mathematics, one merely gets used to it. Heh heh, that is a kind of stoical joke. Anyway, have fun and keep asking questions.
  14. Bascule that depends so much on philosophy and language issues. Unstated assumptions people make and what they mean by various words. the current situation in cosmology is that the LCDM model is generally accepted because it it gives a remarkably close fit to the vast body of data that continues to accumulate. and it assumes a certain uniformity. I think you know. Homogeneity and isotropy. The "cosmological principle". It is a basic assumption and it gets challenged from time to time. Most recently David Wiltshire (a new zealander, very good cosmologist) and a handful of others like him. But the assumption that our location is not special and that things are approximately uniform throughout has turned out to be very durable, and people check it as best they can and of course cannot absolutely prove it true. So that assumption is often made without explicit statement. And I think you see immediately that this assumption goes beyond our particle horizon, outside our past lightcone, and is about stuff we really cannot know. It says no dragons. It says things out beyond 45 billion lightyears are more or less just like here. And of course we can't be sure that there are no dragons! Maybe things are quite different out there. Anyway in January 2009 the five-year Wmap data was published, and essentially everything in that report, all the parameters of every version of the model, assume the basic uniformity thing. That is how you model. Our model of the universe is good, and it wouldn't work if there were dragons all over the place just out of sight over the horizon, exerting forces so far not observed in our part of the universe etc etc. You have to keep it simple in order to get anywhere. So they should have stated that and maybe they didn't make it plain enough. And about the size of the universe there are two main versions of the LCDM model----spatial infinite and spatial finite. Which is right depends on measuring the curvature more accurately. Finite is if and only if the curvature is positive, and once you know the curvature you can calculate the circumference. It is pretty straightforward. The key assumption is uniformity and if you don't suppose that, then the size estimates and pretty much everything else go out the window. Because all the inferences based on what we can observe are made using the model. And if you do assume uniformity (and conventional physics of radiation and matter) then everything else falls into place. Personally I'm very comfortable with the uniformity assumption because of my sense that History has born it out. The farther our instruments look the more it continues to look the same. http://en.wikipedia.org/wiki/Eratosthenes Eratosthenes could not see beyond the Mediterranean world he lived in, but he estimated the circumference of the earth by assuming uniformity which he could not prove. So WMAP has given us a lowerbound estimate of the circumference in essentially the same Eratosthenes way, by assuming uniformity and measuring a curvature. And that involves inference beyond the observational horizon just as in the 200 BC Greek's case.
  15. Most of your post sounds pretty reasonable to me, like you have been thinking about this stuff and trying to work out a consistent picture. There seems to be two parts: first, a conjecture you want us to help check. And then also some conclusions that you derive from that, and a bit more discussion. This conjecture is something we can check and maybe it can be proven wrong and ruled out. We will see. But then even if the conjecture turns out to be wrong, your following discussion seems completely on track, at least to me. I think you are absolutely right to point out that even if the distance between two objects is increasing faster than c, at present, there will be photons which are already on their way which will continue to arrive! So just having a recession rate bigger than c would certainly not shut off the flow of light. So the discussion makes sense. This is OK. So let's get back to the first part where you hypothesize that the whole universe is within our past lightcone, or they also use the technical term "particle horizon". The technical jargon doesnt matter. You say it very clearly. You conjecture that the universe is finite volume, and small enough that any bit of matter could have sent us light that we are receiving or have already received. You know, that is just barely possible! I think that under reasonable assumptions it has been ruled out to something like 95% confidence level. But it is not obviously wrong----I think. I have to go out, so I can't finish this post. But maybe someone else will jump in. Given time, I think I could persuade you that the universe is probably bigger than what we see or could see. There are reasons to think that it is bigger. But I don't believe I could prove it with absolute certainty. The mainstream consensus in cosmology, so far, is that we don't know yet wether or not the universe is finite volume or infinite. And as of January 2009 when a bunch of WMAP data was published (I'll get a link to it later) the best estimate was that if it is finite then with 95% confidence the circumference is at least 600 billion lightyears. And also the best estimate currently is that even allowing for expansion the farthest matter we can see is today only 45 billion lightyears away. (It used to be much closer, when the light started its journey to us, but the distance has expanded to 45.) Now this is not absolutely certain, but it suggests that we are causally connected only out to a current distance of 45 billion lightyears. But if the universe is finite then the farthest anything is away from us is at least 300 billion, namely half the hypersphere circumference, with 95% confidence. Something like the surface of a balloon, only 3D whereas the surface of a balloon is only 2D----difficult to imagine a finite volume, boundaryless, space but that's the typical model that they use for the finite case. Have to go. Nice post by the way!
  16. There is no strictly scientific reason to imagine that time had a beginning. Certainly no reason to assume that time began at the big bang! Different pre-bang models are currently under study and say different things about conditions leading up to the big bang. Predictions can be derived from the models to compare with precision observations we can either make today or will be able to make with more advanced instruments. It will eventually be able to distinguish between the models observationally, and rule some out---by testing. So far we can't pick a winner, so it will take more work. MacSwell is right. Astronomers cannot say. However even if they do not believe that the start of expansion was the beginning of time, they still use the imaginary big bang singularity as a convenient time-mark. This is well explained at Einstein-Online. A Tale of Two Big Bangs The link to E-O website is in my signature. It is the public outreach site of a top research institute, and is more up-to-date than the other cosmology public outreach that I know of on the web. Here is the link for A Tale of Two Big Bangs: http://www.einstein-online.info/en/spotlights/big_bangs/index.html You might find several other things of interest at Einstein-Online if you browse around some there. ============================= Is this the Ridpath book? http://www.amazon.com/Astronomy-Universe-Equipment-Eyewitness-Companions/ Don't worry if a book like that over-simplifies some, or is slightly out-of-date. It will still provide lots of useful (approximately correct) information. Just take what they say about the temperature (right after the start of expansion) with a bit of a grain of salt. There's a lot more work to be done about that.
  17. Dear Buddha, What you have here seems to me to be a kind of poem or dream of how the universe came into being and matured to its present state. It is potentially a beautiful story and it is in harmony with spiritual philosophy---also it stimulates the imagination (in a nonmathematical way). I would hope that you write more about this cosmic dream of yours and I look forward to seeing more about it here. I moved it to Speculations forum because it seemed out of place where it was originally posted. Astro/Cosmo is quantitative mathematical science. Basically Astro/Cosmo talks about the mathematical models of the cosmos which have been formulated using equations involving measurable quantities. The game is to fit observation data to the models and see which model gives the best match to the real world. It seemed that your philosophy was not that kind of activity. However it does seem well suited to Speculation forum.
  18. Systems have been found with Jupiter-mass planets in the habitable (liquid water) zone. Suppose a planet massing several Jupiters had one or more moons massiver than the Jovian ones: Io, Europa, Ganymede etc., in our system. Would Jupiter moons be habitable if they were closer in, and warmer? And more massive as well? It's an interesting possibility. So here is an interesting variation or extension of the exoplanet search: We already know how to detect exoplanets (by star-wobble and transit methods) so how about searching for moons of exoplanets? A method has been worked out for detecting exomoons by precise transit timing. If any are found then if some turn out to be in the habitable zone around the star, then so much the better. http://www.centauri-dreams.org/?p=8791 The method was proposed by David Kipping (University College London) http://www.davidkipping.co.uk/ It is projected that the transit timing accuracy of the Kepler spacecraft would be sufficient to detect an exomoon in this way. http://kepler.nasa.gov/
  19. http://arxiv.org/abs/0907.3432 This is a 39 page article that delves into a dozen or so different answers to the Fermi question/paradox that have been proposed over the past half century. The author is a Belgrade astronomer/SETI expert/future generalist. He seems to know a lot and reason carefully, like a credible academic. Well, that's my first impression. Maybe you can find some flaws, if so please let the rest of us know where you think he is wrong. I don't think he has any preferred answer, he compares and weighs them all. Maybe he ends up throwing out some as relatively improbable, and narrowing down to the ones he thinks are better. But on the whole it looks like a balanced survey of all the ideas. If you are interested in Fermi question, have a look.
  20. It is possible that you think a galaxy is a planetary system like the solar system. A galaxy is a collection of stars like a billion stars, or 10 billion stars, or several hundred billion. Maybe Wikipedia has something for you about that. Galaxies are on the order of a million or more lightyears from our (milkyway) galaxy. So any probe we could construct would take a million or more years to get to another galaxy, and when it reported back to us what it saw there the radio message would take a million or more years to get back. I'm understating the typical distances and times. ==================== To make sense of your question I have to believe you didn't mean to say galaxy. You meant send a spacecraft probe to some nearby star that has planets. We know plenty of stars that have planets orbiting them. Some are only a few tens of lightyears from here. So a radio message from a probe would only take a few tens of years to get back to us. If humans survive as a technologically competent species, and don't trash or exhaust their planet, then I have no doubt that they will eventually get around to doing just what your post suggests---send probes to nearby planetary systems and check them out at close range. For now, the game is to build better telescopes so that we can study the planets orbiting nearby stars and learn more about them (without having to send probes there.) I would say that advanced telescopes able to detect oceans, and analyze cloud-cover, are at least as important for the longrange future of earth-life in space as manned missions to Mars---possibly more important.
  21. Good answer by Atheist. The technical term for the distance getting larger without the thing moving is recession. It is not like ordinary motion where you actually go somewhere. Recession rates are not real motion velocities and arent limited by special relativity. They have no effect on time. They are just the rate some distance is increasing, which can easily be > c if the distance is large enough to start with. Most galaxies which we can see have redshift z > 1.4. And any such galaxy we are receding from at a rate > c. It has no effect on time or any thing else. Point your telescope at a typical galaxy with redshift 1.4 or better, in any direction, the people in that galaxy see us receding at a rate > c. The distance we are from them increases by more than a lightyear each year that goes by. But aside for a small random individual motion that doesn't really count (such a small percentage of c) we are not going anywhere. There is no place we are getting closer to at any rate comparable to c. So there is no reason to worry about rates of recession having an effect on time.
  22. We don't know that there are more than three spatial dimensions. Right now the fuzzy consensus pendulum is actually swinging towards there NOT being any extra spatial dimensions. The biggest event in that particular controversy was the last 12 minutes of a July 7 talk at CERN (geneva switzerland) by nobel laureate Steven Weinberg. He was working on a nonstring idea that he invented in 1979 and has had a big resurgence of interest and which allows what he called "good old Quantum Field Theory" (just 3 spatial dimensions, no strings) to work, and be fundamental after all. If that works out there is no need to dream up extra spatial dimensions. Heard of Occam's razor? So the pendulum swings, and physics hasn't yet made up her mind. Time is a separate issue. The FQXi institute just had an essay contest on the topic What is Time? and the prize essays all happened to argue that time does not exist as a geometric dimensions but is something that somehow "emerges" at large scale. they were against the "block universe" concept. But the pendulum swings on that one too.
  23. What's it called? The Kerr solution? The Schwarzschild solution is only for nonrotating. There are two horizons, the outer one is oblate---like the slightly squashed soccer ball. But inside that there is another spherical surface which is the real event horizon. I dug up these sources, if anybody else has some good ones please share: http://www.astrophysicsspectator.com/topics/generalrelativity/BlackHoleKerr.html http://en.wikipedia.org/wiki/Kerr_metric The Wikipedia article looks very informative. I just read the first part of it.
  24. Reaper, I'm glad someone else is interested in this too! I don't understand this big jump in sales. I just now checked at 5PM pacific time, July 3, and the book was at overall salesrank 833, it was the #3 amazon bestseller in physics. I think this has to include some wider sector of the reading public. 833 is pretty good for a physics book (you are competing with all kinds of more popular genres) For comparison, at the same time (5PM) Brian Greene's fabric and elegant were 2286 and 3520. Kaku's hyperspace and parallel worlds were 4588 and 6347. So you can see 833 is real good. ====== You mentioned Greene and Kaku in your post. I absolutely agree they are much more familiar names than Smolin out there in the general public----TV series and all that. But just at the moment Smolin's book is suddenly outselling them. And Hawking too. Odd. Probably just a brief flukey spike, but what caused it? How does a science book get such a burst of sales? If you would like to watch this curious phenomenon yourself, here is the amazon physics bestseller list http://www.amazon.com/gp/bestsellers/books/227399/ If you click on one of the books there, you will get a product description page for that book which gives its current salesrank plus lots of other information. Merged post follows: Consecutive posts mergedToday (5 July) at noon the Smolin book's salesrank was 352 (amazon-wide book sales rank). This was 16.1 times better than the topfive stringy books. Their average salesrank was 5667.6. The five currently most popular stringy books were elegant, fabric, parallel, hyperspace, and idiot's guide. I make a regular routine of checking at noon pacific time and always using the five most popular string books as a benchmark for comparison. Earlier this summer the Smolin book was doing a modest 0.5 or 0.7 of par. It was only half or two-thirds as good as the benchmark. Now it is 16 times better. This jump in sales is sofar unexplained and I'm wondering why it happened. Did anybody see anything on Television? Or on these Digg and Twitter things?
  25. Your basic intuition is not far from mainstream---and the words you happened to choose to describe it are outside normal physics language and need a dictionary. I think this happens more often we realize. Someone has an intuition and a mental picture which is up-to-date and they think how to express it and happen to choose words that make it sound eccentric just because they aren't the usual words. Where you say "the wave/particle ratio is high" other people might say "at the Planck scale" or "at densities approaching Planck density." Just this past week (up to July 4) there was a large conference in Poland on the Planck Scale. Many physicists suspect that at planck scale (if you zoom in very tiny) it is where quantum effects begin to dominate in spacetime geometry, over classical behavior. So there cannot be a classical "singularity". In a manner of speaking you could say the uncertainty principle prevents there being infinite density. Classical geometry (of 1915 gen rel) could describe a collapse or crunch, for instance, in which density goes higher and higher. But when density reaches a level called planck density then classical geometry fails to apply, because quantum effects dominate, and geometry becomes "fuzzy"----curvature could be several things at once and things can't be in a definite location. When the fuzzy geometry takes over, it is like your example of something going thru two slits, and the researchers suspect a classical singularity does not happen. In effect it is where wave behavior dominates over rock behavior. So you pioneer express this in your pioneer vocabulary and say that at some point the wave/particle "ratio" increases to where wavy behavior dominates. Your intuition is this has to happen because of "entropy"----visually what I see from your words is the kind of fuzzy geometry of what is conjectured to occur at what people sometimes call the quantum regime. I don't adopt your words (like wave-to-particle ratio, and geometric entropy) because I don't see a straightforward way to apply the familiar pictures of wave and particle and the familiar tools of classic thermodynamics entropy in this new situation. But I get a glimpse of an intuition that has become increasingly widespread and is attracting research interest. You offer some concepts. It isn't clear what the right concepts are. There are a half a dozen main approaches to conceptualizing planck scale behavior. That is why they had this conference last week. They don't know. Different bunches of people picture it different ways. It turned out to be a very successful conference. Big turnout. Highly "ecumenical" so to speak (string people, loop people, triangulations, asymptotic safety, noncommutative geometry, some approaches I don't even have a name for) hundreds of people from all over getting together to describe and compare their different ways of talking about the planck regime. The way nature behaves when, in your way of speaking, the wave-over-particle ratio increased to the point that the wavy aspect of geometry itself dominates and it becomes indefinite---the planck scale. I will give a link to the conference, but I caution that you can't learn anything from it because a frontier area of physics tends to be so confused and chaotic---they haven't settled on what the right ideas are, different tribes have different concepts, they have to fight it out among themselves before there is a clear message, and this kind fighting is OK, it can be very constructive and civilized. It can be a polite struggle. But what I am saying is the conference link is useless. I only give it because I don't like anything, even chaos, to be concealed. Even messes should be out in the open, so people can (if nothing else) see how messy they are. Here is what a Edinburgh string theorist blogged about the Planck Scale conference, from his distinctively stringy perspective, from his slant. Incidentally it is even possible that one of our most savvy people here at SFN knows this guy or is a colleague. Here is Jose Figueroa's page: http://www.maths.ed.ac.uk/~jmf/ always try to know your source and understand his bias and interest viewpoint. Here is Jose Figueroa's account of the Poland conference: http://empg.maths.ed.ac.uk/blog/?p=503 You can see he wasn't entirely happy with it but he had quite a bit to say. Now here is the official conference website: http://www.ift.uni.wroc.pl/~planckscale/index.html?page=timetable You can see from the titles of the talks listed here that they would be almost entirely incomprehensible. But believe me these people are our friends, they are trying to understand what nature and the universe are like down at planck scale, which is a new place to take your mind to.
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.