Jump to content

fredrik

Senior Members
  • Posts

    532
  • Joined

  • Last visited

Everything posted by fredrik

  1. Ajb, what's your opinion and interpretation on QM in this context? Should we expect that the geometries must be constructible relative to the observer in the sense that there exists in the observers memory or internal microstate, and "image" of the environment, that may in general disagree with other observers. The analogy here is that the subjectively physical geometry so to speak is the one that can be "in principle" be mapped out by an observer living there - ie the intrinsic geometry. intrinsic geometry ~ intrinsic information, but when you add the information thinking, then the geometry itself is of course part of the information - so it seems that the truly "physical geometry" - adding the QM stuff - is subjective to the observer living on the manifold, making it very strange. Classically one can imagine that an observer living on the manifold can "in principle", with clocks and rules find out the global geometry. But for several reasons, this doesn't make sense. One is that how do we expect a finite observer living on the manifold, to be able to relate to such a massive map? (information capacity issue) information wise, it makes no sense. The second issues is the time and information "ages" and changes (time stamping issue). I'm curious if you have a different view of this, in your geometrized thinking so to speak? I'm not trying to "solve" these big issues in this thread, but I'm curious to hear alternative views and angles, crazy or not, doesn't matter to me. Some madness might even help. /Fredrik
  2. The threads on expansion of the universe caused me to take the opportunity to express an interesting reflection that relates to the still open question of the conceptual unification of QM and GR. There two conceptual areas that commonly are discussed, from both philosophical and conceptual viewpoints: 1) Relativity, both special and general, which relate to the concept of geometry and in particular intrinsic geometry. In the general picture of a curved surface, there are two kinds of curvature - extrinsic and intrinsic. The extrinsic curvature relates to how the surface "curves" relative to the higher dimensional embedding space - for example a 2D-surface curved in 3D embedding. The physical curvatures in GR are the intrinsic ones. Although one can mathematicall imagine it embedded in a higher dimensional space, this embedding is not unique, and more important non-physical. So the physical curvature is what can be deduced from an observer that lives on the surface, not in the embedding space. Martin wrote in http://www.scienceforums.net/forum/showthread.php?t=30787 2) Quantum mechanics, which at least ín some interpretations, and according to Bohr, shoud dela with the observers information about the system. Ie. QM, according to Boht speaks about what we can say about nature, not what nature is. In essencse, I like to think of this as intrinsic information vs extrinsic information. Like the analogy of internal vs external geometry, the observer constrained to the surface, can not necessarily say that there can not exists in some sense an embedding. The point is that, this embedding is unverifiable for him. This is conceptually related to the idea that, there can always "in some sense" exists information that the observer doesn't have, but then as in the geometry example, information truly hidden from the observer, has no impact on his expected life. At the same instant it does, it's not hidden anymore. Similarly, if we consider an observer embedded in an environment, one comes to a situation that is strikingly similar to the geometry. An observer, is embedded in an environment. And anything the observer "knows" about the environment, must be encoded in the part of the system that we call the observer. So one can consider two kinds of information, intrinsic to the observer, and extrinsic as stored in the environment. In geometry as in GR, the intrinsic one is the physical one, any embeddings are non-physical, redundant and ambigous. One may be tempted to draw the same parallell, that in QM, the physical correspondence is the intrinsic information, and the externally "imagined" information encoded in the environment are non-physical are judged relative the observer. This is really the same argument as in the GR case. But this leads to a completely subjective view of things - which is sort an "issue". It would seem to totally wreck objectivity, since each observer would have it's own physical picture, and it seems alot of people doesn't see any sensible way to do science. The usual way to resolve this, is to violate the intrinsic information an instead IMO, pick an embedding that many seem to agree upon. But here is IMO things missing. I think the parallell here is conceptually interesting, and comparaing different problems with each other may open for new ideas. Maybe the missing understanding is how, objectivity can emerget, out of the seemingly only sensible non-ambigous, but subjective, physics? I just wanted to pose this reflection. Maybe someone might add their own reflection to this. It might be interesting. /Fredrik
  3. A philosophical reflection. At this point, I haven't asked myself this question too frequently, mainly since there are many other questions that hogs my mind. But my first reflection, is how certain I am are on the measure implicit in the question. Since, it should seem clear the the certainty of the preferred measure and question, does change what's the right answer. And in general measure tend to be chose based on particular models. To motivate my opinion I would have to spend time analysing the construction of the measure, and current data. But the easiest motivation is due to the "could be" in the question /Fredrik
  4. The concept of superpostion is relative to an observer as well. As to why we seem to have a superposition of complex amplitudes rather than the classical statistical mixtures is not entirely understood either IMO and it boils down to defining addition and encoding information. I think that there may be a way to understand this be considering that the observers microstructure encodes the information it's in possession of' date=' but there are various ways to encode and store information. I like to think of it like transforming the information before building it, and even his transformation may be encoded in the microstructure. This alone _could_ explain why the "statistical mixture" concept fails, but what remains is still to understand why this has evolved to be like this. I don't think any string book would give you a satisfactory answer to these questions. I think this isn't understood yet either. In ordinary QM, the concept of an observer (needed for the construction) is highly idealized to the point of not beeing realistic. IMO ordinary QM foundations consider the concept of measurements in absurdum, defined without a clear connection to the microstructure of the observer, and prescription for retention of measurement results. The standard drill about infinite experiment and frequentists interpretation does not hold water except in the effective sence IMO. /Fredrik
  5. If I interpret you right, you note that QM deals with measurements, but in what context? You need a reference to make a measurement, and where does this measurement come from? Does "measurement form spontaneously in the void"? If so, who is measuring what? and what happens to the measurement results? If this is it I think it's a really good question but not so easy to resolve. This does introduce the observer into the description not only as the reference by as a part of the subject. On one hand the observer evolves, and it can make measurements which relate to himself. What we loook for is the logic of self-organisation in that sense. Why does structures start to form, and later on these structures start to "communicate / interact". I think stability arises from relative inertia, even though technically everything is uncertain, intertia protects everything from happening at once. I agree this involves a deeper definition of time and is a challange. Keep up the questions! /Fredrik
  6. I respect your opinon and view, I was just commenting from my viewpoint like everyone else. I didn't mean to pick on anyone, just add my personal view for contrast. I would choose to say that knowledge is also far more complicated that you indicated above. IMO irrationality is relative, and so is knowledge, although I can imagine a few people who disagree with this. I'm not sure what you mean by "knowledge is always true". If you mean that everybody (every observer) would come up with the same answer, then I disagree. I would say that even information and knowledge is still somewhat under debate also from a physics point of view. It's not just "facts", because how to you define a fact without proposing a verification and acquisition process, and even the acquisition process itself (the measurement) is IMO related in terms of the questioner. So even plain knowledge is complicated enough as it is. But I think by tradition this is often oversimplified, because to imagine a set of absolute truths is very convenient, no matter how nonsensial. Typically the common meaning of knowledge in society is the knowledge that is collectively agreed upon, but this changes too, but more slowely due to the collective inertia. Everything you describes fits into my view of information and knowledge at least. So does knowledge as I use the term. I would described that as a dynamic process as well, it's constantly remodelled. It's not just a "record of facts". Different observers have different knowledge, but this doesn't mean that some are right and some are wrong, because who is the judge? Hmm... theory? I'm not sure the representation needs to be on paper. How about the genome of life? Maybe it's natures own theory for a survival strategy? I don't know the inventor but it's outstanding. What's a human anyway? What about chimps? I am more leaning towards that there are different degrees of consiciousness. /Fredrik
  7. I see what you are saying darkshade but how can, given that this indeed is a matter of definition, any "sensible"(scientific?) definition of a property of anything be "formulated" in any other way so to speak than in terms of it's interactive properties? /Fredrik
  8. Why expect from a fish to never make the same mistake twice when we humans happily do the same mistake twice. A problem is clearly how to deduce from a single experience that it IS a mistake, this isn't easy even for something as (*cough*) intelligent as a human. An organism who immediately respond to everything with an immediate correction would probably not live long either. Learning needs intertia, or otherwise you could easily "learn" the wrong things - you mistake actual mistakes from normal fluctuations for example. It amazing how single cells learn and adapt. They have both short terms and long term responses, and I for one is deeply impressed by the construction of something so small that it easily escapes the naked eye. I may agree that humans are quite a bit more clever than many animals, but I think the difference is exaggerated by our own incapability to communicate with them. /Fredrik
  9. I'm sorry for the diversion Adib. I personally didn't quite follow the line of reasoning in your original post, but I agree that the nature of "space and spacetime" in QM is an interesting topic to reflect upon. /Fredrik
  10. Something like that yes. But my suggestion is also extremely conservative, and I'd like to think it's _in a certain sense_ way more conservative than that paper. I agree completely that this should be tested, but the opinion is open as to whether extrapolating GR to a domain where we have pretty much no empirical support, is the least speculative (most conservative) strategy? Yes I read it and I like the fact that they try to compute something without ordinary perturbation theory. The most interesting thing IMO is the dynamical dimensional reduction they found. But I am not personally satisfied with their starting points, and I don't think it's anywhere like a revolutionary paper in any way. Reading this papers makes it clear how ambigous and poorly understood the path integral and action principle is. They also refer to principal measurements with clocks and rods. I would expect to see the distinguishable rods and clocks in the micro world emerge out of the data. I'm not sure I follow their conclusion(?) how their choice of regularization and gluing rules have no impact on the resulting expectations (theory)? I suspect this is based on some implicit manually put in ergodic hypothesis and that type of reasoning is speculative IMO. It seems more like an assumption to me, and one I'm not sure I share. I think the ergodic inductions should come from a revised action principle. The ergodic hypothesis should IMO be supported in a history, and in this history the support should be quantifiable. And I think this quantity is closely relatey to the concept of intertia. Also they have made the blocks 4D by hand. Wouldn't you expect that there should be able to be exist transition between topologies? I think that it is possible to rate possible topologies according to their expected fitness, and this induces from the observers history a prior expectation, so no ergodic hypothesis is needed in the ordinary sense. I think this should also solve the regularization issues, because the options themselves are normalized one by one. I always felt this is a missing detail in feynmanns formalism. Still the formalisms is too nice and too good to be a conicidence, but I think we still await the ultimate polishing on this, and my personal expectations is that in that respect gravity will emerge. Now if such a procedure could be shown to reproduce einsteins equations in the appropriate limiting case it would be interesting. I totally agree that all options should be tested for viability in relation to expectations. But what I look for is to me is even more conservative in that it tries to implement the essence of the scientific method in the "micro rules"! This means that I envision that these microrules are sort of nothing but the essence of scientific reasoning, or scientific inductive reasoning, and the laws of physics should be the micro-implementation of what human philosophers consider to be the "science" I have had the impression that it's extremely hard to communicate the ideas at this proto-stage. I hope to be able to find an explicit realisation of it but it's hard with so little time. I try to be as conservative as possible, given that I hope to be able to get something at least within a lifecycle. /Fredrik
  11. I find the idea that spacetime (any structure in general) is self-emergent and self-organised as per some "logic" of self interaction right in line with my personal thinking. I like this! But I have some problems to picture a the implementation of this "microscopic rule" withour prior structure. It is a bit ad hoc IMO. Not as ad hoc as many other things but still not quite satisfactory. So I think the idea is right on, but more needs to be done. I think that the extension of this idea is to also show how the microscopic rules themselves are emergent and that in principle rules and structures are unified. I haven't found much papers on this but it's what occupies my mind at the moment. I think that to expand the ideas of Loll, one must also take into account the observer, and the microstructure of the observer. I am starting to think the rules are implicit in the microstructure (not microstate; I make a distinciton here) of the observer, and the evolution of the rules go hand in hand with evolution of observers (which as essential in my thinking self-preserving microstructures that interact with their environment) and since basically everything is observers in mutual interaction (if you consider particle interactions as two observer inteacting) these things (structures and rules) tend to find it's common base. I'm starting to be convinced that the path integral concept can be given an explanation from a pure information processing perspective. But it seems very difficult to find the formalism. I don't like the fact that in that paper the rule are put in manually. You have to start somewhere, but I think taking the idea to the next level would IMO need to show how the rules are also emergent, and how the rules and structure are unified. I am naive enough to think it can be done but I don't know how. /Fredrik The idea then is that observer emodying inconsistent "rules" simply won't survive. There is a negative selection for them. And ultimately the selected rules will be what "most observers" agree upon, which means that the rules may not be completely homogenous. /Fredrik
  12. I was expecting a microscope shot but this was funny. I don't know where you found it but I'd suggest not inhaling it. LOL /Fredrik
  13. Welcome to science I started paying attention to chemistry when I was around 13-14. The first experiences was the smoke and fire kind of stuff, and needless to say my parents weren't approving it. But I soon wanted to understand what I observed - how and why does this work. So I bought some books and started reading. My advice would be to garden your curiosity and your inspiration. Try to keep your curiosity alive and let it grow. School classes isn't always the best place for that, so I suggest you get yourself some chemistry books. Reading is harmless and I'm sure your parents won't object. If you don't know where to start, visit a bookshop or just get the first book that covers your first chemistry year. Answers usually feeds new questions, so just get started and trust your inspiration for the journey. Science should be fun and rewarding, or something is wrong. Then you have time to grow inspiration and questions well in time to make better use of your teachers, and make sure You are the one asking the questions, and the teacher providing the answer, not the other way around. /Fredrik
  14. The association occam's razor I make in this context is related the information capacity bounds and the adaptivity. If we believe an observer to only beeing able to hold a limited amount of information, then the fact that anything that doesn't "fit" (can't be projected) within the bounds are forbidden along becuse it's *too complex* - and complex here means requiring a too "massive" representation structure - "occam's razor". The other idea is that I argue that that evolution will favour compact representations. More or less, I associate relative complexity with relative masses. Yes I think something like that. And the beautiful part is that once something becomes random, it does not contain very much information - and can be discarded/released without information loss! This is related IMHO and personally thinking, to the _complexity scaling_ and _mass scales_. It seems that many physicists looks for mathematical models that can solve physics problems. I think that sometimes this is limiting. I don't look at my tools and wonder what I can do with them. I try to decided in terms of some internal representation what I want to do, and then look for the tools. If the tools aren't around, then we'll have to invent them like mankind always did to survive. I do not rule out that we may need a completely new formalism. But if that is what it takes, then that is what we must do. So the way I imagine this, the principle of occam's razor is given a more specific meaning that is built into the model. /Fredrik
  15. I'll keep assuming we are talking about the same thing... IMO this is related to the information capacity bounds of systems interacting. This bounds the resolvable inhomogenity, and it may seem that "continous options" inbetween the observations is a natural a priori assumption or "minimal extension", but I disagree with this too. Either you decoupole the model from reality, or you violate the information capacity bound by inflating it with a continuum. /Fredrik
  16. I'm roughly with you here. That the change or as I like to think of of the uncertainty in one observable, can be taken to define emergence of a new observable. This will also be emergence on demand, becaues it's emerges only when the uncertainty calls for it. But I also want to understand how observables emerge. This is why I think in terms of uncertainty before I think of time. Because even without clocks, there may be a certain amount of uncertainty and we might simply have lost track of the timeline in the fluctuations. But observing the uncertainty closely might give rise to a distinguishable clock. But I haven't found out exactly how to do it yet. To me the information is just something human, I am looking for the one-2-one map of information and physical reality, and thus to make some sense in the usually considered "wavefunction is not real". As it stands and as we understand it, this is correct, but still that doesn't smell right. If the wavefunction is redundant that I suggest we remove that redunancy from the formalism. I've got a strong feeling that the continuum background structures of space contains ALOT of redundancy. And although the redundancy of this may be mathematical and not physical, it suggest that the model isn't minimal. /Fredrik
  17. In general that's a complex question. Certainly there are off products in most processes, and there are so many strains and speices of bacterias that it may vary. From the beer and wine fields it's known that a range of both bacterias as well as yeasts sometimes produces a mixtures of ethanol, organic acids as well as aldedhydes. But relatively speaking I think the acetaldehyde makes up a small part. It's often toxic even to the cell and puts additional stress on the cell. To monitor pH and from that deduce acid production you also need to know the buffering system, which usually in itself contains many variables. About measuing.. I don't know what you are doing if it's a hobby project or school project or what the purpose is but some ideas if you like playing I have tried a electronic gas sensor that is senitive to ethanol. For example http://www.efo.ru/doc/Sencera/Sencera.pl?919 - this is a type that can be used in simple digital breathalyzers. I have tried this, and by hooking it up with a basic circuit you easily build yourself I was able to achieve excellent resolution on the ethanol level in beer. I simply submerge this sensor in the headspace over a glass o beer - wait 5 minutes for equilibration and take a reading. I tried several beers and the resolution was suggesting that it has potential. But the problem was to get a stable and reliable calibration, because temp changes in the beer as well as the electronics made huge drifts. So for a serious device you need to have temperature compensation probably + a stabilized electroic circuit. I think there was also an issue of self- heating of the sensor. There are a lot of fun stuff you can do at home with only basic electronic skills, a simple A/D device for your PC and imagination. Measurement errors can usually be compensated for by making mathematical error corrections but then that might be too complicated if you aren't into chemistry. I think you should first determine the purpose and what resolution you want in your measurements. There are many more or less crude methods. Brewers tend to use hydrometers to monitor the fermentation process, and I'm sure if you can expand this model to incorporate a pH reading, a typical buffering system with some assumptions to monitor also a mix of acetic acid. Then you can add som theoreitcal modelling yo our experiment and it will get alot more fun Also wether you have alot of acetaldehyde you can simply try. Once you learn to identify the smells you can smell it. This compound has an extremely low treshold. /Fredrik
  18. I was sloppy in my question. I consider mass and energy loosely related concepts, so do you as it seems, we share that so far. But then the question is still what is energy and what do you mean with localization? Classically I understand what you mean, but to merge this with a logic that also embraces quantum phenomena, we need to seek the common denominators that unit Einsteins thinking with the quantum world. This is what I tried to suggest. With Localization I suspect you implicitly refer to space, but then what is space? How is the image of space formed from collected evidence? Imagine the localization to be emergent in that things organize and the principle of locality can reversly define space distance. So IMO the generalization of the locality principle to information systems is to simply consider let experience build the information geometry and space. Events that are far away in space is because experience expects low correlation. So it seems the jump is not far from that to think that maybe mass of a structure is the amount of data supporting that particular structure, which have emerged. Closely related to the confidence in the structure. A structure with high mass is dominant from an information capacity point. This is the thing I'm trying to elaborate. And it seems there exists natural links here between encoding capacity and inertia, but it's still a question of exactly howto put it together with a consistent formalism. I am imagining a relational idea, where even information quanta are relative, but everything is always formulated from a given starting point. This may also resolves the headache that if nature a decomposed of discrete units then what determines their scale, and how are they determined? IMO, this is relative, and the "discrete units" are not discrete blocks in the sense of classical realism. The units are rather relations between sub systems, but since the decomposition of systems into subsytems is not unique, neither do I think the discrete units are. Objectivity is emergent as per som self-organisation. Once the bits and pieces are worked out and how they interact, I would expect that interesting simulations could be made. /Fredrik To analyze that, how about try to analyse how you actually measure those things? Then you may or may not might find that those measurements are not independent. /Fredrik In a restricted sense I can agree with you here. The fact that I don't know something AND that all my current experience tells me that this seems impossible to know, doesn't necessarily mean I can't learn and come to know. But then in such a hypotetical scenario, the original conditions not the same. So it's not possible to make a foolproof comparasion. IMO, the whole point with the subjective bayesian interpretation of probability is that by construction, we are trying to find the plausability or "probability" for different possibilities that we by the condition can't discriminate between. Our current and incomplete knowledge can only induce a probability for the various options, which is conditional to what we know, including our lack of discrimination - not on what we could know or will come to know. This is exactly why I think it makes no sense at all to imagine a probability, conditional on hypotetical information pictures as infinite measurements. It disrespects the measurement ideals IMO, which I consider to be subjective. Because two observers may not interface identical to the same measurement device. It's an unrealistic idealization IMO, that happens to make perfect practical sense in many experiements, which is why I think it has survived and the objections to it are often dismissed as irrelevant philosophy. Still, I think the inherently unknown things refer to "expectedly inherently unknown". Which means that unless new evidence appears, the seem to be inherently unknown. But I suspect it also depends on how you construct the formalism, what you start with and what you try to induce. If you start by postulating the commutator of x and p, or "define" p to obey this, then the inherent uncertainy follows from definition. /Fredrik This relates to my ramblings above. That a subjective conditional probability 0, doesn't mean it wont happen. The point is that we don't ever know what WILL happen, until it has happened. All we've got are expectations. This can also intuitively explain how symmetries can be created and broken. A symmtry is an expected pattern, or rule, as induced by the incomplentess of a subsystem, and as long as the system is decoupled from the environment or in agreement with this, this symmetry will be stable and remain unbroken. If you seek intuitive models, think economy models and game theory. If you consider the rate value of a company on the stock market. One can say that it's not REAL. It's simply the collective expectation! If people THINK it's good, then the exchange rates rise and it becomes good. That's apparently how the world works. I think insight in this, will help in physics as well. Except of course in economy there is still an underlying determinism that we can imagine, but that is really beyond the point IMO. /Fredirk
  19. Here is more... the purpose is to convey what IMO is potentially an appealing intuitive potential understanding of some of the weirdness of modern physics. This is quite analogous to the idea of geometry and geodesics, straight lines and force. The statement that "we base our actions upon our expectations, and without evidence suggesting otherwise, we have no reason to revise our expectation" pretty much says the same thing as "an object follows a geodesic when no forces are acting upon it" It's just that the statement can be more or less abstract, given a more or less general interpretation. It is also striking that the statement almost appears trivial and follows by definition. So it seems the non-trivial part is when the expectations are updated beyond the expected self evolution. When analysing that, the interesting probablistic interpretation and analogy between intertia is not far away. This basic idea is exploited in a few approaches trying to connect information geometry with general relativity. But I don't think anyone has acually succeded yet, so it's an open challange. Questions 1) Excactly what is "mass"? and how does the mass of a system evolve? 2) Exactly how do we restore the consistent probabilistic interpretation of "information" induced and retained from our history of interactions, so that the "selv-evolution" we typically describe by say the schrödinger equation can be seen as the expected self evolution? (of course we would expect corrections to the schrödinger equation to account for information capacity limits / mass and inertia) This are what I cnosider to be two key questions for physics. "we base our actions upon our expectations, and without evidence suggesting otherwise, we have no reason to revise our expectation" is generalised "A system bases it's actions upon it's expectations, and without new interactions suggesting otherwise, it is unlikely to revise it's expectations and thus actions" I suspect this seems remote from the original topic to some, but I think it is close, if you see it abstracted the principle can apply to an arbitrary scale. /Fredrik
  20. I wouldn't call this a comp.sci question, but perhaps someone knows this? I don't usually use macs but I've got a mac for testing mac applications and installed the bootcamp to get dual boot for windows. The choice of bootcamp was that it seemed to be the quickest soultion at the time although it was a beta software. Now it is claimed that the beta license is expire at the end of the year. Does anyone know if the bootloader itself will actually stop to function, so that it won't be possible to boot the PC or switch between partitions? or is it just the reconfiguration via the bootcamp application that will no longer work? Ie given that no changes need to be made in the dual boot setup, would it be safe to keep the existing setup after expiration, or should I expect a surprise on 1 jan 2008? Anyone know? /Fredrik
  21. I think this is an interesting reflection. IMO there are at least to distinct types of cases where "things" are unpredictable. The uncertainty principles of position and momentum are of a type that I consider to be kind of "logical entanglements", similar to that A and "not A" simply can't be true at the same time. It's just that on first sight it might not be apparent that they are related. This relats to most normal QM. IMO, the other kind of unpredictability, is what I consider related to the constraints of limited information capacity. All our predictions and expectations are based on experience, and in particular the compilation of retained experience that makes up our identity. This I think relates to intertia and the flow of time, as well as touching decoherence ideas of QM. Predictions are based on expectations about our environment and these expectations are based on the limited amount of information the observer is in possession of. This information has in turn been formed and _selected_ during the observers history. Moreover this constraint limits the possible questions the observer asks. Which I also see as the possible measurements or interactions the observer _expect_ to participate in. The predictions can not possibly be anything else but expectations. Sometimes the expectations are wrong, and then feedback about this causes the observer to revise the expectations for the future "predictions". But the revision of expectations are subject to inertia. One or two datapoints can not possible just flip the expectations backed up by billions of datapoints. For example, I may examine myself and find 0 support for a particular event in my pretained past. Then I have not reason expect that this will happen, and I assign it "probability" 0 which techincally is just a "expected probability", and I see no real to calculate in a sensible way the probabillity for that expectation beeing correct - the limit is set by the information capacity. Then maybe this event still happens. But that could IMO be explained by the limited "resolution" of the observer. The effect is that things are "sufficiently" unlikely (from the birds view) are assigned 0 probability from the subjective view - they are not EXPECTED to happen. That however does not mean they wont happen. However it's obvious that if our expectations is consistently violated, then evidence is accumulating and this will update the expectations. Similarly, what is sufficiently probably, may end up as a "certain expectation" of proability 1 - as induced from our limited (incomplete) knowledge. Still this may be violated as new information arrives. This way I imagine that the probabilities and in the extension (yet to be properly explained) the "wavefunction" are actually physically real, and they make up the observers identity. And to the limits of the observers microstructure, the born rule are IMO nothing but emergent expectations. Now if a set of interacting systems behave this may, it means that they can reach a state where the probability of change is 0 - the expected probability. But if a local group of system agrees on this, then stability is expected, in despite of the incompletenss. This is one possible way to picture the quantum stabilisation in an intuitive way. This makes pretty decent sense to me, but current physics does not as far as I know give a satisfactory explanation of this. So there are many missing things that I think when done will make us understand QM much better - the mystery of the complex amplitudes, as well as intertia. I think the posed question are relevant. And I for one do not think the last word in QM is on the table yet. But those who look to restore classical realism are in for a dark journey IMO /Fredrik
  22. Fred, I'm not sure what you mean. I am not talking specifically about human perceptions, I am talking about trying to understand the connection to ontology and epistemology. I adhere to some kind of measurement ideals and the philosophy of Bohr that the task of physics is not to explain what nature is, but to explain what we can say about nature. And then I extend that. With "we" I don't mean just me, or humanity. I am talking about an arbitrary observer, a system in general. Not only does the environment scale, the observer itself can scale. I am still trying to find a information theoretic understand that to the limit of my understanding is consistent. Part of the problem is to define both ontological as well as epistemological things. And I think of them as complementary and inseparable, not something which you choose between. We learn by interacting, observing. But one can not just talke about observations in free space. There has to be an observer or a system that relates to the observations, that retains the results of the measurements in some kind of structure. This observer is itself part of the dynamics, and it seems clear that the observers "strategy" of information processing is crucial for it's persistence and survival. I think that the probability interpretations can be consistently restored, but it is not possible to make a fixed distinction between ontology and epistemology, observer and observations, matter and space. Empirist ideals are excellent. But things aren't that easy, because the empirically collected experience need to be processed and stored efficiently, because we have limited resources. A selection process will clearly favour those structures that can do this. The question is not what we can predict with certainty, the question is more what is our best expectation. I think this can be furthermore reformulated as what is most likely the more beneficial expectation. This can I think be attacked by a generalised probabilit theory, and I think the path integral can be consistently seen as as generalized diffusion. Of course this idea is old, but I have yet to see a proper formulation of this. I think a proper review should also fix some of the loose/unclear ends in this reasoning. With an "action" is associate a prior probability for the relation transformation. This probability evolves by two mechanisms - self evolution, which occurs because the condition consists of mixed information (this is why it's no "simple diffusion", more interesting thingsh append - supposedly this is described by the dynamical equations. but there may also a feedback that is not predictable. This updates the "self-evolution". The main philosophy behind this is that we know what we know, we don't know what the don't know and it's not possible to put bounds on that. The best we have is expectations. Continuous agreement with interactions means that a steady-state or equilibrium of is attained in a certain sense. This relates closely I think to the path integral ideas, the action ideas and the fundaments of QM. But most probably also to gravity. I think gravity will be emergent from the complexity. Maybe it is the very simplification in our description that shaves off gravity? I think we would benefit from a reconstruction here, and see if gravity pops up as another effect of higher order complexity. /Fredrik Now I'll mix everyday language with physics but it happens to be interesting so I'll do it... What determines Your initiated actions, from Your point of view? Clearly it's your expectations of the world right? Wether your expectations are "correct" (whatever that means) is completely irrelevant because it's all you've got. But what about things that are forced upon you by things beyond control? New things, new information, clearly causes you to revise your expectations, so this is still consistent. The behaviour of a poker player is completely determined what he expects of the other players hands and strategies. If he is wrong, is irrelevant, his expectations still governs his behaviour until his expectations are updated. If we insist on that we are never wrong, that our expectations are always met, then we risk failing to learn and that's when we get toasted. /Fredrik
  23. Some other threads make me pose another question... In the introductory courses of classical analytical mechanics, the classical action principles where to my experience argued by showing that they yield equivalent solutions to the ordinary newtons mechanics. So they are in a sense nothing new, just a sort of reformulation, whose main justification is the agreement with the standard newtons equations of motion. Still it is intuitively appealing, beatiful approach that appeals to variational principles and ideas that seem very intuitive such that "nature chooses the quickest path", giving the student the impression that there is after all something deeper hear. It is appealing in short. Then feynmann's pathintegral formulation of QM which also makes use of similar looking formalisms to evaluate the transition probability, but there is now the concept of the complex amplitudes. A connection to stat mech can be done by relating time with complex temperatures. Interesting connections but I lack a rigorous and consistent connection. Exactly how does the feynmann pathintegrals and complex action connect to probabilistic reasoning? After all, set aside any exotic formalisms, what we want is to estimate the prior probability for a observing a new state, given the information at hand, right? How can we _define_ a prior probability? Is there a physical meaning of such prior probability? What forms does "information" at hand take, and how does one define "addition" of such information? We have learned that superposition of complex amplitudes are generally quite a different thing than classical union of events, but what's the connection? Many QG papers use these concepts that are IMO still awaiting full understanding. What is the general view on this? Do you think the problem of QG to be independent from the foundational consistency of QM? Or is the foundational issue of QM a non-issue? Ideas? comments? /Fredrik What I feel is suspect and currently reviewing is the lack of detail in the constant of proportionality of [math] e^{iS/ \hbar} [/math] and this could possibly be the reason for why the expression diverges at times. If we compare this to a simpler classical case of the estimated transition probability to one relative frequency given a prior frequency distribution that is proportional to [math]e^{-S_KL}[/math] where S_KL is the kull-back entropy. This can be found by basic combinatorics. But the constant in front of the factor has a complex dependency on the scale! As the set population goes to infinity so that we get a continuum this goes to 1, but in the discrete domain there is a complex dependence on the factor. In simple combinatorical classical cases this factor can be derived explicitly. I think there is a link to this, combined with transformations between probability spaces whose results will be very close to the path integral formulation. And most probably it will provide a much more general setting where feynmanns original ideas can be understood from a deeper analysis from probabilistic reasonings. And I think the connection to gravity will be understood much better since this relates to complexity scaling (discrete -> continuum) Still it takes some time and I resumed this again not so long ago. But there must be tons of papers that has ripped this to pieces over and over again. But I haven't yet found them. Anyone knows what attempts that has been made already along these lines? /Fredrik
  24. I looked at Loll's desktop gravity a little closer and it strikes me that the key seems to be how to properly interpret and use the path integral, how to make the partitions etc. Loll's seems to me to admit that while she doesn't do any fool proof reasoning, the main point is that they can at least make a REAL non-perturbative calculation. And indeed the result of the dimensionaly reduction from 4 to 2 is interesting. Dynamical dimensionality is I think right on target. I don't know if this is explained elsewhere but regardless of wether it works or not, the methods to compute the pathintegrals comes out at somewhat ambigous also from the physical point of view. I want to see more progress in refining the nice ideas behind the path integral but I've got a feeling that the formalism is still not finished. Loll doesn't specify the relation to the observer in the paper. This makes me feel something is missing, or Loll is taking the hypotetical view of an omnipresent obeserver with infinite storage capacity and processing power. Since it seems like mathematical strucutre and the physical structure are generally different things, the question is if Loll in this simulation are observer the properties of the possibly redundant mathematical structure, or if he is making hypotethical observations of the physical structure? Consider a real observer, trained in the now possibly 2D planck domain, how and why would he induce the crazy idea of two extra dimensions? It is easier to picture dimensional reduction, but don't we also need to explain real dimensional formation? Martin, are you aware what of this that has been done in the non-string world in the context of dynamical dimensionality? I'd expect something like phase changes and excessive information capacity condenses and collapses dimensions in order to maximize information content / storage capacity. Similarly uncertainty and crazy dynamics should select and grow new dimensions. The obvious challange how to do that in a no-cheating way without throwing out unitarity. /Fredrik
  25. Given that I didn't understand the given limited choices, philosophy would be my given choice. But I'd focus on the philosophy of science. Otherwise I've personally found some basic microbiology and molecular biology to give a nice complement to the typically dry physics/math education. That aspect of "life" and complexity has provided me with a healthy perspective, almost like a missing link. But I didn't appreciate this until afterwards. I think a broad education is better than a narrow one, to see the connection between the simple and the complex, big and small, dead and alive, the list goes on. /Fredrik
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.