Jump to content

fredrik

Senior Members
  • Posts

    532
  • Joined

  • Last visited

Everything posted by fredrik

  1. I have resumed working on going through the logic of modern physics as part of the quest for a more systematic and philosophically consistent approach that may help solve some of the fundamental issues and unify not only the mathematics but also the logic and philosophy which has been severly lagging for too long. Intuitively general relativity can be given a very natural and plausible abstracted bayesian interpretation, where the geometry identified with probability priors and geodesics are simply the most basic a priori rule for evolution. Anyway, I have not spent alot of time reading every possible treatise on the subject so I wonder if any of you guys on here happens to know of any papers where this has already been elaborated and perhaps some logical implications has been worked out explicitly? Considering the universe of ideas that people are working on all over the place it seems reasonable that there are at least some papers. I suspect though that they might have been published in philosophy, logic journals or possibly math rather than physics journals. Or maybe it's in somebodys private waste basket. Who knows what is important these days I did a quick google and didn't find anything obviously interesting. Any pointers would be much appreciated. /Fredrik
  2. > How do we connect zero-point energy here? Quantum mechanics says the ground state energy remains intrinsically in all space, so why doesn't this fill the bill? There are various speculations by different people on a connection between a vacuum energy and dark matter which I suspect what your main idea? So I think the thought as such isn't bad. But since there is yet not proper treatise of the topic there is no authorative way to calculate the expectation value of the vaccuum energy density. The infinities in the zero point energies not only fills the bill, it suggest far more energy than what seems reasonable. Then there are various artificial tricks to reduce the energy by removing the constant but seemingly infinite part but there doesn't seem to be well motivated universal way to do it yet. Until we have a consistent theory, I think there are many links. And I suspect that one way or the other these things may be connected. "Renormalization", zero point energies, spacetime nature, time evolution (arrow of time) are IMO related and I expect from a new theory to resolve all these and I think they are linked. From a logical point of view, I think it's somewhat clear that the issue of zero point energy is related to the causality constraints when we polarize the vaccum fluctuations. Somehow the attempt of a simple fourier composition of the time parametrization and at the same time excluding the negative energies doesn't make complete sense to me because it does contain a clear restriction. I think that is one key of headache. Sure negative energies is headache too, but I am not yet convinced about the procedure. I haven't come to this stage yet but I remember some leery issues from when I went through this long time ago. The exploit from what I recall was to introduce spin ½ particles, which is sort of fine. But I have to review what the impact was on the sample space. Because there was a piece of the original spacetime dynamics that was take to define a fermion system, used as the dirac exploit. I have not checke again, but it is quite likely that such a exploit could be generalized. They way I personally see it here there are soo many things, high and low, with open wires that it's hard to keep your head from twisting. /Fredrik
  3. > Is everything predictable? I'd say unless I missed something big - "apparently not" Could everything "in principle" be preditcable given X? What does "in principle" mean, and what is X? That we are complete like some kind of God? Let's just suppose for fun, that God does NOT place dice. How does that imply that *We* do not have to place dice? God may be complete, but we are hardly not? It's still two different things to me. The question, could I predict everything "if I was God" is not really a clear question. The answer could be anything you choose, depending on how you define it. Not really the same, but IMO "related"...Gödel's incompleteness theorems http://en.wikipedia.org/wiki/G%C3%B6del's_incompleteness_theorem /Fredrik
  4. While I'd like to avoid the word "causal" at this point, when I read your comments again I think this is a good question and maybe I misunderstood your comment... there is something to it, one way or the other. No doubt the boundaries and discreteness are related too... the boundaries and general properties of the samples space will definitely have an impact on the relations too. This is why the whole issue of the event and samples spaces should be carefully considered, and the properties of the basic set will have a big impact on the rest (and probably even also dictate which method of quantization to use, who says we have to stick to only one scheme?), and also on convergence properties of any expansions. There are several important issues at this point that isn't clear to me and need work. One complicating factors is that my instinct suggest that the event spaces themselves are somehow not fixed, they should be "dynamic" or changing and evolve in response to changing information. Now, we only have to get this to fit together. /Fredrik
  5. I'm sure if I understood you right here but I agree that the *probabilistic event space* must be defined better. This is one major point indeed. To simple take some classical physical spacetime concept as part of eventspace of particle positioning is wild and I think is plain wrong, at least in the general case, and at this basic level I find it unacceptable to step aside from the so called "general case". That was the focus on the first post in this thread. I am not sure yet what the proper abstraction is, but I am working on it. /Fredrik
  6. > at least locally bad word... what I meant instead of locally is "with respect to a particular observer", but OTOH the nature of the observer is sort of incorporated into the priors. /Fredrik
  7. In my thinking, causality is treated on a logic or probabilistic level, not "physical level". I think of the "physical" level as built ontop of the abstract information level. I see different interpretations of causality. Locality OTOH is yet another story. These things will in my thinking at least enter the model higher up in the development. At the most basic low level treatise, there is no such thing as locality or the cloesest thing to causality I can think of is the obvious fact that the prior state influences the evoltion into the posterior state. This would generate a causal like behaviour, at least locally. But the details remains to be elaborated. I believe strongly in the suggested route here and I intend to try and walk it and see where it takes me. /Fredrik
  8. I don't want to obstruct my rethinking by advancing myself but my view of the various quantization approaches is that it simply depends on what partitioning of the event space, and prior you make. The simple canonical way "[math]p = -i\hbar\frac{\partial}{\partial q}[/math]" is just one way, inspired by the familiar fourier theory, and historically I guess a natural way. Logically I think it is by no means distinguished and the only way. We define a logical prior which is information about p, to evaluate the conditional probability of q. I think of the quantization in this case is a plain conseqeunce of our choice of partitioning of the event space and "prior event space". There is nothing that stops us from defining a mixed partitioning that could be anything we want, as and alternative to the normal (q,p) phase, if we think that makes our treatment easier. As long as we do it properly they should in a certain sense be equivalent, so changing partition IMO I think should not need to involve "assumptions" as such. It's just a change of representation, we are free to make. I didn't explain everything in detail here, but IMO these things can grasped on a principal level, and this I find important for internal guidance. The formalism itself is just a language, but it's meaning can be seen without the full formalism. So far I think it's not so overly weird, but I think that this alone, is not sufficient. We need to go yet deeper, or alternative iterate the same logic used a few times. I will have no choice but to analyse this in more detail eventually. /Fredrik
  9. My main concern in what things are, isn't what are they are mathematically, it's more what is it, defined in terms of reality and knowledge. Although this connection is bound to be fuzzy, it is still important. Attaching the formalism to reality does require some philosophical argumentation, some sloppy half hearted analogies isn't good enough. I do not like when mathematical concepts are pulled out of nowhere, with from a philosophical point often very vauge/weak connections to the real issue, and then a machinery of mathematical excercises are initiated and all further work is made with blindfolds. I find such treatments both unreadable and objectionable. It is more like something I would expect from matematicians, that find physical models "fun to play with" After all there are plenty of researchers at mathematics departments where they don't do physics, but rather simlpy "study the mathematical models suggested by physicists". Which is of course excellent and necessary, but that is sort of only part of the issue. I remember one of my old math teachers who if I remember exclaim complaints about the quality of the math seen in physics papers. His expressio was that "someone has to do it properly", and indeed when it comes to evaluating the mathematical properties of certain objects, that is indeed a job more a mathematician. He was interested in the models, not the purpose or usability of the models. And I think everyone should follow his instinct. There has seemingly been a mutual synergy between mathematics and physics, which is important. I like mathematics, but I am mainly a philosopher to mind, and has always been. Stringent formalisms are important, but the joy of playing with formulas must not take over the original purpose of someone that studies reality. /Fredrik
  10. > it is between you and God Even though I am not one bit religious myself, I like this statement in the general sense. At this point, your QM questions seems to be formulated - from my point of view - somehow in the "application layer". And to not blurr my own thinking up, I will try to not enter these things until I reworked the basics (in my own head that is Right now I am on a very abstract general level, and trying to define things like the event space, and how to treat the conjugate event spaces properly. I am not sure if it's because I am stupid and everyone thinks it's too obvious to mention, or if they are admittedly ignored but to take and example I tried to extract the philosophy of the CDT approach in one of the links posted by Martin, and already in the foundations there are some things that make me leery, and I need to verify if it's valid. In a very general sense, the of feynmann pathintgerals (conditional or transitional amplitudes) can be considered as nothing but a summation or a complete partition of the event space - this partition is by definition arbitrary and could be exploited to simplify the formalism, which I supposed could be the exploit in their philosophy, however in the treatise I sense some assymmetry in the treatise when time comes into play... there is also extensive use of the classical mechanics "action" as per the feynmann postulates. These jumps need to be motivated because they are not obvious to me at least. I also need to analyse their incorporation of the causality constraint. My first simple impression on reading was that it was used in choosing a partition that "a priori" was incomplete. But I am not sure if that was the right interpretation. The way they present the stuff in the papers are in my mind lacking philosophical argumentation, so I feel that I have to recrify everything I read into my preferred thinking. The definition of event space, and treatise of time is crucial... and already at this level we make assumptions. Since I am rethinking at this basic level I haven't participated in any higher level discussions yet. The whole "concept" of photon localization and the concept of energy is (in my rethinking here), not yet formalized. My motivation is that I think alot of the headache are founded at this low level. This is why I consider it well worth the time to rethink this. When this is in place, and only then, will I try to reconsider the applications and particle definitions. /Fredrik
  11. I agree breaking of vacuum symmetries and zero point energies etc is a key topic which is interesting/fun too. This would be the easiest test case where I hope to try out some new ideas eventually. I don't want to go back and contaminate my brain with the standard procedures yet... I am trying to take advantage of that fact that I am reset now... and I try to rethink critically without falling back into the standard lines of thinking. I am trying to rework all of it in a purely sort of probabilistic/learning model approach without overly silly prior assumptions. I want to view physics as a distinguished application in a general learning theory or information theory, or if possible formalize the scientific method. My instinct is that there is no doubt this is possible. It's just a matter of exactly how complex it will be and what the preferred formalism is. In parallell I'm briefly scanning existing approaches to see if there are parallell projects going on that I like. In a certain sense alot of the current theories already tangent to all this... it's just that I don't see why it's not abstracted harder because I think there is alot of power in the approach. What bothers me is that there is a mix of old "classical mechanics" stuff with good new ideas and this mix makes the logic blurred as far as I can see. This is why I want to clean out anything that's out of faishon and start from square one. I think that correspondance principle might follow if the new theory is trained in the classical domain so to speak. There should not need to be a priori assumptions hard coded in it to fake classical mechanics in the limit. I think it will be a matter of training. /Fredrik
  12. I'm not sure to whom you refer but I speak only for myself and while my general philosophy is to encourage diversity and multiple attempts I personally never liked the string theory thread. I am not a string expert but I am capable of drawing limited conclusions from what I know of it, and on that I base my actions for best economic use of my resources in the future. I see much better options that I put my energy into. Of course, what is the best option for me, is not the best option for everyone - to each his own. So we do not really have an obvious contradiction here. I have posted a bit here to enter a brief communication with other minds. New ideas and experience is healthy. The proof is in the survival and success. Also more importantly I pose my own questions, and the string theory thread as I've currently seen it simply doesn't answer my questions, no matter how good it may seem to others. I look for more than an mathematical exploit, but that's just me. I do not need to convert to a string expert to hold this opinon. There are possibilities if string theory are radically reformulated... that is the end where I by definition do not have sufficient info. I am open for suggestions when they come, if others do the work. I work on my fave leads, and others on theirs, just the way it's supposed to be. I think one could also decompose theories in several aspects. First there is the starting points and priors. Ideally this should be cleanly posed axioms, methods or some general unavoidable assumptions. This is an important parts and contains some philosophy and possible "metaphysics". Then ontop of that comes the theory with implications, theorems and applications. Before I find it motivated to invest too much time (of which we humans do not have alot) in something, I judge the starting points or the philosophy first. From what I can see here string approach seems pretty weak. More than a philosophically well formulated and generic approach, it seems more like something that started with a what if game of, what if particles really are strings... then the implications of that are worked on in a mathematical sense, then adding various consistency constraints along the way... I am sorry but what kind of procedure is that? The impressing part is the complex math, but from a philosophical point of view there isn't much new at all? At any rate I don't see it. I really don't consider myself more stupid than anyone else. I have a physics/math education, and have spend some time pondering about similar things it's just that I took a 10 year break recently and just started to resume these projects a few weeks ago because I found a new hope. I don't get payed to do anything so I work slowly. This is why I "just talk"... I am still working from scratch and have some stuff left to do before I am on to the next step. I keep the possibility open that I am still just ingorant and miss the whole point, and in that case I offer my apology and hope that my "ignorance" will be my saviour /Fredrik
  13. Here is another question regarding string theory. Since Martin seems well oriented in these fields, perhaps he could comment? In defense of a "reformulated" string theory, whatever the name may be. 10 years ago, I recall one possible opening in the "string theory interpretation" given that you accept the underlying procedure. While the concept of the "string" has been somewhat odd to me, what I could possibly motivate is to instead of things consider a gaussian distribution, that could be excited. This was what my thinking was like way back, and I guess it is still a possibility that would probably be easier than the other stuff I have been talking about lately. Not as well founded, but still possibly a systematic procedure. The motivation for the gaussian shape (in the continous case) is the probabilistic one. What I found to be an obvious interpreation is to consider the quantization as an induction step, and logically it simply means "relating the relations", or the probability of the probability, or a kind of higher order logic. This could inductively be performed into n'th quantization. In the frequentists interpretation this may seem pretty akward, but in the bayesian thinking it seems natural and has a certain beauty also from a philosophical point of view. Can someone, maybe Maritin, answer if this is anything where the developed "M-theory" reworkings are leaning towards? If this is anything like it. I would give the "reformulated string theory" some hope! though it is still fuzzy and at this point seems to be only the second best option if all else fails, which is not out of question. Comments? /Fredrik
  14. One very fundamental property I would like to see in a scientific method is that it should be as expandable as possible. That is, I think there should be defined "handles", or "expansion points", where all the inconsistencies are collected, until they get big enough to separate from the noise. Thus giving birth to new concepts. This is why I expect from a new theory, to naturally and on a fundamental level incorporate a evolutionary or adaptation step, so that the model can evolve and adapt in a more continous way in response to new data. This is also IMO the most natural way to trace us back to "big bang". This may be a too complex task, but at least from the philosophical point I feel this is farily close to what I consider a good question. Should be answer a worse question, just because it's easier? This is my objection to some of the current theories, including string theory, that it simply doesn't target the questions I want answered. So I just don't find the motivation for it. I am not currently that awfully well oriented in all possible approaches that are worked on. I know that there are many interesting approaches where I like parts, but so far I haven't seen anything that is dead on to my preferences. What is close to my taste are the bayesian logic approaches, but I have some doubts on how the sample spaces are treated. Sometimes they are simply "given", and I find that somewhat disturbing. This is what currently keeps me awake at nights. /Fredrik
  15. Thanks for your overview of the LQG "flavours" Martin! As a first approach I think you answered my question good enough for me to get a better picture on the LQG philosophy. I'll try to look up CDT. Am I right that in the above suggestions you are presupposing a distance metric, or? From first principles, what would the meaning of a distance between two samples mean, unless possibly interpreted in terms of their correlations? Suppose you throw a dice. What is the "distance" between 6 and 2? It seems to me, the prior is that all samples are equidistant, if at all? Comments? did I miss something? I'd like to start from just consider a plain set of inputs, possibly ordered and countable (I am not sure yet), without defined metric. What I picture is that all these things can evolve as data is consumed. Without even consuming the first sample, I have a hard time accepting things like a distance metric. Perhaps the sample space itself isn't complete. I think that may even expand. Suppose you start sampling real numbers. Do you know how high numbers you will get? It seems your sample space might be considered to increase on a "as needed" basis... This sure is a very good and unavoidable key question. It seems clear that the less informative the priors, the harder and more work do we have to build the new theory. And at some point there has to be a practical limit. This is of course a little fuzzy, but I try to get rid of as much assumptions "as possible", which really just means that I will do the best I can, given my tiny brain. The priors I accept are motivated with that I simply can't do better at this point. I remember discussing this with some of my old teachers, and their(ie those who get payed to do research - professionals that is) defence was that dealing with the big problems are to complex and takes to much time, that chances are that it would take so much time for some results worth publishing that the research funders might think you are not doing anything and withdraw the funding. The system encourages a "small step size", which means that some larger changes that might possibly be beneficial is effectively repressed by the research politics. Perhaps this is one reason why so little fundamental progress has been made? What do You think Martin? Personally, beeing philosophically inclinded, I try to get rid of everything that I simply can not accept as axioms or beeing a sufficiently necessary assumption, and that I see a possible exploit to formalize it. One issue of concern is obviously that we pose a question so hard that we can not solve it within a reasonable time, but also consider the other issue, the risk that we spend alot of time working out an alternative solution that finally proves to not be good enough after all. Then, if the mistake was in the very foundations of this theory, it may not be so easy as to just introduce correction terms... the whole stuff might need reworking. And is the second scenario better? So I agree there has to be some balance. /Fredrik
  16. Hey Martin, Do you advocate LQG? If so, what is your current attitude towards the dimensionality of spacetime? What bothers me more than anything, is not just the geometry or structure of spacetime it's also maybe more the spacetime dimensionality itself. While the dimensionality may appear inuitively obvious, I don't think this is a valid assumption. It is far too strong. I'd expect there to be a better explanation for the apparent dimensions. What is the mental rescue the LQG advocates uses to handle this? This is to me an important point, and my feeling is that ingnoring this is really too much. While I'd like to answer that the answer is in the data - there still has to be a intelligent method for deriving "dimensionality" out of data. Is this ignored also in LQG or have I missed some alternative more advanced interpretations of LQG? (I ask because I am no LQG expert) /Fredrik
  17. I'm not sure if I caught your motive but one natural way of viewing any dimensionality is by means of variation(change). Here is some of my personal thinking, to which others are free to disagree. There seems to be a the little logical problem that we need to start somewhere. We may for example start by postulating that we have a 3 dimensional sample space, and then experience reveals that the sample to sample variations are not random. This gives birth to a new dimension that we can parametrize by variations/deviations in the previous model. Suppose we started by assuming a 1 dimensional sample real valued space. And all our data would be single values. Supposed we have limited storage capacity, then we need to learn howto compress this data. We might find that this stread of single values, can be interpreted as natural consequences of projections of a higher dimensional object. And this may help us comprehend the large amount of data better. Suppose that we have a real {(x,y,z,t)} data from a real experiment. And suppose that a monkey have scramble all the data... and all we now have is a set of single numbers {q}, which is composed of the individual x, y, z, t randomnly scrambled. What is the chance that someone can guess the original unscramble data at a probability that is higher than the a priori probability by plain scrambling? /Fredrik
  18. Loosely speaking, string theory is an attempt to answer some questions about how different forces in nature are related. For example, in cases where the quantum effects and effects of general relativity are both supposedly significant, there is currently no satisfactory model. I you have zero background in QM and relativity I think it could be difficult to grasp because the first, and by far, the most important step is to understand and acknowledge the questions string theory is trying to answer. I think there are many isssues with string theory, both fundamental question that regards serious philosophical and logical issues, and then other technical or mathematical consistency issues. Some objections are that "string theory" can not possibly be a fundamental theory because it's based ignores some of the more important issues. Some people seemingly may find it acceptable to dismiss such issues and metaphysics. Or some thing that "in the other end" string theory will "mature" and there will be a reformulation that removes this issues. If you like it or not depends on what philosophy of the "scientific" method you have. Of course, one should give people a break, because any theory may start out fuzzy and inconsistent... by my main issue with string theory is how one could have allowed the most fundamental issues to remain. In my personal view, which I think is share by many others, I can't see the good "scientific method" in the string theory project. It intuitively rejects my philosophical ideals from step 0. /Fredrik
  19. This is a personal answer to your question. If you are a biology student and wonder what connections there might be between fundamental physics and biological reality.. I'd pose the things like... arrow of time / time reversal, and creation / evolution processes. And what about consciousness? Is a yeast cell conscious? Is a DNA molecule conscious? If so, what does it mean? If not, why not? Is a bug consious? Is a dog consious? Concepts like self-information? The self? Is that evening meaningful? Why? why not? The biological evolution supposedly didn't start with an "egg in space", the first step beeing formation of particles, then atoms and molecules... But that isn't specific to "string theory". I personally thing there are more interesting theories of fundamental physics. I used to be a physics student, that asked the opposite question. From the point of view of a "reduced" physical reality, why do I need to study biology? I couldn't see it. It took me some time... but now I see it. There are many interesting links to the philosophically minded. And solving some of these deep problems will provide keys to many fields. How come a tiny cell can perfom a tasks that we have a hard time to have supercomputers do with our big brains? Are we just plain stupid, or what is the problem? Orienting yourself in physics could possible be fruitful also for your future biology endeavours. But it tooke me some time to find the motivation. My motivation was that I walked the reductionist line... and ended up with the conclusion that something is wrong here, and I went back to where I came from. /Fredrik
  20. I never tried that experiment, but just speculating here, it doesn't seem out of reason that 1) increasing the amount of simple sugars in relation to complex sugars and proteins and 2) boiling the honey while adding/mixing in sugar (heat treatment) Might possibly affect the binding properties of a clump of honey in water, and thus the rate of it dissolving in a glass of water. But in either case it seems like a fuzzy or "crude" test. But since most of us do not have access to lab analysis some crude tests may work. Moist level and reducing sugar content can be estimate at home though. I brew beer and I found out that there are a range of neat blood sugar meters on the market supposedly designed for diabetics. There are different brands that are based on different enzyme assays and some of them are specific to glucose, some react on many other reducing sugars. Including maltose and maltotriose that exists in brewers wort. I've had excellent use for these meters in brewing by combining the measurment, with a custom calibration, as well as a theoretical model in the typical relative sugar distribution of sugars in wort. My meter doesn't react on fructose or sucrose, but on glucose and some othre sugars. Perhaps a similar "dirty test" can be used to guesstimate the relative sugar distribution in honey. Obviously if the honey is not diluted as to maintain the natural fructose/glucose ratio, you could detect the fake honey by the relation between reducing sugar and density(moist). However the blood sugar strips have high deviations so many tests are neede to get the uncertainty down. I've used the same stuff to analyse commercial beer (at home that is, without nifty expensive lab equipment). The reducing sugar reading in relation to the gravity reveals the relative yeast attenuation, and sweetness of the beer via a set of estimated correlations. /Fredrik
  21. Hello Norman, you didn't receive much feedback and I am not prepare to give any specific comments either since I am now trying to re-evaluate the whole concept of spacetime and causality from scratch because I have a distinct feeling that just starting "in the middle" means everything is floating the philosophical side of me rejects the problem as somewhat undefined or arbitrary... so while I don't have any sensible comments on what you are doing... the last time i looked briefly at the normal formalism my hunch was that all of the fields might be transformed away, of course it's not possible only within the lorentz group, you need more general transformations, but so what? I think that is not so strange, given the weird properties of the electron. From what I remember the electron can consistently be interpreted as a "dynamical property" of a boson field, and I forgot the details. Considering this "dynamic property" is also what generates the half integer spin. Then when I realized that all of this work was made on floating ground I didn't find the motivation to go on... until I could attach the floating ground to something more fundamental... so I postpone the higher level applications until the fundamentals are in place because I suspect that once the fundamentals are rectified alot of the higher order formalism may be flipped and need to be reworked anyway, and I suspect that many problems today will simply be gone at that point. So... I have no other comments but to say keep working on your projects I have alot of catching up to do as well since I didn't do much in 10 years. The good part is that I feel like I have a fresh mind now... starting by questioning things that I didn't have time to question before. I sense that the concept of spacetime dynamics is really sort of arbitrary until we nailed the nature of spacetime itself. Becauase that is also entangled up in the foundations of defining our sample space. And there seems to be several logically related samples spaces of different orders, and we tend to mix a priori information from different event spaces... making the interpretation of conditional probability hairy enough. /Fredrik
  22. I'll fiddle on and see what happens. I've found a number of interesting papers but IMO they don't address the main point, they rather elaborate the existing ad hoc theories from the point of view of probability and information theory which is illumanating itself but I rather have in mind using the full power of information theory together with some natural philosophical arguments to see what the consequences of these things are when required as constraints. I see that along the way many many assumptions are made to simplify, and I am not convinced that this is acceptable, and it may itself be the reason to several current problems. The obivous main problems I can predict so far are of technical nature, such as hard to manipulate formalisms and one has not choice but to rely on numerical methods in the general case. I think it should ultimately be formulated as an optimation problem under a number or different constraints. The main task is to define the ultimate measure with a minimum of a priori assumptions. This is a similar procedure I had in mind when I was thinking on howto properly simulate a cell. The main problem was to find the ultimate measure of success. Once this is found, it's merely a technical problem to find the best path. While I am not religious, I think that if there was a god who created this, all he had to do was to defined the magic universal measure of life, and leave the rest to itself, and there would be sponatenous evolution. /Fredrik
  23. This came out wrong. What I meant was that time would be a "parametrization" of the most probable disturbance, and the actual "rate of time" would simply be related to changes in some reference subsystem, for example a clock device taken as definition of the rate of time. So while the flow of time might be independent of the clock as such, the definition of a "second" would obviously be a matter of definition (arbitrarily chosen reference). /Fredrik
  24. Background: Currently the most sensible procedure to me is to consider spacetime basically as defined through stochastic evolution (relating to a kind of information entropy), and that the "classical spacetime" will be reached as the "best guesses" or expectation values. Time would be considered naturally as the "most probable disturbance" given our conditional information (which could be anything btw - as long as it's correct and not just an assumption). In between measurements the information entropy either increases or is held constant, but during interactions the entropy is decreases as new information is gained. So interactions (communication) can be considered to maintain information. I have an idea that out of this it might be possible to derive some very general equations of dynamics, that would sort of correspond to the "a priori" dynamics, given no further info. Any further info, is simply added ontop of the former as a constraint to the equations of dynamics. The idea is not to start out with a spacetime notion - rather to let the spacetime dimensions emerge as assymetries in the stochastic evolution. So there will be no presupposition of spacetime. If data then requires spacetime, it will come to us as the "most natural" interpretation. It will be more complex, but I think also more correct. And it would be extremely general, and easily extend beyond physics to general intelligence applications. This is very fuzzy I know. And my intent of this post is definitely *not* to present a new theory here: I simply want to ask if anyone on here that has been working more on this than I have lately, is aware of what the current work is on this? And if anyone has any links? I have some ideas but before I start fiddling with equations from scratch I wonder if anyone knows if there are any good papers on this already that I better read before I get to it? Also this post could I guess as well have been posted in some of the other sections, but I had to pick one Ideas or pointers anyone? /Fredrik
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.