Jump to content

fredrik

Senior Members
  • Posts

    532
  • Joined

  • Last visited

Everything posted by fredrik

  1. I still don't quite understand the "obsession" of classification of physical and non-physical. Suppose we say that "time is physical", what implications would you pull from that? The question wether the physical reality is independent of the human brains, isn't that an easy question IMO. If you take that view to it's extreme, it's to say that things exists wether they could ever be observed or not, which IMO is a akward position, which doesn't make much sense.I think our brains tend to simplify things, and reality is really more complex than common sense suggests at times. "Measuring time" would mean, making measurements on a clock device in parallell to the rest of your system, and then perform some calculations to arrive at the "progress parameter" we call time. If you wonder if time is a degree of freedom, like space. It could also be a matter of definition. But generally I would say time is not a degree of freedom like space dimensions is usually considered to be. My preferred view is rather than time is a parametrization of configurational changes. But since these things are all relational, there can be many apparently different alternative views which effectively represent the same underlying representation. Therefore the important things is to understand the relation, and see that the representation is not unique. /Fredrik
  2. Yes I was trying provide another point of view. What is "physical" and what is not never occured to me as an important question, but that's just me. To me "physical" certainly isn't the same as "real". The question is more what's observable and what's not. If you'd ask me if energy and matter is "physical", I think it can be debated, or considered a matter of definition, in either case I don't find it a fundamental question. My point is that even whatever we consider to be physical, still needs to be communicated so to speak, which brings us to the information concepts. Interaction = communication. But to play along, I would say that energy, mass and energy and entangled up with each other in a relational way. They are not independent things. Traditionally in QM, time is related to energy approximately like space is related to momentum. They can be thought of as different point of views of a similar thing. Mass is something like constrained energy, as opposed to free energy (like radiation). /Fredrik
  3. In classical mechanics, there is only one path. In QM, there are many "virtual paths", so it begs the question how to parametrized a bunch of competing parts. The resolution is to consider the stated of information, and instead parametrize the state of information. Then we get only one path, and the parametrization can go on. /Fredrik
  4. I don't quite understand the meaning of your last question but FWIW, I would say that time can be thought of as an internal parametrization of changes. Ie. you parametrize the evolution of the system, by a small subsystem (a clock device). So what you are measureing is how the system as a whole evolves, relative to the evolution of this clock device. The problem is to find the proper measure of "evolution", when doing this comparasion. The whole point with time that makes it sensible is that it is a kind of event ordering. One nice way of thinking of it, that nicely merges with both general relativity and QM, is that the system diffuses into the future. At each point there are typically always different possible ways, but nature always chooses the easiest path. And time can be thought to be a parametrization of this path, at least in a differential sense. The toplogy of the finite time might in general be complex, like in general relativity. One choice or measure of evolution is based on generalized entropy principles. IMO one of the most beautiful parts of classical physics is statistical mechanics. And this can be extended, to deal with more abstract models. And in such a model time can be defined as a parametrization of the system diffusion into the future so to speak. This thinking means the only measure you need is a measure of distinguishability or change. Then the measures of space and time dimesions (meters and seconds) can simply be thought of as constructs based on this more fundamental measure, in a relational way. /Fredrik
  5. An alternative or complementary shorter comment on this is that the reason is so we can understand where the deterministic mechanisms come from in terms of a kind of learning logic. So in a sense all the deterministic mechanisms are removed altogether, and replace by a generic learning model. And the deterministic rules are output of the learning models, but they are not fundamental per see. Instead I consider learning fundamental, the specific output is dependent on input which in this case is data or experiment or experience. Of course this is a massive task, and I don't have all the answers yet. But even give that, I sense that the logic and philosophy of this is promising. Which is just my personal opinion of course. Instead of trying to learn/discover/build something specific. I changed the focus into to learning/exploration/creative process itself, and I think it will be found that alot of fundamentals in physics can in fact be traces back to such similar abstracts. There are already today some farily deep simialrities between relativity, QFT and stochastic processes, and it's hardly a coincidence. But the missing part is the adaptive step, the learning step, itself as an stochastic evolution. /Fredrik
  6. I'm not sure I understand your comment here? What I am suggesting is that realising that we simply can not "know everything" so to speak will make it easier to understand what is really going on here. This is by no means in contradiction to that we are learning. But there is likely a limit to what any given subject(=particle, observer, subsystem, organism) can learn, and which point we more or less hit some "residual uncertainty" we can not resolve. The residual uncertainty for the elemetary particles are clearly massive as compared to the human perspective. The view of reality as percieved by say and electron is clearly quite different from ours. The human perspective is clearly different (all perspectives are different anyway) but I think in a more abstract level is not distinguished in the fundamental sense. I think the fundamental treatment must transcend these things and work in a more abstract way. Or at least try to. The problem with thinking that all the answers are out there, and that they would be accesible to us if we only found them, is that it is likely to lead us into inconsistent models. We are looking for something that probably does not exist. I think of ordinary QM of particle physics as a kind of information mechanics of the residual uncertainty domain. It is basically random mechanics under some particular set of constraints and priors. Of course the more complex systems we study the more significant is the learning. I think this is also the thinking needed when gravity comes into play. The basic rules of learning and the updating that happens during the residual uncertainty domain is of course exactly the same, it's just that the changes the latter domain is distributed as per a random distribution forever, and nothing new is learned. This is clearly the simplest of possible cases, and i tihnk we need to think differently here. /Fredrik
  7. Hello, I was away for a few days. Just got back. My point is that I think ignorance comes first and I acknowledge my ignorance. Learning is step 2. And the so called deterministic rules we know, is something we have learned by consuming and processing data, or in terms of physics, by alot of interactions and experiments. But I think in general the deterministic rules are not static and forever lasting. There may be a time when they evolve as well. So I think often the determinisim is effetive and indistinguishable from a "macroscopic determinism" microscopic chaos. But my point is that whichever, doesn't matter. The state we face is that we don't know certain things. The "explanation" I am looking for is exactly how does knowledge and structure grow out of ignorance and chaos. I don't label it "an non-deterministic mechanism in operation". This makes no sense like you imply, which was what I mean to say. Instead I just say "I don't know". And in reality, it is more a rule than an exception that you need to make a decision based on incomplete information, but you still need to argue for your chosen path. Howto do that? You arge in terms of information at hand, and find your weigted "best bet". And given a symmetry situation, you can perhpas deduce that you end up with a class of options that - to your information - is equally sensible, and you thus just pick one at random, and if you want to explore and break the ignorance after some random walking you have acquired more data that will allow your prior to be promptly updated. And along the way you have a minimalistic and motivated update rules. If I simply don't have a clue - I have not reason whatsoever to argue that one path is more preferred than another one. Then if experience later shows, that certain path are in fact preferred, then my ignorance is broken, and I update my prior. Then future learning always consider my total ignorance constrained to my prior. So my prior information does imply a pattern or geometry on my ignorance. So if you choose to say that I have an non-deterministic rule that sounds odd, but I simply acknowledge that "I don't know" and leave it at that. I think that's about an simple an honest it gets. The minimalist philosophy in a sense. My preferred choice of partitioning is in information I have, and what I don't have. I have no reason at this point to speculate on the nature of what I don't know. A point is that a assumption that I "could have" complete information and still be consistent is not a proven. I think it's more evidence against that it's more often false. But I think we are reather constantly learning. A big complex subject like the human brain are clearly "learning". However a much simpler subject like say an electron, is simply updating or revising. There seems to be a limit to what an isolated electron as such can possibly every "learn". It's learning curve quickly flattens and turns into a state of updates and revisions. I think there is a uniform system behind this that can we nicely worked out from first principles and it could also probably explain alot of things in depth from first information principles. And one advantage is that this route will (yet to be formally prove of course) from scratch have a natural integration of general relativity and quantum mechanics. I've got a clear gut feeling about this, and I think it's the way to go. It can hopefully properly resolve several of the fundamental but historically ignored issues in physics, and it can do so without introducing ad hoc stuff, that need further explanation. What is better, I think fundamental physics can be interpreted as learning mechanics. Analogous to frames of reference in GR and SR, the beefed up QM will IMO need to acknowledge that interacting parties different prior informations. This also has the potential to completely unite the system vs observer issue. The observer will be treated just like a particle, because there is no reason whatsoever that the observer is distinguished. Details remain unclear but somehow the information capability of the observer must be defined here. /Fredrik
  8. A bit philosophy but... Here is where I'd prefer to apply a philosophical principle and choose the simplest (minimal representation) construct consistent with data. Inspired by the idea that if you assume representations to appear randomly, a more complex representation are less likely to start with. If we have no prior reason to assume any particular deterministic rule a) from our effective point of view, it's "random". (That's not to imply it's going to stay random forever, as we will respond promptly to deviations) b) we consider the probability distribution over "all possible" deterministic rules. But since they are all equally likely, we really have not gained any information, but we have enormously increased the complexity of the representation by introducing alot of unknown variables. I think expanding the representation is only motivated upon experimental evidence - ie as soon as we have significant deviations from a random distribution. /Fredrik
  9. IMO, if there on the contrary is no evidence for deterministic behavour, the principle of insufficient reason suggest that we have no reason to *complicate things* any further than to treat it as random or pesudo-random, because there are indistinguishable anyway. Ie. our best, and simplest guess in the spirit of the mentioned philosophical principle is simply a uniform distribution, given no info that suggest otherwise. But at the same time, given that we have the memory capacity we continously keep evaluating the randomness of our ignorance. As soon as significant doubt is found, things take a new turn. Another reason for this is that any real observer has limited memory capacity and must not fill memory with what "seems to be" random data... often, there are better use for limited memory. However, suppose the observer is put into a plain chaos, there simply isn't much else to feed on but the apparently random numbers, and chances are that eventually he will decode all data given his memory and processing constraints. That's how I see it. Sorry for the many posts /Fredrik
  10. But still, I think that in the standard formalism of QM and QFT the deepest nature of this "information mechanics" is not yet understood. There is alot of abstract thinking going on, and I think that the future of this world should be a cooperation between physicists and some of the AI research. I think bascules association with data cryptation and encoding are dead on. In that thinking I think we are more or less on the same page. /Fredrik
  11. The difference between classical mechanics and QM beeing that in classical mechanics uncertainty wasn't taken seriously. The fact that observations are fundamentally correlated on an abstract information level are not acknowledged. Instead it was just considered to be experimental error that "in principle" could be resolved, and thus once we make an ultimate device we could predict everything from the initial conditions. Even classicaly it is easy to see that such a scenario is simple unreal and impossible. In QM philosophy, the status of information is acknowledged to be fundamental. And in fact more fundamental than the physical reality. When the mechanics of this logic is worked out there is an intrinsic uncertainty that has to do with information, and relations between possible states of information, and the references. There is still alot of conservation laws because alot of the information mechanics is just a zero sum game, but in reality there is always an incomplete reference, which can not possibly all contain information. Incompleteness is a key in this philosophy. Wrongly assuming completeness, often leads to inconsistencies. This is why I think that at fundamental level all we have is an information mechanics. And in this view I think what bascule writes can fit too, it's just that I'm not sure what is the philosophy. /Fredrik
  12. Additional comments to my previous ones. Inspired from the philosophy that science deals with the actual observations rather than the in some sense "in principle possible observations", I think that the effect X implies on Y (real experienced effect relative Y) cannot depend on details or variables that is unobservable to Y, because it would be sort of non-scientific IMO. ie. to assume that there is something "unobservable" that might in fact resolve our uncertainty, does not seem like a scientific statement, except that it can be interpreted as "there is a possibility that we can learn new things" and that "we may find" that what we considered as "random", is in fact not, which is a statement I definitely agree with, but I consider that as a different thing. I guess random is a weird concept anyway. I use it in the meaning lack of prior information. And wether there is a rule that is unknown to use that renders it deterministic, does not help, or make a difference does it? Let me know if you feel I missed your point. Since I guess you are a comp guy perhaps you had another intention? /Fredrik
  13. I think bascule sort of have a point, here are some comments... The way I personally see it, information is the core concept in a modern QM interpretation. And physics can be thought of as "information mechanics". The way I consider the intdeterministic nature of QM is that the condition of the "what if game" that "what if we knew everything today, then we would know everything tomorrow", can never be fulfilled in reality. I think the essence of the indeterminacy of the world, is that there is an fundamental uncertainty in the knowledge/information a subset of the universe possess about another subset. So wether there is a rule that for fundamental reasons we can never understand, or there isn't a rule. I think the two cases are not distinguishable from each other, and thus theresponse is the same. Bascule, in a certain sense I think you have a point. However I don't see it as a problem. The indeterminism is IMO about acknowledging our own incomplete information. It will give a more consistent theory, because incompleteness is a property of reality, and I think it does affect our apparent physical reality more than what may seem intuitively clear. In a real situation. If a variable appears random to someone/someothing, it means this someone/something has not information whatsoever about thie variable. What is the best guess this someone/something can make about this variable as a based for it's responses? Clearly he has absolutely no clue. So treating the variable as "random variable" until there is sufficient new data to suggest otherwise seems like the best choice? IMO, the real task here is to base a response on the information we have, given our incompleteness. So in a certain sense I might think you have a point with your statement However, I raise this point which I think is the KEY here Equally likely relative to what and in what setting? And what is the difference between an undetectable/unobservable determinism and real indeterminism? I think there is no difference. It's the same thing. However, the possibility remains that what we have treated as noise before, have now gotten structure. But that concerns evolving and learning, which IMO is a central property of a good theory. In this respect, I think the standard formalism of QM is insufficient and inconsistent and needs improvement. To summarize my personal view. In my view of reality, indeterminism is real, however this indeterminism is not static. Note that I always reference a real observer here. Ie. indeterminism evaluate as per a particular observer. (Evaluations made by some imagined God or in retrospect are not valid references.). /Fredrik
  14. I'm not sure what he meant with free? Free as in "free of charge" or for example gibbs free energy which is the most common sense of useful energy. I suspect he meant the latter? http://en.wikipedia.org/wiki/Free_energy /Fredrik
  15. IMO, if you are searching from "faster than light" stuff that stand a chance to possibly make some sort of sense, I'd look at the planck scales where things are more chaotic. /Fredrik
  16. A general non technical comment. I consider myself quite philosophical and I can accept transient fuzz no problem, but I find this beeing too much analogies that really doesn't give me a sensation of gravity explained. Analogies can be nice to induce a intuitive feeling for something that is otherwise too abstract. But in my terminology that is not an explanation. For me, more often than not, "explanations" tend to incorporate deeper abstractions, going back to first principles in order to explain previously ill defined terms and show that previously considered empirical facts can be deduced in an expanded fundamental setting, and thus replace assumptions or postulates with further abstractions and a lesser number of more fundamental postulates, rather than to try to argue for something only in therms of common sense coming from analogies that IMO tend to encourage circular reasoning. The first question I ask myself as a reader when I start reading this is what kind of explanation this is intended to be? Explain gravity - in terms of what? And is there supposed to (in the end) to come out a mathematical formalism out of this? Or is it supposed to stay a verbal elaboration of gravity? This was just meant to be a sincere reflective "critics" based on the questions that pop into my head what I read this. /Fredirk
  17. Well not exactly like burnt circuit boards which tend to have a sharp burnt edge to them, but there are "similarities" in the aromas sort of in the same class. At least relative to my distorted brain. Though some wild yeast strains can also produce a spicy "burnt" phenolic character. If you don't like beer, did you try any other beers but that typical lame light lager beers? Before I started to brew by (by accident) I didn't like it much either. But then I never tried the GOOD flavourful beers. Depending on where you live the good beers may be are found at speciality shops or quality pubs that have a good selection /Fredrik
  18. I had a period long time ago when I was more of a chem geek and I had a bottle of crystalline phenol, but I discared to a recycling station it when I stopped playing around like I used to. It should be treated with respect from the health point of view, but like YT2095 writes i don't think it smells bad per see, on the contrary. It has a kind of sweetish aroma. Needless to say I never put it in my mouth. It reminds of brunt circuit boards, as well as some belgian beers (I'm a beer freak). "phenolic" is also in fact a flavour descriptor used in beer tastings. In beer it's not actually phenol, but a range of compounds that contains aromatic rings and does remind of phenol, some go to the vanilla side too. But it does remind of actual phenol. In beer these compounds originate from aromatic compounds existing as esters in the barley cell walls, and they are freed during the process and converted by the yeast into nice aromatic compounds. The "phenolic" touch are signatures of many german wheat beers and belgian ales. Leffe blonde is my fave. But it also have significant amounts of banana esters and solvent like nail-polish esters. It's mainly due to the yeast strains used. If you want an idea, get a bottle of leffe blone... and try to mentally subtract the nail polish solvent, and the banana and there you've got the phenolic like sweetish excellent aroma. Either you like it or you hate it. When it comes to beer, some people love the stuff, some don't. Classifying aromas as positive or negative are often subjective. /Fredrik
  19. Ok, I'll probably get back on this. I guess a more sensible explanation of my suggestion will require starting from scratch. IMO, a good theory deals with observable and quantities and abstractions thereof only. Objects and phenomena should also be logically devised in therms of the latter. The probabilities I talked about is that on the short time scale, time is more like a random walk, and to a "micro-observer" hardly distinguishable. But there is a "drift" in the random walk. In the classical domain, this drifting random walk appear completely continous. And from first principles of the random walk interesting things can be said. I'm still struggling with the starting points though. Once I've fiinshed it properly I'll attempt a better explanation. What I wrote was admittedly informal. /Fredrik
  20. > I think you're confusing "functional" with "procedural". Possibly. I wasn't too careful about the words there. What I meant is the object oriented vs function/procedure oriented in the loose sense, not to imply any details. Btw, english isn't my mother tounge so bare with me /Fredrik
  21. IMO, the whole philosophy of GR is to formulate the theory in an coordinate invariant way so that the so called laws of physics look does not depend on such things as choice of parametrisation or frames of reference. When he made SR, which holds true only for intertial frames, this was clearly unsatisfactory because clearly the laws of physics must hold in ANY frame. This I think was the motivation for Einstein to look for general relativity. Einstein knew SR was incomplete. The universal, invariant principle, is that the state of a system will "diffuse into the future" in a direction suggested by the principle of insufficient reason - the shortest path, or the path of least resistance - what we call a geodesic. Einstein's idea was to look for the spacetime geometry that alone could explain away gravity as an additional force. He also assumed that SR holds locally (in the differential sense) - meaning that the speed of light is constant in the differential sense. Of course one can not non-trivially make parallell transports of vectors in tangent spaces from different spacetime points. I think Einsteins lineout here is excellent, but some parts needed for the future (quantum gravity) are missing. First Einstein never ever spends a thought on considering the status of the event space, and to what extent points in event space are equally certain. I think of this is the classical limit (non-quantum) of the future theory we look for. This is sloppy and very informal but I'll spit it out anway: For an alternative (admittedly speculative) elaboration of the upper bound of change consider this. Pick a clock device of your choice, and define your choice of "time" units. Then use this as a parametrisation of your observed distinguished changes. What will the next immediate future look like? We have a range of possibilities with different conditional probabilities (based on present). Since time is simply an internal parametrisation, some of the observables has to do with changes in the state of the clock device. Thus, the clock is a subset of the system. If we work in the classical domain, the expected path into the past is the most probable path. The most probable path should be in the direction of maximum prior probability. If we now are to quantify the change into the future, relative the change of the clock device, it seems intuitive that this ratio can never exceed 1. It equals 1 only when the future of the pure clock device coincides with the future of the system. If there was no upper bound on this relative change, it would mean that a pure change in the clock device (other things held constant) is more probably than the change that was by definition chose to be the most probably change, a contradiction in other words. /Fredrik
  22. Swanson, I keep noticing that some authors on their personal sites have links both to the journals where their articles have been published (for example the physical review A) on which links you can purchase the pdf article from physical review journals. On the same page the author have a link to arXiv where the same paper can be downloaded freely. Is this technically a violation of the "signed away copyright" or is it part of the "fair use" you talk about? /Fredrik
  23. I think once you get a little bit used to thinking like a programmer it's not hard to learn new languages. There is a jump from the old functional programming to object oriented programming, but I don't see why you couldn't practice both. I started programming when I ran into the limitations of my programmable pocket calculator. I first learned quick basic, and then turbo pascal on my first computer because those were the only compilers I owned and could practice with. Then during the programming courses I happend to be in the year when they changed the "first language" from pascal to C++. So oddly, I got the first semester in pascal, and the second follow up semester in C++. A bit odd, but that wasn't a big problem. Maybe it depends on what you intend to do? Are you doing database programming, web development or are you implementing heavy numerical routines? These days for light stuff I still VBA. Excel and VBA are my "pocket calculator" these days. I rarely use pocket calculators anymore. But for anything more advanced you need efficiently compiled code. For this reason I'm not much of a Java fan either. But it was several years ago I worked on this, I am not sure if the java engines has improved. But the last time I worked with it, the performance sucked. My limited experience is that the cumbersome parts of programming doesn't have much to do with the language, it's the complexity of the function or class library. /Fredrik
  24. I am not much into all the standards but if I am not mistaken there is or used to be a definition of second that used the speed light in relation to radiation. Meaning, that the speed of light would be constant by definition. But I think other know these standards better. /Fredrik
  25. First I would like to say that's an excellent question. There are different answers depending on how you interpret the question. In the pragmatic sense one can say that all standard theories of modern physics takes the speed of light to be constant. This is also one of Einsteins postulates(assumptions) on which he built his theory of relativity. And these theories are pretty consistent with experimental data we have so far. Thus we have not seen a reason yet to abandon the idea. It seems to be a consistent idea. If the speed of light is not constant, many working theories have to be reworked. To look for a philosophical why, here is my personal suggestion of start and it is to ask more quesiton. What is light and what is speed? Speed should be a measure on position change with respect to time. And what is time? Supposedly time is simply relative change in a clock device. This means that speed is somehow something like "relative change relative another change". And what is "change"? Change is clearly related to the notion of information of knowledge and beeing able to distinguish one thing from another. I think it can be reduced to the relation of distinguishability. So if we for a second ignore "what is light", the answer seems to be related to relative changes in information, somehow. This is just talk but at this point but I think these things can be formalized into a more stringent formalism. It's a part of what I'm currently working on, to try to formalize what is intuitively clear, and IMO it is very exciting. As for the particular number, it's cleary just a unit convention thing. Why is one inch 2.54 cm? /Fredrik
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.