Jump to content

SuperSlim

Senior Members
  • Posts

    83
  • Joined

  • Last visited

Everything posted by SuperSlim

  1. Let me at least try, once again then. It is not correct that the claim you refer to, which I made, is that Shannon entropy and thermodynamic entropy have the same definition. Neither is it correct, as you conclude, that entropy cannot decrease. You have perhaps forgotten what thermodynamic entropy is, or you haven't grasped the significance of Maxwell's demon. I guess the last thing there is excusable to some extent, it took over a century to explain why it can't violate the laws of physics. I honestly think I have been addressing them. I've already noticed that some posters here seem to have a problem with understanding what a computer is. That's actually a bit of a worry; I'm not sure I'm discussing the subject with people who really know what the word means. You yourself appear to have just suggested I can build a computer that doesn't have to erase information if It doesn't do anything. So I have to ask, do you really know what a computer is?
  2. I'm not sure what you mean, there. In a deterministic machine (a digital computer running a program) storage and erasure are not concerned with "unavailable information". Storage and erasure are logical operations; both are local, not global, operations. A global reset is something not usually considered to be a program although it does leave the sytem in a known state. Powering down a computer is one way to leave it in a known state. Computation without energy doesn't make a lot of sense to me. If information is being transformed, and if information is physical, how can it be transformed without a source of energy to physically transform it? Or are you just pointing out that a square is topologically already a circle, no computing required?
  3. Even if it isn't true, our sense of being free to choose is fundamental to survival, evolution and no free will doesn't seem to add together.
  4. Information: it's something whose nature depends on a choice. It can be all kinds of physical objects, but computation is a different level. Information can be defined in terms of static objects with fixed physical values, but computation has to transform some information, there has to be an information-transport mechanism that takes inputs to outputs. It seems that if we choose electric charge to be a computational basis (the presence or absence of fixed amounts), then we restrict the choice for erasure. Erasure has to mean a complete lack of any "forensic charges" left lying around, there has to be a large number of d.o.f. when electrons dissipate so a charge vanishes from some location (where it might have been stored briefly, or been conducted through a transistor, etc.). Erasure is by definition the same type of local change as storage.
  5. Right, a computer is indeed a physical system; however an active computation sees heat as non-informational, in an electronic computer. A quote from Charles C Bennett (the inventor of the Brownian motion computer). --https://arxiv.org/pdf/physics/0210005.pdf
  6. The non-informational d.o.f of a physical system must have effectively infinite dimensions. This seems to be a necessity to erase all evidence of some information-bearing physical quantity, such as a fixed amount of charge in a digital computer. It can't even be like burning paper, which "erases" a written message; there is evidence that a message was "erased" so, in principle the message is still around. Not all the IBDGs have been dissipated into an ocean of "no information".
  7. I think what the (so far, confusing) storyline might be, is that Baez conjectured whether a theory is like an algorithm, and an experiment is like a path or trajectory through an (abstract) machine. I guess I've picked up the baton, here. In any experiment, what can you write down that is representative of the state of a system (of, yknow, particles), and then what do you do with the recorded information? Just to throw a spanner in, what if someone else steals your notes and burns them, they're a pile of ashes now? (The machine has run an unexpected information-bearing algorithm). I'll add the observation that Kirchoff's laws are based on ideal elements, they say nothing about temperature. An ideal resistor has a fixed resistance, a real physical resistor has a resistance that depends on temperature. The trick is to restrict the ambient temperature by actively or passively cooling the physics down. So is there something like a "Kirchoff's algorithm" for chemistry? If you can define an abstract machine and paths through it, and algorithms, well, maybe.
  8. I'll just add this comment: The mathematical Fibonacci sequence can be written down and assumed to be a static picture. That is, it can be assumed to be "free of physics". How accurate is that assumption, though?
  9. By finding a common algebra, preferably one that defines linear relations (such as the addition of voltages). I believe it's called information. Or, what it is you know, and what it is you can't know. The reality of information.
  10. I haven't seen that, but I have seen a claim that thermodynamic entropy is connected to Shannon entropy. I personally believe the connection is quite a strong one, mainly because thermodynamic entropy must be considered in any computational device.
  11. Yes absolutely. Erasing information means you erase the information-bearing degrees of freedom of what has been defined (before anything is written or erased) as having those IBDGs. You map those to NIBDGs, also pre-defined. The mapping is always a physical, dissipative process. That's quite a coincidence. I find you confusing, but then you have jumped to the conclusion that since I started talking about an example--electronics--that's what I have to stay with, I can't generalise it. I can't talk about a chemistry experiment or coupled pendulums, or anything else, because I have to talk about electronics.
  12. Look I'm sorry but that seems to be a mischaracterization. "The folks" are trying to explain why thermodynamics and information theory are connected. Because they are, I mean it's just one of those things. Take information entropy and multiply it by the total energy of a system, then divide that by the average energy per particle, and you get a number that has no physical dimensions. It's a ratio. So say information is also physical, it has the same kind of thermodynamic entropy when considered as a source of heat, surely? In a computer, the information is a resource, so is heat but heat is a "waste product". Turn that around and consider the information being processed by a cyclic heat engine, and say what the computer is doing is wasting a resource. Why doesn't that work? Or maybe, it does work. So that's the memo I guess. In any computation there are chosen information-bearing degrees of freedom, and so there are, although we almost don't bother with it, non-information-bearing degrees of freedom. I think this is some kind of fundamental principle, perhaps, so must apply to any computation. If an experiment is such a thing, and I can't see how it isn't, there we have it. Except for the choice part because most of them are made for us.
  13. Why do you think that's what I'm laying out here? I was talking about an experiment with an electronic circuit. You've pointed out how in chemistry you can do non-electronic experiments. Something I know already, but, thanks anyway. Obviously something has wooshed straight over that head of yours. I'm trying to examine what information there is in any experiment and how you would encode that information, not necessarily on paper or an electronic circuit. I know how scared some people are of Shannon entropy, but it is what it is. So I point out something about how you can choose 0V in a circuit, and this doesn't seem to you, to be something you can do in every experiment. Well, you could be right about that.
  14. I think, when asking about the elastic properties and what the restoring forces are, one should consider what happens in gravity-free situations. Does a bell vibrate if struck in space? Does water, when it forms blobs, wobble around because of gravity, when in space, and why do blobs of liquid stay blobs? GIven the ISS is in a gravity-free scenario and has an atmosphere, I think a gas in a container in space is covered. From a physics textbook: "It is very important to understand what is propagated as a wave in a wave motion. . . . it is not matter that propagates, but the state of motion of matter. I.e. wave motion is not a bulk flow of matter. Actually, Baez states it's the flow of momentum, parts of the medium transmit momentum to other parts, so that the matter moves the least distance to transmit the wave. Waves obey a principle of least action. And here's something to think about: the waves I produced on the surface of a large brandy bowl had a small wavelength, the pattern was like rays of standing waves; near the center of this "wave motion" the wavelength was so small it effectively vanished. The wavelength near the middle could not have been gravitational, the wavelengths were too small. So a blob of water in free fall; can you make capillary waves on the surface of the blob? How is gravity involved?
  15. And these quite different things are . . . ? Why don't you just point out what the problem is, with whatever it is you think you've spotted? Why not just quote something from a recognized source? Elastic waves in a compressible fluid should not be muddled with the elastic waves in an incompressible fluid, because . . . Restoring forces in compressible fluids are different to those in incompressible fluids, because . . . p.s. does it have anything to do with momentum?
  16. Which experiments don't involve anything electrical? Or do you mean the ones that don't need to plug anything into a mains supply? Unless the experiment involved the creation of some voltages, if that was the aim of the experiment, you mean? Say the experiment was about mixing reagents together and observing what happens. Say you also know what reagents are being mixed together. How would you relate something like Kirchoff's laws to the chemistry in the reactions? Or would your opinion be it isn't something anyone would bother with? Would someone who tries to formulate a linear algebra of reactions, be just getting lost, you think? You know how easy that is, sometimes people don't manage to connect a post with the one preceding it !! And it's from the same person, and it's about the same thing !!!
  17. So when an organist stops pressing a key and the air relaxes in some pipes, that isn't an example of elasticity either? What about when a pipe organ key is pressed, or when I was making a liquid 'vibrate' with standing waves? Is either of those an example? If it isn't, please give a definitive example and explain why my examples aren't examples of elastic waves in a medium? Please bear in mind a bloke named Chladni investigated glass plates covered with sand, and used a violin bow. Patterns--standing waves--appeared; when he stopped bowing the side of the glass, the sand stopped responding. A layer of sand doing nothing on a sheet of glass is what kind of example? Wait, I know what kind of example: it's an example of small bits of solid matter acting like the disconnected particles of lquid or gas do when they are subjected to elastic waves. Let me put it this way. Although elastic waves are common, in sound, in liquids, and of course solids, people who have a physics background assume that elasticity is restricted to solids, and is a long-range phenomenon; it happens because solids have connected particles, they stay where they are. But it isn't just solids with their long-range version, liquids and gases have it as well. In fluids, elasticity and surface tension aren't restricted to surfaces, the whole medium has surfaces in it, right? When an organ pipe vibrates, elastic standing waves are set up, there are density variations along the pipe so the air gets squeezed and stretched; the forcing means the air has to rearrange itself, 'elastically'. Ok?
  18. I guess what I'd like to do is verify Landauer's principle holds for any experiment, including those that violate Landauer's limit for the erasure of information. This erasure must be based on a choice of what the information is, this appears to be unavoidably related to a choice of gauge. That is, choosing where 0V is, is something you have a few degrees of freedom for in the average electronic LRC circuit, including those with active elements. This choice determines the values (the information content) of other voltages. A string of voltages/currents that add to zero should obey Kirchoff's laws (a linear algebra over V and i).
  19. Suppose you're doing an experiment that involves the use of an electronic circuit. Where do the electrons come from, and where do they go to? If there wasn't a "bath" of available electrons that can be pushed around a circuit then dumped back into the reservoir, electronics would be a lot more esoteric, like particle accelerator experiments with hadrons maybe. Electrons are light particles and they're easy to gather together in classical amounts. The experiment focuses on the circuit and the voltages and currents in it; the fact there is a reservoir of available electric charge is sort of a given; ignoring it has no effect on the experiment. But in quantum experiments, the electrons have to go back into the reservoir; if this didn't happen they wouldn't leave any marks on a screen, they wouldn't be detected. I'm trying to examine this idea of having a reservoir, part of which is given a known state (230V, 60Hz), does work, then returns to the same unkown state it was in. In particular it seems to be an important detail in quantum experiments--particles that need to leave evidence behind also have to vanish, so nothing further can be known. What's the common denominator? Information has to be "written" somewhere, then it has to be "erased", at a cost.
  20. When I took Communication Theory (quite some years back), I learned to my surprise that information has entropy. A message that's been received has an information content, and an expectation that the information will be received. Then I learned that sending a message has an expectation of being received; this is something that also intersects the domain of engineering--you want the sender and receiver to have a reliable channel and an agreed protocol. So that there is the connection: an experiment is in both domains, you need to do some engineering (after doing some design), then you get some information and then . . . analysis. You hand in the lab assignment and hope you get at least a C. So where I'm trying to go is, when we do the experiment, what information isn't then in the analysis, what do we ignore, and does it matter? Why would or wouldn't that be true? I know it sounds trivial, but so does the reason Maxwell's demon doesn't get to "see" gas molecules and violate the 2nd law (you know, the one that also says more or less, time doesn't go backwards).
  21. Another thing about fluiids like water; when you drop an object into it, it doesn't respond "gravitationally". The response is very much a property of liquids, the reaction with high speed photography seems more elastic than gravitational. It suggests that liquids respond to a sudden increase in pressure much the way gases do, except for the compressibility. That gases are elastic is supported by the existence of wind instruments, and my old physics textbook. It seems that elastic properties are not confined to the solid state. p.s. thanks to swansont I now know about Faraday waves; I thought my little experiment might be an example of Chladni waves.
  22. I think you might be confusing long-range elasticity in solids, with the kind in fluids. When you drop something into water, there is a response, not a long range one except that after the local one (a splash, maybe some plumes), waves spread out on the surface. But these waves must involve the same short-range elasticity the water just demonstrated with the response to an impulse. The object you dropped did not just slip into an inelastic medium, it would have just disappeared soundlessly under the water; there would be no "long-range" waves on the surface.
  23. I might have worked it out. If you attach a weight and let go it accelerates; now there are Newtons and a second derivative to deal with. Exclude this by not lettling the weight accelerate (duh!). Lower the weight at more or less constant velocity, declare you in fact acheived this, and eliminate the time factor in the usual way with s = vt. What's left has units of kg.m. In other words an integral over time of the momentum. With no acceleration I just "transfer" some to the bar and bend it.
  24. I think a good way to look at randomness as emergent, is to consider binary strings of different lengths. The longer a binary string is, the more examples of random (i.e. incompressible) patterns there are. But each string might have the same probability, like if it's the outcome of a roulette wheel spin with a lot more slots than usual; like thousands.
  25. You need to find a good forensics chemist who can tell you what was in the flasks, or what the residues contain. I'd say they will want access to a gas chromatograph.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.