Jump to content

SuperSlim

Senior Members
  • Posts

    83
  • Joined

  • Last visited

Everything posted by SuperSlim

  1. That looks almost intelligent. What you've clearly forgotten is that languages are context-dependent. Strange too, that after dismissing the need for a sender, receiver and a channel, you invoke the concept of noise in a channel, and filtering. You don't realise how inane that is. You don't because you have immunity, right? The difference between information and data: there is no physical difference, it's entirely artificial; it's one of those things called a choice. You can't or won't agree of course, because this discussion is all about how much you can disagree with whatever you choose to disagree with. What fun you must be having,
  2. Sort of how numbers play a trivial role in arithmetic, huh?
  3. Right. According to you a computer can be switched off and still be computing! What a fascinating worldview. More completely dumbass stuff from an "expert". Shannon entropy is about the frequency of messages; it's about information content and how to encode that efficiently. The surprise factor is not some kind of highfalutin terminology. Expectation is not an ill-defined term in communication theory. My guess is you probably think data and information are different things too. You provide an example: the Swedish language without the extra marks. A change of encoding that makes almost no difference to the information content. So it has about the same entropy. What a pack of retards.
  4. What I meant was no reader of Swedish would find the missing marks surprising, because they expect to see them. So they would understand written Swedish with or without the marks; it's like how you can ndrstnd nglsh wtht vwls n t. Or wat. ys. mst frms r fr jrks. Lk xchmst. What a mature question; you must feel so proud of yourself; you don't even have to try, do you? Seriously, you don't have anything better than schoolboy jokes? What a bunch of clowns. Seriously. What a pack of goddam idiots. Patting each other on the back aboout how much you like each others inane posts. Jesus Christ. You can keep this shit. I'm wasting my time with it Eat shit and die, you dumb fucks.
  5. One of the non-intuitive aspects of gyroscopic motion is that any acceleration is perpendicular to the applied force. If you spin a bicycle wheel then whack it sideways, it accelerates (turns) 90 degrees away, in both directions. That's why when you're riding a bike you can recover from a sideways jolt, by controlling the resulting torque with the handlebars.
  6. Ok then. Well I'm saying that's a bit of a misconception. Actually it's a pretty big one. I would say there is no theory of computation in which information doesn't "play a role". I say that because any device that can reasonably be called a computer, has to process physical information. It's also because computer engineers who design and build computers have to decide how to measure outputs, and how the information is physically represented as the computer processes it. I mean, how did it occur to you that computational theory doesn't include the concept of information? It does include it. MIT professors who put the course together might assume that students already know this (almost trivial) thing. I know I do.
  7. What isn't? Are you saying MIT's theory of computation online course isn't "about" information?
  8. Ka-ching! I should say, communication of information is about preserving it. Computation, well, just isn't. It can be logically reversible, however.
  9. The way to really consider the Shannon entropy is as a sender, a receiver, and a channel. It's about how to encode a set of messages "efficiently". I consider your example wouldn't change the coding efficiency much; not many words would be "surprising".
  10. Does it have a good explanation of Shannon entropy? It might not because Shannon entropy is about communicating information, not computing it. Or correct me if I'm mistaken.
  11. --https://www.cs.auckland.ac.nz/research/groups/CDMTCS/researchreports/352mike.pdf
  12. {0} with addition and multiplication forms the trivial ring. I remember an assignment where we were supposed to prove, using ring axioms, that 0 + 0 = 0 - 0 = 0. It suggests to me that the additive inverse of +0 is -0. But it doesn't have a multiplicative inverse; oh dear, that makes the above a bit harder to prove, but I've seen it done (no, I couldn't figure out the answer myself, but it's kind of gob-smackingly simple once you know).
  13. Perhaps monoidal categories have more to do with computation than most scientists realise. You understand, I hope, that the "big idea" is that category theory can provide a common language that spans QFT, QIS, classical IS, and maybe some other disciplines? Unfortunately I'm in the process of moving house and I've stashed all my notes in storage. But that's the concept, that category theory can fill the gaps in understanding. However, it seems to still be largely not understood. I'll just trot this out, since I do know what the connection is between field theories and monoids. If you've looked at the computational problem of Maxwell's demon, the monoid in question is N molecules of gas in thermal equilibrium. The demon sees a "string of characters" which are all the same. If the demon could get some information about just one of the molecules and store it in a memory, then the second law is doomed. Since the second law doesn't seem to be doomed and time keeps ticking forwards, the demon can't store information about the molecules. It can't encode anything except a string with indeterminate length, from an alphabet of 1 character. But I'll leave you all to carry on, figuring out whatever it is you think you need to figure out. I can't help you it seems. So good luck with your search.
  14. The halting problem is sort of recursive. It says that given a Turing machine T, there is no Turing machine T' which can decide if the first Turing machine will halt. Generally, you assume there is a Turing machine, H, that can decide if T will halt and then show this leads to a contradiction. So you prove it by using proof by contradiction.
  15. Also, I'd recommend looking into monoidal categories; a free monoid is just a set of strings over an alphabet, i.e. an alphabet with concatenation. Formal languages can be free monoids, or monoids with restrictions on string composition (i.e. not free). A monoid is basically a group with no inverses, but there's an identity (the empty string). A monoid is, according to category theory, a de-categorification of a symmetric monoidal category (!)
  16. Ok. Well, I'm sorry if that's how it comes across. I'm "dismissive" generally of people's naive ideas about certain subjects. It's fairly apparent that people generally believe they know all about the subject: what is information, and what is computation, and I'm sorry again, but that is clearly not the case with this thread. Yes, you choose how to encode information. Or maybe the universe does. Information does not exist if it isn't encoded somehow; so your third sentence there is incorrect. The encoding must have a physical basis. Yes it is. There may well be an infinite number of ways to encode the same finite amount of information; each encoding will be the same information, unless it's been transformed irreversibly. Yes you did. You said you could write a list that represented some books. Because . . . you said I is some information and information is always encoded. Seriously, if you handed in an assignment that said "I is information" then didn't specify what kind, what physical units, what the encoding is, a professor would probably not give you a passing grade. p.s. the level of criticism I'm using in this thread is nothing compared to the real thing; when you study at university, particularly hard science subjects, you get criticised if you say something that's incorrect, that isn't quite the whole story. This is not because university lecturers are nasty people, it's because they want you to learn something. One lesson is accepting there are things you don't know about, so you don't understand them. A review of what I just posted: You and I and everyone actually has a good idea of what information is, and what computation is; however, getting those marks in that exam means you need to understand it in a formal way; you need to be able to trot out those equations. I guess I've been kind of arguing the point, somewhat. But so far, neither studiot, nor Ghideon, has managed to leave the pier. With binary computers a lot of the heavy lifting has been done, thanks to computer designers. Binary information is pretty obvious. But as I say, how do you know a particular binary word represents an address, or an address offset, or is an instruction? How do you tell which binary strings are which? You don't have to, it's all been done for you . . . On the other hand, information from say, the CMB, is not different kinds of strings, instructions or addresses. It is encoded though. Well, have you heard of Baez' cobordism hypothesis? In which a program is a cobordism between manifolds?
  17. No. An encoding doesn't encode meaning. The meaning of information is something above and beyond how it's encoded. For instance if you try to run a program that isn't a program (it's not the right format), you'd expect the computer to reject it, to output an error message. If the program is say, an mpg file it wouldn't be expected to run like a program. If you interpret information, you apply meaning. This is something a digital computer does all the time: it has to be able to distinguish the meaning of an address word from an instruction word, and so on. Encoding is only important if you want to encode some . . . information!
  18. 'sigh'. If that's what you want I to be, you should also define how the information is encoded. If you want to call J information that can be deduced from I, you should also define what that means. Is J a copy of I or another encoding of information? Yes; what you do is copy information from the form it's in when you look at the books, to a different form, a list. The list is not the books and nor is what you see when you look at them. So, information is something you defined, in terms of books. An arbitrary choice. Information is a choice of an encoding. Even when it's information that you know is unreadable, say the individual velocities of molecules of gas, in a bottle of gas. You still know there is information, and you know something about the encoding of it. You can design an algorithm; the algorithm doesn't have to have a physical realization, it doesn't have to work. It just has to be an algorithm.
  19. The above amateur experiment I think is a trick this person has done before. After this initial demonstration, more water is added; not a lot, but it looks like the experimenter has some idea how to 'tune' the glass. It does look like it might be ordinary glass, but it's hard to tell. There is a well-known difference in the quality of sound between ordinary glass and crystal glass. Crystal or lead glass is tougher, it's been tempered by adding some metallic elements, in a well-understood process. With the extra water the system now tries to move to a new equilibrium (a superposition of concentric waves and other modes). I'd guess because of the elliptical vibrations, there's a superposition of 'horizontal + vertical' waveforms. You need a wave equation for an elliptically vibrating wineglass, it seems that it only needs to produce enough power to drive the inertial waves.
  20. I haven't until now really thought about what it would take to investigate this properly. I've been in a lab or two so I imagine it would mean some kind of interferometry as a way to detect wavefronts. You want to count them, and you want to know if the pattern rotates with the glass, or in any other way. I've gathered some more visual evidence, and it seems the surface waves I'm focused on are quite different than the kind you get when the glass container is less efficient at resonance than a crystal (lead) glass one. With ordinary glass it's usually thicker and there are more lower frequency harmonics. In some images you see how the water gets driven to an 'inertial' solution, involving more of the bulk and a greater surface area to get to equilibrium. In these first shots, there are evanescent type waves around the glass, spreading out on the surface. The water's been dyed with green food dye. The contrast is better (I guess red wine should work, then).
  21. What the field equations keep telling us is we haven't found the real solutions yet. All the solutions, so far, including what happened at a beginning, are still approximate somehow.
  22. I don't know if you noticed, but the rows and columns represent a design choice. So does the coin. You ran an algorithm to do this designing and posted a diagram. It's a way to represent, I guess, a kind of memory device. There are a lot of things in nature that fit in this class: a store of information is a physical object, with "extra" structure. Except that we decide what the extra is, it isn't "really" there, we just say it is. We prove it by drawing diagrams.
  23. Another thing I thought about, was this is an example of a coherent output, a two dimensional surface bends regularly into a third dimension; well ok it's already curled up at the edges, but there's the measurement idea to consider, and if the coherent waveform is because the water underneath (a mostly cohesive liquid, with a gravitational potential gradient, i.e. inertia), is being pumped with sound waves, it's like laser coherence with light, except with sound you get something else. What keeps it coherent is not, apparently dependent on whether the bulk is rotating, or even sloshing (pitching and yawing). It only depends on how sound waves "pile up" as they reflect back and forth in the water. It looks like a kind of waveguide, or wave-antenna, IOW.
  24. The kind of experiment I'd like to do, or see done, involves a pair of glass bowls and an external speaker large enough to make both vibrate at resonance. An overhead camera can record the patterns in some water in both the bowls. It should be different levels to investigate the frequency change, and both patterns will either line up or they won't. The second result indicates its the inner glass surface and the fact it isn't an exact smooth one. Otherwise there's a bit of explaining to do. The next thing I would try is picking each bowl up and making the water in each one rotate in opposite directions. Then apply the sound input again to see what happens. If the patterns in both bowls are fixed, what's fixing them? If it doesn't depend on rotation, then what does it depend on? I'd like to know.
  25. I keep remembering, but not saying so much about it, that measurement isn't necessarily an output--it can be an input too. If you want to, just call what the excitation of the glass is, a measurement. The guy rubbing the edge with his finger is measuring something physical (I don't know that you can measure anything else, but that's another story). The glass is responding, there are sound waves traveling into his finger, a feedback mechanism. But is the idea of feedback--positive or negative--included in a measurement? To see the pattern--a surface response to sound--seems entirely passive, you don't do anything but look at it. The measurement you "perform" by looking at a pattern (maybe an interference pattern) includes the memory you then have of it. You have Shannon entropy. I think an interesting thing to pursue in this is, what does determine the position and direction of the rays? Some experiments involving translation and rotation, using sensitive measuring equipment might turn up something curious. But just explaining why it happens when you leave the glass where it is, and each time you get it to resonate, the rays appear in the same place. That's curious because it implies the glass is fixing the pattern, and so rotating the glass should change it. So does it? I can't say, I didn't really try to look at that. But what does theory say about it? What theory is it?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.