Jump to content

SuperSlim

Senior Members
  • Posts

    83
  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

SuperSlim's Achievements

Meson

Meson (3/13)

-17

Reputation

  1. That looks almost intelligent. What you've clearly forgotten is that languages are context-dependent. Strange too, that after dismissing the need for a sender, receiver and a channel, you invoke the concept of noise in a channel, and filtering. You don't realise how inane that is. You don't because you have immunity, right? The difference between information and data: there is no physical difference, it's entirely artificial; it's one of those things called a choice. You can't or won't agree of course, because this discussion is all about how much you can disagree with whatever you choose to disagree with. What fun you must be having,
  2. Sort of how numbers play a trivial role in arithmetic, huh?
  3. Right. According to you a computer can be switched off and still be computing! What a fascinating worldview. More completely dumbass stuff from an "expert". Shannon entropy is about the frequency of messages; it's about information content and how to encode that efficiently. The surprise factor is not some kind of highfalutin terminology. Expectation is not an ill-defined term in communication theory. My guess is you probably think data and information are different things too. You provide an example: the Swedish language without the extra marks. A change of encoding that makes almost no difference to the information content. So it has about the same entropy. What a pack of retards.
  4. What I meant was no reader of Swedish would find the missing marks surprising, because they expect to see them. So they would understand written Swedish with or without the marks; it's like how you can ndrstnd nglsh wtht vwls n t. Or wat. ys. mst frms r fr jrks. Lk xchmst. What a mature question; you must feel so proud of yourself; you don't even have to try, do you? Seriously, you don't have anything better than schoolboy jokes? What a bunch of clowns. Seriously. What a pack of goddam idiots. Patting each other on the back aboout how much you like each others inane posts. Jesus Christ. You can keep this shit. I'm wasting my time with it Eat shit and die, you dumb fucks.
  5. One of the non-intuitive aspects of gyroscopic motion is that any acceleration is perpendicular to the applied force. If you spin a bicycle wheel then whack it sideways, it accelerates (turns) 90 degrees away, in both directions. That's why when you're riding a bike you can recover from a sideways jolt, by controlling the resulting torque with the handlebars.
  6. Ok then. Well I'm saying that's a bit of a misconception. Actually it's a pretty big one. I would say there is no theory of computation in which information doesn't "play a role". I say that because any device that can reasonably be called a computer, has to process physical information. It's also because computer engineers who design and build computers have to decide how to measure outputs, and how the information is physically represented as the computer processes it. I mean, how did it occur to you that computational theory doesn't include the concept of information? It does include it. MIT professors who put the course together might assume that students already know this (almost trivial) thing. I know I do.
  7. What isn't? Are you saying MIT's theory of computation online course isn't "about" information?
  8. Ka-ching! I should say, communication of information is about preserving it. Computation, well, just isn't. It can be logically reversible, however.
  9. The way to really consider the Shannon entropy is as a sender, a receiver, and a channel. It's about how to encode a set of messages "efficiently". I consider your example wouldn't change the coding efficiency much; not many words would be "surprising".
  10. Does it have a good explanation of Shannon entropy? It might not because Shannon entropy is about communicating information, not computing it. Or correct me if I'm mistaken.
  11. --https://www.cs.auckland.ac.nz/research/groups/CDMTCS/researchreports/352mike.pdf
  12. {0} with addition and multiplication forms the trivial ring. I remember an assignment where we were supposed to prove, using ring axioms, that 0 + 0 = 0 - 0 = 0. It suggests to me that the additive inverse of +0 is -0. But it doesn't have a multiplicative inverse; oh dear, that makes the above a bit harder to prove, but I've seen it done (no, I couldn't figure out the answer myself, but it's kind of gob-smackingly simple once you know).
  13. Perhaps monoidal categories have more to do with computation than most scientists realise. You understand, I hope, that the "big idea" is that category theory can provide a common language that spans QFT, QIS, classical IS, and maybe some other disciplines? Unfortunately I'm in the process of moving house and I've stashed all my notes in storage. But that's the concept, that category theory can fill the gaps in understanding. However, it seems to still be largely not understood. I'll just trot this out, since I do know what the connection is between field theories and monoids. If you've looked at the computational problem of Maxwell's demon, the monoid in question is N molecules of gas in thermal equilibrium. The demon sees a "string of characters" which are all the same. If the demon could get some information about just one of the molecules and store it in a memory, then the second law is doomed. Since the second law doesn't seem to be doomed and time keeps ticking forwards, the demon can't store information about the molecules. It can't encode anything except a string with indeterminate length, from an alphabet of 1 character. But I'll leave you all to carry on, figuring out whatever it is you think you need to figure out. I can't help you it seems. So good luck with your search.
  14. The halting problem is sort of recursive. It says that given a Turing machine T, there is no Turing machine T' which can decide if the first Turing machine will halt. Generally, you assume there is a Turing machine, H, that can decide if T will halt and then show this leads to a contradiction. So you prove it by using proof by contradiction.
  15. Also, I'd recommend looking into monoidal categories; a free monoid is just a set of strings over an alphabet, i.e. an alphabet with concatenation. Formal languages can be free monoids, or monoids with restrictions on string composition (i.e. not free). A monoid is basically a group with no inverses, but there's an identity (the empty string). A monoid is, according to category theory, a de-categorification of a symmetric monoidal category (!)
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.