Jump to content

DoorNumber1

Senior Members
  • Posts

    33
  • Joined

  • Last visited

Everything posted by DoorNumber1

  1. Zanthra knows what he's talking about. Trust me, nobody in security would do something as silly as a simple mapping from letters to hex symbols. A good hash has the property that a tiny change in the cleartext produces a large, and unpredictable change in the hash. There are at least 4 different protocols windows uses for login that involve hashes. LAN Manager (LM) is an old authentication protocol that uses a shitty, weak hash that can be brute forced almost instantly. NTLM is the updated version used since NT, but can still be brute forced pretty quickly. NTLM 2 is an update that I don't know much about, but I'd hope that by this time they'd learned their lesson and done some decent hashing. Kerberos is what's used to log into a Windows Server 2k or 2k3 domain, which finally does respectable crypto. Your Windows computer never wants to store or pass the password in clear, so it stores and passes a hash of it instead. That's what you see as a hex string. Unfortunately, with the crappy Windows LM hashing, you can brute force the hell out of it and crack it pretty quickly. For the updated authentication protocols that take a bit longer, you can always send them off to someone with cycles to spare (like, well, loginrecovery.com) and let them brute force it for you. Also, keep in mind that hashing is not encryption. If you use a weak password, it can be cracked using dictionary attacks and intelligent brute force schemes (likes ones that use common letter->number swaps and append a few numbers to the end of the dictionary words). Just use a long, complex password... or use a mac like I do. It's hard to work in security and feel good using a Windows box
  2. It's a valid question, BobbyJoeCool and important to follow the solution to this fairly small problem. You solve lim x -> 0 sin(x) /x type of question using L'Hopital's rule (sp?). In other words, limit of f(x)/g(x) = limit of f'(x)/g'(x) which is really useful for cases when you get indeterminates like 0/0 or inf/inf using normal rules. So lim x -> 0 sin(x)/x = sin(0)/0 = 0/0 = indeterminate, so we differentiate the top and bottom and get lim x -> 0 cos(x)/1 = cos(0)/1 = 1/1 = 1. And therefore, by L'Hopital's rule, lim x -> 0 sin(x)/x = 1. Remember your calculus!
  3. the question isn't really whether the human eye can keep up with it; that's just a question about the mechanics of the eye. The upper speed your eye can track is (to a decent approximation) a function of the mass, size, and shape of your eye, the amount of force your eye muscles can exert, where they're attached, how far away the object is, the relative path it's going on (to know the tangential compent of its velocity), the mass, size, and shape of your head, how much force they can exert... you get the picture. In the end, the number's pretty freaking high because your eye is very small, nearly spherical, and almost frictionless. Also, your eye muscles are (relatively) pretty strong. But this doesn't answer the real question, which is whether you'll perceive it. After all, you must perceive it to "track" it. This doesn't have much to do with focus speed or the speed of light, or anything like that. It's more a question of a) how much light is the object putting off, b) how much of it reaches your actual eye, and c) which part of your eye these photons land on. imagine yourself staring straight ahead and an object appears out of nowhere. Before you "see" it, enough photons must reach the photosensitive cells in your eye to activate it. If enough of these cells are activated, it generates a signal strong enough to actually register in your brain (assuming you're not focusing on something else), which causes your eyes to turn to focus on it and elevates it to conscious thought. That's the order for most things: it registers on a rod or cone cell, their activation registers in your brain, you turn your eye to it, and a small fraction of time later this whole thing registers in consciousness. The irony is that we usually think of it as "Oh, I saw it and turned to look at it..." which is completely wrong. It also depends on where the light first hits your eye: in the center our eyes have way more resolution (especially color), but near the outside the cells are much more sensitive to light. Okay, I'm going to stop and sum up the answer. The speed at which "the human eye can no longer keep up with it" is more than just the human eye's movements speed, but also out ability to track it. That's dependent on whether it gets registered (so our eye actually turns) which is dependent upon how much light from the object reaches our eye and where it hits. Hope this helps!
  4. DoorNumber1

    Voting

    Yourdad, please get a life. Or at least a sense of perspective. Hell, a basic knowledge of linguistics concerning the critical periods for learning languages will do. Truth is, picking up a language as an adult is, to put it scienficially, hella-hard to do. After a month in China I learned a lot of random words and phrases, but all I remember how to do is greet people, express the thought "I am..." (useful for saying you're an American or a student or anything like that), and say the word thank you. And the word for bathroom. That was a very useful one to know. If I stayed longer, with a bit of formal study, I don't doubt that I would become functional and rather quickly. But it would be broken and hideous to a native speaker of Mandarin. Put me around a group of English speakers there and I'll always revert to my native tongue. It was SUCH a relief to hear fluent English being spoken by anybody after a while. And it's hard, even when you're trying, to find your way around when none of the signs are in your native tongue. But I'm not stupid. Quite the opposite. And guess what? Most of those spanish immigrants aren't either. One of my best friends parents were biochemists. Oh, but that was in Nicaragua. Because they couldn't speak English fluently, they were assumed to be dumb so they had to settle for low paying janitorial jobs in Miami. Naturally the second generation picked up english flawlessly as kids do and are now assimilated and adding a lot to our country. I met her here at Harvard, and I'd bet money she's 20 times smarter than you are, yourdad. And that's the point of America. Amazing people can come here, do their thing, and everyone benefits. As a final note, more to the point considering the implied ideologies behind your "lacking in patriotism" statement, ask yourself honestly if you'd react the same to a cute girl speaking broken English but fluent French? Or how about Italian? Or German? You don't have to tell me the answer, but you should really think about it. And hard.
  5. I have a DVD-RW in my G5, and I have yet to use it. Why? Well, I just don't need to transfer that much info at a time! Truthfully, I'm happy for my DVD burner because I can, in theory, back up quite a bit of information. But for everyday use, the most I'll need is a CD to back things up. I think Sayonara hit the nail on the head though; some people would need more. There are tons of applications (mostly multimedia things) where one might like to transfer much more than a DVD might hold. What I would like to see is a study done that collects stats on the average amount of data transferred to/from the average home computer computer, the way it gets there, and what it's used for. I think such data would be quite enlightening, and I would predict that the largest items would be media files that still fall within size range of CDs or DVDs. Most uses would be rather small, and transerred through non-physical means.
  6. Okay, can somebody explain one thing to me: how did we solve the big problem of radiation shielding in deep space? While we're within Earth's EM field everything's just great, but once we leave it a ship and all inside will get their collective asses kicked by the full, unrestrained burden of our sun. The required shielding to keep a man healthy for suck a long ship would make for a HUGE (read: tons of fuel required to get into orbit) craft! Also, what about the fast forward osteoporosis that takes place in zero g or near zero g? There are tons of theories out there of what a craft going to another planet would have to look like involving spinning disks and stuff, but as far as I know we haven't found a good way of simulating gravity. There's so much damage done to astronauts nowadays in their relatively short stays... imagine a trip that'll take years! I think we have decades of committed experimentation and testing ahead of us before we can even begin to tackle the god of war. I'm all for it though. Uncovering the secrets of interplanetary travel is one of those goals that drove me to study physics through hgih school and my first years of college before I switched to computer science. And I agree with Lucid... it was a blatant attempt by an idiot president to appear "science friendly" after his ridiculous stances on other scientific issues. Making such a claim, to me, just highlights his stupidity and insecurity; he feels needs to connect himself to greatness by making such a Kennedy-esque statement. I'll be cheering loudly when he's out of office.
  7. How much gas does a farting cow produce? Enough. We have an awful lot of cows in this world, ya know. A good question would also be "how much methane gas does a city full of farting humans produce?" I suggest we put a fart tax on people who sit around on their couches eating potato chips and contributing to global warming. 5 cents for every burst of methane laden wind, eh? On a slightly more serious note, I just read an article a few days ago about huge greenhouse gas emissions coming from a Brazilian dam because they basically flooded an entire forest that is now rotting underwater. Kind of sucks, when you think about it. Our stupidity is really catching up to us. Or, more truthfully, other people's stupidity is starting to affect all of us around the world.
  8. I already made that very logical argument. He promptly ignored it. And yes Sayonara, I think I'll fire away. Everything is interpreted. Period. Your computer can do nothing more than what is hard coded. You lay out this thing called an ALU and a registry (depending on your design) and a couple of other essentials and you hard code it with a primitive language called assembler. Your entire computer is nothing more than a translational unit that takes voltages on wires, interprets them as 0's and 1's, and picks from a set of hardcoded functions and performs them in some specified order (given by the program you feed it). Everything else is supplementary. Nothing else matters. A string of 0's and 1's that can be successfully interpreted by your machine as assembler is the heart of every piece of software. But this is not the definitino of software. Software includes code that goes through an intermediate step to get to the assembler (which is why, as a million non technical sites have said, anything except hardware is software). Like the java virtual machine. So JVM + .class file = software because the two of these together do, in fact, produce something that runs in assembler which is determined primarily by the the .class file used as input. Is a .class file by itself software? Eh... I'll call it potential software. Feed a .class file to lots of things and it'll do nothing. Feed it to the JVM and ouila! You have a runnable program and, thus, software. So (JVM + *.class) as a package = software. A virus is no different. It's a script to be interpreted at some level of abstraction. Hell, it doesn't even have to be. In a less secure system I'm sure you could write a virus in as low level a language as you want. People don't have to though. It's the fact that it does harm that classifies it as a virus, not the language it's written in. Is that snippet of php code software? Could be. Wrap it with a php enabled app server like any decent apache installation and it sure as hell is. It's just abstracted by a step. Oh, and I apologize. I missed the post in which you wanted me to draw a distinction between software taking in an input file and viral instructions being picked up by a program. I must have been making a post myself... but I'll happily challenge your analogy, albeit a good one though. Is a .jpeg software? I'm sure that there is some set of circumstances in which you can write an interpreter that takes a .jpeg and successfully maps it to assembler. The requirement to me is that the program that picks it up is merely an interpreter; the heart of the logic must be in the input file fed into it. I'll explain why I have that requirement if it's really necessary... just ask. So a virus fed into a program (say, a web browser) that exists solely to interpret information coming into it and controls, for however long, the logic of what your machine is doing (again, there's a mapping between viral code and the logic the machine's performing that is missing in most simple text or image file inputs) is software. The layer of indirection doesn't invalidate it. So a malicious script + interpreter = virus = software. COULD a jpeg + a suitable interpreter be a virus/software? Maybe, but I tire of this conversation. You've said one thing that is quite correct... this is outside of the scope of the original question. Hell, I wouldn't have even posted if not for the fact that I saw an obvious logical flaw being made and defended for many posts (who committed the "classical mistake" here?) in my field of study. It's only 8:35 here in Cambridge, MA, but this is exhausting. I need to spend a night drinking to make up for this. I quit. And, for the record, I agree with YT2095's earlier post. The most pointless piece of software ever written was Windows, by far! Not that it was pointless when originally conceived, but it certainly is today. Now where's my alcohol...
  9. This discussion is going nowhere quickly. Sayonara: define "software" and prove how a virus is not it. If you can not give a specific definition do not expect us to conform to it. I tried to determine your definition of software from your previous posts but it seems to be somewhat vague at best. The best I could get from your words, as distinguishing factors, dealt with data manipulation and code written in a non-interpreted language. Tell me exactly what it is (not what most of it does) and we can get somewhere.
  10. oh, and I'm a "he" for future reference. Typing he/she is waaaaaaaaay too long! And I agree; that was not central to my argument at all. I was merely providing an example of "software" that doesn't fit the requirements of data manipulation provided by Sayonara, who seems to be drawing a distinction between software written in a high level vs a low level programming language.
  11. Sorry dude, but I'm not confused. Answer this question for me... is a program written in java or c# software or not?
  12. I'm going to have to disagree with you Sayonara. A virus is software. Software is just a set of instructions... ironically, the exact definition you gave to a virus. And that's it. In terms of your statement that that's simply not valid. Certain types of software do this, but the function of a piece of code doesn't determine whether it's software or not. That would be a great definition for useful software. The program posted earlier int main(int argc, char** argv){ return 0; } is a prime example. It's literally pointless... yet you can compile it down into assembler with a C/C++ compiler and run it. It's software. Technically it does something because it returns 0 but it doesn't do a useful something. Trying to draw distinctions between whether there's a running process of its own or not isn't valid either because many interpreted languages don't spawn a new process for every set of instructions they run. You can call anything written in Java or C# a virus under that definition; there's a main process (a virtual machine) that kicks off and other code is "injected" into it to be interpreted. Now I'm not about to start a debate whether java's useful, good, pointless, or anything else... but you certainly must admit that it produces software. Oh, and they're called "viruses" because they cause harm and tend to multiply silly.
  13. maybe I'm wrong, but I thought that the premise behind special relativity was that the speed of light is constant for all observers... period. Einstein based this upon the conclusions of earlier astronomers who concluded the same thing. What he did was point out the inconsistency that this created with classical mechanics and show that, for this to be true, things like time and mass can't be assumed constant. Flak, you're arguing that something can travel faster than the speed of light using Newtonian principles and that's the whole point of relativity... that doesn't work! You can't just dump more energy into something to make it move ever faster. In fact, time and mass adjust to make sure that it doesn't work (as was stated earlier, all of this has been proven). And anything that is traveling at the speed of light must have a zero rest mass. I thought that even all of this junk about the "slowing" and "stopping" of light doesn't actually violate this... most of it is just producing materials with ridiculous indices of refraction so we make the photons interact with so many particles (well, their electrons) that they never make it very far without being absorbed and spit out again. Although I keep some physicist friends (one of which was involved in research in this area and quoted in a few articles), I only studied it myself for a few years at a collegiate level so someone please correct me if I'm mistaken. I like to know when I'm wrong. The comparison to a sound wave is, unfortunately, also incorrect. The speed of sound wasn't, isn't, and never will be considered constant for all observers. There's no rule saying that you can't travel at it or pass it. We don't slow down like some matrix parody everytime somebody shouts from a train just so it's constant velocity is preserved. It's just a pressure wave traveling through a material, man.
  14. I'd say it was comments like "just like the americans and bush have banned cloning" that left that bad taste in my mouth. You, btw, didn't make those comments. I try to give most foreigners as much credit as I give educated Americans to separate the official policy of a given country from its people.
  15. "Back-in-the-day" with C++, still my favorite language although I use Java at work on a daily basis (C++ wasn't made for the web), implementing a generic container was, in fact, done with templates. However, polymorphism worked essentially the same way it did in Java. The reason the STL in C++ uses templates for its containers is that it's much FASTER than doing a dynamic cast. Casting's only really good if you're doing it quick and dirty C-style but a true, object oriented, dynamic cast (as done in java or optionally done in C++ when you want flexibility over speed) is not a cheap process and accounts for some of Java's lag. That and object creation on the heap, of course. Honestly, I only use casting in java when dealing with their Collections framework or when creating objects from .class files at runtime. And in terms of the Collections framework, it would be awful nice if they had an ArrayList that could be fed a Class object in its constructor so you wouldn't have to cast what it spits out every time. After all, I've very infrequently (if EVER) had to store different types of objects in a java container that weren't related by a class or interface that I wrote anyhow.
  16. I'd just like to state for a second that most Americans, especially educated Americans, don't support Bush or his crazy, stupid ass policies. The ones who do are either bible-thumpers who refuse to listen to reason anyhow (the crazy religious right), people to uneducated to see the long term result of his policies (the underpopulated, rural center of the country), or people who don't care what happens to others as long as they get more money (a portion of the very rich). And no, those aren't the majority of Americans. I'll give you 10 bucks if you can a large group of conservatives at Harvard. Please don't make comments that seem to imply that this Bush speaks for the entire US because to be honest, as a highly patriotic American who hates Bush's guts and what the far right is doing to our country through him, I'm highly insulted by them. Laugh at Bush, not the USA. And what's the point of protesting? I'll take to the streets in a second if I really disagree with something. The point is so those in power will KNOW that you disagree and, hopefully, your reasons will get some publicity. Also, so that people around the world don't look at certain policies, see a lack of protest, and suddenly assume that the American people are behind them. Assuming that foreigners see such things and don't make blanket assumptions, that is.
  17. Okay, again I disagree with you right here. Whether it's because I require a "paradigm shift" or whatever, I do. You're basing the truth of that last statement on the idea that no segment can be defined by a single point... who's trying to? That's obvious. I'm saying that you can test the length of any segment between that one point (your reference point) and another. And yes, if those segments are getting smaller, the second point is drawing closer to the first. To deny that is to say that there is no such thing as distance or space. That's what the concept of a limit is based upon, and I'll assume you understand the current concept of a limit at an intuitive level (although maybe not if you disagree with its truth and usefulness so much). Not correct, if I am looking at the car from zero state, I can distinguish between me (as zero state) and any given car that obviously is not me (zero). be careful. I try my best not to quote you out of context so don't do it to me. I qualified my definition of "indistinguishable" in the very next sentence in terms of resolution, which itself is directly analogous to the concept of using an arbitrary e. The point it, to any possible degree of accuracy that you choose (a millionth, billionth, trillionth? keep going) you can show that the series, if run long enough, passes your threshold and continues to get closer. I never claimed that they were the same number, just that you can get close enough that they're virtually indistinguishable. The fact that the number 1 can look at the number 0.999999999999999999 and say "you're not quote me" doesn't change the significance of the number 1 to the number 0.999999999999999999. Even if you choose to represent a number in a different manner that allows for a concept of closeness to be defined only on a finite set, the point is that you can always take a finite approximation of an infinite set produced by a series (as long as you want to run it) and apply your concept of "close" to it. Screw it, let's use the {0.9, 0.99, 0.999, 0.9999, ...} example. You will notice that, as the finite set you choose gets larger and larger (and therefore a better and better approximation to the infinite one) that the numbers in your finite set do grow close to the number one. And since you can't deal with an actual infinity you can at least conclude that since the closer we come to it the closer the numbers pull toward 1, 1 must be the limit of the series that produced the set. Current math can't really deal with infinities, but calculus is based upon the current definition of a limit which itself it tied into infinity. And guess what? Calculus works. It produces answers to real world problems that simple aren't solvable without an ability to deal with those concepts. If our framework is so inherently flawed, then why does it produce answers that work in the real world? After all, the real world IS the final test of truth. I'm not going to keep this up forever, but I genuinely do want to understand where you're coming from with your idea. But before I can, you have to convince me that my reasoning is flawed and you have yet to do that (or it seems, from the previous posts, to convince anybody else). You create tons of terms and repurpose common words and invent new logic systems to create your framework, but have you considered that if it were really true you should be able to explain it in layman terms to anybody? The hallmark of most big breakthroughs is that they're the kind of thing that makes you slap your head and go "Why didn't I see that!?!?" once it's presented to you with all of its supporting facts. And they also have the ability to be explained to arbitrary degrees of precision to convince a really well educated crowd or just convey the concept to the common man. Your ideas seem to lack that quality. Keep in mind that I am impressed by the time and thought you put into your theories and I think that maybe you're on to something. Any criticism I give is meant to be goodhearted and I hope it's interpreted in that manner. I think that sometimes a single person can have insight that can potentially shift or add to the world's understanding, but you also have to ask yourself this question: Is it more likely that I'm wrong or THE REST OF THE ENTIRE WORLD is wrong? I apply that simple question to myself all of the time and even if I stick with my opinions, which I often do, I at least try to give others the benefit of the doubt and look at their reasons for rejecting something that I'm saying. I don't believe that the majority's always right (or even frequently right), but at least pause to reconsider whether you're coming from as solid a position as you think you are. (oh, and usually when I find that I'm wrong it's because of a limited understanding of the system I'm trying to debunk, and when I talk to really experienced professors they can immediately point out the error in my logic and the false assumptions that I'm making. How well do you REALLY understand current mathematics?)
  18. where do we go from there? We've already gone from there... and all of current physics follows from that.
  19. I'll admit in advance to being merely a college student without that much mathematical training (I tutor math but only through calc/linear algebra, and I had to teach myself most of calculus because I came from a really really shitty public school) and I've done a bit more number stuff in Comp Sci theory courses but that's it. I don't have a PhD (yet) so I may not be able to put my thoughts into the most precise and bulletproof forms. I'm not even formally studying math; my area is AI because I love studying how people think. So please, bear with me. Your first post stated that the reason you disagree with the current theory of limits is that This fundamental point is one that I disagree with, and it seems as though we understand the word "approaching" in different manners. I read your pdfs and looked at the car example, and it's a great example that explains perfectly well the standard concept of a limit. You say that nothing is approaching the zero state, yet obviously every new point the car is at is closer to the zero state than the last point. That qualifies as approaching zero. Of course, that also qualifies as approaching -1, or any number beyond zero. If you would've extended your car example and drawn even more small cars you would've eventually been making cars so small and so close to zero that they were indistinguishable from it (but would never PASS zero, which is what rules out -1 or anything else). Of course "indistinguishable" in this sense is a matter of screen resolution. But no matter how far you zoom in, you can always repeat that process in which you continue drawing more cars and they will always hit a point at which, eventually, they become indistinguishable from zero. Think of the screen resolution in this case as your epsillon. So how do you, in your reasoning, come to the conclusion that "you can clearly see that nothing is approaching to zero state" and that "Therefore no such constant can be considered as a limit of the above collection"??? Clearly nothing is reaching it, but EVERYTHING is approaching it. That constant is fundamental to the behavior of the series, and to state otherwise requires some serious, heavy duty revelations or an order of magnitude beyond what you have given. Later, drawing on this "conclusion", you state that which I completely disagree with. You're essentially saying that there is no truth beyond the set A that's relevant to A. There is no constant B that holds any relationship to it other than an exclusive or relationship (A or B, but never both A and B... we're using the same definition of xor, right?). You can try, very weakly, to back that up if you were able to prove that the current concept of a limit is incorrect but since you haven't I'm afraid you've failed in this sense as well. You have yet to prove that, for a given set A, there is no number outside of the set that is fundamental to the description of A (as a limit or boundary would be). And if we can describe the behavior of A in terms of some constant B, then again all of your conclusions must be rethought. Then you LATER try to prove that all of our mathematical proofs concerning infinite sets is incorrect based on the first two conclusions that you failed to prove. You attack the ability to use the any/all reasoning here: but there's no inconsistency. The inconsistency is how YOU are defining "all" in this sense. You are saying that using the word "all" means that we're assuming everything is inside our domain?... that is completely and utterly false. All is usually describing everything that IS ALREADY inside some defined domain that we give, not somehow pulling everything into it! To give you some credit, I'll assume that you simply worded that incorrectly and you actually had some reasoning behind saying that. Or maybe I just require a paradigm shift or something like that. and further, you say that: but this relies COMPLETELY on you having supposedly proven that you can't bind an infinite set by constants (due to the failure of our concept of a limit) which you completely failed to do. You seem to be saying that since you've disproved our concept of a limit and thus taken away our ability to define a boundary to an infinite set, the numbers 0 and 1 are meaningless because to actually touch them our set would have to somehow fundamentally change what it is and undergo a "phase transition" to include them. Dude, do I have to point out how flawed this logic is? If you want, define [0, 1] as the intersection of the sets {1, 0.1, 0.01, 0.001,.... } and {1 - 1, 1 - 0.1, 1 - 0.01, 1 - 0.001, ....} k? You later, further defending your theory that we can define no limit, state that I happily take the challenge because it's such an easy one, but I refuse to work within the confines of your "smooth link without leaps" framework. I will simply show you that the number 1 is essential to the definition of this series, and thus even though it's not included in the series it's still a fundamental part of it and can be associated with it. First, with whatever theory of numbers you may decide to hold, do you agree that a given number has more than one representation? That 4 is completely equivalent to 2 + 2? If you do believe this, then I suggest that you simply rewrite the series, instead of being {0.9, 0.99, 0.999, 0.9999, ...} as {1 - 0.1, 1 - 0.01, 1 - 0.001, 1 - 0.0001, ...} which defines the the elements by the difference between them and 1. It's an equivalent set, but in this case it's obvious that 1 is fundamental in the rules that govern the series. Start with 1. Subtract off smaller and smaller and smaller amounts. Where do you end up at? 1. That's the limit of the series. Since the difference between 1 and the current number, the part that's changing, is getting arbitrarily close to 0, the current number must be growing arbitrarily close to 1. Is this so hard to see? Whether we choose to call this fundamental constant 1, a "limit," "propety," or a freaking "dinosaur" is a matter of semantics, but the fact that it's related by something other than an XOR relationship should be obvious. But no, you refuse to see it and seem to claim that WHAT? You're claiming that, just because a number in a series might not ever reach the limit of the series that means that other numbers in the series can't grow arbitrarily closer to that limit? Uh, using my definition of "closeness" as the difference between two numbers |A-B| your logic seems absurd. Is it even possible, with whatever your definition is, for one number to be smaller than another? If so, your claim is groundless. If not, you don't have a reasonable definition of "close". When you're unable to produce anything that seems to back up your claims except an unfounded criticism of the current system, you just accuse us all of needing a paradigm shift or, essentially, being squares living and thinking in flatland and waiting for you, oh mighty Sphere, to show us the way to reality. You claim that you bring is a new way to view numbers that will show deeper relationships between them. Fine. Maybe you do have some valid ideas that can add to the field of mathematics. But for goodness sake, just present those in a clear, backed up manner and stop unsuccessfully trying to topple all of current mathematics. Then maybe people would buy your ideas.
  20. okay, let's suppose that time doesn't exist. Where do we go from there... nowhere! A theory's all find and good, but only insofar as it doesn't contradict with reality. And of course I'm not saying that we should base our theories on only what we see... they just couldn't completely conflict. If I theorize that the universe is really made of tiny, dancing leprochauns and we can walk through walls if we hold 4-leaf clovers in front of us and dance a jig that would be a mighty fine theory (I'd think it was cool at least) but it would fall apart the moment we held a clover in front of us, twirled over to a wall, and still smashed face first into it. Great theories fall apart when proven wrong. Remember Michelson and Morley? Also, theories shouldn't replace a simple answer that explains everything with a more complicated answer that explains nothing more. Einstein won his nobel prize for his work on the photoelectric effect which proved that light is a particle (or at least had unignorable particle-like properties), but it wouldn't have even happened if the wave theory didn't predict things like infinite energies and thus break apart. Relativity's a great idea, but it wouldn't have taken off if not for the fact that newtonian mechanics, under extreme conditions, starts to conflict with what we observe to be true. Relativity still produces the world we see in front of us. So those of you who don't believe in time answer me this... what does our current theory of time conflict with? How does a "no time" theory offer a simpler (or at least more universal) explanation of the universe's workings than one that has time? How do you explain our everyday observations in terms of there being no time? And please try not to use words like motion that are defined in terms of passage through time.
  21. I still hold FF7 and Neverwinter Nights up there as two of my favorite games ever made in any context. Funny both would be mentioned here
  22. I think those bearded men need women I've never seen one try to build a tractor before, but they have a wonderful way of making everything clearer! Time exists, even if it's not that much different than the other dimensions. In fact it doesn't seem all that different except for the fact that we can choose the direction and speed in which we move in the others and we can only change the speed we move through time by going really really really really freaking fast (ie, near the speed of light) and the direction part is debatable. The reason we perceive time in the way we do is an interesting question and I'd love to know why. But the fact that "time" might mean something larger than what we perceive doesn't somehow invalidate the truth of what we perceive... it just means that it's not an absolute and total truth. Just because we may only know part of a larger truth doesn't necessarily mean that the part we know is incorrect. And hey, colors are as "real" as anything else. The fact that the electromagnetic spectrum is a rather large bit bigger what we perceive as visible light doesn't somehow invalidate the differences we perceive between red and blue. They are specific frequencies, after all.
  23. Yes, I follow you. I think that human beings, in general, are subject to many, many more of these bacteria-like behaviors (okay, maybe not that low on the food chain) than we admit to ourselves. The majority of our actions aren't all that complicated. I guess it's when it's crunch time that we pull through and exhibit that tiny little spark of intelligence that sets us apart (at least in the degree to which we possess and exercise it). btw, I was reading an article a while ago after a pretty bad nightclub accident (in PA, USA I think) in which the scientist was baffled at how much human beings, in situations of panic, behave like mice in labs. Specifically in the context of how we don't make the rational choice when it comes down to getting out of a place with few exits. When they look at what humans do in practice, it's to rush blindly for the door although reason would tell us that we stand a more likely chance of getting out (and more people with us) if we didn't clog the door so we wouldn't slow the number of people that are exiting at a given time.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.