Jump to content

DoorNumber1

Senior Members
  • Posts

    33
  • Joined

  • Last visited

About DoorNumber1

  • Birthday March 8

Profile Information

  • Location
    Boston
  • Interests
    life, love, sex, food, chess, ballroom dancing, people watching, reading, laughing
  • College Major/Degree
    Harvard/Computer Science
  • Favorite Area of Science
    Physics/Psychology
  • Biography
    I'm the one who stopped you from pressing the red button. You can thank me later.
  • Occupation
    Student/Programmer

Retained

  • Quark

DoorNumber1's Achievements

Quark

Quark (2/13)

10

Reputation

  1. Zanthra knows what he's talking about. Trust me, nobody in security would do something as silly as a simple mapping from letters to hex symbols. A good hash has the property that a tiny change in the cleartext produces a large, and unpredictable change in the hash. There are at least 4 different protocols windows uses for login that involve hashes. LAN Manager (LM) is an old authentication protocol that uses a shitty, weak hash that can be brute forced almost instantly. NTLM is the updated version used since NT, but can still be brute forced pretty quickly. NTLM 2 is an update that I don't know much about, but I'd hope that by this time they'd learned their lesson and done some decent hashing. Kerberos is what's used to log into a Windows Server 2k or 2k3 domain, which finally does respectable crypto. Your Windows computer never wants to store or pass the password in clear, so it stores and passes a hash of it instead. That's what you see as a hex string. Unfortunately, with the crappy Windows LM hashing, you can brute force the hell out of it and crack it pretty quickly. For the updated authentication protocols that take a bit longer, you can always send them off to someone with cycles to spare (like, well, loginrecovery.com) and let them brute force it for you. Also, keep in mind that hashing is not encryption. If you use a weak password, it can be cracked using dictionary attacks and intelligent brute force schemes (likes ones that use common letter->number swaps and append a few numbers to the end of the dictionary words). Just use a long, complex password... or use a mac like I do. It's hard to work in security and feel good using a Windows box
  2. It's a valid question, BobbyJoeCool and important to follow the solution to this fairly small problem. You solve lim x -> 0 sin(x) /x type of question using L'Hopital's rule (sp?). In other words, limit of f(x)/g(x) = limit of f'(x)/g'(x) which is really useful for cases when you get indeterminates like 0/0 or inf/inf using normal rules. So lim x -> 0 sin(x)/x = sin(0)/0 = 0/0 = indeterminate, so we differentiate the top and bottom and get lim x -> 0 cos(x)/1 = cos(0)/1 = 1/1 = 1. And therefore, by L'Hopital's rule, lim x -> 0 sin(x)/x = 1. Remember your calculus!
  3. the question isn't really whether the human eye can keep up with it; that's just a question about the mechanics of the eye. The upper speed your eye can track is (to a decent approximation) a function of the mass, size, and shape of your eye, the amount of force your eye muscles can exert, where they're attached, how far away the object is, the relative path it's going on (to know the tangential compent of its velocity), the mass, size, and shape of your head, how much force they can exert... you get the picture. In the end, the number's pretty freaking high because your eye is very small, nearly spherical, and almost frictionless. Also, your eye muscles are (relatively) pretty strong. But this doesn't answer the real question, which is whether you'll perceive it. After all, you must perceive it to "track" it. This doesn't have much to do with focus speed or the speed of light, or anything like that. It's more a question of a) how much light is the object putting off, b) how much of it reaches your actual eye, and c) which part of your eye these photons land on. imagine yourself staring straight ahead and an object appears out of nowhere. Before you "see" it, enough photons must reach the photosensitive cells in your eye to activate it. If enough of these cells are activated, it generates a signal strong enough to actually register in your brain (assuming you're not focusing on something else), which causes your eyes to turn to focus on it and elevates it to conscious thought. That's the order for most things: it registers on a rod or cone cell, their activation registers in your brain, you turn your eye to it, and a small fraction of time later this whole thing registers in consciousness. The irony is that we usually think of it as "Oh, I saw it and turned to look at it..." which is completely wrong. It also depends on where the light first hits your eye: in the center our eyes have way more resolution (especially color), but near the outside the cells are much more sensitive to light. Okay, I'm going to stop and sum up the answer. The speed at which "the human eye can no longer keep up with it" is more than just the human eye's movements speed, but also out ability to track it. That's dependent on whether it gets registered (so our eye actually turns) which is dependent upon how much light from the object reaches our eye and where it hits. Hope this helps!
  4. DoorNumber1

    Voting

    Yourdad, please get a life. Or at least a sense of perspective. Hell, a basic knowledge of linguistics concerning the critical periods for learning languages will do. Truth is, picking up a language as an adult is, to put it scienficially, hella-hard to do. After a month in China I learned a lot of random words and phrases, but all I remember how to do is greet people, express the thought "I am..." (useful for saying you're an American or a student or anything like that), and say the word thank you. And the word for bathroom. That was a very useful one to know. If I stayed longer, with a bit of formal study, I don't doubt that I would become functional and rather quickly. But it would be broken and hideous to a native speaker of Mandarin. Put me around a group of English speakers there and I'll always revert to my native tongue. It was SUCH a relief to hear fluent English being spoken by anybody after a while. And it's hard, even when you're trying, to find your way around when none of the signs are in your native tongue. But I'm not stupid. Quite the opposite. And guess what? Most of those spanish immigrants aren't either. One of my best friends parents were biochemists. Oh, but that was in Nicaragua. Because they couldn't speak English fluently, they were assumed to be dumb so they had to settle for low paying janitorial jobs in Miami. Naturally the second generation picked up english flawlessly as kids do and are now assimilated and adding a lot to our country. I met her here at Harvard, and I'd bet money she's 20 times smarter than you are, yourdad. And that's the point of America. Amazing people can come here, do their thing, and everyone benefits. As a final note, more to the point considering the implied ideologies behind your "lacking in patriotism" statement, ask yourself honestly if you'd react the same to a cute girl speaking broken English but fluent French? Or how about Italian? Or German? You don't have to tell me the answer, but you should really think about it. And hard.
  5. I have a DVD-RW in my G5, and I have yet to use it. Why? Well, I just don't need to transfer that much info at a time! Truthfully, I'm happy for my DVD burner because I can, in theory, back up quite a bit of information. But for everyday use, the most I'll need is a CD to back things up. I think Sayonara hit the nail on the head though; some people would need more. There are tons of applications (mostly multimedia things) where one might like to transfer much more than a DVD might hold. What I would like to see is a study done that collects stats on the average amount of data transferred to/from the average home computer computer, the way it gets there, and what it's used for. I think such data would be quite enlightening, and I would predict that the largest items would be media files that still fall within size range of CDs or DVDs. Most uses would be rather small, and transerred through non-physical means.
  6. Okay, can somebody explain one thing to me: how did we solve the big problem of radiation shielding in deep space? While we're within Earth's EM field everything's just great, but once we leave it a ship and all inside will get their collective asses kicked by the full, unrestrained burden of our sun. The required shielding to keep a man healthy for suck a long ship would make for a HUGE (read: tons of fuel required to get into orbit) craft! Also, what about the fast forward osteoporosis that takes place in zero g or near zero g? There are tons of theories out there of what a craft going to another planet would have to look like involving spinning disks and stuff, but as far as I know we haven't found a good way of simulating gravity. There's so much damage done to astronauts nowadays in their relatively short stays... imagine a trip that'll take years! I think we have decades of committed experimentation and testing ahead of us before we can even begin to tackle the god of war. I'm all for it though. Uncovering the secrets of interplanetary travel is one of those goals that drove me to study physics through hgih school and my first years of college before I switched to computer science. And I agree with Lucid... it was a blatant attempt by an idiot president to appear "science friendly" after his ridiculous stances on other scientific issues. Making such a claim, to me, just highlights his stupidity and insecurity; he feels needs to connect himself to greatness by making such a Kennedy-esque statement. I'll be cheering loudly when he's out of office.
  7. How much gas does a farting cow produce? Enough. We have an awful lot of cows in this world, ya know. A good question would also be "how much methane gas does a city full of farting humans produce?" I suggest we put a fart tax on people who sit around on their couches eating potato chips and contributing to global warming. 5 cents for every burst of methane laden wind, eh? On a slightly more serious note, I just read an article a few days ago about huge greenhouse gas emissions coming from a Brazilian dam because they basically flooded an entire forest that is now rotting underwater. Kind of sucks, when you think about it. Our stupidity is really catching up to us. Or, more truthfully, other people's stupidity is starting to affect all of us around the world.
  8. I already made that very logical argument. He promptly ignored it. And yes Sayonara, I think I'll fire away. Everything is interpreted. Period. Your computer can do nothing more than what is hard coded. You lay out this thing called an ALU and a registry (depending on your design) and a couple of other essentials and you hard code it with a primitive language called assembler. Your entire computer is nothing more than a translational unit that takes voltages on wires, interprets them as 0's and 1's, and picks from a set of hardcoded functions and performs them in some specified order (given by the program you feed it). Everything else is supplementary. Nothing else matters. A string of 0's and 1's that can be successfully interpreted by your machine as assembler is the heart of every piece of software. But this is not the definitino of software. Software includes code that goes through an intermediate step to get to the assembler (which is why, as a million non technical sites have said, anything except hardware is software). Like the java virtual machine. So JVM + .class file = software because the two of these together do, in fact, produce something that runs in assembler which is determined primarily by the the .class file used as input. Is a .class file by itself software? Eh... I'll call it potential software. Feed a .class file to lots of things and it'll do nothing. Feed it to the JVM and ouila! You have a runnable program and, thus, software. So (JVM + *.class) as a package = software. A virus is no different. It's a script to be interpreted at some level of abstraction. Hell, it doesn't even have to be. In a less secure system I'm sure you could write a virus in as low level a language as you want. People don't have to though. It's the fact that it does harm that classifies it as a virus, not the language it's written in. Is that snippet of php code software? Could be. Wrap it with a php enabled app server like any decent apache installation and it sure as hell is. It's just abstracted by a step. Oh, and I apologize. I missed the post in which you wanted me to draw a distinction between software taking in an input file and viral instructions being picked up by a program. I must have been making a post myself... but I'll happily challenge your analogy, albeit a good one though. Is a .jpeg software? I'm sure that there is some set of circumstances in which you can write an interpreter that takes a .jpeg and successfully maps it to assembler. The requirement to me is that the program that picks it up is merely an interpreter; the heart of the logic must be in the input file fed into it. I'll explain why I have that requirement if it's really necessary... just ask. So a virus fed into a program (say, a web browser) that exists solely to interpret information coming into it and controls, for however long, the logic of what your machine is doing (again, there's a mapping between viral code and the logic the machine's performing that is missing in most simple text or image file inputs) is software. The layer of indirection doesn't invalidate it. So a malicious script + interpreter = virus = software. COULD a jpeg + a suitable interpreter be a virus/software? Maybe, but I tire of this conversation. You've said one thing that is quite correct... this is outside of the scope of the original question. Hell, I wouldn't have even posted if not for the fact that I saw an obvious logical flaw being made and defended for many posts (who committed the "classical mistake" here?) in my field of study. It's only 8:35 here in Cambridge, MA, but this is exhausting. I need to spend a night drinking to make up for this. I quit. And, for the record, I agree with YT2095's earlier post. The most pointless piece of software ever written was Windows, by far! Not that it was pointless when originally conceived, but it certainly is today. Now where's my alcohol...
  9. This discussion is going nowhere quickly. Sayonara: define "software" and prove how a virus is not it. If you can not give a specific definition do not expect us to conform to it. I tried to determine your definition of software from your previous posts but it seems to be somewhat vague at best. The best I could get from your words, as distinguishing factors, dealt with data manipulation and code written in a non-interpreted language. Tell me exactly what it is (not what most of it does) and we can get somewhere.
  10. oh, and I'm a "he" for future reference. Typing he/she is waaaaaaaaay too long! And I agree; that was not central to my argument at all. I was merely providing an example of "software" that doesn't fit the requirements of data manipulation provided by Sayonara, who seems to be drawing a distinction between software written in a high level vs a low level programming language.
  11. Sorry dude, but I'm not confused. Answer this question for me... is a program written in java or c# software or not?
  12. I'm going to have to disagree with you Sayonara. A virus is software. Software is just a set of instructions... ironically, the exact definition you gave to a virus. And that's it. In terms of your statement that that's simply not valid. Certain types of software do this, but the function of a piece of code doesn't determine whether it's software or not. That would be a great definition for useful software. The program posted earlier int main(int argc, char** argv){ return 0; } is a prime example. It's literally pointless... yet you can compile it down into assembler with a C/C++ compiler and run it. It's software. Technically it does something because it returns 0 but it doesn't do a useful something. Trying to draw distinctions between whether there's a running process of its own or not isn't valid either because many interpreted languages don't spawn a new process for every set of instructions they run. You can call anything written in Java or C# a virus under that definition; there's a main process (a virtual machine) that kicks off and other code is "injected" into it to be interpreted. Now I'm not about to start a debate whether java's useful, good, pointless, or anything else... but you certainly must admit that it produces software. Oh, and they're called "viruses" because they cause harm and tend to multiply silly.
  13. maybe I'm wrong, but I thought that the premise behind special relativity was that the speed of light is constant for all observers... period. Einstein based this upon the conclusions of earlier astronomers who concluded the same thing. What he did was point out the inconsistency that this created with classical mechanics and show that, for this to be true, things like time and mass can't be assumed constant. Flak, you're arguing that something can travel faster than the speed of light using Newtonian principles and that's the whole point of relativity... that doesn't work! You can't just dump more energy into something to make it move ever faster. In fact, time and mass adjust to make sure that it doesn't work (as was stated earlier, all of this has been proven). And anything that is traveling at the speed of light must have a zero rest mass. I thought that even all of this junk about the "slowing" and "stopping" of light doesn't actually violate this... most of it is just producing materials with ridiculous indices of refraction so we make the photons interact with so many particles (well, their electrons) that they never make it very far without being absorbed and spit out again. Although I keep some physicist friends (one of which was involved in research in this area and quoted in a few articles), I only studied it myself for a few years at a collegiate level so someone please correct me if I'm mistaken. I like to know when I'm wrong. The comparison to a sound wave is, unfortunately, also incorrect. The speed of sound wasn't, isn't, and never will be considered constant for all observers. There's no rule saying that you can't travel at it or pass it. We don't slow down like some matrix parody everytime somebody shouts from a train just so it's constant velocity is preserved. It's just a pressure wave traveling through a material, man.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.