Jump to content

TakenItSeriously

Senior Members
  • Posts

    511
  • Joined

  • Last visited

Everything posted by TakenItSeriously

  1. I once heard Dr. Michio Kaku say, in a documentary, that Einstein was never comfortable with curved spacetime and said that it was just a model to demonstrate the effects of gravity but that it shouldnt be taken literally.
  2. I've always suspected that spagettification near BHs was an observer dependant spacetime illusion, similar to length contraction at close to light speed. The thinking is that the delta force between your head to feet would physically tear you apart. As I see it, you dont feel a thrust when in free fall because it is spacetime that is accelerating towards the planet. So you shouldnt feel the delta forces either because it is spacetime that is getting stretched not the body in free fall.
  3. It sounds like they ran a proof of concept but it still seems too good to be true. 23,560% boost to revenues while completely eliminating the polution CO2 emmission free. Thats the first time Ive ever used 5 figures for a percentage incease. It also burns methane which seems like it could provide another benefit reducing methane as a greenouse gass. Perhaps create a process to collect the methane from sewage treatment plants or old garbage dumps, that would otherwise allow it to escape into the atmosphere as a greenhouse gas.
  4. It actually said requres 1/10 the code . I could see it as being a typo, but the way it and emphasized easier programming, I'm not so sure. I thought that it was finding efficiencies by using the same piece of code in 10 pipelines instead of ten seperate threads? But really, if its doing the same process, then all ten threads could call one subroutine. I'm not sure I understand how a slower memory could provide greater efficiencies for parallel pipelines. It seems like more of limitation of effectiveness if the memory cant handle the extra bit rates the parallel pipes can process Oh. I see, its putting much of the code into the hardware. Thanks.
  5. Well the trouble with over population is it creates many many problems that individually dont seem that threatening but put them all together and they can have a devestating compound effect. We're slowly changing the atmosphere as oceans are producing less oxygen, more methane. Oxygen levels have already dropped quite a bit over the past century, this century? Similar issues with the melting tundra, deforrestation, dying coral reefs, increasingly worse forrest fires, etc. We're efficiently destroying the food chain as our population continues to grows at an exponential rate. The previous century took us from 1.5 Billion to where we are now. This century? We can expect extinction rates to climb even more dramatically as climate change becomes increasingly worse over the next 50 years meaning unpredictable extreme weather patterns creating natural disasters that get increasingly worse every year after year. Satellites under threat from space junk creating a chain reaction could have devastating effect on our technology and economy. infrastructure in the worst shape ever with no money to make the repairs? fracking permanently poluting the water table in many regions Governments too polarized and too corrupt to be effective. We could loose much of the grain belt to climate change. Over fishing, with poluted and dying oceans cant be sustainable for long. extinction of the bees could takenout even more crops. So we should allready be in pretty bad shape when the inevitable coastal flooding creates tens of millions of displaced refugees around the globe raising social stress to unprecedented levels possibly throwing many parts of the globe into anarchy. Topple a few of the wrong governments where nuclear arsenals get pillaged... Super Volcanoes which are normally a low risk individually, but with three that are over due with maybe a 100 year window where we will be extra vulnerable against bouncing back, and suddenly the odds are not looking so safe. ^Funny, I enjoyed it^
  6. I've simulated parallel processing with using timers with AHK scripts when creating HUDs for online poker, which was kind of cool. It wasnt for efficiency but for continuous scanning for a number of trigger events over multiple tables. It just qued up the commands in a single pipe though. Just curious, are the number of pipes limited to the number of cores you have available? Also, I don't understand how it could require 1/10 the code.
  7. Ok but my question about radiation is not so much about the rate of expansion but if there was some other mechanism I am not aware of where radiation can dissipate without any mass in the universe. Because, based on what we know it seems like radiation is always going to be around and I don't understand why the current end of the universe models don't account for perpetual ever-expanding radiation? Previously, I used to think of the universe ending when all matter was gone. Now I don't know what to think.
  8. Thanks Mordred, modeling the universe as permutations of it's constituent parts makes a lot of sense. Radiation expansion brings up a question about the end of the universe. In an ever expanding end of the Universe scenario after the last remnants of the last BH disappeared with a bang and we are only left with radiation, I've heard that it is supposed to eventually disperse into heat. But, without matter there doesn't seem to be a mechanism to absorb the radiation and turn it into heat. So, is it correct to assume that the universe could never end and would expand forever as radiation in a steady state expansion? Could you expand a bit on how it was surprising? e.g. Did matter alone expand at the same rate as our universe? Or Did matter alone expand at a rate faster than what DE predicted? Ok, but to be clear, are we talking about a universe such as Mordred's example where at least one thing was still a part of the universe that was studied, be it radiation, lambda, matter, or DM. Or are we talking about a universe that was truly empty of everything. By the way, I mentioned the Higgs field as an after thought because, in theory, it is supposed to be everywhere. So I wondered if the Higgs field could define spacetime. It would be a single contiguous field that would contain all of the particles and all of the connections as FLRW defines them. Also there are clearly problems with obtaining information about spacetime outside of our light cones, but it seems like their is a need to do just that. That's because there is a need for the universe to be homogeneous and isotropic for things like inertial frames to work. To explain: I assume that homogeneous and isotropic is to ensure all physical laws remain consistent no matter where we applied them. But what if an observable edge was very near to an actual edge of the universe or located next to an improbably large and dense cluster of galaxies. IDK, Whatever is large enough to break the local laws of physics. Then it seems like there is a potential for creating paradox where light speed is violated by information within our domain. I see potential solutions using something like a Brane theory but that's getting into speculation. But I do see a definite need for a single field to be everywhere and the Higgs field seems to fit the bill. Also it imparts mass which could make it related to a quantum loop theory... Sorry, I just had an epiphanic moment and I didn't want to loose my train of thought.
  9. I'm having trouble understanding what an empty universe is. Or why they would try to distinguish it from nothing, was it done arbitrarily or were they trying to predict what would be left of a failed universe that recollapsed back in on itself. Could we assume that an empty universe was an appropriate application for einsteins equations? Didn't they predict that space and time was created as a result of the first matter particles coming into existence? I also can't imagine what the properties were that could be compared to ours. Was it flat, open, closed, expanding, contracting, homogeneous, isotopic, etc. does any property besides steady state even make sense without matter? Can you tell us how they define an empty universe, or why they would want to define an empty universe. And what it's properties are? Did it have a Higgs field?
  10. Nice! This is a good explanation for why their must be a theoretical maximum on speed. Edit to add: Upon reflection, while it does sound like a paradoxical result, I'm not certain that it actually leads to a paradox. The dual existence could only last for an instant in time like teleportation. One could imagine teleportation without creating paradox. For instance it doesn't violate cause and effect.
  11. I'm not sure how that comment applies to my understanding of what a theory is. I did misunderstand the derivations for Planck numbers was what linked their values in the first place, which was never an issue for me as an engineer. So When Imatfal's, mentioned that time and distance were equal for planc units, at first, I didn't notice that it was due to how they were derived and I erroneously initially thought the matching numbers proved the link of space and time which made me question why it wasn't considered a law or principle. though I corrected that oversight a minute later. Perhaps I should have deleted the erroneous statement but I'm in the habit of forums permanently reflecting errors in posts. So I added corrections for the comment instead of replacing it which may have confused you somehow? IDK. I can assure you that I have a pretty solid understanding of what qualifies a theory now vs what may have been required earlier and why the shift in requirements is important. It essentially amounts to validating a new theory vs adding new portions of theory for completing a theory. Solving problems that existing theories fail to solve while remaining consistent within the complex set of theories already vetted over time is self vetting for the most part. It is the complexity and the broad range of vetted theories that it needs to conform to that makes solving multiple problems, while conforming to known data and doing it in a consistent manner a very difficult task to achieve with an invalid theory. It's just fitting too many variables that makes it self validating like matching 15 points on a finger print.
  12. In the example of the cards in envelopes, I don't think it is literally true. Though to be fair, Delta did use the expression "taken at face value". It would only be true if there were no way possible to determine the contents of the envelop through cheating, such as switching the envelopes with envelopes where the contents were known, but that would assume perfect security which doesn't exist. We often think of it as being hypothetically true because if no one was cheating, then there would be no way to confirm or deny the contents and it would be an equivalent condition of being impossible to know, but you can never assume no one is cheating with absolute certainty either. Which BTW, is why you should never use a scientist to debunk a con artist. They take for granted too many hypothetical conditions. I keep thinking of the gamblers fallacy where if you flipped a fair coin a thousand times and it came up heads every time, the next flip should still be 50:50. But that's only the mathematician believing it's a fair coin because he was told to assume it was fair or it could be a fair coin but the coin flipper was using slight of hand to control the results. In any case, believing the result should be 50:50 after results showing heads a thousand times in a row, is clearly foolish reasoning.
  13. You're kidding! Wow, how is it that the linkage of space and time is still just a theory if that's the case? Unless it's based on how plank units are derived. Like if plank distance was derived from plank time and C and perhaps make it a circular result? Oh, wait, that's what you were saying at the beginning. Lol. I was thinking they were independently derived for some reason but that doesn't seem reasonable.
  14. If you're looking for significance in the number itself, such as finding a perfect number or a factor of pi or something, don't forget that the units are arbitrary so the literal number "299,792,458 m/s" is going to be arbitrary as well. Who knows, if you based C on Planck units and analyzed the number, maybe something would reveal itself, unless it's already been done. Imagine if a light year was the same number as a year in Plank units and no one ever bothered to check.
  15. Not really, all entangled particles create a single indeterminate universe till the point of observation defines the event. Edit to add, if you think this is intriguing, the model is more than just a differential universe that completes QM. For instance all charged particles become dipoles across time, therefore gravity becomes a loop that stretches across world lines. So we now have a mechanistic explanation of a macro version of the quantum gravity loop that is required for linking QM to Relativity.
  16. In a sense, only instead of infinite realities, only two realities that are equal and opposites are needed. Think of it like yin and yang, or a universe that allows for free will or chance between outcomes. Without a differential universe, duality doesn't exist and the only universe that could exist would be one that is completely deterministic where cause and effect will create a chain of events that can only end up with a single outcome till the end of time.
  17. I've made no secret that I am not a physicist or mathematician, so formalism aside, I will try instead try to explain it logically. In order to do this we must imagine a hypothetical differential universe that consists of dual realities that are equal and opposite and locked in entanglement. Therefore, in the example with the EPR entanglement paradox, there actually exists two realities with two versions of Alice and two versions of Bob. In the first reality: Alice measures the X-spin with 100% certain results of + and Bob measures the Z-spin with non-determinate results 50:50 + or - In the second reality: Bob measures the X-Spin with 100% certain results of - and Alice measures the Z-spin with non-determinate results of 50:50 - or + Therefore the information does not actually travel any distance and only travels between alternate realities which occupy the same space in alternate timelines.
  18. Ok, but the issue is, I could only provide the logical model, the formalism would require too long a learning curve on my part, though mostly it would only require logical states. If your still interested in seeing the model, I could spend some time cleaning up a more presentable synopsis, for the QM portion. BTW, there's a gravity loop portion in addition to this that I'm still trying to muddle my way though. Edit to add: the model does resolve entanglement paradox, but it also solves or begins to explain most of the duality issues as well.
  19. Is that allowed in these forums? I wouldn't mind but wouldn't it be considered speculation?
  20. Local hidden variables don't necessarily require inequality as Bell views it. What if the hidden variables were differential instead? For example, consider a differential signal pair. Differential signals are equals and opposites, or put another way, they are identical signals that are phase shifted by 180⁰.
  21. That's news to me and I didn't see that mentioned in the Wikipedia article either. Can you provide a reference?
  22. Encryption Key example:The useful, instantaneous information at a distance I was speaking of in that example was the encryption key they shared and know one else could have any knowledge of, including the man in the middle. Pun intended. However, an even better example is the one given in the EPR paradox which stands for the last initials of it's authors: Albert Einstein, Boris Podolsky, and Nathan Rosen. EPR Paradox summary: It was based upon a mental model of an electron/positron entangled pair that involved the Heisenberg Uncertainty Principle. The gist of it was: given two properties of X-spin and Z-spin Uncertainty guarantees we cannot know the state of both properties at the same time even if separated by entangled particles. Therefore if Alice received an electron and spontaneously decided to measure either the X-spin or the Z-spin of the electron, than at any point after that: Bob attempted to measure, by chance, the opposite property from Alice (X/Z, Z/X) of the positron his result would always be inconclusive with a 50:50 probability for either + or -. Or... If Bob, by chance, chose to measure the same property as Alice (X/X or Z/Z), Than he would detect its state of + or - with 100% probability. Therefore Bob could always deduce which measurement (X or Z) Alice chose to make instantly after she made it despite the distance that separated them. The Authors of the article then acknowledged that one of two things must be true: *Alice violated the speed of light by instantly communicating which measurement she just made to Bob over a distance. or... *Quantum Mechanics as not a complete theory and some missing portion of the theory must involve properties within each particle that are hidden from our domain. *Note: I paraphrased those conclusions for clarity sake. You can Wikipedia the original quotes under "EPR Paradox" As to the question of the usefulness of a single bit of information a couple of famous examples come to mind. Paul Revere communicating the approach of the British at the beginning of the US Revolution by shining lamps from a light house "one if by land, two if by sea" as the song goes. If your not a US citizen than another famous single bit communication example would be the grey or black (or is it white or black?) smoke that signals that cardinals [un]successful voting in of a new pope. Note: While this is only a mental model, according to the Wikipedia article, experiments had later verified the mental experiment as valid.
  23. I'm not sure this question makes sense. If any theory is in fact true speculating what would happen if it wasn't true means the universe doesn't exist. However, we may have doubts about a theories correctness and speculate what would happen if such and such was not true when looking for an alternative theory that may make more sense. If that is the way you intended the question than I don't know, but what Mordred said is probably a pretty good bet before I've had a chance to fully digest his response. (aside from sound) we mostly communicate using signal waves in electromagnetic fields through substrate materials, in which light plays only one role, i.e. Fiber optics, but there are plenty of other examples of communication that does not involve light, such as any of the Ethernet protocols or wireless communications. Though they all still propagate at the speed of light and are fundamentally based on Maxwells equations.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.