Jump to content

Duda Jarek

Senior Members
  • Posts

    572
  • Joined

  • Last visited

Everything posted by Duda Jarek

  1. Quantum cryptography was believed to be ultimately safe, but in fact if someone for example could connect himself in the middle of such optic cable and can intercept the classical communication, he could say A that he is B and use their protocol to receive the massage. Then he can say B that he is A and using the same protocol to send some own message. So in fact safeness of quantum cryptography relies on safeness of classical cryptography - channels and authentication. I believe You mean something different - use for coding a hypothetical computer which can quickly solve some NP problems - which in each step chooses behavior analyzing some exponential number of possibilities. But if physics allows to quickly solve NP problems without restrictions, we could add to the set of these possibilities all possible keys (to find such that gives nontrivial correlations) - we still could break it. If there are restrictions for time and memory need of verification - we can easily exceed any limit using preinitialized cipher like based on ANS. About public key cryptography - You are right, but if there will be such computer (with restrictions), to make it safe we would have to ensure that every step of encoding requires large number of computations (much more than for RSA).
  2. Ok - I was too pessimistic - we should be able to make protected and practical hybrid systems - public key cipher for very short message like a key for a secret-key cipher or a hash value for authorization. Most generally, public key is a parameter of some transformation which is extremely difficult to reverse. But there is the private key - some kind of 'clues' which make this reverse easy. So if someone could solve quickly NP problems: - he could try all possible 'clues' and for example check if for some block encrypting and then decrypting gives the same block. If yes - he could try a few more different blocks to be sure it's the correct private key, but there is also more dangerous attack: - searching not for these 'clues' but straightforward for the reverse function: having encrypted message in a form of independent blocks, for each block he could try to encode all possible input blocks with the public key to get the given block. So to protect it in analogy to secret-key ciphers, we rather have to make that encoding already require extremely large amount of calculations. The problem is that this time these huge calculations cannot be just made while initialization like before, but has to be made for each block - it could be practically used only for extremely short messages, like the key for a private key cipher or a hash value.
  3. I see that You believe that the only real algorithmic advantage of QC is the Shor's algorithm. Probably You are right, but ... QC can theoretically make all calculations at once (is almost nondeterministic Turing machine) and the only problem is with the extraction. I'll show how to enhance it, but it would be strange if basic QC wouldn't already allow for more, maybe even solving NP in polynomial time. Observe that if someone would find a way, he could not necessary tell it loud, but for example try to became rich... In such case 1024 bit key would be a piece of cake - elongating key may be not enough. Especially that I believe that we could easily protect private key cryptosystems against such eventualities by using preinitialized ones. But designing public key cipher is much more difficult and I have no idea how to make a protected one??? The next argument that we should take such scenarios seriously is that maybe basic QC is not the only possibility for massive parallel computation physics gives us. First of all there is so called Feynman-Stueckelberg effect http://groups.google.com/group/sci.physics.research/browse_thread/thread/9d10b4e5cbda1108 which hasn't been taken seriously, but maybe it will change in a few months in LHC ... but such computer would require (huge?) accelerator. The other option can be (quantum) loop computers http://forums.devshed.com/security-and-cryptography-17/new-computation-method-which-could-endanger-used-cryptosystems-580926.html I'm strongly confused about this idea, especially for classical computers. But ... if used for quantum computation, such feedback should amplify the correct solution (wavefunction), making the rest of them vanish. It couldn't be standard approach to QC in which we use some sequence of for example external fields on some lattice of atoms. We would need a circuit which allows to sustain entanglement of many calculations. Observe that similarly to benzene, (-CH=CH-) sequence can be in quantum superposition with shifted one (=CH-CH=) - we could use such molecule as a wire for qbits. Unfortunately it has some resistance, but there are know such superconductors also. We know also transistors made of single molecule - they are irreversible so would destroy entanglement, but there should be possible also quantum gates made this way. The question is if such molecular quantum computers could sustain entanglement for practically long time ... There is also problem with auxiliary variables - we need a lot of them because in QC all calculations has to be reversible. They cannot be sent in the loop - they should be treated in some special way to not destroy the entanglement... ... but maybe ? Probably physics doesn't allow to solve NP in polynomial time, but I'm far from being sure of it. And I believe that preinitialized cryptosytems should be practically protected against all presented hypothetical possibilities. And this protection is achieved practically for free.
  4. Physics most probably allows for non-classical computation methods, which makes parallel computation of all possible inputs - like becoming more and more realistic quantum computers. I think we should widely discuss if it's a real threat and if yes - how to design cryptosystems resistant to such eventualities. Especially that we wouldn't know if someone would use it... Here is one discussion of this type: http://www.ecrypt.eu.org/stream/phorum/read.php?1,1021,page=1 I wanted to start here one. Please share Your opinions and links to interesting discussions on this topic. There is know Shor's algorithm which breaks RSA. Generally cryptosystems have always a 'weakness' that make them prone to brute force attacks - that there is a key which properly decrypts the message. To make such attack we can use that there is practically only one key which makes that message decrypted with it has some significant correlation. So using a quantum computer we could use entangled all possible keys to decrypt the message, check for correlations and somehow extract some information about the correct key. This extraction is more difficult than it looks, but assuming that it's impossible could be dangerous. Quantum computers are extremely difficult to make and they most probably will be limited to relatively small number of qbits and time they can sustain entanglement - so I think that we can protect against imaginable QC by enforcing large amount of required computation. To make such cryptosystem practical, this computations should be done only once - specifically for given key. The next advantage of such preintialized cryptosystem, like based on asymmetric numeral systems, is that now the processing of the data can be extremely fast. Do You think it's a real threat? How strong it is - can be used only for some algebraic attacks or even for brute force? How to protect against such eventualities? What about public key cryptography???
  5. I've just finished large paper about ANS. There is added some deeper analysis, gathered rethinked information I've placed on different forums... There is also shown that presented data correction approach can really allow to reach theoretical Shannon's limit and looks to have expected linear time, so should be much better than the only used such practical method - Low density Parity Check (LDPC) http://arxiv.org/abs/0902.0271
  6. So please motivate, why You think it cannot be a realistic scenario? We have our subjective perception of direction of time, but it is only the result of the boundary conditions with relatively small entropy (big bang). What is less subjective is the causality: reason-result chains. And CPT conservation, which allows to switch past and future, suggests that such relations should be possible in both time directions.
  7. In many worlds interpretation such causality loop doesn't have to bother to close. We have plenty of mystical QM interpretation and all of them wants to look at reality from the perspective of constant time plane - maybe here is the problem! GR, field theories strongly suggest that we have to think about physics four-dimensionally - we live in some already created spacetime and time arrow we percept is only some local solution (eternalism). As I've shown - choosing this view there shouldn't be a problem with QM interpretation. Even better - QM should naturally occur there! Making this assumption - eternalism - that 4D spacetime is already there and is stabilized, causality loops/question we will ask are already stabilized/answered. If physics couldn't do it, it can easily break the weakest link - prediction. For example by choosing statistics of the scattering so that we didn't spot these 'virtual' particles. ------------------------- If the idea could really work, transferring many bits would require much more resources than one. Especially that in practical problems the first algorithm would require hundreds/thousands of them. So I though about the second algorithm, let's remind it (we transfer one bit: B) - if B then 'input' -> next possible 'input' (cyclically) - if 'input' verify the problem -- then transfer back in time B=false -- else transfer back in time B=true. If it could it should stabilize on B=false and some solution. The problem is that 'input' have to be initialized somehow before. The solution could be using a really good random number generator, for example measuring photon 45deg from its polarization (before the loop). So the causality loop should cause that this generator has already chosen a good solution - cause something even in its own past. But because of it, this algorithm looks to be less stable - physics should make the prediction more difficult by for example making statistics of scattering worse.
  8. I don't see a problem with some architecture? It would make calculations only once, but physics would ensure that it would be the correct one. But from practical reason (prediction) it should be extremely fast - so it even doesn't have to use clock. For many problems, like universal 3SAT, we could do this verifier using some number of layers (O(log N)) of basic logic gate -------------------------------------- Physicists usually believe in CPT conservation, that means that in small scale past and future should be symmetric. So if we make high energy scattering in some accelerator, there should be also created some amount of particles which travels into the past and hit e.g. some detector. From the perspective of out time perception, such particle was created in this detector and goes straight into the scattering point - we should be able to detect the scattering BEFORE it actually happen. So using accelerator as a 'part' of time-loop computer, we should be able to close the causality loop. http://www.scienceforums.net/forum/showthread.php?p=455246
  9. I cannot agree - there is qualitative difference. Light 'thinks' that geometry changes/the world is curved when it goes through different materials. If it would be really geometry, X-rays would also 'think' so People going in parallel direction can be getting closer, because of positive curvature of the space OR because solutions of the field they are build of would have energetic tendency to bend so.
  10. CPT symmetry conservation suggests that in high energy scatterings particles should be able to choose between both light cones - future and past. We can call them virtual particles, that they have negative energy, because from our perspective of time it was the detector what emitted the particle which hit exactly into the scattering point. So if we would observe precisely the matter near scattering, we should be able to observe small changes just BEFORE the scattering? For example by using a mirror as the detector and analyze reflected low energy beam. Now coupling it with something which for example could bend the beam to make it miss the target (avoid scattering), we would make causality loop, which should allow for time-loop computations...? ps. Using magnetic field we could make that this 'virtual' particle would create a (spatial) loop, what should allow us to make the time difference (for classical computation) quite large.
  11. Particles are some solutions of the field localized in some three dimensions and long in the last one. They prefer their long dimension to be in interior of local light cones and so they don't like to turn back. But in high energy scattering this restriction should be weakened. Shouldn't high energy scattering produce some small amount of reverse temporal momentum particles? From the perspective of our perception of time such particles would be produced before scattering, probably by the matter of detectors in accelerator. In accelerators are used extremely sensitive detectors, but they are specialized in measuring absorbed particles - they could not spot that they themself emit a bit more particles than usual (just before the scattering). Is such effect possible? Could we detect it? If yes, it could lead to extremely powerful computers: http://groups.google.com/group/sci.physics.foundations/browse_thread/thread/a236ada29c944ebb ---------- The first conclusion on another forum was that I'm talking about tachyons... No - I'm not talking about FTL - it can be possible for huge energies, very small scales (inflation), but from our perspective these nonperturbative solutions are rather impossible for practical usage. About tachyons - waves which could travel out of the cone of usual interactions ... in my view of physics SRT doesn't forbid them. Generally I think we are far from proving that they exist or don't exist ... I'm talking about something what shouldn't be controversial - particles produced in high energy scattering, which moves inside the light cone, but in reverse time direction. It's reversed only from its point of view - from ours they've been produced for example by a matter of detectors and goes straight into the scattering point. But from causality point of view, their production by the matter of a detector is caused by the scattering in what we call future.
  12. In this approach instead of trying to solve problem classically, we change it into continuous problem and use physics natural ability to solve some partial differential equations. Thanks of it could take some short path to find the solution. I was pointed out that the problem is that gates looses it's properties in high frequencies, but it can be a matter of choosing proper technology. http://groups.google.com/group/sci.crypt/browse_thread/thread/736fd9f3e62132c2#
  13. Let's take some NP-problem: we have a verifier which can quickly say that given input is correct or not, but there is huge number of possible inputs and we want to tell if there is a correct one (find it). Such problem can be for example finding a divisor (RSA breaking) or finding a key that if we use it to decrypt the beginning of the file, there will be significant correlations (brute force attack). Imagine we have a chip with 'input' and 'output' in which is implemented (e.g. FPGA): IF 'input' verifies the problem - THEN send 'input' to 'output' - ELSE send next possible 'input' (cyclically) to 'output' such that this chip uses only basic logic gates - computations are made in some small number of layers - IT DOESN'T USE CLOCK (we can do it for some NP problems like 3SAT). Now connect it's 'output' and 'input' to make a loop. Such loop will be stable if it has found a solution (which can be transferred out). If there would be a clock, in each cycle there would be checked one input. If not, it's no longer classical computer: while removing the clock/synchronization we are destroying the order and releasing all it's statistical, quantum properties to make what physics do the best: solve its partial differential equations. Now it became continuous and so it can go with energy gradient to some local minimum, but the only local minimals are stable states (solutions). Every other is extremely unstable - electron fluid won't rest until it find a solution. The statistics, differences between propagation times will create extremely fast chaotic search for it. I'm not sure, but it could find solution qualitatively faster than classical computer. I know - there can be a problem with nonlinearity of transistors? If yes, there are plenty of technologies, maybe some of them would handle with it? This loop computer idea is practically simplified from time-loop computer idea: http://www.scienceforums.net/forum/showthread.php?p=453782
  14. No As I've already written: If physics could stabilize this causality loop, it should be done. If not - it should be stabilized by breaking it's weakest link - by making that the prediction would gave wrong answer. Why this link? Because this link would require an extremely precise measurement of some process which is already not preferred energetically. Creating causality paradoxes should be even less, so for physics it should be easier for example to shorten the time of this reverse temporal propagation, especially that the rest of this causality loop is just classical computation. I think that such spatial, purely classical loop should already have (smaller but still) strong tendency to stabilize. Without clock it would be pure hydrodynamics of electrons: http://www.scienceforums.net/forum/showthread.php?t=37155
  15. Some physicists believe in the possibility of instant time travels. Let's assume hypothetically something much simpler and looking more probable - that physics of four-dimensional spacetime we are living in, allows for microscopic loops which include time dimension. If they would have at least microseconds and we could amplify/measure them (Heisenberg uncertainty principle...), we could send some information back in time. Observe that computers based on such loop could instantly find fixed point of given function: Let's take for example some NP-problem - we can quickly check if given input is correct, but there is huge (but finite) number of possible inputs. So this computer can work: - take input from the base of the loop, - if it's correct, send back in time to the base of the loop the same input, if not - send the next possible input (cyclically). If there is correct input, it would be the fixed point of this time-loop, if not - it should return some trash. So we would only need to verify the output once again after all (out of the loop). Can such scenario be possible? General relativity theory says that local time arrows are given by solutions of some equations to the boundary conditions (big bang). CPT symmetry conservation suggest that there shouldn't be large difference between past and future. These arguments suggest so called eternalism/block universe philosophical concepts - that spacetime is already somehow created and we are 'only' going through it's time dimension. I've recently made some calculations which gave new argument, that such assumption actually gives quantum mechanics: Pure mathematics (maximizing uncertainty) gives statistical property - Bolzman's distribution - so it should be completely universal statistics. If we would use it to find distribution on constant time plane, we would get stationary probability distribution rho(x)~exp(-V(x)). If we would use it to create statistics among paths ending in this moment, we would get rho(x)~psi(x) (quantum ground state). If we would use it to create statistics among paths that doesn't end in this moment, bu goes further into future, we would get rho(x)~psi^2(x) - like in quantum mechanics. So the only way to get QM-like statistical behavior is to threat particles as their paths in four-dimensional spacetime. So spacetime looks like four-dimensional jello - both 'tension' from past and future influence the present. http://www.scienceforums.net/forum/showthread.php?t=36034 It suggest that particles should for example somehow prepare before they would be hit by a photon. The question is if this can be measured (uncertainty principle)? If yes - are these times long enough to be useful? Observe that if the answer is yes, such computer could e.g. break RSA in a moment. To make cryptosystems resistant to such attacks, they should require long initialization (like based on Asymmetric Numeral Systems). ----------------------------------- I though if we could reduce the required number of bits transferred back in time, and it looks like one (B) should be enough (this algorithm intuitively looks less stable?): - if B then 'input' -> next possible 'input' (cyclically) - if 'input' verify the problem -- then transfer back in time B=false -- else transfer back in time B=true. If it could it should stabilize on B=false and some solution. Such algorithm means that it uses input(B) from some physical process which can predict (in microseconds) for example if there will be photon absorbed and on the end emits this photon or not. If physics could stabilize this causality loop, it should be done. If not - it would be stabilized by breaking it's weakest link - making that the prediction would gave wrong answer. I believe here has just started discussion: http://groups.google.com/group/sci.physics/browse_thread/thread/c5f055c9fc1f0efb ------------------------------------------------- To summarize: the verifier is completely classical computer, but when it is coupled with some effect which can transfer data a few nanoseconds back in time, the physics should make that this couple create stable causality loop. But it could only happen if it by the way solve given NP problem (or e.g. find a keys so that decrypted message looks to have significant correlations). If for given instance of problem, there would be created dedicated chip - which makes calculations layer by layer (without clock), it should make the verification in nanoseconds - such jumps are easier to imagine. This suggests some nice though experiment: make such loop but much simpler - just spatial (it's a tube in four dimensions): take a chip with verifier of some problem and the algorithm from the first post. Now instead of sending 'input' back in time just connect it to the 'input'. Such loop should quickly check input by input and finally create stable loop if it can... Is that really???? This scenario require the clock, doesn't it? What if there wouldn't be a clock...? Shouldn't it find the solution practically instantly?
  16. There is a problem with measuring angles - they depends on the reference frame. GR rotates locally light cones - solutions for waves of interaction, which makes that it looks like we are living in Minkowski space. These rotations of solutions can be caused by some field. So even when they will be confirmed by an observation, the internal curvature won't be needed.
  17. But we can also think about GR that we have flat spacetime with some interacting fields. It allows to understand how it is a result of microscopic physics (like photon-graviton scatterings) and we avoid huge amount of philosophical questions for internal curvature interpretation. The maximum time travel possibility this picture allows is to turn our reason-result line into opposite time direction and after some (minus) time turn back. It would create some loop which cannot spoil actual situation (like killing grandfather) - so it suggests that the future is already somehow set - eternalism (which assumption creates QM (see link)). Oh I've forgot to mention that such causality loop would create some very strange topological singularity ... so probably all time travels are forbidden ... ?
  18. So we don't need any mystical internal curvature... SRT can be derived from the assumption that light travels with given constant speed. Gravity waves has the same. From the point of view of spacetime, it says the angle (45 deg) of solutions for waves which carries (probably?) all interactions. The only difference in GRT is that these solutions: interaction/light cones has changed their directions.
  19. Cannot GR be viewed as such graviton-graviton scatterings?
  20. I was recently told ( http://groups.google.pl/group/sci.physics.foundations/browse_thread/thread/e6e26b84d19a17ff# ) that there is quite new Relativistic Theory of Gravity (RTG) by Russian physicist Logunov, which explains GR without internal curvature. It uses speeds of clocks. But any clock (mechanical, atomic,biological...) bases on some chain of reason-result relations. These relations are made by some interactions - transferred by some waves ... So speed of clock can be translated into wave propagation speed. I have a question: does strong (electro)magnetic field bend the light? In field theories like QED we add some nonlinear terms (like phi^4) to introduce interactions between different frequencies... From the second side electromagnetic interactions has some similarities to gravity interaction... Have You heard about such experiments, calculations? Dedicated experiment should find it, and so using different EM fields we could tell a lot about fundamental details of QED... (and maybe GR...)
  21. Two dimensional manifold with positive constant internal curvature should create a sphere... (somewhere...) If spacetime is really immersed somewhere, when it should intersect with itself, it probably could change topology instead (like going through a critical point in the Morse theory). I think it's the concept for time travels/wormholes(?).
  22. Ok, You are right ... let's call it immersion ... So what do You think about the possibility of instant time/space travels?
  23. So maybe it only looks like it was a result of a curvature? Einstein - Hilbert equations connects internal curvature with energy/momentum, but how physically is made this connection? Until we understand it, these equations aren't argument for internal curvature. Especially that it can be an analogous effect to minimal optical path principle: because of interference with responses of local atoms, light travels along geodesic of metric tensor being 'refractive index'*'identity matrix'. In GR, such refractive index has to be four-dimensional and usually anisotropic (different wave speeds for different directions). It could be created for example by that interactions are transferred by some waves of field, so they creates some local structure of the field - small differences between being in different phase could cause that large interactions make interference needed to change propagation speed/direction of other waves. If GR is really the result of internal curvature and spacetime is not embedded anywhere, what would happen if it looks like it should intersect with itself?
  24. When light goes through different materials, it chooses path to locally minimalize distance - it's trajectory is geodesic of some metric (usually diagonal - isotropic) . It is the result of that microscopic structure of the material can reduce wave propagation speed. Microscopic models of physics usually assume that we have some field everywhere and it's fluctuations transfer interactions/energy/momentum. So maybe these microscopic structure can reduce waves propagation speeds? Reciprocals of these velocities creates (anisotropic) metric tensor (g) and so for example particles travel through geodesics like in general relativity theory. Standard interpretation of general relativity says that particles goes through geodesics because of space time internal curvature: theory, experiments suggest some equations, which looks like being a result of internal curvature of spacetime. But if we live on some curved manifold, it intuitively should be embedded somewhere(?) (and for example black holes are some spikes) So why our energy/matter is imprisoned on something infinitely flat? Why we doesn't interact with the rest of this something? What happen if our manifold will intersect with itself? (optimists say that it would allow for time travel/hyperspace jumps?...) And less philosophical, but most fundamental (to connect GR and QM) question is: how energy/momentum density can create curvature? Maybe it's not the only possible interpretation. Maybe we live in flat R^4 and GR is only the result of reducing the speed of wave propagation by microscopic structure of some field, which somehow is ruled by similar equations to Einstein-Hilbert. This interpretation doesn't allow for instant time/space travel, but it get rid of some inconvenient questions ... and creates a chance to answer to the last one. So how should look such connection of QM(QFT?) and GR? What are particles? From spacetime point of view, they are some localized in some three dimensions and relatively long in the last one, solutions of let say some field to some equations. These solutions want to be more or less straight in four dimensions (constant velocities), but they turn accordingly to interactions transferred by the field. Many of them were created in big bang (boundary conditions), so their long dimension is similarly directed - creating local time arrow (GR). Bolzman distribution among such trajectories, can purely classically create QM like statistical behavior ( http://www.scienceforums.net/forum/showthread.php?t=36034 ). Are there any arguments for spacetime internal curvature other than that the equations looks like being be a result of it? What do you think about these interpretations? If the curvature is the only option, is spacetime embedded somewhere... ?
  25. First of all it occured that the way of thinking that we are moving along the time dimension of some already created timespace is known and called eternalism/block universe. It's main arguments are based on general relativity, but also that there is a problem with CPT conservation and wave function collapse... The fact that bolzmanian distribution among paths gives statistical behavior similar to known from QM, suggest even more - that QM is just the result of such structure of the timespace. ...that wave function collapse is for example reversed split of the particle (to go through two-slits). This simple statistical physics among trajectories gives similar behavior of QM, but still qualitatively different - particles leaves an excited state exponentially instead of making it in quick jumps, producing a photon for energy conservation. I think this difference is because we assumed that in given time, the particle is in a given point, but in fact it's rather its density spread around this point. If instead of considering a single trajectory for a particle, we take some density of trajectories with some force which wants to hold them together, the particle's density instead of slowly leaking, should wait for a moment to quickly jump to a lower state as a whole. This model should be equivalent to simpler - use a trajectory in the space of densities (instead of density of trajectories). But I don't see how to explain the production of the photon - maybe it will occur as an artifact, maybe the energy conservation should be somehow artificially added ? The question is what holds them together, to form exactly the whole particle - not more, not less? Kind of similar question is why charge/quantum numbers are integer multiplicities? I'll briefly present my intuitions to deeper physics. The first one is that the answer to these question is that particles are some topological singularities of the field. That explains spontaneous creations of a pair/annihilation, that such pair should has smaller energy when closer - create attractive force. The qualitative difference between weak and strong interaction could be due to the topological difference between SU(2) and SU(3). So the particle would be some relatively stable state of the field (in which for example the spin has spatial direction). It would have some energy, which should correspond to the mass of the particle. The energy/singularities densities somehow creates spacetime curvature...? Now if particles are not just a point, the field which they consist of fluctuates - still have some orders of freedom (some vibrations). I think that quantum entanglement is just the result of these orders of freedom - when particles interact, they synchronise fluctuations of their field. But these orders of freedom are very sensible - easy to decoherence... ps. If someone is interested in the inequality for the dominant eigenvalue ([math]\lambda[/math]) of real symmetric matrix ([math]M[/math]) with nonnegative terms from the paper: [math]\ln(\lambda)\geq\frac{\sum_i k_i \ln(k_i)}{\sum_i k_i}\qquad [/math] where [math] k_i = \sum_j M_{ij}[/math], I've started separate thread: http://www.scienceforums.net/forum/showthread.php?t=36717 ---------------------------------------------------------------------- Simplifying the picture The field theory says that every point of the space-time has some value for example from U(1)*SU(2)*SU(3). This field doesn't have something like zero value, so the vacuum must have some nontrivial state and intuitively it should be more or less constant in space. But it can fluctuate around this vacuum state - these fluctuations should carry all interactions. It allow also for some nontrivial spatially localized, relatively stable states - particles. They should be topological singularities (like left/right swirl). Another argument is that if not, they could continuously drop to the vacuum state, which is more smooth - has smaller energy - they wouldn't be stable. Sometimes the fluctuations of the field exceed some critical value and spontaneously create a particle/antiparticle pair. Observe that this value around which vacuum fluctuates, should have huge influence in choosing the stable states for particles. Maybe even this value is the reason for weak/strong interactions separation (for high energies this separation should weaken). It could also be the reason for matter/antimatter asymmetry... The problem with this picture is that it looks like the singularities could have infinity energy (I'm not sure if it's necessary?) If yes - the problem could be with Lagrangian being to simple? The other question is if the field theory is really the lowest level? Maybe it's the result of some lower structure...? ------- I was thinking about how energy/singularities could create spacetime curvature... Let's think what is time? It's usually described in the language of reason-result chains. They can happen in different speeds in different points of reference. These reason-result chains are microscopically results of some (four-dimensional) wave propagations. But remember that for example light speed depends on the material ... the wave propagation speed depends on microscopic structure of ... the field it is going through. This field should be able to influence both time and spatial dimensions - slow it down from the light speed. In this picture spacetime is not some curved 4D manifold embedded in some multidimensional something, but is just flat - local microscopic structure of the field specifically slows down some time/space waves of the field. ...and for example we cannot go out of black hole horizon, because the microscopic structure of the field (created by the gravity) won't allow any wave to propagate. This picture also doesn't allow for hyperspace jumps/time travels... -------- Finally something the most controversial - the whole picture... Let's imagine such particle, for example one of created in the big bang. This stable state of the field in 4D is well localized in some three of dimensions, and is very long in the last one ... most of created in the big bang should choose these directions in similar way ... choosing some general (average...) direction for time (the long one) and space (localized ones)... These trajectories entangles somehow in the spacetime ... sometimes for example because of the field created by some gravity they change their (time) direction a bit - observed as general relativity... ... their statistics created (purely bolzmanian - without magic tricks like Wick's rotation) quantum mechanical like behaviour ... What do You think about this picture? Is internal curvature really necessary, or maybe it's only an illusion (like light 'thinks' that geometry changes when it changes the material)? Are rules for time dimension really special? ... is time imaginary? Or maybe it is only a result of some solutions for these rules, specially due to boundary conditions (big bang)...?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.