Duda Jarek
Senior Members
Posts
548 
Joined

Last visited
Content Type
Profiles
Forums
Events
Everything posted by Duda Jarek

Any comments about Gryzinski freefall atomic model?
Duda Jarek replied to Duda Jarek's topic in Speculations
Bignose, So let's look at Gryziński's lectures, for example in http://www.cyf.gov.pl/gryzinski/teor7ang.html there is compared both calculations for hydrogen molecule. Quantum calculations (H.Haken, H.Ch.Wolf, Molec. Phys. and Elem. of Quant. Chem. p. 4551, SpringerVerlag Berlin, Heidelberg, 1995) need heuristically modifying standard approach, introducing fitted succeeding artificial coefficients to get agreement with experiment ... comparable with what is get straightforward using just Coulomb and Lorentz force ... how would you comment it? And generally look at first posts  it's not about saying that only one them is true! We know that QM imply classical picture (e.g. Ehrenfest theorem) ... it's about seeing them equivalent  as just different pictures of the same ... Like that we can see evolution of coupled pendulums through their positions (classical picture), or through their normal modes: eigenstates of evolution operator  in this eigenbase evolution is literally 'superposition of rotations of coordinates'  unitary (quantum picture). I've heard many things here, but still no concrete counterarguments  so I ask again: why do you believe that these pictures cannot be just equivalent? That quantum orbitals are not just simple mathematical representation of some stable dynamical state (like in nuclear shell model)  that these probability densities cannot be sharpened  became governed by physics: deterministic, like seen as made by concrete trajectories? What about corpuscular nature of particles? Extremely small size of electron as particle? Seeing doubleslit, SternGerlach through concrete trajectories governed by Coulomb, Lorentz law? DJBruce, His classical scattering theory was widely used (like 450 citings) and I haven't seen nonpositive comments about it (?) His atomic models are natural consequence  just succeeding scatterings from nearby .. and the only nonpositive comment I've found is this enigmatic 'unsatisfactory'. One of goal for this thread was to understand this situation  why these understandable, noncontroversial finally working modern successors of well known Bohr model are just unknown and not developed further? Doesn't this 'discussion' itself suggest that it's rather a sociological problem?  please give me one concrete argument why they cannot be just equivalent? 
Any comments about Gryzinski freefall atomic model?
Duda Jarek replied to Duda Jarek's topic in Speculations
DJBruce, You've confused me, so I had to look at dictionary once again: http://www.thefreedictionary.com/pseudoscience pseudoscience  " A theory, methodology, or practice that is considered to be without scientific foundation." Which of: charge, Coulomb law, magnetic moment, Lorentz law, Lagrangian mechanics ... are considered to be without scientific foundation? If you want, here is a nicer set: out of physics collapses, splitting universes, infinite vacuum energy density creating virtual particles, infinite masses not always magically renormalizable, momentum chosen randomly (conservation?), almost pointwise particles are probability clouds, cat is simultaneously dead and alive, there are only subjective physics of different observers, nonlocality, indeterminism, a dozen of basic interpretations, which everybody not understand in own way, ... And do you imagine comprehensive search as on e.g. fig. 2,3 from the link to be made by a few persons not using modern computers? swansont, do you mean the Science article? http://www.sciencemag.org/cgi/content/abstract/328/5986/1658 So is quantum mechanical description of dynamics of photoemission satisfactory? Does being a theory mainstream means that it's satisfactory? What is untested about Coulomb, Lorentz law and Lagrangian mechanics? 
Any comments about Gryzinski freefall atomic model?
Duda Jarek replied to Duda Jarek's topic in Speculations
Great  extremely scientific description 'unsatisfactory' on one paper means that understandable noncontroversial physics has to land in pseudosicence? As I've read in Gryzinski's lecture  he improved later his original models from 1965: http://www.cyf.gov.pl/gryzinski/teor5ang.html These calculations just required computers and probably they are still simplified  there is a place for improvement ... when I was young I believed that science is about expanding what we do understand ... By the way: do you find quantum mechanical description of dynamics behind wavefunction collapse like photoemission satisfactory? 
Any comments about Gryzinski freefall atomic model?
Duda Jarek replied to Duda Jarek's topic in Speculations
Ok  let's look at this link: 'objective criticism'  I had to miss it? 'No maths'  it's about approved and verified mathematical consequences, results of simulations of Coulomb and Lorentz law...? 'Incomprehensible'  these laws and way for searching for their consequences can be found in elementary books...? 'You are contradicting accepted science'  doesn't glorious quantum mechanics says that particles have also corpuscular nature? Particle physics that electrons are extremely small? It's exactly about their trajectories ... ? 'No evidence'  these peerreviewed papers show good agreement with many different kinds of experiments ... ? 'No physical basis'  I would understand saying that QM founders didn't have experimental basis to know that wavefunction collapse isn't simultaneous, so they've assumed it as practical idealization ... but models from this topic are more recent and didn't need any such speculations  their basis is just Coulomb and Lorentz law ... ? 'Obvious errors'  I thought I've explained the problem with angular moment using a cat (can be even Schroedinger's)?... is there something more? 'It's not science'  they don't try to convince that each wavefunction collapse like photoemission is 'out of physics' or splitting universe phenomena, but just oppositely  show that we don't need any new exciting explanations  that we can even try to look at internal dynamics of such processes using standard, noncontroversial physics ... ? 'I have evidence, from this book and article'  there are some peerreviewed papers from the best journals listed in Wikipedia article I've linked, but there can be easily found more of them... ? I have to admit that I still don't understand why it was moved? 
Any comments about Gryzinski freefall atomic model?
Duda Jarek replied to Duda Jarek's topic in Speculations
What is speculative about it?  Coulomb law? Lorentz law? their natural consequences verified and approved by many world class reviewers? that cat can turn in air with zero initial angular momentum? these new optical measurement from Science showing that quantum SPECULATION that collapses are instantaneous is just wrong ... ???? http://www.sciencemag.org/cgi/content/abstract/328/5986/1658 Please anybody explain why it was moved ???? Speculation: 'opinion/reasoning based on incomplete information: a conclusion, theory, or opinion based on incomplete facts or information' (from http://encarta.msn.com/dictionary_1861711484/speculation.html ) 
Any comments about Gryzinski freefall atomic model?
Duda Jarek replied to Duda Jarek's topic in Speculations
Quantum mechanics says that while photoemission, electron is one probability cloud, then there is some mystical phenomena and there is instantaneously(?) chosen (by? out of physics: supernatural? splitting universe into parallel ones?) one of new probability clouds ... We are finally reaching measurement precision to see that it isn't really instantaneous: http://www.sciencedaily.com/releases/2010/06/100630110910.htm So maybe there is some internal dynamics behind it  QM isn't fundamental theory, but only practical idealization and so we can sharpen its probabilistic picture ... like imagine concrete electron trajectory behind it, which from particle physics is believed to be extremely small ... Heisenberg uncertainty restricts measurement capabilities  does it say that the picture is also blurred for physics  internal dynamics? That we cannot model it  imagine what's going on behind the curtain? Even in doubleslit or SternGerlach experiment we imagine concrete trajectories ... aren't they also governed by Coulomb and Lorentz law? Why we cannot imagine them behind probability cloud of orbitals? 
Experimental evidence started in 80s or earlier http://www.sciencedirect.com/science?_ob=MiamiCaptionURL&_method=retrieve&_udi=B6TJ14WDGCNK1&_image=fig1&_ba=1&_user=4420&_coverDate=08%2F31%2F2009&_alid=1441903185&_rdoc=1&_fmt=full&_orig=search&_cdi=5297&_issn=09276505&_pii=S092765050900084X&view=c&_acct=C000059607&_version=1&_urlVersion=0&_userid=4420&md5=3d9cea58f6a7c0f05dcf360a29dabe08 and there is much more of them, for different isotopes and with huge statistical analysis ... in many papers accepted by reviewers of good journals... Do you know peerreviewed papers claiming opposite results? What are they showing... That mathematical idealization  decay time of nucleus is constant only approximately ... Why can we be sure that these coefficient from Poisson theorem (extremely many of extremely rare chances) are really constant? Shouldn't we really understand nuclear/particle physics before such claims? Do we? Here is more discussion: http://www.symmetrymagazine.org/breaking/2010/08/23/thestrangecaseofsolarflaresandradioactiveelements/

Extremely important news: http://news.stanford.edu/news/2010/august/sun082310.html Decay times depend on sun activity! It's another argument that we shouldn't look at nucleuses from blurred, fluctuating quantum picture ( http://www.physorg.com/news199711994.html ), but rather as a concrete spatial structure near (local?) energy minimum (so called soliton). To get it out of this local minimum, there is needed energy  many orders of magnitude larger than in chemistry. Standard assumption: Boltzmann distribution suggests that rarely, but it really can spontaneously gather huge amount of energy ... but maybe it's only idealization, chemistry can have some limits ... and so we should search for another source of this energy ... like neutrinos! Consequences? Besides the need to reconsider datings, planetary models ... Look at hypothetical proton decay  required by particle models like supersymmetry, useful to explain nonzero baryonic number o our universe ... but not observed in huge water tanks  maybe required energy to get proton out of extremal deep potential well is just larger than accessible by chemistry or solar neutrinos? Safer place to search for it would be extreme temperatures like the core of neutron star  such decay would be Nature's failsafe to prevent infinite matter densities  they would change into energy earlier ... it could also help to explain beyond GZK cosmic rays ...

Any comments about Gryzinski freefall atomic model?
Duda Jarek replied to Duda Jarek's topic in Speculations
Would humanity get to this point by developing mythology/metaphysics  because explanation of lightning using Zeus worked? Or maybe we've got here because they slowly and consequently expanded what was understandable? No connections? Really? While particle physicists know well that they are extremely small, in QM they are just blurred clouds and then there is mystical "zaaaaap" and voila: there appears chosen (by? 'out of physics': supernatural forces? Universe splits to parallel ones?) one of new blurred clouds ... and by definition we just cannot improve that picture  imagine what's happening behind the curtains ... why? Heisenberg's commandment? But uncertainty principle gives only restrictions for measurements, which unavoidably influence system ... where does it says that internal dynamics is also uncertain about itself? That the picture is blurred not only for the observer but also for physics? So what happens when someone shows that we shouldn't give up so early  that we can sharpen this blurred picture  see concrete motion behind it  that we can think about finding concrete dynamics of these 'moderns lightings'  expand what we do understand ... ? ... sounds 'crashing on a wall of fanatics who believed that their glorious unconceivable understanding of nature by definition cannot be improved' familiar from history? These models are not to deny QM, but to explain it by sharpening its blurry picture  so that we can not only experience its misterium, but maybe finally understand it  not as artificially introduced, but as naturally emerging. 
Any comments about Gryzinski freefall atomic model?
Duda Jarek replied to Duda Jarek's topic in Speculations
Angular momentum?  but when electron is on the farthest position, it just stops: has zero angular momentum  situation here is much more complicated... For example when a cat fall down upside down  even if it looks like breaking angular momentum conservation, the cat is able to rotate ... http://en.wikipedia.org/wiki/Cat_righting_reflex It can be modeled by two cylinders ... similar 'problem' we have in sports, especially jumps to water ... The point is that angular momentum conservation doesn't really forbid complex objects with zero angular momentum to rotate: like a cat, a jumper ... or protonelectron system. Anyway, however we will call different properties of a system, the real agreement we get with experiment  and it is what the author showed in his many papers ... 
Any comments about Gryzinski freefall atomic model?
Duda Jarek replied to Duda Jarek's topic in Speculations
Great discussion ... like talking to a wall ... who is saying that QM doesn't describe nature???? We generally know that natural, understandable field theories (like GRT, EM .. but also KleinGordon, QFT) quite often also agrees with experiments ... and these many papers from the best journals strongly suggests that classical picture of atoms can give much better agreement than it is generally believed ... so maybe they are just two different views on the same ???? I see it's not about arguments  you just have deep internal faith that common classical picture couldn't be equivalent with glorious inconceivable quantum mechanics ... I've said some of my arguments ... For both quantum mechanics and field theories (linearized), the basic evolution is unitary, untrue? But QM has additionally decoherence ... which in modern view is believed not to be out of unitary picture, but thermodynamical consequence of interaction with environment, untrue? Classical thermodynamics: that when we cannot trace particle, we should assume Boltzmann distribution among possible trajectories, leads to going to the lowest Hamiltonian eigenfunction (with nonzero projection), untrue? It explains decoherence and for example makes that stable orbits while stochastic perturbation shifts toward the nearest quantum state... Do you have at least one CONCRETE ARGUMENT to support your faith? Why these pictures cannot be just equivalent? If not, I'm still counting for some concrete comments in topic ... 
Any comments about Gryzinski freefall atomic model?
Duda Jarek replied to Duda Jarek's topic in Speculations
I haven't looked at these models closer yet, but the number of reviewers of world class journals who verified and approved these papers showing good agreement with different types of experiments (e.g. different scatterings, energy levels, magnetic properties, ... ) of what consequences of Coulomb and Lorentz force really are, suggests that a simple counterargument won't 'prove' that they are just wrong ... To do it, in science there are needed e.g. further simulations and comparisons with experiments ... it's why I'm asking if someone has an experience with them ... ? I'm not saying that nature has to be understandable, but rather that there is dangerous and well known from history social phenomenon: that when people believe that they cannot understand something, they for example introduce Zeus to explain lightning ... having such 'explaination' suppresses further search ... I believe scientists should be very careful about such giving up in difficult situations  it's kind of accepting 'intelligent design' ... do you disagree? So before giving up understanding, we should for example take what we do understand to its real limits of applicability  in my opinion situation in which the general belief of scientific society is that history of models of atoms we can understand has ended almost a century ago ... while there in fact are practically unknown much better modern models ... is just sick. As a scientist, I believe that to understand something 'inconceivable', the basic approach is to really deeply understand consequences of what we do understand  the essence of 'inconceivability' has to be hidden somewhere in what this fully exhausted picture is still missing. Modern history of physics shows that this basic approach not only have been neglected ... but even seems like it has been silenced ... it's not about obsolete Bohr model people should be learned first! So do you believe we should just give up trying to understand inconceivable dogmas of QM? If not  isn't really deep understanding of full consequences of what we can be really sure of (like Coulomb and Lorentz force) the basic approach? 
Any comments about Gryzinski freefall atomic model?
Duda Jarek replied to Duda Jarek's topic in Speculations
The problem with QM is that Feynman words: "I think I can safely say that nobody understands quantum mechanics" probably still applies  don't you see a problem in that the foundation of our understanding is unconceivable? From the other side we have classical mechanics  intuitive, natural, without any controversies  don't you think that to finally understand QM it should be helpful to see what using just classical Coulomb and Lorentz law really leads to  take it to the limits of applicability and look closely to see what is still missing for 'full QM'? Especially that there is still used Bohr model to understand, calculate different phenomenas  there is commonly used Bohr radius ... So I think it should be useful to be at least aware that the history doesn't end on Bohr and Sommerfeld what is generally believed, but there are also modern classical models, which have been showed in well peerreviewed way to get much better agreement to experimental results ... sometimes even better than QM (like in http://www.cyf.gov.pl/gryzinski/teor7ang.html ) In 'classical' picture, electrons are localized object, like on these photos of atoms: http://www.mizozo.com/tech/09/2009/15/firstpictureofanatom.html  we can measure where exactly single electrons were before being tear off. Classically pointlike electron starts moving on some trajectory around, stabilizing thermodynamically own statistics (using some complicated deterministic motion) to expected probability density (maximizing entropy) and finally is tear off by potential  natural thermodynamical model: Boltzmann distribution among possible trajectories says that this stabilized probability density (time average) is exactly the same as for the lowest quantum state (similar to Feynman path integrals). Brownian motion is good enough approximation of this finally mathematically correct thermodynamical model  it works for diffusion in liquids, but is no longer sufficient for fixed structure of defects in solids: http://physicsworld.com/cws/article/news/41659 We can look at coupled pendulums through their positions (classical picture), but also through their normal modes  that their evolution is 'superposition of rotations of phases' in this eigenbase of evolution operator (quantum picture). Taking a lattice of such pendulums, we get crystal with phonons. Now make infinitesimal limit  we get a field theory, like waves on water, GRT, EM, KleinGordon, QFT  if we go to 'normal modes'  eigenbase of evolution differential operator  we get 'superposition of rotations'  quantum picture  like interference of 'classical' waves on water. It's because these PDE are hyperbolic  'wavelike'  in all these theories the basic excitations are waves. We can see classical mechanics as a result of quantum (like in Ehrenfest theorem)  maybe it's also in the opposite way  maybe they are just equivalent? These modern classical atomic models make some of the way for seeing QM no longer as only inconceivable dogmatic theory, but for example as naturally emerging in mathematically clear and natural field theories. 
I've recently found that after Bohr model there was introduced by Gryzinski classical model in which electrons make almost radial freefall trajectory to the nucleus, which due to magnetic moments is bent by Lorentz force and so the electron goes back to the initial distance. This model is a natural consequence of classical scattering theory developed by the author. In almost 20 peerreviewed papers in the best journals he claims to show that these using just Coulomb and Lorentz force models give really good agreement with experiment (in opposite to Bohr). These conceptually simple calculations were verified and approved by many world class reviewers, so one could think that such impressive models should be well known ... ... but surprisingly I cannot even find any constructive comments about them ??? http://en.wikipedia.org/wiki/Freefall_atomic_model I'm very interested at finding some serious comments about these finally agreeing with experiments modern classical models? Have you even heard about them? About someone working on them?

What if we make CPT transformation of free electron laser?
Duda Jarek replied to Duda Jarek's topic in Physics
Sounds similiar? http://www.sciencenews.org/view/generic/id/61673/title/Behold%2C_the_antilaser "(...)cause the device to start and then stop absorbing light" ... 
Standard random walk on a graph – that for every point each outgoing edge is equally probable, doesn’t really maximize entropy as mathematics expects from thermodynamical models, but do it only locally. Such models lead to Brownian motion in continuous limit, which is good enough approximation to model diffusion in fluids, but isn’t longer appropriate within fixed structure of solids, like for recently measured electron stationary probability density on a defected lattice of potential wells of semiconductor surface: http://physicsworld.com/cws/article/news/41659 We would rather say that this probability density is quantum mechanical ground state ... but this sample is macroscopic, so we should expect some current flow behind – some thermodynamical behavior of these 'quants of charge'. It occurs that when we use stochastic model which finally do what mathematics expects from us – really maximize entropy, we get going to exactly quantum mechanical ground state stationary probability density, like we would thermodynamically expect from quantum mechanics. So maybe starting from such models we could better understand dynamics of current flow in quantum scale... I’ve just made Mathematica demonstration which allow to compare electron conductance through defected lattice using both – based on standard Generic Random Walk (classical) and these new models based on Maximal Entropy Random Walk. It allows to observe both stationary probability distribution and dynamics of current flow for different defect densities and applied potential gradient: http://demonstrations.wolfram.com/preview.html?draft/93373/000008/ElectronConductanceModelUsingMaximalEntropyRandomWalk or https://docs.google.com/leaf?id=0B7ppK4IyMhisMTRiNGZjYWItMDU0NS00OTFjLTg0NmQtOWE4ZTg5ZTkzMTJk&hl=en They give completely different qualitative picture – I would like to ask which of them better correspond to conductance at quantum level? For example in standard model for even the smallest potential applied, we immediately get almost uniform current flow through the whole sample, while in this new models we usually require some nonzero minimal potential gradient to 'soak' out of entropy wells through some complicated entropic landscape. And generally I would be grateful for any remarks and comments about the demonstration.

Can thermodynamical model be fundamental: reason not result?
Duda Jarek replied to Duda Jarek's topic in Physics
In the only models I can think as theories of everything (the reason): deterministic, in each point of the spacetime, there is some single situation  there is no point in talking about probabilities and so entropy. While building some THERMODYNAMICAL MODEL OVER THIS SOLUTION  in each point we usually consider a ball and average over it, getting some effective local parameters like entropy or temperature  for our history of Universe, it allows to assign to each point of the spacetime local entropy  and 2nd law of thermodynamics says that we move along fourdimensional gradient of this entropy  so there had to be entropy minimum in our Big Bang (or maybe Bounce) and it probably created entropy gradient giving 2nd law of thermodynamics. Quantum mechanics by definition ignores the dynamics behind wavefunction collapse and just says probability distribution of its result  like thermodynamical models ... what is interpreted by some people that spcetime is infinitely quickly branching tree of parallel universes ... I completely disagree  fourdimensional nature of our spacetime already leads to many nonintuitiveness, like (confirmed) Wheeler's experiment or that to translate amplitudes we are working for into real probability we should square it against Bell's intuition, or allows for powerful 'quantum' computers: http://www.scienceforums.net/forum/showthread.php?p=569143 
I always thought that thermodynamics/statistical physics is effective theory – statistical result of some fundamental physics below, but recently there became popular theories starting from ‘entropic force’ as fundamental (basing on holographic scenarios, like in http://arxiv.org/abs/1001.0785 ). I was taught that to introduce effective local thermodynamical parameters to given concrete situation, for each point we average inside some ball around it to get for example local entropy or temperature. For a simple mathematician like me it sounds like a nonsense – in fundamental theory describing evolution of everything there should be one concrete history of our universe – there is no place for direct probabilities of scenarios required to define e.g. entropy. So I wanted to ask if someone could explain why we can even think about fundamental ‘entropic’ theories? To start the discussion I would like to briefly remind/discuss looking clear for me distinction between deterministic and stochastic/thermodynamical models: DETERMINISTIC models – the future is completely determined  evolution of gas in a tank is full dynamics of all its particles  for given valve opening there escaped concrete number of particles,  it's usually Lagrangian mechanics of some field – there is some scalar/vector/tensor/’behavior of functional'(QFT) in each point of our spacetime, such that ‘the action is optimized’ – each point is in equilibrum with its fourdimensional neighborhood (spacetime is kind of ‘4D jello’),  evolution equations (EulerLagrange) are HYPERBOLIC PDE  linearized behavior of coordinates in the eigenbase of the differential operator is d_tt x =  lambda x (0 < lambda = omega^2 ) so in linear approximation we have superposition of rotation of coordinates – ‘unitary’ evolution – and so such PDE are called wavelike – the basic excitations on water surface, in EM, GR, KleinGordon are just waves,  the model has FULL INFORMATION – there is no place for direct probability/entropy in electromagnetism, general relativity, KG etc. – the model has some TIME (CPT) SYMMETRY INVARIANCE (no 2nd law of thermodynamics – there is still unitary evolution in thermalized gas or a black hole) THERMODYNAMICAL/STOCHASTIC models – there is some probability distribution among possible futures  gas in a tank is usually seen as thermalized, what allows to describe it by a few statistical parameters like entropy (like sum of –p*lg(p) ) or temperature (average energy per degree of freedom)  for a specific valve opening, the number of escaped particles is given by a probability distribution only,  it is used when we don’t have full information or want to simplify the picture – so we assume some mathematically universal STATISTICAL ENSEMBLE among POSSIBLE SCENARIONS (like particle arrangements) – optimizing entropy (uniform distribution) or free energy (Boltzmann distribution),  thermodynamical/stochastic evolution is usually described by difussionlike: PARABOLIC PDE – linearized behavior of coordinates in the eigenbase of the differential operator is d_t x =  tau x (tau  ‘mean lifetime’ ) so in linear approximation we have exponential decay (forgetting) of coordinates – evolution is called thermalization: in the limit there survive only ones with the smallest tau – we call it thermodynamical equilibrium and usually can be describe using just a few parameters,  these models don’t have time symmetry – we cannot fully trace the (unitary?) behavior so we have INFORMATION LOST – entropy growth – 2nd law of thermodynamics. Where I’m wrong in this distinction? I agree that ‘entropic force’ is extremely powerful, but still statistical result – for example if while random walk instead of maximizing entropy locally what leads to Brownian motion, we do it right: globally, we thermodynamically get going to the lowest quantum state probability density – single defects create macroscopic entropic barriers/wells/interactions: http://demonstrations.wolfram.com/GenericRandomWalkAndMaximalEntropyRandomWalk/ For me the problem with quantum mechanics is that it’s between these pictures – we usually have unitary evolution, but sometimes entropy grows while wavefunction collapses – there is no mystical interpretation needed to understand it: entropy maximizing from mathematically universal uncertainty principle is just enough ( http://arxiv.org/abs/0910.2724 ). What do you think about this distinction? Can thermodynamical models be not only effective (result), but fundamental (reason)? Can quantum mechanics alone be fundamental?

In modern view on quantum mechanics wavefuntion collapse is no longer a ‘mystical out of physics’ phenomena, but is seen as a result of interaction with the environment (‘einselection’) – there is still some concrete unitary evolution behind. So there should exist ‘Hamiltonian of the Universe’ describing evolution of everything. We have similar situation in (classical!) field theories: for EulerLagrange equations (like KleinGordon: [math] d_{tt} \psi = \Delta\psi  m^2 \psi [/math] ) the evolution operator is selfadjoint – can be diagonalized (spectral theorem). The evolution on the [math] \lambda [/math] coordinate is: [math] d_{tt} x = \lambda x [/math]. So this operator should be nonpositive, because otherwise some coordinates would explode. For negative eigenvalues, we get unitary evolution – like in quantum mechanics, we can imagine it as superposition of different eigenfunctions, ‘rotating’ with different speeds. And so such hyperbolic PDE are called wavelike. We have limited knowledge: cannot fully trace these unitary evolutions – from our perspective they 'loose their coherence':  we don’t/can’t know precise parameters, like initial conditions,  we cannot fully trace complicated motion (chaos),  thermodynamically stable state usually have own dynamics, like atomic orbitals or quantum phases. If we model such our lack of knowledge with proper statistical ensemble among possible scenarios  maximize uncertainty not locally like in Brownian motion, but globally  we get thermodynamical going to quantum mechanical ground state probability density. These new models also show why to translate from amplitude we are working on to the probability, we should take ‘the square’ ( http://arxiv.org/abs/0910.2724 ). To understand the strength of quantum computers, it should be enough to focus on models with constant (fixed) number of particles, for what classical field theory is enough. What is nonituitive about them is that natural picture for such Lagrangian mechanics is ‘static 4D’ – particles are no just ‘moving points’, but rather their trajectories in the spacetime ... let’s look what gives it us for computational capabilities. Quantum algorithm usually look like:  initialize qbits,  use Hadamar gates to get superposition of all possible inputs,  calculate classical function of the input,  extract some information from the superposition of results, look at the classical function calculation – it has to use reversible gates, like (x,y,z)>(x,y,z XOR f(x,y) ) they are also reversible classically, so we can easily reverse the whole function calculation on standard computer. Unfortunately it’s not so simple: there is a problem about it – such reversible calculations usually requires quite large number of auxiliary (q)bits, which had been initialized (to zero). While taking classical reverse of such function, we rather cannot control that these auxiliary (q)bits are zeros – they would usually be just random – so we hadn’t really calculated what we wished. If we could for example calculate square of a number modulo N or multiplicate of two numbers using ‘small’ number of auxiliary bits, we could guess their final value (e.g. randomly) and in a small number of trials we would be able to reverse such function (getting all zeros), what would allow to factorize N – so probably simple multiplication requires linear number of auxiliary bits. The strength of quantum computers is that they can ‘mount qbits trajectories’ in both past and future – simultaneously initialize auxiliary qbits and using measurement focus only on scenarios having the same final value (the measured one). In Shor’s algorithm case, we wouldn’t even need to know all the scenarios to make Fourier transform – knowing two would be already enough: if these powers gives the same value modulo N, their difference gives 1 modulo N. On the 18th page of my presentation is diagram for Shor’s algorithm: https://docs.google.com/fileview?id=0B7ppK4IyMhisODI5ZTU4YjYtNmU0MC00ZTM3LTg5MWQtMTJiYTY4MWVkOTJk&hl=en For physics it’s natural to find global minimum of action, but simulating such computer in classical field theory, even after simplifications probably still would be difficult, but anyway it suggests that to attack algorithmically difficult problems, we should translate them into continuous ones. For example in 3SAT problem we have to valuate variables to fulfill all alternatives of triples of these variables or their negations – look that x OR y can be changed into optimizing [math] ((x1)^2+y^2)((x1)^2+(y1)^2)(x^2+(y1)^2)[/math] and analogously seven terms for alternative of three variables. Finding global minimum of sum of such polynomials for all terms, would solve our problem. I’ve just found information that it looks like it is successfully done for a few years – enforcing that there is only one minimum, so local gradient would show the way to the solution: http://en.wikipedia.org/wiki/Cooperative_optimization What do you think about it?

Electromagnetic properties like charge of magnetic momentums implicate some forces reverse proportional to some power of distance  so accordingly to such simplified picture, there appear infinities inside/near particles  so at least they are some kind of singularities of EM field  I would call it itself (a part?) of their internal structure, don't you? Returning to the topic, as I've written there  I totally agree that such interactions would be extremely small  but such lens for neutrinos with focal distance of e.g. millions kilometers could be still useful ... For example to finally determine their magnetic moment, what as you've suggested  could be even impossible with used today detection approach ...

Neutrinos are extremely difficult to catch because they interact extremely weakly with the matter ... but they probably have internal magnetic structure and magnetic moment  so shouldn't they interact a bit with strong EM fields? In accelerators we use magnetic devices focusing beams of charged particles, but maybe there could be constructed analog for particles having only magnetic moment? I know  the first problem could be that their spin direction is random  but there are ways to order it, like SternGerlach experiment (... for low energy) ... ? The other problem is that focal distance of such lens should rather depend on their energy ... but maybe there are ways to handle with such 'chromatic aberrations' ... ? Let's imagine we could build such extremely weak magnetic lens for neutrino and place relatively small detector in its focal point  for some scale it should became more effective than standard detectors ... The interesting fact is that it could have extremely long focal lengths  1km: we could place detector under ground ... 12700 km: we could place detector on the other side of the Earth ... or much more while placing the lens on a spaceship ... What do you think about it? Is such lens doable (for low energy neutrinos)? Additionally they should allow to say much more about neutrinos than standard detectors ...

Precessive motion as the source of quantum uncertainty?
Duda Jarek replied to Duda Jarek's topic in Physics
While argumenting quantification of magnetic flux going through superconducting ring, we say that 'quantum phase' has to 'enclose itself'  make some integer number of 'rotations' along this ring. But in such considerations (or e.g. Josephson junction) nobody rather thinks about this phase as really the quantum phase of these single electrons, but as ORDER PARAMETER  some local value of statistical physics model, describing relative 'quantum' phase  or more generally phase of some periodic motion (e.g. http://rmf.fciencias.unam.mx/pdf/rmfs/53/7/53_7_053.pdf ). So what about solutions of Schroedinger equation for hydrogen? We know well that there are better approximation of physics, like Dirac equations ... which still ignores internal structure of particles ... So doesn't it make Schroedinger's phase also only just an order parameter of some approximate statistical model? ... describing relative phase of internal periodic motion of electrons ("zitterbewegung") ... What I'm trying to convince is that when we get below these statistical models  to 'the real quantum phase' (which for example describes the phase of particle's internal periodic motion)  we won't longer have to 'fuzzy' everything, but will be able to work on deterministic picture. These statistical models works only on relative phase of some periodic motions  cannot rather distinguish absolute phase ... but generally we should be careful about implying this gauge invariance as fundamental assumption  that physics doesn't really cares about its local values. I was recently told that prof. Gryzinski also didn't like the approach that physicists couldn't handle with some problems, so they have hidden everything behind mystical cape of quantum mechanics and said that it's the fundamental level. He spent his life (died in 2004) explaining atomic physics, 'problems' which lead to the belief that QM is the lowest level using classical physics. There are many his papers in good journals. Here can be found his lectures: http://www.cyf.gov.pl/gryzinski/ramkiang.html Another argument are recently mapped electron densities on the surface of semiconductors: http://physicsworld.com/cws/article/news/41659 So we have some potential wells and electrons jumping between them. These wells create lattice with defects  we can approximately model it using graph on which these electrons make some 'walks'. There are huge amount of small interactions there, so we should rather look for statistical model  some random walk on this graph. Which random walk? Standard (maximizing entropy locally) gives just Brownian motion  without localization properties ... From quantum mechanics we should also expect going to the quantum ground state of this lattice ... and random walk maximizing entropy globally also gives such quantum ground state probability density ... and similar 'fractal patterns' as on the pictures from experiment. 
Why intristic curvature is better than gravitomagnetism?
Duda Jarek replied to Duda Jarek's topic in Physics
Ok  You say perturbative, but forgot to add: approximation ... What about nonpertubative picture  in all of these theories time is continuous  when e.g. particle decays, it's not that we have one particle in the first moment and then there is a magical 'pooof' and we have two particles physics don't like rapid changes and so especially such discontinuities  it would make this process smooth  continuous transformation from one particle into two... What is perturbative expansion is considering different scenarios in some probability distribution. If they are not made of the field  so 'where' are they? It would suggest that they 'live' parallelly to the field (?)  so how can they affect it? Why does EM field 'cares' about them? In QFT they are excitations of harmonic potential well  so still they have some momentum structure and so after Fourier transform  some spatial structure, don't they? If they really don't have any internal structure  are they 'pointwise'? Have infinite density? So e.g. electric field goes to infinity near them? Quantum mechanics is used as a magical cape protecting from inconvenient questions ... ... but there are also nonperturbative field theories  deterministic (!) mechanics of density functionals  governed by concrete EulerLagrange equations ... 
Why intristic curvature is better than gravitomagnetism?
Duda Jarek replied to Duda Jarek's topic in Physics
I just don't like the picture that the field is one thing and particles something completely different  abstract beings, which somehow can influence the field ... while they can be just built of the same field as some special local solutions  for example topological singularites, which we see in many fields. So how do You imagine particles? There appears also question  if particles have some internal structure, is it affected by fields? Can they change a bit its properties like mass, charge, magnetic moment? It also could look like time dilation ... Thanks for the paper. 
Why intristic curvature is better than gravitomagnetism?
Duda Jarek replied to Duda Jarek's topic in Physics
Ohh .. I believe they are much more than only a background fields  look at the Gauss law  it sums charges inside  charges which are almost pointwise  so we can almost define such electron as near pointwise topological singularity of electric field ... maybe particles are just such special local solutions of some field ... for example spin is often defined that quantum phase makes something like that around: http://demonstrations.wolfram.com/SeparationOfTopologicalSingularities/