This PDF file contains the front matter associated with SPIE Proceedings Volume 9570 including the Title Page, Copyright information, Table of Contents, Introduction, and Conference Committee listing.
Proc. SPIE 9570, Quantum theory as a description of robust experiments: application to Stern-Gerlach and Einstein-Podolsky-Rosen-Bohm experiments, 957002 (10 September 2015); doi: 10.1117/12.2185704
We propose and develop the thesis that the quantum theoretical description of experiments emerges from the desire to organize experimental data such that the description of the system under scrutiny and the one used to acquire the data are separated as much as possible. Application to the Stern-Gerlach and Einstein-Podolsky-Rosen-Bohm experiments are shown to support this thesis. General principles of logical inference which have been shown to lead to the Schrödinger and Pauli equation and the probabilistic descriptions of the Stern-Gerlach and Einstein-Podolsky-Rosen-Bohm experiments, are used to demonstrate that the condition for the separation procedure to yield the quantum theoretical description is intimately related to the assumptions that the observed events are independent and that the data generated by these experiments is robust with respect to small changes of the conditions under which the experiment is carried out.
Proc. SPIE 9570, A convergence: special relativity, zitterbewegung, and new models for the subcomponent structure of quantum particles, 957003 (10 September 2015); doi: 10.1117/12.2186369
Hestenes has presented an integration of Schrödinger's zitterbewegung with the spin matrices of the Dirac equation, suggesting the electron can be modeled by a rapidly rotating dipole moment and a frequency related to the de Broglie frequency. He presents an elegant spacetime algebra that provides a reformulation of the Dirac equation that incorporates these real spin characteristics. A similar heuristic model for quantum particles has been derived by this author from a different, quasi-classical premise: That the most fundamental subcomponents of quantum particles all travel at a constant speed of light. Time is equated with the spatial displacement of these subcomponents – the speed of light is the speed of time. This approach suggests a means of integrating special relativity and quantum mechanics with the same concept of time. The relativistic transformation of spinning quantum particles create the appearance of additional, compactified spatial dimensions that can be correlated with the complex phase of the spin matrices as in the Dirac formalism. This paper further examines the convergence on such new models for quantum particles built on this rapid motion of particle subcomponents. The modeling leverages a string-like heuristic for particle subcomponents and a revised description for the wave-like properties of particles. This examination provides useful insights to the real spatial geometries and interactions of electrons and photons.
We present a particle model which was developed to explain special relativity by classical means. This model is also able to account for physical processes that are normally attributed to quantum mechanics. The model is able to describe several well-known QM processes by means of classical calculations, making them accessible to the imagination. An essential difference compared with the Standard Model of present-day particle physics is the fact that, in the model presented, particles are viewed as being extended rather than point-like. In addition, the strong force is shown to be the universal force operating in all particles. Also, the photon, which quantum mechanics views as being nothing but a quantum of energy, can be understood to have an internal structure. The model presented here is not merely a different way of explaining physics with similar results; in contrast to quantum mechanics, it has the ability to provide deeper insights into physical processes.
Hugh Everett’s “Relative State” model [1] of the observing element in quantum theory is expanded in the Event Oriented World View [2] and used to show how inertial interactions suggested by “Mach’s Principle” can resolve the wave particle duality. I argue that bullet like Photons are not ontologically real but have been introduced as interpretation aids for experiments based upon an outdated concept of the observing mechanisms in such observers. However denying the reality of photons is only one consequence of a more fundamental shift from object to event oriented physical theories. I will show how adopting the concept that all systems are observers and providing an event model for those observers, will give quantum theory a paradox free ontological context. This context can, for example, provide an ontological explanation for the apparent random hits of individual matter-radiation interactions in the dual slit experiment. Almost a century of attempts to find an acceptable interpretation for quantum theory have failed because mere interpretations do not go far enough. Namely they do not treat the physical observer and his observations as incorporated into a single event. Event oriented physics suggests an observer is an activity cycle and Hilbert Space is a set of actual detector/actuator arrays, through which all knowledge is obtained. These arrays separate physical reality from the display of observable measurement results, inside such observing activities. By assuming arrays are fundamentally describable as mass and charge densities, gravity and electricity are coupled together by internal material forces between mass and charge. Electromagnetic influences can only change the dynamic state of an observing activity when charge-mass forces produce compensating changes in the gravito-inertial field. Gravito-inertial field fluctuations limit the ability of an atom to make a transition that exactly matches the energy available in a stimulating electric field. Thus the propensity of photon absorption is determined by the field intensity but the actual transition is determined by quasirandom gravito-inertial fluctuations. Disturbances in the electromagnetic-field propagates as waves, but are observed as well localized phenomena falsely suggesting particles exist in the field.
Are we to accept quantization as a fundamental property of nature, the origin of which does not require or admit further investigation? To get an insight into this question we consider atomic systems as open systems, since they are by necessity in contact with the electromagnetic radiation field. This includes not only photonic radiation, but, more importantly for our purposes, the random zero-point or nonthermal radiation that pervades the Universe. The Heisenberg inequalities, atomic stability and the existence of discrete solutions are explained as a result of the permanent action of this field upon matter and the balance between mean absorbed and emitted powers in the equilibrium regime. A detailed study carried out along the years has led to the usual quantum-mechanical formalism as a powerful and revealing statistical description of the behavior of matter in the radiationless approximation, as well as to the radiative corrections of nonrelativistic QED. The theory presented gives thus a response to the question posed above, within a local, realist and objective framework: quantization appears as an emergent phenomenon due to the matter-field interaction.
It is now generally recognized that physics has not been contributing anything conceptually fundamentally new beyond the century old Relativity and 90 years old Quantum Mechanics [1-4]. We have also started recognizing that there is an increasing rate of species extinction all over the world, especially since the last century [5]; and we are beginning to understand that the related problems are being steadily accelerated by human behavior to conquer nature, rather than understanding nature as is and living within its system logics [6,7]. We are beginning to appreciate that our long-term sustainability as a species literally depends upon proactively learning to nurture the entire bio-diversity [8-10]. Thus, humans must consciously become evolution process congruent thinkers. The evolutionary biologists have been crying out loud for us to listen [5,6, 8-10]. Social scientists, political scientists, economic scientists [13] have started chiming in to become consilient thinkers [6] for re-constructing sustainable societies. But, the path to consilient thinking requires us to recognize and accept a common vision based thinking process, which functionally serves as a uniting platform. I am articulating that platform as the “evolution process congruent thinking” (EPCT). Do physicists have any obligation to co-opt this EPCT? Is there any immediate and/or long-term gain for them? This paper argues affirmatively that co-opting EPCT is the best way to re-anchor physics back to reality ontology and develop newer and deeper understanding of natural phenomena based on understanding of the diverse interaction processes going on in nature. Physics is mature enough to acknowledge that all of our theories are “work in progress”. This is a good time to start iteratively re-evaluating and re-structuring all the foundational postulates behind all the working theories. This will also consistently energize all the follow-on generation of physicists to keep on fully utilizing their evolution-given enquiring minds without being afraid by the prevailing culture of “publish-or-perish”, requiring them to stay within the bounds of the prevailing theories as the final ones. Current physics thinking has been successfully driven by Measurable Data Modeling Epistemology (MDM-E); which is basically curve-fitting without demanding to understand the actual physical processes nature is carrying out. I am proposing to add an iterative repertoire, Interaction Process mapping Epistemology (IPM-E) over and above successful MDM-E. This will facilitate the physicists to become conceptual reverse engineers of nature. The gap between physicists and engineers will start melting down and our collective sustainability will be re-assured as successful engineers of nature.
The linguistic and epistemological constraints on finding and expressing an answer to the title question are reviewed. First, it is recalled that "fields" are defined in terms of their effect on "test charges" and not in terms of any, even idealistically considered, primary, native innate qualities of their own. Thus, before fields can be discussed, the theorist has to have already available a defined "test particle" and field source. Clearly, neither the test nor the engendering particles can be defined as elements of the considered field without redefining the term "field." Further, the development of a theory as a logical structure (i.e., an internally self consistent conceptual complex) entails that the subject(s) of the theory (the primitive elements) and the rules governing their interrelationships (axioms) cannot be deduced by any logical procedure. They are always hypothesized on the basis of intuition supported by empirical experience. Given hypothesized primitive elements and axioms it is possible, in principle, to test for the 'completion' of the axiom set (i.e., any addition introduces redundancy) and for self consistency. Thus, theory building is limited to establishing the self consistency of a theory's mathematical expression and comparing that with the external, ontic world. Finally, a classical model with an event-by-event simulation of an EPR-B experiment to test a Bell Inequality is described. This model leads to a violation of Bell's limit without any quantum input (no nonlocal interaction nor entanglement), thus substantiating previous critical analysis of the derivation of Bell inequalities. On the basis of this result, it can be concluded that the electromagnetic interaction possesses no preternatural aspects, and that the usual models in terms of waves, fields and photons are all just imaginary constructs with questionable relation to a presumed reality.
Proc. SPIE 9570, Did Planck, Einstein, and Bose count indivisible photons or discrete emission/absorption processes in a black-body cavity?, 957009 (10 September 2015); doi: 10.1117/12.2188291
Planck, Einstein and Bose all had to introduce statistics, and thus counting, in order to successfully derive an equation for the energy distribution within the black-body radiation spectrum, and what we now call Bose-Einstein statistics. Some of the details involved in the counting procedure vary while still giving the same result. However, the interpretation of what we count may differ dramatically from one another (as, for example, between Planck and Bose), without impacting the final, mathematical result. We demonstrate here a further alternative, which varies both, in the details of the counting, as well as in the interpretation, while still producing the same well known statistics. This approach puts the "quantumness" back into the radiation emission/absorption process, possibly dispensing with the requirement of quantized light, at least in the context of black-body radiation.
Considering an idea of F. Arago in 1853 regarding light dispersion through the light ether in the interstellar space, this paper presents a new idea on an alternative interpretation of the cosmological red shift of the galaxies in the universe. The model is based on an analogy with the temporal material dispersion that occurs with light in the optical fiber core. Since intergalactic space is transparent, according to the model, this phenomenon is related to the gravitational potential existing in the whole space. Thus, it is possible to find a new interpretation to Hubble's constant. In space, light undergoes a dispersion process in its path, which is interpreted by a red shift equation of the type Δz = HL, since H = (d^{2}n/dλ^{2} Δv Δλ), where H means the Hubble constant, n is the refractive index of the intergalactic space, Δλ is the spectral width of the extragalactic source, and Δv is the variation of the speed of light caused by the gravitational potential. We observe that this "constant" is governed by three new parameters. Light traveling the intergalactic space undergoes red shift due to this mechanism, while light amplitude decreases with time, and the wavelength always increases, thus producing the same type of behavior given by Hubble's Law. It can be demonstrated that the dark matter phenomenon is produced by the apparent speed of light of the stars on the periphery of the galaxies, without the existence of dark energy. Based on this new idea, the model of the universe is static, lacking expansion. Other phenomena may be interpreted based on this new model of the universe. We have what we call temporal gravitational dispersion of light in space produced by the variations of the speed of light, due to the presence of the gravitational potential in the whole space.
Photons are here considered to be resonant oscillations (solitons) in four dimensions (space/time) of an undefined ‘field’ otherwise generally existing at a local energy minimum. The photons’ constituent EM fields result in elevated energy, and therefore potentials, within that field. It is in the context of the standing waves of and between photons that the EM fields and potentials lead to a description of alternating (AC) ‘currents’ (of some form) of unquantized alternating ‘charge’ (of some sort). The main topic of this paper is the alternating charge.
The Dirac equation electron is modeled as a helically circulating charged photon, with the longitudinal component of the charged photon's velocity equal to the velocity of the electron. The electron's relativistic energy-momentum equation is satisfied by the circulating charged photon. The relativistic momentum of the electron equals the longitudinal component of the momentum of the helically-circulating charged photon, while the relativistic energy of the electron equals the energy of the circulating charged photon. The circulating charged photon has a relativistically invariant transverse momentum that generates the z-component of the spin ħ / 2 of a slowly-moving electron. The charged photon model of the electron is found to generate the relativistic de Broglie wavelength of the electron. This result strongly reinforces the hypothesis that the electron is a circulating charged photon. Wave-particle duality may be better understood due to the charged photon model—electrons have wavelike properties because they are charged photons. New applications in photonics and electronics may evolve from this new hypothesis about the electron.
The nature of light is studied by comparison between the real and digital worlds. Combining the theoretical results obtained from a relative quantum mechanical equation which is suited for a particle with zero rest mass, spin quantum number 1/2 with experiments analysis results, we discuss the possibility that a photon could have structure. A possible photon model with internal structure is proposed and discussed. A photon could consist of a pair particles named particle x^{+} and antiparticle x^{-}. x^{+} and x^{-} both have zero rest masses, and spin quantum number 1/2. x^{+} and x^{-} oscillate in opposite directions but spin in the same direction, which means there is an internal oscillation. The frequency of light is the frequency of the oscillation. The model can be used to explain not only all of the experiment phenomena, but also the wave particle dualism of light. It can provide clear pictures of both polarization and coherency of light. It can also explain that there are two spin states of 1 and -1. The possible experiments that could be used to prove the model are addressed too.
There are two opposing points of view on the nature of light: the first one manifests the wave-particle duality as a fundamental property of the nature; the second one claims that photons do not exist and the light is a continuous classical wave, while the so-called “quantum” properties of this field appear only as a result of its interaction with matter. In this paper we show that many quantum phenomena which are traditionally described by quantum electrodynamics can be described if light is considered within the limits of classical electrodynamics without quantization of radiation. These phenomena include the double-slit experiment, the photoelectric effect, the Compton effect, the Hanbury Brown and Twiss effect, the so-called multiphoton ionisation of atoms, etc. We show that this point of view allows also explaining the “wave-particle duality” of light in Wiener experiments with standing waves. We show that the Born rule for light can easily be derived from Fermi’s golden rule as an approximation for low-intense light or for short exposure time. We show that the Heisenberg uncertainty principle for “photons” has a simple classical sense and cannot be considered as a fundamental limitation of accuracy of simultaneous measurements of position and momentum or time and energy. We conclude that the concept of a “photon” is superfluous in explanation of light-matter interactions.
An experimentally verified mathematical model that precisely describes the attraction and motion between an electron and positron does not yet exist. Although there have been no direct experimental measurements of the particle velocity when the distance between the two particles approaches zero, the basic inverse square model used for point charges is thought to be inadequate because it would predict speeds in excess of c, the speed of light. Modifications to this basic model have been made by theorizing a variable velocity dependent relativistic mass or a velocity dependent force. Using these models, that assume the electron and positron both attain a velocity of approximately c during their annihilation collision, results in a very compelling model of a photon as an electron and positron in a two body orbital union traveling through space. However, photon models based on this assumption show that the photon translational velocity must have some dependence on the photon wavelength. Further exploration of the basic inverse square model of electron - positron attraction shows it predicts the first order two body photon model without this wavelength dependent dispersion. Furthermore, study of the electron-positron interaction with a hydrogen like entity shows that the popular notion of a photon having an angular momentum on the order of ħ and an energy of ħw can be derived from first principles.
It is argued that, unlike material particles, electromagnetic quanta are devoid of individual identities. Their birth and death are dictated by the conservation principles involving the interacting partners resulting in the emission or absorption. During their entire life, they restlessly propagate unnoticed by the media they pass through. An encounter with an interacting partner results in their demise. A photon lives and propagates as a phoenix with a successive photon arising from the ashes of its predecessor.
Our investigation shows that the wave properties associated with fermions implies their finite size. However, existing quantum theories are based on the point particle concept, and fail to maintain relativistic invariance for finite sized charged particles. Notably, the quantum uncertainty principle introduces finite size in all elementary particles. Recently, we have developed a theory for understanding the quantum properties of finite sized fermions and bosons that incorporates both special relativity and quantum uncertainty. Using this theory, we are able to demonstrate theoretically the physical appearance of bosons and fermions. Understanding of the physical structure of electrons and photons will definitely help us advance our technology.
Proc. SPIE 9570, Wave interference: mechanics of the standing wave component and the illusion of "which way" information, 95700L (10 September 2015); doi: 10.1117/12.2187523
Two adjacent coherent light beams, 180° out of phase and traveling on adjacent, parallel paths, remain visibly separated by the null (dark) zone from their mutual interference pattern as they merge. Each half of the pattern can be traced to one of the beams. Does such an experiment provide both "which way" and momentum knowledge? To answer this question, we demonstrate, by examining behavior of wave momentum and energy in a medium, that interfering waves interact. Central to the mechanism of interference is a standing wave component resulting from the combination of coherent waves. We show the mathematics for the formation of the standing wave component and for wave momentum involved in the waves' interaction. In water and in open coaxial cable, we observe that standing waves form cells bounded "reflection zones" where wave momentum from adjacent cells is reversed, confining oscillating energy to each cell. Applying principles observed in standing waves in media to the standing wave component of interfering light beams, we identify dark (null) regions to be the reflection zones. Each part of the interference pattern is affected by interactions between other parts, obscuring "which-way" information. We demonstrated physical interaction experimentally using two beams interfering slightly with one dark zone between them. Blocking one beam "downstream" from the interference region removed the null zone and allowed the remaining beam to evolve to a footprint of a single beam.
We provide a mystery-free explanation for the experimentally observed facts in the neutron interferometry quantum Cheshire cat experiment of Denkmayr et al. [Nat. Comm. 5, 4492, 2014] in terms of a discrete-event simulation model, demonstrating that the quantum Cheshire cat is an illusion.
A famous beam-split coincidence test of the photon model is described herein using gamma-rays instead of the usual visible light. A similar a new test was performed using alpha-rays. In both tests, coincidence rates greatly exceed chance, leading to an unquantum effect. In contradiction to quantum theory and the photon model, these new results are strong evidence of the long abandoned accumulation hypothesis, also known as the loading theory. Attention is drawn to assumptions applied to past key-experiments that led to quantum mechanics. The history of the loading theory is outlined, and a few equations for famous experiments are derived, now free of wave-particle duality. Quantum theory usually works because there is a subtle difference between quantized and thresholded absorption.
Optical lossless beam splitters are frequently encountered in fundamental physics experiments regarding the nature of light, including “which-way” determination or the EPR paradox and their measurement apparatus. Although they look as common optical components at first glance, their behaviour remains somewhat mysterious since they apparently exhibit stand-alone particle-like features, and then wave-like characteristics when inserted into a Mach-Zehnder interferometer. In this communication are examined and discussed some basic properties of these beamssplitters, both from a classical optics and quantum physics point of view. Herein the most evident convergences and contradictions are highlighted, and the results of a few emblematic experiments demonstrating photon existence are discussed. Alternative empirical models are also proposed in order to shed light on some remaining issues.
Proc. SPIE 9570, Photon diffraction described by momentum exchange theory: what more can edge diffraction tell us?, 95700R (10 September 2015); doi: 10.1117/12.2186353
Previous papers have presented an alternative picture for photon diffraction based on a distribution of photon paths through quantized momentum exchange with probabilities defined at the location of scattering, not the point of detection. This contrasted with the picture from classical optical wave theory that describes diffraction in terms of the Huygens-Fresnel principle and sums the phased contributions of electromagnetic waves at the location of detection to determine probabilities. This alternative picture was termed “Momentum Exchange Theory (MET),” replacing the concept of Huygens wavelets with photon scattering (positive and negative dispersions) through momentum exchange with the scattering lattice. MET assumes a momentum representation for diffracted particles and has been applied to several different optical diffraction experimental configurations. Straight edge diffraction has been a particularly revealing experimental configuration as it provides significant clues to the geometric parameters controlling exchange probabilities. Diffraction by an opaque disc is examined to provide further insight to negative (attractive) dispersions. This analysis indicates that the “diffraction force” is an integration of momentum exchange field interactions to derive an exchange probability at interaction points along the photon path – resembling aspects of the QED path integral formulation for particle interactions.
This paper addresses the interference of photons with themselves and the conditions under which a specific resonance creates the entangled electron/positron pair. Analysis of the forces and potentials in the interaction between photons has raised the issue of oscillating charge as the source of the alternating electric and magnetic fields comprising the photon. Since the photon is net neutral, yet is composed of electric fields, the object of this paper is to explore the physical mechanism(s) describing how the alternating fields of photons can be ‘rectified’ to produce mass and the separated opposite charges of the electron and positron pair.
The creation of charged elementary particles from neutral photons is explained as a conversion process of electromagnetic (EM) energy from linear to circular motion at the speed of light into two localized, toroidal shaped vortices of trapped EM energy that resist change of motion, perceptible as particles with inertia and hence mass. The photon can be represented as a superposition of left and right circular polarized transverse electric fields of opposite polarity originating from a common zero potential axis, the optical axis of the photon. If these components are separated by interaction with a strong field (nucleon) they would curl up into two electromagnetic vortices (EMV) due to longitudinal magnetic field components forming toroids. These vortices are perceptible as opposite charged elementary particles e± . These spinning toroids generate extended oscillating fields that interact with stationary field oscillations. The velocity-dependent frequency differences cause beat signals equivalent to matter waves, leading to interference. The extended fields entangled with every particle explain wave particle duality issues. Spin and magnetic moment are the natural outcome of these gyrating particles. As the energy and hence mass of the electron increases with acceleration so does its size shrink proportional to its reduced wavelength. The artificial weak and strong nuclear forces can be easily explained as different manifestations of the intermediate EM forces. The unstable neutron consists of a proton surrounded by a contracted and captured electron. The associated radial EM forces represent the weak nuclear force. The deuteron consists of two axially separated protons held together by a centrally captured electron. The axial EM forces represent the strong nuclear force, providing stability for “neutrons” only within nucleons. The same principles were applied to determine the geometries of force-balanced nuclei. The alpha-particle emerges as a very compact symmetric cuboid that provides a unique building block to assemble the isotopic chart. Exotic neutron- 4 appears viable which may explain dark matter. The recognition that all heavy particles, including the protons, are related to electrons via muons and pions explains the identity of all charges to within 10^{–36}. Greater deviations would overpower gravitation. Gravitation can be traced to EM vacuum fluctuations generated by standing EM waves between interacting particles. On that basis, gravity can be correlated via microscopic quantities to the age of the universe of 13.5 billion years. All forces and particles and potentially dark matter and dark energy are different manifestations of EM energy.
We take the postulate of Special Relativity, that the cosmic rules observable through physical phenomena, are the same for all stars in all galaxies. We have deliberately avoided using the phrase; “in all inertial frames of reference” to avoid conceptual mathematical debate in defining what such frames of references are [1-4]. Then, accepting the universal validity of light velocity defined by Maxwell’s wave equation, c^{2} = 1 / (ε _{0}μ _{0}) ; we revive the old ether concept with physically descriptive phrase that space is a continuous Complex Tension Field (CTF). This is strengthened by the fact that all non-dissipative tension fields allow for perpetual propagation of waves when excited within its linear restoration capability. We accommodate the particles as localized self-phase-looped resonant oscillations of the same CTF; thus integrating particles as another kind of excited states of the same CTF [5]. Further, all tension fields allow co-propagation and cross-propagation of multiple waves (preservation of wave properties and the respective Poynting vectors) through the same physical volume (linear Superposition Principle) in the absence of perturbing resonant detectors within the volume of superposition. We have re-named this universal property of all waves as Non-Interaction of Waves [6,7]. Thus, Doppler shifted waves from different stars and galaxies can cross through each other unperturbed while bringing to us the signatures of the properties of their parent stars. Now, if these light signals are waving of CTF, the optical Doppler effects must also be, as for sound waves in air pressure tension field, discernable into two different frequency shifts: as due to (i) source velocity (distant stars) and (ii) detector velocity (that of the earth) [8,9]. In other words, we are proposing that CTF (modified old ether) is the stationary cosmic rest frame. Since we have been routinely assuming that quantum phenomena are same in all stars; we strengthen our position by analyzing the origin of absorption lines in distant stars as the same energy level transition phenomenon as we observe on earth and well-modeled by precision spectrometry and validated by QM formalism. The analysis also reveals that Cosmological (Hubble) Redshift cannot be due to optical Doppler Effect; since the Doppler Effect is determined by the velocities of the source-atoms within the star coronas. We have also proposed a satellite based one-way light propagation measurement; which could identify the absolute velocity of the satellite and validate that CTF is the stationary rest frame for our observable universe.
Adhering to Werner Heisenberg’s and to the school of Copenhagen’s physical philosophy we introduce the localized observer as an absolutely necessary element of a consistent physical description of nature. Thus we have synthesized the theory of the harmonicity of the field of light, which attempts to present a new approach to the events in the human perceptible space. It is an axiomatic theory based on the selection of the projective space as the geometrical space of choice, while its first fundamental hypothesis is none other than special relativity theory’s second hypothesis, properly modified. The result is that all our observations and measurements of physical entities always refer not to their present state but rather to a previous one, a conclusion evocative of the “shadows” paradigm in Plato’s cave allegory. In the kinematics of a material point this previous state we call “conjugate position”, which has been called the “retarded position” by Richard Feynman. We prove that the relation of the present position with its conjugate is ruled by a harmonic tetrad. Thus the relation of the elements of the geometrical (noetic) and the perceptible space is harmonic. In this work we show a consequence of this harmonic relation: the golden section.
The particle model presented here is able to explain the structure of leptons and quarks without reference to quantum mechanics. In particular, it is able to explain quantitatively the existence of inertial mass without any use of a Higgs field. - An essential difference from the Standard Model of present-day particle physics is the fact that in the model presented, particles are viewed as being not point-like but extended. In addition, it becomes apparent that the strong force is the universal force that is effective in all particles.
Proc. SPIE 9570, Quantum mechanical probability current as electromagnetic 4-current from topological EM fields, 957010 (10 September 2015); doi: 10.1117/12.2188288
Starting from a complex 4-potential A = αdβ we show that the 4-current density in electromagnetism and the probability current density in relativistic quantum mechanics are of identical form. With the Dirac-Clifford algebra Cl_{1,3} as mathematical basis, the given 4-potential allows topological solutions of the fields, quite similar to Bateman’s construction, but with a double field solution that was overlooked previously. A more general nullvector condition is found and wave-functions of charged and neutral particles appear as topological configurations of the electromagnetic fields.
Proc. SPIE 9570, The fine structure constant alpha: relevant for a model of a self-propelling photon and for particle masses, 957013 (10 September 2015); doi: 10.1117/12.2187396
A model for a self propelling (i.e. massless) photon^{1} is based on oscillations of a pair of charges amounting to elementary charge divided by SQRT alpha, where alpha is the fine structure (Sommerfeld) constant. When one assumes a similar model for particles that do have rest mas (i.e. which are non- self propelling), alpha plays also a role in the rest masses of elementary particles. Indeed all fundamental elementary particle masses can be described by the alpha / beta rule^{2} → m(particle) = alpha^{-n} * beta^{m}* 27.2 eV /c^{2} where beta is the proton to electron mass ratio 183612 and n= 0….14, m= -1,0 or Thus, photons and particle masses are intimately related to the fine structure constant. If the latter would not have been strictly constant throughout all times, this would have had consequences for the nature of light and for all masses including those of elementary particles.
A new topological study is presented of a semi classical quantized model for the electron, consisting of a circularly bound monochromatic photon. This model for the electron includes a topologically created elementary charge, point‐like behavior in high‐energy scattering events, half‐integral spin, and the magnetic moment of the electron. We will also present evidence for the causes of: 1) the magnetic moment of the electron, 2) the elementary charge, 3) a binding force (Poincare) which allows a photon to be confined to become and electron, 4) the property of inherent inertial mass in the electron, and 5) Relativity. Within this context we propose that all matter is composed of EM waves, and that Relativity and Quantum Mechanics are the simple result of the behavior of those EM waves. For the most part we will use simple scalar mathematics to support this premise, because most of the problems have relatively simple solutions. However we will also submit that new formulations are needed for field equations.
A new theory, describing both light and material particles, is proposed. The experimentally-observed nature of space and time are brought into the theory at the most fundamental level. An equation encompassing the usual free-space Maxwell equations but similar in form to the Dirac equation is proposed. This equation has new kinds of solutions. Propagating, pure-field solutions may have any energy, but the energy transferred must be proportional to the frequency. These are identified with the physical photon. Solutions with a rest-mass term allow any incoming propagating field to merge into re-circulating vortex-like solutions. The minimum energy configuration "rectifies" the oscillating electric field of light into a uni-directional, radial (inward or outward directed) configuration. The resulting apparent external charge may be readily estimated and is found to be of the order of the elementary charge. The spin may, likewise, be calculated, and is found to be half integral, exhibiting a double-covering internal symmetry. Charge is then not a fundamental quantity in the theory - but is a result of the way field folds from a rest-massless bosonic to a rest-massive fermionic configuration. The simplest such charged, fermionic particles are identified with the electron and positron.
A rigorous introduction of the underlying nature of space and time, through a sharpening of the principle of relativity, forces qualitatively new kinds of solutions in the classical theory of electromagnetism. A class of relativistic wave-functions are derived which are solutions to the first-order, free-space Maxwell equation, These describe all photons from radio to gamma waves and are governed by a single parameter: the exchange frequency. Though the theory remains that of classical, continuous electromagnetism, allowed travelling-wave solutions are quantised in that they come in “lumps” and their characteristic energy is proportional to frequency.
Recently the existence of space as a complex dynamical system was discovered, based upon various experiments going back to 1887. The early experiments by Michelson and Morley 1887, and Miller 1925/26, used light speed anisotropy detected with interferometers. Only in 2002 was the calibration theory first derived. More recently there have been other experimental techniques, including Doppler shift effects detected by NASA using spacecraft Earth flybys. The most recent technique uses current fluctuations through the nanotechnology reverse-biased Zener diode barrier potential, by using two detectors and measuring the time delay in correlations to determine speed and direction of the space flow. Physics has never had a knowledge of this dynamical space, and the theory is now well developed, and is now known to explain the origin of gravity, quantum fluctuations, bore hole g anomalies, galactic rotations, galactic lensing of light, universe dynamics, laboratory G measurements, and more. This dynamical space supports a coordinate system, and it was this that was originally thought to be space itself.
A model of the universe based on energetic spacetime (zero point energy) is expanded. The energy density of spacetime is calculated using only general relativity and acoustic equations. This energetic spacetime is shown to possess the properties required to be the new aether (Lorentz invariance, quantization of angular momentum, impedance, and quantum mechanical energy density.) The contradictory wave-particle duality properties of a photon are resolved by a model where a photon is a wave propagating in energetic spacetime but appearing to have particle properties because it possesses quantized angular momentum. Compton scattering and the photoelectric effect are examined and found to be compatible with the proposed wave-based photon model.
Proc. SPIE 9570, Modeling superposition of 3- and N-polarized beams on an isotropic photo detector, 95701A (10 September 2015); doi: 10.1117/12.2188518
In a previous paper [SPIE Proc.Vol.7063, paper #4 (2008)], we have attempted to model possible modes of excitations that detecting dipoles carry out during the interaction process with EM waves before absorbing a quantum cupful of energy out of the two simultaneously stimulating EM waves along with experimental validations. Those experiments and analyses basically corroborate the law of Malus. For these two-beam cases, the cosθ-factor, (θ being the angle between the two polarization vectors), is too symmetric and too simple a case to assure that we are modeling the energy absorption process definitively. Accordingly, this paper brings in asymmetry in the interaction process by considering 3-beam and N-beam cases to find out whether there are more subtleties behind the energy absorption processes when more than two beams are simultaneous stimulating a detector for the transfer of EM energy from these multiple beams. We have suggested a possible experimental set up for a three-polarized beam experiment that we plan to carry out in the near future. We also present analyses for 3-beam and simplified Nbeam cases and computed curves for some 3-beam cases. The results strengthen what we concluded in our two beam experimental paper. We also recognize that the mode of mathematical analyses, based upon traditional approach, may not be sufficient to extract any more details of the invisible light-dipole interaction processes going on in nature.
Proc. SPIE 9570, Tabletop demonstration of non-Interaction of photons and non-interference of waves, 95701B (25 September 2015); doi: 10.1117/12.2191014
Recently, Non- Interaction of Waves or the NIW property has been proposed as a generic property of all propagating electromagnetic waves by one of the authors (CR). In other words, optical beams do not interact with each other to modify or re-distribute their field energy distribution in the absence of interacting materials. In this paper, path taken to re-create CR's original demonstration of the NIW-property as an on-site tabletop experiment is discussed. Since 1975, when the NIW demonstration was first reported, several advances in lasers and optical component design architecture have occurred. With the goal of using low cost components and having agility in setting up on non-conformable platforms for general viewing, a compact arrangement for demonstrating the NIW property was envisioned. In our experimental arrangement, a beam multiplier element was utilized to generate a set of spatially separate parallel beams out of an incident laser beam. The emerging parallel beams from the beam multiplier element were then focused on a one-sided ground glass, the flat side being towards the beam multiplier. This flat side reflects off all the incident focused beams as fanning out independent laser beams, remaining unperturbed even though they are reflecting out of a common superposed spot. It is clear that there is neither "interference between different photons", nor "a photon interferes with itself", even within a region of superposed beams. In contrast, the ground glass surface (same silica molecules but granular or lumpy) was anticipated to generate a set of crisp spatial fringes on its surface as in the original experiment. The fringes are due to granulated individual silica lumps responding simultaneously to the local resultant E-vectors due to all the superposed beams and are scattering energy proportional to the square modulus of the sum of all the simultaneous dipolar amplitude stimulations. The dark fringe locations imply zero resultant amplitude stimulation and hence no scattering. Due to multi-longitudinal mode nature of laser module, the fringes appeared washed out at the backside of the ground glass plate. Experimental refinements followed by our views on whether the fundamental physics behind the generation of superposition fringes by photo detectors different from those due to a ground glass are briefly discussed.
Knowledge of structure rules of the atomic nucleus and the properties of vortex electromagnetic field allow us to create relatively precisely the structures of individual atoms and molecules. Properties of atoms are largely described by the structure of their electron shells. However, the standard model of atoms does not allow define this structure exactly. New theory VFRT (vortex-fractal-ring-theory) can solve this lack. Theory VFRT uses fractal ring structure of the electron, the proton and the neutron, and can describe the inner structure of atomic nuclei. Fractal descriptions of Nature are very promising. The atomic nucleus can be built from the ring protons and neutrons. This new theory assumes that the arrangement of electron shells arises from the structure of the atomic nucleus. Electrons are not in orbit around the atomic nucleus, but each electron levitates with the corresponding proton of the nucleus. The levitation bond between the electron and the proton is formed by an electromagnetic vortex structure. Theory VFRT expands understanding of nature through a new perspective on the evolution of lifeless nature using a vortex, fractal and ring substructures with self-organization, from quarks, electrons, protons and neutrons, atoms, molecules, to the structure of complex organic compounds.
A critical error is found in the Special Theory of Relativity (STR): mixing up the concepts of the STR abstract time of a reference frame and the displayed time of a physical clock, which leads to use the properties of the abstract time to predict time dilation on physical clocks and all other physical processes. Actually, a clock can never directly measure the abstract time, but can only record the result of a physical process during a period of the abstract time such as the number of cycles of oscillation which is the multiplication of the abstract time and the frequency of oscillation. After Lorentz Transformation, the abstract time of a reference frame expands by a factor gamma, but the frequency of a clock decreases by the same factor gamma, and the resulting multiplication i.e. the displayed time of a moving clock remains unchanged. That is, the displayed time of any physical clock is an invariant of Lorentz Transformation. The Lorentz invariance of the displayed times of clocks can further prove within the framework of STR our earth based standard physical time is absolute, universal and independent of inertial reference frames as confirmed by both the physical fact of the universal synchronization of clocks on the GPS satellites and clocks on the earth, and the theoretical existence of the absolute and universal Galilean time in STR which has proved that time dilation and space contraction are pure illusions of STR. The existence of the absolute and universal time in STR has directly denied that the reference frame dependent abstract time of STR is the physical time, and therefore, STR is wrong and all its predictions can never happen in the physical world.
When we create mathematical models for quantum theory of light we assume that the mathematical apparatus used in modeling, at least the simplest mathematical apparatus, is infallible. In particular, this relates to the use of ”infinitely small” and ”infinitely large” quantities in arithmetic and the use of Newton - Cauchy definitions of a limit and derivative in analysis. We believe that is where the main problem lies in contemporary study of nature. We have introduced a new concept of Observer’s Mathematics (see www.mathrelativity.com). Observer’s Mathematics creates new arithmetic, algebra, geometry, topology, analysis and logic which do not contain the concept of continuum, but locally coincide with the standard fields. We use Einstein special relativity principles and get the analogue of classical Lorentz transformation. This work considers this transformation from Observer’s Mathematics point of view.
From super clusters of galaxies down to the quarks in the proton, at all length scales the structure of matter is the result of a balance of forces. In this paper it is argued that with decreasing size there must necessarily be an increase of the fraction of kinetic and binding energy with respect to the total energy. Smaller sizes require stronger forces which represent more of the energy available. The smallest possible size of granularity is found to be where the internal kinetic energy and total energy become comparable, which occurs at the size of the proton. We infer that the proton is the smallest stable particle, being a light speed circulation of energy.
Proc. SPIE 9570, Are electrons oscillating photons, oscillating “vacuum," or something else? The 2015 panel discussion, 95701I (10 September 2015); doi: 10.1117/12.2205311
Platform: What physical attributes separate EM waves, of the enormous band of radio to visible to x-ray, from the high energy narrow band of gamma-ray? From radio to visible to x-ray, telescopes are designed based upon the optical imaging theory; which is an extension of the Huygens-Fresnel diffraction integral. Do we understand the physical properties of gamma rays that defy us to manipulate them similarly? One demonstrated unique property of gamma rays is that they can be converted to elementary particles (electron and positron pair); or a particle-antiparticle pair can be converted into gamma rays. Thus, EM waves and elementary particles, being inter-convertible; we cannot expect to understand the deeper nature of light without succeeding to find structural inter-relationship between photons and particles. This topic is directly relevant to develop a deeper understanding of the nature of light; which will, in turn, help our engineers to invent better optical instruments.