# Measuring coupling strengths

This is for Sylwester Kornowski, after the issue of measuring , the strong coupling constant, came up in yet another thread of his over on SciForums. This not being the first time he and I have been down this particular road I’m writing it here for future convenience.

Sylwester makes, among his repetitive ramblings, two regular claims, that the Standard Model (SM) is a farce which will very soon be over turned, its proponents shamed and that he is able to predict the value of with incredible accuracy and predict it’s running better than the SM. Let’s consider what that means for a moment, namely what precisely is. It is the strong force’s version of the fine structure constant in electromagnetism.

In quantum electrodynamics (QED) the photon gauge field couples to a charged field, say the electron , by a term of the form . If there is no coupling and the electron is not charged under electromagnetism. The charge is not dimensionless but its possible to construct a dimensionless quantity which depends on it, namely the fine structure constant . When a quantum field theorist wants to answer a question like “What’s the scattering behaviour of bouncing electrons off one another?” they will do plenty of calculations involving Feynman diagrams and obtain an answer which is a function of . This means he (or she) has a formula which will relate physically measurable things like momentum and angles and particle counts to things which are *not* directly measurable, namely the value of . It’s by this method which can be calculated. For obvious reasons all electromagnetic processes have a dependency on , so by doing experiments involving charged particles we can obtain QED based predictions for the value of in many different ways. If the values from different experiments didn’t agree we’d know there’s something wrong with QED.

Now the important point here is that the value we obtain for is dependent on the model we used. Suppose we thought there wasn’t just an electron, muon and tau but a fourth charged lepton more massive than the tau. In some experiments it would contribute to the behaviour of the particles and when we work through the predictions we’d end up getting a different value for . It’s precisely things like this which allowed us to discover things like the W/Z bosons and the top/bottom quarks in the first place, as they contribute to the processes dependent on all of the coupling forces, , and (the electroweak Fermi constant). So processes which involve quarks, gluons, anything which is charged under the strong force will allow us to compute a prediction for but it dependent on the model we use. Altering the SM would alter the predicted values.

Now recall Sylwester doesn’t like the SM, he thinks its rubbish and should be binned without delay. Yet at the same time he likes to tell people how his ‘theory’ outputs the value of so accurately. So he’s simultaneously denouncing a model while promoting the fact his work is consistent with it?

Any model attempting to replace the SM would have to demonstrate that when it applied to *raw* experimental data it is consistent. It’s a serious job, a great many physicists spend their careers converting raw data into SM predictions and then comparing them. For example, here is a paper published by an experimental collaboration showing the model dependent processing which must be done, followed by a statistical analysis of the results and a comparison with other methods. Here is a similar collaboration paper which looks at slghtly different processes but does much the same analysis. If widely different answers were obtained the model would be called into question.

This processing of data goes on all the time, CERN had to set up dedicated high speed links to other research facilities around the world in order to transmit the many petabytes of data the LHC produces. It takes a long time to boil that down to values for the parameters in the SM. Sylwester seems completely unaware this is even done, let alone necessary and that’s after I’ve explained it to him several times over the course of years. Until Sylwester does this he is failing to justify his claims about correctly modelling experiments.

Can you within the SM calculate the physical constants?

Can you within the SM calculate the neutrino speeds higher than the c? Are they consistent with the MINOS and OPERA experiments and the data concerning the supernova SN 1987A within one coherent model?

Can you calculate within the SM the masses of the all leptons and quarks?

Can you within the SM calculate the asymptotic value for the alpha-strong for the very high energies?

No. No. No. No. No.

I did it within the Everlasting Theory applying 7 parameters only. Calculated values are as follows (system SI):

G: 6.6740007*10^-11

Planck constant/2*pi: 1.054571548*10^-34

c: 299,792,458

e: 1.60217642*10^-19

Fine-structure constant for low energies: 1/137.036001

Neutrino speed depends on lifetime of particles which decay due to the weak interactions. The calculated neutrino speed for the MINOS experiment is 1.000051(21)c. The maximum neutrino speed is 1.000072c. The calculated time-distance between the fronts of the neutrino and photon beams for the OPERA experiment is 59.3 ns whereas the neutrino speed is 1.0000172(71)c i.e. maximum neutrino speed is 1.0000243c. The calculated time-distance between the fronts of the neutrino and photon beams, observed on the Earth, for the supernova SN 1987A is 3 hours whereas the neutrino speed is 1.0000000014(6)c.

Masses: see the tables.

Alpha_strong for very high energies: 0.1139

So which theory is better?

As I replied on SciForums, you’re being blatantly dishonest.

For example, you ask whether the SM can predict the asymptotic value of . You know full well it can. Asymptotic freedom is something you and I have repeatedly discussed. You’ve even explicitly stated you know what the SM value of at various TeV scale energies is, because you predict a different amount. Say by saying “No” you’re flat out lying. You are a liar. Your reason for saying ‘no’ is that the SM doesn’t agree with you. What you should have said is “Does the SM agree with my claims?”, then the answer is indeed no, but the answer to whether it predicts a value is YES.

Then consider what you say about in general. As this post is about, if you change the SM you change the value of extracted from experiments. Likewise for precision measurements of . They depend on how the SM thinks particles interact, their masses, their spins, their compositions. Change the SM, change the values.

Until you work with raw experimental data your claims are self contradicting. Your inability to understand this shows how poor your grasp of science is.

In my theory, there is paragraph titled ‘New interpretation of the Uncertainty Principle’. From it follows that the renewable particles such as electrons, sham quarks, muons appear in appropriate fields as the gluon or photon loops with spin equal to 1 and then transform into the gluon or photon particle-antiparticle pairs and then there is their collapse to gluon or photon balls. Then they disappear in one place of field and appear in another one, and so on.

It looks as follows: gluon or photon loop–>torus–>ball, next loop–>torus–>ball, and so on.

My non-perturbative theory is associated with the loop and torus and partially with the ball whereas the perturbative theories are associated with the balls. Lifetime of the torus is 2*pi*r/c whereas lifetime of the ball stadium is r/c. This means that the non-perturbative stadium lasts 2*pi longer than the perturbative stadium.

You should not compare the non-perturbative math and stadium with the perturbative math and stadium.