I found a quite easy way of measuring ei (the real value ) in Ci , and compare the "energies" of e and ei. Since the (SET) single electron transistors have evolved in accuracy and speed one could count the elementary charges (not only in theory) in a current. The SI ampere is defined 1/e (units) * e =1A. Well this is always true (e/e=1) , but since the CODATA e is 1% of this proof have not been published , they would prove me right while other struggle with a impossible 1% error.
So a measurement as follows: A current source , a electron counter(SET) and ammeter in series. Im =measured amps ,nm =count during time tm=measured time. The measured elementary charge is em=Im*tm/nm . Simple but when e (1%)is wrong it do not give CODATA e . The ammeter is calibrated to CODATA and have a "inbuilt" e value et= e tuning , and I ad the true value ei . We get nm=Im*tm /ei *(et/ei) T he parenthesis to correct ammeter "adjustments" => nm=Im*tm*et/ei^2 We know et = CODATA e and from these 3 e's follows that ei^2=em*et I have proved my theory right since the measured em will be about 1.568845*10^-19C and (1.60217*10^-19*1.568845*10^-19)^(1/2)=1.585 *10^-19C
Every amperemeter in the world is adjusted to prove me right (they have other uses too )
Seriously , I am very sure this have been done 1000's of times but somehow there is a 2% insoluble error. It would make me happy to see one of these measurements before the recalibrate all SET -microchips to apply incredible important SI declarations .
Now explain with no mathematics in your explanation .