Hi Tom Think about what you're saying. If speed of light changes between reference frames, then laws of optics (for example) will be different in different reference frames. That is clearly not the case. So, there must be still something missing in your picture. Let's see... Question is, who is measuring the time? If it's the moving object, remember that its clock ticks slower -- so in the end it still gets precisely c as the speed of light. If you are making your measurements from some other FOR, then of course you would have the impression you expressed above. But then, in your reference frame you are still measuring the velocity of light to be c, aren't you? The point is, everybody will measure the speed of light as c, from any inertial reference frame. That is precisely the opposite of what happened. From the start, Einstein assumed speed of light to be a constant -- c -- for all intertial frames of reference; he was led to such an assumption by consideration of Maxwell's electromagnetic equations describing light. In fact, Einstein was always very fond of Maxwell, and he was quite distressed at conflict between the apparent constancy of lightspeed that came out of Maxwell physics with theoretically variable (zero to infinite) speed of light in Newtonian physics. Once he made his choice and stuck with Maxwell, he then proceeded to see what the implications would be. To his possible surprise and probable delight, he discovered that the outcome is a self-consistent mathematics that at low-velocity limit reduces to Newtonian. Of course, Lorentz transformations were never just artificially inserted into the theory; on the contrary they are a direct logical consequence of the original premises (lightspeed constancy and special relativity.) I have a book on relativity that derives the Lorentz transformations entirely from those two premises (in fact, the entire theory derives from them.) If you want, I can reproduce that derivation for you.