Mentioned in post #26. Since the definition of \(f'(x_0)\) is \( \lim_{h\to 0} \frac{f(x_0 + h) - f(x_0)}{h}\) it follows that in any small neighborhood of \(x_0\) we have \(f(x) \approx f(x_0) + ( x-x_0) f'(x_0)\) . So if both f and g have derivatives at \(x_0\), it follows from the definition of derivative that \(\lim_{x\to x_0} \frac{f(x)}{g(x)} = \lim_{x\to x_0} \frac{f(x_0) + ( x-x_0) f'(x_0)}{g(x_0) + ( x-x_0) g'(x_0)}\) If the latter limit exists. This generalizes to Taylor series approximations of all finite orders, but evaluation need not go past the first ratio of terms which is not 0/0.
Correct. Thank you for not telling me the whole thing. I did remember you had mentioned it when I found it. It was good for me to work it out for myself from there.
And since I'm handing out kudos, thanks to you as well, Dinosaur. I'd never have known what to look for in the first place without you.