No, I didn't. I repeat my post here so you can check again:
Assuming infinity does not exist in any way, how do you do basic arithmetic like 1 = 3 x 1/3 = 3 x 0.333... = 0.999...?
Clearly, 1 isn't equal to any finite decimal part, like, say, 0.99999999999. We normally understand "0.999..." to mean an infinity of 9's so that it makes sense to see 0.999... as equal to 1. If you think infinities don't exist, the conventional way of interpreting "0.999..." has to be discarded for good. Same for very many other arithmetic operations, like 1/7, 1/11, 1/17, 1/29 etc. So, what do you propose instead?
And pi? The number pi is understood as having an infinity of decimal digits, without any repeating sequence ever. What do you propose to do instead?
Where do I "pretend he denied"?
Further, you're interpretation of my phrase "
Assuming infinity does not exist in any way" is obviously off. As I see it, maths is just an idea, an abstraction we use in our representations and models of the physical world. So, to me, infinity, like any mathematical entity, isn't something that can be said to exist in any way.
Given this, and considering that Seattle thinks infinity doesn't exist in the physical world, my question to Seattle is therefore to explain why infinity is so pervasive in our mathematical representations of the physical world and what alternative method he suggests we should use.
My point is that infinity is at the heart of our most basic mathematical representations of the world. It's not just something QM physicists negligently let slip into their equations. It was already there a very long time ago when human beings started to use symbols to represent integers. As I see it, infinity is a direct consequence of the way the human mind works. We can't do without it I don't think.
So, assuming infinity does not exist in any way
in the physical world, how do you do basic arithmetic like 1 = 3 x 1/3 = 3 x 0.333... = 0.999...
so that it reflects the finiteness of the world?
Clearly, 1 isn't equal to any finite decimal part, like, say, 0.99999999999. We normally understand "0.999..." to mean an infinity of 9's so that it makes sense to see 0.999... as equal to 1. If you think infinities don't exist, the conventional way of interpreting "0.999..." has to be discarded for good. Same for very many other arithmetic operations, like 1/7, 1/11, 1/17, 1/29 etc. So, what do you propose instead?
And pi? The number pi is understood as having an infinity of decimal digits, without any repeating sequence ever. What do you propose to do instead?
EB