Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think it's just a perspective shift. The main idea is that you can't ever measure a real number, only an approximation to one, so if two values differ by less than the resolution of your measurement they are effectively the same. For example consider the derivative f(x+dx) = f(x) + f'(x) dx + O(dx^2). The analysis version of the derivative says that in the limit dx -> 0 the O(dx^2) part vanishes and so the limit [f(x+dx)-f(x)]/dx = f'(x). The 'finitist' version would be something like: for a sufficiently small dx, the third term is of order dx^2, so pick a value of dx small enough that dx^2 is below your 'resolution', and then the derivative f'(x) is indistinguishable from [f(x+dx)-f(x)]/dx, without a reference to the concept of a limit.
 help



Yes but like I was thinking more how you'd do any kind of "and it vanishes" or even "becomes sufficiently small" with a gappy number system as it would have to pass through gaps where "undefined" non-rationals exist.

I guess my stance (which is not very well-developed or anything) is that you try to learn to live with the gaps: define everything in terms of only what you can measure and it no longer matters whether a number is rational or irrational, or infinitesimal vs small-but-finite, because you can't tell. Instead of saying "it vanishes" as an absolute statement you say "it appears to vanish from my perspective".



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: