- #1
docnet
Gold Member
- 672
- 307
- TL;DR Summary
- Confusion about infinity
We have come to accept that Infinity times two is infinity. In the sense of 'size' we use to think about everyday numbers, the rules of arithmetic with infinities seem like nonsense. For example, consider the computable number
$$0.100100100100100....$$
In the decimal expansion, there are clearly twice as many zeros as there are ones. In fact, for any finite ##n##, if we partition the decimal expansion into strings of ##3n## numerals, then each string will contain ##100\%## more zeros than ones. However, as ##n## reaches infinity, we learned that ones and zeros both appear ##\aleph## times, implying equality. This, to me, seems like falling short of a precise definition on what's called 'infinity'.
I guess I am not satisfied the mysterious reality that math doesn't have answers to such puzzling contradictions, where our intuitions break down. This must mean the language we use to express mathematics, and mathematics itself, ultimately supersedes our 'intuition', and must be accepted as such without question?
edit: changed ##50\%## to ##100\%##.
$$0.100100100100100....$$
In the decimal expansion, there are clearly twice as many zeros as there are ones. In fact, for any finite ##n##, if we partition the decimal expansion into strings of ##3n## numerals, then each string will contain ##100\%## more zeros than ones. However, as ##n## reaches infinity, we learned that ones and zeros both appear ##\aleph## times, implying equality. This, to me, seems like falling short of a precise definition on what's called 'infinity'.
I guess I am not satisfied the mysterious reality that math doesn't have answers to such puzzling contradictions, where our intuitions break down. This must mean the language we use to express mathematics, and mathematics itself, ultimately supersedes our 'intuition', and must be accepted as such without question?
edit: changed ##50\%## to ##100\%##.
Last edited: