Infinity is not self-contradictory... it only contradicts the idea that "infinity" is a quantity. Infinity is used in limits, and is essential to understanding many different parts of calculus.
Never, and I mean never, confuse the concept of infinity in limits (something that never stops growing) with the concept of infinite sets (infinity as a quantity). They are by no way related and in fact an infinite set is not represented by the simbol ∞, but by the symbol aleph:
http://en.wikipedia.org/wiki/Aleph_numberHere is Taylor's Expression:
e^x=(1+x/1!)+(x^2/2!)+(x^3/3!) . . . (Cont'd),-∞<x<∞
Notice it uses positive and negative infinity to describe that X can be all real numbers, positive or negative using mathematical terms.
Again here you're refering to the limits infinite. The expresion -∞<x<∞ is just notation, and will only be well-defined if you define the operator < for the extended real numbers:
http://en.wikipedia.org/wiki/Extended_real_numberHowever extended real numbers don't behave as well as the standard real numbers since indeterminations cannot be operated.
"lrn2divide
You cannot do that if x = 0."
You most likely say that i cannot because you dont understand the idea of 0 divided by itself, that is okay... i just recommend that you explore the concept.
(the calculator does not accept it, because it is programmed not to accept divisions by 0. Heres a thought experiment: If you have 0 cookies shared amongst 0 people... how many cookies get shared?)
I recomend you to explore the concept:
6/3 = 2 because 2 * 3 = 6
21/7 = 3 because 3 * 7 = 21
But 0/0 = 0 because 0 * 0 = 0
and 0/0 = 3 because 0 * 3 = 0
and 0/0 = a because 0 * a = 0
It's not well-defined at all. The result of an operation cannot be "any real number works". Even if you define a to be not a real number, but a new number which behaves differently (similar to how the imaginary unit was created) you'll get tons of contradictions when you work with that new number. So sorry but 0/0 is not defined.
To be most precise, when you divide by x on both sides, what you actually have is 1(x/x) = (x/x). This is true. You may then replace (x/x) with 1 only if you assert that x ≠ 0.
That's not true. The assumption that x ≠ 0 has to be stated BEFORE dividing by x. That's the point of most of the 1 = 2 proofs around internet. Get in mind that x is not variable, it is a constant since you have not defined any function, and thus it is just a number. 1x = x thus 1(x/x) = (x/x) is not a valid step if x = 0.
Graph the function [1/x=y] in the window
Xmin=-1
Xmax=1
Ymin=-20
Ymax=20
Here's where it comes to understanding infinity:
The function completely dodges the value 0, the smaller the number for x values the greater the absolute value of the Y. In order to get a VERY small number you need a VERY large number under the division sign. No matter what X equals, Y cannot equal zero. You can cut up a cookie as many times as you want, but you will still have crumbs.
"x/x = 1? where x is zero? i think not." - i know basic algebra people, i was just going beyond it. i was trying to get a concept out, notice my dissonance at the end.
>.>
The function does not dodge the value 0. It has a hole right there. It is small yeah, smaller than anything you will ever be able to plot, but the hole is there, and 0/0 is not defined. Almost0/almost0 however is, and the result sure is 1, but as stated above if you try to assert that 0/0 = 1 it'll result in lots of contradictions, including 1 = 2.
In fact, here is that proof:
a = b
ab = b²
-(ab) = -(b²)
a²-ab = a²-b²
a(a-b) = (a+b)(a-b)
a = a+b
a = 2a
1 = 2
gl hf