@kennethamy,
The only thing Goedel says is that we cannot develop a recursively axiomatizable (this comes originally from the successor function s(0)=1 and s(x)=x+1; hence people say that it holds for any system 'strong enough for counting') that is provably consistent i.e. there is no way to prove something true and false (among other equivalent definitions), and complete in the sense that every sentence in theory that is true can be proven true and that is false can be proven false.
This means that we probably cannot build any kind of foundation to mathematics from first order logic (what Russell wanted to do with PM, and what Frege wanted to do before him). This doesn't mean that mathematics is in any way shape or form undermined. It just means everything isn't as nice and tidy as people wanted it to be.
The real important stuff is consistency. If you can prove that something true is false or vice versa, you have a problem. fortunately, calculus is safe, because
Gasai Takeuti and friends showed that the Real numbers are consistent (the real numbers are called 'second order arithmetic' in the wiki, you can read about that if you want, it's pretty cool. It seems like the Continuum Hypothesismust have been assumed, so if that puts you off then you may be in trouble).
A very nice result of the work done by G?del and Turing is that we know what is and is not computable in theory.
I think that a very relevant topic is that of
Unification . The use of unification in logic programming is precisely that of being able to replace logically equal objects with one another. The idea is to be able to have a way to automatically identify logically equivalent formulas (probably by reducing them to a normal form i.e. by having a set way to write them out and an algorithm to convert them to that form).