This is a question that I have heard debated for a long time now. For me, this first became a topic of discussion right out of high school when a co-worker of mine suggested I watch Al Gores "An Inconvenient Truth." Although, there are issues with the documentary and I don't necessarily agree with all of his points, I do think this is a subject that we need to discuss more.
I found the following article really interesting in that it took a very neutral stance.
No matter what side of the political isle you sit on, we all need to agree that the earth is getting warmer (1C per year). It can be debated if we humans have a huge role in this but why not at least attempt to change the future? It doesn't matter if you think Al Gore was crazy or if you think that we are all going to be fighting over the last habitable space on earth near the poles in a couple of years.
I think we owe it to all humankind to be better stewards of this beautiful earth we have been given.
Here is a link to the article: Is climate change real, and is the world actually getting warmer?