The Final Gift is a very powerful book, and I agree with Dotsie that it's well worth a read.

It's amazing to me how hard our American culture tries to pretend that death is not a part of life. Like it's somehow *un*natural. Now doesn't that seem crazy when you really think about it?

I think we lost something important as a culture when we moved death almost exclusively to a hospital setting rather than at home.

(Not that I have anything against hospitals, believe me!)