I was taught in public school, by my science teacher, from my science book, written by science scholars, approved by the state and federal governments, that only life can beget life. This was presented to me as a scientfic fact...not theory...not a faith exercise...not a religious experience...not a believe-it-or-not, take-it-or-leave-it, it's-your-choice statement. It was an irrefutable fact. It is now 46 years later, and upon searching present day life-science text books, I can find no trace of that "Fact." What I do find, however, is a lot of theory presented as being fact. I also find that during that 46 year time span, no one has disproved the fact that I was taught.
Where did it go? Is it no longer considered important enough to be discussed or even mentioned? Has life been devalued? What exactly does Science want me to believe?
michael
I wonder if these are the same people who are telling me that there is no cure for Parkinson's Disease? Just a thought.