Do Doctors Really Heal?To answer the question “Do doctors really heal?” one must first define the term “health,” to understand what it is doctors are supposed to be restoring, and then we need to define the word "doctor." The concept of “health” and the term "doctor" was understood very differently among the ancients in biblical times.
Do Doctors Really Heal?I happened upon this site in researching the benefits of coconut oil. The Spirit of the Lord led me to find a gold mine through the work of Brian Shilhavy's, brilliant and selfless work. I praise the Lord in his leading. I did not know that honesty, truth, integrity and value such as this work and site, still exists. Blessings always!