Informed consent means you have the legal right to be fully and accurately informed about the benefits and risks of a medical intervention, including a pharmaceutical product, and are free to make a voluntary decision about whether to accept the risk for yourself or your minor child without being coerced or punished for the decision you make. Informed consent has guided the ethical practice of medicine since the Doctor’s Trial at Nuremberg after World War II, where the informed consent principle was internationally acknowledged as a human right for individuals participating in scientific research. Today, informed consent to medical risk taking also means you have the legal right to be fully and accurately informed by a doctor or medical facility about the benefits and risks of a lab test, surgical procedure, prescription drug or other medical intervention performed on you or your minor child and give your voluntary permission.

Read full story...