DISCLAIMER, no connection is intended between any individuals in the above images and concepts brought across in this article, the ideas are entirely those of the author. **Images sourced on Pinterest- upload by user.
Medicine or drug —a chemical agent used to cure, modify a disease or relieve pain by altering cellular protein production and physiological actions.
Eugenics — the science of improving a population by controlled breeding to increase the occurrence of desirable heritable characteristics.
We live in exciting times, Trumpian politics aside. In recent years with the advent of mapping the human genome, scientists can now detect minute sequence changes in our genome after exposure to various environmental insults and drugs. Perhaps the most dramatic of these effects are seen in response to certain chemicals, including an array of pharmaceutical agents. New evidence now show protein modifying effects in cells, could cause genetic modifications for generations to come.
Recently the effect of variation in genes coding for drug targets and for the enzymes involved in drug metabolism has highlighted the genetic component of drug response. This variation seen besides the diversity already found based on composite environmental interactions between different individuals. Consensus have been reached amongst scientists that drug responses are linked to complex, multifactorial genetic traits, and the study of these genetic variations, termed pharmacogenetics, is analogous to the study of complex genetic disease in terms of the questions posed and the analytical possibilities. Just as DNA variants are associated with specific disease predispositions, so will they be associated with individual response to specific drugs—these responses further influenced by the environment.
The testing for drug response is starting to follow the same route as the genetic testing for inherited disorders, and has reached the stage where both genome-wide analysis and single gene analysis are accepted realities, a welcome advance.
Pharmaceutical agents are subsequently seen as altering disease outcomes based on genetic responses via DNA and messenger RNA in a cascade of chemical modifications with some similarity and individuality. Gone are the days of superficial objectivity, with adverse effects based on statistics in events such as vomiting, diarrhoea, headaches or skin rashes, gathered from cohort studies consisting of controls and trials. Operating now in a different world, drug response outcomes can be interactively seen on DNA level as altering base pairs on variable individual level as an interactive concern. And now, with the so-called surplus or ‘junk’ DNA also realised as purpose bound, things have become even more complex. This 'junk' or non-decoder DNA until recently suited a reductionist model in classifying it as an isolated 'invader DNA', finding support in a simple natural selection-based version of Darwinism. However, more recently it is realised that minor sequence changes can boost gene expression and that surplus DNA can act as a not so feckless bystander in a complex malleable and ‘perceptive’ interactive network— our evolution is now for the first time ever manipulatable in our own hands. Simultaneously, we are slowly beginning to realize how delicately interconnected and interdependent things are.
Historically, in February 2001 The Human Genome Project published its first draft on the sequence of the over 3 billion base pairs that make up the human genome. The finished sequence of five entire human chromosomes (chromosomes 22, 21, 20, 14, and Y) was published for the 50th anniversary of the publication of the structure of DNA by Watson and Crick on 8 April 2003. The finished DNA sequence of the entire genome was subsequently made available to the public by the International Human Genome Sequencing Consortium (IHGSC) on the internet, http://www.ncbi.nlm.nih.gov/Genbank/index.html. Over the past few years, another more than 30 organisms have had their genomes completely sequenced, with more in progress. Consequently, we find ourselves at a time in which new types of experiments are possible, and observations, analyses, and discoveries are being made on an unprecedented scale. It can be expected that genetic considerations will become important, in all aspects of disease (and life), be they diagnosis, treatment, or prevention— with ethical implications and moral impact.
As mostly the case in our search for simplicity we instead find complexity, and to prove the point along then came the introns. Part of the so-called surplus DNA, introns are segments of DNA within genes that don’t code for proteins, yet they make up a huge portion of the human genome. Slowly their support roll as part of a transgenerational interactive network is becoming recognized and their initial rejection as ‘junk’ DNA seen as completely unjustified. Found in all fully sequenced eukaryotic genomes, including other nucleomorphs (Gilson et al., 2006), intron density ranges from a handful in the entire genome of some primitive single cell organisms (Mair et al., 2000; Morrison et al., 2007), to about eight per gene in the human genome (Sakharkar et al., 2004).
With the ill repute of these introns they were seen as unnecessary and ‘parasitic’ in nature, with recognition of a potentially hazardous nature and of limited purpose. This initiated a quest for a function that could counter their ineffectuality in having only superfluous value or deleterious effects. The physicist, biochemist and Nobel laureate, Walter Gilbert12proposed, shortly after the discovery of the introns, what is now known as the intron-early theory (Gilbert, 1987). According to this theory, introns were pivotal in the formation of modern, complex, genes, by allowing for constant shuffling of small, primordial, mini-exons. Today introns are seen as absolutely essential in intron-rich species, as well as in many intron-poor ones (Lynch,2007) vital to boosting gene expression. One of the best examples of the importance of intronic function in contemporary eukaryotes is the increase in protein abundance of intron-bearing genes. Subsequent works reported the same phenomenon to be associated with numerous other introns in many eukaryotic species, suggesting that this increase protein expression is wide-ranging (Le Hir et al., 2003). Introns seem to affect virtually any step of mRNA maturation, including transcription initiation, transcription elongation, transcription termination, polyadenylation, nuclear export, and mRNA stability. (Chorev M, Carmel L. 2012).
As understanding and knowledge escalate in the field of genetics and with drug responses now likened to complex, multifactorial genetic traits, what is becoming clear is not only that individual responses are more complex and interlinked to other factors than previously thought but that these genetic modifications may be carried forward in generations to come. Various chemicals such as the now well recognised PBA’s in plastics and petrochemical agents have been traced to affect the rat genome for up to 10 generations after exposure during pregnancy, Brand, E.J; Kennedy, J.L; Müller, (2014)
Gone are the days of finding security in MTDs (minimum toxic doses) and MLDs (minimum lethal doses) when dealing with pesticides and certain chemicals.
Looking back, it was in 1941 that Isaac Berenblum, then a Riley-Smith Research Fellow in the Department of Experimental Biology & Cancer Research, University of Leeds, demonstrated through experimental research that carcinogenesis induced by chemicals involved 3 separate and independent processes: initiation, promotion, and latency. Berenblum² also observed that every carcinogen that produces a tumour at the site of application or injection is an irritant, in the sense that it induces a continued state of hyperplasia. He further indicated that in all cases in which sufficiently accurate observations could be made, it was seen that the primary tumour was preceded by a stage of hyperplasia. Berenblum concluded that although hyperplasia is a precursor of neoplasia, only some and not all irritants are carcinogenic.
Prior to this in the beginning of 1938, the Federal Food, Drug, and Cosmetic Act gave regulatory powers to the FDA, requiring, among other things, that new drugs be clinically tested and proven safe before being sold. The FDA offered guidelines for such studies, but there were no designated standards. The first published guidance for industry for assessing the toxicity of chemicals in food by the FDA was in 1949. In the so called “Black book” publication, included was a contribution of work done by O. Garth Fitzhugh on the subject of long-term studies and their design. On unchallenged ground, Fitzhugh suggested that for long-term feeding studies, 2 species should be investigated: the albino rat would be studied for a lifetime of about 2 years, and a non rodent second species (dogs or monkeys) would be studied for at least 1 year. Dose selection for these long-term studies would be based on results of subacute studies. Four groups of at least 10 animals of each sex were then proposed: (1) a dietary control group, (2) a group fed a diet containing 100 times the amount of the substance proposed for use in food, (3) a group fed a diet containing the highest tolerated amount of the substance, and (4) a group given an intermediate dosage. Biochemical and haematology evaluations were to be made at 3-month intervals during the study. At the end of the study, autopsies were to be performed, along with weighing of the principal organs and preservation of tissues for microscopic examination. The pathologist Arthur Nelson described in further detail in the 'Black book' the tissues to be evaluated. These included lung, heart, spleen, pancreas, gallbladder, lymph nodes, stomach, small intestine, colon, kidney, adrenal, urinary bladder, testis or ovary, prostate or uterus, thyroid, parathyroid, submaxillary salivary gland, 4 levels of brain, hypophysis, bone, bone marrow, and voluntary muscle— reminiscent of my days as a student in the pathology labs.
It was only in 1955 that Lehman7 and coworkers updated the FDA guidance to include sections pertaining to drugs and recommendations for toxicity studies aimed at supporting marketing applications. These studies included acute, subacute, and chronic toxicity testing, with chronic toxicology studies having a suggested duration/species of 2 years in the rat and 1 year in the dog. At this time, designated carcinogenicity studies had not been established; therefore, the carcinogenic potential of the drug product was assessed based on a toxicological model based on a limited and set chronic exposure. With shortfalls and limitations of techniques realised some uncertainty existed on how to progress from current methods.
In 1962 an amendment to the FDA’s Food, Drug, and Cosmetic Act aimed at promoting drug safety, shifting the burden of proof to drug manufacturers. For the first time, drug manufacturers had to prove that their products were both safe and effective before they could be sold. Subsequently, guidelines for toxicity tests for all drugs (known as the Lehman Guidelines) were written by Arnold Lehman, director of the Division of Pharmacology of the FDA, to aid the pharmaceutical industry in complying with the new law. Rats, dogs (beagles), and rabbits were the primary species for testing at this time, with chronic toxicology tests conducted for 1 year to 18 months depending on the species. Around the same time also in 1962, the National Cancer Institute (NCI) Carcinogenesis Screening Program was initiated.
Early efforts to develop standardized carcinogenicity protocols were established by the NCI scientists John and Elizabeth Weisburger in the 1960's who began revising standardized systemic carcinogenicity protocols based on FDA protocols. They ambitiously linked 55% of global cancers to inappropriate nutritional habits and another 35% to tobacco use thus well diverting focus away from other potential factors² . Their design is still used as the basis for the current design for 2-year carcinogenicity studies in rodents. The importance of stabilizing the effect of the test chemical, selection of an ‘appropriate’ test species, standardization of animal maintenance and environmental control (including temperature, humidity, hours of light, bedding, airflow, water, and diet), and issues related to various routes of administration and dose selection were well realised and considered then as all needed to secure ‘purity’ in tests outcomes.
To briefly reflect on the economic evolution of testing : in 1961 an NCI carcinogen screening test of a given chemical performed in one species took 8 months and cost about $10,000 to $15,000. In 1972, a more extensive test in 2 species with larger numbers of animals required about 30 months and cost about $75,000. By 2009, costs for carcinogenicity testing in 2 species were in the range of $2-4 million.
It was in the beginning of 1968 that drug package inserts were required for newly approved drugs, including a section discussing carcinogenesis. The following year, data elements to be captured for carcinogenicity studies were described by Berenblum(1) and included descriptive information on the chemicals, animals, experimental design, survival, body weight, and individual pathologic results, as recommended by the International Union Against Cancer.(10)
With limited adjustment in study lengths and selection criteria during this time it was quietly noted that the selected duration of ‘chronic’ studies was influenced by toxicities elicited by a number of anticonvulsants, analgesics, hypercholesterolemic agents, and tricyclic antidepressants, which were being manifest only between 6 and 12 months of treatment. In the 1970’s the National Cancer Program and its Carcinogenesis Testing Program, the NCI was asked by the FDA to conduct carcinogenicity studies in rats and mice for some older drugs. During this time the first electronic data capture system for these NCI carcinogenicity studies, the carcinogenesis bioassay data system (CBDS), was also developed and the NCI pathology working group was organized for peer review. The design of these early NCI studies was such that one set of 20 controls was used for several drug/chemical studies conducted in one room, with 50 animals used for each of the low- and high-dose groups for each unique drug/chemical. Protocols for carcinogenicity studies conducted by the NCI Carcinogenesis Testing Program were first standardized in 1976.
In a 1982 colour change the ‘Redbook’ came along, standardized carcinogenicity protocols were published by the Organization for Economic Co-operation and Development in. These protocols specified the use of the today scarily sounding 3 dose groups, with the now maximum tolerated dose (MTD “the highest dose of a test agent used during the chronic study that can be predicted not to alter the animals’ normal longevity from effects other than carcinogenicity.” The MTD was now considered by the Centre for Drug Evaluation and Research (CDER)/FDA to also include severe alterations to homeostasis or other alterations that might interfere with interpretation of the studies. By the mid-1980s, carcinogenicity studies of new drugs conducted by a drug sponsor for the FDA advanced to the inclusion of 3 dose groups of at least 50 animals for rats and mice. In 1987, CDER was established within the FDA (from the Centre for Drugs and Biologics), and soon after, the Carcinogenicity Assessment Committee (CAC) and executive CAC were formed to review carcinogenicity protocols and results to ensure consistency across the centre and that statistical analysis can be standardized and applied to be the same for all carcinogenicity studies of drugs.
What is concerning in this brief history of our carcinogenic anxiety is how complacent we still are on these matters in our current world of mass drug and pesticide consumption. Disconcerting also is how focus still is directed to standardising study groups and protocols. Liberated if we wanted to by an anti-reductionist science where the interconnections seen in growing complexity fascinates most open-minded scientist, we should not only bravely confront significant new scientific challenges but also the multidisciplinarity of a new more morally truthful era and openly question our approach in view of a new biology and evolution.
Exciting times if not complex, where only faltering security can be found in a science and research based on set models and protocols, anxiously attempting to appease fund distribution.
*About the Author:
Dr Theo Holtzhausen is a practicing vet and author of Sensible Gene Selfish Being (2010), Spheres of Perception (2020), dedicated to a truthful Science.
1) Berenblum, I . The mechanism of carcinogenesis: a study of the significance of cocarcinogenic action and related phenomena. Cancer Res. 1941;1:807–814.
2) Berenblum, I . Irritation and carcinogenesis. Archive Pathol. 1944;38:233–244.
3) Brand, E.J; Kennedy, J.L; Müller, D.J. Pharmacogenetics of Antipsychotics. Canadian Journal of Psychiatry 59 (2): 76-88. PMID 24881126.3) Chorev M, Carmel L. Front Genet. 2012 Apr 13;3:55. doi: 10.3389/fgene.2012.00055. eCollection 2012. PMID: free article 22518112
4) Fedorov, A., Roy, S., Fedorova, L., Gilbert, W. Mystery of intron gain. Genome Research 2003. Open access
5) Gilson P. R., Su V., Slamovits C. H., Reith M. E., Keeling P. J., McFadden G. I. (2006). Complete nucleotide sequence of the chlorarachniophyte nucleomorph: nature’s smallest nucleus. Proc. Natl. Acad. Sci. U.S.A. 103, 9566–957110.1073/pnas.0600707103 [PMC free article] [PubMed] [CrossRef] [Google Scholar]
6) International Conference on Harmonization ICH M3(R2) Guideline . Nonclinical safety studies for the conduct of human clinical trials and marketing authorization for pharmaceuticals. 2010.
7) Jacobs A.C. Hatfield, K.P. History of Chronic Toxicity and Animal Carcinogenicity Studies for Pharmaceutical. First Published June, 13. 2012 Research article
8) Lehman, AJ, Patterson, WI, Davidow, B. Procedures for the appraisal of the toxicity of chemicals in foods, drugs and cosmetics. Food Drug Cosmet Law J. 1955;10:679–748.
11). Mair G., Shi H., Li H., Djikeng A., Aviles H. O., Bishop J. R., Falcone F. H., Gavrilescu C., Montgomery J. L., Santori M. I., Stern L. S., Wang Z., Ullu E., Tschudi C. (2000). A new twist in trypanosome RNA metabolism: cis-splicing of pre-mRNA. RNA 6, 163–16910.1017/S135583820099229X [PMC free article] [PubMed] [CrossRef] [Google Scholar]
12). National Cancer Institute. Guidelines for carcinogen bioassay in small rodents. In : DHEW Publ. (NIH) 76-801. Bethesda, MD: National Cancer Institute; 1976:1–65.
13) US Food and Drug Administration . FDA history: part I. http://www.fda.gov/AboutFDA/WhatWeDo/History/Origin/ucm054819.htm. 2009. Google Scholar
15) Saxonov, S., Gilbert, W. The universe of exons revisited. Gentica . 2003
16). Sistare, FD, Morton, D, Alden, C. An analysis of pharmaceutical experience with decades of rat carcinogenicity testing: support for a proposal to modify current regulatory guidelines. Toxicol Pathol. 2011;39:716–744. Google Scholar | SAGE Journals | ISI
18). Yamagiwa, K, Ichikawa, K. Experimental study of the pathogenesis of carcinoma. J Cancer Res. 1918;3:1–29. Google Scholar
1 CommentComments on Theo Holtzhausen’s article
Like Lance Gardner’s comment