wheat crop

The Complexity of Reductionism: A Case Study of Genetic Engineering

As Albert Einstein famously said – “We can’t solve problems by using the same kind of thinking we used when we created them.” In my opinion, there is no area where this statement holds more relevance today than in relation to our tendency to view problems in isolation, to oversimplify complex discussions – reductionism.

There can be no doubting that reductionist approaches to science have helped us achieve a remarkable ability to influence and control the world around us. Unfortunately, these same approaches have also resulted in the situation where we have influenced the world around us to the extent that our own impact on the environment has begun threatening the very existence of humanity as we know it.

The pivotal problem of our time has become – how can we reverse the rapid destruction of the global environment to which we are so intrinsically linked, whilst at the same time learning to live in harmony as part of it? Viewed via the context of Einstein’s statement above – reductionism is the “kind of thinking” that we used to create this problem, it is not, therefore, the way we will solve it.

This essay represents an attempt to highlight a reductionist approach to science and the complexity inherent within it, via the case study of genetic engineering. Whilst genetic engineering is by no means the only area where reductionist science sits at the forefront of the mainstream view, it is perhaps one of the starkest examples – a proverbial house of cards, built on a series of outdated, reductionist principals and approaches.

Genetic engineering is a highly complex discussion involving a wide range of self perpetuating reductionist arguments. It is easy to get lost within this complexity and hence, in the aim of keeping essay contained I have chosen to focus on four specific areas:

  • The outdated, reductionist premise on which genetic engineering is based,
  • Simplistic, reductionist arguments that mislead the debate,
  • The corporate control of research, and finally,
  • The multidimensionality and complexity of environmental problems.

Complexity

Complexity, in broad terms, is the understanding that systems, by virtue of the multitude of interactions occurring within them, are far too complex to be understood purely via breaking them down into their component parts.  Or, as Aristotle said “the whole is greater than the sum of its parts.”

In terms of a complexity focussed approach to science, this definition by Daisy Allen fits well for me – “An important part of holistic vision is the focus on the relationships between things, rather than on the things themselves. This interconnectedness of systems leads to the understanding that the behaviour of even quite simple systems is very hard to predict – there are so many connections and thus potential actions within the system.” (Allen, D. (2010)

Complexity is essentially the study of relationships, with the understanding that form can emerge from these relationships that could not accurately be predicted via the study of the individual properties alone. A complexity approach to science, by definition, causes us to let go of definite, exact understandings of the world around us and replace them with a science of pattern and relationship – a science not only of quantity, but also of quality.

If complexity is the science of uncertainty, then reductionism is without doubt the science of certainty. As I explored in my previous essay, certainty is something that I believe we have learnt to seek refuge in, feeling safe in the simplistic, black or white, yes or no answers that it provides; even when – as I will show via the genetic engineering case study – these answers are often anything but certain or simple.

Reductionism

But before I get into the subject of genetic engineering, it’s important to spend some time on the subject of reductionism itself as it is an oft-used term which, on its own, is little more than meaningless. Terms like mechanistic and reductionist are used regularly by both proponents and opponents alike, and hence have come to mean a whole range of different things to different people. Without wanting to get lost in the pandora’s box that is this subject most definitely is, I find this (edited) explanation by the Nature Institute’s Steve Talbot provides a useful starting point for my purposes –

Reductionism, at root, is not so much a body of concepts as it is a way of exercising (and not exercising) our cognitive faculties. It is the inner act of isolating something so as to grasp it more easily and precisely, thereby allowing us gain power over it. We want to be able to say, “I have exactly this – not that and not the other thing, but this“. Unfortunately, as Talbot points out, it’s not the act of reduction that is pathological here, but rather its singleness. It’s in the suppression of the necessary counterbalancing view points that problems arise. (Talbott, 2004)

That is to say, in essence – there is nothing wrong with the reductionist approach to science, provided it’s understood within its context. Where reductionism is used to mislead or oversimplify debates and prematurely shut down or prevent genuine discussions it is, in my view, extremely damaging. And this leads me to my case study topic – genetic engineering.

The Outdated, Reductionist Premise of Genetic Engineering

Perhaps the most obvious attraction to using genetic engineering (otherwise known as ‘biotechnology’ or ‘genetic modification’) as my case study issue for reductionism is the fact that the entire premise on which it sits – the concept that genes are: “functional units of information which can be characterised precisely, counted, added or subtracted, altered, switched on and off, or moved from one organism or one species to another by means of genetic engineering” (McAfee, 2003) – is not only reductionist, but ultimately, outdated and highly simplistic. Yet, remarkably, genetic engineering continues on, with barely a mention of this startling reality.

Without wanting to get bogged down in the highly technical area of molecular biology, the basic situation is this – a wide range of unexpected discoveries in the field of molecular biology over the past decade, primarily arising from the Human Genome Project, have completely rewritten basic gene theory. Researchers found that the human genome is not a tidy collection of independent genes, with each sequence of DNA linked to a single function as had been the basic premise since the discovery of the DNA double helix – by Francis Crick and James Watson in 1953. But instead, that genes appear to operate in a complex network, interacting and overlapping, not only with one another, but also with other components in ways that no-one claims to fully understand.

The reality for molecular geneticists today is that the entire concept of a gene has become little more than a simplistic and outdated metaphor. Helen Pearson, in her 2006 Nature article titled ‘Genetics: what is a gene?’ states in her opening line – “The idea of genes as beads on a DNA string is fast fading. Protein-coding sequences have no clear beginning or end and RNA is a key part of the information package”. In describing the outdated view that many scientists continue to hold on the subject of genes, Pearson goes on – “those at the forefront of genetic research see it as increasingly old-fashioned – a crude approximation that, at best, hides fascinating new complexities and, at worst, blinds its users to useful new paths of enquiry.” (Pearson, 2006)

Barry Commoner, a senior scientist at the Center for Biology of Natural Systems, in an article titled ‘The Spurious Foundation of Genetic Engineering’ is a little more blunt in his assessment –The experimental data, shorn of dogmatic theories, points to the irreducibility of the living cell, the inherent complexity of which suggests that any artificially altered genetic system, given the magnitude of our ignorance, must sooner or later give rise to unintended, potentially disastrous, consequences.” (Commoner, 2002)

The unfortunate reality for the genetic engineering industry is that, far from being the precise and accurate method that industry claims, evidence from the history of genetic engineering itself points squarely at this uncertainty. For every one of the (few) genetic engineering ‘success’ stories, thousands of unexpected mutations and failures lie silently behind it.

The assertion that the state of play when it comes to the field of molecular biology has completely changed in the past decade cannot be denied. These changes have, without doubt, also challenged the underlying assumptions of genetic engineering. What is perhaps most fascinating in the face of this reality is to question – why hasn’t the field of genetic engineering come under the intense scrutiny one would expect given the foundation of its entire premise has been eroded?

An article in the business section of the New York Times titled ‘A Challenge to Gene Theory, a Tougher Look at Biotech’ made exactly this assertion, stating – “The $73.5 billion global biotech business may soon have to grapple with a discovery that calls into question the scientific principles on which it was founded.” (Caruso, 2007) But this article was written in 2007 and surprisingly little has been said since.

What is important to understand in relation to this foundation premise of genetic engineering is that nearly every other premise on which the discipline is based depends on it. This includes, but is by no means limited to, the right to patent genes, and the regulatory process for genetic engineering which currently views genetically engineered crops as ‘substantially equivalent’ to natural crops. That is to say, the entire genetic engineering industry is a house of cards built almost solely on the foundation of a scientific principal that is now anything but solid.

Given such shaky foundations, it perhaps becomes easier to understand how scientist tend to get so heated in defending this topic. It is after all, their entire careers that hang in the balance.

Simplistic, Reductionist Arguments that Mislead the Debate

The reality surrounding the broadly acknowledged uncertainty inherent within definitions of the gene and outdated interpretations of molecular biology stand in stark contrast to the language that continues to flow from industry. For example, Croplife International, the global federation representing the plant science [including genetic engineering] industry, defines biotechnology in simplistic and misleading terms. This is the first paragraph taken from the biotechnology section of the Croplife International website

For thousands of years, farmers have been using breeding techniques to “genetically modify” crops to improve quality and yield.  Modern biotechnology allows plants breeders to select genes that produce beneficial traits and move them from one organism to another.  Plant biotechnology is far more precise and selective than crossbreeding in producing desired agronomic traits.” 

Firstly, the assertion that farmers have been using “genetic modification” for thousands of years is, at best confusing and at worst, purposely misleading. The term genetic modification (or genetically modified organisms – GMO), is routinely used throughout the world to describe genetic engineering, by suggesting farmers having been doing this for thousands of years Croplife is blurring the lines between natural breeding and genetic engineering. This line of argument doesn’t foster debate, it appears little more than a simplistic attempt at shutting it down.

Secondly, and perhaps more importantly, the language used only highlights the outdated simplistic interpretation of genes held by the industry. It is quite extraordinary that, given the scientific reality now commonly accepted in relation to molecular biology, the genetic engineering industry is still allowed to get away with using terms like precise. They are using language that is directly contradicted by the current scientific data. Again, this is more than reductionism, it’s bordering on outright deception that serves only to create a false sense of security and prevent discussion in the first place.

Corporate Control of Research

One of the most concerning realities of genetic engineering for anyone seeking an open and transparent approach to science is the level of control that corporations are given, via their patent rights, over the research that is conducted on their crops. This occurs because, to buy genetically engineered seed, purchasers must sign a contract that limits what can be done with that seed.

A 2009 Scientific American editorial article titled – ‘Do Seed Companies Control GM Research?’ states – For a decade their [referring to companies such as Monsanto, Pioneer and Syngenta] user agreements have explicitly forbidden the use of the seeds for any independent research. Under the threat of litigation, scientists cannot test a seed to explore the different conditions under which it thrives or fails. They cannot compare seeds from one company against those from another company. And perhaps most important, they cannot examine whether the genetically modified crops lead to unintended environmental side effects(emphasis added). Research on genetically engineered crops is still published, clearly, but, as the article goes on to point out, only studies that the companies themselves have approved “ever see the light of a peer reviewed journal”. (Do Seed Companies Control GM Crop Research? 2009)

This really is another startling reality of genetic engineering and, when viewed in the context of the previous two sections, really does begin to paint a vivid picture of an industry with a lot to hide doing the best it can to keep it hidden.

Multidimensionality and Complexity of Environmental Problems

Finally, the irony in using genetic engineering as a case study for reductionism more broadly is that, speaking from a social, political and regulatory point of view, genetic engineering must be one of the most complex issues there is. But, far from contradicting the starting premise, this complexity points to one of the primary problems of reductionism when it comes to our approach to science and regulation – that complicated debates get easily lost in a sea of interdisciplinary silos and competing scientific and commercial agendas.

The complexity inherent within environmental issues is well summarised by Michael Carolan in a scientific paper on the multidimensionality of environmental problems –

“…environmental problems are ontologically multifaceted, involving the interpenetration of socio-cultural, economic and ecological systems; each of which are individually complex, but when taken together the emergent complexity far exceeds the sum of its parts. Consequently, being the finite creatures we are, it is difficult for us to arrange ʻthe factsʼ of a particular controversy in such a manner to reveal a picture of reality in its totality.” (Corolan, 2008)

In no issue is the above statement more relevant than in genetic engineering. In a paper titled ‘How science makes environmental controversies worse’, Daniel Sarewitz  states that – “…scientific uncertainty, which so often occupies a central place in environmental controversies, can often be understood not as a lack of scientific understanding but as the lack of coherence among competing scientific understandings.”  (Sarewitz, 2004)

Sarewitz goes on to talk specifically about the topic of genetic engineering, referring to a paper (Quist and Chapela 2001) published in Nature relating to the occurrence of transgenic corn in Mexico. In critiquing the debate surrounding this paper, Sarewitz explains how opponents fell very clearly into the discipline of micro-biology, while proponents were largely ecologists. In describing the “disciplinary structure and disunity of science itself” as the “root of the controversy”, Sarewitz suggests “The two sides of the debate represented two contrasting scientific views of nature – one concerned about complexity, interconnectedness, and lack of predictability, the other concerned with controlling the attributes of specific organisms for human benefit.” (Sarewitz, 2004)

And here-in lies the central problem for supporters of complexity based approaches over those of reductionists – our learned tendency to seek certainty, even where that certainty is itself unjustified, tends to favour definitive reductionist positions over more the circumspect positions of complexity theorists. Hence, a favourite catch phrase of genetic engineering proponents when criticising its opponents is “anti science”. It matters not that the exact point of the opponents is that definitive information, when it comes to complex subjects such as these is, itself, the problem.

Conclusion

Even though genetic engineering is reductionist, it is also, at the same time, extraordinarily complex. It’s this complexity which makes it so hard for opponents to clearly and effectively communicate their messages in a coordinated way – particularly when there are reductionist branches of science that will immediately and viciously attack anyone who questions their methods.

I could have written ten essays of equal length focussing on different ways that genetic engineering uses reductionist approaches and arguments to achieve its agenda. However, it is in the false and outdated “central premise” of genetic engineering where I think the most important arguments against the technology originates.

What is most interesting and concerning about this case study is the way in which paradigm shifting scientific discoveries seem to have been brushed over completely. This points to a much darker side of reductionism – its potential for misleading debates when the science doesn’t suit a firmly entrenched agenda.

One of the biggest problems facing opponents of the genetic engineering is the reality that, at some time or other, there is always a tendency to sound like a conspiracy theorist. But, rather than thinking of genetic engineering as a conspiracy, I much prefer to look at it from a realist perspective.

The genetic engineering industry was worth, according to the New York Times article I quoted earlier, $73 billion in 2007 (Caruso, 2007) and certainly more than that today. It’s not a conspiracy theory to state that the genetic engineering industry wields immense political power, particularly in the United States. There is endless evidence of this, but for example, whilst on the election campaign trail in 2007, US President Barack Obama, promised that if elected he would immediately seek to label GMO foods because “Americans should know what they’re buying”. But since being elected, and faced with the reality of the value of the industry to the US economy, it seems Mr Obama has quickly forgotten this promise.

There can be no doubt that the US Government sees the value of this industry to its economy. The problem here is that, given it is the Government that regulates the industry, it’s hard to put much faith in the regulatory system it sets up to oversee it. It’s not that I necessarily believe the Government is intentionally misleading people in relation to genetic engineering, more so that I believe, once there is an agenda in play or a will to see something a certain way, reductionist ‘evidence’ can always be mounted.

In addition to, and because of, the value of the industry to the US economy, genetic engineering necessarily employs thousands of scientists. These scientist are, by definition, reductionist scientists as they are only “concerned with controlling the attributes of specific organisms for human benefit” (Carolan, 2008). They defend genetic engineering, not because they are trying to mislead, but because they are trained only to study one very specific question; they absolutely believe what they say. These scientists provide the perfect foil for the light regulatory practices of Government, and hence, the self supporting cycle of reductionism entrenches itself.

As I have certainly found in writing this essay, the more you get into this topic the more dizzy you can become from following the endless circular and self supporting arguments of genetic engineering proponents, that seem only to pass you from one issue to the next – and this I feel is true, for many of the most complex environmental issues we face.

But ultimately, I believe support or opposition for complex environmental issues like genetic engineering must come down to one simple question – how do we view environmental issues?

Do we believe our best hope lies in business as usual, simplistic, reductionist approaches to solving the complex problems we have created, approaches such as genetic engineering?

Or, do we believe (as I clearly do) that this is the type of thinking that has created these problems in the first place, and hence, favour an approach that acknowledges the complexity inherent within biological systems?

This, I feel, may be one of the greatest questions facing humanity.

References

Allen, D. (2010). Biosemiotics and the New Paradigm. HOLISTIC SCIENCE JOURNAL, 1(3).

Caruso, D. (2007). A Challenge to Gene Theory, a Tougher Look at Biotech. New York Times, p. 33. Retrieved from http://www.nytimes.com/2007/07/01/business/yourmoney/01frame.html?pagewanted=all

Commoner, B. (2002). Unraveling the DNA Myth: The Spurious Foundation of Genetic Engineering. Harpers Magazine. Retrieved from http://www.commondreams.org/cgi-bin/print.cgi?file=/views02/0209-01.htm

Carolan, M. S. (2008). The Multidimensionality of Environmental Problems: The GMO Controversy and the Limits of Scientific Materialism. Environmental Values, 17(1), 67–82. doi:10.3197/096327108X271950

Do Seed Companies Control GM Crop Research? (2009) Scientific American. Retrieved from http://www.scientificamerican.com/article.cfm?id=do-seed-companies-control-gm-crop-research

McAfee, K. (2003). Neoliberalism on the molecular scale. Economic and genetic reductionism in biotechnology battles. Geoforum. Retrieved from http://www.sciencedirect.com/science/article/pii/S0016718502000891

Pearson, H. (2006). GENETICS: WHAT IS A GENE ? Nature, 441(May).

Quist, D., & Chapela, I. H. (2001). Transgenic DNA introgressed into traditional maize landraces in Oaxaca, Mexico. Nature, 414(6863), 541–3. doi:10.1038/35107068

Sarewitz, D. (2004). How science makes environmental controversies worse. Environmental Science & Policy, 7(5), 385–403. doi:10.1016/j.envsci.2004.06.001

Talbott, S. L. (2004). The Reduction Complex. NetFuture, 158.

 

About the Author
Profile photo of Richard

Richard

Richard is an agricultural scientist with ten years professional experience in the agricultural sector, particularly in agricultural policy & sustainable agriculture campaigning. He has recently completed a Masters in Holistic Science at Schumacher College in the UK and is interested in projects that challenge current paradigm approaches to food production and water management globally. Richard is a board member of the Ecological Agriculture Association of Australia and is currently working on a range of projects whilst exploring agroecological food stories in the UK and Europe. Richard is the founder of Our Food Future and can be contacted via ourfoodfuture@outlook.com

Share this Post

If you enjoyed this post, please consider leaving a comment or joining our mailing list to have future articles delivered straight to your inbox.

Comments

  1. Bob Phelps

    Great work Richard! The genetic manipulation (GM) industry falsely promises benefits that are not possible with its twentieth century GM techniques. These promises include: higher yields, less synthetic chemicals, drought and salt tolerant crops, nitrogen fixation in grains, longer shelf-life food, food biofortified with Vitamin A and iron, etc. But GM techniques can’t deliver on such visions because it’s impossible to cut-and-paste the genetic interactions for complex traits between unrelated genetic systems.

    As Dr Richard Richards of Australian CSIRO Plant Industry says: “GM technologies are generally only suitable for the single gene traits, not complex multigenic ones.” Dr Heather Burrow, CEO of the former Beef CRC, also says that in animals: “… hundreds, even thousands, of interacting genes control important production traits like growth rate, feed efficiency and meat quality – not the handful that researchers had originally believed.” The Australian Bureau of Resource Sciences says: “GM crops with insect resistance, herbicide tolerance, high-lysine content and, to a lesser extent, disease resistance … are controlled by manipulating or inserting a single gene. As a general rule … traits such as water-use efficiency and heat tolerance have multi-genic inheritance patterns and, therefore, plants modified for these traits have not progressed far down the product development pipeline.”
    Even the industry-backed ISAAA (www.isaaa.org) agrees.

    The UN’s IAASTD and UNCTAD reports show how the human family can be fed with ecological agriculture systems and that GM crops are irrelevant. As oil and phosphates deplete, water and land become more scarce, and the climate changes, a transition to environment-friendly and sustainable systems will become essential. The sooner we start the change, the less the pain.

  2. Pingback: The Complexity of Reductionism: A Case Study of Genetic … | Study Dream

  3. Pingback: The Complexity of Reductionism: A Case Study of Genetic … | Study Experts

  4. Pingback: The Complexity of Reductionism: A Case Study of Genetic … | Study Professionals

  5. Philip L Bereano

    I tried to check your Commoner reference, but the link was not alive. His article in Harpers on the “central dogma’ is the best thing I am aware of to explode the superficiality of GE “research.” I think that is what you are referring to. I also agree with Bob that this is a very nice essay.

    1. Profile photo of Richard Author
      Richard

      Thanks for the comment Philip, I have updated the link. It is a Harpers article called “Unravelling the DNA myth – the Spurious Foundation of Genetic Engineering”. Unfortunately, it requires a subscription to read it.

  6. Jack Heinemann

    “It matters not that the exact point of the opponents is that definitive information, when it comes to complex subjects such as these is, itself, the problem.”

    For me, this is the key idea in your essay, Richard. While I don’t see myself as an opponent (indeed, I’m a maker) of GMOs, I do see a critical difference between those who develop GMOs for application and those who test for safety. In our system, these two roles are mixed together, as I discuss in my blogs (e.g. http://rightbiotech.tumblr.com/post/100437995195/ultimate-experts). However, beyond the obvious conflict of interest in the roles, there is potentially a more problematic and fundamental difference in how essential questions are formulated.

    An example is in the debate of using profiling (i.e., ‘omics’ technologies) for hazard identification in GMO safety testing. Omics can be descriptive or reductionist, depending on how the question is composed and the nature of they hypothesis. Those who object to omics often do so based on the absence of hypothesis of harm: that is, scanning the transcriptome for new molecules and finding some does not prove adverse effect. Those, such as myself, who advocate its use (and really, the companies do it anyway), argue that it provides a description of novelty at the molecular level from which any targets of interest could be tested for adverse effect.

    I would call reductionism working from a hypothesis to parts that together form an explanation of what is observed. Reductionism is poor at predicting larger phenomena. That is, reductionist experiments that demonstrated that some sequences within DNA molecules were genes does not prove that all genes are DNA (http://biosafetycooperative.newsvine.com/_news/2013/12/24/22038524-lets-give-the-scientific-literature-a-good-clean-up). However, reductionism allows a series of hypotheses to be tested that each provides independent corroboration and thus confidence in the higher level observation that genes are DNA (at least much of the time).

    I call discovery or observational science working from a series of observations to build up to a hypothesis. Darwin’s science was mainly this and his technique lead to the theory of evolution. It could not prove how things evolved at the molecular level, which has been the grand success story of reductionism.

    Both reductionism and observational science have their limits and their strengths (http://www.sciencedirect.com/science/article/pii/S0160412012000621). When applied incorrectly, they lead to error. For example, to demonstrate by reductionism which of the many sequences in a genome can be genes is not to prove that all genes are composed of DNA. To observe heritable traits with a different material basis requires the observations of genetics rather than the tool of DNA sequencing. An excellent example of the difference is the story of the discovery of prions.

  7. Profile photo of Richard Author
    Richard

    Hi Jack, thanks so much for taking the time to read and comment on this piece. I have read your own blog often and appreciate the balanced view you bring to this debate.

    Regarding the line you selected as the central aspect of my essay, I am copying in the entire paragraph as the small section on its own didn’t even make sense to me –

    “And here-in lies the central problem for supporters of complexity based approaches over those of reductionists – our learned tendency to seek certainty, even where that certainty is itself unjustified, tends to favour definitive reductionist positions over more the circumspect positions of complexity theorists. Hence, a favourite catch phrase of genetic engineering proponents when criticising its opponents is “anti science”. It matters not that the exact point of the opponents is that definitive information, when it comes to complex subjects such as these is, itself, the problem.”

    I completely agree regarding reductionist science being useful and essential. The key is that we are clear about when we are using it and when we are not as it, by its very nature, is prone to massive blind spots.

    Cheers, Richard

Leave a Comment