Wednesday, June 14, 2017

The Dynamics of Discovery...the start of the innovation process. Or, industry might want applied research but that is not what it needs...




From Archimedes to Edison, attempts to improve quality of life have dictated a need for advances in science and technology. These advances are now widely recognised, if not fully understood as the key enablers of increasingly prosperous societies.

And despite this long history, the process of managing the expanding frontiers of new knowledge in a way that will benefit society is a work in progress. This is largely due to the unpredictable nature of scientific discovery most famously illustrated by Archimedes, when, upon stepping into the bath, he suddenly realised that the volume of water displaced was equal to the volume of the submerged portion of his body.  His discovery provided the solution to the previously intractable problem of measuring the volume of irregular objects and led to further advances in assessing the density and purity of precious metals among other things. 

In the modern world little has changed in how new knowledge is acquired. However, in an attempt to get the best value for their limited investments, governments have devised processes to try to manage its discovery and application.

Interestingly there has been a propensity to divide scientific research into a one-dimensional continuum starting with pure (sometimes known as blue-skies) research progressing through to applied research and on to technology transfer; the defining characteristic of pure research being that it seeks new knowledge with no view as to its application, while applied research seeks solutions to industrial problems.

Such a continuum has been the basis of R and D funding prioritisation in advanced economies around the world since it was promulgated by Vannevar Bush following World War II. Persistent debates over public funding suggest this mindset does not accurately reflect the process of science and technology development. 

The dynamic nature of the discovery of new knowledge and its commercial application can be observed in the remarkable career of French chemist and microbiologist Louis Pasteur, whose breakthroughs ranged from the first rabies and anthrax vaccines to paving the way for germ theory and pasteurisation. Pasteur was not driven by a quest for new knowledge for its own sake but was motivated by a desire to better understand and solve the problems of industry.   His work is an early demonstration that many, perhaps most, near-to-market problems exist because the knowledge to solve them has not been discovered.

In his early career, he concentrated largely on uncovering new knowledge, but as he did so he came across other, previously unforeseen questions. For example, while working as a chemist at the age of 22 he sought a theoretical understanding of why tartaric acid crystals derived from bio-mass rotated the plane of polarised light while the chemically synthesised form did not.  His experiments revealed that the naturally occurring compound is chiral, meaning its molecules exist in one of two possible crystal structures, each the mirror image of the other. In the process of uncovering this new knowledge, he laid the building blocks for the modern experimental science of crystallography, which is today used in one form or another in everything from gemstone cutting to DNA analysis.

Pasteur’s remarkable career uncovered whole new branches of science – such as microbiology – and, as he developed as a scientist, he began to seek to satisfy both theoretical and practical goals.

Of particular note is the fact that as the problems Pasteur chose to solve became increasingly applied in nature, the nature of his research to solve them became more fundamental. 

I have drawn my Pasteur example for Donald Stokes.  In his 1997 book Pasteur's Quadrant: Basic Science and Technological Innovation Stokes argues that there is a far stronger link between research of the more basic nature and innovation in industry than many appreciate.  He argues that in fact, the dominant form of research is use-inspired, regardless of whether it is at the discovery or the application part of the cycle. 

Pasteur’s research agenda was use-inspired. Understanding and exploiting the dichotomy between applied and theoretical goals is perhaps the reason behind the breadth of his contribution.
This philosophy could be instructive for modern policymakers seeking to get the most from limited investment funds and move away from the outmoded, linear model of R and D. The effective management of applied research operations is much more complicated than simplistic models, such as that of Vannevar Bush, suggest.

As previously stated, the common ongoing debate over research funding is about whether funding should be provided for pure research, or for applied research. This debate is based on the erroneous assumption that industry benefits only from applied research, and that research directed at assisting industry must be applied - industry (and a large body of policy-makers) is lead to believe that it needs applied research. Thus the attention has turned to wants, rather than needs.

What industry needs is research that is appropriate to solve the problem at hand, or exploit the opportunity recognised. And this is best driven by better problem definition, not the meaningless classification of science.

What many of us in science know is that often the real needs of industry (and the many intractable problems of society) cannot be met from available knowledge, which means that the research it needs must be of a more discovery nature. As Stokes eloquently puts it when he uses Louis Pasteur as his example, the more involved you become in the application of scientific knowledge in the market, the more you identify even more fundamental questions to be answered. These fundamental questions need to be answered to enable full exploitation in the market.  

The lessons of Pasteur's Example can be summed up:

         Pasteur was a chemist and microbiologist
         Driven to solve the problems of industry (fermentation)
         Along the way his breakthroughs included vaccines (rabies and anthrax), germ theory, and pasteurisation (of course)
         He answered fundamental science questions, because he needed the answers in order to answer industry questions
         Which suggests that industry focused research includes both applied and pure/fundamental, and
         The focus should be on outcomes, not type of research

Countless hours are wasted on trying to determine whether the public should be supporting applied research or basic research. Far less time is spent on identifying the priorities to be researched, or the questions and challenges to be answered.  More time spent on the latter will enable the scarce public resources to be better targeted at activities that make a difference. 

Once the priorities are identified it is easier to determine how much effort is needed in discovery and how much in application – that choice depends on what we know about the field, how much information and knowledge has already been discovered, and what remain the unanswered questions.
Deciding what to do on the basis of whether it is pure or applied research does little more than distort the research agenda. Research is research, and the nature of the research needed for any situation depends on how much knowledge we have in relation to the problem or the opportunity we are examining. 

Now, a debate about national priorities - that's an entirely different beast! As is how much is needed to be invested!  How do we best define the problems, or characterise the opportunities? And how do we do a better job of telling the science story?

Scientific research is a resource that must be managed, and if it is to be managed it needs be understood.

Further reading:
Stokes, Donald E (1997) Pasteur’s Quadrant: Basic Science and Technological Innovation. Brookings Institution Press
Dodgson M and Gann D (2010) Innovation: A Very Short Introduction. Oxford University Press

Sunday, January 22, 2017

Farm-gate and consumer food prices - a need for genuine analysis.



What is it that we want our agriculture to do? 

Australia agriculture continues to put food on the tables three times a day.  It continues to innovate and contribute to the nation's prosperity.  It continues to eke out efficiencies in the production system. Though much is to be lauded, much needs to change.   

Modern agriculture is grounded on the belief that the primary objective of the industry is to produce as much food and fibre as possible for the least cost. 

These twin goals have long shaped farming, and underpinned agricultural research.  But with evidence that food is wasted in developed countries (and in developing countries), that food security is a now accepted as a major global issue, and issues of environmental degradation and health problems such as obesity, we need to define what it is that we want contemporary agriculture to do. 

And in doing so, we must be prepared to pay for the “qualities” we want in our food – ethical production, environmental values, animal welfare, safe products – not just accept to dictum that food is too expensive.  Farm gate prices would suggest we are not paying the full cost of production to the quality standards we expect.  The wholesale/retail sector makes big margins -  supermarket drive the price to consumers down (or so they say) but seemingly not at the expense of profit.  There is little transparency in the market trail – consumers do not really know what they are paying for, and for producers it is even more opaque.

Social media abound with comment, much ill-founded on food and food security issues. Parallel with this, we see heightened interests in food and cooking, and in urban agriculture/vegetable growing.  Then there is health - over consumption of energy rich food, imbalanced diets and obesity. 

There is a sense we are missing the big picture. 

Is modern agriculture about producing cheap food? What other values might apply to agriculture, such as preserving landscape and countryside? Can we change the profitability of the system? What should the drivers be for a new agriculture? What is prosperity in contemporary agriculture?  What is the value proposition for all players in the market?

Engaging in public debate on these issues and acknowledging their complexity will help define the shape of future agricultural research and our farm and food systems.

There is nothing new in this.  In recent reading I re-discover a couple of paper in my files from  David Fraser, an animal welfare researcher at the University of British Columbia:
  • Fraser D (1999) Animal ethics and animal welfare science: bridging the two cultures.  Applied Animal Behaviour Science, 65: 171-189, and
  • Fraser D (2001) The “New Perception” of animal agriculture: legless cows, featherless chickens and a need for genuine analysis.  Journal of Animal Science 79: 634-6411
The second title relates to an often-cited quote in animal welfare literature about a (disputed) claim by an animal geneticist that his organisation was attempting to ‘breed animals without legs and chickens without feathers’.

The quote highlights, however, concern felt in some quarters over the direction of modern agriculture. While gene technology is poised to deliver many benefits to agriculture in the fight against disease, reduced environmental impact and enhanced food nutrition and quality, it could fancifully be argued that the technology might one day be equally capable of delivering a legless cow.

Nowhere in modern agriculture is the polarisation of different viewpoints on the direction of animal agriculture more evident than in the fields of gene technology and animal welfare.
In these debates and others, such as the growing divide between production and sustainability science, a far better analysis is required of complex issues to answer the questions of what we want agriculture to do.  

David Fraser describes the polarised views on modern agriculture in terms of the ‘new perception’ and the ‘neotraditional portrayal’. In the new perception, agriculture is regarded as detrimental to animal welfare, controlled by large corporations, motivated by profit, causing world hunger, producing unhealthy food and harmful to the environment.  It is a dichotomy between a negative view that we are “Future Eaters” (as in Tim Flannery’s book with the same title) or the more constructive view that we are “Future Makers” (as argued by David F Smith, in his paper In Praise of Exotic Species”, Quadrant February 2014).

At the other end of the spectrum, Fraser defines the neotraditional portrayal of the industry as beneficial to animal welfare, mainly controlled by families and individuals, motivated by traditional animal care values that lead to profit, augmenting world food supplies, producing safe and nutritious food and not harmful (often beneficial) to the environment.  

Literature from both ends of the spectrum tends to provide information that supports one of these polarised viewpoints while often failing to acknowledge the complexity of the debate, or attempting to establish a middle-ground.

Research undertaken by the International Food Policy Research Institute in Washington indicates that the demand for animal protein will double within 20 years. This demand is being propelled by urbanisation and increased income, particularly in the developing world.  It is common knowledge now that we are heading for a world population of 9 billion, so projections in demand are entirely credible.

However, if we are going to increase livestock production, for example, to double protein production, major changes will be required in how we produce our product. If we increase per animal productivity two or three-fold, then we would also have to reduce environmental impact by a similar amount, accepting present community expectations.  While this may be technically possible within a reasonably short timeframe, is this what we want agriculture to do? How do we want to use the resource?

The agricultural production sector is often criticised for not meeting the triple bottom line (social, economic and environmental) yet by the same token we, at least in the developed world,  continue vote in the supermarkets for cheaper food.  

This not only challenges the viability of farming, it also means that much of what we do as societies is at cross purposes.  There are many questions to be asked, such as:

  • ·       Will we accept that profitability of farm enterprises, and especially family businesses, is a legitimate aspiration?
  • ·       Will we enact the market and price reforms, and equity distributions, need to achieve this?
  • ·       Will consumers accept the harvesting of native species, such as the Red or Grey kangaroo in countries like Australia, as an ecologically sustainable source of meat? 
  • ·       Should we be paying more for food and consuming less in the interests of less energy intake and lower obesity?

There is even the question of “What is food?” 

For example, rather than seeing beef just as a staple in the food system, could our mindset change so that we also think of beef as producing zinc and iron that can be injected into diets at critical times in human development – for example, in early childhood for brain development and early teenage years to combat iron deficiency. (Zinc and iron deficiencies appear to be two major nutritional issues in both the developed and developing worlds.)  In doing so, we change the whole value proposition for meat, and the prospect of better returns to producers, and enhanced benefits to consumers.  For those who are just after the eating experience, nothing changes.

These are myriad challenging questions to be asked, questions that cannot be answered with a simple “yes” or “no” but must be debated vigorously by a range of stakeholders in the public arena.  An effective response will be systemic.

Critical is a need to somehow re-connect consumers with the processes of food production. There is some hope that growth in the urban agricultural sector will enable more people to understand the complexities and vagaries in producing food - but that is only the tip of the iceberg.

Consumer confidence in science has been shaken in recent times by issues relating to food safety and diseases, such as bird flu.  To avoid misrepresentation, scientists have at times been reluctant to acknowledge any potential risk to food safety for fear that such an admission will distort the debate. Yet, with uncertainty comes awareness and planning for any potential unforeseen consequences. We cannot remain silent ( see https://www.crawfordfund.org/news/news-what-happens-when-we-remain-silent-january-2015/).

Risks can be managed effectively without raising public concern if potential risks to the food chain are acknowledged and a system of surveillance, monitoring and detection put in place that enable quick remedial action to address any problems that may arise.

Scientists should not be isolating themselves from controversy because the technical complexity of issues we are dealing with in the community now is such that we need to participate in the public debate – we need people who understand the science to engage.  Nor should other members of the community ignore the requirement to engage openly and responsibly in that debate.

Undoubtedly, we need more simultaneous research at all levels – from sub-cellular to ecological – to develop a greater understanding of issues at the boundaries of science and social and community impacts.  We also need parallel efforts to explore how to reform our markets and to give better price signals and financial returns to our farmers, and to educate consumers about the true cost of food.

We need a public debate too - an informed debate and based on genuine analysis.  

And we must accept the urgent need to do this.

What is it that we want our agriculture to do? 



The Dynamics of Discovery...the start of the innovation process. Or, industry might want applied research but that is not what it needs...

From Archimedes to Edison, attempts to improve quality of life have dictated a need for advances in science and technology. These ad...