Author Archives: Frank Woolrych

Biomarkers – why medicine is about to undergo the biggest single change ever

Professor Tony Freemont

Proctor Professor of Pathology, University of Manchester and Director of MMPathIC (Manchester MRC/EPSRC Molecular Pathology Innovation Centre).

 A brief history of medical advancement

If you look at medical care over the past few centuries through the eyes of learned people living at the time, you gain a better appreciation of how the things we now take for granted impacted on their understanding of disease, illness and therapeutics.

The concept of the “seats of disease” (the relationship between abnormalities of organs and patient’s symptoms) had only just been proposed in the middle of the 18thcentury, when the physician and campaigner for public health, Thomas Percival (1740-1804), was embarking on his medical training. Even the classification of diseases as we might understand it today (infections, cancer, trauma, endocrine etc.) was only coming into acceptance when John Fielden (1784-1849), the politician and social reformer from Todmorden, was in his 20s.

Imagine how Joseph Whitworth (1803-1887), the engineer and entrepreneur who bequeathed much of his fortune to the people of Manchester, would have followed the development of anaesthetics in the 1840s, 50s and 60s, lest he might one day require surgery. In fact, the mid 19th century saw a mini boom in medical care. This included, in addition to anaesthesia, identifying the role of bacteria in causing infections, the discovery of the “cell” as the fundamental component of the body (displacing in this context the organ), and the whole idea of the aetiology of disease (diseases having understandable causes), which went way beyond bacterial infections.

Over the next century there was a steady acquisition of medical advances, such as the development of x-rays for diagnosis and treatment, until the next boom in medical knowledge around the middle of the 20thcentury, when poet and author Ted Hughes (1930-1998) began publishing his work. Developments during this period still influence our own views of disease and medical care. They include the discovery of DNA, the development of antibiotics, the invention of joint replacements, and the use of CT and MRI scanners.

The next medical boom

If history is to repeat itself as we move towards the middle of the 21stcentury, we might expect to see the earliest stirrings of the next “medical boom”. I would like to suggest that we are about to see just such a technologically driven change in medical practice that is so fundamental, exciting and disruptive, it will cause a complete revision of our views of disease and the ways in which medicine thinks of patients and prevents and manages disease.

Like the baby steps at the start of all scientific advance, the “new medicine” has evolved slowly and steadily. It is based on an increasing realisation over the last 50 years that disease can be understood at the molecular level (rather than that of the organ or cell). However, it is only through recent advances in engineering, such as rapid DNA sequencing and mass spectrometry, that it has become possible to identify and quantify large numbers of complex molecule. It is this that has opened the door to the utilisation of molecular information for informing diagnosis and patient management.

The “OMICS” revolution

It is now possible to have a snapshot of the molecular make up of blood, other body fluids and tissues in a way that defines individuals by their genome(DNA), transcriptome (expressed genes), proteome(protein), lipidome(fats) and metabolome(small molecules). The same applies to defining diseases, bacteria, and cancers. This capacity is truly game-changing, hence the term “omics revolution”.

The problem we now face is that there are many orders of magnitude more molecules than cells, and so to understand the make-up of individual patients and how that interacts with the molecular biology of disease, requires that we handle a huge volume of data. We are each composed of thousands of fats, tens of thousands of sugars, hundreds of thousands of proteins, and these can occur in a myriad of combinations.

The need to handle such vast amounts of “omics” data has led to the development of electromechanical computers and a new science, “informatics”, designed to handle “big data”. These technological advances have allowed us to think about experimental bioscience in a new way. The fact that we no longer need to do hypothesis-driven science is a case in point. Not long ago this would have been considered treasonable!

Traditional science relies on hypotheses. A hypothesis is a statement, based on available evidence, that predicts something that is not yet known but can be supported or refuted through carefully crafted experimentation or observation. With “big science” we now have the alternative option of revealing knowledge from the analysis of vast amounts of data.

Take the ‘old-fashioned’ hypothesis:Some cancers kill, I think it might be because they spread, I have discovered cancers that spread express molecule ‘x’, therefore my hypothesis is that ‘expression of molecule ‘x’ makes the outlook worse and I will look for molecule ‘x’ in these cancers’.

And the new:How do I predict who will die from their sarcoma? Some sarcomas kill because they have traits defined by genetic mutations in the sarcoma cells. Therefore if I start by knowing all the genes abnormally expressed in sarcoma cells I should be able to identify one or more that affect survival.

Biologics

Understanding disease at the molecular level has also spawned a new type of therapeutics based on the concept that most of us spontaneously launch molecular defences against diseases, which are more or less successful. If we could harness these naturally occurring molecules, or produce them outside the body, we would have treatments composed of naturally occurring molecules that have been tried and tested by nature for tens of thousands of years. This is now possible and has initiated a new area of therapeutic science based on the use of “biologics”.

Precision medicine

So the world of medicine has suddenly changed. By understanding individuals and diseases at the molecular level there is now the possibility to tailor treatments to the needs of specific patients. Presently a group of patients with the same disease will often receive the same initial treatment, yet not all will benefit from it. Of every 10 patients, say, 7 may respond positively to different degrees, 2 may not benefit and 1 may be made worse by the treatment.  Now there lies the prospect, albeit currently distant, of all 10 receiving a slightly different treatment so that all benefit optimally. This is known as “personalised” or “precision medicine”. It would be wrong to raise hopes prematurely, but the concepts and technologies are around now that will one day bring this goal to fruition.

But the technology extends further. There has been a drive for acquiring molecular data on patients (and healthy people) when they are going about their daily activities, or recovering from surgery, illness etc., and certainly not restricting it to when they are in hospital or the doctor’s surgery. This is no futuristic dream, but simply an extension of the FitBit, the finger prick test used by diabetics for monitoring their blood sugar or 24 hour heart rate monitoring.

Supposing we could harness molecular data from wearable or indwelling devices? How easy would it be to monitor a patient’s response to treatment, or a patient’s changing molecular profile to detect when a disease starts to recur. The challenge is designing and powering sensors to do just this and world-class engineers in Manchester are amongst those leading the field. Remote monitoring of patients has fostered another area of data science called “mobile” or “e-medicine”, and this is another area where Manchester is leading internationally.

Infinite combinations?

However, there is a problem. If all individuals are different (with unique molecular make-up) and if all diseases are different (not only in their complex molecular profiles, but also in the way their molecules interact with the molecules of the individual patient) how on earth do we harness all that data and turn it to diagnostic or therapeutic advantage in real time?

Whilst one day we might have the technology to allow this level of complexity to be assimilated and used, today we don’t. But nature comes to our aid. Molecules don’t work independently, but rather as a hierarchy.

Rheumatoid arthritis illustrates this point well. At the molecular level, rheumatoid arthritis is one of the most complicated diseases it is possible to imagine. In the 1980s a video was made of a woman with rheumatoid disease being treated with an antibody to a naturally occurring molecule called TNF-a. The effect was astonishing. Following treatment she was able to walk, when before she could only hobble. The experts behind the discovery had recognised that although the molecular profile of the disease was complex, and differed from patient to patient, TNF-awas a master regulator of the molecular events occurring in the joints of their patients. Turn it off and all the disease processes would be damped down. TNF-atherefore became a therapeutic target. All that was needed was the bullet. This came in the form of a “biologic”, a molecule, an antibody, against TNF-a.

Molecular biomarkers

Recognising there is a hierarchy of molecules involved in disease means that the expression of certain specific molecules (or small group of molecules) can be used diagnostically. Furthermore, changes in their expression can enable the success, or otherwise, of treatment to be monitored. These molecules, or groups of molecules, are called “molecular biomarkers”.

Identifying molecular biomarkers that assist diagnosis (diagnostic biomarkers), help monitor effects of treatment (theranostic biomarkers), predict outcomes (prognostic biomarkers) or could become, like TNF-a, a target for treatment (therapeutic biomarkers) is taking off as a biomedical discipline. As new biomarkers are discovered (for example in cancers, inflammatory diseases, and in healing or non-healing wounds), there is a drive for them to enter clinical use.

Manchester is at the forefront of the endeavour to bring novel molecular biomarkers to the clinic. Manchester’s lead has been made possible, not only by the superb biomedical, engineering and clinical science across its Universities and Healthcare Trusts, but also by local politicians who have embraced the devolution of the health and social care budget from central government. Their vision for a healthcare system driven by technological advance and tailored for the local population is unique and inspirational. The ‘COPD Salford Lung Study’, a collaboration between public and private sector organisations, which used near real-time data, is just one example of the pioneering health analytics work taking place in Manchester.

What, then, will this “molecular revolution” in medicine mean to us? Medicine will change of that there is no doubt.

Monitoring patients in the community will mean earlier detection of disease, earlier initiation of treatment, and a move towards real time management of disease in, or near, the patient’s home. This will reduce the likelihood of disease going out of control requiring admission to hospital to change treatment. Fewer hospitals, therefore, will be needed, making health services more cost efficient.

Reclassification of disease

There will be a reclassification of disease. Already large companies, working in conjunction with the NHS and Universities, are moving from a disease taxonomy based on changes at the organ and cellular level, to one based on the expression of molecules. Diseases which have until recently been known as being quite disparate (e.g. cancers and joint diseases) are suddenly seen as being close relatives in terms of deranged molecular function with similar therapeutic biomarker profiles. Put another way, there are, for instance, inflammatory and malignant diseases that will respond to the same biologic. This shouldn’t be a surprise, after all the drug methotrexate is used to treat cancer and inflammatory arthritis. So, all healthcare professionals will need to change the way they think about disease and how they investigate and manage patients.

Our increasing understanding of molecular biomarkers will enable us to define diseases in different ways and completely new diseases will be recognised.

For instance, single disorders will be found to have several subtypes that have different outcomes (prognosis) and will require different treatments, as illustrated by the recent reclassification of Type 2 Diabetes.

The future of health-related professions

Healthcare professionals of the future will be guided as much by the results of molecular biomarker tests as they will by the patient’s history and physical findings. New specialisms and roles will emerge, not only in healthcare delivery itself, but also in the array of scientific, computing and engineering disciplines which are already supporting developments in biomarker discovery, identification, measurement, monitoring and therapeutic targeting.

New branches of medicine will evolve and there will be increasing numbers of doctors working behind the scenes at the molecular level, guiding diagnosis and even directing treatment. Not only that, but machine learning and artificial intelligence will help to hone complex tests to gain the most from the smallest impact, whether that is measured in terms of cost, pain or inconvenience to the patient.

For the doctor sat with the patient in front of them, a key component of the next medical boom will be the creation of a platform to combine patient data (the “omics”, the biologics and monitoring) with the patient record. This will allow the doctor to combine the best “scientific” precision medicine with the “art” of medicine using their skills of empathy and communication.

This report is based on a paper written by Tony Freemont, which formed the basis for his talk to the Hebden Bridge Literary & Scientific Society.

With thanks to Tony Freemont for providing this paper and to Ingrid Marshall for adapting it to reflect what was said on the night whilst retaining bonus explanatory material from the paper itself.

This is a copy of the Prof_Tony_Freemont presentation

 Dr Tom Mills

The BBC: Public Servant or State Broadcaster?

Our warm shared memories of favourite children’s television colours our attitude to the BBC – the well-meaning Auntie of the nation, provider of our popular cultural heritage. However, Dr Tom Mills, from the Centre for Critical Inquiry into Society and Culture at Aston University, speaking in Hebden Bridge, argued that we are buying into a myth of a disinterested public service.

His analysis of the way the BBC functions focused on the political stance of the organisation, and hence on its journalism rather than its arts or entertainment output. Evidence shows that while the broadcasting output includes different people with contesting ideas, the organisation as a whole does have a bias towards the prevailing ideologies of an ‘elite’.

A quick look at the history of the BBC provided him with some evidence for this institutional bias. Originally founded by radio manufacturers to provide content that would sell their products, it soon became obvious that radio broadcasts could influence public opinion. The shaper of the BBC ethos, Director General Lord Reith, explicitly wanted it to be part of the establishment. After the defeat of the 1926 General Strike, for example, Reith broadcast a recital of Jerusalem, thanking God for saving the country. A more liberal Director General, Hugh Green, presided over such anti-establishment programmes as ‘That was the week that was’ but was also keen to monitor the activities of left wing producers. Mills argued that there was inevitably a tension between politically appointed leaders of the Corporation and programme makers.

One of the biggest changes to the BBC came with the ‘neoliberalism’ of the Thatcher years, when after the ‘brutal removal’ of Alistair Milne, John Birt took over as Director General and set about embedding competition and an internal market within the structure of the BBC. This was extended under Greg Dyke. One of the effects of this, Mills said, was that long established journalists who had covered industrial relations were replaced by journalists who reported on business, and a conscious pro-business consensus was institutionalised in BBC reporting. So coverage of the 2008 financial crisis was defined by the views of a political elite.

Ten years later there is more pressure to open the BBC to competition and make it an arms length provider of services which are commissioned from the private sector. To call the BBC a ‘state broadcaster’ implies a direct political control which is not the case – individual journalists at the BBC are independent, but they are also drawn from an elite social strata, with 54% being privately educated and 45% with Oxbridge degrees. This maintains a gulf between those making programmes and selecting news stories and the consumers and ultimate owners of the Corporation.

Mills believes that the links that exist between the BBC and government are invidious, with regular Charter renewal, the setting of the licence fee and the direct political appointment of the Director General undermining the independence of the Corporation and keeping it in a precarious position. We are at a turning point when the BBC TV of our fond memories is dead, with on-line programmes allowing viewers to curate their own individual experience and social media providing news. This offers an opportunity for the kind of democratisation of the BBC that challenges the establishment bias. Mills strongly believes in the BBC as a public service broadcaster, and thinks the challenge has to come from those who support it, and not just from those who want it to fail. Questions from the audience in the packed Waterfront Hall suggested that his talk might have begun that process.

With thanks to Sheila Graham for this report

 

 

Judith Weir: A Composer’s Life

Judith Weir is one of the country’s leading composers and was appointed to the ancient office of Master of the Queen’s Music in 2014. Part of her role is to bring a greater awareness of music to the wider public, and her talk for Hebden Bridge Literary and Scientific Society made clear her conviction about the importance of music as she opened a window onto a composer’s life.

She began by considering ‘composer’ as a job title, looking at the woman considered the first named ‘composer’ Hildegard of Bingen, and at how for many famous composers the arrangement and creation of music was often an intrinsic part of their main job as church or court musicians. Bach for example was employed by churches and many of his compositions were written for them. Mozart and his family were touring musicians who also relied on wealthy patrons, as did Beethoven, though he was very much composing on his own terms. Women composers such as Clara Schuman and Magdalena Bach were also performers whose family background gave them space to write music.

Most composers have relied on teaching to make a living, with Elgar, Vaughan Williams and Holst all being teachers. Arts Council grants in the late twentieth century made life easier for some composers, and institutions such as the BBC also commissioned new music. Now composition is seen as an academic discipline in its own right and taught up to PhD level in colleges.

For Judith Weir learning to be a composer requires a great deal of listening, which she felt modern technology made easier, and also playing an instrument, in order to learn how everything works. Time spent having fun making music with friends is as crucial as an academic approach. Most important was mindset – time to be alone and a driving need to compose music.

Her own route to becoming a composer grew from circumstance. Brought up in Aberdeenshire, entertainment in her family often involved groups of friends playing folk music together, so that music was always a social thing. She began writing for her own group of friends, initially as a way of avoiding being sent out to play on wet or cold days at school! The accident of having a teacher who knew John Taverner meant that she was able to meet and talk to him about the music she was composing, and that contact stiffened her resolve to make composing part of her life.

Making her living as a composer has continued to involve teaching and collaborating with others, as a community composer funded by the Arts Council, as part of the Glasgow City of Culture and as resident composer for the City of Birmingham Symphony Orchestra with Simon Rattle. It was clear that these collaborations were part of the creative process as well as a way of getting her music performed. She felt that taking commissions was in no way constraining, but involved a stimulating meeting of minds.

It was fascinating to see some of the physical evidence of her composing process in a current piece for oboe. The first ideas are sketched in pencil, and often contain as many words as musical notation. This stage can take months, as gradually the whole composition becomes more defined – though always with tippex on hand for changes. Even when she is playing sections at the piano the eraser is close by. Eventually there is enough certainty to move to an ink copy which her long standing editor will then work with until a computer programme produces the different parts for the orchestra. At rehearsals there is a chance to refine further, with the precious advice of the musicians.

Working with others is absolutely part of the creative process, but Judith also stressed the need for time and space alone allowing ideas to emerge and develop. While skills can be taught, there is something at the heart of a composer’s life: the need to create music which is outside the scope of teaching.

The audience at Hebden Bridge Waterfront Hall felt privileged to be allowed these insights into a composer’s life and to hear parts of two of Judith Weir’s compositions, a choral piece Maria Regina Celorum and a lovely violin duet Atlantic Drift, originally written for children.

With thanks to Sheila Graham for this report

Before the Big Bang

Professor Jeff Forshaw

6th October 2017

Jeff Forshaw is a professor of theoretical physics at the University of Manchester, specialising in the phenomenology of particle physics. He is passionate about communicating the excitement of the pursuit of science, earning the Institute of Physics Kelvin Prize (2013) for “…outstanding contributions to the public understanding of physics”, whose previous winners include Brian Cox and Jim Al-Khalili.

Jeff’s involvement in cosmology, apart from its need for particle physics expertise, was sparked, he revealed, by agreeing to give a course to 4th year students in Manchester. He began, though, in way that could have been entitled, “How do we know that?” by walking us through some examples of how we corroborate our ideas by a process of prediction and measurement: how big is the Earth? how old is the Atlantic Ocean? and the Earth? and the sun? until some of them become “nailed-on facts”, as he put it. In that category is the Big Bang. The “archaeological evidence” from many independent lines of prediction and observation lead us to think with an overwhelming degree of certainty that the observable universe we see about us was once a very hot dense gas of particles in a volume no bigger than the Waterfront Hall. Already, at that time, it had slight fluctuations of density whose slightly higher density portions gradually collapsed under gravity to form galaxies in filaments and voids observed today. It expanded into the present state of affairs in a manner that is entirely predictable and understandable in terms of the properties of particles and gravity that we know about from our laboratory experiments. That hot gas and its subsequent expansion he defined as the Big Bang. It is also called the Standard Cosmological Model. What happened before that he would come to later. The clinching evidence was a “photograph”, as he put it, of the last vestiges of free charged particles (plasma) 380,000 years after the Big Bang, the radiation (photons) from which forms the Cosmic Microwave Background. 90% of the photons from that era have travelled through the mostly empty space of our universe for nearly 14 billion years without collision. They have cooled from a few thousand degrees Celsius to about -270 C (2.725 degrees above absolute zero, to be more accurate) on average due the intervening expansion of the universe, which stretches their wavelengths. From the “colour”, i.e., the energy, of the photons one can infer the temperature of the original plasma; it varies across the sky by minute amounts in a strikingly beautiful mottled fashion whose structure is spectacularly accurately predicted by the Model using measurements and properties from completely independent observations. It would have been so easy to be otherwise. Jeff himself had written a computer program for the 4th year students to make that prediction—simple, he said, compared to predicting the outcome of kicking a bucket of water, for example.

So how did the Big Bang, that hot dense gas the size of the Waterfront Hall with slight density perturbations, come to be? Here, he admitted, we were moving into the realm of speculation, but ideas that were seen as highly speculative 5-10 years ago are, through the process of working out bit-by-bit, predicting and comparing, becoming increasingly accepted as feasible. He postulated the existence of some “cosmic treacle” called the inflaton field. There exists a present-day analogue that surrounds us all the time, the Higgs field that is responsible for generating mass, predicted 50 years ago by Peter Higgs (and others) and whose excitations were recently observed at the Large Hadron Collider at CERN, Geneva. A special property of the inflaton field is that its energy drives a huge inflation of the universe and guarantees certain properties of the present universe, for example “flatness”, that are difficult to explain otherwise. The Waterfront-Hall-sized region of hot dense gas, which eventually became our observable universe, may well have been a region no bigger than a billionth the size of a proton. Some quanta of the field (inflatons), after a period of huge inflation, decay into “ordinary matter”—and the rest is history, as they say. Jeff showed us a computer simulation he had written of the fluctuating field expanding and decaying into particles, then subsequently condensing into galaxies. The inflaton field is subject to quantum fluctuations, which give rise to just the right size of density fluctuations at the start of the Big Bang.

Finally he speculated that inflation is a continuous process and that our universe is just one of many, possibly infinitely many, universes with the same—or possibly different—physical properties. This is the multiverse conjecture. It seems perhaps that the Big Bang and the Steady State Model may be companion theories after all.

Much of this is in his new book, “Universal: A Guide to the Cosmos”, co-authored with Brian Cox.

The audience gave Jeff spontaneous applause for his passionate and entertaining presentation. He fielded a few questions and afterwards was surrounded by inquirers until we dragged him and his partner away for dinner.

With thanks to John Allison for this report

9th October 2017

Philosophy, Democracy and the Demagogue

Or how Plato can help us deal with Donald Trump:

Professor Angie Hobbs

24th March 2017

Angie Hobbs is the first academic to be appointed Professor of the Public Understanding of Philosophy in the UK (if not the world). With demagoguery on the rise Professor Hobbs believes it is imperative to keep in mind the wisdom of ancient greek philosophers, such as Plato, who themselves lived in tumultuous times.

After defining ‘democracy’ (rule by the people) and ‘demagoguery’ (leading of the people, often manipulatively), Professor Hobbs explored why she regards the ‘will of the people’ to be such a misleading term.

Who are ‘the people’ exactly? Everyone in a state or group? Everyone who can vote? Everyone who did vote? Or only those who voted for the winning side? It is dangerous to equate some or all of the electorate with ‘the people’ as a whole since this implies that those who did not or could not vote, or those who voted for the losing side, are not part of the people.

‘The majority’ is another term, which warrants further scrutiny since it is not as fixed as it may at first appear. Following election individual voters may have a change of heart. Also the composition of the electorate is in a state of flux: new voters come of age and others die, for instance. Such changes could produce a different majority outcome.

Clearly it would be impractical to have constant elections to take account of possible shifting majorities. Nevertheless, opined Hobbs, it is important to recognise that a democratically elected government may often not be supported by the majority of even the electorate – let alone the populace – for most or indeed all of its duration. Which begs the question, what does ‘rule’ by the electorate really mean?

Is ‘rule’ no more than a ballot on a particular day? On the contrary, urged Hobbs, the intellectual foundations of liberal democracy recognise that each individual person has the right to a voice and to be heard. A ballot should, therefore, be seen as playing an important role in an on-going conversation in which citizens – regardless of whom they voted for or even if they are of electoral age – can take part. Thus it is more accurate to refer to the ‘wills of persons’, than the ‘will of the people’.

True ‘rule’, Plato controversially claimed, can only take place if each individual ‘ruler’ (or voter), acts on well-informed, deliberative rational choice. Those who don’t are at risk of being manipulated by an opportunistic demagogue, who, claiming that only they truly understand the ‘will of the people’, are elected to power and then proceed to subvert democracy into tyranny. Plato depicted just such an alarming chain of events in Republic 8 (562a-569c). It makes for a sobering read.

In terms of Plato’s analysis the United States is at a turning point, said Hobbs. Now is the time to heed Plato’s warning and stand up for liberal democracy in every peaceful way imaginable.

Thanks to Ingrid Marshall for this report.