Reports

Richard described his tussles with the publisher over the book’s title over concerns about public preconceptions limiting the books ‘shelf’ appeal should the wrong impression be given. Those preconceptions originating in the large volume of writings on, or set in, Yorkshire since the Industrial Revolution driven by demand from the increasingly affluent residents of the West Yorkshire Towns; many of whom were immigrants from other parts of Britain who wanted to know about their new home. Richard noted that inward migration and geology/geography are highly significant drivers of human activity throughout Yorkshire’s history influencing culture, language and the built environment; as well as its shifting boundaries and loyalties over time. From the Iron Age Parisi who lived in South Yorkshire near Hull (they came from around, and gave their name to, Paris) to the Normans who finally welded Yorkshire into the Kingdom of England by turning much of it into a wasteland: Romans, Welsh, Angles and Vikings have all ruled over all or parts of Yorkshire. All attracted to Yorkshire by the rich soils of the Vale of York and its stone, minerals and other resources most notably wool; all have left their mark on the Landscape and Language: all are ancestral to those that live here today. Yorkshire centred on York was for centuries a rival centre of wealth and power to London and the South, which anyone who would rule England had to control. The many great Castles, Estates and Monasteries which still inhabit the landscape are a legacy of this; and so are the sites of the great battles fought over them along the narrow corridor between the impassable marshy east and hilly west. The many navigable rivers; which made marshes and up which invaders rowed, also facilitated trade from an early date. In medieval times the unique qualities of Yorkshire wool became a prized resource for the weaving of fine cloth in Flanders and Italy. Wool production and trade thus became a major driver Yorkshire’s economy and the development of a local cloth industry. The wealth and skills accumulated leading to the early adoption of industrialised cloth manufacture and to those affluent, literate folk who wanted to read about Yorkshire. But this is where we came in.

Susan Owens is a freelance writer, curator and art historian – until a few years ago Curator of Paintings at the Victoria and Albert Museum in London. She says she’s now transitioning to becoming a ‘cultural’ historian, and this broadening of approach certainly showed in her deeply fascinating lecture. The Waterfront Hall was packed (yet again!) to hear her talk about how the ghost has been represented in British art and literature from medieval times until the present day. What a brilliant and unexpected subject!

Susan’s book The Ghost; a Cultural History was published by Tate in 2017, and she began her talk by explaining where the idea had come from. When she and her husband moved from London to an ancient house in a Suffolk village five years ago, she said, she was surprised that the most frequent question her friends asked about it was ‘is it haunted’? This set her thinking …

She began with a simple example of how popular ‘celebrity ghosts’ now are – especially with the National Trust!! – by putting up images of six separate houses said to be haunted by Ann Boleyn! But centuries ago, ghosts were a matter of life and death. In the Middle Ages life in Britain was short and it was dominated by the Catholic Church. Most people could not read, so pictorial images in churches were important. Faded frescos from parish churches show medieval ghosts represented as clothed skeletons, who were often taken to be messengers from individuals ‘trapped’ in Purgatory pleading for prayers to be said on their behalf to release them to heaven.

The Reformation banished the idea of Purgatory, but Susan Owens showed how the skeletal ghost was transformed during the 17th Century into a malevolent ‘spirit’, an agent of the Devil, against whom good Christians had to fight. Preachers used their pulpits to persuade their congregations that these spirits were invisible but everywhere. This was the ghost used as a tool of social control.

The rationalist  Enlightenment undermined the political purchase of organised religion, but it created an intellectual and artistic space for the appearance of a more individualistic and emotional ghost in the late 18th Century Romantic movement. Susan Owens used images of classic paintings by the likes of Romney and William Blake to show how certain iconographic traditions developed to portray ghosts as representing extreme emotional states of grief or fear. Ghosts came to be enveloped in clouds, and to wear white sheets – an obvious reference to burial shrouds.

Then in the mid-nineteenth century the camera appeared on the scene and both revolutionised and popularised the ghost. ‘The camera never lies’, it is said; but it certainly did when it came to ghosts. It took a long time for an image to be properly captured in early photography, so if a person stood in a shot for a brief moment and then moved out of frame what was left was a faint see-through of an individual. Proof that ghosts existed! And a recognition for the first time that they were see-through! So a new culture of the ghost emerged that led to the establishment of a tradition of classis ghost stories by the likes of M.R. James, and of course the ghost has been a staple of the film industry for at least a century.

But why does our fascination with ghosts remain? Why do we so often ask the question of others ‘do you believe in ghosts?’ Susan Owens ended her talk by suggesting answer to this – we all use the word ‘haunted’ to describe our own deepest fears or most fearful memories, so maybe ghosts play a necessary role in allowing us to confront our own private demons.

In an email to the Lit &Sci society Susan Owens said that it had been an’ honour’ to speak to such a ‘fantastic audience’ as that in Hebden Bridge.

Chris Renwick teaches Modern British History at the University of York; and in 2017 he achieved fame beyond the confines of the academic world with the publication of his book Bread For All, in which he put forward a new and challenging view of the origins of the British Welfare State. Having done his first degree at Leeds, he expressed great pleasure in returning to Hebden Bridge, and it was to an almost full Waterfront Hall that he presented a precis of his book’s argument.

He began by outlining what our ‘welfare state’ state comprises, and argued that we have come to have far too limited a view of its ‘cluster of institutions’, reducing it to the provision of state assistance to the poor: whereas it must be seen to also include pensions, the NHS, and state education. Just assisting the poor, in fact, he argued, was the system that the welfare state of the twentieth century replaced; and Chris gave some shocking evidence as to the brutality of the Poor Law and its prison-like institutions that existed across Britain from the 16th to the 19th centuries. It was on the back of the pioneering social research of institutions such as Rowntree that politicians in the decade before the First World War began to introduce initial provision of pensions and social insurance. Yet the real impulse to do something about the ‘state of the poor’ was not philanthropic, Chris Renwick argued, but came from the fact that such a high proportion of volunteers to fight in the Boer War were turned down on medical grounds. They were simply too feeble to fight for their country. We now think of eugenics as a tainted racist science, but in the first half of the last century it was a ‘respectable’ discipline, developed in an attempt to produce a healthier and stronger population in Britain. William Beveridge himself, Chris reminded his audience, had a long interest in eugenics.

Beveridge was also, politically, a Liberal; and at the core of Chris Renwick’s argument is a claim that our welfare state is a creation most importantly of liberalism, not socialism. The initial Beveridge Report of 1942 was all about establishing a universal scheme of national insurance. But this wasn’t to be paid for out of taxation, but from contributions by every worker. Beveridge was very close to Maynard Keynes, and Renwick said this influenced his view that a system of national insurance would only do its job if three other things were also in place. These were a national health service free at the point of use; a universal system of free and compulsory state education up to the age of sixteen; and full employment. Only this package could provide an adequate basis on which communities and individuals could work to fulfil their potential – and thus become useful citizens. Our welfare state was never meant to just be a safety net, Chris Renwick argued; it was also about providing a solid basis on which individuals could go on to create a better society.

An animated Q & A session followed Chris’s talk, in which he shared the concern that many in the audience showed for the future of our welfare state in a political climate that once again seems to victimise the weakest in society and gives little positive role to the state. It was a sobering conclusion to an enlightening evening when Chris said present debates over the poor in Britain remind him of those in the early nineteenth century.    

Tim Birkhead launched the 2018/9 Season in great style to a hall filled to capacity. 

Tim delivered his information packed talk in jargon free plain English. Opening, saying that he ducked answering when reporters inevitably asked the question “Which came first the chicken or the egg”, he went on to describe the history of “Oology” (the study of eggs). The talk dealt with egg collecting from the earliest 17th Century Naturalists through to the egg collecting industry that developed in the 19th Century, lasting until 1954 when it was made illegal after campaigns by the RSPB. Tim told how the beauty of the colouring and varied shapes of eggs along with rivalry between collectors had been drivers of this passion for egg shells and the high prices they could command. He then related how scientific study over the years had begun to cast light on the origins and role of these features in the process of avian reproduction as it evolved over geologic time. Starting with the Dinosaurs from which all birds descend; as each bird species developed and adapted towards its current ecological niche its anatomy and eggs changed to maximise reproductive success.

This lead into a detailed explanation of the anatomical structures and biochemical processes that female birds employ to produce eggs of such varied colours and shapes, highlighting areas that are still speculative and remain to be elucidated in future research. As an example, he related his own team’s experimental discovery that the entry of multiple sperm into the ovum (polyspermy) was essential to the viability of a fertilised bird egg. He offered some suggestions as to why this might be, but said no conclusive answer had yet been determined.

At the end of the Lecture Tim pointed out that he had now incidentally answered the forbidden question: the answer being, neither; but the Dinosaurs came first.

Of course, one could then say: which came first the Dinosaur or the Egg, but there’s a time and a place for everything.

After the usual thoughtful Questions from our Audience; Tim said that his very long running study of the guillemots on Skomer Island, the longest continuous study anywhere, was being threatened by funding cuts. He felt that at the current time given the mounting threats to the colony, largely as a result of global warming, it was vital that the work continue. He asked that people consider donating to the ‘crowd funding’ which will finance the continuation of the research for the next 10 to 15 years.

If you wish to make a donation please use the link below.

Support Tim’s long-term study of guillemots on Skomer Island

Biomarkers – why medicine is about to undergo the biggest single change ever

Professor Tony Freemont

Proctor Professor of Pathology, University of Manchester and Director of MMPathIC (Manchester MRC/EPSRC Molecular Pathology Innovation Centre).

 A brief history of medical advancement

If you look at medical care over the past few centuries through the eyes of learned people living at the time, you gain a better appreciation of how the things we now take for granted impacted on their understanding of disease, illness and therapeutics.

The concept of the “seats of disease” (the relationship between abnormalities of organs and patient’s symptoms) had only just been proposed in the middle of the 18thcentury, when the physician and campaigner for public health, Thomas Percival (1740-1804), was embarking on his medical training. Even the classification of diseases as we might understand it today (infections, cancer, trauma, endocrine etc.) was only coming into acceptance when John Fielden (1784-1849), the politician and social reformer from Todmorden, was in his 20s.

Imagine how Joseph Whitworth (1803-1887), the engineer and entrepreneur who bequeathed much of his fortune to the people of Manchester, would have followed the development of anaesthetics in the 1840s, 50s and 60s, lest he might one day require surgery. In fact, the mid 19th century saw a mini boom in medical care. This included, in addition to anaesthesia, identifying the role of bacteria in causing infections, the discovery of the “cell” as the fundamental component of the body (displacing in this context the organ), and the whole idea of the aetiology of disease (diseases having understandable causes), which went way beyond bacterial infections.

Over the next century there was a steady acquisition of medical advances, such as the development of x-rays for diagnosis and treatment, until the next boom in medical knowledge around the middle of the 20thcentury, when poet and author Ted Hughes (1930-1998) began publishing his work. Developments during this period still influence our own views of disease and medical care. They include the discovery of DNA, the development of antibiotics, the invention of joint replacements, and the use of CT and MRI scanners.

The next medical boom

If history is to repeat itself as we move towards the middle of the 21stcentury, we might expect to see the earliest stirrings of the next “medical boom”. I would like to suggest that we are about to see just such a technologically driven change in medical practice that is so fundamental, exciting and disruptive, it will cause a complete revision of our views of disease and the ways in which medicine thinks of patients and prevents and manages disease.

Like the baby steps at the start of all scientific advance, the “new medicine” has evolved slowly and steadily. It is based on an increasing realisation over the last 50 years that disease can be understood at the molecular level (rather than that of the organ or cell). However, it is only through recent advances in engineering, such as rapid DNA sequencing and mass spectrometry, that it has become possible to identify and quantify large numbers of complex molecule. It is this that has opened the door to the utilisation of molecular information for informing diagnosis and patient management.

The “OMICS” revolution

It is now possible to have a snapshot of the molecular make up of blood, other body fluids and tissues in a way that defines individuals by their genome(DNA), transcriptome (expressed genes), proteome(protein), lipidome(fats) and metabolome(small molecules). The same applies to defining diseases, bacteria, and cancers. This capacity is truly game-changing, hence the term “omics revolution”.

The problem we now face is that there are many orders of magnitude more molecules than cells, and so to understand the make-up of individual patients and how that interacts with the molecular biology of disease, requires that we handle a huge volume of data. We are each composed of thousands of fats, tens of thousands of sugars, hundreds of thousands of proteins, and these can occur in a myriad of combinations.

The need to handle such vast amounts of “omics” data has led to the development of electromechanical computers and a new science, “informatics”, designed to handle “big data”. These technological advances have allowed us to think about experimental bioscience in a new way. The fact that we no longer need to do hypothesis-driven science is a case in point. Not long ago this would have been considered treasonable!

Traditional science relies on hypotheses. A hypothesis is a statement, based on available evidence, that predicts something that is not yet known but can be supported or refuted through carefully crafted experimentation or observation. With “big science” we now have the alternative option of revealing knowledge from the analysis of vast amounts of data.

Take the ‘old-fashioned’ hypothesis:Some cancers kill, I think it might be because they spread, I have discovered cancers that spread express molecule ‘x’, therefore my hypothesis is that ‘expression of molecule ‘x’ makes the outlook worse and I will look for molecule ‘x’ in these cancers’.

And the new:How do I predict who will die from their sarcoma? Some sarcomas kill because they have traits defined by genetic mutations in the sarcoma cells. Therefore if I start by knowing all the genes abnormally expressed in sarcoma cells I should be able to identify one or more that affect survival.

Biologics

Understanding disease at the molecular level has also spawned a new type of therapeutics based on the concept that most of us spontaneously launch molecular defences against diseases, which are more or less successful. If we could harness these naturally occurring molecules, or produce them outside the body, we would have treatments composed of naturally occurring molecules that have been tried and tested by nature for tens of thousands of years. This is now possible and has initiated a new area of therapeutic science based on the use of “biologics”.

Precision medicine

So the world of medicine has suddenly changed. By understanding individuals and diseases at the molecular level there is now the possibility to tailor treatments to the needs of specific patients. Presently a group of patients with the same disease will often receive the same initial treatment, yet not all will benefit from it. Of every 10 patients, say, 7 may respond positively to different degrees, 2 may not benefit and 1 may be made worse by the treatment.  Now there lies the prospect, albeit currently distant, of all 10 receiving a slightly different treatment so that all benefit optimally. This is known as “personalised” or “precision medicine”. It would be wrong to raise hopes prematurely, but the concepts and technologies are around now that will one day bring this goal to fruition.

But the technology extends further. There has been a drive for acquiring molecular data on patients (and healthy people) when they are going about their daily activities, or recovering from surgery, illness etc., and certainly not restricting it to when they are in hospital or the doctor’s surgery. This is no futuristic dream, but simply an extension of the FitBit, the finger prick test used by diabetics for monitoring their blood sugar or 24 hour heart rate monitoring.

Supposing we could harness molecular data from wearable or indwelling devices? How easy would it be to monitor a patient’s response to treatment, or a patient’s changing molecular profile to detect when a disease starts to recur. The challenge is designing and powering sensors to do just this and world-class engineers in Manchester are amongst those leading the field. Remote monitoring of patients has fostered another area of data science called “mobile” or “e-medicine”, and this is another area where Manchester is leading internationally.

Infinite combinations?

However, there is a problem. If all individuals are different (with unique molecular make-up) and if all diseases are different (not only in their complex molecular profiles, but also in the way their molecules interact with the molecules of the individual patient) how on earth do we harness all that data and turn it to diagnostic or therapeutic advantage in real time?

Whilst one day we might have the technology to allow this level of complexity to be assimilated and used, today we don’t. But nature comes to our aid. Molecules don’t work independently, but rather as a hierarchy.

Rheumatoid arthritis illustrates this point well. At the molecular level, rheumatoid arthritis is one of the most complicated diseases it is possible to imagine. In the 1980s a video was made of a woman with rheumatoid disease being treated with an antibody to a naturally occurring molecule called TNF-a. The effect was astonishing. Following treatment she was able to walk, when before she could only hobble. The experts behind the discovery had recognised that although the molecular profile of the disease was complex, and differed from patient to patient, TNF-awas a master regulator of the molecular events occurring in the joints of their patients. Turn it off and all the disease processes would be damped down. TNF-atherefore became a therapeutic target. All that was needed was the bullet. This came in the form of a “biologic”, a molecule, an antibody, against TNF-a.

Molecular biomarkers

Recognising there is a hierarchy of molecules involved in disease means that the expression of certain specific molecules (or small group of molecules) can be used diagnostically. Furthermore, changes in their expression can enable the success, or otherwise, of treatment to be monitored. These molecules, or groups of molecules, are called “molecular biomarkers”.

Identifying molecular biomarkers that assist diagnosis (diagnostic biomarkers), help monitor effects of treatment (theranostic biomarkers), predict outcomes (prognostic biomarkers) or could become, like TNF-a, a target for treatment (therapeutic biomarkers) is taking off as a biomedical discipline. As new biomarkers are discovered (for example in cancers, inflammatory diseases, and in healing or non-healing wounds), there is a drive for them to enter clinical use.

Manchester is at the forefront of the endeavour to bring novel molecular biomarkers to the clinic. Manchester’s lead has been made possible, not only by the superb biomedical, engineering and clinical science across its Universities and Healthcare Trusts, but also by local politicians who have embraced the devolution of the health and social care budget from central government. Their vision for a healthcare system driven by technological advance and tailored for the local population is unique and inspirational. The ‘COPD Salford Lung Study’, a collaboration between public and private sector organisations, which used near real-time data, is just one example of the pioneering health analytics work taking place in Manchester.

What, then, will this “molecular revolution” in medicine mean to us? Medicine will change of that there is no doubt.

Monitoring patients in the community will mean earlier detection of disease, earlier initiation of treatment, and a move towards real time management of disease in, or near, the patient’s home. This will reduce the likelihood of disease going out of control requiring admission to hospital to change treatment. Fewer hospitals, therefore, will be needed, making health services more cost efficient.

Reclassification of disease

There will be a reclassification of disease. Already large companies, working in conjunction with the NHS and Universities, are moving from a disease taxonomy based on changes at the organ and cellular level, to one based on the expression of molecules. Diseases which have until recently been known as being quite disparate (e.g. cancers and joint diseases) are suddenly seen as being close relatives in terms of deranged molecular function with similar therapeutic biomarker profiles. Put another way, there are, for instance, inflammatory and malignant diseases that will respond to the same biologic. This shouldn’t be a surprise, after all the drug methotrexate is used to treat cancer and inflammatory arthritis. So, all healthcare professionals will need to change the way they think about disease and how they investigate and manage patients.

Our increasing understanding of molecular biomarkers will enable us to define diseases in different ways and completely new diseases will be recognised.

For instance, single disorders will be found to have several subtypes that have different outcomes (prognosis) and will require different treatments, as illustrated by the recent reclassification of Type 2 Diabetes.

The future of health-related professions

Healthcare professionals of the future will be guided as much by the results of molecular biomarker tests as they will by the patient’s history and physical findings. New specialisms and roles will emerge, not only in healthcare delivery itself, but also in the array of scientific, computing and engineering disciplines which are already supporting developments in biomarker discovery, identification, measurement, monitoring and therapeutic targeting.

New branches of medicine will evolve and there will be increasing numbers of doctors working behind the scenes at the molecular level, guiding diagnosis and even directing treatment. Not only that, but machine learning and artificial intelligence will help to hone complex tests to gain the most from the smallest impact, whether that is measured in terms of cost, pain or inconvenience to the patient.

For the doctor sat with the patient in front of them, a key component of the next medical boom will be the creation of a platform to combine patient data (the “omics”, the biologics and monitoring) with the patient record. This will allow the doctor to combine the best “scientific” precision medicine with the “art” of medicine using their skills of empathy and communication.

This report is based on a paper written by Tony Freemont, which formed the basis for his talk to the Hebden Bridge Literary & Scientific Society.

With thanks to Tony Freemont for providing this paper and to Ingrid Marshall for adapting it to reflect what was said on the night whilst retaining bonus explanatory material from the paper itself.

This is a copy of the Prof_Tony_Freemont presentation

 Dr Tom Mills

The BBC: Public Servant or State Broadcaster?

Our warm shared memories of favourite children’s television colours our attitude to the BBC – the well-meaning Auntie of the nation, provider of our popular cultural heritage. However, Dr Tom Mills, from the Centre for Critical Inquiry into Society and Culture at Aston University, speaking in Hebden Bridge, argued that we are buying into a myth of a disinterested public service.

His analysis of the way the BBC functions focused on the political stance of the organisation, and hence on its journalism rather than its arts or entertainment output. Evidence shows that while the broadcasting output includes different people with contesting ideas, the organisation as a whole does have a bias towards the prevailing ideologies of an ‘elite’.

A quick look at the history of the BBC provided him with some evidence for this institutional bias. Originally founded by radio manufacturers to provide content that would sell their products, it soon became obvious that radio broadcasts could influence public opinion. The shaper of the BBC ethos, Director General Lord Reith, explicitly wanted it to be part of the establishment. After the defeat of the 1926 General Strike, for example, Reith broadcast a recital of Jerusalem, thanking God for saving the country. A more liberal Director General, Hugh Green, presided over such anti-establishment programmes as ‘That was the week that was’ but was also keen to monitor the activities of left wing producers. Mills argued that there was inevitably a tension between politically appointed leaders of the Corporation and programme makers.

One of the biggest changes to the BBC came with the ‘neoliberalism’ of the Thatcher years, when after the ‘brutal removal’ of Alistair Milne, John Birt took over as Director General and set about embedding competition and an internal market within the structure of the BBC. This was extended under Greg Dyke. One of the effects of this, Mills said, was that long established journalists who had covered industrial relations were replaced by journalists who reported on business, and a conscious pro-business consensus was institutionalised in BBC reporting. So coverage of the 2008 financial crisis was defined by the views of a political elite.

Ten years later there is more pressure to open the BBC to competition and make it an arms length provider of services which are commissioned from the private sector. To call the BBC a ‘state broadcaster’ implies a direct political control which is not the case – individual journalists at the BBC are independent, but they are also drawn from an elite social strata, with 54% being privately educated and 45% with Oxbridge degrees. This maintains a gulf between those making programmes and selecting news stories and the consumers and ultimate owners of the Corporation.

Mills believes that the links that exist between the BBC and government are invidious, with regular Charter renewal, the setting of the licence fee and the direct political appointment of the Director General undermining the independence of the Corporation and keeping it in a precarious position. We are at a turning point when the BBC TV of our fond memories is dead, with on-line programmes allowing viewers to curate their own individual experience and social media providing news. This offers an opportunity for the kind of democratisation of the BBC that challenges the establishment bias. Mills strongly believes in the BBC as a public service broadcaster, and thinks the challenge has to come from those who support it, and not just from those who want it to fail. Questions from the audience in the packed Waterfront Hall suggested that his talk might have begun that process.

With thanks to Sheila Graham for this report

 

 

Judith Weir: A Composer’s Life

Judith Weir is one of the country’s leading composers and was appointed to the ancient office of Master of the Queen’s Music in 2014. Part of her role is to bring a greater awareness of music to the wider public, and her talk for Hebden Bridge Literary and Scientific Society made clear her conviction about the importance of music as she opened a window onto a composer’s life.

She began by considering ‘composer’ as a job title, looking at the woman considered the first named ‘composer’ Hildegard of Bingen, and at how for many famous composers the arrangement and creation of music was often an intrinsic part of their main job as church or court musicians. Bach for example was employed by churches and many of his compositions were written for them. Mozart and his family were touring musicians who also relied on wealthy patrons, as did Beethoven, though he was very much composing on his own terms. Women composers such as Clara Schuman and Magdalena Bach were also performers whose family background gave them space to write music.

Most composers have relied on teaching to make a living, with Elgar, Vaughan Williams and Holst all being teachers. Arts Council grants in the late twentieth century made life easier for some composers, and institutions such as the BBC also commissioned new music. Now composition is seen as an academic discipline in its own right and taught up to PhD level in colleges.

For Judith Weir learning to be a composer requires a great deal of listening, which she felt modern technology made easier, and also playing an instrument, in order to learn how everything works. Time spent having fun making music with friends is as crucial as an academic approach. Most important was mindset – time to be alone and a driving need to compose music.

Her own route to becoming a composer grew from circumstance. Brought up in Aberdeenshire, entertainment in her family often involved groups of friends playing folk music together, so that music was always a social thing. She began writing for her own group of friends, initially as a way of avoiding being sent out to play on wet or cold days at school! The accident of having a teacher who knew John Taverner meant that she was able to meet and talk to him about the music she was composing, and that contact stiffened her resolve to make composing part of her life.

Making her living as a composer has continued to involve teaching and collaborating with others, as a community composer funded by the Arts Council, as part of the Glasgow City of Culture and as resident composer for the City of Birmingham Symphony Orchestra with Simon Rattle. It was clear that these collaborations were part of the creative process as well as a way of getting her music performed. She felt that taking commissions was in no way constraining, but involved a stimulating meeting of minds.

It was fascinating to see some of the physical evidence of her composing process in a current piece for oboe. The first ideas are sketched in pencil, and often contain as many words as musical notation. This stage can take months, as gradually the whole composition becomes more defined – though always with tippex on hand for changes. Even when she is playing sections at the piano the eraser is close by. Eventually there is enough certainty to move to an ink copy which her long standing editor will then work with until a computer programme produces the different parts for the orchestra. At rehearsals there is a chance to refine further, with the precious advice of the musicians.

Working with others is absolutely part of the creative process, but Judith also stressed the need for time and space alone allowing ideas to emerge and develop. While skills can be taught, there is something at the heart of a composer’s life: the need to create music which is outside the scope of teaching.

The audience at Hebden Bridge Waterfront Hall felt privileged to be allowed these insights into a composer’s life and to hear parts of two of Judith Weir’s compositions, a choral piece Maria Regina Celorum and a lovely violin duet Atlantic Drift, originally written for children.

With thanks to Sheila Graham for this report

Before the Big Bang

Professor Jeff Forshaw

6th October 2017

Jeff Forshaw is a professor of theoretical physics at the University of Manchester, specialising in the phenomenology of particle physics. He is passionate about communicating the excitement of the pursuit of science, earning the Institute of Physics Kelvin Prize (2013) for “…outstanding contributions to the public understanding of physics”, whose previous winners include Brian Cox and Jim Al-Khalili.

Jeff’s involvement in cosmology, apart from its need for particle physics expertise, was sparked, he revealed, by agreeing to give a course to 4th year students in Manchester. He began, though, in way that could have been entitled, “How do we know that?” by walking us through some examples of how we corroborate our ideas by a process of prediction and measurement: how big is the Earth? how old is the Atlantic Ocean? and the Earth? and the sun? until some of them become “nailed-on facts”, as he put it. In that category is the Big Bang. The “archaeological evidence” from many independent lines of prediction and observation lead us to think with an overwhelming degree of certainty that the observable universe we see about us was once a very hot dense gas of particles in a volume no bigger than the Waterfront Hall. Already, at that time, it had slight fluctuations of density whose slightly higher density portions gradually collapsed under gravity to form galaxies in filaments and voids observed today. It expanded into the present state of affairs in a manner that is entirely predictable and understandable in terms of the properties of particles and gravity that we know about from our laboratory experiments. That hot gas and its subsequent expansion he defined as the Big Bang. It is also called the Standard Cosmological Model. What happened before that he would come to later. The clinching evidence was a “photograph”, as he put it, of the last vestiges of free charged particles (plasma) 380,000 years after the Big Bang, the radiation (photons) from which forms the Cosmic Microwave Background. 90% of the photons from that era have travelled through the mostly empty space of our universe for nearly 14 billion years without collision. They have cooled from a few thousand degrees Celsius to about -270 C (2.725 degrees above absolute zero, to be more accurate) on average due the intervening expansion of the universe, which stretches their wavelengths. From the “colour”, i.e., the energy, of the photons one can infer the temperature of the original plasma; it varies across the sky by minute amounts in a strikingly beautiful mottled fashion whose structure is spectacularly accurately predicted by the Model using measurements and properties from completely independent observations. It would have been so easy to be otherwise. Jeff himself had written a computer program for the 4th year students to make that prediction—simple, he said, compared to predicting the outcome of kicking a bucket of water, for example.

So how did the Big Bang, that hot dense gas the size of the Waterfront Hall with slight density perturbations, come to be? Here, he admitted, we were moving into the realm of speculation, but ideas that were seen as highly speculative 5-10 years ago are, through the process of working out bit-by-bit, predicting and comparing, becoming increasingly accepted as feasible. He postulated the existence of some “cosmic treacle” called the inflaton field. There exists a present-day analogue that surrounds us all the time, the Higgs field that is responsible for generating mass, predicted 50 years ago by Peter Higgs (and others) and whose excitations were recently observed at the Large Hadron Collider at CERN, Geneva. A special property of the inflaton field is that its energy drives a huge inflation of the universe and guarantees certain properties of the present universe, for example “flatness”, that are difficult to explain otherwise. The Waterfront-Hall-sized region of hot dense gas, which eventually became our observable universe, may well have been a region no bigger than a billionth the size of a proton. Some quanta of the field (inflatons), after a period of huge inflation, decay into “ordinary matter”—and the rest is history, as they say. Jeff showed us a computer simulation he had written of the fluctuating field expanding and decaying into particles, then subsequently condensing into galaxies. The inflaton field is subject to quantum fluctuations, which give rise to just the right size of density fluctuations at the start of the Big Bang.

Finally he speculated that inflation is a continuous process and that our universe is just one of many, possibly infinitely many, universes with the same—or possibly different—physical properties. This is the multiverse conjecture. It seems perhaps that the Big Bang and the Steady State Model may be companion theories after all.

Much of this is in his new book, “Universal: A Guide to the Cosmos”, co-authored with Brian Cox.

The audience gave Jeff spontaneous applause for his passionate and entertaining presentation. He fielded a few questions and afterwards was surrounded by inquirers until we dragged him and his partner away for dinner.

With thanks to John Allison for this report

9th October 2017

Philosophy, Democracy and the Demagogue

Or how Plato can help us deal with Donald Trump:

Professor Angie Hobbs

24th March 2017

Angie Hobbs is the first academic to be appointed Professor of the Public Understanding of Philosophy in the UK (if not the world). With demagoguery on the rise Professor Hobbs believes it is imperative to keep in mind the wisdom of ancient greek philosophers, such as Plato, who themselves lived in tumultuous times.

After defining ‘democracy’ (rule by the people) and ‘demagoguery’ (leading of the people, often manipulatively), Professor Hobbs explored why she regards the ‘will of the people’ to be such a misleading term.

Who are ‘the people’ exactly? Everyone in a state or group? Everyone who can vote? Everyone who did vote? Or only those who voted for the winning side? It is dangerous to equate some or all of the electorate with ‘the people’ as a whole since this implies that those who did not or could not vote, or those who voted for the losing side, are not part of the people.

‘The majority’ is another term, which warrants further scrutiny since it is not as fixed as it may at first appear. Following election individual voters may have a change of heart. Also the composition of the electorate is in a state of flux: new voters come of age and others die, for instance. Such changes could produce a different majority outcome.

Clearly it would be impractical to have constant elections to take account of possible shifting majorities. Nevertheless, opined Hobbs, it is important to recognise that a democratically elected government may often not be supported by the majority of even the electorate – let alone the populace – for most or indeed all of its duration. Which begs the question, what does ‘rule’ by the electorate really mean?

Is ‘rule’ no more than a ballot on a particular day? On the contrary, urged Hobbs, the intellectual foundations of liberal democracy recognise that each individual person has the right to a voice and to be heard. A ballot should, therefore, be seen as playing an important role in an on-going conversation in which citizens – regardless of whom they voted for or even if they are of electoral age – can take part. Thus it is more accurate to refer to the ‘wills of persons’, than the ‘will of the people’.

True ‘rule’, Plato controversially claimed, can only take place if each individual ‘ruler’ (or voter), acts on well-informed, deliberative rational choice. Those who don’t are at risk of being manipulated by an opportunistic demagogue, who, claiming that only they truly understand the ‘will of the people’, are elected to power and then proceed to subvert democracy into tyranny. Plato depicted just such an alarming chain of events in Republic 8 (562a-569c). It makes for a sobering read.

In terms of Plato’s analysis the United States is at a turning point, said Hobbs. Now is the time to heed Plato’s warning and stand up for liberal democracy in every peaceful way imaginable.

Thanks to Ingrid Marshall for this report.

Inequality and Social Anxiety

Kate Pickett and Richard Wilkinson

5th February 2017

Kate Pickett and Richard Wilkinson are Professors of Epidemiology who have collaborated for several years to investigate data relating to inequality. Their 2009 book The Spirit Level: Why Equality is Better for Everyone, which explored the effects of greater inequality within societies, became a surprise success and initiated a debate that has grown more pertinent as nations have struggled to recover from the global financial crisis. Their talk to Hebden Bridge Lit & Sci expanded on their research.

There was a time when it was felt that eliminating absolute poverty was key to improving society, but the overwhelming evidence is that it is the extent of inequality, the size of the difference between the incomes of the richest and the rest, that is at the root of failing societies. Our relative place in society gives us our sense of importance and status, and if we are low down in the hierarchy we are subject to more anxiety, stress and a sense of shame. Our sense of our own worth can effect cognitive ability – test participants who had to reveal their low caste status performed significantly worse than when this information was not made public. And the physiological effects of social anxiety have also been measured in non-human primates, where the animals lower in status had furred-up arteries and all the consequent health problems.

But it is the effects of inequality on nations that have been the main focus of the work of Kate Pickett and Richard Wilkinson. These have been measured in various ways, and the overwhelming evidence is that more unequal countries have worse outcomes across a range of areas. For example, the well-being of children is greater in societies which are more equal; there is more violent crime and higher imprisonment rates in countries which are more unequal; life expectancy is better in more equal countries, and drug and alcohol problems greater in those which are more unequal. Inequality was described as a social pollution whose effects none of us can escape from. A steeper hierarchical pyramid means a less cohesive society, where the stress effects of social anxiety can lead to a lack of trust in others and a decline in communal life, such as volunteering and caring. A society of barred gates, guards and razor wire is not a happy one.

It might be obviously in everyone’s interest to reduce these huge disparities so that we can all live in a better society, but it seems a difficult task to complete. Redistribution of wealth through taxation is one method, but one government’s decisions can easily be overturned by the next. What seems to be missing is what Professor Wilkinson described as a disciplining force against the prevailing direction of inequality – he felt this might come from a mass movement, and perhaps more workers on boards controlling remuneration, as happens in Germany. There is also the possibility of embedding measures of success other than growth in GDP – we could look for other values, such as the Sustainable Development Goals, which have been internationally agreed and can be monitored.

And on an individual level, we can seek to avoid the pressures of status consumerism by cultivating friendship – proved to be one of the biggest health benefits.

With thanks to Sheila Graham for this report