Graham (and ChatGPT) write AI (Artificial Intelligence) has been in the news quite a bit recently, mostly it seems because of the perceived threat it poses to human existence. It can certainly be argued that AI, like any powerful technology, has the potential to be both beneficial and dangerous, depending on how it is developed and used. However, setting these issues aside, I thought it would be interesting to ‘interact’ with one of the many AI applications that are currently available to get a feel for how convincing it is at emulating human responses. British mathematician and computer scientist Alan Turing proposed a test in the 1950s to assess a machine's ability to exhibit intelligent behaviour indistinguishable from that of a human. The ‘Turing Test’ has also been the focus of relatively recent cinematic offerings such as ‘The Imitation Game’ (2014) and Ex Machina (2015). It was first introduced (‘Computing Machinery and Intelligence’, Mind, Vol. 49, pp. 433-460, 1950) as a way of considering the question ‘Can machines think?’. The setup for the classic Turing Test is pretty straight-forward, involving three participants, a human interrogator, a human respondent and a machine respondent. All interactions occur via text ensuring that physical appearance or voice do not influence the inquisitor. The objective is for the interrogator to determine which respondent is the human and which is the machine. The machine passes the test if the interrogator cannot reliably distinguish it from the human, based on its responses. It’s fair to say that, while the Turing Test is ‘iconic’, it is nevertheless a subjective judgement that is made by the interrogator. Another issue is that the test doesn’t really assess the machine’s ability to understand or reason – Turing’s original goal – but rather evaluates how well a machine mimics human communication. Bearing in mind these limitations, I initiated a conversation with one of the online AI applications – ChatGPT – which can be accessed without cost at https://chatgpt.com. Obviously, this post doesn’t represent a classic Turing Test as there is no human respondent but it’s interesting to see how well the AI apps’ output emulates human-like responses. As mentioned above, this judgement is entirely subjective - some of you may be impressed, others not so much. I opened the ‘conversation’ with the following question … Graham said: If God is loving and omnipotent, why is there so much suffering in the world? ChatGPT said: The problem of suffering is one of the most profound and difficult questions in philosophy, theology, and human experience. It has been the subject of debate for centuries and is often framed as ‘the problem of evil’. Here’s an overview of how different perspectives—religious, philosophical, and humanistic — address this question: 1. The Free Will Defence Key Idea: Human suffering is often the result of human free will.
Key Idea: Suffering is necessary for moral and spiritual growth.
Key Idea: Humans cannot fully understand God's purposes.
Key Idea: Suffering arises from living in a world governed by natural laws.
Key Idea: Suffering is the result of sin and separation from God.
Key Idea: Suffering is a natural part of existence in an indifferent universe.
Key Idea: God participates in human suffering.
Final Reflection: No single answer fully resolves the problem of suffering, and it remains a deeply personal and existential question. Different perspectives offer various ways to find meaning, hope, and action in the face of suffering. Many people draw on a combination of philosophical reasoning, religious faith, and human compassion to navigate this complex reality. How do you personally approach or reconcile this issue? Graham said: I believe that human free will is responsible for most of the suffering in the world ... ChatGPT said: That’s a thoughtful perspective. Believing that free will is the root of most suffering highlights the importance of human responsibility and moral choices. From this viewpoint, much of the world’s pain could be alleviated by better decisions—individually and collectively. How do you think free will interacts with situations beyond human control, like natural disasters or diseases? Do you see those as part of a broader context, or are they separate from moral responsibility? Graham said: Earthquakes are actually necessary to support human life on planet Earth - recycling carbon etc. Some diseases such as cancer go hand in hand with the process of evolution. Evolution allows for the diversity of life. I have no answer for many other diseases, which seem to produce helpless suffering. ChatGPT said: You’ve highlighted a nuanced understanding of how natural processes like earthquakes and evolution play essential roles in sustaining life and fostering biodiversity, even when they result in suffering. It’s a perspective that acknowledges the complexity of life on Earth—where the very mechanisms that support life can also cause pain. The challenge of diseases that seem to cause helpless suffering is indeed difficult. Some might argue that they reflect the randomness of existence, while others see them as part of a mystery yet to be understood. Do you think science and human intervention could eventually reduce or eliminate such suffering, or do you view it as an unavoidable part of life? Graham said: Perhaps 'pointless suffering' might have been a better phase, rather than 'helpless suffering' ... ChatGPT said: That distinction makes sense. "Pointless suffering" emphasizes the difficulty of finding meaning or purpose in certain experiences, which can be even more troubling than feeling helpless. It raises profound questions about why such suffering exists if it doesn't seem to contribute to growth, survival, or a larger purpose. Do you think it's possible for meaning to emerge from what initially appears as pointless suffering, or does it remain inherently without purpose? Graham said: I guess if you believe in a God, pointless suffering may help in building resilience and character? ChatGPT said: Yes, many religious and philosophical perspectives suggest that even pointless suffering can contribute to building resilience, character, or spiritual depth. While it may not have an obvious purpose, it can still shape a person’s inner life in meaningful ways. Here are a few ways this idea is framed: 1. Resilience and Strength
Philosophical Tension Even with these ideas, the tension remains: Why would a loving, all-powerful God allow suffering that seems excessive, random, or pointless? For many, this question is unresolved and leads to a balance between faith, doubt, and human action to reduce suffering where possible. Does this perspective resonate with you, or do you find certain aspects of it challenging? The ‘conversation’ was terminated at this point, but it could have gone on for quite a while. ChatGPT seemed to want to have the last word … If you got this far in the ‘conversation’, then well done! So what do you think? I was quite impressed with the AI’s responses, and I believe that the interaction helped me to understand the question of God and human suffering, and its implications, a little more clearly. Obviously, there is no simple answer, and is definitely a matter of opinion – which is why the conversation could have gone on for some time, eventually ending up going round in big circles! Also, if there had been a human respondent, I don’t think I could have reliably identified who was who from the syntax (sentences and phrases), so I guess the AI passed the Turing Test in this respect. However, I have a feeling that I might have identified the AI from the semantics (concerned with meaning) simply because it did so well in the extent and logical structure of its responses. I think a human respondent would have been more hesitant, and somewhat less logical in framing their ideas. So, at the end of the day you could say that ChatGPT failed the Turing Test because it did too well in addressing the issue. I have to say that I didn’t see that one coming.
This brief experience of an AI application has demonstrated their power and utility and how they can be used in many applications. I hope to resist the temptation of allowing ChatGPT to write future blogs … as I have partially done with this one! May I take this opportunity to wish our readers a good 2025. Also please take the opportunity to leave a like, a greeting or a comment on this and future posts, so that we know you are there. Graham Swinerd Southampton, UK January 2025
1 Comment
Graham writes … As December 2024 draws to a close, it’s clear that I’m not going to be able to post a blog for this month. Many apologies for that. It’s been a manic time, and not just related to Christmas festivities. I hope to be posting something on what AI ‘thinks’ about God and suffering in early January, so I hope you will come back for that. In the meantime, as the Christmas season draws to a close, I’d like to post one of my favourite pieces of ‘popular’ music, which I hope will bless you as we move from 2024 into the new year – ‘Mary, did you know …?’ performed by Pentatonix. Their performance is spine tingling, as they bring clarity to the stunning words. If you follow our blog posts throughout the new year, please leave a greeting or a comment so that we know you are there. Wishing all our readers a happy new year! Graham Swinerd Southampton, UK December 2024 John writes … Well, COP 29 has finished. Did it achieve anything? Here are two reactions to the conference. The first is a brief commentary, written by me for our church on-line newsletter, while the conference was still in progress: Who is my neighbour? I was reading a passage in Chapter 2 of Habakuk and was stopped in my tracks by the beginning of verse 8: 'Because you have plundered many nations …'. It is set in a passage about injustice, exploitation and institutional theft but my mind immediately turned to climate change (1). Climate change has been caused by the emissions of carbon dioxide from burning fossil fuels, mostly in the richer industrialised nations in support of lifestyle, industry, commerce, and warfare. However, the majority of the countries most affected are amongst the poorer nations of the world (although I acknowledge the effects of recent extreme weather events in Europe and the USA). As I have written elsewhere (2), if we factor in a country’s ability to deal with climate change along with the severity of the effects, nine of the ten worst-off countries are in Africa. This may not be direct ‘plunder’ but it certainly says something about inequality and global justice. It is no surprise that at COP 29, there have been demands from poorer countries for financial help to deal with climate change – something which I believe we should support in whatever way is possible for us. The second is a Podcast, made by Chatham House (an independent international affairs think tank), towards the end of the conference while negotiations were still in full swing – to listen to podcast, click on this link: https://www.chathamhouse.org/.../independent-thinking-has... My final comment, now that the conference has finished, is four-fold.
First, the amount agreed for payment to poorer countries is somewhat of a compromise Secondly, despite pressure from oil-rich nations, it is clear that we are making progress towards renewable energy sources and away from fossil fuels. Thirdly however, that progress is not fast enough. We are still heading for a temperature increase of at least 2 degrees, with possibly disastrous effects on tipping points. Fourthly, if you are a praying person, please pray about this John Bryant Topsham, Devon November 2024 (1) I was pleased to see that a recent commentator on this passage, Revd. Amanda Bloor, reacted in a similar way. (2) Introduction to Bioethics, 2nd Edition, Wiley, 2019. John writes ... 'The earth is the Lord’s, and everything in it … for he founded it on the seas' (from Psalm 24). King David equated creation with ownership and went on to consider God’s awesome majesty, holiness and even ‘otherness’. Certainly, when we think about the universe in all its magnificence, awe is one of the emotions that we also feel. But look now at the previous psalm, the very well-known 23rd, in which David thinks of God as knowing him personally, nurturing, caring, providing for his needs – a very different picture from that of awesome majesty but equally valid. In our own reaction to God’s creation, many of us have experienced similar emotions, holding in tension the awesomeness of the creator with the understanding that he cares about us and indeed about the whole of creation. And so I end with this gentle exhortation entitled ‘On Holy Ground’ written by Peter Millar and published by the Iona Community*. Go gently my friends: feel the good earth beneath your feet, celebrate the rising of the sun, listen to the birds at dawn, walk gently under the silent stars, knowing you are on holy ground held in love – in the wonder of God’s creation. John Bryant Topsham, Devon October 2024 *50 New Prayers from the Iona Community, ed. Neil Paynter, Wild Goose Publications, 2012. Graham writes ... Despite fears that the Europa Clipper (EC) spacecraft might not survive the radiation environment at Europa, the spacecraft launched on schedule. A SpaceX Falcon heavy lift launch vehicle left the famed 39A launch pad (the historic Apollo 11 mission left from here in July 1969) at 12.06 EDT (16.06 GMT) on the 14th of October 2024, just 4 days after the launch window opened on 10 October. However, now we all need to be patient – if all goes well, EC will enter Jupiter orbit in April 2030. Nevertheless, this is a great opportunity to look forward to what it is hoped EC will achieve, at an overall mission cost of around US$5-billion. Back in May of this year, the launch looked to be in jeopardy (for details see previous ‘miniblog’ – click on ‘August 2024’ on the RHS of this page) when a problem was discovered concerning the reliability of electronic components already installed in the integrated spacecraft. An issue was suspected with the transistor elements, which are essentially the building blocks of the micro-processors onboard. The Jupiter system, where EC will operate, exposes the spacecraft to intense particle radiation similar to the Earth’s Van Allen radiation belts but 50 times more intense. To survive this the electronic systems need to be housed in a protective radiation vault engineered with 9 mm thick walls composed of aluminium and titanium. In addition, the electronic components are required to be ‘radiation-hardened’ to achieve the spacecraft’s planned 3.5 year mission lifetime. Earlier this year, it was found that identical transistors failed under testing before they should. This posed a real problem for the engineers. Would they have to replace the installed transistors, or would they have to redesign the flight profile to achieve the mission objectives in a shorter time scale? Both of these are major undertakings – either of which would have threatened the readiness of the spacecraft to launch within the bounds of the upcoming launch window. Following this potentially devastating news, NASA instigated four months of 24-hour intensive testing at three different facilities – JPL in Pasadena, California, the Johns Hopkins Applied Physics Laboratory in Laurel, Maryland and the NASA Goddard Space Flight Centre in Greenbelt, Maryland. After evaluating spare components from the same batches that were installed on EC, the test engineers found that the spacecraft’s systems would, afterall, perform as required. This conclusion partially rested on the fact that during the first half of its mission lifetime, the spacecraft will be in the most intense of Jupiter’s radiation only one out of every 21 days. To get an idea how this works, the diagram above shows a ‘plan view’ of the Jovian system with Jupiter at the centre and the orbits of the major (Galilean) moons shown in black. A representative elliptical EC orbit is shown in blue and the radiation hazard is shown roughly by the red, gold and beige areas, red being the most intense. As can be seen, in order to execute flybys of Europa the spacecraft must dip into the red zone, but there is a mitigating factor. Due to the nature of elliptical orbits, the spacecraft’s speed at the perijove (point of closest approach to Jupiter) is very much higher, than its speed at apojove (point of furthest distance), so EC will spend most of its time outside of the damaging radiation environment. Hence for the majority of the time, the orbiter will remain outside the region of greatest hazard. Also, during this time the components can be partially restored from radiation damage by gently heating them (a process called ‘annealing’). So why is Europa, a body about the size of Earth’s moon, the subject of such intense interest? As I said in my previous blog post, it has long been thought to be a place in the Solar System where conditions may be suitable for life to develop. Looking at Europa – a distant and cold, ice-covered world – it doesn’t look at all like an environment where life could flourish. However, in this case, appearances are deceptive. There is strong evidence that beneath the ice crust there is a warm water ocean, the heat being generated most likely by volcanic vents on Europa’s ocean bed. It is currently believed that the depth of this ocean is up to 100 km, so that Europa may have twice the volume of water compared to terrestrial oceans! So, how does this vulcanism work? The key to understanding lies with the massive gravitational pull Jupiter exerts on its moons. As Europa orbits the gas giant, tidal forces cause the icy moon’s interior to flex which generates thermal energy within the moon’s rocky core, increasing the likelihood of volcanoes on the ocean bed. Other Galilean moons, Io in particular, exhibit intense volcanic activity on their surfaces driven by the same mechanism. Suspicions that there may be an ocean beneath Europa’s icy surface were first raised by imagery acquired by NASA’s Galileo spacecraft which orbited Jupiter from 1995 to 2003. A good example is the image below of Europa’s icy surface taken at a range of about 200 km during a flyby in November 1997. In this region, the surface ice appears to have melted, broken up and then refrozen, suggesting surface thawing caused by heating from below. Furthermore, recent reanalysis of old Galileo mission data suggests that the Galileo spacecraft may have flown through plumes of water vapor emanating from the moon during flybys. So, what do we hope to learn from the EC mission? Clearly, there is no way that it can perform an in-situ investigation of potential life signatures at the ocean bed. The objectives must be limited by what can be achieved remotely during the planned 50 flybys of Europa, at ranges as low as 25 km. The over-riding motivation for the mission is to do a detailed study of the Europa system to investigate whether the icy moon could harbour conditions suitable for life to exist. To work out the orbital profile of the mission and to determine what payload instruments are required, the mission objectives (in general terms) may be expressed as:
Based upon these broad objectives the engineers were able to determine the characteristics of the spacecraft. In other words, they can decide on the payload instruments needed to achieve the objectives, and what services are required on board to support these instruments. The ‘services’ are usually referred to as subsystems such as electrical power, communications, data handling, attitude control (required to point the instruments as appropriate), propulsion, structure, and so on. This process led to the design that we see in the final integrated spacecraft, which is now on its way to Europa. With a dry mass of more than 3.2 tonnes, a height of roughly 5 metres, and a width of more than 30 metres with its solar panels fully unfurled, EC is the largest spacecraft that NASA has ever built for a planetary mission. To handle the large transmission rate of payload data over such large distances, the spacecraft is equipped with a 3 metre diameter communications dish. All previous NASA deep-space missions have been much more compact due to the use of radioisotope thermal generators (RTGs) to provide electrical power. These devices use the heat from radioactively decaying elements to produce electricity. The use of solar arrays in this instance is a significant change in design philosophy for NASA. Given that Jupiter is 5 times further from the Sun than Earth, the amount of power per unit area is 25 times less (the ‘inverse square law’!). This means that the incident solar power at Jupiter is only about 55 W per square metre, so lots of array surface area is needed to supply the spacecraft’s electrical needs resulting in the spacecraft’s large size, as seen below. A major payload component is an ice-penetrating radar to determine the structure of Europa’s icy crust and to attempt to acquire direct evidence of sub-surface water. The spacecraft's is also equipped with two sets of cameras, one operating in the visible part of the spectrum and the other in the infra-red (thermal imaging) to produce a high resolution map the moon’s surface and look for potential plumes. The thermal sensor will help pinpoint locations of warmer ice and perhaps recent eruptions of water. EC also carries spectrometers, again operating in different parts of the spectrum to determine the composition of the surface and atmosphere. Spectrometers measure the intensity of reflected light from Europa across a band of frequencies (or colours, if the instrument is operating in the visible part of the spectrum). Particular surface elements or compounds will absorb light at particular frequencies (colours) leaving dark lines in the surface spectrum, revealing their presence. In addition, a magnetometer will measure the moon’s magnetic field, and perturbations to the vehicle’s trajectory during close flybys will provide information about its gravity field, both of which will offer clues about Europa’s internal structure. Stepping back and looking out into the Solar System in general, there is a growing trend with current and proposed missions to give the search for life a high priority. When I was growing up and learning about our planetary system this was not the case. In those times, it was firmly believed that Earth was the only seat of life in the Solar System, and the remainder of the planets were considered to be sterile wastelands (both hot and cold). This was entirely understandable given that at that time everything we knew about our planetary system was acquired remotely through terrestrial telescopes. However, with the advent of the space age, an armada of spacecraft has ventured out into the Solar System, visiting each of the major planets. The recent emphasis on the search for life is epitomised by our current investigations of Mars. It is now recognised that in its early history Mars was a water world, and that life may have developed there. Current rovers are equipped with instruments to detect possible bacterial life that may still dwell in the rock beneath their wheels. However, Mars is not the ideal laboratory to undertake this investigation. It is not often acknowledged that Mars is not quarantined from Earth, so that any life found there may have originated here (or indeed vice versa). To understand how this is possible, we have to go back maybe 3 or 4 billion years when life was thought to be stirring on our world. Alongside this, the early Solar System was a very hazardous place with lots of debris left over from its formation. Large impact events on Earth occurred often, throwing rocks off the planet and into solar orbit. It is possible that ‘earthling bacteria’ could have hitched a ride on this ejecta, to ultimately arrive on the Martian surface. Interestingly, the recent discovery of underground reservoirs of water on Mars (see August 2024 blog – click on the date on the RHS of this page) may reshape the way we think about Martian life. This is why I find missions like Europa Clipper so exciting. If we find life there, it is very likely that it has nothing at all to do with life on planet Earth. The postulated Europan ocean is isolated by a crust of ice which is thought to be somewhere between 3 and 30 km thick. Though no sunlight can penetrate that shell to power life, there is a good theoretical basis to believe that vents that release heat from the moon's interior exist on its ocean floor. Indeed, similar vents in Earth’s deep oceans teem with life. There is a school of thought that advocates these terrestrial vents could have been where life originated on our planet. If, ultimately, we do find life on Europa we have no idea what form it will take. Will it be based upon a DNA-type organisation, but with a different code? Will we even recognise it as life at all? The discovery of non-terrestrial life is a whole new ball game that we have yet to play!
In passing, it is worth mentioning that there is another prime suspect in our Solar System where life may reside. A moon of Saturn called Enceladus is a body which exhibits a lot of similarities with Europa. Here NASA’s Cassini Saturn orbiter mission (2004-2017) identified water plumes emanating from an ice crust, below which is believed to lie an ocean warmed by tidal vulcanism. However, this is another story! Now that I have acquired the grand old age of 74, I’m not sure that I will be around to see the outcome of this mission! However, I do hope so – it could very well be a profound milestone in our understanding of whether extra-terrestrial life is extremely rare (and precious) or very abundant throughout the Universe. Graham Swinerd Southampton, UK October 2024 John writes ... Some personal history. One sunny morning in May 2011 I went to the local teaching hospital to receive a diagnosis based on a biopsy that had been taken a few days earlier. The result was very clear: the pigmented patch on the shin of my left leg was, as suspected by me and by the dermatology consultant, a melanoma, which is the most aggressive type of skin cancer (1). It was a very early-stage melanoma and was unlikely to have spread. The nurse-practitioner who conveyed the news indicated that the melanoma would be removed by surgery followed by a skin-graft. I had that treatment a few weeks later and having chosen to receive only local anaesthetic, was able to watch with interest the whole procedure. I would also have regular checks for a year to see whether or not the cancer had spread – which fortunately it never did. During the conversation about the diagnosis and projected treatment, the nurse-practitioner also explained that, at the time, there were no effective chemotherapies for melanoma (although some had been tried) and so, if the cancer had spread, surgery would again be the treatment. However, during the 15-minute drive home, I listened to a radio programme that presented recent advances in medical science. The first item that came up was the approval by the NHS of an immuno-therapy for treating advanced melanoma. In other words, the treatment involved stimulation of the body’s immune system to specifically target the melanoma cells. There followed testimonies from patients whose advanced melanomas had been treated very successfully by the new therapy. Indeed, one woman said that secondary tumours in her liver had been destroyed so quickly that for a while, the liver had holes and gaps in it, as the body’s regeneration mechanisms could not keep up with the rate of destruction of cancer cells. As I listened to this during my brief car journey my mood was significantly elevated. I realised that, if by chance the medics are wrong and my melanoma does spread, there is now a treatment for it. The immuno-therapy was widely taken up by doctors treating metastatic melanoma (i.e. melanoma which had spread to other parts of the body) and among its recipients was former USA president, Jimmy Carter who was treated very successfully at the age of 90! The treatment spreads. The announcement of an effective immuno-therapy for melanoma immediately led to the idea that similar therapies could be introduced for other cancers. It would require the identification for each cancer of a specific antigen for that cancer, for example, a protein on the surface of the cancer cells. The patient’s immune system would then be stimulated to raise antibodies to target the cancer cells, as had been done for melanoma. Immuno-therapeutic treatments have now been developed for a wide range of different cancers or cancer sub-types whilst at the same time, several different versions of the original treatment have been developed, depending on which protein(s) is/are targeted (2). More recently, immuno-therapy has been combined with genome editing and genetic modification in two slightly different ways in order to cure very difficult cases of childhood leukaemia (3). Back to the 1960s. In targeting antigens, whether in a cancer cell or a pathogen, we are targeting proteins, albeit that some of them may be modified, for example by the addition of carbohydrate groups to make glycoproteins. It is therefore helpful to remind ourselves of a couple of features of the synthesis of proteins. The structure of proteins is encoded in DNA with, for the most part, one protein being encoded in one gene. But the genes are separated from the sites in the cell where the proteins are made, so how does the code in a gene direct the synthesis of a protein. Several scientists, including Sydney Brenner at Cambridge, predicted that genes were copied into ‘messenger’ molecules that took the code to the sites of protein synthesis. Two international teams (French/British/American) assembled especially for the job, and working in two different labs in the USA, confirmed that this hypothesis was correct. The messenger molecules are made of RNA and are hence known as messenger RNA. In bacteria, the messenger RNA (mRNA) molecules are relatively short-lived, as described in the papers published by the two teams (4) but in other organisms, the longevity of mRNA molecules varies between genes. Working with messages. It goes without saying that we have come a very long way since those early discoveries. We now have means of identifying the mRNA copied from a specific gene amongst the thousands of mRNAs copied from other genes. This enables us to ascertain when a gene is actively being copied so that we can look at the patterns/timings of expression of particular genes in relation, for example, to particular life events. Thus, in my field, it has been possible to study the expression of different genes by measuring the amount of their particular mRNAs in relation to the phases of the cell division cycle (5). Further, it is now possible to make mRNA molecules corresponding to specific genes. This is not without its difficulties – single-stranded RNA molecules such mRNA are susceptible to breakdown when existing outside the cellular environment. This is true both of those messages which are longer-lived in the cell and those which are shorter-lived. However, methods of improving very significantly mRNA stability have been developed – an advance which has opened up a huge range of possibilities. Bringing everything together. Earlier in this article, we saw that many cancers can now be treated by immuno-therapy, inducing the body to make antibodies to particular cancer proteins (antigens), often by injecting the patient with the purified protein. But what if we asked a patient’s body to actually make an ongoing supply of the antigen to achieve a higher level of antibody production? How would we do this? I am sure that many of our readers have already reached the correct answer to that question but in order to complete my story, I am going to return to the COVID pandemic. The key feature which I want to emphasis is that several of the most successful vaccines did not involve injection of an antigen (the spike protein) but the mRNA encoding the antigen – so our own cells made the spike protein and our immune system made antibodies against it. Brilliant! The success of the Covid vaccines almost immediately led to questions in the cancer research community – if it works for vaccination surely mRNA technology could also work for immuno-treatment of cancer. And the question has now been answered, as revealed in an announcement in February this year that clinical trials involving mRNA-driven cancer immune-therapy had been initiated at Hammersmith Hospital, London (6). This announcement was followed just last month by the news that a lung cancer patient at University College London Hospitals (UCLH) had been started on immuno-therapy based on mRNA (7). Cancer does indeed ‘get the message’. Incidentally, both the announcement from UCLH and the BBC’s coverage of this development talked of a ‘novel cancer vaccine’, based on the similarity of the technique to that used in Covid vaccines. However, as I have emphasised elsewhere (8), this is not actually a vaccine (it is not used to try to prevent cancer) but a treatment for use when someone actually has cancer.
Concluding comments I have told a story here that entwines the applications of research from three different areas of biological/biomedical science, namely cancer cell biology (including cell division), immunology and molecular biology. As a long-term member of the molecular biology community (including work with mRNA) I am delighted that the discoveries made by scientists are being applied for human benefit. As a Christian I believe that science is a gift from God – curiosity about how the world works is embedded in the human species and some of us are privileged to exercise that curiosity in our work. Equally, I believe that it is important to use our knowledge wisely and where possible in fulfilling the commandment ‘Love your neighbour’ in practical ways, exemplified by the work described here. John Bryant Topsham, Devon September 2024 (1) The consultant also told me that long-distance runners were probably at greater risk of getting skin cancer on the legs than the general population. (2) Immunotherapy for Cancer – NCI. (3) ‘Introduction to Bioethics’(2nd Edition), Bryant and la Velle, Wiley/Blackwell (2019), p. 139; Cancer therapy involving genome editing cures another child’s leukaemia – Genomics Education Programme (hee.nhs.uk), Meet Alyssa | Great Ormond Street Hospital Charity (gosh.org). (4) Discovery of messenger RNA in 1961 (Pasteur.fr). (5) For example, Genes encoding two essential DNA replication activation proteins, Cdc6 and Mcm3, exhibit very different patterns of expression in the tobacco BY-2 cell cycle – PubMed (nih.gov). (6) First UK patients receive experimental mRNA therapy for cancer | NIHR. (7) First UK patient receives innovative lung cancer vaccine: University College London Hospitals NHS Foundation Trust (uclh.nhs.uk). (8) Link to author's interview on Youtube: www.youtube.com. Graham writes … The launch window for this mission to visit Jupiter’s moon Europa opens in October, but the engineers at NASA’s Jet Propulsion Laboratory are currently troubleshooting a serious issue with the Europa Clipper spacecraft. The objective of the mission is principally to determine whether Europa is a suitable place for life to develop, and as such it is generating a fair degree of excitement amongst astrobiologists. Looking at Europa – a distant and cold, ice-covered world – it doesn’t look at all like an environment where life could flourish. However, in this case, appearances are deceptive. There is strong evidence that beneath the ice crust there is a warm water ocean, the heat being generated most likely by volcanic vents on Europa’s ocean bed. The problem with the spacecraft lies with the transistor elements, which are essentially the building blocks of the micro-processors onboard. The Jupiter system, where Europa Clipper will operate, exposes the spacecraft to intense radiation similar to the Earth’s Van Allen radiation belts, but 50 times more intense. In order to survive this environment, the spacecraft’s electronics need to be ‘radiation-hardened’ to achieve its planned 4-year mission lifetime. However, the hardness rating for these elements turned out to be incorrect, and the transistors were found to fail before they should in laboratory tests. This poses a real headache for the engineers. There are currently two main avenues of investigation; firstly, the obvious route of replacing the transistors, and secondly to assess how long the existing integrated spacecraft could survive the radiation environment and whether it could achieve its mission in a shorter time scale. The second option would at least allow them to launch the currently integrated spacecraft in October, but with the prospect of a shorter mission at Jupiter. The first option however is also possible, but it would risk missing the 3-week launch window in October. There are however later launch opportunities, but not until 2025 and 2026. So, everything is very much ‘up in the air’ at the moment, while the engineers mull over the options. I will most likely write a post on the various aspects of this fascinating mission in October – when hopefully we will know better what’s going on. I hope to ‘see’ you then, and in the meantime please see my main blog post for August below (… on water on Mars). Thereafter, I hand over to my co-author John for the September blog! God bless all.
Graham Swinerd Southampton, UK August 2024 Graham writes ... “When I consider your heavens, the work of your fingers, the moon and the stars, which you have set in place, …”: Psalm 8. As we have commented before on this blog page, Mars has not always been the arid desert that we see today. The confirmation of this view from data acquired by orbiting, imaging spacecraft is overwhelming, with clear evidence of water erosion and features such as river deltas and lakes. See for example, my blog post in February 2021 (just click on that date in the blog archive list on the right-hand side of this page), concerning the immanent adventures of NASA’s Perseverance rover as it set out to explore what was once a Martian lake bed. The second post in March 2021 looks more generally at the question of life elsewhere in the Universe. Coming back to Mars however, we can ask 'where has all the water gone?'. The planet is small – about half the size of the Earth – and the consequence of this is that Mars’ gravity field was not strong enough to retain the atmosphere that it had more than 3 billion years ago when it was a ‘water world’. As the atmosphere slowly leaked away into space, the conditions were set for the surface water to evaporate rapidly (in geological terms). Recently, a groundbreaking discovery has added a new layer of intrigue to Mars – the presence of liquid water deep beneath its surface. This finding, made possible through the detailed analysis of seismic data from NASA’s Insight lander, marks a significant milestone in our understanding of Mars and its potential to support life. The Insight lander, which touched down on Mars in 2018, was equipped with a seismometer that recorded vibrations from Mars quakes over four years. By carefully analysing these seismic waves, scientists were able to detect the presence of liquid water reservoirs located approximately 10 to 20 kilometres below the Martian crust – a process that is often used here on planet Earth to detect oil or water deposits underground. This discovery is particularly significant because it provides the first direct evidence of water on Mars, beyond that previously identified frozen in Mars’ ice caps. The amount of water discovered is staggering – enough to uniformly cover the planet’s surface to a depth of more than a kilometre. There is speculation that this underground water was there in Mars’ early history when surface water was plentiful, and that its underground location sustained it as the surface was transformed into an arid landscape. So, why does all this matter? Well, as the astrobiologists will tell you (or any other biologist comes to that …), water is a crucial element for life as we know it. The presence of liquid water on Mars opens up new possibilities for the planet’s habitability. While the surface of Mars is a cold, arid desert, these underground reservoirs could potentially harbour microbial life. Moreover, any such underground life would likely to be quarantined from Earth-based life, so providing an uncontaminated environment to try to understand how life began (both on Mars and the Earth). It is also clearly a great resource for future missions with the objectives of exploring and possibly colonizing Mars - access to water would be vital for sustaining human life and supporting agricultural activities on the planet. However, before we get carried away with all this, it is obvious that accessing these deep reservoirs poses significant challenges. The water is buried deep within the Martian crust, making it difficult to reach with current know-how. Future missions will need to take with them advanced drilling technology to tap into these resources. Additionally, the harsh conditions on Mars, including a global average temperature of -50 degrees Celsius, a harsh surface radiation environment (Mars has no protective magnetosphere) and surface dust that is potentially toxic to humans, present further challenges that need to be overcome!
If you would like to hear more on this, click here to hear the ‘5 Questions on’ podcast: ‘Huge reservoirs of water deep inside Mars’ (7 minutes), with the BBC’s science correspondent Victoria Gill talking with Michael Daventry. Graham Swinerd Southampton, UK August 2024 John writes … The heading of this blog post takes us back to the last words of my previous outing on these pages in which I wrote about the role of cold weather in regulating aspects of plant growth and development. Seeds of most plants growing in cool temperate regions are dormant – unable to germinate – when they are shed from the parent plant. In many species, dormancy is broken by an exposure to cold conditions, as I discussed in more detail in May. As also mentioned in that previous post, this is equally true of leaf and flower buds in biennial and perennial plants: in technical terms, the buds have to undergo a period of vernalisation (you will probably already know the word vernal which refers to things that happen in Spring such as the vernal equinox). Recent work by Prof Caroline Dean and Prof Martin Howard at the John Innes Centre in Norwich has started to unravel the mechanisms involved in vernalisation of flower buds. In autumn, the flower buds are dormant because the flowering process is held in check by the activity of a repressor gene. The activity of the repressor gene is sensitive to cold and so, during the winter, the gene is slowly switched off and eventually the genes that regulate flowering are able to work. OK then, plants have avoided leaf bud-burst or flowering at an inappropriate time but as Spring arrives, what is it that actually stimulates a tree to come into leaf or induces flowering in a biennial or perennial plant? Spring is characterised by several changes in a plant’s environment but the two most important are the increasing daytime temperature and the steady, day-by-day increase in daylength. In respect of temperature, it is clear that it is the major trigger for Spring-flowering plants. It is often said that Spring comes much earlier than it used to (even though, astronomically, the date of the equinox remains unchanged!). That observation was one of the catalysts for my writing these two blog posts and it has now been borne out by the data. In a recent research project carried out at Cambridge, on the effects of climate change, it was shown that in a range of 406 Spring-flowering trees, shrubs and non-woody plants, flowering now occurs a month earlier than it did in the mid-1980s (1). This ties in with my memories of Spring in Cambridge (I hope you’ll excuse a bit of reminiscing): when I was a student, the banks of the Cam were decorated with crocus flowers at the end of the Lent term, before we went home for Easter; when we came back for the summer term, it was daffodils that dominated the same banks. Now, the crocuses flower in the middle of the Lent term and the daffodils are in bloom at the end of that term. I will return to the induction of flowering later but now want to think about trees and shrubs coming into leaf. The situation is nicely illustrated by the old saying about oak (Quercus robur & Quercus petrea) and ash (Fraxinus excelsior): ‘Ash before Oak, we’re in for a soak; Oak before Ash, we’re in for a splash’. In colder, wetter Springs, oak budburst was delayed in comparison to ash and, in thinking about the weather, a cold wet Spring was believed to presage a wet summer (a ‘soak’). The folklore illustrates that budburst in oak is temperature-dependent. But what about ash? Its coming into leaf occurs at more or less the same time each year because the main trigger is increasing day-length. Thus, plants (or at least ash trees) have a light detection mechanism that can in some way measure the length of the light period. I need to add that because of climate change, these days, oak is nearly always before ash in respect of leaf production. Going back to the nineteenth century, Darwin’s experiments on the effects of unilateral illumination clearly showed that plants bent towards the light because of differences in growth rate between the illuminated and non-illuminated side (2). This phenomenon is known as phototropism and shows that light can affect plant growth in a way not directly connected with photosynthesis. This added to previously established knowledge that plants grown in the dark or in very deep shade grew tall and spindly (‘etiolated’) and made little or no chlorophyll. Transfer of etiolated plants into the light slowed down the vertical growth rate and also led to the synthesis of chlorophyll, again showing that light can affect plant growth and development. These phenomena, and many others, lead us to think that plants must possess light receptors which are able to transduce the perception – and even quantification – of light into effects on growth. Further, these days we would say the effects on growth indicate effects on the expression of genes that control growth. The role of chlorophyll as a photo-reactive molecule, active in photosynthesis, was well known but the effects I am describing cannot be ascribed to chlorophyll since they can occur in its absence. The first of these non-chlorophyll photo-reactive molecules was discovered at the famous Beltsville Agricultural Research Centre in Beltsville, Maryland, USA where Sterling Hendricks and Harry Borthwick showed that red light was particularly effective in promoting several light-dependent developmental processes and that this promotion was reversed by far-red light. They proposed that plants contained a photo-reversible light-detecting molecule that was responsible for transduction of the perception of light into effects on growth and development. Cynics named this as yet unknown light receptor a ‘pigment of our imagination’ but Hendricks and Borthwick were proved right in 1959, when a pigment which had the predicted properties was identified by Warren Butler and Harold Siegelman. Butler called the pigment phytochrome which simply means plant colour or plant pigment. Over subsequent decades it has become clear that plants possess several subtly different variants of phytochrome, each with a specific role and there is no doubt that these are major regulators of the effects of light on plant growth and development. However, as research progressed, it became apparent that not all the effects of light could be attributed to photoreception in the red/far-red region of the spectrum. There must be others, particularly sensitive to light at the blue end of the spectrum (as Charles Darwin had suggested in the 1880s!). At the time of detailed analysis of the effects of blue light, the receptors were unknown – and hence given the name cryptochrome – hidden colour/pigment. Three cryptochrome proteins were eventually identified in the early 1980s. And there’s more! The overall situation is summarised in the diagram which is taken from a paper by Inyup Paik and Enamul Huq (4). It is clear that plants are able to respond to variations in light quality and intensity right across the spectrum. They cannot move away from their light environment but have evolved mechanisms with which to respond to it. That brings me back to flowering. While there is no doubt that many spring-flowering plants are responsive mainly to ambient temperature (as described earlier) and are thus neutral with regard to day-length, there are many plants which have specific day-length requirements. These are typified by summer-flowering plants such as sunflower (Helianthus annus) and snapdragon (Antirrhinum) which need n days in which daylight is longer than 12 hours (n differs between different species of long-day plants). Similarly, plants that flower in late summer or autumn, such as Chrysanthemum require n days in which there are fewer than x hours of daylight. Note that x may be above 12 hours but the key requirement is that days are shortening. So, the next time you are outside and thinking about your light environment (will it be sunny or cloudy?), just stop to ponder about the marvellous light response mechanisms that are happening in the plants all around you. John Bryant Topsham, Devon July 2024 PS: For those who want to read more about plant function and development, this book has been highly recommended! (1) Ulf Büntgen et al (2022), Plants in the UK flower a month earlier under recent warming
(2) As published in his 1880 book The Power of Movement in Plants. (3) Yang, X., et al (2009). (4) Inyup Paik and Enamul Huq (2019), Plant photoreceptors: Multi-functional sensory proteins and their signalling networks. Graham writes ... Those of you who are regular visitors to this blog page may recall a post (1) in August 2023 when the so-called ‘Crisis in Cosmology’, or more formally what the cosmologists call ‘the Hubble tension’, was introduced and discussed. If you are not, then may I suggest that you have a read of the previous post to get a feel for the nature of the issue raised? Also please note that some sections of the previous post have been repeated here to make a coherent story. It concerns the value of an important parameter which describes the current rate of expansion of the Universe called Hubble’s constant, which is usually denoted by Ho (H subscript zero). This is named after Edwin Hubble, the astronomer who first experimentally confirmed that the Universe is expanding. The currently accepted value of H0 is approximately 70 km/sec per Megaparsec. As discussed in the book (2) (pp. 57-59), Hubble discovered that distant galaxies were all moving away from us, and the further away they were the faster they were receding. This is convincing evidence that the Universe is, as a whole, expanding (2) (Figure 3.4). The value of H0 above says that speed of recession of a distant galaxy increases by 70 km/sec for every Megaparsec it is distant. As explained in (1), a Megaparsec is roughly 3,260, 000 light years. Currently there are two ways to establish the value of Ho. The first of these, that is sometimes referred to as the ‘local distance ladder’ (LDL) method, is the most direct and obvious. This is essentially the process of measuring the distances and rates of recession of many galaxies, spread across a large range of distances, to produce a plot of points as shown below. The ‘slope of the plotted line’ gives the required value of Ho. The second method employs a more indirect technique using the measurements of the cosmic microwave background (CMB). As discussed in the book (2) (pp. 60-62) and in the May 2023 blog post, the CMB is a source of radio noise spread uniformly across the sky that was discovered in the 1960s. At that time, it was soon realised that this was the ‘afterglow’ the Big Bang. Initially this was very high energy, short wavelength radiation in the intense heat of the early Universe, but with the subsequent cosmic expansion, its wavelength has been stretched so that it currently resides in the microwave (radio) part of the electromagnetic spectrum. The most accurate measurements we have of the CMB was acquired by the ESA Planck spacecraft , named in honour of the physicist Max Planck who was a pioneer in the development of quantum mechanics (as an aside, I couldn’t find a single portrait of Max smiling!). The ‘map’ of the radiation produced by the Planck spacecraft is partially shown below, projected onto a sphere representing the sky. The temperature of the radiation is now very low, about 2.7 K (3), and the variations shown are very small – at the millidegree level (4). The red areas are the slightly warmer, denser regions and the blue slightly cooler. This map is a most treasured collection of cosmological data, as it represents a detailed snap-shot of the state of the Universe approximately 380,000 years after the Big Bang, when the cosmos became transparent to the propagation of electromagnetic radiation. To estimate the value of H0 based on using the CMB data, cosmologists use what they refer to as the Λ-CDM (Lambda-CDM) model of the Universe (5) – this is what I have called ‘the standard model of cosmology’ in the book (2) (pp. 63 – 67, 71 – 76). The idea is that of using the CMB data as the initial conditions, noting that the ‘hot’ spots in the CMB data provide the seeds upon which future galaxies will form. The Λ-CDM model is evolved forward using computer simulation to the present epoch. This is done many times while varying various parameters, until the best fit to the Universe we observe today is achieved. This allows us to determine a ‘best fit value’ for H0 which is what we refer to as the CMB value. For those interested in the detail, please go to (1). The ‘crisis’ referred to above arose because the values of Ho, determined by each method, do not agree with each other, Ho = 73.o km/sec per Mpc (LDL), Ho = 67.5 km/sec per Mpc (CMB). Not only that, but the discrepancy is statically very significant, with no overlap of the estimated error bounds of the two estimates. So how can this mismatch between the two methodologies be resolved? It was soon realised that the implications of this disparity was either (a), the LDL method for estimating cosmic distances is flawed, or (b), our best model of the Universe (the Λ-CDM model) is wrong. Option (b) on the face of it sounds like a bit of a disaster, but since the birth of science centuries ago this has been the way that it makes progress. The performance of current theories is compared to what is going on in the real world, and if the theory is found wanting, it is overthrown and a new theory is developed. And in the process of course there is the opportunity, in this case, to learn new physics. Looking at the options, it would seem that the easier route is to check whether we are estimating cosmic distances accurately enough. Fortunately, we have a shiny new spacecraft available, that is, the James Webb Space Telescope (JWST), to help in the task. When I described the LDL method of estimating Ho above, it looks pretty straight forward, but it is not as easy as it sounds – measuring huge distances to remote objects in the Universe is problematic. The metaphor of a ladder is very apt as the method of determining cosmological distances involves a number of techniques or ‘rungs’. The lower rungs represent methods to determine distances to relatively close objects, and as you climb the ladder the methods are applicable to determining larger and larger distances. The accuracy of each rung is reliant upon the accuracy of the rungs below, so we have to be sure of the accuracy of each rung as we climb the ladder. For example, the first rung may be parallax (accurate out to distances of 100s of light years), the second rung may be using cepheid variable stars (2) (p. 58) (good for distances of 10s of millions of light years), and so on. Please see (1) for details. The majority of these techniques involve something called ‘standard candles’. These are astronomical bodies or events that have a known absolute brightness, such as cepheid variable stars, and Type Ia supernovae (the latter can be used out to distances of a billions of light years). The idea is that if you know their actual brightness, and you measure their apparent brightness as seen from Earth, you can estimate their distance. It is also interesting to note that a difference of 0.1 magnitude in the absolute magnitude of a ‘standard candle’, due to a discrepancy in estimating its distance, can lead to a 5% difference in the value of Ho. In other words, a value of Ho = 73 versus Ho = 69! It would seem the route of investigating the accuracy of estimating cosmic distances is fertile ground for a variety of reasons. And this is exactly what Wendy Freedman, and her team of researchers, at the University of Chicago did. However, I should say that the results that now follow are not peer-reviewed, and therefore may change. The story henceforth is based on a 30-minute conference paper presentation at the American Physical Society meeting in April 2024. Interestingly, the title of her paper was “New JWST Results: is the current tension in Ho signalling new physics?”, which suggests that the original intention, at the time of the submission of the paper’s title and abstract, was to focus on objective (b) as mentioned above – in other words, looking at the implications of the standard model of the Universe being wrong. But in fact the focus is on (a) – an investigation of the accuracy of measuring distances. I can identify with this – when the conference deadline is so early that you’re not sure yet where your research is going! So, what did Freedman’s team do and achieve? They used two different ‘standard candles’ to recalibrate the distance ladder with encouraging results. The first of these are TRGB (Tip of the Red Giant Branch) stars. Without going into all the details, this technique assumes that the brightest red giant stars have the same luminosity and can therefore be used as a ‘standard candle’ to estimate galactic distances. The second class is referred to as JAGB (J-region Asymptotic Giant Branch) stars that are a class of carbon-rich stars that have near-constant luminosities in the near-infrared part of the electromagnetic spectrum. Clearly, these are useful as standard candles, and are also good targets for the JWST which is optimised to operate in the infrared. The team observed Cepheid variable, TRGB and JAGB stars in galaxies near enough for the JWST to be able to distinguish individual stars to determine the distances to these galaxies. Encouragingly, the results from each class of object gave consistent results for the test galaxies. Once a reliable distance to a particular galaxy was found, the team was able to recalibrate the supernova ‘standard candle’ data, which could then be used to re-determine the distances to very distant galaxies. After all that, they were able to recalculate the current expansion rate of the Universe as Ho = 69.1 ± 1.3 km/sec per Mpc The results of the study are encapsulated in the diagram below, which shows that the new result agrees with the CMB data calculation (labelled ‘best model of the Universe result’ in the diagram) within statistical bounds. So, is that the end of the story? Well, as regards this study, it is yet to be peer-reviewed so things could change. Another aspect is that the apparent success here may encourage other groups to look back at their (predominately Hubble Telescope) data to recalibrate their previous estimates of galactic distances. So, I think this has a long way to run yet, but for now the Freedman Team should be congratulated in their efforts to ease the so-called ‘crisis in cosmology’!
Graham Swinerd Southampton, UK June 2024 (1) Blog post August 2023, www.bigbangtobiology.net. (2) Graham Swinerd & John Bryant, From the Big Bang to Biology: where is God?, Kindle Direct Publishing, November 2020. (3) The Kelvin temperature scale is identical to the Celsius scale but with zero Kelvin at absolute zero (-273 degrees Celsius). Hence, for example, water freezes at +273 K and boils at +373 K. (4) A millidegree is 1 thousandths of a degree. (5) Here CDM stands for cold dark matter, and the Greek upper-case Lamba (Λ) refers to Einstein’s cosmological constant, which governs the behaviour of dark energy. |
AuthorsJohn Bryant and Graham Swinerd comment on biology, physics and faith. Archives
January 2025
Categories |