The Apple Experiment
Melissa Sterry. Published August 5th 2024
Picture this scene
A teacher and twelve high school pupils are stood in a circle in a classroom. Each of the pupils has their arms stretched out in front of them, with their palms facing the ground, and holds an apple in each hand. The teacher asks them, ‘if you release the apple in one of your hands, will the apple fall or float?’, to which the pupils respond with, ‘fall’. The teacher then asks the pupils to release one apple from one hand, at which point that apple falls to the ground. The teacher then asks the pupils to release the apple from their other hand, at which point that too falls to the ground. The teacher then asks the pupils to undertake a thought experiment, wherein they are gathered not in the classroom, but on the ISS space station. She asks them to imagine they have formed a circle and, as in the experiment they have just completed in the classroom, they have their arms stretched out before them and are holding an apple in each hand. ‘If you open one of your hands, would the apple you had held in that hand fall or float?’, to which the pupils respond with, ‘float’. The teacher then asks the pupils to imagine they have released the apple from the other hand and repeats the question, ‘would that apple fall or float’, to which the pupils again reply, ‘float’. The teacher then asks the pupils what The Apple Experiment has proven. One pupil correctly replies with, ‘Newton’s Law of Universal Gravitation is correct’, to which the teacher responds by congratulating the pupil on recognising that the exercise was a means of exhibiting that Newtonian mechanics enable us to predict the motion of objects in time and space.
However, The Apple Experiment is not just an exercise in evidencing the proof in a gravitational pudding. What it also makes evident is the fact that we can accurately predict the future in some circumstances. More specifically, whereupon we understand the Laws of Science, and with that, we understand how those laws are expressed in difference environments, we can anticipate the outcome of some events. Thus, as with so many other things, though the statement that ‘you cannot predict the future’ holds generally true, it’s a rule not law, and its expression is not universal.
Since childhood, I have undertaken many ‘apple experiments’. Aged eleven I was creating trend reports filled with pencil sketches of the fashion, interior, and architecture concepts I thought would become popular in the coming year. I didn’t realise it then, but I was expressing a trait that’s fundamental to how my neurology works: pattern recognition. By twenty I was authoring a Bachelors’ thesis on the future of beauty, In the Eye of the Beholder [1] in which having analysed millennia of visual representations of beauty ideals from across myriad cultures, and with that the development of the beauty industry in the 20th century, I predicted the trends that would shape the coming three decades. Trends I detailed in the thesis, which was completed in 1996 included bodily augmentation through both surgical and non-surgical procedures becoming a multibillion-dollar industry that saw a heavily homogenised facial and body ideal, juxtaposed against a counterculture trend that embraced heterogeneity and individualism. In the latter instance, I wrote that features that had previously been considered as quirks and imperfections would become celebrated and particularly in high fashion. In the mid 00s I published a series of essays, of which one was a cover story for Mensa magazine [2], in which I wrote of how shifts in media, press, and television were creating some highly adverse trends in society of which the implications were serious. These developments were principally driven by the fact that the quantity of content being produced was increasingly trumping the quality, and the primarily causation was a shift in the financial models that content producers were working with. Whereas, for the better part, the 20th century had involved making fewer, but higher quality content that reached large audiences, the 21st century was shifting towards are highly fragmented market model in which many production budgets were reduced. One of the ways that production companies were lowering costs was by replacing professionals with amateurs, and this was most evident in the genre dubbed ‘reality TV’. As anyone with any expertise in media production would have recognised, there was nothing ‘real’ about the content that was being produced. What we weren’t witnessing was the creation of content of which the intent was, in the style of Alan Whicker’s pioneering documentary works, that of presenting an authentic window on the world. Instead, we were witnessing a form of televised theatre, wherein the set wasn’t built in a television or film studio, but created through post-production editing processes. And, whereas of past, be they actors or broadcast journalists or presenters, the performers weren’t pros but amateurs. More specifically, amateurs that often had no idea what they were letting themselves in for. However, though it was already clear that profits were often being put ahead of ethics, the ‘reality show’ stars weren’t the only possible victims. Surveys started to show that many young people were falling for what was, in effect, a life, that lie being the idea that fame and fortune can come easily, which is why one of the titles of one of the essays in that series was Easy Come, Easy Go: the beginning of the end of manufactured celebrity [3].
By the late 00s I was predicting the impact of participatory media [4, 5], and explaining how both technical advances in personal communications devices in tandem with social and cultural trends pointed to a major trend that would see citizen journalists and presenters change media content production as we knew it. I called this the democratisation of media production, and detailed how it would present both pluses and negatives. On the one hand, mass micro media content production would see many more viewpoints become visible, we’d hear more, not less voices in the coming couple of decades. I explained that this shift would remove many of the barriers to entry into a career in industries including but not limited to media, as it would enable individual content creators to become discovered in the absence of the likes of talent agents and others that had historically served as gatekeepers. I also explained that this emerging trend would create a more heterogeneous media industry and ultimately challenge the media monopolies that had dominated the previous few decades. One the other hand, mass micro media content production would create a situation where the calibre of content creation was highly variable and, paradoxically, though enabling copious content to authored with authenticity and integrity to propagate, vice versa – misinformation would become so abundant it would cause confusion, controversy, and conflict. Put another way, the growth in do-it-yourself content creation would have some very serious unintended consequences and thus was by no means a content production panacea.
Between then and now, other predictions I delivered in keynote speeches, board advisories, and press articles that came to pass included the financial crisis of the late 00s; the environmental crisis spiralling to worst-case scenarios far faster than the consensus as was being presented in the likes of IPCC reports of the time; and, between 2014 and 2015 delivering the verdict that Britain and America were about to enter periods of substantial political, economic, societal, and cultural unrest that would upend the prospect that many of the peachy predictions of others would become reality. Then, between 2017 and 2019, I delivered a series of talks in which I accurately predicted that the world would soon experience a pandemic of proportions that would cause chaos worldwide and necessitate that companies and civic institutions alike rethink their strategies from the ground upwards, it being a prediction that was referenced in an article published in Inside Housing several months later [6]. Last autumn, when sharing my predications for the year ahead in a series of Instagram IG livestreams to followers, I predicted that 2024 would, sadly, see significant civil unrest in Britain, including an uptick in riots and other direct action by those so that had become so disenchanted with the state of the nation that they chose the path of violence ahead of negotiation and other more peaceful routes to reconciliation.
All the above are predictions that involve assessing the probability of the expression of patterns, and the relation of cause to effect more generally - the question of asking, ‘what are the system inputs, and what do science and history suggest the outputs will be?’. Another analogy I use to explain the logic I apply when assessing the probabilities attached to different possible outcomes is betting on the outcome of races. Gamblers don’t typically place their bets randomly. Instead, they take the odds as calculated by bookies as their guide. If they are high stake gamblers, they will usually study the form of the various competitors. Whether consciously or otherwise, these gamblers are attempting to spot a pattern, it being a pattern that indicates whether a competitor is likely to succeed or to fail in their attempt to win the race.
Some foresight and futures professionals shun the idea that working in and with futures is concerned with predictions, instead suggesting that the discipline ought only to be concerned with creating scenarios. I beg to differ, because though scenarios are a useful tool for helping companies and other clients to work out how they may prepare for various eventualities, many seek guidance on what events are most likely to unfold. Put another way, they are interested to know what Probability Theory suggests are the most likely of the many possibilities. The matter that many clients seek such guidance makes sense, given that any board executive worth their salt is aware that running companies and other organisations involves making a series of what are, in effect, gambles, and they understand that the more information they have as relates to their choice of options the more likely it is they will back the analogical winning competitor, not loser in any given race.
Most ‘apple experiments’ involve far greater levels of complexity than the classroom scenario with which I opened this essay. For example, understanding whether a new invention may scale to global impact isn’t just a question of understanding how it works, but with that, what alternatives exist; what resources are involved in their production, distribution, retail, and maintenance; how the availability of those resources may change over the coming years and decades; how shifts in economics, society, culture, and environment may influence future market demand; how competitor activity might do likewise; and perhaps above all else, whether or not the concept exhibits qualities that might appeal appeal to a large enough populous to drive research, development, and commercialisation. Hence, whereas the classroom apple experiment involved factoring just two variables across space and time, assessing the likely trajectory of an invention or other concept involves factoring shifts in many. I’m by no means the first foresight and futures expert to take this tact. Others that came before me included the likes of Arthur C. Clarke, and I would argue, H. G. Wells, who though principally regarded as a science fiction author nonetheless made many accurate predictions with relation to aspects of the future, such as conceiving of the concept of a handheld communication device that enabled images to be streamed between distant places in real-time [7]. Clarke, Wells, and I share a commonality in our approach to our work, in that science and STEM more generally is central to our research practice, and the primary source of our inspiration artistic and otherwise.
Some muse as to the extent to which artificial intelligence will be able to make predictions about the future. Having examined this question through many lenses, I conclude that AI will have increasingly high levels of accuracy in relation to some specific kinds of forecasting, and both as applies to environmental and social issues. The reason I draw this conclusion is the fact that AI works on pattern recognition, and in the context of probability. This is why AI text and image generators are usually accurate when answering questions on well-established fields, and vice versa on still emerging ones. In the former instance the high level of correlation across a data set typically enables an AI system to calculate a largely, if not wholly accurate answer. However, when there’s very little data available, as occurs in the instance of topics at the very edge of science and innovation, the process of pattern recognition becomes far tricker, thus why AI systems tend to deliver a far lower level of accuracy when providing answers in relation to fundamentally speculative questions, including those that relate to aspects of the future.
Whether a prediction is being generated by a human or AI, the accuracy of that prediction is nonetheless framed by human factors, hence why theory doesn’t always translate to practice. But, which of these human factors is foremost? There is a paradox in predication, which is something that has been widely understood by innumerable individuals since, at latest, the Bronze Age. What is that paradox? It’s perhaps best explained in the myth of the Trojan priestess Cassandra, who upon the promise that she would grant him favours had been given the gift of prophecy by the god Apollo. However, when she failed to keep that promise he fated her to never be believed. The trope repeats time and again across cultures both ancient and contemporary. In the case of the former we find examples in the many flood myths dating from prehistory onwards. In the latter instance, Cassandra takes the form of the character of Brody as he warns beachgoers not to enter the waters around the fictional location of Amity Island, he being an archetype that’s found in umpteen Hollywood action films, including Dante’s Peak and Jurassic Park. “Life finds a way”, as one of Michael Crichton’s most famous characters said [8]. Predictable though the narrative is, audiences connect with it because it’s all too familiar to them. We all of us experience instances where an individual or collective has warned one or more others of a pending disaster to then be ignored. For example, just as John Evelyn had warned that a great fire would engulf London whereupon mitigation measures were not taken, the residents’ committee of Grenfell Tower had flagged safety concerns relating to the risk of fire well ahead of the catastrophe that struck in 2017. Which begs the question of why have ‘Cassandras’ been ignored since prior, even, to the advent of human civilisation?
I posit, as have other before me, that it comes down to human neurology and in turn psychology. Humans don’t understand the external world in a homogenous way. Both consciously and unconsciously we edit our version of reality. Some edit reality in extreme ways, other only partially. The extent to which we do is both a product of our capacity for dealing with issues of complexity, and with that our general character. Some intellectually tread where angels so-say fear to, while others do quite the opposite. However, whether an individual runs towards or away from information and experiences that can help them to decipher fact from fiction, mental health studies have shown that we all of us tend to dislike unpredictability in at least one or more domains of our life, and especially so when it comes thick and fast. In other words, we all of us exhibit limits in our capacity to navigate uncertainty, and when those limits are surpassed, our mental health and wider wellbeing usually suffers. This is why it’s ‘only human’ to reach the point where we feel that ‘enough is enough’.
In not one, but many nations of the world many citizens are showing strong signs of having had ‘enough’. As Alvin Toffler accurately predicted in Future Shock [9], periods of rapid change illicit varied responses across the populous and a primary indicator of how an individual or collective will respond is the matter or whether they tend to fear change or to embrace it. For some, change is a stimulating experience, and to such extent that they actively seek it. Indicators of this disposition including pursuing life-long learning, both formal and otherwise, travelling far and wide, socialising with peoples of wide-ranging backgrounds and demographics more generally, and having highly varied interests professional and personal. Added to this, this group tend to embrace ideas that are new and still emerging, and are comfortable with abandoning protocols and authoring anew. Whereas, for others, change tends to trigger a fear response that sees them trying to hold to the notion that it’s possible to create constancy and predictability and both in and beyond society. This group tend towards embracing traditions and feel nostalgic towards the past. They tend to believe in the idea of a Golden Age when the human world was a better place than today. This group is likely to live a lifestyle that’s not dissimilar to that of their parents, and in turn their grandparents, and typically feel a sense of duty to uphold ideas and values authored not by themselves, but by past generations. Their preferences result from a sense of psychological comfort that derived from familiarity. Both groups, and in many instances in equal measure, exhibit vulnerabilities: the former tends towards unrealistic expectations of the future, such as the idea that technology ‘will save us’; the latter towards idealised perceptions of the past, and to such extent that their mental version of it can become every bit as fictitious as the ideas held by their opposites. The language and in turn conversations of those that embrace the future tends towards utopian visions, whereas the language of the latter towards dystopias. Thus, neither embracing nor rejecting change can necessarily protect an individual or collective from it, for though the former can help build resilience and enable adaptability, sometimes change comes in forms so catastrophic that the capacity for human agency is all but destroyed. Furthermore, when the future comes faster not slower, in the sense that a series of factors combine to create shifts at great scale and speed, such is the level of chaos that ensues that even those that anticipated and tried to prepare for the most likely eventualities can get caught out.
I have hypothesised that the primary causation of the variance in human neurology and psychology is evolutionary and more specifically, that Homo sapiens, we being a social species, have evolved such that skills, abilities, and ambitions across our population are not ubiquitous, but instead varied, and in a way that enables specialisation, thus optimisation of more, not less achievements of the kind that can only be attained through the study and practice of a specific subject over time – it being a hypothesise I have named the ‘Ecosystem of Neurologies’. I have posited that this evolutionary trait is both a blessing and a curse, the former both because it has, I think, enabled us to advance to a high level of social, cultural, intellectual, and technical sophistication. The latter because, I argue, it has caused our species to have highly differentiated understanding of the external world and thus be highly inclined towards disputes of the kind that, at best, cause discomfort, anxiety, and mental stress more generally, and at worst catalyse conflicts of the kind that see us tearing one another apart.
Of now several years I’ve researched how, whereupon the hypothesis holds ground, we might reconcile our differences. Unfortunately, though I conclude that there are points in human history when, despite our many and sometimes profound differences of perspective, we nonetheless exhibit the capacity to reconcile those differences on some issues, and agree to disagree on others, vice versa. Historically, the odds that we will do the former appear to be higher in times of relative abundance of resources, and lower in periods when resources become scant. Consequently, the availability of resources is a primary indicator of conflict probabilities, this being something that military strategies know all too well. Given the trajectories that predictive modelling suggest to be most likely in the coming years and decades, that’s a coupling that doesn’t bode well for human civilisation today, which is one reason why, since 2009 onwards, I’ve been so focused on researching and developing solutions to worst-case scenarios at the level of both the environment and society, including but not limited to what I have anticipated will be rising risks across multiple classes of natural hazard. More recently I have also predicted that the collapse of our current civilisation has already begun, it’s just not evenly distributed, to borrow a phrase from William Gibson.
Of late, many foresight and futures practitioners have been concerned with creating ‘preferred futures’. Occasionally I too explore the future through this lens. However, when doing so I’ve been very much aware that different people’s preferences for the future vary greatly, and thus it’s less, not more likely that any one such future will come to pass, and even if it does it will likely be expressed within spatial and temporal boundaries of which the limits are tightly bound by both environmental and social factors, therein heterogenous in expression, not homogenous. Globally, we’re witnessing - and in real-time - an increasingly contentious wrestle between the preferred futures of inherently polarised perspectives. The fragmentation of opinion is expressed in relation to all manner of aspects of our lives today, from how we identify ourselves to how we love to how we work to how we relate to places, to possibilities, and as is particularly visible of present, to other peoples at large. There is, it seems, little if any common ground, and arguments that flare about one issue often expand to incorporate many, and particularly if they might involve issues of race, gender, biological sex, and sexuality. Trust is now in very limited supply, be that trust in the media, trust in politicians, trust in scientists, trust in those previously revered as experts more generally, and it seems in some instances, even trust in the self, as expressed by the fact that levels of anxiety are at a high, and anxiety is a fear-based response that essentially comes down to an individual fearing that theirs will be not the ability to navigate a given situation or collective thereof in a way that will yield a desirable, or even acceptable outcome. Humanity is having an existential crisis that, as is, looks to get a lot worse before it might get better. If only predicting the future for civilisation was as simple as predicting if an apple would fall or float given the gravitational environment it’s in.
As I reflect on the events that may unfold in the coming weeks, months, years, decades, and centuries, I am more aware than ever that the choices we make today will shape the future in ways that cannot largely be undone. Whereas in the earlier part of this century, though aware that time was of the essence, mine was nonetheless the sense that, at the very least, humanity most likely had just enough time to steer our civilisation towards a better not worse outcome. Now however, as I look to the events unfolding about the world, I see many dices rolling and highly conscious of the fact that my analysis of copious and varied data has led me to conclude that a series of catastrophic events will come to pass and pass soon. All the while, I hope that my analyses is wrong and that sooner not later the many and profound troubles of the present will be largely gone, or at the very least much diminished in their size, distribution, and impact. If there’s a lesson to be learned from the Classical myth of Cassandra it’s that in and of itself, regardless of accuracy, prediction has often very limited impact on the outcome of societal and wider events. Might the mythologies of ancient Greece, Rome, and beyond give us some insight into how we might better navigate the many problems we now face? Given I think we - humanity - need all the help we can get now, I hope so, not least because we are not the first civilisation to face the prospect of possible imminent collapse, and instead just the latest in a long line. In my assessment, though on the surface of it we are in different to our ancient forebears, at the level of our cognition, psychology, and behaviour we’re still one and the same creature. We exhibit today all the same fundamental characteristics of the peoples as came before. Which is why I think it imperative that when looking to the future we also look back, but not through the lens of rose-tinted glasses, but that of microscope, and in the process blend what insights we can glean from the sum of collective knowledge to date: from the sciences, from the humanities, and from the arts. Perhaps our efforts will turn out to be in vain, but if ever there were a high stakes gamble it’s the one we’re facing now.
[1] Sterry, M, L., (1996) In the Eye of the Beholder, BA degree Thesis, University of Salford, Greater Manchester, UK
[2] Sterry, M. L., (2008) Culture Clash: Famous for being famous, Mensa magazine, London, UK
[3] Sterry, M.L., (2007) Easy Come, Easy Go: The Beginning of The End of Manufactured Celebrity, Medium Magazine, London, UK
[4] Sterry, M. L., (2009) Media Monopoloy vs. Media Democracy, Keynote, InterACT Participatory Media Forum, Herefordshire, UK
[5] Sterry, M.L., (2009) Putting the ‘We’ in Web’, Keynote, Media Ecology & Post Industrial Production conference, University of Salford, UK
[6] Taylor, P., (2020) If community groups deliver better support than social landlords then we need to revisit our purpose. Inside Housing, London, UK
[7] H. G. Wells., (1897) The Crystal Egg. Pearson’s Magazine, London, UK
[8] Crichton, M., (1990) Jurassic Park. Alfred A. Knopf, New York, US
[9] Toffler, A., (1970) Future Shock. Random House, London, UK
Disclaimer: This was drafted at speed, without planning, and has not been proofed. You can consider its doubtless many typos and other errors as evidence that it was authored by a human and not an artificial intelligence system.