Is Society Sick? Autism and the Extended Mind

extended mind.jpg

As readers of this blog will be familiar with, neurodiversity proponents argue that autism is a neuro-minority rather than a mental disorder. In short, they deny neither that categories such as autism indicate disabilities, nor indeed the various forms of distress associated with being disabled. Rather, it is that they locate this disablement and distress in society instead of framing it as stemming from a medical pathology in the individual. So an autistic person may, for example, have heightened sensory sensitivity; but they only become disabled in relation to the senses in a world that is not designed to accommodate for this different way of processing, thus leading them, say, to experience sensory overload.

Nonetheless, critics of the neurodiversity movement like to point to examples of what they (misleadingly, as we shall see) call “severe” or “low-functioning” autism in order to either limit or even dismiss the validity of the movement. Sure, they argue, even if some so-called “high-functioning” individuals seem different rather than disordered, and even if a few people given the “severe” label do advocate for the neurodiversity movement, these are not representative of the many who need 24 hour care, and have (so far) unsuccessfully been able to communicate anything more than very basic needs to those around them. In short, what these critics claim is that the minds of at least some of those deemed “severe” are so deficient that they are not just undeniably bad, but inherently so.

I want to suggest here that these critics of neurodiversity base their view on an increasingly outdated, and indeed unjustified, theory of what a mind is as such. I will call this the ‘neuro-centric mind’ theory, and I take this (at least implicitly) to be the most widely held theory of the nature of the mind among the educated public today. In short, on this view, a mind stems from a brain, and is the psychological functioning that stems from that brain. Looked at this way, you might as well say that you are your brain. Or at the very least, the brain is the seat of the mind, and anything beyond the brain and its thoughts are not part of the mind as such.

This is a fairly widely held view of the nature of the mind, and one I am sure many readers of this blog will hold (including many neurodiversity proponents). But as the philosopher Andy Clark has long argued, this view is neither justified, nor able to accurately capture the phenomena of cognition. In short, he notes, many of our cognitive processes do not just take place within the broader world they are embedded in; rather, many forms of cognition actually wholly or partly rely on this outside world. As a recent commentator summarises, for example: ‘There [are] many kinds of thinking that weren’t possible without a pen and paper, or the digital equivalent—complex mathematical calculations, for instance. Writing prose [is] usually a matter of looping back and forth between screen or paper and mind: writing something down, reading it over, thinking again, writing again. The process of drawing a picture [is] similar.’

What Clark points out here is that, whilst the neuro-centric mind theory might frame these resources as mere external scaffolding for the mind, this vastly underestimates their importance. In fact, since the cognitive process that arise in light of them could not happen without this scaffolding, then this scaffolding is as integral to cognition as the scaffolding of the brain is.

Hence the superior alternative to the neuro-centric mind theory is Clark’s (and Chalmer’s) ‘extended mind theory’, which holds that the mind is not situated in the brain, but rather widely embedded in relation to many various resources, from the neural to the technological (for example, even as I type this article, the many tabs open on my browser are constituting my memory as much as my brain is). By contrast, it is a mistake to think that ‘whatever matters about my mind must depend solely on what goes on inside my own biological skin-bag, inside the ancient fortress of skin and skull.’ This belief, although still widely held, rests on unfounded assumptions as to what constitutes cognition – one that unduly discriminates against environmental (and sometimes bodily) resources in favour of just the neural resources.

Return here to the example of the so-called “severely” autistic person who needs a lot of help. Perhaps they need special technology to communicate (we might not even have invented this technology yet), and various technological resources in place to help them organise and plan their day. Perhaps they also need other people there around the clock to help them do things in various respects. And so forth.

On the one hand, according to the neuro-centric mind view, we should judge mental deficiency in terms of having to rely on external supports more than whatever we consider normal at the given time, since the need for external supports indicates that the mind (and brain) must be lacking. (In turn, if we then look at the neurology and it does indeed seem different to the norm, then this is taken to confirm that the mind was deficient – thus giving the impression of justifying the initial hunch.) Hence, on this view, it is hard to conceive of someone who needs 24 hours support, and so forth, to be anything but mentally disordered, or indeed “low-functioning”.

By contrast, however, without unduly discriminating against the extended aspects of the mind, there is no good reason to think that any given mind is deficient merely for having to rely on the environmental aspect of the mind more than the neural. For if we withhold from such discrimination, then this seem like a matter of different distribution rather than inherent dysfunction. It is just that some minds rely on the neural aspect more heavily, whilst others rely on the environmental aspect more heavily. The case of the so-called “severely” autistic person who needs a lot of help is, then – at least when they do get the help they need – a case of widely distributed cognition, rather than deficient cognition.

Crucially too, then, whether they get this help or not is a matter of social justice, not of medicine. Just as it would be a case of social (in)justice rather than medical pathology if some group (for instance, all those with an IQ between 85 and 90) were needlessly excluded from being given calculators at school due to discrimination. The extended mind theory thus helps us understand how the disablement of the autistic extended mind is a matter of power-structures having excluded this mind in various ways, rather than a simple matter of impaired functioning.

This leaves me with the following two thoughts. On the one hand, for as long as there are autistic people whose extended minds cannot function, then despite initial appearances, it may not be the autistics who are sick. Rather, the sickness lies in the society that leads to the disablement of the extended minds of its minority members. On the other hand – and perhaps worse still – society’s need to project this pathology back onto the autistic person, rather than admitting its true societal nature, makes the sickness of that society seem much more insidious that it might initially appear.

New Research Suggests Social Issues are Down to Neurotypicals more than Autistics


Picture by Joan M. Mas

Autism is seen, in popular representations, largely as a social and communication disorder. Formerly framed as stemming from an autistic lack of a “social instinct”, the current dominant idea is that something is deficient or missing in autistic social cognition. Often referred to as a cognitive deficit in “empathy” or “theory of mind”, much research on autistic social issues has focused on trying to clarify and detect this inside autistic brains and minds. The search for an elusive broken “theory of mind module” or “empathy mechanism” in the brain, and its ensuing cognitive manifestations, however, has led to conflicting results – with some scientists even concluding that autistic people feel too much empathy rather than too little.

Another view is that this is not simply an individual neuro-cognitive issue, but rather a wider social problem. Against the idea that autistic people have too much or too little anything, autistic neurodiversity movement advocates have long argued that the empathy problem is actually a two-way issue between neurotypicals and autistics, which only emerges when the line is crossed. Dubbed the “double empathy problem” by autistic scholar Damian Milton, this framing emphasises how communication and social encounters are always things that happen between people – meaning that any breakdown in communication is always relational and down to both sides, not just an innate issue with one or the other.

Not much empirical research has been carried out in this regard (most funding is directed to and by pathology paradigm proponents, who have no interest in pursuing a neurodiversity paradigm reframing of autism). However, a new collection of studies, published in Nature has given clear weight to the notion that autistic problems in socialising stem as much from the neurotypical side as the autistic side.

The three studies, each of which drew on different samples and used a variety of methodologies, initially all found that

‘observers’ first impressions of individuals with ASD engaging in real-world social behavior were found to be robustly less favorable than those of matched TD [i.e. neurotypical] controls […] these impressions were associated with reduced intentions to socially engage by observers’

In other words, they found that an important contributor to social and communication problems stemmed not from the autistic individuals, but rather from the neurotypical reactions, based on (by definition) exclusionary social attitudes and first impressions, which led to a decreased drive to interact with autistic individuals. That is to say, neurotypicals tend to decide, within moments of meeting autistic people, that autistic people are less worth socialising with than neurotypicals.

Building on this, one of the studies further compared evaluations between written communication and speaking in person. What it found was that autistic people were not rated negatively by neurotypicals when only their writing was assessed. Rather, it was how autistic people look, rather than the substance of what was said, that was the key factor in determining the neurotypical drive to exclude autistic individuals. This was further confirmed, note the researchers, when

‘a static image was sufficient for generating negative first impressions of those with ASD […] In contrast, first impressions of TD controls improved with the addition of a visual information’

In other words, an accompanying photo of an autistic individual had a negative affect on neurotypical perception of the value of the writer, whilst an accompanying photo of a neurotypical tended to have a positive affect.

Given this, it is no surprise that autistic people experience problems when it comes to social interaction. As the authors further clarify, their findings suggest that the issues autistic people face are in fact relational:

The reluctance of TD individuals to engage in social interactions with their ASD peers further limits the opportunities for individuals with ASD to practice their already fragile social skills. This can have a significant negative impact on the ability of socially aware and socially interested individuals with ASD to improve their social communication abilities and work toward more successful social integration

The practical implications of this, conclude the researchers, is that

If our goal is to improve social interactions for individuals with ASD, it may therefore be equally important to educate others to be more aware and accepting of social presentation differences, rather than trying to change the many interwoven factors of self-presentation that mark the expressions of individuals with ASD as atypical.

This fits precisely with the notion that the empathy problem goes both ways rather than from within autistic people, not to mention that this happens in the context of pervasive ableist norms and attitudes that seek to alter rather than accommodate autistic being. Given this, as many autistic individuals will surely testify, this important new research may, then, be vital in helping to show policy makers and the wider public that the key problem with regards to autism is not autism itself. Rather, as autistic self-advocates have long argued, it is neurotypical society and the ableism so deeply embedded throughout.




The term ‘schizophrenia’ stems from the from Greek words skhizein, which means ‘to split’, and phrēn, meaning ‘mind’. The initial thought was that the mind splits in some sense: not, as is sometimes thought, into multiple personalities; but rather into a fragmented and alienated personality, usually beginning in late adolescence or early adulthood. Beyond this, the condition is also most centrally characterised largely in light of psychosis, hearing voices, paranoia, a general sense of apathy, and flat or fluctuating moods and emotions. When all these characteristics emerge together in one person to what is deemed a ‘clinically significant’ level, that person is taken within institutional psychiatry to “have” schizophrenia.

In general, these traits are almost universally though to be inherently harmful deviations from the norm, and together they are taken to somehow destroy or at least radically damage the previously existing person. Given this, schizophrenia (and related ‘schizophrenic spectrum’ conditions such as schizoaffective disorder) are taken within both institutional psychiatry, and society more broadly, to be something terrible – tragic medical diseases to be combated with pharmaceutical drugs, institutionalization, and perhaps, one day, genetic engineering.

But the rise of the neurodiversity movement, which until now has focused mostly on the autism spectrum and other cognitive disabilities, gives reason to challenge this view of the schizophrenic spectrum. What neurodiversity movement proponents claim is that, even though the underlying neuro-cognitive differences captured by psychiatric labels indicate meaningful ways of being, the harm associated with these conditions is caused by society and ideology rather than due to anything like innate medical pathology. In light of this, neurodiversity movement proponents argue that what they call ‘neurominorities’ – autism, dyslexia, and so on – are natural and legitimate, albeit oppressed or excluded, ways of being in the world.

In particular, neurodiversity proponents stress the need to embrace one’s neuro-type as an inherent part of one’s identity, instead of seeing it as an external pathology that is separate from one’s essential selfhood.  Autistic neurodiversity proponents, for example, find the notion that you can have autism, rather than be autistic, highly problematic – much in the way that it is problematic to speak of gay people as ‘people with sexual orientation disturbance’, which was precisely how homosexuality was characterized within institutional psychiatry until the 1980s. In this regard, neurodiversity proponents stress the need to drop person-first language (e.g. ‘person with autism’) in favour of identity-first language (e.g. ‘autistic person’). Part of the thought behind this is that, in order to genuinely accept oneself, and in turn live a good flourishing life, seeing one’s neurocognitive style as an essential and valuable part of one’s selfhood, rather than an added pathology, seems vital.

Notably, although some groups of schizophrenic persons have resisted instituaitonal psychiatry in various ways, for example by trying to demedicalise hallucinations, the neurodiversity movement calls into question whether schizophrenic selfhood too, might be re-conceptualised and reclaimed in this way. That is, it points towards the possibility of not just demedicalising schizophrenia’s associated traits, as the ‘hearing voices’ and ‘mad pride’ movements have already started to work towards, but rather towards a more radical and affirmative embrace of the categorization as part of the personal and political identities of schizophrenic persons.

Of course, on the face of it, it might seem utterly bizarre to claim that schizophrenia could be seen as a neurominority rather than a disorder. After all, there is nothing good about hearing harsh and invasive voices, or thinking that people are out to get you. And we have all seen tabloid stories of schizophrenic spectrum persons harming themselves or others. In short, then – as many schizophrenic spectrum persons and their families will testify – those who end up being given this label do indeed encounter terrible amounts of distress, and many would do anything to return to their pre-schizophrenic selves.

Nonetheless, all this does not, in and of itself, necessarily mean that being on the schizophrenic spectrum is inherently harmful. It may still be both that the distress stems from cultural and social rather than neurological factors, and that we dismiss schizophrenic selfhood due to a particularly restricted conception of what it means to be a healthy person (or indeed have a self at all). If this is the case, then it may be that seeing schizophrenia as a natural part of human neurodiversity is preferable to seeing it as a medical disorder after all.

One recent scientific study which supports this hypothesis, for example, comes from the Stanford anthropologist Tanya Luhrmann and colleagues. By examining the experience of hearing voices (or auditory hallucinations) in different cultures, they found that the whether this was associated with distress varied hugely. More specifically, in North America, voices were experienced as harsh and invasive strangers, and so caused a huge amount of distress to those who heard them. By contrast, In Africa and India, voices were much more likely to be experienced as playful and friendly.

Very significantly, the key factor in this difference, Luhrmann and her colleagues note, was actually the socially constructed ideal of the self that dominated in each society. In North America (as with the rest of the West), an atomized, individualistic conception of self dominates, meaning that hearing outside voices was conceptualized and experienced as an external threat to autonomy and selfhood. By contrast, given that more collectivist conceptions of selfhood dominate in Africa and India, it did not matter if some internal voices are not one’s own, since others are already part of the self anyway – and so the voices were much less likely to be harmful, and were sometimes even helpful.

Interestingly, the claims of this anthropological study chimes with those of Rutgers psychologist Louis Sass, who has argued over the past three decades that post-modern Western societies hugely amplify the kinds of problems associated with the schizophrenic spectrum. For Sass, because postmodernism is, like the schizophrenic spectrum, characterized by a hyper self-reflexivity, fragmentation, alienation, and an ever increasingly individualistic yet decentered self, the two can accentuate each other in a way that makes the schizophrenic way of being manifest as more problematic than it otherwise would have. In other words, it may be pervasive aspects of our post-modern culture, and the conception of selfhood that accompanies it, that amplifies the problems schizophrenic spectrum persons seem to be prone to, rather than these problems being most centrally biomedical in nature.

If the claims of researchers like Sass and Luhrmann are even partially right, then it seems that it is social rather than neurological factors that are key in explaining the distress experienced by those on the schizophrenic spectrum. And in particular, it seems that historically contingent conceptions of selfhood – that is, pre-scientific, normative, and ideologically laden notions regarding what it means to be a person – are a key factor in the harm associated with being schizophrenic. Given this, it may be that learning to see schizophrenia as a natural part of neurodiversity may actually be better for schizophrenic persons in terms of the possibility of self-acceptance and living good, thriving lives.

Indeed, as the scientific medical literature itself indicates, it is precisely the case that schizophrenic spectrum persons tend to flourish, like everyone else, only when they come to accept and affirm themselves for who they are. As one schizophrenic man, Simon Champ, writes, it was precisely such an affirmative self-narration that

‘has given me the most precious thread, a thread that has linked my evolving sense of self, a thread of self- reclamation, a thread of movement toward a  whole and integrated sense of self, away from the early fragmentation and confusion I felt as I first experienced schizophrenia.’

In contrast to seeing schizophrenia as a pathological destruction of the self caused by an internal dysfunction, then, it may help to consider working towards a more inclusive and diverse understanding what it means to be and develop as a self. In particular, we may need to acknowledge that for some manifestations of human neurodiversity more than others, it is natural for selfhood to fragment and then resynthesize in new and creative ways (and to be more active in generating images and voices, perhaps especially in response to often harmful social environments).

In fact, we all do this to some extent: situations that cause a rupture in our existence – from life-changing tragedies to falling madly in love – can often cause selfhood to fragment, in some sense, and in turn to need some time before it can realign and grow. This is a natural reaction, and part of the human condition. In line with this, instead of automatically seeing them as deviations from ordinary selfhood, more schizophrenic kinds of fragmentation and self-other relations might also one day be seen as a natural ways for selfhood to adapt and relate to (sometimes extreme) circumstances – part of the continual path of an ever changing self existing in an imperfect world, rather than the destruction of one self replaced by a broken self.

Of course, I am not denying that those on the schizophrenic spectrum currently suffer terribly, nor even that this way of being comes with its own unique set of challenges and limitations. Clearly, many do experience terrible distress, and it is also important to acknowledge that in some extreme cases of, say, paranoid psychosis or severe depression, medication may be helpful (just as all humans need medication at some point in their lives). The point, rather, is that all human ways of being come with their own challenges and limitations: it is part of the human condition to suffer, to experiences crises of selfhood, and to change as life happens to us. And in the case of the schizophrenic spectrum, there is good reason to at least entertain the possibility that the most terrible distress might actually caused by life-events, society, and ideology, rather than due to some inherent medical pathology.

Indeed, once we add to this that, as anti-psychiatrists such as Thomas Szasz have long stressed, those given the schizophrenic spectrum labels are routinely stigmatized, institutionalized, drugged, and repeatedly told they are inherently sick and broken, it is no wonder that so many become paranoid, depressed, and apathetic. Just as with other oppressed or excluded minorities who are thereby more prone to experiencing mental distress, so too it should not be surprising that schizophrenic spectrum persons encounter similar issues in an equally systematic and terrible way.

What I am denying, then, is that those given these labels are obviously inherently broken, sick, or pathological; and I am also denying, vehemently, that it helpful to cast them as such. Rather than seeing the schizophrenic traits as something that destroys the self, we need to work towards understanding how, for some neurodivergent selves, these kinds of self-relation is a natural reaction to the various struggles encountered in the human condition more broadly, as well as to more specific oppressive social conditions. The schizophrenic spectrum, in other words, must be welcomed into the arms of the already blossoming neurodiversity movement – making it, as with autism, a way of being with its own culture and affirmative political identity.



Theories regarding the “cause” of autism abound. In the 1960s much was made of the psychoanalytic notion that bad parents, especially “refrigerator mothers”, could “cause” autism in their children by disturbing them emotionally. This theory became dominant across the West – leading to children being cruelly ripped away from their guilt-ridden mothers – until the late 1970s, when the combined might of increasingly fierce parent advocacy groups, and growing scientific consensus, pushed it to the hazy fringes of psychiatric discourse.

Since the 1990s there has been increased talk of an “epidemic”, and blame has alternatively been scapegoated on vaccines. This theory was first espoused by British researcher Andrew Wakefield, and it caused quite a stir. Parents stopped vaccinating their children, leading to outbreaks of disease and sickness worldwide. Nonetheless, as with the refrigerator mother theory, a scientific consensus emerged that there was, despite incessant searching, no evidence whatsoever to support this hypothesis. In fact, and although the myth it is still periodically dredged up by uninformed celebrities such as Donald Trump, Wakefield has now been exposed as a fraud, and struck off the British Medical Council.

By contrast, more reputable researchers have tended to focus on twins, genetics, epigenetics, and other various environmental factors. From the 1970s onward, for example, twin studies indicated that autism was probably inborn and hereditary. Since then, there has been an incessant search for so-called “risk” genes, as well as research on genomic imprinting and environmental factors, in order to explain the “cause” of autism – which itself is now seen more as a neurocognitive style, accompanied by varying levels of disability, rather than as an emotional disturbance.

In line with this, much has also been made of the fact that most of those diagnosed have been boys, which scientists now often take to indicate some kind of inherent link between autism and the male sex. Professor Simon Baron-Cohen, for example, characterises autism in terms of the “extreme male brain”. This is something he associates with rational, systematic thinking coupled with a lack of social understanding, which he hypothesises to stem from an overload of “male” hormones in the womb during pregnancy. Similarly, Dr Christopher Badcock suggests that autism may be a “hyper-mechanistic” kind of mind that stems from paternal genomic imprinting, again linking autism to the male sex.

Nonetheless, these biologised and sexed ways of understanding autism are, ultimately, unconvincing as well. On the one hand, there are multiple problems with the purported association between autism biological maleness. First, it seems to conflate sex (more biological) with gender (more cultural) in a dogmatic and problematic way, without taking into account the highly complex way in which the two interact. Second, even though people on the autism spectrum do tend to have these purportedly “masculine” traits, they also tend to lack many other traits associated with masculinity, for example, being good at sport and banter. Third, even though autism has, historically, been diagnosed mostly in buys and men, there is now growing awareness that many girls and women also share the cognitive traits associated with the condition, further problematising the assumed association between autism and men in a neurobiological sense.

In turn, beyond these gender troubles, when we review the scientific literature as a whole, things begin to seem a lot more complicated in regards to the purported biological essence of the condition as such. In fact, what the research findings have shown is that over a thousand genes, alongside a huge range of environmental factors – ranging from one’s proximity to busy roads to the age of the mother during pregnancy – seem to increase the chance of tending towards the separate cognitive and behavioural traits associated with the condition. But, crucially, the condition as a whole has no single cause, or even a range of combined causes. Similarly too, there is no neurological essence of the condition: despite systematically misleading reporting by enthusiastic science journalists, the differences seem be unique in each case, and studies reporting to find some neurological unity are rarely replicated.

What this indicates, as Professor Lynn Waterhouse argues in her 2013 book Re-thinking Autism, it that the fundamental error guiding our understanding of autism is that it really is one biological thing, or even “spectrum” of things. ‘Autism’, she writes, ‘is not one disorder or many “Autisms” but is a set of symptoms. The heterogeneity and associated disorders suggest that autism symptoms, like fever, […] signal a wide range of underlying disorders’. Given this (and putting aside her problematically pathologising vocabulary) talk regarding the “cause” of autism as such, when thought of in terms of a physical cause, doesn’t really make much sense (and this is so even though each single case might have its own disparate physical cause). The question, in other words, is driven by a fundamental unsupported assumption: that autism is a natural category, like “gold” or “mammal”, rather than social category, like “black” or “female”.

In contrast to these dominant biologised approaches, then, my own concern with the autism epidemic’s “cause” lies elsewhere – beyond the biomedical, and towards the normative. In fact, we should really be more interested in the social causes of our categorisation of autism (including those which have since caused that category to broaden and change), rather than the biological underpinnings of the traits we associate with autism in any given single case. For, once we accept that autism has no unified physical essence, this seems to be to be the most valid way of taking about what “caused” autism to come into being as a distinct human kind, and then to expand into a broad “spectrum” – or, indeed, “epidemic”.

When looked at from this angle, the first thing to note is that psychiatrists don’t just go around medicalising people at random. Rather, as the pioneering psychiatrist Karl Jaspers noted in his 1913 book General Psychopathology: ‘What is “ill” depends less on the judgement of the doctor than on the judgement of the patient and on the dominant views in any given cultural circle’. In other words, psychiatrists end up medicalising whoever happens to come to, or is sent to, them for “help” at any given time. But, in turn, whoever does end up being implicated as pathological, and in need of help, will already have been delineated by the wider norms of society – and, more specifically, whom these norms exclude.

Consider, for example, how homosexuality was wrongly medicalised as a mental disorder in the mid-20th Century. Initially, homosexuality ended up being medicalised, in part, because homosexuals (to use the lingo of the time) began going to, or being sent to, their doctors to seek for “help” with their homosexual urges in large numbers. But the reason they were sent, was because society was already homophobic; that is, society had already pathologised being gay as somehow sick, and beyond hegemonic hetero-normativity. Thus, in its medicalisation of homosexuality, institutional psychiatry acted more as a catalyst for these more general social norms, than as its cause – and the same is the case for many other psychiatric classifications.

Bearing the relationship between social norms and psychiatric medicalisation in mind, we might, then, similarly ask which norms led to autism, when disentangled from intellectual disability, being categorised as a distinct kind of human, in need of medical attention, in the first place. In other words, to locate the “cause” of autism arising as a distinct human kind, we need to ask not what its biological underpinnings are, but rather which social norms changed, and in what way, for those we now labelled as being “mildly” autistic or as having “Asperger’s syndrome” beginning to emerge as problematic – something that first happened briefly in Austria in the 1930s, and then again Britain, before the rest of the West, in the 1980s.

Turn, first, to 1930s Austria, where Dr Hans Asperger and colleagues began noticing a newly distinctive kind of person. Notably, it had long been the case for a long time that those autistic persons with more notable disabilities, for example profound intellectual disability, emerged as being problematic – it was just that they were thought of as, say, “feeble-minded” or “schizophrenic” rather than “autistic”. But around this time, and for the first time in history, various boys (they were always boys, back then) began being sent to clinics in the German-speaking world who we might now class as having “Asperger’s syndrome” or “high-functioning autism”. 

When considering the social causes of the emergence of autism in 1930s Vienna, it is initially significant that it coincided with the rise of Hitler, the Nazi Party, and the German the occupation of Austria. On the one hand, as I have written about previously, the Nazi Party subscribed to a Social-Darwinist ideology that drove them to categorise and attempt to eliminate what they considered abnormal behaviours. This goes part way to explaining why divergent persons were increasingly pathologised. However, this alone doesn’t explain why those specific behaviours we now call autistic ended up being deemed abnormal, and only in boys, whilst other “male” behaviours – gambling, womanising, or lying – were not then seen as problematic.

As it turns, out, though, this may be explained by gender norms in Nazi Germany, which were intertwined with the drive to sterilise and exterminate the cognitively disabled. On the one hand, in Nazi ideology, the key role of men was to contribute to the state, and the key role of women was to reproduce. Thus, for those who were profoundly cognitively disabled, neither men nor women would be seen as fit to fulfil their gender roles, meaning they were exterminatedIn turn, though, at a more subtle yet equally pervasive level, Nazi ideology also promoted a hyper-masculinity, whereby manliness was specifically associated with heroic group activities. The ideal traits associated with the “new man” were thus to develop a “soldier mentality”, join brotherly male dominated organisations such as the SS, and fight together in battles. Aside from this, there was also a huge patriarchal pressure for men to marry “hereditary fit” Aryan women, reproduce, and instil Nazi values into their children. Without exhibiting all these traits, males would not be considered “real” men, and would have fallen outside the realms of normality.

This, more than anything else, may account for why those boys who were previously considered “normal” were suddenly showing up everywhere as problematic. Given that those we now label as having “Asperger’s syndrome” are more in line with what we now think of as “geek” culture – solitary, lacking social attunement, and interested in mechanistic or philosophical pursuits – they would have fallen well outside the Nazi ideal of the “new man”. That is to say, they would neither have seemed particularly good at marrying, due to their purported problems in socialising, or falling in with this “solider mentality”, since they tend to be isolated, original thinkers, unlikely to be swept up in crowd madness. In short, as Dr Asperger noted in 1944, his autistic patients tended to ‘follow only their own wishes, interests and spontaneous impulses, without considering restrictions or prescriptions imposed from outside’ – traits which would have made them highly problematic from the inside viewpoint of the Nazi drive towards homogenous, hyper-masculine group mentality.

If this suggestion seems unreasonable, consider how long it took for Asperger’s syndrome to end up being deemed an issue in the UK and the rest of the West. Whilst it was deemed problematic in the German-speaking world, briefly, in the 1930s and 1940s, it didn’t systematically appear as an issue in the rest of the West until the 1980s. Although biologised approaches to autism cannot easily account for this huge gap, one clear social explanation regards how, during the first half of the 20th Century, gender norms in the liberal West were very different from those in Nazi Germany. In fact, the modernist male ideal in the rest of the West was much more in line with those traits we now associated with Asperger’s: being rational, clear, fixed in focus, and lacking empathetic attunement were celebrated in the modernist masculine ideal.

Consider, as Patrick McDonagh has argued, how many heroes and anti-heroes produced by modernist writers (ranging from Beckett to Kafka) can retrospectively be seen to exhibit remarkable similarities to those bodies now labelled as having Asperger’s syndrome. One example is Albert Camus’ “outsider” Meursault, who has been described as a ‘striking depiction of a high-functioning autistic’. This is not just in light of his intense sensory overload under the blazing Algerian sun, but also, as Camus himself described him, his being ‘an outsider to the society in which he lives, wandering on the fringe, on the outskirts of life, solitary, and sensual’. In stark contrast to the hyper-masculinity of Nazi-Germany, these traits were, wrongly or rightly, positively fetishized in men throughout the first half of the 20th Century in much of the modernist West, meaning that they would not have been deemed pathological and in need of medicalisation.

Whilst these traits were celebrated in the modernist era, they increasingly began to show up as problems in the Britain during the 1980s – meaning that something had changed in British social normativity. Interestingly, according to critical psychiatrist Sam Timimi and colleagues, this largely happened in light of the rise of the neo-liberal market system, and in particular the services economy. In particular, this economic shift began to alter the notion of the ideal male: rather than being fixed in focus and obsessive, men increasingly now had to forever shift into new roles and to constantly sell one’s “self” in order to fit in. Members of the workforce, in other words, now had to become increasingly agile, flexed, narcissistic, and hyper-social in order to succeed and be valued – and this economic drive became reflected in social normativity at all levels of society.

Thus, whereas modernist conceptions of masculinity tended to celebrate autistic traits, neoliberal economic ideology began to alter idealised conceptions of masculinity in such a way that takes them to be pathological. Boys who fell outside these norms began showing up at clinics, and suddenly a renewed interest in Hans Asperger’s previously overlooked publications from the 1940s, led by British psychiatrist Lorna Wing, emerged in order to account for this. By the mid-1990s, Asperger syndrome had been added to all the major diagnostic manuals, the “spectrum” had radically broadened, and diagnoses of the condition had skyrocketed.

In both times and places where Asperger’s syndrome came to be seen as a distinctively problematic condition – first, briefly, in Nazi-occupied territory during the 1930s and 1940s, and then again in neo-liberal Britain, Europe, and the United States, from the late 1980s onward – shifting gender norms help account for why the condition began to show up as problematic, and that too in more and more subtle cases. Gender norms, in other words, can account for the “cause” of autism, and the autism “epidemic”, in the only way that notion makes any sense: not as something physical, but rather as something that came into being, and grew, as a distinct social grouping at some point in history.



Finsbury Park, London. It is November and the falling leaves are beautiful: red, yellow, and brown; sinking, swirling, crunching. Of course, it is uncontroversial to point out how beautiful the natural world can be at this time of year. We all agree on this, partly, perhaps, since the notion is so deeply ingrained in our culture.

I am more interested in how the beauty of the falling leaves might seem paradoxical, once we consider that their beauty stems from their death – something usually associated with suffering and tragedy. Modernity, after all, makes us fearful of death, our ageing and decaying, to an unprecedented degree. We habitually shy away from finitude: hiding it away in the hospices and slaughter houses, and stifling its signals through cosmetic and digital enhancement.

How, then, could the fresh corpses of a thousand leaves manifest as so effortlessly magnificent?

Because of this apparent paradox – call it the paradox of falling leaves – autumn always reminds me of the ancient Greek concept of a “beautiful death”. In the Homeric era, living well included dying well, which itself was associated with a process of swapping the finite (eschaton) for the infinite (telos). In this way, and in complete contrast to modern understandings, a good death was not just bearable: it was a thing of magnificence, to be celebrated rather than mourned.

I like this concept not just for its own sake, or because it helps dissolve the paradox, but also because it shows the boundless extent to which cultural norms and practices can affect how we react to natural and, indeed, inevitable aspects of human existence. The hope that arises from this is as follows: if it is possible to see even human death, or at least a good death, as beautiful, then is should similarly be possible to re-orientate ourselves towards freeing other natural aspects of human beauty through a likewise more positive lens.

By partial analogy, consider how modern life is similarly so fearful of neurological difference. Currently, we pathologise and medicliase the neurodivergent; we associate neurodivergencies with suffering and tragedy; parents mourn when their children are identified as such; we try to coerce and train them into normality from the moment of identification; and we segregate them into sub-standard schools.

Whilst this may seem ordinary and intuitive to us, by contrast, many traditional cultures managed to see profound (albeit in some ways problematic) beauty in the neurodiverse. In medieval Russia, for example, those bodies now labelled autistic were often seen as “holy fools“. Far from being an insult, the term “fool” (durachok) in this context indicated a blessed, principled, and innocent detachment, which was appreciated by Russian society and celebrated by the likes of Dostoyevsky. Similarly, in traditional prehistoric cultures, those bodies we pathologise as “schizophrenic” were revered for their shamanic insight. Indeed, even in some contemporary non-Western societies, they are still taken to be seers who can, for example, draw wisdom from dead relatives, and so are considered valuable and important members of the community.

In noting this I am not calling for a return to these worldviews: they are long gone, or going. And let us also beware of how such representations fetishize neurodivergence. But I do think we can take the insight that neurological difference, more than just being (usually begrudgingly) accepted and accommodated, could be again seen as beautiful.

I am reminded here of one of the gentler passages from the 19th Century German philosopher Friedrich Nietzsche, who urged us to learn to see beauty in human difference in a similar way to how we habitually see beauty in nature. To quote Nietzsche himself:

 “In the way we go around in nature with cunning and glee in order to discover and, as it were, to catch in the act the beauty that is particular to all things; in the way we, be it in sunshine, under stormy sky, in the palest twilight, make the attempt to see how every piece of coastline with its cliffs, inlets, olive trees, and pines achieves its perfection and mastery: so too ought we to go around among people, as their discoverers and scouts […] so that their own particular beauty can reveal itself”

In other words, just as we learn to comport ourselves towards different natural landscapes – or, indeed, falling leaves – in order that their beauty emerges for us, so too should we learn to comport ourselves to natural human difference in a similar manner.

I think this consideration is particularly pertinent when it comes to the neurodiverse. Whether we come to see beauty in the physical appearance of the model with Down’s syndrome, to relish the writings of the autistic poet, or take joy in the interactive performance of the dyspraxic actor – our capacity to see beauty in such difference will come in part from how we comport ourselves towards the unfamiliar, the different, and the seemingly disordered.

Much more importantly, this will also be the case when it comes to our encounters with those we consider to have multiple and more profound learning disabilities. Just as with anything else natural, ‘beauty’, Nietzsche goes on, ‘for one person unfolds in sunshine, for another in the storm, and for a third only halfway into the night when the sky is pouring rain’. That is to say, just as how a particular tree, leaf, lake, or mountain can manifest as beautiful in some light or another so too can each different person’s beauty manifest in light of the angle best suited to them, even if it is hard to notice at first glance.

To make this a shared habit, perhaps especially when it comes to our encounters with the more profoundly disabled, could benefit us all. Just as we, as a culture, can see beauty in each different dead leaf – not to mention how they all appear together: for they are far more beautiful in their shared diversity than alone – so too might we one day learn to comport ourselves towards the different modes of being, forms of life, and ways of relating, that constitute the wonderful neurodiversity of the natural human world. And if at some point this becomes a deeply ingrained habit, then we will thereby have made the world more beautiful in the process.

Resisting the New Down’s Syndrome Eugenics



‘Freedom’, wrote the Marxist philosopher Rosa Luxemburg, ‘is always the freedom of the one who thinks differently.’ In her view, being able to think outside the usual norms of society is utterly vital. Not just for its own sake, but also for ‘all that is instructive, wholesome, and purifying in political freedom’.

Although she died before the third Reich swept to power – she was killed by German government-sponsored paramilitaries in 1919 – a similar sentiment was later echoed by Dr Hans Asperger, who, at considerable risk to himself, argued in 1938 to a room full of Nazi medical officials that ‘Not everything that falls out of line, and thus is “abnormal,” has to be deemed “inferior”’. In fact, Asperger suggested, many differently-minded people could think creatively due to their abnormality. Moreover, he posited, the kind of creative freedom that emerged from this meant that they could, despite their apparent limitations, be deemed useful members of society due to their ability to produce ‘original ideas’.

The reason such positives were so important for Asperger to stress was that, at the time he gave his speech, it was routine for ‘abnormal’ children to be characterised as mere ‘useless eaters’ who were nothing more than a ‘burden’ on the Third Reich. Due to the eugenicist ideology that pervaded Nazi discourse, these ‘useless eaters’ were systematically institutionalised, stripped of human dignity, and, finally, exterminated on an industrial scale. Given this hellish context, Asperger’s notion that at least some of the differently-minded could in fact be useful members of society allowed him to save some unknowable number of children from being horrifically killed.

Today, it might seem, we are of course a long way from the social-Darwinist horrors of Nazi Germany. After all, we live in liberal democracies, where all citizens are, theoretically at least, supposed to be afforded equal rights. Moreover, much funding goes into ‘helping’ the differently-minded, and there are thriving industries designed to aid or spread ‘awareness’ regarding all the currently identified neurodiverse conditions.

Nonetheless, this rosy picture has recently been challenged by Sally Phillips’ provocative documentary on Down’s syndrome, a genetic condition often portrayed as a tragic divergence from proper human functioning. As Phillips, who has a son with Down’s syndrome, points out, despite it being the case that people with Down’s can, with the right environment, live good, flourishing lives, it is now the case that up to 100% of foetuses with the condition are aborted in some European countries. Moreover, although Phillips does not mention it, there are recent cases in which Down’s children have been systematically killed by doctors, who somewhat bizarrely couch it in terms of ‘after-birth abortion’ rather than murder.

In line with the current abortion of those with Down’s syndrome, much research in regards to other neurodiverse conditions such as autism and dyslexia is also geared precisely towards finding genetic markers. And, although this is not yet fully feasible, a large part of the driving force behind this is the hope that more neurodiverse foetuses can be aborted, more differences can be eradicated – with part of the outcome being that humanity becomes ever more singular, homogenous, and fashioned in line with contemporary norms.

For those of us who value neurological difference and diversity, this seems scarily similar to the Nazi eugenics programme that sought to eradicate the differently-minded almost a century ago. Of course, there are important differences: current practices are, clearly, far less brutal. And most of those involved have, I think, the very best of intentions. But the underlying ideology has striking similarities: one doctor Phillips interviews, in an echo of the notion that they are ‘useless eaters’ characterised those with Down’s syndrome not as persons, but rather as a mere ‘burden that lasts for a long time’. Moreover, the final outcome is the same: the eradication of the differently-minded from the human race.

Many viewers, rightly, found such language outrageous. Nonetheless, writers in papers such as the Guardian and New Statesman have accused Phillips’ anti-eugenics position of being both classist and against women’s rights. On the one hand, they note, Phillips is a wealthy, middle-class woman who can afford to provide for all her son’s needs. And so, they argue, it is easy for her to preach about how easy it is to bring up a disabled son. By contrast, they note, a poor, working-class single mother would have a much harder time. Given this, Phillips worries are rejected, with the implication being that there is nothing wrong with the systematic eradication of neurological difference.

On the other hand, Phillips’ critics have also, perhaps more worryingly, accused her of undermining the rights of women to have an abortion as and when they see fit. The idea behind this worry is that women are merely given the opportunity for prenatal screening, coupled with impartial information, by organisations such as the NHS, and that if they then decide to abort based on this information, then they have chosen rightly. Any attempt to question such practices is seen as being at least covertly pro-life; and much has been made of the fact that Phillips, although her arguments are entirely secular, also happens to be a committed Christian.

Unfortunately, however, things are not this simple. In practice, the information given to women in this position systematically highlights all the possible problems associated with parenting a child with Down’s syndrome, but none of the positives. Similarly, other neurodiverse conditions are also routinely represented in this unduly negative way, being constructed via lists of core ‘deficits’ and associated ‘problems’, rather than in a more balanced manner. This systematic prejudice in terms of representation, under the guise of being impartial, already pushes potential parents towards devaluing and fearing neurodiverse life – and so it is no wonder that aborting neurodiverse foetuses comes to seem, as eugenicist Richard Dawkins recently put it, not just acceptable but ‘very civilised’.

Although this Dawinian line of reasoning might make intuitive sense to many people, when it comes to neurodiverse conditions, a deeper problem is that the science itself on which such representations are based is often hugely biased. Almost without exception, researchers tend to drive their experiments and interpret their data in light of what neurodiversity activist Nick Walker has dubbed the ‘pathology paradigm’. But, far from being scientifically established, the pathology paradigm is, according to Walker, a fundamental set of unsupported normative assumptions regarding what should count as ‘normal’ for human beings – and it drives research and data in such a way that constructs any deviation from this ideological norm as inherently pathological.

Take the case of autism, for example. As one leading researcher interested in the possibility of implicit bias pervading autism research found, characteristics that would be described as ‘strengths’ in anyone else are routinely represented as mere ‘compensations’ in autistic persons, whilst any differences noted in autistic brain wiring were automatically deemed deficient – with the alternative possibility that they were merely different not even considered. In other words, scientists representing autism tend to interpret their data via a lens that already presupposes it is inherently pathological, which in turn leads the results of the studies to echo this underlying presupposition.

Perhaps the most destructive aspect of the knowledge-production regarding neurodiverse problems is that it routinely gives the impression that such issues are associated with these conditions indicate some inherent and natural disposition towards these issues –rather than the problems stemming, at least in part, from how society is structured in a profoundly ableist manner. For example, statistics indicating that many people with some label or another tend to have high unemployment rates gives the impression that they are inherently unable to work, rather than it being the case, say, that workplaces and interview processes routinely discriminate against minority ways of neurological functioning.

In fact, then, the real problem Phillips highlights with current eugenic practice is not with the women who choose to abort. Phillips herself is explicitly pro-choice, as she should be. Rather, the worry is this: currently, people are subtly (and sometimes, not so subtly) pushed, throughout their lives, into believing that the all of neurologically different are inherently ‘disordered’, ‘lacking’, and ‘deficient’ to the point of being unworthy of life. This ideological manipulation actually restricts choice – making the possibility of bringing up neurodiverse children unthinkable – all the while giving the illusion that individual autonomy is greater than it once was.

The issue, then, is more in line with the equally problematic sex-selection in India and China, where female foetuses are, despite laws forbidding this, routinely aborted due to patriarchal norms and social structures. Just as we can criticise these patriarchal norms and social structures without being pro-life, so too can we remain pro-choice whilst criticising hegemonic social-Darwinist ideology and the pathology paradigm it is bound up with. Thus, far from being incompatible with being pro-choice, Phillips’ position, which is based on challenging the forces that restrict choice (i.e. which make the abortion of difference seem necessary), complements it, by challenging unjust social structures and opening up new existential possibilities for mothers-to-be.

As to the deeper intersection between feminist and neurodiverse commitments, it is also worth stressing how our conceptions of neurological differences are themselves tied up with dominant gender norms at any given time. Similarly, it is not hard to see how the physical characteristics associated with Down’s syndrome fall outside what is currently considered aesthetically desirable for either males or females. In each case, those who fall outside currently dominant gender norms, often bound up with wider economic ideology, become pathologised as disordered and in turn targeted for extermination – not, perhaps, because of anything inherent to the conditions, but rather because gender norms lead to those characteristics associated with the conditions being at least temporarily devalued.

A similar point can be made in regards to class struggles. Very significantly, for many neurodiverse persons, their class gravely effects how society responds to their differences. The working-class girl with psychotic tendencies may be more likely to be pathologised as being on the schizophrenic spectrum, and end up as a psychiatric inpatient; whilst the middle-class girl with similar tendencies may be more likely to be branded a disorganised creative, and helped towards becoming a playwright or poet. Similarly too, the middle-class “eccentric” male, fetishized as a genius, may have been deemed to merely have a “mild” autistic “disorder” if they were from a less affluent background. Consider director Tim Burton, who identifies with the label “Asperger’s syndrome”: whilst he can clearly flourish as a celebrated, creative, middle-class eccentric, those with similar eccentricities from a more working-class background are routinely socially excluded and in turn pathologised as being inherently sick.

Rather than criticising Phillips for failing to mention the utterly obvious fact that it is a lot harder to bring up a disabled child if you are a working-class single mother – and inferring that this means we should silence any attempt to challenge eugenicist ideology – we could alternatively consider how class and neurodiverse interests also overlap. That is, we might think that combating class-oppression positively includes a commitment to better public services for neurodiverse children, so that the existential possibility of bringing up neurodiverse children is more open to those who are less affluent, and not just the rich. Indeed, it seems to me that being committed to the emancipation of the working-class makes opening up this choice vital, since it is precisely due to oppressive classist and ableist ideology and social structures that the of bringing up disabled children is currently reserved for the very few.

In contrast to the claims of Phillips’ critics, then, it seem to me that the commitments of the differently-minded in the neurological sense, and the differently-minded in the political sense, overlap. If this is the case, then political freedom, as Rosa Luxembourg indicated, is indeed, I think, bound up with the freedom of the differently-minded – just as freedom of the differently minded is likewise bound up with the freedom of other oppressed minorities globally. The upshot is that the anti-eugenicist neurodiversity movement and other emancipatory movements actually intersect – and to overlook or deny this, from any angle, hurts everyone.