In the UK from 2013 to 2016 there were old Etonians in three important posts: Prime Minister, Mayor of London, and Archibishop of Canterbury. Justin Welby had spent much of his life in the oil industry, and had only been a bishop for just over a year ...
‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ 

Click here to read this mailing online.

Your email updates, powered by FeedBlitz

 
Here is a sample subscription for you. Click here to start your FREE subscription


"POSIWID" - 5 new articles

  1. Just About Managing
  2. The Purpose of Surveillance
  3. Data and the Genome
  4. Algorithmic Intuition - Gaydar
  5. ChatGPT and the Defecating Duck
  6. More Recent Articles

Just About Managing

In the UK from 2013 to 2016 there were old Etonians in three important posts: Prime Minister, Mayor of London, and Archibishop of Canterbury. Justin Welby had spent much of his life in the oil industry, and had only been a bishop for just over a year when appointed to the top job in the Church of England. Initially praised for his crisp business-like approach, and expected to drive improvements across the Anglican congregation, some critics thought he ended up achieving very little, reduced to bland words and sleight of hand.

When you spoke to him, you sensed he was a CEO who had mentally allocated you five minutes before passing on to the next matter to be dealt with. That is agenda-driven episcopacy, rather than a listening episcopacy. You can’t run a church with a handbook full of business buzzwords. Pepinster

Mr Welby has undoubtedly seen it as a big part of his job to hold together very different factions within the Church of England and, even more difficult, in the wider global Church, the Anglican Communion of 85 million people. ... He has expended a huge amount of energy in this endeavour of finding common ground through 12 years during which there has been other momentous social change, and at times has shown himself to be an astute political operator. Maqbook

But his failure to tackle the safeguarding issue properly has damaged the Church and brought an end to his tenure. Martyn Percy argues that the safeguarding measures that Welby oversaw are ill-thought-out and arbitrarily enforced, and deter the sort of volunteers on whom the church has traditionally relied for local good works 

And Ian Paul thought Welby was a poor leader overall. Justin managed to make enemies of every single group. He made enemies of liberals by talking about evangelism. He made enemies of evangelicals by talking about sexuality. He made enemies of conservatives by talking about new forms of church.

So much for his management skills then.



Andrew Anthony, The Church of England is beset by shame and division. Can it survive? (Observer 17 November 2024)

Stephen Bates, Just About Managing (The Tablet, 16 March 2017) (Note: link is to the archived page because of trojan warning on live page)

Stephen Bates, Justin Welby: why archbishop chosen for his managerial skills had to go (Guardian 12 November 2024)

Aleem Maqbool, Church at precarious moment after Welby resignation (BBC News, 13 November 2024)

Catherine Pepinster, Why did Justin Welby fall so tragically short? Because he was preoccupied with efficiency, not listening (Guardian 13 November 2024)

Harriet Sherwood, The C of E’s CEO: how will history judge Justin Welby’s tenure as archbishop of Canterbury? (Guardian 13 November 2024)



Wikipedia: Old Etonians

   

The Purpose of Surveillance

While surveillance has been a recurring topic on this blog, the technological environment has developed significantly over the past twenty years.

Once upon a time, the only form of real-time surveillance involved so-called closed circuit systems (CCTV), providing a dedicated watcher with a view of what was going on at that moment, although these systems now generally include a recording function, often operate retrospectively, and feed into an open-ended ecosystem of discipline-and-punish. As I noted in May 2008, the purpose of CCTV had extended from monitoring to include deterrence and penalty, and in the process it had ceased to be closed circuit in the original sense.

Fiction has provided some alternative models of surveillance and control. As well as Fritz Lang's 1960 film The Thousand Eyes of Dr Mabuse, there are the Palantíri in Tolkein's Lord of the Rings, which are indestructable stones or crystal balls enabling events to be seen from afar.

The data company Palantir. whose founders included Alex Karp and Peter Thiel, was originally established to provide big data analytics to the intelligence community. Geoff Shullenberger suggests that Palantir might be understood as an application of the ideas of Leo Strauss (who inspired Thiel): an enterprise that acknowledges the deep, dangerous undercurrent of human violence and harnesses the reams of data generated by the internet to monitor and control it. Meanwhile Moira Weigel notes the contribution of Adorno (who inspired Karp): Adorno’s jargon anticipates the software tools Palantir would develop. By tracing the rhetorical patterns that constitute jargon in literary language, Karp argues that he can reveal otherwise hidden identities and affinities—and the drive to commit violence that lies latent in them.

 


 

Geoff Shullenberger, The Intellectual Origins of Surveillance Tech (Outsider Theory, 17 July 2020)

Moira Weigel, Palantir goes to the Frankfurt School (Boundary2, 10 July 2020)

Related posts: Surveillance and its Effects (May 2005), What's in a Name - CCTV (May 2008), As Shepherds Watched (April 2024)

Surveillance@DemandingChange, Surveillance@POSIWID








   

Data and the Genome

The word data comes from the Latin meaning that which is given. So one might think it is entirely appropriate to use the word for our DNA, given to us by our parents, thanks to millions of years of evolution. DNA is often described as a genetic code; the word code either refers to the way biological information is represented in the molecular structure of chromosomes, or to the way these chromosomes can be understood as a set of instructions for building a biological entity. Watson and Crick used the word code in their 1953 Nature article.

However, when people talk about the human genome, they are often referring to a non-biological representation in some artificial datastore. In other words, given by biology to data science.

Shannon E French objects to talking about data stored on DNA like it’s some kind of memory stick, and Abeba Birhane sees this as part of the current trend that is so determined to present AI as human-like at all costs, describing humans in machinic terms has become normalised.

Elsewhere, Abeba Birhane is known for her strong critique of AI. As well as important ethical issues (algorithmic bias, digital colonialism, accountability, exploitation/expropriation), she has also raised concerns about the false promise of AI hype.

But describing humans (or other biological entities) in machinic terms, or treating them as instruments. is far older than AI. When we replace animals with technical devices (canaries. carrier pigeons, horses), the substitution implies that the animals had been treated as devices, the replacement often justified by the argument that technical devices are cheaper, more efficient, or more reliable, or don't require regular breaks - or are simply more modern. Conversely, when scientists try to repurpose DNA as a data storage mechanism, this also seems to mean treating biology in instrumental terms.

But arguably what is stored or encoded in the DNA - whether in its original biological manifestation or more recent exercises in bioengineering - is still data, regardless of how or for whom it is used.



Abeba Birhane, Atoosa Kasirzadeh, David Leslie and Sandra Wachter, Science in the age of large language models (Nature Reviews Physics, Volume 5, May 2023, 277–280)

Abeba Birhane and Deborah Raji, ChatGPT, Galactica and the Progress Trap (Wired, 9 December 2022)

Grace Browne, AI is steeped in Big Tech's 'Digital Colonialism' (Wired, 25 May 2023)

J.D. Watson and F.H.C. Crick, Genetical Implications of the Structure of Deoxyribonucleic Acid (Nature, 30 May 1953)

Related posts: Naive Epistemology (July 2020), Limitations of Machine Learning (July 2020), Mapping out the entire world of objects (July 2020), Lie Detectors at Airports (April 2022), Algorithmic Intuition (November 2023)

   

Algorithmic Intuition - Gaydar

When my friend A was still going out with women, other friends would sometimes ask if he was gay. An intuitive ability to guess the sexuality of other people is known as gaydar. There have been studies that appear to provide evidence that both humans and computers possess such an ability, although the reliability of this evidence has been challenged. For example, some of these studies have relied on images posted on dating sites, but images that have been crafted and selected for dating purposes may already reflect how a person of a given sexuality wishes to present thenselves in that specific context, and may not reflect how the person looks in other contexts.

The latest study claims to assess sexuality from brain waves. This has been criticized as gross and irresponsible (Rae Walker) and as unscientific (Ababa Birhane). Continuing a debate that had started with other methods of algorithmic gaydar.

More generally, there is considerable disquiet about computers attempting to segment people in this way. For a start, there are many parts of the world where homosexuality doesn't only lead to social disapproval and harassment, but also criminal penalties and sanctions. Even though the algorithms may be inaccurate, they might be used to discriminate against people, or trigger homophobic actions. Whether someone actually is gay or is a false positive is almost beside the point here, either way the algorithmic gaydar may result in individual suffering.

Furthermore, these algorithm appears to want to colonize aspects of subjectivity, of the subject's identity.

  • WyssBernard: I’m not going accept a machine determination as to what I identify as. ?¿
  • Abeba Birhane: just let people be or let people identify their own sexuality

In an interview with the editor of Wired, Yuval Noah Harari wonders whether an algorithm might have guessed he was gay before he realised it himself. And if an algorithm had been the source of this wisdom about himself, would this not have been incredibly deflating for the ego?

And Lawrence Scott describes how his Facebook timeline started to be invaded by images of attractive men, suggesting that the algorithm had somehow profiled him as being particularly susceptible to these images.


to be continued




Isobel Cockerell, Facial recognition systems decide your gender for you. Activists say it needs to stop (Codastory, 12 April 2021)

Isobel Cockerell, Researchers say their AI can detect sexuality. Critics say it’s dangerous (Codastory, 13 July 2023)

Lawrence Scott, Hell is Ourselves (The New Atlantis #68, Spring 2022, pp. 65-72)

Nicholas Thompson, When Tech Knows You Better Than You Know Yourself (Wired, 4 October 2018)

Wikipedia: Gaydar

   

ChatGPT and the Defecating Duck

For dog owners, the intelligence of dogs shows itself (among other things) in their ability to learn tricks. For cat owners, the intelligence of cats shows itself (among other things) in their disdain for learning tricks. 

When Alan Turing conceived of a way to tell computers and humans apart, now known as the Turing Test, he called it the Imitation Game. His first example was to ask a computer to write poetry - specifically a sonnet on the subject of the Forth Bridge. And his idea of a plausible answer for the computer was to say: Count me out on this one. I never could write poetry.

No doubt many people have tested ChatGPT with exactly the same question. When Jessica Riskin tried it, she was not impressed by its efforts. She found Turing’s imaginary machine’s answer (Turing imitating a machine imitating a human) infinitely more persuasive (as indicator of intelligence) than ChatGPT’s. Turing’s imagined intelligent machine gives off an unmistakable aura of individual personhood, even of charm.

An earlier article by Professor Riskin described a mechanical automaton that attracted large admiring crowds in 18th century Paris. This was a generative pretrained transformer in the shape of a duck, which appeared to convert pellets of food into pellets of excrement. The inventor is careful to say that he wants to show, not just a machine, but a process. But he is equally careful to say that this process is only a partial imitation.

Whereas ChatGPT's bad imitation of poetry is real shit.



Jessica Riskin, The Defecating Duck, or, The Ambiguous Origins of Artifical Life (Critical Enquiry, 2003)

Jessica Riskin, A Sort of Buzzing Inside My Head (New York Review of Books, 25 June 2023)

Alan Turing, Computing Machinery and Intelligence (Mind 1950)

   

More Recent Articles

You Might Like