One of the first images of AI I encountered was a white, spectral, hostile, disembodied head. It was in the computer game Neuromancer, programmed by Troy Miles and based on William Gibson’s cyberpunk novel. Other people may have first encountered HAL 9000 from Stanley Kubrik’s 2001: A Space Odyssey or Samantha from Spike Jonze’s Her.
Images from pop culture influence people’s impressions of AI, but culture has an even more profound relationship to it. If there’s one thing to take away from this article, it is the idea that AI systems are not objective machines, but instead based in human culture: our values, norms, preferences, and behaviours in society. These aspects of our culture are reflected in how systems are engineered. So instead of trying to decide whether AI systems are objectively good or bad for society, we need to design them to reflect the ethically positive culture we truly want.
Here’s an example: Roger Dannenberg, a professor at Carnegie Mellon University in Pittsburgh, has created an AI system that plays music with people. It accompanies performers based on ideas of pitch, scale, tempo and so on, that could be called facets of western music theory. In contrast, the composer and scholar George E Lewis, coming from a tradition based in the African diaspora – jazz and other traditions, as nurtured by Chicago’s Association for Advancement of Creative Music – has created a system called Voyager that is a “nonhierarchical, interactive musical environment that privileges improvisation”. The outcomes are very different. Dannenberg’s system produces output that is effective at following a conventional performer – it sounds like it hits the expected notes. Lewis’s system, in contrast, generates surprises in dialogue with a performer, sometimes taking a solo, sometimes laying out while the human solos – neither the human nor the machine dominates.
Voyager’s sound is somewhere in between that of avant garde jazz great Sun Ra and a Javanese gamelan ensemble. The values of each system are based on the cultures the musicians and programmers draw from: measurable control v improvised collaboration, making for important differences in the output of each system.
Nowadays, AI systems have roles in generating text and images, diagnosing disease, and even in autonomous weapons systems. One could make such systems as followers of humans or as creative supports for humans; we can look at the relationship between machine and human as based in command and control – or as based in collaboration, figuring out what people do best and what the computer does best so that they can work fruitfully together. Both approaches have their place – but I feel that currently the latter approach is less well-known and adopted. Creating AI systems based in many different cultures, with proper ethical guidelines, can help to rectify this.
People can intentionally design computing systems with the values and worldviews we want. For instance, the MIT Center for Advanced Virtuality, which I founded and direct, has built simulations such as Breakbeat Narratives, a collaboration with Universal Hip Hop Museum and Microsoft. The system utilises our centre’s technologies, characters created with comic book artists Black Kirby, Microsoft conversational AI and music archival content from the TunesMap Educational Foundation to teach hip-hop history according to the user’s musical tastes and interests. For instance, if you like roots music and are interested in female hip-hop artists, you can get a short documentary on the self-representation of women in hip-hop that has a soundtrack of hip-hop songs that sample from country and western and bluegrass.
I also had the pleasure of collaborating with war photojournalist and VR artist Karim Ben Khelifa on a project he directed called The Enemy, which enabled viewers to journalistically hear the perspectives of combatants on both sides of conflicts in the Democratic Republic of the Congo, El Salvador and Gaza – while customising the experience based on users’ body language as a proxy for their potential biases and attentiveness.
We build AI and computing systems for creative expression, learning and the social good by meaningfully customising stories and experiences for the people using them. There is a great opportunity for AI to have a positive social impact through such design – but to do so, the field will need to be more interdisciplinary, valuing the aims and insights of the arts, humanities and social science.
I’m not promoting a utopian view of AI. You’ve probably heard about recent approaches such as “deep learning” and “large-language models” – of systems such as Dall·E 2 and GPT-4. People have been using them for many purposes: gamers create characters for Dungeons & Dragons, attorneys craft legal motions.
Such systems utilise neural networks and approaches involving deep learning and large language models. It is hard for humans to interpret exactly why they output the particular images or text that they do (the system’s “reasons” are a pattern of finely tuned statistical values and numerical weights). When processes are opaque, and driven by large datasets that also are based on cultural values, it’s possible for unfair biases and other social ills to find their way into the systems.
We need to be aware of, and thoughtfully design, the cultural values that AI is based on. With care, we can build systems based on multiple worldviews – and address key ethical issues in design such as transparency and intelligibility. AI does offer extraordinary creative opportunities; but creators need to do social-cultural work that is at least as hard as technically engineering these systems – and perhaps harder. There are times when a command-and-control paradigm is appropriate in computing. However, when looking at AI, there are times when, instead, we need to see more jazz-like opportunities for creative, collaborative improvisation.
-
D Fox Harrell is professor of digital media, computing and artificial intelligence at the Massachusetts Institute of Technology
This post was originally published on this site be sure to check out more of their content.