It is not easy to define Lauren McCarthy : programmer, artist, activist, teacher. Because Lauren (Chinese-American, with a degree in Art and one in Computer Science both from MIT -Massachusetts Institute of Technology) lives between two worlds – that of human beings and that of technology. She impersonated Alexa, living in strangers' homes, entertaining them and answering their questions. She created open source systems for anyone to learn to code. She invents gimmicks to connect people emotionally using technology: in her latest work, on display at her solo show in Shanghai which opened on September 19, she asked dozens of people to film themselves telling what they felt in their place of heart, without describing it physically or showing it. Her dream? Seeing the two universes, atoms and bytes, analogue and digital, advance hand in hand, without prevarication on either side. It seems simple, however...
Until December 23 at the Brownie Project space in Shanghai, her personal show You can say ‘Reset the room’ is exhibited, which brings together dozens of her works. They are all linked to Artificial Intelligence and sound really nerdy (home automation, digital surveillance, tracking) but she brings them back to the most secret place in all of us: our intimacy, the relationships we have with others, the way in which we interface with the objects of our home, with its spaces and with people. It is in this area of affections, feelings and emotions that all his work develops despite the high tech look.
What did you learn by working between art and technology, machines and human relationships?
That people understand concepts such as the issues related to privacy and data collection that occurs with Home Automation in a rational way but until they try them on their own skin they don't really grasp the meaning. It is only by touching the strings of emotion that a different, deeper understanding is triggered, destined to make us think. Raising the subject to this level is the task of art, also in relation to Artificial Intelligence.
Give us an example.
When, in my work entitled Lauren, I took Alexa's place in some families' homes for some time - answering their questions, making suggestions, interacting with them - for many a spring of concern was triggered. Because it is impressive to be observed 24 hours a day by a stranger. And they never thought that using the device to observe them there are potentially many more strangers. It is a sobering thought. I am not against technology but I believe that awareness is everyone's right.
Why do you focus primarily on home in your work?
The home is the place that creates our identity and until a few years ago this was given by our parents, by the things we found ourselves using, by the choices we made every day to make it more ours. The objects that have surrounded us for centuries that depended on us: we were the ones who gave them meaning and meaning within our lives. Now our environments are filled with objects that we consider neutral but are not at all. They have been programmed by a small group of people who put in us values that are not necessarily ours. Everyone thinks that there is impartiality when it is an algorithm or a machine that decides, but this is not the case because machine learning repeats patterns that are entered from the outside, exasperating the dominant thinking.
We have had machines at home for decades, why is there now an urgent need to talk about these issues?
Because until very recently the devices had to be set by us to do something on our behalf: an alarm clock, a washing machine, a toaster. Now, with the very recent developments in machine learning, they think for us: they detect data (temperature, presence, etc.) and react autonomously, set by someone. In fact, they judge our present. Think about when you want to write something and the corrector changes the word you are using: it does so because it prefers the dominant term, which may not be the one you would have chosen.
For most people knowing that their data is being detected by a machine is not a problem - I have nothing to hide, they say.
True. But the problem is, we don't know how our data will be used. Already in the United States, insurance companies draw on information on people's lifestyles to calibrate premiums. Everyone knows that social media is scanned before deciding whether or not to hire a person or to give them an offer from a prestigious university. Then there remains the issue that some people have something to hide: activists, researchers, people seeking political asylum. Even if you do not feel in danger on an individual level, it is not right not to think about the collectivity.
There are many discussions on this issue in the Covid era. What do you think about tracking for public health reasons?
I believe that many believe it is the right approach and I think so too, because I am not against technology, quite the contrary. Tracking would be right but with the guarantee of being able to finish it once the emergency is over. It is a problem of trust because to date few companies – institutional or private – would give up precious data.
You are not against technology. So what do you propose?
I wish there was equality before technology. I would like everyone to have access to this kind of knowledge, so that they can then decide with conscience. That's why I created a program years ago that allows anyone to do coding which is still widely used today. Understanding the problems - even through emotions, as I do with my works - leads to action and, I hope, to a collective movement that stops the development of technology where it is needed. Home Automation can give us so much and we discovered it during lockdowns: company, entertainment, contact with others. I don't believe in a dystopian future but in building a different relationship with machines: more conscious and informed. We still have time to stop.
If I were a big tech company I would come to artists like you and finance their work. If that happened what would you do?
Each case is different. It would depend on the freedom of expression that the offer of help implies. The tech companies supported the artists now it seems to me that more luxury companies do it. Perhaps they think that flooding the world with gadgets is enough to help civilization develop. But is not so.