
I have resisted diving into the world of AI. Part of me was overwhelmed, unsure where to begin, but the other part of me felt like I’d rather remain blissfully naïve. AI is one of those topics that everyone seems to have very strong opinions on, but no real sense of the whole landscape. I like clay. Clay is tangible, infinitely variable, but also knowable. AI on the other hand feels highly abstract, emergent, and unknowable. With AI, I have the sense that I’m standing on the shoreline of an uncharted body of water, complete with currents, riptides, and waves–not to mention whatever else is beneath the surface. I have been uncertain where to jump in and how to navigate if I do.
A week ago my willful and uneasy yet blissful ignorance was disrupted when I signed up for my first coding class. As part of my work on the Creative Uncommons project I am considering designing an app, and found that to be able to work from a creative place where I can apply my post-capitalist values, I would need to learn how to write and edit code.
So, I have gone from having my back to the ocean, to jumping off a nearby cliff, headfirst into the water.
I had no idea just how interwoven coding in general and AI in particular are in life. Think about how often you google something, for example. Or follow the suggestion of the “algorithm” for direction, ideas, leisure, or research. Or maybe you use Chat GPT or Google Translate to generate texts. Wherever the case is, you likely are more entwined with AI than you think. I for one did not appreciate all the ways in which my life is already well into the water. Not only that–but everything I put out into the digital landscape shapes AI.
One of the very big things I am learning is that to be any use, AI is dependent on vast amounts high quality data for analysis. Within this emergent field software is being developed mostly by private companies who have the funds to harvest data, (“web-crawling” for example) to feed into whatever models, protocols, and applications they have designed.
And this is where I think artists, writers, curators, and cultural organizations need to be at least alert, and ideally proactive in the conversations around AI. As cultural producers it is important to understand that everything we put out into the digital world is being fed into this emergent body. And, while technically we could opt out of digital production, at this point in the game, that is opting into erasure. Also, I am not one to totally write off AI as a negative tool. It is just that–a tool. For AI to be useful and meaningful, it needs our data–but how it’s mined, what it is used for and how our work is represented are all critically important questions. And, what changes around copyright in terms of my “moral right” not only in terms of ownership, but also in terms of civic duty. If everything I digitize is consumed by a web crawler and reassembled into a dataset that I have no control over, have I lost ownership of my work?
One other thing I am learning is not to be afraid. AI is an emergent field and yes, feels nebulous and unknowable–but the same dynamics are still at work. There are still people with money and power who are making decisions that affect many people. So while the medium is emergent, the structure remains the same old story. Asking questions around who owns what data and how that data is being accessed and shared are questions we all need to be asking–especially those who have high stakes in data output, like artists.
One week in I do not have my bearings, but I do have a combination of curiosity and urgency. I should have taken swimming lessons long ago. I think everyone with a device should. We are already in the deep end.
