Micaela
Hello, everyone. Welcome. It’s so nice that you could join us tonight for this very special conversation. On behalf of More Art, I want to welcome you and I want to thank the Brooklyn Public Library for so graciously hosting this conversation. It’s been, as you know, truly inspiring to work with Stephanie for a little while now on this project. I hope all of you had a chance to experience the project downstairs. If you can, just give your stories because obviously, the work is effective, the more stories there are. The more stories, the more it can be representative of the complex and diverse communities that we are. It’s actually a pleasure to announce tonight that the project was supposed to close tonight, and instead it’s been extended through November 2nd. So there is another month. Yes, it’s really excellent news. Tonight, I also have the pleasure to welcome back Nzinga Simmons, who was our Curatorial Fellow over the summer, and it’s been so wonderful to work with her. She is a PhD candidate at Duke University and came back to New York City specifically to moderate this panel conversation.
It’s with great pleasure that I turn it over to Nzinga. Thank you so much.
Nzinga Simmons
Yeah, thank you. It’s so nice to be back and be back in conversation with you, Stephanie, and get to meet you, Olivia, and you, Idris. Welcome, everyone. Thank you for joining us today in this conversation. I’m honored to be joined by these three panelists. First, we have Olivia, who’s a Caribbean-American information worker and documentarian from Queens. Her work practice in audiovisual and software preservation informs artistic research into the cybernetics of secrecy, power, and place. She is a deaf doula, mathematician-in-training, as well as a teacher and student at the School for Poetic Computation. Current and previous collaborators include Performance Space New York, New Ink, Rhizome, The Kitchen, Pioneer Works, Bel Moon Productions, and Neon. Her work has been featured in The Atlantic, Documentary magazine, Letterboxd, Movienotebook, Them, and Refinery 29. And I was first introduced to her work on the acclaimed documentary Seeking Mavis Beacon, where she worked with Jasmine, and Stephanie is also included in the documentary, and I encourage you guys all to go look at it. It’s available on Amazon and Hulu. We’re also joined by Stephanie Dinkins, who is the artist for the installation If We Don’t, Who Will, whose work is featured downstairs on the Plaza and is open until November 2nd, as Micaela shared.
She’s a transdisciplinary artist whose work intersects emerging technologies in our future histories. Her art practice is deeply committed to creating platforms for dialogue about AI. Dinkins leverages technology and storytelling to challenge and re-imagine narratives surrounding Black and Brown communities. Through her installations, objects, and community-based projects, Dinkins seeks not only to question the current paradigms of AI development but also to forge paths towards more equitable and inclusive technological futures. Her work emphasizes the importance of incorporating diverse voices and perspectives into the design and application of AI systems, advocating for a future where technology uplifts and amplifies the narratives of the global majority. And last but not least is Idris Brewster, a Brooklyn-born artist and creative technologist that disrupts traditional narratives through spatial experiences, all while empowering others to do the same. Idris’s work explores the liminal space between the historical archive, public space, and technology. Idris is the executive director of Kinfolk Foundation, an augmented reality archive that puts the power of monument-making and historical preservation into the hands of Black and Brown communities. Before Kinfolk, Idris just worked for Google, developing an educational program called Code Next that exposes Black and Brown youth to the world of computer science, allowing them to have the tools to build their own futures.
Let’s give it up for our panelists. I really appreciate you guys all being here. I guess I just want to open it up very broadly and ask, what was your first introduction or first engagement with technology? I’m talking early, early first experiences, and how do you feel like that has shaped your trajectory into your current practice? When I say early, I’m talking, I guess I can share. My first was coding HTML in my MySpace About Me at 10 years old, trying to make it as glittery and pretty and watermark-free as possible. So, yeah, whoever wants to start can hop in.
Olivia
I can start. I started computer programming a bit precociously at 9 or 10. My father drove for the MTA. He’s a Grenadian immigrant, lived in Brooklyn. And he would talk to me a lot about machines. He had dyslexia, and so he couldn’t pass. I think he wanted to be an aviator. He wasn’t really going to pass his exam; his handwriting, I can’t read it still. But he would talk to me a lot about machines. When there was something that I needed fixing in the house, I was his only daughter, and he would call me over to watch him fix an outlet or the things that would stereotypically be things you teach your son about, household electronics, were things that he would open at the back of the remote and show me. I was the type of child who would ask follow-up questions and who would be really receptive to that. And computers, I got a lot of dopamine from him telling me, Oh, good job. So when I got into computers, he was super into that because he liked Doctor Who. And all of it, and Star trek and stuff.
And so he was like, Oh, my gosh, my daughter’s a coder. And I was like, Yeah. And this was also when the kids learned to code movement in the early 2010s was really vicious. And there was all this nonprofit funding for specifically targeting young children. But also, there was this idea of Girls: learn the code, close the gender gap in technology, et cetera. The reason why girls aren’t in these spaces is because it needs to be pink. It has nothing to do with structural misogyny in the workplace. No, just make it pink. There was also a really welcoming community of adults that, instead of what I think would 10 years ago have been like: log it off the computer, do something else. I think a couple of years after, I was deciding this was something I wanted to pursue, like Hidden Figures came out. It was the perfect storm of you are a little girl and you want to learn how to do with computers; we’re going to fill your life with affirming imagery around that. I feel actually very lucky.
Stephanie Dinkins
I guess I’ll keep it just running down the line. Hi, everyone, and thank you for waiting. I think that I’m going to age myself. So my first is probably CB Radios, Breaker Breaker 1. 9. I don’t know if anybody else, got to share more. So when I was a kid, instead of computers, there was a moment where CB Radios were a really big thing. Right, and so you could talk to truckers on your CB radio, and I would talk… I had a pinball machine, and I would talk to the truckers as they’re going by. And it’s interesting because you’re asking me about technology, and I’m like, Oh, I think this is about far-reaching communication. Because after that, I remember distinctly being at a residency in Valencia. In Spain, we were out in a monastery in nowhere, my shortwave radio right? And so you could listen to the world through this radio, and it would listen to Russia. It’s just everything is coming at me.
Olivia
I’m studying for my ham radio license right now.
Stephanie Dinkins
Are you really? That is so cool, right? And Olivia, I’m like, And you’re making me think about these things, my very first car, right? And I think this is technology, too. You know I got the book and changed my brakes, and people are like, Are you crazy? I’m like, Well, if I can read, I guess I can do it. And so I did it. So I love what your father was doing. It’s like, just show the thing. And I guess that all comes to tinkering and wanting to reach out to other folks. And that’s the draw for technologies for me.
Idris Brewster
Yeah, I think for me, it was a lot about video games and play. I got a computer very early on, and that was sort of how spending sleepless nights in my room just playing and end up learning. It took a class in early middle school on how to make video games, and I started getting obsessed with that. But then, I guess, sixth, seventh grade hit, and that was when Facebook started to come around, and in AIM, instant messenger was a thing. And so then a lot of my time in my computer days was really trying to communicate with folks and my friends remotely. I think that took up a lot of my world. And even then Facebook at that time, it felt like a local community of sorts. It wasn’t what it is today. I have this broad projection to the world. It was Intimate spaces for me to talk with my friends. That was really a lot of the obsessions. Then trying to get my first Razor, Blackberry, all of these little phones was the pinnacle of technology in my mind at that moment of time because I want to be the cool person that has all these different devices and that can communicate with my friends.
That was something that started my journey into technology, and that, combined with trying to learn how to make technology. Then in high school, I learned about algorithmic art, which then took me in a completely other direction. I was like, wow, now with AI, I spend a lot of time coding these algorithms to make paintings that are completely different and a lot easier to do now. But that was really a lot of the tinkering for me in high school was I’m figuring out how to use computers to start to make visual art. That was a really big obsession of mine through the use of algorithms, which-
Olivia
Yeah, that’s really interesting that your high school had a algorithmic art class.
Idris Brewster
They didn’t. They had a graphic design class. Then, through my explorations outside, I started learning to… I think it was processing towards my senior year where I started to really delve into algorithmic art. But no, they did not have that.
Olivia
The first time I saw Dan Shiffman in the flesh as an adult, it was like Sunday Cartoons, it’s like Coding Train on YouTube. And then I became an adult and you just walk around Greenwich Village, and it’s like, Oh, NYU is right, Hi. In a way that so punctured the third wall. I was like, You’re Bob Ross of Algorithmic art. What are you doing walking down Soho? And he’s like, I live here. And I was like, right.
Idris Brewster
I still remember the first time I met Dan Shiffman as well. I was like, Who- I’m walking amongst the gods. It was great.
Olivia
I was at the IEO conference in Minneapolis because I was a student volunteer, and he was like, Does anybody want to get on stage? And I was like, Me.
Nzinga Simmons
Why? I feel like from all of you guys’ responses, it really shows how I feel like your introduction to technology was really from a space of creativity and play and experimentation and curiosity. I feel like the technological landscape has shifted, particularly in this current moment. What does it mean for you to be making and creating and using technology now and making that your practice in this political moment, this rise of misinformation, the age of big data analytics, this contemporary moment that we’re living in?
Olivia
I guess we can, I guess, keep going. It’s definitely, I think, pulled for me in a sharper sense of history and thinking about, Okay, what is computational art really about? And what artistic questions are we asking when we make work with technology, about technology or about society? I think a lot of, especially the current political moment, it’s really easy to like- There’s so much to be fascinated by, and there’s so much to get swept into. But it’s also really easy to become an agent of empire without realizing and make art that feels more like a tech demo for carceral technologies instead of art that interrogates or art that inspires or art that makes you see things differently. And it makes me think a lot about the history of algorithmic art and about art that uses computing and algorithmic art that predates computing, like the Olapo poets in France who were doing these rule-based poetry and opening the dictionary to a random page or constructing large corpuses of work based on a discrete sense of roles and the idea of invoking ideas like randomness, like chance, like non-duality, and all of these things that exist in the space of computing, that also exist in the space of society, that you can really dig deep into.
It’s interesting to see how can we approach the space like a craft and what happens when AI tools get more and more embedded in that artistic process of what is the art about now that we’ve diluted the human-computer interaction into a space that abstracts power away? I’ve been thinking about that a lot in terms of the current moment.
Idris Brewster
I got a lot more questions than I have answers in this moment in time. It’s a little bit of a crisis in my head of how do you navigate being a technologist, and is there any ethical consumption of technology? And this is really cemented for me that technology is political, especially. It always has been, but especially now, and how do we navigate that? I think it’s been tough, not tough, but there’s a lot of pressure on understanding what is that future with technology? How can we actually imagine a space where technology is a tool for liberation? That is a question that keeps coming up for me, and I’m not sure I have an answer. And that is where it’s drawn me, one, away from using technology in my daily life, trying to integrate myself with people, with nature and outside and my sources of inspiration, moving away from technology. But then also drawing further into community, because I know I’m not the only one that has these questions. And now more than ever, especially under fascism, we need to come together in maybe more analog environments to really have critical conversations and wanting to spread those conversations amongst others, because folks in our community are definitely worried, they’re scared, and they have fear of the environment that we are in.
And so I’m trying to figure out ways to be more transparent, pull the curtain behind the wall to really see what’s going on, and doing that with others is where I’ve been drawn to in this moment, because the powers that exist, with the technological platforms that exist, I’m really at a crisis point of, do I abandon? Do I form my own? Do I work with others? How do we navigate this? I think that’s a question we need to ask together, but that’s really a lot that’s been on my mind of how to move forward. And definitely interested in Stephanie’s answer or thoughts.
Stephanie Dinkins
I think I hover between the two of you, between this craft and questioning. Because if I think of it, I think of it as part of a continuum, and the continuum goes on, whether it’s a technology or not, whether people turn or not. And so then the question becomes for me, well, what does it mean to survive and thrive? And how does that work? And how do we do it together? And for me, that becomes the idea of an engagement, because if you don’t engage, then there’s no way to do anything about it or with it. The idea of crafting a technology and playing with it in that way so that you try to, or at least I try to, divert it from its regular flow. And thinking about the world is on fire. Okay. The world has been on fire before. True. But we didn’t have the same technologies. I’m always thinking about, well, what is it that I could do? Let’s say I as a maker could do with this technology in this moment. How much do I stick my head out? How much do I go underground and just try to do?
How much do I question? I was thinking about Reagan today. Because it’s like this moment. I remember what, so here we go again. Because when I was going to school, it became a big question about what I could do because of that political moment. And the idea of having to just live the moment and do the best you can through the moment is real. And I feel like that’s where we are again, having to live the moment, having to do the best we can in the moment, and using all the tools at our disposal and not turning our backs on the moment. That would be my thing. But how you do that is, whoa. The question of the how is a huge one. But I think action, I always turn to action. It’s like, well, what act, no matter where it is, can I do?
Nzinga Simmons
Well, so something that has come to mind when you guys are responding is the title of the installation, if we don’t, who will, and how there’s this sense of responsibility. Do you guys imagine that you have a responsibility in the way that you use technology, and particularly also in your practice, the way that you conceptualize or frame technology for an audience? Because I think a lot of times, technology falls into this category of neutrality, where people think that, Oh, it is a machine, so it’s unbiased. It’s not laid in with our cultural and racial, and all of our preconceived notions. And I guess I wonder, do you feel a responsibility? Or even in the title, I guess I should ask you, Stephanie, we just start there. When you were titling the installation, If We Don’t, Who Will?, what were you imagining your responsibility? And then Olivia and Idris, how do you guys imagine your responsibility in framing and using technology?
Stephanie Dinkins
Sure. I think that art should do work in the world in some way, shape, or form. And I think not only should the art do work, because how many people are actually going to see the art? But if the title can do work, then I feel like I’ve done something because that’s going to travel so much further than the amount of people who actually see the thing or engage the thing, let’s say. And so for me, I carry a big responsibility for what it is and then how it acts in the world, both in its passive state, the title, and its active state, the thing. And I want people to go to the questions. I think one of the things that we do really well in this society, especially American society, is consumed and passively consumed. And the question is, well, how do we not just consume? I can’t tell you. It’s nice to hear your stories, right? Because often I teach, I’ll talk to people and I’ll ask them about, say, their phone, and they don’t dig into what it can do. I’m like, do you understand how powerful this little thing that you are carrying is?
Oh, sorry. Thank you. Do you understand how powerful this thing that you are carrying is if you put it to use, do something with it, do more than what is expected of you with it? And then if you do put those things to work, what does that do for you? I will never forget being in Senegal. I was in Senegal at this little museum. And the guy… So it was all international. There were English speakers, Japanese speakers, German speakers, like five different languages. This one man from Senegal had a flip phone. On his flip phone, he had Google Translate. He could run around to us. So he was there to serve us. But he could go and ask anybody what they needed in their language. Took a minute, right? And made a stream of help, yes. But I’m thinking about what is the economic stream that that did for his family? Because he went the step to figure out that, Oh, I can make a space with this thing. It’s not that extraordinary, but I can use it really well and make it help others, make it help me. And that was amazing to me, right? Because it’s like, Oh, it’s something we have.
And so I’m always wondering, Well, what are the things that we have? And then how can we engage them? And how can we engage them slightly differently? And then how can we put that power in other people’s hands to some extent?
Olivia
Your question makes me think a lot about our revolutionary ancestor, Assata Shakur, may she rest in power. And one of my earliest encounters with revolutionary thought as a child was listening to and seeing primary source, like archival footage of her speaking to a group of children and saying, “it is our duty to fight for our people, it is our duty to win”. And I think about that in my relationship to technology as an information worker a lot in terms of, one, this idea of code as a martial context and as a means of asymmetric warfare, whether it’s information warfare or actual physical warfare. And also through how we understand the Internet as a martial technology, as a technology that was developed by government agencies for the specific… Using technologies that were developed by using Southeast Asia as a laboratory for very cruel and inhumane methods of both surveillance and war. And I think about that origin a lot. And so much of how our network society is built, the physical infrastructure of that Internet is built to resist nuclear fallout. If one section of the Internet is destroyed, another one, command and control can still be preserved.
And broadcasted messages can still persist despite intense fallout between nations. And it’s really tempting, and you kind of hear this narrative a lot when people talk of critical technology to say, yeah, the Internet was built by the government to do X, Y, Z, and that’s why it’s bad. And it’s like, that is why it is powerful, and our enemies are powerful. And I feel very intensely as I grow up in this imperial data environment to not fear power and to also understand, okay, the Internet is a martial technology for which army? Can it be also a technology for a people’s army, for a group of people who are trying to build networked power together? Earlier this year, and in the past couple of years and months since working on Seeking Mavis Beacon, I ended up falling down a crazy cybersecurity rabbit hole in tandem with archival research. I’m thinking a lot about ideas in secrecy. I taught a class at the School for Poetic Computation called Advanced Secret Keeping, partially out of just this sense of existential dread of if we cannot keep secrets from the dominant culture, it will kill us. And so thinking about also what a gift it is that there are actual ways to keep a secret in a way that doesn’t require anyone to suspend belief or methods of encryption that someone can trust with their whole bodies.
This idea of, no, I promise you, the way that you’ve encrypted your message, it would take a computer longer time than the Earth has to decrypt it with our current technologies. These types of mathematical assurances that are possible because of advanced… And even thinking about what mathematics is in terms of just this God-shaped hole of apples falling down from trees. These types of universal ideas that one plus one equals two, and you stack it up in a way that also results to an algebraically proven idea of safety and security for people who otherwise have to trust on systems and services that have all these other ideas in mind. I think how I foresee it, I guess, and I’ve done the whole wind around now, but how I see my role as an information worker and cultural worker is to cultivate revolutionary awareness in people to understand where power is and when it’s in their hands, and things they can do that are outside of their wildest dreams in that way.
Idris Brewster
I think for me, one of the original callings for my work was after I read about Saidya Hartman’s Critical Fabulation, which was basically it’s the artist’s duty to fill in these gaps of history. And that has led me on a journey of how the imagination can serve our knowledge and our memory. And that also has brought me down a spiral of this erasure of our narratives isn’t just in the past, it’s in the present and in our information systems that are existing. And what is the duty of critical fabulation in this technological time? And I think there’s a lot of, as you said, that awareness play of folks being aware of how that erasure is taking place is super important. I feel like that’s where I feel my duty is. There’s not as many critical conversations in the public sphere around technology that I think that there should be. And that awareness that I don’t see too many folks having around their phone and how that reflects power is where we also, one, need to have more critical conversations. But two, while our access to information in the past is really being severed, and that is also not allowing us to understand the systems that are in place around us.
And so I think for us, a duty is to invoke and really reflect those systems or how they’re failing, rather. And I think that’s a big thing. And technology plays a really large role. I mean, a lot of people were sucked into this loop of using our phone that’s extracting from us. How can we pull folks away from that loop and that cycle? It’s something that I see a lot of people, including myself, even struggling with. How can we root ourselves in knowledge of the past? For us, it’s building bridges to that, but also building bridges for folks to contribute to knowledge of that past. I think for me, my work at KinFolk is, how can we create enough pathways for people to access and also contribute to our information systems in a way that is not extractive or also putting themselves at risk? Because I’m also a lot… I’m very dejected. The concept of a public technology in a public digital space is really concerning for me. As we continue to evolve, I’m having less and less faith and hope in that and more interested in war, and it’s more encrypted or these more local systems of information sharing engineering.
That’s sort of where my mind is going and interested in as a technology as liberation. If we don’t build it, who will? That comes back to it of like, that’s where the crisis comes in because I’m severely questioning, but also severely need to build and build in community with others because the pace at which this stuff is advancing and with AI is just exponential and even beyond exponential at this moment in time. And so I worry for us. I worry for myself. I worry for my community of folks who rely on these systems, like WhatsApp and Instagram, to communicate. My whole family’s genealogy history is on a Facebook group right now. And that’s where it’s like, we don’t need to keep contorting these systems to work for us. But the answer to that, we understand that, but then how do we move forward from that? I feel like it’s my duty and also the duty of all of us around here to really figure out and at least contribute to conversations around how that becomes a reality, because this AI bubble is going to burst, and it is not this impenetrable thing, the reality that we have to live with, but it’s we’re going to acquire a convening community to understand how to move forward.
But I don’t think it’s impossible. I do feel a duty to figure out or at least think about how we do that and involve others in that conversation as well.
Stephanie Dinkins
Yeah. Can I interject for one second? Of course. You’re both making me think of an argument that I’m having right now, which is an argument of language and reason, because I feel like that’s the first algorithm. And many of us don’t even question that level of what the language we use, how we use this language, what it’s really saying, and how it’s putting logics into effect that we’re not even trying to question. Before we even get to the AI, it’s like, Well, you all are complaining about this, but you never question the system that you are in and reinforce it. How are we going to start to pull at that? Then for me, that means that’s preparing for any technology, Because then we’re starting to deal with what it is underneath the things. Now, how we communicate after this, or how we pull apart the language and how we start to use it and not to let it automatically apply the system systems that we live within and function in right now is a question for me. But I’m having this huge argument with reason and logics and the way that we organize things as default, even before I get to anything plugged in.
Olivia
A hundred percent. I think that was one of the core… One of the biggest things that I find very annoying about the AI conversation is the fact that I’m forced to call it artificial intelligence every time. The idea that large language models are something that maps neatly onto what a human brain is, to me, is so reductive. Neurobiologists are screaming about how much that is not true. And the way we just accept these phrases, deep learning and avatars, and use all this language to shove this idea of life inside of a machine that is built using the lives of real people, is something that really… Yeah, I just like having to use the same terms and participate in building that reality with language on terms that I don’t agree with, or even this idea of intelligence. And it’s possible, I think. I think there is a type of intelligence that is created by creating models that play with language at the scale or at smaller scales. But it’s an inferential logic. There’s no there, there. And I think there’s a lot of cool things that could be done with large language models on a smaller scale that maybe didn’t require a data center the size of a small nation or any of these things.
But on a level, every time I get to speak to a computer scientist who studies machine learning or anything like that, and I’m like, So how do you feel? And they’re like, Well, whenever someone says something about how this doesn’t really work, they get fired.
Nzinga Simmons
I was just having the conversation just before the panel started with it, and I was like, I hate AI, secretly. But then I was like, Wait, no, I take that back. I actually hate large language models, and there’s a difference in this idea of how we’re conceptualizing what intelligence is can be shifted. And I guess that opens up my next question is, how do you guys think that machines fail to understand Black culture, expression, nuance, memory, and how are the projects that you work on intervening in that? And then also, I guess for my own personal interest, do you guys think that Black people have a particular relationship with technology, considering that technology zoomed out as just a tool? At the beginning, the genesis of the United States, we were brought here as tools, and so there’s a certain slippage there between-
Olivia
I do actually really think that.
In particular, I think that the relationship to technology in terms of, I feel a political affinity to robots, because I do think that part of the reason why this country is preparing itself so quickly and irresponsibly to become dependent on large data centers is because we have built a fundamentally broken economy that needs an excess of free labor to survive. And if you hadn’t human trafficked my ancestors to this side of the hemisphere, that probably wouldn’t be the problem that the nation has. But we’ve never actually recovered from the type of excess and the type of empire.
Nzinga Simmons
Just more of the expectation. Yeah.
Olivia
Even the California Gold Rush and this Manifest Destiny. You see these echoes of we ran out of space on the continent, and so we moved into cyberspace. And then you get all of these dot com companies with spooky names like Amazon and Internet Explorer, and Safari. It starts to get really fucking weird, actually. And I do think that plantation, logistical infrastructure is so deeply embedded in even how we look at information infrastructure and how things are arranged and what our grocery stores look like. Everything is in boxes. We can’t, not to give everyone’s zoochosis, but it’s genuinely… That is something that I think about very actively in the sense of, you guys just can’t quit slavery, not even for a second. You just won’t pay anyone for labor.
Stephanie Dinkins
It’s interesting because it’s the build of an economy and the way things are built. And then I think, okay, so this is where we are. I’m always like, okay, where are we and what do we do with it? And my approach or my thought is always that I think, yes, there is a way in and there is a… What kind of attachment? Like a way that blackness relates to the technology in a sense. But my idea is that this technology, and in fact, the nation needs us so badly. What we know the things that won’t be recognized about what we know. So I’m so interested in these things that are invisible, intangible, undescribable in many ways that hold together. One of the reasons I do or have done what I do is because I think that my family knows things that this nation needs to know. Like needs to know, not like it be nice, but if they could actually hear it, actually take in the information, there’s a lot to learn. And then I can extrapolate that out to all the unseen people, because there’s so many unseen people that we just deny very quickly without exploring.
And in fact, we tell them they don’t know what they know, because I’m going to say the we, the big we, we don’t understand it, which means it can’t exist, right? It shouldn’t exist. And so for me, then it becomes about well, this is where this gift thing comes from, right? How do you gift a system something so that it can hold you? And in holding you, it holds the greater sense of a society, right? That’s the weird dream in my head. Now, if that works or not, I don’t know. But I’m willing to give it some time and energy to try. But I’m convinced there are many things that many of us know, and that we’re considered not knowing or in service of or the service thing to, as opposed to the thing that could inform in ways that would never, ever get to otherwise, which is necessary. What was your question?
Olivia
No. One of the questions that you answered, the second question and the first question was, are there-
Nzinga Simmons
How do machines fail to understand Black culture, expression, nuance, memory, et cetera?
Stephanie Dinkins
I think they fail to understand because they’re trained not to understand anything. It’s those things that aren’t recognized, which is why I think we have to train it back in a way. But it’s giving away a lot. You can see, I’m going to equivocate because I’m like, Oh, I think we should do this. And then I’m like, Oh, no, I don’t think we should do this at all. And I do this to myself all the time because it’s a hard equation to square. But I also don’t see the world that we get to live in well without some gifts to that landscape, right?
Nzinga Simmons
We can’t opt out entirely.
Olivia
I think that culture it’s so- sorry.
Idris Brewster
But do they deserve to know? I guess it’s a question for me because I’m-
Nzinga Simmons
Do they deserve our gifts?
Idris Brewster
I understand what you’re saying, but it’s like, Oh, my Gosh, it’s like, what are we… We’re gifting it in service of what this is in my head. Sometimes it isn’t a mutually beneficial exchange. I see a lot of, I guess, the issue. I see a lot of Tech for Good, AI for Good projects.
Stephanie Dinkins
Not this garbage.
Idris Brewster
Not yet. Sorry. No, I’m just saying.
Stephanie Dinkins
You know what I mean? That feels very Pacific tech. All the civic tech. It’s like, no. Do they deserve it? I don’t think they deserve it, but I think I deserve it.
Idris Brewster
I think you deserve it.
Stephanie Dinkins
And without us doing it, it just won’t. That one, I know. Yeah, no, they don’t deserve it.
Idris Brewster
They don’t.
Stephanie Dinkins
However, we are still living. We are. And we are here. And that becomes the, Well, what do we deserve? And how do we make that thing for me? Because we could take our cookies and go. What happens once you run with your cookies? I don’t know. I’ve tried to live in other places in the world. They just will not hold me in the way that this place will hold me. So then it becomes, Well, how do I make the place look more? And I’ve tried. I’ve tried to find the place, but I wind up back every time.
Olivia
One thing about your question,I feel like a lot of Black cultural expression is very… And I mean this in the actual meaning of the word, not in a sense that it’s trivial, but decorative. There’s a level of ornamentation, and celebration of ornament, and celebration of flourish, and glitter, and glamor, and ostentatiousness. That I think a lot of how capitalist logic informs how we build information systems collapses because it’s all about minimizing and maximizing, and clearing noise out of signal and making things very readable, categorizable, sortable, accountable in that way. And I feel often, so much of- you’ll see all these memes on the internet about how Black people are able to talk to each other and say the same phrase. There’s a meme that I’m specifically thinking of that’s about Black New Yorkers and how if you say the phrase, “you good?”, and different intonations, it means something completely different. And that is the minutiae that should you actually try to encode that social dynamic, you need, it’s a system on a quantum level that has all of these, will they, won’t they states encoded in it. It’s a system that actually can’t really fully explain itself in a binary thinking model.
Nzinga Simmons
Then it’s also, I feel like when that culture It then merges with technology or technological systems. It becomes beautiful. You think about Black Twitter, hip hop.
Olivia
It creates these gorgeous glitches. The thing that we perceive as gorgeous about it is the way it breaks the form. It is the record scratch and the sweat and the thing that shows that actually this is something that can only barely be held inside of its medium.
Stephanie Dinkins
I just did a project to this, and we encoded stories into DNA, and then it was decoded back. And it’s so imperfect, but it is so beautiful because it does this mathematical stuff, and it’s like nine, five, divided by 10. Hallelujah, your chorus, blackness explores. And then it just keeps going, and you’re just like, oh my God. This is incredible. And I think it’s some of that stuff that you’re talking about, right? Those things that rub in that way or that flow.
Nzinga Simmons
It’s also the questions, the prompts in the container. They’re so poetic and prompts you to share your experiences in excess. And then when you see the image on the outside of the container, you’re like, Okay, how does my story fit into it? But it’s this chaos, chaotic It’s like jazz. Yeah. It’s a collision of all of these beautiful forms and things that come together to make a whole new.
Idris Brewster
I think these conversations become a lot more important when machines are starting to, at least, people are trying to get them to understand on a philosophical fundamental level. I was in Paris last week at this AI convening, which had 15 computer scientists, 15 humanists, and then me. So I was just pulling my hair out. I was like, you all both don’t understand. But one of the interesting conversations was about white space and how the white space of poems, AI largely erases all of the white space in poetry, and it has no understanding of the importance that the poets input in that white space and in those blank, that negative space. And machines do not really have a great way to understand that white space and that negative space. I think that is interesting to me. That is also a metaphor for Black life within these knowledge systems of that white space and that missing space. We can account for everything that’s there, but what about the stuff that’s not there and the inference needed to make conclusions or understand that? That’s fascinating to me, and I think is a part of the problem that we’re dealing with of straight logic, facts, ground truth, data.
What about the in between?
Stephanie Dinkins
Well, this is something that happens to me every time I do a project. I’m like, No, I do not want it to fit the system that exists. You need to make your system accept the information that is given. And people just look at me like I have 19 heads. And how are we going to do that? That’s not the way it works. It’s like, I understand that’s not the way it works, but there’s so much missing. And this goes right back to language. When I first started using future histories, everybody would correct it. They’d automatically just… It’s like, no, I put the S there on purpose. Put it back. Oh, no, but that… No, no. People are constantly correcting my stuff, and I just keep putting it back constantly. Because it goes against the grain of what is acceptable or the rules of language. Or I like to think of Grammarly, because the way that Grammarly police uses language is very interesting because it’s just hard. There’s not a lot of the nuance, the spectrum of things. And it’s like, if we can do anything, it’s like if the inclusion of the spectrum is possible instead of the binary, that would be amazing.
Olivia
And it’s so interesting because before Grammarly, people used to be taught grammar by people who were taught grammar by people. And so the way that grammar would shift and expand through that telephone chain of people teaching each other things would expand to the scope of how society changes. But instead, we have an algorithm that informs how people see what what’s correct and what’s not correct grammar instead of people making decisions based on a wealth of human history and cultural understanding and stuff. The conversation you’re having in Paris really annoys me because-
Idris Brewster
Yeah, it was so frustrating.
Olivia
Well, it’s frustrating on so many levels, but something that’s been really bothering me also about artificial intelligence, and that’s the way it’s risen in our conversation things about technology is that I feel like people are forgetting about other kinds of technology we’ve already invented in deference to making a large language model do something that it doesn’t really need to try and do. People who will ask ChatGPT to do math equations, like Wolfram Alpha wasn’t there to help you on your homework for decades? Or this idea of computers can’t understand white space in poetry, except we’ve been doing computer vision for a while now, actually.
We’ve been doing OCR. There actually are other types of non-AI technologies that are asking for an entire ship’s worth of data from other people to do these kinds of calculations and create data that could be interesting to look at and interact with. But for some reason, there’s this arm race and data fetishism happening in the AI space where so many people will hear the question, so what do you think about technology? And start telling me stuff about AI. And it’s like, that’s not what I asked you.
Nzinga Simmons
I feel like for the first time in the history of the advancement of technology, that research around emerging AI technology is not being done by people in the academy who are doing it from their own curiosities. It’s now being pushed forward by all of these money hungry, Silicon Valley tech organizations. And so all of the resources are going to be going towards research around large language models because they’re so predictive. And when something is predictive, it helps you predict market trends and helps you to gain capital in ways that perhaps other forms of AI wouldn’t be as lucrative in doing. That’s just my.
Olivia
I really agree with you. Well, I don’t know. I’m curious about what you guys think about this, because lately, this week, and particularly, I’ve been raising my eyebrow. Oftentimes, people will tell you in an analysis of critical technology or our current data ecosystem that, yeah, it’s like data is the new oil. Data is the most lucrative thing you can collect about a person. And it’s like, how much of that is true because they keep saying it. We’ve created a culture, and it’s like, how do we… The idea that you can exchange information for money is actually a little ridiculous. I would like to exchange goods in exchange for other goods.
Nzinga Simmons
Intangible things for money.
Olivia
And it’s like, why are we in this space where the rule in class is just able to push piles of money around in circles without actually doing any real work for or to each other, just telling each other stuff. Do we remember what information is so we can just tell you stuff? My brain has been breaking around that idea of, is data the new oil? Why is data the new oil? How can we take that power away from data? Is that possible? What’s going on?
Nzinga Simmons
Yeah, and the discernment on what is good data versus what is not good data, I feel like, is it being looked at? But to widen the conversation back up, I want to ask you guys, and also I want to turn to our audience after this final question. What do you guys think a just technological future looks like and what principles do you think should guide its design, its application? How do you think that… It doesn’t have to be about AI, but how do you think that we can move the needle forward technologically in ways that feel more community-oriented, feel more ethical?
Idris Brewster
To try to build not-for-profit, for one thing, I think that clouds a lot of the judgment around our systems. There’s very little building for how it actually affects and helps people, which is why I definitely believe that… I don’t have an answer to your question. I think that’s what I’m trying to figure out. But one that’s rooted in actual humanity and people and our own benefit is something that, weirdly enough, isn’t really being prioritized or funded in general. I mean, I had a project at this Paris thing was a small language model data set application, and I was literally told, yeah, this doesn’t help large language models. I was like, that’s not what I’m trying to do. And they were like, well, if you want to get funding, you really should shift this to think about how you can use large language models and push it forward. And I’m just like, It’s the money is controlling a lot of the technologies and what we build. I think that problem is fundamental to figure out, but it’s a hard one.
Stephanie Dinkins
Yeah, I would agree with that. I think it’s really interesting, and I think it’s about people and what we do as well. I can’t tell you whether it’s profit or nonprofit. I’ve been in many a room where it’s nonprofit space, but everybody’s got their heads so far up the funders’ butts. And the funders change their mind in a minute. So then you turn towards the next idea, the next idea, chasing the funding instead of doing the thing? I don’t know what it is either, but I think we need to center… I don’t even want to say people. I want to say we need to center entities. We need to consider entities across the board and their value, and then figure out a way to back off of capital a little bit. Because capital, it just bends and corrupt things so badly. I can’t tell you how disappointed I often am because you know what it is to be in these rooms. And theoretically, they’re beautiful rooms. They’re just like these rooms where you should get work done. And work hardly ever gets done because of the structures. And so I’m always trying to think about, well, what and how do we ship those structures to be more like the unit here?
How do we have a dynamic conversation and send that back out and seat it and see where it goes and then act upon it? And then, I’m not going to say I’m totally against because you need some capital to do the things. This is the thing. And then once you have the capital, what do you do with it? And it’s been an interesting ride to have some money and then try to push money to other people so they can do a little something and make it possible for people to do a little something, but it’s a little something. But how many drops in the bucket fill the bucket? I did a project a long time ago, and somebody told me, One, one, full basket. And that was our grandmother’s thing of you put one by one and you get a full basket. And it’s like, how do we resist going towards the mean and doing the thing that really is working, whether it’s for you personally, but always considering society because I don’t think you personally survive. Yeah. Right? Yeah. Even if you’re doing the best possible, we have to support the rest so that we can have a raft to be on.
Yeah, I don’t know. It’s one of these places I could spin forever. I do spin forever. Yeah. But I think gatherings and conversations and seating has been interesting. And if we could make that really like the ocean versus an your occasional drop in it, it would be interesting. What are you thinking, Olivia?
Olivia
I think everything you guys are saying, I think the more life shuttles along, I used to feel like… You know that phrase? I don’t even remember who said it. I think it was Toni Cade Bambara. But like, Oh, the role of the artist is to make a revolution irresistible. I always thought it was the most annoying person you know posting that on the Internet. It’s always the worst person I’ve ever met posting that. But I do feel culture moves much faster than policy at a rate that’s terrifying. And technology is the thing that is accelerating that culture. And it’s because we’re communicating with each other in faster, faster ways. And I think… I don’t remember which of you said it first, or maybe you both said it in different ways, but I think the real thing of shifting the… Figure out a way to influence culture, to allow people to have the courage to leave these systems of creative economy and the way we kind of reproduce society around these goals of efficiency? How do we actually move our value system away from an economy that rewards productivity and that rewards maximizing some idea of profit that is so abstracted and not anything close to how we measure happiness.
How do we generate a culture using these technologies that are so effective generating culture to really heighten that contradiction and bring people to a place of being conscious of how alienating that system is in a way that gives them the courage to leave those things behind and to say, Okay, what if we didn’t spend money and we just took care of each other? What stuff would we make if the thing we needed to get stuff done was isn’t actually money. It’s time and resources. It’s real stuff that we can touch and do stuff with. And it’s like, How do we create an abundance of that? And how do we create circular loops where we don’t have to worry about how we’re going to get more of that and then build societies at work. I don’t know. Yeah.
Stephanie Dinkins
This is why I’m so invested in story, right? Yes. Because we’ve always told stories, and the more we say and then enact it, the more it can happen. I love to ask people, Well, how many of you could ask your neighbors to help you build a house in the next few months? Remember, people used to do this. People used to come together to help their neighbor build a house and then build another house and build another house. And now we don’t have that as a narrative. We don’t have that as a… It feels like a distant memory. It’s like, well, how do we tell the story that that’s something that is possible, maybe even expected in a way? Last thing, I love the governor of my block. In Brooklyn, you get the governor and the mayor of your block. Yeah, I love him. I love him because he’s doing the work every day of stitching together a community through conversation, through making sure people know each other. And it’s that narrative and the watching of that, that maybe somebody else is lining up to be the next governor. But then, how do we take those responsibilities on truly?
It’s a responsibility that nobody’s giving a jack for.
Olivia
I had to coach my sister because she moved into a suburban neighborhood where her and her husband, who are two-person income household, she’s like, Yeah, I don’t know. My neighbors don’t need anything from me. How do I go over and be like, Oh, do you need sugar? And it’s like, No, we have sugar. We’re middle class. We have sugar. And I was like, So what you actually are going to have to do is you have to start lying. You have to make your neighbor feel needed, and you have to come up with things for your neighbor to solve. And then you will just go back and forth. You have to be like, Oh, man, I made too much pie. How am I going to eat all the time? You just have to start making up things that you need your neighbors power by extending that and sharing in that. This will be my last point to you to allow you to moderate. But Ursula K. Le Guin has this cool essay where she talks about how a lot of our conceptualization of the first technology is in this mythology of the sphere. It’s like, how phallic and annoying is that?
It’s this idea that the sphere is an indicator of a civilization. She argues, well, actually, I think of the technology that we should be looking for is the basket, because that is proof of a group of people who are gathering and providing for each other’s needs and have this technology of gathering as opposed to this technology of martial war and having this idea of, Oh, you’ve killed for your food in this community.
Nzinga Simmons
I love that. Early last year, I went to an exhibition at the Schaumburg Code Switch, and it was a technology show. And so I’m thinking that there’s going to be all of this art that’s very digital or new media oriented. And there were these pots, ceramic pots. And it was this beautiful story about how pots were the first technology, and enslaved people were making these pots. And it really did make me think about what is technology in a different way? Because it doesn’t have to be this extractive, exploitative thing. It could be something very community oriented. But yeah, with that, I would love to open it up to the audience if you have any questions for our wonderful panelists.
Olivia
Give you a round of applause. Don’t be shy.
Audience
Thank you for… I mean, I’m still processing so much of what you said, but because based on your backgrounds and that you’re at the intersection of your understanding when it comes to philosophy, history, and technology, and as artists and creatives, which means you also recognize a responsibility and the trajectory that you might had set forward, it’s integrating so many and tapping into all these resources and understanding. You’ve asked amazing questions, so I’m trying to take it even further because there’s just so much to unpack there. But now, once again, I want to come back to you as an artist and a creative. What do you think sets you apart, especially when you were at Paris, and the perspective and the nuance that you were able to lend? Because I, too, haven’t been an artist myself, and when I was a Mons philosopher, theologians, and even ethicist, or when I was in a room full of people who are working in government and getting my MPA, is that I struggled so much to create that lane and help them recognize how essential and critical it is for the voice of an artist because of the way you could set forth possibilities, that you could conceive of things in ways they cannot.
And so, I mean, I don’t even know how to even frame this question, but I think you might understand what I’m trying to get to, is do you feel a sense of urgency as things are changing and evolving so exponentially? And Where do you think your role in the room itself, even though they might not fully recognize the value you are bringing to the table because they can’t even conceive beyond the utility of technology and how it’s exploitive, and that is what they are to do because they can conceive of it. So because of your, I guess, consciousness and your awareness in the role of the prophetics, which I think artists tend to lend historically in what you bring to the table, where do you think you could really help steer in the ways, in the avenues that you already are in? And how do you see things come together in such a way and continue to cultivate that, I guess, be a respect or influence moving forward in whatever direction that you are sensing? I know it’s very a lot for you to impact. I’d be curious to hear from all three of you.
Stephanie Dinkins
Do you have a… Go ahead. Well, I don’t… Okay, so I have a lot because I try to quit doing the work I’m doing a lot. But then I think that it still needs to be done. So I say, okay, I’m going to do it. I’ve given myself over to the work, whatever that is. But it’s interesting because you’re making me think of a time that I was in a room like Idriss is describing, and it was mathematicians. It’s like some of the top experts in the world, like 30 people in this room. And we were all asked to give a little presentation, so I gave a little presentation. And then I asked them to stop and imagine something and then say it to the room. I sat there, I stood at the podium, I watched their heads exploding. Their heads were literally exploding because because they did not know how to imagine into a space. And these are the people crafting the technologies that we are all working with. And that scared the crap out of me. Because if you can’t imagine, then how do you make? You can make a technological thing, but you don’t have the capacity to slide around.
Another room, and I think a lot about common languages, right? Because I’ve also been in rooms where people are having dinner and it’s supposed to be light, people from different backgrounds, and the doctor will get up and speak doctories. And I’ve actually said, Listen, we all are interested in what you’re saying, but can you say that in a language that everyone at this table can speak to and understand? And people give me the evil eye. I was like, No, if we’re here to speak to each other, we have to be able to use a language we all understand. And I think the other thing is we can’t just be these representatives of things. Because I’ve been the black person, the woman, the disabled person, all rolled into one. I’m supposed to be filling these slots. You’re like, well, what is this? And how do we do this? Like, weird Noah’s art crap, right? Instead of really trying to do work. And you can usually see it in a room automatically. You’re just like, oh, okay, I see what’s going on here. You’re trying to tick boxes. You’re not trying to do work. And so the question becomes, well, what do you do with that?
Do you show up at all? Do you not show up? I feel like I’m the generation that is to show up. I don’t want them to have to show up. Really? I just don’t get. No, you shouldn’t have to show up. I get it. I’m older. There’s a different path. I’m willing to show up. But you guys, why? I know the voices need to be in the room, sadly.
Olivia
They haven’t been listening to the voices.
Stephanie Dinkins
I know. Well, this is the thing, right?
Olivia
It really pisses me off that you’re in there and they’re not listening to you.
Stephanie Dinkins
No. And that’s a thing, right? Because you’ll say, and And it’s like, Okay, why did you invite me? Clearly, you thought I was smart or something, or I checked your boxes. Now you don’t like how smart I am and what I’m trying to tell you. And that’s the listening piece of my work in particular. Because as much as I’m asking for the stories, I’m asking for people to listen to each other, which is something we tend to have trouble doing somehow. Sorry.
Idris Brewster
I mean, yeah, they’re not listening. I’m questioning myself all the time. I was like, Why the hell am I even here? Why am I talking to you all and trying to convince you we’re talking even about humanity? I was like, Well, let’s honestly understand that at a base level, because I’m not even sure that folks really understand that and what that really means. For me, I was getting imposter syndrome. I was definitely like, I was dissociating, not speaking. I also don’t want to give that part of me to them, but it’s like at the same time, there was a lot of concerning rhetoric and concerning conversations. At the same time, there was not an ability for anyone to understand what each other was saying. There was a lot of acronyms. There was a lot of phrases that have multiple meanings if we break that down. And there’s definitely a language barrier and a space in which folks are not trying to understand. They’re trying to talk past each other. And I think it comes down for me to also an understanding of just like breaking rules. We need to break the rules.
We need to think of other ways. And folks that I was in the room with, which, by the way, they’re all academics. So I was also the non-one with… I don’t even have a PhD, and that was already concerning for me. But they were just finally attuned to the system in which they were working within. Even me presenting ideas outside of that bounds were rejected in certain ways. And on top of that, it’s like the concerning thing was also the hunger and thirst for data, data, data. It was like, Oh, I need millions of millions of millions of millions. It’s like a kink. Yeah. It’s like they were really getting excited to say it in a certain way about the amount of data that was available. And then when I told folks that I’m working with the Baldwin family with their data, the light was shining in their eyes. I was like, All right, that’s what I really need to retract here, because I can’t give that to them. I was like, You know what? Folks were in that room. They were just machines looking for data was very concerning. And that’s where I was like, now I need to figure out how can we protect our data because folks are not only using it, they’re thirsty for it.
And do I want these machines to know about us or talk about us? Not in the certain form that they are. I don’t know if you ever asked ChatGPT to talk to you in Patois, but it is very weird. And that’s also not the future I want when our data is contributed to these systems, and then they can mimic us in certain ways. That’s just digital colonialism in an evolving form of digital colonialism. I am very concerned about how to move forward. I forgot your question. But yeah, I don’t… Yeah, I forgot your question.
Olivia
One thing that I really… I think I feel really excited in terms of… I’ve been very privileged in terms of working with and studying and teaching about making art with information that I’ve gotten to do so as an educator and also as a documentarian, because it’s strange. The science and industry, they don’t listen to artists and humanitarian thinkers and the humanities when we’re physically in the same room. But there’s something about a mass media, like Doctor Strange Love, Neuromancer, all of these science fiction, the way that Hollywood manufactured consent for Internet of Things with all of these smart home science fiction, readers, and all of this crazy. There’s this idea of, Stephanie, what you’re saying in terms of people not really being able to listen to their imagination. And so we have this science fiction where they’ll be like, Don’t build this. And they’ll be like, Oh, what if we… Let’s get it. It’ll be these grim dark horrors of people trapped in their smart home. And then you’ll see 30 years later, someone bursts into flames inside of a cyber truck. Well, it’s like, we told you not to build the thing. And I’m very curious, I think, in that way of…
I am not thirsty to be necessarily in the same rooms as people who are developing these industries and pushing them out. I’m very curious in being a part of the media diet that they are forced to reckon with. And I’m very curious about how do I create a hawk culture that is hostile to the absolute abominations you create. How do I make Gen Alpha feel affirmed in calling robots clankers in the most disgusted tone of voice they can. How do I… I don’t condone the use of this word, but as someone who was raised in New York, I’ve heard it a lot growing up, and I have overheard young children, middle schoolers who over here… I taught middle school in a public school in Queens a couple of months ago, and I overheard these girls referring to a classmate who they saw using ChatGPT to do an assignment as the R-word. And I was like, That’s not kind, but it’s also very interesting to me in terms of what the actual word, in terms of mental retardation means and what we have actual evidence of active AI use and active chatbot use doing to people’s brains and how they think and their ability to make new connections unassisted.
It’s very rude, but something very bad is happening to these two. Yeah.
Nzinga Simmons
Like you said, I don’t condone the verbiage, but I hope that they maintain that maybe adversarial position towards things like ChatGPT, because I have taught an undergraduate class at Duke, and I feel like it’s exactly the opposite. The students in my class, it’s almost like they’re smarter if they’re using it because they’re somehow getting away with or getting over on or presenting deserving their own labor in ways that are tactful and sneaky. I’m like, wow. I love that they were like, wait, no.
Olivia
It’s like, are less smart. I want to contribute to and glamorize a culture where people who build technologies like this can’t have sex. I want you to be the most uncool because that’s the thing also. But that’s what everybody wants at the end of the day. It’s like a social capital. They’re exchanging money, they’re exchanging their labor. They want to feel their position in their community is valuable because they’re a tech bro, or because they’re in government, or because they’re XYZ. What if I told you you were actually unlovable? What if every message in culture was telling you that actually this way of being is deeply unhip and that you are a figment of the past and everyone will forget you? I’m curious about creating that instead of trying to have micro conversations with people who aren’t really interested in what I like to say, especially given the virality of Blackness and Black people. One of the things that was really funny in Seeking Mavis Beacon in the documentary process is because we were interfacing with a distribution company that was very white dominated. It was like a bunch of white dads, basically. And we had this inside joke about how…
Because this is when I was very young. I was probably like, 2021 while we were in production. And so we’d be having this conversation with the production company. And I had this mini secret power more because we were talking about how there’s nothing more embarrassing than a Black teen girl telling a room of executives that they’re deeply uncool, especially a group of executives who are trying to make media that goes places. Me being like, wow, that is so deeply unchic, actually. It’s lame. Me saying something was lame, was a way of protecting the project from going in a bunch of weird directions. I just had to say it was lame once, and it worked.
Nzinga Simmons
Do we have any other questions? Okay. Well, with that, I would like to offer you guys post-panel Prosecco in the corner over there. And that concludes our event. Thank you guys so much for joining us.