Just Tech: Centering Community-Driven Innovation at the Margins episode 1 with Desmond Patton and Mary L. Gray

Publié

Headshots of podcast guest Desmond Patton and host Mary Gray side by side and set against a dark purple background. Each headshot is contained within a hexagon shape.

Episode 133 | March 23, 2022

In “Just Tech: Centering Community-Driven Innovation at the Margins,” Senior Principal Researcher Mary L. Gray explores how technology and community intertwine and the role technology can play in supporting community-driven innovation and community-based organizations. Dr. Gray and her team are working to bring computer science, engineering, social science, and community together to boost societal resilience in ongoing work with Project Resolve. She’ll talk with organizers, academics, technology leaders, and activists to understand how to develop tools and frameworks of support alongside members of these communities.

In this episode of the series, Dr. Gray talks with Dr. Desmond Patton (opens in new tab), whose work at the intersection of social work, social media, and technology seeks to understand the root of aggression, grief, and trauma in ways that can help inform interventions for social workers and broader communities. Together, they explore Patton’s learnings about the challenges of using AI in a field that’s full of nuance and how informed technology can make positive social impacts in partnership with local communities. Dr. Patton also shares how his work on gang violence has grown his understanding of how social media can influence and transform the narratives about people.

Learn more:

Subscribe to the Microsoft Research Podcast (opens in new tab):
iTunes (opens in new tab) | Email (opens in new tab) | Android (opens in new tab) | Spotify (opens in new tab) | RSS feed (opens in new tab)

Spotlight: blog post

GraphRAG auto-tuning provides rapid adaptation to new domains

GraphRAG uses LLM-generated knowledge graphs to substantially improve complex Q&A over retrieval-augmented generation (RAG). Discover automatic tuning of GraphRAG for new datasets, making it more accurate and relevant.

Transcript

[MUSIC PLAYS UNDER DIALOGUE]

MARY GRAY: Welcome to the Microsoft Research Podcast series “Just Tech: Centering Community-Driven Innovation at the Margins.” I’m Mary Gray, a Senior Principal Researcher at our New England lab based in Cambridge, Massachusetts. I use my training as an anthropologist and communication media scholar to study people’s everyday uses of technology. In March 2020, I took everything I’d learned about app-driven services that deliver everything from groceries and tutoring to telehealth to study how a coalition of community-based organizations in North Carolina might develop better tech to deliver the basic needs and health support to those hit hardest by the pandemic. Our research together, called Project Resolve, aims to create a new approach to community-driven innovation. Project Resolve asks, “What difference could it make if we’re building with community organizations that provide direct services for those most in need and made community agendas the driving force and core of innovation?” To answer this call to action, we’re experimenting with bringing computer science, engineering, the social sciences, and community expertise together to accelerate the roles that communities and technologies could play to boost societal resilience. But there’s a larger mission here. We believe that understanding how social methods and theories woven into the fabric of engineering and computing research offer a new paradigm for responsible computing itself. For this podcast, I’ll be talking with researchers, activists, and nonprofit leaders about the promises and challenges of what it means to build technology with rather than for society.

[MUSIC ENDS]

My guest today is Dr. Desmond Patton. He’s joining us from the Columbia University School of Social Work, where he’s the Associate Dean for Innovation and Academic Affairs and the founding director of the SAFElab. He’s also the co-director of the Justice, Equity, and Technology Lab and a professor of social work. In addition to his numerous positions at the School of Social Work, he is the associate director of Diversity, Equity, and Inclusion. He also co-chairs the Racial Equity Task Force at the Data Science Institute, and he’s the founder of the SIM/ED tech incubator, both at Columbia University. Dr. Patton is a member of the board of directors at the Columbia Center for Technology Management and a faculty associate at the Berkman Klein Center for Internet and Society at Harvard University. Dr. Patton is a groundbreaker and leader in the field of making AI empathetic, culturally sensitive, and more attentive to bias. His research uses virtual reality to educate youth and policymakers about the way social media can be used against them and how race plays a part. His research on grief as a pathway to aggressive communication on Twitter, published originally in Nature, has been featured in The New York Times, the Chicago Tribune, and USA Today, among other outlets, and cited in a brief submitted to the United States Supreme Court examining the interpretation of threats on social media. Dr. Patton, welcome.

Desmond Patton: Hi, Mary, it’s good to be with you.

Mary Gray: Hey, can I just start calling you Desmond now?

Desmond Patton: Of course.

Mary Gray: I’m so excited to have you here, because when I think about people who, years ago, struck me as doing something that was both innovative but also an intervention in machine learning and thinking about what could we do with the social realities that bring offline and online together, your work just inspired me, So—

Desmond Patton: Thank you so much.

Mary Gray: —I mean it’s really profound, because you’ve never let go of the possibility of technology, but you also bring such an important critique. So I want to start us off with talking about the SAFELab. Can you talk a little bit about what is the SAFELab? Let’s start with what does «SAFE» stand for?

Desmond Patton: Yes, you know, it’s interesting because I remember my first year at Michigan wanting to create a research lab that was akin to what I experienced as a graduate student as a way to bring people together, and to just produce rigorous research. And I remember like having this moment of like thinking about the lab name. And I believe that «SAFELab» was supposed to mean «Supporting Aggression-Free Environments.» But at this point, I think, that the actual name has kind of morphed into its own thing. And so I don’t really stick to what, uh, the original idea was. But the idea around the SAFELab has always been to really create an interdisciplinary space to study issues of youth, gun, and gang violence. Um, I’m a qualitative researcher. I’m a social worker who is also a social scientist. And I wanted to get tenure, I wanted to produce high-quality research, and I wanted to have some fun doing it. So that’s the purpose of the lab.

Mary Gray: So I mean, check and check. [LAUGHTER] You definitely made an amazing space. Can you talk a bit about why making it interdisciplinary, and maybe why making it fun was important to the work of the SAFELab?

Desmond Patton: So I study violence, and people have been studying violence from different angles and perspectives for many, many years. Um, as a student at the University of Chicago, I took sociology classes, psychology classes, law classes, anthropology classes, public health courses—all with the idea of being able to merge disciplinary perspectives to understanding the root causes of violence, and to think about different ways of intervening in violence. And so it became really clear to me that when I would start my own lab that these perspectives would be really important. I didn’t know that I would be involved in technology—in particular looking towards computer science, and machine learning, and data science—as spaces with which I would be involved in, in which I would learn from in order to study, um, root causes of violence. But I’ve always known that in order to really study any critical social problem that you need to have representation of opinion, and thought, and idea. And that’s something that I learned as a graduate student at U. Chicago, was something that became very clear, the study of violence, particularly violence that permeates in black and brown communities.

Mary Gray: Can I circle back to that question of fun?

Desmond Patton: Yes. [LAUGHTER]

Mary Gray: You know, so I mean when I think about like the work you do, it’s heavy.

Desmond Patton: Yes.

Mary Gray: Right, I mean it’s hard work. So how do you bring—certainly not levity—but how do you bring some joy, some possibility for an imagined, freeing future to the work that you’re doing?

Desmond Patton: Gosh, it—it is so intentional, and has to be at the forefront of all the things that we’re doing in the lab. As you mentioned, the work is extremely heavy. It is daunting. It is heartbreaking. That has been a part of the lab for the last ten years. However, intentionally we make space to be a community, and so «community-building» is the term that I—you know, it can be kind of interesting. The idea that we are connected, that we’re in this together, that we’re learning from each other is really important. And so having social gatherings with lab members has been really important to just get outside of the lab. Having process groups, right, because a lot of the folks in the lab are looking at or imbibing really hard, um, content, so having a space where they can talk about their feelings, their emotions, what’s coming up for them—anything that might be coming up that’s leaning towards a bias or affecting their interpretation, being able to talk about that authentically and organically. Because oftentimes I feel like—especially now—we’re in a space where it’s challenging to talk about really hard and difficult things. But I think in the space that we’re in, instead of using gun violence we have to be able to articulate the things that are painful and be able to process those things. And we hang out. We go to dinner together. My husband and I bought a house a couple of years ago. We allow the students in the lab to spend the weekend at the house to just kind of get to know each other as fellows in the lab, as friends, as potential colleagues in the field. All of this has been really important. We also think about, like, different ways in which we study the problem. One of the critiques that I have for my discipline of social work is that we’ve been studying the same social problems for many, many years, using the same methodologies. And I think it’s time that we at least open our minds to the extent to which these new tools could be useful or harmful. But at least having the opportunity to play around with them, to study them, to unpack them within this context, has been thrilling and been really exciting for the work they’re doing in the lab.

Mary Gray: Can you describe for listeners some of the projects that you’re working on, and particularly what brought you to thinking about the connection across social media, the experiences of marginalized youth, particularly gang-involved youth, and AI? I mean, for many people they think like, «What do those have to do with each other?» But can you maybe walk us through a project where you’re bringing those together and what it is that is really different about the approach the SAFELab takes?

Desmond Patton: So my interest in this intersection of social media and community-based violence and gun violence is really something that was driven by young people. So as I said earlier, I’m a qualitative researcher who has been studying youth violence, and I came to this tech space as someone that was interested in understanding how young people were making sense of violence in the neighborhood. So my dissertation in Chicago focused on high-achieving black boys and men who were 4.0 students but were also living in what was then deemed to be the most violent neighborhood in Chicago. And I wanted to understand basically how they cognitively geocoded their world, both within their neighborhood and within their school, but what kept coming up in our interviews was how they were doing that online. And it was something that I didn’t expect because at this time, like, social media was gaining prominence, but in my mind, I saw this as being two different spaces. There was a virtual world, and there was a physical world. But what young people were telling me is that actually no this is not the case. They would say things like, «Facebook is life.» And I was like, «What does that mean?» And what it meant is that, you know, what you do in your digital world meant a lot for how you would navigate your offline world as well. And so that became really important. I went to the literature, um, to kind of dive deeper. I started to see more examples of this in popular media, There was virtually no empirical literature in this space around this topic. And so with some colleagues from the University of Chicago, we developed the first paper to really define and put some parameters around this phenomena of social media violence that is happening in communities of color, and we called it «Internet banging.» So this happened around 2012/2013. And I started to investigate it qualitatively. And so I partnered with some folks from the YMCA in Chicago, and we conducted a year-and-a-half-long qualitative study where we were interviewing young boys and men, um, who either self-identified as being gang involved, or formerly gang involved. And we also interviewed outreach workers that were supporting those young men. And we just learned so much around the importance of understanding the digital environment, and that the digital environment really needed to be treated as a neighborhood, as an environmental context. We understood the importance of having young people as translators and communicators in this space. Because there’s a lot we didn’t know about how young people were talking online, how they were expressing themselves online, how they were using text, and images, and emoji, and hashtag, and memes to communicate. There was just so much we didn’t know that we understood that we really needed to work with them; that it—it couldn’t be the situation where we’re just extracting from them, that they needed to be co-partners, um, co-, um, investigators in this work. And so that then led to a series of projects where we hired young people as domain experts in our lab to really help us translate and interpret culture context and nuance on social media platforms. And they could apply that expertise and that knowledge to then look into how machine learning and computer vision could help us look at big datasets, in particular large amounts of Twitter data that may tell us something about the relationship between social media and gun violence. And so, I would say I spent my early years doing those early qualitative studies, and then working with computer scientists and data scientists like Kathy McEwen, Shih-Fu Chang, to then understand the utility of machine learning and computer vision on this particular problem of gun violence. And what we learned is that it was actually very problematic, that these tools that everyone was so excited about and raving about were really harmful. They would misinterpret posts left and right. The most awkward conversation that I had was when I had to talk to my colleagues about the N word. Because the machine learning tools kept identifying the N word as being an aggressive term, and I had to educate them that not necessarily within the black community is the N word perceived to be, um, an aggressive term. And what I’ve appreciated about this work is that computer scientists and data scientists are not afraid of a hard problem. And so even though these tools didn’t really work, we kept pushing, and we kept thinking outside the box to try to figure out, you know, how do we improve accuracy? What is accuracy in this space? What data do we actually need? Um, and how do we make sure that these tools are aware of the biases that are coming into how we’re interpreting social media data? And so we added more data by using deep-learning techniques. We really centered the voices of black and brown community members in those interpretations. We allowed their view of language to be how we viewed the work. We brought in computer vision to look at how images were talking to text, and so forth and so on. So that produced a bunch of experiments where we never really got above 70% accuracy. And you know, that was interesting, because a lot of people would say, «Well, don’t you want the most accurate tool? Don’t you want to get as close to 100% accuracy and being able to identify aggression, and loss, and substance use? And then the question that I have been really trying to wrestle with is, «So what does it mean to have an accurate tool that is interpreting black speech online? And is that helpful or harmful?» That was really kind of a pause moment for me in the work.

Mary Gray: Thinking about the work that you’ve done on natural language processing, NLP, and the focus on text, you also have this deep experience with thinking about images. And I’m wondering just what differences you’re seeing between the work of modeling around text, compared to the work of what it takes to train computer vision to—you know—to see what? See racism, see bias? Like how do you think about the differences between those two modalities?

Desmond Patton: That’s a great question. So we’ve been engaged in multimodal approaches to really see to what extent can we automatically identify these psychosocial concepts of aggression and loss in both text and images. Because the question has been, «Does the image tell the same or different story? Does it complicate the story? Does it help us to understand a phenomenon differently?» What we learned is that, in our deployment of NLP and computer vision tools, that computer vision was better at automatically identifying aggression, because our understanding of what’s aggressive, or threatening, or problematic is more rapidly apparent in our minds. And so when we’re doing bounding boxes around faces and images, you can see a gun, you can see someone’s face. And there was more, uh, synchronicity between automatically identifying aggression in images than in text. We found that text in NLP tools is much better at expression of loss, because people would use more words to express how they were feeling online. But we’ve been very concerned about how the appearance of someone’s face then also influences how you’re interpreting what’s happening in an image as well. And so one of the things that we’ve thought about is like what is the validity and utility of seeing a face? It could tell you a lot, and it could tell you a lot, right? And so—

Mary Gray: Mm-hmm. Mm-hmm.

Desmond Patton: —we have really tried to play around with that in various ways. When we present our work, we black out faces so that the audience can’t see. But we haven’t done it yet in our analysis work. And I think that this is an important space for us to go into, especially when we think about kind of racial coding of images as well.

Mary Gray: Yeah. Whoa, good stuff. I mean, I can spend another hour on that, so I’m going to stop us there. What have you—as you’re reflecting now on what accuracy means—like what was the goal in terms of being able to identify something particular on social media, and then what’s the intervention that you and the domain expertise of these young people have been able to bring to the table to really change the idea of what we should be trying to achieve?

Desmond Patton: Yes, so the goal has always been to identify automatically the psychosocial concepts and Twitter data that might tell us something about root causes of violence. What we learned qualitatively is that expressions of aggression and grief and trauma were paramount in the engagement that we were seeing, particularly among young people living in Chicago, young black people living in Chicago. And so these two dominant areas became the categories that we were most interested in when looking at social media data, and we wanted to see to what extent can an NLP tool automatically identify the psychosocial concepts of aggression and loss and trauma. And so what became clear is that number one, the machine learning tools are really bad at context. But what humans are really good at is context—and emotion. And so it became clear to me that the ways in which we do annotation or labeling of data actually can play a stronger role in NLP research. But how my colleagues were conceptualizing is that, «Oh, you just need to do these classifications. Just label it positive or negative, or one or two, or yes or no.» And that never sat well with me, and it became clear that it doesn’t sit well with me because it doesn’t allow us to really see the fullness of the human experience, or human condition, that we need deeper, more robust unpacking than was actually happening to actually be able to label in a more informed way. And my colleagues heard me. So we didn’t have this big fight or pushback. It was like, «Oh, I think you’re right. Let’s try that.» [LAUGHTER] And that was great. And so we spent, you know, as a social work team in a lab, we spent probably more time than people would like annotating data and creating a methodology around doing it, because we saw the importance of bringing in that insight. It also allowed us to check ourselves and to see visibly why and how we were interpreting data and the biases that were coming through. So we could have that conversation with each other when we were labeling something that a domain expert, a young people may label differently. And if we were just doing it quickly with positive/negative, yes and no, it wouldn’t have afforded us that opportunity. So that became a really important piece of the work. But that wasn’t—that was never really the goal. It wasn’t—it’s not what I went into thinking about. It was kind of just hopping along making mistakes and learning from those mistakes. So what I always hoped to do was to, you know, be able to help social workers and outreach workers identify challenging conversations online that then they can intervene on. They are the experts in working in communities and deescalating violence. But the fights that were happening online—they were getting to them too late. I wanted to be able to mitigate that gap and support them. There’s so much challenge with doing that. A, you know, it’s still an interpretation issue. B, that work is really hard to get funded. There’s lots of ethical issues with doing that kind of work. So what actually emerged for me was the intervention with people. The intervention is kind of creating a cadre of young people from the community that want to do this work, that can do it on their own terms, how they would want to do it, them learning skills, them being a part of this process. So then they within their own communities can make decisions on how and whether or not technology is the answer. And I think that it’s been so enlightening to be able to bring in young people through multiple mechanisms, either as research assistants or through our AI for our summer program, or through our work from the Brownsville Community Justice Center, and just to see the light bulb switch and be like, «Oh, like I’m an expert. I know this content. And that can actually do something to protect thinking my community is safe.» That’s been the intervention. It’s actually not been the NLP tool.

Mary Gray: I think you were the person who I learned the framing of domain expertise, subject matter expertise, as a way of approaching humans in the loop.

Desmond Patton: Yes.

Mary Gray: And we both study these processes that are completely dependent on someone —

Desmond Patton: Yes.

Mary Gray: —looking at a piece of information and quickly being able to make a judgment about what could this be, to the points you’re making. As you’re thinking about this as an intervention, and particularly your role as a professor in social work, I’m wondering how you navigate the relationship between social work as a discipline that’s had a complicated relationship—

Desmond Patton: Yes.

Mary Gray: —to the carceral state, to prisons, to, um, identifying people who should be the targets of surveillance, and balancing that with creating interventions that are real alternatives and empower that expert that you’ve been talking about who knows that context, could use it to bring safety to their communities. How do you navigate that, particularly as an academic, as a teacher?

Desmond Patton: Yeah. I think it’s such a great question. I think number one is just having an awareness of the history of social work, and how that plays out, and how we think about treatment intervention. And I think some people kind of mystify social work as being, «Oh, you know, these great people that come in and make everything better,» but know we actually have a problematic and racist history that still plays out in our child welfare systems, in our carceral systems. And I think having that awareness and naming it is a critical first step. That allows us to then have really difficult conversations that I think are critically important in and outside of the classroom. And what I’ve loved about social media, as challenging that social media can be, it does present the world in a very difficult, robust, and complicated way. And I think it’s a really exciting place to study the human condition when you don’t have to participate in it all the time either. [LAUGHTER]

Mary Gray: That’s true.

Desmond Patton: Because we also know that it can be really scary. But I’ve been able to use these examples in various workshop and classroom settings to help drive critical and reflexive conversations around power, and race, and oppression, and privilege. To be able to use a tweet and to look at a tweet from multiple angles, and to embed that tweet within systems, within individual, within a family, within a society, within a community, is really important and has offered a new way in which we can think about the role of social work. I think before social media, we were just so used to a certain way of talking about people, and communities, and families. Now we get to see how people interact around content, around triggering events, around societal events. And we can use that to unpack and think about different and multiple intervention strategies and treatments. But the challenge has been social work as a field—it’s been really slow to move in this direction. So there are a few people and a few schools and centers that are now really excited about this work and are moving forward. But as a discipline we are not there yet. I’ve been—in this new kind of part of my life—trying to push for an agenda for social work values and thinking that can be embedded in technology, that also affects how social workers do social work in the community as well.

Mary Gray: I’m just putting together there’s this fantastic parallel to computer science and engineering that often thinks of itself as a kind of neutral bystander going to help—

Desmond Patton: Yep.

Mary Gray:  —going to intervene by helping with tools. And it’s just striking me like what a great way to model that same sort of critical reflexivity for probably the computer scientists/data scientists around you, who also probably are not used to thinking, «I need to contextualize what’s happening.» Because I mean, isn’t most of computer science, after all, about abstraction?

Desmond Patton: Yep.

Mary Gray: It’s like, «Can I just get away from the messiness of the day-to-day and kind of model some abstract action of people as nodes and edges?» Have you had the computer scientists in your life reflect back to you that you bringing up the messiness is both hard but also exciting, maybe changing how they’re looking at what they do?

Desmond Patton: Absolutely! And it’s really helped me to see the contribution of social work in computer science. Because what to me feels very organic and natural, feels foreign to my computer science colleagues. But what I’ve appreciated is that most of them feel excited to learn more. So I’m not met with a lot of resistance, I’m met with confusion. I’m met with, you know, the need for support. And that feels deep and long for sure. But it has been a space where I can really firmly articulate the power and the importance of social work thinking in this space. And you hit the nail on the head when you said «reflexivity,» because it’s really in this ability to look outside of yourself and to ask deeper questions, to ask critical questions, to think about how you, um, interrogate what you have learned, what you thought was right, and to put it in a different context. And I think that’s been something that computer scientists have, and data scientists have, not really had to do because you’re thought of as being rigorous because you do math that maybe—that you don’t need that kind of reflection. And I think now people are realizing that, «Eh, that’s actually wrong. That’s not the way to go.»

Mary Gray: [LAUGHTER] Not the way to go. Not the way to go. Well, and I mean, maybe for me this is the connection between the streams of work we have is that, you know, we cross paths with thinking sitting that expert who knows the context—

Desmond Patton: Mm-hmm.

Mary Gray: —next to the expert who can think about systems and the abstract is where the power is at. So I want to ask you—well, let me preface this with the work I’m doing, because I haven’t had a chance to tell you about it. So Project Resolve. Project Resolve is this attempt to build tech with communities, and it’s drawing on ethnographic data, it’s guidance from the community leaders, and really direction based on their agendas and their domain expertise. But also the domain expertise of social scientists, drawing on us to kind of be the doorstops that keep the door open between industries that maybe don’t listen to those experts—

Desmond Patton: Mm-hmm.

Mary Gray: —and also to be that bridge. So I’m wondering when you think about the work you’re doing and the realities of the communities that you’re working with, of what they need, what are some of the parallels you might see between projects like mine that are always trying to make their agendas first, and projects like yours that also want to make these young people’s experiences the priority, but are, you know, part of building tools that are a part of systems, part of institutions that may not have their back, may not have their best interest? Like, how do you balance the tension between who could benefit from what you’re doing?

Desmond Patton: You know, that’s such a good question. And I think a part of it is really the need to create a practice of humility in this work and a willingness to engage in active listening—and to identify your non-negotiables, which is something that I learned from you. I don’t know if you remember, we were at Cornell Tech years ago, and you said that in some context. And it was a light bulb moment for me and for the work—

Mary Gray: Hmm.

Desmond Patton: — because I don’t think that when you are—I’m a black gay—and at the time I had not gotten tenure yet—at an ivy league institution. And you’re moving really fast. You want to get tenure. You want to do the most important work. No one’s really telling you to slow down and to treat community members with respect, and to listen to them, and to partner with them in true ways. And when you said that, it became the light bulb moment for me. But it’s also risky within the context of whether you’re in academia, or you’re working for a big tech firm, or wherever else you might be working, to slow down and to actually put people before bottom lines. And I think the parallel is that if you’re really about this work, and if you really want to do it well, then I think that you have to be willing to have a conversation about the non-negotiables. And so, you know, this past year and a half I’ve been working with the Mayor’s Office for Criminal Justice here in New York City, on a very similar project around, «What can the city do to create better relationships with low-income residents in New York City?» And one of the non-negotiables for me is that we cannot do this work that it’s just about technology, that it’s just about scraping social media data to then inform some tool that community members would have no take in or no participation in. So a part of that was to have budget line to have projects for and resources for community-driven, community aspect of this project, where we have to listen to their voices, where we have to partner with them. And that’s been a game-changer because it also influences the decisions that we make with how we use machine learning, with how we’re communicating with the mayor’s office. And hopefully we have a new mayor that this work would then go to then translate into new action and new work in the new mayor’s office that will hopefully lead to better communication. But again, I think it really started with being willing to identify and wrestle with those non-negotiables.

[MUSIC BREAK]

Mary Gray: So I don’t think that I’m shifting gears, but I want to ask you about the ethics and the tensions of data collection. Because, you know, when you’re effective at what you’re doing, it’s building, you know, an archive of how to understand people’s actions, their grief, their loss, their aggression. If you could just say a bit about how do you think about the balance or the usefulness of learning from that data, and the need to protect the privacy and the security of the folks who are producing this data, particularly for communities that have this history of being surveilled?

Desmond Patton: Yeah, this is the very thing that has always kept me up at night doing this work, because I think my work sits at two critical book ends. On one I know, observe, and have experienced young people dying because of what they say online. People dying because of what they say, what has been interpreted online. That is a real thing, it is not uncommon, and has become a global phenomenon. Add to that, parents and family members wanting whatever tool possible to keep their family safe. They don’t care if it is a surveillance tool, because at the end of the day, it’s about life or death. Versus also understanding that usually when we’re talking about these tools, we’re talking about their deployment in black and brown communities. There is no benefit of the doubt where interpretation can be thought of in positive and negative ways. It’s normally used as negative character testimony. That there’s been no input given from community members in the data collection process, in the interpretation process, and then how that tool is deployed in their communities. And that can lead to another form of violence, which is state violence, carceral violence, that is extremely problematic and also detrimental to the black and brown communities. So I’m constantly wavering and toggling between those two spaces. And then, honestly, Mary, what’s been really difficult doing this work is that I feel like it’s been hard to find communities and colleagues that really get that. You know who does get it? Community groups, right? And so I can talk to, you know, a violence outreach worker, or a probation officer, or someone at the Giffords organization, or at the YMCA. Like, they get it because they’re constantly making those difficult decisions each and every day. And so I think for me, what I have decided to do is to lay out all of the challenges, and to be very clear about my positionality, about the questions that I’m asking, and being willing to change those questions if they are not the right question, being really clear and transparent about interpretations and how those interpretations feed into classifications that feed into algorithmic systems, that feed into deployment of AI systems and being willing to not produce the thing that I want to produce if it’s not going to work. I’ve been doing this particular work for the last seven years, and we have never deployed any of these systems in a community. You know, as the research, and I’m like, «Gosh, I feel like I’ve failed.» But as a human being, as a social worker, I feel like this is the right thing to do.

Mary Gray: I would say being able to prevent harm is probably the highest order—

Desmond Patton: Yeah. Yeah.

Mary Gray: —calling for your profession and for my profession. So, I mean, I think being able to identify systems that won’t work and keeping them from being deployed is incredibly important. Let me ask you, do you see a role for big companies like Microsoft to create tech alongside these communities so that they have more autonomy, that they have more control over how these systems would be deployed, or at least what happens to the data that might be collected to benefit them? Like you described those parents—what is a way to do right by them and also create some systems that make them the trusted stewards? Is there something companies can be doing differently?

Desmond Patton: Yeah, absolutely. I’ve long thought that big tech companies need to hyper-invest in education and for seamless pathways into technology that don’t have to be about which school you went to and which degree you have, but value persistence, and determination, and lived experience as being equally as important—and to help scale that up. And what does that look like? And so I would love to see companies like Microsoft put aside millions and millions of dollars to really, um, invest in that space. And maybe that’s happening, but I would love to see more of that happening. I would love to see investment in communities that are doing work with hard-to-reach populations that have figured out how to tap into that knowledge base in really important and critical ways. They need the support. And I think what’s hard is that these amazing organizations are constantly having to apply for funding, and it really detracts from the mission because you constantly don’t know if you’re going to be able to have the lights on in the next week or so. And so I think tech companies can really revolutionize that space as well. I want tech companies to really re-imagine who gets to work in these spaces. Do you need another engineer from MIT? Love MIT, love those engineers, but do you need another one? Is that going to make the difference for the tech company that embodies transformational justice? Who are the people, what content do you need, what experiences do you need to really get there. Because you’re clearly good at making money. You’re going to make money. There needs to be some kind of, you know, moment of reckoning where you understand that this is as important as the bottom line. And so I would love to see companies there. But I, you know, I’ve had really good experiences with Microsoft, not only being a VR in your lab, but being able to work with a variety of folks to kind of create new opportunities to number one, open up the conversation to different types of people to offer feedback to Microsoft. I have been the beneficiary of funding to create a Design Justice course based off of Sasha Costanza-Chock’s book, um, that allows to bring social work students, and engineering students, and humanities students together to work on projects that could hopefully, you know, be beneficial for the work that’s happening at Microsoft. And so there are people within these organizations that really care, that have a justice mindset. And I hope that that kind of work and dedication is rewarded at places like Microsoft, that it is as exciting as you creating the new product.

Mary Gray: Can you say a bit more about what is it that you think that baseline of education of Microsoft or large tech companies, the tech industry being able to say, «Oh, we need to invest in those domain experts,» for example, that you mentioned? Like, what’s the game-changer there? What is it about the education? Is it treating people as something other than a consumer? Is it—what is it?

Desmond Patton: Gosh, that’s just such a good question.

First, it is recognizing that expertise comes in different ways and is expressed in really dynamic ways within the human experience. And I think that that’s something that I had to untrain myself to think that a young black child from Brownsville holds expertise. Because my training did not tell me that I should view them as an expert. My training told me that I am the expert because I have a PhD and I teach at Columbia, and that they should benefit from me. And the reality is that in my work, there was a humbling practice and process that forced me to realize that I didn’t know anything. And I think that there’s individual work and that work has to be extrapolated to product teams and big corporations. And it’s harder because big corporations like Microsoft and other spaces have long been viewed as being kind of almighty in this space, in this techno-chauvinist way of thinking about the world. I think it’s going to take these critical, like, moments of breaking down to understand the importance of this type of expertise, especially if you say that you actually care about the human condition, that you actually care about social justice. If you actually care about social justice, you will center the voices. And it’s beyond letters, and it’s also beyond just granting. It’s a retraining of your mind that takes time. It takes learning. It takes putting yourself in complicated and difficult conversations. So it is additional work. And it is so important for us to all be able to engage in this work to get to the other side. And so I’m hoping that these types of conversations become the norm within big tech companies like Microsoft.

Mary Gray: I would love for you to tell the listeners about the book that you’re working on, because I’m pretty sure when I hear you say it took you learning that lesson, that your book is going to be an opportunity for other people to learn from what you’ve learned. Can you just tell us a bit about that book?

Desmond Patton: Yeah, goodness. So since 2014, I’ve been studying a young black girl. Her name was Gakirah Barnes. She was murdered in April of 2014. And I learned about her in a popular news outlet that called her «the gun-toting gang girl of Chicago.» And they were comparing her to the character, Little Snoop, from «The Wire.» So basically, she was gang-involved, shoot first, ask questions later. And she had this deep mythology on Twitter of being, you know, this murderer, this person who had shot or killed up to 20 people before she was 17 years old. That was the mythology that was carried on social media and in the larger community. So I learned about her and wanted to leverage her story to understand this connection between social media and gun violence, because she had been murdered, shot nine times, and I wanted to see to what extent her friends on Twitter were going to retaliate. So what would be the mechanisms of the language, how would that happen, would it happen? But what Gakirah Barnes did in death is that she forced me to reckon with her humanity. So what I didn’t realize is that the questions that I was asking of Gakirah were full of white toxic notions of black girls, that I had imbibed that as a black man, and was asking questions of her that did not see her as a person, did not level with her humanity. But she—she showed herself to me. She forced me to reckon with the trauma that she experienced every day, that she had experienced more death and more grief than probably most of us will ever experience in a lifetime. She forced me to see a young girl who loved, who laughed. She forced me to see a person, a child.

Desmond Patton: And that was an eye-opening moment for me. But it took time. It took pouring over hundreds and thousands of tweets and images, and talking to her mom, and her best friend, and her cousins, and her family members to actually see this individual. And so the book is me wrestling with those big lessons, with me wrestling with how I viewed Gakirah, and the impact it had on how I interpreted her Twitter data, how my interpretations of Gakirah affected the tools that I used, and how I used those tools, how it affected the ways in which I wanted to effect change in those communities, and how I thought about expertise and partnership as well. I’m almost done with writing the book. I’m on chapter four of a five-chapter book. And you will hear from Gakirah through her tweets. You’ll hear from her mom, who I’m so grateful for—spent time with me to help me understand her daughter. And honestly, her mom’s voice has been the most important voice to me because I was terrified that her mom thought that I was doing harm, that the work that I was doing was doing harm, that it was unethical, that it was just wrong. What her mom told me is that the only reason she responded to my DM on Facebook is because she thought that I was helping to change the narrative of her daughter.

Desmond Patton: And that, you know, lit a fire under me to really want to think about the importance of how social media and how technology can actually change narratives. So I don’t have to go down this route of painting this negative, hateful, scary picture, that I can actually paint a complicated, and beautiful, a nuanced picture that shows you the human experience in a different way. And that NLP and computer vision can do that if we allow ourselves to ask better questions, to ask more complicated questions, to put ourselves in communities and spaces where we have to wrestle with that. So that’s the hope of the book, that’s the promise of the writing, and I’m excited to move that forward.

Mary Gray: I feel like whenever I read your work, I can feel your effort to create empathy, you know, to really show me on the page how you’ve built a different perspective for yourself and that you’re sharing that. And in some of your work, you talk about making AI empathetic. Can you talk a little bit about what that means to you, and just how do we get there with AI?

Desmond Patton: You know, what we talk a lot about in social work is taking a strengths-based perspective. We say it so much that it becomes trite, but I think it became so clear for me in my work because I had lost sight of that. The power of the narrative around technology tools became so pervasive that I forgot about strengths. I forgot about positivity. I forgot about kindness. I forgot about all the things that make us who we are as people. And it’s all the things that I was able to see in Gakirah. So there was this completely horrible narrative about who she was, and in the same breath, I was also able to see all the things that make her a person. And I think that a part of that was a reckoning and awareness of what was happening to me, how history and context shapes how we interpret. Naming that and the willingness to also put myself in spaces with people who can disagree with my approach and the way that actually advances the work in a positive way. And so being able, again, to be interdisciplinary, transdisciplinary, and to talk to people like you, and anthropologists, and political scientists, and community members, and computer scientists. All of that matters for kind of getting outside of these, like, restricting, reductionist narratives. These processes of like naming, and active listening, and processing I think should be a part of training the person who develops AI, that you should not and cannot be an ethical engineer if you have not at least heard of or gone through a process in which you have to deal with yourself and how your self impacts the things that you’re developing. And I think that if we can make that a criteria, a requirement, then I think that we will slowly get to a space where people can at least be conversant in these conversations, and be willing to be checked and to listen. So that’s what I mean by making AI ethical, is about really bringing in a process for self-discovery, for reflexiveness, uh, for difficult conversations.

Mary Gray: And for that empathy, it’s interesting because I hear you describing a process that’s just adding dimensions, you know what mathematicians would talk about high-dimensional space.

Desmond Patton: Yeah, yeah. [LAUGHTER]

Mary Gray: They usually hate that because that’s not going to give you an elegant model. And I’m wondering as you’re asking, you know, a discipline that approaches the computational with a goal of reducing all that complication, if you see some tension between the early efforts to debias AI, and what I hear you calling for, which is really making it more culturally aware, more sensitive, and not trying to shut down the complexity. Like, have you seen a shift in the conversation within the discussion of bias and fairness that gives you hope that there’s willingness to see our complexity, to see our dimensions?

Desmond Patton: I do. It’s slow, but it really takes champions in this space to make a difference. So folks like Reddit Adebay—I’m maybe mispronouncing her last name—at Berkeley, who was really a social worker‘s data scientist who has reached out for partnerships, and for paper collaborations, and research grants really, like, striving to understand social work principles for AI. And then she will inevitably create a community of students around her that will then take these ideas and move them forward. And there are lots of examples like that that are happening across the country. And I’ve been so thrilled and excited to see that. I’ve just been able to have the most robust conversations, the most exciting conversations with folks that are acting as champions. And so I hope that we are seeing these folks for who they are, they are awarded for that. So when they are, you know, coming up for tenure, or trying to get that big grant that we recognize that as important as the findings from an article or their placement of a manuscript in whatever, you know, prestigious journal. And so I think we are moving in the right direction, but we have to honor the work that is happening in these communities as well.

Mary Gray: Yeah. So I know we’re running tight on time, and I don’t want to miss the chance to ask you, what are some projects you’re excited about for 2022?

Desmond Patton: Absolutely. So my work has moved me into the space of studying a couple of newer things for me, which will be grief, and joy, and racism on social media. So we received a couple of grants to study how black folk in New York City are expressing grief on digital platforms. We’re going to study that through using NLP, and linguistic approaches, and qualitative methods, with the goal of being able to figure out how do we identify digital signs of stress online, and then how can we then use these tools to support new treatments and interventions. Lots of people are grieving, um, as it relates to COVID-19, and anti-black racism, and police violence. It’s just a really hard time for lots of people, but in particular black folk. Um, and so we want to be able to figure out what can the digital world tell us about grief, and how can we intervene? I have been working with folks from MIT, um, and Stanford on a new product called «InterpretMe,» which is built off of the Teaching Systems Lab Teacher’s Moment product, where we adapt this tool to help people in various jobs think about their interpretation of social media posts. And the goal isn’t to make you a better interpreter, it’s to help you realize bias in your interpretation, how that bias leads to more punitive decisions. And so you would then go through this experience and reckon with interpretations, get more context, and then make decisions, and then get automatic feedback as well. So this is really becoming kind of a DEI training tool. So that’s the second project. And then we are continuing our work to work with the mayor’s office to really create stronger tools that build better communication between low-income residents in New York City and the new mayor’s office as well. None of these projects are tech only. They are a merger of qualitative methods, and community insights, and linguistics, and anthropology, and computer science. And that’s been a really exciting space for me.

Mary Gray: So before we finish, are there any final points or thoughts you’d like to leave us with as we wrap up?

Desmond Patton: I am excited about the possibilities—and have hope—for transformational justice that can happen through technology. And I believe that if we reimagine a world that is hyper-inclusive, where everyone gets to have input, and everyone can participate in development, where everyone gets to win and thrive in these spaces, that we really can see another set of technology that actually can be a space for social justice. And I believe the conversations like the ones you are having, and the opportunities and resources that technology companies can provide can be transformational on that journey.

Mary Gray: Thank you so much for joining us today, Dr. Desmond Patton. And thank you to everyone listening. As a reminder, this episode is part of the series «Just Tech: Centering Community-Driven Innovation at the Margins.» Check out microsoft.com/research to tune into past and future episodes and subscribe to the podcast wherever you listen to your favorite shows. Desmond, thank you. I appreciate your work so much. I’m so glad that you keep doing what you’re doing. And please give your husband my best.

Desmond Patton: Thank you, Mary, good to see you.

[End of episode]

Lire la suite

Voir tous les podcasts

Domaines de recherche

Projets connexes

En relation