Subscribe

Ukraine, cellphone, photos
Photo credit: © Katherine Cheng/SOPA Images via ZUMA Press Wire

In each of our pockets are devices with the power to bear witness. Suddenly violations of human rights are documented for the ages and may be more easily prosecuted.

Digital technology has changed the world. That change is about more than instant, 24/7 news, or booking a vacation at a stranger’s house, or making calls on your wrist, or having a car show up magically to whisk you anywhere. In each of our pockets are devices that have the ability to bear witness to the world around us. Suddenly, violations of basic human rights are being documented for the ages. 

The visionary leading the effort to bring these two forces together is our guest on this week’s WhoWhatWhy podcast, Alexa Koenig, executive director of the University of California,  Berkeley’s Human Right Center. Koenig, Ph.D., J.D., is also the winner of the 2015 MacArthur Award for Creative and Effective Institutions and a lecturer at the UC Berkeley School of Law, where she teaches classes on human rights and international criminal law — with a particular focus on the impact of emerging technologies on human rights practice. 

She co-founded the Human Rights Center Investigations Lab which trains students and professionals to use social media and other digital content to expose gruesome atrocities, from Ukraine to Uganda.     

She is also co-chair of the Technology Advisory Board of the Office of the Prosecutor at the International Criminal Court, and co-chair of the International Bar Association’s Human Rights Law Committee’s Technology and Human Rights working groups, and is the author of seven books.   

Koenig details how she is marshaling the resources of videos, social media, and technology to strengthen human rights advocacy and prosecution. She is proving how that device each of us carries is being used to secure international justice. 

She is breaking new ground in the ways courts around the world will deal with this new kind of evidence, and who can really be trusted to control this new visual power.  

iTunes Apple Podcasts   Google Podcasts Google Podcasts   RSS RSS   MP3 MP3


Full Text Transcript:

(As a service to our readers, we provide transcripts with our podcasts. We try to ensure that these transcripts do not include errors. However, due to a constraint of resources, we are not always able to proofread them as closely as we would like and hope that you will excuse any errors that slipped through.)

Jeff Schechtman: Welcome to the WhoWhatWhy podcast. I’m your host, Jeff Schechtman. We know that digital technology has changed the world. I’m sure that all of you have these debates about the outcome of that change. But it’s about more than instant 24/7 News or the ability to book a vacation at a stranger’s house or make calls on your wrist, or magically have a car show up and whisk you away. In each of our pockets are devices that have the easy ability to bear witness to the world around us.

Couple this with the surveillance that has become ubiquitous, and suddenly violations of basic human rights are documented for the ages. From the George Floyd video to what will be the world’s most documented war, in Ukraine, bearing witness is now a personal and individual heavy responsibility. But where does it all lead? What are the ethics that surround this newfound citizen power? If everybody has the power, then is anyone in charge?

Last week, US Attorney General Merrick Garland visited Ukraine to discuss the prosecution of Russian war crimes. How will courts around the world deal with this new kind of evidence and who can really be trusted to use this new visual power? These are just a few of the questions that our guest, Alexa Koenig, is dealing with as the Executive Director of the Human Rights Center at the University of California, Berkeley.

Among her many accomplishments, Alexa Koenig is the winner of the 2015 MacArthur Award for creative and effective institutions,  and a lecturer at UC Berkeley School of Law, where she teaches classes on human rights and international criminal law with particular focus on the impact of emerging technologies. She co-founded the Human Rights Center Investigation Lab, which trained students and professionals to use social media and other digital content to strengthen human rights advocacy and accountability.

She is the co-chair of the Technology Advisory Board of the Office of the Prosecutor at the International Criminal Court and is co-chair of the International Bar Association’s Human Rights Law Committee. She’s the author of several books. She has her PhD and MA from Berkeley and a law degree from the University of San Francisco School of Law. It is my pleasure to welcome Alexa Koenig here to the WhoWhatWhy podcast. Alexa, thanks so much for joining us.

Alexa Koenig: Thank you so much for having me.

Jeff: Well, it is a delight to have you here. First of all, tell us a little bit about the Human Rights Center, a little bit about its history, how it came to be.

Alexa: Absolutely. The Human Rights Center here at UC Berkeley was established in 1994. We’re coming up on our 30th anniversary. The idea is that we would be one of the world’s first university-based human rights organizations, a place where people from across disciplines — so whether you are working as a journalist or as a lawyer or in public health, sociology, et cetera — could come together where there were issues of mutual human rights concern and do that research, run those investigations, and hopefully give back to the world the facts that you find as part of the work that you’re doing.

We also, of course, because we’re on such a huge public campus, have an opportunity to really train and teach students and inspire them to think about human rights as a component of any of the work that they might do once they leave this campus. And of course also to work very closely with nonprofit organizations, legal organizations, and reporting outfits around the world while they’re doing their time here at Berkeley.

Jeff: And talk a little bit of how the center has changed during those 30 years, particularly with respect to the way the world has changed in the way issues of human rights sit in the world today.

Alexa: Sure. So I think any time you’re doing research or investigations, you’re very much beholden to the state of the information environment of which we’re all a part. As we’ve seen technologies change and shift in terms of their dominance and how we communicate as human beings, I think one of the things that we really recognized here at Berkeley is the need to also adapt our methods of really trying to get at what is happening in the world.

When we started 30 years ago, a big part of what we were looking at and what we were doing with our partners, was really more traditional-based investigations. So Eric Stover — who’s our faculty director and did a lot to really set up the organization —  he was an expert in investigating war crimes, very boots-on-the-ground, exhuming mass-grave sites, using what was then an emerging technology of DNA technology to help identify the bodies and to help reunite people and families who’d been torn apart by war.

Over time, though, we began to realize that other new technologies were really coming into the processes of understanding what had taken place, insights of atrocity. And so in 2009, we hosted what we think was probably the world’s first big conference on human rights and new technologies, where we brought together people who were really experimenting with how smartphones could be harnessed to do interviewing, to document what people had experienced in the context of war.

At the time, the cost of satellite imagery was dropping and had suddenly become accessible to non-governments, less than a decade previously, and I think was giving us a very new physical perspective on what was happening around the world. We, of course, also had the rise of social media. Social media companies were entering into areas that were unfortunately rife with conflict and providing new opportunities for people to communicate about what was happening on the ground.

So this really was a moment of reckoning of how do we — as human rights practitioners, as researchers, as reporters — pivot to take advantage of all of these new ways that people are reaching out and sharing about their experiences? After that, we began really asking the question, first in a legal context. So in 2011, we were coming up on the 10-year anniversary of the International Criminal Court and beginning to ask how strong this new experimental legal body was in terms of actually prosecuting the highest-level perpetrators for some of the world’s most grave crimes.

And we sent a researcher to the court. She came back and essentially, as we had noticed and many others had noticed, a lot of cases were falling apart and failing to proceed at fairly early stages of prosecution. So our researcher’s insight from spending time at the court was essentially that the judges were criticizing the Office of the Prosecutor for “over-relying” on the testimony of survivors without bringing in enough corroborating data to really meet the evidentiary thresholds needed for successful prosecutions.

So that really launched our last decade of work, of asking: How can we take advantage of some of these new digital technologies to bring that in to either support or disprove what survivors are saying is happening in their communities and hopefully get successful prosecutions, get governments to stop the overreach and, hopefully, contribute to a more peaceful future?

Jeff: Is there a downside to this technology with respect to the way in which people might bear witness? The analogy might be one that people can relate to: the idea of they’re traveling, they’re going on vacation. If you’re just observing and really being in the moment, it’s different from if you’re looking through the lens all the time of your phone or a camera and taking pictures.

Alexa: There are several downsides to the new information environment that I think we are all now a part of. One, of course, is, as you’re saying, the human connection and our understanding of what is actually taking place. There are a number of biases that really influence what we can perceive through the lens of a camera and a posting to social media. So, for example, one of the very common biases that we’re often aware of as investigators and researchers is access bias.

First, who even has access to a smartphone or to the internet to be able to post about their experiences — that can be very gendered, based on geography, age, et cetera? If we think about where some of us may be living online these days, for me, it might be Twitter and Instagram; for my daughter, it might be TikTok; for my parents, it might be Facebook. So really understanding how that can vary and how if we rely on certain platforms for the information we get, we’re getting a very narrow slice of the world.

One area that we’ve been trying to do more and more research on is how can we use some of this online open-source information? The information that’s publicly available in digital spaces to investigate and potentially prosecute sexual and gender-based violence? Well, there’s a lot of assumptions around whether people are going to put up that kind of sensitive material on social media sites. I think one thing that we found is that people do post about all aspects of war but sometimes the ways they communicate that can be very coded, et cetera.

And then I’d also really point out the ethics of some of this work. I think that this is really a form of counter-surveillance, when you’re using new technologies to investigate international crimes, war crimes, et cetera. So as much as we may, as individuals who have a deep faith in and reliance on and desire for privacy, I think understanding how these technologies can be abused, even by the “good guys,” I think it’s something that we do all need to be thinking about.

When we’re doing these investigations, there are a number of competing tensions between privacy, access to information, freedom of expression, national security, all of these are human rights, interests, and things that I think we as a community of practice need to be talking about.

Jeff: The other side of that, of course, is — as this kind of technology grows and the degree to which it is used — the way in which it impacts the crimes themselves.

Alexa: Yes, it does. If you look at everything from cyber-attacks, which can really undermine critical infrastructures and communities that are already hard hit by war —  that’s just one example of the ways that the cyber domain can be used to get military advantage and to actually harm large numbers of civilian populations. A colleague of mine, Lindsey Freeman, has been leading a project here at our center, looking at cyber-attacks that have been perpetrated against Ukrainians.

And really trying to think about how this form of warfare really needs to be better recognized by the International Criminal Legal Community and baked into the kinds of crimes that we’re investigating and prosecuting today. I think another thing we really need to be thinking about is how when we are talking about information put into digital spaces, we really need ways to probe the quality of the information that we’re accessing.

So propaganda has been around since time immemorial — people are always going to try to manipulate our understanding of what’s happening on the ground, through what they share and how they frame it. I think the challenge with the digital domain is that the speed and the scale of this can be so much greater than what we may have previously experienced.

That’s one of the reasons we participated with the United Nations Office of the High Commissioner for Human Rights to create something that’s called the Berkeley Protocol on digital open-source investigations. Which is really a set of guidelines for international investigators, researchers, and even journalists to better understand how to probe the quality of the kinds of digital information that we find online. How to find it in ways that overcome some of the biases in what we have access to in digital spaces, but then also how to verify that information so that we know if it’s actually communicating something true about the world.

Jeff: Because there’s also the issue of context, even if a particular set of images or film is accurate, we don’t see what happens before it starts or after it ends, and without that context, it could be different.

Alexa: Absolutely. So there’s a few different challenges that I think, when we’re relying on digital evidence, we need to be cognizant of. The first is this issue of cropping in ways or just even looking at the perspective of the lens and how that really covers our understanding of events on the ground. There was a big study that I remember teaching some students on that is really well known in the psychology-of-law space. That looks at how a jury’s perspective may shift if they are looking at video taken in an interrogation room.

Depending on whether the camera is looking from behind the shoulder of the interrogator or from behind the shoulder of the person being interrogated, or even from a purportedly neutral place, somewhere in the middle. Their perspective of who maybe was abusive or evading questions, who was right and who was wrong in the space in terms of their behavior, really shifted even with the same script being used in these psychological experiments. And I think that is something that a lot of investigators are trying to grapple with now, by just trying to find how many videos may actually exist of the same incident — whether it’s an attack on a hospital or the mass killing of a people on a particular date and time — trying to triangulate those and get imagery from all different angles.

Another big challenge, though, is just what’s become known as shallow fakes — a photo that actually was taken of a real event in the wild but then it’s mis-framed in terms of what it claims to be. There’s an example that we use from our partnership with Amnesty International. There’s this one picture of two children who look to be Asian and they’re huddled together. And one allegation that was made was that these were of two children who were basically orphaned in Myanmar, so in Burma.

But if you do it, what’s called a reverse image search, you take that photo and you run it across the internet, what you quickly find is that a lot of reporters used it to illustrate a story about, I think it was an earthquake in Nepal in 2015. But if you go far enough back, you find even they got it wrong, that it’s actually from Vietnam in 2007. And this really charming, heartwarming, heart-wrenching picture has been recycled over and over.

And then the final part is, of course, deepfakes: The videos, the photographs where we know the technology is getting better and better and increasingly making it look like something happened in the wild, that someone said or did something that they never said or did, or that a person existed who really never has, but was basically constructed whole cloth from an algorithm.

Jeff: Is there a concern that so much of this imagery being around essentially forever will make truth and reconciliation where it’s desired after a war, after a conflict, virtually impossible?

Alexa: I think that’s an excellent question because many of us are grappling with it at the moment. What I think for us is, as an investigations and research center, we really do have a fidelity to facts. I know your interpretation of facts can be highly subjective and really shaped by who you are and what your experiences have been. But I think in order to uphold democracy and the rule of law, we have to be able to get to the point where we agree that a fact exists for the basis of proving that a crime has taken place.

So a big piece of this is really figuring out how do we grapple with both the temporal aspect of all this data, how long it sticks around, but also how widespread it is and how it’s being seeded and planted everywhere. I think one especially acute challenge is, of course, some of the tensions that arise between reporters who are trying to really report out to the world what’s happening in the context of war and the legal investigators who have a very different sense of objectives and incentives.

So, for example, one challenge is of course reporters, when they’re on the ground in a place like Ukraine, they find a survivor of an attack and they ask that person about their experience in witnessing this. That is deeply problematic from a legal investigator’s perspective because suddenly there’s a narrative out there about what this person, who could have been a very powerful witness in a court of law, says that he or she experienced. That person then becomes very problematic to ever call as a witness into court years later, because there’s this competing narrative.

We all tell stories about real-life events, not making things up, but we tell them a little bit differently every single time, or our memories start to slip or new things jog our memory. I think that’s a big piece of how we’re trying to figure out how do you not perjure yourself by putting these narratives out there in the first place? Of course, that’s not to say that journalists should stop doing what they do and do so well. We need the pressure of journalists to bring the world’s attention to these atrocities.

That’s what motivates bringing these cases in the first place and creating new court systems to actually get justice for these crimes. But I think more and more what we’re finding, as we’re all working in these digital spaces, is that we need to be at least aware of and sensitive to the ways that these can be problematic for the record. The other challenge that we’re seeing is that more and more when legal investigators or reporters go out there to report on conflict in war, they’re finding that a few key videos or photographs may become quite iconic.

And that the people that they’re talking with assume that the reporters and the legal investigators want to talk with them about that video or that particular incident, as opposed to really telling their own first-person experience from start to finish. And that those videos and photographs may be beginning to color the ways that they remember their own experiences in ways that could be challenging for later verification processes and establishing facts in a court of law.

Jeff: Do we have, or have there been developed baselines yet, by which this kind of digital evidence effectively works inside a court?

Alexa: Yes and no. It really depends on the court system itself. Some courts and some parts of the world are much more comfortable with dealing with digital materials and much better resourced, quite frankly, to be able to handle the kinds of information that we’re increasingly seeing come into these processes. I think we could look for example, at the International Criminal Court — there’s a case there called the Al Mahdi case. And Al Mahdi was a defendant who was accused of the destruction of cultural heritage property in Timbuktu in Mali.

This was a huge case where an organization that we have occasionally worked with — that’s really fantastic, called C2 Research based out of New York City — developed an interactive platform to try to help the judges better process and understand the digital evidence that was really relevant to this case. So in the Al Mahdi case, there were videos and photographs that have been posted to social media, also digital photographs and videos that the war-crimes investigators themselves had collected.

And what C2 did was they created this online system where you could look at each of nine different sites where these historic buildings had been destroyed, click on that site, and it would show you the satellite imagery perspective of where that location was relevant to all the others. It would show you the videos that were relevant to Al Mahdi himself, potentially being engaged in the destruction of these buildings, or other people who were affiliated with them. And it would show you all the photographs and you could just scroll around and play with this, so you really got a sense of the digital materials that I think the court was likely less familiar with.

I think what we’re going to see over the next decade is just an increasingly prevalent use of this kind of digital information coming in as evidence in war-crimes cases. Part of it is because international justice takes so long. It can take 5, 10, 15, even sometimes 30 years to bring these cases to court, so that we have a lag right now in the international system, where cases are just emerging right now that have really been part of or that took place during this digital era.

Even five years ago, we were still seeing a lot of cases where crimes have been perpetrated during the analog period. The scale of this evidence, the speed of its transmission, our access to it have all changed in really profound and fundamental ways just in the last couple of years.

Jeff: If this digital information turns out to be even more successful as you talk about it, does it create a situation in the future where there is a bias towards prosecuting cases where there is digital evidence, as opposed to those where there may not be, but that are no less horrible in their consequences?

Alexa: Yes, absolutely. So, I did a big study with a gentleman by the name of Ulic [unclear] from Swansea University, and we were interviewing traditional boots-on-the-ground investigators, gender experts, and digital investigators about their experience with investigating and prosecuting sexual and gender-based violence.

What we repeatedly heard from the gender experts and also from the boots-on-the-ground investigators was that they were really concerned that rape cases, and other cases that have been historically really difficult to prosecute, were just going to become even more difficult because the judges would be anticipating and expecting that there would be that video or that photograph that could be used to corroborate what survivors said they’d experienced, and they really have a deep fear that this can be perverting our understanding of what constitutes proof.

I do think, if you even look at some of the really fantastic digital investigations that are happening today, a lot of them, whether they’re done by human rights researchers or reporters or legal investigators, are focusing on things like attacks on hospitals and destruction of buildings and mass killings. And the reason for that is in part, I would hypothesize, that those are easier to document visually.

I think any kind of crime that tends to happen behind closed doors, that tends to leave fewer visible marks, is going to be much more difficult. If you think about the destruction of a hospital, for example, or a series of hospitals in Ukraine, you can pick that up on satellite imagery. But sporadic rapes, or individualized killings around the community, there’s going to be less triangulation of the kinds of digital information available.

Jeff: The other side of that is a situation like Ukraine. There is just such an enormous volume of material in a system that, as you detail, is very slow to act. One wonders the degree to which that doesn’t grind to a halt simply because of the volume.

Alexa: Yes, that has been a huge piece of the challenge for war-crimes investigators. We’ve never had to deal with this scale and volume of digital content before, or the variety of forms it takes. The conflict in Syria was, of course, a first major milestone in terms of just seeing the volume of video that was being posted to YouTube and occasionally to other platforms. What makes Ukraine really another milestone is just the variety of platforms to which information is being posted.

Of course, we have the traditional ones like Facebook, YouTube, Twitter, et cetera; but we’re also seeing more and more on TikTok and Telegram. Meanwhile, our law schools and universities don’t really have a systematized training for teaching people how to do fact-finding in these digital spaces, which is one of the reasons we created our investigations lab and were partnering with Amnesty International to create an international network of universities that can disseminate some of these skills.

But the variety of platforms is changing so quickly, it really does take a lot of time and effort to figure out how do you effectively, efficiently, and ethically mine them for the information that you might need.

The other challenge is of course storage. This is so many terabytes, if not more, of data. That’s a huge cost, I think, for often very cash-strapped war-crimes units. So, how do we centralize and begin to create maybe some kind of international repository that has the resourcing and the security practices in place such that they can really safeguard this data throughout its life cycle?

And then finally, how do — I think, thinking through how technology can be used to combat the challenges of technology, we are no longer talking about finding information at human scale when we’re talking about millions of videos related to a particular conflict. There’s a lot of innovation happening right now with people who are experts in artificial intelligence, and particularly with machine learning, to figure out how, if you have a database of, say, a million videos related to alleged atrocities in Ukraine, how do you parse those using machine learning processes to find the ones that are specific to a particular date or location?

How to figure out whether you can verify a video against a piece of satellite imagery to say this actually did take place, where the person who shared the video said it occurred? How do we use natural language processing to find the keywords and the terminology to maybe bring this million-piece data set down to a thousand videos, so that we can begin to systematically comb through them? I think this human-technological partnership is going to be a really critical aspect of how we —  kind of the future of war-crimes investigations — but we have to figure out how we make sure the human remains at the center — whether the investigator, the survivor, the victim, et cetera — and that the technology is really used in service of those more traditional efforts at the same time that we’re entering this new domain.

Jeff: What if any nexus exists now between the work that you and your colleagues are doing at the Human Rights Center and so much of the technological prowess that’s needed from Silicon Valley?

Alexa: We are definitely in conversation with several of the tech platforms to talk about the future — actually the present and the future — and how we make the preservation of content that goes up on social media and has to be taken down because it’s graphic, it violates their terms of service. Is there some way we can work together to ensure that it’s preserved to strengthen war-crimes prosecutions in the future? I think we have a couple of precedents that we can look to as we’re designing what that might look like going forward.

Certainly, there have been repositories created by the companies themselves for terrorist content, specifically the hashes — basically, an alphanumeric code that’s specific to each video or photograph that might have terrorist content. They put that into a repository so that they can all scan their platforms, make sure this terrorist content isn’t circulating and remove it if it is. We also have examples with child sexual exploitation material, where it’s illegal, of course, for any platform, any individual, to have that in their possession.

Once identified, it needs to be turned over to a national center on exploited and missing children. I think we’re trying to look to some of those precedents to see if something similar can be done in the context of international crimes. The University of Oxford has been working with us and working independently as well on trying to design such a system. We’ve seen some initial explorations in the context of Syria and Myanmar for evidence repositories for those two conflicts. I think the question now is, we’ve realized there’s a real need to centralize and preserve all of that data, whether collected by the companies, whether collected by individuals on their smartphones. A place for it all to go so it can be safely held onto for when prosecutors eventually come knocking.

Jeff: And because this is an international effort, talk a little bit about the ability of organizations to work together towards solving some of these complex issues and problems that we’ve been talking about.

Alexa: That’s been one of the biggest, I think, rewards, of working in an academic institution, that we’re often very well situated to be a hub between different sectors. Whether there are traditional war-crimes investigators, social media companies, other tech companies, to host conversations here at Berkeley, and in other places around the world, to design that collective way forward. I think one thing that has been increasingly recognized by human rights practitioners, journalists, investigators, and others is that we can’t do this alone.

The scale is too big, the speed is too fast. There is too little training and too little guidance about how to do this work well, and we need to be working more efficiently and effectively by almost thinking of ourselves as link in a chain of accountability. That first link may be the person on the ground in Ukraine who happens to be witness to the killing of someone in their community or the destruction of a building. They capture that on their smartphone.

Meanwhile, you have some great non-profits out there that have been training people who are in conflict areas how to capture the kinds of content that is really helpful for later legal process. So not just the dead body or the destroyed building, but the surrounding information that may tell you who was present when this happened and may get some clues as to location and time, which are going to be important for verification later in court.

And then after that, when it’s posted to social media or sent to someone, who finds that information, who collects it, who preserves it forensically so that it’s reliable for courts of law? Who then analyzes it and assesses whether it’s propaganda or a deepfake or something we can actually count on to tell us something about the world? And then how does that get packaged and communicated in ways that judges and juries can understand and ultimately sent out more broadly so that the world knows what’s happening at these moments in time?

I think we’re realizing no one organization can do all of that and that we really need to figure out how we do a better job of communicating and coordinating all across that entire chain to ultimately work together.

I think the scale of the crime today, the power of the perpetrators that can only be countered by civil society really coming together and figuring out how we all do what we do best, but make sure that we’re reinforcing and amplifying the work of each other.

Jeff: And finally, Alexa, how did you come by this work, this specialty?

Alexa: When I was in law school, 9/11 happened. I think that opened my eyes to the importance of really thinking about how we document atrocities that are happening in the world and how evidence becomes really critical to our understanding of the facts. I was at the time specializing in intellectual property and cyber law. So I was very curious, based here in San Francisco, about how information was changing and how digital affordances were creating new opportunities for understanding the world.

And then I think I began working a lot on issues related to Guantánamo, and the holding of men who’d been detained there, in a project at UC Berkeley and also a project at USF, and really documenting a lot of these men’s stories through video. So recognizing the power of video to actually help communicate it to the world, about the experiences of people in really, really difficult times in their life.

So doing my PhD here, and my master’s degree, I specialized in interrogations and investigations, eventually was certified through the Institute for International Criminal Investigations. But really this coming together, I think part of it was geographic. Being here in San Francisco and seeing what was happening in the tech sector and really grappling with Can this be used to strengthen human rights around the world? And then finding a home like UC Berkeley, where that was very much a question that others were asking as well.

Jeff: Alexa Koenig, she is the Executive Director of the Human Rights Center at the University of California, Berkeley. Alexa, I thank you so much for spending time with us today here on the WhoWhatWhy podcast?

Alexa: Thank you. I really appreciate the opportunity to speak with you.

Jeff: Thank you. And thank you for listening and joining us here on the, WhoWhatWhy podcast? I hope you join us next week for another radio WhoWhatWhy podcast? I’m Jeff Schechtman. If you like this podcast, please feel free to share and help others find it by rating and reviewing it on iTunes. You can also support this podcast and all the work we do by going to whowhatwhy.org/donate.


Comments are closed.