Authenticating Video

The Guardian Project’s software authenticates human rights videos and protects activists in the field

Loading the podcast player...

Steven Cherry: Hi, this is Steven Cherry for IEEE Spectrum’s “Techwise Conversations.”

Suppose you’re a human rights organization. Someone sends you a video documenting an abuse of human rights—by a private militia in Chechnya, or the Abu Ghraib prisoner abuse photos, or children forced to pick cotton in Uzbekistan. The Internet has once again proved its worth—the pool of people able to video a human rights abuse and send it to you is a hundred- or maybe a thousandfold greater than it used to be. Awesome.

Unless the video is a fake. The Internet and modern software make it at least a thousandfold easier for someone to fake a video too. Not so awesome.

The human rights group known as Witness, with money from the Knight Foundation, has been developing a new app that would help it authenticate video, photos, or audio coming from mobile devices. The actual developer of the app is an organization with the confusing name of the Guardian Project. Its main mission is creating apps that help journalists and others “protect their communications and personal data from unjust intrusion and monitoring.”

My guest today is the founder of the Guardian Project, Nathan Freitas. He joins us by cellphone from a hotel bar in Thailand, where it is 11:30 at night.

Nathan, welcome to the podcast.

Nathan Freitas: Thanks for having me, Steven. Excited to be here to talk about our work.

Steven Cherry: So what are some fakes that your app has helped uncover?

Nathan Freitas: An example might be during the Tibetan uprising in 2008, there was a photo that was being circulated of a man with a machete sort of attacking, maybe, some Chinese tourists in Lhasa. And then there was another photo of a man without the machete, and everyone wanted to believe that Tibetans were not being violent, that this was peaceful. But it was also very confusing—which was the real photo and who was backing these various versions of this photo that were being distributed. So that’s one example.

And there’s another famous example, where there was a Palestinian activist who was shot by a tear gas canister and was killed, died from the impact. And the question was, Did the Israeli soldier shoot the canister in a way that was, you know, sort of weapon-like, or was it truly an accident? And this is something our partner Witness had worked on, to combine multiple pieces of footage, combine evidence from the scene, 3-D model. It was a very laborious process, but using the video data as the basis, they were able to determine that, indeed, the shot must have come in a downward direction, you know, focused at the person.

Steven Cherry: Yes, so your application, which goes by the name of InformaCam, it starts by looking at the metadata that a mobile device generates. I guess we’re all used to seeing the date and time stamp on a photo, but there’s a lot more that isn’t shown.

Nathan Freitas: Exactly. The modern smartphone is full of sensors. You can think of it as a mobile sensor platform in your pocket. Most of the time, the sensor that a normal user will interact with is the GPS—you know, the global positioning sensor that tells you your latitude and longitude, and then perhaps also the altitude and compass, your heading.

Increasingly, your users might know about the accelerometer or the gyroscope, based on games they would play. There’s also heat sensors and light sensors, and even more coming for health applications. All of this data is there to be stored and tracked and correlated with any photo you take or audio you record or video you record, but it’s just not happening. These two sources are not being combined. And this is the breakthrough of InformaCam, is pairing these sensors with the media-capture process.

Steven Cherry: So when somebody, when Witness gives you a video and says, “Was this really taken, you know, in Palestine at this date and time,” and so forth, you’re looking into the digital file itself. And I guess at the very beginning of the file there’s some of this information.

Nathan Freitas: Right. So in a typical file taken from a non-InformaCam-enabled device, you’ll have a variety of different headers in the file. So maybe a JPEG photo from a camera will have something known as XIF, which is typically the date and time, the make and model of the camera, perhaps a GPS location, but not much more. A QuickTime movie from Apple or from an iPhone might have a date stamp and a modified stamp, and every format has something slightly different. If something is taken on an InformaCam device using the InformaCam app, it will have a new metadata kind of header in it that we call the J3M [pronounced “gem”], which is the JSON Evidentiary [Mobile] Multimedia Metadata Standard.

Essentially, we’re saying it’s a very rich source of data. It has all of these sensors in it. It’s easy to parse. It’s a standard format. And not only is it the data there, but it also has a cryptographic signing layer, so that you can verify it hasn’t been manipulated, and that the media itself hasn’t been manipulated. So beyond just the data, chain of custody is very important, and knowing that, you know, even though this video was taken a month ago, and it took a month to travel out of Syria by mule over the border, we know that it hasn’t been modified in that month, and so the data can also tell that.

Steven Cherry: So just to be clear: This isn’t a way of, I don’t know, authenticating a YouTube video. Basically, Witness is an organization that, among other things, hands out video equipment to people with the idea that they can take videos of potential human rights abuses. We should mention, I guess, the singer and activist Peter Gabriel was one of the founders, and it got its start after the Rodney King beating, which happened to be caught on video in 1991, before modern cellphones even had cameras. So the idea is that with this application as well, when the video is taken, all this additional metadata that doesn’t normally attach to a video does attach, including this encryption layer.

Nathan Freitas: Yes, that’s correct. And the legacy of Witness, or the track record of Witness, is such that they for 20 years have been successfully training groups that need to use visual media to defend themselves or their community or their cause. They’ve trained people and handed out video cameras. Now, more and more people don’t know what to do with video cameras, or want video cameras, and they say, “Well, I already have an Android phone, or an iPhone, can’t I just use that?” And so Witness had this realization a few years ago and said, “Wow, we’d better transition into this modern age and start developing software to run on existing phones.” And it’s a lot cheaper for them, and it also means that, you know, instead of giving out 100 cameras a year, they can distribute hundreds of thousands of copies of the software.

And so we actually have another application called ObscuraCam, which is sort of the alter ego of InformaCam. And ObscuraCam is for pixelating, or redacting, images and videos, so removing a face or blurring a shirt or removing some sensitive piece of information before you post to YouTube. And in that case, we built an app, and we have 100 000 downloads in the Google Apps store. But really what these are, both ObscuraCam and InformaCam, are reference designs. And with ObscuraCam, we’ve been able to demonstrate what we’ve done to YouTube. And then subsequently, YouTube’s sort of human rights division has successfully lobbied to add face-blurring to YouTube. So if you upload to YouTube, now you can pixelate it, or it will automatically look for faces if you choose.

And the idea with InformaCam is exactly the same. We want to demonstrate what’s possible. We want to create free software, and freely licensed engineering standards and code and reference design, and then we want every camera manufacturer out there to support these standards in the future. So we are starting with the app, because that’s the easiest, but our plans for impact are quite large. If you get into a car accident and you need to make an insurance claim, or for your house or mundane parts of everyday life, where people know to take a photo and submit it, we’re actually hoping InformaCam has become sort of, you know, that broad.

Steven Cherry: Is there a concern? I mean, do insurance companies say, “How do we know this is an authentic photo?”

Nathan Freitas: Yeah, I think that what we’re seeing, I mean, going back to YouTube with, say, Occupy Wall Street, there were actually a number of cases where YouTube footage was used to either, you know, substantiate an arrest, or to say, “Nope, the police officer was lying. We have this YouTube clip.” But what people like the International Bar Association, the International Criminal Court are saying is, once the courts kind of wake up and realize that there’s no chain of custody, there are all these things that sort of social media footage is going to be thrown out, and by extension anything that doesn’t have a way to really verify, it will be thrown out as viable evidence.

Steven Cherry: So, really, this lets everybody become a human rights abuse videographer if they want to, and if they download the apps in advance and use them, then if they turn the video over to somebody like Witness, then it would help with authenticating. Now for us budding human rights activists in the field, you have two other apps that would be phenomenally useful, right? One of them is Orbot, and the other is Ostel, and the one lets people browse online anonymously, and the other lets them make secure, anonymous phone calls. Tell us how they work.

Nathan Freitas: Orbot is kind of our core engine that a lot of our apps are based around. Or it’s a building block in that it defends your network connection. It’s in one part anonymity, but it’s also sort of antisurveillance and antifiltering.

So there was news this week that Egypt has blocked YouTube because of the very poorly made anti-Islamic film that we don’t really need to talk about. But so Egypt has blocked YouTube for months. So this means that if there’s something on YouTube, or in Egypt, you can’t upload it to YouTube, because it’s blocked. Orbot enables a smartphone to route around those sort of blocks. It also does provide anonymity, so if you are uploading to YouTube, YouTube doesn’t necessarily know your IP address, can’t tie it back to a specific person. And, you know, this is critical as well, to allow everyone from activists to battered women often use this, or people who are being stalked, to avoid being tracked by their IP address by their spouse, former spouse. This is a common use.

And law enforcement uses it. You know, if you’re the FBI investigating organized crime and visiting a website, you don’t want them to see that the FBI visited your website, so anonymity has broad applications, and we provide that through something called Tor, the Tor Project. It’s a partner of ours, and we use their software on the smartphones. So that’s one piece, and that’s definitely our biggest app at this point. And Ostel is what we call an open secure source and an open-standard telephony network.

Steven Cherry: Now, all of these apps are written for Android. We’ve heard from other developers that the iPhone is hard to develop for. Is that your reason as well?

Nathan Freitas: No, our choice of Android came from my instinct that it would be in the broader global world—it would be the platform that had the more affordable device and the wider impact amongst people that tended to be working on human rights activism. And so what you see in a place like China is that while, you know, the rising upper middle class and the wealthier upper class of Chinese go crazy over iPhones in Beijing and Shanghai, 80 percent of the smartphone market is Android.

And what you’re seeing in places like Kenya or Nigeria or South Africa, for sure, is that Android is the US $100 smartphone that people are adopting, and that iPhones are just way out of reach. So when you’re talking about human rights and activism, it’s just a different set of judgments you make about who your target users are. We are increasingly looking at supporting the iPhone because it is getting more affordable in some cases. And it’s a great platform. I mean, Apple makes great technology. And so even some of the software, InformaCam, ObscuraCam, we have work underway to port things to the iOS environment.

You know, Apple is a bit of a control freak, and again, if you do anything controversial as well, they reserve the right to not allow it in the app store. So one of my students…I teach at NYU, and one of my students built an app that was about tracking drone strikes. And Apple didn’t allow it in the app store, even though this was his project for the entire semester. So there are some downsides, again, with Apple, I mean, when you come at it from a sort of human rights and activism perspective.

Steven Cherry: Well, Nathan, I’ve been reporting on anonymatizing software and servers and networks for more than a decade, and, of course, the effort to create them goes back a long time before that. Thanks for this important work and for speaking with us today.

Nathan Freitas: Thanks a lot, Steve, and we look forward to building more tools that your audience can use, and make their smartphones more than a distraction, and actually an empowering piece of technology in their lives.

Steven Cherry: We’ve been speaking with Nathan Freitas, developer, activist, and founder of the Guardian Project, about software that helps activists in the field and activist organizations back home.

For IEEE Spectrum’s “Techwise Conversations,” I’m Steven Cherry.

This interview was recorded 13 February 2013.
Segment producer: Barbara Finkelstein; audio engineer: Francesco Ferorelli
Read more “Techwise Conversations,” find us in iTunes, or follow us on Twitter.

NOTE: Transcripts are created for the convenience of our readers and listeners and may not perfectly match their associated interviews and narratives. The authoritative record of IEEE Spectrum’s audio

Advertisement
 
Computing

Spying on Skype Users in China

In a ComputerWorld story yesterday, Skype says that it was not aware that China was monitoring, censoring and then archiving text messages that contain certain politically sensitive words that are sent on Tom-Skype, a joint venture between Chinese Internet service provider Tom Online and Skype (which is itself owned by eBay). As told in a story in the New York Times earlier this week, Citizen Lab, a human-rights activist group that focuses on politics and the Internet at the University of Toronto discovered the surveillance by accident in September. Citizen Lab claimed that …

 
Semiconductors

Satellite images indicate abuses in Myanmar (formerly Burma)

An analysis of satellite images performed by the American Association for the Advancement of Science (AAAS) seems to indicate evidence of villages being razed by the military in Myanmar (formerly known as Burma) over the course of the last several years. 0928burma_report_b.jpg

A before image (top) depicts a small settlement in Burma on 5 May 2004, and again on 23 February 2007, with all structures removed. The images correspond with information provided by the Free Burma Rangers regarding December 2006 attacks at and near the Burmese village of Kwey Kee. (Lat: 18.79 N Long: 96.76 …

Advertisement