19.7 C
Los Angeles
Tuesday, March 28, 2023
- Advertisement -

It’s one thing to prove the photo is fake. it’s not proving it’s quite another thing

TechScienceIt's one thing to prove the photo is fake. it's not proving it's quite another thing
- Advertisement -

Teahat true The first casualty of war is an old adage. A recent example is the dissemination of images and videos of things that did not happen in the currently ongoing wars in Ukraine and Syria. Some of these are outright fake. Others are manipulated versions of honestly recorded material. Last year, an edited video surfaced of Ukrainian President Volodymyr Zelensky apparently asking Ukrainian troops to surrender.

However, the proliferation of such fakes has given rise to a second, more subtle approach to lying with images. It is to use the ubiquity of fakes to cast doubt on the veracity of inconvenient photographs that are in fact real. For example, shortly after Russia invaded Ukraine last year, the Associated Press released a video of doctors failing to revive a young girl wounded in the shelling of Mariupol. The footage soon appeared on Russian television with the word “fake” stamped on it. Since it is difficult to prove a negative (that is, that the material has not been tampered with), such evidence can thus be challenged, possibly even in court. On the basis of that evidence the charges of offenses cannot stand.

Therefore methods of establishing the authenticity of digital imagery would be valuable. And one is available now. The “Glass-to-Glass” warning system creates specialized software “ecosystems” within which images and videos can be taken, stored and transmitted in a way that alerts viewers to changes, regardless of whether they occur. When and where are introduced in the journey of an image. Lens for screen.

a plate of hash

- Advertisement -

One such system has been developed by Eyewitness to Atrocities, a charity based in London. The eponymous app does two things at its core. First, when a photo or video is taken by a phone paired with the app, it records the time and location of the event as reported by hard-to-deny electronic witnesses. GPS satellites and nearby mobile phone towers and Wi-Fi networks. This is known as controlled capture of metadata, and is more secure than collecting such metadata from the phone itself, as the phone’s time and location settings can be changed.

Second, the app reads the entire digital sequence of the image (the zeros and ones it represents) and uses a standard mathematical formula to calculate an alphanumeric value, known as a hash, that matches the picture. is unique to All this is done, then it puts the metadata and hashes in a file called a proof bundle that is separate from the image and sends an encrypted copy of the image and its proof bundle to a special server.

Eyewitness to the atrocities Wendy Bates describes this server as a digital evidence locker. When there is a need to verify the authenticity of an image, it is sufficient to re-scan its digital sequence, recalculate its hash, and ask the repository if it contains a similar hash. If even a single pixel of the image is changed, the recalculated hash will not match the original. If it matches, the image has not been retouched. As an additional service, around 80 lawyers work for the charity without pay for a few hours each week, reviewing incoming images. They package what appears to be recorded into dossiers, which are then sent to prosecuting authorities, including Europol (a law-enforcement agency of the European Union), the International Criminal Court, and Ukraine’s Office of the Prosecutor-General.

Andrey Kostin, the prosecutor-general himself, is a fan of the eyewitness system—and not just because it provides the protection of authenticity that courts require. He also likes the fact that it helps remove a second obstacle to his efforts: the fear of witnesses getting caught.

make connections

In areas of Ukraine occupied by Russia, this is a serious risk. For example, soldiers stationed at a checkpoint were on someone’s phone to find video evidence collected by that person. war crimes, the consequences can be serious. To make this less likely to happen, the app’s icon doesn’t reveal its purpose. Also, if it is tapped by the investigating officer and a wrong passcode is entered, that opens the normal photo gallery of the phone. Maryna Slobodianyuk, the lead investigator for Truth Hounds, a human rights group in Kyiv, said of the evidence of the attacks collected using eyewitnesses: “Even if I will be caught… no one will get to that.”

first edition of eyewitness account system, available for free, was released in 2015 so most of the bugs have been dealt with. The uptake in Ukraine has increased over the past year. Ms Bates says that of the 40,000 submissions received in 2022, more than 27,000 were sent from Ukraine, which her team considers relevant to the investigation.

Police officers and journalists are particularly keen users. So are analysts from the Ukrainian Healthcare Center, a think-tank in Kyiv that employs the app to gather evidence of attacks on medical facilities. Nor is Eyeglasses the only provider of glass-to-glass services. The Guardian Project, based in Valhalla, New York, has released a smartphone app called proof mode, Like Eyewitness, ProofMode adds controlled-capture metadata and a hash of the image to the proof bundle. However, rather than operating the recipient servers themselves, Proofmod employs repositories run by other firms such as Google, which log them as notaries. Viewers of an image taken with ProofMode can upload it to a Guardian Project website that recalculates its hash and checks the repository for a match. If it fails to find one, the image is declared changed.

Soon, the Guardian Project will add a new feature, Synchrony. It will also link to OpenStreetMap, an online cartography of the world, and a detailed geographic record of the world’s weather over the past few years (which has yet to be decided) to capture the location and time-of-day of an image. This would make it easier to investigate discrepancies between the place and time someone claims the photograph was taken, and the local landscape and weather conditions on that day. The idea is to “sync the images to the real world as it were”, says Nathan Freitas, founder of the Guardian Project. He hopes to link to other databases as well, including databases that record when and where street protests have occurred.

A third operator, Truepic, of La Jolla, California, is taking a more commercial approach. Charities pay nothing to use its software, but the companies that employ it to monitor things like supply chains, progress on construction sites, compliance with loan terms, and the whereabouts and condition of expensive kit, They should stop.

Truepic offers two services. Scans smartphones for malware designed to facilitate falsification of metadata. Other places a so-called rebroadcasting attack, in which a tampered picture is copied to produce a new picture that does not contain traces of tampering in its code. Munir Ibrahim, once a member of the US Diplomatic Corps (he served Always, in Damascus, a hotbed of photographic deception), and now head of public affairs at Trupik, is concerned about how this is done. But the trick, he notes, is to look for clues that all the pixels in an image have registered the same flat surface.

Truepic joins hands with Adobe in 2021, armThe BBCIntel and Microsoft to form Alliance for Content Origin and Authenticity (C2PA, It is trying to create a set of image authentication technical standards for manufacturers of hardware and software. The aim is to eliminate the need to fuss with specialized apps. Instead, the alliance wants the metadata capture, hashing, and transmission of data in the repository to happen behind the scenes and without royalties.

If C2PAWere the standards widely adopted, even web browsers would be able to check an online repository of hashes and put up warnings on non-matching images. Eventually, the hashes can be automatically distributed across blockchain ledgers. The Starling Lab at Stanford University is testing such a system.

However, hurdles remain. Jonathan Doughton, founding director of the Starling Lab, points to one in particular. The technology could potentially allow authoritarian regimes to identify devices, and therefore people, who have taken damaging photographs. He says researchers must first find a way to make such tracing impossible. Transparency is a great thing, but even the good ones recognize that, sometimes, there can be too much of a good thing.

Source link

Judge says Google’s failure to preserve employee messages deserves sanctions in Epic antitrust case

A Google sign is pictured outside the Google office during the company's presentation of a detailed investment plan...


redbourne Source link
- Advertisement -

Follow us

— Advertisement —

Most Popular Articles