An advanced guide to verifying video content-bellingcat

2021-11-24 03:13:47 By : Mr. jiang longjie

Aric Toler started volunteering for Bellingcat in 2014 and has been working since 2015. He is currently responsible for Bellingcat's training and research work, with a focus on Eurasia/Eastern Europe.

One of the most common questions for researchers and journalists is to verify user-generated video content, which is most commonly found on social networks and file sharing platforms such as YouTube, Twitter, Facebook, etc. There is no magic bullet to verify every video, and it is almost impossible to verify certain videos without getting the original files from the source. However, we can use multiple methods to verify most content, especially because it involves ensuring that videos showing breaking news events are not recovered from previous events. There are many online verification video guides, the most famous is the verification manual. This guide will include some additional quirks frequently used by the Bellingcat team and strive to provide our readers with solutions to the limitations of available tools. After reading this guide, I hope you will not only know how to use this toolset, but also how to be creative to avoid dead ends.

The first step in verifying video content is the same as verifying images-run a reverse image search through Google or other services (such as TinEye). Currently, there is no free tool available that allows you to reverse search the entire video clip like an image file, but we can do the next best thing by searching for thumbnails and screenshots through reverse image search. People who make fake videos are rarely very creative. They usually repost an easy-to-find video without any obvious signs that it does not match the event, such as a news chyron or a soundtrack with someone who speaks a certain language that is not suitable for the new one event. Therefore, it is relatively easy to fact-check the recovered videos.

There are two ways to perform this search. The first is to manually take screenshots of the video, preferably at the beginning of the clip or at a critical moment, and then upload them to a reverse image search service, such as Google Pictures. The second is to rely on thumbnails generated by the video host (usually YouTube). There is no easy way to determine which frame the video will automatically select as the thumbnail, because Google has developed a sophisticated algorithm for YouTube to select the best thumbnail for the uploaded video (for more information, see Google on the subject Research the blog entry here). Perhaps the best tool to find these thumbnails is Amnesty International’s YouTube DataViewer, which can generate thumbnails for videos on YouTube and allows you to perform reverse image searches with one click.

For example, a YouTube aggregator called Action Tube recently released a video that allegedly showed a fleet of military equipment in Lithuania, but did not provide any source material. In addition, there is no indication of when the video was taken, which means it may have come from yesterday or five years ago.

https://www.youtube.com/watch?v=zX7gu_gS3zE

If we search for this video on the Amnesty International tool, we will find the exact date and time that Action Tube uploaded the video, as well as four thumbnails to search backwards to find the original source of the video.

None of the results let us directly hit the original source; however, many of the results on the third thumbnail point to the video that showed this thumbnail on the page at once. If you click on these videos, you may not find this thumbnail because the result of the "next" video on the right side of the YouTube page is tailored for each user. However, the video with this thumbnail already exists when Google saves the results, which means you can find the video on the cache page.

Again, none of these five results are the source of the video we are looking for, but when Google caches its page snapshot, the thumbnail video of the source appears on the page of these videos. When we checked the cached page of the first result above, we saw the source of the video posted by Action Tube, with the title "Enhanced Frontier Existence Battle Group Poland marching towards Lukla, Lithuania".

We now have all the information needed to track the original video and verify that the Action Tube video does indeed show military equipment recently deployed in Lithuania. After we searched for the title of the video found in the thumbnail search results, we found six uploads. If we sort them by date, we can find the oldest upload, which serves as the source material for Action Tube.

This brings us to the video uploaded on June 18, 2017-the day before the Action Tube video on June 19-uploaded by "Maj Anthony Clas". This is the same video shared by Action Tube.

https://www.youtube.com/watch?v=_4kHuTs1Nog

If we do a simple search on the uploader, we will see that he wrote an article about NATO activities in Europe for the U.S. Army website, which means that he is likely to be a communications officer, so he is more convinced that his upload is The original source pipe of the action.

Although reverse image search can find many fake videos, it is not a perfect solution. For example, the video below has more than 45,000 views and purportedly shows a battle between Ukrainian soldiers and Russian-backed separatist forces near Svetlodarsk in eastern Ukraine. The title translates as "Battle in the Vitrodarsk Bulge area of ​​Donbass (photographed from the perspective of the Ukrainian armed forces)." We can see a lot of gunshots and cannons, and the soldiers seem to laugh with the battle.

When we enter the URL of the video in Amnesty International's tool, we will see the exact date and time it was uploaded, as well as the thumbnails that we can search backwards.

When viewing the results, almost all of the results are about the same time as the uploaded video, which makes the video appear to be a true display of the battle near Svetlodarsk in December 2016.

However, the video actually came from a Russian military training exercise in 2012.

Even with the most creative reverse Google image search and Amnesty International tools, you will not find this original video in the results, except in an article describing the spread of the fake video. For example, if we search for the exact title of the original video ("кавказ 2012 учения ночь", which means "Kavkaz 2012 night training", referring to the Kavkaz 2012 military exercise), and the screenshots in the video, we will only find the results. For the fake Svitlodarsk video. Knowing that the video is fake requires one of two things: familiarity with the original video, or keen eyes (or ears) telling you that the smiling soldier is not in line with the supposed fight.

So what should I do? There are no simple answers other than creatively searching. One of the best ways to do this is to try to think like someone sharing a potentially fake video. Through the above example, the smiling soldier gives you a clue that this may not be a real battle, leading to a question, under what circumstances would a Russian-speaking soldier film the incident and laugh. If you want to find such a video, what would you search for? You may want to play the video at night so that there are fewer details that can be identified. You can also try to view spectacular battle footage, but after the Donbass War, it is not easy for Ukrainians or Russians to recognize these footage-therefore, finding videos of exercises by Russian, Ukrainian or Belarusian troops may meet the requirements, unless you Found a war video from another country and dubbed it with Russian speakers. If you search for "training practice" and "night" in Russian phrases, this video will be the first result. If you cannot find the original video, the best way to verify the video is to contact the uploader.

The use of digital tools to verify material is inherently limited, because algorithms can be fooled. Generally, people use simple techniques to avoid being detected by reverse image search-mirroring the video, changing the color scheme to black and white, zooming in or out, etc. The best way to overcome these factors is to pay attention to details so that you can verify every detail in the video to ensure that the surrounding environment of the video is consistent with the event at hand.

On September 19, 2016, it was reported that the man responsible for three bomb explosions in New York City and New Jersey was arrested in Linden, New Jersey. Several photos and videos from different sources appeared, including the following two grounds showing the suspect Ahmad Khan Rahami surrounded by police.

The exact address where he was arrested in Linden, New Jersey is not yet known, but it is certain that the two photos are real, considering that they show roughly the same scene from two angles. The video embedded below is also from a local citizen. Obviously, the video is real because it is widely shared in the news media throughout the day, but how can we perform lightning-quick verification in the event of breaking news to know that it is real?

We can quickly find out where Rahami was arrested from the two photos. In the lower left corner of the second photo, we can see an advertisement with four numbers (8211), as well as fragments of words such as "-ARS" and "-ODY". We can also see that there is an intersection of Highway 619 nearby, allowing us to drill down more precisely. If we search for a phone number that contains 8211 in Linden, New Jersey, we will get the result of Fernando's Auto Sales & Body Work, which completes the "-ARS" and "-ODY" fragments-car and body. In addition, we can find Fernando's address is 512 E Elizabeth Ave in Linden, NJ.

Checking the address on Google Street View allows us to quickly check again that we are on the right track.

Left: Photo of the suspect arrested in Linden, New Jersey. Right: Google Street View image in the same place

In these two photos and related videos, the weather is the same-cloudy and humid. 26 seconds after the video was played, the driver passed a sign that said "Bower St" and another intersection of Highway 619, which provided us with a geographic location to intersect with the location we found in the two photos Check.

A quick glance at the Google map will reveal that Bower Street intersects East Elizabeth Ave. The suspect was arrested near the auto repair shop (indicated by a yellow star).

If you have time, you can gain insight into the exact location where the video was taken by comparing the features on Google Street View with the video.

Left: Video from the day Lahamie was arrested in Linden, New Jersey. Right: Google Street View image

Although each of these steps seems to involve a lot of work, if you know what to look for, the entire process should not take more than five minutes. If you cannot reach the witnesses who provided the video material of the incident, verifying their footage only requires careful observation of the details and some errands on Google Maps and Street View. Verifying video material should not only be a regular part of the report, but also a regular part of sharing content on social networks, as this is one of the fastest ways to spread fake news.

Compared with photos, digitally changing video requires more effort and skill, adding or subtracting elements while still looking natural. Generally, the purpose of changing a video is not only to avoid fact checkers, but also to avoid detecting algorithms that look for copyrighted content. For example, a movie, TV show, or sporting event may be uploaded to YouTube with a mirrored video so that it can still be watched (although a bit offensive), but avoid DMCA violations. The best way to quickly detect if the video is mirrored is to look for any text or numbers, because they look strange after being flipped.

In the series of screenshots below, footage from the 2011 Moscow Domodedovo Airport attack was reused to produce fake videos about the Brussels and Istanbul Airport attacks. Some of the effects used by counterfeiters include zooming in on video clips, adding false time stamps, and changing the color scheme to black and white. In addition, fancy logos are usually added to the top of the lens, which makes reverse image search more difficult.

There is no easy way to detect these as fakes through tools, you need to rely on common sense and creative search. Just as Russian military training videos are reused as new combat footage, you need to consider what counterfeiters will search for source material. Searching for words such as "airport bombing" or "CCTV terrorist attack" will provide you with footage of the Domodedovo airport attack, providing results much faster than using screenshots to return results in a reverse Google image search.

Many people view technological advancement as a future remedy for fake news and content, but it is difficult to see any digital method that can remove fake videos and verify content in a nearly completely accurate way. In other words, unless there are strict content sharing controls on social networks and YouTube, the arms race between developers and semi-creative fake video creators is a losing battle at this point. Although digital toolsets are important in verifying fake content, creative toolsets are even more important.

Your donation to Bellingcat is a direct contribution to our research. With your support, we will continue to publish groundbreaking investigations and expose illegal activities around the world.

In addition to the content we have published, we will also introduce readers to the activities our employees and contributors participate in, such as noteworthy interviews and training seminars.