North America

Microsoft-Funded Professor Builds Software to Fight Terrorism

Dartmouth College computer science professor Hany Farid — using funding from Microsoft Corp. — has developed technology to help scrub extremist content from the internet.

Working with the nonprofit think tank Counter Extremism Project, Farid built software capable of identifying and tracking photo, video and audio files, even if they’ve been altered. The software, unveiled Friday, would allow websites such as Facebook Inc. to automatically catch flagged content and remove it or prevent it from being uploaded.

On a call to discuss the technology, Farid, who is also a senior advisor to the CEP, said his software would allow companies to automatically remove posts that violate the sites’ terms of use. He also said deleting the content is not a freedom of speech issue because the companies have the right to dictate what’s suitable.

You've reached your daily free article limit.

Subscribe and support our veteran writing staff to continue reading.

Get Full Ad-Free Access For Just $0.50/Week

Enjoy unlimited digital access to our Military Culture, Defense, and Foreign Policy coverage content and support a veteran owned business. Already a subscriber?

Dartmouth College computer science professor Hany Farid — using funding from Microsoft Corp. — has developed technology to help scrub extremist content from the internet.

Working with the nonprofit think tank Counter Extremism Project, Farid built software capable of identifying and tracking photo, video and audio files, even if they’ve been altered. The software, unveiled Friday, would allow websites such as Facebook Inc. to automatically catch flagged content and remove it or prevent it from being uploaded.

On a call to discuss the technology, Farid, who is also a senior advisor to the CEP, said his software would allow companies to automatically remove posts that violate the sites’ terms of use. He also said deleting the content is not a freedom of speech issue because the companies have the right to dictate what’s suitable.

“We allow them to do it fast, accurately, automatically,” he said.

Many internet and social media companies, including Facebook and Twitter Inc., do have rules prohibiting posts from organizations that are involved in terrorist activity or organized crime or excessively violent, graphic content. But foul content gets posted anyway and relies on manual flagging and removal — more of a “Whack-a-Mole” approach, Farid said.

Read more at Bloomberg

Image courtesy of hdw.eweb4.com

About SOFREP News Team View All Posts

The SOFREP News Team is a collective of professional military journalists. Brandon Tyler Webb is the SOFREP News Team's Editor-in-Chief. Guy D. McCardle is the SOFREP News Team's Managing Editor. Brandon and Guy both manage the SOFREP News Team.

COMMENTS

You must become a subscriber or login to view or post comments on this article.

More from SOFREP

REAL EXPERTS.
REAL NEWS.

Join SOFREP for insider access and analysis.

TRY 14 DAYS FREE

Already a subscriber? Log In