Silicon Valley asked to build a terrorist-spotting algorithm by the U.S. government

U.S. government wants the top tech companies in Silicon Valley to build terrorist-spotting algorithm

The officials from the United States in a recent meeting have made a request to the executives of top technology companies to make the Minority Report a reality.

The policymakers during the terrorism summit between members of the government and the heads of Silicon Valley’s biggest technology companies came up with an idea that before the terrorists act with a computerized system, it may be possible to catch them, according to Fusion.

The algorithmic method would be proposed to possibly flag online radical activity and give law enforcement a go ahead on activity that may be thought as terrorism.

The suggestion put forward to companies like Facebook, Twitter, and Google, as they can do more to help the authorities keep the world safe by monitoring the huge collection of users’ data on their respective platforms. These services can keep an eye on future activity or notify authorities to take action when possible signs of radicalization are identified.

Issues with this suggestion are abundant, especially without any real information. The proposal is too broad and ill-defined, a fact that was acknowledged even by the White House.

Before Friday’s briefing, a White House memo that went out to summit participants acknowledged that such a system would raise privacy and civil liberties concerns, and that it’s “unclear” whether radicalization is as easily measurable as credit scores. But the memo said that “such a measurement would be extremely useful to help shape and target counter terrorism and efforts focused on countering violent extremism.”

Earlier this month, Andre McGregor, a former FBI terrorism investigator, told the Guardian that “It’s a very fine line to get that information.” McGregor is now director of security at Silicon Valley security company Tanium Inc. “You’re essentially trying to take what is in someone’s head and determine whether or not there’s going to be some violent physical reaction associated with it.”

The recent events especially the San Bernardino shooting, in which it was widely reported that one of the shooters had commitment to ISIS on Facebook is likely the idea behind calling of an algorithmic watchdog to notice possible extremists before they are able to act. However, it was later shunned by the Federal Bureau of Investigation (FBI), but not before the report was passed around for weeks.

This is not the first time that such a thought has been floated either. The Republican presidential candidate Carly Fiorina earlier this year had recommended that the government isn’t using the right algorithms to find terrorists online, and that she has the power to persuade Silicon Valley companies to help because of her time as CEO of Hewlett-Packard.

The security committee and parliamentary intelligence in the United Kingdom last year approached Facebook to help recognize terrorists by using its scripts and tracking methods. A lot of questions were raised due to the decision among experts, including data strategist Duncan Ross, who ran an analysis to find out just how accurate such a system could be.

According to the Guardian, Ross found that close to 60,000 people would still be misidentified as being suspicious, even if such a system was 99.9 percent effective, which is significantly more successful than any real-world application. That’s a huge amount of information for law enforcement to push its way through.

This all comes before taking into account privacy concerns. Any algorithm that is in search of activities on social networks to spot likely problems would likely be invasive and may also require reaching into protected content. A similar system was supposedly recommended by Facebook for the social networking site’s suicide prevention program, which needs friends of a user to flag a post and bring it to the attention of Facebook.

Facebook refused to comment on the report.

Recently, the computer scientists of University of Pennsylvania published a paper for an algorithm they developed. The system is designed in such a way that it can spot terrorist activity without violating the privacy of any user is required.

Basically, the system developed by the researchers turns the members of a particular network into bits. Only certain bits at a time are recognized by the algorithm, discovering specific information about a user without disclosing their complete identity. Then, without disclosing any details of the remaining of the user base, the system can find a likely target.

However, it still leaves a question unanswered as to what will get a person flagged and once identified what kind of action is need. There is no clarity as to how something somewhat subjective—the concept of being “radicalized’—can be recognized by an algorithm that look at things in an objective manner even if there is no direct violation of privacy of most of the users.

Kavita Iyer
Kavita Iyerhttps://www.techworm.net
An individual, optimist, homemaker, foodie, a die hard cricket fan and most importantly one who believes in Being Human!!!

3 COMMENTS

  1. Algorithms support these concepts: Apathy versus Empathy… What is it? As linked to Racial Issues it becomes the suppression or down drafting of ones Self-Esteem…How good or bad one feels about oneself. Institutionally the dominate race (claims of superiority) must demean the apposing races, in order for the dominate race to be superior; showing no Apathy when doing so… Empathy is only allowed for the dominate race… Without these techniques there is no control or subjugation of the opposing race… If one is left with no Apathy for their own condition, then they have surrendered their Human Life Value and become less than the dominate race. Stealing the virtues, history and all accomplishments of the opposing race and reducing their culture to criminal and nearly insane status; far from the norm of what the dominate race finds acceptable is the ultimate racism…!

  2. Isn’t this kinda exactly like the crime-prediction system in Watch Dogs 2? This kind of software can and will get out of hand.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Read More

Suggested Post