The US Department of Defense (DoD) wants contractors to mine your social media posts to develop new ways for the US government to infer what you’re really thinking and feeling — and to predict what you’ll do next.
Pentagon documents released over the last few months identify ongoing classified research in this area that the federal government plans to expand, by investing millions more dollars.
The unclassified documents, which call on external scientists, institutions and companies to submit proposals for research projects, not only catalogue how far US military capabilities have come, but also reveal the Pentagon’s goals: building the US intelligence community’s capacity to forecast population behavior at home and abroad, especially groups involved in political activism.
They throw light on the extent to which the Pentagon’s classified pre-crime R&D has advanced, and how the US military intends to deploy it in operations around the world.
Could your social media signature reveal your innermost thoughts?
A new Funding Opportunity Announcement document issued by the DoD’s Office of Naval Research (ONR) calls for research proposals on how mining social media can provide insight on people’s real thoughts, emotions and beliefs, and thereby facilitate predictions of behavior.
The research for Fiscal Year 2016 is part of the Pentagon’s Multidisciplinary Research Program of the University Research Initiative (MURI), which was initiated over 25 years ago, regularly producing what the DoD describes as “significant scientific breakthroughs with far reaching consequences to the fields of science, economic growth, and revolutionary new military technologies.”
The document calls for new work “to understand latent communication among small groups.” Social meaning comes not just from “the manifest content of communication (i.e., literal information), but also from latent content — how language is structured and used, as well as how communicators address each other, e.g., through non-verbal means — gestures, head nods, body position, and the dynamics in communication patterns.”
The Pentagon wants to understand not just what we say, but what is “latent” in what we say: “Subtle interactions such as deception and reading between the lines, or tacit understanding between communicators, relative societal position or relationship between communicators, is less about what is said and more about what is latent.”
All this, it is imagined, can be derived from examining social media, using new techniques from the social and behavioral sciences.
The Pentagon wants to:
“… recognize/predict social contexts, relationships, networks, and intentions from social media, taking into account non-verbal communication such as gestures, micro-expressions, posture, and latent semantics of text and speech.”
By understanding latent communication, the Pentagon hopes to develop insight into “the links between actors, their intentions, and context for use of latent signals for group activity.” The idea is to create:
“… algorithms for prediction and collection of latent signals and their use in predicting social information.”
These algorithms also need to “accurately detect key features of speech linked to these structural patterns (e.g., humor, metaphor, emotion, language innovations) and subtle non-verbal elements of communication (e.g., pitch, posture, gesture) from text, audio, and visual media.”
The direct military applications of this sort of information can be gleaned from the background of the administrator of this new research program, Dr. Purush Iyer, who is Division chief of Network Sciences at the US Army Research Laboratory (USARL).
Among the goals of Dr. Iyer’s research at the US Army are expanding “Intelligent Networks” which can “augment human decision makers with enhanced-embedded battlefield intelligence that will provide them with tools for creating necessary situational awareness, reconnaissance, and decision making to decisively defeat any future adversarial threats.”
Creeping police state
The allure of co-opting Big Data to enhance domestic policing is already picking up steam in the US and UK.
In the US, an unknown number of police authorities are already piloting a software called ‘Beware’, which analyses people’s social media activity, property records, the records of friends, family or associates, among other data, to assign suspects a so-called “threat-score.”
That “threat-score” can then be used by police to pre-judge if a suspect is going to be dangerous, and to adapt their approach accordingly.
Given the police’s discriminatory track record with shootings of unarmed black people skyrocketing, the extent to which such ‘Minority Report’-style policing could backfire by justifying more discriminatory policing is alarming.
In the UK, Home Secretary Theresa May just last week told the Police ICT Suppliers Summit that polices forces should use predictive analytics to “identify those most at risk of crime, locations most likely to see crimes committed, patterns of suspicious activity that may merit investigation and to target their resources most effectively against the greatest threats.”
Noting that the police have yet to catch up with the “vast quantities of data” being generated by citizens, she complained: “Forces have not yet begun to explore the crime prevention opportunities that data offers.”
In reality, the shift to predictive policing in the UK is well underway, with Greater Manchester, Kent, West Midlands, West Yorkshire and London’s Metropolitan Police having undertaken trials of a software known as “PredPol.”
According to the UK College of Policing’s National Policing Vision for 2016:
“Predictive analysis and real-time access to intelligence and tasking in the field will be available on modern mobile devices. Officers and staff will be provided with intelligence that is easy to use and relevant to their role, location and local tasking.”
The next threat is social change, economic collapse
Driving the hunger to capture Big Data is a growing recognition that the post-2008 era of slow economic growth and geopolitical crisis is likely to lead to a continuing risk of civil unrest — both within Western homelands, and in foreign regions of strategic interest.
The Pentagon’s new research calls are designed to build on a wide range of already active programs developing ways to integrate open source data, including the social media footprints of entire populations, into sophisticated computer models.
One of the most disturbing applications of this sort of information was described in a new Funding Opportunity Announcement released last month for the Minerva Research Initiative, a DoD social science program founded in 2008.
Among the subject areas mentioned in the announcement is “Influence and mobilization for change”, which includes themes like:
“Analyses of the topology, power structure, productivity, merging and splitting, and overall resilience of change-driven organizations.”
Other overlapping themes the Pentagon wants input on are:
“Mechanisms of information dissemination and influence across diverse populations”; “Mechanisms of (and factors inhibiting) mobilization at individual and group levels”; “Factors that make specific individuals/groups influential within a particular cultural context”; and “The interaction between emotion and cognition and its impact on future behavior.”
These are generic themes concerning the dynamics of community-driven change activism in general. Yet the underlying assumption implicit in the document is the conviction that change activism can in some cases in itself generate a threat to national security.
The document also explains that research on such themes:
“… will help the Department of Defense better understand what drives individuals and groups to mobilize for change and the mechanisms of that mobilization, particularly when violent tactics are adopted. This research will inform understanding of where organized violence may erupt, what factors might explain its spread, and how one might mitigate its effects.”
This and several other paragraphs are verbatim copied from an earlier Minerva call for research that I reported on about a year ago. As I observed then:
“At first glance, this seems fairly innocuous, but it reveals a disturbing ideological bias in the Pentagon’s conception of social and political dissent. The assumption that the adoption of ‘violent tactics’ is linked to the issues that motivate people to ‘mobilize for change’ conflates the dynamics of change activism in general with a risk of being involved in ‘organized violence.’”
The document does not specify particular types of organization or group that should be studied, except once in reference to “hacking forums,” which perhaps highlights the Pentagon’s increasing interest in decentralized networks like Anonymous.
The Pentagon appears to be particularly concerned about the potential risks of social crisis, civil unrest and collapse, both at home and abroad.
In a section calling for submissions on “Societal Resilience and Change”, the Minerva document states that “DoD seeks to develop new insights into the social dynamics within regions and states of strategic interest, and to examine the factors that affect societal resilience to external ‘shock’ events and corresponding tipping points.”
Without specifying what those “shocks” could be, the document does mention developing frameworks to improve policy “before, during, and after societal shifts like those seen during the so-called Arab Spring.”
It should be noted that the Arab Spring protests had brought down and undermined brutal autocratic governments that had, however, been longstanding US allies.
The Minerva document also emphasizes the need to understand “changes in demographics (e.g., gender and age structure, wealth distribution) on internal and external stability,” especially what the Pentagon describes candidly as:
“Security implications of aging populations and shrinking working age populations worldwide.”
So the Pentagon anticipates a looming economic crisis due the unsustainability of the rise in an elderly population, relative to the reducing numbers of working people. It further confirms that the Pentagon perceives this as posing a potential national security crisis.
The US, and major allies like Britain, Germany, France, and Israel, are among the top 20 countries that will be most impacted by these demographic trends.
Last year, the Wall Street Journal reported that in 2016, “the world’s advanced economies will reach a critical milestone. For the first time since 1950, their combined working-age population will decline, according to United Nations projections, and by 2050 it will shrink 5%. The ranks of workers will also fall in key emerging markets, such as China and Russia. At the same time the share of these countries’ population over 65 will skyrocket.”
From open source to ‘minority report’
By linking up metadata from social media with other forms of data — whether it’s mobile phone usage metadata, geolocation information, satellite data, personal records — the Pentagon hopes to find patterns that enable it to predict future behavior.
A third major subject-theme of the Minerva research call clarifies the Pentagon’s concern with enhancing its ability to predict the future.
Titled, “Analytic Methods and Metrics for Security Research,” the document calls for “rigorous, validated quantitative measurement and models” which can “compare information across sets of data and across time.”
Such models would enhance “opportunities for visualization of trends, and the potential to forecast future events.”
Last summer, a similar research call was issued through a Broad Agency Announcement issued by the DoD’s Office of Naval Research (ONR), related to “Expeditionary Intelligence Surveillance, Reconnaissance Science and Technology.”
A significant portion of the ONR document is dedicated to outlining the need for predictive models.
“In being able to use social media as an ISR [intelligence, surveillance, reconnaissance] signal, ONR is interested in theoretical constructs that allow understanding and thus interpretation of an online open media signature and its relationship to on the ground sentiment and behavior.”
The Pentagon wants to develop approaches that will allow open source analysis of a person’s or group’s publically available social media “signature” — the full array of their social media activities — and how this relates to both emotional “sentiment” and actual “behavior.”
ONR also wants to know “how social media can be used as a seed in a Global Knowledge Environment (cloud based, big data repository that includes imagery, video, ship tracks, METOC [meteorology and oceanography] and analytic products) to discover additional information about the physical, military, and sociocultural environment of an operational area of interest.”
Basically: everything in an ‘area of interest.’
The ‘Minority Report’ style implications of this sort of social media data mining are explained in some detail:
“Information demands that social media could be helpful in fulfilling include:
• Predict, detect, track violent behavior by groups
• Understand anomalous event/sentiment signals/signatures in a region of interest
• Derive sociocultural trends to assist in decision making
• Identify trends, local perceptions, media bias, cultural nuances, and environmental distinctions.
• Connecting people, places, and things to uncover physical, cyber, financial, social, operational aspects of an unknown or emerging threat
• Pattern of life analysis used to provide visibility and thus vulnerability to physical, informational, social aspects of a threat
• Radicalization methods, speed of spread (ISIL as an example) — signature to see tipping point or understand sooner (strategy, tactics, rhetoric, narrative, what can be tracked in social media).”
Prediction is repeatedly mentioned as a core goal:
“It may be possible to better predict what affect ‘aiding,’ ‘attacking’, ‘isolating’ will have in an area if behaviour/action surrogates can be found in historical data for which some ground truth exists.”
Social media data can thus be integrated with a wide range of open source information from other sources to generate complex, quantitatively-grounded empirical models of population and group behaviour.
The idea is to use such models “to explain, track, and anticipate key group behaviors including cooperation, communication (information operations), conflict, consolidation, and fragmentation that characterize the factional dynamics among multiple, independent armed actors in insurgencies and civil wars.”
The all-seeing eye
One significant area the document emphasises is advancing the Pentagon’s ability to detect “complex events” using algorithms which can identify patterns of events within “large data streams.”
How, in other words, does the US intelligence community make sense of the massive amounts of surveillance data absorbed by the National Security Agency (NSA) and other agencies, with a view to detect a real threat?
The document confirms the longstanding position of critics of the NSA like Bruce Schneier, that although existing technologies are great for simplistic issues like detecting credit card fraud, they are virtually useless for detecting real terrorist activity:
“While this works well for the detection of a behavior exhibited by a subpopulation (e.g. credit card fraud), its application to complex patterns applied to diverse actors leads to a high false alarm rate.”
This has never been publicly admitted by the Pentagon or US intelligence community, but it is acknowledge here, clear as daylight.
To address the problem, the Pentagon proposes to create new ways of integrating social media into a single, giant analytical system, which can feed directly into US military operations.
The ONR document describes, for instance, wanting to build a next generation of “Marine Civil Information Management System” (MARCISMS NEXGEN), to support the US Marine Corps, which “must be able to intelligently query both structured and unstructured data sources… Relevant area of operations (AO) data (e.g. social media, news reports, METOC, Automatic Information System (AIS), video, images, etc.) must be easily consumed.”
The new MARCISMS engine must also be “built on natural language processing, machine learning, predictive modeling, inference models, and confidence modeling.”
Population control
The association with civil-military operations demonstrates the importance of such predictive tools for counter-insurgency operations abroad, and accordingly, increasing the effectiveness of US propaganda operations.
Models, the ONR document says, should “suggest ways to draw groups closer or further apart to each other or to a concept,” based on “predictions about whether groups ‘attract’ or ‘repel.’”
Much of the information used to run such models would come from “unclassified data.”
In this context, these new technologies will help achieve a key goal of the US Marine Corps: to “maintain, influence, or exploit relationships between military forces and indigenous populations and institutions.”
Ultimately, then, this is not simply about predicting the behavior of diverse populations and social groups.
The Pentagon wants the ability to use this predictive capacity to manipulate human behavior, and thereby win wars.
One explicit discussion of this goal was recently published by the Joint Special Operations University (JSOU) in its 2016 Research Topics monograph, which highlighted subjects considered high priority by experts across the US Special Forces (SOF) community.
“Defining and understanding the ‘human domain’ and how SOF can influence cognitive behavior in myriad operational environments continues to be a topic of interest,” the JSOU document asserts.
“What affects people’s perceptions and decision-making that SOF can favorably influence to prevent/mitigate/deter crisis and conflict? What are the future advanced technologies and cultural social practices for engaging underdeveloped populations in support of partner governments to achieve US interests?”
But what happens if those interests happen to be at odds with popular demands for self-determination, economic independence and resource nationalism? The counter-democratic implications are already on display in US support for brutal autocratic regimes such as Saudi Arabia and Egypt.
These cases suggest that massive data-mining is designed to help US military agencies influence the “cognitive behaviour” of “underdeveloped populations,” so that the governments that rule them may continue conforming to “US interests.”
In other words, the US military wants to mine the world’s social media footprint to suppress the risk of popular social movements undermining the status quo, at home and abroad.