Dawn Lim | Nextgov | January 14, 2013 | 0 Comments

Can you predict the future by reading Twitter? The Pentagon thinks maybe.

Flickr user eldh

The Defense Department wants new computer tools to analyze mounds of unstructured text, blogs and tweets as part of a coordinated push to help military analysts predict the future and make decisions faster.

The search is part of the Office of Naval Research’s “Data to Decisions” program, a series of three-to-10-year initiatives that will address the volume of information that threatens to overwhelm planners in the digital age, contract databases indicate. The goal is to build an open source system that can unite various tools that collect, manage and draw relationships between data sets.

In a program announcement, ONR is calling for computer algorithms to predict events, fuse different forms of information and offer context on unfolding events. The office expects to spend $500,000 each year in funding. “The Department of Defense recognizes the potential for text analytics to play a vital role in future capabilities that inform timely and accurate situational awareness in time-constrained, uncertain, and complex environments,” the tender reads.

Defense is seeking ways to predict the future by monitoring Twitter, blogs and news, and determining the “frequency of contacts between nodes or clusters.” As networks grow larger and more complex, researchers have found it harder to monitor group behavior. ONR also wants researchers to discover networks that could be hidden within networks, and how information and money flows through a community.

Officials also want tools that fuse and assimilate multiple, incomplete data sets on agriculture, weather, terrain, demographics and economic indicators to find patterns. ONR is especially interested in ways to comb text-based information to provide more nuanced views of how groups, such as terrorists, operate by extrapolating the “stated values and beliefs that motivated behaviors of interest,” “community structure and clusters of social networks” and the level of “emotional support expressed towards topics or persons.”

The office also seeks better technologies for machine translation and processing -- translating physical characters or sounds -- into one machine-readable language.

Proposals are due Jan. 15 and funding decisions will be made by Feb. 15, contract documents indicate. Carey Schwartz, an ONR program officer and researcher affiliated with the Applied Research Laboratory at Pennsylvania State University, spearheads the program. So far, Clifton Park, N.Y.-based Kitware, which develops algorithms to analyze battlefield imagery, has been slated for funding, in partnership with software developer SOARTech, data-focused firm Signal Innovations Group, sensor processing company Systems Technology Research, defense giant Lockheed Martin Corps. and University of Southern California.  

Government agencies have been pushing technologists to refine techniques to sift through open source intelligence or publicly available data sets. The CIA, responding to mounting calls that such information needed to be systematically collected and better analyzed, created the National Open Source Center in suburban Northern Virginia around 2005 to acquire and analyze the information to support intelligence agencies. The Defense Advanced Research Projects Agency, the Pentagon’s research arm, in 2012 sought out research ideas on computer programs that predict “cyber terrorism events” by detecting how criminal groups and hackers interact on the Internet.


Thank you for subscribing to newsletters from Nextgov.com.
We think these reports might interest you:

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

  • Featured Content from RSA Conference: Dissed by NIST

    Learn more about the latest draft of the U.S. National Institute of Standards and Technology guidance document on authentication and lifecycle management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security


When you download a report, your information may be shared with the underwriters of that document.