VIEWPOINT INTELLIGENCE AND SURVEILLANCE

Next-Gen Computers Will Soon Transform Battlefield Intelligence

1/20/2017
A circuit board with SyNAPSE-developed chips

Imagine that 30 days prior to the invasion of a city the size of Mosul, Iraq, the U.S. military deploys a network of micro drones overhead and sensors embedded in the city infrastructure.

The sensors begin gathering geo-intelligence data, infrared, multi-spectral images, signals intelligence, full-motion video, mobile data, pictures, emails and social media, including the movement of people, vehicles and supply chains in order to build a 3D interactive map of not only structures and streets, but people, systems, life and traffic patterns, power usage, commerce and the location of weapons and ordnance.

By the time forces are ready to enter the city, battlefield commanders have a recent historical record of events in the city as well as a real-time mobile big data map displaying holographic images employing virtual and augmented reality. Meta-tagged individuals and locations are updated via real-time with GPS, and sensors and drone videos to keep the map fresh and current.

The massive amounts of data collected could be crunched to do predictive analysis to determine where the enemy is, and resolve identities and entities engaged in and around the city to build an identity database to determine where they have been and maybe where they are going.

That kind of knowledge and these types of scenarios could save warfighters’ lives and protect civilians but it’s not possible today.

The sensors and the drones are readily available, but not the computing power necessary to analyze the large amount of data from so many diverse sources. However, the day when this scenario will be possible is coming. And it will be made possible by a wave of new computing architectures.

These exponentially faster computers will marry with artificial intelligence, big data analytics and cloud computing to help crack some of the military and intelligence communities’ toughest problems — if they can take advantage of these technologies, which are largely being developed in the private sector.

Computing hasn’t fundamentally changed in 60-plus years. Hardware and silicon still process data with zeros and ones. And we are soon going to be reaching the physical and processing limits of today’s computers.

Moore’s Law, which predicts periodic increases in computing power, is going to reach an end. Faster than Moore’s Law processing is what we need. There are a few major trends leading computer engineers in this direction.

First, are recent breakthroughs in artificial intelligence, AI. The past three years has seen a transition from rule-based artificial intelligence to deep learning and machine learning, which gives machines the capacity to self-organize and simulate human operations and can do this at speeds and capacities we have never seen before.

Image processing, self-driving vehicles and mobile smartphones are all examples of innovations driven by AI. This has been accomplished with the computer architectures of today. To receive the full benefit of artificial intelligence, a new generation of computers will be required. We are talking about very fast, more powerful data crunching super computers and a new generation of thinking machines based on cognitive platforms — that think like humans do.

Another trend is the upcoming tsunami of data that will have to be organized and analyzed to deliver on agile predictions and threat prevention. By 2020, there will be about 20 billion mobile devices, and 100 billion intelligent, interconnected applications that will comprise the “internet of things.” Everything will be online, discoverable and able to broadcast. The need to process, store, manage and secure all of the data will outpace the capabilities of today’s computers.

The legacy computer systems of ones and zeros will not be able to keep up if we are going to keep pace with the escalation of complex challenges and threats facing national security. We need to collapse time to innovate faster and manage strategic surprise. To do this we need the next generation of computers to deliver the “what’s next” much faster than ever before.

Researchers and engineers who foresee these days are busy working on new computing architectures that have the potential to disrupt the way the world — and by extension the U.S. military and intelligence community — crunches and uses vast amounts of data, generating insights faster and with greater understanding and operational capabilities.

There are three technology tracts, all in various stages of development.

Quantum computing uses particles of the individual units of energy — quantum bits, or qubits, and charges them or polarizes them to represent ones and zeros. Manipulating these qubits and making them interact with each other allows for multiple computing calculations to happen simultaneously as opposed to today’s computers, which only do one calculation at a time. Advancements in hard encryption from the use of quantum computing as well as advanced complex problem solving with exabyte scale data in almost real-time could transform defense and intelligence operations.

Neuromorphic computing also has the potential to take a tremendous amount of data and crunch it very fast. These computers, operating in mobile clouds, will be based on the way the brain works, which harnesses neurons and networks of synapses. The Defense Advanced Research Projects Agency and its Neuromorphic Adaptive Plastic Scalable Electronics program, better known as SyNAPSE, was a precursor to work taken on by IBM with its Watson computer. It sought to create neuronal circuits for a computing architecture that mimicked the function of neurons — in other words — the way the brain thinks.

The third is memory-driven computing, which restructures a computer’s architecture to put memory at the center of the computer, replacing traditional processors. Hewlett Packard Enterprise, HPE, has already demonstrated a memory-driven computer it calls “The Machine,” which it says is 8,000 times faster than current computers. It also features a photonics fabric, a new way to accelerate and enable high performance communications.

Memory-driven computing is far ahead of quantum and neuromorphic computing, which are still in the early stages of research and development. HPE says it plans on commercializing The Machine as early as 2018. It is likely that each of these three architectures will be used together to accomplish tasks that today either we don’t understand or the problems elude our computing capacity to identify, analyze or address.

The question is then, what can be accomplished with these computers, which have the potential to be orders of magnitudes more powerful than what we have today?  

On a larger scale, this new wave of powerful computers combined with artificial intelligence can be used to solve some of humanity’s biggest social problems, many of which have been linked to conflict, terrorism and threats.

They could be used to defeat disease, invent clean energy, better manage climate change, or feed the planet. This would be a worthy set of challenges that I refer to as the “AI prosperity dividend.”

If we could unleash AI to address these global challenges, more than $100 trillion in prosperity could result. This is estimated by taking the economic losses due to health care costs of major diseases on the global population combined with the savings from wasted energy and food production from reorganizing the global logistics infrastructure. Innovative ecosystems to better manage food production and distribution on a global scale could end the suffering and increase prosperity for some 800 million people who are impacted by hunger and malnutrition. There is a direct correlation between poverty, low per capita income and the increase in terrorism.

A harbinger of things to come can be seen in IBM’s Watson Health, which scans a cancer patient’s file and then examines the thousands of pages of evidence-based treatments found in more than 290 medical journals, 200 textbooks and 12 million pages of text. It’s simply not possible for a cancer specialist to keep up with this massive amount of reading. Watson then matches the known successful treatments with the patient’s file.

In the realm of national defense, we are talking about the prevention of events such as 9/11, the attack on the Benghazi, Libya, diplomatic compound or the USS Cole in Yemen. In all of these cases, post-event analysis showed that there were many data points — that if tied together — could have warned of an attack. The computing and networking tools we need to do so did not exist, though.

The year 2016 will be remembered as the time when a state actor was accused of destabilizing an election through cyber attacks. Those attacks were persistent. We need better technology, better big data, thinking machines and new computing architectures to be able to develop insight into who the bad actors are even before they act, to predict and prevent and to be able to protect critical infrastructure.

The defense and intelligence communities can employ the integration of AI to new computing architectures to identify threat scenarios, predict the likely emergence of threats, prevent them, carry out real-time operations and do post-event analysis. There is a certain amount of ability to execute some of these tasks today, but not all of them. We need to innovate faster.

Returning to the first scenario, Army and Marine Corps analysts have predicted that they will be fighting in complex, urban environments in the coming decades. A city the size of Mosul has some 600,000 residents, but the future will see an increase in megacities, where the military may need to operate new diverse forces such as augmented expeditionary teams comprised of human operators, AI agents, drones and robots among populations numbering in the millions.

Predictive, holistic theater big data analysis is something that doesn’t exist today. But it could be employed to send in a first wave of robotic systems to render threats safe. Afterward, soldiers, Marines or bots could follow in a much safer environment.

But the procurement system that would bring these new technologies to the battlefield is broken and doesn’t serve our national security interest in embracing and investing in emerging innovations. We are in a global competition with states and non-state actors that can innovate faster and smarter.

The innovation in AI and new computing architectures is largely happening outside the military and IC communities. And that’s OK as long as government leaders have a grasp of what the private sector is doing and has to offer.

IBM, Google, Amazon and Apple are spending billions on AI, machine intelligence and deep learning. Second-tier companies are spending a couple more billion. Facebook and Microsoft are far outspending the government on virtual reality and augmented reality and the tool sets needed to use them.

The military and the intelligence communities need to leverage these opportunities faster and with more agility, to build systems that are relevant to their mission — and the missions of the future. Enabling the warfighter to have both real-time and predictive analytics enhanced with AI to better execute operations in theater is critical as a force multiplier.

As an advisor to both the government and the private sector, I can tell you the acquisition system is doing better, but it’s not there yet. We need a more agile and intimate collaboration between the two sectors to make this reality happen. Supporting persistent radical innovation in new computing architectures is a vital component in effectively managing the future of strategic surprise.

Dr. James Canton is CEO and chief futurist of the Institute for Global Futures, a think tank he founded in 1990 that advises the Global Fortune 500, technology leader and government clients. He is the author of Future Smart and the Extreme Future. He can be reached at globalfuturist.com.

Topics: Cybersecurity, Emerging Technologies

Comments (0)

Retype the CAPTCHA code from the image
Change the CAPTCHA codeSpeak the CAPTCHA code
 
Please enter the text displayed in the image.