Robotics & Geopolitics: Drones, AI, and National Security; Countries Collaborate on Robot Talent

The border between San Diego, Calif. (left) and Tijuana, Mexico (right). Source: Sgt. 1st Class Gordon Hyde,

June 29, 2018      

Much of the current global interest in automation isn’t just around economic competitiveness; it’s also about national security. How will countries respond to the shortage of skilled robot developers, what national security threats do aerial drones pose, and how might artificial intelligence companies benefit from U.S. defense spending?

Robotics Business Review has partnered with Abishur Prakash at Center for Innovating the Future to provide its readers with cutting-edge insights into recent developments in international robotics, AI, and unmanned systems. Are you ready to be updated?

Asian agreement centers on building robotics talent

Robotics development: Thailand and Japan have signed an agreement to jointly develop 1,300 robotics specialists between 2018 and 2021.

This comes a year after Thailand met with 30 robotics companies, including Panasonic, Universal Robots, Kuka, and ABB. At the time, the robotics companies were interested in entering the Thai provinces of Chon Buri and Rayong.

These provinces, along with Chachoengsao, make up the Thai Eastern Economic Corridor (EEC), which is a $45 billion regional initiative launched in February to attract foreign investment.

Geopolitical significance: Companies trying to decide where to expand need to know where the talent is. For countries that want to become leaders in automation, nurturing technical talent is as important as regulations or investment.

Right now, though, there’s a huge shortage of robotics talent. According to Tencent, there are only 300,000 AI “researchers and practitioners” around the world.

A separate study found that China needs more than 5 million AI experts, even though there are only 1.9 million experts in the world right now.

In addition, in the U.K., the number of AI jobs has increased 485% since 2014. And in Germany, Chancellor Angela Merkel has noted that international competition could lead to a “brain drain” to foreign companies, affecting national security.

As demand for robotics and AI skills continues to increase, two trends are emerging. First, companies are opening up labs to tap foreign talent. For example, Adobe has announced an AI lab in Canada.

Second, nations are trying to develop their own local talent. For instance, a leading university in India plans to pump out 10,000 AI experts by 2021.

In South Korea, the government is investing $2 billion into R&D. Part of the investment will see 1,370 AI experts emerge by 2022, along with six new AI research centers.

Countries like Taiwan, China, and the U.K. are working on similar talent-focused plans. As nations develop talent pools, a new challenge may emerge for robotics businesses. How do they tap this new talent, especially if it is being created for another country?

As Japan and Thailand create robotics specialists, who will benefit from these specialists? Consider that Japanese robotics companies are increasingly looking to foreign talent. If Japan already has a foothold in Thailand, Japanese robotics businesses may benefit more than anyone else.

Drones create more national security challenges

Robotics development: To identify areas where people can cross the U.S.-Mexico border, smugglers are turning to drones and video cameras.

Since October 2017, officials have reported over three-dozen sightings of drones operating near the border. The U.S. Customs and Border Protection is concerned that drones could show smugglers in Mexico vulnerable points on the border where people and/or drugs can cross over.

According to one official, the number of sightings could be a low estimate because drones don’t pop on radars as easily as larger aircraft. It could be substantially higher.

Geopolitical development: Increasingly, drones are becoming a national security challenge for governments. This extends well beyond smuggling and to terrorism.

The New York Police Department has sounded the alarm on “drone terrorism,” experts in the U.K. have warned that the country could be attacked by drones equipped with bombs, and both France and the Netherlands have been quietly training eagles to take down drones deemed as hostile.

FAA logo Drone Flights BVLOS

Drone companies must start to think about the national security risks their products could create, or they risk public policy that could hamper business. For example, in 2017, the United Nations called for a global drone registry. Will consumers still buy drones if they are being tracked?

Meanwhile, the U.S. Federal Aviation Agency (FAA) has banned drones over prisons and Coast Guard bases as illegal flights soar.

Instead of waiting for governments, drones may want to take the lead. In some instances, this is already taking place. In April 2017, DJI enforced geofencing across most of Iraq and Syria, stopping terrorists from using its drones in those countries.

Should the U.S. government work with China’s DJI to enforce the same geofencing along the U.S.-Mexico border?

If public policy doesn’t work and drone companies can’t work with governments, then the last option may be to destroy drones themselves, such as with Boeing’s anti-drone laser. If drone companies thought their products being used by smugglers or terrorists was bad for public relations, wait until their drones are getting blown out of the sky for national security.

AI to fuel public-private partnerships in the U.S.

Robotics development: A team at the U.S. Pacific Northwest National Laboratory (PNNL), which is part of the U.S. Department of Energy, has applied AI to detect signs of a nuclear bomb test.

Specifically, the system looks for electrons that are released when a nuclear bomb is tested. The team at the PNNL programmed the AI with 32,000 different variables so it would know what it was looking at and when to tag something.

Geopolitical significance: The program at the PNNL is the only government AI program not led by the U.S. Department of Defense (DoD).

For example, the Pentagon has the Project Maven program, which applies AI to analyze drone footage and produce actionable intelligence. The DoD is also launching its first Joint Artificial Intelligence Center (JAIC), one of the programs influenced by Project Maven.

In this case, the Pentagon wants AI to predict nuclear weapons launches. For the U.S., AI is becoming core to protecting national security.

During a defense forum┬áthis month, the deputy joint chief of staff and former deputy secretary of defense both alluded to a single point: The U.S. will lose its military lead on China if it doesn’t start investing in AI, robotics, big data, and hypersonics.

This could be a huge opportunity for AI startups. How? For several years, starting under the Obama administration, the Pentagon has been trying to work in public-private partnerships with AI startups in the U.S.

These partnerships have been core to China’s rise as an AI power. But unlike in China, where business and government are tightly interlinked, in the U.S., the government and private sector (especially startups) have operated in independent realms.

The Pentagon’s AI programs may be the beginning of a new “bridge” for national security.

When it comes to AI hunting missiles, the White House is going all out. One of the Pentagon’s programs will receive $83 million, triple what it received last year. Could AI startups help develop some of the technologies behind this and other programs?

Even the Pentagon’s Algorithmic Warfare Cross-Functional Team, which is the formal name for Project Maven, is receiving increased funding. For fiscal year 2019, it will receive $93 million, compared with $70 million in 2017. All of this points to the DoD’s rising focus on AI. Part of this is to protect national security.

Part of this is to compete with other countries, like China. But behind all this may be a huge, untapped opportunity for AI startups to develop and supply their innovations in a way they may never have been able to before.