AI + IoT = 1000X More Dense Networking Environments: How Intel, NSF Are Planning For Tomorrow

 Photo of a city, which will require AI to help...

In a decade we could have 50 billion devices connected to the internet, more than double our current number. Not too long after, it could be 75 or 100 billion.
Smart, connected things everywhere may be the future, but there’s a problem.
How will we connect everything at such scale?
Intel and the National Science Foundation say we’ll need help from artificial intelligence. And mesh networks. Including, potentially, even SpaceX Starlink satellites in low earth orbit.
“Traditional 4G networks that your current mobile devices rely on typically can support a region that has about 300 to 2,000 devices in their coverage area,” Thyaga Nandagopal, a deputy division director at the National Science Foundation, told me recently on the TechFirst podcast. “We are thinking about device densities [with] tens of thousands in a small region … [going] all the way up to millions of devices in a coverage area of a single cell site in a wireless network.”
That’s easily 1,000 times more dense wireless networking.
It includes a confusing array of 4G, 5G, 3G WiFi, Bluetooth, and potentially other frequencies, all at once. And it’s not just about people on calls, or streaming media, or messaging friends, or posting to Facebook, either. It’s also about growing machine-to-machine traffic, which is currently projected to be about 50% of network load by 2025. In situations like this just determining what device to listen to can be a significant problem, Nandagopal says.That’s why Intel and the National Science Foundation have jointly announced research awards for projects that seek to improve wireless networking in ultra-dense environments. There’s 15 awards in total under the banner of Machine Learning for Wireless Networking Systems, or WLWiNS.

One project, with the University of North Carolina at Charlotte, is to build smarter mesh networking.

“My project is to develop smart wireless multi-hub networks, such as mesh networks, to enhance the AI performance of actual devices, such as cell phones, like robots, auto-driving cars,” says Pu Wang, an assistant professor at the UNC.

Wang is trying to make edge devices perform better by training them to coordinate in privacy-safe ways. Ultimately, the goal is to teach the network to make smart allocation decisions and optimize performance, using reinforcement learning AI models.

AI can help both on-device and in network routers, says Vida Ilderem, the Intel executive on the project.

One big challenge: managing spectrum. “Spectrum is a very scarce resource. It’s a very expensive resource … you pay sometimes billions of dollars for this spectrum.” says Ilderem. “AI can help … access that spectrum for managing the workload and the traffic in the pipe … there are many, many spectrum waves going on. So the idea is also for AI to manage interference between these different spectrum accesses and usages.”


That includes spectrum-hopping to find open channels for communication and dynamically adjusting spectrum for optimal performance, she adds. Part of that might require special chips for maximizing AI performance; part of that can be done on generic chips or GPUs (graphical processing units). In some cases you might need an ASIC solution: an application-specific integrated circuit.

Putting intelligence in the network is critical, says the NSF’s Nandagopal.

AI can prioritize critical communications like medical or emergency personnel’s calls. AI can also sense when an emergency occurs and automatically prioritize traffic in that area and focus capacity where it’s needed.

been a cesspool of security failures that have left people’s personal data exposed while also providing foot soldiers for hackers to assemble digital armies and launch distributed denial of service attacks.

That’s not just homes. In 2017, hackers exploited a weakness in an internet-connected fish tank to gain access to a major casino, highlighting that any company’s network is only as good as its weakest link.

The problem: in a busy, full, talkative IoT world … there are a lot of links. One of them is almost guaranteed to be weak. And some of them hide very well.

“What is really worrisome to security experts is what we call low persistent threats,” Nandagopal says. “A large number of devices that are doing small things here and there that are not noticeable by a human being.”

Those low persistent threats won’t be noticeable to a human monitor. But they might just be biding their time, waiting to strike at the right time. Smart systems would be better able to understand what kind of traffic is normal, permitted behavior, and what kind is abnormal — perhaps even coordinated with others.

With 10-50 devices in the vast majority of today’s just-barely-smart home, this is a problem. With 100-500 in the future, it’s even tougher. Imagine a modern factory in five or ten years with hundreds of thousands if not millions of smart devices …

“Each of them may have their own software vulnerabilities and any one software creating a vector of attack suddenly leads to you having to figure out where did this attack come from,” Nandagopal says. “The network operator has no idea, right? Why is this device which was working fine until yesterday, suddenly behaving? Is it behaving badly because it has misconfigured the owner, the person who owns this device accidentally misconfigured the settings to keep sending too much data? Or is it behaving maliciously?”

“And again, if you look at it, the scale becomes much harder. If it’s 10 devices, 20 devices, it’s easier to pinpoint and kind of isolate it. You’re talking about millions of devices, now this becomes a problem.”

That’s not easy with all-new recently-updated devices. It’s even harder because some IoT devices will have lifespans measured in decades. The road sensor in the freeway, the stress detector embedded in a bridge: those might have lifespans of 10 or 20 years.

Smart systems that learn — and remember — should be able to help equipment, facilities, and city managers to both monitor and maintain all their smart, connected parts of their sensor grids, while also ensuring they stay on task and don’t get coopted by third parties.

Ultimately, Pu Wang and the University of North Carolina is just one part of that. In total, Intel and the NSF announced 15 awards for advanced research and innovation into future wireless systems in the Machine Learning for Wireless Networking Systems, or WLWiNS program.

Those include:

  • Physical Layer Communication Revisited via Deep Learning with University of Illinois Urbana-Champaign and University of Washington
  • Deep Neural Networks Meet Physical Layer Communications – Learning with Knowledge of Structure with Virginia Polytechnic Institute and State University (Virginia Tech) and Massachusetts Institute of Technology (MIT)
  • Artificial Neural Networks for Interference Limited Wireless Networks with Northwestern University, University of Minnesota-Twin Cities and Oregon State University
  • Dino-RL: A Domain Knowledge Enriched Reinforcement Learning Framework for Wireless Network Optimization with The Pennsylvania State University (Penn State) and University of Virginia
  • Reinforcement Learning-Based Self-Driving Wireless Network Management System for Quality of Experience (QoE) Optimization with University of California, Santa Barbara
  • Cross-Layer Integrated Radio Frequency-Based Data-Driven Wireless Device Classification Framework for Spectrum Access Awareness with Oregon State University
  • Quality vs. Quantity in Spectrum Sensing with Distributed Sensors with University of Notre Dame
  • Democratizing AI Through Multi-Hop Federated Learning Over-the-Air with University of North Carolina Charlotte
  • Collaborative Training and Inference at the Wireless Edge for Collective Intelligence with Georgia Tech Research Corporation
  • Ultra-Reliable Collaborative Computing for Autonomous Unmanned Aerial Vehicles with University of California, Irvine
  • Wireless On-the-Edge Training of Deep Networks Using Independent Subnets with Rice University
  • A Coding-Centric Approach to Robust, Secure, and Private Distributed Learning Over Wireless with University of Southern California (USC) and University of California, Berkeley (UC Berkeley)
  • Distributed Learning for the Nomadic Edge with University of Wisconsin-Madison
  • Resource-Constrained Mobile Data Analytics Assisted by the Wireless Edge with New York University
  • Hyperdimensional Computing for Scalable IoT Intelligence Beyond the Edge with University of California, San Diego
AI + IoT = 1000X More Dense Networking Environments: How Intel, NSF Are Planning For Tomorrow AI + IoT = 1000X More Dense Networking Environments: How Intel, NSF Are Planning For Tomorrow Reviewed by Tejkiran Kushwaha on July 12, 2020 Rating: 5

No comments:

Provide Your Reviews

Theme images by Flashworks. Powered by Blogger.