1. Kansas State University
  2. »Division of Communications and Marketing
  3. »K-State Today
  4. »K-State computer science faculty and ISCAAS lab director Arslan Munir receives funding...

K-State Today

Division of Communications and Marketing
Kansas State University
128 Dole Hall
1525 Mid-Campus Drive North
Manhattan, KS 66506
785-532-2535
vpcm@k-state.edu

November 19, 2021

K-State computer science faculty and ISCAAS lab director Arslan Munir receives funding from AFOSR

Submitted by Arslan Munir

Multimodal information fusion for human activity recognition is expected to outperform the models that rely on a single modality. The Air Force has shown considerable interest in multimodal fusion for activity recognition in recent years. Many of the existing approaches for activity recognition perform poorly under varying environmental and/or lighting conditions as well are not suitable for real-time activity recognition. The Air Force Office of Scientific Research has awarded $149,900 to K-State computer science faculty Arslan Munir for a project to investigate multimodal real-time activity recognition.

Investigative lead for the project "A Multimodal Attention-Based Deep Learning Framework for Real-Time Activity Recognition at the Edge" at Kansas State University is Munir, associate professor and Michelle Munson-Serban Simu Keystone Research Faculty scholar, and founding director of the Intelligent Systems, Computer Architecture, Analytics and Security Laboratory in the computer science department at the Carl R. Ice College of Engineering.

This research project proposes a deep learning-based framework for real-time human activity recognition at the edge under varying environmental and/or lighting conditions by leveraging multiple sensor modalities — e.g., color cameras, infrared cameras, depth cameras, radars, etc. — and an attention-based mechanism to fuse sensor data. The proposed framework performs comprehensive preprocessing of raw signal data followed by a specialized individual convolutional neural network for each modality to extract meaningful features. The proposed framework then utilizes attention-based convolutional neural networks and recurrent layers to fuse spatial and temporal features. To help enable real-time and energy-efficient human activity recognition at the edge, this project also aims to propose innovative algorithms and techniques for hardware acceleration of the proposed activity recognition framework. 

The success of this project will have a huge impact on safeguarding national security and defense. Activity recognition in different environments and conditions is particularly important for the Air Force. Furthermore, real-time activity recognition is crucial for the Air Force because often real-time response is required to minimize the losses.

In this issue

News and research
COVID-19 university updates
Events
Health and safety
Kudos, publications and presentations
Technology
Newsletters, magazines and blogs
University life
Volunteer opportunities