AI System Transforms Security Cams into Fire Detectors

NYU Tandon School of Engineering

Fire kills nearly 3,700 Americans annually and destroys $23 billion in property , with many deaths occurring because traditional smoke detectors fail to alert occupants in time.

Now, the NYU Fire Research Group at NYU Tandon School of Engineering has developed an artificial intelligence system that could significantly improve fire safety by detecting fires and smoke in real-time using ordinary security cameras already installed in many buildings.

Published in the IEEE Internet of Things Journal , the research demonstrates a system that can analyze video footage and identify fires within 0.016 seconds per frame—faster than the blink of an eye—potentially providing crucial extra minutes for evacuation and emergency response. Unlike conventional smoke detectors that require significant smoke buildup and proximity to activate, this AI system can spot fires in their earliest stages from video alone.

"The key advantage is speed and coverage," explained lead researcher Prabodh Panindre , Research Associate Professor in NYU Tandon's Department of Mechanical and Aerospace Engineering (MAE). "A single camera can monitor a much larger area than traditional detectors, and we can spot fires in the initial stages before they generate enough smoke to trigger conventional systems."

The need for improved fire detection technology is evident from concerning statistics: 11% of residential fire fatalities occur in homes where smoke detectors failed to alert occupants, either due to malfunction or the complete absence of detectors. Moreover, modern building materials and open floor plans have made fires spread faster than ever before, with structural collapse times significantly reduced compared to legacy construction.

The NYU Tandon research team developed an ensemble approach that combines multiple state-of-the-art AI algorithms. Rather than relying on a single AI model that might mistake a red car or sunset for fire, the system requires agreement between multiple algorithms before confirming a fire detection, substantially reducing false alarms, a critical consideration in emergency situations.

The researchers trained their models by building a comprehensive custom image dataset representing all five classes of fires recognized by the National Fire Protection Association, from ordinary combustible materials to electrical fires and cooking-related incidents. The system achieved notable accuracy rates, with the best-performing model combination reaching 80.6% detection accuracy.

The system incorporates temporal analysis to differentiate between actual fires and static fire-like objects that could trigger false alarms. By monitoring how the size and shape of detected fire regions change over consecutive video frames, the algorithm can distinguish between a real, growing fire and a static image of flames hanging on a wall. "Real fires are dynamic, growing and changing shape," explained Sunil Kumar, Professor of MAE. "Our system tracks these changes over time, achieving 92.6% accuracy in eliminating false detections."

The technology operates within a cloud-based Internet of Things architecture where multiple standard security cameras stream raw video to servers that perform AI analysis. When fire is detected, the system automatically generates video clips and sends real-time alerts via email and text message. This design means the technology can be implemented using existing CCTV infrastructure without requiring expensive hardware upgrades, an important advantage for widespread adoption.

This technology can be integrated into drones or unmanned aerial vehicles to search for wildfires in remote forested areas. Early-stage wildfire detection would buy critical hours in the race to contain and extinguish them, enabling faster dispatch of resources, and prioritized evacuation orders that dramatically reduce ecological and property loss.

To improve the safety of firefighters and assist during fire response, the same detection system can be embedded into the tools firefighters already carry: helmet cameras, thermal imagers, and vehicle-mounted cameras, as well as into autonomous firefighting robots. In urban areas, UAVs integrated with this technology can help the fire service in performing 360-degree size-up, especially when fire is on higher floors of high-rise structures.

"It can remotely assist us in confirming the location of the fire and possibility of trapped occupants," said Capt. John Ceriello from the Fire Department of New York City.

Beyond fire detection, the researchers note their approach could be adapted for other emergency scenarios such as security threats or medical emergencies, potentially expanding how we monitor and respond to various safety risks in our society.

In addition to Panindre and Kumar, the research team includes Nanda Kalidindi ('18 MS Computer Science, NYU Tandon), Shantanu Acharya ('23 MS Computer Science, NYU), and Praneeth Thummalapalli ('25 MS Computer Science, NYU Tandon).

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.