The next breakthrough in operational planning, execution, and safety for first responders will come from Location-Based Services (LBS) that assist personnel in possibly chaotic and perilous situations with location and map related information. An inevitable and critical requirement for the applicability of LBS is their integration into any search-and-rescue setting, hence LBS systems must include indoor positioning, navigation and mapping into wearable devices and real-time platforms. To address these needs, CSDFD proposes to 1) Implement a wearable, multi-sensor based, indoor positioning system with state of the art technology capable of precisely tracking and positioning personnel and/or assets in terms of latitude, longitude, altitude, and orientation; and 2) Develop a group-based, mapping framework for real-time, collaborative construction of 3D maps including optimizations for communications limited in throughput, reliability, and latency. The system will use Building Information Modeling (BIM) and will not only connect emergency first responders to those in need of rescue, but will use physics-based fire modeling – Computational Fluid Dynamics (CFD) – that will show SMART users how a fire will propagate. Using CFD, the device will use real time data in earthquakes, floods, active shooter and other emergency situations. In a training scenario, the wearable device in the form or a helmet or goggles will simulate the desired emergency crisis to allow first responders to interact realistically in a virtual reality environment.
The proposed system will provide a number of benefits to public safety organizations (PSO) before, during and after an operation. Indoor positioning will allow the position of all first responders to be tracked, which is needed to improve coordination and safety. The collaborative mapping framework allows units to efficiently construct 3D maps of the environment and provides context for coordination and navigation. Furthermore, the position logs and constructed map can be analyzed after the operation to analyze performance and safety. Both contributions directly relate to the technical area “Location-Based Services (Indoor Positioning, Navigation, and Mapping)” by addressing the indoor positioning and mapping components. Successful LBS for first responders must be accurate, timely, and reliable across the widest possible range of physical environments and communication channels.
The requirements of LBS for first responders will require the development of simultaneous localization and mapping (SLAM). SLAM techniques use vision, range-finding, and/or inertial sensors to track motion of the device and map the environment simultaneously. The advantages of these techniques are that they are accurate, fast, and require no prior knowledge of the environment. Given the challenging requirements of public safety LBS and the unique advantages of SLAM, a potential LBS solution for first responders would incorporate SLAM technology, distributed positioning and mapping, and optimizations for impaired communications.
Broadly, the necessary steps to achieve the proposed contributions are to build a test platform that incorporates SLAM technology, develop a group-based mapping framework, optimize information dissemination for impaired communication channels, and perform live test campaigns. The project has a timeline of 12 months. Existing, state-of-the-art hardware and software technologies may be leveraged to quickly prototype the system. SLAM algorithms and range-finding sensors have reached a maturity level where this approach is now possible.
The test platform will use commercial off-the-shelf (COTS) mobile devices with range-finding hardware and SLAM capabilities. An application will be custom-built to interface with the hardware and associated software development kits (SDK) and visualize real-time location and map data in 3D and 2D.
Established networking methodologies will be used to optimize communications on impaired channels. All members need to receive position and map updates from other members, so these communications will be multicast to maximize throughput efficiency. Forward Error Correction (FEC) will be used to combat bit errors common on unreliable wireless channels. Network outages will be addressed by the octree synchronization methods.
In order to evaluate the positioning, mapping framework, and communication optimizations, two tests campaigns will need to be held. Test scenarios should be planned with first responder personnel to mimic deployments and movements of first responders in emergency situations. The first test will specifically evaluate multi-user positioning and mapping. The second test will incorporate impaired communications scenarios to test the optimizations to information dissemination.
As a SMART training device to simulate real life first responder scenarios, the proposed technology will pose absolutely no threat to users. The SMART helmets/goggles will display an emergency scenario digitally on the faceplate of the helmet/goggles that will be correlated to the LBS data. As a SMART in-field device, the proposed technology has the potential to save the lives of civilians and first responders. As development will occur in close conjunction with active fire fighters to ensure system features are relevant to the needs of PSOs, the technology will be user-friendly, comfortable, and effective for everyday use by first responders, which will further increase the safety of the public and emergency personnel.
Once a mapping platform is developed, the 3-D mapping and sensing technology will be accessible through an application on any SMART device. CSDFD proposes that a 911-GO app will be available for download on any SMART device already available and widely used by the civilian population. Through 911-GO, civilians will be able to scan their homes, which will produce Building Information Modeling (BIM) data that will be uploaded to a secure network that will be accessible to first responders. The BIM data will allow first responders to use LBS to locate victims of fire and other crises. 911-GO will show first responders and civilians the locations of a fire and have the ability to predict the development of such an emergency situation so that PSO’s and civilians alike may take steps to avoid harm. As SMART devices are now widely used, access to this 911-Go app is precisely realistic.
In order to use this technology as a training tool, PSO’s will need to purchase SMART helmets/goggles so that a scenario may be displayed on the “screen.” Although 911-GO will be available on any SMART device capable of downloading an app, in order for a first responder to train in the most realistic setting, a helmet/google screen will be the most effective in creating an interactive environment. For civilian and PSO use in emergency situations, the greatest hindrance to full integration is public education. The public will need to be made aware of the technology and its importance in reducing injury and death in emergency crises. The public will need to be encouraged to scan their home into 911-GO so that the database has up-to-date information to ensure effective use. As with any technological progress, the most important step is education.
The proposed SMART helmet/goggles will be able to simulate changes in light, weather, noise and physical instability through life-like renderings on the helmet/goggle screen along with synced headphones built into the helmet that will be available for simulated audio and communication with other first responders. The SMART helmet/goggles will have the ability to record video and audio of the training activities of each user. The proposed platform will have the ability to be integrated into other technologies that may be used during a training simulation including many of those listed in the NIST References and Situations: Location-specific identification tools; Sensors; Touch displays; Audio cues; Voice commands/auditory controls; Data analytics and sharing tools; Smart Alerts; Data integration tools; Mapping technologies; Interface layouts; Biofeedback; Haptic Gesture recognition and Smart watches.
Current available hardware and software include 2-Oculus headsets, 2-Falcon Nikki gaming computers, 2-Samsong Gear 360 video cameras, 1-ODG Augmented Reality glasses R-6, 1 Realwear HM-1 Augmented Reality (AR) enabled wearable tablet. Additional equipment has been pre-promised to the project including industrially hardened computer vision hardware, artificial intelligence software and expertise of industry professionals.
The Purchase Order List (POL) is commensurate with solution scale and funding availability. Minimal to modest equipment is required to produce single and small team involved Virtual Reality (VR) application successes based on past, current and expected donations.
Future suggested equipment for multi-player use include.
Additional VR and AR enabled headsets and gaming computers.
Various 3D gaming software suites.
Additional 360 video cameras including 6X and 24X lens configurations for in-video depth navigation.
Multiple GPU’s will increase on user performance enhancement.
To bring the VR enabled platform to AR real-time safety solution for emergency response, additional equipment and software needs will need to be met.
The 3-D mapping and sensing technology will allow for any building to be scanned and uploaded into the network database. The information may then be developed through Building Information Modeling (BIM) for a desired emergency simulation. For firefighters, this technology may be used to simulate smoke and fire filled public buildings and private homes, and of most importance civilian rescue simulations. For police officers, this technology may be used in active shooter simulations whether in a building or outdoors at a public event with crowds of people. For Search & Rescue, the technology may simulate disaster response in both indoor and outdoor environments including terrestrial simulations for search and rescue during an earthquakes. For explosives and hazmat, the technology may simulate an unmanned ground vehicles scenario. The technology will be capable of being integrated into other technologies to accomplish the desired training simulation.
The proposed design will enable the collection of data and metrics by measuring the following specific to the trainee performance: Amount of time it takes the trainee to rescue a person in a fire and extricate them from scene; Accuracy of completing a task; Efficiency of interactive conversations and tasks between first responders and command center; Set-up time at an emergency scene; and Speed of command center to access information from scene, make decisions, and send directions back to the first responders. Regarding technology performance, the design will enable the collection of data and metrics by measuring the following: Measuring mistakes made with the technology interface in Virtual space; Accuracy and precision of location based technologies; Victim identification, medical record receipt and display; Upload and/or sharing of visual elements—photos, video, etc. including accuracy of the information that is coming across; and Level of information or data is actually required for optimal performance.