An Almost Complete Journey of JIBC’s Fire Investigation Simulation

How it all started?

Back in October 2020, we had a conversation with instructors and leaders from JIBC’s Fire Fighting Program and learned that practical firefight training can be costly, dangerous, and only available at specialized locations.

Taking fire investigation as an example, instructors need to purchase used furniture, set up scenarios such as a kitchen or a bedroom in shipping containers, burn them, and then put the fire out. Only after all these steps can students enter and investigate the cause and origin of the fire. In addition, it is reported that sometimes the burn patterns do not turn out as expected.

Bedroom in a Cubicle

Bedroom in a Cubicle


A Journey of JIBC’s Fire Investigation Simulation

Early Prototyping

  • In early 2021, we partnered with a team from Centre for Digital Media and created a working prototype for fire investigation over a period of 13 weeks. The CDM team on this project did an excellent job capturing their design and development process on their blog.

  • Additionally, within the handover package, the team provided a well-designed infographic to illustrate the main features of the simulation. Kudos to the team again for exceeding our expectations.

Piloting the Simulation

  • As our ultimate goal is to integrate this gamified simulation into the firefighting program, we decided to pilot the simulation in an upcoming course section.

  • Kavita and Dennis from our CTLI team designed an HTML page that includes all the necessary information for the learners.

Pivoting Based on User feedback

  • Through the pilot, we collected and analyzed students’ feedback and found that one of the main issues is that students had a hard time downloading, installing, and accessing the simulation, especially for MAC users.

  • Even for those who were able to access the simulation, the simulation can be choppy depending on the performance of users’ computers. In short, our analysis suggests that we need to address the accessibility and usability of the simulation.

  • While users acknowledged the potential of this fire investigation simulation, a better solution is needed. Upon discussions with our team, we decided to convert it into a web-based simulation while improving its usability within the simulation.

A New Direction

  • With a limited budget, we worked with a CDM alumnus, a software engineer with a passion for health and education simulations, to convert the simulation into a more accessible web-based solution.

  • The first prototype of the web-based solution was completed in July 2022, followed up with user testings that aim to assess its accessibility and usability.

  • User testing suggests that simulation loading and responding speed, navigation and wayfinding are key areas for improvement, along with other minor adjustments needed. For a more detailed testing report, please refer to Fire Sim User Testing Report – Aug 2022.

What Next?

Design is a craft and sometimes a never-ending process. We have identified a list of achievable changes to improve user interaction and overall user experience. We are also hoping to pilot the web-based fire simulation in an upcoming course and continue to collect feedback from users for improvement.

What are the lessons learned?

  • Pivoted a few times, from the original idea of a VR application, to computer-based, and then finally web-based, we learned again that a good design needs to be firstly accessible and then usable.
  • User research is a critical task that should never be overlooked, as the False-Consensus Effect suggests: we are not users and we should not assume users will behave similarly.
  • Looking at the big picture, this simulation has a powerful impact on the overall course design: assignments and relevant instructions need to be adjusted accordingly; instructors will play a key role in supporting students when a new tool is introduced.

Developing HoloLens Application for Aircraft Maintenance Engineers

Just in case you don’t know, our project is basically to translate an existing desktop Helicopter Rotor Head application into Microsoft Hololens application. In addition, we are adding voice control and multiple user network.

(Image Credit: John Bondoc, UX Designer)

Sounds interesting? It’s actually quite scary because none of our team members had experience developing HoloLens application previously. So we learn as we do it.

We just finished Sprint #6 and entered into Sprint #7.

(Credit: Junsong Zhang, Project Manager)

In Sprint #5, we created a prototype based on the concept art below and started adding voice control into the application. The main changes were: 1) we added a voice control command board, and 2) the controls were moved below the rotor.

(Image Credit: John Bondoc, UX/UI Designer)

However, in Sprint #6, we tested the prototype and found that the main problems are

  • Difficulties with collective/cyclic controls.
  • Difficulties to see the effects such as airflow & swashplate movement.
  • Difficulties in looking at the rotor while moving the cyclic/collective controls.
  • Slow response with voice control.
  • Confusion about voice control menu.
  • Insensitivity of voice control in noisy environment.
  • There are no indications when “make bigger/smaller” voice controls hit the limits.
  • Not enough training/instructions with regard to how to use HoloLens.
  • Users tend to take it as a 2D object instead of 3D.

Considering scope, we decided to work on the ones that are critical to the functionality of this application:

  • Improve cyclic function
  • Resize/reposition cyclic & collective controls
  • Resize the whole field of view so users don’t need to move their heads too much at the beginning.
  • Change how the model scale up and down: keep the rotor in the background while it grows bigger or smaller instead of jumping to the front.
  • Redesign the voice control menu: instead of a command board, the new voice control menu will be interactions that give users instructions when they gaze/hover over the buttons.

(Credit: the entire team)

Based on that, we came up with a new sketch that reflects that our new interface and interaction. The main changes are the positioning of controls and voice control menu, as well as how the interactions work. We’ll have to prototype and test it. 

(Image Credit: John Bondoc, UX Designer)

From this week on, we will spend more time developing a multiple user network. The idea is to enable instructors to broadcast their views and modifications in the application to students in real-time.