Post Image

Introduction:

I recently found out about NASA’s Space Apps Challenge, an international hackathon that happens every year. At Space Apps, teams try and solve some of the most pressing issues in the world. Due to COVID-19, this year’s Space Apps is fully online. I got wind of this and applied and for 48 hours, my team and I made J.A.R.V.I.S. This is a direct copy of the project page on the Space Apps website.

Summary

COVID-19’s impact can be felt from pole to pole, leaving entire communities in desolation. However, COVID-19 places an especially large burden on remote task forces, raising a team’s stress level and thus making them prone to burnout and abort the mission. Project J.A.R.V.I.S. is designed to address such issues. Using AI-powered analytics, J.A.R.V.I.S is able to hone in on factors that lead to burnout, analyzing crew members’ facial emotions, their gait, social interactions, and other factors. From this, J.A.R.V.I.S calculates the recommended procedures to prevent burnout from happening, and if situations are severe, will notify the team captain and HQ on potential next steps.

How We Addressed This Challenge

Our team has identified the collateral effects of the pandemic on remote work sites. Such effects include emotional burnout, increased anxiety, and higher stress levels. These factors all increase the chance of personnel making mistakes in their work, and in some cases, can lead to the failure of an entire mission.

This is where J.A.R.V.I.S comes in. J.A.R.V.I.S eliminates the potential psychological issues that could impede a mission’s success. The AI-powered system involves monitoring both physical as well as the mental health of personnel. This entails analyzing data from a crew member’s wearable technology such as blood pressure and heart rate, as well as reading a person’s gait and emotions via security cameras. The data is then processed into a readable graph and compiled into a completely private profile of a crewmember. Only the crewmember is able to access his or her data unless J.A.R.V.I.S analyzes that the crew member will burnout or harm others in the mission. J.A.R.V.I.S will also be able to give suggestions to a stressed crew member, recommending them to take a break, drink water, and any other ways to recover. It is important to realize that ALL video and wearable data will be deleted after processing, leaving only the analysis and the private profile.

An Individual member’s private profile is then added to make a profile of the entire crew, which is shared with the team captain over a virtual dashboard. Some statistics that are displayed on the team dashboard include the chance of burnout and raising the team’s stress level. From there, the team captain can observe the mental performance of the crew an assess whether there is a risk of burnout and how it can be managed, e.g. by temporarily reducing the workload of some personnel.

Crew Dashboard

An additional safety feature, as mentioned before, is that an individual person’s data can be made available to the team captain only if J.A.R.V.I.S. determines that a crew member is in a state of persistent anxiety. From therein the team captain can undertake specific measures to minimize this stress, e.g. managing work schedule until a person recovers. This will only happen if a team member has a constant state of anxiety, otherwise, the team captain only has access to his private profile and the team dashboard.

In conclusion, we believe that Project J.A.R.V.I.S. successfully addresses the challenge by dealing with potential psychological issues of personnel in remote worksites in a non-intrusive manner and relying upon reliable, science-based predictive diagnostic to enhance decision making and the productivity of remote facilities.

To not bore the non-technical user, all technical details for the code is in the ReadMe of the Github repository. 

We wish to outline potential ways we could have improved project in case of a longer period of time:

1. With more time to work on this project, we could train the AI to have a more granular and sophisticated analysis of the person’s mental and physical state, thus increasing the system’s reliability.

2. With access to higher-quality datasets, we could increase the accuracy of the system (with so many resources and datasets, it was hard to sort through them all).

3. As the recommendations that J.A.R.V.I.S gives to the team have real-life impacts, we would be looking forward to working with psychiatrists and other people that have a medical background (none of us have a medical background) to help develop J.A.R.V.I.S’ recommendations. We also want to implement already known protocols  that NASA and other agencies have developed with J.A.R.V.I.S (we contacted people in NASA but were unable to get access to these protocols as that is the nature of them)

How We Developed This Project

Legend:

  1. What inspired your team to choose this challenge?
  2. What was your approach to developing this project?
  3. How did you use space agency data in your project?
  4. What tools, coding languages, hardware, software did you use to develop your project?
  5. What problems and achievements did your team have?

1. Our team was inspired by an opportunity to participate in contributing to the space industry by designing a remote crew support system for situations, such as the current pandemic and when the HQ cannot be reached. We hoped it would help reduce mission failures and enable a task force to be self-sustaining. We also wanted to express our gratitude to the valiant members of the space force and researchers who risk their lives and well-being in the name of science

2. First of all, we brainstormed ideas to develop a plan for this project, and having discussed and consolidated the final version of the plan, we proceeded to understand each others’ skill sets to see how we can best create deliverables and bring this project together. The skillset included data analysis, data engineering, ML, solution architecture, and organizing and maintaining an organized schedule of the activities via daily meetings.

Our first approach was to identify which audience we wanted to focus on which we identified using the “Persona technique”. We also were keen to find an accurate but non-intrusive way of capturing data as well as what kind of data we would need to capture.

Our next task was to come up with a solution to autonomously monitor crews without the need for human labor. Conversing on the subject, we concluded we can use small embedded devices such as Raspberry pi’s to analyze crewmembers’ facial emotions. In parallel with the other tasks, we developed the entire hardware system.

J.A.R.V.I.S is able to analyze a person’s emotions

We then identified the next phase which was how to implement privacy-friendly processing of that data and how it can be presented in a user-friendly way. We also acknowledged the importance of those who handled sensitive medical data and implemented that in our plan.

As J.A.R.V.I.S. is a tool, we put in a lot of dedication in developing a user-friendly way to interact with the complex AI system. One of our teammates had experience in creating data pipelines and data dashboards, so we spent time connecting our datasets and camera feeds to the dashboard. This allowed us to design a user-friendly visual that concisely summed up J.A.R.V.I.S’ analysis.

Check out the dashboard @ Infogram

3. We have used psychological data on the Performance Readiness Evaluation Tool (PRET), made available by the Canadian Space Agency. The reason why we picked this data set is because it provides us with the psychological perspective of potential end-users of our project and helped us to visualize how our dashboard would work.

Through our research, we did not find other raw data sets (so .csv files, spreadsheets, etc) to assist us in our project. However, the following datasets were used:

We have used biometric data from wearable devices which came from the WESAD dataset, available in the University of California Machine Learning Repository. We have chosen this dataset because this is the data we would be collecting from end-users. Due to our time constraint and the lack of essential hardware (we did not have access to a heart rate sensor) we analyzed what kind of response AI could potentially recommend to the crew captain.

Finally, we used:

The segmentation model (drawing of the person contour)which was was trained on MSCOCO dataset which was fundamental to our project development

An emotional analysis model that was trained on the FER13 dataset. This dataset was used to analyze human emotions which is fundamental to the functioning of J.A.R.V.I.S. in this project

4. We used draw.io to draw architecture diagrams. We also used Google docs to exchange information, edit documentation, organize meetings, and share agendas. For meetings, we have used rocket chat, zoom, and google meet.

For the project, we used:

Infogram: used for dataset illustration;

Python: Language used for the back end

Libraries used:

TensorFlow: Used for deep learning (using Keras);

OpenCV: Used for image manipulation and preparing our video demo;

Pandas: Used for data operations, tabular data analysis and manipulation;

Hardware:

Raspberry Pi: Used to collect, pre-process and transmit video data (prototyping edge computing); In the future, this will be replaced by an application-specific embedded device

AWS: Used as a computation platform;

Spark: Used as a data processing framework.

5. Outline of issues faced while making J.A.R.V.I.S:

1. Remote communication and idea sharing in a multi-timezone environment – we have agreed on the length of the meeting to make sure we plan ahead and don’t spend too much time on the issues or exhaust team members in other time zones;

2. Language barrier – where there was an issue of a language barrier, team members would help translate the idea or clarify that to the team;

3. Finding data sets that had to match our project- we spread out our resources to try and find any relevant datasets which turned out to be a success, however, we have the potential to find more datasets next time;

4. Discussing the issue of privacy and how it will be embedded in our system – we designated a special call for this and prepared a document with all of our ideas. This was all in an effort to ensure that we can see where we agreed and disagreed on those matters. After doing so, we then debated and agreed on the common elements of the privacy infrastructure in our project;

5. Integrating the subsystems together: As we were limited to only two days of working, we developed the hardware, software, and visuals all in parallel. This meant that we had to dedicate a serious amount of time to fully integrating every part together; and

6. Writing documentation: With a limited amount of word space, we chose every single character with care. This was definitely a tedious task.

We achieved the following:

1. We have created a prototype of a significant part of our system which includes data ingestion and processing, video recognition, information dashboards, risk detection, and an alerting engine. We also had time to implement high-level design for this project and were able to fully validate the concept of J.A.R.V.I.S.

2. We successfully resolved emerging issues by maintaining a successful and on-going communication and agenda-keeping to avoid wasting time and focusing on the core of the project. We continued to stay on task despite feeling the urge to celebrate our small victories, which greatly benefited the project.

Project Demo

Github

Data & Resources

1) Biometric data from wearable devices: from WESAD dataset, available in the University of California Machine Learning Repository

2) Psychological data: data from Performance Readiness Evaluation Tool (PRET), made available by the Canadian Space Agency

3) Video analysis: the segmentation model (drawing of the person contour) was trained on MSCOCO dataset, and the emotion analysis model was trained on the FER13 dataset

Tags

#projectjarvis #wellbeing #AIsupport #Dashboard #AI #RemoteLocation #OpenCV #Python #Keras

Next
TEA2025B
Comments are closed.