WasteWatch
💉

WasteWatch

Tags
python
tensorflow
nvidia-jetson
socketio
raspberry-pi
cockroachdb
javascript
html/css
flask
Published
September 18, 2023
Author
Samar Qureshi
Submitted to Hack the North 2023 by Annie Wang, Lee Zheng, Joe Dai, and myself.

Introduction

The North American market for medical disposable devices and supplies stands at a staggering $188 billion USD, with projections estimating a surge to $330 billion in the next half-decade [1]. Second only to wages, this represents a substantial variable cost for hospitals. WasteWatch is designed to bridge this significant gap. By systematically recording and analysing what's discarded in both patient rooms and operating theaters daily, we aim to equip both clinicians and hospital administrators with actionable insights.

What it does

WasteWatch offers a comprehensive IoT approach, capturing data on discarded items across multiple trash receptacles, and determines the specific disposals at each garbage station. This data is then centralized and displayed for interpretation at a larger scale.
Our novel hardware solution ensures nurses no longer need to pause their tasks and scan individual barcodes, offering a seamless workflow experience. This not only results in considerable cost savings by preventing the wastage of unused items but also contributes to a reduced carbon footprint for hospitals.

How we built it

Each trash can is fitted with a node that houses a Raspberry Pi Model B, which is equipped with a camera, an MPU 6050 accelerometer using the I2C protocol, and LED indicators. The system is activated when the trash bin's lid opens beyond a 50-degree angle, triggering the LEDs and initiating a video stream to our core processor - the NVIDIA Jetson Orin Nano. An integrated low-pass filter in the MCU minimizes accelerometer noise from nearby vibrations, ensuring accuracy by eliminating false positives.
Our unsuccessful attempt at getting a WiFi access point going on the Jetson.
Our unsuccessful attempt at getting a WiFi access point going on the Jetson.
The garbage node with LED indictors and accelerometer.
The garbage node with LED indictors and accelerometer.
The video stream is transformed into image frames and then converted to binary for batch data transmission via SocketIO. On the receiving end, these frames are stored in-memory using BytesIO and subsequently fed into our inference model, which was crafted and honed using the TensorFlow/Keras API and a proprietary dataset. The resultant classifications are then stored in the CockroachDB, a serverless database, and simultaneously relayed to the user interface.

Challenges we ran into

  • "Overfitting" of the model
  • Low initial frame rate
  • Converting image frames into binary and transmitting batches of data
  • Communication issues between different nodes and asynchronous timing
  • The cases of duplicate and successive objectives without closing the lid
  • Weren’t able to get a VNC going from the Jetson
  • Using the EDIMax Wifi Adapater to establish an airgapped WiFi network from the Jetson

Accomplishments that we're proud of

  • 2/4 members’ first in-person hackathon!
  • Integrating machine learning with hardware
  • Solving a real world problem with embedded systems engineering

What we learned

  • How to network troubleshoot
  • WSGI and SocketIO
  • How to build and train our own TensorFlow neural network
  • How to use CockroachDB
    • Building our TF model.
      Building our TF model.
Organization.
Organization.

What's next for WasteWatch

  • Locally hosted and air gapped CockroachDB instance on every node
  • Predictive algorithm for items to bring to a surgery
  • Expanding our training dataset to include items in different states (i.e., with bodily fluids)
  • Opting for a camera with a higher frame rate to account for faster falling objects

References