r-u-still-there

Crates.ior-u-still-there
lib.rsr-u-still-there
version0.3.0
sourcesrc
created_at2021-07-11 16:04:01.479942
updated_at2021-12-02 21:11:06.743255
descriptionOccupancy sensors for home automation using thermal cameras.
homepage
repositoryhttps://git.sr.ht/~paxswill/r-u-still-there
max_upload_size
id421472
size534,552
Will Ross (paxswill)

documentation

README

r-u-still-there

A home automation sensor for human presence using thermal cameras.

The most common way to detect if a space is occupied or not is with a PIR sensor, but this comes with the downside that it doesn't detect stationary people. r-u-still-there is an application that can be installed on an embedded Linux system along with a thermal camera. r-u-still-there will then notify your home automation system when it detects a person, and when that person leaves its view. In other words, you can use it to sense if a room is occupied, even if the people in that room are keeping still (like when they're watching a movie).

Features

  • Efficient usage of CPU and network.
  • Messages sent through an MQTT broker, allowing use by multiple different home automation systems.
  • Easy integration with Home Assistant.
  • MJPEG stream available so you can feel like the Predator.

Hardware

r-u-still-there has been tested and used on a variety of Raspberry Pis from the low-cost 0 and low-speed 1B+ up through an 8GB 4B. I also use it on BeagleBone Greens, and it should run on any Linux device that you can connect an I²C peripheral to.

Performance wise, I recommend using something with at least an ARMv7 CPU if you can. The ARMv6 CPU on the Raspberry Pi 0 and 1 works, but it can struggle to render the image stream at higher frame rates and larger sizes. The SIMD instructions and faster speeds on newer processors makes a noticable difference.

Cameras

Currently (as of v0.2.0) three models of thermal camera are supported:

Camera Resolution Frame Rate Field of View
Panasonic GridEYE 8×8 1 or 10 FPS 60°×60°
Melexis MLX90640 32×24 0.5, 1, 2, 4, 8, 16, or 32 FPS 55°×35° or 110°×75°
Melexis MLX90641 16×12 0.5, 1, 2, 4, 8, 16, 32 or 64 FPS 55°×35° or 110°×75°

A more powerful CPU is recommended for the Melexis cameras, especially if you intent on running them at one of the higher refresh rates. Also note that running the high refresh rates for those cameras requires a 400kHz I²C bus speed (and possibly other configuration changes). See the documentation from the camera driver for more details. pattern.

Installation

You can install just the program from cargo (crate name 'r-u-still-there'). The preferred process though is to use the .deb packages, either manually downloaded from the releases on GitHub or from my package repo. In any case, I would recommend using the version that matches your hardware closest, as each jump in ARM instruction sets makes a noticeable improvement in performance (even on the exact same hardware).

Debian-based Distributions (including Ubuntu, Raspberry Pi OS)

See the RaspberryPi.md file for a walkthrough of installing it on a Raspberry Pi. Installation on other Debian-based systems will be mostly the same, with the biggest difference in how the I²C bus is enabled on different devices.

FAQ

Why is the CPU usage is really high?

Drawing the text of the temperatures is fairly CPU intensive at the moment. If you can disable that (by commenting out the render.units value in the config file), CPU usage will go down. Another option is to limit the frame rate of the video stream with the frame_rate_limit setting. And finally, nudging the render.grid_size setting lower can help a little bit.

Rendering the video stream is the most "expensive" part of r-u-still-there at the moment as it's all being done on the CPU. If there is no client connected to the MJPEG stream though, no rendering is done and CPU usage should drop back down.

How do I configure it?

For the Debian packages, the configuration file is located at /etc/r-u-still-there/config.toml. That is also the default location if no config file is given as a command line argument.

How do you connect the camera to the computer?

You need to connect the camera to your device's I²C bus. This varies between different devices, but here are a few examples for some devices:

How do I get more detailed logs?

Logging can be configured using the RUST_LOG environment variable. Setting RUST_LOG=debug will give pretty verbose logs, but if you want even more, trace is also available.

What MQTT brokers can I use?

I use mosquitto, but any MQTT 3 compatible broker that supports retained messages should work.

How far away can the sensor detect a person?

This depends the resolution and field of view of the camera you're using. Higher resolution and a narrower field of view will result in a the camera being able to detect a person from farther away. In my experience, a GridEYE can usually detect me (a moderately tall, average build man) from about 4m (13 ft) away.

There's a lot of noise (rapidly changing, but not by much) in the temperatures. How can I get rid of that?

On the GridEYE, setting the camera to 1 FPS will internally de-noise the image. I'm also planning on adding other methods in the future.

How can I view the camera image?

There's an MJPEG stream available (if enabled) over HTTP on port 9000 at /mjpeg (so http://<IP address>:9000/mjpeg). If you want to have it available in Home Assistant, you'll need to configure it manually.

This sounds a lot like what room-assistant does.

It does! I used room-assistant for a while, and think it's a really cool piece of software. If you want presence detection using Bluetooth, it's still what first comes to mind. Over time I encountered a few pain points that I feel r-u-still-there better addresses:

  • room-assistant can be pretty taxing on the CPU, as it's rendering the image for every frame, regardless of if there's someone watching it. r-u-still-there only renders the image if there's an active client. This results in CPU usage of around 2% on a Raspberry Pi Model 1B+, while room-assistant would normally be around 50-60% on older versions, and be maxed out on newer versions (v2.12.0 changed to using Javascript libraries for image rendering). Even when streaming video though, r-u-still-there is generally more performant, with a 1 FPS stream taking roughly 15% CPU usage, and 5 FPS taking around 60% CPU.

  • r-u-still-there has a few more configuration knobs, such as the size of the generated thermal image, the color scheme used in that image, camera frame rate, and the units used for generated data.

  • Some of my devices have poor WiFi reception, so they slow down the other devices on the network when they need to communicate. room-assistant generates a lot of network traffic, with a full image being sent over MQTT every second in addition to a fair bit of multicast traffic (which can be turned off, but is enabled by default). r-u-still-there does not send images over MQTT (there may be an option to enable this in the future, but not currently).

  • Some cameras offer extra capabilites that room-assistant doesn't expose. For example, the GridEYE can be run at 10 FPS at the cost of increased noise in the image. Most thermal cameras also have an ambient temperature sensor, which is also exposed by r-u-still-there.

All that being said, I'm still very thankful to room-assistant for inspiring me to create r-u-still-there.

What are the warning messages about a measurement sink lagging about?

This usually happens if the CPU can't keep up with the frames coming off of the camera. Try some of the suggestions above for reducing CPU usage.

How can I stop r-u-still-there from crashing after a few minutes when using a Melexis Camera on a Raspberry Pi?

The Raspberry Pi's I2C clocks are based off of the core clock, and the core clock will (by default) vary based on the overall system's CPU usage. To stop this, you need to set core_freq=250. It is also possible to set force_turbo=1, but be aware that if you have any of the over_voltage_* values set the warranty bit will be permanently set. See the documentation on overlocking options for more details.

If you have enable_uart set, core_freq=250 is implicitly set, so you don;t need to make any changes unless you have also disabled Bluetooth.

If you hook a logic analyzer up to your I2C bus, this issue will manifest as the first few frames being transferred at full speed (default is 100kHz, but 400kHz is the recommended setting for r-u-still-there), then dropping to either 62.5% (older Pis) or 40% (Pi 4) of the normal speed.

Development

This repository should just build with cargo once checked out from git:

git clone https://github.com/paxswill/r-u-still-there.git
cd r-u-still-there
cargo build

Development builds turn the optimizations up, as it's unusably slow without them.

Cross-compiling

Building on the target device itself can be very slow, and the device may not even have enough memory. Thankfully cross compilation is pretty easy with Rust.

Whichever way you end up building the package, if you're compiling the a 32-bit ARM arhitecture you'll need to pass some extra flags through to the C compiler (replacing cargo with cross as needed):

# ARMv6, for Raspberry Pi 0 and 1
TARGET_CFLAGS="-march=armv6+fp" cargo build --release --target arm-unknown-linux-musleabihf
# ARMv7, for earlier version of the Raspberry Pi 2 and BeagleBones
TARGET_CFLAGS="-march=armv7-a+simd" cargo build --release --target armv7-unknown-linux-musleabihf
# 64-bit ARMv8, for Raspberry Pi Model 4
cargo build --release --target aarch64-unknown-linux-musl

glibc

The easiest way to cross-build for glibc targets I've found is with cross. It just works, and is also how the provided packages are built (along with cargo-deb).

musl static builds

I use the musl targets for most of my development as they're easier to get working when cross-building from a FreeBSD-based system. The musl-based targets are also a bit slower in my experience, so by using them for development I get a nice little "performance boost" for free when using glibc for the packages. I've found musl-cross-make the easiest way to get a native cross-toolchain set up. Once they're installed and available in $PATH, you'll need to create .cargo/config.toml with contents similar to this:

[target.arm-unknown-linux-musleabihf]
linker = "arm-linux-musleabihf-gcc"

[target.armv7-unknown-linux-musleabihf]
linker = "armv7-linux-musleabihf-gcc"

[target.aarch64-unknown-linux-musl]
linker = "aarch64-linux-musl-gcc"

Packaging

Building the Debian package is done using cargo-deb. build.sh will build each architecture using cross, then package it up, leaving the .deb file in the project directory.

Commit count: 0

cargo fmt