I wrote a program that sends cats to my phone when I’m sad at the computer. I was inspired by a tweet I saw last week. I’ve lost the link but, to paraphrase, it went something like this:
I’m okay submitting myself to The Algorithm as long as it knows when I’m sad and forwards cats directly to my face
I figured that you could probably solve this problem locally without leaking any personal data.
Our computers are fast enough that we can run machine learning models in a browser in the background, maybe without even noticing. I tried out a few different JavaScript face recognition libraries that came prepackaged with trained models, and evaluated them by the following criteria:
- Is there example code I can easily hack on?
- Does it accurately reporting when I’m frowning, furrowing, or furious?
I went with vladmandic/human — another strong contender was justadudewhohacks/face-api.js. Both these libraries provide an API to get the weights of some common emotions.
['sad', 'angry', 'disgust', 'fear', 'neutral', 'happy', 'surprise']
I split the emotions into good vs. bad to get a clearer read of my mood. The overall score swings between -1 (very bad) and 1 (very good). I don’t want to be spammed with cats every time I itch my nose and trigger a frame of video that’s interpreted as negative so I added a three-second trailing average to look for prolonged periods of negative emotion. There’s also a timeout of five minutes after sending a cat before it starts checking again.
You can see some of the emotion scores below in the debug console I added.
I wrote all the frontend code in an index.html
file for the prototype. The main loop runs at 30-40FPS on a decade-old desktop (it reads emotion accurately at far lower FPS and should probably be capped to save resources).
function main() {
const config = { backend: 'webgl' }
const human = new Human.Human(config)
async function detectVideo() {
// `inputVideo` is a video of a webcam stream
const result = await human.detect(inputVideo)
// `result` contains an array of faces along with emotion weights
handleResult(result)
requestAnimationFrame(detectVideo)
}
detectVideo()
}
The web server runs locally and serves this file and the model data. The full source code is on healeycodes/if-sad-send-cat.
Notifications
I used Pushover to send notifications to my iPhone. The API/docs and community libraries are delightful, and there’s a one month free trial (no credit card required, etc). I had heard of programmers using Pushover as part of different home automation projects and was keen to try it out.
Here’s how I send a message and an image from server.py
:
r = requests.post