I am decidely not a software engineer or a web developer. That said, I am endlessly curious and a little too tenatious for my own good. By trade I am an aerospace engineer, by passion I am an artist. As an artist I am deeply interested in the intersection of visuals and music. When I paint, the vibes of my paintings often are reflected in what I am listening to and vice versa. I created this project as an attempt to combine music and images in a fun way. Truthfully, the algorithm probably isn't the most elegant in the world, but this project gave me a chance to practice thinking about developing applications for scale, how to build a UI, integrate APIs, think about costing (no this not free to run) and play around with machine learning. If you are curious about what I build in my free time, follow what I'm up to :). If not, eh that's ok, hope you have fun with this project nonetheless.
A FastAPI application provides an endpoint for detecting emotions in images. It downloads an image from a provided URL, processes the image to detect emotions using the fer
library, and returns the detected emotions in a JSON response. The application is also set up to be deployable as an AWS Lambda function using Mangum. Originally this was done with the google cloud vision API but this was determined to be too expensive at scale.