Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I find a lot of these frameworks to be overkill.

A rating event comes into our web server (python), then sent to SQS, then the event is pulled by our custom artificial neural network (deep learning) script, --written in Go. A model is trained and serialized. Next, the serialized model is uploaded to Postgres where it is fetched by the web service (also written in Go) to serve predictions.

We update our models within 15 seconds of a user rating. Every month with millions of ratings, we re-train millions of models and serve billions of predictions.



Interesting. Do you mind going into more detail?

I'm thinking of implementing a similar system – though not using Go.


What sort of details are you looking for?


Mostly persistence and scaling.

I assume you're using a jsonb column, and going horizontal?


Honestly, we haven't needed to scale horizontally. The Postgres column is a jsonb, although probably not necessary since we don't need to query the serialized model. If our reads ever start to slow down, the plan is to just add a follower to read from.


Woops. I was referring to the prediction tier.


What frameworks are you thinking of, specifically?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: