Uv has a killer feature you should know about
More from Lukas Valatka
I’ll give it a few years until MLflow dominates the model package format space, with alternatives like SageMaker models fading away, and sharing pure weights becoming an arcane art. But until that dominance is absolute, I’ve been thinking that there’s another quite obvious way to package models: just store them as wheels. Packaging == persisting...
For the past few months, I’ve been exploring Go. Having done quite a bit of grueling work shaving off milliseconds from Python web apps, I’ve found Go to be incredible. You can schedule dirt cheap concurrent operations — simply by adding go in front of a function call — and achieve true parallelism across cores.
I have always been intrigued by Bloom filters. They are very similar to hash sets but somehow consume much less memory. Sometimes, they can yield false positives, creating a peculiar tradeoff.
Recently a data platform vendor introduced us to their latest offering: a feature store module for machine learning projects. They highlighted the usual selling points, like reducing train-serve skew, feature backfilling, and enhancing feature documentation. However, one of our experienced senior data engineers remained unconvinced. The question...
The Walrus operator, formally the assignment expressions, has been for some time around now in Python (since 3.8). I’ve grown fond of it :).