The Rise, Fall, and Future of Vector Databases: How to Pick the One That Lasts
Related
More from Stories by Dmitry Kan on Medium
This Fall I have co-taught a new course on LLMs and Generative AI at the University of Helsinki. It was the first course in its kind, with quite a large group of students.AI-Generated imageUnderstanding LLMs from the ground up is essential, especially as they dominate discussions in tech today. Beyond the allure of impressive demos, diving deeper...
Last week, I had a pleasure to teach the Week-6 topic: “Use cases and applications of LLMs”. Week-5 on RAG can be found here.We looked at multimodal LLMs, as a very interesting, and in many ways still an emerging trend in the LLM world, covering text, image, video and audio modalities (you can ask: “What do you hear in this video?”, for...
This Fall we are teaching a course on LLMs and Generative AI at the University of Helsinki, together with Aarne Talman (Accenture) and Jussi Kalgren (Silo.AI, now AMD).Screenshot of the PDF RAG streamlit appSyllabus:Week 1: Introduction to Generative AI and Large Language Models (LLM)Introduction to Large Language Models (LLMs) and their...
“Large Language Models are complex systems. So the output, the final weights of the neural network, is just one little part of the entire picture.”This is the quote of Alessandro from the episode we recorded at Berlin Buzzwords’24.I also tweeted (X’d?) about how alarming it is to see the downward trend in open-sourcing various components of these...
Another re-blog: this time about Lucene’s TokenFilter’s (originally published in 9 June 2014). For those into neural search from scratch, I also wrote this piece, that deals with embeddings on Lucene level.At the recent Berlin buzzwords conference talk on Apache Lucene 4 Robert Muir mentioned the Lucene’s internal testing library. This library is...