Course: Large Language Models and Generative AI
Related
More from Stories by Dmitry Kan on Medium
Special thanks to Doug Turnbull, Daniel Svonava, Atita Arora, Aarne Talman, Saurabh Rai, Andre Zayarni, Leo Boytsov, Pat Lasserre and Bob van Luijt for reading and commenting the drafts of this postJo Kristian Bergum recently wrote a massively influential X post: “The rise and fall of the vector database infrastructure...
Last week, I had a pleasure to teach the Week-6 topic: “Use cases and applications of LLMs”. Week-5 on RAG can be found here.We looked at multimodal LLMs, as a very interesting, and in many ways still an emerging trend in the LLM world, covering text, image, video and audio modalities (you can ask: “What do you hear in this video?”, for...
This Fall we are teaching a course on LLMs and Generative AI at the University of Helsinki, together with Aarne Talman (Accenture) and Jussi Kalgren (Silo.AI, now AMD).Screenshot of the PDF RAG streamlit appSyllabus:Week 1: Introduction to Generative AI and Large Language Models (LLM)Introduction to Large Language Models (LLMs) and their...
“Large Language Models are complex systems. So the output, the final weights of the neural network, is just one little part of the entire picture.”This is the quote of Alessandro from the episode we recorded at Berlin Buzzwords’24.I also tweeted (X’d?) about how alarming it is to see the downward trend in open-sourcing various components of these...
Another re-blog: this time about Lucene’s TokenFilter’s (originally published in 9 June 2014). For those into neural search from scratch, I also wrote this piece, that deals with embeddings on Lucene level.At the recent Berlin buzzwords conference talk on Apache Lucene 4 Robert Muir mentioned the Lucene’s internal testing library. This library is...