Don't worry about LLMs
Related
More from Vicki Boykis
We keep trying to get LLMs to do math. We want them to count the number of “rs” in strawberry, to perform algebraic reasoning, do multiplication, and to solve math theorems. A recent experiment particularly piqued my interest. Researchers used OpenAI’s new 4o model to solve multiplication problems by using the prompt: Calculate the product of x...
In the 1800s, before serfdom was abolished in the Russian empire, landowners paid taxes based on how many serfs they had. A census was conducted every few years by government employees traveling across the empire and doing counts; a manual map-reduce of epic proportions. If a person was dead, it would often be years before the government cleared...
Jakob’s Law of UX goes something like this. I, as a user online, spend my time on many sites. As such, when I come to your site, I am already used to the way the other sites work, and I don’t want to learn new paradigms. Some also call these preconceived notions user mental models or affordances. I like to call it the user-site contract. For...
We are now in a very weird liminal space in information retrieval for consumers, particularly those attuned to trends in search and working on the bleeding edge of LLMs. On the one hand, we have the fall of old companies. Broadcast-based centralized social media, which steadily served as a newsfeed and realtime search for a small, vocal minority,...
I, like many developers who have worked on high-scale, low-latency web services over the last fifteen years, have an intimate relationship with Redis. At any new job, when you ask where the data is, and someone points you to a server address with port 6379, you know you will meet an good, reliable friend there. When you shell into the redis box...