LLMS and Lisp
- Lisp Thinker

- Nov 23, 2025
- 9 min read
I recently listened to a talk by Anurag Mendhekar from the 2025 European Lisp Symposium entitled: Back to the Future: Lisp in the New Age of AI on YouTube. In the talk, he outlined the reasons why Lisp was the lingua franca of AI development up until the late 1980s; what happened to AI in the 1990s and 2000s; and how Lisp was replaced by Python during the Machine Learning era. Finally, he takes a look towards the future and how Lisp might still play a role in Computer Science going forward.
I encourage you to watch the full talk, and I won't spoil the ending, however, I've summarized some of the key points below:
Why Lisp was great for AI
Lisp was perfect for building complex knowledge graphs using very few constructs.
Lisp provides a very simple computational framework to access this knowledge with:
car, cdr, lambda, and cond
You can experiment with your knowledge graph interactively and add complexity incrementally with:
dynamic typing, REPL, garbage collection
Lisp programs are knowledge graphs themselves and therefore its computational framework can work on code as well using:
macros, eval, compile
The first part of the talk is a great tribute to Lisp and a succinct explanation of why the knowledge model used by Lisp, i.e. the "Lisp Epistemology" was a perfect model for AI research at the time.
From there, the talk becomes a little bit depressing!
AI Winter of the 1980s
knowledge graphs started becoming too complex and difficult to maintain.
hardware to run heuristics was too expensive.
many of these AI companies using Lisp crashed and ceased to exist taking the specialized Lisp hardware with them.
Machine Learning Advances in the 1990s and Early 2000s
with the advent of machine learning, knowledge began to be represented with probability distributions instead of knowledge graphs and the hardware moved to GPUs.
Python became the language of choice to bind together the lower level libraries used to perform machine learning.
Conclusion: there's no more need for Lisp's epistemology!
In the talk, he also mentions a (disheartening?) quote by Lisp Expert, Peter Norvig:
"Python is an acceptable Lisp."
Finally, Anurag finishes the talk with a section called: Lisp 2045, Lisp in the Next Age of AI. He is brutally honest about the future of software engineering as a career in the age of AI coding.
Spoiler alert: It is not looking good for Lisp programmers or any skilled coder for that matter!
I won't ruin the conclusion of the talk which ends with his proposal of where he thinks Lisp might have a place in computer science in the future. However, at least for me, the proposal felt a little less than satisfying for a language as elegant and enduring as Lisp.
That being said, it was a very interesting talk and I'm thankful that it has been made available on YouTube.
Another Take...
Rather than lament the end of Lisp epistemology or the rise of Python as a faux-Lisp, I think a more positive take is to look at ways that we can incorporate some of these new AI tools into our Lisp programming and couple that with the features that he mentioned in the first part of the talk, i.e. the features that made Lisp great in the first place.
If anything, Large Language Models have ushered in a new era of knowledge and data abundance. They've ingested the entirety of the internet, and we can use them to extract some of that data to improve our Lisp programs. This is what this article is about.
Using LLMs to help with Lisp Projects
One simple way that LLMs can work for Lisp is by generating data. As mentioned in the talk, Lisp is perfect for stand-alone knowledge models, however during previous eras of Lisp popularity, it may have been difficult to come up with data for these knowledge models. In this case, I am not talking about training data as in deep learning, I am talking about the concrete knowledge facts used by the knowledge model itself.
Personally, I am also a fan of low-requirement computing so having stand-alone knowledge systems is still something that interests me.
Before we see how to incorporate data into our programs, let's review hash tables in Common Lisp.
Hash Tables
To create a hash table in Lisp, use the make-hash-table function:
[1]> (defvar favorite-colors (make-hash-table))
FAVORITE-COLORSTo retrieve a value from a hash table using a key, use the gethash function:
[2]> (gethash 'bob favorite-colors)
NIL ;We received NIL because we haven't stored anything in the hash table yet, let alone for the key 'bob.
The gethash function returns a place (generalized variable) in the hash table, therefore we can store a value in the hash table by using setf in conjunction with gethash.
[3]> (setf (gethash 'alice favorite-colors) 'blue)
BLUENow, when we use gethash, we will receive a value for that key.
[4]> (gethash 'alice favorite-colors)
BLUE ;
TAssuming, we've populated the hash-table with more key/value pairs, we can also iterate over keys with some built-in loop commands:
(loop for k being the hash-keys of favorite-colors do
(format t "~% ~A's favorite color is ~A" k
(gethash k favorite-colors)))JACK's favorite color is CRIMSON
IRIS's favorite color is VIOLET
HENRY's favorite color is TEAL
GRACE's favorite color is PINK
FRANK's favorite color is ORANGE
EVE's favorite color is YELLOW
DAVID's favorite color is PURPLE
CAROL's favorite color is RED
BOB's favorite color is GREEN
ALICE's favorite color is BLUEOne detail: if you decide to use double-quoted strings for your keys (instead of symbols) that is perfectly fine, however you should then specify that you want to use 'equal for your key comparison test rather than the default eql since strings that look the same are equal but not eql.
[18]> (eql "mushroom" "mushroom")
NIL
[19]> (equal "mushroom" "mushroom")
Tthe :test argument in make-hash-table specifies the equality function used to compare keys:
(defvar emojis (make-hash-table :test 'equal))Fun with Emojis
Now that we've reviewed hash tables in Lisp, let's see how to make use of LLMs to generate data for us.
If I define a hash table to store emoji (decimal) unicodes like this:
(defvar emojis (make-hash-table :test 'equal))Then I can populate the table using setf:
(setf (gethash "smile" emojis) 128516)
(setf (gethash "grin" emojis) 128513)We can also define a function to convert the unicode emoji to something printable:
(defun emoji (emoji-name)
(code-char (gethash emoji-name emojis)))We can now print an emoji by referring to its description rather than remembering its code:
(format t "~% Hello world! ~a" (emoji "smile"))
Hello world! 😄We now have enough structure to prompt the LLM (in this case Claude) to finish populating the table for us:
Hi. If I have a hash table for emojis defined like so:
(defvar emojis (make-hash-table :test 'equal))
(setf (gethash "smile" emojis) 128516)
Can you please finish populating the table with 100 of the most popular emojis?I'm assuming this is a fairly easy task for an LLM and sure enough, it grants our wish. Here is one section of emojis returned by Claude:
;; Objects
(setf (gethash "watch" emojis) 8986)
(setf (gethash "mobile phone" emojis) 128241)
(setf (gethash "laptop" emojis) 128187)
(setf (gethash "keyboard" emojis) 9000)
(setf (gethash "printer" emojis) 128424)
(setf (gethash "computer mouse" emojis) 128433)
(setf (gethash "joystick" emojis) 128377)
(setf (gethash "money bag" emojis) 128176)
(setf (gethash "dollar" emojis) 128181)
(setf (gethash "credit card" emojis) 128179)
(setf (gethash "gem" emojis) 128142)
(setf (gethash "tools" emojis) 128736)
(setf (gethash "hammer" emojis) 128296)
(setf (gethash "wrench" emojis) 128295)
(setf (gethash "gear" emojis) 9881)
(setf (gethash "lock" emojis) 128274)
(setf (gethash "unlock" emojis) 128275)
(setf (gethash "key" emojis) 128273)
(setf (gethash "bulb" emojis) 128161)
(setf (gethash "flashlight" emojis) 128294)
(setf (gethash "battery" emojis) 128267)
(setf (gethash "pill" emojis) 128138)
(setf (gethash "syringe" emojis) 128137)
(setf (gethash "test tube" emojis) 129514)
(setf (gethash "microscope" emojis) 128300)
(setf (gethash "telescope" emojis) 128301)
We now have enough emojis at our disposal to spice up our Lisp programs! Here's a quick and dirty example:
(defun random-choice (l)
"Return random choice from list."
(nth (random (length l)) l))
(defun emoji (emoji-name)
(code-char (gethash emoji-name emojis)))
(defvar emojis (make-hash-table :test 'equal))`
;; Objects
(setf (gethash "watch" emojis) 8986)
(setf (gethash "mobile phone" emojis) 128241)
(setf (gethash "laptop" emojis) 128187)
; ... emoji codes populated by LLM
(setq *random-state* (make-random-state t))
(let ((delay 5)
(n 20)
(emoji-keys (loop for k being the hash-keys of emojis
collect k)))
(loop repeat n do
(setf k (random-choice emoji-keys))
(format t "~% You found a ~A [~a]" (emoji k) k)
(sleep delay)))You found a ⚙ [gear]
You found a 🧪 [test tube]
You found a 🔦 [flashlight]
You found a 💰 [money bag]
You found a 💻 [laptop]
You found a 🔋 [battery]
You found a 🔋 [battery]
You found a 💻 [laptop]
You found a ⌨ [keyboard]
You found a 🖨 [printer]
You found a 💻 [laptop]
You found a 🕹 [joystick]
You found a 🔬 [microscope]
You found a 💎 [gem]
You found a 💉 [syringe]
You found a 💵 [dollar]
You found a 🔭 [telescope]
You found a 🔋 [battery]
You found a 🧪 [test tube]
You found a 💰 [money bag]
At the beginning of this article, we were discussing knowledge models not emojis, so let's return to something a bit more factual!
I prefer to write the functions first with a few samples of data as an example, before prompting the LLM; and then include the code with the prompt. This will keep the LLM from writing the code for us (the interesting part!) and letting the chatbot do the boring part, i.e. exporting the data from its model.
Hi. if I define a hash table in common lisp where the key is the country name and the value is the population rounded to the nearest million and expressed with engineering notation like this:
(defvar countries (make-hash-table)) (setf (gethash "United States" countries) 343e6)
Can you finish populating the hash-table with the top 100 most populous countries?Since I wanted this data in one of my packages, I moved it to a function that creates the hash-table at runtime:
;; Hash table of the top 100 most populous countries
;; Key: country name (string),
;; Value: population in engineering notation (rounded to nearest million)
(defun populate-countries ()
"This will create the hash-table, populate the countries at run-time and return the hash-table. That way we don't populate it while loading the package."
(let ((countries (make-hash-table :test 'equal)))
;; Top 100 countries by population (2024/2025 data)
(setf (gethash "India" countries) 1.428e9) ; 1,428 mil
(setf (gethash "China" countries) 1.413e9) ; 1,413 mil
(setf (gethash "United States" countries) 343e6) ; 343 million
(setf (gethash "Indonesia" countries) 278e6) ; 278 million
(setf (gethash "Pakistan" countries) 241e6) ; 241 million
(setf (gethash "Nigeria" countries) 230e6) ; 230 million
(setf (gethash "Brazil" countries) 217e6) ; 217 million
(setf (gethash "Bangladesh" countries) 173e6) ; 173 million
(setf (gethash "Russia" countries) 144e6) ; 144 million
(setf (gethash "Mexico" countries) 129e6) ; 129 million
(setf (gethash "Ethiopia" countries) 132e6) ; 132 million
(setf (gethash "Japan" countries) 123e6) ; 123 million
(setf (gethash "Philippines" countries) 118e6) ; 118 million
(setf (gethash "Egypt" countries) 113e6) ; 113 million
(setf (gethash "Vietnam" countries) 99e6) ; 99 million
...While working on this example, I was reminded that LLMS are still not perfect and hallucinations are still a possibility:
Great! However at least one of those is a city and not a country. Can you find it?You're absolutely right! Hong Kong is not a country - it's a Special Administrative Region of China, not an independent nation.Take that non-Lisp Epistemology!
Finally, a classic Lisp example of returning objects having a specific property, where the facts have now (conveniently) been generated by an LLM:
(defparameter *facts*
'((dog mammal)
(dog has-fur)
(cat mammal)
(cat has-fur)
(lizard reptile)
(lizard has-scales)
(snake reptile)
(snake has-scales)
(mouse mammal)
(mouse has-fur)
(elephant mammal)
(elephant has-thick-skin)
(elephant herbivore)
(lion mammal)
(lion carnivore)
(lion has-mane)
(tiger mammal)
(tiger carnivore)
(tiger has-stripes)
...
(bat mammal)
(bat can-fly)
(bat nocturnal)))
;; A function to find all animals with a given property
(defun search-animals (property)
(mapcar #'car (remove-if-not (lambda (fact)
(equal property (cadr fact))) *facts*)))(search-animals 'reptile)
(LIZARD SNAKE TURTLE CROCODILE IGUANA)Conclusion
It might be easy to think of Lisp as being replaced by LLMs, but actually, the same things that made it great in the beginning still make it great now, especially when it comes to working with facts and data. Now, since data is abundant within AI models, it is actually a very fun time to be a Lisp programmer.
In future articles, I hope to continue to show interesting ways to use Lisp in the age of modern AI. The obvious use of an LLM is to simply generate code based on prompts, however, let's not forget that we can also get value out of LLM's by exfiltrating data from these models and then use the power of Lisp to create mini expert systems that can be installed anywhere that can run Lisp.
Whichever AI techniques you end up trying out with your Lisp development, good luck with your journey and happy hacking!




Comments