reversebias's picture nsarrazin's picture
Fix prompt caching on llama.cpp endpoints (#920)
d96c921 unverified