6 Comments

I have an answer for you on the Perplexity name ... perplexity is an evaluation metric used in natural language processing to evaluate the performance of language models. Basically, it measures how well the model predicts the next word based on the context provided by the previous words. (It does this through some probability maths- specifically, it is a measure of uncertainty in the value of a sample from a discrete probability distribution, which is where you have the connection to the semantic meaning of perplexity).

Expand full comment

Thanks a bunch, Mark, for reminding me about the salt in tea thing. You might have stuck a trigger warning at the top of your post! Thanks for the search engines. I've tried perplexity several times, and like the fact that it gives citations and links to pdfs and websites. Looking forward to trying out the others now. Cheers

Expand full comment