

I don’t trust LLMs for anything based on facts or complex reasoning. I’m a lawyer and any time I try asking an LLM a legal question, I get an answer ranging from “technically wrong/incomplete, but I can see how you got there” to “absolute fabrication.”
I actually think the best current use for LLMs is for itinerary planning and organizing thoughts. They’re pretty good at creating coherent, logical schedules based on sets of simple criteria as well as making communications more succinct (although still not perfect).
And how many are bots?
I liked that show before I realized it was another blue-balls Lost situation where they keep edging you along and never give you any real closure. But I still watched damn near all of it because James Spader is the fucking man.
Like your wife, I thought it was an absolutely brilliant film 15 years ago when I was an enlightened college kid who had just discovered LSD. I watched it again last year and could barely get through it.
The philosophical ideas presented have no consistency or connection to whatever passes for a plot, the pacing/narrative is an afterthought, and (sorry to any Linklater fans) the rotoscoping animation started to give me a migraine.
”wait, aren’t these the Israel AI people”
To be fair, that describes like 80% of tech companies nowadays