I am a paying customer of Kagi, decamping from a lifetime of Google as my default search engine. The search results are almost always better than Google’s, especially for the non-commercial web.
Like every other toaster on the planet, Kagi is using AI to deliver what they call ”quick answers” (you can see them by clicking the “quick answer” link in search results, or simply by adding a question mark to the end of your search terms).
The current “quick answer” returned for a search for “Peter Rukavina” shows the downside of the current state of the art in munging together disparate web results and trying to sew a complete story: Kagi gets some of the details right (I am a writer, printer, and developer in Canada, and my blog is indeed here at ruk.ca), but it also reports that I died in 2012 after a setback in hospital (not true). There’s about half of the result that’s “true” (if I’m there Peter Rukavina in questions) and half that’s not.
I’m posting this here, in part, as a place marker to return to as technology advances and this sort of issues gets solved.
Comments
My condolences
My condolences
Getting bullshit fed to you
Getting bullshit fed to you with an authoritative tone? Consulting companies are quaking in their boots
To be fair the disclaimer
To be fair the disclaimer says to 'exercise wisdom' for AI summary and there is another Peter Rukavina that did pass away. Quick Answer summarizes the search results and from that standpoint it did its job, disambiguation is hard to do in one quick forward pass.
Add new comment