@defanor Thank you for the bug report! I did not properly escape the NOT operator, that's fixed now 🙂
Conversation
Notices
-
Embed this notice
Louis (louis@emacs.ch)'s status on Tuesday, 12-Mar-2024 13:30:52 JST Louis - clacke likes this.
-
Embed this notice
defanor (defanor@emacs.ch)'s status on Tuesday, 12-Mar-2024 13:30:53 JST defanor @louis "cheese NOT toast" does not yield any results, though simply looking for "cheese" yields those with no "toast" in sight, and more than "cheese AND toast".
As for the interactivity, I enjoy proper lispy REPLs as well, but to be fair, with less interactive languages one probably would make the state needed for resumption persistent, and probably have fewer runtime exceptions: it would be odd to keep restating large-scale crawling on each adjustment.
How much space does the crawled data take, by the way? If it is not too big, maybe you could also organize something similar to Common Crawl for Gopher, by publishing that as a single dump. Not sure how useful that would be though, but would have been useful for this project if somebody else did it earlier.
-
Embed this notice
Louis (louis@emacs.ch)'s status on Tuesday, 12-Mar-2024 13:30:55 JST Louis GNV - the Gopher Web search engine - just got an update.
- Now includes Gemini
- Shows context snippet of the content where search query was foundThe crawler is still working through more than 60'000 found links, this will require a few more hours to finish.
I've removed the Gopher image and binary proxy out of the following consideration:
- It's a potential attack vector to external Gopher servers
- GNV shall not replace functionality of Gopher, but encourage its use by using a proper client
- I want to focus on providing features increasing discovery of Gopher/Gemini content and how it is interconnected with each other (inspired discussion by @screwtape - aka "How to link Gopher phlogs with each other")
Those who follow me know that GNV is written in Common Lisp. A great learning exercise for me. It's incredibly suited for this task because I could modify the code for numerous edge cases _while_ the crawler is running, without starting from scratch every time an exception is thrown.
Sure, when the project is finished and the problem is modeled out properly, one could say: that would be easy to write in lang X.Y.Z! Sure, but it's not the end product where Common Lisp shines, it's the journey.
With that, have a good night y'all.