making a full text index is turning out to be a complicated task
no matter, i will do it anyway
i just have done it twice the wrong way, so i will revise it tomorrow again after fiat mining is done
got some nasty reading work lined up to figure out how to actually secure a potentially spammy notification system
the problem so far with the indexing work is that it seems clear that i probably got some part of my first attempt right, as in, run a background thread that gets thrown index jobs, and does them as they come in, and a main analysis thread that breaks down the text into words and punts that to the background thread
and i think now, i need a third background thread that all it does is persist the progress of the indexer in the database so it doesn't do things twice
and a third, or is it fourth, thing is that i probably need to find some efficient way to check if one of the inverted indexes already contains the new serial so it doesn't keep on growing the value of the key to infinity
lots of things to figure out, but, manyana, because today is over
#GN
#realy #devstr #progressreport
after some fighting with the huma api to play nice with my easy-registering router - mainly was just realising that it should match on path prefixes not the whole path for the huma servemux, i finally had the HTTP API presenting
first things first, i wanted to move more stuff into dynamic configuration stored in the database, and first thing to do is enable configuration of the admins in order to restrict access to the admin parts of the API
i haven't fully finished it yet but it starts up wide open now, and you have to use the /configuration/set endpoint and drop at least one npub into the Admins field, and voila, it is now locked down
i have to first start by adding just one configuration to the environment, which is the initial password, which you put in the Authorization header field to allow access, this ensures that the relay is not open to the world should it be deployed on some far distant VPS that can be spidered by nasty people who might figure out quickly how to break it on you
once those two pieces are in place, i need to put back the nip-98 expiration variant generator tool, and then you can use this token temporarily to auth as an administrator, and tinker with the other admin functions, but mainly the configuration is what is most important priority
so, a nice afternoon's work, dragging a bit into the evening, but i got my nice router library working with huma API and based on the code from the original realy i will reinstate the whole functionality of it pretty quickly... and likely along the way i will probably find something to make a bit better, but i think overall it's fine as it is, it's just a bit clunky to use the export function in the scalar REST docs UI but with my nip-98 capable curl tool, nurl, you can just use that and basta
now is time for bed tho
#gn
ahaha, so, nearly #gn again but i'm trying to get a bit actually done
i got the simple events endpoint done, though i haven't tested it, i figured easiest way was to actually implement a simplified filter, which will return precisely the parameters needed for a simple events query (just list of Ids) - so once i make teh simple filter, i can make a filter query, and then use the result to make an events query, and both are then tested...
the filter itself is done, it was just copypasta from the full filter, with ids, since, until and search removed, i had created a filter sort (so identical filters are the same if they contain the same elements) and so i could make a filter index, i forget why i made that exactly, maybe it was to speed up subscriptions, since subscriptions are based on filters and if an identical subscription comes in and it's sorted, marshaled and hashes the same fingerprint as an existing one, it only adds a new subscriber to that filter instead of another filter to pass through when events arrive (well, it's there, i think it also is mostly implemented so it can equally apply to simple filters later)
first is i need a new policy filter function, AcceptFilter, to match the AcceptReq, which does the same thing except it only listens to one filter, because that's how this thing works.
yes, obviously i already created a new simplified filter type, it's a simple object/struct with authors, tags and kinds
further, we put "since" and "until" into the URL parameters, because we can, and why not. I think this also leaves open the possibility of later adding things like sort order, instead of the default reverse chronological (newest first)
processing the parameters is simple enough, split the Path by ? and then the second field contains parameters as key=value separated by &
this will be tomorrow afternoon's labors
it is dark, and when it gets dark, it is time to sleep, even if by the fiat clock it says 8pm that's only 6:40pm by sun clock but it's dead dark already... i think the moon is not up yet, last night later on it was very bright outside, and i was disappointed that they fixed the lights around the caminho and it was once again bright without lights on inside (two days it was dark, was so nice)
finally got around to finishing more of the HTTP nostr protocol implementation
i've already tested one method, which is for submitting events... and had used it, but then i was looking at it and realised that http methods should return http error status codes
so i went through and replaced all the ok,false etc envelopes with standard http messages, mainly "NotAcceptable" for cases like trying to delete other users events (need to make an exception for users authed as admins, or maybe just put it inside the main admin interface, just haven't got to that yet) and "InternalServerError" ie 500, which is what it returns if the database fetch fails for whatever reason, errors that likely don't happen very often
i forgot how laboriously complicated the delete event processing was, also, omg... and yeah, that's another reason why i'm probably not going to do anything about enabling admin delete outside of the admin endpoints, it's just for normal users this code, it's basically the same as the EVENT envelope flow but translated into HTTP
i still have to write the actual simple fetch events by id `/events` in the database, i've sketched it out on the http handler but the database side isn't done yet, but i'm feeling a bit wired this evening and i'm gonna just get me some sleep, it's my bedtime, sun goes down, my brain turns off
tomorrow morning it's fiat mining again for another 5 hours first, and then i can return to the nostr http and implement the simple fetch protocol and probably i may actually have the `/filter` endpoint, which is the same as a nip-01 filter except without the "ids" field or "search" field
the actual "search" field i'm just gonna put a "NotImplemented" stub in there for now because it requires a separate index to be created, though i can probably make a special case of filter processing but where it requires at least a pubkey, a tag, and since/until boundaries that don't exceed - let's say, a month, because it can then pull those in a standard filter and then include everything that matches one of the keywords on the content field, and basta, full text search without full text index... nah, seriously, i'm not going to do that, too fuzzy to define sufficiently tight requirements for doing an exhausting scan of content fields, so it will stay that way until i do implement a fulltext search
and i will, just not now, it's important for a document repository use case and that's a key target for the #realy in the long term, in association with the #alexandria project
the tricky thing is that i have to implement that index manually, and that requires an index entry for every full word found, and then a second indexing pass to find mashed together words, and then parsing for whether it's case sensitive or not (probably can just have it not, but then check the content match if i want it to be case sensitive) hah... yeah, actually, it's a complicated task, probably a couple weeks work for me to build a full text index
anyway, getting off track, with fetch by id `/events` and `/event` for submitting events and `/filter` for searching events, you have your basic CRUD (when you include delete event processing, which i have) and that will be MVP
second target after that is to add a socket-using `/subscribe` endpoint that does filter matches on only newly arived events, and a `/relay` method which ignores the nip-01 idea of kinds indicating an event is ephemeral so the user can dictate that the relay is not to store the event even if it is storable, why? you ask - i'm glad you asked, because i intend to later on revise the event encoding and request encoding scheme to plain text with line separations and other common conventions, and event kinds as path matching like `note` and `note/html` and `note/markdown` and `note/asciidoc` just as an example of this, so you could search for any kind of `note` or you can search for specific formatted ones instead, and in this future scheme, there will not be a notion of "ephemeral" or "replaceable" or "parameterized replaceable" events, instead there will just be `/relay` `/replace` and parameterized replaceables are just a client side semantics, instead of some nebulous vague nonsense like what plagues the whole nostr protocol specification from top to bottom
anyway, that's my debrief for my day's work today, it was a good day, i've mostly got a matching matrix table implemented for that now, and will be actually building a mock and having it populate the matrix table and then perform a bunch of queries... which then leads to making a specific query API for the recommender which must also store history of recommendations so it doesn't repeat itself
#gn
oh dear, i gave my skitty #mochi the new drontal internal antiparasitic and some other drug for the external stuff, and i see his eyes are all watery and he's behaving rather differently, like, playful and bold, sitting up on the shelf threatening to tamper with my nicknacks
he is definitely in skitty mode right now
this weather at the moment is crazy, rain and wind and rain and wind and i don't know what way is up at the moment
time to go to bed, but first i must dispose of the skitty's deposits in the clay repository in the bathroom, and sweep the clay bits he has scattered around the floor, and then to welcome him to his favourite sleeping place next to me on the bed upstairs
too much friction and probably i can't even catch a photo of him perched on the shelf with his visible slight wobble that is obviously probably induced by some cholinergic effect of the drugs i gave him to clear his parasites
well
inshalah that this actually kills the ear mites and eye mites he has had that the previous treatment did not affect
and i'm off to clean toilets and chew bubblegum, and go to sleep to the whooshy sound of the wind and the periodic random patter of rain
#GN
Showing page 1 of
11 pages