@zvava@twtxt.net Going to have to hard disagree here Iām sorry. a) no-one reads the raw/plain twtxt.txt files, the only time you do is to debug something, or have a stick beak at the comments which most clients will strip out and ignore and b) Iām sorry youāve completely lost me! Iām old enough to pre-date before Linux became popular, so Iām not sure what UNIX principles you think are being broken or violated by having a Twt Subject (Subject) whose contents is a cryptographic content-addressable hash of the āthingā⢠youāre replying to and forming a chain of other replies (a thread).
Iām sorry, but the simplest thing to do is to make the smallest number of changes to the Spec as possible and all agree on a āMagic Dateā for which our clients use the modified function(s).
@prologic@twtxt.net the simplest thing to do is to completely forgo hashing anything because we are communicating using plain text files right now :3
@prologic@twtxt.net the simplest thing to do is to completely forgo hashing anything because we are communicating using plain text files right now :3 while i agree hashes are incredibly helpful in the backend im not sure it has a place outside of it, it basically eliminates two core design principals of twtxt (human readability and integrating well with unix command line utilities) and makes new clients more difficult to build than it should be
@bender@twtxt.net Well honestly, this is just it. My strong position on this is quite simple:
Do the simplest thing that could work.
Itās one of the age old UNIX philosphies.
Therefore, the simplest thing⢠to do here is to just increase the hash length, mark a magic⢠date/time as @lyse@lyse.isobeef.org has indicated and call it a day. Weāll then be fine for a few hundred years, at which point thereāll be no-one left alive to give a shit⢠anyway š¤£
@prologic@twtxt.net considering other alternatives we have seeing (of which I have lost track already), yes. Why donāt you guys (client makers) take a step at a time and, for now, increase the hash length to deal with the collisions. Then location-based addressing can be added⦠or not, you know. š
@alexonit@twtxt.alessandrocutolo.it My problem is I donāt see a world where we donāt employ some form of cryptography to use as keys for threads in databases and other such things honestly. Iām not going to use url#timestamp as keys.
Each origin feed numbers new threads
(tno:N). Replies carry both (tno:N) and (ofeed:<origin-url>). Thread identity = (ofeed, tno).
@prologic@twtxt.net I think a counter in the client is not a good choice given the decentralized nature of twtxt, especially if someone use multiple cients together.
After thinking about it for a while I got to two solutions:
Proposal 1: Thread syntax (using subject)
Each post have an implicit and an optional explicit root reference:
Implicit (no action needed, all data required are already there)
- URL + timestamp
- URL + timestamp
Explicit (subject required)
- Identity (client generated)
- External reference
- Random value
- Identity (client generated)
We then add include a ārootā subject in each post for generating explicit theads:
1. `[ROOT_ID] (REPLY_ID)`: simpler with no need of prefixes
2. `(root:ROOT_ID) (reply:REPLY_ID)`: more complex but could allow expansions
- `(rt:ROOT_ID) (re:REPLY_ID)`: same but with a compact version
- `($ROOT_ID) (>REPLY_ID)`: same but with a single characters
Each post can have both references, like the current hash approach the reference can be treated as a simple string and donāt have a real meaning.
Using a custom reference this way allows a client to decide how to generate them:
- Identity: can be a content hash or signature or anything else, without enforcing how it is generated we can upgrade the algorithm/length freely
- External references: can be provided from another system (Eg.
7e073bd345, yarnsocial/yarn latest commit)
- Random value: like a UUID (Eg.
9a0c34ed-d11e-447e-9257-0a0f57ef6e07)
Proposal 2: Threaded mentions (featuring zvava)
Inspired by @zvava@twtxt.netās solution it could be simplified into: #<nick url#timestamp> or #<url#timestamp>
It can be shown like a mentions or hidden like a subject.
If weāre using thinking of using a counter in the client, I think thereās no point in avoiding the timestamp anymore.
index.md a prehook and a few utilities:
@bender@twtxt.net Yes I did about a week or so ago. It took me a lot of effort to get the content even rendered in the first place. LOL I had to basically export my blog as HTML (can you believe that?!) ā The Hugo export just didnāt work at all š¤£
@prologic@twtxt.net While it might work if you want to keep both, I think the point was to be able to use one or the other, if we still have to generate the hash anyway it might be pointless to use this format.
@prologic@twtxt.net I admit that I was a bit confused about the meaning of the message, at least I understood it was a āyesā from the last sentence. š
Each origin feed numbers new threads
(tno:N). Replies carry both (tno:N) and (ofeed:<origin-url>). Thread identity = (ofeed, tno).
@movq@www.uninformativ.de Yes itās kind of terrible š ā Letās not do this š¤£
@bender@twtxt.net Really? š¤
@bender@twtxt.net Well, you guessed correctly! š
Would be nice to have a fixed fee for that, a car is a car anywhere in the worldā¦
Each origin feed numbers new threads
(tno:N). Replies carry both (tno:N) and (ofeed:<origin-url>). Thread identity = (ofeed, tno).
@prologic@twtxt.net Thatās a completely flat threading model (you canāt reply to replies). Is that intentional?
@prologic@twtxt.net thatās not too bad! šš»šš»šš»
Each origin feed numbers new threads
(tno:N). Replies carry both (tno:N) and (ofeed:<origin-url>). Thread identity = (ofeed, tno).
Example:
Alice starts thread href=āhttps://yarn.girlonthemoon.xyz/search?q=%2342:ā>#42:**
2025-09-25T12:00:00Z (tno:42) Launching storage design review.
Bob replies:
2025-09-25T12:05:00Z (tno:42) (ofeed:https://alice.example/twtxt.txt
) I think compaction stalls under load.
Carol replies to Bob:
2025-09-25T12:08:00Z (tno:42) (ofeed:https://alice.example/twtxt.txt
) Token bucket sounds good.
@itsericwoodward@itsericwoodward.com Iām glad to hear it š¤£
@alexonit@twtxt.alessandrocutolo.it Holy fuck! 𤣠I just realized how bad my typing was in my reply before 𤣠š¤¦āāļø So sorry about that haha š I blame the stupid iPhone on-screen keyboard āØļø
@alexonit@twtxt.alessandrocutolo.it Maybe I misunderstood, but you have to keep the timezone offsets in mind. Simple alphabetical sorting of the timestamp strings does not yield a truly chronological order. It might be close enough for you, though.
@alexonit@twtxt.alessandrocutolo.it that sounds pretty much like Italy! LOL. We pay $48 on renewal in Florida, US, but that fee isnāt Federal, so other states may pay more, or less.
@prologic@twtxt.net That is really great to hear!
If there are opposing opinions we either build a bridge or provide a new parallel road.
Also, I wouldnāt call my opinion a āstanceā, I just wish for a better twtxt thanks to everyoneās effort.
The last thing we need to do is decide a proper format for the location-based version.
My proposal is to keep the āSubject extensionā unchanged and include the reference to the mention like this:
// Current hash format: starts with a '#'
(#hash) here's text
(#hash) @<nick url> here's text
// New location format: valid URL-like + '#' + TIMESTAMP (verbatim format of feed source)
(url#timestamp) here's text
(url#timestamp) @<nick url> here's text
I think the timestamp should be referenced verbatim to prevent broken references with multiple variations (especially with the many timezones out there) which would also make it even easier to implement for everyone.
Iām sure we can get @zvava@twtxt.net, @lyse@lyse.isobeef.org and everyone else to help on this one.
I personally think we should also consider allowing a generic format to build on custom references, this would allow for creating threads using any custom source (manual, computed or external generated), maybe using a new āTopic extensionā, hereās some examples.
// New format for custom references: starts with a '!' maybe?
(!custom) here's text
(!custom) @<nick url> here's text
// A possible "Topic" parse as a thread root:
[!custom] start here
[custom] simpler format
This one is just an idea of mine, but I feel it can unleash new ways of using twtxt.
@itsericwoodward@itsericwoodward.com I used the dates as is for indexing them as string, the ISO format allows for free auto sorting.
@bender@twtxt.net What?! In my country you have to pay 100⬠every 10 years of which about 75% are just taxesā¦
@movq@www.uninformativ.de Iāve got this magic spell in my config: -f bestvideo[height<=?1080]+bestaudio/best
@itsericwoodward@itsericwoodward.com pretty cool! Started following you, not to miss any progress. Thanks for the exhaustive reply!
yt-dlped https://www.youtube.com/watch?v=OZTSIYkuMlU. It's only worth for an experiment, no recommendation to watch.
@lyse@lyse.isobeef.org thatās pretty cool! The first video I see on YouTube of that kind. Thanks!
@lyse@lyse.isobeef.org Hm, I couldnāt trick yt-dlp into downloading the correct format. Works in the browser, though. š
@bender@twtxt.net @movq@www.uninformativ.de I had automatically yt-dlped https://www.youtube.com/watch?v=OZTSIYkuMlU
. Itās only worth for an experiment, no recommendation to watch.
@lyse@lyse.isobeef.org I canāt remember the last time I came across a 360° video. š¤
@bender@twtxt.net A renewed vision test might be a good idea for some people. š I mean, it is kind of curious that you get this license as a young person and then it lasts a lifetime, without any further tests. As long as you donāt screw up really bad, it remains valid ā¦
@movq@www.uninformativ.de I tried making an ascii scribble of penguin waving back at you but I gave up halfway through my first try xD ⦠HI! š
Thanks @bender@twtxt.net itās been a long time indeed but, I was here the whole time. Just silent. I just didnāt have much meaningful/worth twting about ⦠/ME flips a bird to life
@bender@twtxt.net Thanks for asking!
So, Iāve been working on 2 main twtxt-related projects.
The first is small Node / express application that serves up a twtxt file while allowing its owner to add twts to it (or edit it outright), and Iāve been testing it on my site since the night I made that post. Itās still very much an MVP, and Iāve been intermittently adding features, improving security, and streamlining the code, with an eye to release it after I get an MVP done of project #2 (the reader).
But thatās where Iāve been struggling. The idea seems simple enough - another Node / express app (this one with a Vite-powered front-end) that reads a public twtxt file, parses the āfollowā list, grabs (and parses) those twtxt files, and then creates a river of twts out of the result. The pieces work fine in seclusion (and with dummy data), but I keep running into weird issues when reading real-live twtxt files, so some twts come through, while others get lost in the ether. Iāll figure it out eventually, but for now, Iāve been spending far more time than I anticipated just trying to get it to work end-to-end.
On top of it, the 2 projects wound up turning into 4 (so far), as Iāve been spinning out little libraries to use across both apps (like https://jsr.io/@itsericwoodward/fluent-dom-esm, and a forthcoming twtxt helper library).
In the end, Iām hoping to have project 1 (the editor) into beta by the end of October, and project 2 (the reader) into beta sometime after that, but weāll see.
I hope this has satisfied your curiosity, but if youād like to know more, please reach out!
@bender@twtxt.net Thanks for asking!
So, Iāve been working on 2 main twtxt-related projects.
The first is small Node / express application that serves up a twtxt file while allowing its owner to add twts to it (or edit it outright), and Iāve been testing it on my site since the night I made that post. Itās still very much an MVP, and Iāve been intermittently adding features, improving security, and streamlining the code, with an eye to release it after I get an MVP done of project #2 (the reader).
But thatās where Iāve been struggling. The idea seems simple enough - another Node / express app (this one with a Vite-powered front-end) that reads a public twtxt file, parses the āfollowā list, grabs (and parses) those twtxt files, and then creates a river of twts out of the result. The pieces work fine in seclusion (and with dummy data), but I keep running into weird issues when reading real-live twtxt files, so some twts come through, while others get lost in the ether. Iāll figure it out eventually, but for now, Iāve been spending far more time than I anticipated just trying to get it to work end-to-end.
On top of it, the 2 projects wound up turning into 4 (so far), as Iāve been spinning out little libraries to use across both apps (like https://jsr.io/@itsericwoodward/fluent-dom-esm, and a forthcoming twtxt helper library).
In the end, Iām hoping to have project 1 (the editor) into beta by the end of October, and project 2 (the reader) into beta sometime after that, but weāll see.
I hope this has satisfied your curiosity, but if youād like to know more, please reach out!
@movq@www.uninformativ.de better than in the US. Our lasts only 10 years, and you need to go through the vision test, and, of course, pay). Recently they added a little gold star denoting āreal IDā compliance, and we had to pay $10 to get the old one replacedāout of the regular renew āscheduleā.
In here it is all about control, and money.
@lyse@lyse.isobeef.org is it a 360 degree video online, or a local one?
@alexonit@twtxt.alessandrocutolo.it Yhays kind of love you!! Stance and position on this. If we are going to make chicken changes in the threading model, letās keep content based addressing, but also improve the use of experience. So in fact, in order to answer your question, I think yes, we can do some kind of combination of both.
@lyse@lyse.isobeef.org @prologic@twtxt.net Canāt we find a middle ground and support both?
The thread is defined by two parts:
- The hash
- The subject
The client/pod generate the hash and index it in itās database/cache, then it simply query the subject of other posts to find the related posts, right?
In my own client current implementation (using hashes), the only calculation is in the hash generation, the rest is a verbatim copy of the subject (minus the # character), if this is the common implemented approach then adding the location based one is somewhat simple.
function setPostIndex(post) {
// Current hash approach
const hash = createHash(post.url, post.timestamp, post.content);
// New location approach
const location = post.url + '#' + post.timestamp;
// Unchanged (probably)
const subject = post.subject;
// Index them all
addToIndex(hash, post);
addToIndex(location, post);
addToIndex(subject, post);
}
// Both should work if the index contains both versions
getThreadBySubject('#abcdef') => [post1, post2, post3]; // Hash
getThreadBySubject('https://example.com#2025-01-01T12:00:00') => [post1, post2, post3]; // Location
As I said before, the mention is already location based @<example https://example.com/twtxt.txt>, so I think we should keep that in consideration.
Of course this will lead to a bit of fragmentation (without merging the two) but I think this can make everyone happy.
Otherwise, the only other solution I can think of is a different approach where the value doesnāt matter, allowing to use anything as a reference (hash, location, git commit) for greater flexibility and freedom of implementation (this probably need the use of a fixed āheaderā for each post, but it can be seen as a separate extension).
@lyse@lyse.isobeef.org I donāt think thereās any point in continuing the discussion of Location vs. Content based addressing.
I want us to preserve Content based addressing.
Letās improve the user experience and fix the hash commission problems.
@aelaraji@aelaraji.com welcome back dude! Long time no see!
@movq@www.uninformativ.de Yeah, it took quite some time to load. But then it was briefly back. Now itās 503ing immediately all the time.
@lyse@lyse.isobeef.org https://www.celestrak.com (where the TLE files are downloaded from) seems to be down. :/
@lyse@lyse.isobeef.org šæšæšæš
@movq@www.uninformativ.de Woah, cool!
(WTF, asciiworld-sat-track somehow broke, but I have not changed any of the scripts at all. O_o It doesnāt find the asciiworld-sat-calc anymore. How in the world!? When I use an absolute path, the .tle is empty and I get a parsing error. Gotta debug this.)
@prologic@twtxt.net I know we wonāt ever convince each other of the otherās favorite addressing scheme. :-D But I wanna address (haha) your concerns:
I donāt see any difference between the two schemes regarding link rot and migration. If the URL changes, both approaches are equally terrible as the feed URL is part of the hashed value and reference of some sort in the location-based scheme. It doesnāt matter.
The same is true for duplication and forks. Even today, the ācannonical URLā has to be chosen to build the hash. Thatās exactly the same with location-based addressing. Why would a mirror only duplicate stuff with location- but not content-based addressing? I really fail to see that. Also, who is using mirrors or relays anyway? I donāt know of any such software to be honest.
If there is a spam feed, I just unfollow it. Done. Not a concern for me at all. Not the slightest bit. And the byte verification is THE source of all broken threads when the conversation start is edited. Yes, this can be viewed as a feature, but how many times was it actually a feature and not more behaving as an anti-feature in terms of user experience?
I donāt get your argument. If the feed in question is offline, one can simply look in local caches and see if there is a message at that particular time, just like looking up a hash. Whereās the difference? Except that the lookup key is longer or compound or whatever depending on the cache format.
Even a new hashing algorithm requires work on clients etc. Itās not that you get some backwards-compatibility for free. It just cannot be backwards-compatible in my opinion, no matter which approach we take. Thatās why I believe some magic time for the switch causes the least amount of trouble. You leave the old world untouched and working.
If these are general concerns, Iām completely with you. But I donāt think that they only apply to location-based addressing. Thatās how I interpreted your message. I could be wrong. Happy to read your explanations. :-)
@movq@www.uninformativ.de Happy equinox!
It looks amazing from the map, you probably canāt tell even by looking from space.
@kat@yarn.girlonthemoon.xyz Oh! A new place to spam dad jokes. š„³