Searching yarn

Twts matching #4
Sort by: Newest, Oldest, Most Relevant

Goodbye Blender, I guess? 🤔

Image

A bit annoying, but not much of a problem. The only thing I did with Blender was make some very simple 3D-printable objects.

I’ll have a look at the alternatives out there. Worst case is I go back to Art of Illusion, which I used heavily ~15 years ago.

⤋ Read More
In-reply-to » @bender Gave it a try on Termux same thing @doesnm uses and it worked 👍 Media

@doesnm@doesnm.p.psf.lt No it’s all good… I’ve just rebuilt it from master and it doesn’t look like anything is broken:

~/GitRepos> git clone https://github.com/plomlompom/htwtxt.git
Cloning into 'htwtxt'...
remote: Enumerating objects: 411, done.
remote: Total 411 (delta 0), reused 0 (delta 0), pack-reused 411 (from 1)
Receiving objects: 100% (411/411), 87.89 KiB | 430.00 KiB/s, done.
Resolving deltas: 100% (238/238), done.
~/GitRepos> cd htwtxt
master ~/GitRepos/htwtxt> go mod init htwtxt
go: creating new go.mod: module htwtxt
go: to add module requirements and sums:
        go mod tidy
master ~/GitRepos/htwtxt> go mod tidy
go: finding module for package github.com/gorilla/mux
go: finding module for package golang.org/x/crypto/bcrypt
go: finding module for package gopkg.in/gomail.v2
go: finding module for package golang.org/x/crypto/ssh/terminal
go: found github.com/gorilla/mux in github.com/gorilla/mux v1.8.1
go: found golang.org/x/crypto/bcrypt in golang.org/x/crypto v0.29.0
go: found golang.org/x/crypto/ssh/terminal in golang.org/x/crypto v0.29.0
go: found gopkg.in/gomail.v2 in gopkg.in/gomail.v2 v2.0.0-20160411212932-81ebce5c23df
go: finding module for package gopkg.in/alexcesaro/quotedprintable.v3
go: found gopkg.in/alexcesaro/quotedprintable.v3 in gopkg.in/alexcesaro/quotedprintable.v3 v3.0.0-20150716171945-2caba252f4dc
master ~/GitRepos/htwtxt> go build
master ~/GitRepos/htwtxt> ll
.rw-r--r-- aelaraji aelaraji 330 B  Fri Nov 22 20:25:52 2024  go.mod
.rw-r--r-- aelaraji aelaraji 1.1 KB Fri Nov 22 20:25:52 2024  go.sum
.rw-r--r-- aelaraji aelaraji 8.9 KB Fri Nov 22 20:25:06 2024  handlers.go
.rwxr-xr-x aelaraji aelaraji  12 MB Fri Nov 22 20:26:18 2024  htwtxt                 <-------- There's the binary ;)
.rw-r--r-- aelaraji aelaraji 4.2 KB Fri Nov 22 20:25:06 2024  io.go
.rw-r--r-- aelaraji aelaraji  34 KB Fri Nov 22 20:25:06 2024  LICENSE
.rw-r--r-- aelaraji aelaraji 8.5 KB Fri Nov 22 20:25:06 2024  main.go
.rw-r--r-- aelaraji aelaraji 5.5 KB Fri Nov 22 20:25:06 2024  README.md
drwxr-xr-x aelaraji aelaraji 4.0 KB Fri Nov 22 20:25:06 2024  templates

⤋ Read More

China boosts support for Global South + 2 more stories
Chinese President Xi Jinping announces eight development measures for the Global South; President Biden visits the Amazon rainforest to boost climate efforts; Biden pledges record $4 billion to World Bank fund for poorest nations. ⌘ Read more

⤋ Read More

SPRF 10km: 6.28 miles, 00:07:55 average pace, 00:49:44 duration
played this by feel. such a cool day with a breeze making everything feel comfortable. at around 3 or 4 miles in i realized my pace was probably pretty quick and told myself i would just attempt to maintain. near mile five it started to hurt but just pushed through and got a pb!
#running #race

⤋ Read More
In-reply-to » Thank you, @eapl.me! No need to apologize in the introduction, all good. :-)

@lyse@lyse.isobeef.org

Regarding section 4 about feed discovery: Yeah, non-HTTP transport protocols are an issue as they do not have User-Agent headers. How exactly do you envision the discovery_url to work, though?

This is from a twt of mine from January 2022:

https://www.uninformativ.de/files/twtxt/2022%2D01%2D22%2D%2Dfollow%2Dendpoint.md

(This idea gets lost all the time, so I put it into a file now. 😅)

Not sure if this is what @eapl.me@eapl.me had in mind, obviously.

⤋ Read More
In-reply-to » Thanks @lyse! I'm replying here https://text.eapl.mx/reply-to-lyse-about-twtxt

Thank you, @eapl.me@eapl.me! No need to apologize in the introduction, all good. :-)

Section 3: I’m a bit on the fence regarding documenting the HTTP caching headers. It’s a very general HTTP thing, so there is nothing special about them for twtxt. No need for the Twtxt Specification to actually redo it. But on the other hand, a short hint could certainly help client developers and feed authors. Maybe it’s thanks to my distro’s Ngninx maintainer, but I did not configure anything for the Last-Modified and ETag headers to be included in the response, the web server just already did it automatically.

The more that I think about it while typing this reply, the more I think your recommendation suggestion is actually really great. It will definitely beneficial for client developers. In almost all client implementation cases I’d say one has to actually do something specifically in the code to send the If-Modified-Since and/or If-None-Match request headers. There is no magic that will do it automatically, as one has to combine data from the last response with the new request.

But I also came across feeds that serve zero response headers that make caching possible at all. So, an explicit recommendation enables feed authors to check their server setups. Yeah, let’s absolutely do this! :-)

Regarding section 4 about feed discovery: Yeah, non-HTTP transport protocols are an issue as they do not have User-Agent headers. How exactly do you envision the discovery_url to work, though? I wouldn’t limit the transports to HTTP(S) in the Twtxt Specification, though. It’s up to the client to decide which protocols it wants to support.

Since I currently rely on buckket’s twtxt client to fetch the feeds, I can only follow http(s):// (and file://) feeds. But in tt2 I will certainly add some gopher:// and gemini:// at some point in time.

Some time ago, @movq@www.uninformativ.de found out that some Gopher/Gemini users prefer to just get an e-mail from people following them: https://twtxt.net/twt/dikni6q So, it might not even be something to be solved as there is no problem in the first place.

Section 5 on protocol support: You’re right, announcing the different transports in the url metadata would certainly help. :-)

Section 7 on emojis: Your idea of TUI/CLI avatars is really intriguing I have to say. Maybe I will pick this up in tt2 some day. :-)

⤋ Read More
In-reply-to » Righto, @eapl.me, ta for the writeup. Here we go. :-)

@eapl.me@eapl.me here are my replies (somewhat similar to Lyse’s and James’)

  1. Metadata in twts: Key=value is too complicated for non-hackers and hard to write by hand. So if there is a need then we should just use #NSFS or the alt-text file in markdown image syntax ![NSFW](url.to/image.jpg) if something is NSFW

  2. IDs besides datetime. When you edit a twt then you should preserve the datetime if location-based addressing should have any advantages over content-based addressing. If you change the timestamp the its a new post. Just like any other blog cms.

  3. Caching, Yes all good ideas, but that is more a task for the clients not the serving of the twtxt.txt files.

  4. Discovery: User-agent for discovery can become better. I’m working on a wrapper script in PHP, so you don’t need to go to Apaches log-files to see who fetches your feed. But for other Gemini and gopher you need to relay on something else. That could be using my webmentions for twtxt suggestion, or simply defining an email metadata field for letting a person know you follow their feed. Interesting read about why WebMetions might be a bad idea. Twtxt being much simple that a full featured IndieWeb sites, then a lot of the concerns does not apply here. But that’s the issue with any open inbox. This is hard to solve without some form of (centralized or community) spam moderation.

  5. Support more protocols besides http/s. Yes why not, if we can make clients that merge or diffident between the same feed server by multiples URLs

  6. Languages: If the need is big then make a separate feed. I don’t mind seeing stuff in other langues as it is low. You got translating tool if you need to know whats going on. And again when there is a need for easier switching between posting to several feeds, then it’s about building clients with a UI that makes it easy. No something that should takes up space in the format/protocol.

  7. Emojis: I’m not sure what this is about. Do you want to use emojis as avatar in CLI clients or it just about rendering emojis?

⤋ Read More
In-reply-to » I’m seeing strange lights in the sky. None of my cameras are sensitive enough to make a video.

@movq@www.uninformativ.de Some more options:

  1. Summer lightning.
  2. Obviously aliens!11!!!1

I once saw a light show in the woods originating most likely from a disco a few kilometers away. That was also pretty crazy. There was absolutely zero sound reaching the valley I was in.

⤋ Read More
In-reply-to » @bender @prologic I'm not exactly asking yarnd to change. If you are okay with the way it displayed my twts, then by all means, leave it as is. I hope you won't mind if I continue to write things like 1/4 to mean "first out of four".

@bender@twtxt.net I try to avoid editing. I guess I would write 5/4, 6/4, etc, and hopefully my audience would be sympathetic to my failing.

Anyway, I don’t think my eccentric decision to number my twts in the style of other social media platforms is the only context where someone might write ¼ not meaning a quarter. E.g. January 4, to Americans.

I’m happy to keep overthinking this for as long as you are :-P

⤋ Read More
In-reply-to » @prologic I'm not a yarnd user, so it doesn't matter a whole lot to me, but FWIW I'm not especially keen on changing how I format my twts to work around yarnd's quirks.

@bender@twtxt.net @prologic@twtxt.net I’m not exactly asking yarnd to change. If you are okay with the way it displayed my twts, then by all means, leave it as is. I hope you won’t mind if I continue to write things like 1/4 to mean “first out of four”.

What has text/markdown got to do with this? I don’t think Markdown says anything about replacing 1/4 with ¼, or other similar transformations. It’s not needed, because ¼ is already a unicode character that can simply be directly inserted into the text file.

What’s wrong with my original suggestion of doing the transformation before the text hits the twtxt.txt file? @prologic@twtxt.net, I think it would achieve what you are trying to achieve with this content-type thing: if someone writes 1/4 on a yarnd instance or any other client that wants to do this, it would get transformed, and other clients simply wouldn’t do the transformation. Every client that supports displaying unicode characters, including Jenny, would then display ¼ as ¼.

Alternatively, if you prefer yarnd to pretty-print all twts nicely, even ones from simpler clients, that’s fine too and you don’t need to change anything. My 1/4 -> ¼ thing is nothing more than a minor irritation which probably isn’t worth overthinking.

⤋ Read More
In-reply-to » @bender True, I'm just not sure we can have it both way? 🤔 I can turn smartypants off, but I do seem to recall you wanted it on 🤣

@prologic@twtxt.net I’m not a yarnd user, so it doesn’t matter a whole lot to me, but FWIW I’m not especially keen on changing how I format my twts to work around yarnd’s quirks.

I wonder if this kind of postprocessing would fit better between composing (via yarnd’s UI) and publishing. So, if a yarnd user types ¼, it could get changed to ¼ in the twtxt.txt file for everyone to see, not just people reading through yarnd. But when I type ¼, meaning first out of four, as a non-yarnd user, the meaning wouldn’t get corrupted. I can always type ¼ directly if that’s what I really intend.

(This twt might be easier to understand if you read it without any transformations :-P)

Anyway, again, I’m not a yarnd user, so do what you will, just know you might not be seeing exactly what I meant.

⤋ Read More

Inversion by Aric McBay was another random library pick. Like The Fall of Io, it’s the most recent in a series, though I think this series is pretty loosely connected. In contrast, the villain in this book is simple and cartoonishly evil. The book presents a design for utopia which was interesting but a little cloying. I’m not sure if I’m supposed to want to live there, but I don’t think I do. I enjoyed the book as easy reading, and might try the others in the series some time. (4/4)

⤋ Read More

I’m enjoying Wesley Chu’s Tao and Io series. Spies, action, ancient aliens. Some funny parts, some interesting world-building parts, some action-filled parts. I picked up The Fall of Io at random from a library a few weeks ago, and it turned out to be the last in a series of six (technically two series), so after finishing that I read the first and am partway through the second. Usually I try to read series in order, but this way is interesting. One thing I liked about The Fall of Io was that it it followed many points of view with somewhat conflicting interests, some more evil than others, and I felt sympathy for most of them. (I was kind of hoping it would be about Jupiter’s moon Io, but it wasn’t, but I’m satisfied with what I ended up with.) (2/4)

⤋ Read More

👋 Thanks for joining us on our Sept monthly Yarn.social meetup today y’all 🙇‍♂️ We had @david@collantes.us @sorenpeter@darch.dk @doesnm@doesnm.p.psf.lt @falsifian@www.falsifian.org and @xuu@txt.sour.is 💪 Nice turn out! (not all at once of course, as we normally run this over 4 hours as we span many time zones!)

Things we talked about:

  • Decentralised vs. Distributed
  • Use of SHA256 for Twt Hash(es)
  • We solved Edits! 🥳
  • UUID(s) probably won’t work! (susceptible to sppofing)
  • Helped @sorenpeter@darch.dk write some PHP to process/parse User-Agent and service his feed via a custom PHP script 😅
  • @falsifian@www.falsifian.org introduced himself 👌
  • Talked about Merkle Trees 🌳

Did I miss anything? 🤔

⤋ Read More

Recent #fiction #scifi #reading:

  • The Memory Police by Yōko Ogawa. Lovely writing. Very understated; reminded me of Kazuo Ishiguro. Sort of like Nineteen Eighty-Four but not. (I first heard it recommended in comparison to that work.)

  • Subcutanean by Aaron Reed; https://subcutanean.textories.com/ . Every copy of the book is different, which is a cool idea. I read two of them (one from the library, actually not different from the other printed copies, and one personalized e-book). I don’t read much horror so managed to be a little creeped out by it, which was fun.

  • The Wind from Nowhere, a 1962 novel by J. G. Ballard. A random pick from the sci-fi section; I think I picked it up because it made me imagine some weird 4-dimensional effect (“from nowhere” meaning not in a normal direction) but actually (spoiler) it was just about a lot of wind for no reason. The book was moderately entertaining but there was nothing special about it.

Currently reading Scale by Greg Egan and Inversion by Aric McBay.

⤋ Read More

More thoughts about changes to twtxt (as if we haven’t had enough thoughts):

  1. There are lots of great ideas here! Is there a benefit to putting them all into one document? Seems to me this could more easily be a bunch of separate efforts that can progress at their own pace:

1a. Better and longer hashes.

1b. New possibly-controversial ideas like edit: and delete: and location-based references as an alternative to hashes.

1c. Best practices, e.g. Content-Type: text/plain; charset=utf-8

1d. Stuff already described at dev.twtxt.net that doesn’t need any changes.

  1. We won’t know what will and won’t work until we try them. So I’m inclined to think of this as a bunch of draft ideas. Maybe later when we’ve seen it play out it could make sense to define a group of recommended twtxt extensions and give them a name.

  2. Another reason for 1 (above) is: I like the current situation where all you need to get started is these two short and simple documents:
    https://twtxt.readthedocs.io/en/latest/user/twtxtfile.html
    https://twtxt.readthedocs.io/en/latest/user/discoverability.html
    and everything else is an extension for anyone interested. (Deprecating non-UTC times seems reasonable to me, though.) Having a big long “twtxt v2” document seems less inviting to people looking for something simple. (@prologic@twtxt.net you mentioned an anonymous comment “you’ve ruined twtxt” and while I don’t completely agree with that commenter’s sentiment, I would feel like twtxt had lost something if it moved away from having a super-simple core.)

  3. All that being said, these are just my opinions, and I’m not doing the work of writing software or drafting proposals. Maybe I will at some point, but until then, if you’re actually implementing things, you’re in charge of what you decide to make, and I’m grateful for the work.

⤋ Read More
In-reply-to » I'd like to see them fine me 2% of zero dollars

83(4) GDPR sets forth fines of up to 10 million euros, or, in the case of an undertaking, up to 2% of its entire global turnover of the preceding fiscal year, whichever is higher.

Though I suppose it has to be the greater of the two. But I don’t even have one euro to start with.

⤋ Read More

Gemini/Gopher Twtxt feeds account for less than 1% in existence:

$ total=$(inspect-db yarns.db | jq -r '.Value.URL' | awk -F'//' '{if ($1 ~ /^https?/) print "http/https:"; else print $1}' | sort | uniq -c | awk '{sum+=$1} END {print sum}'); inspect-db yarns.db | jq -r '.Value.URL' | awk -F'//' '{if ($1 ~ /^https?/) print "http/https:"; else print $1}' | sort | uniq -c | awk -v total="$total" '{printf "%d %s %.2f%%\n", $1, $2, ($1/total)*100}' | sort -r
7 gemini: 0.66%
4 gopher: 0.38%
1046 http/https: 98.96%

⤋ Read More
In-reply-to » This is only first draft quality, but I made some notes on the #twtxt v2 proposal. http://a.9srv.net/b/2024-09-25

Good writeup, @anth@a.9srv.net! I agree to most of your points.

3.2 Timestamps: I feel no need to mandate UTC. Timezones are fine with me. But I could also live with this new restriction. I fail to see, though, how this change would make things any easier compared to the original format.

3.4 Multi-Line Twts: What exactly do you think are bad things with multi-lines?

4.1 Hash Generation: I do like the idea with with a new uuid metadata field! Any thoughts on two feeds selecting the same UUID for whatever reason? Well, the same could happen today with url.

5.1 Reply to last & 5.2 More work to backtrack: I do not understand anything you’re saying. Can you rephrase that?

8.1 Metadata should be collected up front: I generally agree, but if the uuid metadata field were a feed URL and no real UUID, there should be probably an exception to change the feed URL mid-file after relocation.

⤋ Read More
In-reply-to » Taking the last n characters of a base32 encoded hash instead of the first n can be problematic for several reasons:

@prologic@twtxt.net

There’s a simple reason all the current hashes end in a or q: the hash is 256 bits, the base32 encoding chops that into groups of 5 bits, and 256 isn’t divisible by 5. The last character of the base32 encoding just has that left-over single bit (256 mod 5 = 1).

So I agree with #3 below, but do you have a source for #1, #2 or #4? I would expect any lack of variability in any part of a hash function’s output would make it more vulnerable to attacks, so designers of hash functions would want to make the whole output vary as much as possible.

Other than the divisible-by-5 thing, my current intuition is it doesn’t matter what part you take.

  1. Hash Structure: Hashes are typically designed so that their outputs have specific statistical properties. The first few characters often have more entropy or variability, meaning they are less likely to have patterns. The last characters may not maintain this randomness, especially if the encoding method has a tendency to produce less varied endings.

  2. Collision Resistance: When using hashes, the goal is to minimize the risk of collisions (different inputs producing the same output). By using the first few characters, you leverage the full distribution of the hash. The last characters may not distribute in the same way, potentially increasing the likelihood of collisions.

  3. Encoding Characteristics: Base32 encoding has a specific structure and padding that might influence the last characters more than the first. If the data being hashed is similar, the last characters may be more similar across different hashes.

  4. Use Cases: In many applications (like generating unique identifiers), the beginning of the hash is often the most informative and varied. Relying on the end might reduce the uniqueness of generated identifiers, especially if a prefix has a specific context or meaning.

⤋ Read More

Taking the last n characters of a base32 encoded hash instead of the first n can be problematic for several reasons:

  1. Hash Structure: Hashes are typically designed so that their outputs have specific statistical properties. The first few characters often have more entropy or variability, meaning they are less likely to have patterns. The last characters may not maintain this randomness, especially if the encoding method has a tendency to produce less varied endings.

  2. Collision Resistance: When using hashes, the goal is to minimize the risk of collisions (different inputs producing the same output). By using the first few characters, you leverage the full distribution of the hash. The last characters may not distribute in the same way, potentially increasing the likelihood of collisions.

  3. Encoding Characteristics: Base32 encoding has a specific structure and padding that might influence the last characters more than the first. If the data being hashed is similar, the last characters may be more similar across different hashes.

  4. Use Cases: In many applications (like generating unique identifiers), the beginning of the hash is often the most informative and varied. Relying on the end might reduce the uniqueness of generated identifiers, especially if a prefix has a specific context or meaning.

In summary, using the first n characters generally preserves the intended randomness and collision resistance of the hash, making it a safer choice in most cases.

⤋ Read More
In-reply-to » Could someone knowledgable reply with the steps a grandpa will take to calculate the hash of a twtxt from the CLI, using out-of-the-box tools? I swear I read about it somewhere, but can't find it.

@quark@ferengi.one Do you mean something like this?

$ ./yarnc debug ~/Public/twtxt.txt | tail -n 1
kp4zitq 2024-09-08T02:08:45Z	(#wsdbfna) @<aelaraji https://aelaraji.com/twtxt.txt> My work has this thing called "compressed work", where you can **buy** extra time off (_as much as 4 additional weeks_) per year. It comes out of your pay though, so it's not exactly a 4-day work week but it could be useful, just haven't tired it yet as I'm not entirely sure how it'll affect my net pay

⤋ Read More
In-reply-to » @quark At the moment, the twt in question exists in the sixth archive:

@movq@www.uninformativ.de I didn’t run the command as you recommended, but, I wiped things once more, and ran jenny -f, and this time got:

david@arrakis:~$ jenny -f
Fetching archived feed https://anthony.buc.ci/user/abucci/twtxt.txt/1 (configured as abucci, https://anthony.buc.ci/user/abucci/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2024-04.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://darch.dk/twtxt-archive.txt (configured as soren, https://darch.dk/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2024-04-21_6v47cua.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://twtxt.net/user/prologic/twtxt.txt/1 (configured as prologic, https://twtxt.net/user/prologic/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2024-03.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2022-12-21_2us6qbq.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://twtxt.net/user/prologic/twtxt.txt/2 (configured as prologic, https://twtxt.net/user/prologic/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2024-02.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2022-01-14_ew5gzca.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://twtxt.net/user/prologic/twtxt.txt/3 (configured as prologic, https://twtxt.net/user/prologic/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2024-01.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-12-23_f6y65bq.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://twtxt.net/user/prologic/twtxt.txt/4 (configured as prologic, https://twtxt.net/user/prologic/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-12.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-12-04_e4x7yba.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://twtxt.net/user/prologic/twtxt.txt/5 (configured as prologic, https://twtxt.net/user/prologic/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-11.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-11-18_42tjxba.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://twtxt.net/user/prologic/twtxt.txt/6 (configured as prologic, https://twtxt.net/user/prologic/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-10.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-11-08_i2wnvaa.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-09.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-10-23_kvwn5oa.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-08.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-10-11_mljudaa.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-07.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-09-22_5mkqwua.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-06.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-07-27_xcnzmlq.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-05.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-06-16_mtedqya.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-04.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-04-29_z7lvzja.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-03.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-03-19_xjabvhq.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-02.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-02-24_te4a6oa.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-01.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-01-26_qxgigma.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-12.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2020-12-13_igfnala.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-11.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-10.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-09.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-08.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-07.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-06.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-05.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-04.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-03.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-02.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-01.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-12.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-11.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-10.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-09.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-08.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-07.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-06.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-05.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-04.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-03.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-02.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-01.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2020-12.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)

Notice that @prologic@twtxt.net’s /6 is there. I found the twtxt then. Kind of odd it didn’t show before.

⤋ Read More
In-reply-to » @falsifian In my opinion it was a mistake that we defined the first url field in the feed to define the URL for hashing. It should have been the last encountered one. Then, assuming append-style feeds, you could override the old URL with a new one from a certain point on:

I was not suggesting to that everyone need to setup a working webfinger endpoint, but that we take the format of nick+(sub)domain as base for generating the hashed together with the message date and content.

If we omit the protocol prefix from the way we do things now will that not solve most of the problems? In the case of gemini://gemini.ctrl-c.club/~nristen/twtxt.txt they also have a working twtxt.txt at https://ctrl-c.club/~nristen/twtxt.txt … damn I just notice the gemini. subdomain.

Okay what about defining a prefers protocol as part of the hash schema? so 1: https , 2: http 3: gemini 4: gopher ?

⤋ Read More