Good writeup, @anth@a.9srv.net! I agree to most of your points.
3.2 Timestamps: I feel no need to mandate UTC. Timezones are fine with me. But I could also live with this new restriction. I fail to see, though, how this change would make things any easier compared to the original format.
3.4 Multi-Line Twts: What exactly do you think are bad things with multi-lines?
4.1 Hash Generation: I do like the idea with with a new uuid
metadata field! Any thoughts on two feeds selecting the same UUID for whatever reason? Well, the same could happen today with url
.
5.1 Reply to last & 5.2 More work to backtrack: I do not understand anything you’re saying. Can you rephrase that?
8.1 Metadata should be collected up front: I generally agree, but if the uuid
metadata field were a feed URL and no real UUID, there should be probably an exception to change the feed URL mid-file after relocation.
@sorenpeter@darch.dk Points 2 & 3 aren’t really applicable here in the discussion of the threading model really I’m afraid. WebMentions is completely orthogonal to the discussion. Further, no-one that uses Twtxt really uses WebMentions, whilst yarnd
supports the use of WebMentions, it’s very rarely used in practise (if ever) – In fact I should just drop the feature entirely.
The use of WebSub OTOH is far more useful and is used by every single yarnd
pod everywhere (no that there’s that many around these days) to subscribe to feed updates in ~near real-time without having the poll constantly.
Some more arguments for a local-based treading model over a content-based one:
The format:
(#<DATE URL>)
or(@<DATE URL>)
both makes sense: # as prefix is for a hashtag like we allredy got with the(#twthash)
and @ as prefix denotes that this is mention of a specific post in a feed, and not just the feed in general. Using either can make implementation easier, since most clients already got this kind of filtering.Having something like
(#<DATE URL>)
will also make mentions via webmetions for twtxt easier to implement, since there is no need for looking up the#twthash
. This will also make it possible to make 3th part twt-mentions services.Supporting twt/webmentions will also increase discoverability as a way to know about both replies and feed mentions from feeds that you don’t follow.
Reddington Shores - Long run (part I): 3.05 miles, 00:11:32 average pace, 00:35:10 duration
slower second half. more of a walk-run due to the sun blaring and high heart rate.
#running
Pinellas County - Long run (part I): 3.26 miles, 00:09:16 average pace, 00:30:14 duration
quicker half down to the condo. about two miles in i was already missing the west coast non-existent humidity.
#running
And they have arrived (well, they did around 3 hours ago, LOL). Buttery smooth, my 16 Pro (one with dark cover). It took a bit over an hour to transfer all my data.
compressed_subject(msg_singlelined)
be configurable, so only a certain number of characters get displayed, ending on ellipses? Right now the entire twtxt is crammed into the Subject:
. This request aims to make twtxts display on mutt
/neomutt
, etc. more like emails do.
@movq@www.uninformativ.de yes, that’s perfect! <3
the stem matching is the same as how GIT does its branch hashes. i think you can stem it down to 2 or 3 sha bytes.
if a client sees someone in a yarn using a byte longer hash it can lengthen to match since it can assume that maybe the other client has a collision that it doesnt know about.
There’s a simple reason all the current hashes end in a or q: the hash is 256 bits, the base32 encoding chops that into groups of 5 bits, and 256 isn’t divisible by 5. The last character of the base32 encoding just has that left-over single bit (256 mod 5 = 1).
So I agree with #3 below, but do you have a source for #1, #2 or #4? I would expect any lack of variability in any part of a hash function’s output would make it more vulnerable to attacks, so designers of hash functions would want to make the whole output vary as much as possible.
Other than the divisible-by-5 thing, my current intuition is it doesn’t matter what part you take.
Hash Structure: Hashes are typically designed so that their outputs have specific statistical properties. The first few characters often have more entropy or variability, meaning they are less likely to have patterns. The last characters may not maintain this randomness, especially if the encoding method has a tendency to produce less varied endings.
Collision Resistance: When using hashes, the goal is to minimize the risk of collisions (different inputs producing the same output). By using the first few characters, you leverage the full distribution of the hash. The last characters may not distribute in the same way, potentially increasing the likelihood of collisions.
Encoding Characteristics: Base32 encoding has a specific structure and padding that might influence the last characters more than the first. If the data being hashed is similar, the last characters may be more similar across different hashes.
Use Cases: In many applications (like generating unique identifiers), the beginning of the hash is often the most informative and varied. Relying on the end might reduce the uniqueness of generated identifiers, especially if a prefix has a specific context or meaning.
Taking the last n characters of a base32 encoded hash instead of the first n can be problematic for several reasons:
Hash Structure: Hashes are typically designed so that their outputs have specific statistical properties. The first few characters often have more entropy or variability, meaning they are less likely to have patterns. The last characters may not maintain this randomness, especially if the encoding method has a tendency to produce less varied endings.
Collision Resistance: When using hashes, the goal is to minimize the risk of collisions (different inputs producing the same output). By using the first few characters, you leverage the full distribution of the hash. The last characters may not distribute in the same way, potentially increasing the likelihood of collisions.
Encoding Characteristics: Base32 encoding has a specific structure and padding that might influence the last characters more than the first. If the data being hashed is similar, the last characters may be more similar across different hashes.
Use Cases: In many applications (like generating unique identifiers), the beginning of the hash is often the most informative and varied. Relying on the end might reduce the uniqueness of generated identifiers, especially if a prefix has a specific context or meaning.
In summary, using the first n characters generally preserves the intended randomness and collision resistance of the hash, making it a safer choice in most cases.
@movq@www.uninformativ.de I didn’t run the command as you recommended, but, I wiped things once more, and ran jenny -f
, and this time got:
david@arrakis:~$ jenny -f
Fetching archived feed https://anthony.buc.ci/user/abucci/twtxt.txt/1 (configured as abucci, https://anthony.buc.ci/user/abucci/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2024-04.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://darch.dk/twtxt-archive.txt (configured as soren, https://darch.dk/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2024-04-21_6v47cua.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://twtxt.net/user/prologic/twtxt.txt/1 (configured as prologic, https://twtxt.net/user/prologic/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2024-03.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2022-12-21_2us6qbq.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://twtxt.net/user/prologic/twtxt.txt/2 (configured as prologic, https://twtxt.net/user/prologic/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2024-02.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2022-01-14_ew5gzca.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://twtxt.net/user/prologic/twtxt.txt/3 (configured as prologic, https://twtxt.net/user/prologic/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2024-01.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-12-23_f6y65bq.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://twtxt.net/user/prologic/twtxt.txt/4 (configured as prologic, https://twtxt.net/user/prologic/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-12.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-12-04_e4x7yba.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://twtxt.net/user/prologic/twtxt.txt/5 (configured as prologic, https://twtxt.net/user/prologic/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-11.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-11-18_42tjxba.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://twtxt.net/user/prologic/twtxt.txt/6 (configured as prologic, https://twtxt.net/user/prologic/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-10.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-11-08_i2wnvaa.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-09.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-10-23_kvwn5oa.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-08.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-10-11_mljudaa.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-07.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-09-22_5mkqwua.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-06.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-07-27_xcnzmlq.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-05.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-06-16_mtedqya.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-04.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-04-29_z7lvzja.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-03.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-03-19_xjabvhq.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-02.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-02-24_te4a6oa.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2023-01.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2021-01-26_qxgigma.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-12.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://www.uninformativ.de/twtxt-old_2020-12-13_igfnala.txt (configured as movq, https://www.uninformativ.de/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-11.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-10.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-09.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-08.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-07.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-06.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-05.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-04.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-03.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-02.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2022-01.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-12.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-11.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-10.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-09.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-08.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-07.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-06.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-05.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-04.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-03.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-02.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2021-01.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Fetching archived feed https://lyse.isobeef.org/twtxt-2020-12.txt (configured as lyse, https://lyse.isobeef.org/twtxt.txt)
Notice that @prologic@twtxt.net’s /6
is there. I found the twtxt then. Kind of odd it didn’t show before.
(replyto:http://darch.dk/twtxt.txt,2024-09-15T12:06:27Z)
I think I like this a lot. 🤔
The problem with using hashes always was that they’re “one-directional”: You can construct a hash from URL + timestamp + twt, but you cannot do the inverse. When I see “, I have no idea what that could possibly refer to.
But of course something like (replyto:http://darch.dk/twtxt.txt,2024-09-15T12:06:27Z)
has all the information you need. This could simplify twt/feed discovery quite a bit, couldn’t it? 🤔 That thing that I just implemented – jenny asking some Yarn pod for some twt hash – would not be necessary anymore. Clients could easily and automatically fetch complete threads instead of requiring the user to follow all relevant feeds.
Only using the timestamp to identify a twt also solves the edit problem.
It even is better for non-Yarn clients, because you now don’t have to read, understand, and implement a “twt hash specification” before you can reply to someone.
The only problem, really, is that (replyto:http://darch.dk/twtxt.txt,2024-09-15T12:06:27Z)
is so long. Clients would have to try harder to hide this. 😅
@aelaraji@aelaraji.com make sense, probably. The twtxt was already on my Maildir, that’s why I can fetch it. I fetch every 3 minutes (sssh, don’t tell anyone!). LOL!
More:
Subject: The [tag URI scheme](https://en.wikipedia.org/wiki/Tag_URI_scheme) looks interesting. I like that it human read- and writable. And since we already got the timestamp in the twtxt.txt it would be
somewhat trivial to parse. But there are still the issue with what the name/id should be... Maybe it doesn't have to bee that stick? Instead of using `tag:` as the prefix/protocol, it would more it clear
what we are talking about by using `in-reply-to:` (https://indieweb.org/in-reply-to) or `replyto:` similar to `mailto:` 1. `(reply:sorenpeter@darch.dk,2024-09-15T12:06:27Z)' 2.
`(in-reply-to:darch.dk/twtxt.txt,2024-09-15T12:06:27Z)' 2. `(replyto:http://darch.dk/twtxt.txt,2024-09-15T12:06:27Z)' I know it's longer that 7-11 characters, but it's self-explaining when looking at the
twtxt.txt in the raw, and the cases above can all be caught with this regex: `\([\w-]*reply[\w-]*\:` Is this something that would work?
Subject: The [tag URI scheme](https://en.wikipedia.org/wiki/Tag_URI_scheme) looks interesting. I like that it human read- and writable. And since we already got the timestamp in the twtxt.txt it would be
somewhat trivial to parse. But there are still the issue with what the name/id should be... Maybe it doesn't have to bee that stick? Instead of using `tag:` as the prefix/protocol, it would more it clear
what we are talking about by using `in-reply-to:` (https://indieweb.org/in-reply-to) or `replyto:` similar to `mailto:` 1. `(reply:sorenpeter@darch.dk,2024-09-15T12:06:27Z)` 2.
`(in-reply-to:darch.dk/twtxt.txt,2024-09-15T12:06:27Z)` 3. `(replyto:http://darch.dk/twtxt.txt,2024-09-15T12:06:27Z)` I know it's longer that 7-11 characters, but it's self-explaining when looking at the
twtxt.txt in the raw, and the cases above can all be caught with this regex: `\([\w-]*reply[\w-]*\:` Is this something that would work?
Notice the difference? Soren edited, and broke everything.
Also what are the change that the same human will make two different posts within the same second?!
Just out of curiosity, What would happen someday if I (maybe trolling) edit my twtxt.txt-file manually and switch/switch a couple of twt timestamps, or add in 3 different twts manually with the same time stamp?
@mckinley@twtxt.net Thanks for the feedback.
- Yeah I agrees that nick sound not be part of syntax. Any valid URL to a twtxt.txt-file should be enough and is more clear, so it is not confused with a email (one of the the issues with webfinger and fedivese handles)
- I think any valid URL would work, since we are not bound to look for exact matches. Accepting both http and https as well as a gemni and gophe could all work as long as the path to the twtxt.txt is the same.
- My idea is that you quote the timestamp as it is in the original twtxt.txt that you are referring to, so you can do it by simply copy/pasting. Also what are the change that the same human will make two different posts within the same second?!
Regarding the whole cryptographic keys for identity, to me it seems like an unnecessary layer of complexity. If you move to a new house or city you tell people that you moved - you can do the same in a twtxt.txt. Just post something like “I move to this new URL, please follow me there!” I did that with my feeds at least twice, and you guys still seem to read my posts:)
The tag URI scheme looks interesting. I like that it human read- and writable. And since we already got the timestamp in the twtxt.txt it would be somewhat trivial to parse. But there are still the issue with what the name/id should be… Maybe it doesn’t have to bee that stick?
Instead of using tag:
as the prefix/protocol, it would more it clear what we are talking about by using in-reply-to:
(https://indieweb.org/in-reply-to) or replyto:
similar to mailto:
(reply:sorenpeter@darch.dk,2024-09-15T12:06:27Z)
(in-reply-to:darch.dk/twtxt.txt,2024-09-15T12:06:27Z)
(replyto:http://darch.dk/twtxt.txt,2024-09-15T12:06:27Z)
I know it’s longer that 7-11 characters, but it’s self-explaining when looking at the twtxt.txt in the raw, and the cases above can all be caught with this regex: \([\w-]*reply[\w-]*\:
Is this something that would work?
url
field in the feed to define the URL for hashing. It should have been the last encountered one. Then, assuming append-style feeds, you could override the old URL with a new one from a certain point on:
I was not suggesting to that everyone need to setup a working webfinger endpoint, but that we take the format of nick+(sub)domain as base for generating the hashed together with the message date and content.
If we omit the protocol prefix from the way we do things now will that not solve most of the problems? In the case of gemini://gemini.ctrl-c.club/~nristen/twtxt.txt
they also have a working twtxt.txt at https://ctrl-c.club/~nristen/twtxt.txt
… damn I just notice the gemini.
subdomain.
Okay what about defining a prefers protocol as part of the hash schema? so 1: https , 2: http 3: gemini 4: gopher ?
Interesting.. QUIC isn’t very quick over fast internet.
QUIC is expected to be a game-changer in improving web application performance. In this paper, we conduct a systematic examination of QUIC’s performance over high-speed networks. We find that over fast Internet, the UDP+QUIC+HTTP/3 stack suffers a data rate reduction of up to 45.2% compared to the TCP+TLS+HTTP/2 counterpart. Moreover, the performance gap between QUIC and HTTP/2 grows as the underlying bandwidth increases. We observe this issue on lightweight data transfer clients and major web browsers (Chrome, Edge, Firefox, Opera), on different hosts (desktop, mobile), and over diverse networks (wired broadband, cellular). It affects not only file transfers, but also various applications such as video streaming (up to 9.8% video bitrate reduction) and web browsing. Through rigorous packet trace analysis and kernel- and user-space profiling, we identify the root cause to be high receiver-side processing overhead, in particular, excessive data packets and QUIC’s user-space ACKs. We make concrete recommendations for mitigating the observed performance issues.
@falsifian@www.falsifian.org In my opinion it was a mistake that we defined the first url
field in the feed to define the URL for hashing. It should have been the last encountered one. Then, assuming append-style feeds, you could override the old URL with a new one from a certain point on:
# url = https://example.com/alias/txtxt.txt
# url = https://example.com/initial/twtxt.txt
<message 1 uses the initial URL>
<message 2 uses the initial URL, too>
# url = https://example.com/new/twtxt.txt
<message 3 uses the new URL>
# url = https://example.com/brand-new/twtxt.txt
<message 4 uses the brand new URL>
In theory, the same could be done for prepend-style feeds. They do exist, I’ve come around them. The parser would just have to calculate the hashes afterwards and not immediately.
On the Subject of Feed Identities; I propose the following:
- Generate a Private/Public ED25519 key pair
- Use this key pair to sign your Twtxt feed
- Use it as your feed’s identity in place of
# url =
as# key = ...
For example:
$ ssh-keygen -f prologic@twtxt.net
$ ssh-keygen -Y sign -n prologic@twtxt.net -f prologic@twtxt.net twtxt.txt
And your feed would looke like:
# nick = prologic
# key = SHA256:23OiSfuPC4zT0lVh1Y+XKh+KjP59brhZfxFHIYZkbZs
# sig = twtxt.txt.sig
# prev = j6bmlgq twtxt.txt/1
# avatar = https://twtxt.net/user/prologic/avatar#gdoicerjkh3nynyxnxawwwkearr4qllkoevtwb3req4hojx5z43q
# description = "Problems are Solved by Method" 🇦🇺👨💻👨🦯🏹♔ 🏓⚯ 👨👩👧👧🛥 -- James Mills (operator of twtxt.net / creator of Yarn.social 🧶)
2024-06-14T18:22:17Z (#nef6byq) @<bender https://twtxt.net/user/bender/twtxt.txt> Hehe thanks! 😅 Still gotta sort out some other bugs, but that's tomorrows job 🤞
...
Twt Hash extension would change of course to use a feed’s ED25519 public key fingerprint.
Pinellas County Running: 3.02 miles, 00:08:52 average pace, 00:26:46 duration
just needed to get out. first run since the PTC and felt great.
#running
Base: 3.00 miles, 00:10:35 average pace, 00:31:45 duration
test full gear, cool down with ice, and 3’/1’ pacing strategies.
#running #treadmill
Base: 3.00 miles, 00:10:01 average pace, 00:30:03 duration
testing cool down strategy with ice on the neck.
#running #treadmill
yarnd
that's been around for awhile and is still present in the current version I'm running that lets a person hit a constructed URL like
@prologic@twtxt.net This does not seem to fix the problem for me, or I’ve done something wrong. I did the following:
- Pull the latest version from
git
(I have commit7ad848
, same as ontwtxt.net
I believe).
make build
andmake install
- Restart
yarnd
- Refresh cache in Poderator Settings
Yet I still see these bogus /external
things on my pod when I hit URLs like the one I sent you recently. When I hit such a URL with curl
I think it’s giving an error? But in a web browser, the (buggy) response is the same as it was before I updated.
So, this problem is not fixed for me.
@lyse@lyse.isobeef.org I have no Idea, I still haven’t found a repair shop I can trust with my monitor. As for the blackouts, they don’t have consistent frequency. Sometimes it’s once every 3 months… other times it’s 3 times a day 😂
Pinellas County Running: 3.03 miles, 00:10:00 average pace, 00:30:18 duration
little sprinkle so tried to get out and enjoy it, but only caught a few minutes. just took it easy.
#running
You might have seen me popping up on IRC. This is how it looks:
That’s EZirc from the 1990ies. (It says it needs Warp 4, but runs fine on Warp 3.)
Lots of this old stuff still works (technically), but as @lyse@lyse.isobeef.org said: A lot of it really is dead. There’s not much going on anymore in Usenet.
HTTP/2 differs from 1.x by becoming a binary protocol, it also multiplexes multiple channels over the same connection and has the ability to prefetch related content to the browser to lower the perceived latency.
HTTP/3 moves the binary protocol from HTTP/2 over to QUIC which is based on UDP instead of TCP. This makes it better suited to mobile or unstable networks where handling of transmission errors can be handled at a higher level.
Does anyone know what the differences between HTTP/1.1 HTTP/2 and HTTP/3 are? 🤔
Pinellas County - Recovery: 3.25 miles, 00:10:04 average pace, 00:32:44 duration
relaxed running to flush the legs out a bit before bowling with some friends. both of my shoes came undone which has not happened in years to that was a bit odd.
#running
receieveFile()
)? 🤔
@stigatle@yarn.stigatle.no @prologic@twtxt.net testing 1 2 3 can either of you see this?
I deleted them all right before I sent my previous message, and already, a few minutes later, there are two more:
abucci@buc:~$ du -sh /tmp/yarnd-avatar-3*
1.8G /tmp/yarnd-avatar-3122347915
2.4G /tmp/yarnd-avatar-3533381443
What is this?
Je suis tellement fatigué de la bêtise humaine. 1/3, je croise forcément ces gens… Je ne peux m’empêcher de penser à mes élèves, mes MEILLEURS élèves, + assidûs et intelligents que les autres, qui ont laissé un proche au fond de la méditerranée… ne me parlez pas ce soir :/
Pinellas County - Cool down: 2.35 miles, 00:09:59 average pace, 00:23:27 duration
walked about ¾ mile and then started c/d run. felt strong so endurance is in a great spot!
#running
Asked ChatGPT 3 yesterday to tone down an angry message I had to send, as in make it still polite, but make sure to get the message across, and it succeeded pretty well.
Pinellas County - Fun run: 3.14 miles, 00:08:29 average pace, 00:26:39 duration
just a fun run in the rain
#running
Hmm…
Jun 19 23:31:38 yarn_init.sh[61567]: [yarnd] 2024/06/19 23:31:38 (127.0.0.1:40254) “POST /post HTTP/
1.0” 200 0 3.402208ms
[…]Jun 19 23:31:39 yarn_init.sh[61567]: [yarnd] 2024/06/19 23:31:39 (127.0.0.1:40262) “GET /post HTTP/1.0” 404 729 123.474001ms
Pinellas County - Fun run: 3.02 miles, 00:08:39 average pace, 00:26:09 duration
blow off stress between meeting followed by an excuse to go outside with no comms.
#running
Base: 3.19 miles, 00:09:50 average pace, 00:31:21 duration
finishing it up
#running #treadmill
@prologic@twtxt.net 1. allow only members of your pod to submit feeds
- some kind of captcha (eew)
- probationary status after submittal which requires review of some sort
- rate limiting to slow down submissions from the same source
Recovery: 3.11 miles, 00:10:55 average pace, 00:33:54 duration
Pinellas County - Long run 3’(mod) [1’ rec]: 7.47 miles, 00:09:46 average pace, 01:13:02 duration
again practicing the 3’ on and 1’ off strategy. thinking i will have to just be flexible and adapt it as the day goes on for PTC. bit of a hot one out there today.
#running
Base: 3.11 miles, 00:09:48 average pace, 00:30:26 duration
quick 5km on the tread. pre-cinco de mayo meal.
#running #treadmill
Recovery: 3.11 miles, 00:10:41 average pace, 00:33:10 duration
Started writing something from scratch yesterday using thread(3) and wow do I miss writing in Limbo instead. :-/ #plan9
Whoo! 🥳 I won 2 of my 3 singles tonight! 🏓
If you’re using jenny on Python 3.12, it will spit out a deprecation warning regarding datetime.utcnow()
. This will be fixed in the next release.
Pinellas County - Long Run: 10.19 miles, 00:09:43 average pace, 01:39:03 duration
practicing 3 minutes running and one minute walking. not only for the knee but also for the PTC (~46.6 miles) coming in about 17 weeks. the knee actually hurt a little the first 5 miles but afterwards nothing. not sure if i finally found my stride but it felt great once the dull pain was gone.
#running