Searching yarn

Twts matching #http
Sort by: Newest, Oldest, Most Relevant

WOW LOL

fetch https://weaknotes.com/users/david: status 500 Internal Server Error

First real test failed trying to lookup / follow @david@weaknotes.com

⤋ Read More

For those curious, the new Twtxt <-> ActivityPub bridge I’m building (bidirectional) simply requires three things:

  1. You register your Twtxt feed to the bridge: https://bridge.twtxt.net
  2. You verify that you in fact own/control the feed by putting the verification code somewhere on/in your feed (doesn’t matter where or how)
  3. You proxy/forward requests for /.well-known/webfinger to the Bridge bridge.twtxt.net.

I’m still testing through and ironing out bugs 🐛 Please be patient! 🙏

⤋ Read More
In-reply-to » @lyse then it was, most likely, space debris—which, sadly, make up for 98% of all space anomalies these days. And thought they have applied to the Grant Wishes Council, they are yet to be approved. Keep playing, though. 😅

@bender@twtxt.net Hahahahahaahaaa, you’re right, it can’t be anything else! :‘-D Must have been one of these manmade objects. Let’s hope they will become a full member of the Grant Wishes Council soon. In any case, I will keep trying.

⤋ Read More
In-reply-to » @lyse I hope you were prepared to cram those wishes in 3 seconds. I am always prepared for that eventuality. You don't have to mutter a word, nor clearly think much about it---that is, you don't need to think your wish(es) word-by-word. As long as you stay within the wish(es) main goal(s), you should be fine, and it/they shall be granted, of course.

@lyse@lyse.isobeef.org then it was, most likely, space debris—which, sadly, make up for 98% of all space anomalies these days. And thought they have applied to the Grant Wishes Council, they are yet to be approved. Keep playing, though. 😅

⤋ Read More
In-reply-to » @lyse I hope you were prepared to cram those wishes in 3 seconds. I am always prepared for that eventuality. You don't have to mutter a word, nor clearly think much about it---that is, you don't need to think your wish(es) word-by-word. As long as you stay within the wish(es) main goal(s), you should be fine, and it/they shall be granted, of course.

@bender@twtxt.net I wished my mate would see it, too. But he turned his head a second too late. :-(

⤋ Read More
In-reply-to » On today's night walk I came across an absolutely giant shooting star. With it being visible for three seconds, it's my second largest I've ever seen so far.

@lyse@lyse.isobeef.org I hope you were prepared to cram those wishes in 3 seconds. I am always prepared for that eventuality. You don’t have to mutter a word, nor clearly think much about it—that is, you don’t need to think your wish(es) word-by-word. As long as you stay within the wish(es) main goal(s), you should be fine, and it/they shall be granted, of course.

⤋ Read More
In-reply-to » @lyse Yeah, I noticed that too. I haven’t double-checked my code, though. Maybe it has something to do with selecting the correct URL? I mean, these feeds don’t have any # url = fields, so maybe that’s it?

@movq@www.uninformativ.de Haha, you were spot on! It took me a bit to figure this out on my own. I’m actually very surprised to have gotten this wrong. Oh well.

⤋ Read More
In-reply-to » Hmmm, looks like my twt hash algorithm implementation calculates incorrect values. Might be the tilde in the URL that throws something off. :-? At least yarnd and jenny agree on a different hash.

@lyse@lyse.isobeef.org Yeah, I noticed that too. I haven’t double-checked my code, though. Maybe it has something to do with selecting the correct URL? I mean, these feeds don’t have any # url = fields, so maybe that’s it?

⤋ Read More
In-reply-to » @lyse nginx allows logging per user, via using defined variables on configuration. Not sure, though, if a Tilde would be willing to go to those “extremes”.

@bender@twtxt.net Hmm, didn’t find anything. But you mean a giant bucketload of access_log /home/$USER/logs/access.log if=… where the condition matches the requested path for said user? Yeah, that gets annoying very quickly. :-D

⤋ Read More
In-reply-to » My goodness, a new level of stupidity.

This looks like a botnet, to be honest. The IPs are all over the place. Ethopia, Brazil, Kenya, Lebanon, Netherlands, … I mean, that’s the logical thing to do, isn’t it? Do your web crawling on infected PCs. Nobody will block those, because those are the same IP ranges as legitimate requests. And obviously you don’t have to pay for computing time.

… and they all send invalid HTTP requests, all answered with HTTP 400 … How silly.

⤋ Read More
In-reply-to » Hmm, so it seems this Mike is the one who inherited it: https://tilde.club/~deepend/, but not too active anywhere, though pinging “deepend” on Libera might work...

@lyse@lyse.isobeef.org nginx allows logging per user, via using defined variables on configuration. Not sure, though, if a Tilde would be willing to go to those “extremes”.

⤋ Read More
In-reply-to » Hmm, so it seems this Mike is the one who inherited it: https://tilde.club/~deepend/, but not too active anywhere, though pinging “deepend” on Libera might work...

@bender@twtxt.net Sounds about right.

I had a brainfart yesterday, though. For whatever reason I thought of subdomains, which are modeled with server entries in nginx. So, each could define its own access_log location. However, there are no subdomains in place! Searching around, I didn’t find any solution to give each user their own access log file.

One way would be a cronjob, aeh, systemd timer as I learned the other day, that greps the main access log and writes all user access log files with only the relevant stuff.

⤋ Read More

My goodness, a new level of stupidity.

The bots are now doing things like this:

GET http://uninformativ.de/projects/lariza/feednotify/datenstrahler/slinp/countty HTTP/1.1
  1. That URL does not exist.
  2. By including http://uninformativ.de in that request, this instructs the webserver to do an HTTP proxy request. Of course, this isn’t allowed on my webserver (and shouldn’t by allowed on any normal webserver), resulting in HTTP 400. And even if it were, the target would be the exact same server, making a proxy request unnecessary.

And of course, it’s not just 50 hits like this or 100 or 1’000 or 10’000. No, it’s over 150’000 in the last 2 days. All from vastly different IP ranges of different cloud hosters.

This almost looks like a DDoS attack, but it’s just completely stupid. This feels more like some idiot vibe coded a crawler.

⤋ Read More