My open letter, to the European Commission digital markets act team:
Hello,
I am joining other developers, concerned about Googles new plan, to approve every app and effectively destroy most of the competing 3rd party stores this way. The biggest one of these alternative stores, most known for their focus on user and developer privacy, already states, this would make it impossible for them to operate: https://f-droid.org/cs/2025/09/29/google-developer-registration-decree.html
Even communities like the XDA forum, where new developers are often introduced to the world of Android development, would likely be strongly impacted, as making, publishing and installing Android apps is made less accessible.
I am not just writing on their behalf, I run a small website myself (https://thecanine.ueuo.com/), that both provides legal modifications, for some android apps - for example adding an amoled dark theme, to the most popular XMPP chat client for Android, or increasing one of Androids keyboard apps height. This all comes after Googles previous changes to the Android operating system, that prevent users from installing old apps (old to Google, can mean only a couple of months, without an update - https://developer.android.com/google/play/requirements/target-sdk and the target version gets increased every year). I rely on apps developed by a single developer, even for things like making the pixel art presented on my website and sideloading as a way to make these apps work, before developers can catch up to Googleās new requirements - if Google is allowed to slowly kill these options, us digital artists will soon lose the tools we need to create digital art.
iām torn between git-daemon and using a forgejo to store my flakes. i can do authz using yggdrasil addresses, but thatās basically the limit. maybe thatās not a bad thing.
@bender@twtxt.net We have quite a few that are basically part of our friendly neighborhood. They knew we wonāt chase them aware, scare them, etc. In fact some of us find little cockroaches to feed them, tose āem up in the air and watch them sweep in and grab the little suckers š¤£
@alexonit@twtxt.alessandrocutolo.it Thanks mate! Ah cool, now Iām curious, what did you make? :-)
You used the rubber hammer to fold the metal, not to set the rivets, right? :-? I glued cork on my wooden mallet some time ago. This worked quite good for bending. But rubber might be even better as it is a tad softer. I will try this next time, I think I have one deep down in a drawer somewhere.
@zvava@twtxt.net @bender@twtxt.net At first I added it without thinking when planning the possible fields based on other UIs I was researching.
I was about to discard it but after thinking about it a bit I noticed that the services allowing to have a separated nick and display_name could unlock some good uses.
For example some added context or at-a-glance information like pronouns or statuses (like Artist [Accepting commissions] or App Name (v2.5)) while other used a more readable version of the nick (blog.domain.com became Person Name's Blog).
Of course it is absolutely optional and it can be safely ignored, but with my vision of being able to build more that a pure twtxt clients, giving it a first-class support just like the other known fields felt right to me.
@lyse@lyse.isobeef.org Great job!
I suggested it because I did it in the past, but never used it on bigger works.
In my case I did it exclusively on really small projects and used a thin rubber head hammer to prevent deforming the metal.
@movq@www.uninformativ.de I can confirm.
An intern practicing with turtle had an error when launching it the first time because it was missing tkinter which it use internally.
I experimented with a 2.4x7mm aluminium rivet I had on hand. As expected, it was quite a bit long. Using my pliers wrench, I was able to crush it down by quite some bit. I should have taken a photo right after the hand riveter for comparison. Now, itās much smoother and the chance of cutting my hand open is reduced by quite a bit. But breaking the burr with a few file strokes is still necessary. I should get 2.4x4mm rivets and try with them. I reckon they would be more suited for my 0.5mm sheet metal.
With the pliers wrench again, I was able to also crush down the chopped off 3mm copper nail and form a second head. That was surprisingly easy. Now, I need to figure out how to efficiently make a head on the remaining copper nail shaft, so that I can use this again.
Both are rock solid, thereās absolutely no movement at all between the two sheet metal cutoffs.
@movq@www.uninformativ.de I never programmed with Tkinter myself and itās been ages that I ran a program which used it. I always thought that it looks awful. But maybe there are nicer themes these days. I just wanted to give the demo python3 -m tkinter a try, but this module doesnāt exist. I was always under the wrong impression that Tkinter is bundled with Python.
From the chicken archive, 2017.
Not mine, these were more or less free roaming chickens. Farmers didnāt use some of their fields for a while and allowed some other farmer to let the birds live there in the meantime.
@lyse@lyse.isobeef.org Xfce is nice, but itās also mostly GTK. I donāt really know the answer yet. For now, Iāll just avoid anything that uses GTK4.
For my own programs, I might have a closer look at Tkinter. I was complaining recently that I couldnāt find a good file manager, so it might be an interesting excercise to write one in Python+Tkinter. š¤ (Or maybe thatās too much work, I donāt know yet.)
@movq@www.uninformativ.de canāt you use generic drivers? I did that for an enterprise copier/printer/scanner we used to have at work, and it worked just fine!
All good things come to an end, I guess.
I have an Epson printer (AcuLaser C1100) and an Epson scanner (Perfection V10), both of which I bought about 20 years ago. The hardware still works perfectly fine.
Until recently, Epson still provided Linux drivers for them. That is pretty cool! I noticed today that they have relaunched their driver website ā and now I canāt find any Linux drivers for that hardware anymore. Just doesnāt list it (it does list some drivers for Windows 7, for example).
I mean, okay, weāre talking about 20 years here. That is a very long time, much more than I expected. But if it still works, why not keep using it?
Some years ago, I started archiving these drivers locally, because I anticipated that they might vanish at some point. So I can still use my hardware for now (even if I had to reinstall my PC for some reason). It might get hacky at some point in the future, though.
This once more underlines the importance of FOSS drivers for your hardware. I sadly didnāt pay attention to that 20 years ago.
Uuuhhh, thatās rather interesting, I didnāt know about that:
Aachen has been officially certified as āBad Aachenā, but for alphabetical reasons usually declines to use the prefix
ā https://en.wikipedia.org/wiki/List_of_spa_towns_in_Germany#A
That made me chuckle.
Sieht ganz so aus, als hätte die gute @kat@yarn.girlonthemoon.xyz ihre Büchse mit in den Kurort Bad Gateway genommen.
Sorry, this pun only works in German, where āBadā means spa and is used as prefix for spa towns.
Waste paper, like an opened envelope, suits a shopping list perfectly fine.
Indeed, Iām drowning in this stuff and I throw it away anyway, so I might just use it.
Youāve got a nice handwriting, I like it.
Thanks. š (It used to be horrible. Gosh, the teachers scolding me in school ⦠Bah. š)
@movq@www.uninformativ.de Not sure, if this observation is correct. I know so many techies who also use every latest shit and automate their homes which is scary as hell to me.
@movq@www.uninformativ.de So damn true.
I have a friend that might lock himself out of his home if thereās a power outage while I keep removing apps and devices from my daily lives instead.
I recently switched from all the todo apps I used to sticky notes on my monitors and a pocket notebook for sketching and quick notes.
Hello again everyone! A little update on my twtxt client.
I think itās finally shaping a bit better now, but⦠āļø
As Iām trying to put all the parts together, I decided to build multiple parallel UIs, to ensure I donāt accidentally create a structure that is more rigid than planned.
I already decided on a UI that I would want to use for myself, it would be inspired by moshidon, misskey and some other āsocial feedsā mock-ups I found on dribbble.
I also plan on building a raw HTML version (for anyone wanting to do a full DIY client).
I would love to get any suggestions of what you would like to see (and possibly use) as a client, by sharing a link, app/website name or even a sketch made by you on paper.
I think Iāll pick a third and maybe a fourth design to build together with the two already mentioned.
For reference, the screens I think of providing are (some might be optional or conditionally/manually hidable):
- Global / personal timeline screen
- Profile screen (with timeline)
- Thread screen
- Notifications screen or popup (both valid)
- DM list & chat screens (still planning, might come later)
- Settings screen (itāll probably be a hard coded form, but better mention it)
- Publish / edit post screen or popup (still analysing some use cases, as some āenginesā might not have direct publishing support)
I also plan on adding two optional metadata fields:
display_name: To show a human readable alternative for a nick, it fallback tonickif not defined
banner: Using the same format asavatarbut the image expected is wider, inspired by other socials around
I also plan on supporting any metadata provided, including a dynamically parsable regex rule format for those extra fields, this should allow anyone to build new clients that donāt limit themselves to just the social aspect of twtxt, hoping to see unique ways of using twtxt! š¤
@bender@twtxt.net The first format use the subject extension while the other is a new format that is inspired by mentions format, the first one should be compatible but Iām not sure, if itās used verbatim by the client it would work, but if we consider the new proposal for it to have an optional part it wont work on clients without changes.
@movq@www.uninformativ.de While using the a frament is pretty nice, I think we can have a twtxt only format if the formatting seems to be a problem.
@lyse@lyse.isobeef.org I think will be bad if handled incorrectly.
The client must reference both properly or it would miss posts, including both this way is a bit pointless if you canāt use the hash or url separately.
Being a highly likely a breaking change anyway I think @zvava@twtxt.net proposal looks much better.
But you know what still works, my squeeze filler (didnāt even refill it) and my old (super cheap) calligraphy set ⦠Iāll just use that.
https://movq.de/v/f48c7cda09/IMG_20251001_200317.jpg.jpg
https://movq.de/v/f48c7cda09/IMG_20251001_202438.jpg.jpg
@movq@www.uninformativ.de huh, firefox actually does seem to tolerate the dashes in the fragment. also, i did propose simply using an anchor link first, but prologic was not a fan of this :p
Spooky season is upon us, so I can take a month break, from being a paper clip.

url metadata field unequivocally treated as the canon feed url when calculating hashes, or are they ignored if they're not at least proper urls? do you just tolerate it if they're impersonating someone else's feed, or pointing to something that isn't even a feed at all?
(#abcdefghijkl https://example.com/tw.txt#:~:text=2025-10-01T10:28:00Z), because it can be simply hacked in to clients currently on hashv1 and provides an off-ramp to location-based addressing
I like that property (an off-ramp to location-based addressing), so I think I could live with that approach. ā
(Iām not sure why weāre using text fragments, though. Wouldnāt that link to the first occurence of 2025-10-01T10:28:00Z? Thatās not necessarily correct. And, to be proper URLs that Firefox and Chromium understand, it would also need to be written as 2025%2D10%2D01T10:28:00Z. The dash carries meaning, sadly. I think all this just creates needless complication. How about we just go with https://example.com/tw.txt#2025-10-01T10:28:00Z?)
url metadata field unequivocally treated as the canon feed url when calculating hashes, or are they ignored if they're not at least proper urls? do you just tolerate it if they're impersonating someone else's feed, or pointing to something that isn't even a feed at all?
@zvava@twtxt.net My clients trusts the first url field it finds. If there is none, it uses the URL that Iām using for fetching the feed.
No validation, no logging.
In practice, Iāve not seen issues with people messing with this field. (What I do see, of course, is broken threads when people do legitimate edits that change the hash.)
I donāt see a way how anyone can impersonate anybody else this way. š¤ Sure, you could use my URL in your url field, but then what? You will still show up as zvava in my client or, if you also change your nick field, as movq (zvava).
url metadata field unequivocally treated as the canon feed url when calculating hashes, or are they ignored if they're not at least proper urls? do you just tolerate it if they're impersonating someone else's feed, or pointing to something that isn't even a feed at all?
@zvava@twtxt.net Yes, the specification defines the first url to be used for hashing. No matter if it points to a different feed or whatever. Just unsubscribe from malicious feeds and youāre done.
Since the first url is used for hashing, it must never change. Otherwise, it will break threading, as you already noticed. If your feed moves and you wanna keep the old messages in the same new feed, you still have to point to the old url location and keep that forever. But you can add more urls. As I said several times in the past, in hindsight, using the first url was a big mistake. It would have been much better, if the last encountered url were used for hashing onwards. This way, feed moves would be relatively straightforward. However, that ship has sailed. Luckily, feeds typically donāt relocate.
@zvava@twtxt.net Ah okay. Yeah, so far, yarn looks pretty nice and is easy to use. I might even selfhost my own instance too!
url metadata field unequivocally treated as the canon feed url when calculating hashes, or are they ignored if they're not at least proper urls? do you just tolerate it if they're impersonating someone else's feed, or pointing to something that isn't even a feed at all?
@zvava@twtxt.net That was my greatest concern with how it is currently handled, Iām afraid to break threads even by fixing a typo.
Handling it via the pod might work but I think itās not the best approach, external feeds and clients donāt usually use a pod api but their own implementation, so any workaround wonāt work there.
Thatās why my proposals addressed those issues:
- the idea of using a ākeyā instead of the
url(with the url as a fallback), the key could even be a public key so it can be used verifieable in crypto functions
- using the timestamp to prevent content changes to break threads (plus being simpler to implement)
- using an explicit thread reference with an alternative subject format (like
[#THREAD_ID] Hello worldand replies with(#REPLY_ID) Ahoy) so the content can change without affecting the thread reference, and anyone can use their own schemes freely
@zvava@twtxt.net What is a nice twtxt client to use?
@lyse@lyse.isobeef.org I can suggest you a trick to do a ācoldā welding.
Using a copper wire or a similarly malleable material, pass it through a drilled hole, hammer it on one end until flat, then do the same on the other side.
It does the same job of a rivet but itās flatter and look nicer on both sides, itās of course weaker but still strong enough for small objects.
Itās sometimes used to reduce risk of deformities due to heat in hand-crafted jewelry and to reduce costs of small tools.
@zvava@twtxt.net CORS is our worst enemy. š„·
I too had the same issue being a browser-based request, so the only solution is using a proxy.
For testing (and real personal use) I rely on this one https://corsproxy.io/.
In my client, I first check if the source allows me to fetch it without issues first and fallback to prefixing with a proxy if it gives an error.
For security reasons the client donāt give you a readable error for CORS, so you must use a catch-all for that, if it fails again with the proxy you can deal with any other errors it throws as you normally would (preferably outside of the fetch function).
After the fetching responded, I store the response.url value to fetch it again for updates without having to do extra calls (you can store it verbatim or as a flag to be able to change the proxy later).
Here an extract of my code:
export async function fetchWithProxy(url, proxy=null) {
return await fetch(url).catch(err => {
if (!proxy) throw err;
return fetch(`${proxy}${encodeURIComponent(url)}`);
});
}
// Using it with
const res = await fetchWithProxy('https://twtxt.net/user/zvava/twtxt.txt', 'https://corsproxy.io/?');
// Get the working url (direct or through proxy)
const fetchingURL = res.url;
// Get the twtxt feed content (or handle errors)
const text = await res.text();
I also plan to allow the user to define a custom proxy field, I like the solution used by Delta.chat in their android app, where you can define the URL format with a variable https://my-proxy?$TWTXT_URL since it allows you to define with more freedom any proxy without a prefix format.
If the idea of using a third-party proxy is not to the user liking they can use a self-hosted solution like cors-anywhere or build their own (with twtxt it should just be a GET).
Hopefully I can muster up the energy to start this new project:
Put up lots of thermometers and hygrometers in the apartment, have them report their readings wireless to a database.
I suspect that Iāll have to ābuildā these myself, because ready-to-use kits most like require some sort of cloud service. Dunno, havenāt checked yet.
@bender@twtxt.net Yes and no.
To build a compliant PWA you need to provide a webmanifest json and a service worker.
Those requirements are not directly part of this project.
You can build the client as a standalone PWA or even as a widget inside an existing page.
The general steps are closer to how you would include a third-party library in an existing project, by importing it as a dependency and using it in your website.
Iām pretty sure most users would expect a PWA (me included) so I plan to provide a ready-made template ready to be deployed as is.
Hi everyone, hereās a little introduction of my twtxt client (still WIP).
The client Iām developing is a single tenant project that runs entirely in the browser (it might use an optional backend).
Itās entirely based on native web-components and vanilla JS, it is designed to act closer to a toolkit than a full-fledged client, allowing users to āDIYā their own interface with pure html or plain javascript functions.
Users can also build their own engines by including a global javascript object that implement the defined internal API (TBD).
Iām planning to build a system that is easy enough to build and use with any skill level, using only pure html (with a homebrew minimal template engine) or via plain JS (Iāll be also providing some pre-made templates too).
Everything can be self-hosted on any static hosting provider, this allows to spread twtxt within communities like Neocities and similarly hosted websites (basically any Indieweb/Smallweb/Digital garden website and any of the common GitHub/Lab/Berg/lify Pages).
It will be probably named something like TxtCraft or craf.txt but Iām not really sure yet⦠š¤ (Maybe some suggestions could help)
Iām still in the experimental phase, so thereās no decent source-code to share yet, but it will soon enough!
I think Iām just about ready to go live with my new blog (migrated from MicroPub). I just finished migrating all of the content over, fixing up metadata, cleaning up, migrating media, optimizing media.
The new blog for prologic.blog soon to be powered by zs using the zs-blog-template is coming along very nicely š It was actually pretty easy to do the migration/conversation in the end. The results are not to shabby either.
Before:
- ~50MB repo
- ~267 files
After:
- ~20MB repo
- ~88 files
Pretty happy with my zs-blog-template starter kit for creating and maintaining your own blog using zs š Demo of what the starter kit looks like here ā Basic features include:
- Clean layout & typography
- Chroma code highlighting (aligned to your site palette)
- Accessible copy-code button
- āOn this pageā collapsible TOC
- RSS, sitemap, robots
- Archives, tags, tag cloud
- Draft support (hidden from lists/feeds)
- Open Graph (OG) & Twitter card meta (default image + per-post overrides)
- Ready-to-use 404 page
As well as custom routes (redirects, rewrites, etc) to support canonical URLs or redirecting old URLs as well as new zs external command capability itself that now lets you do things like:
$ zs newpost
to help kick-start the creation of a new post with all the right āstuffā⢠ready to go and then pop open your $EEDITOR š¤
@prologic@twtxt.net I too, self-host various services on a VPS (and considering buying a mini PC to keep at home instead).
I use most of it as a hosting platform for personal use only and as a remote development environment (I do share a couple of tools with a friend though).
But given the costant risks of DDoS, hacking, bots, etc. I keep any of my public facing resources purely static and on separate hosting providers (without lock-ins of course).
Lately, I began using homebrew PWAs with CouchDB as a sync database, this way I get a fantastic local-first experience and also have total control of my data, that also sync in a locally hosted backup instance in real-time.
Also, I was already aware of Salty.im, but what Iām thinking is a more feature complete solution that even my family can use quickly, Delta.chat with the new chatmail provider (self-hostable) might be the solution for my needs.
But Iām still thinking if itās worth the trouble. I might just drop everything and only use safe channels to speak with them (free 24/7 family tech-support is easy to manage š).
Also, Iāll be waiting for the day youāll share with us your story, Iām pretty curious about it!
@movq@www.uninformativ.de You didnāt miss anything. Just time for more useful stuff. ;-)
Exactly, @zvava@twtxt.net, I agree. (Although, in my client at least, I wouldnāt use hashes anywhere.)
@prologic@twtxt.net Hm, I donāt know. Over here, we have parties that we would call āleftā or ārightā, one of them even calls themselves āThe Leftā. No idea about your political landscape, but it still makes sense for us. š¤ For me, at least.
I meant, ājljā. He used to be at https://twt.nfld.uk/, long gone now too. I wonderā¦
https://zsblog.mills.io/ for anyone interested. I think I still have some small tweaking to do befor eI use this for realz.
@alexonit@twtxt.alessandrocutolo.it Yeah I think weāre overstating the UNIX principles a bit here 𤣠I get what youāre trying to say though @zvava@twtxt.net š If I could go back in time and do it all over again, I would have gotten the Hash length correct and I would have used SHA-256 instead. But someone way smarter than me designed the Twt Hash spec, we adopted it and well here we are today, it works⢠š
@alexonit@twtxt.alessandrocutolo.it Well we have to really use the same spec or threading doesnāt really work in a truly decentralized manner š
Thatās what Iām using right now, while my own client is still in the making.
A simple bash script to write a post in a mktemp file then clean it with regex.
I donāt even bother to hash the replies, I just open https://twtxt.net and copy the hash by hand since Iām checking the new posts from there anyway (temporarily, as I might end up DoS-ing everyoneās feed in my client right now).
@prologic@twtxt.net to clarify the i meant the ability to parse feeds using unix command line utilities, as a prinicpal of twtxtv1ās design. im not sure how feasible it is to build a simple feed reader out of common scripting utilities when hashing is in play, and;
i concede, it does make a lot of sense to fix up the hashing spec rather than completely supplant it at this point, just thinking about what the rewrite would be like is dreadful in and of itself x.x
@prologic@twtxt.net to clarify: i meant the ability to parse feeds using unix command line utilities, as a principal of twtxtv1ās design. im not sure how feasible it is to build a simple feed reader out of common scripting utilities when hashing is in play, and;
i concede, it does make a lot of sense to fix up the hashing spec rather than completely supplant it at this point, just thinking about what the rewrite would be like is dreadful in and of itself x.x
Put another way, what you are proposing/pushing for requires hundreds of lines of code to change across a half dozen or so clients and lots of breaking changes, not to mention unknowns.
What I want us to do is make only a few half dozen or so lines of code changes to our clients and minimize the breaking changes and unknowns.