when writing a new tool/software, write doc first, explaining how it works. Then, actually writing the code is much easier :)
Forensics Tools https://github.com/mesquidar/ForensicsTools
@eapl.me@eapl.me I have many fond memories of Turbo pascal and Turbo C(++). They really did have a great help system. And debug tools! Its rare for language docs to be as approachable. QBasic was great. As was PHP docs when I first came into web.
Obligatory Twtxt post: I love how I can simply use a terminal window and some very basic tools (echo, scp, ssh) to publish thoughts, as they pop up, onto the Internet in a structured way, that can be found and perhaps even appreciated.
@lyse@lyse.isobeef.org I wish more standardization around distributed issues and PRs within the repo ala git-bug was around for this. I see it has added some bridge tooling now.
Anyone have any ideas how you might identify processes (pids) on Linux machine that are responsible for most of the Disk I/O on that machine and subsequently causing high I/O wait times for other processes? š¤
Important bit: The machine has no access to the internet, there are hardly any standard tools on it, etc. So I have to get something to it āair gappedā. I have terminal access to it, so I can do interesting things like, base64 encode a static binary to my clipboard and paste it to a file, then base64 decode it and execute. Thatās about the only mechanisms I have.
@prologic@twtxt.net do not use it, but gave it a try early on and was not impressed. it gave a good outline of what I asked but then unreliably dorked up all the crucial parts.
I will say though if it is truly learning at the rate they say then it should be a good tool.
In setting up my own company and itās internal tools and services and supporting infrastructure, the ony thing I havenāt figured out how to solve āreally wellā is Email, Calendar and Contacts š¢ All the options that exist āsuckā. They suck either in terms of āoperational complexity and overheadsā or āa poor user experienceā.
@prologic@twtxt.net Horseshit hype:
- AI that we have today cannot thinkāthere is no cognitive capacity
- AI that we have today cannot be interviewedāāinterā āviewingā is two minds interacting, but AI of today has no mind, which means this is a puppet show
- AI today is not freeāitās a tool, a machine, hardly different from a hammer. It does what a human directs it to do and has no drives, desires, or autonomy. What youāre seeing here is a fancy Mechnical Turk
This shit is probably paid for by AI companies who desperately want us to think of the AI as far more capable than it actually is, because that juices sales and gives them a way to argue they arenāt responsible for any harms it causes.
Iād love to read the original source code of this:
https://ecsoft2.org/t-tiny-editor
This was our standard editor back in the day, not an āemergency toolā. And itās only 9kB in size ⦠which feels absurd in 2023. š The entire hex dump fits on one of todayās screens.
Being so small meant it had no config file. Instead, it came with TKEY.EXE
, a little tool to binary-patch T.EXE
to your likings.
Yep, thatās right, we have to use these tools in a proper way; terminal itās not a friendly tool to use for this kind of stuff, on mobile devices, and web interfaces are prepared to bring us a confortable space.
Btw, Iām waiting for your php based client š no pressure⦠š¤
[lang=en] That was the reason for twtxt-php =P
I tried using CLI tools but it was too hacky, I think.
More if we consider Jakobās Law, where we have prior expectations of a microblogging system.
A Web interface could be quite minimalistic and usable as well. (And mobile-friendly)
snac
/the fediverse for a few days and already I've had to mute somebody. I know I come on strongly with my opinions sometimes and some people don't like that, but this person had already started going ad hominem (in my reading of it), and was using what felt to me like sketchy tactics to distract from the point I was trying to make and to shut down conversation. They were doing similar things to other people in the thread so rather than wait for it to get bad for me I just muted them. People get so weirdly defensive so fast when you disagree with something they said online. Not sure I fully understand that.
@prologic@twtxt.net Well, you can mute or block individual users, and you can mute conversations too. I think the tools for controlling your interactions arenāt so bad (they could definitely be improved ofc). And in my case, I was replying to something this person said, so it wasnāt outrageous for his reply to be pushed to me. Mostly, I was sad to see how quickly the conversation went bad. I thought I was offering something relatively uncontroversial, and actually I was just agreeing with and amplifying something another person had already said.
What I see here is that when I was reading your .txt, the timestamp was like 40 minutes later than current time. Say itās 1pm and that twt is timed on 1.40pm
No idea why, perhaps your server has a wrong Timezone, or your twtxt tool is doing some timezome conversion?
Google Says Itāll Scrape Everything You Post Online for AI
Google updated its privacy policy over the weekend, explicitly saying the company reserves the right to scrape just about everything you post online to build its AI tools.
Google can eat shit.
Seems to me you could write a script that:
- Parses a StackOverflow question
- Runs it through an AI text generator
- Posts the output as a post on StackOverflow
and basically pollute the entire information ecosystem there in a matter of a few months? How long before some malicious actor does this? Maybe itās being done already š¤·
What an asinine, short-sighted decision. An astonishing number of companies are actively reducing headcount because their executives believe they can use this newfangled AI stuff to replace people. But, like the dot com boom and subsequent bust, many of the companies going this direction are going to face serious problems when the hypefest dies down and the reality of what this tech can and canāt do sinks in.
We really, really need to stop trusting important stuff to corporations. They are not tooled to last.
@shreyan@twtxt.net I agree re: AR. Vircadia is neat. I stumbled on it years ago when I randomly started wondering āwonder whatās going on with Second Life and those VR thingsā and started googling around.
Unfortunately, like so many metaverse efforts, itās almost devoid of life. Interesting worlds to explore, cool tools to build your own stuff, but almost no people in it. It feels depressing, like an abandoned shopping mall.
I have no interest in doing anything about it, even if I had the time (which I donāt), but these kind of thing happen all day every day to countless people. My silly blog post isnāt worth getting up in arms about, but there are artists and other creators who pour countless hours, heart and soul into their work, only to have it taken in exactly this way. Thatās one of the reasons Iām so extremely negative about the spate of āAIā tools that have popped up recently. They are powered by theft.
There is a ārightā way to make something like GitHub CoPilot, but Microsoft did not choose that way. They chose one of the most exploitative options available to them. For that reason, I hope they face significant consequences, though I doubt they will in the current climate. I also hope that CoPilot is shut down, though Iām pretty certain it will not be.
Other than access to the data behind it, Microsoft has nothing special that allows it to create something like CoPilot. The technology behind it has been around for at least a decade. There could be a āpublicā version of this same tool made by a cooperating group of people volunteering, āleasingā, or selling their source code into it. There could likewise be an ethically-created corporate version. Such a thing would give individual developers or organizations the choice to include their code in the tool, possibly for a fee if thatās something they want or require. The creators of the tool would have to acknowledge that they have suppliersāthe people who create the code that makes their tool possibleāinstead of simply stealing what they need and pretending thatās fine.
This era weāre living through, with large companies stomping over all laws and regulations, blatantly stealing other peopleās work for their own profit, cannot come to an end soon enough. It is destroying innovation, and we all suffer for that. Having one nifty tool like CoPilot that gives a bit of convenience is nowhere near worth the tremendous loss that Microsoftās actions in this instace are creating for everyone.
I was listening to an OāReilly hosted event where they had the CEO of GitHub, Thomas Dohmke, talking about CoPilot. I asked about biased systems and copyright problems. He, Thomas Dohmke, said, that in the next iteration they will show name, repo and licence information next to the code snippets you see in CoPilot. This should give a bit more transparency. The developer still has to decide to adhere to the licence. On the other hand, I have to say he is right about the fact, that probably every one of us has used a code snippet from stack overflow (where 99% no licence or copyright is mentioned) or GitHub repos or some tutorial website without mentioning where the code came from. Of course, CoPilot has trained with a lot of code from public repos. It is a more or less a much faster and better search engine that the existing tools have been because how much code has been used from public GitHub repos without adding the source to code you pasted it into?
I have to write so many emails to so many idiots who have no idea what they are doing
So it sounds to me like the pressure is to reduce how much time you waste on idiots, which to my mind is a very good reason to use a text generator! I guess in that case you donāt mind too much whether the company making the AI owns your prompt text?
Iād really like to see tools like this that you can run on your desktop or phone, so they donāt send your hard work off to someone else and give a company a chance to take it from you.
On LinkedIn I see a lot of posts aimed at software developers along the lines of āIf youāre not using these AI tools (X,Y,Z) youāre going to be left behind.ā
Two things about that:
- No youāre not. If you have good soft skills (good communication, show up on time, general time management) then youāre already in excellent shape. No AI can do that stuff, and for that alone no AI can replace people
- This rhetoric is coming directly from the billionaires who are laying off tech people by the 100s of thousands as part of the class war theyāve been conducting against all working people since the 1940s. They want you to believe that you have to scramble and claw over one another to learn the āAIā that theyāre forcing onto the world, so that you stop honing the skills that matter (see #1) and are easier to obsolete later. Donāt fall for it. Itās far from clear how this will shake out once governments get off their asses and start regulating this stuff, by the wayāmost of these āAIā tools are blatantly breaking copyright and other IP laws, and some day thatāll catch up with them.
That said, it is helpful to know thy enemy.
@prologic@twtxt.net I get the worry of privacy. But I think there is some value in the data being collected. Do I think that Russ is up there scheming new ways to discover what packages you use in internal projects for targeting ads?? Probably not.
Go has always been driven by usage data. Look at modules. There was need for having repeatable builds so various package tool chains were made and evolved into what we have today. Generics took time and seeing pain points where they would provide value. They werenāt done just so it could be checked off on a box of features. Some languages seem to do that to the extreme.
Whenever changes are made to the language there are extensive searches across public modules for where the change might cause issues or could be improved with the change. The fs embed and strings.Cut come to mind.
I think its good that the language maintainers are using what metrics they have to guide where to focus time and energy. Some of the other languages could use it. So time and effort isnāt wasted in maintaining something that has little impact.
The economics of the āspyingā are to improve the product and ecosystem. Is it āspyingā when a municipality uses water usage metrics in neighborhoods to forecast need of new water projects? Or is it to discover your shower habits for nefarious reasons?
Iāve never liked the idea of having everything displayed all of the time for all of history.
And I still donāt: Search and Bookmarks are better tools for this IMO.
From a technical perspective however, we will not introduce any CGO dependencies into yarnd
ā It makes portability harder.
Also I hate SQL š
restic Ā· Backups done right! ā In case no-one has used this wonderful tool restic
yet, I can beyond a doubt assure you it is really quite fantastic š #backups
$name$
and then dispatch the hashing or checking to its specific format.
I have submitted this to be used as the hash tooling for Yarn. See it as a good example on using this in a production environment!
PSA: DMs on social media sites are not truely PMs. This is why we have a separate tool for private messaging from yarn. Always remember, if you donāt own the infra (or the parts at the ends of e2e encryption) you donāt own the data. and the true owners can view it any way they want!
https://twitter.com/TinkerSec/status/1587040089057759235?t=At-8r9yJPiG6xF17skTxwA&s=19
@jlj@twt.nfld.uk @xuu@txt.sour.is hello! @prologic@twtxt.net and I were chatting about the question of globally deleting twts from the yarn.social network. @prologic@twtxt.net noted that he could build the tools and endpoints to delete twts, but some amount of cooperation from pod operators would be necessary to make it all work together. He asked me to spawn a discussion of the subject here, so here we are!
I donāt have enough technical knowledge of yarn.social to say with any credibility how it all should work, but I can say that I think it ought to be possible and itād be good to do for those rare times when itās needed.
Hey. I my own local forward tool. https://github.com/JonLundy/sshfwd it uses ssh port forwards.
Thanks to @TANTlab@twitter.com and @birkbak@birkbak.neocities.org for havning me today at AAU CPHš Presentation notes can be found at: http://darch.dk/aau-tool-talk/
@lyse@lyse.isobeef.org there was an old tool for encrypted volumes that you could use random files as the unlock keys. And you could havemultiple hidden volumes that would unlock depending on the files supplied
No on gitlab. If its self hosted gitea is best in class.
I can see hosting a mirror on github if only for the redundancy/visibility. Some projects will host but then direct contributions on their self host. Like Go does.
I would suggest using a vanity domain that can redirect tools like go get to hosting of choice. And not require rewriting all the packages any time it gets moved.
JavaScript : web apps
wut?! š³ seriously?! š¤¦āāļø
Python : small tools
Okay š
Go: micro services
Umm bad generalization 𤣠ā Example yarnd
that powers most of Yarn.social š
Java: enterprise software
Yes! Oh gawd yes! 𤣠And Java⢠needs to die a swift death!
C: crimes
Hmmm? š¤ I feel this one is going to have some backslash and/or go the way of āHackerā being misconstrued to mean entirely different/incorrect things as is whatās happening in the media (for various definitions of āmediaā).
š¤ š Reconsidering moving Yarn.socialās development back to Github: Speaking of which (I do not forget); @fastidious@arrakis.netbros.com and I were discussing over a video call two nights ago, as well as @lyse@lyse.isobeef.org who joined a bit later, about the the whole moved of all of my projects and their source code off of Github. Whilst some folks do understand and appreciate my utter disgust over what Microsoft and Copilot did by blatantly scraping open source softwareās codebases without even so much as any attempt at attribution or respecting the licenes of many (if not all?) open source projects.
That being said however, @fastidious@arrakis.netbros.com makes a very good and valid argument for putting Yarn.socialās codebases, repositories and issues back on Github for reasons that make me ātornā over my own sense of morality and ethics.
But I can live with this as long as I continue to run and operate my new (yet to be off the ground) company āSelf Hosted Pty Ltdā and where it operates itās own code hosting, servicesa, tools, etc.
Plese comment here on your thoughts. Let us decide togetehr š¤
I use WKD with my gpg key tool. its quite nice!
Lots. 𤣠The system is small, coherent, and understandable in a way no modern unix is. The namespace operations remain incredibly powerful. And several of the tools built on it, like the way network listeners and the mail server are built, are just much nicer to use, modify, and build on.
Z7 is a new CC0-licensed 6x7 monospaced typeface for uxn environments. Itās designed to be an alternative to the specter8-frag font used in various uxn tools https://sectordisk.pw/?sectors&s=1958 gopher://sectordisk.pw:70/0/cgi-bin/sector.cgi?1958
I wrote part of a configuration tool with embedded FORTH to validate schemas. It was awesome
If [you take] a look at how APLers communicate when they have ideas, you see code all the time, all day long. The APL community is the only one Iāve seen that regularly can write complete code and talk about it fluently on a whiteboard between humans without hand waving. Even my beloved Scheme programming language cannot boast this. When working with humans on a programming task, almost no one uses their programming languages that primary communication method between themselves and other humans outside of the presence of a computer. That signals to me that they are not, in fact, natural, expedient tools for communicating ideas to other humans. The best practices utilized in most programming languages are, instead, attempts to ameliorate the situation to make the code as tractable and as manageable as possible, but they do not, primarily, represent a demonstration of the naturalness of those languages to human communication. ā aaron hsu
agile / scrum is fine, but as a tool for developers to keep tight feedback loops with people who donāt know what they want well enough to give a formal spec; not as a tool for managers to squeeze productivity.
interesting RFC dated April 1st, 1998: Hyper Text Coffee Pot Control Protocol (HTCPCP/1.0):
looking at the date this was published, i think the authors originally meant this as an apilās fool joke/prank.
funny because now we have IOTs and this is somewhat a reality today :P
I wonder if email would be a reasonable way to enable interaction on twtxt⦠something like publishing an email address for replies in the preamble of your feed, then like twtxt the rest is up to you, but I could imagine a simple moderation queue that could be checked periodically allowing the admin to move approved comments into some public space⦠I keep thinking Iāll add activitypub comments to my site but it seems more complex than I care for. Ironically because of available tooling email actually feels simpler for this⦠of course, there is spamā¦
Twtxt is still very much alive and well. I just wrote a quick tool to crawl as much of the Twtxt network as I could and hereās what the results are:
Crawled 516 feeds
Found 52464 twts
That means there are >500 unique Twtxt feeds/users, and over ~52k Twts posted to date. š³
@xuu@txt.sour.is Not too happy with WKDās use of CNAME over SRV for discovery of openpgpkey.. That breaks using SNI pretty quick. I suppose it was setup as a temporary workaround anyhow in the RFC..
@prologic@twtxt.net it is some interesting work to decentralize all the things.. tricky part is finding tooling. i am using a self hacked version of the go openpgp library. A tool to add and remove notations would need to be local since it needs your private key.
Fun setting up basic productivity tools with Syncthing and Todo.txt