@prologic@twtxt.net When you unpack what heâs saying in that video (which Iâve watched, and just now re-watched), and strip away all his attempts to wrap this idea in fancy-sound language, he is saying: it would be better if women were viewed as property of men, because then if they were raped, the men who owned them would get mad and do something about it. Because rape would be a property crime then, like trespassing or theft. Left unspoken by him, but very much known to him, is that the man/men who âownâ a woman can then have their way with her, just like they can freely walk around their yard or use their own stuff. In his envisioned better world, itâd be impossible for a husband to rape his wife, for instance, because she is his property and he can do almost anything he wants (thatâs literally what âpropertyâ is in Western countries).
Itâs so fucked up itâs hard to put into words how fucked up it is. And this isnât the only bad idea who bangs on about!
Taking Jordan Peterson asn an example, the only thing he âpreachesâ (if you want to call it that) is to be honest with yourself and to take responsibility.
This is simply untrue. Read the articles I posted, seriously.
In a tweet in one of the articles I posted, Peterson states there is no white supremacy in Canada. This is blatantly false. It is disinformation. Peterson has made statements that rape is OK (he uses âfancyâ language like âwomen should be naturally converted into mothersâ but unpack that a bitâwhat he means is legalized rape followed by forced conception). He is openly anti-LGBTQ and refuses to use peoplesâ preferred pronouns. He seems to believe that women who wear makeup at work are asking to be sexually harassed.
Heâs using his platform in academia to pretend that straight, white men are somehow the most aggrieved group in the world and everyone else is just whining and can get fucked. The patron saint of Menâs Rights Activists and incels. I find him odious.
The Internet Isnât Meant To Be So Small | Defector
Itâs annoying to see millions of dollars thrown at making more-or-less literal dupes of internet
companies that everyone is already using begrudgingly and with diminishing emotional returns. Itâs maybe more frustrating to realize that the goals of these companies is the same as their predecessors, which is to
make the internet smaller.
Looks like Googleâs using this blog post of mine without my permission. I hate this kind of tech company crap so much.
@Phys_org@feeds.twtxt.net using the phrase âmachine learningâ in this article is misleading and bandwagoning. They used a neural model, which neuroscientists were doing long before âmachine learningâ became a popular term.
Tapetum Lucidum
â Read more
@prologic@twtxt.net yes, I agree. Itâs bizarre to me that people use the thing at all let alone pay for it.
BlueSky is cosplaying decentralization
I say âostensibly decentralizedâ, because BlueSkyâs (henceforth referred to as âBSâ here) decentralization is a similar kind of decentralization as with cryptocurrencies: sure, you can run your own node (in BS case: âpersonal data serversâ), but that does not give you basically any meaningful agency in the system.
I donât know why anyone would want to use this crap. Itâs the same old same old and itâll end up the same old way.
I was listening to an OâReilly hosted event where they had the CEO of GitHub, Thomas Dohmke, talking about CoPilot. I asked about biased systems and copyright problems. He, Thomas Dohmke, said, that in the next iteration they will show name, repo and licence information next to the code snippets you see in CoPilot. This should give a bit more transparency. The developer still has to decide to adhere to the licence. On the other hand, I have to say he is right about the fact, that probably every one of us has used a code snippet from stack overflow (where 99% no licence or copyright is mentioned) or GitHub repos or some tutorial website without mentioning where the code came from. Of course, CoPilot has trained with a lot of code from public repos. It is a more or less a much faster and better search engine that the existing tools have been because how much code has been used from public GitHub repos without adding the source to code you pasted it into?
@thecanine@twtxt.net wow this is horrifying. What happened to Opera? It used to be my favorite browser but now theyâre like that one cousin who started getting into drugs, and then got in trouble with the law, and then before you know it theyâre scamming old ladies out of their pension money.
Definition of e
â Read more
@prologic@twtxt.net @carsten@yarn.zn80.net
There is (I assure you there will be, donât know what it is yetâŠ) a price to be paid for this convenience.
Exactly prologic, and thatâs why Iâm negative about these sorts of things. Iâm almost 50, Iâve been around this tech hype cycle a bunch of times. Look at what happened with Facebook. When it first appeared, people loved it and signed up and shared incredibly detailed information about themselves on it. Facebook made it very easy and convenient for almost anyone, even people who had limited understanding of the internet or computers, to get connected with their friends and family. And now here we are today, where 80% of people in surveys say they donât trust Facebook with their private data, where they think Facebook commits crimes and should be broken up or at least taken to task in a big way, etc etc etc. Facebook has been fined many billions of dollars and faces endless federal lawsuits in the US alone for its horrible practices. Yet Facebook is still exploitative. Itâs a societal cancer.
All signs suggest this generative AI stuff is going to go exactly the same way. That is the inevitable course of these things in the present climate, because the tech sector is largely run by sociopathic billionaires, because the tech sector is not regulated in any meaningful way, and because the tech press / tech media has no scruples. Some new tech thing generates hype, people get excited and sign up to use it, then when the people who own the tech think they have a critical mass of users, they clamp everything down and start doing whatever it is they wanted to do from the start. Theyâll break laws, steal your shit, cause mass suffering, who knows what. They wonât stop until they are stopped by mass protest from us, and the government action that follows.
Thatâs a huge price to pay for a little bit of convenience, a price we pay and continue to pay for decades. We all know better by now. Why do we keep doing this to ourselves? It doesnât make sense. Itâs insane.
I have to write so many emails to so many idiots who have no idea what they are doing
So it sounds to me like the pressure is to reduce how much time you waste on idiots, which to my mind is a very good reason to use a text generator! I guess in that case you donât mind too much whether the company making the AI owns your prompt text?
Iâd really like to see tools like this that you can run on your desktop or phone, so they donât send your hard work off to someone else and give a company a chance to take it from you.
@carsten@yarn.zn80.net Who says you need to use anything like that? Whereâs the pressure coming from?
đ Q: How do we feel about forking the Twtxt spec into what we love and use today in Yarn.social in yarnd
, tt
, jenny
, twtr
and other clients? đ€ Thinking about (and talking with @xuu@txt.sour.is on IRC) about the possibility of rewriting a completely new spec (no extensions). Proposed name yarn.txt
or âYarnâ. Compatibility would remain with Twtxt in the sense that we wouldnât break anything per se, but weâd divorce ourselves from Twtxt and be free to improve based on the needs of the community and not the ideals of those that donât use, contribute in the first place or fixate on nostalgia (which doesnât really help anyone).
On LinkedIn I see a lot of posts aimed at software developers along the lines of âIf youâre not using these AI tools (X,Y,Z) youâre going to be left behind.â
Two things about that:
- No youâre not. If you have good soft skills (good communication, show up on time, general time management) then youâre already in excellent shape. No AI can do that stuff, and for that alone no AI can replace people
- This rhetoric is coming directly from the billionaires who are laying off tech people by the 100s of thousands as part of the class war theyâve been conducting against all working people since the 1940s. They want you to believe that you have to scramble and claw over one another to learn the âAIâ that theyâre forcing onto the world, so that you stop honing the skills that matter (see #1) and are easier to obsolete later. Donât fall for it. Itâs far from clear how this will shake out once governments get off their asses and start regulating this stuff, by the wayâmost of these âAIâ tools are blatantly breaking copyright and other IP laws, and some day thatâll catch up with them.
That said, it is helpful to know thy enemy.
We could ask them? But on the counter would bukket or jan6 follow the pure twtxt feeds? Probably not either way⊠We could use content negotiation as well. text/plain for basic and text/yarn for enhanced.
Iâm not super a fan of using json. I feel we could still use text as the medium. Maybe a modified version to fix any weakness.
What if instead of signing each twt individually we generated a merkle tree using the twt hashes? Then a signature of the root hash. This would ensure the full stream of twts are intact with a minimal overhead. With the added bonus of helping clients identify missing twts when syncing/gossiping.
Have two endpoints. One as the webfinger to link profile details and avatar like you posted. And the signature for the merkleroot twt. And the other a pageable stream of twts. Or individual twts/merkle branch to incrementally access twt feeds.
đĄ Quick ân Dirty prototype Yarn.social protocol/spec:
If we were to decide to write a new spec/protocol, what would it look like?
Hereâs my rough draft (back of paper napkin idea):
- Feeds are JSON file(s) fetchable by standard HTTP clients over TLS
- WebFinger is used at the root of a userâs domain (or multi-user) lookup. e.g:
prologic@mills.io
->https://yarn.mills.io/~prologic.json
- Feeds contain similar metadata that weâre familiar with: Nick, Avatar, Description, etc
- Feed items are signed with a ED25519 private key. That is all âpostsâ are cryptographically signed.
- Feed items continue to use content-addressing, but use the full Blake2b Base64 encoded hash.
- Edited feed items produce an âEditedâ item so that clients can easily follow Edits.
- Deleted feed items produced a âDeletedâ item so that clients can easily delete cached items.
@abucci@anthony.buc.ci that is an ironic example. Since the inventor of the seatbelt gave rights to use the technology freely.
Linguistics Gossip
â Read more
I played around with parsers. This time I experimented with parser combinators for twt message text tokenization. Basically, extract mentions, subjects, URLs, media and regular text. Itâs kinda nice, although my solution is not completely elegant, I have to say. Especially my communication protocol between different steps for intermediate results is really ugly. Not sure about performance, I reckon a hand-written state machine parser would be quite a bit faster. I need to write a second parser and then benchmark them.
lexer.go and newparser.go resemble the parser combinators: https://git.isobeef.org/lyse/tt2/-/commit/4d481acad0213771fe5804917576388f51c340c0 Itâs far from finished yet.
The first attempt in parser.go doesnât work as my backtracking is not accounted for, I noticed only later, that I have to do that. With twt message texts there is no real error in parsing. Just regular text as a âfallbackâ. So it works a bit differently than parsing a real language. No error reporting required, except maybe for debugging. My goal was to port my Python code as closely as possible. But then the runes in the string gave me a bit of a headache, so I thought I just build myself a nice reader abstraction. When I noticed the missing backtracking, I then decided to give parser combinators a try instead of improving on my look ahead reader. It only later occurred to me, that I could have just used a rune slice instead of a string. With that, porting the Python code should have been straightforward.
Yeah, all this doesnât probably make sense, unless you look at the code. And even then, you have to learn the ropes a bit. Sorry for the noise. :-)
go mills()
đ
So. Some bits.
i := fIndex(xs, 5.6)
Can also be
i := Index(xs, 5.6)
The compiler can infer the type automatically. Looks like you mention that later.
Also the infer is super smart.. You can define functions that take functions with generic types in the arguments. This can be useful for a generic value mapper for a repository
func Map[U,V any](rows []U, fn func(U) V) []V {
out := make([]V, len(rows))
for i := range rows { out = fn(rows[i]) }
return out
}
rows := []int{1,2,3}
out := Map(rows, func(v int) uint64 { return uint64(v) })
I am pretty sure the type parameters goes the other way with the type name first and constraint second.
func Foo[comparable T](xs T, s T) int
Should be
func Foo[T comparable](xs T, s T) int
Salt Dome
â Read more
Omniknot
â Read more
@prologic@twtxt.net I get the worry of privacy. But I think there is some value in the data being collected. Do I think that Russ is up there scheming new ways to discover what packages you use in internal projects for targeting ads?? Probably not.
Go has always been driven by usage data. Look at modules. There was need for having repeatable builds so various package tool chains were made and evolved into what we have today. Generics took time and seeing pain points where they would provide value. They werenât done just so it could be checked off on a box of features. Some languages seem to do that to the extreme.
Whenever changes are made to the language there are extensive searches across public modules for where the change might cause issues or could be improved with the change. The fs embed and strings.Cut come to mind.
I think its good that the language maintainers are using what metrics they have to guide where to focus time and energy. Some of the other languages could use it. So time and effort isnât wasted in maintaining something that has little impact.
The economics of the âspyingâ are to improve the product and ecosystem. Is it âspyingâ when a municipality uses water usage metrics in neighborhoods to forecast need of new water projects? Or is it to discover your shower habits for nefarious reasons?
@prologic@twtxt.net the rm -rf is basically what go clean -modcache
does.
I think you can use another form that will remove just the deps for a specific module. go clean -r
@eldersnake@we.loveprivacy.club Several reasons:
- Itâs another language to learn (SQL)
- It adds another dependency to your system
- Itâs another failure mode (database blows up, scheme changes, indexs, etc)
- It increases security problems (now you have to worry about being SQL-safe)
And most of all, in my experience, it doesnât actually solve any problems that a good key/value store can solve with good indexes and good data structures. Iâm just no longer a fan, I used to use MySQL, SQLite, etc back in the day, these days, nope I wouldnât even go anywhere near a database (for my own projects) if I can help it â Itâs just another thing that can fail, another operational overhead.
Code Lifespan
â Read more
@mckinley@twtxt.net i use pass along with the android and browser-pass clients. it is very good and keeping in sync is pretty simple.
What password manager do you use? Or, why none?
Hey @kdx@kdx.re What clinet are you using?
@xuu@txt.sour.is that doesnât seem to fit the spirit of the spec, at least by my read (I could be wrong obv). The example on Wikipediaâs webfinger page,
{
"subject": "acct:bob@example.com",
"aliases": [
"https://www.example.com/~bob/"
],
"properties": {
"http://example.com/ns/role": "employee"
},
"links": [{
"rel": "http://webfinger.example/rel/profile-page",
"href": "https://www.example.com/~bob/"
},
{
"rel": "http://webfinger.example/rel/businesscard",
"href": "https://www.example.com/~bob/bob.vcf"
}
]
}
and then the comparison with how mastodon uses webfinger,
{
"subject": "acct:Mastodon@mastodon.social",
"aliases": [
"https://mastodon.social/@Mastodon",
"https://mastodon.social/users/Mastodon"
],
"links": [
{
"rel": "http://webfinger.net/rel/profile-page",
"type": "text/html",
"href": "https://mastodon.social/@Mastodon"
},
{
"rel": "self",
"type": "application/activity+json",
"href": "https://mastodon.social/users/Mastodon"
},
{
"rel": "http://ostatus.org/schema/1.0/subscribe",
"template": "https://mastodon.social/authorize_interaction?uri={uri}"
}
]
}
suggests to me you want to leave the subject
/acct
bit as is (donât add prefixes) and put extra information you care to include in the links
section, where youâre free to define the rel
URIs however you see fit. The notion here is that webfinger is offering a mapping from an account name to additional information about that account, so if anything youâd use a "subject": "acct:SALTY ACCOUNT_REPRESENTATION"
line in the JSON to achieve what youâre saying if you donât want to do that via links
.
@prologic@twtxt.net Unfortunately the RFCâs are a bit light in this regard. While it makes mention of different kinds of accounts like mailto: or status services.. it never combines them. It does make mention of using redirects to forward a request to other webfingers to provide additional detail.
I am kinda partial to using salty:acct:me@sour.is, yarn:acct:xuu@txt.sour.is, mailto:me@sour.is that could redirect to a specific service. and a parent account acct:me@sour.is that would reference them in some way. either in properties or aliases.
restic
yet, I can beyond a doubt assure you it is really quite fantastic đ #backups
Interesting. Ive been using backupninja with Borg for snapshots.
restic · Backups done right! â In case no-one has used this wonderful tool restic
yet, I can beyond a doubt assure you it is really quite fantastic đ #backups
Did something chchange with how the discover feed is generated? My pods logout mode now only shows my twts. It used to be all twts from watcher observation like my logged on discover tab. @prologic@twtxt.net
One of the frustrating parts of using twtxt for conversations is the URLs are, well⊠ugly. Anyone (like yâall yarn folks) looked at using webfinger for translating user@domain accounts to URLs?
@prologic@twtxt.net and @justamoment, this Gitxt project sounds really interesting. Can you tell us about some of your goals?
An interesting read about testing code using nullable states instead of mocks.
https://www.jamesshore.com/v2/projects/testing-without-mocks/testing-without-mocks
@prologic@twtxt.net see where its used maybe that can help.
https://github.com/sour-is/ev/blob/main/app/peerfinder/http.go#L153
This is an upsert. So I pass a streamID which is like a globally unique id for the object. And then see how the type of the parameter in the function is used to infer the generic type. In the function it will create a new *Info and populate it from the datastore to pass to the function. The func will do its modifications and if it returns a nil error it will commit the changes.
The PA type contract ensures that the type fulfills the Aggregate interface and is a pointer to type at compile time.
A Modest Robot Levy Could Help Combat Effects of Automation On Income Inequality In US, Study Suggests
An anonymous reader quotes a report from MIT News: What if the U.S. placed a tax on robots? The concept has been publicly discussed by policy analysts, scholars, and Bill Gates (who favors the notion). Because robots can replace jobs, the idea goes, a stiff tax on them ⊠â Read more
@prologic@twtxt.net so basically you would use cgit + gitbug with some webhooks?
one that i think is pretty interesting is building up dependent constraints. see here.. it accepts a type but requires the use of a pointer to type.
https://github.com/sour-is/ev/blob/main/pkg/es/es.go#L315-L325
Tutorial: Getting started with generics - The Go Programming Language â Okay @xuu@txt.sour.is I quite like Goâs generics now đ€Ł After going through this myself I like the semantics and the syntax. Iâm glad they did a lot of work on this to keep it simple to both understand and use (just like the rest of Go) đ
#GoLang #Generics$name$
and then dispatch the hashing or checking to its specific format.
I have submitted this to be used as the hash tooling for Yarn. See it as a good example on using this in a production environment!
Logged in using new argon2i password hash!
@me@eapl.mx you are lucky you can get off easy with just âgive me $10â! In the US $10 does nothing. You need to give, at least, $50. đ
I made a thing. Its a multi password type checker. Using the PHC string format we can identify a password hashing format from the prefix $name$
and then dispatch the hashing or checking to its specific format.