Prev 12 / 28 Next
is this a misskey moment, a shadowsocks-rust moment or a privoxy moment thats like. in an hour or so. i wonder how many of my (shit)posts were never federated
@[email protected] except half the fedi likely won't see those because of defed :D
> Based on the information provided, the Entry HAS NOT BEEN SELECTED for further processing for the Electronic Diversity Visa program at this time. why do i even keep trying lol
(this is a followers-only post, please log in to see it)
(this is a followers-only post, please log in to see it)
@[email protected] delivering in cardboard boxes is actually kinda nice, wish they did that here
@[email protected] oh fuck i forgot you can go to grocery in person $[blur can you tell i haven't left my apt in ages?]
@[email protected] do they like deliver in them and have you unpack your stuff before the courier leaves?
how do i stop girlrotting
@[email protected] i think if it's just `/` you could do the other way round and it would be much easier
@[email protected] can't you just proxy by `accept` header? + `.well-known` routes for webfinger
@[email protected] groups are not activated before you relogin https://unix.stackexchange.com/questions/277240/usermod-a-g-group-user-not-work
(this is a followers-only post, please log in to see it)
(this is a followers-only post, please log in to see it)
(this is a followers-only post, please log in to see it)
@[email protected] because there is no lightweight ast parser for rust for javascript and i dont want to use treesitter or something. regexes just work™ ([here's](https://github.com/teidesu/deno-types/blob/main/src/libs.ts#L36) the thing if ur interested btw)
@[email protected] they are small enough to fit there so why not. it's not like they can be arbitrarily large also db access is often faster than loading files because db has in-memory caching and stuff
> Error: Cannot perform I/O on behalf of a different request. I/O objects (such as streams, request/response bodies, and others) created in the context of one request handler cannot be accessed from a different request's handler. This is a limitation of Cloudflare Workers which allows us to improve overall performance. (I/O type: Native) > at WebSocketTransport.send what the fuck how does "not having a persistent connection" improve performance am i really supposed to open a new websocket every single request?.. this is so dumb
@[email protected] it fails to bundle them properly though i use `new URL(..., import.meta.url)` to locate wasm blob and `import.meta.url` is just `undefined`). and yes i use wrangler
@[email protected] to be fair, wintercg's fetch doesn't define how the proxies should be passed > Let proxies be the result of finding proxies for url in an implementation-defined manner. If there are no proxies, let proxies be « "DIRECT" ».
@[email protected] the thing is that the js standards just suck precisely because it's trying to be one-size-fits-all, it doesn't account for the use-cases that are specifics for the platforms where the code is actually being run. ending up in runtimes just patching stuff into to make up for that. i doubt there is any way around that the way the standard apis are now...
oh for fucks sake don't touch dates ([src](https://twitter.com/acdlite/status/1785691330988986587))
@[email protected] doesn't node also have some non-standard options in the fetch api? i think it was called `dispatcher` pretty much [the same stuff](https://github.com/nodejs/undici/pull/1411), isn't it? and workers also have a fun quirk of automatically upgrading websockets. not very "standard", too
@[email protected] honestly if the site can't handle that load - its the site's issue. > a single roughly ~3KB POST to Mastodon caused servers to pull a bit of HTML and… fuck, an image <i>idk where did POST come from, but whatever</i> just make it uhh not return the entire page for preview generators? and put a cdn in front of your images? literally what > In total, 114.7 MB of data was requested from my site in just under five minutes thats like... nothing, lol. if you can't handle 380kbps of throughput how is your site even still alive :neocat_googly_shocked:
apparently they don't support `import.meta.url` too, lol. though that should be fixed ?[here](https://github.com/cloudflare/workerd/pull/1553) which was merged 4 hours ago. nice timing i guess lol. though i found people in their discord complaining about exactly that half a year ago :haggard:
> Date.now() returns the time of the last I/O; it does not advance during code execution. wait what, what the fuck
``` service core:user:cloudflare: Uncaught Error: EvalError: Code generation from strings disallowed for this context ``` > `eval()` is not allowed for security reasons. > `new Function` is not allowed for security reasons. fucking what now cloudflare why there are very legitimate uses for both of them and it's so dumb to restrict them like that what the fuck luckily i don't do much codegen and those uses are kinda optional but still wtf
yay, i did absolutely nothing (useful) today :neocat_googly:
@[email protected] yeah, that's true. but i don't find it reasonable enough to not use github *at all* in worst case scenario – you can always just migrate away if you end up needing stuff they made paywalled and dont want to pay. it's not like github is very vendor-locking – there are emulators for actions, and they provide api enough to export all issues/wiki/whatever.
why do people keep rejecting github lol like yeah its *proprietary* and *owned by microsoft*. but like.. so what? they *actually* invest a lot of money in open source (i cant even imagine how much they lose on free-tier github actions) and the network effect, too. everyone has a github account, nobody has an account on your yet another forgejo instance. lowering entry barrier here is pretty good in many cases i just don't get it. if you are anti-corporate, microsoft will most likely lose *more* money if you just use their services extensively without paying them (e.g abusing free actions as fuck, storing large files in repos, etc.) than if you didn't use github at all. instead of spending corporate money - people for some reason choose to spend their own money to build the same infra (that is also less convenient for outside contributors). literally why, i dont get it
@[email protected] i meant that if they figured out a way to send something privately (without exposing it to the entire ~~blockchain~~ repository), they could probably just do the same with multiple recipients to achieve followers-only posts > unless you also want to sync them too why not tho?
@[email protected] i wonder if implementing dms means that bluesky will eventually get followers-only posts
ended up overengineering things as usual and made an automated generator for those lol parsing rust with regexes is not very fun :haggard:
what the fuck how are they so slow compared to the most naive implementation ever
oh fuck me turned out node's `EventEmitter` throws an error if i `.emit('error')` and there is no registered error handler this is so dumb lol
honestly, so far my entire experience with deno was very similar to writing code for browsers. i'm not *building* in browser, i use a proper tool for that (node, vite/rollup/tsc, ...) i might be wrong, but deno just feels like a very fragile thing, only suitable for a very limited number of use-cases. and once you need *a little bit* more control over what's happening - you end up hacking around the made up limitations of deno. it's just.. not fun dealing with all that instead of writing actual code. i can't point out exactly *why* i feel that way, but i can't shake this feeling off either.
they also interfere with the builtin libs like `dom` because fuck you that's why. so mixing them all just makes me put even more `any` in my code base even in absolutely unrelated places. thanks deno very cool. i should honestly probably just strip `lib.deno.d.ts` manually and only keep what's actually used in my code lol
its been like 3 weeks already since i started adding support for deno in my library, and honestly its definitely not the most pleasant experience. bun, despite its quirks, mostly just worked. probably because they didn't reinvent the wheel, idk
god im so tired of deno they have *some* interesting decisions, but i cant see deno becoming mainstream ever. it just.. feels like you are literally just running backend code in a headless browser for whatever reason. performance-wise too, despite all the ffi benchmarks and optimizations it still feels... slower than node.
i said that in the beginning, i'll keep saying that until the end. deno having a hard dependency on tsc and reimplementing half of the typechecking is a very bad design choice...
Prev 12 / 28 Next

this is a misskey-tombstone instance loaded up with data from a now defunct misskey instance