Batteries exist you donkey
- 1 Post
- 54 Comments
Hexarei@beehaw.orgto
Programming@programming.dev•Projects are shutting down due to Microslop's Github CoPilot making AI contributions easy and plentiful
3·7 days agoOnly if they’re contributing through GitHub and not through local AI coding apps like Opencode or Claude CLI
Not in any recognizable capacity though, unfortunately
Sometimes they’re less of a short king and more like a small jester
Someone forgot a debounce
Hexarei@beehaw.orgto
Risa@startrek.website•The wildest part about this poll is that it was only shared to Star Wars sitesEnglish
21·15 days ago… Or a new foe, two seconds later
Hexarei@beehaw.orgto
Programmer Humor@programming.dev•The official Introduction to Github page included an AI-generated graphic with the phrase "continvoucly morged" on it, among other mistakes.
1·15 days agoAnd then made no effort to proofread it
Hexarei@beehaw.orgto
Linux@lemmy.ml•Cursed screenshot: XFCE desktop from remote machine launched over KDE Plasma of local machine
1·17 days agoI’ve used Xpra for similar
Hexarei@beehaw.orgto
Asklemmy@lemmy.ml•I think Lemmy in general is very against AI. I'm rather new here, is it like a fediverse group thing or is this even based on reality?
3·17 days agoIve had good success on similar hardware (5070 + more ram) with GLM-4.7-Flash, using llama.cpp’s
--cpu-moeflag - I can get up to 150k context with it at 20ish tok/sec. I’ve found it to be a lot better for agentic use than GPT-OSS as well, it seems to do a much more in depth reasoning effort, so while it spends more tokens it seems worth it for the end result.
Hexarei@beehaw.orgto
Buy European@feddit.uk•GitHub - aliasvault/aliasvault: Privacy-first password manager with built-in email aliasing. Fully encrypted and self-hostable. (Dutch, AGPL-3.0 licence)English
6·17 days agoMost of them are written in heavy legalese to cover asses
Hmmm. Anal Cobalt? Anal Grand Cherokee? Anal R8?
Hexarei@beehaw.orgto
Programmer Humor@programming.dev•o(1) statistical prime approximation
5·18 days agoBecause only 5% of those numbers are prime
A perfect image to post to catbox
Those AI detectors don’t work, btw.
Interestingly, none of the official sources for the model weights clickwrap the download in a way that forces the user to read or agree to those terms before downloading. There is precedent for such terms being unenforceable when the user isn’t forced to agree to the terms.
There’s lots of open source models you can download from Hugging Face, Ollama, and even github without signing any contracts or terms of use. Gemma3, Llama, Ministral, GLM, olmo, and a bajillion others. GLM-4.7-Flash is a very capable agentic model that can run at very usable speeds on commodity hardware - and none of what it generates is dictated by any agreements or policies agreed to anywhere.
Not mine, I run my own 😜
This take is weird, because none of the companies that do inference claim ownership of the generated content in their contracts for one, and because anyone can download open source models and generate code without entering into any ToS, for two.
I signed no contracts to download the open source local models I use for code generation, just for what it’s worth





And when they wear out they can largely be recycled