You could look into prose. The interface of slack/discord/mattermost, built on XMPP, with E2EE.
You could look into prose. The interface of slack/discord/mattermost, built on XMPP, with E2EE.
Bitwardens local cache does not include attachments, though. If you rely on them, you have to rely on the server being available.
While I like and appreciate the campaign, the issue IMO is bigger. IoT devices for example even have environmental impact when services behind them get discontinued.
I would therefore like a more general rule: whenever a product is discontinued for whatever reason, all necessary documents, sources, etc need to be released to allow third parties to take over maintenance (that also includes schematics for hardware repairs).
I don’t understand how that hybrid is supposed to work. Monospace is a binary attribute; either all chars have the same width or not. So what is the font now?
Unreal Tournament
Most people in my company use OSX, followed by a few dozen Linux users (various distros; whatever each one prefers), followed by a few Windows users (whyever they want that). So essentially: we can choose what we want to use.
They also fuck over their own OS. I don’t think they deliberately broke dual boot installs, they simply don’t put enough effort in QA. (See their recent problems with BitLocker after an update. Or that one update that fails because some internal partition is too small. And so on.)
Fli4l is still around?! Crazy. I used that back in 2002 or so to turn an old i386 with 3 ISA HP 100Mbit network cards into a router + fileserver combo. Good times.
glibc’s malloc
increases the stacksize of threads depending on the number of cpu cores you have. The JVM might spawn a shitload of threads. That can increase the memory usage outside of the JVMs heap considerably. You could try to run the jvm with tcmalloc (which will replace malloc
calls for the spawned process). Also different JVMs bundle different memory allocators. I think Zulu could also improve the situation out of the box. tcmalloc might still help additionally.
I wonder if that would be a genuine use case for “AI”. If the voice actor consents to have his voice represented in such a scene but doesn’t want to play it out in a studio, the computer model could take over that part.
It’s an okay game, but far worse than the first two. They forced an open world onto it, and made it pretty repetitive. The DLC is more linear and feels a lot more like a typical Mafia story telling.
Its not said that they need devs to target home machines, it says they need to give the resources so people can host it themselves, period.
Before attacking me with such an arrogant rant, maybe read what I wrote.
I said:
Once they release the source, people can refactor or reengineer it to run on smaller scale, replace proprietary databases with free ones, etc.
So of course it’s about releasing anything (!) at all.
I simply said that you can’t compare a small fan project like a WoW self hosted server with Blizzards infrastructure and the requirements to have a high available setup for millions of players.
ArenaNet is quite open about their infrastructure and you can see that this is far from trivial, but also allows them to have zero downtime updates. That is a huge feat, but also means that self hosting that thing will be a pain in the ass. Yet I would not want them to not do this just so it could be easily (!) self hosted some time in the distant future.
Such an architecture is typically shit. Building a system that is simple AND scales high won’t work. Complexity usually gets added to cope with scale. If we don’t allow companies to build scalable (i.e. complex) systems, we simply won’t get such games anymore.
Again: I am completely in favor of forcing devs to release everything necessary to host it. I am not in favor of forcing devs to target home machines for their servers, when their servers clearly have completely different requirements. That’s unrealistic.
Not a fair comparison. The private servers were written with the small hosting in mind. They would very likely never scale to what Blizzard has in place. For all I know, Blizzard could run their stuff on a Mainframe with specific platform optimizations against an IBM DB2.
But I also don’t think this has to be transferable to a local setup without effort either. Once they release the source, people can refactor or reengineer it to run on smaller scale, replace proprietary databases with free ones, etc.
I ran Arch on a convertible laptop around 2006-2010. Most notes I did using OpenOffice Writer, with hotkeys to quickly add formulas. Drawings were done with the pen. Homework (where speed didn’t matter as much but where I wanted high quality) were done in ConTeXt.
Programming was done in FreePascal using Lazarus IDE or Java using Netbeans IDE, depending on the course and my personal preference.
I think I had no complaints from anyone. Quite the contrary, one professor even gifted me a book as a thanks for the high quality typesetting in my homeworks, since most students didn’t give a shit and had no fucking clue how to really use their beloved MS Word.
So “it’s weird then”. As I said. And basically as the person I answered to said.
Yeah but why not both? Extra support shouldn’t hurt.
… each time the server restart and randomly during login.
They already accept donations as a means of continuous support. So I guess this is now just another channel for people who prefer buying a license over using github donations.
Edit: oh I just realized they stopped donations with the restructuring. Ok, that’s weird then.
It’s more comparable to Snikket. Both Snikket and Prose use Prosody as server with their own extensions.