In theory they (or someone else) could just bundle an open source copy of the assets no different to having a different texture pack in a game.
In theory they (or someone else) could just bundle an open source copy of the assets no different to having a different texture pack in a game.
I think a lot of new indie Devs, having little knowledge of the actual state of the industry, see a publisher offering them big money to make video games (their dream!) and ignore (or lack resources to properly interrogate) the fine print. There are countless examples of publishers simply destroying indie Devs on a whim like this.
https://awesomekling.substack.com/p/welcoming-shopify-as-a-ladybird-sponsor not sure about the other anonymous donation he received.
ngl this looks pretty slick. Good stuff!
Serenity does have donors, but most of the donations are for Ladybird, their libre browser stack separate from Mozilla/Safari/Chromium. Most of the money came specifically from donors looking for improved support for their own websites on Ladybird.
Cost for commercial, free for personal may not always be open source. Redis for example.
Premium support channels - This is basically how RedHat and Canonical make their money, while offering FOSS for individuals.
Donations - KDE and GNOME are largely donor-backed, both by individuals and corporate entities.
Commissions on features - Collabora for example is commissioned by Valve to improve KDE and SteamOS.
Software licenses - Certain FOSS licenses may permit paid access to software as long as the source is open i think? There are also source-available eg. Asperite that are open source, but only offer binaries for customers.
Add on services - Your FOSS web app can offer paid hosting and management for clients. Your distro can offer ISOs with extra pre-downloaded software for a fee (Zorin). You can partner with hardware to distribute your software (Manjaro, KDE).
Hired by a company to work on your project and integrate with their own stack. This is what Linus Torvalds did with Linux when he was first hired by Transmeta - part of his time was spent working on Linux to work better with the technology Transmeta used.
I look forward to more voice assistants that arent as laughably broken as Google/Alexa.
My Google Nest can’t even subtract from a timer (it can add, though!). Such a blatant surveillance device.
Yeah it’s nice to have a “IaC-lite” setup for stuff. Could definitely do an involved install but for my use case its not too involved :)
TLDR it’s a Debian/Linux image that comes preconfigured for raspberry pis and other small single board computers.
Firstly, it’s quite minimal for a “full featured” Linux distro, reducing RAM and CPU usage which are usually in high demand on SBCs. But it also doesn’t remove stuff that a typical linux user needs, so no weird configuration to get your regular suite of apps running.
Secondly, it has a library of utilities for managing your computer from the command line. Such as common raspberry pi configuration, setting up and managing cron jobs, services, DDNS, VPNs, disks, etc.
Thirdly, it has its own “repository” of applications, which are really just regular Debian packages but with extra scripts to configure said software for the typical user. Stuff like, installing and configuring a database, webserver, python, php are all done alongside your software setup, and it “just works”.
It’s usually used for hosting services like Plex, Jellyfin, Nextcloud, and other utilities with minimal effort but it’s really just like any other Linux and you can do whatever you like to it.
dietpi.com if you wanna read about it from the devs
oops meant to reply, not make a new comment 😌
deleted by creator
DietPi, for setting up an SBC (ie raspberry pi) with common server software. very good for a first-time self hoster like myself.
I moved from a 1080p monitor to a 1440p one for my main display and it’s actually really worthwhile. Not only is your daily computing sharper, but multitasking becomes easier because smaller windows are still legible.
IMO it’s a lot easier on the eyes when things are sharper, too.
1080p is still more than enough, but I think 1440p is worth it for a screen you’re using for hours every day :)
Because the underlying dependencies of GIMP changed a lot, especially between GTK2 and GTK3. It pulls resources from writing code to figuring out how to migrate code to a new version, usually as changes in the toolkit mean you may have to rewrite stuff anyway, which can mean more bugs or regressions in functionality which must be addressed.
Writing from scratch, you can write code that immediately fulfills the spec you’re working to (They could’ve even skipped straight to GTK4!) and used the old code as a reference of what’s needed for users whilst throwing out anything and everything else that isn’t strictly necessary.
It allows you to throw out old features and code that you had to work with in the past, but may not actually be used by anyone. If it WAS needed, it can be added later.
Most importantly, there’s little to no expectation that the new app will supersede the old one immediately, whereas during a toolkit migration you need to keep everything working for every release you do.
It’s like how building a new house can be faster than renovating an old historically-protected building to the same regulatory standard. There’s red tape, you have to work around old masonry and brickwork, you have to avoid damaging the building and keep its appearance the same. A new house may be different in looks and miss the old charm or familiarity, but it’ll be built faster, cheaper, and for the same price have more features to the old building.
That’s different to how dependencies are managed when developing an application. When they’re updating GIMP and updating it from GTK3 to GTK4, they have to upgrade a bunch of old code that was meticulously fixed to work with GTK3, and things may be changed, broken, or missing entirely. That’s the kind of work you need to do when upgrading an app versus writing a new one from scratch :)
If the modern internet teaches us anything, its that everything is ephemeral even when you stringently catalogue every last byte of data. People just dont need access to 90% of YouTube’s library, yet Youtube has to pay big money to make 100% of that library available 24/7 365.
There’s already rips at the seams of these systems. Time is not on the side of YouTube.
Because updating dependencies after a long time breaks most of that code anyway, so you have to do a lot of work just to get things working exactly how they were before, only now your code probably has a bunch more bugs that you now have to fix, and it’s still not utilising enhanced features that updated dependencies may offer.
Rewriting can take more time, but if your alternative is to slowly upgrade code originally written in the nineties, you’re actually saving time by using your experience to rewrite something.
Lol thankfully i stopped before it ate any important info, but now I finally have all of that vital stuff being backed up to a hetzner storage box weekly now :)
Rarely, but I’ve contributed to a couple that I use.
Also, just a note that writing big reports is a valid contribution! It can really help both the regular maintainers finding and fixing bugs, but also gives new devs more potential work to pick up for first contributions.