• 0 Posts
  • 16 Comments
Joined 1 year ago
cake
Cake day: December 25th, 2023

help-circle







  • For me it’s very simple: NSFW can’t have a general acceptable definition because it depends on culture, background and personal beliefs. There is no way for a collection of communities to have a common definition and even if they would have: enforcement and interpretation is still done by volunteers.

    Therefore All is never safe for work unless I know that my tolerance is lower than all communities within lemmy AND I’m fine with an accidental penis or breast due to human error.



  • User perspective:

    If you want something big I’d pitch nixos. As in the core distribution. It’s a documentation nightmare and as a user I had to go over options search and then trying to figure out what they mean more often than I found a comprehensive documentation.

    That would be half writing and half coordinating writers though I suspect.

    Another great project with mixed quality documentation is openhab. It fits the bill of more backend heavy side and the devs are very open in my experience. I see it actually as superior in its core concepts to the way more popular home assistant in every aspect except documentation!

    That said: thanks for putting the effort in! ♥





  • I’d try chat gpt for that! :)

    But to give you a very brief rundown. If you have no experience in any of these aspects and are self learning you should expect a long rampup phase! Perhaps there is an easier route but I’m not familiar with it if there is.

    First, familiarize yourself with server setups. If you only want to host this you won’t have to go into the network details but it could become a cause for error at one point so be warned! The usual tip here is to get yourself familiar enough with docker that you can read and understand docker compose files. The de facto standard for self hosting are linux machines but I have read of people who used Macos and even windows successfully.

    One aspect quite unique to themodel landscape is the hardware requirements. As much as it hurts my nvidia despicing heart at this point in time they are the de facto monopolist. Get yourself a card with 12GB VRAM or more (everything below will be painful if you get things running at all. I’ve tried and pulled or smaller models on a 8GB card but experienced a lot of waiting time and crashes). Read a bit about Cuda on your chosen OS and what their drivers need.

    Once you can understand this whole port, container, path mapping and environment variable things.

    Then it’s going to the github page linked, following their guide and starting a container. Loading models is actually the easier part once you have the infrastructure running.


  • No offense intended, possible that I miss read your experience level:

    I hear a user asking developer questions. Either you go the route of using the publicly available services (dalle and Co) or you start digging into hosting the models yourself. The page you linked hosts trained models to use in your own contexts, not for a “click button and it works”.

    As a starting point for image generation self hosting I suggest https://github.com/AUTOMATIC1111/stable-diffusion-webui.

    For the training part, I’ll be very blunt: if you don’t indent to spend five to six digit sums on hardware or processing power, forget it. And even then you’d need the raw training data to pull it of.

    Perhaps what you want to do use fine tune a pretrained model, that’s something I only have a. It of experience in LLMs thohfn(and even there I don’t have the hardware to get beyond a personal proof of concept).


  • You’re not being dramatic enough - if you switch your point of view from user to creator. I’ll try to explain my point of view:

    This is a hardware company just learning oppen software at best and there are a lot worse scenarios.

    The homepage is an artificial loop. Correct me if I’m wrong but LAN mode wasn’t even fully functional on rollout.

    They create awesome and exciting machines but that you point a critique of their closed system business practice as “overly dramatic” is interesting.

    From my point of view notions like this will increase the split of the communities even further (makers VS users VS builders).

    This doesn’t have to be a bad thing from my perspective nor is it the fault of Bambu - it’s just that individual people with their own focus will have a vastly different perception of how much intensity (aka drama) is appropriate.