Alternate account for @simple@lemmy.world

  • 26 Posts
  • 173 Comments
Joined 2 years ago
cake
Cake day: July 3rd, 2023

help-circle

  • simple@lemm.eetoOpen Source@lemmy.mlProton's biased article on Deepseek
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    edit-2
    2 days ago

    I understand it well. It’s still relevant to mention that you can run the distilled models on consumer hardware if you really care about privacy. 8GB+ VRAM isn’t crazy, especially if you have a ton of unified memory on macbooks or some Windows laptops releasing this year that have 64+GB unified memory. There are also websites re-hosting various versions of Deepseek like Huggingface hosting the 32B model which is good enough for most people.

    Instead, the article is written like there is literally no way to use Deepseek privately, which is literally wrong.


  • simple@lemm.eetoOpen Source@lemmy.mlProton's biased article on Deepseek
    link
    fedilink
    English
    arrow-up
    119
    arrow-down
    4
    ·
    2 days ago

    DeepSeek is open source, meaning you can modify code(new window) on your own app to create an independent — and more secure — version. This has led some to hope that a more privacy-friendly version of DeepSeek could be developed. However, using DeepSeek in its current form — as it exists today, hosted in China — comes with serious risks for anyone concerned about their most sensitive, private information.

    Any model trained or operated on DeepSeek’s servers is still subject to Chinese data laws, meaning that the Chinese government can demand access at any time.

    What??? Whoever wrote this sounds like he has 0 understanding of how it works. There is no “more privacy-friendly version” that could be developed, the models are already out and you can run the entire model 100% locally. That’s as privacy-friendly as it gets.

    “Any model trained or operated on DeepSeek’s servers are still subject to Chinese data laws”

    Operated, yes. Trained, no. The model is MIT licensed, China has nothing on you when you run it yourself. I expect better from a company whose whole business is on privacy.


  • People keep meaning different things when they say “Generative AI”. Do you mean the tech in general, or the corporate AI that companies overhype and try to sell to everyone?

    The tech itself is pretty cool. GenAI is already being used for quick subtitling and translating any form of media quickly. Image AI is really good at upscaling low-res images and making them clearer by filling in the gaps. Chatbots are fallible but they’re still really good for specific things like generating testing data or quickly helping you in basic tasks that might have you searching for 5 minutes. AI is huge in video games for upscaling tech like DLSS which can boost performance by running the game at a low resolution then upscaling it, the result is genuinely great. It’s also used to de-noise raytracing and show cleaner reflections.

    Also people are missing the point on why AI is being invested in so much. No, I don’t think “AGI” is coming any time soon, but the reason they’re sucking in so much money is because of what it could be in 5 years. Saying AI is a waste of effort is like saying 3D video games are a waste of time because they looked bad in 1995. It will improve.













  • For AI stuff you’d want something with at least an RTX 4060. AMD GPUs for laptops are not great and most of them don’t support AI. Any card that’s good for AI will also be good for gaming so you’ll be fine there.

    You probably want something with 32gb of ram, too.

    As for pulling out the wifi card, there’s no need. Most laptops let you disable wifi through the BIOS, completely disabling it at a system level.

    Dual booting works on anything.

    Generally I’d recommend Lenovo Legions, ASUS ROG stuff. If you’re rich you can also look into Razer I guess.