So…with all this openclaw stuff, I was wondering, what’s the FOSS status for something to run locally? Can I get my own locally run agent to which I can ask to perform simple tasks (go and find this, download that, summarize an article) or things like this? I’m just kinda curious about all of this.

Thanks!

    • cideyav138@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      14 hours ago

      Ollama is a VC-backed copy/paste of llama.cpp.

      They have a history of using llama.cpp’s code (and bugs) without supporting the project or giving credit. llama.cpp is easy to use, more performant, and truly open source.

    • iturnedintoanewt@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      17 hours ago

      Thanks! I have an understanding of being able to run these models as LLM you can chat with, using tools like ollama or GPT4All. My question would be, how do I go from that to actually do things for me, handle files, etc. As it stands, if I run any of these locally, it’s just able to answer offline questions, and that’s about it…how about these “skills”, where it can go fetch files, or go find an specific URL, or say a summary of what a youtube video is about based on what’s being said in it?

      • 9tr6gyp3@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        13 hours ago

        Sorry, wish I was able to share more. I honestly JUST started diving into this stuff after your post. Learning a lot from the various other comments though. Hopefully some of the other commenters can help you get the answers you’re looking for.

    • Auster@thebrainbin.org
      link
      fedilink
      arrow-up
      11
      ·
      2 days ago

      There’s also a community for it here on the fediverse, to those interested: !Ollama@lemmy.world

      Also, from my tests, it works decent enough even on Android’s Termux, though a powerful phone seems needed.

    • PetteriPano@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      I’ve had better luck with llama.cpp for opencode. I’m guessing it does formatting better for tool use.