Even quicker-start all LLMs locally

Install Ollama — it’s just click and install

Install https://github.com/open-webui/open-webui?tab=readme-ov-file — it’s two lines of command line

Next step: give it control of your browser — Install https://github.com/browser-use/browser-use

Leave a Reply