4.4 out of 5
Google doesn't verify reviews. Learn more about results and reviews.
- All reviews
- Recent
- Helpful
- Highest to lowest rating
- Lowest to highest rating
- English
- All languages
Mohammad KanawatiFeb 13, 2025
- Report illegal content
- Copy link
Easy, light, and just let you chat. However, I wish if it has more customization, or system prompt, or memory section. As for the multiline, I wish if it was the opposite (Enter for enter, and CTRL+Enter for multiline)
Sultan PapağanıOct 31, 2024
- Report illegal content
- Copy link
Please update it and add more features. its awesome (send enter settings, upload-download image if it doesnt exist, export chat .txt, rename chat saving title (it ask our name ? it should say Chat name or something))
Manuel Herrera Hipnotista y BiomagnetismoOct 27, 2024
- Report illegal content
- Copy link
Simple solutions, as all effective things are. thanx
Bill Gates LinAug 8, 2024
- Report illegal content
- Copy link
How to setting prompt
Damien PEREZ (Dadamtp)Aug 8, 2024
- Report illegal content
- Copy link
Yep, it's true, only work with Ollama on localhost. But my Ollama turn on another server exposed by openweb-ui. So I made a reverse proxy http://api.ai.lan -> 10.XX.XX.XX:11435 But the extension can't access it. Then I also tested with the direct IP : http://10.1.33.231:11435 But you force the default port: failed to fetch -> http://10.1.33.231:11435:11434/api/tags Finally, I made a ssh tunnel: ssh -L 11434:localhost:11435 USER@10.XX.YY.ZZ It's work, but not sexy
Fabricio cincuneguiAug 5, 2024
- Report illegal content
- Copy link
i wish a low end firendly GUI for ollama. you made it thanks
Frédéric DemersMay 19, 2024
- Report illegal content
- Copy link
wonderful extension, easy to get started with local large language models without needed a web server, etc.... would you consider inserting a MathJax library in the extension so that equations are rendered correctly? something like <script type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/3.2.2/es5/latest.min.js?config=TeX-AMS-MML_HTMLorMML"> </script> or package mathjax as a local resource perhaps....
Hadi ShakibaMay 9, 2024
- Report illegal content
- Copy link
It's great to have access to such a useful tool. Having 'copy' and 'save to txt' buttons would be a fantastic addition!
Daniel PfeifferApr 30, 2024
- Report illegal content
- Copy link
I like it! So far the best way to easily chat with a local model in an uninterrupted way.
Luis HernándezApr 25, 2024
- Report illegal content
- Copy link
Nice work. Just curious, how did you manage to get around Ollama’s limitation of only accepting POSTs from localhost, since the extension originates from chrome-extension://? Regards,
Hi! Great question, chromium extensions allow you to control the headers in ways you can't from a web page. You can see how this is done here: https://github.com/ollama-ui/ollama-ui/blob/main/api.js#L2-L25 I think it also requires the `declarativeNetRequest` permission: https://github.com/ollama-ui/ollama-ui/blob/main/manifest.json#L8
- Report illegal content
- Copy link