Page Assist - A Web UI for Local AI Models
67 ratings
)Overview
Use your locally running AI models to assist you in your web browsing.
Page Assist - A Sidebar and Web UI for Your Local AI Models Utilize your own AI models running locally to interact with while you browse or as a web UI for your local AI model provider like Ollama, Chrome AI etc.. Repo: https://github.com/n4ze3m/page-assist Current Features: - Sidebar for various tasks - Support for vision models - A minimal web UI for local AI models - Internet search - Chat with PDF on Sidebar - Chat wit Documents (pdf,csv,txt,md,docx) Supported Providers: - Ollama - [Beta] Chrome AI (Gemini Nano) - [BETA] OpenAI-compatible API support (LM Studio, Llamafile, and many more providers).
4.9 out of 567 ratings
Google doesn't verify reviews. Learn more about results and reviews.
Manuel Herrera Hipnotista y BiomagnetismoOct 27, 2024
Wow, really nice extension, it feels fine to work with local llm and ollama in a useful and practical way. thanx
Ritch CuvierOct 23, 2024
I like this application. I wish it could use two different URLs or two different ports in the OLLama URLs. That would allow me to utilize multiple PCS.
Leonardo GrandoOct 13, 2024
Congrats and thanks for bring us this extension. Fantastic
Details
- Version1.3.1
- UpdatedOctober 28, 2024
- Offered byMuhammed Nazeem
- Size2.76MiB
- Languages12 languages
- Developer
Email
nazeemnob17@gmail.com - Non-traderThis developer has not identified itself as a trader. For consumers in the European Union, please note that consumer rights do not apply to contracts between you and this developer.
Privacy
This developer declares that your data is
- Not being sold to third parties, outside of the approved use cases
- Not being used or transferred for purposes that are unrelated to the item's core functionality
- Not being used or transferred to determine creditworthiness or for lending purposes
Support
For help with questions, suggestions, or problems, please open this page on your desktop browser