Overview
Use your locally running AI models to assist you in your web browsing.
Page Assist - A Sidebar and Web UI for Your Local AI Models Utilize your own AI models running locally to interact with while you browse or as a web UI for your local AI model provider like Ollama. Repo: https://github.com/n4ze3m/page-assist Current Features: - Sidebar for various tasks - Support for vision models - A minimal web UI for local AI models - Internet search - Chat with PDF on Sidebar - [NEW BETA] Chat wit Documents (pdf,csv,txt,md,docx) Supported Providers: - Ollama
4.9 out of 539 ratings
Google doesn't verify reviews. Learn more about results and reviews.
Ayman KonnaJun 13, 2024
I am sponsoring this project on GitHub, I appreciate (n4ze3m's) work and contribution for the developer community. I believe that Page Assist has the potential to make a significant impact to anyone seeking smooth entry to AI development.
Suhreed SarkarJun 13, 2024
A perfect solution with use of Ollama and local model; it has RAG built in and quite easy to configure. Orgainizing documents in Knowledge base and then interacting with KB is useful.
Wooi Haw TanJun 6, 2024
Provides an easy way to use local LLMs served by Ollama. Kudos to the developer.
Details
- Version1.1.12
- UpdatedJune 12, 2024
- Offered byMuhammed Nazeem
- Size1.71MiB
- Languages7 languages
- Developer
Email
nazeemnob17@gmail.com - Non-traderThis developer has not identified itself as a trader. For consumers in the European Union, please note that consumer rights do not apply to contracts between you and this developer.
Privacy
This developer declares that your data is
- Not being sold to third parties, outside of the approved use cases
- Not being used or transferred for purposes that are unrelated to the item's core functionality
- Not being used or transferred to determine creditworthiness or for lending purposes
Support
For help with questions, suggestions, or problems, please open this page on your desktop browser