Local LLama LLM AI Chat Query Tool
1 rating
)Overview
Query a local model from your browser.
Elevate your browsing experience with our cutting-edge Chrome extension, designed to seamlessly interact with local models hosted on your own server. This extension allows you to unlock the power of querying local models effortlessly and with precision, all from within your browser. Our extension is fully compatible with both Llama CPP and .gguf models, providing you with a versatile solution for all your modeling needs. To get started, simply access our latest version, which includes a sample Llama CPP Flask server for your convenience. You can find this server on our GitHub repository: GitHub Repository - Local Llama Chrome Extension: https://github.com/mrdiamonddirt/local-llama-chrome-extension To set up the server, install the server's pip package with the following command: ``` pip install local-llama ``` Then, just run: ``` local-llama ``` With just a few straightforward steps, you can harness the capabilities of this extension. Run the provided Python script, install the extension, and instantly gain the ability to effortlessly query your local models. Experience the future of browser-based model interactions today.
5 out of 51 rating
Google doesn't verify reviews. Learn more about results and reviews.
Details
- Version1.0.6
- UpdatedOctober 2, 2023
- Offered byRowDog
- Size375KiB
- LanguagesEnglish
- Developerrw entreprise services
Piccadilly Business Centre Aldow Enterprise Park Manchester M12 6AE GBEmail
mrdiamonddirt@gmail.com - Non-traderThis developer has not identified itself as a trader. For consumers in the European Union, please note that consumer rights do not apply to contracts between you and this developer.
Privacy
This developer declares that your data is
- Not being sold to third parties, outside of the approved use cases
- Not being used or transferred for purposes that are unrelated to the item's core functionality
- Not being used or transferred to determine creditworthiness or for lending purposes
Support
For help with questions, suggestions, or problems, please open this page on your desktop browser