Item logo image for LLM-X

LLM-X

Item media 4 (screenshot) for LLM-X
Item media 1 (screenshot) for LLM-X
Item media 2 (screenshot) for LLM-X
Item media 3 (screenshot) for LLM-X
Item media 4 (screenshot) for LLM-X
Item media 1 (screenshot) for LLM-X
Item media 1 (screenshot) for LLM-X
Item media 2 (screenshot) for LLM-X
Item media 3 (screenshot) for LLM-X
Item media 4 (screenshot) for LLM-X

Overview

LLM-X ( also called llmx) is a webapp and a chrome extension. The entire codebase (no secrets!) is on Github. The entire webapp is…

LLM-X ( also called llmx) is a webapp and a chrome extension. The entire codebase (no secrets!) is on Github. The entire webapp is also hosted on github. This app is designed for users to be able to chat with Local AI models including a text to image provider! Supports: Ollama, LM Studio (through the lmstudio sdk), Gemini nano (built in browser, currently chrome canary only), Automatic1111 (for image generation), and OpenAI compatible endpoints! Features: - Multiple Models can be run at the same time! - Chat bot sending and receive images and text, regenerate responses - Saving multiple chats - Quick bar that allows for easy wipe of all data - Much more! No RAG support yet, apologies. It should be coming soon!

0 out of 5No ratings

Google doesn't verify reviews. Learn more about results and reviews.

Details

  • Version
    1.1.4
  • Updated
    December 11, 2024
  • Offered by
    mr.demarcus.johnson
  • Size
    966KiB
  • Languages
    English (United States)
  • Developer
    Email
    mr.demarcus.johnson@gmail.com
  • Non-trader
    This developer has not identified itself as a trader. For consumers in the European Union, please note that consumer rights do not apply to contracts between you and this developer.

Privacy

LLM-X has disclosed the following information regarding the collection and usage of your data. More detailed information can be found in the developer's privacy policy.

LLM-X handles the following:

Website content

This developer declares that your data is

  • Not being sold to third parties, outside of the approved use cases
  • Not being used or transferred for purposes that are unrelated to the item's core functionality
  • Not being used or transferred to determine creditworthiness or for lending purposes

Support

Related

WebextLLM

5.0(1)

Browser-native LLMs at your fingertips

Ollama EasyPHPRompt

5.0(1)

Ollama EasyPHPRompt Chrome Extension

Local AI helper

0.0(0)

LocalAI provides access to web content with your authorisation, without storing it. It can be configured to use files and…

Offload: Fully private AI for any website using local models.

5.0(3)

A fully private in-browser AI assistant. Works even offline. No external API dependencies.

sidellama

5.0(2)

sidellama

open-os LLM Browser Extension

4.5(4)

Quick access to your favorite local LLM from your browser (Ollama).

ollama-ui

4.5(31)

This extension hosts an ollama-ui web server on localhost

Orian (Ollama WebUI)

3.1(21)

Quick access to your favorite local LLM from your browser (Ollama).

Page Assist - A Web UI for Local AI Models

4.9(183)

Use your locally running AI models to assist you in your web browsing.

Ollamazing

4.0(3)

Web extension to use local AI models

OpenTalkGPT - UI to access DeepSeek,Llama or all open source modal offilne.

4.7(6)

This extension hosts an ollama ui on localhost and help you to access all open srouce modals.

AIskNet

5.0(2)

Locally-run AI that answers questions from your current webpage

WebextLLM

5.0(1)

Browser-native LLMs at your fingertips

Ollama EasyPHPRompt

5.0(1)

Ollama EasyPHPRompt Chrome Extension

Local AI helper

0.0(0)

LocalAI provides access to web content with your authorisation, without storing it. It can be configured to use files and…

Offload: Fully private AI for any website using local models.

5.0(3)

A fully private in-browser AI assistant. Works even offline. No external API dependencies.

sidellama

5.0(2)

sidellama

open-os LLM Browser Extension

4.5(4)

Quick access to your favorite local LLM from your browser (Ollama).

ollama-ui

4.5(31)

This extension hosts an ollama-ui web server on localhost

Orian (Ollama WebUI)

3.1(21)

Quick access to your favorite local LLM from your browser (Ollama).

Google apps