
Local LLama LLM AI Chat Query Tool
144 users
Version: 1.0.6
Updated: October 2, 2023

Available in the
Chrome Web Store
Chrome Web Store
Install & Try Now!

Big News September 13, 2023 (Digital)

1990's Yahoo Corporate Yo-yo Toy

DMC Stranded Cotton Embroidery Floss - 453

Wolverine Vs. Deadpool [Book]

Connections Puzzle Book: Connections Word Game 2:: The Roman Empire

Dr Teals Pure Epsom Salt, Soothe & Sleep - 3 lbs (1.36 kg)
Elevate your browsing experience with our cutting-edge Chrome extension, designed to seamlessly interact with local models hosted on your own server. This extension allows you to unlock the power of querying local models effortlessly and with precision, all from within your browser.
Our extension is fully compatible with both Llama CPP and .gguf models, providing you with a versatile solution for all your modeling needs. To get started, simply access our latest version, which includes a sample Llama CPP Flask server for your convenience. You can find this server on our GitHub repository:
GitHub Repository - Local Llama Chrome Extension:
https://github.com/mrdiamonddi rt/local-llama-chrome-extensio n
To set up the server, install the server's pip package with the following command:
```
pip install local-llama
```
Then, just run:
```
local-llama
```
With just a few straightforward steps, you can harness the capabilities of this extension. Run the provided Python script, install the extension, and instantly gain the ability to effortlessly query your local models. Experience the future of browser-based model interactions today.
Our extension is fully compatible with both Llama CPP and .gguf models, providing you with a versatile solution for all your modeling needs. To get started, simply access our latest version, which includes a sample Llama CPP Flask server for your convenience. You can find this server on our GitHub repository:
GitHub Repository - Local Llama Chrome Extension:
https://github.com/mrdiamonddi rt/local-llama-chrome-extensio n
To set up the server, install the server's pip package with the following command:
```
pip install local-llama
```
Then, just run:
```
local-llama
```
With just a few straightforward steps, you can harness the capabilities of this extension. Run the provided Python script, install the extension, and instantly gain the ability to effortlessly query your local models. Experience the future of browser-based model interactions today.
Related

open-os LLM Browser Extension
577

ChatLlama: Chat with AI
146

sidellama
340

Ollama Chrome API
85

smartGPT Multi-Tab Extension
61

ReadAnything
61

Chatty for LLMs
32

Llama 3.1 405b
132

Offline AI Chat (Ollama)
76

Local LLM Helper
117

postpixie.com Chrome Extension
18

WebextLLM
107

Llama AI
133

llama explain
65

Smart Web Assistant
77

Smatr-AI | GPT LLM for browser
73

Orian (Ollama WebUI)
1,000+

Local AI helper
48

ollama-ui
10,000+

LeafLLM
204

Chat LLM
401

Heystak
137

HighlightX | AI Chat with Ollama, Local AI, Open AI, Claude, Gemini & more
575

GL Git Clone
129