Using Chatbots
From Miscellany
Contents
Links to Selected Chatbots
Claude
- From Anthropic, Claude comes as several LLMs. There is a free version, and Claude Pro costs $18/month.
- Claude 3.5 Haiku
- Claude 3.5 Opus (premium)
- Claude 3.5 Sonnet (premium)
CoPilot
- Microsoft's derivative of OpenAI's ChatGPT is being integrated into many products, and its capabilities expanded to include vision, image generation, etc.
Gemini
- From Google, Gemini comes in various flavors. Probably the best way to learn about them is in Google's AI Studio. An API is available to download from this site.
Grok
- Accessed via the sidebar on the X platform, Grok is free to use, and early versions can be downloaded for custom use on one's own hardware.
Llama
- An open-source LLM from Meta, using it is free.
Leo
- Leo is the AI agent built into Brave broswer. It can use several different LLMs:
- Mixtral
- Claude 3 Haiku
- Claude 3.5 Sonnet (premium)
- Llama 3.1 8B
- Basic use is free, but Claude 3.5 Sonnet requires a premium subscription ($149/year or $14.99/month).\
MetaAI
- Another open-source LLM from Meta, MetaAI is accessible via Facebook.
Mistral AI
- A French AI with several open-source LLMs that are available via Leo AI and OnlyOffice AI plugin, among others
OpenAI
- The home of ChatGPT, this may be the best-known AI chatbot currently thanks to it's first-out-of-the-gate status.
Perplexity
- Said to be currently the best for research because it searches the internet and provides citations, Perplexity Pro costs $16/seat/month, although there is also a free option.
- Using Ollama, download a library to use in the Ollama Docker image, or just use the built-in default (llama3).
Pi
- A free AI that is described as "your personal AI".
Ollama
- Ollama is an open-source CLI app that implements chatbots using various LLMs without requiring APIs. There is an official Ollama Docker image available that has an extensive library of LLMs available on GitHub.
- To use Ollama in Docker, use these Terminal commands:
- Pull a library:
docker exec -it ollama ollama pull [name of LLM]
- Run the model:
docker exec -it ollama ollama run [name of LLM]
- Pull a library:
- Once a model has been pulled and run, the following commands are available:
Available Commands: /set Set session variables /show Show model information /load <model> Load a session or model /save <model> Save your current session /clear Clear session context /bye Exit /?, /help Help for a command /? shortcuts Help for keyboard shortcuts