npmai
npmai is a lightweight Python package (current version 0.1.8) designed to bridge the gap between users and open-source LLMs. It allows connection with Ollama and over 10 other powerful models instantly, without requiring local installation, user logins, or API keys. It also facilitates the development of RAG Agents without local or cloud installations, offering free and unlimited usage. The project shows a consistent release cadence with minor version updates addressing features and bug fixes.
Warnings
- breaking Parameters of the `Rag` class have been updated across minor versions (e.g., in v0.1.7) to support new functionalities like sending multiple file types and integrating with Supabase. Existing RAG agent implementations relying on previous parameter signatures may break.
- gotcha While `npmai` advertises 'zero setup, no API keys, no installation' for LLM access and RAG, it internally relies on hosted cloud services (e.g., Huggingface Server for RAG processes, Supabase for vectorized document storage). Users should be aware of these external dependencies for reliability and potential production considerations.
- gotcha As a library in its early development stage (v0.1.x), the API and class methods (e.g., `Memory` class methods like `clear_memory` added in v0.1.7) are subject to change in minor releases without explicit deprecation warnings or major version bumps. This can lead to unexpected behavior or breakage.
Install
-
pip install npmai
Imports
- Ollama
from npmai import Ollama
- Memory
from npmai import Memory
- Rag
from npmai import Rag
Quickstart
from npmai import Ollama # Initialize the Ollama model (llama3.2 is an example) llm = Ollama(model="llama3.2", temperature=0.5) # Define your prompt prompts = "Hello, tell me a short summary of NPMAI" # Invoke the LLM to get a response result = llm.invoke(prompts) # Print the result print(result)