{"id":4604,"library":"langchain-pinecone","title":"LangChain Pinecone Integration","description":"langchain-pinecone is an integration package that connects LangChain applications with Pinecone, a leading vector database. It facilitates storing, retrieving, and managing vector embeddings to power AI search, recommendation, and generative AI features within the LangChain ecosystem. The current version is 0.2.13 and it follows the release cadence of the broader LangChain ecosystem, with frequent updates.","status":"active","version":"0.2.13","language":"en","source_language":"en","source_url":"https://github.com/langchain-ai/langchain-pinecone","tags":["vector database","langchain","pinecone","embeddings","RAG","AI","LLM"],"install":[{"cmd":"pip install -U langchain-pinecone pinecone-client langchain-openai","lang":"bash","label":"Recommended Installation with OpenAI Embeddings"}],"dependencies":[{"reason":"Core LangChain library for application logic.","package":"langchain","optional":false},{"reason":"Official Python client for interacting with Pinecone services.","package":"pinecone-client","optional":false},{"reason":"Common dependency for OpenAI embeddings, often used with Pinecone.","package":"langchain-openai","optional":true}],"imports":[{"symbol":"PineconeVectorStore","correct":"from langchain_pinecone import PineconeVectorStore"},{"note":"OpenAI embeddings were moved to the `langchain-openai` package in LangChain v0.1.x.","wrong":"from langchain.embeddings import OpenAIEmbeddings","symbol":"OpenAIEmbeddings","correct":"from langchain_openai import OpenAIEmbeddings"},{"note":"The `init` function was deprecated in `pinecone-client` v3.0.0; use the `Pinecone` class constructor instead.","wrong":"from pinecone import init","symbol":"Pinecone","correct":"from pinecone import Pinecone"}],"quickstart":{"code":"import os\nfrom langchain_pinecone import PineconeVectorStore\nfrom langchain_openai import OpenAIEmbeddings\nfrom langchain_core.documents import Document\nfrom pinecone import Pinecone, ServerlessSpec\n\n# --- Configuration (replace with your actual keys and environment) ---\n# It's recommended to set these as environment variables.\nPINECONE_API_KEY = os.environ.get(\"PINECONE_API_KEY\", \"YOUR_PINECONE_API_KEY\")\nPINECONE_ENVIRONMENT = os.environ.get(\"PINECONE_ENVIRONMENT\", \"gcp-starter\") # e.g., 'us-west-2'\nOPENAI_API_KEY = os.environ.get(\"OPENAI_API_KEY\", \"YOUR_OPENAI_API_KEY\")\n\nif PINECONE_API_KEY == \"YOUR_PINECONE_API_KEY\" or OPENAI_API_KEY == \"YOUR_OPENAI_API_KEY\":\n    print(\"Warning: Please set PINECONE_API_KEY and OPENAI_API_KEY environment variables.\")\n    print(\"Quickstart will likely fail due to missing credentials.\")\n\nindex_name = \"my-langchain-test-index\"\ndimension = 1536 # OpenAI text-embedding-ada-002 model dimension\nmetric = \"cosine\"\n\n# --- Initialize Pinecone Client (pinecone-client v3.x recommended) ---\ntry:\n    pc = Pinecone(api_key=PINECONE_API_KEY, environment=PINECONE_ENVIRONMENT)\nexcept Exception as e:\n    print(f\"Error initializing Pinecone client: {e}\")\n    exit(1)\n\n# --- Create/Connect to Pinecone Index ---\nif index_name not in pc.list_indexes().names():\n    print(f\"Creating Pinecone index '{index_name}'...\")\n    pc.create_index(\n        name=index_name,\n        dimension=dimension,\n        metric=metric,\n        spec=ServerlessSpec(cloud=\"aws\", region=\"us-west-2\") # Adjust spec as needed\n    )\n    print(f\"Index '{index_name}' created.\")\nelse:\n    print(f\"Connecting to existing Pinecone index '{index_name}'.\")\n\n# --- Initialize Embeddings Model ---\nembeddings = OpenAIEmbeddings(api_key=OPENAI_API_KEY)\n\n# --- Prepare Documents ---\ndocuments = [\n    Document(page_content=\"The quick brown fox jumps over the lazy dog.\"),\n    Document(page_content=\"A computer is an electronic device that processes data.\"),\n    Document(page_content=\"LangChain is a framework for developing applications powered by language models.\"),\n    Document(page_content=\"Pinecone is a vector database for building AI applications.\")\n]\n\n# --- Create or Connect to the Vector Store from Documents ---\n# This method handles embedding and upserting the documents.\nprint(\"Adding documents to Pinecone vector store...\")\nvectorstore = PineconeVectorStore.from_documents(\n    documents, embeddings, index_name=index_name\n)\nprint(\"Documents added.\")\n\n# --- Perform a Similarity Search ---\nquery = \"What is LangChain?\"\nprint(f\"\\nPerforming similarity search for: '{query}'\")\nresults = vectorstore.similarity_search(query, k=1)\n\nprint(\"\\nSearch Results:\")\nfor doc in results:\n    print(f\"- Content: {doc.page_content}\")\n\n# --- Optional: Clean up ---\n# print(f\"\\nDeleting index '{index_name}' for cleanup...\")\n# pc.delete_index(index_name)\n# print(f\"Index '{index_name}' deleted.\")\n","lang":"python","description":"This quickstart demonstrates how to initialize the Pinecone client, create a new Pinecone index (if it doesn't exist), embed documents using OpenAIEmbeddings, store them in the Pinecone vector store, and perform a similarity search. Ensure you have your `PINECONE_API_KEY`, `PINECONE_ENVIRONMENT`, and `OPENAI_API_KEY` set as environment variables or replaced in the code."},"warnings":[{"fix":"Replace `from pinecone import init; init(api_key='...', environment='...')` with `from pinecone import Pinecone; pc = Pinecone(api_key='...', environment='...')`. Ensure your `langchain-pinecone` library is updated to leverage the newer client as well.","message":"`pinecone-client` v3.0.0 introduced a breaking change to how the Pinecone client is initialized. The global `pinecone.init()` function was deprecated in favor of instantiating the `pinecone.Pinecone` class directly.","severity":"breaking","affected_versions":"pinecone-client <3.0.0 (old) vs pinecone-client >=3.0.0 (new)"},{"fix":"Always ensure the `dimension` parameter used when creating your Pinecone index matches the output dimension of your chosen embedding model. For OpenAI's `text-embedding-ada-002`, the dimension is 1536.","message":"Mismatch between the embedding model's dimension and the Pinecone index's dimension. If the index is created with a dimension (e.g., 768 for `all-MiniLM-L6-v2`) and you try to insert vectors from an embedding model with a different dimension (e.g., 1536 for OpenAI `text-embedding-ada-002`), it will result in an error.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Double-check your `PINECONE_API_KEY` and `PINECONE_ENVIRONMENT` (or the `api_key` and `environment` passed to `pinecone.Pinecone()`). Ensure the environment matches your Pinecone project's region (e.g., 'us-west-2', 'gcp-starter'). Using environment variables is recommended: `export PINECONE_API_KEY='...'`.","message":"Incorrect or missing Pinecone API key or environment configuration. This is a common setup issue that leads to authentication or connection errors.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Ensure the index is created in Pinecone before calling `from_existing_index()`. If you want to create an index on the fly from documents, use `PineconeVectorStore.from_documents()` and pass the `index_name` parameter; it will create the index if it doesn't exist.","message":"When using `PineconeVectorStore.from_existing_index()`, the specified Pinecone index must already exist. If it does not, this method will raise an error.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Upgrade your `langchain` and `langchain-pinecone` packages and use `from langchain_pinecone import PineconeVectorStore`. Migrate any direct imports from `langchain.vectorstores`.","message":"Older versions of LangChain (prior to v0.1.x) might have provided integration classes directly under `langchain.vectorstores.Pinecone`. The recommended approach for LangChain v0.1.x and newer is to use the dedicated `langchain-pinecone` package.","severity":"deprecated","affected_versions":"langchain <0.1.0"}],"env_vars":null,"last_verified":"2026-04-12T00:00:00.000Z","next_check":"2026-07-11T00:00:00.000Z"}