{"id":3533,"library":"langchain-huggingface","title":"LangChain Hugging Face Integration","description":"Langchain-huggingface provides integrations to leverage Hugging Face models and pipelines within the LangChain ecosystem. This includes support for various Hugging Face LLMs and embeddings, allowing users to connect to the Hugging Face Hub, Inference Endpoints, or run models locally via the transformers library. As of its current version 1.2.1, it's a dedicated integration package that follows LangChain's modular architecture, with frequent updates aligning with the broader LangChain ecosystem.","status":"active","version":"1.2.1","language":"en","source_language":"en","source_url":"https://github.com/langchain-ai/langchain","tags":["LLM","Hugging Face","LangChain","AI","NLP","transformers","embeddings","chat"],"install":[{"cmd":"pip install langchain-huggingface","lang":"bash","label":"Install package"}],"dependencies":[{"reason":"Core LangChain functionalities; required by all LangChain integration packages.","package":"langchain-core","optional":false},{"reason":"Required for interacting with the Hugging Face Hub (downloading models, using inference APIs).","package":"huggingface-hub","optional":false},{"reason":"Required for loading and running Hugging Face models locally via pipelines.","package":"transformers","optional":false}],"imports":[{"note":"The original `langchain` package was monolithic; integrations are now separate.","wrong":"from langchain.llms import HuggingFaceHub","symbol":"ChatHuggingFace","correct":"from langchain_huggingface.chat_models import ChatHuggingFace"},{"note":"Moved from the monolithic `langchain` package to the dedicated integration package.","wrong":"from langchain.llms import HuggingFacePipeline","symbol":"HuggingFacePipeline","correct":"from langchain_huggingface.llms import HuggingFacePipeline"},{"note":"Embeddings were also moved to the dedicated integration package.","wrong":"from langchain.embeddings import HuggingFaceEmbeddings","symbol":"HuggingFaceEmbeddings","correct":"from langchain_huggingface.embeddings import HuggingFaceEmbeddings"}],"quickstart":{"code":"import os\nfrom langchain_huggingface.llms import HuggingFacePipeline\nfrom langchain_core.prompts import PromptTemplate\nfrom langchain_core.output_parsers import StrOutputParser\n\n# Set your Hugging Face API token if accessing models from the Hub\n# os.environ[\"HUGGINGFACEHUB_API_TOKEN\"] = os.environ.get(\"HUGGINGFACEHUB_API_TOKEN\", \"\")\n\n# Initialize the HuggingFacePipeline with a small, accessible model\n# Ensure 'transformers' and a deep learning backend (e.g., 'torch') are installed.\nllm = HuggingFacePipeline.from_model_id(\n    model_id=\"google/flan-t5-small\",\n    task=\"text2text-generation\",\n    pipeline_kwargs={\"max_new_tokens\": 100},\n    # Pass token explicitly if needed, e.g., for private models or inference endpoints\n    # model_kwargs={\"huggingfacehub_api_token\": os.environ.get(\"HUGGINGFACEHUB_API_TOKEN\", \"\")}\n)\n\n# Create a simple prompt template\ntemplate = \"Question: {question}\\nAnswer:\"\nprompt = PromptTemplate.from_template(template)\n\n# Create a chain\nchain = prompt | llm | StrOutputParser()\n\n# Invoke the chain\nquestion = \"What is the capital of France?\"\nresponse = chain.invoke({\"question\": question})\nprint(response)\n","lang":"python","description":"This quickstart demonstrates how to use `HuggingFacePipeline` to load a model and perform text generation. It uses `google/flan-t5-small` as an example, which works well for `text2text-generation` tasks. Ensure `transformers` and a deep learning backend (like `torch`) are installed."},"warnings":[{"fix":"Update all relevant import statements to target the new `langchain_huggingface` package structure (e.g., `from langchain_huggingface.llms import HuggingFacePipeline`).","message":"The LangChain ecosystem transitioned from a monolithic 'langchain' package to modular 'langchain-*' integration packages. Components previously imported directly from `langchain` (e.g., `langchain.llms.HuggingFaceHub`) are now in `langchain-huggingface` (e.g., `langchain_huggingface.llms.HuggingFacePipeline`).","severity":"breaking","affected_versions":"Migrations from langchain<0.1.0 to langchain-huggingface>=0.1.0"},{"fix":"For local inference, choose smaller, quantized models, or ensure adequate GPU/CPU resources. For larger models, consider using Hugging Face Inference Endpoints or other cloud-based LLM APIs.","message":"Running large Hugging Face models locally requires significant hardware resources (RAM/VRAM). Attempting to load or run large models on insufficient hardware will lead to slow inference, out-of-memory errors, or crashes.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Obtain an API token from Hugging Face and set it as an environment variable (e.g., `export HUGGINGFACEHUB_API_TOKEN='hf_...'`) or pass it explicitly to the model constructor via `model_kwargs`.","message":"Many Hugging Face models, especially when accessed via the Hugging Face Hub Inference API or for downloading private models, require an `HUGGINGFACEHUB_API_TOKEN`. Without it, you might encounter authentication errors or rate limits.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Only enable `trust_remote_code=True` for models from trusted sources. Understand the implications before using it. If possible, prefer models that do not require custom code.","message":"Using `trust_remote_code=True` when loading models from Hugging Face can be a security risk as it executes arbitrary code from the model repository. This is sometimes required for custom architectures or tokenizers.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Always install `langchain-huggingface` first to allow `pip` to resolve compatible dependencies. Use a virtual environment to isolate dependencies. If conflicts occur, check the `install_requires` in `langchain-huggingface`'s `pyproject.toml` or `setup.py` for exact version ranges.","message":"Version conflicts can arise between `langchain-huggingface`, `langchain-core`, `transformers`, and `huggingface-hub`. These packages often have strict dependency ranges, and mismatched versions can cause unexpected behavior or import errors.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-11T00:00:00.000Z","next_check":"2026-07-10T00:00:00.000Z"}