Databricks
Databricks Intelligence Platform is the world's first data intelligence platform powered by generative AI. Infuse AI into every facet of your business.
Databricks embraces the LangChain ecosystem in various ways:
- ๐ Model Serving - Access state-of-the-art LLMs, such as DBRX, Llama3, Mixtral, or your fine-tuned models on Databricks Model Serving, via a highly available and low-latency inference endpoint. LangChain provides LLM (
Databricks
), Chat Model (ChatDatabricks
), and Embeddings (DatabricksEmbeddings
) implementations, streamlining the integration of your models hosted on Databricks Model Serving with your LangChain applications. - ๐ Vector Search - Databricks Vector Search is a serverless vector database seamlessly integrated within the Databricks Platform. Using
DatabricksVectorSearch
, you can incorporate the highly scalable and reliable similarity search engine into your LangChain applications. - ๐ MLflow - MLflow is an open-source platform to manage full the ML lifecycle, including experiment management, evaluation, tracing, deployment, and more. MLflow's LangChain Integration streamlines the process of developing and operating modern compound ML systems.
- ๐ SQL Database - Databricks SQL is integrated with
SQLDatabase
in LangChain, allowing you to access the auto-optimizing, exceptionally performant data warehouse. - ๐ก Open Models - Databricks open sources models, such as DBRX, which are available through the Hugging Face Hub. These models can be directly utilized with LangChain, leveraging its integration with the
transformers
library.
Installationโ
First-party Databricks integrations are now available in the databricks-langchain partner package.
pip install databricks-langchain
The legacy langchain-databricks partner package is still available but will be soon deprecated.
Chat Modelโ
ChatDatabricks
is a Chat Model class to access chat endpoints hosted on Databricks, including state-of-the-art models such as Llama3, Mixtral, and DBRX, as well as your own fine-tuned models.
from databricks_langchain import ChatDatabricks
chat_model = ChatDatabricks(endpoint="databricks-meta-llama-3-70b-instruct")
See the usage example for more guidance on how to use it within your LangChain application.
LLMโ
Databricks
is an LLM class to access completion endpoints hosted on Databricks.
Text completion models have been deprecated and the latest and most popular models are chat completion models. Use ChatDatabricks
chat model instead to use those models and advanced features such as tool calling.
from langchain_community.llm.databricks import Databricks
llm = Databricks(endpoint="your-completion-endpoint")
See the usage example for more guidance on how to use it within your LangChain application.