Bemærk
Adgang til denne side kræver godkendelse. Du kan prøve at logge på eller ændre mapper.
Adgang til denne side kræver godkendelse. Du kan prøve at ændre mapper.
Each Databricks app can include dependencies for Python, Node.js, or both. You define these dependencies in language-specific files:
- Use a
requirements.txtfile to specify Python packages installed withpip. See Define Python dependencies withpip. - Use a
pyproject.tomlfile to specify Python packages installed withuv. See Define Python dependencies withuv. - Use a
package.jsonfile to specify Node.js packages. See Define Node.js dependencies.
Define Python dependencies with pip
Apps that use pip come with a set of pre-installed Python libraries. To define additional Python libraries, use a requirements.txt file. If any listed packages match pre-installed ones, the versions in your file override the defaults.
For example:
# Override default version of dash
dash==2.10.0
# Add additional libraries not pre-installed
requests==2.31.0
numpy==1.24.3
# Specify a compatible version range
scikit-learn>=1.2.0,<1.3.0
Pre-installed Python libraries
The following Python libraries are pre-installed for pip-based apps. You don't need to include them in your requirements.txt unless you require a different version.
| Library | Version |
|---|---|
| databricks-sql-connector | 3.4.0 |
| databricks-sdk | 0.33.0 |
| mlflow-skinny | 2.16.2 |
| gradio | 4.44.0 |
| streamlit | 1.38.0 |
| shiny | 1.1.0 |
| dash | 2.18.1 |
| flask | 3.0.3 |
| fastapi | 0.115.0 |
| uvicorn[standard] | 0.30.6 |
| gunicorn | 23.0.0 |
| huggingface-hub | 0.35.3 |
| dash-ag-grid | 31.2.0 |
| dash-mantine-components | 0.14.4 |
| dash-bootstrap-components | 1.6.0 |
| plotly | 5.24.1 |
| plotly-resampler | 0.10.0 |
Define Python dependencies with uv
If your app uses uv for dependency management, define Python dependencies in a pyproject.toml file instead of requirements.txt. Preinstalled libraries aren't available for uv-based apps. You must declare all dependencies in your pyproject.toml. You can also specify any Python version using the requires-python field, unlike pip-based apps which use Python 3.11.
During deployment, Databricks Apps selects an install strategy based on which files are present:
- If
requirements.txtexists, the app usespipto install dependencies, regardless of whetherpyproject.tomlis also present.requirements.txtalways takes precedence. - If
requirements.txtdoesn't exist and bothpyproject.tomlanduv.lockexist, the app usesuvto install dependencies from the lock file.
The uv installer creates and manages its own virtual environment, so you don't need to create a .venv directory.
The following example shows a minimal pyproject.toml for a Databricks app:
[project]
name = "my-app"
requires-python = ">=3.11"
dependencies = [
"dash==2.10.0",
"requests==2.31.0",
]
To use uv, you must include a uv.lock file alongside your pyproject.toml. Generate it by running uv lock locally and include it in your app directory.
Define Node.js dependencies
To define Node.js libraries, include a package.json file in the root of your app. During deployment, Azure Databricks detects this file and runs npm install to install all dependencies listed in it.
For example, a package.json file for a React app using Vite might look like this:
{
"name": "react-fastapi-app",
"version": "1.0.0",
"private": true,
"type": "module",
"scripts": {
"build": "npm run build:frontend",
"build:frontend": "vite build frontend"
},
"dependencies": {
"react": "^18.2.0",
"react-dom": "^18.2.0",
"typescript": "^5.0.0",
"vite": "^5.0.0",
"@vitejs/plugin-react": "^4.2.0",
"@types/react": "^18.2.0",
"@types/react-dom": "^18.2.0"
}
}
Note
List all packages required for npm run build under dependencies, not devDependencies. If you set NODE_ENV=production in your environment variables, the deployment process skips installing devDependencies.
Avoid version conflicts
Keep the following in mind when you define dependencies:
- For
pip-based apps, overriding pre-installed packages can cause compatibility issues if your specified version differs significantly from the pre-installed one. - Always test your app to ensure that package version changes don't introduce errors.
- Pinning explicit versions in
requirements.txthelps maintain consistent app behavior across deployments. - When using
uv, include auv.lockfile for fully reproducible installs across deployments.
Dependency installation and management
Libraries defined in requirements.txt, pyproject.toml, and package.json are installed directly on the container running on your dedicated compute. You're responsible for managing and patching these dependencies.
You can specify libraries from multiple sources in your dependency files:
- Libraries downloaded from public repositories like PyPI and npm
- Private repositories that authenticate using credentials stored in Azure Databricks secrets
- Libraries stored in your
/Volumes/directory (for example,/Volumes/<catalog>/<schema>/<volume>/<path>)
Install from private repositories
To install packages from a private repository, configure environment variables for authentication. For example, set PIP_INDEX_URL to point to your private repository:
env:
- name: PIP_INDEX_URL
valueFrom: my-pypi-secret
Your workspace network configuration must allow access to the private repository. See Configure networking for Databricks Apps.
Install wheel files from Unity Catalog volumes
To install Python packages from wheel files stored in Unity Catalog volumes:
- Add the Unity Catalog volume as a resource to your app. See Unity Catalog volume.
- Reference the full wheel file path directly in your
requirements.txt:
/Volumes/<catalog>/<schema>/<volume>/my_package-1.0.0-py3-none-any.whl
Note
Environment variable references are not supported in requirements.txt. You must hardcode the full wheel file path.
To enhance security when accessing external package repositories, use serverless egress controls to restrict access to public repositories and configure private networking. See Configure networking for Databricks Apps.