[Doc] Add sphinx build for vllm-ascend (#55)

### What this PR does / why we need it?

This patch enables the doc build for vllm-ascend

- Add sphinx build for vllm-ascend
- Enable readthedocs for vllm-ascend
- Fix CI:
- exclude vllm-empty/tests/mistral_tool_use to skip `You need to agree
to share your contact information to access this model` which introduce
in
314cfade02
- Install test req to fix
https://github.com/vllm-project/vllm-ascend/actions/runs/13304112758/job/37151690770:
      ```
      vllm-empty/tests/mistral_tool_use/conftest.py:4: in <module>
          import pytest_asyncio
      E   ModuleNotFoundError: No module named 'pytest_asyncio'
      ```
  - exclude docs PR

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
1. test locally:
    ```bash
    # Install dependencies.
    pip install -r requirements-docs.txt
    
    # Build the docs and preview
    make clean; make html; python -m http.server -d build/html/
    ```
    
    Launch browser and open http://localhost:8000/.

2. CI passed with preview:
    https://vllm-ascend--55.org.readthedocs.build/en/55/

Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
This commit is contained in:
Yikun Jiang
2025-02-13 18:44:17 +08:00
committed by GitHub
parent 63b11ec7e9
commit 46977f9f06
22 changed files with 255 additions and 36 deletions

View File

@ -25,6 +25,7 @@ on:
- '*.txt' - '*.txt'
- '**/*.py' - '**/*.py'
- '.github/workflows/vllm_ascend_test.yaml' - '.github/workflows/vllm_ascend_test.yaml'
- '!docs/**'
pull_request: pull_request:
branches: branches:
- "main" - "main"
@ -32,6 +33,7 @@ on:
- '*.txt' - '*.txt'
- '**/*.py' - '**/*.py'
- '.github/workflows/vllm_ascend_test.yaml' - '.github/workflows/vllm_ascend_test.yaml'
- '!docs/**'
# Bash shells do not use ~/.profile or ~/.bashrc so these shells need to be explicitly # Bash shells do not use ~/.profile or ~/.bashrc so these shells need to be explicitly
# declared as "shell: bash -el {0}" on steps that need to be properly activated. # declared as "shell: bash -el {0}" on steps that need to be properly activated.

20
.readthedocs.yaml Normal file
View File

@ -0,0 +1,20 @@
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
version: 2
build:
os: ubuntu-22.04
tools:
python: "3.12"
sphinx:
configuration: docs/source/conf.py
fail_on_warning: true
# If using Sphinx, optionally build your docs in additional formats such as PDF
formats: []
# Optionally declare the Python requirements required to build your docs
python:
install:
- requirements: docs/requirements-docs.txt

View File

@ -1,7 +1,7 @@
<p align="center"> <p align="center">
<picture> <picture>
<source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/vllm-project/vllm-ascend/main/docs/logos/vllm-ascend-logo-text-dark.png"> <source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/vllm-project/vllm-ascend/main/docs/source/logos/vllm-ascend-logo-text-dark.png">
<img alt="vllm-ascend" src="https://raw.githubusercontent.com/vllm-project/vllm-ascend/main/docs/logos/vllm-ascend-logo-text-light.png" width=55%> <img alt="vllm-ascend" src="https://raw.githubusercontent.com/vllm-project/vllm-ascend/main/docs/source/logos/vllm-ascend-logo-text-light.png" width=55%>
</picture> </picture>
</p> </p>
@ -71,7 +71,7 @@ curl http://localhost:8000/v1/models
**Please refer to [official docs](./docs/index.md) for more details.** **Please refer to [official docs](./docs/index.md) for more details.**
## Contributing ## Contributing
See [CONTRIBUTING](./CONTRIBUTING.md) for more details, which is a step-by-step guide to help you set up development environment, build and test. See [CONTRIBUTING](docs/source/developer_guide/contributing.md) for more details, which is a step-by-step guide to help you set up development environment, build and test.
We welcome and value any contributions and collaborations: We welcome and value any contributions and collaborations:
- Please feel free comments [here](https://github.com/vllm-project/vllm-ascend/issues/19) about your usage of vLLM Ascend Plugin. - Please feel free comments [here](https://github.com/vllm-project/vllm-ascend/issues/19) about your usage of vLLM Ascend Plugin.

View File

@ -1,7 +1,7 @@
<p align="center"> <p align="center">
<picture> <picture>
<source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/vllm-project/vllm-ascend/main/docs/logos/vllm-ascend-logo-text-dark.png"> <source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/vllm-project/vllm-ascend/main/docs/source/logos/vllm-ascend-logo-text-dark.png">
<img alt="vllm-ascend" src="https://raw.githubusercontent.com/vllm-project/vllm-ascend/main/docs/logos/vllm-ascend-logo-text-light.png" width=55%> <img alt="vllm-ascend" src="https://raw.githubusercontent.com/vllm-project/vllm-ascend/main/docs/source/logos/vllm-ascend-logo-text-light.png" width=55%>
</picture> </picture>
</p> </p>
@ -72,7 +72,7 @@ curl http://localhost:8000/v1/models
**请参阅 [官方文档](./docs/index.md)以获取更多详细信息** **请参阅 [官方文档](./docs/index.md)以获取更多详细信息**
## 贡献 ## 贡献
有关更多详细信息,请参阅 [CONTRIBUTING](./CONTRIBUTING.md),可以更详细的帮助您部署开发环境、构建和测试。 有关更多详细信息,请参阅 [CONTRIBUTING](docs/source/developer_guide/contributing.zh.md),可以更详细的帮助您部署开发环境、构建和测试。
我们欢迎并重视任何形式的贡献与合作: 我们欢迎并重视任何形式的贡献与合作:
- 您可以在[这里](https://github.com/vllm-project/vllm-ascend/issues/19)反馈您的使用体验。 - 您可以在[这里](https://github.com/vllm-project/vllm-ascend/issues/19)反馈您的使用体验。

21
docs/Makefile Normal file
View File

@ -0,0 +1,21 @@
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = source
BUILDDIR = _build
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

21
docs/README.md Normal file
View File

@ -0,0 +1,21 @@
# vLLM Ascend Plugin documents
## Build the docs
```bash
# Install dependencies.
pip install -r requirements-docs.txt
# Build the docs.
make clean
make html
```
## Open the docs with your browser
```bash
python -m http.server -d build/html/
```
Launch your browser and open http://localhost:8000/.

View File

@ -1,15 +0,0 @@
# Ascend plugin for vLLM
vLLM Ascend plugin (vllm-ascend) is a community maintained hardware plugin for running vLLM on the Ascend NPU.
This plugin is the recommended approach for supporting the Ascend backend within the vLLM community. It adheres to the principles outlined in the [[RFC]: Hardware pluggable](https://github.com/vllm-project/vllm/issues/11162), providing a hardware-pluggable interface that decouples the integration of the Ascend NPU with vLLM.
By using vLLM Ascend plugin, popular open-source models, including Transformer-like, Mixture-of-Expert, Embedding, Multi-modal LLMs can run seamlessly on the Ascend NPU.
## Contents
- [Quick Start](./quick_start.md)
- [Installation](./installation.md)
- Usage
- [Running vLLM with Ascend](./usage/running_vllm_with_ascend.md)
- [Feature Support](./usage/feature_support.md)
- [Supported Models](./usage/supported_models.md)

View File

@ -0,0 +1,8 @@
sphinx==6.2.1
sphinx-argparse==0.4.0
sphinx-book-theme==1.0.1
sphinx-copybutton==0.5.2
sphinx-design==0.6.1
sphinx-togglebutton==0.3.2
myst-parser==3.0.1
msgspec

View File

@ -0,0 +1,2 @@
pytest-asyncio

105
docs/source/conf.py Normal file
View File

@ -0,0 +1,105 @@
#
# Copyright (c) 2025 Huawei Technologies Co., Ltd. All Rights Reserved.
# This file is a part of the vllm-ascend project.
# Adapted from vllm-project/vllm/docs/source/conf.py
# Copyright 2023 The vLLM team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# -- Path setup --------------------------------------------------------------
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))
# -- Project information -----------------------------------------------------
project = 'vllm-ascend'
copyright = '2025, vllm-ascend team'
author = 'the vllm-ascend team'
# The full version, including alpha/beta/rc tags
release = ''
# -- General configuration ---------------------------------------------------
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
# Copy from https://github.com/vllm-project/vllm/blob/main/docs/source/conf.py
extensions = [
"sphinx.ext.napoleon",
"sphinx.ext.intersphinx",
"sphinx_copybutton",
"sphinx.ext.autodoc",
"sphinx.ext.autosummary",
"myst_parser",
"sphinxarg.ext",
"sphinx_design",
"sphinx_togglebutton",
]
myst_enable_extensions = [
"colon_fence",
]
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
language = 'en'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This pattern also affects html_static_path and html_extra_path.
exclude_patterns = [
'_build',
'Thumbs.db',
'.DS_Store',
'.venv',
'README.md',
# TODO(yikun): Remove this after zh supported
'developer_guide/contributing.zh.md'
]
# -- Options for HTML output -------------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
html_title = project
html_theme = 'sphinx_book_theme'
html_logo = 'logos/vllm-ascend-logo-text-light.png'
html_theme_options = {
'path_to_docs': 'docs/source',
'repository_url': 'https://github.com/vllm-project/vllm-ascend',
'use_repository_button': True,
'use_edit_page_button': True,
}
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
# html_static_path = ['_static']
def setup(app):
pass

View File

@ -1,4 +1,4 @@
# Contributing to vLLM Ascend plugin # Contributing
## Building and testing ## Building and testing
It's recommended to set up a local development environment to build and test It's recommended to set up a local development environment to build and test
@ -45,7 +45,7 @@ git commit -sm "your commit info"
### Testing ### Testing
Although vllm-ascend CI provide integration test on [Ascend](.github/workflows/vllm_ascend_test.yaml), you can run it Although vllm-ascend CI provide integration test on [Ascend](https://github.com/vllm-project/vllm-ascend/blob/main/.github/workflows/vllm_ascend_test.yaml), you can run it
locally. The simplest way to run these integration tests locally is through a container: locally. The simplest way to run these integration tests locally is through a container:
```bash ```bash

View File

@ -1,4 +1,4 @@
# 为 vLLM 昇腾插件贡献 # 贡献指南
## 构建与测试 ## 构建与测试
我们推荐您在提交PR之前在本地开发环境进行构建和测试。 我们推荐您在提交PR之前在本地开发环境进行构建和测试。
@ -41,7 +41,7 @@ git commit -sm "your commit info"
``` ```
### 测试 ### 测试
虽然 vllm-ascend CI 提供了对 [Ascend](.github/workflows/vllm_ascend_test.yaml) 的集成测试,但您也可以在本地运行它。在本地运行这些集成测试的最简单方法是通过容器: 虽然 vllm-ascend CI 提供了对 [Ascend](https://github.com/vllm-project/vllm-ascend/blob/main/.github/workflows/vllm_ascend_test.yaml) 的集成测试,但您也可以在本地运行它。在本地运行这些集成测试的最简单方法是通过容器:
```bash ```bash
# 基于昇腾NPU环境 # 基于昇腾NPU环境

53
docs/source/index.md Normal file
View File

@ -0,0 +1,53 @@
# Welcome to vLLM Ascend Plugin
:::{figure} ./logos/vllm-ascend-logo-text-light.png
:align: center
:alt: vLLM
:class: no-scaled-link
:width: 70%
:::
:::{raw} html
<p style="text-align:center">
<strong>vLLM Ascend Plugin
</strong>
</p>
<p style="text-align:center">
<script async defer src="https://buttons.github.io/buttons.js"></script>
<a class="github-button" href="https://github.com/vllm-project/vllm-ascend" data-show-count="true" data-size="large" aria-label="Star">Star</a>
<a class="github-button" href="https://github.com/vllm-project/vllm-ascend/subscription" data-icon="octicon-eye" data-size="large" aria-label="Watch">Watch</a>
<a class="github-button" href="https://github.com/vllm-project/vllm-ascend/fork" data-icon="octicon-repo-forked" data-size="large" aria-label="Fork">Fork</a>
</p>
:::
vLLM Ascend plugin (vllm-ascend) is a community maintained hardware plugin for running vLLM on the Ascend NPU.
This plugin is the recommended approach for supporting the Ascend backend within the vLLM community. It adheres to the principles outlined in the [[RFC]: Hardware pluggable](https://github.com/vllm-project/vllm/issues/11162), providing a hardware-pluggable interface that decouples the integration of the Ascend NPU with vLLM.
By using vLLM Ascend plugin, popular open-source models, including Transformer-like, Mixture-of-Expert, Embedding, Multi-modal LLMs can run seamlessly on the Ascend NPU.
## Documentation
% How to start using vLLM on Ascend NPU?
:::{toctree}
:caption: Getting Started
:maxdepth: 1
quick_start
installation
:::
% What does vLLM Ascend Plugin support?
:::{toctree}
:caption: Features
:maxdepth: 1
features/suppoted_features
features/supported_models
:::
% How to contribute to the vLLM project
:::{toctree}
:caption: Developer Guide
:maxdepth: 1
developer_guide/contributing
:::

View File

@ -1,6 +1,6 @@
# Installation # Installation
### 1. Dependencies ## Dependencies
| Requirement | Supported version | Recommended version | Note | | Requirement | Supported version | Recommended version | Note |
| ------------ | ------- | ----------- | ----------- | | ------------ | ------- | ----------- | ----------- |
| Python | >= 3.9 | [3.10](https://www.python.org/downloads/) | Required for vllm | | Python | >= 3.9 | [3.10](https://www.python.org/downloads/) | Required for vllm |
@ -8,11 +8,11 @@
| torch-npu | >= 2.4.0 | [2.5.1rc1](https://gitee.com/ascend/pytorch/releases/tag/v6.0.0.alpha001-pytorch2.5.1) | Required for vllm-ascend | | torch-npu | >= 2.4.0 | [2.5.1rc1](https://gitee.com/ascend/pytorch/releases/tag/v6.0.0.alpha001-pytorch2.5.1) | Required for vllm-ascend |
| torch | >= 2.4.0 | [2.5.1](https://github.com/pytorch/pytorch/releases/tag/v2.5.1) | Required for torch-npu and vllm required | | torch | >= 2.4.0 | [2.5.1](https://github.com/pytorch/pytorch/releases/tag/v2.5.1) | Required for torch-npu and vllm required |
### 2. Prepare Ascend NPU environment ## Prepare Ascend NPU environment
Below is a quick note to install recommended version software: Below is a quick note to install recommended version software:
#### Containerized installation ### Containerized installation
You can use the [container image](https://hub.docker.com/r/ascendai/cann) directly with one line command: You can use the [container image](https://hub.docker.com/r/ascendai/cann) directly with one line command:
@ -33,13 +33,13 @@ docker run \
You do not need to install `torch` and `torch_npu` manually, they will be automatically installed as `vllm-ascend` dependencies. You do not need to install `torch` and `torch_npu` manually, they will be automatically installed as `vllm-ascend` dependencies.
#### Manual installation ### Manual installation
Or follow the instructions provided in the [Ascend Installation Guide](https://ascend.github.io/docs/sources/ascend/quick_install.html) to set up the environment. Or follow the instructions provided in the [Ascend Installation Guide](https://ascend.github.io/docs/sources/ascend/quick_install.html) to set up the environment.
### 3. Building ## Building
#### Build Python package from source ### Build Python package from source
```bash ```bash
git clone https://github.com/vllm-project/vllm-ascend.git git clone https://github.com/vllm-project/vllm-ascend.git
@ -47,7 +47,7 @@ cd vllm-ascend
pip install -e . pip install -e .
``` ```
#### Build container image from source ### Build container image from source
```bash ```bash
git clone https://github.com/vllm-project/vllm-ascend.git git clone https://github.com/vllm-project/vllm-ascend.git
cd vllm-ascend cd vllm-ascend

View File

Before

Width:  |  Height:  |  Size: 153 KiB

After

Width:  |  Height:  |  Size: 153 KiB

View File

Before

Width:  |  Height:  |  Size: 153 KiB

After

Width:  |  Height:  |  Size: 153 KiB

View File

@ -1,6 +1,6 @@
# Quickstart # Quickstart
## 1. Prerequisites ## Prerequisites
### Supported Devices ### Supported Devices
- Atlas A2 Training series (Atlas 800T A2, Atlas 900 A2 PoD, Atlas 200T A2 Box16, Atlas 300T A2) - Atlas A2 Training series (Atlas 800T A2, Atlas 900 A2 PoD, Atlas 200T A2 Box16, Atlas 300T A2)
@ -48,7 +48,7 @@ You will see following message:
``` ```
## 2. Installation ## Installation
Prepare: Prepare:
@ -84,7 +84,7 @@ cd ..
``` ```
## 3. Usage ## Usage
After vLLM and vLLM Ascend plugin installation, you can start to After vLLM and vLLM Ascend plugin installation, you can start to
try [vLLM QuickStart](https://docs.vllm.ai/en/latest/getting_started/quickstart.html). try [vLLM QuickStart](https://docs.vllm.ai/en/latest/getting_started/quickstart.html).

View File

@ -1 +0,0 @@
# Running vLLM with Ascend

View File

@ -15,6 +15,7 @@ norecursedirs =
vllm-empty/tests/compile vllm-empty/tests/compile
vllm-empty/tests/lora vllm-empty/tests/lora
vllm-empty/tests/models vllm-empty/tests/models
vllm-empty/tests/mistral_tool_use
vllm-empty/tests/multimodal vllm-empty/tests/multimodal
vllm-empty/tests/standalone_tests vllm-empty/tests/standalone_tests
vllm-empty/tests/async_engine vllm-empty/tests/async_engine

View File

@ -1,3 +1,5 @@
-r requirements-lint.txt -r requirements-lint.txt
modelscope modelscope
pytest >= 6.0 pytest >= 6.0
pytest-asyncio