Files
vllm-ascend/docs/source/installation.md
Yikun Jiang 46977f9f06 [Doc] Add sphinx build for vllm-ascend (#55)
### What this PR does / why we need it?

This patch enables the doc build for vllm-ascend

- Add sphinx build for vllm-ascend
- Enable readthedocs for vllm-ascend
- Fix CI:
- exclude vllm-empty/tests/mistral_tool_use to skip `You need to agree
to share your contact information to access this model` which introduce
in
314cfade02
- Install test req to fix
https://github.com/vllm-project/vllm-ascend/actions/runs/13304112758/job/37151690770:
      ```
      vllm-empty/tests/mistral_tool_use/conftest.py:4: in <module>
          import pytest_asyncio
      E   ModuleNotFoundError: No module named 'pytest_asyncio'
      ```
  - exclude docs PR

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
1. test locally:
    ```bash
    # Install dependencies.
    pip install -r requirements-docs.txt
    
    # Build the docs and preview
    make clean; make html; python -m http.server -d build/html/
    ```
    
    Launch browser and open http://localhost:8000/.

2. CI passed with preview:
    https://vllm-ascend--55.org.readthedocs.build/en/55/

Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
2025-02-13 18:44:17 +08:00

2.1 KiB

Installation

Dependencies

Requirement Supported version Recommended version Note
Python >= 3.9 3.10 Required for vllm
CANN >= 8.0.RC2 8.0.RC3 Required for vllm-ascend and torch-npu
torch-npu >= 2.4.0 2.5.1rc1 Required for vllm-ascend
torch >= 2.4.0 2.5.1 Required for torch-npu and vllm required

Prepare Ascend NPU environment

Below is a quick note to install recommended version software:

Containerized installation

You can use the container image directly with one line command:

docker run \
    --name vllm-ascend-env \
    --device /dev/davinci1 \
    --device /dev/davinci_manager \
    --device /dev/devmm_svm \
    --device /dev/hisi_hdc \
    -v /usr/local/dcmi:/usr/local/dcmi \
    -v /usr/local/bin/npu-smi:/usr/local/bin/npu-smi \
    -v /usr/local/Ascend/driver/lib64/:/usr/local/Ascend/driver/lib64/ \
    -v /usr/local/Ascend/driver/version.info:/usr/local/Ascend/driver/version.info \
    -v /etc/ascend_install.info:/etc/ascend_install.info \
    -it quay.io/ascend/cann:8.0.rc3.beta1-910b-ubuntu22.04-py3.10 bash

You do not need to install torch and torch_npu manually, they will be automatically installed as vllm-ascend dependencies.

Manual installation

Or follow the instructions provided in the Ascend Installation Guide to set up the environment.

Building

Build Python package from source

git clone https://github.com/vllm-project/vllm-ascend.git
cd vllm-ascend
pip install -e .

Build container image from source

git clone https://github.com/vllm-project/vllm-ascend.git
cd vllm-ascend
docker build -t vllm-ascend-dev-image -f ./Dockerfile .