Skip to content

Commit a148efe

Browse files
authored
Merge pull request #2 from zhangtaolab/dev
Dev
2 parents e02d49f + f2b8ffd commit a148efe

9 files changed

Lines changed: 101 additions & 73 deletions

File tree

CONTRIBUTING.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ source .venv/bin/activate # Linux/macOS
5959
.venv\Scripts\activate # Windows
6060

6161
# Install DNALLM in development mode
62-
uv pip install -e '.[dev,test]'
62+
uv pip install -e '.[dev]'
6363
```
6464

6565
### 2. Pre-commit Hooks (Optional)
@@ -627,7 +627,7 @@ We follow [Semantic Versioning](https://semver.org/):
627627
# Setup development environment
628628
uv venv
629629
source .venv/bin/activate
630-
uv pip install -e '.[dev,test]'
630+
uv pip install -e '.[dev]'
631631

632632
# Pre-commit validation (run before every commit)
633633
# Option 1: Use automated script (recommended, code quality only)
@@ -658,7 +658,7 @@ mkdocs serve
658658
1. **Code check script fails**:
659659
- Make sure you're in the DNALLM root directory
660660
- Ensure virtual environment is activated: `source .venv/bin/activate`
661-
- Install dependencies: `uv pip install -e '.[dev,test]'`
661+
- Install dependencies: `uv pip install -e '.[dev]'`
662662

663663
2. **Ruff formatting errors**:
664664
- Auto-fix: `python scripts/check_code.py --fix`

README.md

Lines changed: 45 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -77,11 +77,11 @@ pip install --upgrade pip
7777
# Install uv in virtual environment
7878
pip install uv
7979

80-
# Install DNALLM with base dependencies
81-
uv pip install -e '.[base]'
80+
# Install DNALLM with all optional dependencies
81+
uv pip install -e '.[all]'
8282

83-
# For MCP server support (optional)
84-
uv pip install -e '.[mcp]'
83+
# Or install only base development tools
84+
# uv pip install -e '.[base]'
8585

8686
# Verify installation
8787
python -c "import dnallm; print('DNALLM installed successfully!')"
@@ -103,19 +103,19 @@ conda activate dnallm
103103
# Install uv in conda environment
104104
conda install uv -c conda-forge
105105

106-
# Install DNALLM with base dependencies
107-
uv pip install -e '.[base]'
106+
# Install DNALLM with all optional dependencies
107+
uv pip install -e '.[all]'
108108

109-
# For MCP server support (optional)
110-
uv pip install -e '.[mcp]'
109+
# Or install only base development tools
110+
# uv pip install -e '.[base]'
111111

112112
# Verify installation
113113
python -c "import dnallm; print('DNALLM installed successfully!')"
114114
```
115115

116116
### GPU Support
117117

118-
For GPU acceleration, install the appropriate CUDA version:
118+
For GPU acceleration, install the appropriate CUDA version **after** installing the base package:
119119

120120
```bash
121121
# For venv users: activate virtual environment
@@ -127,11 +127,45 @@ source .venv/bin/activate # Linux/MacOS
127127
# conda activate dnallm
128128

129129
# CUDA 12.4 (recommended for recent GPUs)
130-
uv pip install -e '.[cuda124]'
130+
uv pip install -e '.[all,cuda124]'
131131

132132
# Other supported versions: cpu, cuda121, cuda126, cuda128
133133
# Nvidia 5090 Please use cuda128 & torch==2.7
134-
uv pip install -e '.[cuda128]'
134+
uv pip install -e '.[all,cuda128]'
135+
```
136+
137+
> **Warning:** Hardware groups (`cpu`, `cuda121`, `cuda124`, `cuda126`, `cuda128`, `rocm`, `mamba`) are mutually exclusive. You must choose exactly one. Do NOT combine multiple CUDA versions.
138+
139+
### Dependency Groups
140+
141+
| Group | Purpose | Includes |
142+
|-------|---------|----------|
143+
| `all` | Install everything | `base` + `docs` + `ui` |
144+
| `base` | Full dev environment | `dev` + `test` + `notebook` + `mcp` + extra tools |
145+
| `dev` | Development | `test` + `notebook` + linting/typing tools |
146+
| `test` | Testing only | pytest and plugins |
147+
| `notebook` | Interactive notebooks | Jupyter, Marimo |
148+
| `docs` | Build documentation | mkdocs and plugins |
149+
| `ui` | Gradio web interface | Gradio |
150+
| `mcp` | MCP server | (included in core) |
151+
152+
**Hardware groups (mutually exclusive, NOT included in `all`):**
153+
154+
| Group | PyTorch | Use Case |
155+
|-------|---------|----------|
156+
| `cpu` | 2.4.0-2.7 | No GPU |
157+
| `cuda121` | 2.2.0-2.7 | Older NVIDIA GPUs |
158+
| `cuda124` | 2.4.0-2.7 | Most modern GPUs (recommended) |
159+
| `cuda126` | 2.6.0-2.7 | Ada/Hopper with Flash Attention |
160+
| `cuda128` | 2.6.0-2.7 | RTX 5090 and latest hardware |
161+
| `rocm` | 2.5.0-2.7 | AMD GPUs |
162+
| `mamba` | 2.6.0-2.7 | Native Mamba architecture (requires CUDA) |
163+
164+
```bash
165+
# Examples:
166+
uv pip install -e '.[all,cuda124]' # Everything + CUDA 12.4
167+
uv pip install -e '.[base,mamba]' # Dev tools + native Mamba
168+
uv pip install -e '.[test,cpu]' # Testing only, no GPU
135169
```
136170

137171
### Native Mamba Support

docs/faq/models_troubleshooting.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88

99
```bash
1010
# Install without the [mamba] extra
11-
uv pip install -e '.[base,mcp,dev,notebook]'
11+
uv pip install -e '.[base]'
1212
```
1313

1414
Then load a Mamba model through `transformers` (cpu only) as usual. Note that this fallback path is significantly slower than the optimized `mamba-ssm` kernels (which require Linux + CUDA).

docs/getting_started/installation.md

Lines changed: 29 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -113,28 +113,34 @@ DNALLM provides multiple dependency groups for different use cases:
113113

114114
### Core Dependency Groups
115115

116-
| Dependency Group | Purpose | When to Use |
117-
|-----------------|---------|-------------|
118-
| **base** | Development tools + ML libraries | Recommended for most users |
119-
| **dev** | Complete development environment | For contributors |
120-
| **test** | Testing environment only | For running tests |
121-
| **notebook** | Jupyter and Marimo support | For interactive notebooks |
122-
| **docs** | Documentation building | For building docs |
123-
| **mcp** | MCP server support | For MCP deployment |
116+
| Group | Purpose | Includes |
117+
|-------|---------|----------|
118+
| **all** | Install all optional dependencies | `base` + `docs` + `ui` |
119+
| **base** | Full development environment | `dev` + `test` + `notebook` + `mcp` + extra tools (isort, types-transformers) |
120+
| **dev** | Complete development environment | `test` + `notebook` + linting/typing (ruff, flake8, pre-commit, mypy, pandas-stubs) |
121+
| **test** | Testing environment only | pytest and plugins |
122+
| **notebook** | Jupyter and Marimo support | Jupyter Lab, Marimo |
123+
| **docs** | Documentation building | mkdocs-material, mkdocstrings, mkdocs-jupyter |
124+
| **ui** | Gradio web interface | Gradio |
125+
| **mcp** | MCP server support | Included in core dependencies (no extra install needed) |
124126

125-
> **Note:** Core ML libraries (torch, transformers, datasets, peft, accelerate) are installed automatically as main dependencies. The groups above add additional functionality.
127+
> **Note:** `mcp` is an empty extra because MCP dependencies (`mcp`, `starlette`, `uvicorn`, `websockets`) are already part of the core dependencies. You can still use `.[mcp]` for clarity but it won't install additional packages.
126128
127-
### Hardware-Specific Groups
129+
### Hardware-Specific Groups (Mutually Exclusive)
128130

129-
| Dependency Group | PyTorch Version | GPU Type | When to Use |
130-
|-----------------|-----------------|----------|-------------|
131+
> **Warning:** These groups are mutually exclusive. You MUST choose exactly one. Combining multiple hardware groups will cause conflicts.
132+
133+
| Group | PyTorch Version | GPU Type | When to Use |
134+
|-------|----------------|----------|-------------|
131135
| **cpu** | 2.4.0-2.7 | CPU only | Development without GPU |
132136
| **cuda121** | 2.2.0-2.7 | NVIDIA (older) | Volta/Turing/Ampere early |
133137
| **cuda124** | 2.4.0-2.7 | NVIDIA (recommended) | Most modern GPUs |
134138
| **cuda126** | 2.6.0-2.7 | NVIDIA (latest) | Ada/Hopper with Flash Attention |
135-
| **cuda128** | 2.7.0+ | NVIDIA (cutting-edge) | Latest hardware |
139+
| **cuda128** | 2.6.0-2.7 | NVIDIA (cutting-edge) | RTX 5090 and latest hardware |
136140
| **rocm** | 2.5.0-2.7 | AMD GPUs | AMD GPU users |
137-
| **mamba** | 2.4.0-2.7 | NVIDIA + Mamba | For Mamba architecture models |
141+
| **mamba** | 2.6.0-2.7 | NVIDIA + Mamba | Native Mamba architecture (requires CUDA) |
142+
143+
> **Note:** Hardware groups are NOT included in `all` because they conflict with each other. Always combine a hardware group with your chosen feature group: e.g., `.[all,cuda124]`
138144
139145
## Installation Scenarios
140146

@@ -147,8 +153,8 @@ For development and testing without GPU acceleration:
147153
conda create -n dnallm-cpu python=3.12 -y
148154
conda activate dnallm-cpu
149155

150-
# Install base dependencies and CPU version
151-
uv pip install -e '.[base,cpu]'
156+
# Install all dependencies and CPU version
157+
uv pip install -e '.[all,cpu]'
152158

153159
# Verify installation
154160
python -c "import dnallm; print('DNALLM installed successfully!')"
@@ -166,8 +172,8 @@ nvidia-smi
166172
conda create -n dnallm-gpu python=3.12 -y
167173
conda activate dnallm-gpu
168174

169-
# Install base dependencies and CUDA 12.4 support
170-
uv pip install -e '.[base,cuda124]'
175+
# Install all dependencies and CUDA 12.4 support
176+
uv pip install -e '.[all,cuda124]'
171177

172178
# Verify installation
173179
python -c "import torch; print(f'PyTorch: {torch.__version__}'); print(f'CUDA available: {torch.cuda.is_available()}')"
@@ -182,7 +188,7 @@ For models with Mamba architecture (Plant DNAMamba, Caduceus, Jamba-DNA):
182188
conda create -n dnallm-mamba python=3.12 -y
183189
conda activate dnallm-mamba
184190

185-
# Install base dependencies
191+
# Install base dependencies first
186192
uv pip install -e '.[base]'
187193

188194
# Install Mamba support (requires GPU)
@@ -201,8 +207,8 @@ For contributors and developers:
201207
conda create -n dnallm-dev python=3.12 -y
202208
conda activate dnallm-dev
203209

204-
# Install complete development dependencies
205-
uv pip install -e '.[dev,notebook,docs,mcp,cuda124]'
210+
# Install all dependencies + CUDA support
211+
uv pip install -e '.[all,cuda124]'
206212

207213
# Verify installation
208214
python -c "
@@ -223,8 +229,8 @@ For MCP server deployment:
223229
conda create -n dnallm-mcp python=3.12 -y
224230
conda activate dnallm-mcp
225231

226-
# Install MCP-related dependencies
227-
uv pip install -e '.[base,mcp,cuda124]'
232+
# MCP dependencies are included in core, just install with CUDA support
233+
uv pip install -e '.[base,cuda124]'
228234

229235
# Verify installation
230236
python -c 'from dnallm.mcp import server; print("MCP server dependencies installed!")'

docs/getting_started/quick_start.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -315,8 +315,8 @@ DNALLM supports the following task types:
315315
## Need More Details?
316316

317317
- See [Installation Guide](installation.md) for complete dependency information, including:
318-
- All available dependency groups (base, dev, test, notebook, docs, mcp)
319-
- Hardware-specific groups (cpu, cuda121, cuda124, cuda126, cuda128, rocm, mamba)
318+
- All available dependency groups (`all`, `base`, `dev`, `test`, `notebook`, `docs`, `mcp`)
319+
- Hardware-specific groups (`cpu`, `cuda121`, `cuda124`, `cuda126`, `cuda128`, `rocm`, `mamba`) — mutually exclusive
320320
- Installation scenarios for different use cases
321321
- Troubleshooting common issues
322322

docs/index.md

Lines changed: 7 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -56,11 +56,14 @@ conda activate dnallm
5656
# Install uv in conda environment
5757
conda install uv -c conda-forge
5858

59-
# Install DNALLM with base dependencies
60-
uv pip install -e '.[base]'
59+
# Install DNALLM with all optional dependencies
60+
uv pip install -e '.[all]'
6161

62-
# For MCP server support (optional)
63-
uv pip install -e '.[mcp]'
62+
# Or install only core + base development tools
63+
# uv pip install -e '.[base]'
64+
65+
# Add GPU support (choose ONE: cpu, cuda121, cuda124, cuda126, cuda128, rocm)
66+
# uv pip install -e '.[all,cuda124]'
6467

6568
# Verify installation
6669
python -c "import dnallm; print('DNALLM installed successfully!')"

docs/user_guide/getting_started.md

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -46,8 +46,11 @@ Getting DNALLM installed is the first step. We recommend using `uv`, a fast Pyth
4646
# Install uv, the fast package manager
4747
pip install uv
4848
49-
# Install DNALLM and its core dependencies
50-
uv pip install -e '.[base]'
49+
# Install DNALLM and all optional dependencies
50+
uv pip install -e '.[all]'
51+
52+
# Or install only base development tools
53+
# uv pip install -e '.[base]'
5154
```
5255

5356
4. **Verify the Installation**

docs/user_guide/tutorial/end_to_end_workflow.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -998,7 +998,7 @@ COPY README.md ./
998998
999999
# Install Python dependencies
10001000
RUN pip install --no-cache-dir uv && \
1001-
uv pip install -e '.[base,mcp,cuda124]' --system
1001+
uv pip install -e '.[base,cuda124]' --system
10021002
10031003
# Copy application code
10041004
COPY dnallm/ ./dnallm/

pyproject.toml

Lines changed: 8 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -76,20 +76,12 @@ Documentation = "https://zhangtaolab.github.io/DNALLM/"
7676

7777
[project.optional-dependencies]
7878
dev = [
79+
"dnallm[test,notebook]",
7980
"ruff==0.15.8",
8081
"flake8>=7.0.0",
8182
"pre-commit>=3.6.0",
82-
"ipywidgets>=8",
83-
"jupyter>=1.1.1",
84-
"marimo>=0.16.3",
8583
"mypy>=1.15.0",
8684
"pandas-stubs>=2.3.0",
87-
"pydantic>=2.10.6",
88-
"pytest>=8.3.5",
89-
"pytest-asyncio>=0.21.1",
90-
"pytest-cov>=6.0.0",
91-
"pytest-progress>=0.1.0",
92-
"pytest-timeout>=2.3.1",
9385
]
9486
test = [
9587
"pytest>=8.3.5",
@@ -99,7 +91,6 @@ test = [
9991
"pytest-timeout>=2.3.1",
10092
]
10193
notebook = [
102-
"ipywidgets>=8",
10394
"jupyter>=1.1.1",
10495
"marimo>=0.16.3",
10596
]
@@ -109,26 +100,17 @@ docs = [
109100
"mkdocstrings-python>=1.16.10",
110101
"click<8.2.2",
111102
]
112-
mcp = [
113-
"mcp>=1.3.0",
114-
"asyncio>=3.4.3",
103+
ui = [
104+
"gradio>=4.0.0",
115105
]
106+
mcp = []
116107
base = [
117-
"asyncio>=3.4.3",
118-
"ruff==0.15.8",
119-
"flake8>=7.0.0",
120-
"ipywidgets>=8",
108+
"dnallm[dev,test,notebook,mcp]",
121109
"isort>=6.0.1",
122-
"jupyter>=1.1.1",
123-
"marimo>=0.16.3",
124-
"mcp>=1.3.0",
125-
"mypy>=1.15.0",
126-
"pandas-stubs>=2.3.0",
127-
"pydantic>=2.10.6",
128110
"types-transformers>=0.1.0",
129-
"pytest>=8.3.5",
130-
"pytest-cov>=6.0.0",
131-
"pytest-progress>=0.1.0",
111+
]
112+
all = [
113+
"dnallm[base,docs,ui]",
132114
]
133115
cpu = [
134116
"torch>=2.4.0,<2.7",

0 commit comments

Comments
 (0)