Skip to content

Latest commit

 

History

History
34 lines (22 loc) · 1.04 KB

File metadata and controls

34 lines (22 loc) · 1.04 KB

MoE-Visualizer

demo

Introduction

This project is a visualizer for Mixture of Experts (MoE) models. We aim to provide a visual tool to help users understand the usage of experts in MoE models.

We designed a hook that can be mounted on a specific layer of the MoE model, which records which experts are used for each sample during inference. Ultimately, this allows us to count the usage of each expert.

Therefore, this is a plug-and-play module that can be used with any MoE model, with Qwen1.5-MoE-A2.7B provided as an example.

What we have done

  • Visualize the usage of experts in prefill and generate phase
  • Support batch processing
  • Support downloading data

Models we support

How to use

Step 1: Install the package

pip install -r requirements.txt

Step 2: Run the demo

python qwen1_5_moe.py

If this project helps you, please give us a star. 🌟