|
| 1 | +--- |
| 2 | +id: uniter |
| 3 | +sidebar_label: UNITER |
| 4 | +title: "UNITER: UNiversal Image-TExt Representation Learning" |
| 5 | +--- |
| 6 | + |
| 7 | +This repository contains the code for pytorch implementation of UNITER model, released originally under this ([repo](https://github.com/ChenRocks/UNITER)). Please cite the following papers if you are using UNITER model from mmf: |
| 8 | + |
| 9 | +* Chen, Y.-C., Li, L., Yu, L., Kholy, A. E., Ahmed, F., Gan, |
| 10 | +Z., Cheng, Y., and jing Liu, J. *Uniter: Universal imagetext representation learning.* In European Conference on |
| 11 | +Computer Vision, 2020b. ([arXiV](https://arxiv.org/pdf/1909.11740)) |
| 12 | +``` |
| 13 | +@inproceedings{chen2020uniter, |
| 14 | + title={Uniter: Universal image-text representation learning}, |
| 15 | + author={Chen, Yen-Chun and Li, Linjie and Yu, Licheng and Kholy, Ahmed El and Ahmed, Faisal and Gan, Zhe and Cheng, Yu and Liu, Jingjing}, |
| 16 | + booktitle={ECCV}, |
| 17 | + year={2020} |
| 18 | +} |
| 19 | +``` |
| 20 | + |
| 21 | +## Installation |
| 22 | + |
| 23 | +Follow installation instructions in the [documentation](https://mmf.readthedocs.io/en/latest/notes/installation.html). |
| 24 | + |
| 25 | +## Training |
| 26 | + |
| 27 | +To train a fresh UNITER model on the VQA2.0 dataset, run the following command |
| 28 | +``` |
| 29 | +mmf_run config=projects/uniter/configs/vqa2/defaults.yaml run_type=train_val dataset=vqa2 model=uniter |
| 30 | +``` |
| 31 | + |
| 32 | +To finetune a pretrained UNITER model on the VQA2.0 dataset, |
| 33 | +``` |
| 34 | +mmf_run config=projects/uniter/configs/vqa2/defaults.yaml run_type=train_val dataset=vqa2 model=uniter checkpoint.resume_zoo=uniter.pretrained |
| 35 | +``` |
| 36 | + |
| 37 | +To finetune a pretrained [VILLA](https://arxiv.org/pdf/2006.06195.pdf) model on the VQA2.0 dataset, |
| 38 | +``` |
| 39 | +mmf_run config=projects/uniter/configs/vqa2/defaults.yaml run_type=train_val dataset=vqa2 model=uniter checkpoint.resume_zoo=villa.pretrained |
| 40 | +``` |
| 41 | + |
| 42 | +To pretrain UNITER on the masked COCO dataset, run the following command |
| 43 | +``` |
| 44 | +mmf_run config=projects/uniter/configs/masked_coco/defaults.yaml run_type=train_val dataset=masked_coco model=uniter |
| 45 | +``` |
| 46 | + |
| 47 | + |
| 48 | +Based on the config used and `do_pretraining` defined in the config, the model can use the pretraining recipe described in the UNITER paper, or be finetuned on downstream tasks. |
0 commit comments