Skip to content

Commit c5bf03b

Browse files
committed
contributions and bibtex
1 parent 91ef40a commit c5bf03b

1 file changed

Lines changed: 43 additions & 25 deletions

File tree

index.html

Lines changed: 43 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -203,18 +203,23 @@ <h4 class="subtitle has-text-centered">
203203
<div class="container is-max-desktop">
204204
<div class="columns is-centered has-text-centered">
205205
<div class="column is-four-fifths">
206-
<h2 class="title is-3">Abstract</h2>
206+
<h2 class="title is-3">Our Contributions</h2>
207207
<div class="content has-text-justified">
208-
<p>
209-
Discrete diffusions models have been demonstrated to be surprisingly strong language models.
210-
In this work, we show that discrete diffusion language models can be further improved by adapting
211-
methods from continuous-state diffusion models. We establish a core property of uniform state diffusion:
212-
it stems from an underlying Gaussian diffusion process. This property allows us to improve both training
213-
by utilizing a curriculum learning strategy that reduces training variance and leads to \(\mathbf{2\times}\)
214-
faster convergence, as well as sampling by adapting efficient distillation methods from continuous-state diffusion models.
215-
As a result, models surpass an autoregressive model's zero-shot perplexity on
216-
3 out of 7 benchmarks and we manage to reduce the sampling steps by \(\textbf{two orders}\) of magnitude while preserving sample quality.
217-
</p>
208+
<ol>
209+
<li>
210+
We show that <b>uniform-state discrete diffusion emerges from Gaussian diffusion</b>,
211+
enabling the transfer of techniques from continuous to discrete domains.
212+
</li>
213+
<li>
214+
Building on this insight, we propose the DUO framework,
215+
which improves training through a low-variance curriculum.
216+
</li>
217+
<li>
218+
We further introduce Discrete Consistency Distillation, adapting consistency
219+
distillation to the discrete setting and accelerating DUO sampling
220+
by <b>two orders</b> of magnitude.
221+
</li>
222+
</ol>
218223
</div>
219224
</div>
220225
</div>
@@ -229,8 +234,9 @@ <h2 class="title is-3">Abstract</h2>
229234
<h2 class="title">Introduction</h2>
230235
<p>An eternal theme in mathematics is that discreteness emerges from underlying continuity.
231236
From quantum mechanics, where the quantized energy states of electrons arise as solutions
232-
to continuous wave equations, to the Fourier decomposition of the Heaviside function,
233-
which results in a trigonometric series, and to the binary logic of digital circuits,
237+
to continuous wave equations,
238+
<!-- to the Fourier decomposition of the Heaviside function, which results in a trigonometric series, and -->
239+
to the binary logic of digital circuits,
234240
fundamentally driven by smooth analog currents, discreteness has repeatedly and naturally
235241
emerged from an underlying continuum. Our work continues this tradition by demonstrating
236242
that a discrete diffusion process is, in fact, an emergent phenomenon of an underlying
@@ -312,19 +318,31 @@ <h2 class="title">Poster</h2>
312318

313319

314320
<!--BibTex citation -->
315-
<!-- <section class="section" id="BibTeX">
316-
<div class="container is-max-desktop content">
317-
<h2 class="title">BibTeX</h2>
318-
<pre><code>@misc{sahoo2024simple,
319-
title={Simple and Effective Masked Diffusion Language Models},
320-
author={Subham Sekhar Sahoo and Marianne Arriola and Yair Schiff and Aaron Gokaslan and Edgar Marroquin and Justin T Chiu and Alexander Rush and Volodymyr Kuleshov},
321-
year={2024},
322-
eprint={2406.07524},
323-
archivePrefix={arXiv},
324-
primaryClass={cs.CL}
325-
}</code></pre>
321+
<section class="section" id="BibTeX">
322+
<div class="container is-max-desktop content">
323+
<h2 class="title">BibTeX</h2>
324+
325+
<div style="position: relative;">
326+
<button onclick="copyBibTeX()" style="position: absolute; top: 10px; right: 10px; z-index: 1;">
327+
Copy
328+
</button>
329+
<pre><code id="bibtex-code">@inproceedings{sahoo2025diffusion,
330+
title={The Diffusion Duality},
331+
author={Sahoo, Subham Sekhar and Deschenaux, Justin and Gokaslan, Aaron and Wang, Guanghan and Chiu, Justin T and Kuleshov, Volodymyr},
332+
booktitle={ICLR 2025 Workshop on Deep Generative Model in Machine Learning: Theory, Principle and Efficacy}
333+
}</code></pre>
326334
</div>
327-
</section> -->
335+
</div>
336+
</section>
337+
338+
<script>
339+
function copyBibTeX() {
340+
const code = document.getElementById('bibtex-code').innerText;
341+
navigator.clipboard.writeText(code).then(() => {
342+
alert("BibTeX copied to clipboard!");
343+
});
344+
}
345+
</script>
328346
<!--End BibTex citation -->
329347

330348

0 commit comments

Comments
 (0)