This repository was archived by the owner on Mar 9, 2026. It is now read-only.
Commit eebc315
committed
Bug fix: Create a buffer so we aren't close the 4k tokenization limit. I've seen cases where the tiktoken token count doesn't agree with OpenAI's count.
1 parent a1de539 commit eebc315
1 file changed
+2
-2
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
108 | 108 | | |
109 | 109 | | |
110 | 110 | | |
111 | | - | |
| 111 | + | |
112 | 112 | | |
113 | 113 | | |
114 | 114 | | |
| |||
231 | 231 | | |
232 | 232 | | |
233 | 233 | | |
234 | | - | |
| 234 | + | |
235 | 235 | | |
236 | 236 | | |
237 | 237 | | |
| |||
0 commit comments