Is it better to fine tune whisper for domain specific terms or a Lora approach is good enough #2754
KouissAbdessalam
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello guys! i am planning to fine tune whisper for simply improving the transcription quality in contexts with a lot of domain specific words, the dataset i ll have will be arround 3 to 4 hours, my question is it better to do the full finetuning or a PEFT lora finetuning is better?
Beta Was this translation helpful? Give feedback.
All reactions