Hi thanks for your project !
I've been trying to use your work to punctuate some audios in portuguese, but I got stuck with some problems with the Tokenizer
First I got in punctuate.py:
line 84, in init self.tokenizer = self.whisper_tokenizer.tokenizer AttributeError: 'Tokenizer' object has no attribute 'tokenizer'
By removing the .tokenizer, I got another error in punctuate.py:
line 221 tokenizer has no convert ids tokenizer.convert_ids_to_tokens
Do you have any ideia why this is happening?
Hi thanks for your project !
I've been trying to use your work to punctuate some audios in portuguese, but I got stuck with some problems with the Tokenizer
First I got in punctuate.py:
line 84, in init self.tokenizer = self.whisper_tokenizer.tokenizer AttributeError: 'Tokenizer' object has no attribute 'tokenizer'
By removing the .tokenizer, I got another error in punctuate.py:
line 221 tokenizer has no convert ids tokenizer.convert_ids_to_tokens
Do you have any ideia why this is happening?