[Bugfix] fix detokenizer shallow copy (#5919)

This commit is contained in:
Aurick Qiao
2024-10-22 18:38:12 -04:00
committed by GitHub
parent 17c79f3c36
commit 23b899a8e6

View File

@ -90,7 +90,7 @@ class Detokenizer:
prefix_offset = next_iter_prefix_offset
read_offset = next_iter_read_offset
if prev_tokens is None:
prev_tokens = next_iter_tokens
prev_tokens = next_iter_tokens.copy()
else:
prev_tokens.extend(next_iter_tokens)