[3.13] gh-119118: Fix performance regression in tokenize module (GH-119615) (#119682)
- Cache line object to avoid creating a Unicode object for all of the tokens in the same line. - Speed up byte offset to column offset conversion by using the smallest buffer possible to measure the difference. (cherry picked from commit d87b0151062e36e67f9e42e1595fba5bf23a485c) Co-authored-by: Lysandros Nikolaou <lisandrosnik@gmail.com> Co-authored-by: Pablo Galindo <pablogsal@gmail.com>
M
Miss Islington (bot) committed
0d0be6b3efeace4743329f81c08f9720cc221207
Parent: c0e9961
Committed by GitHub <noreply@github.com>
on 5/28/2024, 8:47:45 PM