[3.12] gh-119118: Fix performance regression in tokenize module (GH-119615) (#119683)
- Cache line object to avoid creating a Unicode object for all of the tokens in the same line. - Speed up byte offset to column offset conversion by using the smallest buffer possible to measure the difference. (cherry picked from commit d87b0151062e36e67f9e42e1595fba5bf23a485c) Co-authored-by: Lysandros Nikolaou <lisandrosnik@gmail.com> Co-authored-by: Pablo Galindo <pablogsal@gmail.com>
M
Miss Islington (bot) committed
4a0af0cfdcc0b81da5d78dc219df4985c4403f9c
Parent: 7f06cd3
Committed by GitHub <noreply@github.com>
on 5/28/2024, 8:49:02 PM