Why does lezer-javascript use ExternalTokenizer for template strings?

Nitpicky question again, but I noticed the use of an ExternalTokenizer for javascript template strings (javascript/tokens.js at main · lezer-parser/javascript · GitHub) where, it seems this can be done in pure lezer as well with

  templateContent {
    (![\\`$] | "\\" _ | "$" ![{])+
  InterpolationStart { "${" }
  templateEnd { "`" }

Is there something I’m missing? Any gains to using an external tokenizer here?

That will not match a $ at the end of the content (i.e. `$`).

The new @local tokens feature might be useful in cases like this.