Jano, you have 50K followers, so you must be doing something right.
Still I think the article is not it! quantum algs are not good at processing large data, nor with large parameters. Instead they look thru large spaces of alternatives.
This does not map onto the Transformer architecture or really any LLM architecture AT ALL.
It just does not apply to the part of AI that is the bottleneck, so it does not help that bottle neck.