action becomes something like; [TOKEN name="use"] [TOKEN name="qd"] [TOKEN value="action"] [ENDT] [ENDT] then process line by line? I can feel a stack is going to be needed. I think I need to have a play with getting results from the expected token format (written by hand) so I can be sure how the system will work post-tokenisation. (then write the tokeniser)