If the amount of data is at-all reasonable, you could write a script which populates additional SQL tables whose purpose is to index the data that you receive from this most-peculiar application. Create a suitable primary-key column even if you have to do it by hand, then scan through the data inserting rows in other tables that consist of break-downs of what is in the original data. For example, two entries in an index-table for row #1 would specify that it contains 'AA' and 'BB'. Once you have done this, now you can begin to do sensible queries against the data that you get.
A script to parse the original data stem-to-stern, inserting descriptive records in other tables, would be fairly easy to write and could accomplish its work with a single pass through the data. Now, you no longer have to depend upon its weirdnesses. I truly believe that you will never get a truly-satisfactory program from the approach that you are pursuing right now. I daresay that the incoming data is full of special-cases and exceptions that would vex you endlessly.