> record their performance in previous runs and you sort them accordingly.
OK some theoretical thoughts of how to sort them:
be c[i] the average cost to run filter[i] on 1 object
be r[i] the average ratio of remaining/checked objects after applying filter[i]
be o the number of objects, c the costs so far
so after applying a filter i
o*=r[i] and c+=o*c[i]
so with an chosen ordering 0,1,2 we get
c = o*c+ o*r*c + o*r*r*c + ...
after restructuring we get
c = o * ( c + r * ( c + r * ( ... ) ) )
I don't know if there is a direct solution for this optimization problem (I could ask some colleagues from uni, I'm pretty sure they know) but this equation is a good base for a "branch-and-bound"¹ - graph search algorithm:
the minimal cost-factor after the first filter is > c + r * c[min] with c[min] being the minimal cost after excluding filter.
That means even if your recorded performance for each filter is fuzzy and unstable you can always adapt and recalculate on the fly, even if you have more complicated models with ranges of values for c and r.
¹) for examples of B&B applications see Possible pairings for first knockout round of UEFA champions league or Puzzle Time