Welcome to the Optibrium Community

Forgot login?



Why do you advise against a sequential filtering approach to compound prioritisation?

Tuesday, 29 September 2009 21:17
E-mail Print PDF

One common approach to prioritizing compounds is sequential filtering, whereby each property value is compared in turn with a required threshold value. Those compounds that fall on the wrong side of the threshold are rejected. Those compounds that pass then progress to the next filter in the series and hence the number of compounds is iteratively reduced. However, this process has a number of serious shortcomings:

  • As the number of compounds considered at each filter is always reduced, often very few, or even no, compounds emerge from the sequence of filters, having passed all of the thresholds. In this situation, it is difficult to select alternative compounds, as incomplete information has been generated on the full compound set.
  • Filters make artificial distinctions between compounds with property values that cannot be resolved within the uncertainties of a prediction or experimental measurement.
  • Errors rapidly accumulate as the number of filters increase. For example, if we consider 5 filters, each with an accuracy of 90%, the probability of a compound with ideal properties being correctly passed through all filters is only 59%. However if 10 such filters are applied, this drops to 35%, meaning that this process is more likely to incorrectly reject an optimal compound than pass it.
Comments (0)
Only registered users can write comments!
Latest Forums

Read more >