Search

TRENDS IN HIGH THROUGHPUT SCREENING

Updated: Sep 12




November 8, 2021


by 20/15 Visioneers

September 2021 Contents

1. Executive summary

2. Preface

3. Historical perspective

4. Main actors in the HTS play

4.1 Targets on screening

4.2 Assay modalities and technologies

4.3 Bioreagents

4.4 Compound collections and chemical diversity

5. Automation trends

5.1 Robots

5.2 Smart and Cloud Labs environments

6. Data Sciences and Information Technologies

7. Social commitment in HTS

7.1 Public-private partnerships

7.2 Sustainable screening

8. A look to the future: AI and HTS

9. Conclusions

10. Bibliography 1. EXECUTIVE SUMMARY

High throughput screening (HTS) has been a paramount tool in the drug discovery process over many decades. After years of maturation, the discipline has evolved from the initial “game of numbers” principle, based on the belief that the larger the number of samples being analyzed, the higher the probability of success, to a more information-driven strategy. Enzymes (kinases included) and GPPCRs remain the most popular target classes in the discipline, but novel biological systems like targeted protein degradation (TPD) have been incorporated into the HTS arena opening the possibility of modulating physiological processes previously considered inaccessible to therapeutic intervention. Innovative chemical strategies such as DNA-encoded library technology (DELT) are being widely used to expand the chemical diversity of the molecules being screened, and molecular biology tools have enabled expression systems suited for the preparation of multiprotein complexes at an adequate scale for screening purposes. Those advances are facilitating the modulation of targets that remained elusive until now. Flexibility in the design of robotic instruments, with modular systems contributing to a fluidic layout of HTS labs, allows for rapid implementation of novel assays in a flexible environment, distant from the rigid factory-like settings so common in the first decade of this century. This expansion of HTS boundaries comes along with a democratization of HTS that makes it more accessible to research groups with financial constraints. For instance, Cloud Labs enable the use of expensive equipment at reasonable pay-for-use costs. Likewise, initiatives to promote public-private partnerships facilitate massive data access to research groups and foster fruitful collaborations among them. Artificial intelligence tools are starting to enable the massive screening of molecules with only a limited number of real assay points, intertwining virtual and physical HTS. The internet of things enables more powerful and real-time quality control of operations. New liquid handlers facilitate the miniaturization of assays, causing precious savings that will also reduce the environmental impact of HTS, an area of concern recently the subject of intense efforts. Finally, improved data management has proved essential to maximizing the proficiency of the HTS process, with tools to harmonize and unify data structure being developed. Notably, such tools will be critical to make the data available for data science and analytics like artificial intelligence that will clearly drive the HTS of the future. Overall, as HTS has evolved, it continues being a cornerstone in drug discovery, and will likely remain so, for years to come.

Figure 1: How HTS advances have positively impacted on issues traditionally associated with the discipline. 2. PREFACE

With their focus on finding new molecules for therapeutic purposes, pharmaceutical companies embraced high throughput screening (HTS, i.e., the simultaneous evaluation of a vast number of molecules in biological assays expressing a given functionality, with the goal of identifying modulators of such functionality) several decades ago as a key strategy to identify new chemical matter. Starting from modest efforts intended for screening hundreds or thousands of molecules, technology evolution fostered a change in the discipline which evolved to ultra-high throughput screening at the dawn of the 21st century, where millions of molecular entities were screened for every target in a short period of time. This sudden revolution transitioned in the last two decades to a calm evolution where quantitative changes were replaced by qualitative ones. Novel strategies in different fields have converged (and are converging) to slowly transform the HTS scenario, as it becomes more mature, to increase its success rate. In this document we attempt to summarize the current status of HTS, highlighting the opportunities to further improve its efficiency with a view on future directions for the discipline. It will be focused mainly on target-based HTS, leaving phenotypic screening to be discussed in a separate document. Likewise, it will not cover genomic screening, an emerging therapeutic approach that has been reinvigorated with recent CRISPR advances (an excellent review on this topic can be found in a recent article1).

3. HISTORICAL PERSPECTIVE

In the last two decades of the 20th Century a number of discoveries caused a significant leap forward in the drug discovery process. Until then, drug discovery was based primarily on serendipity and empirical observations in rudimentary physiological models, using animal tissues at first, and assays based on cell cultures and purified proteins later. These latter assays, though more efficient than the animal tissues models, were burdensome and only allowed the evaluation of a few tens of chemical compounds per week. Therefore, the number of opportunities to develop novel therapeutics was rather limited. But advances in recombinant DNA technologies during the 1980s fostered the availability of recombinant proteins that could be used as targets in assays intended to find ligands capable of modulating their function, either in cell-free or in cell-based models. This was the seminal point that gave rise to high throughput screening as a discipline in the following decade.

Advances in fluorescence-based reagents in the 1990s enabled the development of more efficient assays that could be performed in 96-well plates, thereby allowing the simultaneous screening of many molecules. Likewise, the low volume of the assays caused precious savings in chemical compounds, moving from spending 5-10 mg per assay tube, to 10-100 µg per assay well. The first published articles visible in PubMed with the term “high throughput screening” date from 19912-4 using radioactive tracers to monitor ligand binding. However, most of the HTS work performed in pharmaceutical companies was not disclosed at that time since it was perceived as a competitive advantage and therefore its results were considered as highly sensitive information. Actually, HTS campaigns were first initiated in the 1980s with a throughput that grew from less than 1,000 compounds screened per week in 1986, to nearly 10,000 in 19895. As throughput increased, corporate compound collections typically containing 10,000-20,000 samples were too small to suffice for the nature of the HTS endeavor, conceived as a game of numbers: the larger the number of samples being screened, the higher the chances to find an interesting molecule. In addition, those collections were usually a reflection of the previous activity of the company and often included poorly developable molecules such as dyes. These facts contributed to limited chemical diversity in the set of compounds being screened, hence precluding the chances to find novel chemotypes. However, the explosion of combinatorial chemistry and parallel synthesis methods by mid-1990s, and a new appreciation for develop-ability properties for molecules brought on by Lipinski6in 1997, changed the landscape completely, and in the early 2000s compound collections scaled up to include at least half a million different molecules.

The aforementioned advances in fluorescence chemistry gave rise to a plethora of reagents suitable to all forms of fluorescent readouts: fluorescence intensity (FLINT), fluorescence polarization (FP), Förster resonance energy transfer (FRET), time-resolved fluorescence (TRF), fluorescence intensity distribution analysis (FIDA) or confocal fluorescence lifetime analysis (cFLA). All these technologies enabled the development of highly efficient assay formats in homogeneous “mix and read” mode, avoiding cumbersome washing steps and therefore increasing considerably the screening throughput. Those advances led to a further decrease in assay volumes, transitioning from 96-well plates in the mid-1990s (dealing with 100-200 µL assays) to 384-well plates (10-50 µL assays) at the end of the decade and 1536-well plates (3-10 µL assays) in the early 2000s. This was accompanied by significant progress in automation, with the development of more proficient robotic systems which made low-volume liquid handling more reliable. The founding of the Society for Biomolecular Screening (SBS, currently Society for Laboratory Automation and Screening, SLAS) prompted the adoption of microplate standards that were paramount to harmonize the development and manufacturing of new instrumentation, from liquid handlers to plate readers or incubators, and helped their integration in big robotic platforms. The bottleneck shifted from the laboratory to the information technology (IT) desks as new tools capable of dealing with thousands of data were demanded. The first versions of these tools were prepared initially as rudimentary, customized systems developed within each pharma company (usually as sophisticated macros operating in Excel) and later as more robust commercial LIMS platforms capable of capturing, processing and integrating all the information in corporate databases in a data integrity-compliant manner. Simultaneously, new quality control criteria were implemented to monitor and ensure assay performance, avoiding unwanted waste of costly reagents, and sound statistical criteria started to be used for the appropriate selection of hits.

These advances contributed to the industrialization of HTS, which evolved into “ultra-HTS” (uHTS), defined as the ability to screen more than 100,000 compounds every day against every single target. The availability of the human genome sequence7 in 2001 nourished the identification of novel therapeutic targets and all these facts triggered the definitive explosion of HTS in the pharmaceutical industry during the 2001-2005 period. And, as it happens with any sudden explosion of technology, this caused a hype of expectation that shortly afterwards led to disappointment as some of the unfounded (and somewhat naive) expectations did not pay off, at least in the short term. Most of the reproaches against HTS were elegantly discussed and refuted in a Nature Reviews in Drug Discovery article8 published in 2011.

The fruits delivered by HTS are now becoming more apparent. It is very difficult to trace the origin of drugs already in the market or in late clinical trials to find out if they originated from a HTS campaign, but surrogate approaches are demonstrating that the contribution of screening strategies to the discovery of novel molecular entities has increased in the last few years and therefore the discipline seems to be paying off, as observed in Figure 2. In the 1990-99 period, 244 novel molecular entities (NMEs) were approved by the FDA and 105 out of them (43%) were new in their scaffold and shape (i.e., they were not fast followers, nor inspired by known chemotypes). In the 2000-09 period, this percentage grew modestly to 51% (84 out of the 164 NMEs approved) but the number of these novel molecules decreased (from 105 to 84). However, in the 2010-19 period the percentage expanded to reach 67% (164 out of the 245 NMEs approved) and the figure boosted drastically (from 84 to 164) with screening believed to be responsible for this growth9. *