Search

TRENDS IN HIGH THROUGHPUT SCREENING




November 8, 2021

by 20/15 Visioneers September 2021 Contents 1. Executive summary. 2 2. Preface 3. Historical perspective 4. Main actors in the HTS play 4.1 Targets on screening 4.2 Assay modalities and technologies 4.3 Bioreagents 4.4 Compound collections and che


mical diversity 5. Automation trends 5.1 Robots


5.2 Smart and Cloud Labs environments 6. Data Sciences and Information Technologies 7. Social commitment in HTS 7.1 Public-private partnerships 7.2 Sustainable screening 8. A look to the future: AI and HTS 9. Conclusions 10. Bibliography 1. EXECUTIVE SUMMARY High throughput screening (HTS) has been a paramount tool in the drug discovery process over many decades. After years of ma


turation, the discipline has evolved from the initial “game of numbers” principle, based on the belief that the larger the number of samples being analyzed, the higher the probability of success, to a more information-driven strategy. Enzymes (kinases included) and GPPCRs remain the most popular target classes in the discipline, but novel biological systems like targeted protein degradation (TPD) have been incorporated into the HTS arena opening the possibility of modulating physiological processes previously considered inaccessible to therapeutic intervention. Innovative chemical strategies such as DNA-encoded library technology (DELT) are being widely used to expand the chemical diversity of the molecules being screened, and molecular biology tools have enabled expression systems suited for the preparation of multiprotein complexes at an adequate scale for screening purposes. Those advances are facilitating the modulation of targets that remained elusive until now. Flexibility in the design of robotic instruments, with modular systems contributing to a fluidic layout of HTS labs, allows for rapid implementation of novel assays in a flexible environment, distant from the rigid factory-like settings so common in the first decade of this century. This expansion of HTS boundaries comes along with a democratization of HTS that makes it more accessible to research groups with financial constraints. For instance, Cloud Labs enable the use of expensive equipment at reasonable pay-for-use costs. Likewise, initiatives to promote public-private partnerships facilitate massive data access to research groups and foster fruitful collaborations among them. Artificial intelligence tools are starting to enable the massive screening of molecules with only a limited number of real assay points, intertwining virtual and physical HTS. The internet of things enables more powerful and real-time quality control of operations. New liquid handlers facilitate the miniaturization of assays, causing precious savings that will also reduce the environmental impact of HTS, an area of concern recently the subject of intense efforts. Finally, improved data management has proved essential to maximizing the proficiency of the HTS process, with tools to harmonize and unify data structure being developed. Notably, such tools will be critical to make the data available for data science and analytics like artificial intelligence that will clearly drive the HTS of the future. Overall, as HTS has evolved, it continues being a cornerstone in drug discovery, and will likely remain so, for years to come.

Figure 1: How HTS advances have positively impacted on issues traditionally associated with the discipline. 2. PREFACE With their focus on finding new molecules for therapeutic purposes, pharmaceutical companies embraced high throughput screening (HTS, i.e., the simultaneous evaluation of a vast number of molecules in biological assays expressing a given functionality, with the goal of identifying modulators of such functionality) several decades ago as a key strategy to identify new chemical matter. Starting from modest efforts intended for screening hundreds or thousands of molecules, technology evolution fostered a change in the discipline which evolved to ultra-high throughput screening at the dawn of the 21st century, where millions of molecular entities were screened for every target in a short period of time. This sudden revolution transitioned in the last two decades to a calm evolution where quantitative changes were replaced by qualitative ones. Novel strategies in different fields have converged (and are converging) to slowly transform the HTS scenario, as it becomes more mature, to increase its success rate. In this document we attempt to summarize the current status of HTS, highlighting the opportunities to further improve its efficiency with a view on future directions for the discipline. It will be focused mainly on target-based HTS, leaving phenotypic screening to be discussed in a separate document. Likewise, it will not cover genomic screening, an emerging therapeutic approach that has been reinvigorated with recent CRISPR advances (an excellent review on this topic can be found in a recent article1). 3. HISTORICAL PERSPECTIVE In the last two decades of the 20th Century a number of discoveries caused a significant leap forward in the drug discovery process. Until then, drug discovery was based primarily on serendipity and empirical observations in rudimentary physiological models, using animal tissues at first, and assays based on cell cultures and purified proteins later. These latter assays, though more efficient than the animal tissues models, were burdensome and only allowed the evaluation of a few tens of chemical compounds per week. Therefore, the number of opportunities to develop novel therapeutics was rather limited. But advances in recombinant DNA technologies during the 1980s fostered the availability of recombinant proteins that could be used as targets in assays intended to find ligands capable of modulating their function, either in cell-free or in cell-based models. This was the seminal point that gave rise to high throughput screening as a discipline in the following decade. Advances in fluorescence-based reagents in the 1990s enabled the development of more efficient assays that could be performed in 96-well plates, thereby allowing the simultaneous screening of many molecules. Likewise, the low volume of the assays caused precious savings in chemical compounds, moving from spending 5-10 mg per assay tube, to 10-100 µg per assay well. The first published articles visible in PubMed with the term “high throughput screening” date from 19912-4 using radioactive tracers to monitor ligand binding. However, most of the HTS work performed in pharmaceutical companies was not disclosed at that time since it was perceived as a competitive advantage and therefore its results were considered as highly sensitive information. Actually, HTS campaigns were first initiated in the 1980s with a throughput that grew from less than 1,000 compounds screened per week in 1986, to nearly 10,000 in 19895. As throughput increased, corporate compound collections typically containing 10,000-20,000 samples were too small to suffice for the nature of the HTS endeavor, conceived as a game of numbers: the larger the number of samples being screened, the higher the chances to find an interesting molecule. In addition, those collections were usually a reflection of the previous activity of the company and often included poorly developable molecules such as dyes. These facts contributed to limited chemical diversity in the set of compounds being screened, hence precluding the chances to find novel chemotypes. However, the explosion of combinatorial chemistry and parallel synthesis methods by mid-1990s, and a new appreciation for develop-ability properties for molecules brought on by Lipinski6in 1997, changed the landscape completely, and in the early 2000s compound collections scaled up to include at least half a million different molecules. The aforementioned advances in fluorescence chemistry gave rise to a plethora of reagents suitable to all forms of fluorescent readouts: fluorescence intensity (FLINT), fluorescence polarization (FP), Förster resonance energy transfer (FRET), time-resolved fluorescence (TRF), fluorescence intensity distribution analysis (FIDA) or confocal fluorescence lifetime analysis (cFLA). All these technologies enabled the development of highly efficient assay formats in homogeneous “mix and read” mode, avoiding cumbersome washing steps and therefore increasing considerably the screening throughput. Those advances led to a further decrease in assay volumes, transitioning from 96-well plates in the mid-1990s (dealing with 100-200 µL assays) to 384-well plates (10-50 µL assays) at the end of the decade and 1536-well plates (3-10 µL assays) in the early 2000s. This was accompanied by significant progress in automation, with the development of more proficient robotic systems which made low-volume liquid handling more reliable. The founding of the Society for Biomolecular Screening (SBS, currently Society for Laboratory Automation and Screening, SLAS) prompted the adoption of microplate standards that were paramount to harmonize the development and manufacturing of new instrumentation, from liquid handlers to plate readers or incubators, and helped their integration in big robotic platforms. The bottleneck shifted from the laboratory to the information technology (IT) desks as new tools capable of dealing with thousands of data were demanded. The first versions of these tools were prepared initially as rudimentary, customized systems developed within each pharma company (usually as sophisticated macros operating in Excel) and later as more robust commercial LIMS platforms capable of capturing, processing and integrating all the information in corporate databases in a data integrity-compliant manner. Simultaneously, new quality control criteria were implemented to monitor and ensure assay performance, avoiding unwanted waste of costly reagents, and sound statistical criteria started to be used for the appropriate selection of hits. These advances contributed to the industrialization of HTS, which evolved into “ultra-HTS” (uHTS), defined as the ability to screen more than 100,000 compounds every day against every single target. The availability of the human genome sequence7 in 2001 nourished the identification of novel therapeutic targets and all these facts triggered the definitive explosion of HTS in the pharmaceutical industry during the 2001-2005 period. And, as it happens with any sudden explosion of technology, this caused a hype of expectation that shortly afterwards led to disappointment as some of the unfounded (and somewhat naive) expectations did not pay off, at least in the short term. Most of the reproaches against HTS were elegantly discussed and refuted in a Nature Reviews in Drug Discovery article8 published in 2011. The fruits delivered by HTS are now becoming more apparent. It is very difficult to trace the origin of drugs already in the market or in late clinical trials to find out if they originated from a HTS campaign, but surrogate approaches are demonstrating that the contribution of screening strategies to the discovery of novel molecular entities has increased in the last few years and therefore the discipline seems to be paying off, as observed in Figure 2. In the 1990-99 period, 244 novel molecular entities (NMEs) were approved by the FDA and 105 out of them (43%) were new in their scaffold and shape (i.e., they were not fast followers, nor inspired by known chemotypes). In the 2000-09 period, this percentage grew modestly to 51% (84 out of the 164 NMEs approved) but the number of these novel molecules decreased (from 105 to 84). However, in the 2010-19 period the percentage expanded to reach 67% (164 out of the 245 NMEs approved) and the figure boosted drastically (from 84 to 164) with screening believed to be responsible for this growth9. *

Figure 2: Evolution of drug discovery outcomes in the last three decades. NMEs, novel molecular entities. The term “innovative molecules” refers to those not inspired in known chemotypes, thus being new in their scaffold and shape. The percentage of innovative molecules is calculated with respect to the number of NMEs. As often happens with hype cycles, the disappointment following the hype was superseded by a degree of maturation and productivity which has been reached by HTS in the last few years. This maturation is reflected in the adoption of convergent, flexible strategies and the renunciations of previous myths and dogmas that were common in the early 2000s. For instance, diversity screens are combined with fragment-based screenings and rational drug design, complementing each other. Medium- or low-throughput assays, banned in screening platforms two decades ago as inefficient to screen large compound collections, are nowadays accepted using small subsets of the compound libraries or iterative approaches if those assays are a more valid reflection of the physiology of the target or if they are the only viable option, thus enabling the screening of targets previously considered intractable. These are just a few examples showing how HTS strategies have evolved and matured. In the following chapters a more detailed analysis of the different aspects of HTS will be elaborated with some considerations about future trends. As already mentioned in the Preface chapter, most of these reflections will focus exclusively on target-based HTS, leaving the rich field of opportunities brought about by phenotypic screenings to be analyzed in a future document. 4. MAIN ACTORS IN THE HTS PLAY 4.1 TARGETS ON SCREENING Given the sensitive nature of HTS campaigns in pharma companies, determining accurately the nature of the targets being screened in those companies is an almost impossible task since in many cases this information might not be fully disclosed. An approximation can be taken by looking at screening data publicly available in databases like PubChem BioAssay, with most of these data being deposited by academic and public institutions (https://www.ncbi.nlm.nih.gov/pcassay/). Alternatively, HTS data published by pharma companies in the most relevant journal in the field (SLAS Discovery, formerly Journal of Biomolecular Screening) can be retrieved for this analysis, but this likely would be incomplete since not all companies publish comprehensive data from their HTS campaigns. Therefore, the information available in the PubChem BioAssay database has been used as the most feasible approach to evaluate current trends in the type of targets being screened.

Figure 3: Frequency distribution of target types (human and non-human). A: Distribution in HTS-related assays (N=332,549 assays) registered in the PubChem BioAssay database. Targets were grouped according to the classification of the IUPHAR/BPS Guide to Pharmacology. The group “enzymes” includes kinases and the group “receptors” comprises all type of receptors, including GPCRs and nuclear receptors. B: Distribution of targets among the FDA-approved drugs until December-2015 according to the Supplemental Information available in the original article from Santos et al.10. This frequency distribution is done on the total number of different targets (N=813), not on the number of drugs targeting them (i.e., each individual target is only counted once even if it is targeted by more than one drug). C: A closer look to the distribution in B but focused only in small molecule drugs approved in the 2011-2015 period. See main text for further explanation. An exploration of this dataset, done by using the “Classification Browser” tool and arranging the information by target type according to the IUPHAR/BPS Guide to Pharmacology (https://www.guidetopharmacology.org/), reveals that more than 300K HTS-related assays (either primary HTS, secondary and confirmation assays, dose-response assays and mechanistic biology assays for HTS hits) have been deposited in the database since 2004 . Figure 3A shows how these assays are distributed by target class, with enzyme assays being the most frequent (49%, kinases being included in this group), followed by receptors (40%, including G-protein coupled receptors (GPCRs), nuclear receptors and other membrane receptors). Worth noting, some of the assays considered in this analysis aimed to identify or characterize toxicity issues, hence the significant presence of transporters assays (2%). It is interesting to see how this distribution is similar to that observed in the panel of targets of FDA-approved drugs until 2015 (Fig. 3B), although receptor assays are more poorly represented in the latter panel. This difference is likely due to the fact that this target class includes many GPCRs considered intractable until the late 1990s. However, a number of events in the 2000s increased their perceived tractability: the development of novel molecular biology tools capable of generating transient and stably transfected cell lines thus making these targets amenable to screening campaigns; advances in crystallography techniques that uncovered the 3D structure of many of these proteins, therefore enabling rational drug design and virtual screening approaches; and the development of novel assay technologies. Among the latter, it is worth mentioning significant breakthroughs like 1) the discovery of fluorescent calcium indicators11 that led to the development of FLIPR™ technology which revolutionized GPCR drug discovery in the late 1990s12, 2) the design of luminescence-based technologies (e.g., “enzyme fragment complementation” systems used to monitor β-arrestin recruitment as a generic method for all GPCR types or the formation of cAMP in Gs receptors) and 3) the development of non-radioactive, FRET-based, ligand-binding assay formats. These advances caused a boost in GPCR-targeted HTS campaigns and consequently they are largely represented in Fig. 3A. These increased efforts in GPCR drug discovery are starting to pay off. A closer look at the FDA-approved drugs in the 2011-2015 period (Fig. 3C) shows that the presence of GPCR targets is significantly higher in this period than in all the historical series (21% vs11%), i.e., a significant part of the approved GPCR-targeted drugs is concentrated in recent years. With this encouraging success, and considering the relevance of GPCRs in cell physiology, it is likely that this target class will continue to be prevalent in the drug discovery portfolios of many pharmaceutical companies and therefore continuous improvement in assay formats is likely to occur. Particularly, it is expected that cheaper and more accessible instruments to monitor intracellular calcium release (the preferred methodology when screening for Gq receptors) will become available, especially these days where public institutions, with more restricted budgets, are increasingly active in HTS. Indeed, less expensive alternatives to FLIPR™ like FlexStation™ from Molecular Devices (San Jose, CA) are available and it is foreseeable that this trend will continue and expand in the coming years. Also included in the receptor group, there are many nuclear receptors that were extensively pursued as drug targets in the early 2000s. Interest in this target class has somewhat declined in the last decade. Nevertheless, there are many proficient assay platforms allowing the development of high throughput assays for nuclear receptors, most of them being functional and based on the transcription event triggered by the receptor to lead to the expression of a reporter gene, which is easy to monitor in a high throughput-friendly format. Enzymes are the largest target class being screened and the one with the largest representation in the target set of FDA-approved drugs. Worth mentioning, kinases have been included within this group in the analysis shown in Fig. 3. As it happened earlier with GPCRs, kinases were the subject of an explosion in drug discovery efforts in the first decade of the present century, especially for cancer drug discovery. However, many other non-kinase enzymes are responsible for the large impact of this target class. Indeed, a closer look at the group of enzyme targets of FDA-approved drugs shows that only 16% of them were kinases (including non-protein kinases of human and microbial origin) and the remaining 84% were other enzymes. Nonetheless, and consistent with the explosion mentioned above, many of the drugs targeting human protein kinases were approved in the 2011-15 period; likewise, kinases are largely represented in the target sets of HTS-related assays within the PubChem BioAssay database: 43% of the screened enzymes are kinases. Among the non-kinase enzymes there are many proteases, phosphodiesterases, reductases, dehydrogenases, phospholipases and, interestingly, a wide number of enzymes involved in the transfer of methyl groups, this being consistent with the burst in epigenetics drug discovery programs from 2005. Different than GPCRs, where preferred assay technologies involve the measurement of intracellular calcium release, there is not a universal method to be used with enzymes. Some groups of enzymes (kinases, phosphatases, NAD(P)H-dependent dehydrogenases), sharing common substrates or products, may benefit from a common assay format, but this is not applicable to all enzymes. Therefore, bespoke assays for individual enzymes are not unusual. On the other hand, in the hit selection and optimization phases of many drug discovery programs, especially in enzyme-targeted ones, there is a growing trend to consider not only the thermodynamics of drug-target interactions (measured through surrogates of the equilibrium constant: EC50 and the degree of effect at a given drug concentration) but also their kinetics (i.e., off-rates and target residence times)13. Consequently, it would not be surprising to observe a trend towards assay formats enabling continuous, on-line readouts instead of formats exclusively allowing end-point detection, since the former are more suitable for the acquisition of kinetic data. Ion channels are the third target class more frequently considered in drug discovery. Since patch-clamp methods typically used in classical pharmacology studies for these targets are not amenable to high throughput efforts, alternative assay formats were developed. One of the most frequent approaches includes the use of ligand-binding assays, but such assays are not considered truly functional, and they are biased towards the identification of compounds binding to the same site of the probe ligand. Functional alternatives were developed, and the most popular one includes the use of voltage sensor probes (VSPs) which consist of FRET probe pairs which are disrupted upon membrane depolarization. Likewise, ion-specific fluorescent probes (like those used for GPCRs) are used to monitor changes in intracellular ions, e.g., calcium for calcium channels and for non-selective cation-channels. Efflux assays are also used in several drug discovery programs, although to a much lower extent than those mentioned above since they use radioactive tracers (usually 86Rb+) or demand low-throughput atomic absorption spectroscopy instruments to detect non-radioactive rubidium; in addition, they offer low temporal resolution. However, efforts in the last decades to adapt the gold-standard patch-clamp technology to high-throughput settings are maturing with the development of planar patch clamps14. IonWorks™ from Molecular Devices was the first platform to be developed (superseded by their new versions IW Quattro™ and IW Barracuda™) and new instruments are currently available incorporating the precision and accuracy of manual electrophysiology: PatchXpress™ (Molecular Devices), IonFlux™ (Fluxion Biosciences; Alameda, CA), Qube™ and QPatch HT/HTX™ (Sophion; Ballerup, Denmark) and Patchliner™ and SynchroPatch (Nanion; Munich, Germany) are among those. Although their throughput is incomparably high with respect to manual patch-clamp, it is still low when compared to other HTS-friendly technologies. However, they are very useful when screening small compound libraries or for screening campaigns using iterative approaches (like those described in chapter 8) to analyze a lower number of compounds. Still, the price of this instrumentation is nearly prohibitive for many medium or small companies that will have to rely on alternative methods. It is anticipated that prices will decrease in the near future and the predictable trend for ion channel screening will likely include a combination of these technologies. Most of the targets considered in drug discovery programs were selected with the intention of modulating cellular processes that were deemed critically involved in the onset of pathological conditions. Deciphering cell signaling pathways in the 1980s and 1990s preceded the interest in GPCRs, kinases and nuclear receptors playing critical roles in those pathways. In the 2000s, unveiling the importance of epigenetics in modulating cell physiology fostered the interest in many targets involved in the process and therefore a large number of enzymes acting as “writers” (DNA methyl transferases, lysine histone methyltransferases), “erasers” (histone deacetylases, lysine demethylases) and “readers” (bromodomains) of epigenetic changes have been included in pharma early drug portfolios in the last 10-15 years. This effort has partially paid off with the first and second waves of FDA-approvals for drugs targeting epigenetic targets, namely DNA methyltransferases (DNMTs) and histone deacetylases (HDACs). Six drugs were approved among the two waves, all of them for the treatment of hematological malignancies: the first wave delivered azacitidine (2004) and decitabine (2006), targeting DNMTs, as well as vorinostat (2006) and romidepsin (2009), targeting HDACs; the second wave delivered belinostat (2014) and panobinostat (2015), both targeting HDACs. Since then, more efforts have focused on other epigenetic targets, but several issues are forestalling further success. One of them is the observed poor translation of in vitro results into in vivo efficacy data due to several reasons, the main one being that most epigenetic targets work in multiprotein complexes that are not being included in the design of the in vitro assays (this point and its possible solutions will be discussed in section 4.3). In addition, concerns on toxicity usually arise during the development of epidrugs due to both their poor selectivity versus the intended target and the interplay among epigenetic targets and chromatin related proteins. Finally, lack of sufficient target validation also impacts on the poor translational success of many epigenetic drug discovery programs. Nonetheless, several epidrugs candidates are currently in clinical trials and efforts within the scientific community to overcome the challenges described above will surely increase success in this field. Targeted protein degradation (TPD) is the latest approach in the drug discovery radar and the one leading the current trends at the dawn of the 2020s. The main conceptual difference in this case is that TPD is not only being considered as a physiological process to be modulated for therapeutic purposes but rather as cellular machinery to be exploited to modulate protein targets that were neither druggable by small molecule ligands nor accessible by biological agents. Basically, TPD is a strategy intended to label specific target proteins for proteosomal degradation. It requires finding a suitable ligand for the target protein that can be conjugated through a suitable linker to a signal molecule that recruits an ubiquitin E3 ligase. The resulting molecule, called PROTAC (“PROteolysis Targeting Chimera”) is a heterobifunctional small molecule formed by a linker and two warheads: one warhead binds to the target protein, and the other recruits the E3 ligase. In this way, the target protein can be ubiquitinated and degraded by the proteasome. The first successful use of PROTACs was published in 2001 using a peptide conjugated to a covalent ligand of the protein targeted for degradation15. Since then, the field has rapidly evolved to the design of non-peptide small molecules acting as sensors for recruiting E3 ubiquitin ligases. The discovery of the E3 ligase cereblon as the target for thalidomide and analogues and the clinical success of these molecules in treating multiple myeloma has propelled a significant amount of work in the field. Such efforts have mostly tried to exploit the E3 family in the search for the ideal ligase cognate for each target, while looking for the most appropriate ligands. This is a nascent field predicted to evolve and grow in the next few years and will benefit from efficient HTS strategies. Since new E3 ligases will likely be discovered (as of 2019, only 7 out of circa 600 E3 ligases in the human genome had been exploited for TPD16), novel ligase sensors will be needed hence suitable binding assays will have to be developed. Likewise, new assay systems will have to be designed to screen for the most efficient ligands in inducing degradation. Companies like Amphista (London, UK), Lycia (San Diego, CA), C4 Therapeutics (Watertown, MA), Captor (Wroclaw, Poland), Arvinas (New Haven, CT) or Cullgen (San Diego, CA) are actively working in the field using their own platforms and designing assays to investigate the efficiency of the hits found, considering not only the affinity for the intended target and the cognate E3 ligase, but also the extension and the duration of action of the induced degradation. It is therefore predictable that the pharmaceutical industry will devote more resources to the field and significant progress will be made in upcoming years. Notably, as outlined below in section 4.4, TPD is a perfect match for DNA-encoded library technology (DELT, explained in the same section), and this fact will undoubtedly lead to successful outcomes. 4.2 ASSAY MODALITIES AND TECHNOLOGIES In the previous section, some assay modalities were discussed concerning particular target classes. However, it is worth mentioning that target-agnostic technologies will likely influence the evolution of HTS in the near future. Particularly, biophysical, label-free technologies are endowed with features that make them attractive from a HTS perspective as they do not need artificially created ligands or labelled molecules that may be slightly but significantly different to natural ligands. Such technologies would therefore enable the development of assays that resemble more closely the physiological function of the target. In addition, label-free technologies are not susceptible to interferences with labeled reagents that usually lead to unwanted false positives and negatives. Therefore, they do not need orthogonal secondary assays to eliminate such false positives, resulting in valuable savings in time and costs. Among the most extended label-free assay formats amenable for HTS it is worth mentioning those based in waveguide, impedance or surface plasmon resonance (SPR). The latter was one of the first technologies to be developed for drug discovery in the mid-1990s although its use for screening purposes is relatively recent, when appropriate instruments have become available. Though different technologies with different underlying events, SPR and waveguide are based in changes in the properties of reflected light following a change in the mass of a target previously immobilized on a suitable surface. Analogously, impedance-based methods register changes in the electrical conductivity of an electrode array located at the bottom of a well containing cells, once such cells have undergone morphological changes as a consequence of treatment with a given compound (Figure 4). Besides enabling the development of “less artificial” assays, these technologies provide biophysical data that can illuminate the mode of action of identified hits, thus offering a means for hit prioritization. Despite these benefits, the implementation of these technologies in the HTS arena has not been as widespread as was predicted ten years ago. Several factors can explain the low uptake, the most important being their lower throughput compared to other HT-technologies, the cumbersome preparation of reagents and plates (which also impacts throughput) and the high costs of instruments and, specially, consumables. Nonetheless they are still used in secondary screenings as described below for thermal-based methods.

Figure 4: Fundamentals of traditional label-free technologies used in HTS. A: Surface plasmon resonance. B: Waveguide. C: Impedance. Changes in mass (A, B) or in morphology (C) due to biological interactions (lower panels) induce a change in the reflection angle (A), the reflected wavelength (B) or the electrical impedance (C). Following these pioneer technologies, other approaches have become part of the screening armamentarium. Mechanistic assays based in shifts in thermal stability upon ligand binding are frequently considered in drug discovery programs: isothermal titration calorimetry (ITC), cellular thermal shift assay (CETSA) or microscale thermophoresis (MST) are becoming increasingly popular (the latter has been included in this group although is not a label-free technology properly speaking). However, they are also low throughput and therefore are more likely to be used for screening small libraries or in fragment-based screening exercises given the useful biophysical information they provide. To date, the most successful label-free technology for HTS purposes is mass spectroscopy (MS). Initially, MS was coupled to multiplexed liquid chromatography to increase throughput, but this approach still required cycle times of more than 30 s per sample which were unacceptably high for HTS purposes. Changing to solid phase extraction caused a threefold increase in throughput, still considered low. However, the development of highly efficient MALDI systems such as the RapifleX® MALDI PharmaPulse® (Bruker; Billerica, MA), capable of working with low volumes in 1536-well plates has caused a dramatic decrease in cycle times to 0.5-1 s per sample and therefore this technology is now being successfully applied in HTS campaigns17. Still, interferences caused by salts are a significant problem when using MS in HTS. The emergence of self-assembled monolayers for MALDI (“SAMDI”) has overcome this important issue. SAMDI (developed by SAMDI Tech; Chicago, IL) utilizes biochip arrays with self-assembled monolayers (SAMs) of alkanethiolates on gold allowing assay execution under any condition: once the assay is completed, the reaction mixture is transferred to the SAM biochip, where the analyte of interest is captured and the rest of the reaction mixture is washed away. Finally, a suitable matrix is applied to the biochip and the sample is analyzed in a MALDI instrument18. Therefore, the only limitation for running screening assays using SAMDI is that the analyte has to be immobilized in the SAM biochip, but a vast number of immobilization chemistries are available and can be easily developed for most analytes. Recent advances allow the acquisition of bidimensional arrays of hundreds of thousands of MS spectra in cells and tissue samples enabling MS-imaging, which is already being used for screening19. The successful implementation of MS for HTS purposes is being followed by other spectroscopy modalities like nuclear magnetic resonance or Raman spectroscopy and it is not unusual to find reports of HTS campaigns run with these technologies. These advances offer a promising field to be developed in the near future, increasing the arsenal of screening technologies and enabling the HTS of targets difficult to analyze by classical conventional technologies. 4.3 BIOREAGENTS The provision of bioreagents has been one of the bedrocks to enable a successful HTS campaign. As explained in the introduction, advances in DNA recombination in the 1980s contributed to the availability of constructs that allowed the preparation of large quantities of recombinant proteins with high purity degree and transfected (transient or permanent) cell lines either overexpressing the target of interest or showing a target-driven specific phenotype, thus providing a suitable readout for high throughput assays. In the beginning of this century it was quite common in big pharma to have in-house resources and large production facilities equipped with bioreactors and protein purification instrumentation to prepare their own reagents. There is now a growing trend to outsource the preparation of bioreagents, a decision which is more common in small biopharma and academia. Classical suppliers of life science reagents (Sigma Aldrich, Merck, Promega, VWR, TebuBio, etc) can provide bespoke bioreagents at reasonable prices depending on scale. Likewise, small companies have flourished in many countries offering similar services. Therefore, it is not difficult to find a solution for bioreagents provision at all scales, from large HTS premises in big pharmaceutical companies to small groups in biotech startups or academic consortia. Furthermore, many commercial suppliers offer tailored cell lines expressing selected targets, ready to use for screening assays after a few optimization steps. Companies like Promega (Madison, WI) or Eurofins DiscoverX (Fremont, CA) possess a rich catalogue of cell lines for screening modulators of the most demanded targets according to scientific trends. A good example these days is the availability of cell lines for a wide number of targets in immuno-oncology. That said, the availability of recombinant proteins or transfected cell lines should never be taken for granted. Some proteins are hard to express in functional form and in enough quantities for HTS purposes. Improvements in assay technologies and miniaturization trends in HTS have reduced the amount of bioreagents needed, but still some proteins can be difficult to express in a functional state. Nevertheless, advances in molecular biology have helped reduce those challenges to a minimum, and savvy molecular biologists usually find a way to overcome expression and/or production issues. Such advances in molecular biology currently permit scientists to express multiprotein complexes instead of individual entities, a significant leap forward allowing experimental in vitro systems to reflect more accurately disease pathophysiological processes. As outlined in section 4.1 for epigenetic targets, many cellular functions are possible due to the activity of multiprotein complexes performing in a coordinated way, with individual components modulating the activity of others. Though the presence of large multiprotein complexes in cellular systems has been known for some time, it has recently been discovered that there are many small (no more than five components) protein complexes and, interestingly, many disease target proteins are part of these small protein complexes20. Given the prevalence of disease targets in such complexes, it follows that monitoring their activity in isolation seems an excessively simplistic approach that may fail to find the right therapeutic modulators and, therefore, may contribute to attrition in drug discovery. Powerful proteomic tools are helping to understand the nature of these complexes, including subunit composition, stoichiometry, and post-translational modifications. All this information is crucial to inform the correct cloning and expression strategy in appropriate systems. The selection of such systems must consider their suitability in terms of speed, reproducibility, scalability and economic feasibility.

Figure 5: The expression of multiprotein complexes involving disease target proteins helps the design of functional assays in vitrowith physiological relevance to be used in HTS. Recent advances in molecular biology have made available several expression systems suitable for the preparation of multiprotein complexes at an adequate scale for drug discovery and screening purposes. There are tools that utilize tandem recombinant strategies for simultaneous co-expression of proteins or proteins subunits in a single host. For instance, ACEMBL21 in E. coli (also applicable to eukaryotic hosts22), MultiBac23 for insect cells and MultiMam24 for mammalian expression hosts as well as several tools for co-expression in plants (reviewed in 25), demonstrate that these strategies expand to all types of hosts. They can be used both for transient and stable expression, giving molecular biologists the opportunity to select the most appropriate option depending on the features of the target being considered. These tools are clearly pointing to new trends in bioreagent supply for HTS and drug discovery which will deliver the necessary material to investigate the function of disease targets in a more physiological environment. These targets will not only be the subject of structural studies needed to afford rational design of drugs or to run fragment-based screening campaigns, but also constitute the right target for high throughput assays that should deliver more viable hits than those obtained in the past with isolated targets. 4.4 COMPOUND COLLECTIONS AND CHEMICAL DIVERSITY As explained in the introduction, the size of corporate compound collections for HTS grew drastically in the late 1990s in parallel with the deployment of combinatorial chemistry and subsequent array synthesis technology. Consequently, compound collections in big pharmas reached a size of more than 1M samples in the early 2000s. Nonetheless, a concern remained among all scientists involved in HTS regarding the diversity of such collections. Having been nurtured mostly from combinatorial chemistry applied to a limited number of chemistry projects (thus to a few chemical structures), there was the caveat that the collections were richer in depth than in breadth, i.e., they contained a vast number of variations of a limited number of defined chemical scaffolds (chemotypes). In other words, the collections were poorly representative of the enormous diversity of the chemical space. Although most chemical spaces are too large (above 1E60 molecules) to consider covering with a full representation, having as accurate a depiction as possible, or at least covering the biologically relevant portions of the space however that might be defined, is the unmet dream of most pharma company chemistry departments. Therefore, most have devoted significant resources to complementary strategies intended to explore novel regions of chemical space while trying to expand the diversity of their collections as much as possible by investing in the synthesis of novel compounds. The first approach includes several alternatives, all of them sharing rational, structure-driven strategies. Virtual screening, utilizing large in silico compound libraries, has been the most common strategy. It is generally applied in two different forms, either following ligand-based screening, trying to find compounds matching the chemical features of known ligands, or following structure-based screenings, in which the virtual libraries were interrogated for docking fitness to the binding regions of known 3D protein structures. Many in silico libraries are available, some of them particularly extensive like ZINC (http://zinc15.docking.org/) comprising nearly 1 billion different compounds. In recent years, on-demand in silico libraries have been developed applying known and validated chemical reactions to a divergent set of reagents. Because they are based on validated reactions, these libraries offer the advantage of being realistic, i.e., their virtual compounds are synthesizable. With the obvious advantage of generating novel proprietary molecules, many pharmaceutical companies have expanded their internal diversity by leveraging their own synthetic experiences and novel building blocks, like the Pfizer Global Virtual Library (PGVL), the Proximal Lilly Collection (PLC) from Eli Lilly, the Merck’s Accessible inventory (MASSIV), or the Boehringer Ingelheim’s BI-Claim26. This trend has also reached academic groups, and two good examples of virtual libraries developed in academia are SCOBIDOO27 and CHIPMUNK28, the latter being particularly rich in compounds well suited to modulate protein-protein interactions. Finally, chemical vendors have also built on-demand libraries, the most extensive being REAL (https://enamine.net/compound-collections/real-compounds/real-database) developed by Enamine (Kyiv, Ukraine), with nearly 2 billion virtual compounds. Interestingly, these libraries are not only being used for virtual screening in the two classical forms described above, but they are also being used in a third approach that takes advantage of the benefits of machine learning (ML): the so-called generative models which can build new molecules with desired properties based on the continuous vector representation from a set of molecules used to train the algorithm sustaining the model29. Fragment based screening (FBS) is another alternative strategy to explore the chemical space with relatively small physical compound libraries. This approach was initiated in the early 2000s when it became evident that compounds in corporate collections were similar in size to drugs, although this feature might be a disadvantage as fragments might not be endowed with all the properties needed to display high affinity for their targets (in addition to other considerations related to their ADME-Tox profile). It followed that screening at higher concentrations and using small fragments as starting points could be a valid and fruitful approach30. The strategy has a thermodynamic background based on the ligand efficiency concept31and also on entropic savings leading to several orders of magnitude increase in affinity when combining fragment-sized molecules with different enthalpic contributions into a larger molecule preserving such contributions. This fragment-based approach has been pursued by several companies and so far there are two drugs in the market that originated from this approach: Vemurafenib, for the treatment of metastatic melanoma, and Venetoclax, for chronic lymphocytic leukemia. In addition to validating the strategy, these successful cases have encouraged drug discovery scientists to consider FBS as a must-do activity in their projects, particularly when the structure of the biological target is suited for this approach because it contains binding pockets with cavities amenable to binding small fragments. This determination is propelling the evolution of FBS, led by two major drivers. The first is aimed at scoring the suitability of a given target for FBS and several computational approaches have been developed to that end32: this allows focusing on those targets with the highest likelihood of success, avoiding wasting resources on targets not suitable for this strategy. The second relies on the development of new chemical approaches to synthesize novel and easy-to-grow fragments, with particular emphasis on structural diversity and three-dimensionality33. Computing tools will be paramount in this regard to mine the interface between chemical synthesis precepts and structural information from the pocketome, as well as to exploit the biological data generated in order to guide and prioritize the synthesis of new fragments34. While FBS is a useful alternative strategy to increase the diversity of HTS compound collections, the most important revolution in this area has been the development of DNA-encoded library technology (DELT), an approach initiated 15 years ago that has matured over this decade and is now playing a major role in most pharmaceutical companies, delivering successful results as exemplified by the RIP-1 kinase inhibitor GSK2982772 currently undergoing clinical development for psoriasis. The fundamentals of DELT (depicted in Figure 6) are connected to the original concepts of combinatorial chemistry and the “split and pool” procedures capable of generating a large number of molecules using multiple combinations of building blocks followed by screening the resultant pools of molecules by affinity screening to identify putative hits However, since each individual molecule is present in very small amounts, its correct identification is predicated on amplification of the DNA tag that encodes for the molecule’s recipe or synthetic sequence. Labeling individual building blocks with DNA oligos of different sequences and lengths enabled encoding the resulting molecule in a unique mode (Fig. 6A), thus granting its identification by amplification and sequencing of its DNA tag (Fig. 6B). This identification is unequivocal and highly sensitive, enabling the detection of molecules present in tiny amounts that could not previously be identified by conventional analytical methods. The availability of next generation DNA sequencing, with massive capabilities, has increased the throughput of the identification step with respect to the early days of the technology, therefore sharpening and improving the vast potential of DELT in drug discovery.

Figure 6: Fundamentals of DELT. A: split and mix of building blocks and labeling of the resulting combination with unique DNA tags. B: the identification of individual hits in a HTS against an immobilized biological target (from 35and 36). The implementation of DELT provides rapid access to large compound collections (with sizes that may reach billions of compounds) for big and small pharma alike, thus expanding their diversity. Since only minuscule amounts of each compound are necessary, the physical volume of these libraries is significantly smaller than those of conventional compound collections, despite being orders of magnitude larger. Likewise, the savings in preparing such libraries are huge compared to conventional ones: estimates are that the cost per compound in conventional libraries is 1,000 USD per compound, whereas in DELT libraries such cost is reduced to 0.02 US cents per compound37. This makes DELT libraries available even to academic labs or small startups with budget restrictions. In addition, since each individual molecule is easily identified, libraries are screened in mixtures of compounds, hence less assay points are needed to analyze the complete library, resulting in further savings. Furthermore, compounds are tested in binding assays against immobilized targets and, consequently, there is no need to develop functional assays, which can be challenging for some targets. Such functional assays are necessary, however, in a further step to confirm the biological activity of the identified molecule lacking its DNA tag, but since the number of entities to be confirmed will be relatively low it may not be imperative to have the assays configured in high throughput format. As mentioned in section 4.1, another added advantage of DELT is that it fits very nicely with TPD: once a ligand of the protein to be degraded has been identified, it is very simple to replace the DNA tag with the ligase-recruiting sensor. Hence, DELT and TPD seem to be a perfect match for many drug discovery efforts. Unfortunately, failure rates usually reach 50% in the confirmation assays mentioned above when many putative hits, prepared synthetically lacking the DNA tag, fail mainly due to possible DNA interferences in the primary, binding assay or artifactual enrichment due to incomplete chemical synthesis or tag degradation35. This is not the only inconvenience brought about by the DNA tag. It also limits the chemical reactions that can be used to prepare target molecules, precluding those reactions requiring harsh conditions where DNA may be labile. In addition, the resynthesis of some individual molecules at large scale, required for subsequent biological tests, could be challenging as some rare building blocks may not be available in sufficient quantity to carry out the synthesis. These are some of the major challenges DELT is currently facing, which future work is expected to address. The catalogue of chemical reactions tolerated by DNA, however, continues to grow as chemists develop alternative conditions to carry out key chemical transformations. This trend will benefit from the addition of enzyme-catalyzed reactions and incorporation of multifunctional building blocks. Progress is also expected in the DNA-labeling field to make this step more efficient. Finally, ML tools will likely contribute to the design of improved building blocks best suited for this technology. Increasing the diversity of corporate compound collections remains an important objective for medicinal chemistry departments, but not the only one. One never wants to compromise quality for quantity and increasing the quality of their collections is paramount. Indeed, there has been a growing concern about the presence of undesirable compounds (the so-called PAINs: pan-assay interference compounds38), so efforts have been undertaken to identify them as early as possible in HTS campaigns. These compounds usually generate false positives due to interference with the assay readouts (e.g., fluorescent compounds or quenchers of emitted fluorescence), but they may also display apparent biological activity due to unwanted, nonspecific mechanisms like aggregation (forming micelles trapping the biological target or its ligands) or promiscuous chemical reactivity including strong redox potential. Although it is common that secondary assays are included within screening programs to filter out these molecules early in the process, the accumulation of data regarding their nature and the appropriate exploitation of such data will help prevent their inclusion in future collections. All the strategies described above to improve chemical diversity and ultimately discover novel chemical matter have advantages and limitations, and none can be considered a universal panacea. The wisest option for pharmaceutical companies, assuming they are not resource limited, is a multiple approach integrating all these strategies in different proportions depending on the nature of the target being considered. Computer approaches and ML tools will play a significant role not only in improving these strategies, but also in deciding how to combine them for each individual target. 5. AUTOMATION TRENDS As explained in the introduction chapter, automated platforms and robotic systems played a prominent role in the expansion of HTS at the end of 1990s. Different instruments from different vendors (e.g., the first versions of liquid handlers, plate readers, plate grippers or robotic arms moving plates from one device to another and multiple peripheral components specialized in punctual tasks like removing plate lids or reading barcodes) were integrated using bespoke drivers and tools enabling communication and coordination among them. This early configuration was soon replaced by bulky workstations, inspired by manufacturing robotics, which included all these functions in one single instrument demanding a considerable space in the lab and requiring huge investments. Throughout these decades the role played by robotic units has evolved into more flexible modular configurations that could be tailored to each laboratory’s needs for each specific situation. Those large robotic workstations are again being replaced by small subunits easily integrated and interchangeable. In addition, in recent years externalization of certain tasks to specialized laboratory automation companies has become a strong trend in the field with the emergence of cloud labs. 5.1 ROBOTS Budget restrictions and, more importantly, the need to adapt to different project types with very different needs in a continuously evolving environment, are motivating pharma companies to change their investment criteria regarding instrumentation. Large investments in big, static workstations have been replaced by the purchase of smaller instruments, covering different individual tasks, which could easily be moved, interchanged, and utilized flexibly. Robotic arms are a good example of individual pieces offering flexible integration covering a wide number of activities. In the initial days of HTS, robotic arms were simple plate movers with no degrees of freedom, only capable of transporting plates in a linear motion from one device to another one next to it. Soon thereafter, limited access arms, endowed with 2-4 degrees of freedom, were launched. Those degrees of freedom enabled these instruments to move plates among peripheral instruments arranged in a bidimensional layout. Finally, articulated robotic arms, with 5 or 6 degrees of freedom became available and their flexibility has put them at the heart of many robotic platforms, not only in HTS environments but also in many automated labs.

Figure 7: Examples of articulated arms: A: F7 from Thermo Fisher Scientific (Waltham, MA). B: Spinnaker from the same vendor. C: UR3 from Universal Robots. D: PF400 from Precise Automation The first models of these articulated arms presented some inconveniences that, although being common to the three types of robotic arms, were more pronounced in the case of the former given their flexibility of movements. First of all, the arms had to be instructed to learn the positions of the peripherals and to acquire some spatial orientation. This learning process was complex and usually involved software packages with cumbersome interfaces, making the process excessively time consuming and arduous for the average operator. In addition, since the arms are heavy pieces moving at high speed, they pose a serious safety risk and hence they were confined in closed platforms with safeguarding barriers. Those barriers impeded flexible interactions of the operator with the HTS system for common tasks like refreshing reagents or fixing errors. Therefore, operators were sometimes tempted to circumvent the barriers, violating safety rules and putting themselves at risk. The most recent models of articulated arms (presented in Figure 7) have overcome these issues. They can acquire spatial orientation using a specific teaching command, enabling them to learn on the fly without the need for programming or using sophisticated software instructions. More importantly, they are equipped with sensors able to detect the presence of external elements (humans or instruments) in their trajectory and subsequently respond by softly slowing down their movement and eventually coming to a complete stop when such presence is excessively close. This new generation of articulated robotic arms are commonly known as “cobots” (the short form of “collaborative robots”) and their use is expanding the HTS arena to include other laboratory operations. Indeed, they have become common tools in many laboratories progressively enjoying the benefits of automation, providing a good example of the blurring edges between the classical HTS environments and modern automated labs not necessarily intended for screening purposes. Together with companies that have been playing in the field since the early days, new ones have emerged in recent years developing novel instrumentation like the last generation of robotic arms described above. Companies like Precise Automation (Fremont, CA) or Universal Robots (Odense, Denmark) are some of the leading players in the field, and the recent launching of the Universal Robots UR10, equipped with a dual gripper, clearly demonstrates the progress being made. In addition to robotic arms and plate readers, liquid handlers are also one of the most critical pieces of instrumentation in the automated HTS environment. Considering only the technology supporting their operation, seven types of liquid handlers are available: those based on air displacement, positive displacement, peristaltic pumps, capillarity, acoustic technologies, piezoelectric stacks, and solenoid valves. They all cover a wide range of volumes to be dispensed, from pL to mL (see Figure 8).

Figure 8: Range of volumes dispensed by the seven different types of liquid handlers, depending on the technology supporting their operation. However, not all the liquid handlers using these technologies are available for common tasks in HTS like dispensing reagents during assay execution. High prices (e.g., for acoustic, piezoelectric or solenoid-based ones) or their own designs limit the functionality of some. For instance, acoustic instruments are exclusively used for dispensing compounds, although their high prices and the need for special plates capable of transducing the sound waves are limiting factors for their common use. Capillarity-based systems are also used for compound distribution by replication of a source plate. Some solenoid-based systems like the D300e from Tecan (Männedorf, Switzerland) are available at a reasonable price. These incorporate Hewlett Packard digital dispensing technology to deliver volumes as low as 12 pL. Again, this instrument is intended for compound addition, although its high accuracy when dealing with such small volumes makes it an ideal instrument for preparing concentration-response curves or combination of compounds to monitor synergistic effects. Liquid handlers based on air displacement or on positive displacement are the most common types for routine dispensation of reagents, together with the popular (and less expensive) peristaltic pumps which are typically used for bulk additions throughout the entire plate. The continuous need for saving precious reagents dictates the trend in the evolution of these instruments with a two-pronged goal: to minimize dead volumes and to dispense small amounts of reagents for highly miniaturized assays. Air displacement liquid handlers have traditionally been the most popular instruments for that purpose due to the lack of contamination provided by the air cushion between the moving piston and the liquid being dispensed using disposable tips. However, they show poor accuracy for volumes below 2 µL and are severely limited when dealing with viscous solutions. In contrast, positive displacement liquid handlers are liquid-agnostic and show large accuracies in the sub-µL range, but they typically use fixed tips that pose the risk of cross contamination; hence they demand washing steps that play against speed and efficiency. In addition, they are usually not amenable to working with 1536 plates. However, SPT LabTech (formerly TTP Labtech; Melbourn, UK) has recently launched the Dragonfly 2 model (Figure 9), which addresses both issues: it uses disposable tips equipped with plungers which are coupled to the moving piston, hence no cross contamination occurs, and it is designed to work with 384 and 1536 plates. On top of that, it demands very low dead volumes (200 µL), so it is perfectly suited for challenging assays requiring highly efficient use of costly reagents.

Figure 9: Dragonfly 2 from SPT Labtech, a positive displacement liquid handler addressing most of the issues associated with this kind of instruments (as detailed in the text). A: overall view of the instrument. B: Details on disposable tips, showing the tip internal plunger coupled to the moving piston. In the last few years, novel firms have appeared offering liquid handlers (usually air displacement-based ones) with flexible configurations, intended for HTS-related tasks such as assay development or lead optimization. Opentrons (New York, NY) launched the OT2 instrument, which operates under an open-source software enabling customized protocols using its Python-based application programming interface. The configuration of the OT2 desk is flexible, allowing the assembling of different peripherals. Another example is the Andrew + pipetting robot (from Andrew Alliance, now part of Waters; Milford, MA), which uses conventional electronic pipettes to perform repetitive pipetting operations. Although these instruments are not genuinely intended for HTS, they can be used in those HTS-related tasks described above and they are undoubtedly key players in modern automated labs. Flexibility has become a key component of modern HTS laboratories. Instead of using fixed instruments and large robotic machines resembling those of traditional factories, the current trend is to utilize modular systems, easily interchangeable, so that the HTS lab design can vary from day to day, adapting to changing demands. Thermo Fisher InSPIRE™ was the first modular platform being launched in 2018. It can be configured for a wide variety of workflows and is governed by the Momentum™ Workflow Scheduling Software using tactile devices. It is expected that by the end of 2021 Thermo Fisher will launch SmartCarts, a system intended to facilitate docking of the instruments within the platform that will help to reconfigure InSPIRE™ smartly through visual and tactile tools. Likewise, companies like High Res Biosolutions (Woburn, MA) have specialized in developing modular high-end lab automation systems and instruments and in 2015 they partnered with AstraZeneca for the development and deployment of several new modular automated screening systems. With the growing implementation of the Internet of Things (IoT), novel pieces of equipment are being incorporated into HTS labs and into automated labs in general. Sensors and cameras enabling remote control and surveillance of operations are starting to be used, and machine vision will likely be exploited in the short term for automatic inspection, process control, and robotic guidance. For instance, scientists at the Scripps Florida HTS premises have developed a IoT-based system to continuously monitor liquid dispensing accuracy39. It is based on the weight of the reservoir containing the liquid to be dispensed, which is placed on a precision balance connected to a suitable microcontroller capturing the data and sending them to the corresponding database. In addition, the liquid handler is monitored by an infrared sensor to identify when a dispense operation starts and finishes, so data are captured only during the dispensation process and recorded conveniently to be shown in a display next to the robotic unit. Dispensing errors can be easily identified and narrowed down to the plate position if alterations are observed in the decreasing weight readout (Fig. 10).

Figure 10: IoT-based monitoring system developed by Scripps Florida HTS scientists for quality control of liquid dispensing during HTS operations. A: Overview of the process, with data being read from a balance, captured by an Arduino microcontroller, stored into a database, and made available for review via a web interface. B: Graph of decreasing weight readout from the balance throughout the course of a dispense. Dispenser state (right hand vertical axis) oscillates between 0 (off) and 1 (on). C: Data captured during a typical HTS process. (From 39). Voice assistants are also becoming useful tools in common laboratories, and they will likely be implemented for HTS operations. Companies like LabVoice (Research Triangle Park, NC), LabTwin (Berlin, Germany) and HelixAI (Atlanta, GE) have already launched the first voice assistants for the lab. It will be interesting to see how they evolve not only for lab operations but also for data handling and connectivity. All these innovations have already been implemented in automated labs and will likely make their way to HTS settings, again demonstrating the connections and the vague limits among these two highly related environments. 5.2 SMART AND CLOUD LABS ENVIRONMENT As already outlined above, lab automation pioneered a few decades ago by HTS facilities has spread out of the HTS arena to reach other laboratories that are progressively embracing the benefits of automation. The emergence in the last few years of so-called “cloud labs” is fostering innovation in automation technologies, these cloud labs having learned from the experience of HTS labs in the past. Although they can perform screening tasks, they are not necessarily intended for that purpose and they may also be involved in other activities strongly related to HTS like target validation, assay development, chemical lead optimization and biological lead profiling, to cite a few. Therefore, HTS settings and cloud labs are strongly intertwined, with the latter focused on running a mix of complex suites of ever-changing protocols rather than high-speed performance of a single fixed protocol. Cloud-based platforms promote accessibility, usability, and sustainability. They also offer vast improvements in experimental reproducibility, a matter of concern in modern science. A recent article40shows that nearly 40% of the results published in selected papers from high impact journals like Nature or Science failed to be replicated when the studies were performed by independent groups. Another report previously published in Nature41shows than 70% of scientists failed to reproduce data generated by others and, even worse, more than 50% of scientists failed to reproduce their own work. Not only has this lack of reproducibility engendered negative economic consequences (which are obvious in the drug discovery world since wrong data leading to wrong decisions can cause huge late-stage losses), but it also produces a moral damage to society as it erodes credibility and increases distrust in science. Setting aside rare cases of fabrication and falsification, poor reproducibility can often be traced to human error, either in “wet” experimental tasks or in data handling and processing, or to vague and incomplete documentation of experimental methods. Moral hazard also plays a role particularly under the perverse incentives often found in academic settings, with the high pressure for publishing being among the most important. It follows that minimizing human interventions and replacing them by robotic actions following experimental methods described in executable code will reduce these errors and improve reproducibility. In addition to reducing human error, transparency is another feature of cloud labs that fosters reproducibility. Because protocols are written in executable software when data becomes public so will those protocols, enabling their execution by others using the same instruments and same materials. Likewise, data reside in the cloud complying with FAIR (findable, accessible, interoperable and reusable) principles. Consequently, published data can become fully auditable, as permitted by the originator scientist depending on confidentiality constrains. Beyond improved reproducibility, lower costs, more rapid experimental progress, more methodical troubleshooting, and higher operating efficiency are driving the growth of cloud labs. A recent analysis by Emerald Cloud Lab (ECL; San Francisco, CA), one of the pioneering cloud lab companies, showcases differences in costs for a startup running their experiments in a cloud lab compared to investing in their own laboratory facilities and instrumentation. The results are clear: costs per experiment can be more than 4-times lower when run in a cloud lab, and savings due to more rapid milestone achievement may exceed 70%42. This cost analysis underscores one of the most convenient aspects of robotic cloud labs: having access to millions of dollars’ worth of lab equipment on a subscription basis scaled to utilization levels without having to bear the capital expense. In addition, the fact that these robotic labs can operate 24/7/365 suggests an unprecedented efficiency. Teams of scientists distributed around the world are already analyzing their results and designing new experiments during normal working hours, dispatching those protocols to run overnight in a non-stop production cycle. Finally, remote 24/7/365 operation provides high flexibility to scientific teams, making their productivity insensitive to interruptions like the COVID-19 lockdowns.

Figure 11: Emerald Cloud Lab facilities in San Francisco, CA (image from https://www.emeraldcloudlab.com/) Founded in 2015, ECL is the pioneering company in the robotic cloud lab field. Strateos (Menlo Park, CA) emerged afterwards following the merger of Transcriptic with 3Scan. ECL and Strateos offer vast facilities equipped with state-of-the-art robotic instrumentation covering diverse scientific disciplines including organic synthesis, analytical chemistry, and biochemistry. ECL is expanding its portfolio to include microbiology and cell culture in the short term. Other companies are focused on specific disciplines, like Synthego (Redwood City, CA) in genomic engineering and Arctoris (Oxford, UK) in drug discovery, combining HTS and AI approaches. As stated above, the flexibility of these robotic cloud labs makes them suitable to run many tasks involved in drug discovery that complement HTS, so it seems foreseeable that they will play a prominent role in the field within the next few years. 6. DATA SCIENCES AND INFORMATION TECHNOLOGIES As mentioned in chapter 3, at the dawn of HTS in the 1990s data analysis was performed using customized systems developed within each company. When the amount of data grew exponentially, more robust commercial LIMS platforms became available. ActivityBase® (developed by IDBS; Guildford, UK) was one of those LIMS platforms used by many companies to capture, process, analyze and store their HTS data. Since then, other products have appeared in the market with improved performance. Screener® (from Genedata; Basel, Switzerland) is currently one of the most popular products. It enables users to optimize experiment and data analyses avoiding wasting time on lengthy data capture, processing, and management tasks. One of the Screener® features most appreciated by its users is the protocol development ability, which is more versatile and user friendly than that of other platforms. Data visualization is integrated in the software without the need of separate interface development; its sophisticated data viewers allow users to access the quality of available data across complete campaigns and determine improvement strategies with advanced data analysis algorithms Likewise, it seamlessly integrates the whole screening workflow, including compound management, without needing to combine more than one tool. In addition, the product is easy to integrate in corporate IT environments. Dotmatics Studies® (from Dotmatics; Bishops Stortford, UK) is one of the newest tools for HTS data analysis and management. It expands its features beyond HTS to reach DMPK studies, hence integrating all the data in a single suite. Besides supporting very high data volume analysis and providing out-of-the-box and custom protocols and analysis (including plate- and non-plate-based assays), it allows manual or automated processing, a flexibility that is highly appreciated by customers. Another important advantage lies in its collaborative nature, including cloud-hosted deployments supporting external collaborations. Dotmatics also offers other tools like Register® for compound registration and Vortex® for data visualization and mining, all of them easily integrated with Studies®. Tools for data visualization like Spotfire® (from Tibco, Palo Alto, CA) or the tools developed by Tableau (Seattle, WA) are also critical in the HTS process. Therefore, it is common that many software tools, likely from different vendors, end up getting incorporated into the HTS data workflow. An appropriate integration of all those systems into a common network becomes imperative to prevent experimental data sets from remaining isolated from one another in independent silos accumulating data that ultimately becomes unusable. Needless to say, such integration must be automatic to avoid manual data entry and manipulation steps that will put data integrity at risk. TetraScience (Boston, MA) offers cloud-based solutions to integrate all these tools. In addition, it harmonizes and transforms the data, unifying data structures in a centralized platform that makes the data available for data science and analytics like AI, hence favoring FAIR guidelines compliance (Figure 12).

Figure 12: Scheme on the integrated solutions provided by TetraScience to favor adherence to FAIR guidelines. A: Typical data workflow scheme with data silos generated by independent systems using manual data entry and manipulations. B: Improved data workflow using TetraScience integrative solutions (images from https://www.tetrascience.com). One of the major issues associated with these informatics systems is the difficulty that small enterprises and academic institutions find in accessing them due to budget limitations. There are several issues to be considered here. The first is the mindset of upper management, which very often considers these tools as “useful” rather than “absolutely necessary”. Hence, acquisition falls low in the scale of priorities. In addition, these systems demand periodic updates and support, so they are perceived as a continuous source of expenditure. Finally, public funding for these informatics solutions is not as straightforward to obtain as for other investments, e.g., in instrumentation or personnel. Agencies usually demand extended and carefully built explanations to justify investing in software tools, again due to a misinformed mindset that considers them as accessories rather than pillars on which to build the knowledge that will support future growth. The critical role of data management and the need for an optimized data workflow has been highlighted by many experts in recent times. The blooming of AI and ML approaches has evidenced the absolute need for model-quality data fulfilling the FAIR principles mentioned above. Early implementation of tools like the ones described is critical for adopting data standards and process standards as well as for harmonizing and optimizing processes, avoiding costly corrections in the future. It is estimated that in the E.U. alone, the costs of data wrangling in R&D environments (i.e., the process by which the data is identified, extracted, cleaned and integrated to yield a data set that is suitable for exploration and analysis) exceeds 28 billion USD per year. Therefore, the economic impact of poor data management in the HTS field, where massive amounts of data are generated, appears significant. Doubtlessly, there is an urgent need to change old mindsets in order to ensure proper implementation of data management strategies in scientific organizations as a way to rise the value of their most important scientific asset: experimental results and data. 7. SOCIAL COMMITMENT IN HTS A recent notable aspect of the evolution of HTS is the growing awareness among practitioners of the potential negative environmental externalities generated by the disposal of vast amounts of consumables along with the laudable desire to engage in more frequent collaborations with less economically resourced groups. The corporate goal is to improve the long-term sustainability of HTS, as well as the reputation of the pharmaceutical companies, in the face of mounting societal pressures. 7.1 PUBLIC-PRIVATE PARTNERSHIPS Given the high costs associated with HTS, the last decades have seen the emergence of many partnerships between pharmaceutical companies and academic groups. While the former contribute costly instruments and know-how in the drug discovery process, the