Hello everybody. Welcome to today's webinar presented by Pro, Mega and Evotech titled Enabling High Throughput Screening and Characterization of Targeted Protein Degraders with High Bit Technology. I'll be your moderator today. My name is Simon Moe. Before we begin, I would like to cover a few housekeeping items on your screen. There are multiple windows, all of which are movable and resizable, so feel free to move them around to get the most out of your desktop experience. In this webinar, we'll have many ways to be interactive. You can submit a question at anytime during the webinar. We're going to be answering those at the end during a live Q&A session with the speakers in the Resource library. There are also a list of very helpful materials which include a copy of today's presentations. Feel free to download any of those resources or bookmark any links that you may find useful. After the presentations, there will be a survey. Please take a moment to answer these few questions. We really appreciate your feedback. You're also welcome to share the webinar with any of your colleagues so you think would be interested in learning more about the content of it. Next, I'm going to be introducing our featured presenters. First, we have Doctor Kristen Reaching. She's a senior research scientist and group leader at Pro Mega. She brings extensive experience in developing innovative products and technologies across a diverse scientific fields, with one of her primary focuses being the creation of assays to monitor both cellular kinetics and the mechanism of action of targeted protein degraders. Then we have Doctor Pierik Rashard and Dr. Melissan Dewillis Piolaire, both of them work from Evotech. Evotech is a life science company offering end to end drug discovery and development solutions, navigating customers projects from target identification to the clinic. With more than 4000 scientists across Europe and the US, the company integrates deep disease understanding, a multi modality approach, advanced technologies and tailored solutions, all to enhance the probability of success. So this is an agenda for today's webinar and then we will just kind of jump right into the introduction on TPD and technologies presented by Kristen at Promega. Well, thank you for that introduction and thank you all for joining us today. I'm going to be giving a brief introduction to the technologies that we have developed here at Promega that enables screening and characterization of targeted protein degraders. And so targeted protein degradation is really a promising emerging therapeutic modality that has a number of significant advantages over traditional occupancy driven inhibition. It relies on a catalytic complex series of events to drive a high potency of degradation. And this is a really significant advantage over occupancy based inhibition because it it allows for the potential to achieve high potency with significantly lower doses of compound with extended durability within the cell. And essentially by degrading a protein, you can eliminate all functions of a protein in the cell, not just those that require a some enzymatic function. But we can also with this approach target proteins lacking well defined binding pockets. So it truly is an expansion or a strategy to potentially expand the druggable proteome with this approach. But of course, there are many parameters that have to occur, many steps in this pathway that have to occur for degradation ultimately to be successful. And this relies on a very complex series of events. So at Promega, we asked this question, how can we characterize and build assays to characterize to greater efficacy. So what we want to be able to do is monitor the real time degradation kinetics. So not only the outcome of this entire process, we want to look at degradation of the target, but we want to be able to confirm on mechanism activity as well and look at all of the events upstream in this pathway and potentially identify challenges at each of these upstream steps that might require optimization. And it's really this combination of kinetic and mechanistic characterization that really enables deconvolution and some insights potentially into how to optimize this entire process of targeted protein degradation. And so of course the the main technology that we have relied on to enable this is the hybrid technology. So hybid is a small 11 amino acid peptide, which complements with high affinity to the larger subunits called the large bit protein. And these two proteins complements with high affinity to form a very bright luciferase tag. And because hybid is so small, it is very efficiently inserted into the genome using CRISPR CAS 9 editing technology, where we can precisely tag target proteins of interest in the cell. And then if we do that in a cell background that is expressing large bits, we have our protein of interest that is is tagged with a luciferase that we can then use to track the levels of our target protein. When we treat with a degrader, then we can track the loss of that luminescent signal that directly correlates to the loss in that protein over time. And we can generate these really beautiful and quantitative data rich degradation profiles looking at degradation out to 24 hours and even in some cases beyond 24 hours. So over the years we have used this approach extensively to monitor degradation kinetics of a variety of different proteins as well as with different degraders including Protax and molecular glues. And in many cases for highly optimized Protax, we can see very classical potent and complete degradation. We can often see the presence of a hook effect where we have competition and saturation of the binary complexes out competing ternary complex formation. We can also see for some proteins evidence of rapid recovery following initial degradation, but in other cases with less optimized compounds, we see more incomplete degradation, so partial degradation, slower onset of degradation. This could be due to poor permeability, onset of cellular toxicity, or other parameters within the pathway that needs to be optimized. And the shape of these profiles really provides mechanistic insight that can be useful to understand potential routes of degrader optimization. But how do we want to quantify and rank these kinetic responses? So as I mentioned, there's a lot of information that one can gain from these kinetic profiles. And So what we essentially do here is we take a single curve. So each of these curves is a different concentration of compound, and we can quantitate a number of different parameters from each of these curves. We can look at the rate of degradation, the degradation maximum. We can even start to look at parameters involved in any recovery that's observed, including the recovery rate as well as the maximum or our Max value that is achieved that indicates the maximum recovery. And then what we can do is we can generate secondary plots of each of these parameters across concentration to evaluate the potency, the kinetic potency of that parameter. But we can also couple hybrid technology with nanobrat to study mechanisms of targeted protein degradation. So nanobrat is a bioluminescence resonance energy transfer technology that relies on a luminescent energy donor that transfers energy to a fluorescent acceptor, in this case a Halo tag fusion protein. If we want to look at a protein, protein interaction and when the two proteins are brought into proximity, there's transfer of energy from the luminescent donor, in this case either nano lock or nano bit complex. So either high bit, large bit energy transfer from that donor to the fluorescent Halo tag acceptor. And we have optimized a dye for Halo tag that has optimal spectral overlap and increased signal to background. And so applying this technology to studying mechanisms of degradation, we have built assays to monitor cellular ternary complex formation via introduction of a Halo tag fusion to the E3 ligase, in this case either Sarah Bond or VHL most predominantly. We can also in the same way look at ubiquitination with transfection of a Halo tag fusion to ubiquitin, but then also so a very similar approach, but now using a DI conjugated small molecule, we can interrogate the binary binding to either the target protein or the E3 ligase. So if we have a small molecule fluorescent tracer that produces energy transfer when that tracer is bound to the target, we can look at competition of that tracer with our unmodified compounds or protax to interrogate, interrogate the binary engagement to those proteins. And the great thing about Nano, Brett, in all of these cases is that it's racial metric and we can detect these interactions despite ongoing degradation of the target. So we have built an entire portfolio of cellular assays that really enable quantitative real time characterization of essentially all steps within the degradation pathway from characterization of the binary binding and understanding of permeability. If we run this target engagement assay and compare the displacement IC 50s in both live and permeabilized cells, we can gain an understanding of protect permeability. But as I mentioned, we can also study the ternary complex formation kinetics inside the cells, ubiquitination kinetics in addition to ultimately the endogenous target protein degradation. But very early on, we wanted to try to gain a sense of what was the impact of ectopic expression of the target on being able to identify potent degraders. And so to do this, we actually looked at degradation of BRD 9 and we compared transient expression of BRD 9 either driven by CMD promoter, so we have nano luck fused to BRD 9 or on the right here we have nano luck BRD 9 driven from a weaker promoter, ATK promoter. And you can see even in the case of the CMV promoter, even when we tried to dilute that plasmid a hundredfold and carrier DNA to try to dial back the expression of nano luck BRD 9, we're not able to see really any degradation. And the data is very, very noisy with all of these compounds. However, the the weaker expression, we can actually see some degraders that appear reasonably potent if we compare to the endogenous expressed BRD 9. So now here we have high bit BRD 9 endogenously tagged in the cell. Now we can see while we detect the same compound rank order in both the transient expression system compared to the endogenous protein, the compounds look significantly more active and more potent when we look at the endogenous target. So we see faster rates of degradation and greater overall depth of degradation. However, when we looked at a related family member, so BRD 7 has some of these compounds were also known to target BRD 7. We again compared transient expression of Manila BRD 7 expressed from a weak promoter compared to the endogenous hybrid BRD 7. And here now we can see that there's really only one compound, the most potent compound that we're able to detect degradation within the overexpress system compared to the endogenous target where we're still able to pick up these other degraders that show modest but but but some degradation of BRD 7. And so it's, it's possible to use overexpression, but we have to use weak promoters. But even in this context, it's important to realize that this might result in in missing some degraders that might be very potent against the endogenous protein. And also because of the differences in the family member responses, we're really not able to rank family member responses to the same degrader just because it's difficult to control the levels of expression of both targets. But there are some situations as we've now seen, there's interest to try to develop degraders against targets that are not natively expressed in the cell. And so how could we actually study targeted protein degradation of these non host cell targets? And is it possible that we could potentially further optimize our expression conditions to give some better reliability into this approach and and give more confidence in our screening results? And so for this, I would now like to hand it over to my colleagues at Evotech, Eric and Melissan. And I'd like to thank you for your attention and also just to remind you that we will be holding time for questions at the end. Hello everyone. Today, we'll talk about our approach to targeted protein degradation at Evotech, starting with acid development. A very crucial step in targeted protein degradation research is to develop a good cellular model that is amenable for screening campaigns and heat validation. Here I will show you an example of the approach we took in my group where we focus on targeted protein degradation of viral targets. So we chose to develop inducible cell lines that expressed tag viral protein of interest and we either target with a hybrid tag or an analytifier tag. We first choose whether we tag on the sitter on the an term of the protein. And once we generate our cell lines, we first validate their biological functions. So here I'm showing an example of a hybrid tag viral polymerase. You can see that after inductions the polymerase is expressed in the cells. And here we we transfected to the cell all the components that are necessary for genome replication and we measure genome replication by the polymerase. Here is a control where it's just a native polymerase is transfected in cells and you can see the level of genome replication. And here it's our cell line where we have induced a high B tag polymerase and you can see that the level of genome application are on par with what with the native protein. So the cell line is validated. We also do a careful characterisation of the TAG protein by monitoring its expression kinetics that you can see here over time and also the protein stability to verify that the half life of the protein is close to the native protein. The big advantage of choosing this inducible cell line is that you can really control the the timing of expression of the viral protein, but also modulate its expression level as you can see here, depending on how much inducer you put on those cells. So once we've validated those cells, then they can be used for high throughput screening campaigns. Let's first try to summarize the key elements that we consider when choosing between endogenous or ectopic expression of the target protein. As always, there are pros and cons for these two options and no universal rule but some target specificities that will drive the decision. The main element in favor of monitoring the level of the endogenous protein is that you will stay as close as possible to the natural conditions and in particular, you will benefit from the natural balance of the different players involved in process of degradation. And we have just seen some examples where this translates into higher s s sensitivity. But for that you will need to identify a suitable cell line that exposes your target of interest at descent levels. And the lower this expression level, the more limited USA window and sensitivity. In some instances you won't find any suitable endogenous model and it will make sense to go for ectopic expression. And we have run successful HTS in such conditions. And the use of a part of the endogenous promoter of your protein of interest can represent a middle way between endogenous or ectopic expression. However, this does not guarantee that you will get good assess sensitivity. Optimizing for SA window can be done with overexpression of a constitutive promoter and if you want to fine tune your expression level you can use inducible systems. As medicine has just shown, in some cases you may you may be worried about some kind of imbalance between the different actors. An issue that can be mitigated by simultaneous over expression of the if we like is something which we have observed can improve excess sensitivity. Whatever the model, asset development proceeds more or less in the same way, and this was a straightforward. There is a first step of cell iteration where you will determine the minimal amount of cells that you can use in your essay to get the signal strong enough to reliably quantify your protein of interest. The goal is also to confirm that you are within the linear range of the technology and that any change in the amount of your protein will translate into a change in the luminescent signals that you are measuring. We are used to run in parallel a cell title for your essay as a surrogate of cell count. In case of low protein expression, you can eventually add some stimulator that is known to increase the level of target of interest. In this example it does not increase the LZ prime factor, a marker of s s sensitivity, but it was already high enough in the absence of the stimulator. But what you can see is that it increase the essay window to improve operational efficiency but also as a robustness when working with a bit unstable systems or to minimize batch to batch variability of transiently transmitted cells. We aim to run the essay with essay ready cells which are frozen cells just stored prior to plate sitting, and we first check in this situation experiment that both fresh and frozen cells behave the same before moving to functional validation. When available, we use reference degraders to confirm the cell phenotype in the IBT essay, taking also advantage of the CTF readout to make sure that we are actually measuring a specific decrease in the level of the target protein and not a specific toxicity. We also run functional validation with protein turnover modulators such as blockers of protein synthesis or abiquoting proteazone system inhibitors and this can be the opportunity to evidence that your protein of interest is already under control of the proteasomal degradation. When you see an increase in its steady test level with inhibitors such as MLN 4924, the mediation inhibitors that prevents activation of a number of well known if. We like this because a number of these modulators also interfere with survivability. We actively follow the rapidly evolving literature to take advantage of new modulators described to interfere with protein degradation systems. Once we have defined essay conditions and validated primary pharmacology, we transfer the essay onto one of our robotic platforms where the high throughput screening will take place. Let's first describe how such an essay is conducted on our robotic platforms with the example of an endpoint multiplexed essay measuring membrane integrity with the cell type of your essay before lytic eye bit detection. Our robotic systems are more or less large enclosure that contain all the devices that you use at the bench to perform such experiments in microplates linked together by a robotic arm that replaces you to move a plate from 1 device to another. Generally we seed plates offline in a semi automated cell seeding platform using AVL four O 6 bio stack device for migilant and we place plates in the system in a cyclomatic incubator which provides homogeneous cell growth conditions. After appropriate incubation time, generally around 14 to 24 hours, CTF Regent is added with a multi drug combi and plates are returned to the incubator for 30 minutes. After a short spin. Viability readout is performed on the 1st RFS. 6 read and I beat Regent is added to each well with an El 4 or 6. After incubation at room temperature, which can vary a bit with different cell lines, the experiment ends by reading the luminescent signal. Depending on the plate format 384 or 1536 well plates and the number of readouts, we can in such system reach throughput up to 10 to 30 plates per hour, corresponding to 50 to 200,000 compound screen per day. I will now move to the strategies that we have progressively developed for each tree edge, starting with a case study of a screen of 400,000 compounds from our screening collection performed in 384 well format. For confidentiality reasons I cannot detail the target on some specific aspects, but we should anyway be able to go through the whole process of ETA identification and selection. This starts with primary screening where we study the effect of our compounds in monoplicate, here at 10 micro molar in a single lytic I bit essay. We perform the analysis in gene data Screener where raw signals obtained for each compounds are normalized between 2 control values. So neutral control corresponding to the vehicle and blank control wells with no cells representing 100% of protein loss in the control area of the plate. We also have few wells with a given amount of a reference compound to monitor the reproducibility of its effect intra and interact. At the bottom left you have an overview of normalized signal for a few plates where white color indicates an activity close to 0, meaning at the same level than DMSO wells and red color indicates decreased IBT readout. You see this red line at the right of each plates that corresponds to the blank control and you can evidence few scattered wells on several plates corresponding to potential hits. For QC analysis, we monitor different parameters along the different runs such as LZ prime factor, a global indicator or FSA quality. So distribution of activity of the different well types, row and normalize median of the controls and row wise and column wise that are homogeneity. We can apply different strategies for its selection and in this in this case we have determined a run wise statistical threshold for its identification based on three times the standard deviation of the mean of the neutral control wells. In this project we have screened 400,000 compounds within around 1200 to 84 well plates with good overall quality parameters. We ended up with a little bit more than 4000 hits, corresponding to slightly more than 1% hit rate, with activity thresholds spread from 26 to 48% and a rather typical distribution of the activity range. We then pushed those 4000 hits to hit confirmation, where they are retested in duplicate in the same essay. 67% of the hits were confirmed based on the activity threshold of this experiment, calculated as during the primary screening. But when looking at this correlation plot where you have the primary activity on the X axis and the confirmation activity on the Y axis, you see that most of the so-called nonconfirmed hits are close to the statistical threshold and show rather similar activity in both experiments. And actually less than 5% of the test compound represented true nonconfirmed hits. If you don't benefit from compound annotation and you will see in the coming slides how precise these annotations need to be, We base the selection of confirm it for next stage mostly on medicinal chemistry properties after having removed the usual frequent teachers. In this case, after removal of the unsuitable profiles, we still had to apply chemical tractability and diversity filters to select our target of 750 heats for those response characterization. In this next phase, we have run the 736 compounds that were available for cherry picking in duplicate those response ranging from 20 micromolar to 15 nanomolar in three different essays. The primary essay and you clearly see on the upper part of the picture that most it's showed those dependent activity shown by this Red Guardian profiles that occupies the wall plates. And actually 99.6% of the compound showed activity which attest of the robustness of the IP test. We also assess simultaneously cell viability taking advantage of the Multiplex CTF readout. 31% of the tested compounds show activity in this essay, indicating that decrease in the target protein level can be unspecified due to decrease in variable cell count. And we lastly performed a high BTSA not on live cells but on cell lies. It prepared in an high BTI buffer and assessed in the same conditions as the primary essay. But in that case, as cells are not alive, any change in luminescence could not come from changing the amount of the high BTAC protein, but rather from interference with essay readout such as inhibition of nanolecular constitutional activity. And you obviously see in this lower part of the picture that many it shows activity in this slide, SSA representing up to 50% of the compounds that we have tested and even 63% of the non-toxic compounds. In terms of each 3 edge outcome. The viability control screen filtered the 733 actives down to 533 non-toxic actives, and the technology interference essay filter them more down to 209. Potentially specific from a technology perspective at least it's with some example profile illustrated on the bottom right. I bit activity is depicted in green, viability assessment in yellow, and interference with the technology in blue. One key finding from this study following our classical workflow is that a significant part of the hits that have been pushed to the dose response profiling stage are rather obvious artifacts despite the use of generic filters which include freckantiture triaging. So we wondered how we could improve the process and triage out earlier this unwanted compound. The first aspect we focused on is the identification of technology artifacts, I mean essentially nano Luc reconstitution or activity inhibitors as it looks like they were not very well identified by the frequent heater filter. And actually there are significant differences between the different nano Luc variants with for example only 87.6% sequence similarity between nano Luc and the high bit reconstituted version. We studied the effect of a number of luciferase inhibitors on recombinant or cellular Nanolook, best Nanolook or Ibit on an obit assays. And actually, we found that most Nanolook inhibitors have no impact on the Ibit on an obit, but that on the opposite, some Ibit inhibitors are inactive against Nanolook. Moreover, even within the same technological background, we need to pay attention to the essay condition in these two graphs. At the bottom, we are working with cells that under journalistically express the large bit subunit. On the left hand graph we compare the activity of compounds that have been incubated with cells for 20 hours prior to addition of the liceps to trade on the right axis to the activity obtained with virtually no incubation time on the Y axis, which mimics the conditions of the lysetase and new evidence there that a number of nano link. There are a number of nano link interference that shows the same activity in these two conditions. However, when we look at the same compound on the right and compare the to the primary activity with the same activity measured in a classical IBT essay where we had on top of the substrate an excess of large bit subunit, many of those interference now show different activities in the two settings. The difference in the two comparison likely results from differences in large bits levels, which so you need being probably the target of those differential compound. So despite the existence of many annotations related to luminescent essays in screening libraries, limited extrapolation exists between the different technologies and we can only rely on very precise annotations or well controlled comparison to efficiently treat our technological artifact. Maybe one additional comment on this topic, which is that this question will have more or less dramatic impact on heat triage depending on the library you are screening. You will expect to find more than a look interference in a diverse library than in the molecular glue set for example. The second category of heat that we want to trade out is cytotoxic compounds, as it looks obvious that decreasing cell count is associated with decreasing lipid signal. Once again, it remains difficult to rely on annotations since toxicity depends on the cellular model and incubation time. Cell viability can be assessed as soon as primary screening using a Multiplex assay such as cell title Fluor, for example. However, the correlation between Ibit and CTF activity appears rather low and you will always be concerned to eliminate compounds with 60 to 90% of activity on the IBT SA but less than 40% activity on the CTF without. Even if we see in this graph that most of the it's identified as toxic in the single concentration study based on the statistical activity threshold are indeed confirmed as active in Further, those response profiling are depicted in red dots here. Of course, better estimation of toxicity is derived from those response studies and it looks also interesting to include counter essay at short incubation time to mitigate as much as possible the toxicity interference. And finally, there is a third circumstance where you expect to triage out artefacts based on selectivity profile when running several targets in parallel. This strategy can be highly efficient as depicted in this figure, where you can observe a rather good alignment for most it's between two different targets, one on the X axis and the other one on the Y axis. Studied with the same technology, analytic a bit in the same cellular background. As expected, you see that toxic compounds appear well aligned and the only specific hit is was obviously identified as being active against only one of the two targets. However, for such comparison to be powerful, it has to be conducted in the same background with the same technology, which is in line with a recent report from AstraZeneca that also point out to the importance of the similarity between the two protein are flies. At this stage, we should have identified compounds that decrease the amount of our protein of interest and ruled out the main causes of artifactual identification. Let's now see with medicine how we characterize these hits to assess that they are inducing genuine targeted protein degradation. Our first step is to validate that the screen hints degrade the protein of interest in a protele zone determined manner. To do that we can either perform chemical competition or genetic depletion of the issue like it of interest. On the left side, I'm showing an example of chemical competition experiments and what we want is that the protein of intral degradation is computed with medilation inhibitor or IST related bider or protease inhibitor. I'm showing you here an example of what we want. Using BRT 4 as an example and serblon dependent protect DBT 6, you can see that there's complete degradation of BRT four in the cell after treatment with DBT 6. And this is completely rescued when we Co treated the cell with MLN 4924 and a deletion inhibitor or he bear the mind which is a cell blown by the. So we applied the same approach on training hits in collaboration with PIX Group. And you can see here an example on the left of those dependent degradation of a protein of interest. You can see in blue the high beat technology is really decrease with higher concentration of the heat. And this degradation is completely abrogated when we go treated with MLN 4924 or history like its binder. On the right side you have an example of a heat that is not at all protagon dependent because upon Co treatment with the two chemical competitor, you still see the same level of protein degradation after incubation with the heat. At Tivotech we can also generate CRISPR knockout cell lines that are depleted of the issue like this of interest. So this is what we did for some of the cell lines that express our our tag viral target. I'm showing you an example of the characterization of the cell line where we have clear depletion of the H3 like this and we still get good level of expression of our target of interest both in the in the at the same level as the wild type cell here. And so the cell lines really can be easily used for invalidation. We can perform parallel degradation assets in both the line and you can see for the two hits exemplified here that we really lose the target degradation where we perform the testing the history like a snokeout supply. To further validate and characterize the screening hits, we also use nanoveratas is developed by Promega. I'm showing here all on the left. So example of ternary complex formation. So where we monitor the ability of a degrader molecule to bring into proximity our protein of interest tag with manolic to a history like this tag with Halo tags. This proximity is going to generate a bright signal. Can see here a really nice. Hit that shows those dependent increase in prexigmal showing that the two protein are really protein to proximity by this degrader compound. We can also use the nanobot assay to ubiquitination of the target protein. In this case the ubiquitin is tagged with a low tag. You can see here an example where the increase in in prexigmal is seen after incubation with those two degrader, thus showing that those degrader trigger epicutination of the target of interest. Overall, this I say are really useful to further validate on target activity of the degrader and hang them for future studies. In conclusion, at Evotech, we have optimized the robust targeted protein degradation platform utilizing Hyper technology. Our extensive experience allows us to select robust cellular model tailored to the target of interest featuring either endogenous or ectopic expression of hybrid tagged protein. Our robotic capabilities enable us to conduct high or ultra high throughput screens with versatile readouts including Lysol or endpoint measurements. More importantly, our experience has LED us to adopt the workflows and establish a specific hit selection process which varies depending on the model and compound library in use. We have also developed a robust hit validation cascade incorporating competition with ubiquitating prototype system inhibitors and evaluating the loss of compound efficacy in the issue like this depleted cells. By leveraging nanopro technology, we can also validate on our complex formation and ubiquitination. And additionally, we can conduct ad hoc experiments such as degradation kinetics or performing proteomic studies. Overall, this comprehensive process enables the selection of the most promising hits for evaluation in relevant disease model. Thank you for your attention. All right. Now we are going to transition into a live Q&A with the featured presenters. So I will start with a question that was posed. That goes, what do you think of hybrid cell Lysite assay plus cell tighter glow assay and tandem for toxicity counter screening? Is there a significant benefit to using the Live Cell platform for endpoint studies? So I can take this one. Hi everybody. First, the lysate essay is kind of antagonist to the CTGSS. Since the lysate essay needs to be performed without incubation of the compact. The goal is to make sure that there is no change in the level of the protein of interest so that we can, if there is a change in the lybit signal, be sure that it's coming from it's independent from the level of the protein. And then the other side for sure for the CTG without we need to have information to appreciate the toxicity of. The combat great. And we'll jump into a couple other questions. This one, what is an ideal hit triage workflow? So this is for Evotech And then what does it look like in the context of a hybrid based high throughput screening for protein degraders? I can also take this one first within the context of a classical HTS workflow where we are first primary training with equation of the compound in single concentration and then we have a phase of heat confirmation with where these primary heats are confirmed in the same settings in triplicate. Before moving to those response characterization, I would recommend to create out the technological artifacts as soon as possible. Meaning at the confirmation stage, of course annotation can be used. But we have seen there that this annotation can be should be established in the exact same settings as the one that you are putting at the moment, which is not always the case. I believe also that it is interesting. To. Include at this confirmation stage and early time point measurement. Maybe not to triage out compounds that will not be active at this early time point around 6 to 8 hours, but mostly to prioritize the one which may be active and which have more chance to be independent of toxicity to be reflecting a direct effect. And I think the cytotoxicity readout at the confirmation stage if it hasn't been obtained during the primary screening will also be interesting to characterize compound for the next and for this next phase are we providing we would recommend once again studying repeat the technology and toxic to screens and include the competition study with system inimiters both at the initial and at an early time point. Great. All right, moving to the next one. So this one goes seeing a loss in protein levels could be due to a lot of things. How do you best recommend to understand if it is specific to your Protac? Yeah. I think Derek, you already kind of answered this one in your last in the last question. So I think, yeah, you always want to confirm that what you're seeing is on mechanism, that it's due to degradation. Carrick and Melissan showed that you also need to incorporate a cell viability readout as a counter screen to make sure that you're not seeing loss of signal that is just driven by toxicity. But then you also want to make sure that you can successfully compete the the degradation with excess E3 handle. You want to confirm that it's proteasome dependent. So we typically use proteasome inhibitors or other UPS pathway inhibitors, but then also you want to make sure and and this is I think what was really nice, the discussion about potential interference or compounds that can impart some interference to the signal itself. And so you can, you can look at that just using cells, you know, dosing the compounds in the cell lysate because you would not expect there to be any degradation in that context. And so if you see a loss in signal and a lysate that could just be indicative of interference. So there are other ways to potentially look at interference as well because you would not expect degradation to be very rapid. So even if you looked at that in a live cell setting, if you added compounds and right away at time zero, you performed a measurement and you saw a loss in signal, that is also indicative of potential interference. So and then also you could look at interference with just the purified proteins. So if you look at purified high bits with large bits and you see that that addition of compounds results in reduction signal, that is also another way to look at this and confirm that mechanism. So a lot of different ways to confirm that the loss is indeed due to degradation and it's not some artifact of the system or an off target effect. Great. And then we got a question that just came in asking about the advantage of running a protein, protein interaction study with Nano Brett rather than Nano bit. So maybe a comparison between those two different strategies? Yeah. So we've, I could probably take that question as well. We've we've looked at this quite a bit in fact. And to be perfectly honest, both technologies can work. If you want to look at a protein, protein interaction and induce protein, protein interaction, I would say the the only advantage that Nano Brett brings is that it's racial metric. And so if you're looking at an interaction that results in loss of one of the partners, as is typically the case with degrader mechanism, the ratiometric nature of Brett means that you can still detect that interaction despite the degradation of the target. With nano bits, you would have to rely on a proteasome inhibitor to block the degradation because if you saw a loss of the target, that would then lead to loss in the nano bit. So it would be a little bit more difficult to uncouple both of those events degradation from the protein, protein interaction. But we've actually seen that when you do that, if you run the nano bit format with proteasome inhibitor, you actually see very similar results and assay window, you know the full change of ternary complex formation, both technologies can actually be used for this purpose. All right. And then this I don't, I think this might be appropriate for Evotech. So what percentage signal loss would you say is common for initial hit calling as we are looking for even a slight loss in signal overcoming even slight noise seems very. Important. We have very various experience with that. So basically we get starting interest in in loss of signal at around 40% and we have seen for the initial it's loads that can go. From this 30%. Up to almost 100%. Of course, after the screening stage, we have still a number of experiment to perform to make sure that these are genuine, it's at least genuine protein degraders, so. There is still. Some concern with the most important compounds that draw 100% degradation that it is actually true coding degradation so so very. Difficult. To answer to this question, we see different kind of profile and it's rather common to see degradation that are not total. Yeah, I can add a little bit to that as well because we've also in in many different targets and with with different compounds, you know, highly optimized compounds versus, you know, early stage compounds, we can see very different levels of D Max. And in general, it seems like, I mean there are many things that can come into play here and I showed the, you know, from the kinetic perspective, some of those kinetic profiles can look very slow, like the rate of degradation can be very slow. So indicating kind of an inefficient overall mechanism. Maybe the kinetics of permeability, ternary complex formation and ubiquitination, etcetera are are just very inefficient. And so this can drive, you know, a a reduced depth of degradation, so not a very high D Max. So potentially if you were able to optimize those kinetics, then maybe you would drive deeper degradation. And in general, we do see a pretty good correlation if you're able to drive faster kinetics, improve the rate of degradation that that improves D Max. But of course, you know, the native turnover of the protein also comes into a play here. And so where you get that that plateau in the kinetic profile is very much dependent on the native resynthesis rate and sort of the balance of synthesis with degradation. But there can also be different populations of protein, whether in the cell or within the population of cells. So maybe not all cells are responding in the same way, or you have, you know, differentially localized protein where maybe your Protac is more effectively targeting one population versus another. And so you're really just looking at, you know, kind of the balance of both of those two populations. So there are many things that can come into play with respect to the depth of degradation. And maybe one additional comment regarding screening is that when we are able to to see a well defined plateau is quite rare case. And most of the time we have compound with moderate potency where the activity mostly relies only on two to three concentration. And in that case it's not very easy to define to. Great. And we have a kind of a general question about high bit as A tag. It was asked how it might interfere with protein of interest folding. Yeah, I can maybe take that one. So if I think we would see, especially with endogenous tagging and, and even, I think to some extent with overexpression, that if hybrid interfered with protein folding, I, I would think, you know, you, you would see that just because you probably wouldn't be able to detect the protein if the protein weren't folded, I would expect its stability in the cell to be very poor. And ultimately, I think you would probably, if you were, if you were establishing a system, an expression system, whether endogenous or a topic, and you know, you, if the protein were not folded, you would just probably never detect it, I guess would be my expectation. So in a way you're you're always kind of screening against those artifacts because they would show up as essentially an assay that just wasn't robust. Great. And then we'll probably go through maybe one or two more questions as we're closing in on the hour. So let me see if I can find one. This one's asking if there's any advice that any of you would be having for time frames to assess degradation when kinetic experiment experiments may not be possible. Time frame I could propose would be between 14 to 24 hours and in most time I would mostly focus on 14 to 20 hours. And I will mostly give an area of incubation time because anyway when dealing with screening we cannot have a very precise control incubation time and we mostly rely on set of Commission. So that came from 14 to 20. Yeah. And I would also add, I think it, I think it depends too on, on what you want to screen for. So if you're really trying to optimize the kinetics and the rate of degradation, it can also be helpful to have an early time point as well. So sometimes we like to include in addition to, you know, like an overnight to 24 hour time point, we like to include an early four to six hour time point because you might want to see of those that do achieve robust degradation, how many achieved it very quickly. And then what's nice about the early time point too is that you wouldn't always expect to see significant impact of toxicity. So it can be helpful if there is degradation that leads to cell death, it can be helpful to deconvolute that a little bit better by seeing whether the degradation occurred earlier and the toxicity followed later. Exactly, This is why I was suggesting in the workflow to add this early time point as soon as it confirmation. Yeah, that's great. All right. And then I think this might be a good question to kind of wrap up on. So beyond TPD, are there other applications for hybrid technology for high throughput screening? For sure at the moment there is some excitement about TPD. So the most of our IB tests that are running at the moment are dealing with TPD. However, we have also run some other projects looking for something else with this technology because the beauty with IB is is HTS friendly aspect and it's robustness. And for example, when we are dealing with we bought a protein where the amount of a target protein of interest is a surrogate marker of the activation of events in a pass way. I did lytic, I did can be something interesting there. And there is also the use of non lytic I did where we use selling permanent substrate in non elytic condition where we can monitor the level of the membrane of one membrane protein and study therefore either it's cleavage or it's internalization. All right, great. So we're coming up right in the hour. Just to be sensitive of time, I think we may have time for one more question. So I'll sneak this one in so we the speakers have time to answer this quick. Have you Multiplex nanobright ubiquitination assays with CTFCTG viability or hybrid degradation assays? Yeah. So we can Multiplex any of the luminescence. So hybrid or nano bread assays with a cell viability readout and it just depends on the format. So if you're using a hybrid lytic assay, you can Multiplex with cell tighter floor viability assay just because the hybrid lytic is not compatible with cell tighter glow. So you'd have to do those in two separate plates if you're doing a live cell readout. However, with either hybrid or if you're using a running a nano Brett assay, a live cell assay, you could Multiplex either of those with the cell tighter glow assay because you have a live cell and then you come in at the end with cell tighter glow, which is a lytic, I think. Did that answer the question or was there another component to that question? Yeah. So they're asking if they can be multiplexed with CTFCTG or hybrid degradation assays. So nanobret ubiquitination multiplexed with hybrid degradation assays. Mm. Hmm. Yeah. So the other thing I was going to mention, the nice thing about Nanobret again is because you get both readouts. So you could, you can Multiplex a Nanobret assay with a cell viability cell tighter glow, but you can also get the nano Brett. So in this case ubiquitination in addition to degradation because you're monitoring both your, your donor luminescence, which is your target protein levels and your Brett which is the interaction. So you can potentially in one assay get 3 readouts. Great. Thank you so much. OK, so now for the sake of time, I will be wrapping up the live Q&A. If your question was not answered, we will be responding to you after the webinar kind of whoever is most appropriate from amongst the speakers will be responding to you. But I want to say thank you very much for attending the webinar and we hope you have a great rest of your day. Thank you very much. _1728205760007