The importance of analytics in biopharma is often overlooked, despite new technologies heavily relying on these processes for innovation. Here, we hear from Ruizhi Wang, Founder and CEO at Abselion, who shares his experiences and views on removing the black box around analytics.
What first drew you into the biological sciences – specifically electronics and mechanical engineering?
I studied mechanical and electrical engineering across Switzerland, the United States, and the UK, and later completed my PhD in Cambridge.
What drew me to engineering was my interest in physics and mathematics, and how they can be applied in real-world settings. Engineering offers a practical way to use those principles. There’s also an element of curiosity – wanting to understand how things work – which probably started early on.
What I find particularly interesting about engineering is its precision. Systems are well defined and behave predictably if you understand or measure them correctly. That’s especially true in semiconductor engineering, which is my specialty and is rooted in concepts like quantum mechanics and other key areas of modern physics.
Moving into biology was a contrast. Biological systems are far more complex and less controlled. They involve many variables and unpredictable conditions.
That challenge was appealing – the idea of applying an engineering mindset to a more complex and less structured field.
How did you get involved with co-founding Abselion?
The company grew directly out of my PhD work. During my PhD, I focused on semiconductor devices and surfaces, and that research became the foundation of our product, Abselion.
The core technology is based on a semiconductor chip that enables a measurement method called redox electrochemical detection. Unlike most techniques used in life sciences, which are optical, this approach is solid-state and measures electrical signals.
The idea for applying this technology came from my own experience during my PhD. I spent a lot of time using advanced analytical tools, particularly electron microscopy. These systems require extensive training and are expensive, and even after training, access can be very limited. It made me realize that even the most advanced tools are not always useful if they are not accessible or easy to use.
I shared this frustration with others, and at Cambridge, I had the opportunity to interact with researchers from many different disciplines. Through these conversations, it became clear that similar challenges existed in the life sciences. That led to the early idea that the technology I was developing could help simplify analytical workflows in that field.
From there, it took several years of prototyping and generating proof-of-concept data to develop the technology into a product.
One of the biggest challenges in turning research into a company is the shift from a structured environment to one where you must define your own direction. As a student, there is usually a clear path and guidance. When starting a company, especially in a new area, you have to decide what to do at every stage. That need to set your own direction is one of the most difficult aspects of building something from scratch.
What trends are you seeing in bioprocessing and how are they affecting development processes?
It can be helpful to look at this in a modality-specific way, because the challenges differ depending on what type of therapy is being developed.
Our product can rapidly generate titer data, including measuring product concentrations directly from non-purified samples. We apply this across several areas, and the challenges vary by application.
For example, in biologics – particularly antibodies, which are the most established class – there are already well-defined processes and high expectations. However, the field is evolving beyond traditional monoclonal antibodies. New formats such as bispecific antibodies, antibody–drug conjugates (ADCs), and hexameric antibodies are becoming more common. These behave differently, and existing analytical tools are not always sufficient, so new analytical approaches are needed.
At the same time, development timelines are becoming shorter. There is increasing pressure to generate high-quality data more quickly, which affects all stages of development, including cell culture, screening, and analytical testing.
In contrast, gene therapy – particularly using adeno-associated virus (AAV) vectors – is a newer field. While there is strong potential, processes for development, manufacturing, and analytics are still evolving. There is greater variability in how work is done across different teams, and the field is still working toward standardization to ensure consistent and reproducible data.
Analytical complexity is also higher in gene therapy. For AAVs, there are multiple key measurements. These include capsid titer (the number of viral particles), genome titer (the number of particles containing the therapeutic genetic material), and infectious titer (the number of particles capable of delivering that material into cells). Interpreting these requires a detailed understanding of the system.
In comparison, antibody development is more straightforward analytically, as it typically focuses on a single measurement of the target molecule without the need to distinguish between multiple functional states.
Could you give us an overview of your technology, how it supports different modalities, and how it has evolved over time?
At the core of our instrument, Amperion, is a semiconductor chip that is highly sensitive to electrochemical reactions.
To detect proteins – or viral vectors – we rely on binding events at the chip surface. When a target molecule binds, it triggers an electrochemical reaction that can be measured quickly and with high specificity.
Because this is a platform technology, the type of molecule being measured is less important. The system can be adapted to different modalities and a wide range of targets.
We initially focused on antibodies, as they are the most established class of biologics. From there, we expanded into AAV-based gene therapies, which are a growing area.
More recently, we have extended the platform to additional applications, including the quantification of tagged proteins, such as enzymes or reagents that carry specific tags. If a tag is present, it can be used to capture and measure the protein concentration.
We are also developing assays for host cell proteins. Unlike product-focused measurements, this application targets impurities during the production process, which is important for quality control.
Several of these assays are being launched this year, with additional applications in development. The flexibility of the platform allows it to be applied across multiple areas, making it suitable for a wide range of analytical needs in bioprocessing.
What are some of the biggest misconceptions in the industry around titer measurements?
One of the biggest misconceptions is that titer analytics can be treated as a “black box.”
There is often an assumption that you can simply run a sample, generate a number, and move on without considering how that result was produced. This is understandable – analytics is often just one part of a larger workflow, and teams are under pressure to move quickly. However, without understanding how measurements are generated, it becomes harder to interpret results correctly. In some cases, data may appear precise and reliable but not actually reflect what is happening in the sample or process. The same measurement can also have different meanings depending on the conditions or sample type.
This misconception also extends to how analytics is applied across development stages. From early discovery through to large-scale manufacturing, the same measurement – such as titer – is often performed using entirely different tools at each stage. While differences in scale are expected, the lack of consistency in analytical methods can create challenges. Results are not always directly comparable, and teams often need to redevelop analytical methods when moving between stages such as cell line development and process development.
This fragmented approach adds complexity, increases development time, and can make it harder to build a clear understanding of the data.
A more effective approach is to treat analytics as an integrated, well-understood part of the development process. Greater transparency in how measurements are generated, combined with the use of consistent analytical platforms across stages, can improve data interpretation, reduce inefficiencies, and support better decision-making throughout development.
How are regulatory expectations around titer measurements for antibodies and gene therapies evolving, and what does this mean for technology developers and process development labs?
Building on the earlier point about misconceptions, regulators are increasingly recognizing the importance of understanding how measurements are made – not just the final numbers produced.
Recent guidance, such as ICH Q14 (released in 2023 and now being adopted), reflects this shift. It emphasizes that analytical methods should not be treated as a “black box.” Instead, there is a greater focus on understanding how results are generated and on continuously validating methods throughout development.
For antibodies, this improves traceability across different stages of development, including varying concentration ranges, sample types, and impurity profiles.
For gene therapies, it helps clarify how different assays relate to each other, particularly when multiple measurements – such as capsid, genome, and infectious titers – are required. It also supports more informed decision-making based on these data.
While some may see these changes as increasing regulatory requirements, they are likely to improve consistency, transparency, and overall process quality. From our perspective, this is a positive development and aligns with how we approach measurement and data generation.
If you had to choose one major improvement in titer analytics for biologics or gene therapies over the next five years, what would it be – and why?
The key need is closer integration and faster feedback.
One of the main challenges today is long turnaround times. Even for relatively simple titer measurements, results can take one to two weeks in large organizations. This is not due to limitations in the science itself, but rather workflow, logistics, and the way current technologies are used.
What we are trying to do is reduce this friction – freeing up scientists’ time so they can focus on higher-value work, such as developing new therapies and bringing products to market faster.
Looking ahead, the biggest improvement would be to fully integrate analytics into the production process. At the moment, analytics is often treated as a separate step, rather than part of the core workflow.
There has been progress with process analytical technologies (PAT), particularly for measuring parameters like pH, glucose, and other cell culture components. However, there are still gaps – especially in measuring the most critical parameter, which is product yield or titer.
Our approach is to move these measurements closer to the production line, reducing turnaround times from weeks to hours. The next step would be to embed these measurements directly into the process, making analytics a continuous and integrated part of development.
This level of integration could significantly improve efficiency and speed across bioprocessing workflows.