Innovation in pharma manufacturing as paradigms shift and biotech and personalized therapies move through clinical trials at pace
Over the last 10 years, there has been an important change of focus in pharma, away from small chemical molecules towards biotech products and personalized therapies. These important new treatments are now emerging into manufacturing, with direct implications for those scaling up from lab-based scientific research to human clinical trials and full-scale production. This has implications for the way that quality is monitored, managed and assured. This article sets out some practical advice for pharma manufacturers and their supply chain partners as they gear up for efficient, optimized and compliant production at scale in a cost-sensitive, biopharma-first world.
As the emphasis of new drug development gravitates away from chemical synthesis towards new, ground-breaking therapies and vaccines, the requirements of manufacturing sites, equipment and processes need to be appropriate for this new environment. For young biotechs, scaling up production brings green-field challenges, while for more established pharma a shift may be required to ensure that facilities are optimized for new ways of working.
Across all of this is the criticality of compliance with Good Manufacturing and Good Distribution Practice (GMP/GDP) and of a fit-for-purpose quality system. Engineering or redesigning facilities without due consideration of what’s needed, of what may be superfluous, or of how systems and processes will be validated over time, could lead to costly remedial action and delays in getting important and premium new products to market.
Beyond adapting existing sites or engineering new facilities geared to modern drug production, biopharma companies also need to be more cost-conscious than ever before in the way they run and monitor their manufacturing and supply chain operations. That’s because of the relative expense of emerging therapies and the pressure on pricing for continuing product lines.
Gearing up for efficient, optimized and compliant production at scale in a cost-sensitive, biopharma-first world, means creating an environment in which quality can (and must) serve as a real-time facilitator of first-class manufacturing, rather than controller and after-the-fact corrector of production and supply chain delivery.
So, what might that look like?
Historically, in some cases, pharma systems were put in place largely to help companies pass inspections, rather than to foster innovation by adding inherent value as part of the production process. When that happened, systems existed on the periphery of manufacturing, generating after-the-fact documentation rather than being an integral part and enabler of proceedings.
Today, whatever their starting point as they reposition for the biotech era, manufacturers have an opportunity to be more discerning about the measures they put in place to ensure and track quality, and to integrate and embed Quality Management much more within real-time processes, where it can more tangibly add value as a discipline.
This presents a chance to be smarter about the level and scope of detail captured, for instance, and to gear at least some of the tracking, analysis and issue-flagging to enabling internal efficiency gains, by digitalizing workflow management and applying intelligent process automation.
One of the drivers of the shift in quality in biopharma manufacturing is the evolving regulatory climate which—in an attempt to be more supportive of innovation—is shifting towards risk management and health authority approval based on the given risk profile. This is in contrast to the approach of 20+ years ago, which favored indiscriminate documentation, vetting and validation of every small detail against a positive list of requirements, for the sake of being thorough.
Today, as long as biopharma manufacturers respect and show commitment to the principles of GMP and that they are doing all they can to minimize any risk to the patient (the ultimate goal of pharma quality and safety measures), they can design their facilities and processes to be more fit for purpose, aligned with the respective risk. (Currently, regulators are adapting their assessments and inspection criteria accordingly, not least by bringing on board new generations of inspectors who are trained in the new approach and are more attuned to the innovation agenda.)
The ‘relaxing’ of inflexible quality-related regulatory requirements presents an opportunity for manufacturers to use systems and measures to their own operational advantage (without compromise to safety, of course). When they apply digital solutions, this should be not just to replicate old, manual processes, but to enable process redesign and process optimization.
Before they can do that, however, companies must refresh and hone their process knowledge, enabling them to challenge existing ways of achieving something, and analyze the risk of doing it differently. That might involve redesigning everyday processes to be more efficient, or questioning which products are made at all—and which (e.g., lower-margin/loss-making assets) might be better being produced through a subcontracting arrangement.
This is a chance for quality functions and systems to add new value for the manufacturer. Whereas in the past people working in production and quality might have spent the majority of their time putting out fires, the opportunity now is to improve processes, prevent issues from arising in the first place, and to share findings and knowledge with key decision-makers, so that problematic processes can be assessed and redrawn to be more efficient.
Organizational considerations
Maximizing the potential isn’t only a matter of digitalization; there are organizational considerations, too. For instance, if quality teams ordinarily work during standard office hours, e.g., 8am-5pm, yet plants operate 24/7 across three shifts, it follows that quality will be ill-equipped to proactively prevent issues or intervene swiftly if an incident occurs when the function is offline. Any decisions pertaining to problems that have developed at, say, 10pm will then be made without the quality team’s real-time oversight.
An optimal solution to this, without driving up costs (e.g., by recruiting additional quality people), would be to put quality on the shop floor and make the function more integral to production processes. This is so that they can be more continuously monitoring activity; taking an active real-time role in solutions; and recording all decisions as they happen. Documenting all of this in real time will save a protracted quality review after the fact. This in turn will increase the likelihood of a timely release.
New biotechs, emerging from the labs into commercial-scale production, have a greater advantage with all of this, not being influenced or unduly held back by legacy quality processes. As they establish full production facilities, they have a chance to incorporate quality ‘by design’ building, harnessing emerging best practice and applying risk management to determine what scaled-up production should look like in the 2020s, and what process weaknesses to avoid to achieve frictionless validation.
Moving straight to highly integrated and digitalized processes, which allow for real-time data to flow to where it’s needed and trigger next actions, can significantly heighten the chance of success—affording clear visibility across production, and creating the potential for automatic alerts, reminders and prompts based on given parameters to keep everything fluid and dynamic.
From building management systems and equipment control software to process monitoring data, to sampling and certification of analysis systems, companies have a plethora of tools available to them now to perform optimally and to capture all the data they need to satisfy regulators and auditors.
As they strive to become more proactive in quality management, to support ongoing innovation, companies can also link current systems to historical databases to discern any patterns in deviations/incidents and take positive, preventative action to ensure the same issues won’t recur.
The more that functions, systems, and processes can be interlinked, the greater the scope for improvement, too. Connecting historical incident data to risk management for process optimization, and ERP systems for materials management, offers production and quality teams optimal scope to stay ahead, while also reducing the number of review cycles (on the basis that if a computerized system has been validated, it can be trusted).
With so much great innovation happening at a biopharma product level, and so much price and cost sensitivity in today’s markets, biopharma manufacturers can’t afford to be held back by excessive red tape, unwieldy processes, or siloed teamwork. Fortunately, it doesn’t have to be like that—as long as good practice starts with the right advice.
Dr. Eduard Cayón is an experienced pharmaceutical industry consultant at bespoke technology and manufacturing supply chain compliance consultancy, TDV. He is also the founding partner, VP, and a director of pharma auditing organization Asociación Fórum Auditorías (AFA) in Barcelona. The two organizations recently merged with Rephine Ltd, to create the leading global force in GMP audit services.