Sharing information about processes through the stages of drug development enhances technology transfer.
A pharmaceutical product can take many years to go from drug discovery to approval. From research and development (R&D), significant amounts of data and process information are collected. This information is generated by many services, each of which can operate in silos. While the results of process design are shared, the underlying data often is not, resulting in loss of knowledge continuity and suboptimal technology transfer.
This and other related issues can be addressed through the intelligent application of advanced analytics software to coordinate data and information sharing between teams and functional areas. This article will explore some examples of how such an analysis application can be used to support knowledge capture and collaboration, from the lab to the commercial manufacturing site.
Gaps in the flow of knowledge
As a drug moves through preclinical and clinical trials, the development team verifies that the process can evolve to pilot and ultimately commercial manufacture, all without changing critical process parameters that might have. an impact on the quality of the product. Pilot scale data is typically stored in a process historian and compared to lab and process analytical technology data, which is typically available as disparate data sources. As a result, it is often difficult to directly compare pilot data to R&D studies. Disparate data sets must be extracted from their respective sources, and then the data must be aligned before performing the analysis. After this data comparison, a retrospective analysis of the scalability is carried out.
Finally, if the product passes all phases of clinical trials and obtains the seal of approval from the FDA or another agency, it enters commercial manufacturing, where the recipe and process set points are transferred to the sites. internal manufacturing processes or to external subcontractors. Failure modes for process safety and quality are assessed using cause-and-effect analysis, but for the most part, the manufacturing site is starting from scratch to collect data to improve efficiency process. In the event of a process deviation, R&D personnel can be contacted for additional information on a particular failure mode, but commercial scale manufacturers are largely unable to leverage the data or knowledge of the whole. of the development process. By the time a manufacturing gap occurs, R&D scientists may be moved on to other projects, so the process of obtaining materials, setting up analytical methods, and providing additional information is inefficient. There is also a lost opportunity when R&D staff are unable to work on new products because they have to investigate issues with older projects. Despite these difficulties, finding the root cause of a deviation is essential to avoid repeated deviations.
Connectivity, integrity and data auditing
To support analytics at all stages of drug development and manufacturing, Subject Matter Experts (SMEs) need to be able to easily access data throughout a product’s lifecycle so that laboratory, pilot and commercial manufacturing can perform comparable analyzes and use the knowledge captured during technology transfer. . It is crucial to minimize the time spent cleaning and aligning datasets so that these experts can speed up the time to get information. By connecting these disparate datasets to an analytics application, scientists, developers, and manufacturers can quickly find, explore, quantify, and document the results of their processes to support technology transfer.
However, data connectivity alone is not enough, as data integrity is also of paramount importance to ensure safe and efficient drug production. Information security considerations for authentication and authorization are necessary to ensure that only designated individuals can access the system and interact with the appropriate data.
Tracking changes to calculations created in an analytics application is another key factor for regulatory compliance. Data administrators need to be able to prove that data is being used correctly, especially when making decisions to start a production run or change parameters during manufacturing.
To support these types of efforts, companies must establish a designated Good Manufacturing Practices (GMP) IT environment to use for production decisions. This environment can be a completely separate system with its own connections to data sources. Or, it may exist in the system used for engineering, as long as the system has access control settings to limit user changes to validated data, as well as the ability to maintain an audit trail of validated data. changes to calculations and other analysis configurations.
In both cases, standard operating procedures and user permissions are leveraged to maintain the integrity of GMP content and to provide SMBs with a different workspace in which to browse for ongoing scans.
Therefore, advanced analytics applications must provide secure connectivity to live, streamed, and validated data, with traceability and audit controls. Calculations should be transparent and repeatable to allow easy understanding of how the underlying data was processed.
Preparation for commercial manufacture
Advanced analytics applications enhance knowledge capture to support the technology transfer process by making experimental information available to commercial manufacturing personnel and other departments. Knowledge transfer is maximized by connecting to R&D, pilot and manufacturing-wide data sources to overlay experiences to batches, and by providing tools to capture deep process learnings during the build-out. scaling.
In an example of a continuous twin-screw granulation process, an advanced analysis application was used to analyze data from a design of experiments and to create a quality-by-design (QbD) multivariate design space around the process for ultimate commercial use. manufacturing. This model was developed by cleaning up the experimental data to align the upstream and downstream process signals over time, then limiting the model inputs to steady-state operation.
These inputs were used in a multivariate regression model to determine the influence on critical quality attributes. The multivariate QbD model was then deployed in commercial manufacturing to provide a monitoring view of the process, reporting deviations from the defined quality specifications as they occurred to allow rapid remediation (see Figure 1).
Continuous verification of the process thanks to a statistical control chart
By proactively monitoring manufacturing processes, pharmaceutical companies can control variations to ensure product quality. The Statistical Control Card is used to support Continuous Process Verification (CPV), ensuring that processes are performed correctly and consistently.
The control limits will change depending on the product recipe being run at the manufacturing site. It is therefore important to identify both the parameter to be monitored and the associated product campaigns running so that the statistical values ââof the means and standard deviation limits can be calculated for each product, as indicated, for example, in Figure 2. After creating the sigma limits, execution rules can be applied to look for process excursions and trends that can provide early excursion warnings.
Once the logic is defined for the CPV control charts and the desired execution rules, it can be applied to any period of time, or even be executed online to track batch changes in near real time. .
Continuous process monitoring
To support the pharmaceutical technology transfer process, SMEs must be able to connect to data from many different process, laboratory, maintenance and manufacturing sources to perform analysis. Once this data is collected, engineers and scientists can calculate statistical limits, key process indicators, and aggregations using analysis tools. Trends and important metrics can then be aggregated into static batch reports or live update dashboards for continuous process monitoring. These dashboards can continue to update and include the latest data to monitor the process in near real time.
These dashboards and reports promote knowledge capture and collaboration between organizations by bringing together multiple analytics, often configured by a team of process experts. These reports can be reviewed by operators, engineers and managers to ensure that each batch is progressing within specified limits during the drug development and pilot scale-up processes. Using digital tools that make data visible throughout the product lifecycle, any observed deviation can be analyzed against lab data to look for similar issues and learnings.
Throughout the product lifecycle, substantial knowledge is gathered about the process through R&D experiences and scaling batches. Advanced analytics applications enable faster and more efficient technology transfer throughout the product lifecycle to reduce the time required for transfers between departments in an organization. During technology transfer, advanced analysis applications can help researchers understand relationships between variables and determine critical process parameters at the lab scale, as well as verify scalability and optimize performance. process as it moves into pilot production. In addition, these learnings can be essential for monitoring process variability and quality and for investigating trade-scale variances.
Advanced analytics applications thus enable SMBs to document their analytics, capture insights, and extract past insights from colleagues to accelerate process development timelines and improve efficiency at the commercial scale.
About the Author
Emily Johnston is a senior analytical engineer at Seeq.
Flight. 45, n Â° 2
When you cite this article, please call it E. Johnston, “Optimizing Tech Transfer with Advanced Analytics”, Pharmaceutical technology 45 (2) 2021.