With more sophisticated therapies come larger and more complex data sets. Complexity has increased with advances in personalized medicine as it introduces new data points in apheresis (the extraction and infusion of a patient’s blood, cells, tissue, and/or regenerative medicinal compounds) and during cell and gene enhancements .

While data integrity is critical to building trust in the supply chain and product quality, as well as meeting FDA compliance obligations, many organizations continue to rely heavily on spreadsheets, manual data entry, paper and email. This creates numerous opportunities for failure and can result in warning letters, fines or FDA recalls. While data “capture” can begin early in biopharmaceutical R&D, a variety of disparate IT systems are often installed without insight into data consistency throughout process development and clinical and commercial manufacturing.

These challenges are compounded by the universal dependence on external partners for significant process development and manufacturing operations.

To mitigate the risks of delayed, incomplete, and inconsistent data, biopharma companies need to establish a robust data management approach early in product development. Especially for startups that may not have a lot of IT experience or staff, this can be daunting.

The following items should be prioritized to better address and mitigate business risks related to data integrity and reliability:

  • Creating a digital data backbone across the product and process lifecycle and between internal and external teams, sites and partners
  • Interdepartmental review of quality and supply agreements with CDMOs [contract development and manufacturing organizations] to ensure data visibility, intellectual property ownership and process oversight

Establish a single digital data backbone in advance

There are new business demands for faster information processing. Early construction of a digital data backbone supports key downstream activities: late-stage process development, scale-up and technology transfer, and manufacturing where quality assurance and compliance requirements come into play.

New digital data systems maintain or establish the context and relative importance of the data collected by the IT infrastructure. By implementing a cloud-based data backbone, data can be collected and organized in a central platform without compromising context. It can scale as the product and IT infrastructure matures and remains relevant as it integrates with systems such as LIMS [laboratory informatics managements systems]historians, ESM [manufacturing execution systems] and EBR [electronics batch records software]to serve as the single verifiable source of truth for critical data for process control monitoring and for performing analysis and reporting.

With growing demand for accelerated technology transfer, FDA filings, and commercialization, building a data backbone up front generates significant time and cost benefits: fewer PPQs [process performance qualification] racing, technology transfer right the first time, simplified investigation and production, and early batch release.

While a cloud-based data management solution is the first step, companies also need to be vigilant when partnering with manufacturers.

Data visibility into quality and supply agreements

With the acceleration of the development of new drugs and therapies, the complex manufacturing requirements and associated capital investments, the growth of outsourcing is expected to continue for the foreseeable future.

Despite outsourcing manufacturing, the drug owner (sponsor) remains accountable for meeting FDA standards for product quality, demonstrating control over the contract manufacturer and drug manufacturing process, and establishing a process data set , product and quality inscrutable and with high integrity. The near universal reliance on contract manufacturers and the FDA’s focus on data integrity issues in drug manufacturing have generated unprecedented scrutiny of manufacturing operations by the FDA, strategic purchasers and the SEC. As the supply chain continues to expand in complexity, process development and production, data management is an area that requires new approaches/innovation.

While data integrity challenges can lead to quality and operational issues, they can also create legal risks, such as the loss of manufacturing intellectual property and the inability to demonstrate control over the CDMO, which can affect the business value of the agency.

While these challenges affect companies large and small alike, data visibility is a key pain point for small biopharma companies, as most are 100% reliant on CDMOs but often lack the experience and/or negotiating power against consolidated CDMOs.

Despite mandates from the FDA to manage their CDMO and manufacturing processes, drug owners struggle to meet these requirements, being physically remote and often lacking IT systems designed for sharing data between owner and contractual partners. Failure to comply with this requirement may result in the issuance of FDA warning letters. In fact, about 50% of all FDA warning letters in 2019 were about data integrity issues.

Supply agreements must anticipate data needs and emphasize data visibility and ownership of critical information, including process control parameters.

Fortunately, a growing number of CDMOs are realizing the compliance burden on their pharmaceutical sponsors and that the future of biopharma depends on collaboration and visibility into their manufacturing workflows. With state-of-the-art data management solutions and partnerships with CDMOs, biopharma companies can gain more confidence in the quality of their products and be better prepared to meet stringent compliance requirements.

Cloud-based data management solutions help the industry address business and compliance challenges. These platforms must replace traditional data management methods and workflows for biopharma companies and CDMOs seeking competitive advantages.

Leave a Reply

Your email address will not be published. Required fields are marked *