High-tech digital applications tend to run in flavor-of-the-month fads, and few seemed to have been so relentlessly hyped in the past year as cloud computing. To some, it’s simply a relabeling of what’s been going on for years in computing technology: outsourced or hosted data storage and applications, making use of fast data connections to provide performance comparable to having the software and data down the hall or under your desk. Nevertheless, there is a growing momentum toward tailored cloud services and solutions, and along the way the movement is picking up energy and capabilities from the wildly popular smartphones and touch tablets, all of which falls under the “mobile computing” umbrella.
That’s one of the reasons that Hewlett-Packard, the leading PC vendor, made the startling decision to dump its business last month. If everyone is migrating to ever more-powerful tablets and smartphones, who needs a desktop or bulky laptop? (That H-P has also withdrawn its tablet offering, for the time being, points to the strategic bind that the company is in, and the criticism its management is receiving from investors.)
That tailoring of applications is now going on for life sciences companies, and IT departments throughout the industry are trying to keep up with the new service offerings, while trying at the same time to keep critical proprietary data from floating out of secure data systems in someone’s smartphone.
“The pharma industry, like others, are recognizing that its core competency is not designing and operating data centers,” says Alan Louie, a life sciences analyst at IDC Health Insights (Framingham, MA). “Cloud computing concepts have been used for quite a while in sharing clinical data, and as more IT vendors migrate to software-as-a-service solutions, it is changing the entire IT ecosystem of the industry.”
Cost savings and speed to deployment are two of the main advantages of cloud computing. At Model N (Redwood Shores, CA), a vendor of revenue-management software and services for pharma and high-tech companies, Gopkiran Rao, product manager, says that first-year costs of an implementation can be cut by 30-40%, and deployment time drops from months to weeks. “We are transitioning our legacy customers from on-premises solutions to hosted solutions in the cloud, and eventually all our clients will be handled this way,” he says.
Why cloud, why now?
To understand where IT architectures are going, it’s necessary to agree on a broad set of concepts that the industry itself is not exactly united around. Anyone who jumps to a hosted e-mail service like Google Mail or Yahoo Mail is using the cloud, in this case the “public cloud” that is open to essentially anyone with an Internet connection and an e-mail identity—and this has been going on for years. The companies that manage massive Internet data centers (including all of the major IT services companies, as well as a growing number of dedicated “managed services” companies) offer various types of “private clouds” that enable companies to rent either data storage space, or CPU cycles, on a pay-as-you-go basis. Amazon, the leading e-commerce vendor, began offering a variety of cloud services on this basis over the past couple years, and now Amazon Web Services (AWS) is a leading cloud vendor. (Its entry point for a customer is free—a pretty compelling offer.)
Another ordering of the cloud environment is by level of service being rented. Basic data storage and CPU cycle time is “infrastructure as a service;” IaaS combined with supervisory IT management is “platform as a service;” PaaS with specific applications (which the client might own, or might be renting as well) is software as a service (SaaS); and (in some circles) the concept of delivering the results of running a program is “data as a service” (DaaS).
The value, and potential IT service enhancements, increase as you move from infrastructure to applications,” says Jennifer Goldsmith, product manager at Veeva, a customer-relationship management firm that began as a pure SaaS vendor four years ago, and announced “Veeva Vault” cloud services this spring (Pharmaceutical Commerce, May/June, p. 21).
In clinical data management (either for pure research or for conducting clinical trials), the necessity of accessing massive databases of, for instance, genomic data has driven pharma companies and academic institutions to adopt various types of cloud services for many years. “Data like genomics from publicly funding research projects has a relatively low value per byte, and many organizations are comfortable with these data residing in public clouds,” says Louie. “But clinical research results or ongoing internal studies are the most valuable data pharma companies have, and those are going to reside in private clouds with considerable security around them.”
Getting to collaboration
Whether an individual company buys and manages its own data center, or employs a private cloud from a managed services company, is more or less a business decision left up to an individual IT department. But the game gets considerably more complicated when the goal is to share data among different organizations, such as for joint research projects, or for integrated supply networks involving contract manufacturing organizations (CMOs), or even between trading partners (such as a manufacturer and wholesaler).
For many years, these collaborations involved some type of private network—and with the advent of broad Internet services, a virtual private network (VPN). VPNs usually involve setting up access to diverse data centers through corporate firewalls. Each VPN can become a project of its own, and a large organization can be compelled to set up, and manage, hundreds of VPNs. That’s where IT departments begin to choke, and where cloud vendors are offering to speed delivery of services.
“We set up an early adopter program when we began Veeva Vault,” says Goldsmith, “and we’ve seen the first implementations take only 12-18 weeks; now that is coming down to 6-12 weeks.” Veeva Vault is planned to be a comprehensive set of services built in “modules” for specific business applications: promotional material and other regulated content; sales and marketing data; human resources; clinical and others. “Initially, we’re going after unstructured, regulated content, such as the promotional materials that require FDA review, and whose content must be kept under version control,” she says.
The other type of content, “structured,” would be data such as production data from a manufacturing run—essentially, data that get put into fields on a screen, and where preserving the actual data, and time-stamping it, is very important. That’s the goal of companies like TraceLink (Woburn, MA), whose principals were involved in early adoption of pedigree standards for drug distribution, and are now setting up cloud-based networks for manufacturer-CMO partnerships. Catalent Pharma Solutions, Sharp Corp. and Patheon—all providers of CMO services—are early participants.
“Subscription costs, starting at $250 / month / process, are directly proportional to the number of trading partners you have and the business processes you want to collaborate with each on on,” says Brian Daleiden, product manager at TraceLink. “Because of our ‘integrate-once, interoperate-with-everyone’ transaction platform, integration costs are minimal. Once you integrate to the TraceLink Network using our adapters for ERP and related enterprise systems, you are immediately connected to all of your trading partners on the Network. We don’t charge transaction fees, per-user fees or storage fees.”
On paper, all this sounds great—using an Internet browser to retrieve detailed, time-sensitive information in a snap. But the big hurdle—and the problem that is keeping many pharma companies from moving broadly into cloud services—is securing the data, and controlling access to it. Enter the world of digital signatures and (hold your breath!) “security as a service.”
Without really, really good security, data can ether be exposed to the world (a recurring problem with HIPAA-restricted patient data in healthcare systems), or hacked and corrupted. Even more to the point, unless control of data access is rigorously policed, and changes to documents or datasets recorded, a digital file could be changed by a best-intentioned reviewer, but now an audit trail is lost.
Many IT data-integrity requirements exist: besides HIPAA, the Sarbanes-Oxley law requires signatures and audit trails for accounting data; and FDA requires compliance with 21 CFR Part 11, ensuring that data like production batch records are preserved, and access to them is controlled. It is up to individual pharma companies to define to FDA (in the case of 21 CFR Part 11) how this data integrity is maintained.
For its part, Veeva has gone ahead and, working with IT services company QPharma, validated Veeva Vault for 21 CFR Part 11 compliance. “Like any good manufacturing practices (GMP) project, this goes through a series of steps—design qualification (DQ); installation qualification (IQ); operation qualification (OQ) and performance qualification (PQ),” says Goldsmith. “We can provide documentation to clients to cover the first three; only the PQ step, which is specific to that client’s implementation, needs to be conducted by the client themselves.”
21 CFR Part 11 comes out of the GMP manufacturing world, and many cloud or SaaS vendors have never heard of it. But the equivalent level of data integrity—secured access, and identify control—is offered by a variety of digital security vendors or consortia.
One of the more successful of these (and one of the more relevant to pharma) is SAFE-Biopharma Assn (Fort Lee, NJ; www.safe-biopharma.org), a non-profit set up in 2005 by a consortium of pharma companies along with FDA and some trade associations. The orginal goal was for sharing research data securely, but as the cloud has evolved, and as SaaS has broadened in applications, is essentially open to any pharma company, or any IT vendor, that wants to undergo its vetting process.
“We’re beginning to see some interest from commercial IT vendors,” says Mollie Shields-Uehling, CEO of the organization. “One of the attractions is not only compliance with FDA and other federal institutions, but the European Medicines Agency (EMA) and others, making it useful worldwide.”
SAFE-Biopharma has a proprietary Registration Authority System that is fairly expedient for pharma companies to adopt, but has rigorous requirements for IT vendors, who must undergo their own certification and audit process. SAFE-Biopharma is not only 21 CFR Part 11 compliant; it also meets the requirements of the Federal Bridge Certification Authority (FBCA)—the federal government’s own high-security signature system.
In practice, organizations like SAFE-Biopharma manage the identity security, not data or document storage themselves. Once identity credentials are accepted at the “identity trust hub,” access to a particular data center is created. Other vendors (and other quasi-governmental bodies) have similar systems in place.
One private company looking to build a business in life sciences is EchoSign (Palo Alto, CA), recently acquired by Adobe Systems. EchoSign’s system allows business partners to share a document, with each signing based on the signature authority maintained by EchoSign, and then stores the document. EchoSign’s system doesn’t require the private-key-infrastructure (PKI) technology that SAFE-Biopharma employs, but is compliant with the federal E-Signature Act of 2000, which allows for electronic (as opposed to “digital”) signatures to be legally binding in commercial transactions.
Nate McBride, CIO at AMAG Pharmaceuticals (Lexington, MA), has adopted EchoSign for contracts being developed with IT and other vendors, and is gradually moving it out to other operations in the company. “There are clearly savings involved with the use of it, most of which can be tied down to Mean Time to Execute an agreement and Average contracts signed per time period (e.g. month, quarter, et al). Both of these are circumspect however depending on how vigilant the staff is, how overburdened your resources already are and other intangible things like that. At AMAG, we have seen a significant decrease in MTE and this gives IT an advantage such that we can move much quicker once we select a vendor to being able to perform work with that vendor.”
McBride adds that 21 CFR Part 11 compliance is as much in the hands of the pharma company as it is in the IT vendor that company might use. “We use a Quality Management System here made by a company called Master Control. Because we would not allow for the Master Control system to authenticate back to our Active Directory, the login is essentially a standalone auth system.
To mitigate this, it is required that employees provide two signatures with two unique passwords to verify that they are who they say they are. This passes muster and is thereby “21 CFR 11” compliant according to us,” he says. “21 CFR 11 comes down to the company’s interpretation of the predicate rule of what is and what is not governable electronic data. I’ve been in three different biopharma companies that have translated it three different ways. As long as the SOPs back it up, you can translate it almost any way you want. E-sigs and data integrity follow the same rules but everything else is fair game.”
Given how ill-defined the rules of the road are for cloud computing, and given how rapidly the field is evolving, it’s hard to say where it’s all going to wind up. In TraceLink’s Daleiden’s opinion, the future will be “a small set of cloud vendors for life sciences, each one specializing in certain domains but with cross-linkage between one another. The cross-linkage will be particularly crucial as companies outsource significant chunks of their production capacity. For example, for closed loop S&OP, unifying real-time sales/marketing data with supply chain supply/demand data and production capacity/schedules in an outsourced production world will be necessary, not a nice to have. “
He also points to a possible future of “MERP—multi-enterprise resource planning” as a next stage in enterprise ERP systems, but there’s no clear path toward that goal. For now, the main objective seems to be to get a collar on rising IT costs, while ensuring the preservation of data integrity. PC