Showing posts with label document quality. Show all posts
Showing posts with label document quality. Show all posts

03 February 2012

Need a New Mental Model for Regulatory Documents

Wow……I have been away from this blog a whole lot longer than intended. Those competing interests….like you all understand….are the bane of my existence.

For the past couple months I have been looking closely at how people think about the “vehicles” used to communicate with regulatory health agencies. I am using the word vehicle here because I am trying to divorce myself from the notion of document, in particular, the notion of “a document.” In the modern times of on-screen reading and linked files, what is a document anyhow? To me it is the entire corpus somebody may be able to access, not just one slice of that body.

In my consulting/training interactions at McCulley/Cuppan, I find that the majority of people I interface with in the client setting operate within the mind set of individual documents (some have even smaller boundaries and operate by document sections) that are stand alone with well defined boundaries (pages and page counts.)

I want to argue that the vehicle of communication for regulatory submissions is not a document. It is the full and complete dossier submitted by the sponsor. Documents are just placeholders where I go to get a piece or pieces of information that help answer my questions. I want to argue that the regulatory reader does not see a dossier as a set of documents. Rather they see a dossier as a corpus of information that they will use to answer question and make decisions. The contents are just vehicles they peruse to get what they want.

Applying my working model means you stop seeing documents as “stand alone” and stop saying “this document has to tell a story.” I’d also like you to stop using the word document. That word has baggage I am trying to jettison. Instead I want people to view their work at least as “modules” and preferably as vehicles that help a user to answer very specific questions. Bottom line, a research report is just a part of the constellation that tells the stories. Note the plural form as we have many stories to tell in a dossier, not just one.

Applying my working model means you stop seeing your work as being like a novella—something to be read from page 1 to page n. Applying my working model means you see your body of work as something that is read in a coordinate manner that is defined by very narrowly defined aspect rules of inclusion/exclusion. Applying my model means you stop seeing pages and sections and you start seeing concepts and topics.

My argument is that the selective professional reader at regulatory health agencies cares little about documents, sections, pages, and data tables. I am suggesting such readers care solely about making informed decisions and where in the submission dossier they find vehicles that can get answers to their concept and topic questions.

09 January 2011

Minimal Time and Effort Should be Applied to the Creation of the Clinical Study Report Synopsis



How much time and effort to apply to the creation of the clinical study report synopsis?


This is another question I am asked on a regular basis and a line of discussion that repeatedly comes up when I am working with clients to help streamline work practices. I usually draw slack jaws accompanied by an incredulous stare as I give my answer: 


"The amount of time should be minimal, involve no more than three people, the level of effort better be next to nothing, and the time should be no more than an hour to create and a whole lot less time to review."


My reasoning is very simple and straightforward.........."You apply time and effort to the development of a product in relationship to the product's strategic value. The value of the CSR Synopsis to the regulatory reader is virtually zero."


Think about it. The CSR Synopsis affords little utility to what the reviewers are looking to accomplish when they choose to enter the framework (that is, the document) of an individual clinical study. If a regulatory reviewer wants a “snapshot” of a study, they will likely take a contextualized snapshot at a higher level of a drug submission dossier. That is, the documents in Module 2. They do not enter the framework of the study report to get generalized or summarized information. They are at the Module 5 level and embedding themselves in a clinical study report because they are seeking answers to narrowly defined questions.


At McCulley/Cuppan we have queried regulatory reviewers about how they "use" a study report synopsis. Their responses support the premise I have laid out for you in the above paragraph.


So this gets us back to the question on time and effort. Why generate a study report synopsis with every draft? Why allow the full team to look at the document? Talk about wasting time and energy. 


In our assessment of review practices at pharmaceutical and medical device companies we see the same review pattern played out time and time again. The study report synopsis is generated with the first draft and in the review process it consistently garners the attention of the full review team (many of whom never make it all the way through the results sections during the course of their reviews.) Same thing happens on each subsequent draft.


Given the synopsis is but a summary of the body of work presented in the study report, it should not be generated until that body of work is completed and signed off as "good to go." The synopsis does not even warrant review. It should be critiqued. Critique is a comparative read. A reading to ensure that the synopsis accurately and appropriately portrays the sum total of key details of the study. The critique process requires at best two people and certainly no more than three. All being subject matter experts drawn from the key clinical disciplines represented in the research study.

16 September 2010

Proving Document Quality

You say you produce high-quality document products? Great, now prove it! Although many can recognize quality when they read it, the prospect of measuring documentation quality seems dubious to many.

I work with a great many clients where authors say they generate high quality documents, but they can only show that their documents are accurate. That is, accurate in terms of template, source data, and grammar conformance. Beyond these three inspection parameters, many regard documentation quality as inherently immeasurable. I argue that these three parameters by themselves are wholly insufficient measures of document quality and that important aspects of how documents convey meaning can be measured.

Surely they must see other metrics of value, especially when considering factors like the number of regulatory questions elicited by errors of omission and commission in their very accurate submission documents or the number of protocol amendments generated over the life of their very accurate research protocol.

Some see additional metrics as meaningful, but not for their type of documentation. In the realm of clinical development documentation, many consider their output unique and beyond measurement. I hear more than a few medical writers say, “We know our own field intimately, and everywhere we look we see shades of gray and unique situations, so measuring is not useful.” and others in clinical development say “We are unique, and your notions of quality do not apply to us.”

Still others find the prospect of measuring output impractical or suggest that even if they had data, “Against what standards could we make comparisons?”

Even easily measured attributes like accuracy are not effectively tracked. Documents are routinely verified/corrected by quality control operators, yet the output is rarely tabulated and statistically analyzed to provide a measure of writer performance. The intention is merely to ensure the document is archived as accurate in terms of template, source data, and grammar conformance.

I do not know of any pharmaceutical or medical device company that makes use of any statistical quality control techniques to track their document quality. It would be easy to apply similar principles as used in the auto industry or pharmaceutical manufacturing. The auto manufacturers turn out millions of units every year, and each company tracks manufacturing defects, but not by examining every car. Instead, they pull a few cars off the assembly line and tear them apart. Pharmaceutical manufacturers test an appropriate sample from each manufacturing lot. In both cases the products are measured against a set of well-defined acceptance criteria.

Such acceptance criteria exist for technical and scientific documents (in our work at McCulley/Cuppan we have developed such a set of criteria that we use during our training and consulting work for clients) so such quality control methods could easily be deployed.


Originally published on our Knowledge Management blog

16 February 2010

Improved Document Quality Does Pay

Quality control and improvement in document content and design can provide a significant financial benefit to  pharmaceutical and medical device organizations. It is our observation that companies must learn that writing is a process open to continuous improvement.

However, I find in my consulting work that many people working in the life sciences question the merits of attempts to improve the communication quality of their technical and scientific documents.  Quite a few of these people subscribe to the notion that as long as I get the study design right or have the correct data, then that is good enough. This may be the case for a small subset of documents, but the assumption clearly does not apply to the vast majority of documents produced in the life sciences. It is our observation that an effort to improve document quality generally does pay off for the individual and the organization. Though these pay-offs may not be easily seen unless there is an attempt to capture data regarding the impact of better document quality.

Here's a few examples where organizations instituted quality control programs and measured change in selected outcomes in order to answer questions about their document products and processes. These organizations found document quality by design does pay.

The Motorola Corporation substantially improved its operation after instituting a document quality program. The company made changes in R&D and Finance operation guides. These changes were focused on providing clearer directions and rationales in the guides and an easy-to-use format. These changes have helped streamline processes, and reduced delays from review and rework. The company calculated savings at US$ 20,000,000 per year. Source – Business Week Oct. 25, 1992
The United Kingdom Department of Health and Social Security spent the equivalent of US$ 50,000 to develop and test a series of new forms. They have reported annual savings of over US$ 2.9 million in reduced staff time from the forms being properly completed the first time.
Source – Plain Language Principles and Practice by Erwin R. Steinberg

The United Kingdom Department of Customs and Excise cut a 55% error rate to 3% by revising lost-baggage claim forms used by airline passengers.  Source – American Institutes for Research
An  aerospace contractor lost 15 consecutive contract proposals over a three year time period before a quality proposal writing group was called in to review the quality of the proposals and to help the proposal development teams institute new work practices and improve writing skills for developing large proposals. After instituting changes in documentation work practices and processes, the contractor “won” 11 of 12 proposals.
Source – Shipley Associates private communication


Originally published on our Knowledge Management blog