Back to Blog
Trending Post

Ozan Okutan and the CSA Documentation Endurance Test

·Regulatory Compliance
·Share on:

Ozan Okutan's CSA writing marathon shows why ISO 13485 alignment, FDA expectations, and clear documentation matter in QA.

LinkedIn contentviral postscontent strategyregulatory compliancecomputer software assurancecomputer system validationISO 13485FDA QMSRquality assurance

Grow your LinkedIn to the next level.

Use ViralBrain to analyze top creators and create posts that perform.

Try ViralBrain free

Ozan Okutan, a Senior Quality Engineer, recently posted something that made me stop scrolling: "When I wrote in post 'CSA meets ISO 13485:2016', 'If you're already feeling queasy at the thought of the presentation, just wait until you read the full text-it's a real endurance test,' I wasn’t talking about the presentation itself."

That line is funny because it is true. Anyone who has lived through a serious validation program, a quality system audit, or a remediation effort knows the feeling: the hardest part is rarely the slide deck. The hard part is the "full text" behind it.

Ozan goes on to admit that the "full text" was "this very piece," and that it is only "one installment of a much longer odyssey." He even tallies the scale of the effort: more than 200 pages devoted to the mysteries of CSA, and potentially more than 300 pages once he stacks on Bob McDowall’s commentary. Then the perfect closing warning: "So don't say you weren't warned!"

I want to expand on what Ozan is really pointing at: Computer Software Assurance (CSA) is not just a new label for CSV, and it is not something you can understand from a one hour overview. If you want CSA to survive first contact with ISO 13485 expectations and FDA scrutiny, you need a lot of careful, written thinking. The endurance test is the point.

Why the "endurance test" exists at all

CSA is often introduced with a promise: less burden, more focus on what matters, faster delivery. That promise is attractive, especially to teams exhausted by documentation for documentation’s sake.

But Ozan’s post highlights the other side of that promise: when guidance shifts the philosophy from "document everything" to "assure what matters," you do not magically get less work. You trade rote activity for judgment. And judgment has to be made explicit if you want it to be repeatable, auditable, and defensible.

That is why the full text grows. To operationalize CSA, you have to define:

  • What is "risk" for your product and your patients?
  • What evidence is "enough" for different kinds of software features?
  • What do you test, what do you review, what do you monitor in production?
  • How do you prevent CSA from becoming an excuse to skip engineering discipline?

The irony of CSA: you can reduce low value artifacts, but you cannot reduce thinking. If anything, you have to write the thinking down more clearly than before.

CSA meets ISO 13485: what gets complicated

When Ozan references "CSA meets ISO 13485:2016," he is naming a real friction point. ISO 13485 is a quality management system standard built around consistent processes, documented information, and objective evidence. It does not demand waste, but it does demand control.

CSA, at its best, encourages right-sized assurance. That can align beautifully with ISO 13485, but only if you build the bridge carefully. Here are the places where teams often stumble.

1) "Less documentation" vs "documented information"

ISO 13485 expects you to define and maintain processes and records. CSA encourages you to avoid unnecessary paperwork. Those statements are compatible, but not identical. The practical solution is not "no documents." It is "fewer, sharper documents" that demonstrate control and rationale.

Examples of CSA-friendly, ISO-friendly documentation include:

  • A risk-based categorization method for software features
  • A test strategy that links assurance activities to risk
  • Traceable evidence of testing and review for critical functions
  • A justification for relying on supplier evidence or automated testing

2) Objective evidence still matters

Even if you stop producing giant validation summary reports, auditors will still ask: show me how you knew it works. That is objective evidence.

CSA changes the shape of evidence (more automated logs, targeted testing, focused reviews), but it does not eliminate evidence. A good CSA program is often more evidence-rich, just less narrative-heavy.

3) Terminology is not harmonization

Many organizations try to "adopt CSA" by renaming CSV templates. That rarely survives an audit or an internal quality review. CSA is a set of principles that should change your decision-making and your evidence strategy. If all you did was swap acronyms, you did not reduce risk, you only reduced clarity.

If your CSA rollout is mostly vocabulary, you will end up writing even more later, during remediation.

CSA vs CSV: the real difference is the unit of value

Ozan’s pile of pages is a reminder that the community is still working out how to express CSA in practical terms. One of the simplest ways to see the shift is to ask what you optimize for.

  • Traditional CSV implementations often optimize for "audit survival" through completeness of documents.
  • Mature CSA implementations optimize for "confidence" through the right evidence, at the right depth, focused on patient safety and product quality.

That difference sounds small until you try to implement it. Then you hit the questions that create the endurance test:

  • Which requirements are truly critical?
  • Which tests give you the best signal?
  • When is exploratory testing appropriate, and how do you capture it?
  • When do you rely on vendor testing, and what verification do you still perform?
  • How do you maintain validated state when software changes weekly?

Those are not template questions. They are system questions.

Practical ways to survive (and benefit from) the documentation marathon

If CSA guidance feels like a long odyssey, the answer is not to avoid the journey. It is to structure it so the work compounds into reusable capability. Here are a few approaches I have seen work.

Build a "CSA to QMS" map

Treat CSA as an operating model that must plug into your existing QMS. Create a mapping that links:

  • Your software lifecycle (SDLC)
  • Risk management activities
  • Change control
  • Supplier management
  • Training and competency
  • CAPA and deviation handling

This makes CSA less of a special project and more of a system you can audit and improve.

Define evidence packages by risk, not by document type

Instead of debating whether you "need" a validation plan, define evidence packages. For example:

  • High risk features: requirements, design notes, targeted testing, independent review, release decision record
  • Medium risk: streamlined requirements, automated tests, peer review, release notes
  • Low risk: lightweight checks, monitoring, defect trending

Auditors care less about the label and more about whether your evidence matches the risk.

Make automation your evidence engine

CSA pairs naturally with automated testing and continuous integration, but only if you treat the outputs as controlled records. That means:

  • Ensuring test scripts are version controlled
  • Ensuring test execution results are attributable and retrievable
  • Ensuring failures trigger investigation when appropriate
  • Ensuring changes to test code are reviewed

Automation is not only a speed tool. It is also a traceability tool.

Write the rationale once, then reuse it

The best payoff from Ozan-level deep work is reuse. Once you write a solid rationale for your assurance approach, you can reference it across projects, products, and audits. The organization stops reinventing the argument every time a new system launches.

Why long-form commentary helps the whole industry

Ozan mentions stacking up hundreds of pages and brings in Bob McDowall’s contributions as well. That matters because CSA is still being interpreted. The gap between "guidance" and "audit-ready practice" is where organizations get hurt.

Deep commentary does three things:

  1. It exposes ambiguous areas early, before you bet your compliance strategy on assumptions.
  2. It forces precise language, which is essential when you are aligning CSA concepts with ISO 13485 and FDA expectations.
  3. It creates shared patterns, so teams do not have to learn each lesson the hard way.

The endurance test is not bureaucracy. It is the cost of turning principles into repeatable, compliant practice.

A final thought, taking Ozan’s warning seriously

Ozan Okutan’s humor is a gentle way of delivering a serious message: if you want to do CSA well, you should expect to read, write, debate, and refine. The "full text" is where you earn the confidence that your software does what it should, under the controls your quality system promises.

If you are early in your CSA journey, take Ozan’s warning as encouragement. The work feels heavy because you are building a foundation that will eventually let you move faster with less risk.

This blog post expands on a viral LinkedIn post by Ozan Okutan, Senior Quality Engineer. View the original LinkedIn post ->

Grow your LinkedIn to the next level.

Use ViralBrain to analyze top creators and create posts that perform.

Try ViralBrain free