A due diligence questionnaire (DDQ) is a formal process for verifying a third-party vendor’s stated controls and capabilities. It is a structured method for examining whether their security, operations, and compliance practices align with their contractual commitments and your risk tolerance.
Why The Due Diligence Questionnaire Is a Critical Risk Management System

In a modern governance, risk, and compliance (GRC) framework, the DDQ is more than a vendor checklist. It is a systematic process for scrutinising a third party’s controls to ensure they meet specified requirements. This is a foundational component of any mature third-party risk management program.
The primary function of a DDQ is to create a verifiable record of a vendor’s security posture. For organisations in regulated industries, this is a mandatory activity. Regulations like DORA and GDPR require demonstrable oversight of the supply chain, particularly for critical third-party providers.
Establishing a Defensible Position
A well-executed DDQ process establishes a defensible position for audits and security incidents. It transitions the vendor relationship from one based on assumed trust to one based on verified compliance. By asking specific questions and requiring evidence—such as policy documents, configuration records, or recent penetration test reports—you establish a clear line of accountability.
A due diligence questionnaire transforms risk management from a theoretical exercise into an evidence-based discipline. Its value lies not in the questions asked, but in the auditable proof received in response.
This focus on evidence is what distinguishes a DDQ from an informal security survey. A survey might ask about a vendor's confidence in their security controls, whereas a DDQ requires them to prove their controls are implemented and operating effectively. This evidence-based approach gives the process its authority and makes the results reliable for risk assessments.
A Verification Tool for Both Parties
For the organisation issuing the DDQ, it functions as a critical control. It helps identify deficiencies in a vendor’s environment before they can translate into operational, financial, or reputational damage. The responses directly inform the risk assessment and the decision to engage the vendor.
For the vendor responding, a DDQ is an opportunity to demonstrate maturity and operational excellence. A thorough response, supported by verifiable evidence, serves as a competitive differentiator. It signals reliability and a systematic approach to security and governance. Vague or unsubstantiated answers are indicators of potential risk. You can read more about this in our article on foundational risk management and compliance.
Ultimately, the DDQ formalises the responsibilities of all parties. It establishes a documented baseline for security and operational resilience, making it an essential system within any robust GRC program.
The Anatomy of a Modern DDQ

While no two due diligence questionnaires are identical, they are no longer an arbitrary collection of security questions. A modern DDQ is structured logically, with each section designed to test a specific aspect of a vendor’s operational and security resilience.
A DDQ should be viewed as a layered system of verification, not a flat list of questions. Its purpose is to connect a vendor’s claims to tangible, auditable evidence. The domains it covers are a direct reflection of the risks an organisation inherits when integrating a third party into its operational environment.
Core Security and Resilience Domains
Most DDQs begin with an assessment of foundational security controls. This is where you verify the implementation and effectiveness of a vendor's basic cyber hygiene. The inquiry is not if security measures exist, but how they are implemented, managed, and validated.
This domain consistently includes:
- Information Security and Cybersecurity: This is the most extensive section, examining controls from access management and data encryption to vulnerability management and penetration testing. The objective is to confirm the vendor has a defence-in-depth strategy to protect data assets.
- Business Continuity and Disaster Recovery (BCDR): The focus here shifts from prevention to resilience. Questions scrutinise the vendor’s documented ability to maintain service levels during a disruptive event. You will request BCDR plans, test results, and specific metrics like Recovery Time Objectives (RTOs) and Recovery Point Objectives (RPOs).
A DDQ is not built to collect "yes" or "no" answers. It is designed to compel the vendor to provide evidence that their controls operate as described. A policy document without proof of implementation is an artifact, not evidence of a functioning control.
Governance and Operational Integrity
Technical controls are ineffective without a governance framework to enforce them. A modern DDQ extends beyond system configurations to investigate the organisational processes and accountability structures that sustain security.
Key governance domains include:
- Data Governance and Privacy: This section focuses on the data lifecycle. It asks how a vendor classifies, handles, and disposes of sensitive information. This is where you verify compliance with regulations like GDPR and ensure data is managed according to contractual obligations.
- Corporate and Organisational Governance: This area assesses the vendor’s stability and operational maturity. Questions may cover financial viability, insurance coverage, employee background screening, and security awareness training. The goal is to determine if the vendor is a sound and reliable long-term partner.
Emerging Areas of Inquiry
Risk landscapes are not static. DDQs are evolving to address new risks associated with emerging technologies and increasingly complex supply chains. These newer sections demonstrate a forward-looking approach to risk management.
Standardised questionnaires from industry bodies like the Association for Financial Markets in Europe (AFME) are establishing high standards. The AFME questionnaire dedicates over 50 questions to IT disaster recovery and BCP, demanding specific metrics. Despite this, reports indicate that 65% of financial vendors initially fail DDQ checks on fundamental controls like encryption.
New domains now becoming standard include:
- AI Systems Governance: As AI is integrated into service offerings, DDQs are including questions about its governance. What is the provenance of the training data? How is model transparency ensured? Is there a human-in-the-loop for critical decisions? These questions align with emerging frameworks like the EU AI Act.
- Supply Chain Security (Fourth-Party Risk): This addresses the "vendor's vendor" problem. Assessors now require evidence of how a vendor manages risk with its own third parties. They expect to see a documented process for vetting sub-processors, ensuring risk is managed and not simply transferred down the supply chain.
Navigating Common Third-Party Due Diligence Challenges
A due diligence questionnaire is intended to be a foundational risk management tool, but in practice, the process is often fraught with operational friction and hidden compliance risks. For both the organisation issuing the DDQ and the vendor responding, the process can become a bottleneck that delays projects and strains resources.
One of the most significant issues is the reliance on unverified self-attestation. A vendor can affirm that a control is in place, but without supporting evidence—such as a policy, a configuration record, or a recent audit report—the answer has limited value. This creates a critical gap between declared security and demonstrable control.
Another major hurdle is the inefficiency of evidence collection. Responding teams often struggle to locate documents, answer redundant questions for different clients, and transmit sensitive files over insecure channels like email. This manual process is not only slow but also introduces errors and leaves a fragmented, untraceable evidence trail.
The Scope of the Problem in Regulated Industries
These challenges are amplified in regulated sectors. A recent Deloitte survey on DORA readiness identified a significant gap in third-party risk management across Europe's financial industry.
Of all entities surveyed across 28 countries, only 8% had achieved full compliance in managing their third-party risks. The Deloitte DORA European Survey also revealed that 17% of organisations view due diligence and third-party compliance as one of their most demanding challenges.
The core challenge of the due diligence process is not asking the right questions. It is systematically collecting and verifying the evidence that proves the answers are correct. Without this, the entire exercise is an administrative formality.
This data highlights a critical point: even with substantial compliance budgets—with 64% of firms planning to spend between €2–5 million—the operational execution of due diligence remains a primary point of failure. The gap between investment and maturity indicates that financial resources alone are insufficient without effective systems and processes.
Addressing Compliance Gaps and Operational Friction
When findings from a DDQ are not properly tracked and remediated, they create significant compliance gaps. A finding that a critical vendor lacks an adequate incident response plan, for example, must be logged, tracked, and resolved. If it is merely noted in a spreadsheet and forgotten, the organisation has failed to perform its due diligence and remains exposed.
To overcome these challenges, organisations must treat the due diligence questionnaire as an engineering and governance discipline, not an administrative task. This involves:
- Centralising Evidence: Establishing a secure, version-controlled repository of evidence that can be reused across multiple DDQs.
- Mapping Controls to Evidence: Directly linking questionnaire answers to specific, dated, and validated proof.
- Automating Workflows: Using systems to manage question assignment, track progress, and securely handle submissions.
By implementing a structured, evidence-based system, both issuers and responders can transform the DDQ from a source of operational friction into a valuable and efficient verification process.
A Systematic Approach to Responding to a DDQ
Receiving a due diligence questionnaire often initiates a reactive, uncoordinated effort involving email chains, hastily gathered evidence, and looming deadlines. This ad-hoc approach creates internal friction and results in inconsistent, poorly substantiated responses that can project an image of organisational immaturity.
A DDQ response is not merely an administrative task; it is a demonstration of your organisation's control environment and governance maturity. The objective is to transition from a reactive scramble to a structured, evidence-based process that builds trust. This requires delivering precise answers, each linked to verifiable proof. Vague assurances are insufficient.
Deconstruct the DDQ and Assign Ownership
No single individual possesses all the necessary information to complete a DDQ, as its questions span multiple domains, including IT, cybersecurity, HR, and legal.
The first step is to manage the response as a formal project. Deconstruct the questionnaire and assign clear ownership for every question. A simple ownership matrix is an effective tool for this, mapping each DDQ section or question to a specific subject matter expert (SME) or team. This establishes immediate accountability and ensures that questions are directed to the appropriate individuals from the outset. This matrix serves as the project plan, preventing omissions and providing the coordinator—typically a CISO or compliance manager—with clear visibility into progress.
Map Questions to Controls and Policies
With owners assigned, the next step is to map each question to your internal controls, policies, and procedures. This is the core of an evidence-based response. The goal is not simply to answer "yes," but to demonstrate how the requirement is met.
For example, a question such as, "Do you enforce multi-factor authentication for all administrative access?" requires more than a simple affirmative. A robust response links directly to:
- The specific internal policy that mandates MFA.
- A configuration screenshot from the identity provider showing the MFA policy is active for the relevant administrative group.
- An excerpt from an audit log demonstrating a successful MFA-protected login.
A strong due diligence questionnaire response is built on a foundation of traceability. The reviewer should be able to follow a clear line from their question, to your answer, to the specific piece of evidence that proves your control is operational.
This process connects the abstract language of policy to the concrete reality of system implementation. It is also a critical part of managing evidence effectively, which is discussed in our guide on using a virtual data room for due diligence.
Build a Repeatable Evidence Workflow
Once questions are mapped to controls, you must collect the audit-grade evidence. Many organisations falter at this stage, treating it as a one-time data collection effort. A structured process for collecting, managing, and submitting evidence is necessary instead.
The table below outlines the key stages of a repeatable DDQ response process, clarifying the objective and responsible roles at each step.
DDQ Response Process Key Stages and Responsibilities
| Stage | Objective | Primary Responsible Role(s) | Key Action |
|---|---|---|---|
| 1. Intake & Triage | Understand scope and assign a project lead. | CISO, Compliance Manager | Review the DDQ, establish deadlines, and assign a single coordinator. |
| 2. Deconstruction | Assign ownership for every question. | DDQ Coordinator | Create an ownership matrix mapping questions to specific SMEs. |
| 3. Evidence Mapping | Connect questions to internal controls and policies. | SMEs, Control Owners | Identify the exact policy, procedure, or configuration that answers the question. |
| 4. Evidence Collection | Gather and centralise all required proof. | SMEs, System Owners | Upload screenshots, logs, reports, and policy documents to a central location. |
| 5. Review & Approval | Ensure all answers are accurate, consistent, and backed by evidence. | DDQ Coordinator, CISO | Review the complete DDQ response for quality and consistency before submission. |
| 6. Secure Submission | Deliver the response and evidence securely to the requesting party. | DDQ Coordinator | Use a secure portal or data room to transmit sensitive information. |
This structured workflow transforms a chaotic exercise into a governable system.
A central evidence library is a non-negotiable component of a mature process. Instead of repeatedly searching for the same penetration test report, SMEs can retrieve the latest approved version from a secure repository. This library must incorporate strict version control. When a policy is updated or a new audit is completed, old evidence is archived and replaced with new, dated proof. This ensures every response is based on current, accurate information.
Using dedicated platforms to manage this entire workflow distinguishes mature organisations. These tools automate assignments, provide a secure portal for evidence, and create a complete audit trail. This approach eliminates the risky practice of emailing sensitive files and establishes a defensible process for every DDQ.
How Due Diligence Addresses AI Governance
As artificial intelligence transitions from a novel technology to a standard system component, the due diligence questionnaire is evolving. The focus is expanding beyond traditional IT security to include rigorous scrutiny of how vendors build, manage, and govern their AI systems. This is now a fundamental aspect of modern third-party risk management.
Regulators and clients no longer treat AI as an inscrutable black box. They understand it as a complex software system with defined operational limits and clear human accountability. A modern due diligence questionnaire now probes the governance framework surrounding these systems, ensuring a vendor's use of AI is transparent, controlled, and aligned with emerging legal and ethical standards.
Scrutinising AI Systems and Their Governance
In practice, this means vendors must be prepared to answer questions that dissect the entire lifecycle of an AI model. These questions are designed to verify the system's integrity and the robustness of its oversight. The objective is to obtain evidence that the vendor has an operational AI governance framework, not just a new technology feature.
Key areas of inquiry now include:
- Data Provenance and Quality: What was the source of the training data? What processes were used to ensure it is accurate, unbiased, and ethically sourced? Vendors need to provide proof of their data acquisition and cleansing procedures.
- Model Transparency and Explainability: Can the vendor explain how its AI model arrives at a decision? Assessors will request documentation on model architecture, its limitations, and how outputs are interpreted, particularly for high-risk applications.
- Human Oversight and Intervention: Is there a defined process for human intervention? A due diligence questionnaire will demand evidence of "human-in-the-loop" controls, including procedures for overriding or correcting AI-driven decisions.
The following chart illustrates the core steps in a structured DDQ response workflow. This process is essential for collecting the specific, traceable evidence required to answer AI governance questions.

This workflow—assigning questions, mapping them to controls, gathering evidence, and finalising answers—is what differentiates a verifiable, audit-grade response from a superficial one.
Standardising AI Due Diligence
Industry groups are developing formal, consistent standards for AI procurement. The launch of the Legal IT Innovators Group (Litig) AI Due diligence Questionnaire on May 16, 2024, is a prime example. Representing 90,000 users, this DDQ is divided into six sections and directly addresses risks associated with the EU AI Act. This sends a clear market signal that standardised verification is now an expectation. You can learn more about how industry bodies are shaping AI vendor assessments on legaltechnology.com.
The introduction of AI-specific due diligence questionnaires codifies the principle of accountability. It forces vendors to treat AI not as a black box, but as a managed system component subject to the same rigorous oversight as any other critical infrastructure.
For vendors, this means that compliance with standards like ISO/IEC 42001 is rapidly becoming a baseline requirement. A successful response requires more than policy documents; it demands tangible evidence of ethical impact assessments, documented model limitations, and clear records of human oversight.
Reports suggest that 70-80% of AI vendors initially fail to provide adequate evidence for these new governance sections, highlighting a significant gap between claims and provable reality. In any competitive procurement process, the ability to answer these questions with hard evidence is no longer a differentiator—it is a prerequisite.
Evidence: From Unstructured Data to an Audit-Ready System

The answers provided in a due diligence questionnaire are incomplete without the evidence to substantiate them. A claim without proof is an assertion, not a verified control. The mechanics of managing and submitting evidence are as critical as the security measures themselves, representing the final step that transforms a checklist into a genuine, audit-ready demonstration of a security posture.
Building Your Evidence Library
The foundation of any robust DDQ response process is a central, secure evidence repository. This is not simply a shared drive; it is a controlled system designed to substantiate claims and eliminate the inefficiency of last-minute document searches. Its purpose is to ensure every piece of evidence is current, approved, and readily accessible.
The core principles for building an effective evidence library include:
- Immutability and Version Control: Every document, screenshot, or policy must have a clear version history. When a control is updated, the old evidence is not deleted but archived. This creates an immutable audit trail, demonstrating the state of compliance over time.
- Strong Encryption: Evidence must be protected at all stages, both in transit and at rest. Using strong encryption, such as AES-256 for data at rest, is not an optional feature but a baseline requirement for protecting sensitive internal data.
- Role-Based Access Control (RBAC): Access to evidence should be restricted based on functional roles. Granular access controls ensure that experts can only manage evidence relevant to their domains, such as network security or human resources.
An evidence library is more than a storage system. It is the single source of truth for your compliance posture. Its design and security reflect your organisation’s actual commitment to governance.
Secure Submission and Audit-Ready Exports
The method used to transmit the completed DDQ is a final test of professionalism and security competence. Attaching hundreds of sensitive files to an email is an immediate indicator of poor data handling practices.
Secure submission is paramount. The only professional method is to use a dedicated, encrypted portal where the requesting party can access the questionnaire and its associated evidence. This creates a clean, auditable log of what was sent, by whom, and when.
Security alone is not sufficient; the submission must also be usable for the reviewer. Simply providing a folder of unsorted files creates unnecessary work for the auditor and reflects poorly on your organisational processes. The goal is to provide a complete audit-ready export pack, typically as an indexed PDF or a structured ZIP file.
This package should contain:
- A clear index that maps every question directly to its corresponding evidence file.
- The completed questionnaire, with answers linking to the evidence.
- All supporting evidence, named and organised logically.
- An export log detailing when the pack was created and its contents.
This structured approach streamlines the auditor’s review process and demonstrates the reliability of your internal systems. Our guide offers more detail on preparing and managing audit evidence for demanding regulatory environments. When evidence management is treated as an engineering discipline, it proves that your security posture is operational, not just theoretical.
Common Questions About the DDQ Process
When managing third-party risk, the due diligence questionnaire is where theory meets practice. Below are answers to practical questions that professionals encounter in their daily work.
How Often Should We Send a DDQ to a Vendor?
The frequency of a DDQ depends on the level of risk associated with the vendor.
For critical suppliers—those handling sensitive data or providing essential services—an annual review is a standard baseline. For lower-risk vendors, a biennial or triennial cycle may be sufficient.
However, a fixed calendar represents an outdated model of governance. Modern risk management is event-driven. A new DDQ should be triggered whenever a significant event occurs, such as:
- A security incident at the vendor.
- Major changes to the services they provide.
- A merger, acquisition, or other significant change in their corporate structure.
Continuous monitoring combined with event-driven assessments is becoming the new standard, ensuring that your understanding of a vendor’s security posture remains current.
What Is the Difference Between a DDQ and a SOC 2 Report?
This is a frequent point of confusion. A DDQ and a SOC 2 report are related but serve different functions and are not interchangeable.
A SOC 2 report is an independent audit conducted by a certified public accountant. It validates a vendor’s controls against the AICPA’s Trust Services Criteria, providing a standardised, third-party opinion on their control environment.
A due diligence questionnaire (DDQ), in contrast, is a direct inquiry from your organisation to your vendor. It allows you to ask specific, targeted questions relevant to your risk profile and the exact services being provided.
A SOC 2 report is a critical piece of evidence that you should request as part of your DDQ. The DDQ itself allows you to probe deeper into areas a generic audit might not cover, such as your specific data handling requirements or the vendor's reliance on their own third-party suppliers (fourth-party risk).
Can We Just Use One Standard DDQ for All Vendors?
Using a standardised questionnaire, such as the Cloud Security Alliance's CAIQ or Shared Assessments' SIG, is an effective practice. These frameworks provide a solid baseline and introduce consistency into the vendor assessment process.
However, a one-size-fits-all approach is a strategic error.
The optimal approach is to begin with a core, standardised set of questions and then augment it with modules based on the vendor's risk tier. A high-risk data processor warrants a far more detailed review than a low-risk provider of commodity goods. The objective is not to treat all vendors identically, but to ensure the level of diligence is always commensurate with the level of risk.