INSURANCE 101

GenAI Liability Management: Best Practices

10 MIN READ
GenAI Liability Management: Best Practices
“With Vouch, we were able to get the exact coverage we needed without weeks of paperwork — and get the peace of mind that comes with being properly covered.”
A green check mark
Instant coverage & limit advice
A green check mark
Tailored to your stage and vertical
A green check mark
Pricing in minutes
APPLY NOWTalk to an advisor

As GenAI technologies become integral to business operations, companies must adopt comprehensive best practices to mitigate emergent risks and ensure compliance. 

As companies use, develop and deploy AI technologies, they must navigate an increasingly complex regulatory and commercial landscape. It is critical to develop compliance mechanisms with guidance of legal counsel to meet obligations under evolving applicable laws, guidance, and market standards. 

Overview: 7 best practices for mitigating liability in AI

This post looks provides guidance for companies using or offering AI products and services, including:

  • Implementing clear policies and procedures for AI usage and development is essential to uphold ethical standards and regulatory requirements. 
  • Monitoring and tracking the use of training data to address compliance concerns (e.g., bias, accuracy, copyright, privacy, etc.). 
  • Implementing robust technical guardrails that reduce the likelihood of AI misuse or errors.
  • Conducting thorough vendor diligence ensures third-party AI tools meet legal and operational standards. 
  • Securing sufficient IP rights (in, e.g., data licensing, end user, customer, vendor, partner and other such commercial agreements) to enable the company to collect, process, and otherwise use data necessary for the company’s business. 
  • Management of potential liability through appropriate risk allocation mechanisms, including contractual indemnities and limitations of liability provisions. 
  • Obtaining AI-specific insurance offers financial protection against unforeseen risks. 

Best practice #1: Adopt and implement internal policies

In addition to regulatory scrutiny, potential investors and acquirers expect target companies to adopt, implement, and maintain commercially reasonable internal policies and procedures to enable AI compliance. 

External use, publication, or distribution of GenAI outputs opens the door to potential exposure to infringement claims, so companies should implement a system for internal review and escalation, especially for higher-risk use cases (e.g., product development, marketing, etc.). 

Companies must maintain strict internal policies for…

  • GenAI usage 
  • Data usage
  • Specifying permissible purposes for processing
  • Retention timelines
  • Deletion protocols
  • Business continuity/recovery procedures. 

The company’s internal data usage policies should be reviewed by legal counsel to ensure it adheres to applicable privacy laws and any AI- or industry-specific regulations, and the company should regularly audit and review its compliance with such policies.

Working with a GenAI usage policy

GenAI usage policies outline employee requirements and restrictions on prompting and using GenAI outputs, including when to seek approval for higher-risk use cases. Companies will need to tailor their GenAI usage policy to address their unique business risks, which can range from a strict prohibition of using GenAI for any business purpose to a more permissive Policy that sets guidelines and parameters for prompts and outputs for approved uses. 

Companies must also provide regular employee training and notify employees of any changes to the GenAI usage policy to ensure that all company personnel (e.g., employees, consultants, and other such service providers) understand the guidelines and responsibilities associated with GenAI use. 

Baseline prohibitions for a GenAI usage policy

In many ways, these GenAI internal compliance policies may mirror the procedures companies use to track licenses and monitor usage of open-source software in back- and front-end operations—such as including robust monitoring programs (e.g., regular code scans and audits), requiring human review and oversight, and tracking internal organization-wide usage of GenAI outputs. 

At minimum, a GenAI usage policy should include the prohibitions on the use of GenAI tools that are made available to employees on company-authorized accounts or otherwise accessed by employees on non-company accounts, such as:

  1. Do not use GenAI tools to conduct illegal activities (e.g., fraud, phishing, etc.); create illegal or unethical content; or manipulate or deceive another person.
  2. Do not use GenAI tools to invade the privacy of individuals; violate data protection and privacy laws; impersonate another person; or generate misrepresentations or falsehoods regarding another person.
  3. Do not use GenAI tools that infringe upon third-party IP rights.
  4. Do not use GenAI tools to disrupt, harm, or gain unauthorized access to systems or networks of the company or any third party.
  5. Do not use GenAI tools to create discriminatory content; or make decisions that have unfair or adverse impacts on people.
  6. Do not use GenAI tools to create content that could harm the reputation or interests of the company or its stakeholders.
  7. Do not ingest company confidential or proprietary information (e.g., customer or vendor lists, source code, product development details, presentations, user information, consumer personal information, etc.) as a prompt in any GenAI tools – unless the employee is using an enterprise account managed by the company.

Best practice #2: Monitoring and tracking use of data

Effective monitoring and tracking of data are critical for companies using, developing, or offering GenAI products and services. A robust data governance framework ensures compliance while minimizing risks related to data misuse or infringement.

In addition to the GenAI usage policies mentioned above, businesses should implement the following measures.

Data mapping and inventory 

Maintaining a detailed “map” of all data used in training or deploying GenAI models, including…

  •  Source/origin (e.g., end user, customer, vendor, partner, data broker, etc.), data type (e.g., categories of PII)
  • Nature of data (e.g., anonymized, de-identified, aggregated, etc.)

Clearly distinguish between proprietary, licensed, unlicensed, and publicly available data to assess risks relating to compliance, ownership, and licensed rights.

Transparency and documentation

To demonstrate accountability, facilitate audits, and maintain an audit trail, companies must document provenances like these:

  • GenAI training datasets
  • Modifications or preprocessing steps taken
  • Approvals/authorizations for GenAI usage policy exceptions
  • Recordation of acknowledging and facilitating data subject requests
  • Retention and destruction of sensitive categories of data

All should be documented in accordance with internal policies as well as any applicable laws or regulations.

Periodic audits

In addition to regular data security and integrity audits, companies should also conduct regular legal compliance audits to verify adherence with applicable requirements, including…

  • Licensing agreements
  • Regulatory requirements
  • Flow-down usage limitations
  • Business restrictions

Companies may benefit from outsourcing these compliance functions instead of performing them in-house. For earlier-stage companies that may not have legal operations/compliance teams, there are many service providers that offer commercially available “responsible AI” and data compliance support services.

Best practice #3: Vendor diligence and management

Conducting comprehensive vendor and partner diligence is a critical step for companies using or offering generative AI products and services. Ensuring that third-party entities supplying data or AI tools adhere to rigorous legal, ethical, and technical standards helps mitigate downstream risks associated with non-compliance, data misuse, or poor-quality outputs. 

Consider these key elements of vendor and partner diligence.

  • Evaluate the vendor or partner’s track record. This includes prior compliance issues, data breaches, or legal disputes. Favor partners with a proven history of ethical data sourcing and GenAI development that adheres to applicable current industry standards.
  • Data verification. Require vendors to provide detailed documentation about the provenance of their data. Confirm that data sources comply with all applicable laws, including sector-specific and comprehensive privacy regulations (e.g., GDPR, CCPA), and do not include unauthorized or sensitive information without consent for sub-processing.
  • Contractual safeguards. Negotiate agreements that clearly outline data ownership, permissible uses, and liability for breaches or non-compliance. Include indemnification clauses and ensure vendors warrant that their data is legally and ethically sourced. See “Contractual Risk Allocation” below.
  • Compliance certifications and audits. Insist on relevant certifications (e.g., SOC 2 Type II report, ISO 27001 data security certification, etc.) and conduct regular audits or reviews of vendor data practices. Verify adherence to contractual obligations and legal standards. Review the vendor’s protocols for identifying and addressing risks, including those related to data misuse, privacy violations, and security breaches. Require robust incident response and remediation plans.
  • Technical and ethical obligations. Assess the vendor’s data curation and GenAI development processes for compliance with technical quality benchmarks and ethical guidelines, such as reducing bias, avoiding discrimination, and ensuring transparency. Establish mechanisms for continuous oversight—such as periodic reporting, independent audits, or use of automated tracking tools—to ensure vendors maintain compliance throughout the partnership.

When procuring GenAI tools, such as AI-powered coding companions or notetaking tools, carefully assess how your company will use these third-party products and identify strategies to mitigate potential infringement risks. 

During the review of vendor agreements, including arrangements governed by a vendor’s standard clickthrough terms, consider the rights your company needs and the safeguards the vendor provides for its GenAI offerings.

Best practice #4: Secure sufficient IP rights

Companies must ensure that their inbound and outbound IP license agreements secure sufficient IP rights to data necessary for the operation and conduct of business, while also insulating the company from potential down- or up-stream liability. 

Companies should… 

  1. Clearly define ownership rights to inputs (e.g., user-provided data/prompts) and outputs (e.g., AI-generated content) 
  2. Specify whether users retain ownership of their inputs, and whether outputs are proprietary to the user, shared, or owned by the company
  3. Specify whether the company or vendor retains rights to use AI-generated outputs for further model training, improvement, or other purposes
  4. Ensure that agreements explicitly outline the scope of IP rights granted to users or obtained from vendors (e.g., whether the use of outputs is restricted to personal or commercial purposes, whether sublicensing is allowed, etc.); and 
  5. Clearly define any limitations or exclusions to these rights.

Best practice #5: Contractual risk allocation

Companies can also manage potential liability through appropriate contractual risk allocation, including use of indemnification, limitation of liability, disclaimers, and representation and warranty provisions informed by and tailored to business-specific exposure (e.g., intellectual property disputes, data misuse, and harm arising from AI-generated outputs, etc.). 

The company should develop a contract playbook for engagement of customers, vendors, and partners that covers at least the following:

Stakeholder Indemnities Limitations of liability Outputs & disclaimers Data governance
Customers Limited indemnification for claims related to AI tools; require customer indemnification for improper use of outputs. Cap liability tied to fees paid; exclude indirect or consequential damages with carve-outs for gross negligence or fraud. Clarify customer responsibility for use of outputs and include disclaimers about limitations and potential biases. Limit liability to areas under company control; require customers to follow agreed data security standards.
Vendors Require vendor indemnity for IP infringement, misuse of data, and claims caused by their AI tools. Negotiate caps sufficient to cover risks like third-party claims; exclude caps for gross negligence or fraud. Ensure liability for vendor-provided outputs aligns with agreed warranties and compliance standards. Establish vendor responsibility for data compliance and breaches; require adherence to privacy laws.
Partners Mutual indemnification for breaches, IP violations, or misuse of shared AI tools. Define proportional caps based on each party's control over AI products or services. Define responsibilities and rights for outputs, or jointly-created IP. Define clear responsibilities for data governance and liability for misuse of shared datasets.

Companies can manage risk more effectively by carefully crafting risk-shifting contractual provisions. Legal counsel should regularly review contract templates, negotiate material customer and provider arrangements, and align the company’s contract negotiation and review playbook to align with new regulatory requirements, industry standards, and market trends.  

Best practice #6: Implement technical guardrails

One of the most effective tools companies can use to limit liability associated with GenAI models is the implementation of robust internal technical guardrails.

Guardrails for a business may include the following:

  • Controls that minimize the risk of incorporating copyrighted works. (This is especially necessary for companies developing AI models.) These controls can include internal restrictions on permissible training data and clear guidelines for employees when utilizing third-party GenAI tools.
  • Acquisition of all necessary rights to data particularly for purposes beyond providing services to customers. (This is essential when using data to train AI models or feed inputs into them.) This is especially critical when handling PII, where compliance with privacy laws and principles is imperative to avoid costly financial penalties.
  • Employee policies and oversight procedures that clearly define how these outputs can be used. (This is important for businesses integrating generative AI outputs into operations.) Restrictions on incorporating third-party generated content into products and services can help prevent unintentional violations of copyright or IP laws.

Note: AI data practices often conflict with traditional privacy principles. Companies should establish technical guardrails in accordance with the latest commercial industry standards and regulatory guidance to protect their ability to collect, use, and store the data essential to their operations. 

These measures ensure compliance with privacy regulations while supporting the company’s data-driven objectives. By proactively implementing these safeguards, organizations can reduce liability, promote ethical AI use, and align with legal and regulatory standards.

Best practice #7: Obtain AI insurance

GenAI continues to transform enterprise operations. The rise in potential liabilities associated with developing and utilizing AI models has exposed companies to unprecedented risks. 

Specialized AI insurance can provide an effective safeguard against liabilities that may arise from their use or deployment.

Ensuring that AI insurance policies align with a company’s specific practices and needs is critical. Tailored coverage can address risks unique to AI operations, such as intellectual property disputes, algorithmic errors, or misuse of AI-generated outputs. 

The Vouch approach to AI insurance

Vouch Insurance Advisors can help simplify the insurance process by working with you to identify your potential needs and explaining available coverage options and their associated costs. Our team is committed to providing clear information to support your decision-making process. 

Discover alternatives to traditional insurance for your startup and find tailored solutions that fit your unique needs. And, with just a few clicks, get a coverage recommendation based on your business stage and industry vertical. 

“With Vouch, we were able to get the exact coverage we needed without weeks of paperwork — and get the peace of mind that comes with being properly covered.”
A green check mark
Instant coverage & limit advice
A green check mark
Tailored to your stage and vertical
A green check mark
Pricing in minutes
get startedTalk to an advisor
VOUCH IS THE INSURANCE OF TECH
Get instant guidance based on your stage and vertical.
GET COVERAGE RECOMMENDATION
HOW IT WORKS

How to get business insurance from Vouch.

01
Start online application in as little as 10 minutes.
02
Questions? Speak with your dedicated insurance advisor.
03
Activate coverage and modify as you grow.
START APPLICATION
Directors & Officers
See Recommended Limit & Features
Which best describes your fintech startup?
What’s your stage?
How much revenue do you estimate this year?
$100K - $250K
Get Recommendation
Analyzing coverages & limits
1
/
3
Back
Thank you for completing the calculator!
Reset Results
Oops! Something went wrong.
Directors
& Officers
We’ve prepared a limit recommendation and highlighted important coverage features for your payments startup. These features are commonly excluded by other insurers.
LIMIT
$1M
The highest amount your insurance will pay for a covered claim.
IMPORTANT FEATURES
  • In the case that your investors sue you, Vouch D&O does not include an Insured v. Insured exclusion.
  • In the case that your investors sue you, Vouch D&O does not include an Insured v. Insured exclusion.
  • In the case that your investors sue you, Vouch D&O does not include an Insured v. Insured exclusion.
EST. COST PER YEAR
$7,236 to $13,892
APPLY NOW
MARKET TRENDS
The market for D&O hardended.The market for D&O hardended.The market for D&O hardended.The market for D&O hardended.The market for D&O hardended.The market for D&O hardended.
How much does it cost?
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.