Back to news

Artificial Intelligence Hub

April 14, 2025

Vienna International Arbitral Centre Issues Guidelines on the Responsible Use of Artificial Intelligence in Arbitration

The VIAC releases a Note on the Use of AI in Arbitration, offering guidance on disclosure, confidentiality, and responsible AI use by tribunals and parties.

The Vienna International Arbitral Centre (“<span class="news-text_medium">VIAC</span>”) has published <a href="https://www.viac.eu/viac-publishes-a-note-on-the-use-of-ai-in-arbitration-proceedings/" target="_blank" class="news-text_link">a Note on the Use of Artificial Intelligence in Arbitration Proceedings</a> (“<span class="news-text_medium">Note on AI</span>”). The non-binding guidance addresses the use by the tribunal and parties of AI tools in VIAC arbitrations, including whether such use should be disclosed and the importance of safeguarding confidentiality.

The Note on AI is intended to facilitate discussion on the use of AI in VIAC arbitration proceedings to enhance, rather than impair, the efficiency and effectiveness of an arbitration. In particular, it aims to promote the responsible use of AI tools, consistently with ethical standards and professional duties, while maintaining confidentiality and procedural fairness.

The introduction to the Note on AI explains that its application should be tailored to the specific requirements of a case and that, where the tribunal and parties to a VIAC arbitration agree that it should apply, they should also agree which AI tools fall within its scope.

The Note on AI addresses the use of AI by all participants in an arbitration, including the arbitrators, parties and their counsel. According to the Note on AI, arbitrators should not use AI tools in substitution for their independent analysis of factual and legal issues and they must not delegate any decision that may have an impact on the proceedings. However, it is a matter of discretion for an arbitrator as to whether to disclose to the parties their intended use of particular AI tools, where they consider this relevant and necessary.

Also, the tribunal is encouraged to discuss with the parties, at the case management conference, the potential use of AI in the proceedings, whether that use should be disclosed and the potential impact of AI on the arbitration timeline and costs. Also, the parties and tribunal should consider agreeing provisions on confidentiality and the transparent use of AI for inclusion in the first procedural order. Arbitrators and parties are responsible for the outputs of any AI tools they use. Furthermore, arbitrators may direct that the parties disclose where fact or expert evidence is "produced by AI or with the support of AI". It is also within the tribunal's discretion to determine the admissibility, relevance, materiality and weight of any evidence produced by the parties with the support of AI.

Arbitrators and the parties are also required to respect the confidentiality of the arbitral process, ensure any AI tools comply with any such confidentiality obligations and take all reasonable steps to prevent unauthorised access to sensitive case-related data.

Address
London:
2 Eaton Gate
London SW1W 9BJ
New York:
295 Madison Ave 12th Floor
New York City, NY 10017
Paris:
56 Avenue Kléber,
75116 Paris
Moscow:
Molodegnaya st., build. 5
119296, Moscow
BELGRAVIA LAW LIMITED is registered with the Solicitors Regulation Authority with SRA number 8004056 and is a limited company registered in England & Wales with company number 14815978. The firm’s registered office is at 2 Eaton Gate, Belgravia, London SW1W 9BJ.

‘Belgravia Law’ (c) 2025. All rights reserved.
By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyse site usage, and assist in our marketing efforts. View our Privacy Policy and Cookie Policy for more information.