
The Vienna International Arbitral Centre (“<span class="news-text_medium">VIAC</span>”) has published <a href="https://www.viac.eu/viac-publishes-a-note-on-the-use-of-ai-in-arbitration-proceedings/" target="_blank" class="news-text_link">a Note on the Use of Artificial Intelligence in Arbitration Proceedings</a> (“<span class="news-text_medium">Note on AI</span>”). The non-binding guidance addresses the use by the tribunal and parties of AI tools in VIAC arbitrations, including whether such use should be disclosed and the importance of safeguarding confidentiality.
The Note on AI is intended to facilitate discussion on the use of AI in VIAC arbitration proceedings to enhance, rather than impair, the efficiency and effectiveness of an arbitration. In particular, it aims to promote the responsible use of AI tools, consistently with ethical standards and professional duties, while maintaining confidentiality and procedural fairness.
The introduction to the Note on AI explains that its application should be tailored to the specific requirements of a case and that, where the tribunal and parties to a VIAC arbitration agree that it should apply, they should also agree which AI tools fall within its scope.
The Note on AI addresses the use of AI by all participants in an arbitration, including the arbitrators, parties and their counsel. According to the Note on AI, arbitrators should not use AI tools in substitution for their independent analysis of factual and legal issues and they must not delegate any decision that may have an impact on the proceedings. However, it is a matter of discretion for an arbitrator as to whether to disclose to the parties their intended use of particular AI tools, where they consider this relevant and necessary.
Also, the tribunal is encouraged to discuss with the parties, at the case management conference, the potential use of AI in the proceedings, whether that use should be disclosed and the potential impact of AI on the arbitration timeline and costs. Also, the parties and tribunal should consider agreeing provisions on confidentiality and the transparent use of AI for inclusion in the first procedural order. Arbitrators and parties are responsible for the outputs of any AI tools they use. Furthermore, arbitrators may direct that the parties disclose where fact or expert evidence is "produced by AI or with the support of AI". It is also within the tribunal's discretion to determine the admissibility, relevance, materiality and weight of any evidence produced by the parties with the support of AI.
Arbitrators and the parties are also required to respect the confidentiality of the arbitral process, ensure any AI tools comply with any such confidentiality obligations and take all reasonable steps to prevent unauthorised access to sensitive case-related data.