The Supreme Court of Ukraine Prohibited the Use of ChatGPT and Grok Responses as Evidence in Courts

52% українців не використовують штучний інтелект

The Supreme Court of Ukraine has confirmed that responses from artificial intelligence, including models like ChatGPT and Grok, cannot be recognized as judicial evidence. This decision aims to uphold the principles of reliability in judicial information and the independence of judges.

This is reported by Business • Media

Use of AI in Legal Cases: Limits and Restrictions

The Supreme Court emphasized that artificial intelligence technologies cannot replace judges or serve as a source of scientifically validated information. They may only be used as an auxiliary tool during case consideration, but not for making final decisions. Attempts to use AI responses to challenge court rulings have also been deemed unacceptable.

A precedent-setting decision was made by the Cassation Economic Court on July 8, 2025, in a case between the city council and a limited liability company regarding amendments to a land lease agreement and the recalculation of rent. During the proceedings, the defendant requested the use of chatbot responses as evidence; however, the court denied this request, stressing that decisions must be made exclusively by humans.

The Position of the Supreme Court and Its Significance for Justice

After the case was reviewed by the Cassation Court, the materials were forwarded to the Supreme Court of Ukraine for a final determination on the use of AI responses in legal proceedings. A special panel of judges emphasized that delegating decision-making to artificial intelligence contradicts judicial autonomy and the principles of fair trial.

The Supreme Court’s ruling clearly states that chatbot responses are not a reliable source of information, and the refusal to consider them as evidence is legitimate. Artificial intelligence technologies should only be used to support the court, without replacing judicial analysis and the decision-making process.

“Right now, if you talk to a therapist, lawyer, or doctor about your problems, there is a kind of legal privilege for that… And we have not yet decided what that will look like when you communicate with ChatGPT,” said the entrepreneur.

Thus, the Supreme Court of Ukraine has established the principle that artificial intelligence can only be a support tool, but not a source of evidence in judicial proceedings.

Additionally, earlier in the USA, a legal clerk was fired for using ChatGPT while preparing lawsuits, which further underscores the cautious approach to applying such technologies in the judicial field.