The Cabinet of Ministers of Ukraine has enacted a new policy requiring authors of dissertations that incorporate artificial intelligence (AI) technologies to personally verify and edit all sections of their work generated by AI. This measure is designed to ensure that AI tools are used responsibly and to maintain the integrity of academic research. The requirement mandates that each author must certify that they have personally checked and edited the content, thereby preventing the unauthorized use of AI-generated text in academic publications.
Under the new regulation, dissertation authors must provide a detailed certification process confirming their involvement in editing and verifying all AI-assisted content. The policy reflects a growing concern across educational institutions about the potential for AI to be misused in academic settings, leading to the dissemination of inaccurate or plagiarized information. By enforcing personal verification, the Ukrainian government aims to preserve the credibility of academic work and deter the unethical use of AI in scholarly contexts.
The implementation of this policy is expected to increase the workload for graduate students and researchers, as they must now allocate additional time to review and edit AI-generated content. While some argue that this could hinder academic progress by adding bureaucratic hurdles, others believe it is a necessary step to protect the authenticity of academic research and uphold higher educational standards. The Cabinet’s decision highlights the ongoing global debate on the ethical implications and regulation of AI in educational and research environments.