Recently, the Irish Examiner explored the ease with which high-quality, fictional Leaving Cert research projects can be created using free artificial intelligence (AI) tools, revealing a significant concern for educational integrity. This development aligns with new guidelines indicating that these projects, termed additional assessment components (AACs), will account for 40% of a student’s overall marks in their subjects starting next September.

The AACs are set to be implemented in stages across various subjects including biology, chemistry, physics, and business. These guidelines emphasize the necessity for students to cite any usage of AI in their projects, akin to referencing a book or online source. However, this requirement raises pressing questions about plagiarism, especially in light of students’ considerable freedom to leverage AI technologies, which can produce virtually undetectable work.

Despite the official stance, concerns from educators are intensifying. Teachers are apprehensive that the significant weight of the AACs, coupled with the vague guidelines on AI usage, will inadvertently encourage academic dishonesty. Humphrey Jones, chair of the Irish Science Teachers’ Association (ISTA), noted that AI’s capabilities will only enhance over time, stating, “Imagine what this technology will be like in two years when the first batch of sixth years are doing AACs.” He highlighted the frightening prospect of AI creating false images and drawing graphs for students.

Testing various AI tools such as ChatGPT and Perplexity, the Irish Examiner successfully generated a comprehensive and realistic project, complete with background knowledge, research questions, and even a scientific method, all within 30 seconds. In business projects, AI suggested topics like the impact of social media marketing or consumer preferences for eco-friendly products, further illustrating the ease with which students could bypass genuine academic effort.

Mr. Jones noted that these projects could be crafted to look remarkably authentic with minimal input, leading to an overwhelming burden on teachers to ensure the authenticity of projects—their role should be primarily educational, not law enforcement. The guidelines lack clarity regarding the implications of referencing AI in projects, creating uncertainty around penalties and how such projects would be assessed.

Comparative analyses between Ireland and the UK show a significant disparity in regulatory frameworks. The UK provides extensive guidelines detailing acceptable AI usage and defined consequences for misuse, whereas Ireland’s approach remains largely undefined. The absence of these guidelines could encourage some students to exploit the situation, and concerns about equity arise, as students from different backgrounds may have varying access to advanced AI tools.

Conor Murphy, an English teacher and chair of the Drama, Film and Theatre Teachers’ Association, echoed Mr. Jones’ sentiments on how AI could replace critical skills in project work, ultimately undermining the educational objectives of the AACs. The pressure students experience during the points race for college admission may exacerbate the likelihood of resorting to AI-generated work as a desperate measure.

A spokesperson for the Department of Education asserted that careful monitoring and student engagement would help authentic work prevail. AACs are crafted to facilitate a thorough understanding of the scientific process, encouraging students to develop critical thinking and analytical skills. Though they aim to incorporate AI-generated materials appropriately, the requirements for referencing AI outputs lack delineation in terms of academic credit, which adds to the complexity of navigating this new academic landscape.