NTU Policy on the Use of Generative Artificial Intelligence in Research
NTU acknowledges that GenAI has the potential to enhance the quality and efficiency of research, and provides new modes of inquiry. However, it is important for researchers to recognise the risks associated with GenAI use. These include inaccurate outputs (or falsehoods), bias, lack of appropriate attribution (plagiarism), threats to data confidentiality, and infringement of intellectual property.
Researchers who use AI tools in their scholarly work must use it in a (1) responsible, (2) accountable, and (3) transparent manner. This would include acknowledging the use of any AI tools in their research proposals, manuscripts, and scholarly works, and stating the extent and nature of the involvement of AI in their work.
The Responsible Use of GenAI in Research
1. Acknowledging/Declaring the Use of GenAI
In the interests of transparency and integrity, the use of GenAI beyond basic spelling and grammar checks should be appropriately acknowledged and cited. This would include acknowledging the use of any AI tools in their research proposals, manuscripts, and scholarly works through a statement specifying the tool's full name and version, its purpose of use, and how it was used.
Not citing or acknowledging the use of GenAI could be considered plagiarism (i.e. a form of research misconduct), especially if GenAI was used to generate ideas or for literature reviews. Any direct quotation from AI output should be enclosed in quotation marks, similar to citing textual sources.
For Theses/Dissertations, declaring the use of GenAI should be mentioned in the Statement of Originality.
2. Authorship
Authorship requires the acceptance of responsibility for the work described in any manuscript. Researchers should recognise that GenAI cannot be held responsible as an author for the accuracy, integrity, and content of such work.
Therefore, GenAI (e.g. ChatGPT) should not be listed as an author of any paper with an affiliation to NTU; or listed as a Principal Investigator (PI), Co-PI, or collaborator in any research proposals.
Authors and/or PIs are fully responsible for the content of their scholarly materials (e.g., research proposals, grant applications, manuscripts for publication) in which GenAI was used in the preparation and/or development of such outputs. Researchers must therefore exercise caution and judgement when using GAI, and be ready to verify the accuracy and validity of their work. (See Figure 1 under resources.)
3. AI-Generated Images, Figures and Videos
The use of GenAI for image and video creation in research has given rise to new legal copyright and research integrity challenges. Most mainstream publishers prohibit the use of AI-generated images and videos in publications. NTU strictly adheres to established best practices in copyright compliance and research integrity, hence researchers using AI-generated images, figures, and videos in research should adhere to the following:
i. All AI-generated images, figures, and videos must be clearly labelled as such. Researchers are required to document the methods, tools, and parameters used to generate these visuals in sufficient detail to ensure reproducibility.
ii. AI-generated images, figures and videos must not be presented as actual experimental data unless AI (or AI-assisted) tools are an essential component of the research design and/or methodology. The use of AI to fabricate images to resemble experimental results constitutes serious research misconduct. Similarly, the substantive modification of genuine experimental images using AI tools is prohibited unless clearly described in the methodology.
iii. Researchers are responsible for the content of the AI-generated images, figures and videos. They must ensure that the content does not infringe upon copyright or intellectual property rights by referring to the terms of use of the AI tool.
4. Data Privacy & Confidentiality
The use of GenAI to process or analyse research data must comply with all relevant data privacy and protection laws, regulations, and institutional policies - e.g., the Personal Data Protection Act (PDPA), NTU’s Data Governance Policy.
This includes data owned by external parties, including but not limited to businesses, organisations, and government ministries/ agencies.
Any confidential or sensitive information, and/or personal data must not be uploaded to any external GenAI software, system, or platform unless:
- The activity does not contravene any applicable laws, regulations, or institutional policies;
- Access to the GenAI is controlled and restricted to only authorised study members involved in the research;
- The data is not retained in or by the GenAI; and
- Where applicable, written permission has been explicitly provided by the data owner, or the use of such GenAI has been agreed upon in the Research Collaboration Agreement (RCA).
Researchers must prioritise and safeguard the privacy and confidentiality of research data when using GenAI and will be held responsible for any leakage of confidential data.
5. Rules established by other parties
If the proposal, manuscript or other document is being submitted to a agency, journal or other party with rules concerning GenAI that differ from the NTU policy, then the stricter rules should be followed.
Date issued: 04 Jul 2023
Last reviewed: 07 Nov 2025
Disclosure
OpenAI’s ChatGPT (Mar 23 version) was used to improve the clarity and readability of this statement. No confidential or sensitive information was uploaded during this process.
Resources:
Cambridge University Press
Elsevier
Oxford University Press
Sage
Springer Nature
Taylor and Francis
Wiley
For NTU Students - Guidelines & Impact of GAI Tools on Education
For NTU Faculty & Staff – GAI Policy and Guidelines for Teaching and Learning

Figure 1: When is it safe to use ChatGPT?
(UNESCO / ChatGPT and Artificial Intelligence in higher education: Quick start guide -
licensed under CC-BY-SA 3.0 IGO)