Using ChatGpt for solving homework or other tasks is not implicitly unethical. Unless specifically forbidden, of course.
There are 2 issues: quality and plagiarism.
1. ChatGpt text should be properly cited. Otherwise it is plagiarism. APA guidelines for citing Gpt are a good starting point. Generally, APA recommends using a citation style similar to personal communication: "x said this (Gpt source)". See link in comments.
But this is just a guideline. To be enforceable, it would be better to be formalized in the rules of the school/course.
2. Quality. Any instrument is suited to some purposes, not suitable for others. As stated in some previous articles (link in comments), LLMs are great at processing text, e.g. summarizing certain documents, or generating text similar to other texts. It is also, obviously, terrible at generating new ideas.
ChatGPT is ultimately just a tool. It is not responsible for the quality of your work or results - same as MS Word or a piece of paper are not responsible for the text that you write on them.
(Inspired by a question from Bogdan Dumitrescu).
APA guidelines for citing Gpt: https://apastyle.apa.org/blog/how-to-cite-chatgpt
My previous articles on Gpt use cases: https://blog.stefanmorcov.com/search/label/AI
No comments:
Post a Comment