AI tools can be used to streamline academic processes including the processes asked of you in some of these activities (e.g., writing in MarkUp, analyzing a text, naming and organizing files, etc.)
For this activity, you will think about different stages of the research lifecycle (literature review, data analysis, dissemination, collaboration, etc.) and identify at least one way an AI tool (e.g., ChatGPT, Gemini Google, Microsoft Copilot) could be positively applied to open research and identify at least one potential challenge or ethical concern regarding the use of AI in open research.
Complete this Activity
To complete the activity, post a comment with your answer.
GenAI Tools
Generative AI (GenAI) is rapidly transforming how we conduct research, scholarship, and teaching and learning. Permitted use of GenAI in these areas is constantly shifting and dependent on the organizations or instituions for which you are engaging. Before sharing any outputs of your GenAI use, it is important to consult policies and guidelines on accepted use. To learn more, review the resources on GenerativeAI and Open Scholarship.
Image Credit: hand-3044387_1280 used under Pixabay Content License
AI tools could be used positively to help with summarizing major themes across multiple papers for a literature review. However, it’s possible that AI could miss new or recent papers in its assessment while hallucinating information if its output is not thoroughly checked/verified.
I think that in the early stages of a project, AI can significantly enhance the literature review and background research process by rapidly scanning and processing large volumes of academic publications. Some specialized AI systems can identify relevant papers, summarize key findings, and highlight gaps in the existing research. This helps researchers by making much of this work faster and more “efficient”.
However, use of AI risks overlooking subtle insights in papers, missing nuances like tone, skepticism, or methodological caveats that human researchers would catch. Additionally, AI often relies on existing databases that may be skewed toward English-language or high-impact journals, which can under-represent niche or non-Western research and create blind spots in the literature review. Some AI tools are not updated in real time and may miss recent studies, potentially leaving gaps in the most current research landscape, and finally AI can hallucinate which – without thorough review and analysis – could insert false information in to a lit review.
I can see how AI could be used to help with research dissemination. AI could be used to help adjust language to be more suitable to individuals without the same scientific background. We could ask AI tools like ChatGPT or Microsoft Co-Pilot to help summarize the research using language that is more accessible to a broader audience. However, as mentioned by the user above, AI models can hallucinate or provide inaccurate information and should always be reviewed for it’s accuracy before sharing!