EARMA Conference Odense 2024

PDF

LLM in Research Support

Leveraging Large Language Models (LLMs) to Revolutionize Research Support Workflows

Conference

EARMA Conference Odense 2024

Format: Oral 30 Minutes

Topic: IT Systems and tools supporting RMA now and in the future

Abstract

In an era where data is abundant and time is scarce, the research support community faces the challenge of efficiently managing and interpreting vast amounts of information. This abstract presents a transformative approach to overcoming these challenges by integrating Large Language Models (LLMs) and other AI methodologies into the research support ecosystem. Our data team at the Research and Innovation Office (RIO) group at SDU University has been at the forefront of this innovation, experimenting with both online services and locally-run AI chatbots. We combine AI and data with human expertise to best support our colleagues work and relieve them from tedious tasks.

LLMs offer a myriad of applications that can significantly improve the workflow in research support. One of the most promising applications is the ability to summarize large text corpora extracted from PDF documents. This feature enables quick insights into extensive literature, thereby aiding in literature reviews and meta-analyses. Another application is assistance in drafting research grant applications. LLMs can generate coherent and well-structured drafts, which can then be fine-tuned by experts, saving valuable time and resources.

In addition, LLMs can provide real-time online advice on a range of topics, from research methods to data analysis techniques. This immediate support can be especially beneficial for researchers who do not have immediate access to human experts. In addition, the use of vector databases enables the retrieval of reference documents, making the search for relevant literature more efficient and accurate.

Beyond these applications, LLMs have the potential to revolutionize other areas, such as data annotation, hypothesis generation, and even automatic answering of common queries, freeing up human resources for more complex tasks.

Our initial experiments in the RIO group at SDU University have shown promising results in integrating LLMs into our workflow. These AI tools have not only increased efficiency, but also improved the quality of our research support services.

In summary, integrating LLMs into the research support workflow offers a transformative approach to addressing the challenges of the modern research landscape. As AI technology continues to evolve, it is imperative for the research support community to adapt and leverage these tools for a more efficient and effective workflow.

In the upcoming presentation at the EARMA conference, we will take a closer look at concrete use cases to illustrate the practical implementation of Large Language Models in research support workflows. These real-world examples, drawn from our experiences in the RIO group at SDU University, will demonstrate the tangible benefits and efficiencies gained from integrating AI tools like customizing your own prompt or creating visuals. From streamlining grant application processes to improving literature search methods, these case studies will provide participants with a comprehensive understanding of how AI can be used effectively to support research.