Skip to main content Accessibility help
×
Hostname: page-component-7857688df4-7f72j Total loading time: 0 Render date: 2025-11-19T23:48:59.882Z Has data issue: false hasContentIssue false

8 - ChatGPT for Bibliometrics: Potential Applications and Limitations

Published online by Cambridge University Press:  13 September 2025

Paul Gooding
Affiliation:
University of Glasgow
Melissa Terras
Affiliation:
University of Edinburgh
Get access

Summary

Introduction

Large language models: Transforming artificial intelligence Since the release of BERT (Bidirectional Encoder Representations from Transformers) by Google in 2018 and GPT-3 by OpenAI in 2020, large language models (LLMs) have revolutionised the field of artificial intelligence (AI) and natural language processing. BERT pioneered the use of bidirectional transformers, which understand words by considering context from both sides, enabling a better grasp of natural language (Devlin et al., 2019). GPT-3 advanced this technology further with 175 billion parameters, providing unprecedented capabilities in text generation and comprehension (Frantar et al., 2023). These models are trained on vast amounts of textual data using deep learning techniques that capture the complex patterns and structures of human language. In analysing text sequences, LLMs use the transformer architecture for deep learning networks, allowing models to attend to different parts of the text simultaneously, thus improving the coherence and relevance of the generated responses. This capability enhances generative AI by helping it to create contextually appropriate content.

The transformer architecture was introduced in the seminal paper ‘Attention is All You Need’ (Vaswani et al., 2017). It demonstrated how self-attention mechanisms could outperform traditional recurrent neural networks in natural language processing tasks. This significantly improved the accuracy and efficiency of language models and laid the groundwork for subsequent developments.

Information

Type
Chapter
Information
Library Catalogues as Data
Research, Practice and Usage
, pp. 145 - 166
Publisher: Facet
Print publication year: 2025

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Accessibility standard: Unknown

Why this information is here

This section outlines the accessibility features of this content - including support for screen readers, full keyboard navigation and high-contrast display options. This may not be relevant for you.

Accessibility Information

Accessibility compliance for the PDF of this book is currently unknown and may be updated in the future.

Save book to Kindle

To save this book to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×