| ||||
| ||||
![]() Title:Enhancing Scientific Title Generation via Instruction-Guided Extraction and Denoising Autoencoders Conference:ACIIDS2026 Tags:Denoising autoencoder, Keyword extraction, Natural language generation, Title generation and Transformer models Abstract: We propose a novel framework for abstractive title generation from scientific abstracts by integrating cross-model keyword augmentation with a denoising autoencoder (DAE) auxiliary objective. In our approach, T5 is employed to extract salient keywords that guide BART’s title generation, ensuring improved lexical and semantic alignment. To further enhance robustness, a DAE auxiliary objective is incorporated into BART’s fine-tuning, refining abstract representations by mitigating noise and emphasizing core semantic concepts. This joint mechanism strengthens the interaction between keyword guidance and generative modeling, enabling more concise, informative, and contextually accurate titles. Experimental results demonstrate that the keyword-augmented, DAE-enhanced framework consistently outperforms existing baselines, offering a robust and interpretable approach to automatic title generation in the scientific domain. Enhancing Scientific Title Generation via Instruction-Guided Extraction and Denoising Autoencoders ![]() Enhancing Scientific Title Generation via Instruction-Guided Extraction and Denoising Autoencoders | ||||
| Copyright © 2002 – 2026 EasyChair |
