Pre-Trained Language Models and News Media
My research interest is to explore the opportunities and challenges of pre-trained language models (PLMs) in the news media.
Challenges in Practice
- Monitoring time-series performance degradation ( AACL-IJCNLP 2022 & IC2S2 2023 => Journal of Natural Language Processing)
- Training data extraction
- Survey paper ( ACL 2023 Workshop)
- Experiments on Japanese newspaper ( INLG 2024)
- Hallucination analysis on domain-specific PLMs ( Journal of Natural Language Processing)
Building and Applying PLMs
- Building domain-specific PLMs ( Press release & Journal of Natural Language Processing)
- PLMs for news industry:
- Crossword puzzle generation ( CIKM 2023)
- Reading time estimation ( BigData 2022 Industrial & Government Track)
- Multilingual news similarity scoring ( NAACL 2022 Workshop)
- Next Item Recommendation ( WSDM 2021 Workshop) S
Reference
You can see Publications for the detail.