Citations
Ashton, N. (2013, October 21). Machine readability. Know your Data Formats | School of Data - Evidence is Power. https://schoolofdata.org/2013/10/21/know-your-data-formats/
Xiang, Wenxuan, et al. "Parsing and Encoding Interactive Phrase Structure for Implicit Discourse Relation Recognition." Neural Computing and Applications, vol. 36, 2024, pp. 13783-13797, https://doi.org/10.1007/s00521-024-09709-8
Holtzman, A., Buys, J., Du, L., Forbes, M., & Choi, Y. (2020). The Curious Case of Neural Text Degeneration. In International Conference on Learning Representations. https://doi.org/10.48550/arXiv.1904.09751
Fan, A., Lewis, M., & Dauphin, Y. (2018). Hierarchical neural story generation. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (pp. 889-898). Association for Computational Linguistics. https://aclanthology.org/P18-1082/
Welleck, S., Kulikov, I., Kim, J., Pang, R. Y., & Cho, K. (2020). Consistency of a recurrent language model with respect to incomplete decoding. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (pp. 5553-5568). https://doi.org/10.18653/v1/2020.emnlp-main.448
Meister, C., & Cotterell, R. (2021). Language model evaluation beyond perplexity. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics (pp. 5328-5339). https://doi.org/10.18653/v1/2021.acl-long.414
Zhang, H., Xu, J., & Wang, J. (2019). Pretraining-based natural language generation for text summarization. In Proceedings of the 23rd Conference on Computational Natural Language Learning (pp. 789-797). https://doi.org/10.48550/arXiv.1902.09243
Vijayakumar, A. K., Cogswell, M., Selvaraju, R. R., Sun, Q., Lee, S., Crandall, D., & Batra, D. (2018). Diverse beam search: Decoding diverse solutions from neural sequence models. In Proceedings of AAAI Conference on Artificial Intelligence. https://doi.org/10.48550/arXiv.1610.02424
Keskar, N. S., McCann, B., Varshney, L. R., Xiong, C., & Socher, R. (2019). CTRL: A conditional transformer language model for controllable generation. arXiv preprint arXiv:1909.05858.
Merity, S. (2019). Single headed attention RNN: Stop thinking with your head. arXiv preprint arXiv:1911.11423. https://doi.org/10.48550/arXiv.1909.05858
Dathathri, S., Madotto, A., Lan, J., Hung, J., Frank, E., Molino, P., Yosinski, J., & Liu, R. (2020). Plug and play language models: A simple approach to controlled text generation. In International Conference on Learning Representations. https://doi.org/10.48550/arXiv.1912.02164
Hugging Face. (2024). Text generation strategies. Hugging Face Documentation. https://huggingface.co/docs/transformers/main/en/generation_strategies
Wolf, T., Chaumond, J., Debut, L., Sanh, V., Delangue, C., Moi, A., & Rush, A. M. (2020). Transformers: State-of-the-art natural language processing. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations (pp. 38-45). https://doi.org/10.18653/v1/2020.emnlp-demos.6
Last updated