References
Alkomah, Fatimah, and Xiaogang Ma. 2022. “A Literature
Review of Textual Hate Speech Detection Methods and
Datasets.” Information 13 (6, 6): 273. https://doi.org/10.3390/info13060273.
Almeida, Felipe, and Geraldo Xexéo. 2019. “Word
Embeddings: A Survey.” ArXiv,
January. https://www.semanticscholar.org/paper/Word-Embeddings%3A-A-Survey-Almeida-Xex%C3%A9o/e28e81a8cb6655aebb72357538f7b7a360366a29.
Barry, Paul. 2017. Python von Kopf bis Fuß. Translated by
Jørgen W. Lang. Zweite Auflage. Von Kopf bis Fuß. Beijing Boston
Farnham Sebastopol Tokyo: O’Reilly.
Camacho-Collados, Jose, and Mohammad Taher Pilehvar. 2020.
“Embeddings in Natural Language Processing.”
In Proceedings of the 28th International Conference on
Computational Linguistics: Tutorial
Abstracts, 10–15. Barcelona, Spain (Online):
International Committee for Computational Linguistics. https://doi.org/10.18653/v1/2020.coling-tutorials.2.
Castaño-Pulgarín, Sergio Andrés, Natalia Suárez-Betancur, Luz Magnolia
Tilano Vega, and Harvey Mauricio Herrera López. 2021. “Internet,
Social Media and Online Hate Speech. Systematic
Review.” Aggression and Violent Behavior 58 (May):
101608. https://doi.org/10.1016/j.avb.2021.101608.
Chollet, François. 2021. Deep Learning with
Python. Second edition. Shelter Island,
NY: Manning.
Chollet, François, Tomasz Kalinowski, and J. J. Allaire. 2022a. Deep
Learning with R. Second edition. Shelter Island,
NY: Manning.
———. 2022b. Deep Learning with R. Second edition.
Shelter Island, NY: Manning Publications Co.
Downey, Allen B. 2021. Think Python: systematisch programmieren
lernen mit Python. Translated by Peter Klicman. 1. Auflage.
Heidelberg: O’Reilly.
Gallatin, Kyle, and Chris Albon. 2023. Machine Learning with
Python Cookbook: Practical Solutions from Preprocessing to
Deep Learning. Beijing Boston Farnham Sebastopol
Tokyo: O’Reilly Media.
George, Alexandra. 2022. Python Text Mining: Perform Text
Processing, Word Embedding, Text Classification and Machine
Translation. Delhi: BPB Publications.
Géron, Aurélien. 2023a. Hands-on Machine Learning with
Scikit-Learn, Keras, and
TensorFlow: Concepts, Tools, and Techniques to Build
Intelligent Systems. Third edition. Beijing Boston Farnham
Sebastopol Tokyo: O’Reilly.
———. 2023b. Praxiseinstieg Machine Learning mit Scikit-Learn, Keras
und TensorFlow: Konzepte, Tools und Techniken für intelligente
Systeme. Translated by Kristian Rother and Thomas Demmig. 3.,
aktualisierte und erweiterte Auflage. Heidelberg:
O’Reilly.
———. 2023c. Praxiseinstieg Machine Learning mit Scikit-Learn, Keras
und TensorFlow: Konzepte, Tools und Techniken für intelligente
Systeme. Translated by Kristian Rother and Thomas Demmig. 3.,
aktualisierte und erweiterte Auflage. Heidelberg:
O’Reilly.
Hunt, Andrew, and David Thomas. 2000. The Pragmatic Programmer from
Journeyman to Master. Reading, Mass.:
Addison-Wesley.
Hvitfeldt, Emil, and Julia Silge. 2021. Supervised Machine
Learning for Text Analysis in R.
1st ed. Boca Raton: Chapman and Hall/CRC. https://doi.org/10.1201/9781003093459.
Inden, Michael. 2023. Python lernen: kurz & gut. 1.
Auflage. O’Reillys Taschenbibliothek. Heidelberg:
O’Reilly.
James, Gareth, Daniela Witten, Trevor Hastie, and Robert Tibshirani.
2021. An Introduction to Statistical Learning: With Applications in
R. Second edition. Springer Texts in Statistics.
New York: Springer. https://link.springer.com/book/10.1007/978-1-0716-1418-1.
König, Tim, Wolf J. Schünemann, Alexander Brand, Julian Freyberg, and
Michael Gertz. 2022. “The EPINetz Twitter Politicians
Dataset 2021. A New Resource for the Study of the German
Twittersphere and Its Application for the 2021 Federal
Elections.” Politische Vierteljahresschrift 63 (3):
529–47. https://doi.org/10.1007/s11615-022-00405-7.
Kulkarni, Akshay, and Adarsha Shivananda. 2021. Natural Language
Processing Recipes: Unlocking Text Data with Machine Learning and Deep
Learning Using Python. Second edition. New
York: Apress.
Kurz, A. Solomon. 2021. Statistical Rethinking with Brms, Ggplot2,
and the Tidyverse: Second Edition. https://bookdown.org/content/4857/.
Lex, Alexander, Nils Gehlenborg, Hendrik Strobelt, Romain Vuillemot, and
Hanspeter Pfister. 2014. “UpSet:
Visualization of Intersecting Sets.” IEEE
Transactions on Visualization and Computer Graphics 20 (12):
1983–92. https://doi.org/10.1109/TVCG.2014.2346248.
Liu, Zhiyuan, Yankai Lin, and Maosong Sun, eds. 2023. Representation
Learning for Natural Language Processing.
Singapore: Springer Nature Singapore. https://doi.org/10.1007/978-981-99-1600-9.
McElreath, Richard. 2020. Statistical Rethinking: A
Bayesian Course with Examples in R and
Stan. 2nd ed. CRC Texts in Statistical
Science. Boca Raton: Taylor and Francis, CRC
Press.
Mikolov, Tomas, Kai Chen, Greg Corrado, and Jeffrey Dean. 2013.
“Efficient Estimation of Word
Representations in Vector Space.” September
6, 2013. https://doi.org/10.48550/arXiv.1301.3781.
Pennington, Jeffrey, Richard Socher, and Christopher Manning. 2014.
“GloVe: Global Vectors for Word
Representation.” In Proceedings of the 2014 Conference on
Empirical Methods in Natural Language Processing
(EMNLP), 1532–43. Doha, Qatar:
Association for Computational Linguistics. https://doi.org/10.3115/v1/D14-1162.
Pilehvar, Mohammad Taher, and Jose Camacho-Collados. 2021.
Embeddings in Natural Language Processing:
Theory and Advances in Vector
Representations of Meaning. Synthesis
Lectures on Human Language Technologies.
Cham: Springer International Publishing. https://doi.org/10.1007/978-3-031-02177-0.
Remus, Robert, Uwe Quasthoff, and Gerhard Heyer. 2010.
“SentiWS - a Publicly Available German-Language
Resource for Sentiment Analysis.” Proceedings of the 7th
International Language Ressources and Evaluation (LREC’10),
1168–71.
Rhys, Hefin. 2020. Machine Learning with
R, the Tidyverse, and Mlr. Shelter Island,
NY: Manning publications.
Risch, Julian, Anke Stoll, Lena Wilms, and Michael Wiegand. 2021.
“Overview of the GermEval 2021 Shared Task on the
Identification of Toxic, Engaging, and Fact-Claiming Comments.”
In Proceedings of the GermEval 2021 Shared Task on the
Identification of Toxic, Engaging, and Fact-Claiming Comments,
1–12. Duesseldorf, Germany: Association for
Computational Linguistics. https://aclanthology.org/2021.germeval-1.1.
Rothman, Denis. 2022. Transformers for Natural Language Processing:
Build, Train, and Fine-Tune Deep Neural Network Architectures for
NLP with Python, Hugging Face,
and OpenAI´s GPT3, ChatGPT, and
GPT-4. Second edition. Expert Insight.
Birmingham Mumbai: Packt.
Shannon, C. E. 1948. “A Mathematical Theory of
Communication.” Bell System Technical Journal 27 (3):
379–423. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x.
Siegel, Melanie, and Melpomeni Alexa. 2020. Sentiment-Analyse
deutschsprachiger Meinungsäußerungen: Grundlagen, Methoden und
praktische Umsetzung. Wiesbaden: Springer
Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-658-29699-5.
Silge, Julia, and David Robinson. 2017. Text Mining with
R: A Tidy Approach. First edition. Beijing ;
Boston: O’Reilly. https://www.tidytextmining.com/.
Stone, James V. 2019. “Information Theory: A
Tutorial Introduction.” June 13, 2019. http://arxiv.org/abs/1802.05968.
Tunstall, Lewis, Leandro von Werra, Thomas Wolf, and Aurélien Géron.
2022. Natural Language Processing with Transformers: Building
Language Applications with Hugging Face. Revised edition.
Sebastopol: O’Reilly.
Vaswani, Ashish, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion
Jones, Aidan N. Gomez, Lukasz Kaiser, and Illia Polosukhin. 2023.
“Attention Is All You Need.” August 1, 2023.
https://doi.org/10.48550/arXiv.1706.03762.
Wickham, Hadley, and Garrett Grolemund. 2016. R for Data
Science: Visualize, Model,
Transform, Tidy, and Import
Data. O’Reilly Media. https://r4ds.had.co.nz/index.html.
Wiegand, Michael. 2019a. “GermEval-2018 Corpus
(DE).” heiDATA. https://doi.org/10.11588/data/0B5VML.
———. 2019c. “GermEval-2018-Data-master.” In
GermEval-2018 Corpus (DE).
heiDATA. https://doi.org/10.11588/data/0B5VML/XIUWJ7.
Yamada, Ikuya, and Hiroyuki Shindo. 2019. “Neural Attentive
Bag-of-Entities Model for Text Classification.” In
Proceedings of the 23th SIGNLL Conference on
Computational Natural Language Learning, 563–73. Association
for Computational Linguistics.