Data visualisation looks at the different technologies and tools that a researcher can use to visualise their data and enhance the communication of their findings
This presentation outlines the aim and scope of the Historical Farm and People Registry project, explains the development process and problems encountered on the way, and demonstrates a use case for the ‘final’ product.
This lesson demonstrates how to build a basic interactive web application using Shiny, a library (a set of additional functions) for the programming language R. In the lesson, you will design and implement a simple application, consisting of a slider which allows a user to select a date range, which will then trigger some code in R, and display a set of corresponding points on an interactive map.
This lesson from Programming Historian introduces basic use of Map Warper for historical maps. It guides you from upload to export, demonstrating methods for georeferencing and producing visualizations.
This three-day international training school in Knowledge Extraction from Text from the CLS Infra project offered a crash course in how to “Dig for Gold” in a corpus of texts. From Stylometry to Natural Language Processing, learners will be able to follow along using 'plug and play' tools, while also getting a brief introduction to Python and R.
This lesson is the second in a two-part lesson focusing on regression analysis. It provides an overview of logistic regression, how to use Python (Scikit-learn) to make a logistic regression model, and a discussion of interpreting the results of such analysis.
This lesson is the first of a two-part lesson focusing on an indispensable set of data analysis methods, logistic and linear regression. It provides an overview of linear regression and walks through running both algorithms in Python (using Scikit-learn). The lesson also discusses interpreting the results of a regression model and some common pitfalls to avoid.
Since their beginnings in the 17th century, newspapers have recorded billions of events, stories and personal names in almost every language and every country daily. This course from DariahTeach provides an introduction to digitised historical newspaper analysis, incorporating methods of Natural Language Processing for discovering, exploiting and visualising newspapers.
This course from dariahTeach introduces learners to the theoretical and practical foundations of an analysis of socio-cultural objects using Python through theoretical grounding and hands-on case studies. Students will work through several research use cases using basic machine learning, and employ network analysis to split a small community network into groups and clusters before finally learning more about visualisation and image analysis.
The aim of this virtual course is to offer basic knowledge and skills in programming in Python. Target audiences are undergraduate and graduate students in the Humanities and Social Sciences who want to acquire hands-on knowledge and skills in working with textual data or quantitative data in language and humanities research.
Tableau is a powerful digital tool for analysing data that can help with mapping and interrogating data. In this short guide we will focus on an aspect of data analysis using mapping that has particular application for Holocaust and refugee studies.
EHRI (European Holocaust Research Infrastructure) supports the use of digital tools that can assist in the research of Holocaust and refugee related topics. In a continued effort to make these tools as accessible as possible so that researchers who have no experience with digital tools will consider trying new ways of using their data, this GitHub-based lesson showcases the use of entity match tools when dealing with geographic data.
Data is now an indispensable part of investigative work and storytelling for journalists and newsrooms. Computational methods and artificial intelligence are making their way to newsrooms more than ever before, and promise to open up new opportunities for journalists, as well as new challenges. This talk provides an overview of how data and Artificial Intelligence can be used in the journalism workflow, investigative reporting and storytelling.