Enabling digital scholarship – the University Library, your research partner
At the University of Basel, numerous projects and infrastructures are operated that work with digital methods in the humanities (e.g. Showcases). Promoting this branch is part of the university’s strategy and the University Library is actively participating in this process by supporting researchers with the design, development and implementation of digital projects.
For information and advice on your research project and/or possible collaboration with the University Library, please don’t hesitate to contact us. We look forward to working with you.
Contact us if you have questions about, among other things
- Digital research
- Digitisation and digital reproductions
- Digital humanities (methods, tools, projects, etc.)
- Opportunities for collaboration and research partnerships
Digitisation for research
The University Library has established processes that enable flexible digitisation, tailored to meet the needs of research.
We set ourselves the goal of supplying research projects, even those with larger corpora, with the necessary digitised sources from our holdings within 6 to 12 months. We will work out the digitisation requirements directly with you, and if the standard services fall short, we will develop the necessary interfaces so that you can seamlessly process the data in your project.
Are you an academic at the University of Basel and wish to make the results of your research more visible?
We will support you with various options for making your publication openly accessible.
Research data management
We would be happy to advise you on the following topics:
- Research data
- Data management plans
- Publication in compliance with FAIR principles
- Data organisation and data archiving
- and many more
Are you interested in collaborating with the University Library? Would you like information about or support with working with digital methods in your field? Our subject librarians will be happy to help you. You will find the contact details on the respective subject pages.
For interdisciplinary enquiries, the Digital Humanities team is there for you.
The Digital Humanities Lab is an interdisciplinary institution with over 20 years of practical experience in digital humanities research. The task of the DHLab is to coordinate and promote research, teaching and infrastructure for digitisation in the humanities and social sciences.
The Research Infrastructure Service unit (RISE) supports researchers in the humanities and social sciences at the University of Basel with questions related to the design of computer-based research, the production and analysis of data, usage-oriented presentation as well as the sustainable and open availability of data.
The Center for Data Analytics (CeDA) supports selected research projects in accessing and making data available, as well as collaborating on data analysis. The service is aimed at all academic disciplines and operates according to FAIR principles in the handling of data.
The Data Service Center for the Humanities (DaSCH) is a SNSF-funded national data infrastructure and competence centre for the long-term use of digital data. As such, the DaSCH supports researchers in the humanities in their work with digital research methods.
Working with digital collections / Digital Humanities
The University Library’s digitised sources can either be used directly on the various publication platforms or they can be edited by downloading PDFs. Various open tools also allow further digital processing. If the specific requirements need to be adjusted or developed, we will work together with the Digital Humanities Lab of the University of Basel or with other partners.
In the broadest sense, the digital humanities not only encompass a field of research in its own right, but also all work in which digital methods, procedures and information technologies are applied to research in the humanities. The website www.whatisdigitalhumanities.com provides an illustration of how the digital humanities cannot be clearly defined.
Although the University Library has been investing considerable resources in the digitisation of its historical collection, currently only about one per cent of all material has been digitised and is available on the platforms. In order for you to find all the holdings, a search in the catalogues is highly recommended. There you will also see whether a work is already available digitally.
Very often the sources required for working on a specific research question are distributed across different platforms. In addition to the option of downloading the sources locally, there is now also the option of combining the sources into your own data set via the IIIF (International Image Interoperability Framework). The Manifest Editor for example, is suitable for this purpose. This allows multiple works, or also only parts thereof, to be saved in a so-called IIIF manifest. Such a manifest can be published online. This allows the data set to be linked directly in the display and to be shared with others.
Zentralbibliothek Zürich IIIF tutorials
The ZB-Lab of the Zentralbibliothek Zürich has created short video tutorials on the topic to facilitate starting to working with IIIF. The videos are listed in the following playlist: https://youtube.com/playlist?list=PLxDekeBVQtVJeRqoTgsif7fJki2X96O-1
We guarantee the sustainability of our digital holdings. We would be happy to support you with regard to data management and the archiving of research data. See also the Basel Long-term Digital Archive project.
DH methods and tools
The digital humanities make use of a variety of tools and methods that are used in research. Some of them are presented and linked here, but the list is by no means complete.
Do you have questions about the individual applications and methods? Or do you have a tool that you would like to see listed here? Please contact the staff of the Digital Services or, if specifically noted, the respective contact person.
Text and Data Mining
The term "Text and Data Mining" (TDM) includes algorithm-based analytical processes for the purposes of discovering structures of meaning in text and other data. Freely accessible data and data for which rights exist are eligible.
In addition, members of the University of Basel can also make use of full-text data sets licensed by the University Library. For legal and technical reasons, HTML and PDF files on the publishers’ servers may not be downloaded by program in quantities needed for TDM. Many publishers provide access to JSON or XML data via API for TDM purposes. The use of these data is subject to certain restrictions – in particular, copyright-protected texts may not be made freely accessible or only in part (snippets).
Additional information may be found on individual publishers’ webpages (e.g. Elsevier, Springer Nature, Wiley) or at CrossRef Text and Data Mining, a free, multi-publisher service (i.a. AIP, APA, APS, Elsevier, HighWire Press, Springer, Taylor&Francis, Walter de Gruyter, Wiley). The corresponding agreements are made directly between the user and the publisher or with CrossRef.
Swissdox@LiRI (Linguistic Research Infrastructure) The University Library supports this service. For further information and data requests see here.
TEI - Text Encoding Initiative
The Text Encoding Initiative (TEI) is a document format based on XML and is now used as the standard for marking up and encoding texts. The TEI guidelines are now widely used in all areas.
The TEI was developed by a consortium of the same name, which continues to maintain the standard.
MEI - Music Encoding Initiative
The MEI is a community-driven, open source project to develop machine-readable encoding of music notation.
A selection of tools and toolboxes for text and data mining. Further recommendations will be gladly accepted.
Transkribus is a program for text recognition and transcription of manuscripts and old prints.
Voyant Tools is a web-based open source program for text analysis. It allows you to work with texts and/or (self-created) text corpora in different formats.
Google Ngram Viewer
Google Ngram Viewer can be used to display the frequency and distribution of words and phrases from Google Books text corpora. However, it is not possible to combine your own texts into a corpus and work with it.
The Text Analysis Portal for Research (TAPoR) provides an overview of, and curated lists of tools for all aspects of text analysis.
Here you will find links to various tools and applications that can be used for working with and creating databases and corpora.
nodegoat is a tool for creating data sets based on your own data models and enables relational, spatial and chronological analysis and visualisation of data.
SQLite is a public domain program library that contains a relational database system. It is the most common and widely used database system in the world.
OpenRefine is an open source program for data cleansing and conversion.
The Digital Research Infrastructure for the Arts and Humanities (DARIAH-DE) provides a large collection of services and tools for research in the humanities.
SSH Open Marketplace
The Social Sciences and Humanities (SSH) Open Marketplace offers a comprehensive collection of resources for researchers in the humanities and social sciences: tools, services, training materials, workflows and data sets.
The Programming Historian
The Programming Historian offers a large collection of peer-reviewed beginner-friendly tutorials and self-study courses on digital tools, techniques and workflows for researchers in the humanities.