- home
- Advanced Search
6 Research products, page 1 of 1
Loading
- Publication . Article . 2018Open Access EnglishAuthors:Laurent Romary; Charles Riondet;Laurent Romary; Charles Riondet;Publisher: HAL CCSDCountry: FranceProject: EC | EHRI (654164), EC | PARTHENOS (654119), EC | EHRI (261873)
This article tackles the issue of integrating heterogeneous archival sources in one single data repository, namely the European Holocaust Research Infrastructure (EHRI) portal, whose aim is to support Holocaust research by providing online access to information about dispersed sources relating to the Holocaust (http://portal.ehri-project.eu). In this case, the problem at hand is to combine data coming from a network of archives in order to create an interoperable data space which can be used to search for, retrieve and disseminate content in the context of archival-based research. The scholarly purpose has specific consequences on our task. It assumes that the information made available to the researcher is as close as possible to the originating source in order to guarantee that the ensuing analysis can be deemed reliable. In the EHRI network of archives, as already observed in the case of the EU Cendari project, one cannot but face heterogeneity. The EHRI portal brings together descriptions from more than 1900 institutions. Each archive comes with a whole range of idiosyncrasies corresponding to the way it has been set up and evolved over time. Cataloging practices may also differ. Even the degree of digitization may range from the absence of a digital catalogue to the provision of a full-fledged online catalogue with all the necessary APIs for anyone to query and extract content. There is indeed a contrast here with the global endeavour at the international level to develop and promote standards for the description of archival content as a whole. Nonetheless, in a project like EHRI, standards should play a central role. They are necessary for many tasks related to the integration and exploitation of the aggregated content, namely: ● Being able to compare the content of the various sources, thus being able to develop quality-checking processes; ● Defining of an integrated repository infrastructure where the content of the various archival sources can be reliably hosted; ● Querying and re-using content in a seamless way; ● Deploying tools that have been developed independently of the specificities of the information sources, for instance in order to visualise or mine the resulting pool of information. The central aspect of the work described in this paper is the assessment of the role of the EAD (Encoded Archival Description) standard as the basis for achieving the tasks described above. We have worked out how we could develop a real strategy of defining specific customization of EAD that could be used at various stages of the process of integrating heterogeneous sources. While doing so, we have developed a methodology based on a specification and customization method inspired from the extensive experience of the Text Encoding Initiative (TEI) community. In the TEI framework, as we show in section 1, one has the possibility to model specific subsets or extensions of the TEI guidelines while maintaining both the technical (XML schemas) and editorial (documentation) content within a single framework. This work has led us quite far in anticipating that the method we have developed may be of a wider interest within similar environments, but also, as we believe, for the future maintenance of the EAD standard. Finally this work, successfully tested and implemented in the framework of EHRI [Riondet 2017], can be seen as part of the wider endeavour of European research infrastructures in the humanities such as CLARIN and DARIAH to provide support for researchers to integrate the use of standards in their scholarly practices. This is the reason why the general workflow studied here has been introduced as a use case in the umbrella infrastructure project PARTHENOS which aims, among other things, at disseminating information and resources about methodological and technical standards in the humanities.
Average popularityAverage popularity In bottom 99%Average influencePopularity: Citation-based measure reflecting the current impact.Average influence In bottom 99%Influence: Citation-based measure reflecting the total impact.add Add to ORCIDPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product. - Publication . Part of book or chapter of book . 2018Open Access EnglishAuthors:Federica Maietti; Roberto Di Giulio; Marcello Balzani; Emanuele Piaia; Marco Medici; Federico Ferrari;Federica Maietti; Roberto Di Giulio; Marcello Balzani; Emanuele Piaia; Marco Medici; Federico Ferrari;Publisher: SpringerCountry: ItalyProject: EC | INCEPTION (665220)
The ongoing EU funded project “INCEPTION – Inclusive Cultural Heritage in Europe through 3D semantic modelling” proposes a workflow aimed at the achievements of efficient 3D digitization methods, post-processing tools for an enriched semantic modelling, web-based solutions and applications to ensure a wide access to experts and non-experts. Nevertheless, the generation of high quality 3D models can still be very time-consuming and expensive, and the outcome of digital reconstructions is frequently provided in formats that are not interoperable, and therefore cannot be easily accessed. This challenge is even more crucial for complex architectures and large heritage sites, which involve a large amount of data to be acquired, managed and enriched by metadata. In order to face these challenges and to start solving the issues of the large amount of captured data and time-consuming processes in the production of 3D digital models, an Optimized Data Acquisition Protocol (DAP) has been set up. The purpose is to guide the processes of digitization of Cultural Heritage, respecting needs, requirements and specificities of cultural assets, by dealing with issues such as time-consuming processes and limited budget available for 3D documentation, accuracy of 3D models, integration of metadata and semantics into the 3D model and links with multimedia information. The DAP can be followed during the planning and performing of a 3D laser scanner survey of Cultural Heritage, and it is referred to architectural, archaeological, urban and site scales.
Average popularityAverage popularity In bottom 99%Average influencePopularity: Citation-based measure reflecting the current impact.Average influence In bottom 99%Influence: Citation-based measure reflecting the total impact.add Add to ORCIDPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product. - Publication . Article . 2017Open Access EnglishAuthors:Margo Bargheer; Zeki Mustafa Dogan; Wolfram Horstmann; Mike Mertens; Andrea Rapp;Margo Bargheer; Zeki Mustafa Dogan; Wolfram Horstmann; Mike Mertens; Andrea Rapp;
doi: 10.18352/lq.10174
Publisher: openjournals.nlCountry: GermanyProject: EC | HIRMEOS (731102)In the light of new digital production and dissemination practices, the scholarly publishing system has seen significant and also disruptive changes, especially in STM (science, technology and medicine) and with regard to the predominant format “journal article.” The digital transformation also holds true for those disciplines that continue to rely on the scholarly monograph as a publication format and means for reputation building, namely the Humanities and the Social Sciences with a qualitative approach (HSS). In our paper we analyse the reasons why the monograph has not yet reached its full potential in the digital paradigm, especially in the uptake of Open Access and innovative publishing options. We highlight some of the principal underlying factors for this, and suggest how especially practices, now more widespread in HSS but arising from the Digital Humanities, could play a role in moving forward the rich digitality of the scholarly monograph. peerReviewed
Average popularityAverage popularity In bottom 99%Average influencePopularity: Citation-based measure reflecting the current impact.Average influence In bottom 99%Influence: Citation-based measure reflecting the total impact.add Add to ORCIDPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product. - Publication . Conference object . 2017EnglishAuthors:Demonet, Marie-Luce;Demonet, Marie-Luce;Publisher: HAL CCSDCountry: FranceProject: EC | HaS-DARIAH (675570)
La conservation, l’étude et la numérisation des livres ont été avant tout justifiées par un besoin social ou individuel de lecture : utilitaire, pour obtenir un diplôme et avoir un métier, pour diffuser des idées, besoin hédoniste issu du plaisir de la lecture. Actuellement la constitution des corpus et la numérisation de collections pourraient remplacer les livres par les « données », dont la consultation mettrait en danger un rapport au texte qui était avant tout « linéaire ». Des exemples pris dans les fonds de la bibliothèque du Mans ou personnels permettent d’alimenter le débat autour de ce qui apparaît comme une déqualification de la lecture, au profit de résultats obtenus grâce à un outillage plus ou moins bien maîtrisé par l’humain. Once upon a time, the preservation, the study, and the digitization of books were primarily justified by utilitarian needs: to obtain a qualification, or to hold down a job — or to share ideas, or to satisfy a hedonistic need derived from the pleasure of reading. Nowadays the building of corpora and the digitization of collections could replace books by « data », the consultation of which threatens a close engagement with the linearity of text. I will comment on examples chosen from the library of Le Mans, and from my own collection, to question this apparent under valuing of reading in favour of results derived by means of tools only partly under human control.
- Publication . Other literature type . 2014EnglishAuthors:Uetani, Toshinori; Jimenes, Rémi; Breuil, Sandrine; Fins, Jorge; Demonet, Marie-Luce; Bertrand, Lauranne;Uetani, Toshinori; Jimenes, Rémi; Breuil, Sandrine; Fins, Jorge; Demonet, Marie-Luce; Bertrand, Lauranne;Publisher: HAL CCSD
The BVH project have been rewarded with the Succeed award 2014. This prize recognizes the successful implementation of a digitisation programme, especially those exploiting the latest technology and the output of research for the digitisation of historical text.The BVH (Bibliothèques Virtuelles Humanistes: Virtual Humanistic Libraries) is a research program devoted to the digitization and electronic publication of original source documents from the Renaissance period. Since 2003, its website has published digital facsimiles, selected Early Modern imprints (1450-1650) mainly from regional collections, and transcriptions of French texts of the same period, encoded according to the XML-TEI standard. Particular attention is paid to achieving great accuracy in the bibliographical description as regards the true states of originals and the closest correspondence between two distinct corpora, facsimile and text, linked by several levels of metadata in the main catalogue. The BVH team works in close collaboration with researchers from the Computer Science Laboratory of Tours (LI-Tours) to develop new technologies in the fields of image processing and pattern recognition. Open source software for layout analysis and text transcriptions, AGORA and RETRO, enables us to perform automatic extraction of graphic components from digitized books, and thus to build up specialized databases of iconographic and typographical material. As a member of the TEI consortium, we actively contribute to the development of a specialised schema for the transcription of Renaissance documents. Each step of processing and every component developed at the BVH is also intended for use by the whole digital community, creating a model for the digital library of the future. Les Bibliothèques Virtuelles Humanistes (BVH) se sont vues décernées le prix Succeed 2014, soutenu par l’Union Européenne, qui récompense « la mise en œuvre réussie d’un programme de numérisation exploitant les nouvelles technologies et les travaux de recherche en matière de numérisation des documents historiques ».Le programme de recherche BVH numérise des documents patrimoniaux et publie des éditions électroniques de documents de la Renaissance. Depuis 2003, les BVH publient sur leur site internet une sélection de Fac-similés numériques - imprimés du début des temps modernes (1450-1650), principalement issus de fonds régionaux - et des éditions de textes français de la même période, encodés en XML selon les recommandations de la Text Encoding Initiative (TEI). Une attention particulière est portée à la description bibliographique respectant l'état original des sources et à la conservation de la correspondance entre les deux corpus distincts, fac-similés et textes, liés par leurs métadonnées communes. L'équipe BVH travaille en étroite collaboration avec l'équipe de recherche sur la Reconnaissance des Formes et Analyse d'Images du Laboratoire d'Informatique de Tours (LI Tours, équipe RFAI) pour le développement de nouveaux outils. Agora et Retro, logiciels open source, nous permettent d'extraire automatiquement les éléments (caractères et illustrations) des livres numérisés et ainsi, de constituer des bases de données spécialisées iconographiques et de matériel d'imprimerie. Membre du Consortium TEI, les BVH contribuent activement au développement d'un schéma pour l'édition de documents de la Renaissance. Les méthodologies et outils développés dans le cadre de ces recherches sont mises à la disposition de tous.
- Publication . Article . 2013Open Access EnglishAuthors:MILICA LAJBENŠPERGER; MARIJA ŠEGAN; SANJA RAJIĆ;MILICA LAJBENŠPERGER; MARIJA ŠEGAN; SANJA RAJIĆ;Publisher: Foundation Pro Scientia Publica
In the Republic of Serbia the idea that the digitization of cultural heritage could be used to popularize and utilize modern technologies in education was first realized in 2012. One of the results of this project was a user study. In this paper we indicate some of project’s fragments and present some of the results of the conducted research.
6 Research products, page 1 of 1
Loading
- Publication . Article . 2018Open Access EnglishAuthors:Laurent Romary; Charles Riondet;Laurent Romary; Charles Riondet;Publisher: HAL CCSDCountry: FranceProject: EC | EHRI (654164), EC | PARTHENOS (654119), EC | EHRI (261873)
This article tackles the issue of integrating heterogeneous archival sources in one single data repository, namely the European Holocaust Research Infrastructure (EHRI) portal, whose aim is to support Holocaust research by providing online access to information about dispersed sources relating to the Holocaust (http://portal.ehri-project.eu). In this case, the problem at hand is to combine data coming from a network of archives in order to create an interoperable data space which can be used to search for, retrieve and disseminate content in the context of archival-based research. The scholarly purpose has specific consequences on our task. It assumes that the information made available to the researcher is as close as possible to the originating source in order to guarantee that the ensuing analysis can be deemed reliable. In the EHRI network of archives, as already observed in the case of the EU Cendari project, one cannot but face heterogeneity. The EHRI portal brings together descriptions from more than 1900 institutions. Each archive comes with a whole range of idiosyncrasies corresponding to the way it has been set up and evolved over time. Cataloging practices may also differ. Even the degree of digitization may range from the absence of a digital catalogue to the provision of a full-fledged online catalogue with all the necessary APIs for anyone to query and extract content. There is indeed a contrast here with the global endeavour at the international level to develop and promote standards for the description of archival content as a whole. Nonetheless, in a project like EHRI, standards should play a central role. They are necessary for many tasks related to the integration and exploitation of the aggregated content, namely: ● Being able to compare the content of the various sources, thus being able to develop quality-checking processes; ● Defining of an integrated repository infrastructure where the content of the various archival sources can be reliably hosted; ● Querying and re-using content in a seamless way; ● Deploying tools that have been developed independently of the specificities of the information sources, for instance in order to visualise or mine the resulting pool of information. The central aspect of the work described in this paper is the assessment of the role of the EAD (Encoded Archival Description) standard as the basis for achieving the tasks described above. We have worked out how we could develop a real strategy of defining specific customization of EAD that could be used at various stages of the process of integrating heterogeneous sources. While doing so, we have developed a methodology based on a specification and customization method inspired from the extensive experience of the Text Encoding Initiative (TEI) community. In the TEI framework, as we show in section 1, one has the possibility to model specific subsets or extensions of the TEI guidelines while maintaining both the technical (XML schemas) and editorial (documentation) content within a single framework. This work has led us quite far in anticipating that the method we have developed may be of a wider interest within similar environments, but also, as we believe, for the future maintenance of the EAD standard. Finally this work, successfully tested and implemented in the framework of EHRI [Riondet 2017], can be seen as part of the wider endeavour of European research infrastructures in the humanities such as CLARIN and DARIAH to provide support for researchers to integrate the use of standards in their scholarly practices. This is the reason why the general workflow studied here has been introduced as a use case in the umbrella infrastructure project PARTHENOS which aims, among other things, at disseminating information and resources about methodological and technical standards in the humanities.
Average popularityAverage popularity In bottom 99%Average influencePopularity: Citation-based measure reflecting the current impact.Average influence In bottom 99%Influence: Citation-based measure reflecting the total impact.add Add to ORCIDPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product. - Publication . Part of book or chapter of book . 2018Open Access EnglishAuthors:Federica Maietti; Roberto Di Giulio; Marcello Balzani; Emanuele Piaia; Marco Medici; Federico Ferrari;Federica Maietti; Roberto Di Giulio; Marcello Balzani; Emanuele Piaia; Marco Medici; Federico Ferrari;Publisher: SpringerCountry: ItalyProject: EC | INCEPTION (665220)
The ongoing EU funded project “INCEPTION – Inclusive Cultural Heritage in Europe through 3D semantic modelling” proposes a workflow aimed at the achievements of efficient 3D digitization methods, post-processing tools for an enriched semantic modelling, web-based solutions and applications to ensure a wide access to experts and non-experts. Nevertheless, the generation of high quality 3D models can still be very time-consuming and expensive, and the outcome of digital reconstructions is frequently provided in formats that are not interoperable, and therefore cannot be easily accessed. This challenge is even more crucial for complex architectures and large heritage sites, which involve a large amount of data to be acquired, managed and enriched by metadata. In order to face these challenges and to start solving the issues of the large amount of captured data and time-consuming processes in the production of 3D digital models, an Optimized Data Acquisition Protocol (DAP) has been set up. The purpose is to guide the processes of digitization of Cultural Heritage, respecting needs, requirements and specificities of cultural assets, by dealing with issues such as time-consuming processes and limited budget available for 3D documentation, accuracy of 3D models, integration of metadata and semantics into the 3D model and links with multimedia information. The DAP can be followed during the planning and performing of a 3D laser scanner survey of Cultural Heritage, and it is referred to architectural, archaeological, urban and site scales.
Average popularityAverage popularity In bottom 99%Average influencePopularity: Citation-based measure reflecting the current impact.Average influence In bottom 99%Influence: Citation-based measure reflecting the total impact.add Add to ORCIDPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product. - Publication . Article . 2017Open Access EnglishAuthors:Margo Bargheer; Zeki Mustafa Dogan; Wolfram Horstmann; Mike Mertens; Andrea Rapp;Margo Bargheer; Zeki Mustafa Dogan; Wolfram Horstmann; Mike Mertens; Andrea Rapp;
doi: 10.18352/lq.10174
Publisher: openjournals.nlCountry: GermanyProject: EC | HIRMEOS (731102)In the light of new digital production and dissemination practices, the scholarly publishing system has seen significant and also disruptive changes, especially in STM (science, technology and medicine) and with regard to the predominant format “journal article.” The digital transformation also holds true for those disciplines that continue to rely on the scholarly monograph as a publication format and means for reputation building, namely the Humanities and the Social Sciences with a qualitative approach (HSS). In our paper we analyse the reasons why the monograph has not yet reached its full potential in the digital paradigm, especially in the uptake of Open Access and innovative publishing options. We highlight some of the principal underlying factors for this, and suggest how especially practices, now more widespread in HSS but arising from the Digital Humanities, could play a role in moving forward the rich digitality of the scholarly monograph. peerReviewed
Average popularityAverage popularity In bottom 99%Average influencePopularity: Citation-based measure reflecting the current impact.Average influence In bottom 99%Influence: Citation-based measure reflecting the total impact.add Add to ORCIDPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product. - Publication . Conference object . 2017EnglishAuthors:Demonet, Marie-Luce;Demonet, Marie-Luce;Publisher: HAL CCSDCountry: FranceProject: EC | HaS-DARIAH (675570)
La conservation, l’étude et la numérisation des livres ont été avant tout justifiées par un besoin social ou individuel de lecture : utilitaire, pour obtenir un diplôme et avoir un métier, pour diffuser des idées, besoin hédoniste issu du plaisir de la lecture. Actuellement la constitution des corpus et la numérisation de collections pourraient remplacer les livres par les « données », dont la consultation mettrait en danger un rapport au texte qui était avant tout « linéaire ». Des exemples pris dans les fonds de la bibliothèque du Mans ou personnels permettent d’alimenter le débat autour de ce qui apparaît comme une déqualification de la lecture, au profit de résultats obtenus grâce à un outillage plus ou moins bien maîtrisé par l’humain. Once upon a time, the preservation, the study, and the digitization of books were primarily justified by utilitarian needs: to obtain a qualification, or to hold down a job — or to share ideas, or to satisfy a hedonistic need derived from the pleasure of reading. Nowadays the building of corpora and the digitization of collections could replace books by « data », the consultation of which threatens a close engagement with the linearity of text. I will comment on examples chosen from the library of Le Mans, and from my own collection, to question this apparent under valuing of reading in favour of results derived by means of tools only partly under human control.
- Publication . Other literature type . 2014EnglishAuthors:Uetani, Toshinori; Jimenes, Rémi; Breuil, Sandrine; Fins, Jorge; Demonet, Marie-Luce; Bertrand, Lauranne;Uetani, Toshinori; Jimenes, Rémi; Breuil, Sandrine; Fins, Jorge; Demonet, Marie-Luce; Bertrand, Lauranne;Publisher: HAL CCSD
The BVH project have been rewarded with the Succeed award 2014. This prize recognizes the successful implementation of a digitisation programme, especially those exploiting the latest technology and the output of research for the digitisation of historical text.The BVH (Bibliothèques Virtuelles Humanistes: Virtual Humanistic Libraries) is a research program devoted to the digitization and electronic publication of original source documents from the Renaissance period. Since 2003, its website has published digital facsimiles, selected Early Modern imprints (1450-1650) mainly from regional collections, and transcriptions of French texts of the same period, encoded according to the XML-TEI standard. Particular attention is paid to achieving great accuracy in the bibliographical description as regards the true states of originals and the closest correspondence between two distinct corpora, facsimile and text, linked by several levels of metadata in the main catalogue. The BVH team works in close collaboration with researchers from the Computer Science Laboratory of Tours (LI-Tours) to develop new technologies in the fields of image processing and pattern recognition. Open source software for layout analysis and text transcriptions, AGORA and RETRO, enables us to perform automatic extraction of graphic components from digitized books, and thus to build up specialized databases of iconographic and typographical material. As a member of the TEI consortium, we actively contribute to the development of a specialised schema for the transcription of Renaissance documents. Each step of processing and every component developed at the BVH is also intended for use by the whole digital community, creating a model for the digital library of the future. Les Bibliothèques Virtuelles Humanistes (BVH) se sont vues décernées le prix Succeed 2014, soutenu par l’Union Européenne, qui récompense « la mise en œuvre réussie d’un programme de numérisation exploitant les nouvelles technologies et les travaux de recherche en matière de numérisation des documents historiques ».Le programme de recherche BVH numérise des documents patrimoniaux et publie des éditions électroniques de documents de la Renaissance. Depuis 2003, les BVH publient sur leur site internet une sélection de Fac-similés numériques - imprimés du début des temps modernes (1450-1650), principalement issus de fonds régionaux - et des éditions de textes français de la même période, encodés en XML selon les recommandations de la Text Encoding Initiative (TEI). Une attention particulière est portée à la description bibliographique respectant l'état original des sources et à la conservation de la correspondance entre les deux corpus distincts, fac-similés et textes, liés par leurs métadonnées communes. L'équipe BVH travaille en étroite collaboration avec l'équipe de recherche sur la Reconnaissance des Formes et Analyse d'Images du Laboratoire d'Informatique de Tours (LI Tours, équipe RFAI) pour le développement de nouveaux outils. Agora et Retro, logiciels open source, nous permettent d'extraire automatiquement les éléments (caractères et illustrations) des livres numérisés et ainsi, de constituer des bases de données spécialisées iconographiques et de matériel d'imprimerie. Membre du Consortium TEI, les BVH contribuent activement au développement d'un schéma pour l'édition de documents de la Renaissance. Les méthodologies et outils développés dans le cadre de ces recherches sont mises à la disposition de tous.
- Publication . Article . 2013Open Access EnglishAuthors:MILICA LAJBENŠPERGER; MARIJA ŠEGAN; SANJA RAJIĆ;MILICA LAJBENŠPERGER; MARIJA ŠEGAN; SANJA RAJIĆ;Publisher: Foundation Pro Scientia Publica
In the Republic of Serbia the idea that the digitization of cultural heritage could be used to popularize and utilize modern technologies in education was first realized in 2012. One of the results of this project was a user study. In this paper we indicate some of project’s fragments and present some of the results of the conducted research.