If the digital scholarly edition (DSE) is ever to replace the print scholarly edition it must be made truly interoperable so it can be easily secured, moved, published, aggregated, distributed and ...sold. Current DSEs are customised for particular projects, and must be maintained by their creators. Their contents are also not easily reusable by others. However, digital editions can be made truly interoperable by writing them directly for the Web rather than first in XML, then converting them to HTML. For this to work, some changes to the organisation of the software and data of a DSE are needed. Instead of dividing the software into two parts that run on the server and the client, all software can be moved to the client. In this way, a DSE can become portable, durable and directly usable in any web-browser. Instead of XML, the textual data can use a simplified form of HTML consisting of only two elements: and , controlled and customised by a standard CSS stylesheet. The current practice of encoding alternatives such as variants can be replaced by versions and layers: versions are complete texts written by the author and layers are notional transcripts of local changes ordered chronologically. In this way textual data can express most of the information formerly specified by the complex TEI-XML Guidelines, and the rest via other technologies, and reorganise it in a form that allows easy comparison, editing, searching and textual analysis using standard software tools.
This book offers a critical introduction to the core technologies underlying the Internet from a humanistic perspective. It provides a cultural critique of computing technologies, by exploring the ...history of computing and examining issues related to writing, representing, archiving and searching. The book raises awareness of, and calls for, the digital humanities to address the challenges posed by the linguistic and cultural divides in computing, the clash between communication and control, and the biases inherent in networked technologies. A common problem with publications in the Digital Humanities is the dominance of the Anglo-American perspective. While seeking to take a broader view, the book attempts to show how cultural bias can become an obstacle to innovation both in the methodology and practice of the Digital Humanities. Its central point is that no technological instrument is culturally unbiased, and that all too often the geography that underlies technology coincides with the social and economic interests of its producers. The alternative proposed in the book is one of a world in which variation, contamination and decentralization are essential instruments for the production and transmission of digital knowledge. It is thus necessary not only to have spaces where DH scholars can interact (such as international conferences, THATCamps, forums and mailing lists), but also a genuine sharing of technological know-how and experience.
Recent proposals for creating digital scholarly editions (DSEs) through the crowdsourcing of transcriptions and collaborative scholarship, for the establishment of national repositories of digital ...humanities data, and for the referencing, sharing, and storage of DSEs, have underlined the need for greater data interoperability. The TEI Guidelines have tried to establish standards for encoding transcriptions since 1988. However, because the choice of tags is guided by human interpretation, TEI-XML encoded files are in general not interoperable. One way to fix this problem may be to break down the current all-in-one approach to encoding so that DSEs can be specified instead by a bundle of separate resources that together offer greater interoperability: plain text versions, markup, annotations, and metadata. This would facilitate not only the development of more general software for handling DSEs, but also enable existing programs that already handle these kinds of data to function more efficiently.
A Model of Versions and Layers Schmidt, Desmond
Digital humanities quarterly,
01/2019, Volume:
13, Issue:
3
Journal Article
Peer reviewed
Open access
Our libraries are full of manuscripts, many of them modern. However, the digitisation of these unique documents is currently very expensive. How can we reduce the cost of encoding them in a way that ...will facilitate their study, annotation, searching, sharing, editing, comparison and reading over the Web? Unlike new documents prepared for the Web, historical manuscripts frequently contain internal variation in the form of erasures, insertions, substitutions and transpositions. Variation is also often expressed externally between copies of one work: in successive print editions, in manuscript copies or successive drafts. Current practice is to prepare separate transcriptions of each physical document and to embed internal variation directly into the transcribed text using complex markup codes. This makes the transcriptions expensive to produce and hard to edit, limits text reuse and requires that transcriptions be first disentangled via customised software for reading, searching, or comparison. An alternative approach, described here, is to separate out the internal variation of each document into notional layers. This is done primarily in order to facilitate the recording of revisions at any one point. The move is, of course, counter-intuitive since these document-wide layers were not intended by the author as texts to be read. But it proves itself in practice by radically simplifying the tasks of editing, searching and comparison. Versions, on the other hand, are higher-level constructs that do represent the state of a text as it was left at some point in time by an author or scribe. By employing layers to record complex revisions, the task of computing differences among intra-document layers and against versions of the same work in multiple documents may be delegated to the machine rather than having to be recorded laboriously by hand. The ensuing simplification of markup reduces transcription and editing costs, boosts text reuse and searching, and, by removing the need for customised software, increases the longevity of digital transcriptions.
Detection and prevention of global navigation satellite system (GNSS) "spoofing" attacks, or the broadcast of false global navigation satellite system services, has recently attracted much research ...interest. This survey aims to fill three gaps in the literature: first, to assess in detail the exact nature of threat scenarios posed by spoofing against the most commonly cited targets; second, to investigate the many practical impediments, often underplayed, to carrying out GNSS spoofing attacks in the field; and third, to survey and assess the effectiveness of a wide range of proposed defences against GNSS spoofing. Our conclusion lists promising areas of future research.
The digitisation of cultural heritage and linguistics texts has long been troubled by the problem of how to represent overlapping structures arising from different markup perspectives (‘overlapping ...hierarchies’) or from different versions of the same work (‘textual variation’). These two problems can be reduced to one by observing that every case of overlapping hierarchies is also a case of textual variation. Overlapping textual structures can be accurately modelled either as a minimally redundant directed graph, or, more practically, as an ordered list of pairs, each containing a set of versions and a fragment of text or data. This ‘pairs-list’ representation is provably equivalent to the graph representation. It can record texts consisting of thousands of versions or perspectives without becoming overloaded with data, and the most common operations on variant text, e.g. comparison between two versions, can be performed in linear time. This representation also separates variation or other overlapping structures from the document content, leading to a simplification of markup suitable for wiki-like web applications.
The digital humanities are growing rapidly in response to a rise in Internet use. What humanists mostly work on, and which forms much of the contents of our growing repositories, are digital ...surrogates of originally analog artefacts. But is the data model upon which many of those surrogates are based – embedded markup – adequate for the task? Or does it in fact inhibit reusability and flexibility? To enhance interoperability of resources and tools, some changes to the standard markup model are needed. Markup could be removed from the text and stored in standoff form. The versions of which many cultural heritage texts are composed could also be represented externally, and computed automatically. These changes would not disrupt existing data representations, which could be imported without significant data loss. They would also enhance automation and ease the increasing burden on the modern digital humanist.
An intrinsic challenge associated with evaluating proposed techniques for detecting Distributed Denial-of-Service (DDoS) attacks and distinguishing them from Flash Events (FEs) is the extreme ...scarcity of publicly available real-word traffic traces. Those available are either heavily anonymised or too old to accurately reflect the current trends in DDoS attacks and FEs. This paper proposes a traffic generation and testbed framework for synthetically generating different types of realistic DDoS attacks, FEs and other benign traffic traces, and monitoring their effects on the target. Using only modest hardware resources, the proposed framework, consisting of a customised software traffic generator, ‘Botloader’, is capable of generating a configurable mix of two-way traffic, for emulating either large-scale DDoS attacks, FEs or benign traffic traces that are experimentally reproducible. Botloader uses IP-aliasing, a well-known technique available on most computing platforms, to create thousands of interactive UDP/TCP endpoints on a single computer, each bound to a unique IP-address, to emulate large numbers of simultaneous attackers or benign clients.
Display omitted
•A framework for generating realistic attack and benign traffic.•The framework uses modest hardware and a customised traffic generator – Botloader.•IP-aliasing is used to create thousands of interactive UDP/TCP endpoints.•The framework successfully emulates a real-world DDoS attack and Flash Event.
Abstract
Although most would agree that the future of the scholarly edition lies in the digital medium, it is the print scholarly edition that is still more often cited and read. The production of ...digital scholarly editions (DSEs) is still seen as an experimental field whose methodology has not yet settled to the extent that a digital editing project can be approached with the same confidence as the making of a print edition. This article describes an experimental conversion of a print scholarly edition—Giacomo Leopardi’s Idilli by Paola Italia (2008)—into a DSE. This posed a challenge due to the complexity of its internal evidence, but was also relatively short and suitable for an experimental edition. Our objective was to assimilate into a web-based DSE all the information contained in the text and apparatus of the print edition. We also sought to discover whether the making of a DSE today that could fully utilize the affordances of the web, would necessarily place a significant technical load on editors who are more accustomed to solving textual problems. We review briefly a number of generic tools for making DSEs and describe two attempts at making our own DSE of Leopardi’s Idilli: a wiki edition whose primary purpose was pedagogical and a DSE based on the software used to make the Charles Harpur Critical Archive (Eggert, 2019, Charles Harpur Critical Archive.http://charles-harpur.org). We compare these experiences and draw conclusions about the prospects of making DSEs today.
The Digital Humanist Domenico Fiormonte; Teresa Numerico; Francesca Tomasi ...
12/2017
eBook
Open access
This book offers a critical introduction to the core technologies underlying the Internet from a humanistic perspective. It provides a cultural critique of computing technologies, by exploring the ...history of computing and examining issues related to writing, representing, archiving and searching. The book raises awareness of, and calls for, the digital humanities to address the challenges posed by the linguistic and cultural divides in computing, the clash between communication and control, and the biases inherent in networked technologies. A common problem with publications in the Digital Humanities is the dominance of the Anglo-American perspective. While seeking to take a broader view, the book attempts to show how cultural bias can become an obstacle to innovation both in the methodology and practice of the Digital Humanities. Its central point is that no technological instrument is culturally unbiased, and that all too often the geography that underlies technology coincides with the social and economic interests of its producers. The alternative proposed in the book is one of a world in which variation, contamination and decentralization are essential instruments for the production and transmission of digital knowledge. It is thus necessary not only to have spaces where DH scholars can interact (such as international conferences, THATCamps, forums and mailing lists), but also a genuine sharing of technological know-how and experience.