Biscoglio I, Ciancabilla A, Fusani M, Lami G, Trentanni G
Requirements analysis Safety standards
Methods and tools for detecting and measuring ambiguity in texts have been proposed for years, yet their efficacy is still under study for improvement, encouraged by results in various application fields (requirements, legal documents, interviews, ...). The paper presents a fresh-started process aimed at validating such methods and tools by applying some of them to a semi-structured data corpus. This corpus represents results of manual reviews, done by international experts, along with their source texts. The purpose is to check how much results of automated analysis are consistent with the reviewers reports. The application domain is that of safety-related system/software Standards in Railway. Thus, if we increase confidence in tools, then we also increase confidence in Standard correctness, which in turn impacts in conforming products.
Publisher: Springer
@inbook{oai:it.cnr:prodotti:415587, title = {Comparing results of natural language disambiguation tools with reports of manual reviews of safety-related standards}, author = {Biscoglio I and Ciancabilla A and Fusani M and Lami G and Trentanni G}, publisher = {Springer}, year = {2019} }