571 result(s)
Page Size: 10, 20, 50
Export: bibtex, xml, json, csv
Order by:

CNR Author operator: and / or
more
Typology operator: and / or
more
Language operator: and / or
Date operator: and / or
more
Rights operator: and / or
2016 Journal article Open Access OPEN
Retrieval and classification methods for textured 3D models: a comparative study
Biasotti S. M., Cerri A., Aono M., Hamza A. B., Garro V., Giachetti A., Giorgi D., Godil A. A., Li G. C., Sanada C., Spagnuolo M., Tatsuma A., Velasco Forero S.
This paper presents a comparative study of six methods for the retrieval and classification of textured 3D models, which have been selected as representative of the state of the art. To better analyse and control how methods deal with specific classes of geometric and texture deformations, we built a collection of 572 synthetic textured mesh models, in which each class includes multiple texture and geometric modifications of a small set of null models. Results show a challenging, yet lively, scenario and also reveal interesting insights into how to deal with texture information according to different approaches, possibly working in the CIELab as well as in modifications of the RGB colour space.Source: The visual computer 32 (2016): 217–241. doi:10.1007/s00371-015-1146-3
DOI: 10.1007/s00371-015-1146-3
Project(s): IQMULUS via OpenAIRE, VISIONAIR via OpenAIRE
Metrics:


See at: The Visual Computer Open Access | ISTI Repository Open Access | The Visual Computer Restricted | Hyper Article en Ligne Restricted | link.springer.com Restricted | CNR ExploRA


2016 Report Unknown
La gestione della privacy all'interno dell'Istituto di Scienza e Tecnologie dell'Informazione A. Faedo
Deluca R.
Il seguente lavoro illustra il sistema di organizzazione per la gestione della privacy realizzato dall'Istituto di Scienza e Tecnologie dell'Informazione "A. Faedo" (ISTI) con lo scopo di garantire che il trattamento dei dati personali avvenga nel rispetto dei diritti, delle libertà fondamentali, nonché della dignità delle persone, con particolare attenzione alle garanzie di riservatezza e sicurezza dei datiSource: ISTI Technical reports, 2016

See at: CNR ExploRA


2016 Contribution to book Open Access OPEN
Non-conventional electrochemical and optical sensor systems
Di Natale C., Dini F., Scozzari A.
Electroanalytical methods are a common tool for the assessment of chemical peculiarities of aqueous solutions. Also, the analysis of water based on optical sensors is a mature field of research, which already led to industrial applications and standard laboratory practices. Nevertheless, scientific literature is still offering new sensor techniques and innovative measurement approaches in both fields. In particular, for fast characterisation of liquids and change detection applications in a continuous monitoring context, the technology of taste sensors based on electrochemical techniques is still witnessing a growing interest. Such devices are often defined as "electronic tongues" or "e-tongues". In addition, emerging inexpensive and portable devices with optical-sensing capabilities can be used for monitoring applications with a novel approach. This chapter gives an overview of recent techniques developed in both fields and presents several potential applications and case studies that deal with the context of water quality assessment. A brief introduction about the basics of each measurement technology, even if not exhaustive, is also provided.Source: Threats to the Quality of Groundwater Resources: Prevention and Control, edited by A. Scozzari, E. Dotsika. London: Springer, 2016
DOI: 10.1007/698_2013_254
Metrics:


See at: ISTI Repository Open Access | doi.org Restricted | link.springer.com Restricted | CNR ExploRA


2016 Contribution to book Restricted
Humanity is much more than the sum of humans
Bolognesi T.
Consider two roughly spherical and coextensive complex systems: the atmosphere and the upper component of the biosphere - humanity. It is well known that, due to a malicious antipodal butterfly, the possibility to accurately forecast the weather - let alone controlling it - is severely limited. Why should it be easier to predict and steer the future of humanity? In this essay we present both pessimistic and optimistic arguments about the possibility to effectively predict and drive our future. On the long time scale, we sketch a software-oriented view at the cosmos in all of its components, from spacetime to the biosphere and human societies, borrowing ideas from various scientific theories or conjectures; the proposal is also motivated by an attempt to provide some formal foundations to Teilhard de Chardin's cosmological/metaphysical visions, that relate the growing complexity of the material universe, and its final fate, to the progressive emergence of consciousness. On a shorter scale, we briefly discuss the possibility of using simple formal models such as Kauffman's boolean networks, and the growing body of data about social behaviours, for simulating humanity 'in-silico', with the purpose to anticipate problems and testing solutions.Source: How Should Humanity Steer the Future?, edited by Anthony Aguirre, Brendan Foster, Zeeya Merali, pp. 1–15, 2016
DOI: 10.1007/978-3-319-20717-9_3
Metrics:


See at: doi.org Restricted | link.springer.com Restricted | CNR ExploRA


2016 Journal article Open Access OPEN
Integrating adaptation rules for people with special needs in model-based UI development process
Miñón R., Paternò F., Arrue M., Abascal J.
The adaptation of user interfaces for people with special needs is a promising approach in order to enable their access to digital services. Model-based user interfaces provide a useful approach for this purpose since they allow tailoring final user interfaces with a high degree of flexibility. This paper describes a system called Adaptation Integration System aimed at providing Cameleon Reference Framework model-based tools with a mechanism to integrate adaptation rules in the development process. Thus, more accessible user-tailored interfaces can be automatically generated. The services provided by the system can be applied at both design time and runtime. At design time, a user interface can be tailored at any abstraction level in the development process. At runtime, changes in the context of use trigger the adaptation process. Adaptation rules are stored in a repository tagged with meta-information useful for the adaptation process, such as the granularity of the adaptations and the abstraction level. As case studies, two applications have been developed using the services provided by the system. One of them exploits the benefits at design time, whereas the other application is devoted to describe the adaptation process at runtime. The results obtained in these two scenarios demonstrate the viability and potential of the adaption integration system since even inexperienced designers may efficiently produce accessible user interfaces.Source: Universal access in the information society (Internet) 15 (2016): 153–168. doi:10.1007/s10209-015-0406-3
DOI: 10.1007/s10209-015-0406-3
Project(s): SERENOA via OpenAIRE
Metrics:


See at: ISTI Repository Open Access | Universal Access in the Information Society Restricted | link.springer.com Restricted | CNR ExploRA


2016 Journal article Open Access OPEN
Experiencing ancient buildings from a 3D GIS perspective: a case drawn from the Swedish Pompeii project
Dell'Unto N., Landeschi G., Leander A. -M., Dellepiane M., Callieri M., Ferdani D.
In recent times, archaeological documentation strategies have been considerably improved by the use of advanced 3D acquisition systems. Laser scanning, photogrammetry and computer vision techniques provide archaeologists with new opportunities to investigate and document the archaeological record. In spite of this, the amount of data collected and the geometrical complexity of the models resulting from such acquisition processes have always prevented their systematic integration into a geographic information systems (GIS) environment. Recent technological advances occurred in the visualization of 3D contents, led us to overcome the aforementioned limitations and set up a work pipeline in which was possible to put the 3D models not only in the context of data visualization but also in the frame of spatial analysis. The case study described is a part of the Swedish Pompeii Project, a research and fieldwork activity started in 2000 with the purpose of recording and investigating an entire Pompeian city block, Insula V 1. As an additional part of the research, a laser scanning acquisition campaign was conducted in the last few years. The resulting models were thus meant to be used to develop further research lines: Among these, a 3D GIS system was expected to be set up with the purpose to (i) collect in the same geo-referenced environment, different typologies of documentation gathered in the context of the Swedish Pompeii Project; (ii) inter-connect 3D models with the project website; (iii) use the third dimension as a further analytical field of investigation, in the form of spatial analysis and cognitive simulation.Source: Journal of archaeological method and theory 23 (2016): 73–94. doi:10.1007/s10816-014-9226-7
DOI: 10.1007/s10816-014-9226-7
Metrics:


See at: ISTI Repository Open Access | Journal of Archaeological Method and Theory Restricted | link.springer.com Restricted | CNR ExploRA


2016 Journal article Open Access OPEN
Building a European geothermal information network using a distributed e-Infrastructure
Trumpy E., Coro G., Manzella A., Pagano P., Castelli D., Calcagno P., Nador A., Bragasson T., Grellet S., Siddiqi G.
Geothermal data are published using different IT services, formats and content representations, and can refer to both regional and global scale information. Geothermal stakeholders search for information with different aims. E-Infrastructures are collaborative platforms that address this diversity of aims and data representations. In this paper, we present a prototype for a European Geothermal Information Platform that uses INSPIRE recommendations and an e-Infrastructure (D4Science) to collect, aggregate and share data sets from different European data contributors, thus enabling stakeholders to retrieve and process a large amount of data. Our system merges segmented and national realities into one common framework. We demonstrate our approach by describing a platform that collects data from Italian, French, Hungarian, Swiss and Icelandic geothermal data providers.Source: International journal of digital earth (Online) 9 (2016): 499–519. doi:10.1080/17538947.2015.1073378
DOI: 10.1080/17538947.2015.1073378
Project(s): IMARINE via OpenAIRE, GEOTHERMAL ERA NET via OpenAIRE
Metrics:


See at: International Journal of Digital Earth Open Access | Hyper Article en Ligne Restricted | www.tandfonline.com Restricted | CNR ExploRA


2016 Contribution to journal Open Access OPEN
Editorial preface for the JLAMP Special Issue on Formal Methods for Software Product Line Engineering
Ter Beek M. H., Clarke D., Schaefer I.
This special issue is devoted to the themes of the FMSPLE workshop series on formal methods and analysis in Software Product Line Engineering (SPLE). SPLE aims at developing a family of (software) systems by reuse in order to reduce time-to-market and to increase product quality. The correctness of the artefacts intended for reuse, as well as the correctness of the developed products, is of crucial interest for many safety-critical or business-critical applications. Formal methods and analysis techniques have been successfully applied in single system engineering in order to rigorously establish critical system requirements. While SPLE has matured considerably over the last decade, many challenges still remain, among which efficient variability management, the consistency between domain and application engineering, the reduction of quality assurance efforts, and the consistent and sustainable evolution of product families. However, formal methods and analysis techniques are still not applied broadly enough in SPLE, despite their potential to improve product quality. One of the reasons for this is that existing formal approaches from single system engineering do not consider variability, the quintessential feature of product lines.DOI: 10.1016/j.jlamp.2015.09.006
Metrics:


See at: Journal of Logical and Algebraic Methods in Programming Open Access | ISTI Repository Open Access | www.sciencedirect.com Open Access | CNR ExploRA


2016 Journal article Restricted
Scale space graph representation and kernel matching for non rigid and textured 3D shape retrieval
Garro V., Giachetti A.
In this paper we introduce a novel framework for 3D object retrieval that relies on tree-based shape representations (TreeSha) derived from the analysis of the scale-space of the Auto Diffusion Function (ADF) and on specialized graph kernels designed for their comparison. By coupling maxima of the Auto Diffusion Function with the related basins of attraction, we can link the information at different scales encoding spatial relationships in a graph description that is isometry invariant and can easily incorporate texture and additional geometrical information as node and edge features. Using custom graph kernels it is then possible to estimate shape dissimilarities adapted to different specific tasks and on different categories of models, making the procedure a powerful and flexible tool for shape recognition and retrieval. Experimental results demonstrate that the method can provide retrieval scores similar or better than state-of-the-art on textured and non textured shape retrieval benchmarks and give interesting insights on effectiveness of different shape descriptors and graph kernels.Source: IEEE transactions on pattern analysis and machine intelligence 38 (2016): 1258–1271. doi:10.1109/TPAMI.2015.2477823
DOI: 10.1109/tpami.2015.2477823
Metrics:


See at: IEEE Transactions on Pattern Analysis and Machine Intelligence Restricted | ieeexplore.ieee.org Restricted | CNR ExploRA


2016 Journal article Restricted
Simple outlier labeling based on quantileregression, with application to thesteelmaking process
Bellio R., Coletto M.
This paper introduces some methods for outlier identiîEUR,cation in the regression setting, motivated by the analysis of steelmakingprocess data. The proposed methodology extends to the regression setting the boxplot rule, commonly used for outlier screening withunivariate data. The focus here is on bivariate settings with a single covariate, but extensions are possible. The proposal is basedon quantile regression, including an additional transformation parameter for selecting the best scale for linearity of the conditionalquantiles. The resulting method is used to perform effective labeling of potential outliers, with a quite low computational complexity,allowing for simple implementation within statistical software as well as commonly used spreadsheets. Some simulation experimentshave been carried out to study the swamping and masking properties of the proposal. The methodology is also illustrated by somereal life examples, taking as the response variable the energy consumed in the melting process.Source: Applied stochastic models in business and industry (Online) 32 (2016): 228–232. doi:10.1002/asmb.2146
DOI: 10.1002/asmb.2146
Metrics:


See at: Applied Stochastic Models in Business and Industry Restricted | onlinelibrary.wiley.com Restricted | CNR ExploRA


2016 Journal article Open Access OPEN
Modelling and analysing variability in product families: model checking of modal transition systems with variability constraints
Ter Beek M. H., Fantechi A., Gnesi S., Mazzanti F.
We present the formal underpinnings of a modelling and analysis framework for the specification and verification of variability in product families. We address variability at the behavioural level by modelling the family behaviour by means of a Modal Transition System (MTS) with an associated set of variability constraints expressed over action labels. An MTS is a Labelled Transition System (LTS) which distinguishes between optional and mandatory transitions. Steered by the variability constraints, the inclusion or exclusion of labelled transitions in an LTS refining the MTS determines the family's possible product behaviour. We formalise this as a special-purpose refinement relation for MTSs, which differs fundamentally from the classical one, and show how to use it for the definition and derivation of valid product behaviour starting from product family behaviour. We also present a variability-aware action-based branching-time modal temporal logic to express properties over MTSs, and demonstrate a number of results regarding the preservation of logical properties from family to product behaviour. These results pave the way for the more efficient family-based analyses of MTSs, limiting the need for product-by-product analyses of LTSs. Finally, we define a high-level modal process algebra for the specification of MTSs. The complete framework is implemented in a model-checking tool: given the behaviour of a product family modelled as an MTS with an additional set of variability constraints, it allows the explicit generation of valid product behaviour as well as the efficient on-the-fly verification of logical properties over family and product behaviour alike.Source: Journal of Logical and Algebraic Methods in Programming [online] 85 (2016): 287–315. doi:10.1016/j.jlamp.2015.11.006
DOI: 10.1016/j.jlamp.2015.11.006
Project(s): QUANTICOL via OpenAIRE
Metrics:


See at: Journal of Logical and Algebraic Methods in Programming Open Access | Flore (Florence Research Repository) Open Access | ISTI Repository Open Access | www.sciencedirect.com Restricted | CNR ExploRA


2016 Conference article Restricted
An experience on applying process mining techniques to the tuscan port community system
Spagnolo G. O., Marchetti E., Coco A., Scarpellini P., Querci A., Fabbrini F., Gnesi S.
[Context & Motivation] The Business Process Management is an important and widespread adopted proposal for modelling process specifications and developing an executable framework for the management of the process itself. In particular the monitoring facilities associated to the on-line process execution provide an important means to the control of process evolution and quality. In this context, this paper provides an experience on the application of business process modelling techniques and process mining techniques to the TPCS, Tuscan Port Community System. This is a web-services based platform with multilevel access control and data recovery facilities, developed for supporting and strengthening the Motorways of the Sea and Italian regulations. The paper describes a storytelling approach applied to derive the TPCS business process model and the conformance checking techniques used to validate it and improve the overall TPCS software quality.Source: Software Quality. The Future of Systems and Software Development. 8th International Conference, pp. 49–60, Vienna, Austria, 18-21/01/2016
DOI: 10.1007/978-3-319-27033-3_4
Metrics:


See at: doi.org Restricted | link.springer.com Restricted | CNR ExploRA


2016 Conference article Open Access OPEN
MeshLab e Blender: software open source in supporto allo studio e alla ricostruzione virtuale della policromia antica
Siotto E., Callieri M., Dellepiane M., Scopigno R.
The paper proposes our experience with the reconstruction of the ancient polychromy of the Roman sarcophagus dedicated to Ulpia Domnina (MNR-TD inv. no. 125891) through the use of open source digital technologies. In particular, MeshLab (an open source mesh processing tool) was used to better understand the styles and techniques used to apply the colour on the sarcophagus, and to create its virtual reconstruction; while Blender (an open source modelling and rendering tool) was used to render the different layers of paint and display their final effect. The combination of the two open-source software has also been tested on a selected area of the Ulpia Domnina's sarcophagus that presents an interesting layering of colours. Their combination (necessary to overcome some of their intrinsic limitations) proved to be able to produce realistic results.Source: Free, libre and open source software e open format nei processi di ricerca archeologica: VIII Edizione Catania 2013, pp. 210–219, Catania, Italy, 18-19/06/2013

See at: www.archaeopress.com Open Access | CNR ExploRA


2016 Journal article Embargo
3D reconstruction for featureless scenes with curvature hints
Baldacci A., Bernabei D., Corsini M., Ganovelli F., Scopigno R.
We present a novel interactive framework for improving 3D reconstruction starting from incomplete or noisy results obtained through image-based reconstruction algorithms. The core idea is to enable the user to provide localized hints on the curvature of the surface, which are turned into constraints during an energy minimization reconstruction. To make this task simple, we propose two algorithms. The first is a multi-view segmentation algorithm that allows the user to propagate the foreground selection of one or more images both to all the images of the input set and to the 3D points, to accurately select the part of the scene to be reconstructed. The second is a fast GPU-based algorithm for the reconstruction of smooth surfaces from multiple views, which incorporates the hints provided by the user. We show that our framework can turn a poor-quality reconstruction produced with state of the art image-based reconstruction methods into a high- quality one.Source: The visual computer 32 (2016): 1605–1620. doi:10.1007/s00371-015-1144-5
DOI: 10.1007/s00371-015-1144-5
Project(s): HARVEST4D via OpenAIRE
Metrics:


See at: The Visual Computer Restricted | link.springer.com Restricted | CNR ExploRA


2016 Journal article Open Access OPEN
A proactive system for maritime environment monitoring
Moroni D., Pieri G., Tampucci M., Salvetti O.
The ability to remotely detect and monitor oil spills is becoming increasingly important due to the high demand of oil-based products. Indeed, shipping routes are becoming very crowded and the likelihood of oil slick occurrence is increasing. In this frame, a fully integrated remote sensing system can be a valuable monitoring tool. We propose an integrated and interoperable system able to monitor ship traffic and marine operators, using sensing capabilities from a variety of electronic sensors, along with geo-positioning tools, and through a communication infrastructure. Our system is capable of transferring heterogeneous data, freely and seamlessly, between different elements of the information system (and their users) in a consistent and usable form. The system also integrates a collection of decision support services providing proactive functionalities. Such services demonstrate the potentiality of the system in facilitating dynamic links among different data, models and actors, as indicated by the performed field tests.Source: Marine pollution bulletin. 102 (2016): 316–322. doi:10.1016/j.marpolbul.2015.07.045
DOI: 10.1016/j.marpolbul.2015.07.045
Project(s): ARGOMARINE via OpenAIRE
Metrics:


See at: ISTI Repository Open Access | Marine Pollution Bulletin Restricted | www.sciencedirect.com Restricted | www.scopus.com Restricted | CNR ExploRA


2016 Journal article Open Access OPEN
Automatic classification of climate change effects on marine species distributions in 2050 using the AquaMaps model
Coro G., Magliozzi C., Ellenbroek A., Kaschner K., Pagano P.
Habitat modifications driven by human impact and climate change may influence species distribution, particularly in aquatic environments. Niche-based models are commonly used to evaluate the availability and suitability of habitat and assess the consequences of future climate scenarios on a species range and shifting edges of its distribution. Together with knowledge on biology and ecology, niche models also allow evaluating the potential of species to react to expected changes. The availability of projections of future climate scenarios allows comparing current and future niche distributions, assessing a species' habitat suitability modification and shift, and consequently estimating potential species' reaction. In this study, differences between the distribution maps of 406 marine species, which were produced by the AquaMaps niche models on current and future (year 2050) scenarios, were estimated and evaluated. Discrepancy measurements were used to identify a discrete number of categories, which represent different responses to climate change. Clustering analysis was then used to automatically detect these categories, demonstrating their reliability compared to human supervised classification. Finally, the distribution of characteristics like extinction risk (based on IUCN categories), taxonomic groups, population trends and habitat suitability change over the clustering categories was evaluated. In this assessment, direct human impact was neglected, in order to focus only on the consequences of environmental changes. Furthermore, in the comparison between two climate snapshots, the intermediate phases were assumed to be implicitly included into the model of the 2050 climate scenario.Source: Environmental and ecological statistics (Dordr., Online) 23 (2016): 155–180. doi:10.1007/s10651-015-0333-8
DOI: 10.1007/s10651-015-0333-8
Project(s): IMARINE via OpenAIRE
Metrics:


See at: link.springer.com Open Access | ISTI Repository Open Access | Environmental and Ecological Statistics Restricted | CNR ExploRA


2016 Journal article Open Access OPEN
Digital restoration of ancient color manuscripts from geometrically misaligned recto-verso pairs
Savino P., Tonazzini A.
We propose a fast automatic procedure for registration and restoration of images of recto-verso pairsof color manuscripts affected by bleed-through distortion. The registration algorithm assumes a rigidprojective deformation of a side with respect to the other. The coefficients of the geometric transformationare computed from a large number of pairs of matching points, automatically detected by exploiting theestimates of local shifts between pairs of small patches. We validate the efficiency of the registrationalgorithm through the performance of a restoration method based on a model that relates each coupleof corresponding pixels in the two images, and thus requiring a very accurate alignment of the twosides. The experiments show that this combined procedure of registration plus restoration can providean excellent removal of the bleed-through pattern, while leaving unaltered the salient features of theoriginal manuscript.Source: Journal of cultural heritage 19 (2016): 511–521. doi:10.1016/j.culher.2015.11.005
DOI: 10.1016/j.culher.2015.11.005
Metrics:


See at: ISTI Repository Open Access | Journal of Cultural Heritage Restricted | www.sciencedirect.com Restricted | CNR ExploRA


2016 Journal article Open Access OPEN
Leveraging spatial abstraction in traffic analysis and forecasting with visual analytics
Andrienko N., Andrienko G., Rinzivillo S.
A spatially abstracted transportation network is a graph where nodes are territory compartments (areas in geographic space) and edges, or links, are abstract constructs, each link representing all possible paths between two neighboring areas. By applying visual analytics techniques to vehicle traffic data from different territories, we discovered that the traffic intensity (a.k.a. traffic flow or traffic flux) and the mean velocity are interrelated in a spatially abstracted transportation network in the same way as at the level of street segments. Moreover, these relationships are consistent across different levels of spatial abstraction of a physical transportation network. Graphical representations of the flux-velocity interdependencies for abstracted links have the same shape as the fundamental diagram of traffic flow through a physical street segment, which is known in transportation science. This key finding substantiates our approach to traffic analysis, forecasting, and simulation leveraging spatial abstraction. We propose a framework in which visual analytics supports three high-level tasks, assess, forecast, and develop options, in application to vehicle traffic. These tasks can be carried out in a coherent workflow, where each next task uses the results of the previous one(s). At the 'assess' stage, vehicle trajectories are used to build a spatially abstracted transportation network and compute the traffic intensities and mean velocities on the abstracted links by time intervals. The interdependencies between the two characteristics of the links are extracted and represented by formal models, which enable the second step of the workflow, 'forecast', involving simulation of vehicle movements under various conditions. The previously derived models allow not only prediction of normal traffic flows conforming to the regular daily and weekly patterns but also simulation of traffic in extraordinary cases, such as road closures, major public events, or mass evacuation due to a disaster. Interactive visual tools support preparation of simulations and analysis of their results. When the simulation forecasts problematic situations, such as major congestions and delays, the analyst proceeds to the step 'develop options' for trying various actions aimed at situation improvement and investigating their consequences. Action execution can be imitated by interactively modifying the input of the simulation model. Specific techniques support comparisons between results of simulating different "what if" scenarios.Source: Information systems (Oxf.) 57 (2016): 172–194. doi:10.1016/j.is.2015.08.007
DOI: 10.1016/j.is.2015.08.007
Project(s): CIMPLEX via OpenAIRE, SoBigData via OpenAIRE
Metrics:


See at: Information Systems Open Access | City Research Online Open Access | ISTI Repository Open Access | Information Systems Restricted | Fraunhofer-ePrints Restricted | www.sciencedirect.com Restricted | CNR ExploRA


2016 Conference article Open Access OPEN
Building a digital library containing digital elaborations of ancient documents
Debole F., Savino P., Tonazzini A.
Digital archives containing digitized images and detailed descriptions of cultural heritage objects are of primary importance in order to guarantee the preservation and to foster the fruition of many fragile artifacts of our culture and history. Digital processing of these images is frequently needed in order to improve their readability, to correct degradations and damages, and to analyze their contents. This paper presents a metadata schema and a metadata editor supporting the description and the archiving of all elaboration activities performed. The archive allows one to perform content based searches of the original object's descriptions as well as of the results of the elaboration activities.Source: Tenth International Conference on Digital Information Management, pp. 124–131, Jeju Island, South Korea, 21-23/10/2015
DOI: 10.1109/icdim.2015.7381855
Metrics:


See at: ISTI Repository Open Access | doi.org Restricted | ieeexplore.ieee.org Restricted | CNR ExploRA


2016 Journal article Open Access OPEN
The Fuzzy Ontology Reasoner fuzzyDL
Bobillo F., Straccia U.
Classical, two-valued, ontologies have been successfully applied to represent the knowledge in many domains. However, it has been pointed out that they are not suitable in domains where vague or imprecise pieces of information play an important role. To overcome this limitation, several extensions to classical ontologies based on fuzzy logic have been proposed. We believe, however, that the success of fuzzy ontologies strongly depends on the availability of effective reasoners able to deal with fuzzy ontologies. In this paper we describe fuzzyDL, an expressive fuzzy ontology reasoner with some unique features. We discuss its possibilities for fuzzy ontology representation, the supported reasoning services, the different interfaces to interact with it, some implementation details, a comparison with other fuzzy ontology reasoners, and an overview of the main applications that have used it so far.Source: Knowledge-based systems 95 (2016): 12–34. doi:10.1016/j.knosys.2015.11.017
DOI: 10.1016/j.knosys.2015.11.017
Metrics:


See at: ISTI Repository Open Access | Knowledge-Based Systems Restricted | www.sciencedirect.com Restricted | CNR ExploRA