453 result(s)
Page Size: 10, 20, 50
Export: bibtex, xml, json, csv
Order by:

CNR Author operator: and / or
more
Typology operator: and / or
more
Language operator: and / or
Date operator: and / or
more
Rights operator: and / or
2003 Doctoral thesis Unknown
Criteria to improve web site usability and accessibility when interacting through screen readers: definition, application, and evaluation
Leporini B.
This research is related to the usability and accessibility of Web sites. Guidelines for Web site usability already exist, but they only marginally consider the exigencies of "special users", such as blind people or subjects with high levels of vision deficit. This study specifically aimed at defining, in a more precise way, the usability of Web sites, in order to improve their accessibility for "special users", who are obliged to navigate on the internet through screen readers. First of all, 19 criteria (general principles) and 54 checkpoints defining each criterion (technical solutions) were proposed; then, possible ways of application of such criteria and checkpoints was specified. This represented the starting point to evaluate the usability of Web sites: in this work, the heuristic-based method was proposed and used in order to assign levels of usability to several Web sites of interest. A user testing was performed by 15 voluntary users, chosen among blind and low vision subjects. Two Web site prototypes were specifically designed for this purpose, only differing for the presence/absence of important usability criteria defined in this study. By comparing the time spent by users navigating and performing assigned tasks on the two Web site (with and without criteria), the impact of the application of the proposed criteria on the quality of the navigation was estimated. Finally, an automatic tool, whose implementation is in progress, is briefly presented at the end of this work. This tool is the first step toward a complete and definitive automatic procedure able to evaluate real Web site usability, especially considering blind and low vision people's constraints. Further studies are in progress to reach this final goal.

See at: CNR ExploRA


2003 Journal article Restricted
Using spanning sets for coverage testing
Bertolino A., Marrè M.
A test coverage criterion defines a set Ec of entities of the program flowgraph and requires that every entity in this set is covered under some test case. Coverage criteria are also used to measure the adequacy of the executed test cases. In this paper, we introduce the notion of spanning sets of entities for coverage testing. A spanning set is a minimum subset of Ec, such that a test suite covering the entities in this subset is guaranteed to cover every entity in Ec. When the coverage of an entity always guarantees the coverage of another entity, the former is said to subsume the latter. Based on the subsumption relation between entities, we provide a generic algorithm to find spanning sets for control flow and data flow-based test coverage criteria. We suggest several useful applications of spanning sets: They help reduce and estimate the number of test cases needed to satisfy coverage criteria. We also empirically investigate how the use of spanning sets affects the fault detection effectiveness.Source: IEEE transactions on software engineering 29 (2003): 974–984.

See at: ieeexplore.ieee.org Restricted | CNR ExploRA


2003 Journal article Unknown
A unified method for designing interactive systems adaptable to mobile and stationary platforms
Paternò F., Santoro C.
The wide variety of devices currently available, which is bound to increase in the coming years, poses a number of issues for the design cycle of interactive software applications. Model-based approaches can provide useful support in addressing this new challenge. In this paper we present and discuss a method for the design of nomadic applications showing how the use of models can support their design. The aim is to enable each interaction device to support the appropriate tasks users expect to perform and designers to develop the various device-specific application modules in a consistent manner.Source: Interacting with computers 15 (2003): 347–364.

See at: CNR ExploRA


2003 Journal article Unknown
A simulation model for analyzing brain structures deformations
Di Bona S., Lutzemberger L., Salvetti O.
Recent developments of medical software applications,from the simulation to the planning of surgical operations,have revealed the need for modelling human tissues and organs, not only from a geometric point of viewbut also from a physical one, i.e. soft tissues, rigid body, viscoelasticity, etc. This has given rise to the term 'deformable objects', which refers to objects with a morphology, a physical and a mechanical behaviour of their own and that reflects their natural properties. In this paper, we propose a model, based upon physical laws, suitable for the realistic manipulation of geometric reconstructions of volumetric data taken from MR and CT scans. In particular, a physically based model of the brain is presented that is able to simulate the evolution of different nature pathological intra-cranial phenomena such as haemorrhages, neoplasm, haematoma, etc and to describe the consequences that are caused by their volume expansions and the influences they have on the anatomical and neuro-functional structures of the brain.Source: Physics in medicine and biology (Print) 48 (2003): 4001–4002.

See at: CNR ExploRA


2003 Journal article Unknown
Conceptual and linguistic constraints for the construction of a knowledge base in archaeology
Cappelli A., Catarsi M. N., Michelassi P., Moretti L.
This article describes a set of constraints for the construction of a knowledge base in 10 archaeology, which satis.es explicit conceptual and linguistic assumptions. The model of organization has been speci.ed by integrating a formalism, derived from structured inheritance networks and elements of conceptual dictionaries. Certain types of constraints have also been introduced to guarantee the coherence of the ontological modeling.Source: Applied artificial intelligence 17 (2003): 835–858.

See at: CNR ExploRA


2003 Journal article Open Access OPEN
Brain volumes characterization using hierarchical neural networks
Di Bona S., Niemann H., Pieri G., Salvetti O.
Objective knowledge of tissue density distribution in CT/MRI brain datasets can be related to anatomical or neuro-functional regions for assessing pathologic conditions characterised by slight differences. The process of monitoring illness and its treatment could be then improved by a suitable detection of these variations. In this paper, we present an approach for three-dimensional (3D) classification of brain tissue densities based on a hierarchical artificial neural network (ANN) able to classify the single voxels of the examined datasets. The method developed was tested on case studies selected by an expert neuro-radiologist and consisting of both normal and pathological conditions. The results obtained were submitted for validation to a group of physicians and they judged thesystem to be really effective in practical applications.Source: Artificial intelligence in medicine (Print) 28 (2003): 307–322. doi:10.1016/S0933-3657(03)00061-7
DOI: 10.1016/s0933-3657(03)00061-7
Metrics:


See at: ISTI Repository Open Access | Artificial Intelligence in Medicine Restricted | www.sciencedirect.com Restricted | CNR ExploRA


2003 Journal article Restricted
D-Index: Distance Searching Index for Metric Data Sets, Multimedia Tools and Applications
Dohnal V., Gennaro C., Savino P., Zezula P.
In order to speedup retrieval in large collections of data, index structures partition the data into subsets so that query requests can be evaluated without examining the entire collection. As the complexity of modern data types grows, metric spaces have become a popular paradigm for similarity retrieval. We propose a new index structure, called D-Index, that combines a novel clustering technique and the pivot-based distance searching strategy to speed up execution of similarity range and nearest neighbor queries for large files with objects stored in disk memories. We have qualitatively analyzed D-Index and verified its properties on actual implementation. We have also compared D-Index with other index structures and demonstrated its superiority on several real-life data sets. Contrary to tree organizations, the D-Index structure is suitable for dynamic environments with a high rate of delete/insert operations.Source: Multimedia tools and applications 21 (2003): 9–33. doi:10.1023/A:1025026030880
DOI: 10.1023/a:1025026030880
DOI: 10.1023/a%3a1025026030880
Metrics:


See at: Multimedia Tools and Applications Restricted | CNR ExploRA


2003 Journal article Unknown
External memory management and simplification of huge meshes
Cignoni P., Montani C., Rocchini C., Scopigno R.
Very large triangle meshes, i.e. meshes composed of millions of faces, are becoming common in many applications. Obviously, processing, rendering, transmission and archival of these meshes are not simple tasks. Mesh simplification and LOD management are a rather mature technology that in many cases can e.ciently manage complexdata. But only few available systems can manage meshes characterized by a huge size: RAM size is often a severe bottleneck. In this paper we present a data structure called Octreebased External Memory Mesh (OEMM). It supports external memory management of complex meshes, loading dynamically in main memory only the selected sections and preserving data consistency during local updates. The functionalities implemented on this data structure (simplification, detail preservation, mesh editing, visualization and inspection)can be applied to huge triangles meshes on lowcost PC platforms. The time overhead due to the external memory management is accordable. Results of the test of our system on complex meshes are presented.Source: IEEE transactions on visualization and computer graphics 9 (2003): 525–537.

See at: CNR ExploRA


2003 Journal article Unknown
Maximizing single connection TCP goodput by trading bandwidth for BER
Celandroni N., Potortì F.
All other conditions being equal, the end-to-end throughput of a TCP connection depends on the packet loss rate at the IP level. This is an issue when IP runs on a wireless link, where the bit error rate is variable and typically much higher than it is on fixed links. Especially on physical links where the bandwidth delay product is high, TCP performance is significantly impaired by apparently low values of the bit error rate. Generally speaking, on a wireless link bandwidth can be traded for information quality (error rate), the simplest method being to change the type or parameters of forward error correction. On this basis, we show a general method of taking advantage of this trade-off in order to maximize the throughput of a TCP connection.Source: International journal of communication systems (Print) 16 (2003): 63–79.

See at: CNR ExploRA


2003 Journal article Restricted
Interiors of Small Bodies: Foundations and Perspectives
Binzel R. P., Àhearn M., Asphaug E., Barucci M. A., Belton M., Benz W., Cellino A., Festou M., Fulchignoni M., Harris A. W., Rossi A., Zuber M.
With the surface properties and shapes of solar system small bodies (comets and asteroids) now being routinely revealed by spacecraft and Earth-based radar, understanding their interior structure represents the next frontier in our exploration of these worlds. Principal unknowns include the complex interactions between material strength and gravity in environments that are dominated by collisions and thermal processes. Our purpose for this review is to use the foundations for our current knowledge of small body interiors to define the science questions which motivate their continued study: In which bodies do 'planetary' processes occur? Which bodies are 'accretion survivors', i.e. bodies whose current form and internal structure are not substantially altered from the time of formation? At what characteristic sizes are we most likely to find rubble piles, substantially fractured (but not reorganized) interiors, and intact monolith-like bodies? We also seek to describe the prospects and requirements for answering these questions on a timescale of a decade or more. We note the motivation for finding these answers is both scientific and pragmatic, as understanding the interior properties of small bodies is essential for considering impact mitigation.Source: Planetary and space science 51 (2003): 443–454. doi:10.1016/S0032-0633(03)00051-5
DOI: 10.1016/s0032-0633(03)00051-5
Metrics:


See at: Planetary and Space Science Restricted | HAL-UPMC Restricted | CNR ExploRA


2003 Journal article Restricted
Monte Carlo Markov chain techniques for unsupervised MRF-based image denoising
Tonazzini A., Bedini L.
This paper deals with discontinuity-adaptive smoothing for recovering degraded images,when Markov random ?eld models with explicit lines are used,but no a priori information about the free parameters of the related Gibbs distributions is available. The adopted approach is based on the maximization of the posterior distribution with respect to the line ?eld and the Gibbs parameters,while the intensity ?eld is assumed to be clamped to the maximizer of the posterior itself,conditioned on the lines and the parameters. This enables the application of a mixed-annealing algorithm for the maximum a posteriori (MAP) estimation of the image ?eld,and of Markov chain Monte Carlo techniques, over binary variables only, for the simultaneous maximum likelihood estimation of the parameters. A practical procedure is then derived which is nearly as fast as a MAP image reconstruction by mixed-annealing with known Gibbs parameters. We derive the method for the general case of a linear degradation process plus superposition of additive noise,and experimentally validate it for the sub-case of image denoising.Source: Pattern recognition letters 24 (2003): 55–64. doi:10.1016/S0167-8655(02)00188-5
DOI: 10.1016/s0167-8655(02)00188-5
Metrics:


See at: Pattern Recognition Letters Restricted | www.sciencedirect.com Restricted | CNR ExploRA


2003 Journal article Restricted
On the estimates of the ring current injection and decay
Ballatore P., Gonzalez W. D.
In the context of the space weather predictions, forecasting ring current strength (and of the Dst index) based on the solar wind upstream conditions is of specific interest for predicting the occurrence of geomagnetic storms. In the present paper, we have studied separately its two components: the Dst injection and decay. In particular, we have verified the validity of the Burton's equation for estimating the ring current energy balance using the equatorial electric merging field instead of the original parameter V Bs (V is the solar wind speed and Bs is the southward component of the Interplanetary Magnetic Field, IMF). Then, based on this equation, we have used the phasespace method to determine the best-fit approximations for the ring current injection and decay as functions of the equatorial merging electric field (Em). Results indicate that the interplanetary injection is statistically higher than in previous estimations using V Bs . Specifically, weak but not-null ring current injection can be observed even during northward IMF, when previous studies considered it to be always zero. Moreover, results about the ring current decay indicate that the rate of Dst decay is faster than its predictions derived by using V Bs . In addition, smaller quiet time ring current and solar wind pressure corrections are contributing to Dst estimates obtained by Em instead of V Bs . These effects are compensated, so that the statistical Dst predictions using the equatorial electric merging field or using V Bs are about equivalent.Source: Earth, planets and space 55 (2003): 427–435.

See at: www.terrapub.co.jp Restricted | CNR ExploRA


2003 Journal article Open Access OPEN
Pc5 micropulsation power at conjugate high-latitude locations
Ballatore P.
The micropulsation power, integrated over the Pc5 frequency range, has been calculated for the horizontal component of the geomagnetic field at two high-latitude conjugate locations: Dumont D'Urville (corrected geomagnetic coordinates: 80.61°S, 235.76°E) and Mould Bay (corrected geomagnetic coordinates: 80.85°N, 272.65°E). Because of the different distances between the geographic and the geomagnetic poles in each hemisphere, the comparison between the Pc5 power observed at Dumont D'Urville and at Mould Bay shows the relative importance of geomagnetic and solar illumination effects in driving low-frequency micropulsation activity. In particular, similarities observed at the two sites can be explained in terms of their common geomagnetic characteristics, while differences can be attributed to the different sunlight or solar zenith angle configurations. Results show that the local summer Pc5 power is statistically higher in the northern hemisphere than in the southern one. This hemispherical difference is smaller for the local equinoxes, and it is only very slight or absent for local winters. These findings are interpreted in terms of the proportionality between the Pc5 power and the ionospheric conductance, which is higher at Mould Bay owing to more permanent and direct sunlight conditions during local summers and equinoxes. Thus the different geographic coordinates affect the Pc5 power at the two considered sites so much so that their effect is visible regardless of the geomagnetic similarities. However, the influence of the geomagnetic activity on Pc5 power is found to be more significant than these geographical effects or than the seasonal effects. In fact, for Kp 2 the difference in simultaneous observations at Mould Bay and at Dumont D'Urville isSource: Journal of geophysical research. Space physics (Print) 108 (2003): 1–15. doi:10.1029/2002JA009600
DOI: 10.1029/2002ja009600
Metrics:


See at: Journal of Geophysical Research Atmospheres Open Access | Journal of Geophysical Research Atmospheres Restricted | www.agu.org Restricted | CNR ExploRA


2003 Journal article Unknown
Performance Evaluation of Atmospheric Density Models for Satellite Reentry Predictions with High Solar Activity Levels
Pardini C., Anselmo L.
In order to estimate the intrinsic accuracy of satellite reentry predictions, the residual lifetimes of eleven spacecraft and five rocket bodies, covering a broad range of inclinations and decaying from orbit in a period of high solar activity, were determined using three different atmospheric density models: JR-71, TD-88 and MSIS-86. For each object, the ballistic coefficient applicable to a specific phase of the flight was obtained by fitting an appropriate set of two-line orbital elements, while the reentry predictions were computed approximately one month, one week and one day before the final orbital decay. No clear correlation between the residual lifetime errors and the satellite inclination or type (spacecraft or rocket body) emerged. JR-71 and MSIS-86 resulted in good agreement, with comparable reentry prediction errors (~10%), semimajor axis residuals and ballistic coefficient estimations. TD-88 exhibited a behaviour consistent with the other two models, but was typically characterised by larger reentry prediction errors (~15-25%) and semimajor axis residuals. At low altitudes ( 250 km) TD-88 systematically overestimated the average atmosphere density (by ~25%) with respect to the other two models.Source: Transactions of the Japan Society for Aeronautical and Space Sciences 46 (2003): 42–46.

See at: CNR ExploRA


2003 Journal article Unknown
Preliminary results on the foraging ecology of Balearic shearwaters (Puffinus mauretanicus) from bird-borne data loggers
Aguilar J. S., Benvenuti S., Dall'Antonia L., Mcminn-Grivè M., Mayol-Serra J.
A data logger devised and manufactured by our research team in order to study the homing routes of carrier pigeons was subsequently modified to study the homing and foraging strategies of breeding marine birds. Recent versions of the data logger, equipped with a flight sensor and depth meter or saltwater switch, were used in a study of the foraging strategies of chick-rearing Balearic shearwaters (Puffinus mauretanicus) in the framework of the project LIFE-Puffinus financed by the Balearic Government amd the EU. Due to low recapture rates (only 3 out of 6 tagged birds were recovered), only preliminary data from a small sample are available. Data loggers have recorded data on the pattern of nest attendance (including departure time to foraging trips and return time) and the diurnal pattern of flight and dive activity (including depth and duration of dives). Despite the small sample size, the results show that our data loggers can successfully be applied to the study of the breeding biology and foraging ecology - including the diving pattern-of Balearic shearwaters and similar speciesSource: Scientia marina 67 (2003): 129–134.

See at: CNR ExploRA


2003 Journal article Unknown
Skewed alpha-stable distributions for modelling textures
Kuruoglu E. E, Zerubia J.
In this letter, we introduce a novel family of texture models which provide alternatives to texture models which are based on Gaussian distributions. In particular, we introduce linear textures generated with a member of the alpha-stable distribution family which is a generalisation of the Gaussian distribution. The new family of texture models is capable of representing both impulsive and unsymmetric (skewed) image data which cannot be accommodated by the Gaussian model. We present new techniques for texture model estimation and we demonstrate the success of the techniques on synthetic data.Source: Pattern recognition letters 24 (2003): 339–348.

See at: CNR ExploRA


2003 Journal article Restricted
Region proximity in metric spaces and its use for approximate similarity search
Amato G., Rabitti F., Savino P., Zezula P.
Similarity search structures for metric data typically bound object partitions by ball regions. Since regions can overlap, a relevant issue is to estimate the proximity of regions in order to predict the number of objects in the regions' intersection. The paper analyzes the problem by using a probabilistic approach and provides a solution that effectively computes the proximity through realistic heuristics that only require small amounts of auxiliary data. An extensive simulation to validate the technique is provided. An application is then developed to demonstrate how the proximity measure can be successfully applied to the approximate similarity search. Search speedup is achieved by ignoring data regions whose proximity with the query region is smaller than a user defined threshold. This idea is implemented in a metric tree environment for the similarity range and nearest neighbors queries. Several measures of efficiency and effectiveness are applied to evaluate proposed approximate search algorithms on real-life data sets. Improvements of two orders of magnitude are achieved for moderately approximated search results. We demonstrate that the precision of proximity measures can significantly influence the quality of approximated algorithms.Source: ACM transactions on information systems 21 (2003): 192–227. doi:10.1145/763693.763696
DOI: 10.1145/763693.763696
Metrics:


See at: dl.acm.org Restricted | ACM Transactions on Information Systems Restricted | CNR ExploRA


2003 Journal article Restricted
Reply to Comment on Effects of fast and slow solar wind on the correlations between interplanetary medium and geomagnetic activity
Ballatore P.
The paper B02 (the paper commented) shows that the statistical significance of the correlations between the interplanetary parameters and the geomagnetic indices (Kp or Dst) is, generally, less significant during the fastest solar wind. On the other hand, at these fast solar wind periods, the significance of the Kp vs. Dst correlation is equal to or higher than during slower solar wind. These results, together with further observations related to substorm periods and with previously published findings, are interpreted in terms of a difference in the interplanetary-magnetospheric coupling for solar wind faster or slower than a certain threshold (identified between about 500 and 600 km/s). Specifically, it is suggested that a possible linear approximation of the geomagnetic-interplanetary coupling is more appropriate during solar wind speed (Vsw) slower than this threshold, being non linear processes more dominant during the fastest speeds. This Reply highlights that the correlation coefficients shown by Wang and Chao are in agreement with these findings. In addition, Wang and Chao show that the statistical significance of the difference between the correlation coefficients for Vsw ³ 550 km/s and those for Vsw 550 km/s would indicate that the interplanetary-geomagnetic correlations during the fastest speeds are not significantly different from those at slower Vsw ranges. Here we give evidence of the fact that, according to the common definition of this parameter, the calculation of the significance of the difference between two correlation coefficients made by Wang and Chao is wrong. Moreover, Wang and Chao re-calculate the correlations between the interplanetary parameters and the DDst, instead of Dst, in fact they note that the time derivative of this index (not the index itself) is driven by the interplanetary medium. Here we note that, on the contrary, they show that the correlation coefficients between interplanetary parameters and Dst are larger than those obtained using DDst and we suggest a possible interpSource: Journal of geophysical research. Space physics (Online) 108 (2003): 1387–1395.

See at: www.agu.org Restricted | CNR ExploRA


2003 Journal article Unknown
SAR image denoising via Bayesian wavelet shrinkage based on heavy-tailed modeling
Achim A., Tsakalides P., Bezerianos A.
Synthetic aperture radar (SAR) images are inherently affected by multiplicative speckle noise, which is due to the coherent nature of the scattering phenomenon. This paper proposes a novel Bayesian-based algorithm within the framework of wavelet analysis, which reduces speckle in SAR images while preserving the structural features and textural information of the scene. First, we show that the subband decompositions of logarithmically transformed SAR images are accurately modeled by alpha-stable distributions, a family of heavy-tailed densities. Consequently, we exploit this a priori information by designing a maximum a posteriori (MAP) estimator. We use the alpha-stable model to develop a blind speckle-suppression processor that performs a non-linear operation on the data and we relate this non-linearity to the degree of non-Gaussianity of the data. Finally, we compare our proposed method to current state-of-the-art soft thresholding techniques applied on real SAR imagery and we quantify the achieved performance improvement.Source: IEEE transactions on geoscience and remote sensing 41 (2003): 1773–1784.

See at: CNR ExploRA


2003 Journal article Unknown
Source separation in astrophysical maps using independent factor analysis
Kuruoglu E. E., Bedini L., Paratore M. T., Salerno E., Tonazzini A.
A microwave sky map results from a combination of signals from various astrophysical sources, such as cosmic microwave background radiation, synchrotron radiation and galactic dust radiation. To derive information about these sources, one needs to separate them from the measured maps on different frequency channels. Our insufficient knowledge of the weights to be given to the individual signals at different frequencies makes this a difficult task. Recent work on the problem led to only limited success due to ignoring the noise and to the lack of a suitable statistical model for the sources. In this paper, we derive the statistical distribution of some source realizations, and check the appropriateness of a Gaussian mixture model for them. A source separation technique, namely, independent factor analysis, has been suggested recently in the literature for Gaussian mixture sources in the presence of noise. This technique employs a three layered neural network architecture which allows a simple, hierarchical treatment of the problem. We modify the algorithm proposed in the literature to accommodate for space-varying noise and test its performance on simulated astrophysical maps. We also compare the performances of an expectation-maximization and a simulated annealing learning algorithm in estimating the mixture matrix and the source model parameters. The problem with expectation-maximization is that it does not ensure global optimization, and thus the choice of the starting point is a critical task. Indeed, we did not succeed to reach good solutions for random initializations of the algorithm. Conversely, our experiments with simulated annealing yielded initialization-independent results. The mixing matrix and the means and coefficients in the source model were estimated with a good accuracy while some of the variances of the components in the mixture model were not estimated satisfactorily.Source: Neural networks 16 (2003): 479–491.

See at: CNR ExploRA