70 result(s)
Page Size: 10, 20, 50
Export: bibtex, xml, json, csv
Order by:

CNR Author operator: and / or
more
Typology operator: and / or
Language operator: and / or
Date operator: and / or
more
Rights operator: and / or
2006 Article Unknown

An integrated infrared-visible system for fire detection
Pieri G., Benvenuti M., De Michele P., Petri D., Salvetti O.
The activity under investigation in this paper regards in particular the development of an information system for the automatic monitoring and detection of forest fires, using combined infrared and visible cameras. The proposed system is based on previously selected and studied algorithms. An integrated information system developed for the monitoring and the automatic detection and location of forest fires is described. This system uses robotized stations equipped with combined infrared (IR) and visible cameras. A specific approach has been developed based on computing thermal and spatial information suitably fused. Real time meteorological information, and previously stored morphological information are integrated and processed by a suitable decisional component based on a fuzzy rules system, which gives the final response for an alert on an active fire.Source: Forest ecology and management 234 (2006): S37.

See at: CNR People


2006 Article Unknown

SAR image filtering based on the heavy-tailed rayleigh model
Achim A., Kuruoglu E. E., Zerubia J.
Synthetic aperture radar (SAR) images are inherently affected by a signal dependent noise known as speckle, which is due to the radar wave coherence. In this paper, we propose a novel adaptive despeckling filter and derive a maximum a posteriori(MAP) estimator for the radar cross section (RCS). We first employ a logarithmic transformation to change the multiplicative speckle into additive noise. We model the RCS using the recently introduced heavy-tailed Rayleigh density function, which was derived based on the assumption that the real and imaginary parts of the received complex signal are best described using the alpha-stable family of distribution.We estimate model parameters from noisy observations by means of second-kind statistics theory, which relies on the Mellin transform. Finally, we compare the proposed algorithm with several classical speckle filters applied on actual SAR images. Experimental results show that the homomorphic MAP filter based on the heavy-tailed Rayleigh prior for the RCS is among the best for speckle removal.Source: IEEE transactions on image processing 15 (2006): 2686–2693. doi:10.1109/TIP.2006.877362
DOI: 10.1109/TIP.2006.877362

See at: DOI Resolver | ieeexplore.ieee.org | CNR People


2006 Article Unknown

HEARTFAID: a knowledge based platform for supporting the clinical management of elderly patients with heart failure
Conforti D., Costanzo D., Lagani V., Perticone F., Parati G., Kawecka-jaszcz K., Marsh A., Biniaris C., Stratakis M., Fontanelli R., Guerri D., Salvetti O., Tsiknakis M., Chiarugi F., Gamberger D., Valentini M.
Chronic heart failure is a major health problem in many developed countries with strong social and economic effects due to its prevalence and morbidity. These effects occur particularly in the elderly who have frequent hospital admissions and utilise significant medical resources. Studies and data have demonstrated that evidence-based heart failure management programs utilising appropriate integration of inpatient and outpatient clinical services, have the potential to prevent and reduce hospital admissions, improve clinical status and reduce healthcare costs. HEARTFAID is a research and development project aimed at creating and validating an innovative knowledge-based platform to improve the early diagnosis and effective management of heart failure. The core of the platform is formalisation of pre-existing clinical knowledge and the discovery of new elicited knowledge. HEARTFAID has been designed to improve the processes of diagnosis, prognosis and therapy by providing the following services:• Electronic health records for easy and ubiquitous access to heterogeneous patient data• Integrated services for healthcare professionals, including patient telemonitoring, signal and image processing, alert and alarm systems• Clinical decision support, based on pattern recognition in historical data, knowledge discovery analysis and inference from patients' clinical data.Source: The journal on information technology in healthcare 4 (2006): 283–300.

See at: CNR People


2006 Article Unknown

Particle swarm optimization for the reconstruction of permittivity range profiles from microwave measurements
Genovesi S., Salerno E.
At the Signal and Images lab, ISTI-CNR, we are developing a new algorithm to reconstruct the permittivity range profile of a layered medium from microwave backscattering data. The algorithm is based on a particle swarm strategy to optimize a specific edge-preserving objective functional. Our technique is able to efficiently find the global optimum of the objective functional, while preserving the discontinuities in the reconstructed profile.Source: ERCIM news 64 (2006): 42–43.

See at: CNR People


2006 Article Unknown

Elements of the information technology of cytological specimen analysis: taxonomy and factor analysis
Gurevich I., Harazishvili D., Salvetti O., Trykova A., Vorob'Ev I.
The automated software systemSource: Pattern recognition and image analysis 16 (2006): 113–115.

See at: springerlink.metapress.com | CNR People


2006 Conference object Unknown

A method for detection of transient events in EEG signals
Righi M., Starita A., Barcaro U., Erimakis S., Micheloyannis S.
A method is described for the detection of EEG transient events that are characterized by transient decrease in the correlation between homologous frequency-band components of different traces. It consists of the following steps: computation of frequency-band components; computation of normalized correlation; application of two thresholds (one for the recognition of an event, and the other for the measure of the event time-length). The method also provides a classification of the detected events. The results can be subjected to statistical analyses. An application is described to seizure-free EEGs of epileptic subjects.Source: Biopattern Brain Workshop, pp. 15–16, Göteborg, 18-19/05/2006

See at: CNR People


2006 Conference object Unknown

A minimax entropy method for blind separation of dependent components in astrophysical images
Caiafa C. F., Kuruoglu E. E., Proto A. N.
We develop a new technique for blind separation of potentially non independent components in astrophysical images. Given a set of linearly mixed images, corresponding to different measurement channels, we estimate the original electromagnetic radiation sources in a blind fashion. Specifically, we investigate the separation of cosmic microwave background (CMB), thermal dust and galactic synchrotron emissions without imposing any assumption on the mixing matrix. In our approach, we use the Gaussian and non-Gaussian features of astrophysical sources and we assume that CMB-dust and CMB-synchrotron are uncorrelated pairs while dust and synchrotron are correlated which is in agreement with theory. These assumptions allow us to develop an algorithm which associates the Minimum Entropy solutions with the non-Gaussian sources (thermal dust and galactic synchrotron emissions) and the Maximum Entropy solution as the only Gaussian source which is the CMB. This new method is more appropriate than ICA algorithms because independence between sources is not imposed which is a more realistic situation. We investigate two specific measures associated with entropy: Gaussianity Measure (GM) and Shannon Entropy (SE) and we compare them. Finally, we present a complete set of examples of separation using these two measures validating our approach and showing that it performs better than FastICA algorithm. The experimental results presented here were performed on an image database that simulates the measurements expected from the instruments that will operate onboard ESA's Planck Surveyor Satellite to measure the CMB anisotropies all over the celestial sphere.Source: Bayesian Inference and Maximum Entropy Methods In Science and Engineering, pp. 81, Paris, France, 08-13/07/2006

See at: aip.scitation.org | CNR People


2006 Conference object Unknown

An integrated infrared-visible system for fire detection
Pieri G., Benvenuti M., De Michele P., Salvetti O.
The activity under investigation in this paper regards in particular the development of an information system for the automatic monitoring and detection of forest fires, using combined infrared and visible cameras. The proposed system is based on previously selected and studied algorithms.Source: International Conference on Forest Fire Research, Figueira da Foz, Coimbra - Por, 27-30/11/2006

See at: CNR People


2006 Conference object Unknown

Active video-surveillance based on stereo and infrared imaging
Pieri G., Salvetti O.
Video-surveillance is a very actual and critical issue at the present time. Within this topic we address the problem of firstly identifying moving people in a scene through motion detection techniques, and subsequently categorising them in order to identify humans for tracking their movements. The use of stereo cameras, coupled with infrared vision, allows to apply this technique to images acquired through different and variable condition, and allows an a priori filtering based on the characteristics of such images to give evidence to objects emitting an higher radiance (i.e. higher tempera-ture).Source: 14th European Signal Processing Conference, Florence, 04-08/09/2006

See at: CNR People


2006 Conference object Unknown

Dependent component analysis as a tool for blind spectral unmixing of remote sensed images
Caiafa C. F., Salerno E., Proto A. N., Fiumi L.
In this work, we present a blind technique for the estimation of the material abundances per pixel (end-members) in hyperspectral remote-sensed images. Classical spectral unmixing techniques require the knowledge of the existing materials and their spectra. This is a problem when no prior information is available. Some techniques based on independent component analysis proved not to be very efficient for the strong dependence among the material abundances always found in real data. We approach the problem of blind separation of end members by applying the MaxNG algorithm, which is capable to separate even sensibly dependent signals. We also present a minimum-mean-squared-error method to estimate the unknown scale factors by exploiting the source constraint. The results shown here have been obtained from either synthetic or real data. The synthetic images have been generated by a noisy linear mixture model with real, spatially variable, endmember spectra. The real images have been captured by the MIVIS airborne imaging spectrometer. Our results showed that MaxNG is able to separate the endmembers successfully if a linear mixing model holds true and for low noise and reduced spectral variability conditions.Source: European Signal Processing Conference - EUSIPCO 2006, Florence, Italy, 04-08/09/2006

See at: CNR People


2006 Conference object Unknown

Estimation of mixtures of skewed alpha stable processes with unknown number of components
Salas D., Kuruoglu E. E., Ruiz D. P.
Alpha stable distributions are widely accepted models for impulsive data. Despite their flexibility in modelling varying degrees of impulsiveness and skewness, they fall short of modelling multimodal data. In this work, we present the alpha-stable mixture model which provides a framework for modelling multimodal, skewed and impulsive data. We describe new parameter estimation techniques for this model based on numerical Bayesian techniques which not only can estimate the alpha-stable and mixture parameters, but also the number of components in the mixture. In particular, we employ the reversible jump Markov chain Monte Carlo technique.Source: European Signal Processing Conference, Firenze, Italia, 04-08/09/2006

See at: CNR People


2006 Conference object Unknown

Estimation of mixtures of symmetric alpha stable processes with unknown number of components
Salas D., Kuruoglu E. E., Ruiz D. P.
In this work, we study the estimation of mixtures of symmetric á-stable distributions using Bayesian inference. We utilise numerical Bayesian sampling techniques such as Markov chain Monte Carlo (MCMC). Our estimation technique is capable of estimating also the number of á-stable components in the mixture in addition to the component parameters and mixing coefficients which is accomplished by the use of the Reversible Jump MCMC (RJMCMC) algorithm.Source: Speech and Signal Processing, Toulouse, France, 14-19/05/2006

See at: CNR People


2006 Conference object Unknown

ISYREADET: un sistema integrato per il restauro virtuale
Console E., Burdin V., Legnaioli S., Palleschi V., Tassone R., Tonazzini A.
Il progetto Isyreadet (Integrated System for Recovery and Archiving Degraded Texts), finanziato dalla Commissione Europea con i fondi del V Programma Quadro di Ricerca, Sviluppo Tecnologico e Dimostrazione (1998-2002) si è proposto di realizzare un sistema integrato, hardware e software per il restauro virtuale e l'archiviazione di documenti danneggiati utilizzando metodi e strumenti innovativi, come camere multispettrali e algoritmi di elaborazione di immagini.Source: IV Congresso Nazionale di Archeometria - Scienza e Beni Culturali, pp. 311, Pisa, 01-03/02/2006

See at: CNR People


2006 Conference object Unknown

Joint correction of cross-talk and peak spreading in DNA electropherograms
Tonazzini A., Bedini L.
In automated DNA sequencing, the final algorithmic phase, referred to as basecalling, consists of the translation of four time signals in the form of peak sequences (electropherogram) to the corresponding sequence of bases. The most popular basecaller, Phred, detects the peaks based on heuristics, and is very efficient when the peaks are well distinct and quite regular in spread, amplitude and spacing. Unfortunately, in the practice the data is subject to several degradations, particularly near the end of the sequence. The most frequent ones are peak superposition, peak merging and signal leakage, resulting in secondary peaks. In these conditions the experiment must be repeated and the human intervention is required. Recently, there have been attempts to provide methodological foundations to the problem and use statistical models to solve it. In this paper, we propose exploiting a priori information and Bayesian estimation to remove degradations and recover the signals in an impulsive form which makes the task of basecalling straightforward.Source: RECOMB 2006. The 10th Annual International Conference on Research in Computational Molecular Biology, Venice, 01-04/04/2006

See at: CNR People


2006 Conference object Unknown

Shape comparison and deformation analysis in biomedical applications
Colantonio S., Moroni D., Salvetti O.
In this paper, we present a method for comparing shapes and analyzing deformations of 3D image objects. This method is based on the definition of an object model based on multi-source 3D images. Elective application cases of the proposed method consist of the analysis of deformable anatomical structures, useful to support medical diagnosis in routine clinical practice. In particular, preliminary aspects are discussed of a study case regarding heart dynamics.Source: Eurographics Italian Chapter 2006, Catania, 2-22/02/2006

See at: CNR People


2006 Conference object Unknown

Statistical analysis of electrophoresis time series for improving basecalling in DNA sequencing
Tonazzini A., Bedini L.
In automated DNA sequencing, the final algorithmic phase, referred to as basecalling, consists of the translation of four time signals in the form of peak sequences (electropherogram) to the corresponding sequence of bases. Commercial basecallers detect the peaks based on heuristics, and are very efficient when the peaks are distinct and regular in spread, amplitude and spacing. Unfortunately, in the practice the signals are subject to several degradations, among which peak superposition and peak merging are the most frequent. In these cases the experiment must be repeated and human intervention is required. Recently, there have been attempts to provide methodological foundations to the problem and to use statistical models for solving it. In this paper, we exploit a priori information and Bayesian estimation to remove degradations and recover the signals in an impulsive form which makes basecalling straightforward.Source: ICDM 2006, Workshop on Mass-Data Analysis of Images and Signals in Medicine, Biotechnology and Chemistry MDA´2006, Lipsia, 13/07/2006

See at: CNR People


2006 Conference object Unknown

Support for the medical-clinical management of heart failure within elderly population: the HEARTFAID platform
Chiarugi F., Tsiknakis M., Conforti D., Lagani V., Perticone F., Parati G., Kawecka-jaszcz K., Marsh A., Stratakis M., Di Bona S., Fontanelli R., Guerri D., Salvetti O., Gamberger D., Valentini M.
Chronic heart failure, one of the most remarkable health problems due to its prevalence and morbidity especially in the western world, is characterised by a strong socioeconomic impact. Recent studies demonstrate that accurate heart failure (HF) management programs, based on coordinated inpatient and outpatient clinical procedures, might prevent and reduce hospital admissions, improving clinical outcome and reducing costs.Source: ITAB 2006, Ioannina Greece, 26-28/10/2006

See at: CNR People


2006 Conference object Unknown

Technology for automated morphologic analysis of cytological slides. Methods and results
Gurevich I., Kharazishvili D., Murashov D., Salvetti O., Trykova A., Vlasova V., Vorobjev I.
The information technology for automated mor-phologic analysis of the cytological slides, taken from patients with the lymphatic system tumors, was devel-oped. The main components of the technology are: acquisition of cytological slides, method for segmenta-tion of nuclei in the cytological slides, synthesis of the feature based nuclei description for subsequent classi-fication, nuclei image analysis based on pattern rec-ognition and scale-space techniques. The experiments confirmed efficiency of the developed technology. The discussion of the obtained results is given. The devel-oped technology is implemented in the software system.Source: International Conference on Pattern Recognition, ICPR 2006, Hong Kong, 20-24/08/2006

See at: CNR People


2006 Conference object Unknown

Bayesian inference on mixtures of stable densities
Kuruoglu E. E., Salas D., Ruiz D. P.
Stable distributions have attracted significant interest over the last decade in applications ranging from telecommunications to finance and from radar signal processing to biomedicine. This popularity is due to the fact that stable distributions provide a very flexible framework for modelling signals which exhibit impulsive behaviour that cannot be accommodated by the Gaussian distribution. In addition to being capable of modelling varying degrees of impulsiveness they can also model skewed behaviour which have been largely ignored. In addition to their empirical success, the stable distributions have important theoretical motivation: they are the outcome of a generalised version of the central limit theorem and moreover are generalisations of the Gaussian distribution and share attractive properties with it such as the stability property. Despite this flexibility, stable distributions fall short of describing multimodal data, while many real life data sets possess multimodal property indicating contributions from different contributing phenomena. Gaussian mixtures have been employed widely for modelling multimodal data and have obtained significant success, however for impulsive data there is still need for an alternative model. Moreover, although skewed data can be described with Gaussian mixtures, this is at the expense of large number of components. In this work we suggest stable mixture densities as an alternative which can model multimodal, impulsive and skewed data with a small number of components. We employ Bayesian inference, in particular Markov chain Monte Carlo techniques for this task. The mixture weights are estimated using Gibbs sampling and the distribution parameters are estimated using Metropolis sampling. In addition to estimating stable distribution parameters and mixing coefficients, the suggested technique is also capable of estimating the number of components for which the reversible jump MCMC algorithm has been employed. Simulation studies demonstrate the success of the estimation technique and we can conclude that a very flexible modelling framework has been proposed in this work.Source: Valencia International Meeting on Bayesian Statistics, Valencia, Spain, 1-6/06/2006

See at: CNR People


2006 Conference object Unknown

In pursuit of the big-bang signature: Bayesian separation of components in astrophysical radiation maps
Kuruoglu E. E.
In this work we present our research into the separation of astrophysical components using numerical Bayesian techniques. The work is motivated by the Planck satellite project. ESA's Planck satellite, which is to be launched in 2007, will provide 9 all-sky maps ranging in frequency from 30 GHz to 900 GHz, and in angular resolution from 30 to 4.5 arcminutes. Celestial microwave radiation is generated by various astronomical sources, and the measured signals are superimpositions of the source signals, corrupted by measurement noise. Source signals include the cosmic microwave background (CMB), the thermal Galactic dust radiation, the synchrotron radiation (caused by the interaction of the electrons with magnetic field of the galaxy) and the free-free radiation (due to the thermal bremstrahlung from hot electrons when accelarated by ions in the interstellar gas); among which CMB is of paramount importance since it is a relic radiation remaining from the first instant light was able to travel in the universe and therefore contains the picture of the very early universe. In addition, the measurement of the anisotropies in the CMB will place fundamental constraints on models for the evolution of large scale structure in the universe. Each of the other source signals is also of interest in cosmology and astrophysics. Our goal is to reconstruct these signals. We implement first a Markov Chain Monte Carlo (MCMC) algorithm to perform Bayesian source separation, with application to the separation of signals of different origin in sky radiation maps. The problem is formulated as the separation of an instantaneous linear mixing. Since the MCMC methods provide samples from the full posterior distribution, one can easily infer other functions of the parameters and their uncertainities. The great flexibility of the sampling approach allows us to make appropriate modelling choices for our problem. In particular, we have used a Gibbs sampling scheme and have adopted a Gaussian mixture model for the sources. We also note that antenna noise is Gaussian but non-stationary, with a different but known variance at each pixel.To accommodate the nonstationarity in the noise and the signals, we then extend the work to sequential Monte Carlo techniques. Particle filtering gives significantly better results which will be presented at the conference.Source: Valencia International Meetings on Bayesian Statistics, Valencia, Spain, 01-06/06/2006

See at: CNR People | www.hereuare.com