2017
Journal article  Open Access

Mirror Mirror on the Wall ... An Unobtrusive Intelligent Multisensory Mirror for Well-Being Status Self-Assessment and Visualization

Henriquez P., Matuszewski B. J., Andreu-Cabedo Y., Bastiani L., Colantonio S., Coppini G., D'Acunto M., Favilla R., Germanese D., Giorgi D., Marraccini P., Martinelli M., Morales M. A., Pascali M. A, Righi M., Salvetti O., Larsson M., Stromberg T., Randeberg L., Bjorgan A., Giannakakis G., Pediaditis M., Chiarugi F., Christinaki E., Marias K., Tsiknakis M.

psychosomatic status recognition  multispectral imaging  I150  breath analysis  I460  I140  I440  Cardio-metabolic risk  I113  Signal Processing  B990  multimodal data integration  Electrical and Electronic Engineering  Computer Science Applications  3D face detection and tracking  unobtrusive well-being monitoring  Media Technology  3D morphometric analysis 

A person's well-being status is reflected by their face through a combination of facial expressions and physical signs. The SEMEOTICONS project translates the semeiotic code of the human face into measurements and computational descriptors that are automatically extracted from images, videos, and three-dimensional scans of the face. SEMEOTICONS developed a multisensory platform in the form of a smart mirror to identify signs related to cardio-metabolic risk. The aim was to enable users to self-monitor their well-being status over time and guide them to improve their lifestyle. Significant scientific and technological challenges have been addressed to build the multisensory mirror, from touchless data acquisition, to real-time processing and integration of multimodal data.

Source: IEEE transactions on multimedia 19 (2017): 1467–1481. doi:10.1109/TMM.2017.2666545

Publisher: Institute of Electrical and Electronics Engineers,, Piscataway, NJ , Stati Uniti d'America


[1] M. Yuan, I. Khan, F. Farbiz, S. Yao, A.Niswar, and M. Foo, “A mixed reality virtual clothes try-on system,” IEEE Transactions on Multimedia, vol. 15, pp. 1958-1968, 2013.
[2] A. Rahman, T. Tran, S. Hossain, and A. E. Saddik, “Augmented rendering makeup features in a smart interactive mirror system for decision support in cosmetic products selection,” in IEEE-ACM Symposium on Distributed Simulation and Real-Time Applications, 2010.
[3] M. Alhamid, M. Eid, and A. E. Saddik, “A multi-modal intelligent system for biofeedback interactions,” in Medical Measurements and Applications, 2012.
[4] G. Coppini, R. Favilla, A. Gastaldelli, S. Colantonio, and P. Marraccini, “Moving medical semeiotics to the digital realm semeoticons approach to face signs of cardiometabolic risk,” in International Conference on Health Informatics, 2014.
[5] R. Hartley, “Theory and practice of projective rectification,” International Journal of Computer Vision, vol. 35, pp. 115-127, 1999.
[6] Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 11, pp. 1330-1334, 2000.
[7] S. Klemm, Y. Andreu, P. Henriquez, and B. Matuszewski, “Robust face recognition using key-point descriptors,” in Proc. VISAPP, 2015.
[8] G. Fanelli, M. Dantone, J. Gall, A. Fossati, and L. Van Gool, “Random forests for real time 3d face analysis,” Int. J. Comput. Vision, vol. 101, no. 3, pp. 437-458, 2013.
[9] P. Henriquez, O. Higuera, and B. Matuszewski, “Head pose tracking for immersive applications,” in Proc. ICIP, 2014.
[10] P. J. Besl and N. D. McKay, “A method for registration of 3-d shapes,” IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 239-256, 1992.
[11] S. Izadi, O. Hilliges, D. Molyneaux, D. Kim, A. J. Davison, P. Kohli, J. Shotton, S. Hodges, A. Fitzgibbon, and R. A. Newcombe, “Kinectfusion: Real-time dense surface mapping and tracking,” in International Symposium on Mixed and Augmented Reality, 2011.
[12] Y. Andreu, F. Chiarugi, S. Colantonio, and et. al, “Wize mirror- a smart, multisensory cardio-metabolic risk monitoring system,” Computer Vision and Image Understanding, vol. 148, pp. 3-22, 2016.
[13] M. Kazhdan, M. Bolitho, and H. Hoppe, “Poisson surface reconstruction,” in Eurographics symposium on geometry processing, 2006.
[14] P. Nair and A. Cavallaro, “3-d face detection, landmark localization, and registration using a point distribution model,” IEEE Transactions on Multimedia, vol. 11, 2009.
[15] W. Quan, B. Matuszewski, and L.-K. Shark, “Improved 3-d facial representation through statistical shape model,” in IEEE International Conference on Image Processing, 2010, pp. 2433-2436.
[16] J. Kolar, L. Farkas, and I. Munro, “Craniofacial disproportions in apert's syndrome: an anthropometric study,” Cleft Palate, vol. 22, no. 4, 1985.
[17] B. Lee, J. Do, and J. Kim, “A classification method of normal and overweight females based on facial features for automated medical applications.” J Biomed Biotechnol., 2012.
[18] B. Lee and J. Kim, “Predicting visceral obesity based on facial characteristics,” BMC Complementary and Alternative Medicine, vol. 14, no. 248, 2014.
[19] E. Vezzeti and F. Marcolin, “3d human face description: landmarks measures and geometrical features,” Image and Vision Computing, vol. 30, 2012.
[20] D. Giorgi, M. Pascali, G. Raccichini, S. Colantonio, and O. Salvetti, “Morphological analysis of 3d faces for weight gain assessment,” in Proc. of EG 3DOR 2015, Eurographics 2015 Workshop on 3D Object Retrieval, Zurich (Switzerland) - May 2-3, 2015, 2015.
[21] “Artec eva: Fast 3d scanner for professionals.” [Online]. Available: https://www.artec3d.com/3d-scanner/artec-eva
[22] C. Nickerson, “A note on a concordance correlation coefficient to evaluate reproducibility,” Biometrics (International Biometric Society), vol. 53, no. 4, pp. 1503-1507, 1997.
[23] S. Millar, I. Perry, and C. Phillips, “Surrogate measures of adiposity and cardiometabolic risk why the uncertainty? a review of recent metaanalytic studies,” J Diabetes Metab, vol. S11, no. 004, 2013.
[24] C. Baena, P. Lotufo, M. Fonseca, I. Santos, A. Goulart, and I. Bensenor, “Neck circumference is independently associated with cardiometabolic risk factors: Cross-sectional analysis from elsa-brasil,” Metab Syndr Relat Disord., 2016.
[25] G. Giannakakis, M. Pediaditis, D. Manousos, E. Kazantzaki, F. Chiarugi, P. Simos, K. Marias, and M. Tsiknakis, “Stress and anxiety detection using facial cues from videos,” Biomedical Signal Processing and Control, vol. 31, pp. 89-101, 2017.
[26] B. D. Lucas, T. Kanade et al., “An iterative image registration technique with an application to stereo vision.” in IJCAI, vol. 81, no. 1, 1981, pp. 674-679.
[27] N. Sharma and T. Gedeon, “Objective measures, sensors and computational techniques for stress recognition and classification: A survey,” Computer methods and programs in biomedicine, vol. 108, no. 3, pp. 1287-1301, 2012.
[28] T. Cootes, G. Edwards, and C. Taylor, “Active appearance models,” Pattern Analysis and Machine Intelligence, 2001.
[29] G. Farneback, “Two-frame motion estimation based on polynomial expansion,” in SCIA, 2003.
[30] N. Thejaswi and S. Sengupta, “Lip localization and viseme recognition from video sequences,” in Fourteenth National Conference on Communications, 2008, pp. 456-460.
[31] M. Pediaditis, G. Giannakakis, F. Chiarugi, D. Manousos, A. Pampouchidou, E. Christinaki, G. Iatraki, E. Kazantzaki, P. Simos, K. Marias et al., “Extraction of facial features as indicators of stress and anxiety,” in 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). IEEE, 2015, pp. 3711-3714.
[32] J. Saragih, S. Lucey, and J. F. Cohn, “Deformable model fitting by regularized mean-shifts,” International Journal of Computer Vision, 2011.
[33] R. Gross, I. Matthews, J. Cohn, T. Kanade, and S. Baker, “Multi-pie,” Image and Vision Computing, vol. 28, no. 5, pp. 807-813, 2010.
[34] S. Abtahi, M. Omidyeganeh, S. Shirmohammadi, and B. Hariri, “Yawdd: a yawning detection dataset,” in Proceedings of the 5th ACM Multimedia Systems Conference. ACM, 2014, pp. 24-28.
[35] D. McDuff, S. Gontarek, and R. Picard, “Remote measurement of cognitive stress via heart rate variability,” in EMBC, 2003.
[36] E. Christinaki, G. Giannakakis, F. Chiarugi, M. Pediaditis, G. Iatraki, D. Manousos, K. Marias, and M. Tsiknakis, “Comparison of blind source separation algorithms for optical heart rate monitoring,” in Wireless Mobile Communication and Healthcare (Mobihealth), 2014 EAI 4th International Conference on. IEEE, 2014, pp. 339-342.
[37] J. R. Stroop, “Studies of interference in serial verbal reactions.” Journal of experimental psychology, vol. 18, no. 6, p. 643, 1935.
[38] M. Larsson, R. Favilla, and T. Stro¨ mberg, “Assessment of advanced glycated end product accumulation in skin using auto fluorescence multispectral imaging,” Computers in biology and medicine. [Online]. Available: http://dx.doi.org/10.1016/j.compiomed.2016.04.005
[39] R. Graaff, R. Meerwaldt, H. Lutgers, R. Baptist, E. de Jong, J. Zijp, T. Links, A. Smit, and G. Rakhorst, “Instrumentation for the measurement of autofluorescence in the human skin,” P Soc Photo-Opt Ins, vol. 5692, pp. 111-118, 2005.
[40] Y. Tashakkor and G. B. Mancini, “The relationship between skin cholesterol testing and parameters of cardiovascular risk: a systematic review,” Can J Cardiol, vol. 29, pp. 1477-87, 2013.
[41] J. H. Stein, W. S. Tzou, J. M. DeCara, A. T. Hirsch, and E. R. Mohler, “Usefulness of increased skin cholesterol to identify individuals at increased cardiovascular risk (from the predictor of advanced subclinical atherosclerosis study),” Am J Cardiol, vol. 101, pp. 986-91, 2008.
[42] F. Di Francesco, R. Fuoco, M. Trivella, and A. Ceccarini, “Breath analysis: trends in techniques and clinical applications,” vol. 79, pp. 405-10, 2005.
[43] W. Miekisch, J. Schubert, and G. Noeldge-Schomburg, “Diagnostic potential of breath analysis- focus on volatile organic compounds,” vol. 347, pp. 25-39, 2004.
[44] P. Sukul, P. Trefz, S. Kamysek, J. Schubert, and W. Miekisch, “Instant effects of changing body positions on compositions of exhaled breath,” 2015.
[45] F. Di Francesco, C. Loccioni, M. Fioravanti, A. Russo, G. Pioggia, M. Ferro, I. Roehrer, S. Tabucchi, and M. Onor, “Implementation of fowler's method for end-tidal air sampling,” 2008.
[46] B. Thekedar, U. Oeh, W. Szymczak, C. Hoeschen, and H. Partake, “Influences of mixed expiratory sampling parameters on exhaled volatile organic compound concentrations,” vol. 5, 2011.
[47] J. Jones, “The effect of pre-inspiratory lung volumes on the result of the single breath o2 test,” vol. 2, pp. 375-85, 1967.
[48] D. Guo, D. Zhang, N. Li, L. Zhang, and J. Yang, “A novel breath analysis system based on electronic olfaction,” 2010.
[49] M. D'Acunto, A. Benassi, F. Chiellini, D. Germanese, R. Ishak, M. Magrini, E. Pagliei, P. Pardisi, M. Righi, and O. Salvetti, “Wize sniffer - a new portable device designed for selective olfaction,” in International Conference on Health Informatics, 2014.
[50] D. Shier, J. Butler, and R. Lewis, Hole's human anatomy and physiology, 11th ed. Mc Graw Hill, 2007.
[51] P. Clifford and D. Tuma, “Characteristics of semiconductor i. steady state gas response,” vol. 3, 1983.
[52] W. Miekisch, S. Kischkel, A. Sawacki, T. Lieban, M. Mieth, and J. Schubert, “Impact of sampling procedures on the result of breath analysis,” 2008.
[53] The sf-12: An even shorter health survey. [Online]. Available: http://www.sf-36.org/tools/sf12.shtml
[54] S. Cohen, T. Kamarck, and R. Mermelstein, “A global measure of perceived stress,” Journal of Health and Social Behavior, vol. 24, pp. 385-396, 1983.
[55] National Heart, Lung, and Blood Institute. Description of the DASH eating plan. [Online]. Available: http://www.nhlbi.nih.gov/health/healthtopics/topics/dash/
[56] A. Mannocci, V. Bontempi, V. Colamesta et al., “IPAQ-SF: Reliability of the telephone-administered international physical activity questionnaire in an italian pilot sample,” Epidemiol Biostat Pub Health, vol. 11, pp. e8860-1, 2014.
[57] Public Health England Alcohol Learning Resources. [Online]. Available: http://www.alcohollearningcentre.org.uk/News/NewsItem/?cid=6150
[58] C. Pomerleau, S. Carton, M. Lutzke, K. Flessland, and P. OF, “Reliability of the fagerstrom tolerance questionnaire and the fagerstrom test for nicotine dependence,” Addict Behav, vol. 19, pp. 33-9, 1994.
[59] D. W. Kaplan, Structural Equation Modeling Foundations and Extensions (Advanced Quantitative Techniques in the Social Sciences). SAGE Publications, 2000.
[60] X. Song, L. Nie, L. Zhang, M. Akbari, and T.-S. Chua, “Multiple social network learning and its application in volunteerism tendency prediction.” New York, NY, USA: ACM, 2015, pp. 213-222.
[61] J. Perk, G. e. a. De Backer et al., “European guidelines on cardiovascular disease prevention in clinical practice (version 2012). the fifth joint task force of the european society of cardiology and other societies on cardiovascular disease prevention in clinical practice (constituted by representatives of nine societies and by invited experts),” Eur Heart J, vol. 33, pp. 1635-1701, 2012.
[62] G. Bedogni, S. Bellentani et al., “The Fatty Liver Index: a simple and accurate predictor of hepatic steatosis in the general population,” BMC Gastroenterol, vol. 6, p. 33, 2006.
[63] D. Matthews, J. Hosker, A. Rudenski, B. Naylor, D. Treacher, and R. Turner, “Homeostasis model assessment: insulin resistance and betacell function from fasting plasma glucose and insulin concentrations in man,” Diabetologia, vol. 7, pp. 402-9, 1985.
[64] J. Lindstro¨ m, M. Peltonen, J. Eriksson et al., “Determinants for the effectiveness of lifestyle intervention in the finnish diabetes prevention study,” Diabetes Care, vol. 31, pp. 857-62, 2008.
[65] The WHO-5 website. WHO-5 Well-being Index. [Online]. Available: https://www.psykiatri-regionh.dk/who-5/who-5-questionnaires/
[66] U. Scholz, B. Don˜ a, S. Sud, and R. Schwarzer, “Is general self-efficacy a universal construct? psychometric findings from 25 countries,” European Journal of Psychological Assessment, vol. 18, pp. 242-251, 2002.
[67] L. Crinie´re, C. Lhommet, A. Caille, B. Giraudeau, P. Lecomte, C. Couet, J. Oppert, and D. Jacobi, “Reproducibility and validity of the french version of the long international physical activity questionnaire in patients with type 2 diabetes,” J Phys Act Health, vol. 8, pp. 858-65, 2011.
[68] C. Bastien, A. Vallires, and C. Morin, “Validation of the insomnia severity index as an outcome measure for insomnia research,” Sleep Med, vol. 2, pp. 297-307, 2001.

Metrics



Back to previous page
BibTeX entry
@article{oai:it.cnr:prodotti:376949,
	title = {Mirror Mirror on the Wall ... An Unobtrusive Intelligent Multisensory Mirror for Well-Being Status Self-Assessment and Visualization},
	author = {Henriquez P. and Matuszewski B.  J. and Andreu-Cabedo Y. and Bastiani L. and Colantonio S. and Coppini G. and D'Acunto M. and Favilla R. and Germanese D. and Giorgi D. and Marraccini P. and Martinelli M. and Morales M.  A. and Pascali M.  A and Righi M. and Salvetti O. and Larsson M. and Stromberg T. and Randeberg L. and Bjorgan A. and Giannakakis G. and Pediaditis M. and Chiarugi F. and Christinaki E. and Marias K. and Tsiknakis M.},
	publisher = {Institute of Electrical and Electronics Engineers,, Piscataway, NJ , Stati Uniti d'America},
	doi = {10.1109/tmm.2017.2666545},
	journal = {IEEE transactions on multimedia},
	volume = {19},
	pages = {1467–1481},
	year = {2017}
}

SEMEOTICONS
SEMEiotic Oriented Technology for Individual’s CardiOmetabolic risk self-assessmeNt and Self-monitoring


OpenAIRE