iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://api.crossref.org/works/10.3390/S22041493
{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2024,9,19]],"date-time":"2024-09-19T16:18:31Z","timestamp":1726762711251},"reference-count":68,"publisher":"MDPI AG","issue":"4","license":[{"start":{"date-parts":[[2022,2,15]],"date-time":"2022-02-15T00:00:00Z","timestamp":1644883200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"National Institute of Health","award":["Grant No: R56 DK113819 and R01 DK127310","Cooperative Agreement 58-3092-5-001"]},{"DOI":"10.13039\/100000865","name":"Bill & Melinda Gates Foundation","doi-asserted-by":"publisher","award":["Contract ID: OPP1171395)"],"id":[{"id":"10.13039\/100000865","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"Knowing the amounts of energy and nutrients in an individual\u2019s diet is important for maintaining health and preventing chronic diseases. As electronic and AI technologies advance rapidly, dietary assessment can now be performed using food images obtained from a smartphone or a wearable device. One of the challenges in this approach is to computationally measure the volume of food in a bowl from an image. This problem has not been studied systematically despite the bowl being the most utilized food container in many parts of the world, especially in Asia and Africa. In this paper, we present a new method to measure the size and shape of a bowl by adhering a paper ruler centrally across the bottom and sides of the bowl and then taking an image. When observed from the image, the distortions in the width of the paper ruler and the spacings between ruler markers completely encode the size and shape of the bowl. A computational algorithm is developed to reconstruct the three-dimensional bowl interior using the observed distortions. Our experiments using nine bowls, colored liquids, and amorphous foods demonstrate high accuracy of our method for food volume estimation involving round bowls as containers. A total of 228 images of amorphous foods were also used in a comparative experiment between our algorithm and an independent human estimator. The results showed that our algorithm overperformed the human estimator who utilized different types of reference information and two estimation methods, including direct volume estimation and indirect estimation through the fullness of the bowl.<\/jats:p>","DOI":"10.3390\/s22041493","type":"journal-article","created":{"date-parts":[[2022,2,16]],"date-time":"2022-02-16T03:44:47Z","timestamp":1644983087000},"page":"1493","source":"Crossref","is-referenced-by-count":9,"title":["A Novel Approach to Dining Bowl Reconstruction for Image-Based Food Volume Estimation"],"prefix":"10.3390","volume":"22","author":[{"given":"Wenyan","family":"Jia","sequence":"first","affiliation":[{"name":"Department of Electrical and Computer Engineering, University of Pittsburgh, Pittsburgh, PA 15260, USA"}]},{"given":"Yiqiu","family":"Ren","sequence":"additional","affiliation":[{"name":"Department of Electrical and Computer Engineering, University of Pittsburgh, Pittsburgh, PA 15260, USA"}]},{"given":"Boyang","family":"Li","sequence":"additional","affiliation":[{"name":"Department of Electrical and Computer Engineering, University of Pittsburgh, Pittsburgh, PA 15260, USA"}]},{"ORCID":"http:\/\/orcid.org\/0000-0002-7770-5012","authenticated-orcid":false,"given":"Britney","family":"Beatrice","sequence":"additional","affiliation":[{"name":"School of Health and Rehabilitation Sciences, University of Pittsburgh, Pittsburgh, PA 15260, USA"}]},{"given":"Jingda","family":"Que","sequence":"additional","affiliation":[{"name":"Department of Electrical and Computer Engineering, University of Pittsburgh, Pittsburgh, PA 15260, USA"}]},{"given":"Shunxin","family":"Cao","sequence":"additional","affiliation":[{"name":"Department of Electrical and Computer Engineering, University of Pittsburgh, Pittsburgh, PA 15260, USA"}]},{"given":"Zekun","family":"Wu","sequence":"additional","affiliation":[{"name":"Department of Electrical and Computer Engineering, University of Pittsburgh, Pittsburgh, PA 15260, USA"}]},{"given":"Zhi-Hong","family":"Mao","sequence":"additional","affiliation":[{"name":"Department of Electrical and Computer Engineering, University of Pittsburgh, Pittsburgh, PA 15260, USA"},{"name":"Department of Bioengineering, University of Pittsburgh, Pittsburgh, PA 15260, USA"}]},{"ORCID":"http:\/\/orcid.org\/0000-0002-5080-108X","authenticated-orcid":false,"given":"Benny","family":"Lo","sequence":"additional","affiliation":[{"name":"Hamlyn Centre, Imperial College London, London SW7 2AZ, UK"}]},{"ORCID":"http:\/\/orcid.org\/0000-0001-7048-8337","authenticated-orcid":false,"given":"Alex K.","family":"Anderson","sequence":"additional","affiliation":[{"name":"Department of Nutritional Sciences, University of Georgia, Athens, GA 30602, USA"}]},{"given":"Gary","family":"Frost","sequence":"additional","affiliation":[{"name":"Section for Nutrition Research, Department of Metabolism, Digestion and Reproduction, Imperial College London, London SW7 2AZ, UK"}]},{"ORCID":"http:\/\/orcid.org\/0000-0002-5024-1465","authenticated-orcid":false,"given":"Megan A.","family":"McCrory","sequence":"additional","affiliation":[{"name":"Department of Health Sciences, Boston University, Boston, MA 02210, USA"}]},{"ORCID":"http:\/\/orcid.org\/0000-0001-7792-4234","authenticated-orcid":false,"given":"Edward","family":"Sazonov","sequence":"additional","affiliation":[{"name":"Department of Electrical and Computer Engineering, University of Alabama, Tuscaloosa, AL 35487, USA"}]},{"given":"Matilda","family":"Steiner-Asiedu","sequence":"additional","affiliation":[{"name":"Department of Nutrition and Food Science, University of Ghana, Legon Boundary, Accra LG 1181, Ghana"}]},{"given":"Tom","family":"Baranowski","sequence":"additional","affiliation":[{"name":"USDA\/ARS Children\u2019s Nutrition Research Center, Department of Pediatrics, Baylor College of Medicine, Houston, TX 77030, USA"}]},{"given":"Lora E.","family":"Burke","sequence":"additional","affiliation":[{"name":"School of Nursing, University of Pittsburgh, Pittsburgh, PA 15260, USA"}]},{"given":"Mingui","family":"Sun","sequence":"additional","affiliation":[{"name":"Department of Electrical and Computer Engineering, University of Pittsburgh, Pittsburgh, PA 15260, USA"},{"name":"Department of Bioengineering, University of Pittsburgh, Pittsburgh, PA 15260, USA"},{"name":"Department of Neurosurgery, University of Pittsburgh, Pittsburgh, PA 15260, USA"}]}],"member":"1968","published-online":{"date-parts":[[2022,2,15]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","unstructured":"Madival, S.A., and Jawaligi, S.S. (2020, January 3\u20135). A comprehensive review and open issues on food image analysis and dietary assessment. Proceedings of the 2020 3rd International Conference on Intelligent Sustainable Systems (ICISS), Thoothukudi, India.","DOI":"10.1109\/ICISS49785.2020.9315940"},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"38","DOI":"10.1038\/s41746-020-0246-2","article-title":"Automatic, wearable-based, in-field eating detection approaches for public health research: A scoping review","volume":"3","author":"Bell","year":"2020","journal-title":"NPJ Digit. Med."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"nzaa020","DOI":"10.1093\/cdn\/nzaa020","article-title":"Development and validation of an objective, passive dietary assessment method for estimating food and nutrient intake in households in low- and middle-income countries: A study protocol","volume":"4","author":"Jobarteh","year":"2020","journal-title":"Curr. Dev. Nutr."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"49653","DOI":"10.1109\/ACCESS.2019.2910308","article-title":"A systematic review of technology-driven methodologies for estimation of energy intake","volume":"7","author":"Doulah","year":"2019","journal-title":"IEEE Access"},{"key":"ref_5","doi-asserted-by":"crossref","unstructured":"Vu, T., Lin, F., Alshurafa, N., and Xu, W. (2017). Wearable food intake monitoring technologies: A comprehensive review. Computers, 6.","DOI":"10.3390\/computers6010004"},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"283","DOI":"10.1017\/S0029665116002913","article-title":"New mobile methods for dietary assessment: Review of image-assisted and image-based dietary assessment methods","volume":"76","author":"Boushey","year":"2017","journal-title":"Proc. Nutr. Soc."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"64","DOI":"10.1016\/j.jand.2014.09.015","article-title":"Image-assisted dietary assessment: A systematic review of the evidence","volume":"115","author":"Gemming","year":"2015","journal-title":"J. Acad. Nutr. Diet."},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Bekelman, T.A., Martin, C.K., Johnson, S.L., Glueck, D.H., Sauder, K.A., Harrall, K.K., Steinberg, R.I., Hsia, D.S., and Dabelea, D. (2021). A comparison of the remote food photography method and the automated self-administered 24-h dietary assessment tool for measuring full-day dietary intake among school-age children. Br. J. Nutr, 1\u201310.","DOI":"10.1017\/S0007114521001951"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"2358","DOI":"10.1038\/s41366-020-00693-2","article-title":"Review of the validity and feasibility of image-assisted methods for dietary assessment","volume":"44","author":"Hochsmann","year":"2020","journal-title":"Int. J. Obes."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"1926","DOI":"10.1109\/JBHI.2020.2987943","article-title":"Image-based food classification and volume estimation for dietary assessment: A review","volume":"24","author":"Lo","year":"2020","journal-title":"IEEE J. Biomed. Health Inform."},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"35370","DOI":"10.1109\/ACCESS.2019.2904519","article-title":"Vision-based approaches for automatic food recognition and dietary assessment: A survey","volume":"7","author":"Subhi","year":"2019","journal-title":"IEEE Access"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"9297","DOI":"10.4081\/hpr.2020.9297","article-title":"A review on food recognition technology for health applications","volume":"8","author":"Allegra","year":"2020","journal-title":"Health Psychol. Res."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"1793","DOI":"10.1111\/1541-4337.12492","article-title":"Application of deep learning in food: A review","volume":"18","author":"Zhou","year":"2019","journal-title":"Compr. Rev. Food Sci. Food Saf."},{"key":"ref_14","first-page":"e61906","article-title":"Deep neural networks for image-based dietary assessment","volume":"169","author":"Mezgec","year":"2021","journal-title":"J. Vis. Exp."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Mezgec, S., and Korousic Seljak, B. (2017). NutriNet: A deep learning food and drink image recognition system for dietary assessment. Nutrients, 9.","DOI":"10.3390\/nu9070657"},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Sahoo, D., Hao, W., Ke, S., Xiongwei, W., Le, H., Achananuparp, P., Lim, E.-P., and Hoi, S.C.H. (2019, January 4\u20138). FoodAI: Food image recognition via deep learning for smart food logging. Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Anchorage, AK, USA.","DOI":"10.1145\/3292500.3330734"},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Tahir, G.A., and Loo, C.K. (2021). A comprehensive survey of image-based food recognition and volume estimation methods for dietary assessment. Healthcare, 9.","DOI":"10.3390\/healthcare9121676"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"12882","DOI":"10.1109\/JSEN.2020.3041023","article-title":"A systematic review of sensor-based methodologies for food portion size estimation","volume":"21","author":"Raju","year":"2021","journal-title":"IEEE Sens. J."},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"80","DOI":"10.3389\/fnut.2020.00080","article-title":"Future directions for integrative objective assessment of eating using wearable sensing technology","volume":"7","author":"Skinner","year":"2020","journal-title":"Front. Nutr."},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Eldridge, A.L., Piernas, C., Illner, A.K., Gibney, M.J., Gurinovic, M.A., de Vries, J.H.M., and Cade, J.E. (2018). Evaluation of new technology-based tools for dietary intake assessment-An ILSI Europe dietary intake and exposure task force evaluation. Nutrients, 11.","DOI":"10.3390\/nu11010055"},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"582","DOI":"10.1038\/s41430-020-00779-0","article-title":"Emerging trends of technology-based dietary assessment: A perspective study","volume":"75","author":"Zhao","year":"2021","journal-title":"Eur. J. Clin. Nutr."},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Fang, S., Zhu, F., Jiang, C., Zhang, S., Boushey, C.J., and Delp, E.J. (2016, January 25\u201328). A comparison of food portion size estimation using geometric models and depth images. Proceedings of the 2016 IEEE International Conference on Image Processing (ICIP), Phoenix, AZ, USA.","DOI":"10.1109\/ICIP.2016.7532312"},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"e15294","DOI":"10.2196\/15294","article-title":"Volumetric food quantification using computer vision on a depth-sensing smartphone: Preclinical study","volume":"8","author":"Herzig","year":"2020","journal-title":"JMIR Mhealth Uhealth"},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Makhsous, S., Bharadwaj, M., Atkinson, B.E., Novosselov, I.V., and Mamishev, A.V. (2020). DietSensor: Automatic dietary intake measurement using mobile 3d scanning sensor for diabetic patients. Sensors, 20.","DOI":"10.3390\/s20123380"},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Suzuki, T., Futatsuishi, K., and Kobayashi, K. (2018, January 21\u201323). Food volume estimation using 3d shape approximation for medication management support. Proceedings of the 2018 3rd Asia-Pacific Conference on Intelligent Robot Systems (ACIRS), Singapore.","DOI":"10.1109\/ACIRS.2018.8467253"},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"577","DOI":"10.1109\/TII.2019.2942831","article-title":"Point2Volume: A vision-based dietary assessment approach using view synthesis","volume":"16","author":"Lo","year":"2020","journal-title":"IEEE Trans. Industr. Inform."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"36","DOI":"10.1109\/MIM.2018.8573592","article-title":"Food volume estimation based on stereo image analysis","volume":"21","author":"Subhi","year":"2018","journal-title":"IEEE Instrum. Meas. Mag."},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Rahman, M.H., Li, Q., Pickering, M., Frater, M., Kerr, D., Bouchey, C., and Delp, E. (2012, January 25\u201329). Food volume estimation in a mobile phone based dietary assessment system. Proceedings of the 2012 Eighth International Conference on Signal Image Technology and Internet Based Systems, Sorrento, Italy.","DOI":"10.1109\/SITIS.2012.146"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Ando, Y., Ege, T., Cho, J., and Yanai, K. (2019, January 21\u201325). DepthCalorieCam: A mobile application for volume-based foodcalorie estimation using depth cameras. Proceedings of the 5th International Workshop on Multimedia Assisted Dietary Management\u2014MADiMa \u201819, Nice, France.","DOI":"10.1145\/3347448.3357172"},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Shang, J., Duong, M., Pepin, E., Xing, Z., Sandara-Rajan, K., Mamishev, A., and Kristal, A. (2011, January 6\u201313). A mobile structured light system for food volume estimation. Proceedings of the 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops), Barcelona, Spain.","DOI":"10.1109\/ICCVW.2011.6130229"},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Makhsous, S., Mohammad, H.M., Schenk, J.M., Mamishev, A.V., and Kristal, A.R. (2019). A novel mobile structured light system in food 3D reconstruction and volume estimation. Sensors, 19.","DOI":"10.3390\/s19030564"},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"1248","DOI":"10.1017\/S136898002000275X","article-title":"An automatic electronic instrument for accurate measurements of food volume and density","volume":"24","author":"Yuan","year":"2021","journal-title":"Public Health Nutr."},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Konstantakopoulos, F., Georga, E.I., and Fotiadis, D.I. (2021, January 25\u201327). 3D reconstruction and volume estimation of food using stereo vision techniques. Proceedings of the 2021 IEEE 21st International Conference on Bioinformatics and Bioengineering (BIBE), Kragujevac, Serbia.","DOI":"10.1109\/BIBE52308.2021.9635418"},{"key":"ref_34","unstructured":"Ma, Y., Soatto, S., Kosecka, J., and Sastry, S.S. (2003). An Invitation to 3-D Vision: From Images to Geometric Models, Springer."},{"key":"ref_35","first-page":"532","article-title":"Single image-based food volume estimation using monocular depth-prediction networks","volume":"Volume 12189","author":"Antona","year":"2020","journal-title":"Universal Access in Human-Computer Interaction. Applications and Practice. HCII 2020. Lecture Notes in Computer Science"},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Fatehah, A.A., Poh, B.K., Shanita, S.N., and Wong, J.E. (2018). Feasibility of reviewing digital food images for dietary assessment among nutrition professionals. Nutrients, 10.","DOI":"10.3390\/nu10080984"},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"1671","DOI":"10.1017\/S1368980013003236","article-title":"Accuracy of food portion size estimation from digital pictures acquired by a chest-worn camera","volume":"17","author":"Jia","year":"2014","journal-title":"Public Health Nutr."},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"105701","DOI":"10.1088\/0957-0233\/24\/10\/105701","article-title":"Model-based measurement of food portion size for image-based dietary assessment using 3D\/2D registration","volume":"24","author":"Chen","year":"2013","journal-title":"Meas. Sci. Technol."},{"key":"ref_39","doi-asserted-by":"crossref","unstructured":"Chae, J., Woo, I., Kim, S., Maciejewski, R., Zhu, F., Delp, E.J., Boushey, C.J., and Ebert, D.S. (2011, January 23\u201327). Volume estimation using food specific shape templates in mobile image-based dietary assessment. Proceedings of the IS&T\/SPIE Electronic Imaging, San Francisco, CA, USA.","DOI":"10.1117\/12.876669"},{"key":"ref_40","first-page":"1153","article-title":"Reliability and validity of food portion size estimation from images using manual flexible digital virtual meshes","volume":"22","author":"Beltran","year":"2019","journal-title":"Public Health Nutr."},{"key":"ref_41","doi-asserted-by":"crossref","unstructured":"He, Y., Xu, C., Khanna, N., Boushey, C.J., and Delp, E.J. (2013, January 15\u201319). Food image analysis: Segmentation, identification and weight estimation. Proceedings of the 2013 IEEE International Conference on Multimedia and Expo (ICME), San Jose, CA, USA.","DOI":"10.1109\/ICME.2013.6607548"},{"key":"ref_42","doi-asserted-by":"crossref","unstructured":"Fang, S., Liu, C., Zhu, F., Delp, E.J., and Boushey, C.J. (2015, January 14\u201316). Single-view food portion estimation based on geometric models. Proceedings of the 2015 IEEE International Symposium on Multimedia (ISM), Miami, FL, USA.","DOI":"10.1109\/ISM.2015.67"},{"key":"ref_43","doi-asserted-by":"crossref","first-page":"360","DOI":"10.9746\/jcmsi.10.360","article-title":"Smartphone-based food weight and calorie estimation method for effective food journaling","volume":"10","author":"Akpa","year":"2017","journal-title":"SICE J. Control Meas. Syst. Integr."},{"key":"ref_44","doi-asserted-by":"crossref","first-page":"124","DOI":"10.1186\/s12966-017-0583-y","article-title":"The international food unit: A new measurement aid that can improve portion size estimation","volume":"14","author":"Bucher","year":"2017","journal-title":"Int. J. Behav. Nutr. Phys. Act."},{"key":"ref_45","doi-asserted-by":"crossref","unstructured":"Liu, Y., Lai, J., Sun, W., Wei, Z., Liu, A., Gong, W., and Yang, Y. (2020, January 8\u201311). Food volume estimation based on reference. Proceedings of the 4th International Conference on Innovation in Artificial Intelligence, Xiamen, China.","DOI":"10.1145\/3390557.3394123"},{"key":"ref_46","first-page":"1180","article-title":"Image-based food portion size estimation using a smartphone without a fiducial marker","volume":"22","author":"Yang","year":"2019","journal-title":"Public Health Nutr."},{"key":"ref_47","doi-asserted-by":"crossref","unstructured":"Myers, A., Johnston, N., Rathod, V., Korattikara, A., Gorban, A., Silberman, N., Guadarrama, S., Papandreou, G., Huang, J., and Murphy, K. (2015, January 7\u201313). Im2Calories: Towards an automated mobile vision food diary. Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV), Santiago, Chile.","DOI":"10.1109\/ICCV.2015.146"},{"key":"ref_48","doi-asserted-by":"crossref","unstructured":"Lo, F.P., Sun, Y., and Lo, B. (2019, January 8\u201312). Depth estimation based on a single close-up image with volumetric annotations in the wild: A pilot study. Proceedings of the 2019 IEEE\/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Hong Kong, China.","DOI":"10.1109\/AIM.2019.8868629"},{"key":"ref_49","doi-asserted-by":"crossref","unstructured":"Fang, S., Shao, Z., Mao, R., Fu, C., Delp, E.J., Zhu, F., Kerr, D.A., and Boushey, C.J. (2018, January 7\u201310). Single-view food portion estimation: Learning image-to-energy mappings using generative adversarial networks. Proceedings of the 25th IEEE International Conference on Image Processing (ICIP), Athens, Greece.","DOI":"10.1109\/ICIP.2018.8451461"},{"key":"ref_50","doi-asserted-by":"crossref","unstructured":"Hassannejad, H., Matrella, G., Ciampolini, P., Munari, I.D., Mordonini, M., and Cagnoni, S. (2017). A new approach to image-based estimation of food volume. Algorithms, 10.","DOI":"10.3390\/a10020066"},{"key":"ref_51","unstructured":"Liang, Y., and Li, J. (2022, January 25). Deep Learning-Based Food Calorie Estimation Method in Dietary Assessment. Available online: https:\/\/arxiv.org\/abs\/1706.04062."},{"key":"ref_52","doi-asserted-by":"crossref","first-page":"1090","DOI":"10.1109\/TMM.2016.2642792","article-title":"Two-view 3D reconstruction for food volume estimation","volume":"19","author":"Dehais","year":"2017","journal-title":"IEEE Trans. Multimed."},{"key":"ref_53","doi-asserted-by":"crossref","first-page":"1578","DOI":"10.1109\/TPAMI.2019.2954885","article-title":"Image-based 3d object reconstruction: State-of-the-art and trends in the deep learning era","volume":"43","author":"Han","year":"2021","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_54","doi-asserted-by":"crossref","unstructured":"Tahir, R., Sargano, A.B., and Habib, Z. (2021). Voxel-based 3D object reconstruction from single 2D image using variational autoencoders. Mathematics, 9.","DOI":"10.3390\/math9182288"},{"key":"ref_55","doi-asserted-by":"crossref","first-page":"463","DOI":"10.1007\/s11042-020-09722-8","article-title":"Single image 3D object reconstruction based on deep learning: A review","volume":"80","author":"Fu","year":"2021","journal-title":"Multimed. Tools Appl."},{"key":"ref_56","unstructured":"Wu, Z., Song, S., Khosla, A., Yu, F., Zhang, L., Tang, X., and Xiao, J. (2015, January 7\u201312). 3D ShapeNets: A deep representation for volumetric shapes. Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA."},{"key":"ref_57","doi-asserted-by":"crossref","unstructured":"Naritomi, S., and Yanai, K. (2021, January 7\u20139). Hungry Networks. Proceedings of the 2nd ACM International Conference on Multimedia in Asia, Singapore.","DOI":"10.1145\/3444685.3446275"},{"key":"ref_58","doi-asserted-by":"crossref","first-page":"76","DOI":"10.1016\/j.jfoodeng.2011.09.031","article-title":"Image-based estimation of food volume using circular referents in dietary assessment","volume":"109","author":"Jia","year":"2012","journal-title":"J. Food Eng."},{"key":"ref_59","unstructured":"(2021, December 08). Bowl. Available online: https:\/\/en.wikipedia.org\/wiki\/Bowl."},{"key":"ref_60","unstructured":"Ruszczy\u0144ski, A. (2006). Nonlinear Optimization, Princeton University Press."},{"key":"ref_61","doi-asserted-by":"crossref","first-page":"624","DOI":"10.1109\/70.163786","article-title":"Three-dimensional location estimation of circular features for machine vision","volume":"8","author":"Tchoukanov","year":"1992","journal-title":"IEEE Trans. Rob. Autom."},{"key":"ref_62","doi-asserted-by":"crossref","unstructured":"Sun, M., Burke, L.E., Mao, Z.H., Chen, Y., Chen, H.C., Bai, Y., Li, Y., Li, C., and Jia, W. (2014, January 1\u20135). eButton: A wearable computer for health monitoring and personal assistance. Proceedings of the 51st Annual Design Automation Conference, San Francisco, CA, USA.","DOI":"10.1145\/2593069.2596678"},{"key":"ref_63","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1260\/2040-2295.6.1.1","article-title":"An exploratory study on a chest-worn computer for evaluation of diet, physical activity and lifestyle","volume":"6","author":"Sun","year":"2015","journal-title":"J. Healthc Eng."},{"key":"ref_64","unstructured":"Beltran, A., Dadabhoy, H., Chen, T.A., Lin, C., Jia, W., Baranowski, J., Yan, G., Sun, M., and Baranowski, T. (2016, January 25\u201327). Adapting the eButton to the abilities of children for diet assessment. Proceedings of the Measuring Behavior, Dublin, Ireland."},{"key":"ref_65","doi-asserted-by":"crossref","first-page":"32","DOI":"10.1186\/s12937-018-0341-2","article-title":"Utility of eButton images for identifying food preparation behaviors and meal-related tasks in adolescents","volume":"17","author":"Raber","year":"2018","journal-title":"Nutr. J."},{"key":"ref_66","unstructured":"McCrory, M.A., Sun, M., Sazonov, E., Frost, G., Anderson, A., Jia, W., Jobarteh, M.L., Maitland, K., Steiner, M., and Ghosh, T. (2019, January 8\u201311). Methodology for objective, passive, image- and sensor-based assessment of dietary intake, meal-timing, and food-related activity in Ghana and Kenya. Proceedings of the Annual Nutrition Conference, Baltimore, MD, USA."},{"key":"ref_67","doi-asserted-by":"crossref","first-page":"1330","DOI":"10.1109\/34.888718","article-title":"A flexible new technique for camera calibration","volume":"22","author":"Zhang","year":"2000","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_68","doi-asserted-by":"crossref","unstructured":"Ho, D.K.N., Chiu, W.C., Lee, Y.C., Su, H.Y., Chang, C.C., Yao, C.Y., Hua, K.L., Chu, H.K., Hsu, C.Y., and Chang, J.S. (2021). Integration of an image-based dietary assessment paradigm into dietetic training improves food portion estimates by future dietitians. Nutrients, 13.","DOI":"10.3390\/nu13010175"}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/22\/4\/1493\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,7,26]],"date-time":"2024-07-26T03:29:45Z","timestamp":1721964585000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/22\/4\/1493"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,2,15]]},"references-count":68,"journal-issue":{"issue":"4","published-online":{"date-parts":[[2022,2]]}},"alternative-id":["s22041493"],"URL":"http:\/\/dx.doi.org\/10.3390\/s22041493","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,2,15]]}}}