[email protected] | |
3275638434 | |
Paper Publishing WeChat |
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
Analyzing Sugar Nurse Diabetes Management Application Through the Lens of Affordance Theory
Zhengyang Liu, Albert Young Choi
Full-Text PDF XML 52 Views
DOI:10.17265/2159-5542/2024.03.002
Hanyang University ERICA, Ansan, South Korea
This research delves into the integration of affordance theory within the diabetes management application Sugar Nurse in the dynamic Chinese market. Through an in-depth examination, the study unveils the distinctive affordances offered by Sugar Nurse, tailored to address a spectrum of user needs and environmental variables. Focusing solely on Sugar Nurse, the research underscores the pivotal role of functional, cognitive, behavioral, sensory, and emotional affordances in elevating user experience and fostering engagement. The findings underscore the necessity of adopting a comprehensive design approach that acknowledges the multifaceted nature of user requirements and environmental influences in crafting user-centric digital health solutions. Such insights provide valuable guidance for the future development of health management applications.
diabetes management, interaction design, Chinese market, affordance
Psychology Research, March 2024, Vol. 14, No. 3, 100-109
Ahonen, T., Hadid, A., & Pietikäinen, M. (2004). Face recognition with local binary patterns. In Proceedings of the European Conference on Computer Vision (pp. 469-481). Berlin, Heidelberg: Springer.
Aleksic, P. S., & Katsaggelos, A. K. (2006). Automatic facial expression recognition using facial animation parameters and multistream HMMs. IEEE Transactions on Information Forensics & Security, 1(1), 3-11.
Calder, A. J., Burton, A. M., Miller, P., Young, A. W., & Akamatsu, S. (2001). A principal component analysis of facial expressions. Vision Research, 41(9), 1179-1208.
Cao, S., Yang, G., & Tu, X. (2005). Computer aided conceptual design based on harmonics. Networking, Sensing and Control, 3, 317-320.
Gaver, W. W. (1991). Technology affordances. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 79-84). ACM, New York.
Gibson, J. J. (1979). The ecological approach to visual perception. Boston: Houghton Mifflin.
Guo, L. X. (2022). Annual review of major progress in the field of diabetes in 2021. Chinese Journal of Diabetes, 14(1), 1-8. doi:10.3760/cma.j.cn115791-20220114-00034
Hachisu, T., & Suzuki, K. (2019). Representing interpersonal touch directions by tactile apparent motion using intelligent bracelets. IEEE Trans Haptics, 12(3), 327-338.
Junior, M. D., Tavares, M. N., Lacerda, J. D., & Tinoco, G. M. (2020). Looks on faces: An interactive experience. Int. J. Creative Interfaces Comput. Graph., 11, 18-35.
Kaptelinin, V., & Nardi, B. A. (2006). Acting with technology: Activity theory and interaction design. Cambridge: MIT Press.
Lawrence, S., Giles, C. L., Tsoi, A. C., & Back, A. D. (1997). Face recognition: A convolutional neural network approach. IEEE Transactions on Neural Networks, 8(1), 98-113.
Lee, A., & Marshall, P. (2013). Interaction design and the role of affordance. Journal of Applied Research in Memory and Cognition, 2(3), 221-225.
Liu, Z., Wu, M., Cao, W., Chen, L., Xu, J., Zhang, R., … Mao, J. (2017). A facial expression emotion recognition based human-robot interaction system. IEEE/CAA Journal of Automatica Sinica, 4(4), 668-676.
Llinares, C., & Aero, F. (2008). Differential semantics as analyzing the emotional impressions which Kansei engineering tool for of determine the choice neighborhood. Landscape and Urban Planning, 87, 247-257.
Norman, D. A. (1988). The psychology of everyday things. New York: Basic Books.
Perez-Gaspar, L. A., Caballero-Morales, S. O., & Trujillo-Romero, F. (2016). Multimodal emotion recognition with evolutionary computation for human-robot interaction. Expert Systems with Applications, 66, 42-61.
Rahmani, H., Mian, A., & Shah, M. (2018). Learning a deep model for human action recognition from novel viewpoints. IEEE Transactions on Pattern Analysis and Machine Intelligence, 40(3), 667-681.
Tullis, T., & Albert, B. (2008). Measuring the user experience: Collecting, analyzing, and presenting usability metrics. San Francisco: Morgan Kaufmann.
Wightman, F., Kistler, D., & Arruda, M. (1992). Perceptual consequences of engineering compromises in synthesis of virtual auditory objects. Journal of the Acoustical Society of America, 92(4), 2332-2332.
Yang, C.-C. (2011). A classification-based Kansei engineering system for modeling consumers’ affective responses and analyzing product form features. Expert Syst. Appl., 38, 11382-11393.