It is well known that primary school children may face difficulties in acquiring mathematical competence, possibly because teaching is generally based on formal lessons with little opportunity to exploit more multisensory-based activities within the classroom. To overcome such difficulties, we report here the exemplary design of a novel multisensory learning environment for teaching mathematical concepts based on meaningful inputs from elementary school teachers. First, we developed and administered a questionnaire to 101 teachers asking them to rate based on their experience the learning difficulty for specific arithmetical and geometrical concepts encountered by elementary school children. Additionally, the questionnaire investigated the feasibility to use multisensory information to teach mathematical concepts. Results show that challenging concepts differ depending on children school level, thus providing a guidance to improve teaching strategies and the design of new and emerging learning technologies accordingly. Second, we obtained specific and practical design inputs with workshops involving elementary school teachers and children. Altogether, these findings are used to inform the design of emerging multimodal technological applications, that take advantage not only of vision but also of other sensory modalities. In the present work, we describe in detail one exemplary multisensory environment design based on the questionnaire results and design ideas from the workshops: the Space Shapes game, which exploits visual and haptic/proprioceptive sensory information to support mental rotation, 2D–3D transformation and percentages. Corroborating research evidence in neuroscience and pedagogy, our work presents a functional approach to develop novel multimodal user interfaces to improve education in the classroom.
Associations between sensory features of different natures are defined as crossmodal correspondences. In the context of size perception, low pitch sound frequencies are often associated with larger objects and high pitch with smaller objects. Here we investigate such crossmodal correspondences in sighted and visually-impaired children. In Experiment 1, after listening to sounds (250–5000 Hz pure tones), children aged 6–11 years were asked to draw a circle "as big as the sound was". In Experiment 2, children aged 6–14 years who were blind or had low vision performed a similar task. In accordance with previous research, we observed that the circle size drawn depends on participants’ age and we confirm the presence of pitch-size associations in sighted children. In visually-impaired children, such associations are influenced by residual vision, suggesting an anchoring of size perception to level of residual vision. These results reveal novel dynamics underlying the advancing of visual loss and the emergence of compensatory mechanisms in childhood.
Recent scientific results show that audio feedback associated with body movements can be fundamental during the development to learn new spatial concepts , . Within the weDraw project , , we have investigated how this link can be useful to learn mathematical concepts. Here we present a study investigating how mathematical skills changes after multisensory training based on human-computer interaction (RobotAngle and BodyFraction activities). We show that embodied angle and fractions exploration associated with audio and visual feedback can be used in typical children to improve cognition of spatial mathematical concepts. We finally present the exploitation of our results: an online, optimized version of one of the tested activity to be used at school. The training result suggests that audio and visual feedback associated with body movements is informative for spatial learning and reinforces the idea that spatial representation development is based on sensory-motor interactions.
Sensory cues enable navigation through space, as they inform us about movement properties, such as the amount of travelled distance and the heading direction. In this study, we focused on the ability to spatially update one's position when only proprioceptive and vestibular information is available. We aimed to investigate the effect of yaw rotation on path integration across development in the absence of visual feedback. To this end, we utilized the triangle completion task: participants were guided through two legs of a triangle and asked to close the shape by walking along its third imagined leg. To test the influence of yaw rotation across development, we tested children between 6 and 11 years old (y.o.) and adults on their perceptions of angles of different degrees. Our results demonstrated that the amount of turn while executing the angle influences performance at all ages, and in some aspects, also interacted with age. Indeed, whilst adults seemed to adjust their heading towards the end of their walked path, younger children took less advantage of this strategy. The amount of disorientation the path induced also affected participants' full maturational ability to spatially navigate with no visual feedback. Increasing induced disorientation required children to be older to reach adult-level performance. Overall, these results provide novel insights on the maturation of spatial navigation-related processes.
While technology is increasingly used in the classroom, we observe at the same time that making teachers and students accept it is more difficult than expected. In this work, we focus on multisensory technologies and we argue that the intersection between current challenges in pedagogical practices and recent scientific evidence opens novel opportunities for these technologies to bring a significant benefit to the learning process. In our view, multisensory technologies are ideal for effectively supporting an embodied and enactive pedagogical approach exploiting the best-suited sensory modality to teach a concept at school. This represents a great opportunity for designing technologies, which are both grounded on robust scientific evidence and tailored to the actual needs of teachers and students. Based on our experience in technology-enhanced learning projects, we propose six golden rules we deem important for catching this opportunity and fully exploiting it.
When developing interactive systems for children, such as serious games in the context of educational technology, it is important to take into account and address relevant cognitive and emotional child's experiences that may influence learning outcomes. Some works were done to analyze and automatically recognize these cognitive and affective states from nonverbal expressive behaviors. However, there is a lack of knowledge about visually impaired children and their body language to convey those states during learning tasks. In this paper, we present an analysis of nonverbal expressive behaviors of both blind and low-vision children, aiming at understanding what type of body communication can be an indicator of two cognitive states: engagement and confidence. In the study we consider the data collected along the EU-ICT H2020 weDRAW Project, while children were asked to solve mathematical tasks with their body. For such a dataset, we propose a list of 31 nonverbal behaviors, annotated both by rehabilitators used to work with visually impaired children and by naive observers. In the last part of the paper, we propose a preliminary study on automatic recognition of engagement and confidence states from 2D positional data. The classification results are up to 0.71 (F-score) on a three-class classification task.
Haptic devices have the potential to enhance the learning experience by foregrounding embodied, sensory and multi-modal elements of learning topics. In this paper, we report on-going work investigating a game prototype with haptic feedback for seven year old children’s engagement with geometrical concepts as part of an iterative design study. Our findings include a new game play mode adopted by the children, that empowers the use of haptic feedback in game play and has the potential to enable the enactment of shape properties in the game play process.
An increasing body of work provides evidence of the importance of bodily experience for cognition and the learning of mathematics. Sensor-based technologies have potential for guiding sensori-motor engagement with challenging mathematical ideas in new ways. Yet, designing environments that promote an appropriate sensori-motoric interaction that effectively supports salient foundations of mathematical concepts is challenging and requires understanding of opportunities and challenges that bodily interaction offers. This study aimed to better understand how young children can, and do, use their bodies to explore geometrical concepts of angle and shape, and what contribution the different sensori-motor experiences make to the comprehension of mathematical ideas.
Touching objects in a virtual environment is a challenge that has yet to be addressed convincingly, in part because haptic technology and, in particular, low-cost haptic technology have strong limitations. This work aimed at assessing the impact of Chai3D texture rendering parameters on texture perception. We used Multidimensional Scaling techniques to build psychological scales for the texture level, stiffness, dynamic friction and several texture patterns. Two perceptual dimensions were in general necessary to fully account for the one-dimensional parameter change. The scales for the texture level, dynamic friction and texture pattern parameters were markedly larger than the stiffness scale, indicating the potential of these parameters to generate well differentiated textures.
In this short review, we aim at providing an update about recent research on force-feedback devices in educational settings, with a particular focus on primary school teaching. This review describes haptic devices and education virtual environments before entering into the details of domain-specific applications of this technology in schools. Currently, the number of studies that investigated the potential of haptic devices in educational settings is limited, in particular for primary schools.
Multisensory learning is considered a relevant pedagogical framework for education since a very long time and several authors support the use of a multisensory and kinesthetic approach in children learning. Moreover, results from psychophysics and developmental psychology show that children have a preferential sensory channel to learn specific concepts (spatial and/or temporal), hence a further evidence for the need of a multisensory approach. In this work, we present an example of serious game for learning a particularly complicated mathematical concept: fractions. The main novelty of our proposal comes from the role covered by the communication between sensory modalities in particular, movement, vision, and sound. The game has been developed in the context of the EU-ICT-H2020 weDRAW Project aiming at developing new multimodal technologies for multisensory serious-games on mathematical concepts for primary school children.
The orientation of the body in space can influence perception of verticality leading sometimes to biases consistent with priors peaked at the most common head and body orientation, that is upright. In this study, we investigate haptic perception of verticality in sighted individuals and early and late blind adults when tilted counterclockwise in the roll plane. Participants were asked to perform a stimulus orientation discrimination task with their body tilted to their left ear side 90° relative to gravity. Stimuli were presented by using a motorized haptic bar. In order to test whether different reference frames relative to the head influenced perception of verticality, we varied the position of the stimulus on the body longitudinal axis. Depending on the stimulus position sighted participants tended to have biases away or toward their body tilt. Visually impaired individuals instead show a different pattern of verticality estimations.
Recent results from psychophysics and developmental psychology show that children have a preferential sensory channel to learn specific concepts. In this work, we explore the possibility of developing and evaluating novel multisensory technologies for deeper learning of arithmetic and geometry. The main novelty of such new technologies comes from the renewed understanding of the role of communication between sensory modalities during development that is that specific sensory systems have specific roles for learning specific concepts.
In primary school, children tend to have difficulties in discriminating angles of different degrees and categorizing them either as acute or obtuse, especially at the first stages of development (6-7 y.o.). In the context of a novel approach that intends to use other sensory modalities than visual to teach geometrical concepts, we ran a psychophysical study investigating angle perception by spatially navigating in space. Our results show that the youngest group of children tend to be more imprecise when asked to discriminate the walking angle of 90°, pivotal to learn how to differentiate between acute and obtuse angles. These results are then discussed in terms of the development of novel technological solutions aimed to integrate locomotion in the teaching of geometrical concepts.
WeDRAW aims to mediate learning of primary school mathematical concepts, such as geometry and arithmetic, through the design, development and evaluation of multisensory serious games, using a combination of sensory interactive technologies. Working closely with schools, using participatory design techniques, the WeDRAW system will be embedded into the school curricula, and configurable by teachers. Besides application to typically developing children, a major goal is to examine this multisensory approach with visually impaired and dyslexic children.