Rachel Smith

Motion’, the change in the position of objects over time, and ‘gestures’, the non-vocal communications made through our body movements, are both cardinal elements of thinking and communication. 

Historically, the inclusion of body movements to lead or assist our thinking was analogous with spatial thinking - for example, consciously or subconsciously implementing hand gestures to describe the orientation of a place or object. As gestures have been frequently associated with these kinds of specific use cases, motion and spatial thinking has been marginalised, (like smell or music), instead of being recognised as anything more universal.  

Stanford University Psychologist and author of ‘Mind and Motion’, Barbara Tversky, recently appeared on a very informative podcast by Sam Harris¹ where she explained the importance of rewriting these connotations of motion and spatial thinking. Tversky claims “any time you take a shortcut or play chess or basketball or rearrange your furniture in your mind, you've done something remarkable: abstract thinking without words.” 

Tversky proposes that the role that motion fulfils is much greater than originally thought, suggesting that not only would we drastically struggle to communicate without it, we would also fail to understand others without it. 

The idea that motion serves as the foundation of thought is evidenced by a catalogue of research which focuses on hand gestures. Hand gestures are very commonly used when needing to describe directions or encode mental representations into the mind - and this works for both communicator and the listener. Tversky describes that when you watch people gesture, you get the feeling that you are watching someone think in real-time; it’s almost as if the gestures are translating the speaker’s communications into thoughts that you can accessibly and immediately process.

In research studies, participants who have been encouraged to use gestures in mental rotation tests have performed significantly better (they made fewer errors) than participants who were told to sit on their hands². Similar findings have been revealed with participants who were shown a video of someone teaching anatomy using hand and body gestures³. Those who were allowed to imitate the gestures as they were being made had a much higher recall later on for structure names and localisation on a diagram than those who weren’t allowed to imitate the gestures. 

We often don’t look at our own hands when we’re gesturing, and interestingly studies show that people with visual impairments also use gestures, reinforcing the idea that gestures aren’t used as a visual aid and take a much more pivotal role. All of these research findings have contributed to the eventual altering perception of motion, where nowadays motion and spatial thinking are recognised as a more centralised entity which has a larger role to play in our day-to-day lives. 

How is motion being used in new technology?

Whereby previous advancements in technology have focused on access to information, modern technology is transitioning to focus on innovating the experience. In this way, the technology can be more intuitive to how users think so that rather than having an action which follows thought, the action itself is initiated by thought. For example, in this months ‘Made by Google’ event, Google announced the Pixel 4’s motion detection feature, which will allow users to snooze alarms, silence calls and skip songs with basic hand gestures. Though these steps may seem incremental, they are a great progression towards giving motion the role it deserves in technology.

On the other end of the spectrum, virtual reality (VR), has probably made the most advancements in catering to the need for humans to experience the world at a much greater scale. VR is known to be one of the most successful tools for learning⁴ for many reasons including the rich context. In terms of gestures specifically, the hand is the most suitable communication device for the application of human-computer interaction⁵, and so hand gestures play a huge role in the success of learning and communicating in VR.

How do we use gesture at Adludio?

Another real-world application of motion and gesture is Adludio’s unique proposition of tactile engagement with ads. Motion and gestures creating really impactful ad experiences because adverts that require physical interactions, employ Kinesthetic Learning techniques. 

Kinesthetic Learning is a well-documented tool for remembering complicated concepts. The technique proposes that the movement of your body creates rich cues which assist in learning and recall. It’s believed that body motion leads to increased blood flow, which floods the brain with oxygen and results in higher cognitive abilities and increased attention. In the traditional practise of this technique, this ‘movement’ or ‘motion’ often involves the whole body, for example, a speaker might ask for audience participation in the form of standing up. However, the benefits of Kinesthetic learning aren’t restricted to large movements with research showing that the benefits can be reaped from much smaller movements such as playing with fidget toys and taking notes. Therefore it is likely that the small gestures required to initiate our ads can assist in the absorption and concretion of information into the memory.

If you’d like to learn more about how Kinesthetic Learning applies to ads read our in-depth article here [link]. If you’d like to learn more about how gestures and sensory interaction creates a heightened and memorable brand experience, get in touch with us at hello@adludio.com






Sources:

 

¹ https://samharris.org/podcasts/168-mind-space-motion/

² https://www.apa.org/pubs/journals/releases/xge-140-1-102.pdf

³ https://www.apa.org/pubs/journals/releases/xge-140-1-102.pdf

https://elearningindustry.com/vr-enhances-elearning-improves-skills-effectively

https://eprints.soton.ac.uk/263149/