Skip to main content
Back

Fusion of data extends our range of senses

sliced oranges on a white plate

I see what you mean. I hear what you say. I feel for you. It leaves a bad taste. It touched my heart.

In a nutshell

  1. We are influenced by all of our senses, though everyone has one sense that tends to dominate. For most people that sense is visual.
  2. The insight industry's persistence with asking questions verbally and in written questionnaires, doesn't utilise our visual senses and communication. 
  3. The recent ESOMAR Fusion conference was a refreshing demonstration of a rebalancing of the senses, with an emphasis on visual data made viable by AI and image recognition has made this a viable option.

All very common ways of speaking and all show how our senses drive how we interact with and make meaning of the world. And it’s our senses that trigger and create our emotional responses too.

The smell of bacon triggers special memories for me of my Dad and I on Sunday mornings while mum slept in. It makes me feel happy and sad as Dad is no longer with me, it makes me feel connected and I re-experience the feeling of love and safety.

We are influenced by all of our senses, though everyone has one sense that tends to dominate. For most people that sense is visual. The brain scientists have calculated that around 50% of our cortex occupies itself with the business of visual processing. It’s how the human species evolved, standing upright and looking into the distance. It enabled us to master small scale precision tasks like tool making and writing.

It’s not surprising therefore that we’ve used technological developments to enable us to communicate and connect visually. It used to be faster to write a letter than send someone a painting, whereas now Instagram is within a second or two of anyone’s time and even making a video for YouTube is an easy task.

So far nothing new. But equally there’s not much new in our persistence with asking questions verbally and in written questionnaires, of discussing on text analysis and word clouds. And there is indeed some validity in this, complex speech is a uniquely human skill that we use in much of our communication. It’s really a question of balance and currently not enough of the emphasis is on visual communications.

The recent ESOMAR Fusion conference was a refreshing demonstration of a rebalancing of the senses. Many speakers showed case studies that had an emphasis on visual data. AI and image recognition has made this a viable option, making Instagram the current darling of insights professionals. Both numeric and qualitative analysis of images is being employed to tease out brand narratives, journey maps, innovation mining and a whole host of other outcomes. Instagram is not the only source of visual data of course. Blogs, websites, review sites, Facebook posts and non passive data such as image diaries are all rich veins to mine.

We’ve experimented with some of these approaches already at TRA but fresh back from the Fusion conference with lots of new application ideas, watch this space for more of this valuable insight gained through our dominant visual sense.

Colleen Ryan
Partner at TRA

Discover more content

Stay in touch!

Sign up to receive our latest thinking straight to your inbox each month.