Integrating body language into text communication 

Many people feel like they are able to communicate with their loved ones better over FaceTime than they can over text. I decided to explore the root of the issue and play around with trying to find a solution! This is of course is pure experimentation. The prototypes shown are surely not the solution to this issue! I hope you have as much fun reading as I did making!

I began this experimentation after a conversation with a friend who became my first persona-- Sophie.


Meet Persona One: Sophie

Sophie explained that she had been experiencing a bit of trouble when having a conversation with her long-distance partner during her depressive episodes. During these episodes, she wants to express what she is feeling without stressing her partner out. However, when she sends a text that is too light-hearted, her partner misreads her feelings and replies with humor. On the other hand, when she sends a text that is more descriptive, it worries her partner. In-person, her partner always reacts appropriately, however, over text is it impossible for her to just say "I am having a hard day"

How is a bee emoji supposed to make me feel better?"

“A bitmoji of a sun will make her smile more than the words good morning”

As I explored this tension in text conversation more, I discovered another persona: Georgia.

Meet Georgia

Georgia is a mother who wants to check in on her busy son in an out of state college. She doesn't like sending her son long or multiple texts in a day, but does want to show that she cares and be a mother. While she can use bitmojis as a substitute for texts, she has found that bitmojis fail in emotionally sensitive conversation because they feel comedic.

It was clear: Text-based cannot meet the needs of emotionally dynamic conversations

When conversations were no longer simple and need to carry a specific tone (like wanting to be upset without causing concern) text and emojis were not cutting it out. (If they were long-distance relationships would not be hard anymore haha) I was curious if I could find an alternative solution!

How do we understand emotionally dynamic conversations?

So I turned to some cognitive science research papers to get insights. As it may be apparent to all of us, body language and facial expressions help us understand, however, as this research paper reveals during an emotional decision making our minds prioritize body language even more and compare it the verbal information. Which means that during text conversations, half of the information we need to assess a situation is gone. No wonder there are so many problems... I decided to go deeper

How do we understand emotionally dynamic conversations?

I discovered that when relying on faces, our minds look for information from our eyes and mouth. 

The eyes and mouth distinguish 21 emotions!

I found this fascinating because the eyes and mouth are two oval-like shapes. A slight change in the curvature of the oval allows us to distinguish when someone says “I’m surprised” between angry-surprise, happy-surprise, and sad-surprise-- variation that is not seen in text conversations

Double Checking (but turns out its true curves are important)

I decided to explore if this theory of mine on curvature expands past realistic faces and to the abstract. As you can see, the helmet and root both display such vivid emotion. While the charger is a bit harder to read, perhaps because it utilizes flat lines and unfortunately, that is what texts are.

I decided to get weird and experiment with texts having facial and body language

I wanted to imagine what it would be like if texts offered the information the curves of our eyes and mouths do for conversation.

Prototypes (far from reality but deep in the unknown)

My prototype is that a user could hit the tap back feature (where they typically would go to add flare to a text) and modify a text to have curvature. Perhaps these curves would provide the necessary context to the respondent's brain, helping them assess the text better. 

This is obviously an experiment so this is far from done but I am currently iterating on this idea. I really think there's potential to add body language to text and I am excited to see where this leads.