• emodobro

    Virtual Sibling App

    Emodo Bro is an app for those who always wished they had a brother. It explores artificial intelligence and emotion recognition to simulate a realistic relationship. You may fight sometimes, but you know your bro will always be there for you.

  • Name
  • Background

    "Can machines think?" Alan Turing, the father of computer science and artificial intelligence, proposed this question in 1950 when he designed the "imitation game," a test to determine if machines have the ability to exhibit intelligent, human-like behavior. Over 60 years later, we now realize the benefits of this machine intelligence.

  • Name
  • Name

    ELIZA, a chatbot created at MIT by Joseph Weizenbaum in 1964. ELIZA is one of the first programs that could pass the Turing Test. She was able to successfully fool people into thinking that they are talking to a real person. ELIZA has lead others to explore the possibilities of computer intelligence.

  • Her (2013)

    This film follows Theodore Twombly, a man who develops a relationship with Samantha, an intelligent computer operating system personified through a female voice. Samantha is designed to adapt and evolve, and proves to be constantly available, always curious and interested, supportive and undemanding.

  • Name
  • Name
  • Xiaoice

    Xiaoice is a chat-bot developed by Microsoft and targeted toward the Chinese market. Xiaoice listens, asks how you're feeling and can even makes jokes. She can remember details from earlier conversations, and even ask how a user is feeling about a past event.


Revised Proposal: With the increasing number of computing devices that we are connected to on a daily basis, I feel that it is important to evaluate what the user interface of the future may look like. We have many human-like interfaces today such as Siri, Cortana, Alexa and more, and each of these have likely evolved in some way from the the concepts of Alan Turing and the original chatbots like E.L.I.Z.A. We have also seen a possibility of the future of interfaces in movies like Her where our operating system comes to life. 

My project will explore these possibilities and will be aimed at creating a human-like interface that anyone with a mobile phone can interact with. Our society has come to rely on txt messaging as one of our primary methods of communication, so what better way to emulate a human than to be able to send a message to my connected chatbot and receive a response. I will try to incorporate typical human behavior in the interactive experience, and use artificial intelligence to to emulate a more human-like relationship.

My project is entitled Emodo Bro, which is a name representing "emotionally diverse objects" and will start with a virtual brother, or "bro" for short. My project will study the typical interactions between an older brother and younger sibling (which could be a younger brother or sister). So if you've ever wished that you had a big bro, or you just need some emotional support from someone, Emodo Bro will be standing by waiting to hear from you. 

Original Proposal: I would like to design and implement a new way for people to emotionally communicate with each other, a new way that communicates more from the heart. To achieve this, I will develop an iPhone app called Emodo that will analyze certain parameters of the communication (such as heart rate from Apple Watch, speech volume and patterns, the chosen vocabulary, etc...) and then attempt to determine the sender's current emotional state.

Emodo will expand its knowledge through machine learning by collecting data and accepting feedback from its users. For example, after Emodo's analysis is performed, the user is asked to reveal their true emotional state, which is stored and compared to Emodo's hypothetical result. This will allow the emotion recognition algorithm to improve its success rate over time. 

When using the Emodo app, anyone will be able to know exactly how the other person feels during every point in the communication. Emodo will encourage deeper friendships and build stronger relationships.


Sources of inspiration include:

• 20th century studies by Paul Ekman - matching faces with emotion content

• Television show Lie to Me - truth and deception are leaked onto the face

Sources of research include:

In reading facial emotion, context is everything, Association for Psychological Science, 2011

Inferring User Mood Based on User and Group Characteristic Data, US Patent Application by Apple Inc., 2012

clmtrackr (constrained local models tracker), a javascript library for fitting facial models to faces in videos or images, Audi Oygard, Github, 2012-present

Emotiv - Scientific contextual EEG devices and data


To successfully execute this project, I will be required to integrate several technical aspects, such as:

• iPhone app development (Swift) with possible WatchKit integration

• Integration with existing open source voice and facial recognition libraries

• Database to store historical data for analysis

• Rudimentary emotion recognition algorithm (must work well enough to demonstrate speculative concept)

• Development of an aesthetic design to represent the speculative interfaces

• Microsoft Project Oxford - cloud based natural data understanding services. Includes: Face API, Emotion API, Speech Recognition, Language Understanding and more.  Demo Image 1    Demo Image 2   Demo Image 3

Orange PI PC - a powerful single board computer that will be used to provide me with a connected device to facilitate the incoming SMS messages, process the message and send a response.

Adafruit FONA - a mini cellular GSM breakout board to enable data connectivity and a phone number 

pyAIML - a Python based open source AIML (Artificial Intelligence Markup Language) implementation for creating advanced chatbots.



                    √ January 14, 2016  -  Initial proposal presentation

                    √ January 15, 2016  -  Two-week research phase

                    √ January 28, 2016  -  Second proposal presentation

                      √ February 1, 2016  -  Four-week development phase

                    √ February 29, 2016  -  One-week testing phase

                    √ March 7, 2016  - Week 10 presentation

                    √ March 24, 2016  -  Final four-week development phase

                    √ April 25, 2016  -  Final testing & usage phase

                    √ May 1, 2016  -  iOS App Development

                    √ May 23, 2016  -  Testing and feeback

                    √ June 2, 2016  - Final presentation and documentation

                    √ June 7-9, 2016  -  Kamil Gallery exposition


Emodo Bro presented on display at UCSD's Adam D. Kamil Gallery from June 7, 2016 to June 9, 2016.