SHERPA
Smart Hub: Emotionally Responive, Private, and Adorable
How might we design privacy-driven interactions into our smart home devices? How might we make the smart home more relatable?
SHERPA :: Smart Hub: Emotionally Responsive, Private, and Adorable
Harvard GSD Research Seminar :: Enactive Design :: Spring 2021
A Collaboration with Grace Chee. Instructor: Jose Luis Garcia del Castillo y Lopez
Part of the winning portfolio submission for the GSD Digital Design Prize 2021
Featured in Harvard GSD Platform 13
Privacy and emotional relatability are common themes in recent Human-Building Interaction research, but are rarely addressed in interactions with the smart home systems in occupants’ most intimate domestic spaces. Our design for the SHERPA smart hub applies Human-Computer Interaction principles to reframe privacy control and design empathetic interactions with the smart home.
SHERPA (Smart Hub: Emotionally Responsive, Private and Adorable) prototypes our concept for “tangible privacy” interactions, where users physically enable or disable their device from collecting data for more reassuring control over data security. Sherpa also designs strategies for more empathetic interactions between occupant and IOT devices, including anthropomorphized device components and actuation of other smart devices in the home based on the user’s assumed emotional state.
Problem Space: Privacy and Relatability in the Smart Home.
The smart home feels more alive than ever before. We've given our internet-of-things-laden devices a mind of their own, allowing objects like smart locks and automated blinds to run completely autonomously. We speak to our homes as though they were named, personified beings ("hey Alexa, can you turn off the lights?"). We've even begun to co-habit our spaces with a range of robotic beings, with autonomous vacuum cleaners zooming through our living rooms and devices like the Gita cargo bot following us around with an almost pet-like attentiveness.
While the smart home moves ever-closer to these fantastical visions of digitally-animated spaces, there is a certain anxiety that the smart home could become as intrusive as it is personable. The smart home has become the next frontier for surveillance capitalism (a term used to describe the commodification of digital footprints), as devices like Alexa
monitor their users silently and pervasively in the background.
Anyone who has ever felt the need to tape over an unused laptop webcam, or feels uncertain about the efficacy of the mute button on their Amazon Echo, knows the feeling of being surveilled in their most private, domestic spaces. Recent patents from Amazon and Google, which describe methods for collecting marketing data and tailoring advertisements through smart home hubs, prove these fears to be not entirely meritless.
SHERPA tests two techniques for addressing issues of privacy and agency in the smart home:
SHERPA introduces herself. Click to play.
1. Tangible Privacy
Tangible Privacy Designs reassuring privacy-driven interactions between occupant and smarthome, by giving occupants the ability to physically enable and disable data-collecting sensors. In what we affectionately refer to as the "Mr-Potato-Head" model for privacy-based interactions, tangible privacy interactions allow sensors to be physically unplugged to reliably disable them and give control over data collection.
2. An Empathetic IOT
An Empathetic IOT creates the sensation of empathy or emotional resonance between occupant and smarthome, through the design of anthropomorphized smart home interfaces and automatic actuation of the smart home in response to occupants’ activity. The smart home is "taught" to recognize its occupants emotional state based on their activity in the home, and responds in an empathetic way.
Tangible Privacy :: Designing SHERPA's Anatomy
SHERPA tests our concepts for Tangible Privacy in the smart home through the design of a range of smart home sensors embedded in its body parts. A series of sensor breakout boards are encased in an anthropomorphized shell, with body parts correlating to the type of data being collected. For example, its ears collect audio data on the room, and its eye houses a camera collecting visual information (see a gallery of SHERPA's bodyparts, below).
SHERPA's microcontroller guts process these sensor readings, and output sensor data via serial communication to an attached computer. Separate code written in Python takes these outputs and formats them into an excel file for data analysis. Connecting SHERPA to this adjacent computer allows it to communicate with other IOT devices in the home.
SHERPA tells us about her sensors. Click to play.
Ear: An electret microphone processes audio data.
Eye: A webcam detects the occupant's level of movement, the area of the room they are in, how many people are in the room, etc.
Nose: A pushbutton detects if SHERPA has been dressed up with glasses or not. Users use this interaction to communicate they are working from home to SHERPA.
Left Hand: When attached, a tilt sensor detects if the occupant is picking up or putting down their phone.
Training Pad: When users first bring SHERPA home, they place her on a pad with sliders to report their "emotive state." This helps SHERPA correlate sensor data to mood.
Right Hand: When attached to the keyboard, a vibration sensor detects how much the occupant is typing.
SHERPA is designed with tangible methods for enabling and disabling data collection from these sensors, so that the user can relax and alleviate the feeling of being “watched” in their private space. Following our “potato-head” model for privacy-driven interactions, these interactions intuitively allow the user to understand what is and isn’t being monitored by plugging and unplugging sensors from SHERPA’s body. For example, unplugging an ear might intuitively be understood by users to disable sound recording (see image above right), or covering its eye with a mop of hair might be understood to block video data collection (see image below right).
These interactions are designed as playful, tangible analogues to human-to-human interactions. For example, the placement of SHERPA’s hand on your keyboard, indicating that you would like it to monitor computer use, might be analogous to asking SHERPA to help you with your work. These pet-like interactions between device and occupant give SHERPA a sense of agency or vitality and endear the robot to users, making it easier to relate to and more comfortable to use.
Enabling or disabling audio data collection, by plugging or unplugging SHERPA's ear. Click to flip.
Enabling or disabling visual data collection, by covering SHERPA's eye with a mop of hair. Click to flip.
An Empathetic IOT :: How Sherpa "Feels" You
To create the sensation of empathy between smart home and occupant, SHERPA is designed to understand the occupant’s emotional state, and actuate the IOT-connected objects in the occupants’ home in response. SHERPA becomes an empathetic intermediary between the user and their smart home devices — it acts as a personified physicalization of the intangible IOT network users might otherwise struggle to understand and feel comfortable with.
Inspired by the MIT Media Lab's SNAPSHOT study, we conceptualized a system for training SHERPA to correlate its sensor data collection to occupants’ reported emotive states. For a two-week period, new SHERPA users would place their hub on the “training pad” and report their emotive state along four tracks: Alert vs Tired, Happy vs Sad, Stressed vs Calm, and Introverted vs Extroverted .
SHERPA tells us how you can train her to empathize with you. Click to play.
SHERPA is designed to take data from its input sensors, and actuate the smart home in response. Many previous "responsive environment" installations have tested regulating occupants’ emotive states in installation-like spaces using lighting and sound. However, smart home devices connected over local Wifi networks can transplant many of the same effects outside of pre-constructed installation spaces and into regular domestic environments.
Taking advantage of these connections, SHERPA communicates its sensor data to a nearby computer via serial communication, where this data is processed and used to automatically actuate smart lights and speakers. These correlations between the data SHERPA collects from its occupants and the way it actuates its occupants’ domestic environment create the sensation of empathetic responses between the occupant and their smart home.
We designed four specific responses SHERPA will make in the smart home in responses to occupants’ activities:
SHERPA's head turns to attentively track its co-inhabitants as they move around the room.
SHERPA controls smart bulbs in the home, changing light intensity or even setting dramatic, colored mood lighting.
SHERPA can control the home's smart speaker, setting music or playing ambient soundtracks.
BZZ!
BZZ!
BZZ!
SHERPA's hands vibrate if the occupant's phone or keyboard use is excessive.
User Testing :: Adopting SHERPA for a Week
SHERPA actuates your smart home to empathize with how you're feeling.
SHERPA is designed to take data from its input sensors, and actuate the smart home in response. In our first iteration of SHERPA, the device reacts after dramatic changes in SHERPA's sensor data inputs, which likely signal the occupant is engaging in a different activity than before. For example, if SHERPA is in “focus mode” (indicated by the user by placing glasses on its nose), excessive phone use will trigger vibration motors in its hands, nudging the user to put down their phone. Similarly, if SHERPA sees a large increase in the amount of movement its camera picks up, it will set bright orange “mood lighting” in response to its energized, active occupant.
\
We tested SHERPA for one week -- the video above shows recreations of user's experiences with SHERPA during their time adopting her. In future work, the actuation of these smart home devices will be correlated to SHERPA’s neural network predictions of occupants’ emotive states.
In future work, SHERPA's sensor data will be correlated to occupant's reported emotive state.
Click for Full-Sized Gallery: