Fifteen year old Sarafina, a female student in the capital city of Liberia, had a distressing problem at school:
Her math teacher refused to give her a report card unless she had sex with him.
Every day at school, he would request sexual favors and touch her inappropriately. Embarrassed, Sarafina kept the issue hidden from everyone, even her parents, until her father overheard a sexually harassing phone call the teacher made to their home. Sarafina’s father successfully confronted the man and got the report card, but his daughter was reprimanded for reporting her teacher’s sexual advances and forced to move to another school.
How Chatbots Are Used To Advance Social Impact & Humanitarian Initiatives
1. Bots Can Uncover The Truth
In Liberia, teachers enjoy high social status but children, especially young girls, are culturally trained not to speak out, leading to a culture of silence and tolerance. While Sarafina’s story sounds extreme to Westerners, her experience is painfully common yet largely ignored in her country.
Enter UNICEF’s U-Report, a social reporting bot that enables young people in developing countries to report issues in their community via SMS and other messaging platforms. U-Report polled 13,000 users in Liberia to ask if teachers at their schools were exchanging grades for sex. An astonishing 86% of reporters said yes.
Within a week of the U-Report discovery of the “Sex 4 Grades” epidemic, help hotlines around the country were inundated with reports of child abuse. Simply exposing a pervasive taboo inspired victims to speak up and reach out for help. Since then, UNICEF and Liberia’s Minister of Education have collaborated on a plan to stop the issue.
“U-Report is not just about getting questions answered, but getting answers back out,” explains Chris Fabian, Co-Lead of UNICEF’s Innovation Unit. “We get responses in real-time to use the data for policy change.” With over 2.6 million U-Reporters worldwide and deep expertise building technology for developing economies, the U-Report team is uniquely positioned to tackle challenging social issues like violence against children, HIV/AIDs policy, climate change, and war and conflict.
2. Bots Raise Awareness & Funds
Less than 50% of the population in Ethiopia has access to clean water and only 21% of the population enjoys proper sanitation services. Unfortunately, cold statistics like these rarely move people to take action.
That’s why Charity:Water teamed up with Lokai and AKQA to create Yeshi, a Facebook Messenger chatbot that humanizes the water crisis in Ethiopia. Yeshi is a young girl in Ethiopia who walks 2.5 hours every day to the nearest reliable water source. She travels alone and straps huge plastic jugs to her back so she can bring gallons of water home to her family. You learn about her dreams of going to school and see a map of her journey.
Yeshi even asks you to send her a picture of where you live. “Wow! Our worlds are so different,” she remarks, before leaving you to continue her arduous walk alone again. The experience of “Walking With Yeshi” is undeniably emotional. Conversational experiences like this can be powerfully effective ways to convey the humanitarian challenges that face the global poor and inspire action.
Besides raising awareness, charities can also use bots and messaging platforms to raise critical funds. Charity: Water recently worked with Assist to enable donors to donate funds directly from Facebook Messenger.
3. Bots Fight Bureaucracy & Inequality
19-year-old Joshua Browder is no typical teenager. The Stanford Computer Science undergraduate has single handedly-beaten over 160,000 unfair parking tickets with his bot, Do Not Pay. The sophisticated “robot lawyer” also helps tenants fight negligent landlords and the homeless apply for much needed government support. Browder was inspired to help the most vulnerable segments of society acquire legal help they would otherwise never be able to afford.
“With parking tickets, the robot lawyer takes money from the government. However, so many government bureaucracies can be automated, like the DMV. Eliminating bureaucracy will actually save the government money,” points out Browder. “In the UK, there is this really broken system where the government pays a lawyer to file an application back to the government for a homeless person to receive support. The government wastes so much money with the application process when they should just spent that money on houses.”
Browder’s vision for Do Not Pay extends far beyond simply fighting off parking tickets and filing for homelessness. While some aspects of the law, like bankruptcy, are complicated and unintuitive, many legal processes can be modeled as logical decision trees by computers. Browder’s mission is to turn Do Not Pay into a legal bot platform where lawyers can identify aspects of the law that are automatable and create their own bots.
Government bureaucracy is so pervasive that many other bots have cropped up to simplify civic matters. Against the backdrop of election cycle drama, several voter registration bots – HelloVote, GoVoteBot, and VotePlz – emerged to allow voters to skip onerous and error-prone paperwork and register simply through SMS and Facebook Messenger.
While competition between most Silicon Valley companies is fierce, voter registration is not a zero-sum game where startups squabble over limited scraps. According to Sam Altman’s VotePlz, only 54% of eligible young people were registered for the last presidential election and 10% of millennials aren’t sure if they’re even registered. Everyone is fighting for the same social goal: boost voter turnout and help citizens do their civic duty.
4. Bots Provide Social & Health Services
With the threat of Zika looming over the Americas, knowing whether you’ve contracted the disease is critical to getting timely and adequate treatment. GYANT, a healthbot on Facebook Messenger, walks you through a questionnaire of symptoms to identify your likelihood of having Zika. Concerned users can get a personalized answer immediately rather than wait for a doctor’s appointment or ignore the problem.
Many non-profits and government agencies offer hotlines and support groups faced with high demand and insufficient human staff. Some, like Samaritans, a suicide prevention group, are reportedly working on chatbots to offer faster response times and around-the-clock support. Such social support, whether given by human or bot, has a huge impact on people. Even gifting senior citizens with a robotic seal is shown to reduce stress and improve socialization. Besides simply building mechanical robots to address the physical challenges of old age, social chatbots can be built to address emotional and mental needs.
In the healthcare industry, providers are overwhelmed by the number of patients, most of whom need continued social and emotional support outside of their doctor and hospital visits. Sensely is a digital nurse bot with a human-like avatar that can save up to 20% of a clinician’s time by monitoring whether patients are dutifully following their prescribed regimens.
For mental health, conversational avatars like Ellie, a digital therapist developed by USC’s Institute of Creative Technologies, can interview patients and detect depression and anxiety by analyzing words, facial expressions, and tone. Professor Louis-Philippe Morency, co-creator of Ellie, says the bot cannot replace a human therapist, but is a decision support tool that helps to “gather and analyze an interaction sample” for doctors.
5. Bots Motivate The Right Actions
Can’t kick your nicotine addiction but too embarrassed to nag a friend to help every time you feel a craving? Public Health England experimented with a Facebook Messenger bot for their month-long Stoptober campaign to help smokers quit. Stoptober successfully helped 500,000 people quit smoking last year, an impressive 20% of the 2.5 million smokers who registered.
PHE’s marketing director Sheila Mitchell believes the addition of the Facebook Messenger bot as a support tool for smokers will increase the % of successful quitters. “The heart of the campaign is social,” explains Mitchell. “We found that the big numbers and responses come from social and that within this Facebook is absolutely dominant.”
Where Humanitarian Bots Fall Short
Addressing social issues requires emotional sensitivity, a critical skill that bots are universally missing. LawBot is a legal bot created by Cambridge University students to help users in the UK understand the complexities of the law and identify whether a crime has been committed. Users can use the bot to report rape, sexual harassment, injuries and assaults, and property disputes.
Unfortunately, the bot uses a strict checklist to assess if a “crime” has been committed. If your report of sexual harassment doesn’t fit within preset criteria, the bot responds with the following:
According to the Guardian, over half of women in the UK have been sexually harassed at work. Even if the offending actions don’t fit a neat legal box of being a “crime,” unwanted aggressions can cause lasting psychological damage and unnecessary suffering. Additionally, many corporate cultures discourage reporting to avoid expensive legal battles or PR nightmares.
When a bot bluntly tells a potential victim of sexual harassment that “no crime was committed here” without a detailed understanding of the situation, the results can be counterproductive and further discourage victims to speak up. Even if LawBot deems that a proper crime has occurred to you, the bot’s only response is to send you the address of the nearest police station.
While artificial intelligence technologies have not yet evolved for bots to respond with emotional acuity to difficult situations, a better solution for LawBot is to connect distressed users to sympathetic hotlines, support groups, or expert human lawyers once the conversation has exceeded the bot’s domain expertise.
Helping Bots Develop Empathy & Compassion
Do Not Pay’s Browder cautions that “the really big challenges are ones that require compassion and human judgment. If someone is to be granted bail, there is no set criteria. Bots work really well when there is a clear decision tree.” When bots address ambiguous issues, like rape or sexual harassment, even well-intentioned efforts like LawBot risk backfiring.
Many technology powerhouses are working to give bots the emotional sensitivity needed to make complicated judgements that can’t simply be captured with decision trees. As mentioned earlier, digital therapists like Ellie factor in facial expressions and vocal characteristics. Amazon is working to make Alexa, the cloud AI that powers the Amazon Echo, detect and respond to your emotions. SRI International, the research lab which created Apple’s SIRI, is building new virtual assistants that emote just like you do.
“Humans change our behavior in reaction to how whoever we are talking to is feeling or what we think they’re thinking. We want systems to be able to do the same thing,” says William Mark, head of SRI International’s Information and Computing Sciences Division.
Even the emotionless bots of today have changed the world for the better, from revealing epidemics of violence against young girls (U-Report) to automating government bureaucracies like homeless applications (Do Not Pay). Bots can tell stories to help you empathize with humanitarian crises (Yeshi), assist your healthcare providers (GYANT, Ellie, Sensely), and help you quit your worst habits (Stoptober).
We can’t wait to see what the empathic and compassionate bots of the future will do.
Watch My Talk At The Gates Foundation
I recently delivered a talk at the Bill & Melinda Gates Foundation where I highlighted the most successful use cases of chatbots for social impact.
Leave a Reply
You must be logged in to post a comment.