Humanitarian organizations are actively researching and testing how the use of messaging and bots can help refugees or those directly impacted by a natural disaster.

Unicef created its own bot, U-Report,to engage young people on a variety of issues. The bot, available via Twitter and Facebook Messenger, polls its followers on a range of topics and uses the data to help influence public policy. Unicef’s bot has had some early successes.

In Liberia, the bot asked 13,000 young people if teachers at their schools were exchanging grades for sex. 86% said yes, uncovering a widespread problem and prompting Liberia’s Minister of Education to work with UNICEF on addressing it.

The bots aren’t just a technical challenge for nonprofits, they also present ethical problems, particularly for fundraisers.

Amazon and Netflix use algorithms to manipulate our choices of books and movies. Facebook manipulates what we see on the site to keep us there longer.

But bots may be able to manipulate our emotions in unprecedentedly unhealthy ways. Emotions have always had an appropriate place in storytelling for fundraising, we give because we are empathetic beings. But bots can offer companionship and empathy and support to donors and ask for donations. Where is the line between cultivation and manipulation? And who determines the line?

Read the full article on bots by Beth Kanter