Bark CEO Brian Bason founded his company after having a couple of kids of his own and finding himself concerned about online safety. Bark offers a machine learning-backed app and service to help parents work together with their children to navigate the dangers of today's digital world. Bark launched in January 2016 after about half a year of machine learning training and testing.
In a phone interview, Bason told All Analytics that the development process involved extensive testing and data labeling. Prior to launch, Bark didn't just do a beta round but an alpha one, as well. In the first round, Bark worked with about 50 users and then in the second round progressed to a few hundred users from different areas to encompass a range of demographics. That "gave us great insight and confidence that algorithms were production ready," Bason said.
Bark's algorithms are based on "advanced machine learning and statistical analysis techniques to recognize potential problems," according to the company's website. Bason said that because human language is "so highly nuanced," relying on key words or even phrases is not sufficient to generate alerts for problems because the same phrases can convey different intents in different contexts. He offered the example of "I hate you." This sentence can be used just to express "standard envy or something more sinister," so you need the contextual clues, too. Bark's algorithms are designed to identify problematic messages because they have been trained with the millions and millions of conversations that are "fed" into the system. These messages teach the algorithm the contexts that enable it to accurately flag phrases that signify cyberbullying, sexting, and other potential problems.
Bark's alert levels are tailored to child's age. The company serves users from 5 to 17 years old, though the majority fall into the 8- to 14-year-old age range. Bark is not just for kids who already have their own phones, either, because many non-phone devices given to young kids are capable of messaging today, too. Bark offers "options to configure and customize a few factors like history of messages between participants."
Bark's approach seems to be working. Bason reports that more than 54% of children have experienced at least one issue that triggers an alert. Those alerts raise awareness of problems. Eighty percent of parents admitted to not having been aware of those issues before the Bark alert. It's all about that awareness, Bason said. Parents can't help their children if they are unaware of the problem.
Bason said parents come to Bark for a wide variety of reasons. In some cases, they come after their children have experienced some form of online bullying. Sometimes they come because they've seen these issues highlighted in the media. Parents may also learn about the dangers of social media from someone in their family or social circles. There are also parents who are buying their child their first device and want to provide some level of protection. Some of the parents may have attempted to use other products that they have found "too time consuming or too invasive."
Sometimes the child is the one who suggests the service. Bark doesn't give parents "unfettered access" to all a child's messages. It only shows parents the messages that were flagged as potentially problematic. The benefit for kids is that they don't have to sacrifice their privacy for the sake of safety. Bark also offers a way of "surfacing problems" for kids that they may feel uncomfortable bringing up themselves. The alert system means they don't have to report problems themselves, which takes pressure off them and can open the way for conversations with their parents.
Fostering these types of "healthy conversations" are a big part of the Bark solution. Aside from the alerts, the service says it provides advice on issues, including what practical steps to take, whether they are just to block an account or to go so far as alerting authorities. Bason said that parents consider the alerts "a great opportunity to talk about the risks that are out there." Reviewing the alert and the recommended actions constitute a "teachable moment" and an opportunity to get parents and children on the same page about online behavior.