One thing I sometimes enjoy more than just mindlessly scrolling through social media is reading the comment sections – overly dramatic, funny, intelligent, sad, the comments are essential for my media consumption. Yesterday, while again going through a random comment section I got a pop up message and was presented a different interface, pictured below. I was encouraged not to simply like the comments, but also appreciate whether they were civil and relevant. Upon additional research, it seems that the option has been available in certain markets since as early as April this year.
Given the network’s tumultuous past, including it being accused of unrest in Myanmar and inability to stop hate speech fast enough, it looks like this might be an attempt to crowdsource what the platform is still unable to do. However, if this function will ever become an actual option on the platform it is likely that the input from the millions of people, provided that it is of good quality, will be used for machine learning, so that in the future “uncivil” and “irrelevant” content can be automatically blocked as early as possible. Naturally, Facebook is not the only company exploring the ideas of automation and machine learning. It is inevitable that machine learning and automation will be huge parts of our lives in the future. How and when is less clear. What I suggest will happen is that couple of years from now machines will accumulate so much data that they will be effectively better at social interactions than most humans. I can imagine the future motivational speakers at self-growth conferences just providing big data analysis to all the attendants’ questions. Naturally, this will cause a guerrilla response – the poor management of the new technology will drive thousands underground in an attempt to sabotage machine learning. Young and old will mark millions of relevant comments as uncivil, civil comments as irrelevant, adding heart and angry emojis here and there. Big companies will respond to this by democratizing the online space. No moderation. No curation. No sanitation. This is where the actual machine learning will begin. The essential argument here being that the trade-off of our privacy for comfort is being a less tolerable option, and the extent to which the companies will go to collect information will surpass many expectations as more people grow aware.