Source: Gannett

These days, it's easy for humans to blame smart phones for "technologically-caused" changes that are too negative for our taste. For example, we blame technology for today's "disconnectedness," choosing to text or message a friend or family rather than sitting down and having a real human connection. But is it really technology that is to blame here? Or is there more to the story? Case in point is the Facebook's move to replace journalist editors with a computer algorithm.

Recently, Facebook faced a barrage of accusations about having a left-wing bias when it comes to displaying what's on their "Trending Section." Surprisingly, Facebook editors have admitted to the bias. To some extent, the editors were suppressing conservative news stories and topics. If you are part of Facebook, the logical solution is to replace the human editors with an algorithm that objectively curates articles based on comments, likes, shares, and other unbiased variables.

Now, Facebook is being publicly criticized about the move. Furthermore, said critics seem to be blaming technology for the loss of another human job. However, is it really Facebook's fault?

There are about 1.7 billion articles circulating on Facebook on a daily basis. Now, imagine if you are one of the editors who decide on which articles go to the "Trending Section." It's easy to imagine being overwhelmed, exhausted, and stressed out.

Keep in mind that cognitive psychologists and neuroscientists have already demonstrated what mental exhaustion can do in terms of suppressing prejudice and bias. Essentially speaking, in a state of mental exhaustion humans tend to simply ignore what we don't like, and we most likely choose to accept something that we prefer in the first place. You can probably relate to this, ignoring data that disagrees with your position and welcoming data that supports your opinions.

In journalism, being unbiased is a top priority. However, one must consider that in times of rampant exhaustion opposing views are uncomfortable for the mind. To avoid such discomfort, it's far easier to turn a blind eye to an article that one doesn’t like. Yet, we all know that such state of mind is quite politically biased. There is also another pressing problem - the idea of pushing towards personalization.

Source: Guim

Social media and e-commerce giants are investing in technology that takes an accurate guess as to what we like. Exposing us to products that we like will substantially increase conversion rate, thus increasing the company's profits. This may be great for the company's bottom line, but it may be a dangerous proposition for the society itself. Such technology will only strengthen our bias and dull our acceptance to new ideas and mentality.

Here's the point - maybe we should not be quick to judge technology as the big problem in humanity. If you truly think about it, technology is just a mirror of ourselves. After all, did technology sow the idea of easy communication? Or, did we want it in the first place? Now that we have instant text and messaging, we hardly look up when we go out with our friends and family. Instead we choose to send emojis, comments, and likes while blaming technology for the disconnectedness. In the end, we should really try to take a hard look at ourselves rather than blaming Facebook for creating an algorithm that replaces human editors.