This vote really has nothing to do with the 'karma system' nor its implementation, so bringing this up as a discussion is a bit off topic.
But I will say this on it. Just seems ironic for someone who stumped for freedom of expression wishes to remove a tool that users use to express themselves.
And it's also ironic that one who is against the binary polarization of political speech wants us to follow probably the single most harmful implementation taken by large Silicon Valley organizations that allowed us to fall into a more binary expression trap with the like/dislike buttons. Which has trained us that we can either thumbs up, thumbs down, or not vote.
I remember when Youtube had the star system, and I still like it better than the like/dislike system.
(The real reason that Youtube did this, by the way, isn't because they found the system more about people feelings. It's more that it's harder to train an A.I. if people are providing middle of the road answers about the quality of the content they are watching, therefore make the decision binary so the computer can more tailor its recommendations based on the more black/white input. AKA, programmer laziness.)
This vote really has nothing to do with the 'karma system' nor its implementation, so bringing this up as a discussion is a bit off topic.
But I will say this on it. Just seems ironic for someone who stumped for freedom of expression wishes to remove a tool that users use to express themselves.
And it's also ironic that one who is against the binary polarization of political speech wants us to follow probably the single most harmful implementation taken by large Silicon Valley organizations that allowed us to fall into a more binary expression trap with the like/dislike buttons. Which has trained us that we can either thumbs up, thumbs down, or not vote.
I remember when Youtube had the star system, and I still like it better than the like/dislike system.
(The real reason that Youtube did this, by the way, isn't because they found the system more about people feelings. It's more that it's harder to train an A.I. if people are providing middle of the road answers about the quality of the content they are watching, therefore make the decision binary so the computer can more tailor its recommendations based on the more black/white input. AKA, programmer laziness.)