Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> at this point in events it is certain that the algorithm builders have very fine control over human mental patterns

IMO this dramatically oversells the power of recommender systems, and in a way which further serves their owners' interests rather than challenging them. In fact, I think what's clear is that they are at best able to achieve a very gross level of control over human mental patterns, one which is not meaningfully different than previous forms of media that have popped up throughout history. "Engagement", keeping someone scrolling long enough that they accumulate a nontrivial probability of clicking on an ad, is the lowest common denominator of marketing. Television, radio, and print media have long understood how to keep people serially "engaged" (consider the 'if it bleeds it leads' mentality of local news, or the emergence of 'angertainment' on CNN or FOX in the 90s and onwards).

But stimulating engagement is very different from actually controlling someone or altering their behavior in a way beyond "hey look at this interesting thing!". Consider that the click-through rate for Meta ads is on the order of ~1%, and this is literally their most valuable metric. They achieve this not by actually persuading people that an ad they don't care about is actually interesting (which to me would be the real acid test of whether they have 'fine control'), but rather by (a) effectively segmenting the audience in a way TV can't and (b) keeping the audience engaged long enough that maybe they click an ad. While they're no doubt good at both of these things, I think it's telling that the best these platforms have been able to do is the same strategy that every other form of mass media has also stumbled on: throw enough sensational crap your way that you stick around long enough to maybe click an ad.

To your point more directly: I agree that being able to agitate large groups of people in the same way is a dangerous ability, but I think it's also one that's very old and very common. It is not the unique provenance of 'algorithms', it's just the nature of mass media acting as a demagogue (look at role of newspapers in the lead-up to the Spanish-American war, for an example that predates our modern era). The way we challenge this is IMO not by treating the problem as something entirely new and overwhelmingly powerful ("big tech algorithms are mind control rays"), it's by looking at the historical record and recycling the strategies which have worked before (libel and slander laws, journalistic ethics, and trust-busting). Certainly there are elements of the problem which are new and unique , but from where I'm sitting the differences seem smaller than the similarities.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: