• LarmyOfLone@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    1 month ago

    Thanks, really interesting analysis! I’d argue that profit seeking is significant in this change though. Polarizing content leads to anger and higher engagement. So any algorithm that is written or trained to increase profit from advertisement will encourage that outcome.

    I believe there are other influences too, the fascists and putin trolls (agitprop) has led to some “automatic downvote and ban reflex”. On lemmy the mods are basically power tripping non stop in trying to curate their fief into a single minded community that brooks no dissent. There is one narrative and anyone dissenting is a <insert slur>. On all sides, people are just sick of the bullshit and are on a hair trigger.

    This seems to the be result of the last decade of mainstream media and social media being run for profit through engagement. I have no idea of how to reverse this.

    • OpenStars@discuss.online
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      Lemmy isn’t run for profit - mostly (though there are some amounts of money involved, and moreover power & fame) - but being based off of Reddit still uses that identical model. And then similarly for Mbin, Sublinks, Piefed, Tesseract, etc. Someone would need to basically do all of having an idea for alternative mechanisms, and also write the code for it, and also start up an instance, and promote it to let others know, whereas a failure in any of those steps would prevent its acceptance by the global community. Plus while all of that is going on, all of Facebook, Threads, Xhitter, Bluesky, and yes Reddit can continue to innovate, possibly stealing the idea out from under someone and twist it to meet their profit-seeking needs, though conversely those also generate ideas that non-profit sources can steal from as well.

      One example is Reddit’s automated CrowdControl (an optional feature available to mods of all subs) - instead of a mod needing to outright “remove” an unpopular comment in a post, it simply gets collapsed by default, thereby working against the trends to maintain an echo chamber by allowing people to post dissenting opinions in the identically same space as the majority of the community, who control what they want to see with voting. Similarly posts that are too lengthy could be cut off after a point, needing you to click to continue reading, but thereby allowing you to scroll past something that you don’t want to spend time on. But these are tiny things, and still many people wouldn’t bother making all that many comments that they know in advance will be unpopular, b/c what would they even gain from such? (besides a brief relief to get something off their chest, but how many can keep coming that way, for weeks and months and years?)

      One reason for that is the power dynamics, which regardless of for- vs. non-profit organizations, still offer greater power to one “side” or the other of a transaction. Voting for instance is anonymous, whereas posting is not, hence voters (even lurkers!) have more power than content creators. All someone has to do is spin up their own instance, or join one of the many that do not require even so much as an email sign-up, and they can generate as many votes as they want, for “free”. As so many discussions have highlighted, “content creators” really are at a severe disadvantage, compared to unethical voters, mods, and ofc admins, especially for those first few vulernable minutes where it hasn’t received any upvotes yet. After all, *I* may offer fewer than one downvote per day, maybe per week, and also routinely sort content by New, but that’s not what others (seem to) choose to do. So should downvotes be rationed? Or the source made publicly viewable? Mbin does the latter btw, though as “reduces” not “downvotes” shared with Lemmy.

      Which further illustrates the trend towards echo chambers: they tend to work, to cut out some of the bull crap - if you ban an agitator then all of their BS goes out the door with them, their downvoting, their harassment, their toxicity, etc. BTW speaking of harassment another example of unqual power dynamics is the sending of messages from different people - e.g. I did not know what ChapoTrapHouse was all about, so when I replied there and subsequently received messages from different users for WEEKS and WEEKS afterwards, and then again from something in lemmygrad.ml, I had no control at the time but to receive those notifications. I almost left social media entirely b/c that is an absolute waste of my time & attention, and by flooding me with unwanted spam they essentially took away from me the normal intended functioning of the notifications feature. But then the ability to block those instances was added, and now after blocking them + lemmy.ml, I enjoy myself here. The only way offered to me to not receive tens and tens and tens and tens of replies was to cut myself off from them, i.e. curate my feed which is if not full-on echo chamber at least is one step towards it. And yet… what other alternative is there? Ignore my notifications entirely? There are SO MANY of them, but only one of me, and this unequal power dynamic leaves me with no other choices - after all, it’s not like I can apply filters to my notifications, where I could still receive messages from them but just not treat them as the same, absolute highest-priority status that is assigned to every other notification also. Also, prior to the blocking of the instance they had the ability to live rent-free in my head, as I would need to read every one of those before I could know what it was about. This is not “fair”, nor equal, hence illustrating that echo chambers are not the absolute worst things in all of existence - rather, they are a poor solution to problems that are far worse (e.g. not having an echo chamber, perhaps rather having nobody at all in a community that remains willing to speak, or possibly even to lurk anymore, i.e. its death).

      These are the tools that we have. If we want better, we need to make them. And this will require emotional intelligence that most of us seem extremely unwilling to ponder. e.g. one idea, which seems to sound to most people to be really bad, would be to implement what we already kinda do as humans, and assign greater weight to people based on their community-specific karma. This would be horrible for new people joining, but if someone has been posting half of a community’s content but then ten new people join, not posting anything at all but instead harassing the existing users and downvoting everything they see that does not match what they want, then those votes should count as “lesser” than the pillars of the community whose votes should count for “more”. New people can always start new communities of their own - ofc that gets back to the “discoverability” issue - but it would virtually eliminate some of the less-organized “noise” trends that so often pollute social media streams, similar to how anti-cheating or captcha devices work, as in if they can do as well as a human who knows the material, then that’s arguably more of a success more than it is a failure? :-P

      img

      But it would also come at a severe cost, of tying together a community’s content to its content creators. And yet… is that such a bad thing? It essentially distributes power from mods to the users, but not all users and rather those who contribute the most. But maybe this idea really is a horrible one - in which case, again, we would need to make a better one, somehow.