stardust_rifle: A cartoon-style image of of a fluffy brown cat sitting upright and reading a book, overlayed over a sparkly purple circle. (Default)
[personal profile] stardust_rifle
I generally think that it's okay to have rules about what is and isn't morally okay that aren't "does this hurt people", however, i don't think it's okay when those rules override the "does this hurt people"*.

As a related point, while in the abstract "is it okay to do a small amount of harm to someone in order to prevent a greater harm from happening later" is a question I'd probably say "yes" to, this does not apply when the small harm is definite and concrete (making people feel bad over their lack of religiosity/porn preferences) and the "greater harm" is abstract and far-off or straight up just fake bullshit (the atheist eventually going to Hell, OP's porn preferences increasing the amount of sexism in the world and subsequently making more women unhappy in the future)


(For a definition of "hurt" that includes emotional damage, that is)

Date: 2022-04-13 12:22 am (UTC)
feotakahari: (Default)
From: [personal profile] feotakahari
Most Utilitarians ditch hedonic calculus, but I think Certainty, at least, is a very important point. It’s often a bad idea to value what may happen in the future over what’s definitely happening now.

https://en.m.wikipedia.org/wiki/Felicific_calculus

Profile

stardust_rifle: A cartoon-style image of of a fluffy brown cat sitting upright and reading a book, overlayed over a sparkly purple circle. (Default)
stardust_rifle

June 2025

S M T W T F S
12 345 67
891011121314
15161718192021
22232425262728
2930     

Most Popular Tags

Page Summary

Style Credit

Expand Cut Tags

No cut tags
Page generated Jun. 10th, 2025 10:51 pm
Powered by Dreamwidth Studios