this post was submitted on 24 Oct 2024
134 points (95.9% liked)
Gaming
2532 readers
343 users here now
The Lemmy.zip Gaming Community
For news, discussions and memes!
Community Rules
This community follows the Lemmy.zip Instance rules, with the inclusion of the following rule:
- No NSFW content
You can see Lemmy.zip's rules by going to our Code of Conduct.
What to Expect in Our Code of Conduct:
- Respectful Communication: We strive for positive, constructive dialogue and encourage all members to engage with one another in a courteous and understanding manner.
- Inclusivity: Embracing diversity is at the core of our community. We welcome members from all walks of life and expect interactions to be conducted without discrimination.
- Privacy: Your privacy is paramount. Please respect the privacy of others just as you expect yours to be treated. Personal information should never be shared without consent.
- Integrity: We believe in the integrity of speech and action. As such, honesty is expected, and deceptive practices are strictly prohibited.
- Collaboration: Whether you're here to learn, teach, or simply engage in discussion, collaboration is key. Support your fellow members and contribute positively to shared learning and growth.
If you enjoy reading legal stuff, you can check it all out at legal.lemmy.zip.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The willingness to be responsible for consequences does factor in. If you round the corner and crash into someone, you probably didn't intend to, but whether you'll be an ass about it and yell at the other person or whether you'll apologise and check they're alright makes a difference.
In a perfect-information-setting, intent equals result: If I know what my actions will cause and continue to carry them out, the difference between "primary objective" and "accepted side-effect" becomes academic. But in most cases, we don't have perfect information.
I feel like the intent-approach better accounts for the blind spots and unknowns. I'll try to construct two examples to illustrate my reeasoning. Consider them moral dilemmas, as in: arguing around them "out of the box" misses the point.
Ex. 1:
A person is trying to dislodge a stone from their shoe, and in doing so leans on a transformator box to shake it out. You see them leaning on a trafo and shaking and suspect that they might be under electric shock, so you try to save them by grabbing a nearby piece of wood and knocking them away from the box. They lose balance, fall over and get a concussion.
Are you to blame for their concussion, because you knocked them over without need, despite your (misplaced) intention to save them?
Ex. 2:
You try to kill someone by shooting them with a handgun. The bullet misses all critical organs, they're rushed to a hospital and in the process of scanning for bullet fragments to remove, a cancer in the earliest stages is discovered and subsequently removed. The rest of the treatment goes without complications and they make a speedy and full recovery.
Does that make you their saviour, despite your intent to kill them?
In both cases, missing information and unpredictable variables are at play. In the first, you didn't know they weren't actually in danger and couldn't predict they'd get hurt so badly. In the second, you probably didn't know about the tumor and couldn't predict that your shot would fail to kill them. In both cases, I'd argue that it's your intent that matters for moral judgement, while the outcome is due to (bad) "luck" in the sense of "circumstances beyond human control coinciding". You aren't responsible for the concussion, nor are you to credit with saving that life.