Go Back   the Magicball Network > Forums > MBN Main Forums > Off topic
Buy LBA1/Relentless from GOG.com Buy LBA1/Relentless from DotEmu Buy LBA2/Twinsen's Odyssey from DotEmu Buy LBA2/Twinsen's Odyssey from GOG.com Buy Little big Adventure from GOG.com or DotEmu Buy Little big Adventure 2 from DotEmu or GOG.com

Welcome to the Magicball Network.

You are currently viewing our site as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today!

If you have any problems with the registration process or your account login, please contact contact us.

Off topic General off-topic chat goes in here.

Reply
 
Thread Tools
  #1  
Old 2017-09-02, 22:12
Darkflame's Avatar
Darkflame Darkflame is offline
Classic
 
Join Date: Feb 2001
Location: Sol, Earth, NL
Posts: 23,836
Send a message via ICQ to Darkflame Send a message via AIM to Darkflame Send a message via MSN to Darkflame
Post Darkflames musings on a universal ethical principle.

Ok, a bit weird, but what follows is random ramblings on my attempt to find a universal ethical principle.

Been thinking about this stuff a long time, kinda needed to finally get it down.
(This isn't preaching everyone should follow this either. More I just want to get my ideas out there/feedback etc.)

Over my life I have slowly been trying to develop and refine a ethical principle. A simple guide to how to act in order to be a good person. Not that I wont fall short, but a thing to aim for.

"Do no harm" is a simple one, for example.

I used to be more specific though, and inspired a bit by Asimov I had;

"Though shall not though action or inaction allow a sentient being to come to harm"

To me action/inaction are essentially inseparable, any conscious choice to pick one future over another carries at least some responsibility. "sentient being" was simply me replacing asimovs "humans" with something more generic.

Eventually I shortened this idea down though;

"Minimise harm"

After all, its very hard to do actions that dont harm some people in some way, regardless of how slight. Actions have all sorts of consequences.
This can be interpreted as looking for the "lesser evil", which I think is fine, provided your honestly trying to reduce net harm overall - and not merely using _some_ benifit as a excuse for something you want to do anyway.

However, I then realised that "harm" means different things to different people. I have shifted the complexitys of the world and what is "good" onto a single word.
Not only does the word mean different things to different people, different people consider different things "harm" to themselves as well. Without a firm definition its kinda useless.

So I tried to define it;

"Minimise Harm*

*Where Harm is defined as something the potential 'harmed' person would not want to have happen to them."


I was quite pleased with this. Its theory-of-mind based, but I think thats as it shouuld be. Its not about what *I* consider harmed, its about what the potential harmed person thinks does. *Would they want this happening to them?* Is the question to ask if determaining if someone is being harmed or not.

So, this takes into account various relgious/cultural/individual preferances.
However, it doesn't take into account how people change day to day and over their life.
A child might not like getting vacinated - but as a adult they probably appricate that they were.

So I had to write a patch;

"Minimise Harm*

*Where Harm is defined as something the potential 'harmed' person would not want to have happen to them - assuming that person has full knowledge of the situation, and its consequences, and has the mental compitance to process it"


Not ideal, as it involves a lot more judgement calls. But I never said good ethics would be easy.
Shortly after that though, I realised there was a more simple way to look at it that amounted to the same thing;

"Minimise Harm*

*Where Harm is defined as something the potential 'harmed' person _later_ would not have wanted to have had happen to them"

ie.
Consider the future; would this person have wanted this to happen to them in their life?
It still involes extrapolating both future events, and what the other person feels, but its expressed/concept seems a bit more simple.

That said, its still messy to have a definition like this.

Eventually, just a few months back I had a eureka moment and think I got a ethical equilivent that amounts to the same thing, but as a concept seems to work a lot better.

You see, once you hit upon the idea of "would this person have wanted this in their life?" I think you hit upon a more fundimental idea then just the negative of "harm".
What state do people want to be in?

So rather then saying "minimise harm"

Why not the positive;

"Maximise people being in the state they would want to be in."

?

Now "state" could be anything, but it clearly excludes anything that would be considered harm in the early definition, while also encourging things to get better.
Can you make a persons life a bit better without making other people in a state they would not want to be?

Its still not by any means a easy principle to follow. Not by a long shot. Its not a simple rule that can be applied blindly - applying this to anything important like politics would take a lot of thinking, and you have to be very careful to be honest and without preconceptions when you do so.
But its easy to understand right?

Does it have obvious flaws ? Things overlooked.
"bad things that can happen if this idea is used as a guide to my actions in life"?

oh, and thanks for listening to my ramble
__________________
http://fanficmaker.com <-- Tells some truly terrible tales.
-
Phones & Tricorders & Blobs & Bombs & 3D Printers & TVIntros also;stuff
Reply With Quote
  #2  
Old 2017-09-03, 09:18
kash's Avatar
kash kash is offline
lba 4 life
 
Join Date: Jun 2014
Location: switzerland
Posts: 48
Very interesting rambling on ethics. Ok let's say I'm wondering if I should give money to a homeless person. How can I know what would maximize the well being of this person? If I don't help him, will it mean that I harmed him? I think my choice would be biased by personal experience.
We all agree on fundamental ethics principles such as don't kill etc. But the more a situation gets intricate, the more subjectivity seems to take hold of our opinions. A multitude of behaviors can be considered "good actions". Thus that would imply substantial freedom within the idea of general ethics where each individual is left with the choice he thinks best according to his personal beliefs as long as it stays within the ethical limits dictated by society, which themselves are proned to fluctuations. Ultimately we must ask ourselves if such a "loose" sense of morality is acceptable. One possibility is to admit several levels of morality; with at first fundamental laws like: don't kill; then on another level you would have guidelines like: be nice to people. The higher the levels the less people would be coerced into following them. Hence more flexibility in decision making and less importance given to the consequences of our actions according on what level they are taken. So to come back to what adequate behavior to have regarding the homeless person, I think it doesn't really matter...without wanting to sound pessimistic.
So anyway Darkflame your point of view seems solid to me, trying to maximize people being in the state they want to be; as long as it makes you happy and other people happy I don't see any flaws with it.
Reply With Quote
  #3  
Old 2017-09-03, 12:01
Jasiek's Avatar
Jasiek Jasiek is offline
Do the evolution.
 
Join Date: Jul 2003
Location: You forgot Poland.
Posts: 8,090
Sam Harris has a bunch of books on that, Dan Dennet as well (and of course a crap ton of other philosophers going back, but let's disregard those old farts).

Basically, since neither good or bad are real, existing, tangible "things" an academic pursuit of a definition might be seen as folly.

Our views on good and bad stem from our evolution as a pack animal, and animal in general. To me, trying to define morality without an evolutionary basis is unrealistic - meaning, whatever you may come up with might be an internally consistent system, but it most likely will not be THE system that's in play.

Animals generally (with a bunch of exceptions of course) don't kill others of the same species, cause if they did, they'd go extinct (which has probably happened, leaving us with those that mostly leave each other alone).

Then, having to rear young makes animals develop empathy, meaning they can internalise states other animals experience - like hunger, anger, distress, want and feel the same emotions so that they are forced to act upon them.

Then, as we go further and pack instincts form, everything clicks into place and we're left with basic tenets of moral behaviour:

(These are according to Jonathan Haidt - as I understand them)
harm/care
fairness/reciprocity/justice
ingroup/loyalty (with a subset of authority/respect)

Harm/Care:
Allows us to recognise harmful behaviour which we label as BAD, and helpful behaviour which we label as GOOD.
Fairness/Reciprocity:
Makes us recognise injustice (like an uneven distribution of food), so we can label fair behaviour as GOOD and unfair behaviour as BAD.
Ingroup/Loyalty:
Makes us able to recognise actions that are GOOD or BAD for the entire group.
This last one often misfires and causes things like... tradition... religion... Where something is recognised as something that has always been, therefore a positive thing for the group...

Harm then, is defined as anything that causes the animal distress - pain, hunger, discomfort, or an action that is categorised by our internal compass as unjust.

Any of those three tenets may be undermined by others in borderline cases - like causing harm will be ok when it is just. The ingroup/loyalty tenet may completely dominate the others, and cause harmful unjust behaviour for the sake of loyalty or tradition...
__________________
Little Script Adventure
Join the Little Script Adventure team
Download Little Script Adventure
Reply With Quote
Reply

Tags
ethics, good, morality, philosophy

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
A (Very) Brief History of Principle Island Synthesis Fan Fics 2 2005-07-11 20:07
Game quits when I visit Principle Island! Help! silver0163 First aid 4 2001-01-17 19:43


All times are GMT +2. The time now is 09:39.




News Feed
Powered by vBulletin®
Copyright ©2000 - 2018, Jelsoft Enterprises Ltd.
Copyright ©2000 - 2016, the Magicball Network