\ Talks with Heroinman - /g/pasta 2.4
From Sinc, 8 Years ago, written in Plain Text.
  1. <john> oh i see
  2. <john> you would prefer a universe with no humans to a universe with humans?
  3. <john> that's very Deeply Wise of you
  4. <Sinc> A better question would be "Do you prefer a universe with humans to no universe at all?"
  5. <Sinc> and I'd be somewhat more likely to answer it
  6. <john> what
  7. <john> i mean
  8. <john> if we fuck up it's not like the universe is destroyed
  9. <john> it's just that
  10. <Sinc> I'm not saying that, in particular
  11. <john> some other intelligence gets to dominate our universe
  12. <john> instead of us
  13. <Sinc> I'd rather have no intelligence dominating
  14. <Sinc> Than any
  15. <john> that won't happen
  16. <Sinc> Eventually, it may
  17. <john> oh well yeah, after entropy is maximized
  18. <john> what sort of fucked up value system do you have
  19. <john> that you would prefer no universe
  20. <john> to a universe with humans in it?
  21. <Sinc> Who's to say I have any value system?
  22. <john> your preferences are inconsistent
  23. <john> you should be killing humans right now
  24. <Sinc> No, your views on the consistency of my preferences are subjective
  25. <Sinc> You can dislike something, and wish it be gone, without actively seeking its demise
  26. <Sinc> Many people would say they hate mosquitoes
  27. <Sinc> But do they actively seek to eliminate them as a species?
  28. <john> i'm not interested in what you say your preferences are
  29. <john> you're probably wrong about what you SAY your preferences are
  30. <john> most humans are, anyway
  31. <john> i am interested in what preferences you reveal
  32. <Sinc> And you're just as likely wrong in your interpretation of them
  33. <john> based on your actions
  34. <Sinc> That's the funny thing about language
  35. <john> lol
  36. <Sinc> It can't perfectly convey things
  37. <john> have you gotten to "cached thoughts"?
  38. <Sinc> And your choice to either listen to or disregard that particular sentiment, in the end, don't matter
  39. <Sinc> No, I have not
  40. <Sinc> But my opinions and outlook were not caused by rationalism
  41. <Sinc> And are, in the end, quite possibly going to be affected by it
  42. <Sinc> And yet, will I notice?
  43. <john> if you could push a button which would cause all humans to instantly vanish
  44. <john> would you?
  45. <Sinc> Only humans?
  46. <john> yeah
  47. <Sinc> Better than none, but that just leaves a gap in the continuum
  48. <Sinc> Hm
  49. <john> wait
  50. <john> what makes it "better"?
  51. <Sinc> Allow a known risk to continue, or bet against the odds on whether a new one would form
  52. <Sinc> john, if one dislikes something, less of it is better than more?
  53. <john> sure
  54. <john> so do you dislike "intelligence"
  55. <john> or do you dislike "humans"?
  56. <Sinc> Intelligence as I perceive it to be, and act?
  57. <Sinc> Yes
  58. <john> Why?
  59. <Sinc> Humans, being the sole representative of such intelligence
  60. <john> woah no
  61. <john> humans are a very shitty example of intelligence
  62. <john> humans were designed by evolution, afterall
  63. <Sinc> And yet, they're the most relevant example for my purposes
  64. <john> and evolution is really bad at designing things
  65. <john> imagine an intelligence *designed by an intelligence*
  66. <Sinc> How can I conceive of something more intelligent than what I know of?
  67. <john> it wouldn't have all of these flaws that wehave
  68. <Sinc> Why should I expect something we create and influence to be better than we ourselves are?
  69. <john> it could act in ways which maximized whatever values it had
  70. <Sinc> And who assigns it its values?
  71. <Sinc> We do
  72. <john> sure
  73. <Sinc> And there's the fault
  74. <john> THAT is the 'genetic fallacy'
  75. <john> saying that human values are bad *because they come from humans*
  76. <john> as opposed to being bad because you don't like their affects
  77. <Sinc> If I disapprove of what I know of humans, and what I've seen of them
  78. <john> disapprove as compared to what?
  79. <Sinc> Will their actions, which have proven to have negative effects, not influence their creations?
  80. <john> woah, human actions are *proven* to have negative effects?
  81. <Sinc> If there is no existence, there can be no suffering, from my viewpoint
  82. <john> what about joy and love and happiness?
  83. <Sinc> Proven relative to proof I have to the contrary
  84. <john> wouldn't you rather at least, say
  85. <john> tile the entire universe with human brains experiencing perfect pleasure
  86. <john> than have it be boring old meteors smashing into each other until the heat death of the universe?
  87. <Sinc> I'd rather an balance of all lifeforms, than solely humans
  88. <john> why?
  89. <Sinc> I don't particularly value humans over anything else, if we're going solely on ethics
  90. <john> wut?
  91. <john> every other animal on earth
  92. <john> is *very unethical*
  93. <Sinc> I've seen humans do more things out of sheer cruelty than anything else
  94. <john> compared to humans
  95. <Sinc> So if humans deserve eternal pleasure
  96. <Sinc> It's rather odd for me to just wish it upon them
  97. <john> woah
  98. <john> whether or not humans "deserve" eternal pleasure
  99. <Sinc> Animals hunt, and cause pain
  100. <Sinc> And humans do, as well
  101. <john> would it be *good* or *bad* for the universe to be that way?
  102. <Sinc> But, if we assume humans to be intelligent
  103. <Sinc> The difference is that humans comprehend the pain they cause
  104. <Sinc> And still don't try to diminish it
  105. <john> yes they do
  106. <john> we have charities
  107. <Sinc> john, since it would mean the end of all other species
  108. <Sinc> I would say bad, in the end
  109. <john> we have people like EY who are trying to make suffering go away
  110. <Sinc> A mouse would likely prefer existence over nonexistance
  111. <john> and they're doing it without trying to *kill everyone* like you are.
  112. <Sinc> I'm not trying to kill anyone
  113. <Sinc> Once agian
  114. <Sinc> again*
  115. <Sinc> I'm merely saying I would not be altogether bothered if life ended
  116. <Sinc> But humans have given me more reason to hold a grudge than anything else I've encountered
  117. <Sinc> If I met another who caused me personal injury to the same extent
  118. <john> do you think what hitler did was wrong?
  119. <john> were you altogether bothered by what hitler did?
  120. <Sinc> Humans killing humans, for the purpose of eliminating one they thought was inferior (Regardless of Hitler's personal opinions, many of his followers did this)
  121. <Sinc> He killed them in painful ways
  122. <john> no he didn't
  123. <john> the jews died very quickly
  124. <Sinc> No one died in pain?
  125. <john> and relatively painlessly, for the mostpart
  126. <Sinc> There was no suffering in concentration camps, compared to the life of an average person in Germany today?
  127. <john> so if hitler had killed all of the jews painlessly, it would have been okay?
  128. <john> you wouldn't have been bothered?
  129. <Sinc> It would have been better
  130. <Sinc> "Okay" is an odd term to use in this discussion
  131. <john> alright, then instead of'okay
  132. <Sinc> Given that it's entirely relative, and there is no true boundary
  133. <john> i'll use your term
  134. <john> If hitler had killed all of the jews painlessly, would this altogether bother you?
  135. <john> would you have prefered he didn't kill the jews at all?
  136. <Sinc> It would bother me less; preferably, there would have been no humans to do it
  137. <Sinc> But, assuming humans would be there no matter what the chain of events leading up to it
  138. <john> no
  139. <john> don't add conditions and reasons
  140. <john> i'm just trying to understand your preferences
  141. <john> by asking you simple questions
  142. <Sinc> So, you're trying to make a conversation which is, by its nature, analog
  143. <john> hold on
  144. <Sinc> Into a binary discussion, in black and white
  145. <john> no, i'm trying to gauge your relative preferences
  146. <john> by asking you which of two things you prefer
  147. <Sinc> Given that dying painlessly is less painful than dying in pain
  148. <john> we just determined that you prefer "hitler killing jews without pain" to "hitler killing jews with pain"
  149. <john> now
  150. <Sinc> But dying is, to the individual, less preferable than survival
  151. <john> which would you prefer:
  152. <john> "hitler not killing jews" or "hitler killing jews without pain"?
  153. <Sinc> I honestly have no preference in that matter; they would prefer not to be killed
  154. <Sinc> And humanity would spread, regardless
  155. <Sinc> So, in the end, not killing
  156. <Sinc> If the contamination is already there, and will spread either way
  157. <john> Okay, you have no preference between "hitler killing jews without pain" and "hitler not killing jews."
  158. <john> What is your preference between "me killing your mom without pain" and "me not killing your mom"?
  159. <Sinc> I never knew my parents very well
  160. <Sinc> But, is this before or after I was born?
  161. <Sinc> For the sake of argument
  162. <john> after
  163. <Sinc> Are we assuming I'm living independently, or at home?
  164. <john> Does that change the answer?
  165. <Sinc> Let's take into account, then, that she is at the moment suffering from Alzheimers
  166. <john> Okay, we can take that into account
  167. <Sinc> Then killing without pain
  168. <john> "me killing your mom, with alzheimers, painlessly" versus "me not killing your mom, with alzheimers"
  169. <john> and you prefer the first one
  170. <john> okay
  171. <john> which do you prefer
  172. <john> "Me killing you without pain" or "Me not killing you"
  173. <Sinc> The first
  174. <john> Then why don't you kill yourself?
  175. <Sinc> Because I could not kill myself without pain
  176. <john> Sure you could.
  177. <Sinc> And because I'm not actively seeking it, but if the choice is offered
  178. <john> Where do you live?
  179. <Sinc> your offer to come kill me, albeit sweet, is almost entirely certain not to be painless
  180. <Sinc> So I must decline
  181. <john> I'll come kill you right now, dead serious. I'll leave right now in my car, I'll bring enough heroin to kill you very painlessly, and I'll do it right now.
  182. <Sinc> You're of the opinion that death by heroin overdoes is absolutely painless?
  183. <Sinc> dose*
  184. <john> I've overdosed on heroin before
  185. <john> and it was the OPPOSITE of painful.
  186. <john> where do you live?
  187. <Sinc> Once again, I don't trust your judgment on the pain related to death by heroin overdose
  188. <Sinc> And I decline
  189. <john> I'm willing to do this because I want to see if it's actually possible for a human to have these preferences
  190. <Sinc> If I choose to end my life
  191. <john> Do you know what pain is?
  192. <Sinc> Only to an extent
  193. <john> Pain is a neurological phenomenon caused by a rapid decrease in the ratio of endorphins:other neurotransmitters fitting into your mu-opioid receptors
  194. <john> when you burn your finger
  195. <john> the endorphins leave your finger very quickly
  196. <john> and the absence of those endorphins is what causes the pain sensation
  197. <john> endorphins are the body's natural opiate, and when i say *are* i mean they are chemically indistinguishable from opiates
  198. <john> for overloading your body with endorphins to cause pain, would be *physically impossible*
  199. <john> you will experience an intense, intense rush of pleasure, similar to when you are very cold and then walk into a very warm building
  200. <john> except much much stronger
  201. <john> and then you will pass out, and while you are asleep you will stop breathing
  202. <john> ..goddamnit now i'm jonesing
  203. <john> brb gonna prep a shot
  204. <john> ahhh
  205. <john> yeah man
  206. <john> i can fuckin guarantee you
  207. <john> this is the best way to die
  208. <john> you know, when you're a heroin addict, you kind of always *hope* you'll OD and die, instead of having your friends rush you to the hospital
  209. <john> it's not really *suicide* so much as it is
  210. <john> making a statement, you know?