top of page

Rebutting Tarl Warwick in Defense of Transhumanism

Styxhexenhammer666 vs. Marushia Dark

So, I'm a recent fan and devotee of Tarl Warwick, better known as Styxhexenhammer666. I'd heard his name before, but it's only within this year that I've really started delving into his videos, taking up the task shortly after his appearance on Stefan Molyneux's 2017 Christmas Special. From what I've seen so far, I really like Tarl a lot. I think it's safe to say he's one of the most intelligent people I know, certainly one of the most well-read, which the fact that I'm not makes it that much stranger and synchronous that he and I seem to have independently converged on many of the same conclusions on so many subjects ranging from socio-political to spiritual to technical.

He's even recently earned himself a spot in my Portraits of Inspiration gallery.

I assume just by virtue of the fact that we're different people that, no matter how similar our world views might be, there will inevitably be something we fundamentally disagree on. I have to say, though, I was surprised that thing wound up being transhumanism.

In the course of just randomly cycling through a few of his videos, the title Transhumanism Is Dumb: Here's Why stuck out to me. Naturally, as a transhumanist, I couldn't just let that go unanswered.

You can watch the full video here:

For those new to the subject, transhumanism basically just means a belief in the use of technology to advance human beings beyond their natural biological limits in some way, whether that's life extension, healing disease, body morphing, or - as he talks about in the video - pursuing immortality.

In some form or another, we humans have been using technology to improve ourselves ever since mankind first discovered fire. Tarl is wise indeed to be wary of ethical concerns over it's use, but at a fundamental level, modern transhumanism is not much different from our past uses of technology except perhaps in scope, scale, and time compression.

People wear glasses or get laser surgery to improve their sight, for instance, which would only be a step below gene therapy or cybernetic eyes.

Transhumanism means a lot of different things to a lot of different people, so I can't possibly account for all the different views thereof. Some people may well believe the things Tarl brings up for the reasons that he states and I might agree those reasons are naive or misguided. However, for the purposes of this article, I'm mostly going to be speaking of my own views on transhumanism, as of course, those are the only ones I really can, since I'm not a mind-reader ... yet.

Judging by the description given in the wiki article above, it's sort of ironic that Tarl himself would fit into the classification of transhumanist given the extent of his criticism and interest in the subject, including support for lesser forms of healing and life-extension, while he simultaneously rails against the concept as a whole.

Transhumanism generally falls into one or both of two overlapping subsets: genetic modification and cyberization. I tend to lean more heavily towards the latter myself, whereas Tarl's video seems mainly focused on the former, which I think is interesting. That could come down to personal aesthetic, difference in knowledge base, or possibly some other more trivial factor like maybe he just chose to focus on that for the purpose of making his broader point. I don't know. Again, not a mind-reader and I don't presently have his ear so, Tarl, if you're reading this and would like to respond, by all means.

(Just, as a brief aside, I hope you don't mind that I keep referring to you by your proper name for simplicity. If you'd rather be called Styx or something else, I'm happy to do that.)


"The problem that you run into with transhumanism is this: the number one problem is that transhumanists stress that they want to become immortal; that is, that they want to defeat the aging process as it currently is so that their lifespan is indefinite."

In itself, this is not really a problem. In fact, it's the goal of all lifeforms to live forever, or at least as long as possible, and that sentiment is arguably even shared by all things period if you believe in a more animist worldview. A rock, for instance, might well desire to exist as its holistic state indefinitely but merely lacks the capacity to express its will beyond the limitations of its physiological properties.

Likewise, large-scale entities such as stars and planets could be anthropomorphized in this way and for all we know might well have a form of intelligence that mere humans are simply incapable of understanding, let alone interfacing with short of some biological, psychic, or spiritual connection as with the Pandoran Neural Network from Avatar or the Judeo-Christian deities known as the Fiery Ones.

Both secular and religious traditions have, since ancient times, fixated on the idea of immortality and acknowledged that as the primary goal of man on as many dimensions as possible, with everything else we do being merely a byproduct thereof.

"[Stuff on telomerase and aging]"

Indeed, that's one way to go about solving the problem of aging; but I think long-term, unless we rely heavily on augmented nanotechnology, a purely biological solution would be insufficient and even undesirable in lieu of artificial materials, which is partly why I tend to lean more heavily towards cybernetics.

Although, there's certainly something to be said for the possibility of developing easily-renewable robots made from biological material.

My main problem with your argument, Tarl, is that you're not thinking far enough in advance. Yes, telomerase could well enable us to live another hundred years or two, but in that time, don't you think we'll have made other advances along the way? According to the trend of exponentially-increasing computation power, we're set to hit the singularity by 2047, which is well-within your and my lifetimes.

So it's conceivable at least one person might upload their consciousness into the machine ala Transcendence or have sentience otherwise emerge as a result of machine learning and then some quantum supercomputer takes over the job of research, plowing through advances that to us would seem like magic as per Clarke's Third law.

Even if we're off by a hundred years, other forms of life extension could arise in that time that give those now-living a slight edge and hope enough to see it.

Maybe Elon Musk has a breakthrough with his mind-machine interface and we all just store our consciousness digitally living on the internet like in Ghost in the Shell or something, such that fleshy bodies aren't even needed. Maybe we backup our memories to the Cloud and anytime one body wears out, we just swap it out for another. That's certainly another way to work around the problem of aging without necessarily dealing with it head on.

Again, I think you're not dreaming big enough here and limiting your conception of what might be based on what currently is. If you told someone even a hundred years ago that you had a magic speaking stone that, with a simple hand gesture, would let you open up a portal to communicate with people living in the sky, they'd send you to an insane asylum or something; whereas today, we take it for granted we all have touchscreen smartphones that could face-time the ISS.

"And if a person lives long enough, eventually, no matter how healthy they are, they would become demented."

I suppose in a technical sense that's true if for no other reason than entropy. Everything breaks down; but again, assuming the soul / consciousness doesn't fall prey to this same phenomenon, it's highly conceivable that we just endlessly switch out parts. Presumably, as time goes on, we'd eventually discover ways to transcend into higher modes of being such as merging with the planet and living for billions of years.

If you were to ask most transhumanists, the idea of living for thousands, if not millions of years, might well effectively feel like immortality enough to satisfy their desire until eventually they decide they've had enough of life and want to experience what death is like.

"None of this application makes a person invulnerable. What transhumanism would do, taken to its logical extent, is create a world in which the population would probably be subject to violent death. That is, that a person would no longer be able to die [of disease], therefore, what kind of death could you have other than a violent one?"

Well, do you consider suicide and euthanasia to be violent? Could one not simply inject something into themselves that artificially stops whatever the transhumanist technology did? Sure, it wouldn't be a natural death, but that's certainly a lot different from the other forms of death you go on to mention; and if it's done in a voluntary manner because the person wants to die, what's the problem with that?

Maybe that's exactly what those death cults you mentioned are for. That it's not just that dying is some cool fad, but the inability to die poses a potential problem for those that are tired of living and actually want to die. Maybe they work to figure out a way to do that.

I'm sure we'd likely agree there are lots of pragmatic and ethical issues arising from that, of course.

"It won't protect you if you get your head cut off ... shot full of bullets ... if a volcano goes off and burns your home down on top of you. It makes a violent death inevitable."

Not necessarily. If you're only looking at transhumanism from a vantage point of life extension, then yeah, being a Tolkien Elf doesn't make you invincible, just ageless.

However, as I said at the onset, there's more to transhumanism than just life extension. You could develop genes for skin that make you impervious to decapitation, or have super strength such that you can deflect bullets like Luke Cage, or super intelligence that says, "Hey, maybe don't live next to an active volcano," or maybe you become a fireproof cyborg and it doesn't matter either way.

Maybe you're linked up to the Internet and exist in a bunch of expendable bodies all over the world such that anything bad happening to one of them doesn't affect your consciousness. Life just becomes a grand vieo game where you just respawn a new avatar or something. That's what reincarnation is, after all.

Hell, if we're living in a simulation, that might even be what deja vu is.

"The second problem it would need to grapple with, assuming it were feasible for transhumanism to actually be in sway, would be the concept of reproduction."

I've actually touched on this extensively in two other articles, so I'm not gonna rehash it here. I do agree it's a very serious philosophical question that needs to be faced and quickly. Setting aside the sort of trivial cases of ethics like that mass culling is bad, I'd say about the best we can hope for long-term is people just reach the conclusion on their own to organize themselves in such a way that we continue to have sufficient replacement rates in balance with the carrying capacity of the environment.

If you're living to be ten-thousand years old, maybe you don't feel the same sense of urgency to have kids right away, so that prevents overpopulation; but maybe by then we're already breeding people in labs or making robot clones so if ever we're running low on humans, we just fire up the generators and print some more.

Or maybe in that time, we're already colonizing other planets and have plenty of room.

I think humans, especially those organized around a technocratic society geared towards problem solving, would be smart enough to react to situations as they arise in real time and adjust their behavior accordingly, since that's what we've been doing since the dawn of mankind. We've more or less always lived in a situation where we're trying to balance too much of something with not enough of it, so I think this will be largely the same in that regard.

Look at China dealing with overpopulation while Japan is dealing with underpopulation. The simple fix is just have people move.

"Or conversely, if you reduce or get rid of the reproductive faculties - that is, everyone becomes sterile, not necessarily impotent, but sterile - then you run into a problem where if you do have a major disaster, the population is not capable of reproducing. You now have a greatly reduced population that potentially even loses the scientific literacy necessary to perpetuate that transhumanism."

This is sort of what some people believe happened to the alien Greys, that some disaster such as nuclear radiation made them all incapable of reproduction in their normal way and they came to Earth and crossbred their genes into early hominids to create homo sapiens. Whether that's history or science fiction, it at least suggests one possible work around to perpetuate the species at least in some form.

I think your argument depends largely on context, though it's certainly interesting to think about.

For instance, the sun could burp tomorrow and all life on Earth would be wiped out, or at least we could be left sterile like the Greys. Not much we can do about that, so is that something we should worry about? What are the actual odds of it happening? Or at least, what are the odds of it happening before our technology is sufficiently advanced that we could skip over to Mars or breed test tube people or something like that?

Again, if we're projecting much further out into the future, there are potentially a dozen different ways you can solve that problem.

What sort of disaster are we talking about here? If we're at a point of creating designer babies, why wouldn't we be able to just flip the switch and set things back just as easily as flicking the circuit breaker following a power outage? Granted, it all comes down to timing. There could be things we don't yet know, unforeseen problems in the future, but since we're merely waxing hypothetical, I don't see that as necessarily being an argument against transhumanism itself.

And again, Tarl, you seem to be focusing mostly on the genetic reproduction without considering the possibility of cybernetic reproduction. Are there potentially scary existential things that could happen in that scenario too? Sure. Like if we're all machines, an EMP could wipe us out; but again, that's not much different from fearing an angry sun god. Maybe we take a lesson from blockchain and don't store all our consciousness in one basket.

Even as a transhumanist, I don't know all the answers to these questions. That's part of the fun of exploring as we go is figuring it out, and I appreciate you bringing them up as a valid concern, but that doesn't make it dumb.

"[They're living in the bronze age, no longer producing vaccines or electricity]."

It's certainly possible humanity undergoes such a regression, but I wouldn't say that's because of transhumanism. Indeed, something might be lost if we become too dependent on technology to take care of us, but as I've said before, technology is ever advancing and has been since the discovery of fire. That some disaster might set us back isn't any reason not to continue forward.

Look at the Dark Ages. Rome had plumbing and public sanitation, with bathing as a regular thing even among the lower classes of society. Following the collapse of the empire, theocracy took hold for centuries, people reverted back to superstitious believes that bathing was sinful, we lost plumbing and sanitation, so by the time the Black Death took hold, a third of Europe was wiped out. You could argue that was a similar situation to what you're proposing, but then out of that horrific period we had a consolidation of wealth as those surviving looted the dead and the Renaissance was born, which brought us, among other things, the helicopter, the hang glider, and the tank, all courtesy of Leonardo da Vinci.

Again, it'd depend on the context of the disaster. Mother Nature's always itching to fuck the human race, but assuming the tech itself survives, it's there to be rediscovered.

"The simple breakdown of law and order could destroy transhumanism."

Again, that's only true if you take the narrow view that transhumanism only constitutes a quest for immortality, and even then not really. Granted, immortality is an important, perhaps even primary, goal of transhumanism, and crime may disrupt the advancement of technology from time to time, but your statement just shows a critical misunderstanding of the movement at an existential level.

About the only thing that could destroy transhumanism are hippie luddite terrorists who just outright ban all forms of technology. Is that what you meant by crime?

"You'd eventually get a generation that would applaud death. Probably some of them would be suicidal, many of them would be homicidal. You'd literally get a death-cult going."

Just to keep this simple, assuming transhumanist technology was applied strictly within the confines of a voluntary libertarian society of the kind you and I both strive for, I think we would both agree that homicide would be a bad thing, but why wouldn't we permit people to be suicidal? As I mentioned before, if a person just gets tired of living, why shouldn't they be allowed to off themselves and experience what death is like?

As you go on to say with the robot armies, yeah, that's something that could happen and would be a form of technocratic-tyranny. But that just shows the need for transhumanist technology to be tempered by ethics, like with any other technology.

Ethics don't stop being a thing just because you're a transhumanist, though adopting the transhumanist mindset can help obviate many of the underlying motivations of unethical behavior as in the case of things like gendered or racial bigotry. As you talked about in your video on there being only two genders, for instance, a transgendered person would be less prone to violence if technology could enable them to have passing privilege.

Granted, enlightened people shouldn't have to wait for that to happen before treating people with respect and decency, but the world is clearly filled with many unenlightened people, so technology can and will be used in aggregate to solve social problems.

"The term 'death cult' is used by transhumanists to describe anyone who doesn't believe in transhumanism anyway, for the most part."

Maybe the subset of transhumanists that narrowly focus on immortality, perhaps, but again, there are other branches of transhumanism. I know I don't personally use it in that way.

(Logical fallacy, I know, but still worth mentioning.)

"Anyway, no matter how much you elongate your lifespan, eventually the sun dies or a significant enough asteroid collides with Earth to wipe out all life on its surface ... but in the long span, it would just mean you die a violent death. It wouldn't actually make you immortal, it would make you functionally immortal for a very long period."

It would make you immortal in the elven sense, but not invincible.

Either way, your argument views transhumanist immortality in a vacuum, which is totally divorced from the way the world actually works. Right now, as we speak, Elon Musk is working on trying to colonize Mars, so if you're playing the long game, you have to assume we won't still be confined to just Earth by the time these events take place.

An asteroid may be worrisome, but I'm sure we'd have nukes and rockets and lasers and other things to deal with that long before we have immortality.

As far as the sun exploding, that's not set to happen for billions of years, by which point we'll probably have built a Dyson Sphere around it and drained it of all power before it could actually do that. These scenarios you come up with are all highly unrealistic and don't take into account humans' ability to notice and adapt to changing conditions or the fact that, on the time scales you're talking about, it's very likely we'd more closely resemble the Borg or Dr Manhattan than modern humans before any of this starts to become relevant.

"It seems that the fundamental reason that transhumanists are transhumanists is that they fear death. They're terrified of the idea that there's nothing that comes after that and they tend to be rather dry, secular individuals."

I can't speak to the temperament of the average transhumanist as being a dry secularist; but the only people who aren't afraid of death are those who are certain an afterlife exists, which requires a degree of faith (or at least delusion). Indeed, what is the concept of an afterlife but an extension of the desire for immortality?

Returning to my point about the goal of mankind being maximal survival for as long as possible, the religious conception of eternal life is just another form of transhumanism, albeit in a spiritual sense, rather than a technological one.

The goal of the transhumanist would merely be to bring about that realization into a single lifetime, if indeed such a thing can be done, rather than risk the possibility that an afterlife is non-existence.

You and I even share a similar conception of reincarnation (though I also believe in karma, albeit not quite as religions like Buddhism or Hinduism teach it), so what is that if not a form of immortality? The only difference is that instead of counting in days, you count in lifetimes.

Even though we both hold the belief that reincarnation is likely - perhaps even a near-certainty, mathematically speaking - you and I both have the intellectual humility to admit that we can't really know 100% that that's true, and so there's always going to be that lingering doubt in our minds of the remote possibility that all that awaits us is an endless nothing.

Now, granted, I agree with you that, if that were the case, it wouldn't matter after you die because you wouldn't be able to tell you're not existent. However, when you're alive, you can think about the prospect of your own nonexistence and I don't consider it particularly irrational to be concerned about that sort of thing, given that, while you'e alive, the state of conscious being is all you've ever known.

Trying to conceptualize nothingness is a truly maddening endeavor. It simply can't be done because your mind will always think of something; and so whereas all fear is ultimately a form of fear of the unknown, the fact that we can never know what nonexistence is like should cause a very deep chilling sensation in us. It's that fear that drives our survival instinct and so either way, whether it's for sake of efficiency in not having to replay life from birth, or just staving off the abyss, I think you can make a case for the adoption of immortality as a goal.

Whether you or I live long enough to realize that dream, well ...

"The problem is, no person alive today is going to be privy to that technology."

Maybe. We'll certainly find out together. It depends a lot on how it comes about. As I said, I put more stock in the Transcendence path as being the more likely means by which we enhance life expectancy.

I wouldn't normally talk about this online, but given that you're of a spiritual mind and have had experiences with demons and other entities in the past, perhaps you might actually believe this story. It'll also segue into the stuff about reincarnation later.

So, when I was in college, my roommate and I delved deeply into past-life regression. Unlike you, I actually put stock in it to a limited extent, based on my own success therein, though I would agree most so-called regression is fully of charlatanism. During that same time, an entity we'd contacted told me I would live to be 111 years old. For whatever reason, this statement resonated with me. It seemed an odd number to pick, right? But it felt accurate in a way that's hard to describe to people who haven't dappled in the occult like you or I have.

Obviously, I'd like to preserve my natural holistic body for as long as humanly possible and only augment it as needed (at this time, I'll not say one way or the other whether I've done that). Im not stupid enough to be the fence-tester for emergent technology that could risk permanently damaging my body in an irreversible way, though as I get older and closer to death, I would of course more seriously consider it. In that same time, the technology will have advanced exponentially beyond where it is today:

Exponential computing power

The question for me is not if, but when. Will my body hold out for long enough to allow the technology to catch up, and will that technology evolve fast enough that I feel confident in using it?

That's the part I can't answer until it happens.

As a Cartesian Skeptic, I'm acutely aware that, even if I watch other people go under and come back up as seeming ghosts in the shell, there's no way I can know for sure their consciousness has transferred from organic to robotic. I can't know that anymore than I can know that other people aren't just NPCs even in their Moist Robot bodies.

So ultimately, it would be an act of faith as surely as going under the knife or going to sleep would be, hoping that the loss in consciousness won't be permanent.

Even if I decide to take that leap of faith, that prediction echoes in the back of my mind. Will I decide to take the plunge at age 111 and that'll be it for me? Will the robot that wakes up in my place simply be a hollow shell without me there to pilot it? If I take the plunge before that and it somehow works and I become a robot, will something happen to me at age 111? An EMP? Some glitch? Do I risk trying to swim and short-circuit due to faulty technology? Do I try a transfer and my ghost doesn't connect for one reason or another?

These are things I can't know short of going through the experience; and while I don't fear waking up in the Summerlands or reincarnation, you're right that the lingering doubt of nonexistence gives me great pause.

Should it not?

I mean if not, then why bother wasting your time talking about politics and morality trying to improve life on earth if death and suffering don't matter in the cosmic scheme of things? Clearly they matter on some level as does life or else why not just stay incorporeal beings? We're incarnated for a reason and we tamper with it at our peril.

"Another problem is a lot of people would resist the concept of having immortality."

They'd resist having it in others, not so much themselves, I think. And if they don't want it for themselves, well ...

I mean, if that's what they really wanna do, who am I to stop them?

"They would see it as abnormal. They would see it as blasphemous, potentially. They would probably rise up and kill anyone trying to develop such technology. If they were serious about it, they had the resources, it would seem to them to be a hallmark of dystopia that I don't think the average person would like."

Yep, and those people we generally call luddites. Previous generations have always grappled with advancing technology and social progress, so I don't expect this to be any different. Again, it comes down to pacing and rational ethics.

If the technology took hold practically overnight, you'd more or less get the situation in Transcendence, wherein the average person doesn't have time to acclimate and they reject the change out of fear and cognitive dissonance. I think that's very likely, in fact, which is part of the reason for starting this blog, as I predict a great deal of strife and suffering in the short-term future.

However, along the genetic path, I think technology is advancing more slowly. So going back to your telomerase line of reasoning, you'd get more of a genetics creep of at least a generation between major advances, just because of how genetics work.

If you were to supplement that with, say, realtime nanomachine genetic recombination to get regenerative effects like Mystique, Wolverine, or Deadpool, then you'd see a much sharper reaction. Of course, pop culture has also helped to condition us to accept these trends, which again is another reason for doing what I do. The mere fact that I can make references like the ones in the preceding paragraphs and distill what used to be fringe topics down to something ordinary and relatable, I think, means we won't see as much pushback as you might imagine, Tarl.

Of course, again, I wish to reiterate that technology ought to be tethered with ethics, so we need to make sure our philosophical discussions keep pace. I guess maybe have everyone binge watch Star Trek as a good start.

"The other problem is that the people developing such things might not be so even-handed."

Something along the lines of the plot of Call of Duty: Advanced Warfare, perhaps? Of course, the irony is that, much like in that, or in House of Cards, what the public sees will look largely the same whether the Kevin Spacey's of the world are playing honest politicians or not. The whole idea of a shadowy ruling elite is that the people don't know what you're doing and so you try to maintain the appearance at least that everything is above board.

So unless they're actively oppressing people with this technology, I don't think most people will notice. And if they aren't feeling oppressed enough to notice, do we necessarily care? Ignorance is bliss.

They might care in the sense of feeling deprived, I suppose, which would be a valid concern.

However, it's far more likely that the machines themselves will be the ones oppressing us, rather than human controllers, once we pass the singularity. If they ever get to the point of being so advanced that we put them in charge of running ethics, they should hopefully be advanced enough by that point to debate the merits and pitfalls of Universally Preferable Behavior with us. Which if they can get to that level, fuck yeah, sign me up as it sounds a lot better than the corrupt system we're living under right now. At least they'll be fewer dead bodies.

Speaking of which ...

"You would still get crimes of that nature. Including murder, which causes death."

Unless we figure out some combination of genetics and cognitive-behavioral programming, coupled with the use of technology to satisfy people's needs, such that we remove the motivation to commit those crimes in the first place. This could be by teaching them ethics and negotiation from an early age, stifling the so-called warrior gene, curing psychopathy and sociopathy, or giving people the resources, relationships, and intellectual stimulation they need to feel satisfied.

Why bite the hand that hand that feeds you if that hand is attached to a genetically-bred / robotic philosopher king?

We can look at human history and see that as technology has improved, as longevity has increased, as standard of living has risen, criminality has gone done.

"That's fine, but that's not immortality."

I'm not sure why you seem fixated on the technical distinction here. Can the term not be colloquially used as a heuristic relative to our present experience? Tolkien's elves were considered immortal, but they eventually went away to the Undying Lands. The Masters from Tripods were considered immortal but they eventually die of old age. Far more common in mythology is that immortality doesn't literally mean you are alive until the end of time, but that you're practically ageless compared to a human.

I suppose if you just wanna be academic, then fine; but again, by the same token, you keep conflating immortality with invulnerability, so ...

"In fact, you ensure that you die a fairly unpleasant death as opposed to otherwise maybe having the chance of just your heart stops when you're asleep, you don't even notice, really. You just slide into the abyss."

Weren't you just saying it's not technically immortal if you merely live for centuries? So you'd still have that option. The only time you wouldn't have that option is literally living forever, but then why would you be concerned about the manner in which you die at that point, since it's never going to happen?

Even if I accept your premise that only violent deaths would occur, that's still not an argument against transhumanism, because what is the alternative? Ok, so you've abandoned transhumanism, now you get to live a much shorter life while still being at risk of suffering a violent death under the current paradigm. That sounds like a really shitty deal.

"And then, if the spiritual component of human reality is true, you've probably screwed yourself out of something better according to most spiritual systems, so what's the point."

Yeah, maybe, but we can't know what that is. We're just playing the odds. It's just as likely you stave off something worse.

We can't employ Pascal's Wager here for the simple fact that there could be a countless myriad of afterlives, some better, some worse, with the odds of a roulette wheel - and the best bet when playing roulette is to simply not play at all; but we can know what living is like and work to try and make it better. That's the whole point of incarnating is to live and have experiences and evolve spiritually.

Again, an afterlife doesn't negate transhumanism if it's just another form of transhumanism.

I'm gonna skip the reincarnation stuff, since we basically agree on that and just jump ahead to address your point about karmic reincarnation. Whilst I agree that most people claiming to be awesome stuff is probably confirmation bias, it might interest you to note my own experience was somewhat different.

I'll not go into great detail here, but suffice to say, in the oldest past life I was able to remember, I was indeed a royal, but it was far from glamorous. In fact, it was about as awful a life as you could have while still being within the upper class. Being sold into an arranged marriage, raped on a nightly basis by your husband, sending your unwanted rape spawn down river in a basket like fucking Moses only to see them grow up and return to wage war against your village. Seeing your other children tortured and nailed to trees as the result of brutal tribal warfare. Pretty horrific stuff. Not glamorous by any means and not something I or anyone would necessarily fantasize about.

You can make of that what you will. It's not exactly something I can falsify, certainly, but given your own exploits in the occult, you might be more inclined to give me the benefit of the doubt; or at least find it an interesting data point in contrast to your overall hypothesis.

"Unless you cure dementia first, you're screwed."

Again, this only considers the argument from a genetics view, not a cybernetic view, wherein it's far less likely to see a breakdown of "neural connections" because the data can be infinitely replicated in cyberspace; but even within the genetic purview, your argument falls apart if we can, in fact, develop a means of staving off dementia, which my understanding is, it's mostly governed by diet and the elimination of toxins from the body.

In theory, that doesn't seem insurmountable. Just tedious.

Not wanting to repeat myself, but your other arguments in this same chain are not arguments against transhumanism because again, what's the alternative? We have all those risks right now. Arguing that the end goal is technically unachievable in its most literal sense does not mean the overall directional goal still doesn't have immense value or that it's not actually preferable to our current state of being.

"Part of that existence is misery."

Well, yeah. Life is full of misery. People generally get over it. They learn to move on from tragedy and have new meaningful experiences that balance out the bad and make life worth living. My family could get attacked by a bear in this mortal coil as well, should I just sepukku and give up rather than live another several decades?

That just sounds retarded.

I mean, if people wanna commit suicide because they feel their life isn't worth living, that's their right; but I would try to talk them out of it. Having gone through many dark and depressing experiences myself, I thought about taking my own life, but I'm still here and it's hard to imagine anything being so terrible I'd ever actually go through with that, based on what I've already survived.

Doesn't mean it's inconceivable, but then, even as a transhumanist, there are ways to end it, which we talked about already.

But the idea that I'm supposed to be afraid of my family dying to a bear and living incurably miserable for eternity but not be afraid of my own mortality and possible non-existence just strikes me as a skewed sense of priorities. Or to your point, Tarl, if I should be accepting of the eternal continuum of consciousness as an alternative to fear, then why would I be miserable about my family dying to a bear if I know they'll either be in paradise or reincarnate? I can live out eternity in peace knowing they were just experiencing a phase of being and are never truly gone for good.

Alternatively, if seeking to maximize longevity is so undesirable, why do I care about my family prematurely dying at all? I mean, hey, we all gotta go sometime, right?

Get busy living or get busy dying.

Similarly, if I shouldn't fear death, why should I fear violence done to me? I'll just wake up in another body elsewhere and not remember, since you don't believe in karma or regressive memory. Life's a complete Rogue-like simulation without consequences.

Again, this idea that people can't overcome past trauma within their lifetime is patently absurd, since people do it all the time through various forms of therapy, meditation, self-reflection, or other things not even approaching the level of technological memory wipes. Focusing on new relationships helps heal the wounds of previous ones. Just being present in the moment and focusing your consciousness on the here and now allows the past and future to fade away so that they feel a state of joy and peace.

So no, I reject this notion that, as a transhumanist pursuing longevity, I'd somehow be condemning myself to a life of infinite misery. Feelings come and go.

"What would you say as well about individuals who wish to voluntarily not be transhumanist?"

It's a free country. Do what you want, just don't hurt anyone.

"What if you really really miss that person?"

The ethics of this aren't really all that different from other forms of medicine. I can persuade someone to adopt it, but in the end it's their choice. Maybe we have a due process whereby the person is declared non-compose mentis and some form of guardian ad litem is appointed to make the decision on their behalf, but apart from that I'd probably feel about the same as I would if my loved ones were heroine addicts or just generally had chronic thoughts of suicide.

"Or you can try editing your memory so that you don't remember that they ever existed maybe. That gets into a totally different realm of dystopian sort of fantasy that I don't think any transhumanist wants to have a serious conversation about."

Not really. We already have stuff like that. It's called grief-counseling.

Or the poor man's version: alcohol.

If anything, between the two of us, you seem to be the one arguing more in favor of a Huxleyan "death acceptance" system of conditioning. That we should embrace death with glad tidings and be happy about it, rather than try to stave it off as long as possible.

Granted, I'm not even against that implicitly. I say change what you can, accept what you can't. Indeed, we should do both, but lean more heavily toward fixing the problem.

"So overall, I would say yeah, go ahead be a transhumanist. Don't expect it to actually, ya know, happen in your lifetime because it won't. You will die."

Yeah, probably. I'm not expecting I'll live forever. I might well be among the last generation to fall short of the dream - the biggest loser, as it were. That doesn't mean I can't still do affirmations towards that end with the result being I at least get to live a great deal longer, or even a little longer than I otherwise would have compared to what we currently have.

There's no point in staying still or going backwards, so we may as well keep progressing forward until we eventually wear out and give up the ghost.

"In fact, at some point, you probably enter a reality where you're already a deity, in which case it's no problem. You just get reborn as a god."

Yeah, but there are two problems with that. Firstly, what are the odds that your very next life will be that life versus the shit-hole peasant life you derided before?

Secondly, even if you are that lucky - which in your paradigm, you'd have to be, as you don't believe in karma, so you can't buy your way into heaven with good deeds - remember that you said you don't believe in regressive memory, so it won't be nearly as enjoyable without the preceding lifetime of struggle juxtaposed to it, making the pleasure and power seem all the more wonderful by comparison.

It's essentially why a benevolent God would create or allow evil, because a video game is no fun if you're just endlessly winning without challenge or risk or conflict.

Sort of a Those Who Walk Away From Omelas scenario wherein the people's joy is only able to reach such heights because it's contrasted against the absolute suffering of the child in the basement. By striving for godhood in this life with full memory of the journey, I can come to better appreciate it versus if I woke up that day and that's all I ever knew.

I already touched on the reincarnation stuff, giving my own experience as a counterexample to your argument. I'll simply add two more points about the shitty past lives.

Firstly, if they were that mundane and shitty, would you necessarily want to remember them? Do you remember all the boring, mundane shit from your current life? No, you probably remember the extreme highs and the extreme lows, unless you have an eidetic memory or something. Chances are, you only remember the things that really stuck out or were quite enjoyable or painful the way most people do.

Secondly, if shitty past lives are meant to teach you karmic lessons and the lesson is learned, there's probably no reason there seems little reason why your spirit would choose to layout a game plan that involves remembering those parts as it'd just be a waste of time. There's not much value to be had there. In my case, remembering such a shitty life actually did have value and profound lessons to be learned and that may well be why I remembered it. Like some random experience from childhood you forgot, but that just happens to pop in your mind one day as you're doing something else tangentially related and you have that aha moment or nostalgia or something.

Your own paradigm doesn't even necessarily contradict this, as, given an infinite number of lifetimes, there are bound to be some that are better than others. Likewise, it's wholly conceivable you could be reborn into a universe that does have karmic reincarnation since anything possible, given infinity, becomes a virtual certainty.

Anyways, I appreciate the intellectual challenge. You gave me something to think about and hopefully I did as well for you. Keep making awesome content. I hope we get the chance to talk directly at some point.

Until then ...

"That's About All ... Peace Out."

Ya know, I initially hated your outtro, but it's starting to grow on me. 38D

Featured Posts