9 Comments
User's avatar
Rai Sur's avatar

The substack has been helpful for me. Especially the post on Newcomblike problems. It's actually great to go around the world from that frame, trusting that if I become more like the person I want, even in private moments/conversations, it will leak out and provide sufficient correlational evidence to get me what I want. I'm seeing unprecedented signs of it working!

Expand full comment
Kayla's avatar

I’d like to know your take on what the role of rationality in dating is. There still is a role surely? I do not observe rats being noticeably more successful in dating, but it’s an open question in my mind whether that’s actually due to their choices or simple gender ratio problems.

Expand full comment
Jacob Falkovich's avatar

I think the biggest contribution of rationality to dating is that it gives both the toolbox and the permission to separate what is socially approved of from what is actually good for you. This is a big theme of Second Person as well: that looking to be "normal" in dating by following general dateability advice, following the discourse, copying lists of mimetic desiderata, or letting hostile actors set rules for you is bad for your dating life. As long as you're not so "abnormal" as to scare away literally 100% of the hoes, you do better by finding your own desires, your own value in the dating market, and your own approach for making the two meet.

This is very much the lesson of what I would call the second wave of rationality, the one centered around Scott's great sociology posts of 2014-15, signaling theory, etc. Bayes and biases doesn't really figure into it. And the sort of thing that rationalism selects for, people who are good at decoupling and don't want to do anything else, doesn't really help this either.

Expand full comment
Greg's avatar
John Wentworth's avatar

Overcompressed summary of this post: "Look, man, you are not bottlenecked on models of the world, you are bottlenecked on iteration count. You need to just Actually Do The Thing a lot more times; you will get far more mileage out of iterating more than out of modeling stuff.".

I definitely buy that claim for at least some people, but it seems quite false in general.

Like, sure, most problems can be solved by iterating infinitely many times. The point of world models is to not need so many damn iterations. And because we live in a very high-dimensional world, naive iteration will often not work at all without a decent world model; one will never try the right things without having some prior idea of where to look.

Example: Aella's series on how to be good in bed. I was solidly in the audience for that post: I'd previously spent plenty of iterations becoming better in bed, ended up with solid mechanics, but consistently delivering great orgasms does not translate to one's partner wanting much sex. Another decade of iteration would not have fixed that problem; I would not have tried the right things, my partner would not have given the right feedback (indeed, much of her feedback was in exactly the wrong direction). Aella pointed in the right vague direction, and exploring in that vague direction worked within only a few iterations. That's the value of models: they steer the search so that one needs fewer iterations.

That's the point of all the blog posts. That's where the value is, when blog posts are delivering value. And that's what's been frustratingly missing from this series so far. (Most of my value of this series has been from frustratedly noticing the ways in which it fails to deliver, and thereby better understanding what I wish it would deliver!) No, I don't expect to e.g. need 0 iterations after reading, but I want to at least decrease the number of iterations.

And in regards to "I don’t think you know what you want in dating"... the iteration problem still applies there! It is so much easier to figure out what I want, with far fewer iterations, when I have better background models of what people typically want. Yes, there's a necessary skill of not shoving yourself into a box someone else drew, but the box can still be extremely valuable as evidence of the vague direction in which your own wants might be located.

Expand full comment
yossarian's avatar

I generally refer to that kind of problems as "Inner monkey" problems. The overall trouble with purely rational solutions is that while the intellectual part of one's mind can understand them, but for the inner monkey part that governs the emotions and a lot of in-the-moment decision making it's still all strange numeric mumbo-jumbo. The point that spirituality-based help programs have is that they are generally written in terms of relationships - with self, others and the world in general, and that's much closer to the native language of the inner ape. So, for solving life problems, often it's better to have a merely good model that one can apply and follow wholeheartedly, then to have a very good model that one would have to consciously struggle to follow.

Expand full comment
XBTUSD's avatar

I think this is your best post. It seems like you've discovered what this blog really is about. You keep hammering the same theme and I think this post gets to the heart of it. There are no +-EV moves, there are no priors, there is only you and what actions you take, what worldview you allow to guide you into action or inaction.

Expand full comment
XBTUSD's avatar

I should actually say you've been slowly discovering what this blog really is about, ie your overarching thesis/worldview

Expand full comment
anh's avatar
Apr 7Edited

Thank you for an interesting analysis of irrational decisions made by rationalists. Imho, some limitations stem from people who see themselves as “decouplers”, i.e., mostly male, with Bayesian brain, math-oriented education, rule-based reasoning, and who tend to hold utilitarian views. The problems with these traits might be:

- Bayesian approach without prior odds: When Bob meets Alice in the first 10mins, he has no prior knowledge of her. Bayesian models don’t work well here because intuitive priors should be used.

- Dating women: Since most rationalists are male, they tend to date women in a non-homosexual context. Women are often more emotional and have less consistent moods. Does rule-based reasoning work with them, or is a more complex, real-time adaptive system needed?

- Utilitarianism in dating: The rationalist might initially struggle to define what he wants in dating, making it impossible to calculate the benefits of any given scenario. How can he assess which scenario will yield the greatest benefit without knowing the ultimate goal?

P/s: Outside of dating, there's a contradiction in how rationalists view themselves as the “cognitive decoupling elite.” They assume this self-perception is true, setting the probability of their assumption to 1. Who ensures they can decouple without bias?

Expand full comment