1
0
Fork 0
mirror of https://github.com/fmap/muflax65ngodyewp.onion synced 2024-07-01 10:46:49 +02:00
muflax65ngodyewp.onion/content_daily/log/91.mkd

34 lines
3.1 KiB
Markdown
Raw Normal View History

2012-09-11 20:46:07 +02:00
---
title: Coup de Poing
2012-09-13 05:29:40 +02:00
date: 2012-09-12
techne: :done
2012-09-11 20:46:07 +02:00
episteme: :log
---
Did the first half of my [epistemic tag][Epistemic State] update. I wanted to add more fine-grained distinctions, but I had a hard time figuring out how to rank a metaphysical belief. The main problem of accurate belief assignments is that it depends on a good prior. But even getting a well-defined prior (i.e. one that isn't zero on the best answer, which is typically by making it non-zero everywhere) requires you to know what the hypothesis-space even *is*.
2012-09-13 05:29:40 +02:00
Say you've only heard of consequentialism and nihilism, and nothing else. You hear the arguments and assign consequentialism a 60% probability and nihilism 40%. Now Kant comes along and blows your mind with deontology, but unfortunately you've already used up all the probability. Dammit, so you switch over to likelihood ratios. Consequentialism is at least as likely as deontology, you think, and each is twice as likely as nihilism, so it's 2:2:1. Fine, if something better comes along, you can easily add it. But now how can you express how certain you are of consequentialism? You'd have to say "I give consequentialism 2:3 odds, compared to everything else I've heard", but for that to be meaningful, you'd *also* have to tell me *what else* you've heard. Ain't nobody got time for that!
2012-09-11 20:46:07 +02:00
Compression sucks. But wait a minute - I'm a [Yangming][Wang Yangming]-ite, I believe in the unity of knowledge and action. Correct beliefs must result in virtue[^virtue] and vice versa. So why not just express *that*?
[^virtue]:
<% skip do %>
Yangming only made the argument in case of moral beliefs and actions, but I think it should extended to the general case, but then I also think morality should be extended to swallow everything. (Eventually.) I suspect (but I'm not sure) that Yangwing would agree with that.
<% end %>
2012-09-13 05:29:40 +02:00
So now for certain things I believe in, to some degree, instead of saying how much I do believe in it, I'm saying what kind of *duel* I would accept over it. This separates hipster beliefs from Serious Business. If you challenge me, and you win, I concede the belief and accept your position. If I win, I expect you to do the same.
2012-09-11 20:46:07 +02:00
2012-09-13 05:29:40 +02:00
For an important but still highly speculative belief, I accept fitness challenges, like who can do the most push-ups in one week. Sure, I may easily lose those, but they sound like fun and I win either way (by becoming more awesome).
2012-09-11 20:46:07 +02:00
2012-09-13 05:29:40 +02:00
For something more serious, I'd fight you in Quake 3 [like a man][Notch Q3]. I require up to 3 months of preparation for those, though.
For the true hardcore stuff, I'd accept a fight to the death. Up to 1 year of preparation, we decide on a time, place and weapon, and at end of the day, at most one of us is still standing. That person is [deemed right][trial by combat].
Yes, I'm completely serious. No, there is no hardcore belief yet. (In public anyway.) More specific rules are outlined on the [Epistemic State][] page. I've yet to re-tag old pages. (That's the second half.)
2012-09-11 20:46:07 +02:00
(HT to Will for the idea.)
2012-09-13 05:29:40 +02:00
---
May I comment my work? AAAAH. That pretty much sums up this week.