mirror of
https://github.com/fmap/muflax65ngodyewp.onion
synced 2024-07-05 11:20:42 +02:00
167 lines
11 KiB
Markdown
167 lines
11 KiB
Markdown
---
|
|
title: Information Wants to Pwn You
|
|
date: 2011-09-05
|
|
techne: :wip
|
|
episteme: :broken
|
|
---
|
|
|
|
Hacker Culture
|
|
==============
|
|
|
|
> Information wants to be free.
|
|
>
|
|
> -- a hacker motto
|
|
|
|
At first, I believed this statement solely on political grounds. When I grew up, everyone who wanted to control information was evil - the record industry, old politicians, you know, those kind of people. Sharing information was an act of rebellion, no matter what the information actually *was*. People didn't want you to have free access, so you simply created it, regardless of content, be it the Anarchist's Cookbook, warez or pr0n.
|
|
|
|
I grew up during the early Windoze years. One day, I accidentally opened an .exe file in a text editor and saw a lot of gibberish. I was amazed how someone could even *produce* this noise, let alone make it *work*. Later, I learned to program (and what machine code and compilers are) and adopted the culture of programmers, specifically open source ones.
|
|
|
|
It was obvious to me that information should be shared. Open your source code and others can learn from it, find bugs for you and even implement new features. Everybody wins. The only people wanting to hide their code were those more interested in making money. (Which was considered suspect in the communitarian culture I grew up in.) Worse, they were essentially only making money from *ignorance*. If everyone knew their code, or how to produce it themselves, then they wouldn't actually provide any worthwhile service at all.
|
|
|
|
This all convinced me that the motto was right, information really ought to be free. Up until now[^wikileaks] that is.
|
|
|
|
Bad News
|
|
========
|
|
|
|
The idea of psychological hijacking, in the form of indoctrination, for example, was always vaguely known to me, but I always thought that this is both a) hard to do and b) affects only *other* people, certainly not me. Weak-minded idiots become cult members and suicide bombers[^suicide]; I'm far too intelligent for that.
|
|
|
|
[^suicide]:
|
|
I see now how wrong I was about fanatics after having read the latest research into suicide bombers. In fact, I can see that I am *exactly* the kind of person who, under the right environmental factors, becomes just that. As a defense mechanism, I get very nervous whenever a belief I hold creates any strong emotions or radical disagreement with the culture it originated in.
|
|
|
|
I became more aware of the problem when I fell into the trap of a particularly nasty conspiracy theory[^conspiracy]. When I crawled my way out of it, I only concluded that I must become *smarter* and more *rational*. I thought of the problem in terms of psychology (being attracted by certain crowds and adopting their beliefs) and faulty reasoning (learn about fallacies and biases and you are safe). This changed when I learned about memetics and was provided with a (basic) mechanism of how this actually happened.
|
|
|
|
A meme is a "unit of cultural transmission", the idea-equivalent of a gene, like an earworm. As memes are themselves replicators, they follow all the laws of evolution. I applied those idea the first time by thinking about the implications of considering [music][Letting Go of Music] as a replicator. I wasn't quite sure what to make of my conclusions, but I didn't seriously deal with it (beyond downsizing my music library from 200GB to about 30GB) until now. (I also should revisit the article and fix several blatant flaws.)
|
|
|
|
It really clicked upon encountering the concept of the [Langford Basilisk]. Let this neat picture explain it:
|
|
|
|
<%= image("parrot.jpg", "The Parrot") %>
|
|
|
|
A Langford Basilisk is a genuinely dangerous idea. In its original form, it works through making the brain think an impossible thought - essentially setting off a logic bomb. I don't believe that the human brain is actually susceptible to this kind of attack, but a poorly designed AI might be. Rergardless, there are other forms of Basilisks, some of which I actually know to work (under certain conditions).
|
|
|
|
Consequences
|
|
============
|
|
|
|
Ok, maybe ideas *are* dangerous, not just in the "this exposes my own flaws or
|
|
crimes and helps my opponents" kinda sense, but in the "computer virus" sense.
|
|
Still, what should we do about that? To be honest, I'm not quite sure. But I
|
|
can at least provide some examples and how I plan to handle them in the future.
|
|
|
|
The most common example of a memetic hazard that is treated as such that I have
|
|
seen is the TV Tropes wiki (intentionally not linked). It's a black hole for any
|
|
culture whore (like myself) that sucks up your free time without any end in
|
|
sight. I easily lost *weeks* of my life in there. Many tropers always follow up
|
|
links to it with a warning. I am slightly immune to it now, but only because I
|
|
know most of it by heart. That's like becoming an atheist by going to
|
|
seminary[^seminary]. Not really practical. I had tried to limit my exposure
|
|
through time limits, but it didn't really help. So I needed a systematic
|
|
approach.
|
|
|
|
So let's draft a little catalogue of memetic hazards.
|
|
|
|
<%= image("memetically_active.jpg", "Memetically Active") %>
|
|
|
|
Structural Hijacking
|
|
--------------------
|
|
|
|
Things that are dangerous because of their structure. The most common example is
|
|
anything that resembles a Skinner box. Most notorious are Twitter, MMOs and email.
|
|
|
|
Emotional Hijacking
|
|
-------------------
|
|
|
|
Things that hide themselves by taking over your emotional system. Many drugs,
|
|
particularly heroin, come to mind as non-meme examples. But what would their
|
|
equivalent look like as an idea? Something that controls your emotions directly
|
|
to serve its own purpose (or the one of its creator)?
|
|
|
|
What about music? When I revisited some old music I hadn't listened to for a few
|
|
years, it became obvious to me. It puts me in a specific emotional state and
|
|
tries to keep me their for as long as it can, not unlike an addiction. The
|
|
emotional control itself wasn't the immediate problem (If I have a song that
|
|
would make me wide awake, motivated and happy, why not listen to it?), but
|
|
rather that it would force emotions on me I *didn't* actually want. Some songs
|
|
would make me angry or sad and there was little I could actually do against it!
|
|
Very, very evil.
|
|
|
|
Our brains have no natural distinction between "I believe this" and "I observe
|
|
this". *Everything* that happens is at first taken at face value, taken to be
|
|
true. If there is sadness, then *I* must be sad and must have a reason to be
|
|
sad. That I just react to a superstimulus is not detected. The same effect, of
|
|
course, is dramatic when it comes to our believes. Plenty of experiments have
|
|
demonstrated that merely *stating* an opinion, even explicitly solely to repeat
|
|
something someone else said, will cause our own opinion to shift in that
|
|
direction unless proper measures are taken. If I merely get you to think about a
|
|
proposition and you don't think it through yourselves, you are very likely to
|
|
become a little bit more convinced of it and identify with it.
|
|
|
|
The important conclusion to be drawn is that there is no such thing as neutral
|
|
observation. You can't do emotionally powerful act without them controlling your
|
|
mind. The Buddhists have warned us about this for centuries; if you lie, you
|
|
will harm *yourself* in the process. You will start to believe your own lies, if
|
|
you want to or not.
|
|
|
|
The way to handle this is by a) being as honest as you possible can (so you
|
|
never state or do something you wouldn't want to be a part of you) and b) put
|
|
off [proposing any solution] to a problem until you have understood it. The
|
|
moment you start defending or attacking a solution, you likely become stuck and
|
|
changing your mind later is quite difficult.
|
|
|
|
But you can also use this to your advantage! Particularly the Tibetans have been
|
|
teaching how loving-kindness and a general good mood are not magical things that
|
|
just happen, but skills to be learned. At first you just pretend to feel like
|
|
the kind of person you'd like to be and through some regular practice you
|
|
actually start feeling like that automatically. Very cool and powerful. Just
|
|
sitting down and forcing myself to be calm and smile for 15 minutes has helped
|
|
me greatly through phases of depression.
|
|
|
|
I also apply this when it comes to recreational media I watch. I now only watch
|
|
TV shows or movies that have characters in them I want to identify with -
|
|
protagonists that are actual role models. I don't do this for moralistic reasons
|
|
(You should be a nice person!), but purely pragmatic ones (I enjoy being nice,
|
|
so I won't watch shows with asshole protagonists as I will become more like them,
|
|
if I want to or not, regardless how much I enjoy the show.)
|
|
|
|
Remember that there is no such thing as a "real" and a "fake" emotion. Emotions
|
|
are (biochemical) brain states, like a tag, and can be changed at will. They are
|
|
not "layered" or even aware of any content at all. You don't like your current
|
|
state? Hack it! It's like changing your wallpaper - there's no "true" wallpaper
|
|
underneath and you can't just "try on" another one. There is only one, right
|
|
now, and whatever you choose, that's it. So make it a pretty one.
|
|
|
|
[proposing any solution]: http://lesswrong.com/lw/ka/hold_off_on_proposing_solutions/
|
|
|
|
Intellectual Hijacking
|
|
----------------------
|
|
|
|
Knowing just enough to be dangerous.
|
|
|
|
|
|
As a general rule, treat information exchange like sex. It might be fun, but
|
|
that's a side-effect that has only been built into you so you would actually do
|
|
it a lot. The purpose really is reproduction, so make sure to be safe. Watch
|
|
your partners and don't use just about any practice.
|
|
|
|
[^wikileaks]: At the time of writing (December 2010), Wikileaks is all over the
|
|
news. It's great to finally see someone pull a Hagbard Celine, but even greater
|
|
to be made aware by the fallout of how afraid of chaos I had become. I was
|
|
seriously worried that this could cause some of the major political players to
|
|
become even more paranoid, putting many (semi-)stable arrangements at risk of
|
|
collapse. I was particularly worried what it would do to fuel the increasing
|
|
[neo-fascism] of the US. Luckily, my Discordian training eventually kicked in and
|
|
I remembered that what I was seeing was not a threat to order, but rather an
|
|
exposition of the inherent chaos.
|
|
|
|
[^conspiracy]: I'm unwilling to publicly state the conspiracy theory I believed,
|
|
but if you send me an [email](/about.html) and ask me in private, I would
|
|
discuss it.
|
|
|
|
[^seminary]: Amusingly, this seminary effect actually happens. I used to study
|
|
religions (in a historical context) and met someone who studied theology. He
|
|
told me that about half the students each year would start out as Christians and
|
|
be atheists at the end when they learned how the bible actually came to be and
|
|
stuff like that. Information kills religions dead.
|
|
|
|
[neo-fascism]: http://zompist.com/fascism.html
|
|
|
|
[Langford Basilisk]: http://www.ansible.co.uk/writing/c-b-faq.html
|