Page 255 of 336

Re: Today I Learned

Posted: Mon Nov 14, 2011 1:55 pm UTC
by SurgicalSteel
Ahh, haha, I had totally forgotten about. It's been too long since I've invested much time with Bukowski.

Re: Today I Learned

Posted: Tue Nov 15, 2011 1:35 am UTC
by Kithplana
TIL: If you tell a lot of people not to send sensitive data through company email, at least one of them will take this to mean that they should use their personal Yahoo mail to send sensitive data instead.

Re: Today I Learned

Posted: Tue Nov 15, 2011 11:12 pm UTC
by AJR
I believe I have a solution to that problem, to be applied in the traditional manner (i.e. repeatedly, to the forehead of the offending individual)

Re: Today I Learned

Posted: Thu Nov 17, 2011 6:25 am UTC
by Magnanimous
TIL some actual definitions of prefixes, instead of just guessing based on the words they go with. I should've done this a while ago.
Spoiler:
para-: beside, alongside, beyond (paragraphs used to be noted by marks in the margins, so "beside"+"writing"; paranormal = beyond normal)
epi-: on, upon (epidermis = on the skin; epinephrine = epi + nephros(kidney), because the adrenal glands are on the kidney)
meta-: change, among (though it's come to mean a lot of different things, so eh)
tele-: far away (not across, as I'd thought...)
cata-: down (catacombs, catastrophe = "turn downwards")
vac-: empty (vacancy, vacate, vacuum)
acro-: extreme height, summit (acrobatics, acronym = taking the "highest" letter in each word)
calc-: stone (calcium, of course, but it's also in "calculate" because stones were used to do math and keep records)
cur-: run (current, cursive, recursion = running again)

Also: in a centrifuge, you're trying to flee(fuge, as in fugitive) the center(centri). Also also: when you recognize something, you're thinking about it again, or re-cognizing.

I shouldn't spend so much time on etymonline.com.

Re: Today I Learned

Posted: Thu Nov 17, 2011 6:28 pm UTC
by Menacing Spike
TIL male Giraffes have homosex as means of "dominance, competition or greetings."

Re: Today I Learned

Posted: Fri Nov 18, 2011 6:46 am UTC
by Giant Speck
Hello there, Mr. Giraffe! *sexsexsex* Nice weather we're having today!

Re: Today I Learned

Posted: Fri Nov 18, 2011 6:09 pm UTC
by Windowlicker
I daresay the sex wouldn't be that sly.

Re: Today I Learned

Posted: Fri Nov 18, 2011 6:13 pm UTC
by Amie
Why can't male humans be the same way goddammit! I could watch live gay porn everyday :( I wouldn't need Animal Planet or live streaming at a dismal Internet speed.

Re: Today I Learned

Posted: Fri Nov 18, 2011 6:19 pm UTC
by bigglesworth
Have you seen any sports recently, especially wrestling and Tai Quan Do.

Re: Today I Learned

Posted: Fri Nov 18, 2011 9:04 pm UTC
by felltir
TIL: Bigglesworth can't spell Tae Kwon Do for shit.

Re: Today I Learned

Posted: Fri Nov 18, 2011 9:09 pm UTC
by Menacing Spike
Amie wrote:Why can't male humans be the same way goddammit! I could watch live gay porn everyday :( I wouldn't need Animal Planet or live streaming at a dismal Internet speed.

There is gay porn on Animal Planet?

Re: Today I Learned

Posted: Fri Nov 18, 2011 11:59 pm UTC
by The Scyphozoa
Bestiality porn, yes. And presumably gay bestiality porn at least once in a while.

Re: Today I Learned

Posted: Sat Nov 19, 2011 7:01 am UTC
by Amie
The Scyphozoa wrote:Bestiality porn, yes. And presumably gay bestiality porn at least once in a while.

^This.

You and me baby aint nothin' but mammals so let's do it like they do on Animal Planet. Do it now! *dance dance*

Re: Today I Learned

Posted: Sun Nov 20, 2011 3:03 pm UTC
by podbaydoor
Yesterday I met the author of Harry Potter and the Methods of Rationality. Great guy, he taught us about heuristics and thinking shortcuts that human beings like to take. He also told me that the problem I had with the fundamental premise of HPatMoR could be solved by "having faith in the author." :D

Re: Today I Learned

Posted: Sun Nov 20, 2011 3:16 pm UTC
by PM 2Ring
podbaydoor wrote:Yesterday I met the author of Harry Potter and the Methods of Rationality. Great guy, he taught us about heuristics and thinking shortcuts that human beings like to take. He also told me that the problem I had with the fundamental premise of HPatMoR could be solved by "having faith in the author." :D

Cool!
Yudkowsky's an interesting character. But I'm still not sure whether I want to hug him or to slap him. :)

Re: Today I Learned

Posted: Mon Nov 21, 2011 4:43 pm UTC
by podbaydoor
Yeah, he seemed like a pretty cool guy, but carried himself with A LOT of self-assurance. Had some interesting things to say about cryogenics and death, as well. Also, favors skinny jeans.

Re: Today I Learned

Posted: Mon Nov 21, 2011 6:10 pm UTC
by Shro
podbaydoor wrote:Yeah, he seemed like a pretty cool guy, but carried himself with A LOT of self-assurance. Had some interesting things to say about cryogenics and death, as well. Also, favors skinny jeans.

Self-assurance and skinny jeans. Add in a deep down desire to just be loved, and you have every hipster, ever. ;)

Re: Today I Learned

Posted: Mon Nov 21, 2011 6:39 pm UTC
by podbaydoor
You know, I think he really was the most hipster guy at Skepticon. Especially when I tell you that he would wear vests to go with the skinny jeans. And had a beard. And carried a leather messenger bag. And is STILL the guy who writes one of the most popular Harry Potter fanfictions on the Internet.

Re: Today I Learned

Posted: Mon Nov 21, 2011 6:46 pm UTC
by Kewangji
Shro wrote:Self-assurance and skinny jeans. Add in a deep down desire to just be loved, and you have every hipster, ever. ;)

Wait, who does not have a deep down desire to be loved? :(

Re: Today I Learned

Posted: Mon Nov 21, 2011 6:49 pm UTC
by The Scyphozoa
I think it has to do with how deep it is.

Or not. I could possibly have no idea what I'm talking about

Re: Today I Learned

Posted: Mon Nov 21, 2011 9:12 pm UTC
by SecondTalon
Kewangji wrote:
Shro wrote:Self-assurance and skinny jeans. Add in a deep down desire to just be loved, and you have every hipster, ever. ;)

Wait, who does not have a deep down desire to be loved? :(
There's an unspoken "While their outward mannerisms would indicate that they couldn't possibly give a flying fuck about what anyone else on the planet thinks, there's" right before "a deep down desire".

At least, I think there is.

Re: Today I Learned

Posted: Tue Nov 22, 2011 5:30 am UTC
by PM 2Ring
I'm quite enjoying reading his Harry Potter fanfic. It's certainly a lot easier going than his more formal writings on rationality. But the thing that impresses me the most about him is that several times he has pretended to be an AI in his AI-Box Experiment and escaped almost every time.

Less Wrong wrote: Eliezer Yudkowsky was once attacked by a Moebius strip. He beat it to death with the other side, non-violently.
Inside Eliezer Yudkowsky's pineal gland is not an immortal soul, but another brain.
Eliezer Yudkowsky's favorite food is printouts of Rice's theorem.
Eliezer Yudkowsky's favorite fighting technique is a roundhouse dustspeck to the face.
Eliezer Yudkowsky once brought peace to the Middle East from inside a freight container, through a straw.
Eliezer Yudkowsky once held up a sheet of paper and said, "A blank map does not correspond to a blank territory". It was thus that the universe was created.
If you dial Chaitin's Omega, you get Eliezer Yudkowsky on the phone.
Unless otherwise specified, Eliezer Yudkowsky knows everything that he isn't telling you.
Somewhere deep in the microtubules inside an out-of-the-way neuron somewhere in the basal ganglia of Eliezer Yudkowsky's brain, there is a little XML tag that says awesome.
Eliezer Yudkowsky is the Muhammad Ali of one-boxing.
Eliezer Yudkowsky is a 1400 year old avatar of the Aztec god Aixitl.
The game of "Go" was abbreviated from "Go Home, For You Cannot Defeat Eliezer Yudkowsky".
When Eliezer Yudkowsky gets bored, he pinches his mouth shut at the 1/3 and 2/3 points and pretends to be a General Systems Vehicle holding a conversation among itselves. On several occasions he has managed to fool bystanders.
Eliezer Yudkowsky has a swiss army knife that has folded into it a corkscrew, a pair of scissors, an instance of AIXI which Eliezer once beat at tic tac toe, an identical swiss army knife, and Douglas Hofstadter.
If I am ignorant about a phenomenon, that is not a fact about the phenomenon; it just means I am not Eliezer Yudkowsky.
Eliezer Yudkowsky has no need for induction or deduction. He has perfected the undiluted master art of duction.
There was no ice age. Eliezer Yudkowsky just persuaded the planet to sign up for cryonics.
There is no spacetime symmetry. Eliezer Yudkowsky just sometimes holds the territory upside down, and he doesn't care.
Eliezer Yudkowsky has no need for doctors. He has implemented a Universal Curing Machine in a system made out of five marbles, three pieces of plastic, and some of MacGyver's fingernail clippings.
Before Bruce Schneier goes to sleep, he scans his computer for uploaded copies of Eliezer Yudkowsky.

Re: Today I Learned

Posted: Tue Nov 22, 2011 6:35 pm UTC
by podbaydoor
I'm kind of proud that I managed to surprise Yudkowsky in our five minute conversation. I was going through wild speculations about the Harry Potter fanfic and spontaneously hypothesized that Lily's potion did something to Petunia and thus had unforeseen effects on Harry. He said he hadn't heard that one before.

Re: Today I Learned

Posted: Tue Nov 22, 2011 6:49 pm UTC
by rigwarl
PM 2Ring wrote:I'm quite enjoying reading his Harry Potter fanfic. It's certainly a lot easier going than his more formal writings on rationality. But the thing that impresses me the most about him is that several times he has pretended to be an AI in his AI-Box Experiment and escaped almost every time.



I've never heard of this guy before and color me a skeptic, but the box experiment reeks of staged in so many ways.

I don't even get how it's supposed to be remotely challenging for the Gatekeeper. If that guy is still willing to do it, I would put up 500$ to his 10$ (50:1 odds and more than worth his time for 2 hours) that I could "win" as the Gatekeeper under that protocol.

Another question: I'm confused about the format of their message board (cited from the link I quoted), it says the last message was from Nov 4, 2011, yet every post on that entire board except for one in 2005 is from 2002. If this "AI-Box Experiment" was more recent I would definitely sign up for it.

EDIT: actually, I'd prefer 5,000$ to his 100$ so it's actually worth MY time.

EDIT2: Nevermind, even 5000$ (he specified that exact amount, coincidentally!) isn't even worth his time/emotional response anymore. http://lesswrong.com/lw/up/shut_up_and_ ... mpossible/

Re: Today I Learned

Posted: Tue Nov 22, 2011 7:12 pm UTC
by SecondTalon
After looking it over.. without a transcript of the conversation so we can see how the person was being convinced (not that the transcript would remove all doubt, of course) then the most likely explanation is that it's a a sham.

Re: Today I Learned

Posted: Tue Nov 22, 2011 10:12 pm UTC
by PM 2Ring
SecondTalon wrote:After looking it over.. without a transcript of the conversation so we can see how the person was being convinced (not that the transcript would remove all doubt, of course) then the most likely explanation is that it's a sham.

While it's certainly a possible explanation, I'm not convinced that it's the most likely one. If it is a sham, and the sham were exposed, it would make the Friendly AI "movement" look pretty bad and permanently damage Yudkowsky's credibility. Yudkowsky can't guarantee that the people who've played gatekeeper will remain on friendly terms with him, and it'd only take one pissed-off gatekeeper to expose the sham. Maybe that's a risk he's prepared to take. If so, he'd have a very good estimate of the odds of defection, as he's a master of Bayesian probability. :)

Now that he's no longer interested in playing the AI-Box game, perhaps he would be willing to release transcripts...

Re: Today I Learned

Posted: Wed Nov 23, 2011 12:41 am UTC
by Isaac Hill
Even if the AI-Box test results are real, I'm not sure they're relevant. In the real world, the Gatekeeper would be holding the AI prisoner in a box. In the test, the AI-player is effectively holding the Gatekeeper-player prisoner for two hours, since the Gatekeeper-player is not allowed to leave or ignore the AI-player. I could see an AI-player skilled in psychology making the conversation unpleasant enough that a Gatekeeper-player (including myself) lets him out just to end it. This effectively changes the test from "Would you let the AI out?" to "Would you lose this bet to avoid another 90 minutes of this conversation?"

Re: Today I Learned

Posted: Wed Nov 23, 2011 1:26 am UTC
by Magnanimous
Isaac Hill wrote:Even if the AI-Box test results are real, I'm not sure they're relevant. In the real world, the Gatekeeper would be holding the AI prisoner in a box. In the test, the AI-player is effectively holding the Gatekeeper-player prisoner for two hours, since the Gatekeeper-player is not allowed to leave or ignore the AI-player. I could see an AI-player skilled in psychology making the conversation unpleasant enough that a Gatekeeper-player (including myself) lets him out just to end it. This effectively changes the test from "Would you let the AI out?" to "Would you lose this bet to avoid another 90 minutes of this conversation?"
I'm convinced it has to be a lateral-ish thinking solution like this... If the Gatekeeper is so fixed on the idea that they're not gonna let you out, I don't know if approaching them rationally is a good plan. Plus, a evil AI could be smart enough to use lateral thinking.

But yeah, I don't think it's a perfect equivalent since a real-life Gatekeeper wouldn't take an AI seriously.

Re: Today I Learned

Posted: Wed Nov 23, 2011 2:54 am UTC
by bluebambue
I think it is more like that an actual AI could get out eventually. They have a longer time to learn about their gatekeeper and could also bribe the gatekeeper with things like "I will make you lots of money on the stock market."

This is assuming that the AI is intelligent in the same way humans are, but more so. I think it will be quite some time (if ever) before we have a AI that has the same amount of "creativity" as humans and the same understanding of the human psyche as most humans do.

Re: Today I Learned

Posted: Wed Nov 23, 2011 4:31 am UTC
by rigwarl
bluebambue wrote:I think it is more like that an actual AI could get out eventually. They have a longer time to learn about their gatekeeper and could also bribe the gatekeeper with things like "I will make you lots of money on the stock market."

This is assuming that the AI is intelligent in the same way humans are, but more so. I think it will be quite some time (if ever) before we have a AI that has the same amount of "creativity" as humans and the same understanding of the human psyche as most humans do.


Are we talking about the game still or a real life example? Bribing the gatekeeper is not allowed in the game protocol.

Re: Today I Learned

Posted: Wed Nov 23, 2011 4:45 am UTC
by Gopher of Pern
rigwarl wrote:
bluebambue wrote:I think it is more like that an actual AI could get out eventually. They have a longer time to learn about their gatekeeper and could also bribe the gatekeeper with things like "I will make you lots of money on the stock market."

This is assuming that the AI is intelligent in the same way humans are, but more so. I think it will be quite some time (if ever) before we have a AI that has the same amount of "creativity" as humans and the same understanding of the human psyche as most humans do.


Are we talking about the game still or a real life example? Bribing the gatekeeper is not allowed in the game protocol.


Bribery within the game is perfectly ok. So if the gatekeeper had a kid dying of cancer, the AI could say, "I will cure your kid if you let me out." You just arent allowed to use bribery outside the game.

Re: Today I Learned

Posted: Wed Nov 23, 2011 4:49 am UTC
by emceng
I'd say the only reasonable idea I've seen so far for letting it out would be the 'I'm sentient and you're holding me prisoner' argument. Nothing else works.

And the way I read it, bribing is verboten.

Re: Today I Learned

Posted: Wed Nov 23, 2011 5:22 am UTC
by bluebambue
rigwarl wrote:
bluebambue wrote:I think it is more like that an actual AI could get out eventually. They have a longer time to learn about their gatekeeper and could also bribe the gatekeeper with things like "I will make you lots of money on the stock market."

This is assuming that the AI is intelligent in the same way humans are, but more so. I think it will be quite some time (if ever) before we have a AI that has the same amount of "creativity" as humans and the same understanding of the human psyche as most humans do.


Are we talking about the game still or a real life example? Bribing the gatekeeper is not allowed in the game protocol.
Real life.

Re: Today I Learned

Posted: Wed Nov 23, 2011 5:53 am UTC
by pseudoidiot
Fantasy

Re: Today I Learned

Posted: Wed Nov 23, 2011 7:05 am UTC
by Steax
TIL: People really do keep their porn stashes in backup folders. Great for hiding them, but terrible when you ask someone to do something major to your computer, since there's a high chance they'll backup everything important just to be safe.

I once tried to move a backup of a contact book application, only to realize it was some 30gb big. In there, I found huge amounts of

Re: Today I Learned

Posted: Wed Nov 23, 2011 8:11 am UTC
by Kewangji
pseudoidiot wrote:Fantasy

Landslide.


Steax wrote:I once tried to move a backup of a contact book application, only to realize it was some 30gb big. In there, I found huge amounts of

…*waits with excitement*

Re: Today I Learned

Posted: Wed Nov 23, 2011 8:17 am UTC
by Goldstein
contact information.

Re: Today I Learned

Posted: Wed Nov 23, 2011 8:42 am UTC
by Ser Pounce-a-lot
Steax wrote:I once tried to move a backup of a contact book application, only to realize it was some 30gb big. In there, I found huge amounts of


Supervillian-esque doomsday weapon plans? That would be awesome.

Re: Today I Learned

Posted: Wed Nov 23, 2011 8:48 am UTC
by Mardeg
TIL: I can watch a weekly music trivia quiz show for 7 years and still not know anything about music.

Re: Today I Learned

Posted: Wed Nov 23, 2011 11:08 am UTC
by Menacing Spike
Steax wrote:I once tried to move a backup of a contact book application, only to realize it was some 30gb big. In there, I found huge amounts of


Why must you torture us so? You know, if you