(no subject)
Dec. 28th, 2018 10:10 pmreblogged your post and added:
so i think the set of matrices you’re talking about is a real thing: the orthogonals with determinant 1. The reflections have -1
reblogged your post and added:
so i think the set of matrices you’re talking about is a real thing: the orthogonals with determinant 1. The reflections have -1
I really like the Cover and Hart result, that you can do nearest neighbor classification (classify to the category of the most similar previous observation), and asymptotically get an error rate less than twice that of the Bayes classifier (classify to the category with the highest posterior probability).
Actually, don’t think of your dataset as observations. Think of them as samples from the posterior distribution, which you generated computationally. So you can do classification by sampling from the posterior distribution, instead of by calculating posterior probability for individual given cases.
And then, sampling from the posterior distribution seems analogous to sitting around and thinking of hypothetical situations which seem plausible to you. So this is a model of reasoning that has a place for thinking, I mean, a kind of thinking which isn’t like, counting possible worlds that remain after eliminating those inconsistent with your observations. Although it doesn’t tell you precisely how much thinking you need to do or when you need more of it, so it’s not a model of reasoning that can really guide thinking? Except, maybe, “try to think of a representative sample of plausible cases”. Rather than “think of plausible cases in which your favored presidential candidate would have a positive impact”, or something.
fnord888 reblogged your post and added:
Not a contradiction, just a and c can’t be integers (I think can’t be rational, but that’s not a proof of that).
Also, any number that’s not itself perfect square is the same, right?
Yeah, it’s a contradiction under the assumption that a and c are integers.
And yeah, that’s true, it works not just for 2 but any number that is not itself a perfect square.
my dad I think has some vague spiritual beliefs involving reincarnation; I could be wrong, he doesn’t really talk about it. My mom, after she got lupus, started going to the Chabad center and keeping kosher and such. And driving to schul on Shabbat, but parking kind of far away so nobody could see that she had driven in.
My mom started raising us Jewish, and my brother was to have his bar mitzvah, and my dad allowed all this but drew the line at tefillin. Nobody’s putting leather straps on his kid. No way my mom was going to win that argument. But it was what she was supposed to do; her son was supposed to start putting on tefillin after his bar mitzvah.
She was talking about this with my paternal grandfather. He took out his father’s tefillin, tried to put them on, couldn’t remember how to do it. Later, he was talking to an Orthodox jew that goes to his gym, who was going to Israel. They ended up agreeing, this orthodox guy would pick up some kosher tefillin in israel, and my grandfather would buy them from him. My mom didn’t put him up to this, it was an independent idea.
So he did, and gave me and my brother tefillin as bar mitzvah gifts, and my dad couldn’t say anything to that, and of course all the lubavitchers at my mom’s schul were saying it was divine providence.
finest-quality reblogged your post “real orthogonal matrices do rotations”
The way I learned the definition of orthogonal, the reflection A= [-1 0; 0 1] is orthogonal because A transpose = A inverse
Yup. That’s a reflection, and it’s orthogonal.
My reasoning was incorrect, because
v . w = ||v|| ||w|| cos(θ)
and
Av . Aw = ||Av|| ||Aw|| cos(-θ) = ||v|| ||w|| cos(θ) = v . w
So, while I was right that the angle between them “reverses” in a certain sense--you turn one direction to get from v to w, and another direction to get from Av to Aw--I was wrong that this flips the sign of the dot product. In fact it doesn’t, because the dot product has the cosine of the angle, and cos(-θ) = cos(θ).
adzolotl replied to your post “real orthogonal matrices do rotations”
Might be a reflection!
No way.
Shit.
Really?
OK if i’ve got v, and rotate up by θ counterclockwise and you've got w. And then you do a reflection and you've got Av and Aw.
Then, starting at Av, you rotate by θ clockwise to get Aw.
So i'm gonna go ahead and say that v . w = - (Av . Aw)
So no, it can't be a reflection.
Right?
new cards don’t have the regenerate keyword?
I always took it for granted but it was actually kind of confusing. I fit would die later in the turn, it doens’t, and instead remove it from combat, remove all damage, and tap it?
Apparently now htey just have ‘indestructible until end of turn.’ That’s better.
When I first learned regenerate was gone it was like the floor had been taken out from under me. The Apprentice guy is president. The world I grew up in is gone. But, at least as far as Magic: The Gathering keyword effects goes, the new world makes more sense now.
Let ∑ be a symmetric, positive definite matrix. I have in mind a covariance matrix.
Diagonalize it:
∑ = P D P⁻¹
Since it’s symmetric, P⁻¹ = P’ (by which I mean, P transpose):
∑ = P D P’
Since it’s positive definite, all the elements in the diagonal matrix D are positive, so you can express D as the square of another diagonal matrix:
D = Q²
∑ = P Q² P’
Define
A = P Q P’
Note that A is symmetric.
Then
A’ A = P Q P’ P Q P’ = P Q² P’ = ∑
This means that quadratic forms of ∑ can be expressed as the lengths of vectors after being transformed by A:
|| A v ||² = v’ A’ A v = v’ ∑ v
I was thinking about this because of the optimization problem
Maximize u’ ∑ u subject to ||u|| = 1
In two dimensions, I can visualize this. The set allowed by the constraint is a circle. The image of that circle under A is an ellipse, since I interpret P and P’ as rotation matrices, and Q as scaling the x and y axes. Take a vector pointing to a point on the circle, rotate it clockwise, scale the axes, rotate it back counterclockwise.
Since u’ ∑ u = || A u ||², our goal is to get the point on the ellipse that is as far as possible from the center. So we want a vector Au that points along the major axis of the ellipse. And our maximum possible value will be the squared length of this vector.
It turns out this comes from a vector u which points along the major axis--this vector is an eigenvector, and it points along the major axis before and after the transformation. I came to that conclusion thinking about what A = P Q P’ does in three parts: rotate clockwise, scale the axes, rotate counterclockwise.