Generating GLM Data and Eccentricity

At work, I had to generate random data to run logistic regressions on. In one unusual case, the slice sampler was performing far worse than expected. The code was simple, and contained no mistake; we thought something incredibly bad had happened with the whole testing framework.

What ended up happening was that our data matrix X was generated by a uniform distribution from 0 to 1, but the reference runs were generated from -.5 to .5. The parameters \beta of the logistical regressions are both generated from -0.5 to 0.5.

The logistic regression is a special case of GLMs, implying that we have a linear predictor of X\beta. Theoretically, in both cases they should be centered at 0. Somehow, it’s (almost) always true that shifting X away from the 0 will cause the condition number to shift larger.

Maybe a small proof will come later? But this seems to have to do with the eigenvalues of the sum of two randomly distributed matrix… which is not entirely trivial.

Work

Work is so tough. I thought it’ll be pretty chill and have tons of time to relax at night. Nope… too tired to do other stuff.

I’ll be posting some statistics soon though!

Paper

Busy with final projects, and watching Pokemon.

*team rocket does something stupid
Jesse: why did we do that?
James: because we have to fill the half hour

May It Never Come

One began to hear it said that World War I was the chemists’ war,

World War II was the physicists’ war,

World War III (may it never come) will be the mathematicians’ war.

Monies

After 11 years of playing the clarinet, I have finally made some money from playing it. This past Sunday, I played in a gig for some Ithaca music group as a cuckoo in Saint-Saëns’ carnival.

Grant total profits? Fifteen dollars.

Made quite a dent in my two thousand dollar clarinet, and upwards of five thousand dollars worth of lessons.

Notes: SSD edition

Some notes from the past week:

  1. It is incredibly easy to be an impostor in a more academic party. First of all, most of the people will be already intoxicated to the point where bullshit science can’t be discerned from actual science. This is good as I can just say random facts I remember from Popular Science.Another acceptable thing to do is to just ask questions upon questions. “What’s your research? … Oh that’s so cool! Tell me more about it! … So does this connect to insert scientific news here? Wow.” That’ll burn around 5 minutes minimum.The main problem comes when you run out of questions in the initial barrage. It also fails when the person is laconic or can’t speak English.

  2. Installing a SSD is extremely easy, but installing operating systems are hard. Right now, I have around 8 entries on my GRUB menu before I migrate everything over to my new distro.I followed the mount guide provided here, which seems intuitive enough on where to put mount points. I’ve also learned that
    mount

    and

    df -h

    are my friends. There’s also that good GParted software.

  3. The Lloyd Trefethren numerical linear algebra book is quite good for a quick overview of the subject. It doesn’t get bogged down with the analysis, and generally refers to other books (mainly the Van Loan) throughout.
  4. Holy shit URF mode.
  5. I need to be more brave in a certain subject….

Damn

Didn’t even make top 500 for Putnam this past year. What a disappointment for me…

Ray Casting with JOGL

I won’t post the entire code here, because it’s pretty damn ugly. But here’s what I ended up doing:

  1. I used the
    gluunproject

    statement to find the beginning and end points to extrapolate a line from.

  2. Now that I have a line, I use the formula provided by Wikipedia using the vector formulation of the line.
  3. Simply do a loop over the vertices and find the minimum.

Sorry for note posting recently… I got caught up in things… 🙁