Love’s Philosophy

BY PERCY BYSSHE SHELLEY

The fountains mingle with the river
And the rivers with the ocean,
The winds of heaven mix for ever
With a sweet emotion;
Nothing in the world is single;
All things by a law divine
In one spirit meet and mingle.
Why not I with thine?—

See the mountains kiss high heaven
And the waves clasp one another;
No sister-flower would be forgiven
If it disdained its brother;
And the sunlight clasps the earth
And the moonbeams kiss the sea:
What is all this sweet work worth
If thou kiss not me?

Musings

  • I understand why people tend to breakup around Thanksgiving now.
  • People in groups need to carry their own weight.
  • Damn chops are so useless.

CS3410

Huh? This doesn’t compute…

We are 3 CS majors looking for more students with C experience to join the group

Idiot’s Guide to GMRES (written by and idiot)

I’m doing some programming work for Professor Bindel in the CS department, and it’s pretty novel stuff to me. I figured if I write out what I know about the original method which he is trying to improve on, it’ll help me with the overall picture. Here it is:

The entire premise of GMRES (generalized minimal residual method) is to solve the system of equations like Ax = b where A is a matrices and x, b are vectors with appropriate dimensions. First though, we need some additional machinery before we can cover the main method (don’t worry, the machinery does most of the work).

Overview

First some definitions and an overview before nitty-gritty: a Krylov subspace is a magic thing which is quite simple to think about. It’s defined as K_k(A, b) = \text{span}\{b, Ab, A^2b, \dots, A^{k-1}b\}. It’s really nothing but a series of vectors which we get from multiplying A with b lots of times.

What GMRES aims to do is to minimize the residual (A\hat{x} - b where \hat{x} \in K_k(A,b)).

Arnoldi Iterations

I’m going to assume that you seen what the power iterations). We basically apply GS on those Krylov vectors to obtain an orthonormal basis. Our end goal is to obtain an expression of the form AV_{k} = V_{k+1} H_n where A was from the matrix defined in our problem and V matrices are the Krylov subspace vectors combined into a matrix. The H is called a upper Hessenberg matrix (i.e. an upper triangular matrix with the first subdiagonal filled in).

The algorithm is as follows (shamelessly copied from wiki):

  • Start with an arbitrary vector ”q”1 with norm 1.
  • Repeat for ”k” = 2, 3, …
  •  q_k \leftarrow Aq_{k-1} \,
  • for j from 1 to k − 1
  •  h_{j,k-1} \leftarrow q_j^* q_k \,
  • q_k \leftarrow q_k - h_{j,k-1} q_j \,
  • endfor
  • h_{k,k-1} \leftarrow \|q_k\| \,
  •  q_k \leftarrow \frac{q_k}{h_{k,k-1}} \,

What does it mean? It’s just like (modified) GS! For each additional vector from the Krylov subspace, you take out the stuff that are orthogonal to vectors already processed and store it in the H matrix. Sounds simple but looks rough…

GMRES

Now we can actually get to the method (which is simple if you’re still reading). We try to minimize the residual \|Ax_n - b\| (in the Euclidean norm for those wondering), for x \in K_k(A, b) (i.e. the kth Krylov subspace). We can rewrite that as x = V_k y as from our discussion of the Arnoldi iterations above, V_k is the matrices of the Krylov subspace.

Now, we perform some algebra…

\|Ax_k - b\|=\|AV_ky - b\|=\|V_{k+1}H_{k+1}y-b\|=\|H_{k+1}y-V_{k+1}^Tb\| + C

The first equation comes from substitution, the second from using the Arnoldi iterations results, and the last step is kind of tricky. C in our equation is the norm of the projection of b onto the orthogonal complement (i.e. the subspace complement to the space) to the Krylov subspace. Think of this as saying the Pythagorean theorem on the part that lies within the span, and those that lies complement to it.

Finally,
The last bit is just the observation that the first column of V_{k+1}[\latex] is b, normalized to unit length. So V_{k+1}^Tb=\|b\|e_1[\latex].

ProblemS with gmres

If you notice the Arnoldi iterations, you have more and more vectors to loop over as the Krylov subspace gets larger… and you have to store all those vectors too! This is why people consistently use GMRES with restarts, where they erase all the previous iterations, and use reconstruct the Krylov subspace from the current, closest solution.

The problem with this, is there are situations where convergence to the solution actually depends on the restart! If you don’t choose a good restart value, convergence to the solution might not occur. That’s bad.

Another way to deal with this is to demand the matrix of vectors by well conditioned… but I don’t know much about this. And another way, is to use Chebychev polynomials to somehow do it (more on this later after I read the paper).

The Push

I finally decided to drop algebra, which puts me at 20.5 credit hours this semester. Honestly, it wasn’t a tough decision.

More and more, I see my pure math as an mental excursion to go on when one wants to find fascinating sights. But that’s all it is though, a mental treat as I don’t see myself becoming a pure mathematician.

Am I smart enough for pure math? Maybe.

Am I motivated enough for pure math? Maybe.

One does not pursue grad school when the answer to those questions are just maybe.

Holy Gamole.

I know I loved Frozen’s entire presentation, but this guy’s cover of “Let It Go” is unreal.

Alexis Ohanian

Alexis Ohanian, known as the co-founder of reddit, came to Cornell yesterday night on his book tour. I guess the two points I took from his talk were the following:

  1. If you think about it, the cat memes have more competition than any of us. There are hundreds of thousands of cute cats, and equal amounts of video-taking owners. For Grumpy Cat to remain at the top of the pile for… grumpy cat pictures is pretty amazing by itself.
    Screw everything.

    The moral of this point was to just don’t worry about the competition, and focus on your work. If you put enough thought and care into it, everything will work out.

  2. Ideas are worthless. Enough said…

He’s a pretty interesting character all in all.

 

Snow Day

7:50 AM: Alarm rang. Decided not to wake up.

8:01 AM: Alarm rang. Decided to wake up and take a shower.

8:45 AM: Finished eating breakfast, and now checking emails. Noticed something about school reopening at 9:30 AM.

9:00 AM: Dressed in a suit and ready for an interview.

9:20 AM: Drudged through the snow and arrived at Carpenter Hall.

9:22 AM: Was informed by the attendant that my interview has been cancelled.

9:22:30 AM: Proceed to curse profusely in my head.

9:22:32 AM: “Thanks! So how should I reschedule….”

9:25 AM: While walking outside Carpenter on the way to my class, recieved an email saying 10:10 class was cancelled.

9:50 AM: Arrive back home.

9:51 AM: Proceed to fume profusely.

IMG_20140205_093140
WHAT I HAD TO TRUDGE THROUGH!