Realist Musings

This stream of consciousness will be a classic case of Dunning-Kruger effect, speaking as someone who’s education in international relation began and ended in an “Introduction to IR” class where I ended up with a solid B, never spoke up in class, and generally got confused while doing the readings.

But one of the classic theories I remember is the Golden Arches Theory of Peace. It states that:

No two countries that both have a McDonald’s have ever fought a war against each other

which, given the current situation, can probably be declared officially dead. Side note, does the theory apply if McDonald’s retroactively pulls out of Russia?

Perhaps the better theory should have the caveat that both countries have functioning democracies. This seems better supported by the small data set at hand: the main counterexamples on Wikipedia (with the exception of Israel/Lebanon) seems to justify the addendum.

This makes intuitive sense. Having a McDonald’s franchise necessarily mean that the country is fairly stable, has a solid logistical base, and can generate income. However, this doesn’t mean much in terms of politics of the country. Case in point, China.

It’s also worrying to see what has happened in Ukraine from the lens that it’s the death rattle of Putin. There’s been rumors of Putin’s declining health, and perhaps this is his need to create a “legacy”? This doesn’t bode well for the Taiwan-China situation either, as China has a huge demographic time bomb caused by the one-child policy. What will happen once that starts to happen?

I really don’t need this century to have another global-order-redefining war.

Putnam 2020 A2

For $k$ a non-negative integer, evaluate
\begin{align*}
\sum_{j=0}^k 2^{k-j} \binom{k +j}{j}.
\end{align*}

Apparently, I only know how to do induction anymore now but this problem is straightforward using it. We suppose the inductive hypothesis, and assume that for a fixed $k$ that
\begin{align*}
\sum_{j=0}^k 2^{k-j} \binom{k +j}{j} = 2^{2k}.
\end{align*}
We will proceed to show
\begin{align*}
\sum_{j=0}^{k + 1} 2^{k + 1-j} \binom{k + 1 +j}{j} = 2^{2k + 2}.
\end{align*}

By a simple binomial identity, we have
\begin{align*}
\sum_{j=0}^{k + 1} 2^{k + 1-j} \binom{k + 1 +j}{j} &= \sum_{j=0}^{k + 1} 2^{k + 1-j} \binom{k+j}{j} + \sum_{j=0}^{k + 1} 2^{k + 1-j} \binom{k +j}{j – 1}.
\end{align*}
Let $A$, $B$ be the first and second term respectively on the right hand side, then
\begin{align*}
A &= 2\sum_{j=0}^{k} 2^{k-j} \binom{k+j}{j} + \binom{2k + 1}{k+1} \\
&= 2^{2k + 1} + \binom{2k + 1}{k+1}.
\end{align*}
For the other term,
\begin{align*}
B &= \sum_{j=1}^{k + 1} 2^{k + 1-j} \binom{k +j}{j – 1} \\
&= \sum_{j=0}^{k} 2^{k-j} \binom{k +1 +j}{j} \\
&= \frac{1}{2} \sum_{j=0}^{k + 1} 2^{k + 1-j} \binom{k +1 +j}{j} – \frac{1}{2} \binom{2k + 2}{k + 1}.
\end{align*}
Thus, we rearranging, we have our result
\begin{align*}
\frac{1}{2}\sum_{j=0}^{k + 1} 2^{k + 1-j} \binom{k + 1 +j}{j} &= 2^{2k+1} + \binom{2k + 1}{k+1} – \frac{1}{2} \binom{2k + 2}{k + 1} \\
&= 2^{2k+1}.
\end{align*}

New Mexico Tourism

Within the last month, I’ve hosted a friend and my brother in Albuquerque meaning I had to think of things to take them to do. This was my list that doesn’t include climbing around the state (which warants a post on its own one day), and my opinions:

  1. Meow Wolf: a must do. Unique, fun and modern. Truly an art museum that is worth every dime. One can easily take the opportunity while up in Santa Fe to eat some better food, and visit their downtown (which is better than ABQ’s Old Town)… speaking of…
  2. Sandia Tramway and ABQ oldtown: I honestly don’t think ABQ as a city is very worthwhile to visit. The Sandia Tramway is nice for an overview of the city and to get mountain air, but it’s fairly expensive for the experience. I think La Luz might be the better option for those who are young and fit. Old Town is really just a shopping area and pales in comparison to the Santa Fe’s corresponding area. The only thing that I liked was the BioPark and Explora (but this zoo/garden/aquarium and museum are not unique to ABQ).
  3. Petroglyph National Monument: I took both people to Piedras Marcadas canyon trail for the petroglyphs. It’s honestly not that spectacular of a hike, considering the flatness and the fact that it’s by a suburb. It is useful as a first day visit to acclimate to the altitude.

  4. Bisti Badlands/Valley of Dreams: for the outdoorsy-type, I thought this was a fantastic place (one of the pictures on the top of my banner is from there). The one draw back is that it is a full day trip, and summer is brutal there. But for geologists or other just curious types, definitely worth a visit.

  5. Ten Thousand Waves Spa: I brough my friend there to let him relax. It wasn’t too expensive once we split the cost, and I did enjoy my time there. The caveat is that there are closer spas nearby in ABQ, but non had the Japanese aesthestic that I wanted. I personally really enjoyed my time there.

  6. Tinkertown Museum: a hidden gem. I loved the little place. It’s filled with tchotchkes, figurines and models which one person collected over his lifetime. On the surface, this may sound mundane, but the craftsmanship and themes really brings the exhibits together. Truly great. This is on the Turqoise road which allows one to visit Madrid, NM on the way.

  7. Mount Wheeler Hike: hard one. Do not attempt if not fit. The payoff is great though, and it’s a very nice trip up to Taos (visit Chokola, unrelated to Dracula, for great choclate) to escape the heat. One can make this a quick weekend trip by camping in the valley, then hiking to Lake Williams rather than the summit (only 4 miles and much less strenous).

The Last Thing He Told Me

I honestly spent some three months trying to read Donna Tartt’s The Little Friend. It was well written, and I generally like her written styles (I did finish The Goldfinch and The Secret History) but somehow this particular novel didn’t work for me. Out of frustration, I just borrowed the most popular book on Libby which is available which happens to be the titular book written by Laura Dave.

Oh boy was the difference in writing stark. It was distinctively less sophisticated and the vocabulary more elementary. It made for much faster reading meaning the novel was done in less than a week. This is also probably due to the over-explanation at all points.

Was it worth it? Eh.

I honestly thought the plot was unrealistic with hackneyed tropes: a teenager being difficult, the protagonist doing things for “love”, a clear misunderstanding of what technology can do. Still, I finished it.

Why is that? I don’t know.

An Improvisational Note

For the last few weeks, I’ve been taking a beginner improv class as a way to meet people and broaden my horizons. Overall, the experience has been highly positive and I’ve learned a great deal on not just improv, but comedy as a whole.

The first phrase I have been telling friends what the classes are like is “adult daycare.” Certainly a lowbrow phrase, but how else can one interpret a class where we try to channel different levels of an animal? Or one where we try to perform a Busby Berkeley dance without a priori discussion? Another great example of an improv game that I (and I suspect many kids) would enjoy is equivalent to the everyone-tell-a-story-but-can-only-say-one-word.

At some point, silliness from all becomes just straight up fun. The key phrase there is “from all”: we all paid for the lessons, and thus had a stake in being whimsical. I’m glad none of my classmates  hesitated to do any of the exercises and in fact most put themselves out there. This was one of the lessons from improv: action with confidence supersedes hesitancy, especially if no one knows what you’re doing.

Joking aside, it’s been actually very informative on what exactly makes good improv, well, good. Some can be attributed to the spontaneity of the performers, but there are general frameworks which help add structure. For one, we learned that it’s much easier to grasp a scene if we self-impose a goal and also add some sort of familiar relationships between the characters. People can grasp onto the characters much easier that way, and there’s established frameworks to manipulate.

As something that I signed up for willy-nilly turned out to be quite the happy accident. I really did enjoy the class and the challenges it gave me.

Saddle Point Property

I felt dumb while trying to derive this, so here it is. Sourced from page 129 of Braess FEM textbook.

Let $X, M$ be two Hilbert spaces, and $a: X \times X \to \mathbb{R}, \, b: X \times M \to \mathbb{R}$ continuous bilinear forms.
Assume that $a$ is symmetric and that $a(u, u) \ge 0$ for all $u \in X$.
Let $f \in X’, g \in M’$ and let
\begin{align*}
\mathcal{L}(v, \mu) &= \frac{1}{2}a(v, v) – \langle f, v \rangle + (b(v, \mu) – \langle g, \mu\rangle)
\end{align*}
which is simply the Lagrangian of a constrained minimization problem.
Assume that $(u, \lambda)$ satisfies
\begin{align}
a(u, v) + b(v, \lambda) &= \langle f, v \rangle \qquad &\forall v \in X, \\
b(u, \mu) &= \langle g, \mu \rangle \qquad &\forall \mu \in M,
\end{align}
then one has the saddle point property
\begin{align*}
\mathcal{L}(u, \mu) \le \mathcal{L}(u, \lambda) \le \mathcal{L}(v, \lambda) \qquad \forall (v, \mu) \in X \times M.
\end{align*}

The first inequality is actually an equality by noting that $\mathcal{L}(u, \mu) = \frac{1}{2}a(u, u) – \langle f, u \rangle= \mathcal{L}(u, \lambda)$ by using (2).
For the other inequality, let $v = u + w$, then
\begin{align*}
\mathcal{L}(v, \lambda) = \mathcal{L}(u + w, \lambda) &= \frac{1}{2}a(u + w, u + w) – \langle f, u + w \rangle + (b(u + w, \lambda) – \langle g, \lambda\rangle) \\
&= \mathcal{L}(u, \lambda) + \frac{1}{2}a(w, w) – \langle f, w \rangle + a(u, w) + b(w, \lambda) \\
&= \mathcal{L}(u, \lambda) + \frac{1}{2}a(w, w) \ge\mathcal{L}(u, \lambda)
\end{align*}
where (1) and (2) are used.

A fixed order double integral

I have a good example where copying code straight from Stackoverflow is a terrible idea. Consider a simple rectangle $[0, 1] \times [0, 2]$ which we want to perform a double integral over. Of course, there’s the classic way to use scipy’s $\texttt{dblquad}$

In [1]:
from scipy.integrate import dblquad

def f(x, y): 
    return x + 2 * y

dblquad(lambda y, x: f(x, y), 0, 1, lambda x: 0, lambda x: 2)[0]
Out[1]:
5.0

This above is all and nice, but $\texttt{dblquad}$ is very accurate, and hence slow. What if we don’t need that level of accuracy? We can try to use the $\texttt{fixed_quad}$ function in scipy. Here’s a Stackoverflow question about the exact same thing, with some sample code.

Let’s copy its code and test it. It seems to work:

In [2]:
from scipy.integrate import fixed_quad

fixed_quad(lambda x: fixed_quad(f, 0, 1, args=(x, ), n=5)[0], 0, 2, n=5)[0]
Out[2]:
5.000000000000001

Of course, we have very fast speed up, at least on my laptop

In [3]:
%timeit fixed_quad(lambda x: fixed_quad(f, 0, 1, args=(x, ), n=5)[0], 0, 2, n=5)[0]
10000 loops, best of 3: 39.3 µs per loop
In [4]:
%timeit dblquad(lambda y, x: f(x, y), 0, 1, lambda x: 0, lambda x: 2)[0]
1000 loops, best of 3: 261 µs per loop

Seems to work well right? It’s too bad it fails for very simple cases…

In [5]:
def g(x, y): 
    return x * y

print(dblquad(lambda y, x: g(x, y), 0, 1, lambda x: 0, lambda x: 2)[0])
print(fixed_quad(lambda x: fixed_quad(g, 0, 1, args=(x, ), n=5)[0], 0, 2, n=5)[0])
0.9999999999999999
1.333333333333333

Ah, what’s going wrong? Turns out it has to deal with the way $\texttt{fixed_quad}$ deals with the array. It doesn’t really evaluate the tensor product of the quadrature points, but rather along some subset of it.

A better implementation is just to rewrite the code myself (or to use something like quadpy):

In [6]:
from scipy.special import roots_legendre
import numpy as np
roots, weights = roots_legendre(5)

def dbl_integral(func, left: int, right: int, bottom: int, top: int):
    """
    Calculates double integral \int_c^d \int_a^b f(x, y) dx dy.
    """
    x = (right - left) * (roots + 1) / 2.0 + left

    y = (top - bottom) * (roots + 1) / 2.0 + bottom

    total = 0
    for index_i, i in enumerate(x):
        # The first ones_like(y) is to ensure we obtain a vector for some of my usages... 
        total += np.dot(np.ones_like(y) * func(i, y) * weights[index_i], weights)

    return (top - bottom) * (right - left) / 4 * total
In [7]:
print(dblquad(lambda y, x: f(x, y), 0, 1, lambda x: 0, lambda x: 2)[0])
print(fixed_quad(lambda x: fixed_quad(f, 0, 1, args=(x, ), n=5)[0], 0, 2, n=5)[0])
print(dbl_integral(f, 0, 1, 0, 2))

print(dblquad(lambda y, x: g(x, y), 0, 1, lambda x: 0, lambda x: 2)[0])
print(fixed_quad(lambda x: fixed_quad(g, 0, 1, args=(x, ), n=5)[0], 0, 2, n=5)[0])
print(dbl_integral(g, 0, 1, 0, 2))
5.0
5.000000000000001
5.0
0.9999999999999999
1.333333333333333
1.0

And we also maintain the speed (though not as fast but it’s not hard to write above in einsum or optimize it a bit more), assuming we precalculate the weights and roots:

In [8]:
%timeit dblquad(lambda y, x: g(x, y), 0, 1, lambda x: 0, lambda x: 2)[0]
%timeit fixed_quad(lambda x: fixed_quad(g, 0, 1, args=(x, ), n=5)[0], 0, 2, n=5)[0]
%timeit dbl_integral(g, 0, 1, 0, 2)
1000 loops, best of 3: 246 µs per loop
10000 loops, best of 3: 38.1 µs per loop
10000 loops, best of 3: 74.1 µs per loop

The “Ahhh” Shower

Rodrigo never intended to have an especially busy day on the 12th. Of course, he had to make a trip to the farmer’s market to get those specialty onions he loved (they were sweeter than the ones in the grocery store), but otherwise the Saturday was completely free. It was a much needed reprieve from the onslaught of work recently with another twelve hour shift on the 13th.

The day started out as planned: waking up with the sun’s rays gently urging his eye lids to make way, with no particular rush. The drive to the market was a smooth twenty minutes. Somehow, traffic was especially clear, and so was the line for the produce. Within an hour, Rodrigo was back home caramelizing the onions for soup and toasting bread. The rest of the day was a blur of mindless entertainment from the multitude of streaming sources while sipping on the French onion soup.

Trish’s Saturday was quite the opposite. Since winter still maintained its grasp on the mornings, she wanted to hike along the foothills in the weak heat of the afternoon. The morning therefore was filled with scrubbing and cleaning. She always loved the sense of accomplishment from finishing chores.

As the hike was finishing up, she got a text from Frank, that cute guy she’s been texting.

“hey, lost our fourth from trivia in 30 @ gecko’s, you in? sorry for short notice!”

The deliberation was short. Whilst her legs were jelly, she didn’t smell too bad since it was still chilly and Frank was, by all accounts, a catch.

“ofc, see you soon :)”

Trish and Frank’s team never came close to winning. No one on the team knew any Madonna songs nor read Julius Caesar. She got home more than ten hours after she left for the trailhead.

Though their days progressed thoroughly differently, Rodrigo and Trish both ended with a hot shower, one where “ahhhh”s were spoken, the true mark of a good day.