Quadrature Rules on Triangles

Quadrature rules on the interval [-1, 1] is pretty well-documented in many different textbooks (see for example Cheney and Kincaid). By transformations, we can easily integrate across the whole real line spectrum. Going into two dimensions, we can take tensor products to integrate rectangles easily. On the other hand, integration on the 2D simplex (i.e. triangles) is a bit harder. There really exists two types of quadrature rules for the triangle:

  1. Tensor products + transformations
  2. “Symmetric quadrature rules”

The latter offers better convergence with smaller points, and has better spread-out points than the tensor product version, but offers less technical support through software packages; we refer the reader to the following paper. We will discuss the first, and exposit a little more; I will be borrowing heavily from Karniadakis and Sherwin’s fluid dynamics textbook.

Suppose our triangle at (-1, -1), (1, -1), and (-1, 1) (so called reference triangle) uses x, y as it’s coordinates. If we use the change of coordinates u = 2\frac{1 + x}{1 - y} - 1, v = y, it’ll map the triangle to the square defined by (-1, -1), (1, -1), (-1, 1) and (1,1). Hence, we can express the integral using a change of variables (details omitted)

\int_T f(x, y) \, dx dy = \int_{-1}^1 \int_{-1}^1 f(u, v) \frac{1 - v}{2} \, du dv.

We note that the weird factor is simply the change of variables Jacobian. Now it’s easy to use Gaussian quadrature on this! But there are better version, as we have the Jacobian term; looking up the quadrature rules, we see that Gauss-Jacobi fits our bill. We refer the reader to page 144 of the Sherwin textbook to see the full-details.

A simple implementation is below (this is far from the best). I guess I’ll talk about this at the end of the week.

from scipy.special import roots_jacobi as rj
def quadtriangle(f, w00, w10, x00, x10):
total = 0
for i in range(len(w00)):
subsum = 0
for j in range(len(w10)):
subsum += w10[j] / 2 * f([(1 + x00[i]) * (1 - x10[j]) / 2 - 1, x10[j]])
total += w00[i] * subsum
return total
def f(x):
return x[0]**2 + x[1]**2
p = 4
x00, w00 = rj(p + 1, 0, 0)
x10, w10 = rj(p + 1, 1, 0)
print(quadtriangle(f, w00, w10, x00, x10))
Quadrature rules

Morality of the Mob

Before you read on, watch the following clip from the Hateful Eight:

Of course, there are no more hangmen left in the civilized world; they have been outsourced by giant pharma companies or the electric chair. But crimes of passion carried out under the banner of revenge will always live on. How true is the quote “justice delivered without dispassion is always in danger of not being justice”?

I bring this up because it’s been a few weeks since I found out that my landlord (hereby denoted as CB) was murdered recently. It was definitely a crime of passion from the context, which requires us to go back in time. In 1999, a police report stated that neighbors accused CB of being a pedophile but no charges were filed.

Later, CB lost his chiropractor license due to having sex with a patient in 2003. His license was reinstated sometimes after; we know this because his license was once again revoked in Sept. 2017 by a complaint from a patient. Four months later, CB was found dead in his home.

The alleged killer of CB is a 21 year old college student who filed the complaint that led to CB losing his license. The charges were “violating the professional boundaries of the chiropractic physician-patient relationship.” Whatever that means, one can only guess, but with the recent case against Nassar, I find myself unable to think of other alternatives which could lead to a stabbing. What other infractions could lead to such a visceral response of targeted stabbing? Was our alleged killer ignored when he cried out for help?

CB was not a good man by all accounts it seems. His former neighbor is quoted to have said ‘it didn’t come as a surprise to him.’ I don’t know who this neighbor is, but finding out someone is murdered should be a huge fucking surprise… unless the neighbor believed that CB is a terrible person.

In the end, is this justice for the alleged killer? He will have to face trial for murder, and possibly end up in prison for the rest of his life. If he lived an average life span, that’s over 60 years in jail for a crime committed against a varmint. And CB does not deserved to die such a painful death also; no matter how heinous his past deeds, CB should have walked through the gates of a court. It seems justice was left at the wayside.

In the end, I feel this should indicate that the system is tarnished, and justice in this case will never be reached. CB should have never had his license again after the first complaint. The alleged murderer should have had avenues of help that could have dealt with this. The fact that vigilante justice was involved at all is upsetting, but one can hope the recent activism will legitimately change our system in the future.

Legend of Zelda: Breath of the Wild

It’s been a week or two since I officially finished my first BOTW play through, and I think I finally can digest the whole thing. First of all, it’s an awe-inspiring game comparable with the scale of the Witcher. I guess since the Witcher was my previous big RPG, I’ll do some comparisons/review.

The freedom which the game gives you is truly unbounded; couple this with a pretty interesting physics engine, and you get crazy ways to defeat enemies. This is especially true early game, where enemies you encounter are weak enough to be killed with bombs/objects. The combat mechanics are actually much deeper than I initially thought, though there’s only really one fighting style that can work. Games like Skyrim or Witcher lets the player really choose a different path (e.g. mage, sneak, one-hander…) which BOTW doesn’t have. Maybe it’ll change in the future, but this might go against the theme of Zelda.

As the game progressed, I really did feel much stronger and better fighter. This connected incredibly well with how the story plays out: Link wakes up 100 years after failing to defeat Ganon and must train again. This compares much better to the other RPGs which seems to be indecisive about how strong you are throughout the game (Geralt is suppose to be an accomplished Witcher already, why the level up system?).

Finally, the endgame proved to be a nice “endgame-fantasy” type. Urbosa’s fury is simply the strongest ability in the game, with the power to just damage and stun enemies for a substantial amount of time. I managed to clear out the Yiga hideout instead of sneaking by using Blizzard/Ice rods and Urbosa’s fury. Couple this with the DLC’s cooldown reduction, and everything fell immediately.

Speaking of the DLC, they were underwhelming. I did enjoy the additional storyline cutscenes, but compared to the Witcher’s massive HoS/BW for only $5 more than Zelda’s, it’s really pales in comparisons. The way the game makes combat harder is not through more intricate fighting patterns or dodging, but rather to take away your powers in the trials. It’s a bit frustrating to say the least.

I didn’t really even mention the dungeons; I guess they were a thing too. In the grand scheme of things, they were far overshadowed by other aspects of the game. I just have one thing to say: motion control can go piss off. The story element is also weaker, with the characters very under-developed. I would really like to know the backstory of the Champions more than what the game gave us, but the Zelda-Link relationship is quite fleshed out. The world does do a good job of showing us the history of Hyrule though: juxtapositioning the Forgotten Temple and the Temple of Time are really cool in how ruins can show what happened.

I normally don’t play games a second time, but I may have to dive into BOTW again…

 

 

SPD Matrices and a False Inequality

This one is from my research, and it’s a doozy. Given two vectors x, y such that each element in x is less in absolute value than the corresponding element in y, show that for any SPD matrix A that x^TAx \le y^T A y.

After spending a good amount of timing trying to prove this, I realized that this is in general not true (in fact, the result I was suppose to be chasing would’ve been disproven if the above statement was true). As a counter example, consider the following counterexample from a Bernstein basis application:

Let x = [1, 0, -1/3], y = [1, -1, 1/3]. Let the matrix be

[2/7, 1/7, 2/35; 1/7, 6/35, 9/70 ; 2/35, 9/70, 6/35].

Then the quadratic forms will be 4/15 and 1/7 respectively.

A Matrix Inequality

An exercise from Braess FEM book: for A, B symmetric, positive definite matrices, let A \le B. We want to show that the inverses satisfies a similar property B^{-1} \le A ^{-1}.

The book actually gave quite a lot of hints for this one.

\begin{aligned}x^TB^{-1}x &= x^T A^{-1/2} A^{1/2} B^{-1} x \\&\le \sqrt{x^T A^{-1} x} \sqrt{x^T B^{-1} A B^{-1} x}.\end{aligned}

.
Then from the hypothesis, A \le B \implies x^T(B - A)x \ge 0 \implies y^T(B^{-1} - B^{-1}AB^{-1})y \ge 0.
So we can plug this in to our equation above to find that x^TB^{-1}x \le \sqrt{x^T A^{-1} x}\sqrt{x^T B^{-1}x}.

Larson 1.11.5

Here’s a cute little problem: let f(x) be a polynomial of degree n with real coefficients such that it has non-negative values. Show that S(x)= f(x) + \cdots + f^{(n)}(x) \ge 0 for all real x.

Notice that S(x) goes to positive infinity in both limits, as f(x) is an even order polynomial. Furthermore, it is also non-negative at 0. Now, the minima/maxima of S(x) exists where f'(x) + \cdots + f^{(n)}(x) = 0, but evaluating at such points reveals that the extremal values are all positive.

1962 Putnam

A6.  X is a subset of the rationals which is closed under addition and multiplication. 0 ∉ X. For any rational x ≠ 0, just one of x, -x ∈ X. Show that X is the set of all positive rationals.

I’m not sure if the Putnam was easier back in the day like the AMC/AIME, but this A6 was ridiculously simple. The problem popped up in Larson’s book 1.9.5 with some hints, which I’ll outline the solution with.

First, note that 1 is in our set. This can be seen several ways, with the official solution being much more elegant (only one of -1 or 1 can be in the set, and we also need to maintain multiplicative closedness). With this, we can prove that all positive integers are in the set.

Finally, assume that there exists a negative rational number in the set. Then simply add it to itself however many times is in the denominator, and it’ll be a negative integer; contradiction.

Symmetry

Going on vacation really puts into perspective a lot of things, though I want to focus on how far my mathematics and problem solving skills have developed since high school. I really think I was most creative at the younger age when doing all those math contests, but I’ve matured now.

It seems at Brighton’s age and level of a 8th grader, I can provide a level of guidance that’s similar of how I feel now with respect to my adviser. It’s really a nice feeling to think that I am indeed getting smarter.

Update

It’s been awhile since I last posted, just wanted to document a little of what happened.

  1. Passed my prelim exam. Overall, it went pretty well in my opinion. It seems the minor actually went much worse than my major. I’m glad that’s over.
  2. Went the the Gene Golub summer school immediately after. It was an extremely fun program where the emphasis seemed to be on making connections rather than actually learning the mathematics. I believe I spent more time playing volleyball than actually doing math.
  3. Traveled alone in Germany: first to Heidelberg, then to Boppard and finally back to Berlin. It was extremely relaxing, and I met some incredible people whom I hope to hopefully keep in touch in the future.
  4. Back to work now…

The (lack of a) Matrix

I think I finally understand why software packages like PETSc has an option for an operator when doing something like conjugate gradient. Why isn’t having a matrix good enough for everyone?

Well turns out that while all linear operators can be translated to a matrix, it may not be the best way to represent the operator. As an example, consider a basis transformation from Bernstein polynomials to Jacobi (or vice versa). It’s certainly possible to find and construct a matrix which does the operation, but it’s ugly.

On the other hand, it’s not that bad to write a code which utilizes the properties of the polynomials and convert it within in O(n^2) time. The key is that a Jacobi polynomial is a sum of Bernsteins, and Bernsteins can be degree raised or dropped at will.

This function will outperform the matrix in many sense. For one, there’s no need to construct a matrix, which will take n^2 operations in the first place. Next, matrix multiplication will take a n^3 operation, so if we optimize enough, we will always beat it. Finally, it’s really less painful to code, because each line of the function serves a visible purpose.

Anyways, I’m sold.

(I’ll eventually publish the code in the summer)