Arcane Season 2

I wrote about Arcane season 1 awhile ago.

The second season is also fantastic.

A bit rushed and somewhat confusing at times (one must pay attention at all times!), but the meat of the story is well-written with quite a few seriously poignant scenes.

Still highly recommend.

 

Remarkably Bright Creatures

I went on a short road trip to Denver this past week, and finally used the Spotify audiobooks feature. The bright cover caught my eyes, and I listened to the whole of Remarkably Bright Creatures in the 12 hours on the road.

The book received a lot of attention from the internet, but, frankly, I was more annoyed at the book rather than enjoyed it. It was because the characters, while technically fleshed out, all had personality traits which gnawed at me.

The shopkeeper couldn’t stop gossiping. The main character Tova embodied some of the worst of what I think of as “boomer” traits. The biggest culprit was Cameron, who was arguably the worst man-child that I’ve ever read in any of my books, and I found it hard to listen to his excuses in his chapters. To be fair, it could be that the audiobook did a fantastic job of voicing him.

The plot was also laid wide open before the halfway point, after the octopus revealed he could discern genetic relations (which is…. silly, but this is fiction after all). The book then became an exercise in dramatic irony, with the main question of what sort of small knots the author will introduce before a happy conclusion.

I was… also frankly disappointed at the message. Tova is a character beset by tragedy. Her husband passed a few years ago, and her son passed long before that from suicide. Instead of exploring how her grief interplay in a mentor-mentee relationship, the author put in the twist of making Cameron and Tova related. This seem to be the only thing to satisfy Tova: to find family again. Why couldn’t she live with grief, something all of us must toil through? Should I start donating sperm to have unexpected grand kids in the future?

Maybe, I need to stick with “high” literature for now. Just compared to other books I’ve read recently, this one just seemed so lacking in substance.

I did like the octopus’ sass though. He was awesome.

Pied-à-terre

small living unit, e.g., apartment or condominium, often located in a large city and not used as an individual’s primary residence

Of course there’s a French word for this.

A depressing set

My little trek through the world of literature of the current century is slightly slowing down over the last month. The opening of a new book always bring with it some emotional investment. The last three books that I finished demanded even more than the usual: Leaving the Atocha Station by Lerner, Never Let Me Go by Ishiguro, Austerlitz by Sebald. This will be a short and sweet discussion of the last two, and is really a “feelings” and not a “literature, here’s quotes that support me” discussion.

As a side note, it’s not that Lerner’s book was bad. It felt like an author’s bildungsroman through the main character’s journey in Spain. Lerner is a poet meaning the language was actually lovely, but sometimes leaning on the everything-as-metaphor stream of consciousness which can be difficult to read. However, it’s just not as impactful as the other two.

Ishiguro’s book is set in a parallel universe where biological sciences advanced far beyond the capabilities of our current world to the point where clones are created for the sole purposes of organ harvesting. We follow one of these clones in the later years as she examines her life and relationships while attending a boarding school. Sebald’s is a fictional account of a immensely troubled man who is reflecting on his personal history as he figures out how he, a Czech Jew, arrived in Wales from Prague during the summer of 1939 as a 5 year old.

There’s a deep sense of tragedy in both of these stories which tantalized me. Neither authors actually explicitly go at length to discuss either the impending or past dooms which haunts our characters and constantly obfuscate. Ishiguro masks the truth with words like “complete” to denote the death of a clone, reminding me of Carlin’s bit on euphemisms, while Sebald’s narrator somehow always gets distracted, perhaps in a constant bid to procrastinate the discovery of the truth. They stand in contrast between the two, one of imminent death and one of the crippling past but neither fully articulating the magnitude of tragedy.

It’s almost as if we’re watching these characters swimming in the ocean with cuts and scrapes while silhouettes of sharks lurk beneath. I couldn’t look away even while knowing  what fate will befall our hopeful protagonists. In a world where so much detail is  explicitly stated for the readers, it’s actually refreshing to have things unspoken. Paired with the way both authors write, the effect is an ethereal experience.

Both also rely on the power of memory as an introspection tool, and grossly remind me of how terrible mine is (and how I should start journaling again). As our protagonists learn, they refract the past through this new prism. Sebald’s character finally understood why he suffered mental collapse at Marienbad while Ishiguro’s character, incorrectly, guessed the purposes of the “gallery”, which was a mystery until the end. This introspection all inevitably guided our characters to the future, past the last pages of the novels.

None of the two stories are complete by the end. Kathy, the clone, left her love interest before his final “donation” and got her first donation summons, while Austerlitz was still traveling to find the whereabouts of his father. If my reading of Sebald’s themes of the last few pages in Paris is correct, he will never find his father. History, especially the grotesque,  is increasingly guarded. And Kathy will not escape her fate of donations, no matter how much humanity she displays; Ishiguro’s tale is not a hero’s conquest, but more of a melancholic struggle.

Why is good literature always so sad. Onto My Brilliant Friend. That should be a happier book.

Grasshopper

The grasshopper laid on the small open air passageway between the stairs and the front door of my apartment. A streak of ardent green juxtaposed against the gray, brutalist pockmarks of the concrete walkway. And yet, I almost stepped on it. Not, on purpose mind you, but because of the very subtle pull from peripheral vision; a beckoning of sorts when one’s mind is on autopilot.

It looked like it was dead. A grasshopper is not considered an elegant insect, with its many sharp angles and rectilinear tagma. Nothing like the gentle curves of a butterfly. But with this, comes a natural orientation that I could clearly see even while erect. It was lying on its side, throwing off the alignment to the ground attained by millions of years of evolution.

But occasional twitches showed specks of life remained. Unfortunately, my hands were full carrying trash to the bin, and saving this tiny green mote involved several steps. I would had to lean my trash bag against the wall, find and gently use a piece of card stock or paper to scoop the little fellow. Finally, take this little specimen down the flights of stairs and deposit it among the shrubs.

Maybe “several steps” is overselling it, but I ultimately did nothing and continued with my chores after returning from the bins. Was it really that hard to do something for a helpless creature stuck in a foreign land? The activation energy required so large that I chose inactivity? (To be fair, it was three flights of stairs…)

Or was my laissez faire attitude the correct choice for it was too weak to survive anyways? The wind was strong that day, and I suspected that it was blown from the nearby tree onto the balcony. Perhaps the traveler was just catching its breath and would straighten up by itself after several minutes

Twenty minutes later, when I was throwing away the recycling, it was gone.

 

Piles

Living is the management of piles. Piles of laundry, piles of dishes, piles of books to read before we fade into the dirt…

Thank you NYTimes for publishing the list of your top books. It’s been awhile since I read a novel due to the pile (ahem) of New Yorker magazines on my coffee table, but it’s truly different to read a full length novel versus just a ten page short story.

Here are some notes from the past few books I finished:

  • Exit West by Hamid: frankly, I thought the novel was for YA audience. I did not like it at all. The premise, while interesting, meant that the plot could be guessed by page forty or so.
  • All the Light We Cannot See by Doerr: apparently bad Netflix adaptation, but solid novel. I really enjoyed the time jumps much like the Cloud Cuckoo Land with increasingly detailed world or anticipation, but the anti-war message is pretty heavy handed.
  • The Emperor of All Maladies by Mukherjee: marvelous writing balancing hope with despair. It’s certainly a difficult topic to read about, but I couldn’t put my Kindle down for the entire book. I do wish it would be… actually more technical… but it’s understandable the level which it is written.

Currently working through Never Let Me Go by Ishiguro and have Austerlitz arriving in a few weeks.

Slurping up SLERP

For work recently, we had to deal with functions on the surface of the sphere. While conceptually simple, this actually introduces more complexity to computations. Even something as simple as distance calculations are now different due to the curvature.

Most of these concepts are really well defined by Wikipedia or other resources. The only qualm I have with the source is the fact that there is no standardization of notation: the usage of $\theta$ and $\varphi$ are almost different and this, frankly, costed me a lot of grief.

However, the concept of shortest point interpolation (so called slerp) between two points on the sphere was really not well explained I think. There is a Scipy implementation, and a short Wikipedia article but both never really discussed the proof of why the formula

\begin{align}
S(p_0, p_1, t) = \frac{\sin((1 – t)\theta)}{\sin \theta} p_0 + \frac{\sin(t\theta)}{\sin(\theta)}p_1
\end{align}

where $\theta = \arccos(p_0 \cdot p_1)$ is actually on the sphere. I was using this for a plotting purpose, so I didn’t care about the “interpolation” aspect, but see this post for a derivation from that point of view.

Let’s quickly give detailed proofs on why this formula works, and to prove that for any $0\le t \le 1$ that we get a point on the sphere that lies on the great circle between the two points $p_0, p_1$. On a sphere of radius 1, $\theta$ is in fact the great circle distance between $p_0$ and $p_1$.

To this end, we have to show that the points defined by slerp is of norm 1, and also lies on the plane defined by the origin, and $p_0, p_1$ (definition of the great circle). This second point is easy to do, we simply have to show that $S(p_0, p_1, t) \cdot (p_0 \times p_1) = 0$ (e.g. the slerp points are orthogonal to the vector which defines the plane). This equality is true due to the simple equaliy $a \cdot (a \times b) = 0$.

Thus the slerp points lies on the plane defined by the great circle, it remains to show that the norm is actually 1. This can be done with just the sine subtraction formula and the usual Pythagorean identity. First note
\begin{align*}
\left\Vert S(p_0, p_1, t) \right\Vert^2 &= \frac{\sin^2((1-t)\theta) + \sin^2(t \theta) + 2 \sin((1-t)\theta) \sin(t\theta) \cos(\theta) } {\sin^2(\theta)}
\end{align*}
by the fact $p_0, p_1$ are norm 1.
Expanding the first term, we have
\begin{align*}
\sin^2((1-t)\theta) = \sin^2(\theta) \cos^2(t\theta) + \cos^2(\theta) \sin^2(t\theta) – 2 \sin(\theta) \cos(\theta) \sin(t\theta) \cos(t\theta).
\end{align*}
and note that the cross term can be expanded using the same sine difference formula
\begin{align*}
2 \sin((1-t)\theta) \sin(t\theta) \cos(\theta) &= 2 \sin(\theta)\cos(\theta)\sin(t\theta)\cos(t\theta) – 2 \cos^2(\theta) \sin^2(t\theta).
\end{align*}
Thus, cancelling the common terms, we have that the numerator is equal to
\begin{align*}
\sin^2(\theta) \cos^2(t\theta) + \cos^2(\theta) \sin^2(t\theta) + \sin^2(t \theta)- 2 \cos^2(\theta) \sin^2(\theta) = \\
\sin^2(\theta) \cos^2(t\theta) + (\cos^2(\theta) +1- 2 \cos^2(\theta)) \sin^2(t\theta) = \\
\sin^2(\theta) \cos^2(t\theta) + (\sin^2(\theta)) \sin^2(t\theta) = \sin^2(\theta)
\end{align*}
which equals the denominator.

My (Not-So-Successful) Quest to Conquer the NYT Connections Game with Word2Vec

The New York Times’ Connections game: a fairly simple puzzle that has been rising in popularity. The objective? Find four groups for four within a larger sample of sixteen total words such that each subgroup has an overarching theme.

I thought this would be fairly easy to solve with some simple usage of word embedding and K-means clustering. After all, if it can figure out king – man + woman = queen, then surely it can figure out that these are all sandwich ingredients. There are enough models out there for topic modelling that it was easy to install a model in under 1 minute, and I just used a simple K-means.

However, I quickly ran into problems. The most major is the fact that K-means doesn’t always give four groups of four. Seeing as this was the case, I switched to a constrained K-means algorithm. Another thing I noticed is that the word embedding probably doesn’t account for the fact that repetition might be used (e.g. ‘tom tom’ rather than ‘tom’).

It’s curious to wonder what a better approach would be, as spending some 2 hours on this little question has proved to be not as fruitful, even for some relatively simple puzzles. Maybe a contextual embedding is needed, rather than just a GLOVE word2vec model.

I also thought a more curated, greedy algorithm might work rather than K-means. Take the two most similar words, and assume they must be a group. Average the two word vectors then find the next word from the now reduced list. I gave this a whack, but also didn’t turn out too well…

… maybe this is a more difficult puzzle than I originally thought.

Nevertheless, below is some sample code:

 

import gensim.downloader
from sklearn.metrics.pairwise import cosine_similarity
from k_means_constrained import KMeansConstrained

words = [
    'plastic', 'foil', 'cellophane', 'lumber',
    'organism', 'harpoon', 'trudge', 'limber',
    'stomp', 'elastic', 'glove', 'bassinet',
    'mask', 'plod', 'jacket', 'supple'
]

# Load model
model = gensim.downloader.load('glove-wiki-gigaword-300')

# Generate similarity matrix
word_vectors = [
    model[word] for word in words # We assume all words exist in corpus
]
sim_matrix = cosine_similarity(word_vectors)

clf = KMeansConstrained(n_clusters=4, size_min=4, size_max=4, random_state=0)
clf.fit_predict(sim_matrix)

print([x for _, x in sorted(zip(clf.labels_, words))])
print(sorted(clf.labels_))