Cosine Similarity & Liking
For graduates of the Industry Graduate Diploma of Data‑Driven CX/UX Management and Design 😊—wow, that's lengthy.
Try it: how angle becomes affection
Drag the slider to change the angle between your preference vector and the frontier you're learning. Watch how cosine similarity and liking move together.
As the angle shrinks (you and the subject align), cosine similarity rises from −1 to +1, and liking rises with it. Familiarity does the rotation for you.
The lesson—why "I don't like it" usually means "I'm not used to it yet"
We often think we don't like something—math, code, a new tool, a strange idea—when really, we just haven't done it enough times for it to feel easy. In the picture, that's a wide angle: your preference vector points one way, the new subject points another. Cosine similarity is low, so liking is low. It feels like a verdict. It's actually just a geometry of unfamiliarity.
Here's what happens when you keep showing up—week after week, module after module—even when it's uncomfortable: every solved problem, every clean function, every "ohhh, that's what that meant" nudges your vector closer to the subject's. The angle shrinks. Cosine similarity rises. Liking follows. That isn't a personality change—it's an alignment.
That's the quiet superpower this program tried to give you: the willingness to stay in the high‑angle zone long enough for cosine similarity to do its work. Now you have a habit your future self will thank you for—the ability to learn anything, even the things that feel cold at first, by trusting the curve.
One more thing, and maybe the most important
This is what cosine similarity has been quietly teaching us the whole time. Look at the formula: it normalizes the length of both vectors away. It doesn't care how long your vector is—how fast you went, how many hours you put in, how loud you were about it. It only asks one question: which way are you pointing?
So when the work feels slow, when a classmate seems to be racing ahead, when a module takes you three weeks instead of three days—none of that changes your similarity to the thing you love. What changes it is direction. Keep pointing toward Math, Programming, Artificial Intelligence, Machine Learning, Causal Inference, Quantitative Modeling, Bayesian Inference, or whatever frontier calls you next—and the humane, customer‑centered problems you came here to solve—and the angle will keep closing, at whatever pace is honest for you.
A short walk in the right direction beats a sprint in the wrong one. Trust your bearing more than your speed.
The code—same idea, in two languages
Run it, fork it, stick it in your notebook. The math fits in eight lines.
import jax import jax.numpy as jnp def cosine_similarity(u, v): return jnp.dot(u, v) / ( jnp.linalg.norm(u) * jnp.linalg.norm(v) ) def liking(u, v, slope=1.0, baseline=0.0): # the curve from the lesson return slope * cosine_similarity(u, v) + baseline liking_jit = jax.jit(liking) # vectorize over a whole class of learners batch_liking = jax.vmap(liking, in_axes=(0, None)) you = jnp.array([1.0, 0.0]) topic = jnp.array([0.92, 0.39]) # after week 12 print(liking_jit(you, topic)) # ~0.92
function cosineSimilarity(u, v) { const dot = u.reduce((s, ui, i) => s + ui * v[i], 0); const nU = Math.sqrt(u.reduce((s, x) => s + x*x, 0)); const nV = Math.sqrt(v.reduce((s, x) => s + x*x, 0)); return dot / (nU * nV); } function liking(u, v, slope = 1, baseline = 0) { // the curve from the lesson return slope * cosineSimilarity(u, v) + baseline; } const you = [1.0, 0.0]; const topic = [0.92, 0.39]; // after week 12 console.log(liking(you, topic).toFixed(2)); // 0.92
The Python version uses jax.jit and jax.vmap—the same primitives behind
modern AI/ML pipelines—to compile and batch the same little formula across thousands of learners.
Tiny math, big leverage. That's the pattern of this whole field.
With heart, and with the curve on your side
Dear graduate—you have just finished the Industry Graduate Diploma of Data‑Driven CX/UX Management and Design. If there is one thing I want you to carry forward, it is this: the discomfort you felt at the start of every module was never a sign that the subject wasn't for you. It was the angle. And angles close, slowly, with practice.
You now like Math, Computational Programming, Artificial Intelligence, Machine Learning, Causal Inference, Quantitative Modeling, Bayesian Inference, or whatever frontier you choose next—and the way these tools crack open tech‑business problems with heart. Not because you were chosen, but because you chose, again and again, to keep rotating. Trust that move for the rest of your life. Anything you want to learn, you can. The curve is on your side.
Go build with kindness, with rigor, and with the heart you brought into this room.