Featured post

Mathematical Aside: Golden Mean Shift and Pascal’s Triangle

I noticed something curious when I was working on my summer project last year. Basically, you consider the set c_n of binary strings (strings of length n where each symbol is either a 1 or a 0 and 1‘s are non-adjacent).  So we construct a table like

 

Length 1 2 3 4 5 6
Strings 0;1 00;01;10 000;001;010;100;101 0000;0001;0010;0100;0101;1000;1001;1010

And then count the number of elements of length n with m ones.

Number of ones [m] (below) Number of strings (below)
Length of string [n] (right) 1 2 3 4 5 6
0 1 1 1 1 1 1
1 1 2 3 4 5 6
2 0 0 1 3 6 10
3 0 0 1 4
4 0 0

 

Do you see Pascal’s Triangle hiding inside? Starting from the top left and diagonally down to the left. First one is 1,2,1, then is 1,3,3,1 etcThe explanation is simple enough. We are using the fact that the subset of  c_n composed of strings with m ones (lets call this c_{n,m}) can be made by taking an element in c_{n-1,m} and appending a zero to it or an element in c_{n-2,m-1} and appending a zero-one to it. We end up with the recurrence c_{n,m}= c_{n-1,m}+ c_{n-2,m-1}. If we consider d_{n,m} as the n^{th} element in the m^{th} diagonal, then we realize that d_{n.m} follows the same recurrence. This reccurence is well documented in the literature for calculating |c_n|, here we are just using this idea for subsets of c_n with exactly m ones to connect it with Pascal’s Triangle.

The adventure continues in Part 2!

Course Review: MATH 418

Probability

“I heard you had a lot of difficulty with the last homework. So you will be relieved to note that next homework you will have another opportunity to practice similar problems.”

Text:  A First Look at Rigorous Probability Theory, 2nd ed by J.S. Rosenthal

Prof: Dr Gordon Slade

Prof Slade is very clear and keeps things simple. He also has no problem slowing down to answer questions from students if they are not following along. He has a dry sense of humor, that keeps the class interesting, and he can even by funny when he is talking in earnest.

Difficulty

After taking measure theory, several sections of the course can feel like review. This was a good thing for me, as I found MATH 420 a tad fast. There are a handful of new techniques that you learn in the homework and class, but there are not that many new concepts if you have some background in measure theory and probability. MATH 421 material also comes up in terms of weak convergence.


Key Concepts

Probability Triples

Random Variables

Distributions

Expectation

Borel-Cantelli

Modes of Convergence

Law of Large Numbers

Central Limit Theorem

Characteristic Functions

Hard Concepts

Tail Events: Kind of funny to think about. Also, include definitions of limit supremum and limit infimum for sequences of event which can be difficult to convert to statements about limits of random variables.

Weak Convergence: There are a lot of equivalent statements, and if you pick the wrong one it can be a mission to prove that convergence occurs.

Conclusion

Good review of measure theory, and gives you a mathematical foundation to elementary probability.

Course Review: MATH 421

Real Analysis II

“If everyone is getting closer together, and you are going to Cleveland, then everyone’s going to Cleveland!”

Text: Real Analysis: Modern Techniques and Their Applications by Gerald B. Folland

Prof: Dr Brian Marcus

Prof Marcus is one of the kindest professors I know. He also explains each concept from multiple perspectives and breaks the concepts down to simple intuition. His notes are super useful and he has a great sense of humor.

Difficulty

The classes moves somewhat slower then 420, so we went a lot more deeper into the details of certain proofs and where they come from. This meant the classes were easier to follow (and in my view more enjoyable!). Further the homework was largely straightforward. The one challenge in this course was the material towards the latter half of the term was slightly more complex and were not assessed on homework. We also may not have covered the material generally covered in a similar course.


Key Concepts

Banach spaces

Hilbert spaces

Linear functionals

Hahn Banach Theorem

Open mapping theorem

Uniform boundedness

Riesz representation theorems

Hard Concepts

Weak Convergence: Can be hard to think about neighborhoods in this topology.

Hilbert Spaces: There are a lot of nifty tricks in Hilbert spaces that are not immediately obvious, and do not work in other spaces.

Conclusion

Really fun course and learnt a lot. Nice conclusion to my studies of analysis in some sense.

Course Review: MATH 420

Real Analysis I

“F just comes along for the ride…”

Text: Real Analysis: Modern Techniques and Their Applications by Gerald B. Folland

Prof: Dr Stephen Gustafson

Prof Gustafson is an engaging professor. He is able to give a quick, intuitive overview of quite complex topics in class, leaving us to work out the technical details on our own. He also is very helpful during office hours. He also posted his notes online.

Difficulty

The classes move very fast so one might find it challenging keeping up with the readings. Also, though the classes are easy to follow, the homework is increasingly challenging and time-consuming. The final exam was quite hard, practice and recalling theorems and definitions are all important. This course covers a lot of material, so watch out!


Key Concepts

Sigma-Algebras

Measures

Differentiation of Measures

Lebesgue integration

Convergence of Functions

LP spaces

Hard Concepts

Lebesgue Radon Nikodyn Theorem: Quite powerful theorem, need to notice where to apply it. Also can be hard to find the decomposition at times.

Fatou’s Lemma: Tricky working with the limit infimum, also quite powerful and pops up in unexpected places.

Conclusion

One of the harder courses I have ever taken. Started learning some really powerful theorems about convergence, for example.

Course Review: MATH 322

Introduction to Group Theory

“We say groups have a rich structure because it keeps us mathematicians in business”

Text: Basic Algebra I: Second Edition

Prof:  Dr Vinayak Vatsal

Nike is a humorous professor who cares about his students. The homework he set was also useful in getting a mastery of the material in the course. Unfortunately, the textbook was made up of large, listless walls of text that prove half a dozen theses arbitrarily before deciding to declare that a theorem worthy of a name has been proven. Thus it cannot be readily used as a reference book and is difficult to read. The classes were significantly better, but from time to time also suffered from a lack of direction and interest. Group theory, on the other hand, is a fascinating field and the proofs often use a lot of creativity.


Difficulty

 

I found the material in this course difficult, but since it was an introductory class, I felt I had enough time to master the relevant concepts. Unfortunately, the relatively few iClickers were weighted 10% of the grade and the exams were not difficult but punished small mistakes a lot. Also, the final exam used a peculiar notation and seemed almost entirely about symmetric groups and had no questions about abstract group theory. As a result, the average was quite low in this course.

Key Concepts

Groups

Subgroups

Action on groups

Orbit-Stabilizer

Class equation

Finite Abelian groups

Isomorphism Theorems

Homomorphism

Free group and generators

Hard Concepts

Action on groups: Useful to get some practice with a couple of these actions before one can start seeing the action that occurs in problems

Proving isomorphisms/finding a bijection: Often one can use the obvious or ‘canonical’ map, other times you need to do some clever

Counting elements of a certain order in abelian groups: Can make careless mistakes, double check your answer!

Conclusion

An interesting topic, the homework problems were fun, wished the textbook and exams were structured differently.

Course Review: STAT 406

Methods for Statistical Learning

“To fix ideas…”

Text: An Introduction to Statistical Learning by James, G., Witten, D., Hastie, T. and Tibshirani, R.

Prof:  Dr. Matías Salibián-Barrera

Matías is a very earnest professor who often works through helpful examples in class. He also provides supplementary class notes that are useful to review in your own time. His slides can be a bit confusing from time to time but are generally understandable.


Difficulty

This was my first statistics course. I found the material quite challenging at first because I was unfamiliar with a lot of the vocabulary and notation that was used. Also, we used a lot of quite novel results from advanced linear algebra and probability. However, I soon got used to the notation and realized that to succeed in the course one does not need to fully understand what mathematics operating in the background (just need to understand a little). The first midterm was short, free response and had a low average. The subsequent midterms and the final were multiple choice, largely, and were easier, the only difficulty often being the wording of some of the options. A lot of the material was also reviewed from CPSC 340. One of the aspects of the course that was a bit annoying was the webwork. They had a time limit and would often start by accident, or give you an incorrect mark on your first attempt. The labs were pretty reasonable, though you need to be quite quick and compare answers with your friends.

Key Concepts

Model Selection, esp Cross-Validation

Elastic Net Methods

Tree-based methods

Splines

Bagging and Boosting

Clustering Techniques

PCA

Hard Concepts

AIC Derivation: Can be pretty confusing as each formula makes different assumptions

Cross-Validation as Expectation: Make sure one understands the notation for expectation used

Linear Predictor: Some models may be linear predictors though they do not appear all that linear

Conclusion

A handy review of CPSC 340 except in R. Would have liked to have delved more into the statistics of it and how one chooses an appropriate model for a given data set.

Course Review: CPSC 421

Introduction to Theory of Computing

“Let me start off by telling you a fact without enough background to prove the fact”

Text: Introduction to the Theory of Computation by Michael Sipser

Prof:  Dr. Nick Harvey

Dr Nick Harvey is a very clear professor. He makes effective use of slides and slowly explains complex topics, making his lectures both interesting and easy to follow. For example, his explanation of the diagonalization argument was much clearer than any of the presentations on the same topic I have received in math classes. He also has a peculiar sense of humour and is a big soccer fan, so the class is interspersed with entertaining distractions.


Difficulty

The material starts with fairly simple with a bit of review from CPSC 121. It picks up a bit when you start performing reductions between problems from various complexity classes though there is a bit of recap from CPSC 320. The first midterm was straightforward, however, given its short length it was possible to not to do as well if one tripped on only one or two questions. The final was a bit harder, however, the previous exam was good preparation.  The homework included some questions that could be genuinely hard to understand. However, he gave generous hints as the deadline approached, and they were the type of questions that once you understood what the question was asking, the proof was often one of the first few obvious approaches that you would think of. Given that he dropped the lowest couple of homework, it’s no surprise that the overall average in the course was 80.

Key Concepts

Finite Automata and Regular Languages

Turing Machines

Non-determinism

Time complexity classes

Mapping Reductions

Probabilistic Turing Machines

Communication Complexity and Fooling sets

Hard Concepts

Context Free Languages: Can be hard sometimes to come up with CFG or PDA for a given context-free language

Probabilistic Turing Machines: Give rise to quite convoluted complexity classes.

NP-Hard reductions: Might require a bit of imagination to come up with the reductions, at times.

Fooling Sets and Covers: One can sometimes get fooled by these, make sure one recalls the definition of rectangles and what exactly needs to hold for a fooling set.

 

Conclusion

Fun course. Dr Nick Harvey can make the course appear deceptively simple at times, but you actually learned a lot of quite complex material.

Course Review: CPSC 303

Numerical Approximation and Discretization

“One of the questions I got at the end of last week was ‘How do you deal with the loneliness?'”

Text: A First Course on Numerical Methods, by Uri M. Ascher and Chen Greif

Prof:  Dr. Jessica Bosch

Dr. Jessica Bosch is the literally the kindest professor I have ever met. She is super-receptive of feedback and cares (possibly too much) about her students’ trials and tribulations. She once spoke about her current research relating to mathematical models of tumour growth which was pretty interesting. Her slides (along with the helpful summary slides) are usually very clear and well structured. She also has in-class chapter group quizzes which are very useful in preparing students for the midterm and final.


Difficulty

The content in the course is in some ways very simple, it is a bunch of algorithms for solving well-known math problems. However, I think this simplicity is deceiving and the course hints at a fundamental aspect of computing – how problems get transformed as they are represented in the computer, and what properties are preserved. From this perspective, the course could be very complicated. Fortunately, we do not do very complex mathematics in the course, and largely the course is quite manageable. The midterm, given the practice material, was pretty doable. The final was a bit trickier but was combined with very liberal scaling. The assignments were not bad, though Matlab can get annoying. There was also a bonus assignment which, along with the liberal final scaling, contributed to the A- average in the course.

Key Concepts

Floating point number systems

Polynomial Interpolation

Numerical Differentiation

Numerical Integration

Numerical Solutions to ODEs

Global error (Convergence) vs local error (Consistency)

Taylor Series

Hard Concepts

Butcher Tableau: Used to represent an instance of general Runge-Kutta method. Easy to get mixed up with the meaning of the different coefficients.

Barycentric weights: Used for efficient computation of Lagrange basis. Not something that would occur to you naturally.

Order of a method: Each chapter seems to have a slightly different meaning of the order and precision of a method. Good to keep track of what it means in a particular context.

Conclusion

Interesting course. A combination of annoying Matlab along with neat mathematics and clever algorithms.

Course Review: CPSC 317

Internet Networking

“Most people’s code must have resembled a state machine… Only two of you [had code resembling a finite state machine]?! There’s gonna be a lot of zeros on the next assignment!”

Text: Computer Networking: Top-Down Approach 7/E by James Kurose  Keith Ross

Prof:  Dr. Alan Wagner

The professor is an engaging speaker and likes to intersperse class with somewhat philosophical activities. Though he cares about his students’ learning, he is somewhat disorganized and several lectures were very unclear and confusing, even if you had done the reading. The class is not boring, however, and Dr, Wagner has a lot of useful analogies, aphorisms and anecdotes that you can learn from even if you will not be assessed on them.


Difficulty

Unfortunately, there were several issues with clarity and many more administrative issues that prevented the course from being the dynamic course Dr. Wagner intended. This resulted in amplifying the difficulty of the conceptual material being taught which is actually quite manageable. A lot of terms were not clearly defined, the slides were not structured or written well (especially if they are to be used as a study resource), and the in-class drawings on the slides often added to the general confusion rather than alleviating it. The textbook is sporadically useful, as we often deviate from it, and other times, since we jump around, we lack the pre-requisite knowledge to understand the assigned readings. The assignments are fun, but the evaluation criteria were not clearly defined and kept getting changed in different ways by various contradictory piazza posts by instructors and TAs.

Key Concepts

Networking Layer

Link Layer

Transport Layer

Various Protocols (IP, TCP, ARP etc)

DNS Lookup

Performance metrics

Communicating Finite State Machines

AS’s

Hard Concepts

DNS assignment: Implement a DNS server. The recursive calls along with the caching can get confusing, best not to do it at the last moment.

Layers: Separating out the flows in different layers can be confusing, best to read and understand an example protocol at each layer to get an idea of how it fits together.

Conclusion

Important topic, confusing delivery, still worth taking.

Course Review: CPSC 340

Machine Learning and Data Mining

“I don’t care if these parameters don’t mean anything to you – they don’t mean anything to me either.”

Text: (none)

Prof:  Dr. Michael Gelbart

Very affable professor. He is keen on his lectures (and even Piazza) not to be engulfed in sub-optimal questions, though, so the lectures are very well scheduled and structured. His background is in Biophysics which is pretty cool, as he will occasionally mention an application or connection to the physics that govern amoeba. He is very concerned about not contributing negatively to students’ mental health, often sparking heated Piazza discussions. Every now and then he will also drop some absolute gems such as:

“It’s easy to see that this is a good idea, but it takes a while longer to realize that it’s not a horrible idea”

“There’s a lot more to your loss function than just your GPA”

“Most of us already find it hard to find our way in this three-dimensional world”

“The mindset you should have right now is ‘I’m awake'”

“I’m a bit afraid to try this, but… Whatever”

“I was so much happier living a happy life, predicting the mean every time”

[Credits to Philip Haupt for recording these XD]


Difficulty

This course covers a LOT of material. The assignments are also very time-consuming. The actual material varies from trivial facts about parabolas to quite complex gradient calculations involving matrices. That said, the midterms and finals do not go very deep into the material. Ultimately the average for the course was an A and I don’t think there was any scaling.

Key Concepts

Classification

Regression

Clustering

Dimensionality Reduction

Loss Functions

Regularization

Fundamental Trade-off

Neural Networks

Hard Concepts

Convolutional Neural Networks: Not really sure how they worked. Thankfully, no question was asked about it on the final exam.

Kernel Regression and Kernel Trick: Not immediately obvious how these things work and inter-related. Possibly useful to think of the underly mathematics of inner products.

Conclusion

A broad introduction to the hot-topic that is Machine Learning. Enjoyable though time-consuming class.

Course Review: PHIL 321

Induction, Decision and Game Theory

“If my parents had brought me up in a house where all the coins were 99% heads, what probability should I assign to a coin flip?”

Text: An Introduction to Decision Theory by Martin Peterson

Prof:  Dr. Christopher Stephens

Entertaining lecturer. Full of funny examples and jokes. Also willing to go through examples quite slowly to ensure everyone understands.


Difficulty

The course is largely mathematical in nature, and as there are not a lot of pre-requisites the complexity of mathematics is not very high. We do cover a fair deal of diverse content in the course though so you might need to allocate some time to learn the material even if you do not find the mathematics challenging. The midterm average was quite low, and the highest grade was only 92% so it did not turn out to be much of a GPA booster if that was students motivation in taking the course.

Key Concepts

Decisions under Ignorance

Decisions under Risk

Interpretations of Probability

Game Theory

Hard Concepts

Dutch Book Arguments: Need to learn how to set the different cases up correctly. One useful intuition is that you usually want to encourage the victim to bet for what they are over-confident about and against what they are under-confident about.

Bayes Theorem: They sometimes ask Bayes Theorem questions in an odd way, not sure if it is mathematically incorrect or I am just being stupid. The confusion is related to sampling.

Final Paper: Not that used to writing a paper. Spent a great deal of time getting my thesis to be specific.

Conclusion

A decent introduction to Decision and Game Theory that is more nuanced and skeptical than you might get from an Economics or Mathematics class. Would have enjoyed even more emphasis on the philosophical issues and less on the mathematics.