Eric’s Enlightenment for Wednesday, June 3, 2015

  1. Jodi Beggs uses the Rule of 70 to explain why small differences in GDP growth rates have large ramifications.
  2. Rick Wicklin illustrates the importance of choosing bin widths carefully when plotting histograms.
  3. Shana Kelley et al. have developed an electrochemical sensor for detecting selected mutated nucleic acids (i.e. cancer markers in DNA!).  “The sensor comprises gold electrical leads deposited on a silicon wafer, with palladium nano-electrodes.”
  4. Rhett Allain provides a very detailed and analytical critique of Mjölnir (Thor’s hammer) – specifically, its unrealistic centre of mass.  This is an impressive exercise in physics!
  5. Congratulations to the Career Services Centre at Simon Fraser University for winning TalentEgg’s Special Award for Innovation by a Career Centre!  I was fortunate to volunteer there as a career advisor for 5 years, and it was a wonderful place to learn, grow and give back to the community. My career has benefited greatly from that experience, and it is a pleasure to continue my involvement as a guest blogger for its official blog, The Career Services Informer. Way to go, everyone!

Eric’s Enlightenment for Wednesday, May 20, 2015

  1. A common but bad criticism of basketball analytics is that statistics cannot capture the effect of teamwork when assessing the value of a player.  Dan Rosenbaum wrote a great article on how adjusted plus/minus accomplishes this goal.
  2. Citing Dan’s work above, Neil Paine used adjusted plus/minus (APM) to show why Jason Collins was one of the top defensive centres in the NBA and the most underrated player of the last 15 years of his career.  When Neil mentions regularized APM (RAPM) in the third-to-last paragraph, he calls it a Bayesian version of APM.  Most statisticians are more familiar with the term ridge regression, which is one type of regression that penalizes the inclusion of too many redundant predictors.  Make sure to check out that great plot of actual RAPM vs. expected PER at the bottom of the article.
  3. In a 33-page article that was published on 2015-05-14 in Physical Review Letters, only the first 9 pages describes the research done for the article; the other 24 pages were used to list its 5,514 authors – setting a record for the largest known number of authors for a single research article.  Hyperauthorship is common in physics, but not – apparently – in biology.  (Hat Tip: Tyler Cowen)
  4. Brandon Findlay explains why methanol/water mixtures make great cooling baths.  He wrote a very thorough follow-up blog post on how to make them, and he includes photos to aid the demonstration.

Eric’s Enlightenment for Friday, May 1, 2015

  1. PROC GLIMMIX Contrasted with Other SAS Statistical Procedures for Regression (including GENMOD, MIXED, NLMIXED, LOGISTIC and CATMOD).
  2. Lee-Ping Wang et al. recently developed the nanoreactor, “a computer model that can not only determine all the possible products of the Urey-Miller experiment, but also detail all the possible chemical reactions that lead to their formation”.  What an exciting development!  It “incorporates physics and machine learning to discover all the possible ways that your chemicals might react, and that might include reactions or mechanisms we’ve never seen before”.  Here is the original paper.
  3. A Quora thread on the best examples of the Law of Unintended Consequences
  4. In a 2-minute video, Alex Tabarrok argues why software patents should be eliminated.

Getting Ready for Mathematical Classes in the New Semester – Guest-Blogging on SFU’s Career Services Informer

The following blog post was slightly condensed for editorial brevity and then published on the Career Services Informer, the official blog of the Career Services Centre at my undergraduate alma mater, Simon Fraser University

sfu csi

As a new Fall semester begins, many students start courses such as math, physics, computing science, engineering and statistics.  These can be tough classes with a rapid progression in workload and difficulty, but steady preparation can mount a strong defense to the inevitable pressure and stress.  Here are some tips to help you to get ready for those classes.

Read more of this post


Mathematics and Applied Statistics Lesson of the Day – The Harmonic Mean

The harmonic mean, H, for n positive real numbers x_1, x_2, ..., x_n is defined as

H = n \div (1/x_1 + 1/x_2 + .. + 1/x_n) = n \div \sum_{i = 1}^{n}x_i^{-1}.

This type of mean is useful for measuring the average of rates.  For example, consider a car travelling for 240 kilometres at 2 different speeds:

  1. 60 km/hr for 120 km
  2. 40 km/hr for another 120 km

Then its average speed for this trip is

S_{avg} = 2 \div (1/60 + 1/40) = 48 \text{ km/hr}

Notice that the speed for the 2 trips have equal weight in the calculation of the harmonic mean – this is valid because of the equal distance travelled at the 2 speeds.  If the distances were not equal, then use a weighted harmonic mean instead – I will cover this in a later lesson.

To confirm the formulaic calculation above, let’s use the definition of average speed from physics.  The average speed is defined as

S_{avg} = \Delta \text{distance} \div \Delta \text{time}

We already have the elapsed distance – it’s 240 km.  Let’s find the time elapsed for this trip.

\Delta \text{ time} = 120 \text{ km} \times (1 \text{ hr}/60 \text{ km}) + 120 \text{ km} \times (1 \text{ hr}/40 \text{ km})

\Delta \text{time} = 5 \text{ hours}


S_{avg} = 240 \text{ km} \div 5 \text{ hours} = 48 \text { km/hr}

Notice that this explicit calculation of the average speed by the definition from kinematics is the same as the average speed that we calculated from the harmonic mean!