FlexEvents: WOD 3 Simulation

During a recent CrossFit competition, flex on the mall, we had to accomplish a team chipper in which we were able to pick the order of team members in order to minimize our workout time. The workout was simple: 50 burpees, 40 over the box jumps, and 30 kettle-bell snatches. Each exercise had to be accomplished in serial: no one could start an exercise until their previous team member had finished. This meant everyone waited while the first person started. As simple as this was, I got confused when figuring the optimal order for team members: should the slowest go first or last?

At the time, I thought the best strategy would be to have the fastest go first, to prevent the scenario where the fast folks were waiting and unable to help the team. I was focused on the idea that you didn’t want anyone to be waiting — so clear out the fast people first. My partners wanted the slowest to go first because the slower participants could rest, go super slow and not affect the final score.

While I was wrong, and they were correct, this didn’t make sense at the time, because I was viewing the whole thing as a linear operation where order didn’t matter, but waiting on someone would definitely slow down the overall time. It turns out that if you put the slowest last, no one is waiting, but the clock is rising totally based upon the slowest time, when you otherwise could have obscured their slow time. The workout came down to the following critical path: the time it took to do everyone’s burpees plus the last participant’s time.

However, this is only true if the time it took to do burpees was significantly more than the other events and there was not a significant difference in fitness between team members. After the competition, I wanted to understand the dynamics of this workout and build a quick model to understand where these assumptions held true and hone my intuition for stuff like this.

It turns out the worst thing to do is have the slowest person go last. The reason why is really simple: you are putting them in the critical path. In fact, an optimal strategy is to have the slowest always go first. If I assume all four members have different speeds, and made an assumption like this based on expected values of their workout times. Let’s assume four members have the following expected completion times (in notional units), where person 1 was the fastest and each successive participant was slower in all events.

Note: These are totally made up numbers and have nothing to do with our team . . . says the man whose wife now routinely kicks his scores to the curb.

person 1 2 3 4
burpees 9 10 15 17
box jumps 7 8 13 15
KB snatch 5 7 11 13


In this case, I wrote some Matlab to look through all (4! = 24) permutations. (Remember: 1 = fastest, 4 is slowest.)

CrossFit Order

This is such a simple event that you can see the basic building blocks without much introspection: 21 is the fastest time to complete the sequence and 45 is the slowest. If the team were all comprised of person 1 they could complete the whole thing in 48. As this fictional team is, burpees alone will account for 51 regardless of order, but if the fastest goes at the end, you can get away with only adding 15 on to the total time, versus adding 28 if the slowest person goes last.

Several things here. It is so simple to work the math of this with variables instead it probably should be done. A better analysis could show you how much variation you could tolerate between athletes before the problem dynamics above don’t apply. Maybe another lunch break for that.

OK, on some more thought, I thought I would look at several scenarios and show optimal strategies.

A team member is much slower on burpees, but faster on the other events.

In this case, it still makes sense for her to go last. The burpees are linear, but speed in the bottom two events determines the order. It helps me to assign a color to each team member and see all 24 strategies on one plot. The lightest shade is member one who has a burpee time of 40 compared to 4 for the others, but is faster in the other two events than the others. In the plot below, the fastest combinations are at the top where you can see that all the fastest outcomes have member one going last.

burpees slower on 1

Burpee times equal KB snatch times

So if their event times look like this where everyone’s KB snatch times equal their burpee times.

burpee times:    1   2   3   4
box jump times:  2   2   2   2
KB snatch times: 1   2   3   4

Then all outcomes are equal and you get a truly random pattern where all outcomes are equal regardless of order. So in this case it doesn’t matter what order you do the workout in, even though 4 is much slower than 1. Interesting and counter-intuitive.


My code is below for any of those interested.

CY2014 Quarter 1 Financial Review

Chrissy and I review our spending on a quarterly basis. Updating every 90 days isn’t too long to correct mistakes and remember purchases, but it also allows for the busy multi-week sprints that life presents us. While we have used every financial management program available, I’ve found the most straightforward and flexible solution is to download historical transactions into Excel where I can assign categories and do the type of analysis you can see below. This works for me because I have complete control. All the other solutions I used (MS Money, Quicken, Mint, GNU Wallet) introduce errors that have required lots of time to fix (or that can’t be fixed), but more importantly they constrain me to their interface and I got used to exporting information into tools that could flexibly answer my questions.

My basic workflow is to download statements from all our bank accounts and credit cards in put them all into one spreadsheet, where I ensure a consistent list of categories. I can do this quickly by filtering and sorting as most of our expenses are cyclical. Once everything is in the right format, I use lots of Excel SUMIF and SUMIFS functions to produce reports.

My purpose of doing a financial review is intended to accomplish the following:

  • Quality check (Are we getting paid the right amounts? Any incorrect expenses?)
  • Spending feedback (Are we overpaying in any categories? Anything we need to reign in?)
  • Tax Production

While my tax production and quality check was very helpful to me, I wanted to share the results of the spend analysis in case my reports might be useful to others.

Spending feedback

In summary, we had a small rise in our overall Grocery and Dining out categories, but the major cost drivers were:

  • Ellie’s 12 cavities were very expensive (no dental insurance)
  • We bought a new espresso machine (major purchase for us)
  • We bought a new car
  • We went crazy on clothes
  • Committed (again) to Army Navy Country Club


Where are we spending?

This doesn’t have a real effect on our spending, but I thought this was interesting. We don’t have saving/investments in here, this is just “spending”. I treated stuff like insurance, taxes, medical, fees, haircuts, etc as “cost of life” — things I feel we can’t avoid and don’t really have discretion in spending. Some other stuff that might fit this category (power bill) gets lumped into household (as does home maintenance and mortgage). I would love to do some more analysis and compare our spending to this article.


Daily Feedback

The plot below has categories on the Y-axis and days on the bottom. Intensity of color is the spend amount. I used matlab to produce this plot. I like it because the colormap used filters everything in way that comes out like a log scale — and that tells me what is a big deal and what is noise. The interesting dynamic is the frequency/magnitude trade that happens with spending dynamics: medical is in seldom/big chunks while grocery expenses are a constant but smaller expense.


You can see that our daily spending has a huge variance: The spending had a standard deviation that was twice our average spending — big purchases had a pronounced effect. I explore four levels of spending: discretionary (dining out), some and limited discretion (haircuts, medical) and non-discretionary (mortgage, tax) at the bottom.


Weekly Feedback

Click on the below to see full size


So how much can we control this?

If I break down spending into four categories:

  • Committed — We have to pay it (i.e. Mortgage)
  • Limited Discretion — We can commit extra time to reduce it (i.e. Home and Car Maintenance)
  • Some Discretion — We can make choices to decrease our quality of purchase (i.e. Groceries)
  • Total Discretion — We can do without this if we have to (i.e. Dining Out/New Clothes)

It turns out that a third of our expenses are committed where about a quarter each apply to limited and some discretion. Roughly 20{aaa01f1184b23bc5204459599a780c2efd1a71f819cd2b338cab4b7a2f8e97d4} of our expenses are totally discretionary and 70{aaa01f1184b23bc5204459599a780c2efd1a71f819cd2b338cab4b7a2f8e97d4} of our expenses could be changed if we had to. The takeaway for me is to focus on eliminating the stuff we pay for but don’t enjoy (fees) and the things that don’t bring joy/reward for their cost.

Fun with Bessel Functions

Well, I certainly forget things faster than I learn them. Today is a quick review of Bessel functions and their applications to signal processing.

The Bessel functions appear in lots of situations (think wave propagation and static potentials), particularly those that involve cylindrical symmetry. While special types of what would later be known as Bessel functions were studied by Euler, Lagrange, and the Bernoullis, the Bessel functions were first used by F. W. Bessel to describe three body motion, with the Bessel functions appearing in the series expansion on planetary perturbation

First, I think they should be called Bernoulli-Bessel functions both because that sounds more pompous and because they were discovered by Daniel Bernoulli and generalized by Friedrich Bessel. While they sound (and can be) complicated, they are the canonical solutions of Bessel’s differential equation:

$$ x^2 \frac{d^2 y}{dx^2} + x \frac{dy}{dx} + (x^2 – \alpha^2)y = 0 $$

for an arbitrary complex number (where denotes the order of the Bessel function). The most important cases are for as an integer or half-integer. Since all math ties together, I find it pretty cool that Bessel functions appear in the solution to Laplace’s equation in cylindrical coordinates.

Although and − produce the same differential equation that a real does, it is conventional to define different Bessel functions for these two values in such a way that the Bessel functions are mostly smooth functions of .

Bessel functions of the first kind:

Bessel functions of the first kind, known as , are solutions of Bessel’s differential equation that are finite at the origin () for integer or positive , and diverge as x approaches zero for negative non-integer . It is possible to define the function by its Taylor series expansion around .

$$ J_\alpha(x) = \sum_{m=0}^\infty \frac{(-1)^m}{m! \, \Gamma(m+\alpha+1)} {\left(\frac{x}{2}\right)}^{2m+\alpha} $$

where is the gamma function, a shifted generalization of the factorial function to non-integer values.

Bessel functions of the second kind :

The Bessel functions of the second kind, denoted by are solutions of the Bessel differential equation that have a singularity at the origin.

For non-integer , is related to by:

$$ Y_\alpha(x) = \frac{J_\alpha(x) \cos(\alpha\pi) – J_{-\alpha}(x)}{\sin(\alpha\pi)} $$

In the case of integer order n, the function is defined by taking the limit as a non-integer tends to :

$$ Y_n(x) = \lim_{\alpha \to n} Y_\alpha(x). $$

There is also a corresponding integral formula (for Re(x) > 0),

$$ Y_n(x) =\frac{1}{\pi} \int_0^\pi \sin(x \sin\theta – n\theta) \, d\theta – \frac{1}{\pi} \int_0^\infty \left[ e^{n t} + (-1)^n e^{-n t} \right] e^{-x \sinh t} \, dt.$$

is necessary as the second linearly independent solution of the Bessel’s equation when is an integer. But can be considered as a ‘natural’ partner of .

When is an integer, moreover, as was similarly the case for the functions of the first kind, the following relationship is valid:

$$ Y_{-n}(x) = (-1)^n Y_n(x).\,$$

Both and are holomorphic functions of x on the complex plane cut along the negative real axis. When is an integer, the Bessel functions are entire functions of x. If x is held fixed, then the Bessel functions are entire functions of .

Bessel Filters

In electronics and signal processing, a Bessel filter is a type of linear filter with a maximally flat group delay. The Bessel filter is used because a low pass filter is characterized by transfer function. The denominator of the Bessel filter is a reverse Bessel polynomial. Bessel filters are often used in audio crossover systems. Analog Bessel filters are characterized by almost constant group delay across the entire passband, thus preserving the wave shape of filtered signals in the passband.

A low pass active filter with a Bessel response is used when the filter needs to exhibit minimum differential delay between the various frequency components of interest contained within the input signal being filtered. In essence this means that the fundamental frequency of say an applied squarewave experiences the same input-to-output delay as the other harmonics within the filter’s pass-band. This results in a high degree of fidelity of the output signal relative to the input signal.

Excel Sorting and Grouping

I had two tables downloaded from Amazon:


Order Date  Order ID    Title   Category
1/26/14 102-4214073-2201835     Everyday Paleo Family Cookbook
1/13/14 115-8766132-0234619     Awesome Book A
1/13/14 115-8766132-0234619     Awesome Book B



Order Date  Order ID    Subtotal
1/6/14  102-6956821-1091413 $43.20
1/13/14 115-8766130-0234619 $19.42
1/16/14 109-8688911-2954602 $25.86

I’m building our Q1 2014 taxes and needed rows in the following format:

1/13/14 115-8766132-0234619 $22.43 Awesome Book A, Awesome Book B

In order to do this without using SQL, I did the following. If columns B corresponds to Order Id and C corresponds to the item Title, then I put the following formula in column N3

=+IF(B3=B2,N2 & " | " &C2,C3)

and in column O3 a column which might be named: “last before change?”:

=+IF(B3=B4,"", TRUE)

Then I could easily sort out the unwanted values. Done. Still, I would like to better automate this. Any thoughts appreciated.