The Hierarchical Dirichlet Process Hidden Semi-Markov Model

In my work at DARPA, I’ve been exposed to hidden Markov models in applications as diverse as temporal pattern recognition such as speech, handwriting, gesture recognition, musical score following, and bioinformatics. My background is in stochastic modeling and optimization, and hidden Markov models are a fascinating intersection between my background Continue Reading

Gradient Descent

It has been awhile since I’ve studied optimization, but gradient descent is always good to brush up on. Most optimization involves derivatives. Often known as the method of steepest descent, gradient descent works by taking steps proportional to the negative of the gradient of the function at the current point. Continue Reading

Fun with Bessel Functions

Well, I certainly forget things faster than I learn them. Today is a quick review of Bessel functions and their applications to signal processing. The Bessel functions appear in lots of situations (think wave propagation and static potentials), particularly those that involve cylindrical symmetry. While special types of what would Continue Reading

Sampling Exploration

I needed to review the Nyquist–Shannon sampling theorem. The coolest thing about Nyquist is that it expresses the sample-rate in terms of the function’s bandwidth and leads to a formula for the mathematically ideal interpolation algorithm. What is sampling? Sampling is nothing more than converting a signal into a numeric Continue Reading