Analysing harmonic motion using computer vision

September 2023

this is more of a personal journey of winning a coding competition, less of a technical guide to computer vision


A few days ago, I saw the Creative Coding Competition by Radu Mariescu-Istodor. I’ve been following his work for a while.

When this competition popped up, I had to give it a shot. It was way outside my comfort zone, but it ended up being one of the most fun coding experiences I’ve had.

ps: I actually scored the highest in the competition (46/50), here's the video reviewing my project by Radu Mariescu-Istodor


The Challenge

The challenge was straightforward: Take a video footage and transform it with a visual effect, using nothing but code.

This is my submission of the project github.com/rudrodip/Harmonic-Oscillator-CV


Overview of the project

It's very simple tool to analyse harmonic of a pendulumn given the video of the pendulum. It uses some popular python libraries like OpenCV, SciPy, NumPy to detect, analyse and predict the harmonic motion of the pendulum. It can take video feed from a file, webcam or a URL, and try to guess the critical parameters of the harmonic motion of the pendulum.


How to use it?

Check out the UI section first, it'll make setup way easier and help you understand how everything works.

Before you begin, ensure you have python 3.6 or higher installed on your system. You can download Python from python.org.

Setting up the project

Clone the repository

git clone https://github.com/rudrodip/Harmonic-Oscillator-CV
cd Harmonic-Oscillator-CV

Create a virtual environment

python -m venv venv

Activate the virtual environment

On Windows:

venv\Scripts\activate

On MacOS and Linux:

source venv/bin/activate

Installing Dependencies

Once you have your virtual environment set up and activated, you can install the project's dependencies

pip install -r requirements.txt

Usage

  1. Run the main application script:
python app/app.py

The OpenCV library you will be using for this project is opencv-python-headless (not opencv-python), so make sure you have the correct library installed in virtual environment, otherwise uninstall opencv-python and reinstall opencv-python-headless using this command

pip install opencv-python-headless --no-cache-dir

User Interface

The UI is very straightforward and made with python and pyqt5.

Buttons and Dropdowns

image

Main controls:

  • Run/Stop: Start and stop video processing
  • Select Video/Webcam/URL: Choose your video source
  • Hide Params: Toggle visual guides on/off

Dropdowns let you switch between display modes (raw video, contours, mask) and detection methods (HSV color detection is recommended).

HSV Range Sliders

image

Fine-tune color detection here. My workflow: set display to "Mask," run the video, then adjust sliders until the white blob matches your pendulum bob. Save when it looks good.

Analyze Widget

image

Hit Estimate once you have good tracking to fit a curve to your data. Shows calculated physics parameters that you can save for later.

Graph

image

Built with pyqtgraph - zoom, pan, scale, and export however you want


Project Sections

This project has three main parts:

  1. Programming: Here I built the user interface and set up image processing using OpenCV and cvzone. The UI was pretty easy to make with Python and PyQt5.

  2. Mathematics: This part is about using math to get useful info from the data.

    • Curve Fitting with curve_fit: I use SciPy’s curve_fit to match the data to a damped oscillation curve. This helps find and analyze the motion patterns.
    • least_square for Circular Path Detection: The least_square algorithm is not only used for curve fitting but also for assessing the accuracy of circular paths.
  3. Physics: Physics provides the necessary equations to model and estimate physical parameters like pendulum length and pivot—crucial for understanding the motion.


Programming

Image Processing

OpenCV handled the video feed, object detection, and tracking. cvzone made contour detection easy, especially for finding objects by HSV color range.

Main Loop Overview

Each frame went through this cycle:

  1. Capture Frame – Get the latest image from the camera.
  2. Find Contours – Use cvzone to detect objects by color.
  3. Pivot & Path Prediction – With SciPy’s least_square, estimate the pivot point and predict motion.
  4. Send Data to UI – Position and frame count updates for display.
  5. Render Frame – Show processed video in real time.
  6. Repeat – Frame after frame for smooth live analysis.

Mathematics

Detecting Circular Motion with least_square

Even if the video is rotated, circular motion can be detected reliably:

  1. Extract Contours – Get the object’s outline each frame.
  2. Residuals Calculation – Compare actual points to a circle model:
def circle_residuals(params, x, y):
    a, b, r = params  # circle center (a,b) and radius r
    return np.sqrt((x - a)**2 + (y - b)**2) - r

Residuals (per point) For data points (xi,yi)(x_i, y_i) and circle (a,b,r)(a,b,r):

εi  =  (xia)2+(yib)2    r\varepsilon_i \;=\; \sqrt{(x_i - a)^2 + (y_i - b)^2}\;-\; r
  1. Optimize Parameters – Let least_square adjust center (a,b)(a,b) and rr to minimize εi2\sum \varepsilon_i^2.
  2. Pivot & Rotation – After estimating the pivot and rotation angle θ\theta, transform coordinates to a de-rotated frame:
[xy]=[cosθsinθsinθcosθ]([xy][ab])\begin{bmatrix} x' \\[2pt] y' \end{bmatrix} = \begin{bmatrix} \cos\theta & -\sin\theta\\ \sin\theta & \phantom{-}\cos\theta \end{bmatrix} \left( \begin{bmatrix} x \\ y \end{bmatrix} - \begin{bmatrix} a \\ b \end{bmatrix} \right)

Fitting Harmonic Motion with curve_fit

Goal: fit the exact motion equation.

Model (underdamped oscillator):

x(t)  =  Aeγtcos(ωt+ϕ)  +  Cx(t) \;=\; A\,e^{-\gamma t}\,\cos(\omega t + \phi)\;+\;C
def underdamped_harmonic_oscillator(t, A, gamma, w, phi, C):
    return A * np.exp(-gamma * t) * np.cos(w * t + phi) + C

Process

  • Collect (t,x(t))(t, x(t)) data.
  • Fit with curve_fit.
  • Extract A,γ,ω,ϕ,CA, \gamma, \omega, \phi, C.

Result: a clean curve matching real motion.


Physics

Understanding the physics behind the motion is key to making sense of the data.
Our object behaves like a damped harmonic oscillator, so the starting point is the classic equations of motion.


From Springs to Pendulums

A damped mass–spring system is described by:

mx¨(t)+γx˙(t)+kx(t)=0m\,\ddot{x}(t) + \gamma\,\dot{x}(t) + k\,x(t) = 0
  • mm → mass
  • γ\gamma → damping coefficient (friction, air resistance, etc.)
  • kk → stiffness or restoring constant

A simple pendulum with small oscillations follows a very similar form:

θ¨(t)+2γθ˙(t)+ω02θ(t)=0\ddot{\theta}(t) + 2\gamma\,\dot{\theta}(t) + \omega_0^{2}\,\theta(t) = 0
  • θ(t)\theta(t) → angular displacement
  • ω0\omega_0 → natural angular frequency (no damping)

This similarity is why the math we use for springs also works for pendulums.


The Underdamped Case

When damping is small (γ<ω0\gamma < \omega_0), the motion looks like a decaying cosine:

x(t)=Aeγtcos(ωt+ϕ)+Cx(t) = A\,e^{-\gamma t}\,\cos(\omega t + \phi) + C

Here:

  • AA → amplitude
  • γ\gamma → damping rate
  • ω\omega → damped angular frequency
  • ϕ\phi → phase
  • CC → vertical offset

The link between damped and natural frequencies is:

ω=ω02γ2\omega = \sqrt{\omega_0^{2} - \gamma^{2}}

This means damping slows the oscillation slightly compared to the ideal case.


Connecting to String Length

For a simple pendulum:

ω0=gL\omega_0 = \sqrt{\frac{g}{L}}

where LL is the string length and gg is gravity.

From the fitted parameters in our data:

  1. Get ω\omega and γ\gamma from the curve fit.
  2. Compute the natural frequency: ω0=ω2+γ2\omega_0 = \sqrt{\omega^{2} + \gamma^{2}}
  3. Solve for the length: L=gω02L = \frac{g}{\omega_0^{2}}

If the Pendulum isn’t “Simple”

Real pendulums can be compound pendulums — the bob has its own size and mass distribution.
We then need its moment of inertia II:

Moment of inertia of a solid sphere about its center:

Ibob=25mr2I_{\text{bob}} = \frac{2}{5} m r^{2}

Shifted to the pivot point using the parallel-axis theorem:

I=Ibob+ml2I = I_{\text{bob}} + m l^{2}

The natural frequency becomes:

ω0=mgdI\omega_0 = \sqrt{\frac{m g d}{I}}

where dd is the distance from the pivot to the center of mass.

In practice:

  • Use the fitted motion data to estimate ω0\omega_0.
  • Apply the correct formula (simple or compound) to back-calculate LL.

So that's it, it's super simple, and I know there are tons of better ideas out there, a lot of optimizations to be done, but it was a fun project to work on.

I really want you guys to try it out, it feels magical when you see the plots being drawn in real time.

If you want to chat about it, you can knock me on X. Give it a star on github if you like it.

See ya 👋!