Okay, so check this out, I was messing around with some AI stuff the other day, right? And I stumbled across this video – Nick Kyrgios reacting to Federer smashing his racket. Classic, right?

I thought, “Hey, wouldn’t it be cool to see if I could get some AI to, like, analyze Kyrgios’ reaction?” So, I started digging around for tools.
First thing I did was download the video. Found a decent quality version on YouTube, used one of those online downloader thingies. Nothing fancy.
Then, I needed to break it down. I used FFmpeg to split the video into individual frames. It’s command line, kinda geeky, but it gets the job done. Something like: ffmpeg -i *4 -r 30 frame%*
. That spits out a ton of JPGs, one for each frame.
Next up, the AI part. I decided to try and use some facial recognition and emotion detection APIs. Google Cloud Vision, Amazon Rekognition, and Microsoft Azure Face API all looked promising. I ended up going with Azure because it had a free tier that was good enough for my little experiment.
I wrote a quick Python script to loop through all those frame images. For each image, it sent it to the Azure Face API and got back a JSON response. The response had stuff like the coordinates of Kyrgios’ face, and estimates of his emotions: anger, happiness, sadness, surprise, etc.
This is where things got interesting.
- I had to clean up the data a bit. The API wasn’t perfect, sometimes it wouldn’t detect a face at all, or the emotion scores were just weird.
- So, I smoothed out the emotion scores over time, using a moving average. That way, you don’t get these crazy spikes and dips that are probably just noise.
- Then, I plotted the emotion scores over time, alongside the video itself. I used Matplotlib for the plotting. Simple line graphs for each emotion.
And… well, the results were kinda what you’d expect. As Federer smashes the racket, Kyrgios’ “surprise” and “contempt” scores definitely spiked. There was a little bit of “sadness” too, which I thought was interesting. Maybe he felt a little bad for Roger?
It wasn’t perfect, mind you. The AI isn’t going to tell you exactly what Kyrgios was thinking. But it did give a pretty decent, objective measure of how his facial expressions changed during that moment.
Learned a few things along the way:
- AI emotion detection is cool, but still needs a human touch to interpret the results.
- FFmpeg is your friend for video manipulation.
- Python is awesome for gluing everything together.
Overall, it was a fun little project. Might try it again with a different video next time. Maybe some drama movie scene or something?