This talk is about pursuing a dumb idea to its extremes. In this case, the dumb idea was to scrape GIFs off Tumblr, play them back fullscreen, and beat-match them to music from Rdio. An instant GIF party, if you will. It’s been my hobby project for over a year, and covers a huge range of web technologies.
We’ll explore some of the things I’ve learned along the way. Such as:
How inflexible, inefficient the GIF format is, but how ubiquitous and beautiful GIFs themselves are
Writing a binary GIF parser in JS vs using Emscripten to compile existing C code
Different ways of detecting a GIFs pacing, rhythm and mood
Repurposing CloudFront to circumvent same-origin policy problems and proxy someone else’s CDN
Synchronising audio, metadata, GIFs and resources from 3 external APIs within a requestAnimationFrame loop
The incredible depth of audio analysis information available from the EchoNest API
Using Heroku and a few lines of NodeJS to refactor an external API
The potential of using a computer’s microphone to do ambient beat detection of music playing elsewhere in the room
I hope you learn something along the way too.