So what if you want to take the lovely Katy Perry, and use your graphics card to melt her lovely face? Well you could write some crazy C/CUDA code to mash memory around, or some PyCUDA which is just Python with the C code as strings.
But you don't want to do that. You'd rather fight Haskell's typechecker and end up with some magic EDSL code that does what you want in a lovely functional way.
Without further ado:
(Compile with $ghc -threaded -main-is Katy)
(Make the image with $ convert -loop 0 lol* katy.gif)