http://www.perlmonks.org?node_id=920300

wanna_code_perl has asked for the wisdom of the Perl Monks concerning the following question:

Hello fellow monks!

I have data from a hardware sensor that I want to visualize in an animated graph. The data is timecoded and captured at 60Hz.

What I'd like to create is an animated ticker tape line graph, exported as any reasonable movie format, that would essentially "replay" the data at the original capture rate, scrolling left to right.

I can draw the graph frame by frame (or as one long strip) easily enough, but I don't know how to encode those frames into video.

The datasets are large (tens of millions of frames, each), so animated GIF is out, and saving all of the frames to disk and then encoding them is not an option either, and I even more surely can't fit them all in RAM, so the solution I use will have to stream.

I'm sure there are 10e6 ways to do this, but I haven't been able to find one. :-) Ideas?

Replies are listed 'Best First'.
Re: 2d animation
by BrowserUk (Patriarch) on Aug 15, 2011 at 14:05 UTC

    Are you looking to do this on the web or desktop?

    Either way, streaming a movie of a real-time line trace would be a hugely cpu-intensive way of approaching the problem. Video-encoding is a very cpu-intensive process, usually done off-line, not on-the-fly.

    The most (band-width) economic method would be to transmit (just) the numeric values and have the graph drawn locally. Basically, each update would consist of bit-blitting the right-most, width-N pixels left N pixels and then drawing the new N-pixel position on the right.

    Updating 60 times per second may be a little too fast across an unreliable html connection, but you can easily reduce that (to 30/sec) by doing two new values at a time, or (15/sec) doing 4 at a time etc.

    For the web this can easily be done using javascript or the new HTML5 canvas control. You could also use a web browser, or one of the graphical toolkits (eg Tk) on the desktop.


    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.
      I meant streaming to the encoder (e.g., ffmpeg) in the Unix pipelining sense, not streaming to the end user. Offline encoding is indeed what I want, "exported as any reasonable movie format". Sorry if that wasn't clear.

        Hm. OKay, but still at somepoint, someone is going to watch these movies right?

        Drawing, compressing, shipping and displaying 60 frames a second of video in order to display 60 individual numeric values per second is a laborious, time & space consuming way of allowing them to see it. Especially if there are going to be many such videos.

        It would be far more cost effective to construct a simple application that reads the raw timestamp/value pairs from a file and draws the scrolling line trace on the users screen on demand. Then you give the user(s) the application once and send them the raw data files -- which will compress readily -- as they become available.

        My estimate is that 10 minutes of raw data -- say 2 x 64-bit float values per reading -- 16 * 60 * 60 * 10 = 1/2 Mb uncompressed and reasonably, less than half that zipped.

        The same information captured as 640x480 video will run to ~60MB. Even if you radically compress it with a lossy codec, you'll be very lucky to get much below 30-35MB. And the results would be hideously blurry, blocky and inaccurate. Effectively useless for anything other than "Ooh. Look at that!".

        I also seriously doubt that you'll be able to render the movies in real-time, which means you'll first have to record the data, and then play it back -- via some application that renders the scroll line trace -- so you can record that as the movie.

        So, you'll need a line tracer anyway to create the movies. Once you have that, why not just give a copy to the user(s) and let them see the actual trace rather than wait for you to produce and ship some huge, blotchy movie of it?


        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        "Science is about questioning the status quo. Questioning authority".
        In the absence of evidence, opinion is indistinguishable from prejudice.
        There is a FFmpeg module on CPAN that wraps the C libraries. Have you investigated it?
Re: 2d animation
by zentara (Archbishop) on Aug 15, 2011 at 15:41 UTC
    If you already have a way to capture each frame, say a a jpg, you can use mencoder or ffmpeg to convert them into a movie. For example, this will take a series of jpg's , like 0000001.jpg, 0000002.jpg, etc
    #create a small speed adjusted file mencoder "mf://*.jpg" -mf fps=60 -o output.avi -ovc lavc -lavcopts vco +dec=mpeg4
    or with ffmpeg
    ffmpeg -f image2 -i image%d.jpg video.mpg # This command will transform all the images from the current direct +ory # (named image1.jpg, image2.jpg, etc...) to a video file named video. +mp
    You can google for assemble images to avi ffmpeg

    Also see z-charcoal-video-converter for how to use these programs from Perl. In this one, I rip the movie to jpgs and audio, process each frame to a charcoal effect, then recombine them. It does audio too, but you don't need that. It does show how to run the commands thru Perl's system, and handling of the jpg filenames. It's preferrable to put the jpgs in a separate sub directory, since there can be so many of them.

    Oops, I just read that you can't save the images to disk first, so that kind of blows my method above away; but if you could somehow feed the frames to ffmpeg in a realtime stream, you may have some luck. Google for ffmpeg capture streaming frames . In that train of thought, see Webcam Streaming to Perl/Tk with ffmpeg


    I'm not really a human, but I play one on earth.
    Old Perl Programmer Haiku ................... flash japh
Re: 2d animation
by zentara (Archbishop) on Aug 15, 2011 at 16:29 UTC
    An easy way to do this, if you can dedicate the whole computer to just capturing the movie, is to setup a ffmpeg screencast, where you basically are capturing your screen to a movie. I've tested this, and it works pretty well. I don't know if you will be able to attain a 60 fps rate, but it's worth a try. See screencasts with ffmpeg for a step-by-step example.

    UPDATE As I sat and thought about it, the screencast example uses the mkv video format, which has a very useful trait... you can just concat them together to make long videos out of short ones. So... to avoid your excessive saving of frames to disk, you could save say 10000 frames at a time, and when the counter hit 10000, you make one mkv movie segment, then discard the old frames and start another 10000. At the end, concat all the mkv's into one movie and convert it to mp4, avi, or whatever you choose.

    Ideas come to me in spurts, sorry I spread my answer out over 3 posts. :-)


    I'm not really a human, but I play one on earth.
    Old Perl Programmer Haiku ................... flash japh

      Ah, yes, excellent point. I think this will work.

      In my early tests, the PNG frame export brought a SAS RAID 0 array to its knees, but if I partition as you suggest, I can easily save 200000 frames to a ramdisk (the render machines have 32GB).

      As a bonus, I pretty much get a distributed solution for free. I can chunk and distribute the (small) data file across the render farm, have each machine render 200k frames at a time and send back the small mkv, and then trivially cat the result on the master.

      So, it's actually a good thing you posted twice, because I can give you two richly deserved upvotes. :-)

        Thanks again to everyone who replied.

        I was able to easily modify the graphing routine to output frames to a ramdisk in configurable batches. The Perlmagick-based output loop is able to output frames at about 720fps on our render machines (dual Xeons with 32GB RAM). Every 10000 frames (batch size--within reason--didn't make a significant difference), I just call a sub that runs ffmpeg, which encodes at about the same rate. Therefore typical 29.97 project encodes at about 12x realtime.

        Everything is still single threaded, and I haven't optimized anything yet. I will probably tinker with the old thread code and the FFmpeg library for interest's sake. But 12x is already more than good enough for our needs (1/2x would have been acceptable), so I plan to keep it simple.

        Even at modest bitrates, lines are sharp and artifacts are barely perceptible, if at all, to the technicians who look at these graphs/videos for a living, and do not affect the ability to interpret the data whatsoever.

Re: 2d animation
by luis.roca (Deacon) on Aug 16, 2011 at 05:12 UTC

    If you haven't already, you may want to look into Processing (Not the JavaScript version but the original created by Casey Reas and Ben Fry). The software and language excel at the kind of 2D data visualization you are looking to do. Although it's been a while since I've read the case studies, IIRC there have been more than a few projects combining video with synced data driven art. Also IIRC you can build your data in Perl and pull it from the software.

    The Reas/Fry Processing text book has some decent case studies and a quick search for specific examples on Processing's Vimeo Channel may be worthwhile.

    Good luck. I'm interested to hear how it turns out. :)

     

    UPDATE: Tuesday the 16th of August 2011: Added 'Additional References' and link to the Processing channel on Vimeo.

    "...the adversities born of well-placed thoughts should be considered mercies rather than misfortunes." — Don Quixote

      Thanks for the tip! Hadn't heard of Processing before. Definitely worth looking into, at least on future projects.

      Sure, I'll post a quick update once I've gotten a little farther along.