{"id":690,"date":"2012-04-29T17:43:27","date_gmt":"2012-04-29T22:43:27","guid":{"rendered":"http:\/\/osric.com\/chris\/accidental-developer\/?p=690"},"modified":"2016-06-11T13:55:16","modified_gmt":"2016-06-11T18:55:16","slug":"using-ffmpeg-to-programmatically-slice-and-splice-video","status":"publish","type":"post","link":"https:\/\/osric.com\/chris\/accidental-developer\/2012\/04\/using-ffmpeg-to-programmatically-slice-and-splice-video\/","title":{"rendered":"Using FFmpeg to programmatically slice and splice video"},"content":{"rendered":"<p>My wife has a research project in which she needs to analyze brief (8-second) segments of hundreds of much longer videos. My goal was to take the videos (~30 minutes each) and cut out only the relevant sections and splice them together, including a static marker between each segment. This should allow her and her colleagues to analyze the videos quickly and using precise time-points (instead of using a slider in a video player to locate and estimate time-points). I&#8217;ve posted my notes from this process below for my own reference, and in case it should prove useful to anyone else.<\/p>\n<p>To my knowledge, the best tool for the job is <a href=\"http:\/\/ffmpeg.org\/\">FFmpeg<\/a>, an open source video tool. <!--more--> FFmpeg provides much of the underlying processing functionality for other popular video tools, such as <a href=\"http:\/\/handbrake.fr\/\">Handbrake<\/a>, <a href=\"http:\/\/www.mirovideoconverter.com\/\">Miro<\/a>, and <a href=\"http:\/\/www.mplayerhq.hu\">MPlayer<\/a>. Since you compile FFmpeg from source code, it should run on any system with a C compiler, including my OS X box. Unfortunately, it&#8217;s not the most user-friendly software package in the world.<\/p>\n<p>The developers recommend using the latest version from <a href=\"http:\/\/git-scm.com\/\">Git<\/a> (which finally forced me to install Git, something I&#8217;d been meaning to do anyway). One <a href=\"http:\/\/stephenjungels.com\/jungels.net\/articles\/ffmpeg-howto.html\">how-to for FFmpeg on OS X<\/a> docs suggest that I&#8217;d also need <a href=\"http:\/\/lame.sourceforge.net\/\">LAME<\/a> in order to process audio. (The audio is irrelevant for my use case, so I didn&#8217;t bother with LAME.) But I couldn&#8217;t compile FFmpeg because, apparently, OS X doesn&#8217;t include GCC. To get GCC on OS X, the official Apple way, I needed XCODE from the <a href=\"https:\/\/developer.apple.com\/technologies\/tools\/\">Apple Developer Tools<\/a>. To get those you have to sign up for an Apple Developer account. Welcome to Dependency Hell, now featuring bureaucracy!<\/p>\n<p>Hours later, I&#8217;ve compiled FFmpeg. However, the videos I&#8217;m dealing with are raw H.264 video files from a <a href=\"http:\/\/www.nightowlsp.com\/Products\/Complete-Kits\/K-44500-C\">Night Owl K-44500-C<\/a> surveillance system. Also, I want to save them as H.264-encoded MP4 files. That means I needed additional H.264 support&#8211;or, at least, I thought I did&#8211;from the <a href=\"http:\/\/www.videolan.org\/developers\/x264.html\">x264<\/a> project. And you need <a href=\"http:\/\/yasm.tortall.net\/\">YASM<\/a> to compile x264.<\/p>\n<p>I was actually unable to compile YASM from the Git repository, although I was able to compile it following the instructions in this <a href=\"https:\/\/trac.handbrake.fr\/wiki\/CompileGuide\">Handbrake for Mac OS X guide<\/a>.<\/p>\n<p>I recompiled FFmpeg with the &#8211;enable-libx264 and &#8211;enable-gpl switches:<br \/>\n<code>.\/configure --enable-libx264 --enable-gpl<br \/>\nmake<br \/>\nsudo make install<\/code><\/p>\n<p>To select just the relevant portions of the video, I used the -ss (start\/seek position) and -t (time\/duration) flags, e.g.:<br \/>\n<code>ffmpeg -f h264 -i input-video-file.264 -ss 180 -t 8 output-video-file.mp4<\/code><\/p>\n<p>The above example takes 8 seconds of the input video starting at the 3-minute (180-second) mark.<\/p>\n<p>However, when I played back the output, it played much too fast. The source videos included a timestamp, and about 4 seconds ticked by every for every second of video! It turned out that the source videos were recorded at 7 fps (frames per second). I added a flag to specify the framerate:<br \/>\n<code>ffmpeg -f h264 -r:v 7 -i input-video-file.264 -ss 180 -t 8 output-video-file.mp4<\/code><\/p>\n<p>After running this a few times for different 8 second segments, I needed to put the segments back together again. This is such a relatively common use-case that FFmpeg has instructions in their FAQ, <a href=\"http:\/\/ffmpeg.org\/faq.html#How-can-I-join-video-files_003f\">How can I join video files?<\/a>. The first method&#8211;concatenating MPEG-2 files&#8211;seemed like the easiest option. However, MPEG-2 doesn&#8217;t support the framerate (7 fps).<\/p>\n<p>I tried the other suggested method of concatenating videos, using named pipes. This worked, although the BASH script was convoluted and very particular.<\/p>\n<p>Another thing I wanted to add to the video was a separator&#8211;some static frames to divide each 8-second clip&#8211;and a title card. At first I created a JPEG and turned it into a 1-frame video, concatenated it with itself to create a 2-frame video, and then 4, 8, 16, etc. However, I discovered a much easier method of creating a video from a single image using the loop flag:<br \/>\n<code>ffmpeg -f image2 -loop 1 -r:v 7 -i image.jpeg -pix_fmt yuv420p -an -t 2 image-movie.mpeg<\/code><\/p>\n<p>(The pix_fmt flag was to set the correct color space, and the an flag ignores the audio channel.)<\/p>\n<p>Now I had title cards, clip separators, and 8-second video clips that I could combine into a single video. But I needed to do this hundreds of times! I wrote a Python script to generate the appropriate BASH script based on the input filename. The BASH script would create the title cards and clip separators using <a href=\"http:\/\/www.imagemagick.org\">ImageMagick<\/a>, and then call the appropriate FFmpeg commands to create and concatenate the video.<\/p>\n<p>The ImageMagick commands look like this:<br \/>\n<code>convert -size 704x480 -background SteelBlue1 -fill black -font Helvetica -pointsize 72 -gravity center label:[Video Title] titlecard.jpg<\/code><\/p>\n<p>Then I used the find command to run the Python script on all the video files:<br \/>\n<code>find *.264 -maxdepth 1 -exec .\/process.sh '{}' \\; -print<\/code><\/p>\n<p>Eureka! It worked. Almost perfectly.<\/p>\n<p><em>Almost.<\/em><\/p>\n<p>Some (but not all) of the output videos would produce solid gray frames after a certain time point. Reviewing the FFmpeg output for those files, there was an error:<br \/>\n<code>[h264 @ 0x9460340] FMO not supported<\/code><\/p>\n<p>FMO stands for Flexible Macroblock Ordering, and based on the response to a <a href=\"http:\/\/roundup.libav.org\/issue1440\">libavcodec issue from 2009<\/a>, the FFmpeg developers don&#8217;t plan to support it (although one of the developers suggested that, if someone would like to create a software patch to enable FMO support, the community would welcome it).<\/p>\n<p>I wrote to technical support for the camera system and asked if they had any suggestions. They replied that, although conversion to MPEG formats was not supported, they do provide a conversion utility to convert to AVI. I was able to successfully convert the AVIs to MP4s. An annoying extra step, but one that is only necessary in a subset of cases.<\/p>\n<p>This solution took me several weeks to figure out, although it should save quite a bit of time in the long run.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>My wife has a research project in which she needs to analyze brief (8-second) segments of hundreds of much longer videos. My goal was to take the videos (~30 minutes each) and cut out only the relevant sections and splice them together, including a static marker between each segment. This should allow her and her &hellip; <a href=\"https:\/\/osric.com\/chris\/accidental-developer\/2012\/04\/using-ffmpeg-to-programmatically-slice-and-splice-video\/\" class=\"more-link\">Continue reading <span class=\"screen-reader-text\">Using FFmpeg to programmatically slice and splice video<\/span><\/a><\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[60],"tags":[197,245,244,249,250,251,358,243],"class_list":["post-690","post","type-post","status-publish","format-standard","hentry","category-video","tag-bash","tag-dependency-hell","tag-ffmpeg","tag-h-264","tag-imagemagick","tag-mp4","tag-python","tag-video-2"],"_links":{"self":[{"href":"https:\/\/osric.com\/chris\/accidental-developer\/wp-json\/wp\/v2\/posts\/690","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/osric.com\/chris\/accidental-developer\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/osric.com\/chris\/accidental-developer\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/osric.com\/chris\/accidental-developer\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/osric.com\/chris\/accidental-developer\/wp-json\/wp\/v2\/comments?post=690"}],"version-history":[{"count":8,"href":"https:\/\/osric.com\/chris\/accidental-developer\/wp-json\/wp\/v2\/posts\/690\/revisions"}],"predecessor-version":[{"id":1515,"href":"https:\/\/osric.com\/chris\/accidental-developer\/wp-json\/wp\/v2\/posts\/690\/revisions\/1515"}],"wp:attachment":[{"href":"https:\/\/osric.com\/chris\/accidental-developer\/wp-json\/wp\/v2\/media?parent=690"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/osric.com\/chris\/accidental-developer\/wp-json\/wp\/v2\/categories?post=690"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/osric.com\/chris\/accidental-developer\/wp-json\/wp\/v2\/tags?post=690"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}