I am horrible at encoding and capturing videos but I got a new mountain bike and decided it was time to turn the spare Pi from a Mumble server to a portable video camera.
From my experience, capturing streaming video at any decent fps is a challenge without proper gpu hardware encoding. Here are my attempts while trying.
I used a Logitech C310 webcam – about $30. The C310 works very well with the Pi and I have been able to run it without a powered usb hub (granted it is the only usb device running). It is happy outputting YUYV and MJPEG format video in the following resolutions:
352×288 = default
MJPEG video will also do: 640×480, 752×416, 800×448, 800×600, 864×480, 1280×720
First, I started playing with Motion due to the ease of install. It is pretty much ready to run out of the box. Motion is geared toward remotely monitoring for (guess what) motion.
The default settings are pretty much ready to go with the Logitech C310 but you do have to start the application or change the default setting to have it run as a daemon on startup.
//to start or stop motion /etc/init.d/motion start /etc/init.d/motion stop
I would normally run Motion in the “no create output file mode” so I could watch what errors my tweaks were generating:
The configuration files with the default installation from apt-get on Wheezy are located at:
It does exactly what is says it does well. Run the daemon and it will capture X number of pictures or X seconds of video after a detected movement occurs. The movement determining threshold is adjustable.
Through the configuration file, you can also enable a webserver so you can remotely monitor a MJPEG stream, you can also enable another webserver to modify the configuration from the web and most importantly for my goal, you can have the streaming output encoded and saved by ffmpeg.
I turned off everything possible within Motion and on the Pi but I still could not get past 10-15fps. I did take this version on a couple bike rides just to see what it looked like and it was pretty much a giant fail for video.
One note – to get around the lack of stop start button and easy way to connect to the Pi in the field, I added a startup script that killed Motion after 5 minutes and shutdown the pi.
So Motion, while awesome, was not going to work. I decided to play directly with ffmpeg next. The thought was maybe I could squeeze something else out of it.
Ffmpeg is an extremely versatile tool when it comes to encoding video. It is used by a lot of commonly used video converters (like Handbrake).
My first attempts where simple – I just wanted to get 30 seconds of ok video.
ffmpeg -f video4linux2 -i /dev/video0 -r 30 -t 00:00:35 /home/pi/webcam`date +%j%H%M%S`.mpg
Again, no luck at the high frame rates I needed. So I thought well maybe if I could cut down the encoding/resizing/reformatting I could make it go faster. The camera outputs YUYV and MJPEG so why do I need to convert it – why can’t I just capture it to disk? First, I double checked the output formats:
sudo v4l2-ctl --list-formats-ext ioctl: VIDIOC_ENUM_FMT Index : 0 Type : Video Capture Pixel Format: 'YUYV' Name : YUV 4:2:2 (YUYV) ... ... Index : 1 Type : Video Capture Pixel Format: 'MJPG' (compressed) Name : MJPEG
Then after consulting ‘man ffmpeg’ in Google. I tried:
ffmpeg -f video4linux2 -vcodec mjpeg -i /dev/video0 raw.mjpg
ffmpeg -f video4linux2 -r 30 -i /dev/video0 -vcodec rawvideo raw.yuv
Both came out jittery. Looks like I will have to do more research and come back to this.
For those wanting to play the raw video: Rather than encoding it after the initial capture, I used vlc to play it directly with:
vlc --demux rawvideo --rawvid-fps 30 --rawvid-width 320 --rawvid-height 240 --rawvid-chroma=YUY2 --rawvid-aspect-ratio 16:9 t:\ra w.yuv
I figured out the proper chroma/pixfmt setting from the vlc-devel site.
If anyone has some other insights, I would love to hear them.