Raspberry Pi record video to file and pipe frames

I’m trying to record a video to a file while at the same time, extract frames from the video stream, on a Raspberry Pi with a Pi Cam.
If anyone has a better idea on how to do this, please let me know.
Here’s what I got now:

w,h = (640, 480) # image dimensions
bytesperframe = w * h # number of bytes in the image
camera_cmd = `raspivid -w $w -h $h --output - --timeout 0 --nopreview` # calling raspivid and piping out the stream
tee_cmd = `tee test_video.h264` # split the stream into 2, one gets saved to file
frame_cmd = `ffmpeg -f h264 -i pipe: -r 1 -f image2pipe -` # and one gets processed by ffmpeg for me to extract frames from
io = open(pipeline(camera_cmd, tee_cmd, frame_cmd)) # start it all
bytes = read(io, bytesperframe) # read one frame
img = reshape(bytes, w, h) # reshape
read!(io, img) # faster 

The problem with this is that whenever I close(io) I get Broken pipe errors, which makes sense, cause how on earth could this piped process stop elegantly? The consequences are that sometimes the video file is corrupt or only partially saved.

My goal is to be able to have a video file recording, while at the same time access (in “real time”) frames from that video so that I can process them. The results of that processing is then served with JSServe for diagnostics viewed by users, again, in “real time” (by that I mean within a second or so).

Thanks in advance!

The solution seems to be using VideoIO.appendencode, where I can grab images off of the camera, do what I will with them, and encode them to a video file at the same time. Brilliant! But… The encoding process on the RPI is about 60 times slower than on a PC, resulting in a FPS of about 2. More details here:

I ran into the same problem, and have been wondering if it is possible to just stream raw video data to a server (eg on the same WLAN) that does the encoding, but I have not figured this out yet.

1 Like

Hi @Tamas_Papp!

I’ve made some progress here, but not quite all the way. First, here is a functioning example of how to save a video file while at the same time save to file the last frame (in these examples I crop the image to a square and resize it, you can ignore those parts):

using FFMPEG_jll

const SZ = 640 # width and height of the images
const FPS = 5 # frames per second
const IMGPATH = "frame.png" # this is the image file of the last frame

start_camera(file) = ffmpeg() do exe
    run(`$exe -y -hide_banner -loglevel error -f v4l2 -i /dev/video0 -filter_complex '[0:v]crop=in_h:in_h,split=2[out1][out2]' -map '[out1]' $file -map '[out2]' -s $(SZ)x$SZ -r $FPS -update 1 $IMGPATH`, wait = false)
end

p = start_camera("tmp.mp4")
sleep(10) # record for 10 seconds
kill(p) # finish

This works very well, but it writes to disk each frame, so there’s a lot of unnecessary IO operations.

Instead, here is an in-memory version, but there’s a but at the end (the following readpngdata was shamelessly taken from @Per’s readpngdata):

using FFMPEG_jll, FileIO

function readpngdata(io) 
    blk = 65536;
    a = Array{UInt8}(undef, blk)
    readbytes!(io, a, 8)
    if view(a, 1:8) != magic(format"PNG")
        error("Bad magic.")
    end
    n = 8
    while !eof(io)
        if length(a)<n+12
            resize!(a, length(a)+blk)
        end
        readbytes!(io, view(a, n+1:n+12), 12)
        m = 0
        for i=1:4
            m = m<<8 + a[n+i]
        end
        chunktype = view(a, n+5:n+8)
        n=n+12
        if chunktype == codeunits("IEND")
            break
        end
        if length(a)<n+m
            resize!(a, max(length(a)+blk, n+m+12))
        end
        readbytes!(io, view(a, n+1:n+m), m)
        n = n+m
    end
    resize!(a,n)
    return a
end

const SZ = 640 # width and height of the images
const FPS = 5 # frames per second
const IMG = Ref{Vector{UInt8}}() # a container for the last frame

start_camera(file) = ffmpeg() do exe
    io = open(`$exe -y -hide_banner -loglevel error -f v4l2 -s 1920x1080 -i /dev/video0 -filter_complex '[0:v]crop=in_h:in_h,split=2[out1][out2]' -map '[out1]' $file -map '[out2]' -s $(SZ)x$SZ -r $FPS  -vcodec png -f image2pipe -`)
    @async while process_running(io) # repeatedly update the IMG container with the last frame
        IMG[] = readpngdata(io)
    end
    return io
end

p = start_camera("tmp.mp4")

While this works, I cannot close, kill, or terminate the process gracefully, and all of my attempts to do so (calling kill(p) twice terminates it) result in a corrupt video file.

I’ll keep stabbing at this, but if you can figure out a way to terminate this Process, that would be golden. Maybe there is an intrinsic difficulty in terminating a process that is split like that (it’s split by what I think is ffmpeg’s version of tee is, see more details here), or maybe it’s solvable. Alternatively, if we could somehow send a 'q' to the process then it should terminate gracefully (but I’m not sure about that since this process isn’t from a run, it’s from a open).

Any feedback about this would be greatly appreciated!