How can I stream H.264 video from the Raspberry Pi camera module via a web server?

  • So I got the Raspberry Camera today and got stills working fine.

    Capture an image in JPEG format:

    raspistill -o image.jpg

    Capture a 5-second video in H.264 format:

    raspivid -o video.h264

    I do not want to install any extra application as I want to leverage HTML5 that is readily available. Since Chrome/Safari has built-in decoders for H.264 I just want to point my browser to the URL and watch the stream.

    How can I achieve this?

    I'm working on this, too. I _think_ you need to add MP4 support to nginx or something like that. Will let you know if I have a breakthrough.

    @recantha Have you had any new breakthroughs with streaming video?

    The best solution I've found is based on Silvan Melchoir's RaspiMJPEG. Take a look at my blog which contains a link to the Raspberry Pi Foundation's forum that explains everything. (

    Yea that looks awesome to be able to stream to various devices. What FPS and lag do you get?? I did manage to get uv4l to work with VLC pretty well and OSD. A very short and bad demo. Will make a better one soon. Was made late at night after hours of trial and error.

    @ppumkin how can I record through a python script while RaspiMJPEG is running?It gives a start recording video but it records in .h264 format how can make a python script run on pressing start_recording?

    I think using the Video for Linux 2 (`v4l2`) API driver (official one from the Raspberry Pi Foundation) to stream data straight from the camera is better than using `raspivid`. I'm trying a udp stream with cVLC (`sudo apt-get install vlc`), using memory mapped I/O (mmap) options for `v4l2-ctl` and I would convert this to a `mpeg-dash` stream on the Pi to view the video over HTTP in a web browser.

    Also, there's a useful Raspberry Pi Forums page at ("v4l2 does not work with cVLC"), with some help about streaming the camera with the command line VLC tool and Video for Linux API driver.

    The `v4l2` "driver" used here is not an official API from the Raspberry Foundation?

    V4L is a standard API for video capture that's part of the Linux kernel but there's a driver to get it working on Raspberry Pis with their camera. It was started by community users then an official one was made by the Foundation - information can be found on one of their forum pages.

  • Streaming with HLS

    Apple's proprietary method of streaming live video. It is called HTTP Live Streaming (HLS) and is only supported by Apple's technology. Google (Chromium / YouTube) uses its own implementation called dash mpeg and everybody else is either confused or using H.264 encapsulated in MP4.


    • Can stream HD 1080p on LAN to any device that supports .m3u8 playlists
    • Uses HTML5 semantics (but it is not standardised format)
    • Some support in third-party premium software like jwplayer 6 can be used


    • Has a delay of at least 5 seconds (in this application, but using mirroring from iPhone to AppleTv they achieve 50 ms - 500 ms somehow). So it's not good for remote controlled applications where instant reactions are required, ie robots or helicopters.
    • Have to pay for third-party software if you want to broader browser support which may flash.


    • .m3u8 is simply a UTF-8 version of the M3U format. (.m3u files can have various encodings.) Some people claim that renaming a .m3u8 to .m3u will work as expected on all HTML5 browsers. I tried this, and it did not work for me.

    The concept behind this streaming is that short segments of files, at least 5 seconds long (in this example - it possible new ways are available to speed it up) are recorded and saved to a proper file. The playlist file is updated with the new file name and the client always polls this playlist and downloads the most recent file. There are some mechanics involved to merge the video seamlessly on the client. This is why other developers do not want to implement this because it requires a a lot of effort and does not comply with HTML5 standards (even though there is no proper HTML5 standard for live streams?? Ehh, sigh).


    You need to compile ffmpeg - do not use apt-get installfor FFmpeg

    This can take up to 5 hours - It has to be version 1.1 or higher which supports segment streaming. You can use this to clone it and compile it.

    cd /usr/src
    git clone git://
    cd ffmpeg
    make && make install
    • Install nginx (engine-x) - nginx was specially designed for embedded devises and is the lightest and fastest PHP-enabled web server available at the moment. (Yes, it is better than bulky Apache)
    • Create a directory, for example, live in your www folder, /usr/share/nginx/www/

    Make a Bash script file called something like, apply chmod +x to it and put paste this in. Change the base folder to where ever your HTTP server lives. I used nginx, /usr/share/nginx/www/

    cd $base
    raspivid -n -w 720 -h 405 -fps 25 -vf -t 86400000 -b 1800000 -ih -o - \
    | ffmpeg -y \
        -i - \
        -c:v copy \
        -map 0:0 \
        -f ssegment \
        -segment_time 4 \
        -segment_format mpegts \
        -segment_list "$base/stream.m3u8" \
        -segment_list_size 720 \
        -segment_list_flags live \
        -segment_list_type m3u8 \
    trap "rm stream.m3u8 segments/*.ts" EXIT
    # vim:ts=2:sw=2:sts=2:et:ft=sh

    Create a HTML file that will load the playlist

        <video controls="controls" width="1280" height="720" autoplay="autoplay" >
          <source src="stream.m3u8" type="application/x-mpegURL" />


    • iPhone, opens page, but drops into QuickTime. The quality is really amazing!
    • Windows Safari, streams fine.
    • Macintosh or Windows, QuickTime. Streams fine.
    • Android 2.3.5 and did not work, but it was supposed to be supported since 2.1.x
    • Windows, Chrome - Nothing
    • Windows, Internet Explorer 10 --- Nothing (unsupported video type)
    • Windows, VLC media player - Nothing


    Original code:

    *In regards to speeding up ffmpeg's compilation* In order to circumvent the low computational capacity of RPI and long compiling times for ffmpeg, I've attempted using Qemu with the Wheeze but came across some obstacle in logging in, and had to try with an Arch image. This worked. Also attempted Sqeeze on a Ubuntu image, through VirtualBo

    Is there a way to automatically delete old segments? The SD card gets full after some time. I would also like them to be deleted to I can run this on a tmpfs and not ruin the SD card.

    @Dimmme If you add `-segment_wrap 10` as an argument to ffmpeg it will use max 10 segment files.

    Have anyone gotten this to work? The files are created, but seem to miss SPS/PPS info, so the video won't play in iOS Safari nor VLC. The stream.m3u8 also didn't include `segments/` when pointing to the segment-files, so I dropped the segments folder. Did I misunderstand something?

    you need to pipe the stream through the PSIPS filter binary. The newest version of raspicam was supposed to do this.. but for some reason I couldnt get it working without PSIPS

    Can ffmpeg pick up this stream?

    [stream_segment,ssegment @ 0x2387a10] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead. [stream_segment,ssegment @ 0x2387a10] Failed to open segment 'segments/00000000.ts'

    Ok fixed paths. Now I get - [stream_segment,ssegment @ 0x2503120] Timestamps are unset in a packet for stream 0. This is deprecated and will stop working in the future. Fix your code to set the timestamps properly frame= 1941 fps= 27 q=-1.0 size=N/A time=00:01:17.60 bitrate=N/A speed=1.07x but the video isn't streaming on Safari or iPhone.

    You need to usepipe the stream through PSIPS first to add timestamps.

    I think using the Video for Linux 2 (`v4l2`) API driver (official one from the Raspberry Pi Foundation) to stream data straight from the camera is better than using `raspivid`. I'm trying a udp stream with cVLC (`sudo apt-get install vlc`), using memory mapped I/O (mmap) options for `v4l2-ctl` and I would convert this to a `mpeg-dash` stream on the Pi to view the video over HTTP in a web browser.

    Also, there's a useful Raspberry Pi Forums page at ("v4l2 does not work with cVLC"), with some help about streaming the camera with the command line VLC tool and Video for Linux API driver.

    @gregers I got this working in MacOS & iOS Safari with the directions above without problem. Probably not of consequence, but I wrote the stream to a `tmpfs` with the `-segment_wrap` feature to keep the stream files off disk.


    Thanks to comment from @mpromonet for the update on the Linux-Projects V4L2 driver that now implements MMAL very efficiently - but it is still a work in progress.

    Follow these instructions to install the linux-project repository and install the UV4L driver with extras. Then install the server and mjpeg. If you want, you can experiment with the others too.

    After you install everything, you can access the HTTP server on port 8080. You should also check the /etc/uv4l/conf file and set if you want mjpeg or H.264 as it makes a difference, but you can adjust a few settings via the built-in web server.

    HTML 5

    This is what we were all waiting for (called WebRTC) and thanks to the new driver it works great (on a Raspberry Pi 2).

    First, follow these steps,

    curl | sudo apt-key add -
    # Add the following line to the file /etc/apt/sources.list
    # deb wheezy main
    sudo apt-get update
    sudo apt-get install uv4l uv4l-raspicam
    sudo apt-get install uv4l-raspicam-extras

    Then on your Raspberry Pi 2 install this the WebRTC (for a Raspberry Pi 1, read the linked site for other options)

    sudo apt-get install uv4l-webrtc

    Restart all the drivers and go to


    You now have low-latency, high-quality video streaming direct into a modern browser like Chrome or Firefox. (Maybe Safari, but I can't check because they don't do Winblows any more and Internet Explorer... eh)


    By default, it uses mjpeg at 1080p, and it's very sluggish. I tweaked it to 800x600 framesize and using something like iSpy to process video. For security, I get about 10 fps on a crisp video. It is way better than the 3 fps at 640x480 before this driver. It works on iPhone with Safari, Android Chrome and almost everything else.


    This also means that motion should (I still need to test and compare) work a lot better now. Make sure to set the configuration to use v4l2_palette 8 or v4l2_palette 2


    This has now been fixed for "streaming", and we don't have to go to great lengths to watch H.264 video through VLC media player. The stream is sill RAW H.264, so you need to demux it or transcode/ encapsualte if you need it to work somewhere else. You should tweak the bitrate=xxxxxx in the configuration file if you are streaming over Wi-Fi.

    In VLC media player, you must tell it that you want to use the H.264 demuxer. So if you're using the GUI, then make sure to add the argument :demux=264. From command line, vlc http.../video.h264 --demux h264. Otherwise, you will just see a blank screen even though the camera LED is turned on.


    Voila! HD streaming with roughly 500 ms lag (with tweaking, down to 200 ms). It is definitely much easier than using the old methods. Quality and FPS is superb, but you can't embed this in HTML5 without transcodding to MP4 or WebM. I hope this will be implemented as it will truly make this a great standalone server.


    Not supported/implemented


    Not supported/implemented

    There is no video4linux driver availabe yet. This means that we can't use ffserver to stream data using /dev/video0 or simlar like a USB webcam.

    That is why it is so difficult to find proper live streaming for HTML5 browsers.

    Now there is `video4linux` driver the official V4L2 driver bcm2835-v4l2 and the userspace V4L2 driver [;artid=14]

    Is a real v4l driver or is it just that wrapper around raspivid that gives terrible performance?

    The official driver use the MMAL interface, see source code []. Performance seems correct.

    I have been playing with this for 3 days now. The mjpeg encodding is definalty much more stable and can view [email protected] on iPhone,Android or iSpy, reliably. h264 is great at 1080p 30fps and we can view this in vlc using the `--demux h264` flag. We still need to transcode this for use on mobile or embedding as mp4/webm on webpages. But it is really great move forward in efficiently and quality. Dont confuse with the "other" UV4L non linux-project driver thing that is rubbish.

    Note that adding :demux=264 in the H264 method is for vlc server, not the vlc client. So the command line to start the streaming on the raspberry to get compatibilty with vlc in smartphones is: ```/usr/bin/cvlc v4l2:///dev/video0 --v4l2-width 800 --v4l2-height 400 --v4l2-chroma h264 --sout '#standard{access=http,mux=ts,dst=}' :demux=264```

    I am not able to install `uv4l`. When i do `sudo apt-get install uv4l` it says `E: Unable to locate package uv4l` What am I missing ?

    Possibly the repository, did you add it? You need to manually edit the file

    `uv4l uv4l-raspicam` we have to install on pi or the on the remote computer? `webrtc` you mention to be installed on the pi. I installed everything on pi and went to `http://raspberry:8080` but don't see anything. Where do I give the ipaddress of the remote computer? or on any browser this address will give me camera output?

    Replace `raspberry` with the Pi's IP, or with `localhost` if you are accessing it remotely. Make sure your `uv4l-raspicam` service is running with `sudo service uv4l-raspicam restart`.

    The context in this post is now way out of date.

    @MKNWebSolutions is there a better solution?

    @JFA I find HLS to be the easiest, you can get the delay down to as little as 4-6 seconds too. However I find that RTP may be the best (and cleanest) solution, but I don't have the exact details noted for it just yet. I've been experimenting with other IP cameras (e.g. I just got a cheap sercom that's very stable with RTP / all open source), getting somewhere!

    UV4L offers WebRTC with under 1second delay now. `The first two sections since its the most relevant` - It still requires work for higher resolutions but I know the developer, and he is working as hard as he can to sort it out.

    @ppumkin gladly take WebRTC over everything, but issue is still browser support. Guess the same thing with RTP/RSTP, would still require flash.

  • Streaming with MJPEG


    A kernel interface with a build in HTTP(S) server.

    Raspberry Pi Cam Web interface

    A nice project by silvanmelchior that deploys a web server, dvr like, multi target streaming server. Needs more information

    Legacy method

    Streaming with mjpg is supported by almost all browsers, including Internet Explorer 6. A lot of cameras used before H.264 used hardware mjpg, which essentially dumped JPEG files as fast as possible into a folder while mjpg read the file into a buffer and deleted them. Some devices could achieve up to 25 fps and even if you had a bad connection you would get at least 1 fps.

    Support for mjpg was dropped in HD cameras because the JPEG file just got too large to stream over the Internet and H.264 is a much faster and better quality protocol.

    Since we have no way to broadcast H.264 using the camera module nativly this seems like a viable fallback...

    It is pretty much instant, but don't expect to get more than 1.5 fps. This is down to raspistill being extremely SLOOOW! Using the time-lapse function set to 100 ms which should give us 10 fps does not work because raspistill just chokes up and has serious performance issues within itself.

    1. Change /tmp to use RAM for speed /etc/default/tmpfs - change RAMTMP=yes (This is an effort to increase fps, but raspistill just cannot keep with its self.)
    2. Reboot
    3. apt-get install git
    4. apt-get install libjpeg8-dev
    5. apt-get install libv4l-dev
    6. apt-get install imagemagick
    7. cd /usr/src, mkdir mjpg-streamer, cd mjpg-streamer ...
    8. git clone
    9. make USE_LIBV4L2=true clean all
    10. OPTIONAL If you have errors
    11. sudo ln -s /usr/include/libv4l1-videodev.h /usr/include/linux/videodev.h
    12. sudo ln -s /usr/include/lib4l2.h /usr/include/linux/lib4l2.h
    13. Inside the makefile, comment out all the plugins except for input_file and output_http and do make again. I had a lot of issues here.
    14. Copy the binary, mjpg_streamer and its plugins input_*.so and output_*.so to /usr/local/bin. Otherwise, run it direct from the src directory.
    15. Optional end
    16. mkdir /tmp/stream
    17. raspistill -w 640 -h 480 -q 5 -o /tmp/stream/pic.jpg -tl 100 -t 9999999 -th 0:0:0 &
    18. LD_LIBRARY_PATH=./ ./mjpg_streamer -i " -f /tmp/stream" -o " -w ./www" (run this where the binary and plugins are)
    19. Goto http://<IP-address>:8080
    20. Here are a few option, enjoy "live" streaming the old fashioned way ... supported by most browsers - modern, old and experimental.

    I struggled to compile it for about for 5 hours... sigh, but I think I will use this as I can access the stream from any phone and any browser. I just have to wait till we get better drivers... Another year or two. :(

    No matter what quality I try, I get no faster or no slower than 1 fps using stream. I used 720p and 1080p and only image quality gets better, but fps is no difference on LAN. I suppose smaller settings will help with WAN/3G or other radio transmissions.

    raspistill writes the image to a single file. This could be a bottleneck. It writes the file, mjpg strreamer reads it and deletes it causing a blocking I/O, so raspistill cannot write to the file.

    The only thing I can think of is using raspivid piped into FFmpeg that will create JPEG files for us - I need to try this and possibly it's much faster than usign raspistill. I managed to get 25 fps at a shocking quality, and it was delayed about 10 seconds... Tweaking the settings got me about 3 fps, but 100% CPU. No hardware is being used to process the video stream...

    raspivid -w 640 -h 480 -fps 25 -vf -t 86400000 -b 1800000 -o -  \
    ffmpeg -i - \
        -f image2(?) \
        -c:v mjpeg \

    I was also reading and found that we can use %d in the raspistill output file name. I wonder if that will boost the fps. Also JPG encoding is hardware accelerated in raspistill, so I am really struggling to figure out why it's so slow...

    I got a staggering 2 FPS using %d in the filename. For some reason, writing the JPEG file is horribly slow from raspistill. Sigh.

  • As of 2017 (or perhaps earlier) raspivid is no longer the preferred method, with the Pi devs recommending people use V4L2 instead.

    So this method allows you to stream H264 via RTP using V4L2 instead of raspivid. I noticed this method results in fewer dropouts and allows a higher bitrate:

    # Use V4L2 (preferred) instead of raspivid
    # exposure_dynamic_framerate=1 (raspivid --fps 0) - reduce framerate/increase exposure in low light
    # scene_mode=8 (raspivid --exposure night) - allow framerate reduction to increase exposure
    v4l2-ctl -v width=1296,height=972,pixelformat=H264 \
            --set-ctrl=exposure_dynamic_framerate=1 \
            --set-ctrl=video_bitrate=5000000 \
    exec ffmpeg -f h264 -probesize 32 -r 30 -i /dev/video0 -vcodec copy -an -f rtp_mpegts udp://

    This script multicasts the video, and it can be viewed on another machine on the LAN with a command like this:

    ffplay -sync ext -an -fast -framedrop -probesize 32 -window_title "Raspberry Pi" -an udp://

    -sync ext causes the video to be played as fast as possible so it will run in real time, as opposed to running it at a fixed framerate and lagging if the Pi is capturing frames faster than this. There's still some lag with this method, but no worse than the other raspivid methods.

    (Tip: if you're plugged into a router or switch that supports IGMP, make sure is not firewalled on your machine, otherwise when the router asks your PC whether it wants any multicast traffic the PC will never respond and you'll never see any video.)

    Recording to disk

    As I mentioned recording in the comments below, I'll expand on that here. You can use a command like this to record the network stream to disk:

    ffmpeg -y -i udp:// -c copy \
      -f segment -segment_atclocktime 1 -segment_time 900 \
      -reset_timestamps 1
      -strftime 1 /path/to/storage/pi-%wT%H%M.mkv

    Look at man strftime for the meanings of the % symbols in the filename. The ones in this example use the day number (0=Sunday, 1=Monday, etc.) followed by a T and then the time. It starts a new file every 15 minutes.

    Just to be clear, this recording command is meant to be run on a remote PC (not on the Pi itself) although it will probably work on the Pi too (untested).

    Since you get a new file every 15 minutes with the day and time in the filename, it means that after one week you'll start to get filenames generated that have already been used, causing the oldest files to get overwritten. In other words, you'll end up with a rolling loop of the previous week's worth of footage. This is ideal for a security camera where you will rarely need to go back more than a week.

    As a side note this produces about 500GB worth of files, so you may want to adjust the bitrate, resolution, or overwrite the files sooner (say every 24 hours) if you don't want them taking up so much space.

    Cool - Thanks for sharing this. Can you explain why the use of multicast is necessary here though? From what I have learnt is that multicast is rarely used - so I was wondering what it brings to the table here? Still - The script looks great and I am sure it will help allot of people. Thanks +1

    Multicast is optional - you can just substitute a normal IP address if you wish - but you will need to change the command to use `ffserver` or some other server system if you want more than one machine to display the feed. Then after maybe 2-3 clients (depending on the video bitrate) the Pi's USB Ethernet adapter will run out of bandwidth. With multicast there's no need to run a server (client machines just choose whether to listen to the traffic or ignore it) so you can have thousands of machines displaying the video with no impact on the Pi, which only ever sends out a single video stream.

    Thanks for explaining - But multicast only works on internal networks? If an ISP gets a multicast packet they usually just strip it- So it's not like you can just broadcast to everybody in the internet. I suppose if you got a large internal network, mulit casting a massive stream may also impact your network? But yea.. just for me to view a stream I would just UDP to a selected IP.. but I like the multicast option anyway :D Will try and do it this weekend just because I never did it before. :) Thanks

    Yes multicast is mainly for internal networks. It's supposed to work better with IPv6 but I think it will still need cooperation from the ISP. I use it because it means I don't have to run a server on the Pi, and I can view the streams from two different machines plus record it to disk without changing the Pi's configuration, or overloading the Pi's network bandwidth. If your internal network is large then you will probably be using IGMP-capable switches which are designed to only send multicast traffic where it's needed to make the impact no different to normal.

    Thanks for explaining.. I can now see many benefits of using multicast with minor caveats that won't even impact home users really. I will definitely give this a try. It is the simple and obvious things sometimes that need to be pointed out to make sense. And looking at your update.. the recording bit is actually really,really cool!

    Got `unknown control 'video_bitrate'` errors and other settings threw errors too, and removing them got `Could not find codec parameters for stream 0 (Video: h264, none): unspecified size` not working out of the box for me

    @PhilippeGachoud: Sounds like your camera isn't installed properly, your drivers are too old, or it's conflicting with another video device (e.g. USB camera). If you can't get it to work, post a new question so we can troubleshoot.

    @Malvineous thats it, there is another USB camera plugged... got to find another solution I think than ffmpeg

    @PhilippeGachoud: ffmpeg will still work, but you'll have to adjust the options to suit the capabilities of your camera. This example expects hardware-compressed H.264 data from the camera (as supplied by the Pi camera), but many USB cameras deliver MJPEG or another video format, so you'll need to experiment with the options. I have found that MJPEG is very high bandwidth and the Pi is not powerful enough to transcode the video, so I haven't had a lot of luck streaming from USB cameras. But the Pi camera itself works very well.

    Do you have a source for the devs recommending v4l2 over raspivid? The documentation still shows raspivid here:

    @ElliottB: This GitHub issue is where one of the devs said `raspivid` was only meant to be a demo, and now V4L2 supports everything raspivid does.

  • I managed to stream from my Raspberry Pi to a web server with the compiled-in module nginx-rtmp.

    To save hassles with ffmpeg, I recommend a rolling distribution like Arch Linux Arm.

    raspivid -vf -t 0 -fps 25 -b 2000000 -o - |
    ffmpeg -i - -vcodec copy -an -r 25 -f flv rtmp://x220/myapp/mystream

    Some notes:

    So on this basis, I think live streaming from a Raspberry Pi might be OK for a temporary broadcast, but not for an always-on Web cam since it's too bandwidth-hungry. You will not get audio and if you do, it will a mission to sync.

    You can record audio more efficiently separately at the same time as recording video. Then later perhaps mux the audio feed in later and convert it to WebM and put it on your httpd as a static file with an HTML video tag. The workflow is pretty awkward, though it's the best I can think of for an efficient broadcast that will work painlessly across browsers.

    You can control the bandwidth and resolution though. If its local LAN streaming for CCTV use then thats not even a problem. Broadcasting over internet might need to be on demand and/or a much lower resolution. But its another way of doing it. Thanks +1

    and how does it suppose to work? it doesnt for me... FFMPEG says "RTMP_Connect0, failed to connect socket. 111 (Connection refused)"

  • UV4L now supports live Audio & Video Streaming with WebRTC and HTML5.

    just read the link above...

    Works really well!

    How? The link to its example page is broken...

    I have been through those tutorials and I can confirm they do not work

  • Piotr Kula's answer seems to be on the right track but is outdated for Raspberry stretch.

    There are updated instructions for uv4l on Raspberry stretch at

    # switch to superuser mode
    sudo -s
    # add the repository key for uv4l
    curl | sudo apt-key add 
    # add the url for the u4vl repository to apt
    echo "deb stretch main" >> /etc/apt/sources.list
    apt-get update
    apt-get install uv4l uv4l-raspicam
    apt-get install uv4l-raspicam-extras
    # do not forget to install the server - see what happens if you do
    # below
    apt-get install uv4l-server

    You can tweek the uv4l options via /etc/uv4l/uv4l-raspicam.conf and then restart the service with

    sudo service uv4l_raspicam restart

    In my case things didn't work out of the box (if forgot to install the uv4l-server ...). The following comments might help you debug similar problems.

    I checked that the server is running with:

    pgrep -fla uv4l
    995 /usr/bin/uv4l -f -k --sched-fifo --mem-lock --config-file=/etc/uv4l/uv4l-raspicam.conf --driver raspicam --driver-config-file=/etc/uv4l/uv4l-raspicam.conf --server-option=--editable-config-file=/etc/uv4l/uv4l-raspicam.conf

    and whether it listened with

    sudo netstat -tulpn 

    but there was no entry for uv4l in the list. I had expected one for port 8080

    so i tried the command from How to configure UV4L?

    uv4l --sched-rr --mem-lock --driver raspicam \
    > --width 960 --height 540 --framerate 30 \
    > --encoding mjpeg --vflip --hflip
    <notice> [core] Trying to loading driver 'raspicam' from built-in drivers...
    <notice> [core] Loading driver 'raspicam' from external plug-in's...
    <notice> [driver] Dual Raspicam & TC358743 Video4Linux2 Driver v1.9.63 built Oct  6 2018
    <notice> [driver] Detected camera imx219, 3280x2464
    <notice> [driver] Selected format: 960x544, encoding: mjpeg, JPEG Video Capture
    <notice> [driver] Framerate max. 30 fps
    <notice> [core] Device detected!
    <notice> [core] Registering device node /dev/uv4l

    But still the server didn't start automatically ...

    man uv4l

    then showed me the option

    --enable-server [=arg(=required)] (=auto)
              enable the streaming server. Possible values are: 'auto' (tenta‐
              tively start the server), 'required' (exit if failing  to  start
              the  server,  only  works if --foreground is enabled), 'off' (no
              server at all).

    so I tried:

    pkill uv4l
    sudo uv4l --sched-rr --mem-lock --driver raspicam --encoding mjpeg --enable-server=required
    <notice> [core] Trying to loading driver 'raspicam' from built-in drivers...
    <notice> [core] Loading driver 'raspicam' from external plug-in's...
    <notice> [driver] Dual Raspicam & TC358743 Video4Linux2 Driver v1.9.63 built Oct  6 2018
    <notice> [driver] Detected camera imx219, 3280x2464
    <notice> [driver] Selected format: 1920x1080, encoding: mjpeg, JPEG Video Capture
    <notice> [driver] Framerate max. 30 fps
    <notice> [core] Device detected!
    <notice> [core] Registering device node /dev/uv4l

    but still no server running on port 8080 or elswhere. So it seems i forgot the "--foreground" option which the man page states is necessary:

    sudo uv4l --sched-rr --mem-lock --driver raspicam --encoding mjpeg --enable-server=required --foreground
    <notice> [core] Trying to loading driver 'raspicam' from built-in drivers...
    <notice> [core] Loading driver 'raspicam' from external plug-in's...
    <notice> [driver] Dual Raspicam & TC358743 Video4Linux2 Driver v1.9.63 built Oct  6 2018
    <notice> [driver] Detected camera imx219, 3280x2464
    <notice> [driver] Selected format: 1920x1080, encoding: mjpeg, JPEG Video Capture
    <notice> [driver] Framerate max. 30 fps
    <notice> [core] Device detected!
    <notice> [core] Trying to load the the Streaming Server plug-in...
    <warning> [core] cannot open shared object file: No such file or directory
    <alert> [core] No Streaming Server detected

    Now that's a clear hint! There seems to be no server yet - so install it:

    sudo apt-get install uv4l-server

    and try again:

    sudo uv4l --sched-rr --mem-lock --driver raspicam --encoding mjpeg --enable-server=required --foreground
    <notice> [core] Trying to loading driver 'raspicam' from built-in drivers...
    <notice> [core] Loading driver 'raspicam' from external plug-in's...
    <notice> [driver] Dual Raspicam & TC358743 Video4Linux2 Driver v1.9.63 built Oct  6 2018
    <notice> [driver] Detected camera imx219, 3280x2464
    <notice> [driver] Selected format: 1920x1080, encoding: mjpeg, JPEG Video Capture
    <notice> [driver] Framerate max. 30 fps
    <notice> [core] Device detected!
    <notice> [core] Trying to load the the Streaming Server plug-in...
    <notice> [server] HTTP/HTTPS Streaming & WebRTC Signalling Server v1.1.125 built on Mar  9 2019
    <warning> [server] SSL is not enabled for the Streaming Server. Using unsecure HTTP.
    <notice> [core] Streaming Server loaded!
    <notice> [core] Registering device node /dev/uv4l
    <notice> [server] Web Streaming Server listening on port 8080

    The server is now available at http://pi:8080 (replace pi with your server's ip or hostname)

    After a reboot it worked with out entering another command.

  • UV4L now supports live audio & video broadcasting to Jitsi Meet Rooms over the Web. No special configuration is required. It's as easy as filling your name, room and clicking on Start.

    which browser are you using? Jitsi only support Chrome, Chromium, Opera, and Firefox NIghtly, of which only Chromium is available on the Pi. But Chromium give me a `webkitRTCPeerConnection is not defined` error. I normally use IceWeasel for WebRTC, but that is not supported for Jitsi.

    on the PI there is no browser supporting WebRTC , except an almost broken support in IceWeasel. The way I am using it is : Pi->Jitsi Server on the Cloud -> my PC elsewhere

    UV4L supports hardware-encoded H264 live streaming with no latency.

License under CC-BY-SA with attribution

Content dated before 6/26/2020 9:53 AM