I'm trying to capture my built-in webcam with openCV in C++, do some precessing. This ist working so far.
Now i want to stream the webcam to the browser. How can I archieve that?
- Should I create a WebSocket? Or use a UPD Socket?
- Poco::Net::WebSocket
- How can I display that content in the Browser? Is that possible with HTML5 and JS?
Thank you.
I'm trying to capture my built-in webcam with openCV in C++, do some precessing. This ist working so far.
Now i want to stream the webcam to the browser. How can I archieve that?
- Should I create a WebSocket? Or use a UPD Socket?
- Poco::Net::WebSocket
- How can I display that content in the Browser? Is that possible with HTML5 and JS?
Thank you.
Share Improve this question edited Feb 25, 2016 at 8:40 dab0bby asked Feb 18, 2016 at 10:44 dab0bbydab0bby 2,9911 gold badge33 silver badges38 bronze badges 2- in the browser, it is possible to use websockets and connect to a live stream. the biggest question, i guess, would be to run the actual streaming server and decide on a patible video format. i have never tried to do it by myself, but this question got me really interested and i have found this interesting discussion stackoverflow./questions/21921790/… – user151496 Commented Feb 18, 2016 at 10:51
-
@user151496 it's a pretty interesting topic, I've been able to stream from
webcam(html5) > server > video(html5)
over web sockets, audio was a major issue as capturing the stream has very limited support and in general hard to work with. OP should take a look at MPEG-DASH, it was much easier to get up and going than web sockets. – 8eecf0d2 Commented Feb 18, 2016 at 11:06
2 Answers
Reset to default 2I may be a little late, but as I didn't find a pletely updated solution for C++ and mjpeg in StackOverflow, thought about writing a new answer.
There are now some good and simple libraries for the task in C++ (c++ mjpg streaming to html)
https://github./nadjieb/cpp-mjpeg-streamer
https://github./jacksonliam/mjpg-streamer
https://github./codewithpassion/mjpg-streamer/tree/master/mjpg-streamer
I found the first one to be very simple. You need CMake, and make installed in the system.
git clone https://github./nadjieb/cpp-mjpeg-streamer.git;
cd cpp-mjpeg-streamer;
mkdir build && cd build;
cmake ../;
make;
sudo make install;
- Make sure you have the correct version of OpenCV installed.
Now, write the streamer:
mjpeg_server
#include <opencv2/opencv.hpp>
#include <nadjieb/mjpeg_streamer.hpp>
// for convenience
using MJPEGStreamer = nadjieb::MJPEGStreamer;
int main()
{
cv::VideoCapture cap;
cap.open("demo.mp4");
if (!cap.isOpened())
{
std::cerr << "VideoCapture not opened\n";
exit(EXIT_FAILURE);
}
std::vector<int> params = {cv::IMWRITE_JPEG_QUALITY, 90};
MJPEGStreamer streamer;
// By default 1 worker is used for streaming
// if you want to use 4 workers:
// streamer.start(8080, 4);
streamer.start(8000);
// Visit /shutdown or another defined target to stop the loop and graceful shutdown
while (streamer.isAlive())
{
cv::Mat frame;
cap >> frame;
if (frame.empty())
{
std::cerr << "frame not grabbed\n";
//continue;
exit(EXIT_FAILURE);
}
// http://localhost:8080/bgr
std::vector<uchar> buff_bgr;
cv::imencode(".jpg", frame, buff_bgr, params);
streamer.publish("/bgr", std::string(buff_bgr.begin(), buff_bgr.end()));
cv::Mat hsv;
cv::cvtColor(frame, hsv, cv::COLOR_BGR2HSV);
// http://localhost:8080/hsv
std::vector<uchar> buff_hsv;
cv::imencode(".jpg", hsv, buff_hsv, params);
streamer.publish("/hsv", std::string(buff_hsv.begin(), buff_hsv.end()));
// std::cout<< "published" << std::endl;
}
streamer.stop();
}
Write the CMakeLists.txt
cmake_minimum_required(VERSION 3.1)
project(mjpeg_streamer CXX)
find_package(OpenCV 4.2 REQUIRED)
find_package(nadjieb_mjpeg_streamer REQUIRED)
include_directories(${OpenCV_INCLUDE_DIRS})
add_executable(stream_test
"mjpeg_server")
target_pile_features(stream_test PRIVATE cxx_std_11)
target_link_libraries(stream_test PRIVATE nadjieb_mjpeg_streamer::nadjieb_mjpeg_streamer
${OpenCV_LIBS})
| --- mjpeg_server
| --- CMakeLists.txt
| --- ...
| --- build
| --- demo.mp4
| --- ...
Now, we can build the streamer.
mkdir build && cd build;
cmake ../;
make;
./stream_test
Now, if you go to "http://ip_address:port/bgr"
or, "http://ip_address:port/hsv"
you should be able to see the stream. In my case, ip = 192.168.1.7 / localhost, port = 8000.
If you want to grab the stream with another server,
index.html
<html>
<body>
<img src="http://localhost:8000/bgr">
<img src="http://localhost:8000/hsv">
</body>
</html>
serve.py
import http.server
import socketserver
class MyHttpRequestHandler(http.server.SimpleHTTPRequestHandler):
def do_GET(self):
if self.path == '/':
self.path = 'index.html'
return http.server.SimpleHTTPRequestHandler.do_GET(self)
# Create an object of the above class
handler_object = MyHttpRequestHandler
PORT = 8080
my_server = socketserver.TCPServer(("", PORT), handler_object)
# Star the server
my_server.serve_forever()
python3 serve.py
Finally, even though it's extremely simple, it's not secure.
So I found a solution myself. The concept ist like this:
My server
is a WebSocket-Server
build with the POCO Library.
Server:
In the main thread initialize the camera (the camera has to be initialized in the main thread). After a WebSocket
connection is established the server captures a frame from cv::VideoCapture
, converts the frame to JPEG
, and encodes the image to a Base64
string and finally sends that string back to the client.
Browser:
In the browser the recieved Base64
string can be interpreted as an image by the img
tag.
<img id="image" src="" width="1280" height="720"/>
ws.onmessage = function(evt)
{
$("#image").attr('src', 'data:image/jpg;base64,'+ evt.data);
};
So if the server now sends 30 frames within a second, there a smooth livestream
in the browser.
发布者:admin,转转请注明出处:http://www.yc00.com/questions/1745441111a4627836.html
评论列表(0条)