{"id":2138,"date":"2025-09-01T16:10:51","date_gmt":"2025-09-01T16:10:51","guid":{"rendered":"https:\/\/oasees-project.eu\/?p=2138"},"modified":"2025-09-01T16:10:51","modified_gmt":"2025-09-01T16:10:51","slug":"oasees-rust-detection-dataset-for-critical-infrastructure-and-instructions-on-how-to-locally-deploy-it","status":"publish","type":"post","link":"https:\/\/oasees-project.eu\/?p=2138","title":{"rendered":"OASEES: Rust Detection Dataset for Critical Infrastructure And Instructions On How To Locally Deploy It"},"content":{"rendered":"\n<p>In today\u2019s privacy-conscious world, running video streaming and AI inference locally has become increasingly important. Whether you\u2019re building a security system, monitoring industrial processes, or just experimenting with computer vision, having full control over your data pipeline is crucial. This guide will walk you through setting up MediaMTX for local RTSP streaming and integrating it with YOLOv5 for real-time object detection.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">What is MediaMTX?<\/h2>\n\n\n\n<p>MediaMTX is a powerful, lightweight media server that supports multiple protocols including RTSP, RTMP, HLS, and WebRTC. It\u2019s perfect for creating local streaming solutions without relying on cloud services, giving you complete control over your video streams and ensuring your data stays private.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Prerequisites<\/h2>\n\n\n\n<p>Before we begin, make sure you have:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>A Raspberry Pi (or any Linux machine) with network access<\/li>\n\n\n\n<li>Python 3.7+ installed<\/li>\n\n\n\n<li>Basic familiarity with SSH and terminal commands<\/li>\n\n\n\n<li>A camera or video source to stream from<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\">Step-by-Step Setup Guide<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">1. Connect to Your Raspberry Pi via SSH<\/h3>\n\n\n\n<p>First, establish an SSH connection to your Raspberry Pi from your local machine:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>ssh pi@&lt;RaspberryPi_IP_Address&gt;<\/code><\/pre>\n\n\n\n<p>Replace&nbsp;<code>&lt;RaspberryPi_IP_Address&gt;<\/code>&nbsp;with your Raspberry Pi\u2019s actual IP address. You can find this by running&nbsp;<code>hostname -I<\/code>&nbsp;on your Pi or checking your router\u2019s admin panel.<\/p>\n\n\n\n<p>When prompted, enter your password to complete the connection.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">2. Locate and Configure the MediaMTX Configuration File<\/h3>\n\n\n\n<p>Once connected via SSH, navigate to your MediaMTX installation directory:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>cd \/path\/to\/mediamtxDIR\nls<\/code><\/pre>\n\n\n\n<p>You should see the&nbsp;<code>mediamtx.yml<\/code>&nbsp;configuration file in the directory listing. This YAML file contains all the settings for your RTSP server, including stream sources, authentication, and network configurations.<\/p>\n\n\n\n<p>To start the MediaMTX server with your configuration:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>.\/mediamtx mediamtx.yml<\/code><\/pre>\n\n\n\n<p>If MediaMTX is installed globally on your system, you can also use:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>mediamtx \/path\/to\/mediamtxDIR\/mediamtx.yml<\/code><\/pre>\n\n\n\n<p>At this point, your RTSP server should be running and ready to accept connections.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">3. Set Up the YOLOv5 Inference EnvironmentOpen a second terminal window (you can SSH again or use a local terminal) and navigate to your YOLOv5 directory:<\/h3>\n\n\n\n<pre class=\"wp-block-code\"><code>cd \/path\/to\/yolov5<\/code><\/pre>\n\n\n\n<p>Activate your Python virtual environment. The exact command depends on your setup:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code><em># For venv users<\/em>\nsource venv\/bin\/activate\n\n<em># For conda users<\/em>\nconda activate your_yolo_env<\/code><\/pre>\n\n\n\n<p>If you haven\u2019t installed the required dependencies yet, do so now:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>pip install -r requirements.txt<\/code><\/pre>\n\n\n\n<p>This will install PyTorch, OpenCV, and other necessary packages for YOLOv5 inference.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">4. Run YOLOv5 Inference on the RTSP Stream<\/h3>\n\n\n\n<p>Now comes the exciting part \u2013 connecting your AI model to the video stream. Run your YOLOv5 inference script that\u2019s configured to read from the RTSP stream:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>python detect.py --source rtsp:\/\/localhost:8554\/your_stream_name --weights yolov5s.pt --conf 0.25<\/code><\/pre>\n\n\n\n<p>Your inference script should:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Connect to the MediaMTX RTSP stream<\/li>\n\n\n\n<li>Process each frame through the YOLOv5 model<\/li>\n\n\n\n<li>Log detection results and confidence scores<\/li>\n\n\n\n<li>Optionally save annotated frames or send results to a web interface<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">5. View Results in Your Browser<\/h3>\n\n\n\n<p>If your setup includes a Flask web interface for visualizing detections, you can access it through your browser:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>http:&#047;&#047;&lt;InferenceMachine_IP&gt;:5000<\/code><\/pre>\n\n\n\n<p>Replace&nbsp;<code>&lt;InferenceMachine_IP&gt;<\/code>&nbsp;with the IP address of the machine running your YOLOv5 inference server.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Benefits of This Local Setup<\/h2>\n\n\n\n<p><strong>Privacy and Security<\/strong>: Your video data never leaves your local network, ensuring complete privacy and compliance with data protection regulations.<\/p>\n\n\n\n<p><strong>Low Latency<\/strong>: Direct local processing eliminates network delays, providing near real-time inference results.<\/p>\n\n\n\n<p><strong>Cost Effective<\/strong>: No cloud processing fees or bandwidth costs for uploading video streams.<\/p>\n\n\n\n<p><strong>Customization<\/strong>: Full control over your pipeline allows for custom models, preprocessing, and post-processing logic.<\/p>\n\n\n\n<p><strong>Reliability<\/strong>: No dependency on internet connectivity or third-party services.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Troubleshooting Tips<\/h2>\n\n\n\n<p><strong>Connection Issues<\/strong>: Ensure your firewall allows traffic on the RTSP port (default 8554) and web interface port (5000).<\/p>\n\n\n\n<p><strong>Performance Problems<\/strong>: If inference is slow, consider using a smaller YOLOv5 model (yolov5n.pt) or reducing input resolution.<\/p>\n\n\n\n<p><strong>Stream Not Found<\/strong>: Verify that MediaMTX is running and the stream name in your inference script matches your configuration.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Next Steps<\/h2>\n\n\n\n<p>Once you have this basic setup working, consider these enhancements:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Add multiple camera streams<\/li>\n\n\n\n<li>Implement custom detection classes for your specific use case<\/li>\n\n\n\n<li>Set up automated alerts based on detection results<\/li>\n\n\n\n<li>Create a database to store detection history<\/li>\n\n\n\n<li>Add authentication to your web interface<\/li>\n<\/ul>\n\n\n\n<h2 class=\"wp-block-heading\">Conclusion<\/h2>\n\n\n\n<p>Setting up MediaMTX locally with YOLOv5 provides a powerful foundation for privacy-first video analytics. This combination gives you professional-grade streaming capabilities with state-of-the-art AI inference, all running on your own hardware.<\/p>\n\n\n\n<p>The flexibility of this setup makes it suitable for everything from home security systems to industrial monitoring applications. Best of all, you maintain complete control over your data and can customize every aspect of the pipeline to meet your specific needs.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Download Custom Trained Weights<\/h2>\n\n\n\n<p>To help you get started quickly, we\u2019re providing a zip folder containing custom-trained YOLOv5 weights specifically designed for detecting particular patterns and objects in video streams. These weights have been created through customized training processes to recognize specific use cases and image types that are commonly encountered in local video analytics setups.<\/p>\n\n\n\n<p><strong><a href=\"https:\/\/core-rg.iit.demokritos.gr\/setting-up-mediamtx-locally-from-rtsp-streaming-to-yolov5-real-time-inference\/your-download-link-here\">Download Custom Weights Package \u2192<\/a><\/strong><\/p>\n\n\n\n<p><a href=\"https:\/\/core-rg.iit.demokritos.gr\/wp-content\/uploads\/2025\/08\/Weights.zip\">Weights<\/a><a href=\"https:\/\/core-rg.iit.demokritos.gr\/wp-content\/uploads\/2025\/08\/Weights.zip\">Download<\/a><\/p>\n\n\n\n<p>The custom weights included in this package will enhance your detection accuracy for specialized scenarios beyond the standard COCO dataset objects. Simply replace the&nbsp;<code>--weights best.pt<\/code>&nbsp;parameter in step 4 with the path to your downloaded custom weights folder.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Links<\/h2>\n\n\n\n<p>MediaMTX on Github:&nbsp;<a href=\"https:\/\/github.com\/bluenviron\/mediamtx\">https:\/\/github.com\/bluenviron\/mediamtx<\/a><\/p>\n\n\n\n<p>YOLOv5 on Github:&nbsp;<a href=\"https:\/\/github.com\/ultralytics\/yolov5\">https:\/\/github.com\/ultralytics\/yolov5<\/a><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">References<\/h2>\n\n\n\n<p>DOI:&nbsp;<a href=\"https:\/\/doi.org\/10.5281\/zenodo.16679949\" target=\"_blank\" rel=\"noreferrer noopener\">10.5281\/zenodo.16679949<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>In today\u2019s privacy-conscious world, running video streaming and AI inference locally has become increasingly important. Whether you\u2019re building a security system, monitoring industrial processes, or just experimenting with computer vision, having full control over your data pipeline is crucial. This guide will walk you through setting up MediaMTX for local RTSP streaming and integrating it [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_eb_attr":"","site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"set","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[1],"tags":[],"class_list":["post-2138","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"_links":{"self":[{"href":"https:\/\/oasees-project.eu\/index.php?rest_route=\/wp\/v2\/posts\/2138","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/oasees-project.eu\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/oasees-project.eu\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/oasees-project.eu\/index.php?rest_route=\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/oasees-project.eu\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=2138"}],"version-history":[{"count":1,"href":"https:\/\/oasees-project.eu\/index.php?rest_route=\/wp\/v2\/posts\/2138\/revisions"}],"predecessor-version":[{"id":2139,"href":"https:\/\/oasees-project.eu\/index.php?rest_route=\/wp\/v2\/posts\/2138\/revisions\/2139"}],"wp:attachment":[{"href":"https:\/\/oasees-project.eu\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=2138"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/oasees-project.eu\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=2138"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/oasees-project.eu\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=2138"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}