An application suite including an open-source inference server and web UI to deploy any YOLOv8 model to NVIDIA Jetson devices and visualize captured streams, with one line of code.
-
Updated
Jan 8, 2025 - Shell
An application suite including an open-source inference server and web UI to deploy any YOLOv8 model to NVIDIA Jetson devices and visualize captured streams, with one line of code.
Setting up the NVIDIA Jetson Orin Nano Developer Kit
Jetson MINI CUBE NANO Case support Jetson NANO/Orin NANO/Orin NX/Xavier NX/TX2 NX
Add a description, image, and links to the orin-nano topic page so that developers can more easily learn about it.
To associate your repository with the orin-nano topic, visit your repo's landing page and select "manage topics."