Jetson Inference for Demo

Option1 Install without docker

# Reference
# https://github.com/dusty-nv/jetson-inference

# Download
cd ~/Desktop
git clone --recursive https://github.com/dusty-nv/jetson-inference
cd jetson-inference


# Make
mkdir build
cd build
cmake ../

## Download Model
cd ~/Desktop/jetson-inference/tools
./download-models.sh
# Enable the models you want to run

# Install
cd ~/Desktop/jetson-inference/build
make -j $(nproc)
sudo make install
sudo ldconfig

## Run
cd ~/Desktop/jetson-inference/build/aarch64/bin
./posenet /dev/video0 

Option 2 Install with docker

For more information please visit https://github.com/dusty-nv/jetson-inferencearrow-up-right

#Make sure you have docker installed

sudo apt update

sudo apt install -y nvidia-container-toolkit

#Optional Step

sudo nano /etc/docker/daemon.json

{ "runtimes": { "nvidia": { "path": "/usr/bin/nvidia-container-runtime", "runtimeArgs": [] } }, "default-runtime": "nvidia" }

#Check Status

docker info | grep -i runtime

Runtimes: nvidia runc Default Runtime: nvidia

cd ~/jetson-inference/tools ./download-models.sh

You may need reference the pagearrow-up-right for using GMSL in docker

Last updated