NVIDIA Tegra Jetson Nano

The official Mender documentation explains how Mender works. This is simply a board-specific complement to the official documentation.

Board description

NVIDIA Jetson Nano developer kit hardware booting from SD-card:

URL: https://developer.nvidia.com/embedded/jetson-nano-developer-kit
Wiki: https://elinux.org/Jetson_Nano

Note: the Jetson Naon 2GB model is not yet supported, see https://github.com/OE4T/meta-mender-community/issues/4 for latest status.

Test results

The Yocto Project releases in the table below have been tested by the Mender community. Please update it if you have tested this integration on other Yocto Project releases:

Yocto Project Build Runtime
warrior (2.7) :test_works: :test_works:
zeus (3.0) :test_works: :test_works:
dunfell (3.1) :test_works: :test_works:

Build Means that the Yocto Project build using this Mender integration completes without errors and outputs images.
Runtime Means that Mender has been verified to work on the board. For U-Boot-based boards, the integration checklist has been verified.

Getting started


  • A supported Linux distribution and dependencies installed on your workstation/laptop as described in the Yocto Mega Manual
    • NOTE. Instructions depend on which Yocto version you intend to use.

Setup Yocto Environment

Dunfell and later releases use the tegra-demo-distro to demonstrate a working mender configuration.

Please use the setup-env script in tegra-demo-distro to setup and configure the tegrademo-mender distribution. Use the branch corresponding to the yocto version you plan to target. For instance, for the dunfell branch and NVIDIA L4T release 32.4.3, use these commands from the root of the cloned tree:

git submodule update --init
git checkout dunfell-l4t-r32.4.3

Then use

source ./setup-env --distro tegrademo-mender --machine jetson-nano-devkit

To setup your build environment for Jetson Nano

See the OE4T/meta-tegra wiki for more information about the meta-tegra project.

Setup build environment

Initialize the build environment:

source ./setup-env --distro tegrademo-mender --machine jetson-nano-devkit

or if you’ve already setup your initial environment you can use the simpler option:

source ./setup-env

Building the image

Optional: Build for Hosted Mender:

# To get your tenant token:
#    - log in to https://hosted.mender.io
#    - click your email at the top right and then "My organization"
#    - press the "COPY TO CLIPBOARD"
#    - assign content of clipboard to MENDER_TENANT_TOKEN
MENDER_SERVER_URL = "https://hosted.mender.io"
MENDER_TENANT_TOKEN = "<copy token here>"

You can now proceed with building an image:

bitbake demo-image-base

Using the build output

After a successful build, the images and build artifacts are placed in tmp/deploy/images/jetson-nano-devkit.

The disk image package for Jetson has a name like demo-image-base-jetson-nano-devkit.tegraflash.tar.gz. This file can be used along with recovery mode on the Jetson to initially provision the device.

You should insert an SDCard into the device before flashing.

To enter recovery mode:

  • For the Nano carrier board, unless you have connected buttons to the header pins, use a jumper (REC to GND) to put the device into recovery mode on power up.
  • Connect the USB cable to power on the Nano

You should see your NVIDIA device listed in the lsusb device output.

Next, extract the .tar.gz file and then run the ./doflash.sh script inside to flash the build to the device. The script at repos/meta-mender-community/meta-mender-tegra/scripts/deploy.sh can be customized to automate this step if desired. See the instructions on the OE4T wiki for additional details.

On the other hand, if you already have Mender running on your device and want to deploy a rootfs update using this build, you should use the Mender Artifact files, which have .mender suffix. You can either deploy this Artifact in managed mode with the Mender server (upload it under Releases in the server UI) or by using the Mender client standalone mode.


  • meta-tegra: The base BSP layer with support for NVIDIA tegra.

  • meta-mender-tegra layer in meta-mender-community: Customization for mender on tegra, including u-boot and partitioning. See OE4T layer at this link for any changes or to file issues against mender and NVIDIA support.

  • The official Mender documentation explains how Mender works. This is simply a board-specific complement to the official documentation.

Known issues

Let us know if you find one!


hopefully i did not miss a point here, but these instruction seem not to be working for me.

When i try to build the jetson-nano image with MACHINE=jetson-nano bitbake core-image-base parsing the recipe fails.

NOTE: Your conf/bblayers.conf has been automatically updated.
Parsing recipes: 100% |####| Time: 0:00:58
Parsing of 907 .bb files complete (0 cached, 907 parsed). 1437 targets, 91 skipped, 0 masked, 0 errors.
NOTE: Resolving any missing task queue dependencies
ERROR: Nothing PROVIDES 'u-boot-bup-payload'
u-boot-bup-payload was skipped: incompatible with machine jetson-nano (not in COMPATIBLE_MACHINE)
NOTE: Runtime target 'mender-tegra-bup-payload-install' is unbuildable, removing...
Missing or unbuildable dependency chain was: ['mender-tegra-bup-payload-install', 'u-boot-bup-payload']
ERROR: Required build target 'core-image-base' has no buildable providers.
Missing or unbuildable dependency chain was: ['core-image-base', 'u-boot', 'mender-tegra-bup-payload-install', 'u-boot-bup-payload']
Summary: There were 2 ERROR messages shown, returning a non-zero exit code.

I know that this is quite a new wiki entry and related to the Pull request, but even working with feature/jetson-nano branch did not help. It feels like i missed something, eventhough i am not sure what :confused:.
Would be cool, if you can point me in a direction that might be helpful

@Langhalsdino sorry I just noticed this post.

Can you please confirm you have the latest in the PR here and especially this change:
RDEPENDS_${PN}_tegra186 += "mender-tegra-bup-payload-install"

You could also try out the latest zeus support at this PR, you’ll need to use the zeus mender branch here. This is working for me on nano after testing it today.

Thank you for the good information regarding Yocto. I have a question during the image creation to install Yocto on nVidia Nano Devkit.

With the above, core-image-base is working, but core-image-weston and core-image-minimal are not working as follows,

[5.xxxxx] EXT4-fs (mmcblk0p1) : bad geometry: block count 1074176 exceeds size of device (1048576 blocks) mount: mounting /dev/mmcblk0p1 on /mnt failed: Invalid argument

There could be some problems on sdcard partitioning for image of core-image-weston and core-image-minimal.
Do you have any idea on this problem?


Hi @bg.kim,

There was a similar question here, NVIDIA Tegra Jetson TX2 which might help you move forward. There was a bug in the integration but the fix has been merged on the warrior branch in meta-mender-community

1 Like

2 posts were split to a new topic: NVIDIA Tegra Jetson Nano - Debian support?


I’m new to yocto, mender, and the nano. I followed this guide to get an image successfully, but I get no HDMI output with the base or sato image when I install into my nano dev kit. Am I missing something? Is there debug console output available with these images? I’m connected to the microUSB but have no tty. Do I need to use J41 instead?


Will this work with Jetson Xavier? because It does run Ubuntu 18.04 apart from all the Jetpack components

@nishad1092 the Xavier isn’t supported yet but support is planned, you can also watch this thread on meta-tegra.


Is there debug console output available with these images
Do I need to use J41 instead?

Yes, you can find a pinout here. You can use this with a TTL to USB serial converter to debug boot issues.

Sure, will explore on this, Thank you @dwalkes

I’ve had the same issue. Building with warrior, I can get the Jetson to boot just fine and I can shell in no problem.

Looking at serial output, it’s detecting an HDMI cable is plugged in and detecting the panel just fine. It just doesn’t output anything and the panel goes back to an active input. Similar behavior with DP.

I don’t know enough about Yocto or even Linux in general to know what that ultimately means or how to fix it.

@Mapleguy sorry but I don’t have much experience at all with HDMI on Nano so I don’t think I can help. I haven’t specifically tested this with the mender warrior setup myself, not sure of @Ecordonnier has. I’d start by mining issues on meta-tegra if you haven’t already and then posting an issue there. Sorry I’m not more help.

I got it working a little while after my post. I followed the steps here but replaced mender-tegra/sources/meta-tegra with a newer version from Git before building. Using branch warrior-l4t-r32.2 I built just fine and had video output. Here’s a link to the repo and branch I used.

Not sure if @ndry has solved his problem, but if not, here you go.

1 Like

Hi, I followed your post but failed in the end.

  1. when set MACHINE=jetson-nano, the system said jetson-nano is invaild, and machine became to jetson-tx2

  2. when build finished, I go into the /build/tmp/deploy/images/jetson-tx2 folder but there was no core-image-base-jetson-nano.sdcard file.

  3. Could you please tell me which files are used for booting?

test message

You suggest to us use warrior of zeus release, but it has no gcc 7 versions as sdk, but it is impossible to build cuda and opencv with newer gcc. Better to use Thud release

You can use older gcc versions on warrior/zeus.

Yep. I have found that, but this is not mentioned in topic

Can someone tell me what this method allows you to update? I’m looking for a solution that can update everything, including DTB (both kernel and bootloader DTB) and bootloader.