NVIDIA Tegra Jetson Nano

The official Mender documentation explains how Mender works. This is simply a board-specific complement to the official documentation.

Board description

NVIDIA Jetson Nano developer kit hardware booting from SD-card:

URL: https://developer.nvidia.com/embedded/jetson-nano-developer-kit
Wiki: https://elinux.org/Jetson_Nano

Test results

The Yocto Project releases in the table below have been tested by the Mender community. Please update it if you have tested this integration on other Yocto Project releases:

Yocto Project Build Runtime
warrior (2.7) :test_works: :test_works:
zeus (3.0) :test_works: :test_works:

Build Means that the Yocto Project build using this Mender integration completes without errors and outputs images.
Runtime Means that Mender has been verified to work on the board. For U-Boot-based boards, the integration checklist has been verified.

Getting started

Prerequisites

  • A supported Linux distribution and dependencies installed on your workstation/laptop as described in the Yocto Mega Manual
    • NOTE. Instructions depend on which Yocto version you intend to use.
  • Google repo tool installed and in your PATH .

Setup Yocto environment

Set the Yocto Project branch you are building for:

# set to your branch, make sure it is supported (see table above)
export BRANCH="zeus" 

Create a directory for your mender-tegra setup to live in and clone the
meta information.

mkdir mender-tegra && cd mender-tegra

Initialize repo manifest:

repo init -u https://github.com/mendersoftware/meta-mender-community \
           -m meta-mender-tegra/scripts/manifest-tegra.xml \
           -b ${BRANCH}

Fetch layers in manifest:

repo sync

Setup build environment

Initialize the build environment:

source setup-environment tegra

Configure Mender server URL (optional)

This section is not required for a successful build but images that are generated by default are only suitable for usage with the Mender client in standalone mode, due to lack of server configuration.

You can edit the conf/local.conf file to provide your Mender server configuration, ensuring the generated images and Mender Artifacts are connecting to the Mender server that you are using. There should already be a commented section in the generated conf/local.conf file and you can simply uncomment the relevant configuration options and assign appropriate values to them.

Build for Hosted Mender:

# To get your tenant token:
#    - log in to https://hosted.mender.io
#    - click your email at the top right and then "My organization"
#    - press the "COPY TO CLIPBOARD"
#    - assign content of clipboard to MENDER_TENANT_TOKEN
#
MENDER_SERVER_URL = "https://hosted.mender.io"
MENDER_TENANT_TOKEN = "<copy token here>"

Building the image

You can now proceed with building an image:

MACHINE=jetson-nano bitbake core-image-base

Replace core-image-base with your desired image target here and in the sections below.

Using the build output

After a successful build, the images and build artifacts are placed in tmp/deploy/images/jetson-nano.

The sd-card image package for Jetson Nano has a name like core-image-base-jetson-nano.sdcard. This file can be used to initially provision the sd-card using dd (replace /dev/sdc with the location of your sd-card):

sudo dd bs=4M status=progress if=tmp/deploy/images/jetson-nano/core-image-base-jetson-nano.sdcard of=/dev/sdc conv=fsync

On the other hand, if you already have Mender running on your device and want to deploy a rootfs update using this build, you should use the Mender Artifact files, which have .mender suffix. You can either deploy this Artifact in managed mode with the Mender server (upload it under Releases in the server UI) or by using the Mender client standalone mode.

See additional notes below about standalone updates on warrior and later branches.

References

  • meta-tegra: The base BSP layer with support for NVIDIA tegra. See completed pull request here for modifications to support Mender.
  • meta-mender-tegra layer in meta-mender-community: Customization for mender on tegra, including u-boot and partitioning. See completed pull request here for initial modifications to support NVIDIA Jetson-tx2 (pull request for Jetson-Nano here).

Standalone Updates and Bootloader Components

Updates of bootloader components as referenced on the TX2 mender hub page are not supported yet on Jetson-Nano, due to the lack of support from Nvidia for redundant boot on this platform. Nvidia may add support in the future. This means standalone updates do not use the mender-install-manual script as done on TX2 but use standard mender -install. It also means mender upgrades across meta-tegra yocto branches are unlikely to work properly, given possible dependencies between boot components and root filesystem components.

Known issues

  1. The Jetson-Nano production module is not supported.

If this post was useful to you, please press like, or leave a thank you note to the contributor who put valuable time into this and made it available to you. It will be much appreciated!

3 Likes

Hey,
hopefully i did not miss a point here, but these instruction seem not to be working for me.

When i try to build the jetson-nano image with MACHINE=jetson-nano bitbake core-image-base parsing the recipe fails.

NOTE: Your conf/bblayers.conf has been automatically updated.
Parsing recipes: 100% |####| Time: 0:00:58
Parsing of 907 .bb files complete (0 cached, 907 parsed). 1437 targets, 91 skipped, 0 masked, 0 errors.
NOTE: Resolving any missing task queue dependencies
ERROR: Nothing PROVIDES 'u-boot-bup-payload'
u-boot-bup-payload was skipped: incompatible with machine jetson-nano (not in COMPATIBLE_MACHINE)
NOTE: Runtime target 'mender-tegra-bup-payload-install' is unbuildable, removing...
Missing or unbuildable dependency chain was: ['mender-tegra-bup-payload-install', 'u-boot-bup-payload']
ERROR: Required build target 'core-image-base' has no buildable providers.
Missing or unbuildable dependency chain was: ['core-image-base', 'u-boot', 'mender-tegra-bup-payload-install', 'u-boot-bup-payload']
Summary: There were 2 ERROR messages shown, returning a non-zero exit code.

I know that this is quite a new wiki entry and related to the Pull request, but even working with feature/jetson-nano branch did not help. It feels like i missed something, eventhough i am not sure what :confused:.
Would be cool, if you can point me in a direction that might be helpful

@Langhalsdino sorry I just noticed this post.

Can you please confirm you have the latest in the PR here and especially this change:
RDEPENDS_${PN}_tegra186 += "mender-tegra-bup-payload-install"

You could also try out the latest zeus support at this PR, you’ll need to use the zeus mender branch here. This is working for me on nano after testing it today.

Thank you for the good information regarding Yocto. I have a question during the image creation to install Yocto on nVidia Nano Devkit.

With the above, core-image-base is working, but core-image-weston and core-image-minimal are not working as follows,

[5.xxxxx] EXT4-fs (mmcblk0p1) : bad geometry: block count 1074176 exceeds size of device (1048576 blocks) mount: mounting /dev/mmcblk0p1 on /mnt failed: Invalid argument

There could be some problems on sdcard partitioning for image of core-image-weston and core-image-minimal.
Do you have any idea on this problem?
Thanks!

from,
BG

Hi @bg.kim,

There was a similar question here, NVIDIA Tegra Jetson TX2 which might help you move forward. There was a bug in the integration but the fix has been merged on the warrior branch in meta-mender-community

1 Like

2 posts were split to a new topic: NVIDIA Tegra Jetson Nano - Debian support?

Hi,

I’m new to yocto, mender, and the nano. I followed this guide to get an image successfully, but I get no HDMI output with the base or sato image when I install into my nano dev kit. Am I missing something? Is there debug console output available with these images? I’m connected to the microUSB but have no tty. Do I need to use J41 instead?

Hi,

Will this work with Jetson Xavier? because It does run Ubuntu 18.04 apart from all the Jetpack components

@nishad1092 the Xavier isn’t supported yet but support is planned, you can also watch this thread on meta-tegra.

@ndry

Is there debug console output available with these images
Do I need to use J41 instead?

Yes, you can find a pinout here. You can use this with a TTL to USB serial converter to debug boot issues.

Sure, will explore on this, Thank you @dwalkes

I’ve had the same issue. Building with warrior, I can get the Jetson to boot just fine and I can shell in no problem.

Looking at serial output, it’s detecting an HDMI cable is plugged in and detecting the panel just fine. It just doesn’t output anything and the panel goes back to an active input. Similar behavior with DP.

I don’t know enough about Yocto or even Linux in general to know what that ultimately means or how to fix it.

@Mapleguy sorry but I don’t have much experience at all with HDMI on Nano so I don’t think I can help. I haven’t specifically tested this with the mender warrior setup myself, not sure of @Ecordonnier has. I’d start by mining issues on meta-tegra if you haven’t already and then posting an issue there. Sorry I’m not more help.

I got it working a little while after my post. I followed the steps here but replaced mender-tegra/sources/meta-tegra with a newer version from Git before building. Using branch warrior-l4t-r32.2 I built just fine and had video output. Here’s a link to the repo and branch I used.

Not sure if @ndry has solved his problem, but if not, here you go.

1 Like

Hi, I followed your post but failed in the end.

  1. when set MACHINE=jetson-nano, the system said jetson-nano is invaild, and machine became to jetson-tx2

  2. when build finished, I go into the /build/tmp/deploy/images/jetson-tx2 folder but there was no core-image-base-jetson-nano.sdcard file.

  3. Could you please tell me which files are used for booting?

test message

You suggest to us use warrior of zeus release, but it has no gcc 7 versions as sdk, but it is impossible to build cuda and opencv with newer gcc. Better to use Thud release

You can use older gcc versions on warrior/zeus.

Yep. I have found that, but this is not mentioned in topic