GPT PMBR size mismatch

Hello everybody,

I try to update a self build computer containing an IntelAtom E3950 and running a Yocto Image. I have build a Yocto Image and already integrated the meta-mender-core and meta-mender-demo layers in it, but now I am facing the Problem that whenever I am trying to:
sudo dd if=men-intel-test-cm50-20191106080607.uefiimg of=/dev/sdb bs=1M && sync
the Image on the USB-stick I get this error:

GPT PMBR size mismatch (15368191 != 15425535) will be corrected by w(rite).
Disk /dev/sdb: 7,4 GiB, 7897874432 bytes, 15425536 sectors
Units: sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes
Disklabel type: gpt
Disk identifier: ----

Device Start End Sectors Size Type
/dev/sdb1 16384 49151 32768 16M EFI System
/dev/sdb2 49152 7569407 7520256 3,6G Linux filesystem
/dev/sdb3 7569408 15089663 7520256 3,6G Linux filesystem
/dev/sdb4 15089664 15351807 262144 128M Linux filesystem

I am able to fix this Issue with parted
sudo parted /dev/sdb:

Not all of the space available to /dev/sdb appears to be used, you can
fix the GPT to use all of the space (an extra 57344 blocks) or continue with
the current setting?

but I do not know why I always get the Issue.

I believe this relates to how the wic plugin is generating GPT images, meaning it is not adding the redundant GPT partition table at the end of the disk and hence this warning.

I am not sure there is a “easy fix” and this generally does not generate any problems during run-time.

@kacf, has more experience with this and might have some additional information.

1 Like

I think wic is adding the GPT copy at the end, but due to various padding values being subtracted from the total size of the image, the final size is not 100% accurate down to the sector count. So the image ends up being just a little too small, and therefore the GPT copy is not found by the kernel in the expected location. In general this message is harmless, and is a weakness of the GPT format (strange that they made the validity of the GPT table dependent on the size of the storage medium).

This would be fixable by making the Mender image generator sector accurate, but since this has no ill consequences, and only wastes a tiny amount of space, no one has stepped up and fixed it so far.

1 Like

@kacf has this been fixed in the meantime?

@deffo: No, I do not believe that this has been fixed.