Disclaimer: Most of this I found through debugging and reading the docs, so take it with a grain of salt.
The devices (HMD + Controllers) are recognized by Unity, but because no layout exists, the default layouts (XRHMD and XRController) are used as base. Unity then goes through the list of device capabilities, looks for a matching InputControl in the inherited base layout, and if none is found creates a new one. Here are the capabilities of the HMD for example -- and as you can see, the names do not match with the ones commonly used.
What I did was simply adding aliases for those properties. When building the managed representation of the device (and the InputControls), Unity then uses the common names instead of the WaveSDK names, hence GetChildControl() will find them by their common name. The mapping of primary/secondary etc. is probably already done on the native side that provides the capabilities.
I based those classes on layouts I found in the Oculus XR Plugin btw. They seem to be doing something similar.
Also take a look at XRLayoutBuilder#Build() -- that's where the InputControls are created.
Regarding workaround A; not sure how you would access them. The controllers do also use common usages (like "GripButton"), though, but maybe you would have to create the paths manually using the path syntax.