Hopefully, the OS support isn't as awful as the Jetson platforms usually are. Unless they change, you'll get 1 or 2 major kernel updates ever and have to do bizarre stuff like install a 6 year old Ubuntu on your x86 PC to run the utility to flash the OS.
Keep in mind that a kernel module != driver. It's just doing initialization and passing data to/from the driver, which is closed source and in user space.
It always has been the userspace of the Jetsons which was closed source and tied to Nvidia's custom kernel. I have not heard from people running Jetpack on a different userland than the one provided by Nvidia. Companies/Labs that update the OS don't care about CUDA, Nvidia contributes to Mesa support of the Jetsons and some only need a bit more GPU power than a RasPi.
The Jetson Orin Dev Kit is squarely aimed at being a dev kit for those using the Jetson module in production edge compute (robotic vision and the like). The only reason it's so well known in tech circles is "SBC syndrome" where people get excited about what they think they could do with it and then 95% end up in a drawer a year later because it what it's actually good at is unrelated to why they bought it.
This is more accurately a descendant of the HPC variants like the article talks about - intentionally meant to actually be a useful entry level for those wanting to do or run general AI work better than a random PC would have anyways.
The AGX Orin was only 64GB of LPDDR5 and priced at $5k so this does seem like a bargain in comparison with 128GB of presumably HBM. But Nvidia never lowers their prices, so there's a caveat somewhere.
I've seen some claims that it can do 512 GB/s on Reddit (not sure where they got that from), which would imply a ~300 bit bus with LPDDR5X depending on the frequency.
"According to the Grace Blackwell's datasheet- Up to 480 gigabytes (GB) of LPDDR5X memory with up to 512GB/s of memory bandwidth. It also says it comes in a 120 gb config that does have the full fat 512 GB/s."
Keep in mind the "full" grace is a completely different beast with Neoverse cores. This new GB10 uses different cores and might well have a different memory interface. I believe the "120 GB" config includes ECC overhead (which is inline on Nvidia GPUs) and Neoverse cores have various tweaks for larger configurations that are absent in the Cortex-x925.
I'd be happy to be wrong, but I don't see anything from Nvidia that implies a 512 bit wide memory interface on the Nvidia Project DIgits.
https://www.okdo.com/wp-content/uploads/2023/03/jetson-agx-o...
I wonder what the specifications are in terms of memory bandwidth and computational capability.