Help with Base Plane Calibration and Depth Image Orientation for AR Sandbox Project

General Information

Problem / Question

Hello,
we are currently working on an AR Sandbox project using the Ensenso N45 camera and are facing a few challenges where we would appreciate your guidance:

1. Base Plane Calibration

We would like to define a base plane for our sandbox setup. The idea is to place a large flat board on top of the sandbox and tell the camera that this surface represents a consistent Z-value (i.e., a flat plane).
Is there a recommended or systematic way to calibrate such a base plane in NxLib or via any other method using the Ensenso SDK? Ideally, we would like to be able to transform the point cloud such that the base plane corresponds to Z=0.

2. Depth Image Orientation

When visualizing the depth images using OpenCV or Open3D, the scene appears flipped or viewed from below. Is this expected behavior for Ensenso cameras? Or is there a typical transformation or correction that needs to be applied to orient the depth data correctly for top-down AR projection?

I attached 3 screenshots, two of them showing images using open3d and how the point cloud appears when we dont flip it manually, as well as a 3D view from NxView that shows our setup.

Our goal is to create an interactive AR Sandbox where a beamer projects the real-time depth data onto the sand surface. Any advice or resources related to these topics would be greatly appreciated.

Best regards,

Hello,

you can use a combination of the commands FitPrimitive, CalibrateWorkspace and StoreCalibration to solve this. FitPrimitive can fit a plane to the current point map and CalibrateWorkspace can use this to position in a way that the plane corresponds to Z=0. With StoreCalibration this link can be written to the EEPROM to reuse it opening the camera again.

The corresponding C++ code might look like this:

#include "nxLib.h"

#include <iostream>

void displayException()
{
	try {
		throw;
	} catch (char const* errorMessage) {
		std::cerr << errorMessage;
	} catch (std::string const& errorMessage) {
		std::cerr << errorMessage;
	} catch (std::exception const& e) {
		std::cerr << e.what();
	} catch (NxLibException const& e) { // Display NxLib API exceptions, if any
		std::cerr << "An NxLib API error with code " << e.getErrorCode() << " (" << e.getErrorText()
		          << ") occurred while accessing item " << e.getItemPath() << "";
		if (e.isCommandExecutionFailure()) {
			std::cerr << "\nCommand error: " << e.getCommandErrorSymbol() << ": " << e.getCommandErrorText();
		}
	} catch (...) { // Display other exceptions
		std::cerr << "Something, somewhere went terribly wrong!";
	}
	std::cerr << std::endl;
}

int main(int argc, char** argv)
{
	int error{0};
	std::string serial;
	std::string const serialCmdKey("--serial");
	for (int i = 0; i < argc - 1; ++i) {
		if (serialCmdKey == argv[i]) {
			serial = std::string(argv[i + 1]);
			std::cout << "Found serial " << serial << std::endl;
			break;
		}
	}
	if (serial.empty()) {
		std::cerr << "No serial given as command line parameter" << std::endl;
		return 1;
	}
	try {
		nxLibInitialize();
	} catch (...) {
		displayException();
		error += 1;
	}
	if (error == 0) {
		try {
			std::cout << "Opening camera " << serial << std::endl;
			NxLibCommand open(cmdOpen);
			open.parameters()[itmCameras] = serial;
			open.execute();

			std::cout << "Old Pose" << NxLibItem(itmCameras)[serial][itmResolvedLink].asJson() << std::endl;

			std::cout << "Capturing image camera " << serial << std::endl;
			NxLibCommand capture(cmdCapture);
			capture.parameters()[itmCameras] = serial;
			capture.execute();
			std::cout << "Compute disparity map " << serial << std::endl;
			NxLibCommand computeDisparityMap(cmdComputeDisparityMap);
			computeDisparityMap.parameters()[itmCameras] = serial;
			computeDisparityMap.execute();

			std::cout << "Compute point map " << serial << std::endl;
			NxLibCommand computePointMap(cmdComputePointMap);
			computePointMap.parameters()[itmCameras] = serial;
			computePointMap.execute();

			(std::cout << "Fit Plane: ").flush();
			NxLibCommand fit(cmdFitPrimitive);
			fit.parameters()[itmPrimitive][itmType] = valPlane;
			fit.parameters()[itmIterations] = 2000;
			fit.parameters()[itmInlierThreshold] = 1.5;
			fit.execute();
			std::cout << fit.result().asJson() << std::endl;

			NxLibItem plane = fit.result()[itmPrimitive][0];

			(std::cout << "Define pose: ").flush();
			NxLibCommand definePose(cmdCalibrateWorkspace);
			definePose.parameters()[itmCameras] = serial;
			definePose.parameters()[itmType] = valPlane;
			definePose.parameters()[itmPlane] << plane.asJson();
			definePose.execute();
			std::cout << definePose.result().asJson() << std::endl;

			// Store the new calibration to the camera's EEPROM.
			(std::cout << "Store Link: ").flush();
			NxLibCommand storeCalibration(cmdStoreCalibration);
			storeCalibration.parameters()[itmCameras] = serial;
			storeCalibration.parameters()[itmLink] = true;
			storeCalibration.execute();
			std::cout << storeCalibration.result().asJson() << std::endl;

			std::cout << "New Pose" << NxLibItem(itmCameras)[serial][itmResolvedLink].asJson() << std::endl;
		} catch (...) {
			displayException();
			error += 1;
		}
	}
	try {
		nxLibFinalize();
	} catch (...) {
		displayException();
		error += 1;
	}
	return error;
}

Does this answer your question?

Kind regards

Joel

You can actually do the plane alignment right from NxView on the Workspace Calibration tab. The function is called Align plane with depth data.

Then you can set some parameters in the FitPrimitive dialog and click Start so store the alignment as Link in the camera EEPROM.