Retrieve colored point clouds in MVTec Halcon

General Information

  • Product: C57-6-S
  • Serial Number: 233387
  • Ensenso SDK Version: 4.2.1821
  • Operating System: Windows 11

Attachement:

Problem / Question

Hello,

I am using MVTec Halcon 24.11 to acquire images with an Ensenso C57-6-S camera through the Ensenso-NxLib interface.

Based on the process described in this webinar https://en.ids-imaging.com/visionchannel-media-details/how-to-build-3d-camera-data-using-true-colour-information.html, I want to generate colored 3D depth maps in the coordinate system of the built-in color camera. I tried the approach using the RenderPointMap command (see the attaches hdev example), but the result is not as expected: The quality of the 3d data is not as good as a normal acquisiton and no color information is matched with the 3D data. Also the resolution is smaller than the color camera resolution.

Do you have an example in Halcon how to get colored 3D depth maps from a C57-6-S with the build in color camera? Or which steps am I missing to generate colored 3d depth maps?

Thank you in advance for your efforts.

Kind regards.

Hi Braitmaier_PR,

I guess that the implicit execution of the the command “RenderPointMap” (By requesting the image “RenderPointMap” in grab_data-items) will not be suitable, because there are no command parameters provided and thus the telecentric view of the pointmap is computed.
I ‘ve uploaded a HDevelop sample that is running with two filecameras, taken from an Ensenso C57. I’ll upload all needed data here. With the sample you see the relevant differences to your code. There are some lines of code, that you might want to delete. The script is made for training purposes.
The uploaded data contains one filecamera for the stereo camera and one filecamera for the colorcamera, a parameterfile for each camera and the script. - The script asks you to navigate to the 3d-filecam and parameter file (in that order) and than to the 2d filecamera and parameterfile. The filecameras are the *.zip files.

Kind regards

Ute

ColoredPointMap_FileCam.hdev (99.5 KB)

C57_Metal_FV16_calib.json (18.2 KB)

C57_Metal_FV16.zip (97.7 MB)

C57_Metal_Color_FV16.zip (3.4 MB)

C57_Metal_Color_calib.json (4.1 KB)

Edit: Info regarding the concept to get color information for the 3d data.

After you’ve executed “RenderPointMap”, your pointcloud is still without color !
Think of the command will kind of “rearrange” the 3d data *, from the pixelgrid of the left-rectified stereo camera, to the pixelgrid of the additional color camera. So you have two data containers “image” that share the same pixelgrid. To get “colored points” you can pixelwise add the color information to the coordinate-information.

If you don’t need the 3d data in the format “image” and a ply file is enough, you can directly save the colored 3d mesh from NxView. (File /Save /3d Mesh).Find some more information plus a pdf regarding color +3d in this topic.

* in fact, the command “RenderPointMap” does more than rearranging the 3d data, but for understanding the command, thats the most important part.

Thank you for your quick and thorough reply, I will work myself through the example to get a better unterstanding of the topic.

One point I saw while looking into the example hdev file is, that inside is a procedure called nxLibGetPose. In my environmet is this procedure unkown, which indicates it is not a lokal procedure included in the file, but instead an external procedure referenced from a library. Is it possible to provide the missing procedure? If not, is it no big problem, I will find a workaround for the missing procedure.

Thank you and kind regards.

Hi @Braitmaier_PR ,

the procedures are included in the SDK installation, but you have to copy them manually or add them to the search path in Halcon. See this topic on where the procedures are located and where Halcon expects them.

You can find this procedure in the Ensenso installation. Ensenso /Development/ Halcon/ Procedures

I’ve used it to try whether it’s possible to get the needed values for the Near and the Far plane programmatically. If you know the correct values for Near and Far, you can use it directly

I got it running now with my setup, thank you again for your quick help!

I have one further question a little bit off topic but related: The C57 model we are currently using is a loan model until the release delivery of an ordered CR57-4-S model. Will the procedure to get colored depth data be the same for the new CR57 model, or are there any changes planned?

Kind regards,

Braitmaier

I haven‘t had the chance to test it, but from what I know from X versus XR series, there may be some additional nodes to control the download of images. The part of your code, that is dealing with adding color attributes to the points, should also work with the CR57.