-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Lidar data coordinate system for preprocessed data #11
Comments
Hi, I have updated a tutorial for comparing mesh and pointclouds in the common world coordinates. You can check it out here. Remember to Apologies for the lack of clarity. There are actually multiple choices for processing the LiDAR data, leading to different combinations of LiDAR transforms and lidar_gts=(rays_o, rays_d, ranges). As long as you apply the lidar_gts with the transform, all of the choices are correct. The document This is mainly for flexibility. Both settings are read in the same way in our code. We always first read the LiDAR transform (whether it's an identity matrix or a real pose), and then use the transform to convert the point cloud to world coordinates. To transform the LiDAR GT points to the same "world" coordinates, you need to read the LiDAR and ego_car transform data stored in the sequence data's scenario.pt, and then use the read data to transform the rays_o, rays_d (or the calculated point clouds) stored in the LiDAR local coordinates. You can either directly use the raw data stored in By running it, you will get a popped up window like this: You can also uncomment the last few lines to compare lidar points from all frames with the extracted mesh: You can also get a interactable player by: lidar_mesh_video.mp4scene.debug_vis_anim(
scene_dataloader=scene_dataloader,
# plot_image=True, camera_length=8.,
plot_lidar=True, lidar_pts_ds=2,
mesh_file=mesh_path
) |
Thanks for the calcification and tutorial! |
According to the document, the npz lidar files stores the rays_o, rays_d which are in the world coordinate system.
However the reprocessed data here doesn't seem to agree with the above statement.
For example, this computes the max of the norm of ray_o in the session from "Quick download of one sequence(seg100613)'s processed data"
We expect it to be similar to the vehicle travel distance, but we get 0.42 m instead.
For background, I'm interested in comparing the extracted mesh (through extract_mesh.py) with the lidar ground truth by overlay them together, so I'm looking for a way to merge the lidar point clouds into a single one in the world coordinate frame the same as the extracted mesh.
The text was updated successfully, but these errors were encountered: