2. Cameras

2.1. Learning Objectives

In this example, we will learn how to

  • Add a single and multiple camera publishers.

  • Send ground truth synthetic perception data through rostopics.

Prerequisite

2.2. Add A Single Camera Publisher

We’ll start with a scene of a simple room and two apriltagged cubes on the floor. To load the scene, go to the Content tab, navigate to Isaac/Samples/ROS/Scenario/simple_room_apriltag.usd and open the stage in a new editing layer.

To add a camera rostopic, go to the menu bar and Create -> Isaac -> ROS -> Camera. A ROS_Camera will appear in the Stage Tree. This newly added ROS camera topic publishes images seen in the viewport.

To view the published image, open a new terminal with ROS environment sourced, make sure roscore is running and the simulator is playing, and run command rosrun image_view image_view image:=/rgb to see the image published on the /rgb topic. It should reflect what is seen in the viewport. If you move the camera around in the viewport, the published image should follow.

2.3. Add Multiple Cameras Publishers

You might have more than one camera in your scene, or perhaps your robot has a stereo camera. Here we will add publishers for two separate cameras. Inside the Stage tab, open up the World branch. We should see the two April Tag cubes: Tagged_Cube_1 and Tagged_Cube_2, as well as a Camera_1 and Camera_2. We will first associate the camera publisher we just added in the previous section with Camera_1, and then add a second publisher for Camera_2.

To attach the existing camera topic to Camera_1, select ROS_Camera in the Stage Tree, open the Raw USD Properties in the Property tab. Find the cameraPrim field, and click on Add Target(s). Find Camera_1 in the pop window’s Stage Tree and Add.

Before adding a second camera topic, let’s open a second viewport so we can see both cameras’ views. Window -> New Viewport Window to open the viewport, and select the desired camera view from the Cameras icon on the upper left corner in the viewport.

Add another camera topic by repeating the steps from the previous section. To distinguish the rostopics of the two cameras, go to Property tab of the second camera and edit the rgbPubTopic field to /rgb_2.

We will check the images published this time in Rviz. In a ROS-sourced terminal, type in command rviz -d ros_workspace/src/isaac_tutorials/rviz/turtle_stereo.rviz, and you’ll see both images seen in the viewport are also reflected in Rviz.

2.4. Synthetic Ground Truth Data

In addition to RGB image, the following synthetic sensor and perceptual information also are available for any camera.

  • Depth

  • BoundingBox 2D

  • BoundingBox 3D

  • Segmentation labels

  • Semantic/Instance labels

  • Point Cloud

The details of the synthetic data generation is in the Replicator Tutorials series. Here we will show how to enable those information to be published to ROS messages, using depth as an example.

Open the Raw USD Properties tab for the ROS camera bridge. You will see that there are many other items other than RGB. By default, only RGB (and the camera itself) is enabled. Enable depth by checking the depthEnabled field. Now depth information is being published on the topic name that is in depthPubTopic. We can visualize it using the image_view method again: rosrun image_view image_view image:=/depth.

Try enabling other rostopics, use rostopic echo /topic to see the raw information that is being passed along.

2.5. Troubleshooting

If your depth image only shows black and white sections, it is likely due to somewhere in the field of view has “infinite” depth and skewed the contrast. Adjust your field of view so that the depth range in the image is limited.

2.6. Summary

This tutorial introduces camera and all the associated synthetic data rostopics.

2.6.1. Next Steps

Continue on to the next tutorial in our ROS Tutorials series, Lidar Sensors, to learn how to add a lidar sensor to the Turtlebot3.

2.6.2. Further Learning

  • Additional information about synthetic data generation can be found in the Replicator Tutorial Series.

  • Examples of running similar environment using Standalone Python workflow is outlined here.