No announcement yet.

Questions about VRX 2019 Rules

  • Filter
  • Time
  • Show
Clear All
new posts

  • Questions about VRX 2019 Rules


    I am working on VRX 2019 competition.
    And I have following questions about the rules.
    1. Could we change the position and numbers of the thrusters ? Also, could we add degrees of freedom for them (ex. add azimuth joints) ?
    2. Could we change the position and numbers of the sensors ?
    I already read the rule book.
    But especially for second question, the position of 3d lidar is realy hard for sensing I think.
    That is why, I may ask these questions.

    Best Regards.
    And Thank you forward.


  • #2

    Thank you for bringing up these questions. We are glad to see that you are looking through the guidance documents!

    We are actively working on the two areas of customization that you suggest - sensors and propulsion. We are trying to come up with a solution that allows teams some degree of customization within some bounds.

    For example, we are looking at the feasibility of specifying a common suite of sensors (GPS, IMU, cameras, lidars, etc.), but the number of sensors would be limited - for example, teams wouldn't be able to have 10 3D lidars and 20 cameras pointed in every direction. Teams would be able to specify that location and orientation of the sensors on the WAM-V. Again, the placement would be limited to physically feasible solutions.

    Similarly, we are exploring the potential to provide options for propulsion. In the simulation we have a few pre-configured options - see We are looking at the support needed to make a fixed number of configurations available to teams.

    I anticipate that we'll be able to provide limited customization for the VRX competition. This likely means that the model teams use in VRX will not be configured exactly as their physical WAM-V because we just can't support that many different solutions, but hopefully it will be very close so that the software the teams use for VRX will be essentially equivalent o the software they use for the physical competition.

    We understand that position and attitude of the sensors is important to teams, so that is were we are concentrating our efforts for now. Are there other aspects that are particularly important to your team?

    The task descriptions and technical documents ( are still preliminary, so we are very interested in any and all feedback from the community.

    Thank you,

    PS - Many teams use the VRX project beyond the competition. For those uses the configurations are very customizable and we've included a few tutorials:


    • #3

      I am sorry for late replying.

      It sounds great that you have a lot consideration for robot descriptions.
      I appreciate it.
      And I would like to precise that if there are customize options ,that we can customize from launch file, in URDF (exactly Xacro), customize them is allowed for the competition.
      Is that right ?
      For example, choose the thruster type, positions.... and so on.

      To think about these options, following properties are important we think.

      1. Number of sensors should be limited. (For example, 3D LiDAR is until 3, camera is until 4)
      2. The position and pose of sensors should be set as option.

      And now, the GPS sensor is implemented to publish just a position in the world frame.
      But, the real sensor, they detects also the direction.
      Thus, the GPS plugin should publish position and direction(pose) in world frame, we think.

      Best Regards,


      • #4

        Thank you for the reply. This is indeed a work in progress and something the team continues to discuss. I would recommend that you take a look at the wiki describing how to configure the sensor configuration...

        For thruster/propulsion configuration you may want to look at...

        We would welcome any comments or suggestions from the community. We particularly appreciate suggestions that come in the form of pull requests that we can integrate into the project! In particular, you mentioned that you might have sensor suggestions. The current sensors we are considering are listed at If you have other configurations or sensors, it would be great to describe them as possible additions.

        Thank you,


        • #5

          Thank you for the reply.

          Now, we read the wiki about configuration that you recommended .

          My team (OUXT-Polraris) is plan to create a new gps plug in in which they return the position and pose in world frame, and would like to commit them to upstream.
          Team member will comment about that in detail.

          Best Regards,


          • #6
            Hi, brian, I am now developing nmea_gps_plguin for gazebo.(
            I want to release this plugin as binary and enables to install rosdep.
            After that, we will send pull request to the vrx repo.


            • #7
              That sounds like a great plan. Looking forward to the contribution.


              • #8
                Brian, I developed nmea_gps_plugin package for gazebo.(
                And released it.
                We can use this package via apt about one month later.


                • #9

                  I've added an issue to the VRX project which includes evaluating how your plugin could be used to support VRX:
                  Please feel free to comment or contribute to the issue.

                  One thing I do notice is that your plugin publishes the sensor data as a series of NMEA sentences ( Currently sensor plugins we have been using in VRX provide a higher level interface that generalize the sensor information independent from the particular sensor communication protocol - e.g., the current GPS sensor pluign we use ( publishes fix and fix_velocity information directly.



                  • #10

                    I have new question about the robot configuration.
                    How the operators will manage the robot configuration ?
                    Each teams should have their own configured robot models (the configuration should be set as option in launch file),
                    and if it is true, how the operators will manage their configuration ?

                    We (our team) want to have a time to modify configurations as long as possible.
                    To do that, we present two methods,
                    1. The robot configuration is publish as rosparam based on our own URDF.
                    This method we expect to.
                    First, we set the robot_description param based on URDF on our PC, and the operators'PC will get that robot_description to launch the gazebo model.
                    To check the URDF whether it is fits to the rules, the operator should have some program to check robot_description in automatically.
                    With this method, we have a benefit that we could have a modification time as long as possible.
                    2. The robot configuration file is submitted by the limit that the operators decided. And on the operators'PC will launch the submitted robot_model
                    This method is simple.
                    But, compare to the first method, we have less time to modify the robot configuration.

                    Please let us know which (or other) methods you (the operators) will take ?

                    Best Regards,


                    • #11

                      This feature - being able to have teams customize the propulsion and sensor configuration - is being actively developed and documented. We are working towards a new release of the Technical Description document to document a solution for the VRX competition. That release is due out the end of this month (June) - at which point we'll work with the teams to integrate their feedback.

                      The current process is documented on this wiki page
                      We would encourage you to try that process. Please submit any issues or recommended enhancements.

                      Thank you,


                      • #12

                        Thank you for your encouragement to use URDF auto generation program.
                        And also, I am sorry to be late to understand your intention.

                        I generated our original URDF file using auto generation program with original configuration file, being followed to the wiki page.

                        And I understand that there are auto detection about the limitation for number of sensors.
                        After that, I have a few questions relative to them on following.

                        1. Until when should we submit our yaml files or generated urdf file to the judge ? Or is the submit necessary ...?
                        2. We team desires that the limit of number of 3d-lidar to be two.
                        3. We team desires the NMEA typed GPS sensors.

                        About 1st one,
                        I understand that the configuration files should be a thruster_config.yaml and sensor_config.yaml files.
                        But I want to know when the judge will check these configuration files of each teams ?
                        Or, because auto generation program check the configuration files, the judge do not check the configuration files or URDF files ?

                        About 2nd one,
                        We think two 3d-liders are necessary. One is to observe a widespread area, and the other is to observe near fields.
                        In fact, on RobotX 2018, we used two velodynes to use as that.

                        About 3rd one,
                        You kindly mentioned the hector_gazebo_plugin. And I understand in the latest vrx package that plugin is implemented.
                        We understand the hector_gazebo_plugin returns the longitude and latitude data as fix topic, and the velocity as fix_velocity topic.
                        But, the velocity information, we could not know the orientation (direction) in the world frame.
                        The NMEA sensor returns the orientation (direction) in the world frame.
                        In fact, we used that sensor in RobotX 2018, and that sensor is commercially used.
                        That is why we create the NMEA gazebo plugin, and would like to get permission, and be integrated to VRX package.
                        Here is the created one.

                        Best Regards,


                        • #13

                          Thank you for the additional information. The details of these aspects of the competition are still being refined, so your comments are timely.

                          For the numbered items you highlight:

                          1. The YAML configuration files will be part of the submission. We are working on updates to the VRX Technical Guide and VRX Task Descriptions (see preliminary drafts at These updates will provide more details on the submission process. We expect to have all of these released with a the new software release at the end of this month.

                          2. We are working on defining the constraints for sensor configurations - types and numbers of available sensors. I'll pass on your request for multiple 3D lidars to the development team and solicit their feedback.

                          3. Could you provide some information on the "NMEA sensor" you are recommending? I will bring this up with the development team as well.

                          Thank you for continuing to contribute to the development of the project!



                          • #14

                            Sorry for late reply.

                            1. About the configuration file submission, we will wait for your update !
                            2. We saw the latest version of vrx package, and understand the limit number of lidars are 2. Thank you a lot to consider our opinion.
                            3. NMEA sensor, we are saying is following sensor.
                            With hector_gazebo_plugin, for example, if the ship is carried to sideways by wind or waves, but the orientation of is not changed, the hector_gazebo_plugins returns the sideways velocity.
                            But what we want is the orientation of the ship in the world frame.
                            The NMEA sensor and our nmea_gazebo_plugins returns that global-orientation information.
                            We need this kind of information for our localization method.

                            Thank you for your kind consideration.
                            Best Regards,

                            Last edited by Ryodo Tanaka; 07-03-2019, 06:51 PM.


                            • #15

                              As for the NMEA sensor, the scenario we are advocating is that the WAM-V simulation includes both GPS and IMU plugins (from the hector_gazebo_plugins). In this scenario the vehicle heading would be observed from the /imu message (sensor_msgs/Imu). Functionally this would be equivalent to having a single sensor that measures position, velocity and attitude, but the information is published on multiple topics (/fix, /fix_velocity and /imu). I believe that would be a reasonable approximation of your physical setup - true?