You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a question on how the tool is able to run modules which require lidar/camera/ground-truth data like prediction/perception during 'Sim Control'?
I am able to route and actuate the vehicle on some demo maps using bootstrap and Sim Control without sending any explicit sensor data channels for the modules to read from. How is this possible in Sim Control? Does the sensor data come from pre-simulated data?
The text was updated successfully, but these errors were encountered:
For the planning module to operate, it needs to receive prediction messages, and SimControl publishes dummy prediction messages with no obstacle by default. So to run SimControl pretty much only planning and routing need to be turned on, and there will be no obstacle in simulation.
auto prediction = std::make_shared<PredictionObstacles>();
{
std::lock_guard<std::mutex> lock(mutex_);
if (!send_dummy_prediction_) {
return;
}
FillHeader("SimPrediction", prediction.get());
}
prediction_writer_->Write(prediction);
}
There also exist tools like replay_perception that publish perception messages, then you can turn on the prediction module can see Apollo reacting to those obstacles in SimControl simulation.
I have a question on how the tool is able to run modules which require lidar/camera/ground-truth data like prediction/perception during 'Sim Control'?
I am able to route and actuate the vehicle on some demo maps using bootstrap and Sim Control without sending any explicit sensor data channels for the modules to read from. How is this possible in Sim Control? Does the sensor data come from pre-simulated data?
The text was updated successfully, but these errors were encountered: