g2o tutorial explained

Disclaimer

This is a pretty dirty and quick explanation on how the g2o tutorial works. It is not really a tutorial to use g2o per se, but I hope it helped you in getting a first “jump” into g2o and it’s inner working :).

How to use G2O

Here is the code from the file tutorial_slam2d.cpp:

#include
#include

#include simulator.h
#include vertex_se2.h
#include vertex_point_xy.h
#include edge_se2.h
#include edge_se2_pointxy.h
#include types_tutorial_slam2d.h

#include g2o/core/sparse_optimizer.h
#include g2o/core/block_solver.h
#include g2o/core/factory.h
#include g2o/core/optimization_algorithm_factory.h
#include g2o/core/optimization_algorithm_gauss_newton.h
#include g2o/solvers/csparse/linear_solver_csparse.h

using namespace std;
using namespace g2o;
using namespace g2o::tutorial;

int main()
{
// TODO simulate different sensor offset
// simulate a robot observing landmarks while travelling on a grid
SE2 sensorOffsetTransf(0.2, 0.1, -0.1);
int numNodes = 300;
Simulator simulator;
simulator.simulate(numNodes, sensorOffsetTransf);

/*********************************************************************************
* creating the optimization problem
********************************************************************************/

typedef BlockSolver BlockSolverTraits<-1, -1>SlamBlockSolver;
typedef LinearSolverCSparse SlamLinearSolver;

// allocating the optimizer
SparseOptimizer optimizer;
SlamLinearSolver* linearSolver = new SlamLinearSolver();
linearSolver->setBlockOrdering(false);
SlamBlockSolver* blockSolver = new SlamBlockSolver(linearSolver);
OptimizationAlgorithmGaussNewton* solver = new OptimizationAlgorithmGaussNewton(blockSolver);

optimizer.setAlgorithm(solver);

// add the parameter representing the sensor offset
ParameterSE2Offset* sensorOffset = new ParameterSE2Offset;
sensorOffset->setOffset(sensorOffsetTransf);
sensorOffset->setId(0);
optimizer.addParameter(sensorOffset);

// adding the odometry to the optimizer
// first adding all the vertices
cerr << "Optimization: Adding robot poses ... ";
for (size_t i = 0; i < simulator.poses().size(); ++i) { const Simulator::GridPose& p = simulator.poses()[i]; const SE2& t = p.simulatorPose; VertexSE2* robot = new VertexSE2; robot->setId(p.id);
robot->setEstimate(t);
optimizer.addVertex(robot);
}
cerr << "done." << endl;

// second add the odometry constraints
cerr << "Optimization: Adding odometry measurements ... ";
for (size_t i = 0; i < simulator.odometry().size(); ++i) { const Simulator::GridEdge& simEdge = simulator.odometry()[i]; EdgeSE2* odometry = new EdgeSE2; odometry->vertices()[0] = optimizer.vertex(simEdge.from);
odometry->vertices()[1] = optimizer.vertex(simEdge.to);
odometry->setMeasurement(simEdge.simulatorTransf);
odometry->setInformation(simEdge.information);
optimizer.addEdge(odometry);
}
cerr << "done." << endl;

// add the landmark observations
cerr << "Optimization: add landmark vertices ... ";
for (size_t i = 0; i < simulator.landmarks().size(); ++i) { const Simulator::Landmark& l = simulator.landmarks()[i]; VertexPointXY* landmark = new VertexPointXY; landmark->setId(l.id);
landmark->setEstimate(l.simulatedPose);
optimizer.addVertex(landmark);
}
cerr << "done." << endl;

cerr << "Optimization: add landmark observations ... ";
for (size_t i = 0; i < simulator.landmarkObservations().size(); ++i) { const Simulator::LandmarkEdge& simEdge = simulator.landmarkObservations()[i]; EdgeSE2PointXY* landmarkObservation = new EdgeSE2PointXY; landmarkObservation->vertices()[0] = optimizer.vertex(simEdge.from);
landmarkObservation->vertices()[1] = optimizer.vertex(simEdge.to);
landmarkObservation->setMeasurement(simEdge.simulatorMeas);
landmarkObservation->setInformation(simEdge.information);
landmarkObservation->setParameterId(0, sensorOffset->id());
optimizer.addEdge(landmarkObservation);
}
cerr << "done." << endl;
/*********************************************************************************
* optimization
********************************************************************************/

// dump initial state to the disk
optimizer.save("tutorial_before.g2o");

// prepare and run the optimization
// fix the first robot pose to account for gauge freedom
VertexSE2* firstRobotPose = dynamic_cast<VertexSE2*>(optimizer.vertex(0));
firstRobotPose->setFixed(true);
optimizer.setVerbose(true);

cerr << "Optimizing" << endl;
optimizer.initializeOptimization();
optimizer.optimize(10);
cerr << "done." << endl;

optimizer.save("tutorial_after.g2o");

// freeing the graph memory
optimizer.clear();

// destroy all the singletons
Factory::destroy();
OptimizationAlgorithmFactory::destroy();
HyperGraphActionLibrary::destroy();

return 0;
}

And now a little more explanation on what this tutorial file do :

First step : create the simulation

We want to simulate the passage of a robot going through a grid and observing landmarks. The class SE2 represents vertices and edges made of 2D isometries between poses. Simulator possesses a true pose, a simulated pose, a vector called seenby.

By calling the function simulate(numNodes, sensorOffsetTransf) a certain number of landmarks are created depending on simulated robot observation from different poses.

First poses are created by placing the robot somewhere and moving it using a random number generator.

while ((int)poses.size() &amp;amp;amp;amp;amp;amp;lt; numNodes) {
// add straight motions
for (int i = 1; i &amp;amp;amp;amp;amp;amp;lt; steps &amp;amp;amp;amp;amp;amp;amp;&amp;amp;amp;amp;amp;amp;amp; (int)poses.size() &amp;amp;amp;amp;amp;amp;lt; numNodes; ++i) {
Simulator::GridPose nextGridPose = generateNewPose(poses.back(), SE2(stepLen,0,0), transNoise, rotNoise);
poses.push_back(nextGridPose);
}
if ((int)poses.size() == numNodes)
break;

// sample a new motion direction
double sampleMove = Rand::uniform_rand(0., 1.);
int motionDirection = 0;
while (probLimits[motionDirection] &amp;amp;amp;amp;amp;amp;lt; sampleMove &amp;amp;amp;amp;amp;amp;amp;&amp;amp;amp;amp;amp;amp;amp; motionDirection+1 &amp;amp;amp;amp;amp;amp;lt; MO_NUM_ELEMS) {
motionDirection++;
}

SE2 nextMotionStep = getMotion(motionDirection, stepLen);
Simulator::GridPose nextGridPose = generateNewPose(poses.back(), nextMotionStep, transNoise, rotNoise);

// check whether we will walk outside the boundaries in the next iteration
SE2 nextStepFinalPose = nextGridPose.truePose * maxStepTransf;
if (fabs(nextStepFinalPose.translation().x()) &amp;amp;amp;amp;amp;amp;gt;= bound[0] || fabs(nextStepFinalPose.translation().y()) &amp;amp;amp;amp;amp;amp;gt;= bound[1]) {
//cerr &amp;amp;amp;amp;amp;amp;lt;&amp;amp;amp;amp;amp;amp;lt; &amp;amp;amp;amp;amp;amp;quot;b&amp;amp;amp;amp;amp;amp;quot;;
// will be outside boundaries using this
for (int i = 0; i &amp;amp;amp;amp;amp;amp;lt; MO_NUM_ELEMS; ++i) {
nextMotionStep = getMotion(i, stepLen);
nextGridPose = generateNewPose(poses.back(), nextMotionStep, transNoise, rotNoise);
nextStepFinalPose = nextGridPose.truePose * maxStepTransf;
if (fabs(nextStepFinalPose.translation().x()) &amp;amp;amp;amp;amp;amp;lt; bound[0] &amp;amp;amp;amp;amp;amp;amp;&amp;amp;amp;amp;amp;amp;amp; fabs(nextStepFinalPose.translation().y()) &amp;amp;amp;amp;amp;amp;lt; bound[1])
break;
}
}

poses.push_back(nextGridPose);
}

struct G2O_TUTORIAL_SLAM2D_API GridPose
{
EIGEN_MAKE_ALIGNED_OPERATOR_NEW;
int id;
SE2 truePose;
SE2 simulatorPose;
LandmarkPtrVector landmarks; ///&amp;amp;amp;amp;amp;amp;lt; the landmarks observed by this node
};

Every pose possesses a simulated and a real pose. They also possess a bunch of landmarks seen from that pose.

Then landmarks are randomly created on grid. The landmark structure is as such :

struct G2O_TUTORIAL_SLAM2D_API Landmark
{
EIGEN_MAKE_ALIGNED_OPERATOR_NEW;
int id;
Eigen::Vector2d truePose;
Eigen::Vector2d simulatedPose;
std::vector&amp;amp;amp;amp;amp;amp;lt;int&amp;amp;amp;amp;amp;amp;gt; seenBy;
Landmark() : id(-1) {}
};

It contains only a real position and a simulated position. From every pose, landmark observation are created by considering which landmark are visible from said poses. Attention, to create the observation we use the actual real pose of the robot and introduce the observation with a Gaussian noise.

Adding the odometry is pretty straight forward from the poses.

Next we add the landmark observation to the simulator object itself by adding all observation (even of the same landmark) and adding noise to the observation. Landmark observation are stored in:

struct G2O_TUTORIAL_SLAM2D_API LandmarkEdge
{
int from;
int to;
Eigen::Vector2d trueMeas;
Eigen::Vector2d simulatorMeas;
Eigen::Matrix2d information;
EIGEN_MAKE_ALIGNED_OPERATOR_NEW;
};

which represent the edge between the pose of the robot and the landmark.

tl;dr : Simulate a robot on a grid with landmark by generating a bunch of poses and landmarks and for every pose calculating which landmark are seen and storing the observation with noise added as edges between the pose and the observation.

Second step : creating the optimization problem

The optimization problem is composed of two variable : the optimizer which represent the optimization problem and the measurements.

Using the graph of g2o

We first need to allocate the optimizer. We create a SparseOptimizer class derived from OptimizableGraph which is an abstract class that represents one optimization
problem. It specializes the general graph to contain specialvertices and edges. The vertices represent parameters that can be optimized, while the edges represent constraints. This class also provides basic functionalities to handle the backup/restore
of portions of the vertices.

The Edge dependency tree is like this :

EdgeSE2 -> BaseBinaryEdge -> BaseEdge -> OptimizableGraph::Edge

To add information to a Vertex or an Edge, one need to extend the class Data :

class G2O_CORE_API Data : public HyperGraph::HyperGraphElement {
public:
Data();
~Data();
//! read the data from a stream
virtual bool read(std::istream&amp;amp;amp;amp;amp;amp;amp; is) = 0;
//! write the data to a stream
virtual bool write(std::ostream&amp;amp;amp;amp;amp;amp;amp; os) const = 0;
virtual HyperGraph::HyperGraphElementType elementType() const { return HyperGraph::HGET_DATA;}
inline const Data* next() const {return _next;}
inline Data* next() {return _next;}
inline void setNext(Data* next_) { _next = next_; }
inline DataContainer* dataContainer() { return _dataContainer;}
inline const DataContainer* dataContainer() const { return _dataContainer;}
inline void setDataContainer(DataContainer * dataContainer_){ _dataContainer = dataContainer_;}
protected:
Data* _next; // linked list of multiple data;
DataContainer* _dataContainer;
};

See the class RobotData in types/data/robot_data.h and then RawLaser in types/data/raw_laser.h for an example.

How to get a Data element from the graph : RobotLaser* rl1 = dynamic_cast<RobotLaser*>(r1->userData());

That Data is then used in the optimizer in the function optimize() and more especially used by _algorithm which is a OptimizationAlgorithm with a solve function. Here we use OptimizationAlgorithmGaussNewton derived from OptimizationAlgorithmWithHessian.

tl;dr : to create a graph to illustrate the problem we must create a data structure from Data that will be used in the graph by cleverly using pointer and virtual functions. I still need to figure out where those data are compared/optimized. Need to read the paper more for that

Screenshot from 2016-06-08 00-59-55

Create the graph

The vertices of the graph are all the robot poses and the landmark observations. There is no real distinction made between the two other than semantics. As edges we add the odometry measurements and an edge between every vertices representing a landmark and the robot pose that observes it. Every edge contains an isometry between the two poses. Again, other than the semantic, no distinction is made between those two type of edges.

By adding all the edges this way, we are going to add a certain number of conflicting constraints in the graph and that’s what the optimizer will try to make better.

Create Solver

Once we’ve created an optimizer, we need to give him an algorithm that is the solver used by the optimizer. Here we use LinearSolverCSparse which is a linear solver using CSparse. The base class for linear solvers is LinearSolver which is a basic solver for Ax = b which has to be reimplemented for different linear algebra libraries. A is assumed to be symmetric (only upper triangular block is stored) and positive-semi-definit.

The linear optimizer is used by the block solver. The block solver inherite from the class Solver which is generic interface for a sparse solver operating on a graph which solves one iteration of the linearized objective function. Thus BlockSolver is a solver with a solver inside.

This block solver is then used by OptimizationAlgorithmGaussNewton which inherited OptimizationAlgorithm a generic interface for a non-linear solver operating on a graph.

Skärmbild från 2016-06-08 12-11-19

The setBlockOrdering is only a flag to tell the linear solver

We also need to represent the sensor offset.

Next we had all the odometry measurement to the simulator. It only works so that we can simulate all pose of robot. The simulation object is thus a world representation.

With the odometry, one need to add the odometry constraints

The landmark observations is simply the vector between the robot position and the landmark observe. It is created relatively to the robot pose and not in global coordinates.

Third step optimization

Save the initial state to the disk

we have to call initializeOptimization on the optimizer. It initializes the structures for optimizing the whole graph. Before calling it be sure to invoke marginalized() and fixed() to the vertices you want to include in the schur complement or to set as fixed during the optimization. level : is the level (in multilevel optimization). It returns false if something goes wrong.

We then launch the optimization using optimize. It starts one optimization run given the current configuration of the graph and the current settings stored in the class instance. It can be called only after initializeOptimization

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s