[Math] Is an object located within the field of view of a robot

geometrylinear algebraself-learningtrigonometry

Background:

Assume we have three robots as given in the figure.

The robot in the middle has camera in the front and in the back.

How to calculate whether:

  1. the grey robot is in the field-of-view (FOV) — marked with orange line — of the front camera of the green robot?
  2. the blue robot is in the FOV of the back camera of the green robot?

enter image description here


Current knowledge:

  • Ignore the rectangular form of the robots, (if that helps).
  • Coordinates of all of the robots are known, they are given as Coord(x,y)
  • The length of the maximum range of the FOV is 100m –> that's the length of the orange lines.
  • The half-angle of the FOV is 45 degrees (the angle between the blue arrow and the orange line)
  • The coordinate system used to obtain the heading of the robots ranges between -pi and pi (as given in the figure below). So the heading for green robot is -pi

enter image description here


What has been done so far:

So far, I tried to adapt some implementation to my case but I am not sure whether it is correct, so I need validation on that front, or comments from your side.

double distanceFromGreen2Grey = greenPos.distance(greyPos);

// Angles
double greenHeading = r0->getHeading();
double greyHeading = r1->getHeading();

// Calculate normalized vectors
Coord vectorFromGreen2Grey = (greyPos - greenPos) / distanceFromGreen2Grey;
Coord greenHeadingVector = Coord(cos(greenHeading), sin(greenHeading)) * 1; // <-- 1 used for front
Coord greyHeadingVector = Coord(cos(greyHeading), sin(greyHeading)) * -1; //  <-- -1 used for back

// why use acos here?
double angle = acos(vectorFromGreen2Grey * greenHeadingVector) / M_PI * 180;
double bearing = acos(vectorFromGreen2Grey * greyHeadingVector) / M_PI * 180;


bool inRange = distanceFromGreen2Grey <= 100;
bool inAngle = (vectorFromGreen2Grey * greyHeadingVector) > cos(M_PI/4);
bool inAngle2 = ((greyPos - greenPos)/cos(M_PI/4))*greyHeadingVector >= distanceFromGreen2Grey;
bool withinBearing = (vectorFromGreen2Grey * greenHeadingVector) < 0;

Extra question: How to find the end points for the orange lines so I can draw lines from the coords of the green robot to the endpoints of the orange lines. The FOV lines should move as the robots move too.


References:

I found other questions on SE which address this topic, but they don't use acos as in my code.

  1. https://stackoverflow.com/questions/22542821/how-to-calculate-if-something-lies-in-someones-field-of-vision mentions something like velocity vector but I could not map that to my case.
  2. Something similar has also been asked here: How do I plot a 'field of view' in 2D space and find if a certain point lies within the space?

Best Answer

This is almost exactly the same problem as one of the ones you cite. The only interesting difference is that instead of having to determine the camera’s facing from the endpoints of a line segment (robot arm section), you already know it. In addition, the field of view has a limited range, but that’s not a significant complication, nor is the addition of a rear-facing camera.

To review, in the cited question the target point is in the field of view if $$(Q-P_4)\cdot(P_4-P_3)\ge\|Q-P_4\|\,\|P_4-P_3\|\cos\theta,\tag{*}$$ where $Q$ is the target point, $P_4$ is the location of the camera, $P_3$ the other end of the robot arm segment, and $\theta$ is the half-angle of the field of view. Since we don’t have a robot arm holding the camera, let’s just call the camera’s location $P$ and let’s call the green robot’s heading $\phi$. $P_4-P_3$ gives us a vector in the direction of the camera’s facing, so assuming that the camera is facing directly forward, we can use the unit vector in that direction: $(\cos\phi,\sin\phi)$. Using a unit vector also eliminates the factor of $\|P_4-P_3\|$, since that came from normalizing the camera-direction vector. For this camera, $\theta=\pi/4$, so $\cos\theta=1/\sqrt2$. Putting this all together and moving the $\sqrt2$ to the left side, we get $$\sqrt2\,(Q-P)\cdot(\cos\phi,\sin\phi)\ge\|Q-P\|.\tag{1}$$ The range check is, of course, $\|Q-P\|\le100$. To check the rear view, just reverse the unit facing vector to $(-\cos\phi,-\sin\phi)$.

Since you have two conditions that both involve testing against $\|Q-P\|$, it’s more efficient to test the one that requires less work first. I suggest testing the square of the distance first, i.e., $\|Q-P\|=(Q-P)\cdot(Q-P)\le100^2$, to avoid computing a square root if the target is too far away. I can’t tell from your code snippet whether or not that’s practical, though. If the target isn’t too far away, then you can go ahead and take the square root and compute the rest of test (1).

Looking at your code snippet, it looks like you’re trying to do something along these lines, but there’s an error. The gray robot’s heading is wrong thing to use for inAngle. The target’s heading is irrelevant to deciding whether or not it’s visible. In the original formula, $(P_4-P_3)/\|P_4-P_3\|$ corresponds to your greenHeadingVector instead. Observe, too, that at least for this computation, there’s no need to normalize vectorFromGreen2Grey. That saves you a division operation. Of course, if you need the normalized value for other things, it’s almost certainly more efficient to normalize once, as you do.

Related Question