diff options
author  Wilrik de Loose <wilrik@wilrik.nl>  20080611 07:25:50 (GMT) 

committer  Wilrik de Loose <wilrik@wilrik.nl>  20080611 07:25:50 (GMT) 
commit  6059a52daa590419b3c601d127c0a87c47a5a638 (patch)  
tree  7934b9e8fbbad62b81eae86397f56393df052753  
parent  a4385a3137a9d4f7e36ecfc2e5d95eac41f2d47d (diff)  
download  2iv556059a52daa590419b3c601d127c0a87c47a5a638.zip 2iv556059a52daa590419b3c601d127c0a87c47a5a638.tar.gz 2iv556059a52daa590419b3c601d127c0a87c47a5a638.tar.bz2 
(last) Changes Wilrik
rwrr  report/chapter2.tex  8  
rwrr  report/chapter3.tex  4  
rwrr  report/chapter4.tex  4  
rwrr  report/wiimote_ir.tex  40 
4 files changed, 28 insertions, 28 deletions
diff git a/report/chapter2.tex b/report/chapter2.tex index cc1d11c..bcd9e55 100644  a/report/chapter2.tex +++ b/report/chapter2.tex @@ 27,10 +27,6 @@ The number of test cases to implement was the first small problem the group had In order to support a Wiimote in a program like MatchBlox, a number of development libraries are available on the web. The library that was initially used for the application suffered a lot of lag (or latency). It took some time to determine the error and switch to another library.
\subsubsection{IR smoothing}

...

\subsubsection{Depleted batteries}
A second Wiimote is used to track the head using the infrared camera in front of the Wiimote. The user places two infrared LED's which the Wiimote can keep track of. A wireless Wii sensorbar\footnote[1]{The Wiimote senses IR light from the sensor bar. The light emitted from each end of the sensor bar is focused onto the image sensor which sees the light as two bright dots separated by a distance "mi" on the image sensor. The second distance "m" between the two clusters of light emitters in the sensor bar is a fixed distance. From these two distances m and mi, the distance between the Wiimote and the sensor bar can be calculated.} was used for this. This has as great advantages that it's wireless (as opposed the the wired variant shipped with the Wii console) and no home made infrared LED's need to be used. \\
@@ 45,8 +41,8 @@ There exist a technique that reduces this effect. What it basically does is subt \subsection{Evaluation}
The conclusions of the MatchBlox project are explained in chapter 4 or this report. This paragraph deals with the evaluation of the project and how it was executed. \\
+The conclusions of the MatchBlox project are explained in chapter 4 of this report. This paragraph deals with the evaluation of the project and how it was executed. \\
A lot of new techniques during the implementation were applied; head tracking, stereo vision, a Wiimote as input device, etc. To fit this all into one big project was a bit over the top and therefor unfeasible. The troubles with the infra red LED's and the Wiimote in general consumed a lot of time and energy of the project. \\
Although the project finished the application thus far, it's not useable for an actual experiment. This doesn't mean all was in vein and no conclusion can be drawn from the results of the project. All of this will be explained in the last chapter.
\ No newline at end of file +Although the project was able to complete the application, it's not useable for an actual experiment. This doesn't mean all was in vein and no conclusion can be drawn from the results of the project. The conclusions will be explained in the last chapter.
\ No newline at end of file diff git a/report/chapter3.tex b/report/chapter3.tex index 5c2400a..c9d5a47 100644  a/report/chapter3.tex +++ b/report/chapter3.tex @@ 8,7 +8,7 @@ \subsection{Stereo vision} To be able to view a world in 3 dimensions with real depth information, one must offer each eye an image of its own, taken from a different angle. A problem arises however when there is only one screen available. How can both eyes get one image each from one screen? \\ +To be able to view a world in 3 dimensions with realistic depth, one must offer each eye an image of its own, taken from a different angle. A problem arises however when there is only one screen available. How can both eyes get one image each from one screen? \\ There are several solutions available. There are so called shutter glasses. These glasses first block one eye, then the other and then back to the first one again. This process has to be in synchronization with the screen. So frame \textit{A} is rendered, the left eye is open and the right eye is closed. Then frame \textit{B}, with the left eye closed and the right eye opened. The problem with this solution however is that you need a screen capable of refreshing at at least 100Hz, since each eye will only get a refresh rate of 50Hz. Anything below that will become very noticeable as flickering. Most laptop screens, if not all, are incapable of doing this. \\ @@ 18,3 +18,5 @@ There are several solutions available. There are so called shutter glasses. Thes \end {center} A different approach is using redblue stereo glasses. These glasses allow the left eye to see everything that is not red and the right eye to view everything that is not blue. The application then draws the scene twice, once so that it is only viewable by the right eye in blue and simultaneously a second time, in red, for the left eye. Here both frames are rendered simultaneously, so there is no refresh rate issue. The biggest disadvantage however is that the feel of true colors is lost. In both techniques cross talk can occur. For the case of stereo vision, this is explained in chapter 2.3. \\ + +With the additional head tracking information, the redblue images can be improved. If the distance between the user and the screen is getting smaller, the distance between the red and blue images will be larger. This effect is easily checked when looking at your hand and drawing it nearer to your eyes.
\ No newline at end of file diff git a/report/chapter4.tex b/report/chapter4.tex index 3890216..d070360 100644  a/report/chapter4.tex +++ b/report/chapter4.tex @@ 8,7 +8,7 @@ can drawn some conclusions of the project. With the OpenGL library is't not difficult to make a good perspective
world. It can be done by mainly one function. The difficulty for the
head tracking was to cooperate good with the Nintendo Wii controller
and his sensor bar, which is being discussed in the problem section.
+and its sensor bar, which is being discussed in the problem section.
A disadvantage of using Wii controller is, the small angle of the
camera. To have a good moving space in which the camera can track the
dots, you must be on a long distance of the camera, and so the
@@ 17,3 +17,5 @@ If you are to close to the camera, you easily move out of the range and your head cannot be tracked anymore.
\subsection{Stereo Vision}
+
+The applied redblue stereo vision is one of the least realistic alternatives of stereo vision. Yet, it is the easiest to implement and still very effective. The created images provide the user the illusion of depth in a very usable way. The distance between objects can easily been seen from a still image. This is also a big disadvantage of head tracking; without moving, the image provides no additional "depth info".
\ No newline at end of file diff git a/report/wiimote_ir.tex b/report/wiimote_ir.tex index 0b01dda..b17c955 100644  a/report/wiimote_ir.tex +++ b/report/wiimote_ir.tex @@ 3,7 +3,7 @@ As mentioned in the introduction, we used the Nintendo Wii controller for both head tracking and 3D (mouse) input. In both applications we made use of the infrared sensing ability of the controller. A Wiimote has a build in infrared camera that is used to track up to four infrared sources at a time. An on board image processing chip processes the images acquired by the camera and outputs the coordinates of the infrared sources in camera coordinates, which are reported to the client (pc) via Bluetooth at a frequency of 100Hz. Several sources report that the infrared camera has a horizontal and vertical resolution of $ 1024 $ by $ 768 $ pixels with a viewing angle of $ 40^{\circ} $ and $ 30^{\circ} $ respectively. However, the author of the wiimote interface library that we used, claimed that his measurements indicated that the reported coordinates never exceeded the range $ [0,1015]\times[0,759] $. Since there are no official specifications of the hardware inside the Wii controller we assumed the resolution of the camera to be $1016 \time 760$. In order to use the wiimote as a 3D mouse, two infrared sources/beacons are required to be positioned a certain distance apart, above or below the display. The camera coordinates of these beacons are used to calculate world coordinates (in $ \mathbb{R}^3 $) of the mouse cursor. The beacon shipped with the Wii console is misleadingly called the 'sensor bar'. It is a plastic bar that houses two groups of five infrared LEDs positioned at the tips of the bar. The distance between the LED groups is approximately $ 20.5 $ cm. Other, after market sensor bars, that we used during testing and development of the software, contained two groups of three LEDs each. +In order to use the wiimote as a 3D mouse, two infrared sources/beacons are required to be positioned a certain distance apart, above or below the display. The camera coordinates of these beacons are used to calculate world coordinates (in $ \mathbb{R}^3 $) of the mouse cursor. The beacon shipped with the Wii console is misleadingly called the 'sensor bar'. It is a plastic bar that houses two groups of five infrared LED's positioned at the tips of the bar. The distance between the LED groups is approximately $ 20.5 $ cm. Other, after market sensor bars, that we used during testing and development of the software, contained two groups of three LED's each. To control the 3D mouse cursor, the user points the Wii remote at the display such that both LED groups are registered by the wiimote's camera. The xy position of the mouse cursor is set by pointing the wiimote at the desired location. The position of the cursor in the z direction can be controlled by moving the remote toward or away from the screen. @@ 11,9 +11,9 @@ To control the 3D mouse cursor, the user points the Wii remote at the display su \subsubsection{Mapping to 2D} First we will discuss the mapping of the wiimote infrared output to a 2D mouse cursor position, from there we extend the mapping to 3D. In the ideal situation the wiimote reports the 2D camera coordinates of the two LED groups of the sensor bar. From these coordinates $ g_l $ and $ g_r $, we calculate a cursor position $ c_{pix} $ in pixel coordinates. Because the screen resolution is likely to be different from the camera resolution, we actually calculate a cursor position $ c_{rel} \in [0,1]^2 $ relative to the camera resolution and use that result to get the actual pixel coordinates simply by multiplying $ c_{rel} $ with the screen resolution. +First we will discuss the mapping of the wiimote infrared output to a 2D mouse cursor position, from there we extend the mapping to 3D. In the ideal situation the wiimote reports the 2D camera coordinates of the two LED groups of the sensor bar. From these coordinates $ g_l $ and $ g_r $, we calculate a cursor position $ c_{pix} $ in pixel coordinates. Because the screen resolution is likely to be different from the camera resolution, we actually calculate a cursor position $ c_{rel} \in [0,1]^2 $ relative to the camera resolution and use that result to get the actual pixel coordinates simply by multiplying $ c_{rel} $ with the screen resolution. Figure \ref{fig:2d_mapping} shows a diagram of a camera image of $ g_l $ and $ g_r $ and the corresponding screen with cursor position $ c_{pix} $. In our mapping we first calculate $ g_c $ as the average point in camera coordinates between $ g_l $ and $ g_r $. Then $ g_c $ is inverted and counterrotated before it is mapped to $ c_{rel} $ and finally converted $ c_{pix} $. +Figure \ref{fig:2d_mapping} shows a diagram of a camera image of $ g_l $ and $ g_r $ and the corresponding screen with cursor position $ c_{pix} $. In our mapping we first calculate $ g_c $ as the average point in camera coordinates between $ g_l $ and $ g_r $. Then $ g_c $ is inverted and counterrotated before it is mapped to $ c_{rel} $ and finally converted $ c_{pix} $. \begin{figure}[h!] \begin{center} @@ 33,10 +33,10 @@ The next step is to counterrotate $ g_c $ to compensate for the rotation of the The rest of the mapping quite trivial, let $ g'_c $ be the corrected camera coordinates, $ c_{rel} $ is computed by dividing $ g'_c $ by the camera resolution and $ c_{pix} $ is calculated by multiplying $ c_{rel} $ by the screen resolution. %When the x and yaxis of the screen and camera are alligned, all horizontal and vertical movement of the wiimote with respect to the screen, will result in the expected cursor movements. However, rotating the wiimote will also rotate the when the wiimote is rotated at angle of $ 90^{\circ} $ over its zaxis, %The roll correction compensates for the rotation of the wiimote over its zaxis, which causes the coordinates of $ g_c $ to be rotated. If the user is holding the wiimote on its side (rotated $90^{\circ}$) and points to the right, the cursor will move down, instead of the expected direction. As a solution, the angle of the line $ \overline{g_l g_r} $ with the xaxis is calculated and is used to counter rotate $ g_c $ over the center of the camera. This rather simplistic solution works when the sensor bar is assumed to be aligned with the screen's xaxis. In order to determine whether the Wii remote is rotated over $ 180^{\circ} $ one can use the accelerometer data, but note that this will only give a reliable orientation estimate when the wiimote is not accelerating. +%When the x and yaxis of the screen and camera are alligned, all horizontal and vertical movement of the wiimote with respect to the screen, will result in the expected cursor movements. However, rotating the wiimote will also rotate the when the wiimote is rotated at angle of $ 90^{\circ} $ over its zaxis, +%The roll correction compensates for the rotation of the wiimote over its zaxis, which causes the coordinates of $ g_c $ to be rotated. If the user is holding the wiimote on its side (rotated $90^{\circ}$) and points to the right, the cursor will move down, instead of the expected direction. As a solution, the angle of the line $ \overline{g_l g_r} $ with the xaxis is calculated and is used to counter rotate $ g_c $ over the center of the camera. This rather simplistic solution works when the sensor bar is assumed to be aligned with the screen's xaxis. In order to determine whether the Wii remote is rotated over $ 180^{\circ} $ one can use the accelerometer data, but note that this will only give a reliable orientation estimate when the wiimote is not accelerating. %As a third correction step one could compensate for the offset in height between the sensor bar and the actual center of the display. Note that we take $ g_c $ to be the average of the coordinates of the two LED groups, in order for the user to put the sursor in the center of the screen, the user has to aim the wiimote at the center of the sensor bar and not the center of the screen, the line of sight over the Wii remote and the position of the cursor on the screen do not coincide. +%As a third correction step one could compensate for the offset in height between the sensor bar and the actual center of the display. Note that we take $ g_c $ to be the average of the coordinates of the two LED groups, in order for the user to put the sursor in the center of the screen, the user has to aim the wiimote at the center of the sensor bar and not the center of the screen, the line of sight over the Wii remote and the position of the cursor on the screen do not coincide. %Let $ g'_c $ be the corrected coordinates of $ g_c $, the relative coordinates are computed by dividing the x component of $ g'_c $ by the width of the camera in pixels and dividing the y component of $ g'_c $ by the height of the camera resolution in pixels. @@ 44,7 +44,7 @@ The rest of the mapping quite trivial, let $ g'_c $ be the corrected camera coor For the 3D mapping we convert $ g_l $ and $ g_r $ to a coordinate $ c_{rel} \in [0,1]^3 $, relative to an axis aligned box $ \mathcal{B} $ in world coordinates. The box restricts the movement of the 3D cursor and is defined by two corner points $ p_{min} $ and $ p_{max} $ with the minimum and maximum coordinates respectively. From $ c_{rel} $ we compute the world coordinates $ c_{world} $ of the cursor by: \[ c_{world} = p_{min} + c_{rel} \cdot (p_{max}p_{min})\] The relative x and y coordinates are computed in the same way as in the 2D case. The relative z coordinate is calculated from the measured distance $ d $ between the sensor bar and the wiimote as illustrated in Figure \ref{fig:z_mapping}. +The relative x and y coordinates are computed in the same way as in the 2D case. The relative z coordinate is calculated from the measured distance $ d $ between the sensor bar and the wiimote as illustrated in Figure \ref{fig:z_mapping}. \begin{figure}[!h] \begin{center} @@ 61,12 +61,12 @@ z_{rel} & = 1 && \text{iff} \; d \geq d_{max} \\ z_{rel} & = \frac{d_{max}  d_{min}}{ d  d_{min}} && \text{otherwise} \end{align*} The distances $ d_{min} $ and $ d_{max} $ depend on where the user is standing and have to be initialised according to the user's initial position. The distance between them determines the amount a user has to reach in order to touch the far end of $ \mathcal{B} $. +The distances $ d_{min} $ and $ d_{max} $ depend on where the user is standing and have to be initialized according to the user's initial position. The distance between them determines the amount a user has to reach in order to touch the far end of $ \mathcal{B} $. The calculation of the relative zcoordinate requires us to calculate the distance between the Wii remote and the sensor bar. The distance is calculated by measuring the angle between the lines of sight of left and right LED groups from the camera's point of view and calculating the length of a side of a right triangle. Let $ \Delta $ be the distance between the LED groups in the sensor bar. The distance $ d $ between the wiimote and the sensor bar can be calculated as shown in Figure \ref{fig:wiimote_dist_calc}. The angle $ \theta $ is computed as the distance between $ g_l $ and $ g_r $ multiplied by the angle per pixel: +The calculation of the relative zcoordinate requires us to calculate the distance between the Wii remote and the sensor bar. The distance is calculated by measuring the angle between the lines of sight of left and right LED groups from the camera's point of view and calculating the length of a side of a right triangle. Let $ \Delta $ be the distance between the LED groups in the sensor bar. The distance $ d $ between the wiimote and the sensor bar can be calculated as shown in Figure \ref{fig:wiimote_dist_calc}. The angle $ \theta $ is computed as the distance between $ g_l $ and $ g_r $ multiplied by the angle per pixel: \[ \theta = g_lg_r \cdot \theta_{pix} \] Because the ratio in camera pixels and the ratio in viewing angles is equal we don't have to distinguish between vertical and horizontal pixel viewing angles and can define the angle per pixel as: \[ \theta_{pix} = \frac{\pi}{180}\cdot\frac{40}{1016} \] +\[ \theta_{pix} = \frac{\pi}{180}\cdot\frac{40}{1016} \] Assuming that the wiimote is positioned perpendicular to the sensor bar, the distance can be calculated by solving: \[ d = \frac{\frac{1}{2}\Delta}{\mathsf{tan}(\frac{1}{2}\theta)} \] @@ 78,19 +78,19 @@ Assuming that the wiimote is positioned perpendicular to the sensor bar, the dis \label{fig:wiimote_dist_calc} \end{figure} Because the wiimote never is perpendicular to the sensor bar, the distance will be an estimate. If the angle between the wiimote and the sensor bar deviates from $ 90^{\circ} $, being an angle $ \alpha $ in the range $ (0, 180) $, then the measured distance $ g_lg_r $ is a factor $ \mathsf{sin}(\alpha) $ from the actual distance that would be perceived from a perpendicular viewing angle where $ \alpha = 90^{\circ} $. The computed distance $ d $ will therefore be larger then the actual distance. However, the angle $ \alpha $ can be measured when using customized sensor bar with three LED groups with equally space in between. The angle can then be computed from the ratio between $  g_l  g_c  $ and $  g_r  g_c  $, where $ g_c $ are de camera coordinates of the center LED group. %a third beacon positioned at equal distances in between the other two. %Our 3D mouse implementation does not require such precise distance calculation, because the +Because the wiimote never is perpendicular to the sensor bar, the distance will be an estimate. If the angle between the wiimote and the sensor bar deviates from $ 90^{\circ} $, being an angle $ \alpha $ in the range $ (0, 180) $, then the measured distance $ g_lg_r $ is a factor $ \mathsf{sin}(\alpha) $ from the actual distance that would be perceived from a perpendicular viewing angle where $ \alpha = 90^{\circ} $. The computed distance $ d $ will therefore be larger then the actual distance. However, the angle $ \alpha $ can be measured when using customized sensor bar with three LED groups with equally space in between. The angle can then be computed from the ratio between $  g_l  g_c  $ and $  g_r  g_c  $, where $ g_c $ are de camera coordinates of the center LED group. %a third beacon positioned at equal distances in between the other two. %Our 3D mouse implementation does not require such precise distance calculation, because the \subsubsection{Problems} During the implementation of the control scheme described here, we ran into some problems, or actually one problem with multiple causes: the cursor was impossible to keep steady when holding the wii remote in hand. The cursor seemed to oscillate around a point, while frequently jumping to another position. Only when the controller lay perfectly still on a surface, the cursor on screen seemed to be steady. +During the implementation of the control scheme described here, we ran into some problems, or actually one problem with multiple causes: the cursor was impossible to keep steady when holding the wii remote in hand. The cursor seemed to oscillate around a point, while frequently jumping to another position. Only when the controller lay perfectly still on a surface, the cursor on screen seemed to be steady. One cause for the cursor instability was the method with which we selected the coordinates for $ g_l $ and $ g_r $ from the points returned by the wiimote. As mentioned the wiimote can track up to four infrared sources. If the wiimote is close enough to the sensor bar it will recognise the positions of the individual LEDs withing the LED groups. When this happens more than two coordinates are returned by the wiimote. In our original implementation we selected the two points with the greatest distance between them. As a solution, we implemented an algorithm that sorts the points returned by the wiimote into two groups based on their proximities, one group per LED group. The algorithm returns the average coordinates per group as the coordinates for $ g_l $ and $ g_r $. This solved the large jumps in the cursor position, but the did not help the stability of the cursor very much. +One cause for the cursor instability was the method with which we selected the coordinates for $ g_l $ and $ g_r $ from the points returned by the wiimote. As mentioned the wiimote can track up to four infrared sources. If the wiimote is close enough to the sensor bar it will recognize the positions of the individual LED's withing the LED groups. When this happens more than two coordinates are returned by the wiimote. In our original implementation we selected the two points with the greatest distance between them. As a solution, we implemented an algorithm that sorts the points returned by the wiimote into two groups based on their proximities, one group per LED group. The algorithm returns the average coordinates per group as the coordinates for $ g_l $ and $ g_r $. This solved the large jumps in the cursor position, but the did not help the stability of the cursor very much. Another cause, which is more apparent, is the fact that it is very hard for humans to keep their arm perfectly still when holding a wiimote. As a solution we implemented an exponential smoothing algorithm to calculate a weighted average over $ g_l $ and $ g_r $ before they were processed. The smoothing stabilised the cursor in the x and ydirection while still being quite responsive. In the zdirection however the cursor remained unsteady, albeit to some lesser extend. +Another cause, which is more apparent, is the fact that it is very hard for humans to keep their arm perfectly still when holding a wiimote. As a solution we implemented an exponential smoothing algorithm to calculate a weighted average over $ g_l $ and $ g_r $ before they were processed. The smoothing stabilized the cursor in the x and ydirection while still being quite responsive. In the zdirection however the cursor remained unsteady, albeit to some lesser extend. The reason for the stability problem in the zdirection is the limited resolution of the mapping in the zdirection as opposed to the x and y directions. For the latter two, the resolution corresponds to the resolution of the camera. For the zdirection the resolution is limited by the chosen values of $ d_{min} $ and $ d_{max} $ and the distance between the LED groups in the sensor bar. Suppose we define $ d_{min} $ as 1 meter and $ d_{max} $ as 2 meters and use a standard sensor bar such that $ \Delta $ in the distance calculation is 0.0205 meters (see Figure \ref{fig:wiimote_dist_calc}). The distance in pixels between the $ g_l $ and $ g_r $ at distance $ d_{min} $ is: \[ \frac{2\mathsf{arctan}(\frac{\frac{1}{2}\Delta}{d_{max}})}{\theta_{pix}} = 29.832867646 \approx 30 \text{pixels} \] Compared to the distance in pixels at $ d_{max} $ which is approximately $ 15 $ pixels, gives a resolution in the zdirection of a mere $ 15 $ camera pixels. The resolution can be increased by using a wider sensor bar or by sitting closer to the sensor bar, reducing $ d_{min} $ and $ d_{max} $. Choosing a bounding box $ \mathcal{B} $ with a depth as small as possible can also make vibrations in the zdirection less visible. In our implementation we severely smoothed the zcoordinate of the cursor position to finally stabilise the cursor. The downside of the smoothing is that we are left with a noticeable lag in movements in the zdirection. +Compared to the distance in pixels at $ d_{max} $ which is approximately $ 15 $ pixels, gives a resolution in the zdirection of a mere $ 15 $ camera pixels. The resolution can be increased by using a wider sensor bar or by sitting closer to the sensor bar, reducing $ d_{min} $ and $ d_{max} $. Choosing a bounding box $ \mathcal{B} $ with a depth as small as possible can also make vibrations in the zdirection less visible. In our implementation we severely smoothed the zcoordinate of the cursor position to finally stabilize the cursor. The downside of the smoothing is that we are left with a noticeable lag in movements in the zdirection. @@ 120,13 +120,13 @@ Compared to the distance in pixels at $ d_{max} $ which is approximately $ 15 $ %We are going to compute the position of the cursor relative to the dimensions of . %to move the cursor further in the ydirection, the camera have to be inverted before the are mapped to the xyplane. This is because the +%to move the cursor further in the ydirection, the camera have to be inverted before the are mapped to the xyplane. This is because the %In order to determine the xypostion for a simple 2D mouse cursor, one would simply take the average of $ g_l $ and $ g_l $ to get the center of $ \overline{g_l g_r} $ and convert that to screen coordinates by multiplying by the 2D vector $ (\frac{S_w}{C_w}, \frac{S_h}{C_w}) $. +%In order to determine the xypostion for a simple 2D mouse cursor, one would simply take the average of $ g_l $ and $ g_l $ to get the center of $ \overline{g_l g_r} $ and convert that to screen coordinates by multiplying by the 2D vector $ (\frac{S_w}{C_w}, \frac{S_h}{C_w}) $. % the of the cursor we take the average of %beacons as two points in camera coordinates. Let these two dots be $ L_c $ and $ R_c $, the coordinates of the left and right beacons in camera coordinates. The camera coordinates origin $ (0,0) $ is located in the lower left corner. %One problem of the multiple led beacons in the sensorbars is that at close range the +% the of the cursor we take the average of +%beacons as two points in camera coordinates. Let these two dots be $ L_c $ and $ R_c $, the coordinates of the left and right beacons in camera coordinates. The camera coordinates origin $ (0,0) $ is located in the lower left corner. +%One problem of the multiple led beacons in the sensorbars is that at close range the % noise in ir dot coordinates 