This question is known to cause confusion. And even once investigated, the definition is hard to remember. Therefore we recommend to use the slopes generator to investigate this in case of need.

Use this button

**As a general rule for WaveView**

- If you look at the screen you see the wavefront from behind, as it hits the micro-lensarray! (WaveKit respects the CCD convention, i.e. the image is mirrored)
- The convention of Waveview is to depict
*phase delay*

- In other words: a diverging (convex) wavefront is depicted in blue-ish (negative values) in the center.

The sign convention holds also for the SDK - In even more words: The red zone of the wavefront is lagging behind the blue parts. Imagine you look at the screen from incidence direction, and the wavefront is a surface, whose regions still closer to you are depicted with positive values (red).

- In other words: a diverging (convex) wavefront is depicted in blue-ish (negative values) in the center.

**About the question of where is left, right, up and down**Raw camera image in WaveKit: (0,0) is on the upper left.

**What about astigmatism?**Let's try it out with the slopes simulator.

*Positive Zernike astigmatism coefficient:*

*Negative Zernike astigmatism coefficient:*

**Here you can see that in both cases the focus closer to the screen is the sagittal focus.**

However the estimated astigmatism angle changed by 90°!

Since in general an astigmatism can occur under any angle, WaveView always gives the distance to the closer focal point as sagittal length, but with respect to an angle as deviation from a real sagittal cut (= longitudal = vertical image axis)

**.**

## Comments

0 comments

Please sign in to leave a comment.