The automated design of imaging systems involving no or minimal human effort has always been the expectation of scientists,researchers and optical engineers.In addition,it is challenging to choose an appropriate starting point for an optical system design.In this paper,we present a novel design framework based on a point-by-point design process that can automatically obtain high-performance freeform systems.This framework only requires a combination of planes as the input based on the configuration requirements or the prior knowledge of designers.This point-by-point design framework is different from the decadeslong tradition of optimizing surface coefficients.Compared with the traditional design method,whereby the selection of the starting point and the optimization process are independent of each other and require extensive amount of human effort,there are no obvious differences between these two processes in our design framework,and the entire design process is mostly automated.This automated design process significantly reduces the amount of human effort required and does not rely on advanced design skills and experience.To demonstrate the feasibility of the proposed design framework,we successfully designed two highperformance systems as examples.This point-by-point design framework opens up new possibilities for automated optical design and can be used to develop automated optical design in the areas of remote sensing,telescopy,microscopy,spectroscopy,virtual reality and augmented reality.
Accommodation and convergence play critical roles in the natural viewing of three-dimensional (3D) scenes, and these must be accurately matched to avoid visual fatigue. However, conventional stereoscopic head- mounted displays lack the ability to adjust accommodation cues. This is because they only have a single, fixed image plane, but the 3D virtual objects generated by a pair of stereoscopic images are displayed at different depths, either in front or behind the focal plane. Therefore, in order to view objects clearly, the eyes are forced to converge on those objects while maintaining accommodation fixed on the image plane. By employing freeform optical surfaces, we design a lightweight and wearable spatial-multiplexed dual focal-plane head-mounted display. This display can adjust the accommodation cue in accordance with the convergence cue as well as generate the retinal blur cue. The system has great potential applications in both scientific research and commercial market.
Accommodation and convergence play critical roles in the natural process of depth perception, and the field of natural three-dimensional (3D) perception in stereo displays has been extensively explored. However, no prototypes of these natural 3D displays are suitable for wear due to the system size and weight. In addition, few of the researches have involved subjects with ametropia. We propose and develop an optical see-through head-mounted display (HMD) capable of diopter adjustment of both the virtual image and the real world scene. The prototype demonstrates a diagonal field of view (NOV) of 42° and an exit pupil diameter of 9 mnl, and a diopter adjustment range of -5.5D to 0D. Depth adjustment of virtual image is demonstrated with experiments, the results show the HMD can be further used to investigate the accommodation and convergence cues in depth perception in AR environment, particularly for users with different degree of the ametropia.