We present a low-cost stand-alone AI based wavefront sensor (AIWFS) trained only with synthetic data. A simple defocused image of a source provides non-ambiguous phase retrieval competing with traditional wavefront sensors such a Shack-Hartmann (SH) sensor. An artificial neural network (ANN) is trained to output the Zernike coefficients, or any other relevant figures of merit, exclusively from synthetic data. The synthetic data typically contains random Zernike coefficients or wavefront, noise, as well as a defocus error to avoid any stringent accuracy requirement. Once trained, the AIWFS can be used directly on many other applications without any retraining. In its simplest form, the AIFWS’s hardware is just a camera taking defocused images of a point source, like a star. However, with the proper synthetic data, many types of source and optical layouts can be accommodated, such multi-point, or extended, sources to simultaneously determine both on-axis and field-dependent wavefront performance, from a single measurement. In applications using actual stars, the NN also provides the Fried’s parameter as an estimation of atmospheric turbulence. The ANN outputs are computed directly there is no numerical iteration nor any convergence consideration. The system can run at video rates, in real time, and therefore is suitable for analyzing systems with vibrations or moving parts. The AIWFS only requires a single camera making it a simple cost-effective solution that can take advantage of an existing camera that may already be in an optical system. This paper shows results using AIFWS for telescopes with both actual and artificial stars.