Analysis of point cloud records, in any dimension, have been shown to benefit from analysing the topological invariants of simplicial (or cell) complex shapes obtained with these points as vertices. This approach is based on rapid advances in computational algebraic topology that underpin the Topological Data Analysis (TDA) innovative paradigm. Simplicial complexes (SCs) of a given point cloud are constructed by connecting vertices to their next nearest neighbours and gluing to the interiors of real k-simplexes (k<2) to each set of pairwise connected (k+1) vertices. This process is often done iteratively, in terms of an increasing sequence of distance thresholds, to generate a nested sequence of SCs. The Persistent Homology (PH) of such nested sequence of SCs records the lifespan of the homology invariants (No. of connected components, 2D holes, 3D tunnels, etc.) over the sequence of thresholds. Despite numerous success stories of TDA and its PH tool for computer vision and image classification, its deployment is lagging well behind the exponentially growing Deep Learning Convolutional Neural Networks (CNN) schemes. Excessive computational cost of extracting PH features beyond small size images, is widely reported as a major contributor to this shortcoming of TDA. Many methods have been proposed to mitigate this problem but only modestly for large size images, due to the way images are represented by very large point clouds rather than the computational cost of PH extractions. We shall propose an innovative approach of representing images by point clouds consisting of small sets of texture image landmarks, and thereby create a large number of efficiently extractible PH features for image analysis. We shall demonstrate the success of this approach for different image classification tasks as case studies.