A Multi-grid model using the ADI algorithm (called MADI) is presented to solve the diffusion equation for medical applications and instrumentation development. Nesting multiple blocks of different grid sizes are used in the simulation to save both memory and time. MADI uses the computation results on a coarse grid scheme to provide an approximation of dynamic boundary conditions block-by-block toward the area of interest with finest grid. The precision of MADI is compared with and kept the same as the case of globally uniform finest grid simulation. Algorithm error bound is analyzed by matrix theory. As a conclusion, the thickness of the boundary area and the block width of the finest-grid center block determine the precision. The algorithm is also shown to achieve convergence with only a fixed depth of boundary regions. Finally, the MADI simulation results are compared with both those of the traditional ADI with uniform grids and experimental measurements. The results show the achievement of significant savings (80% - 95%) of memory requirement over uniform-grid computations with essentially no loss of precision. This result demonstrates the feasibility of using such MADI algorithms to achieve high resolution (fine grid, sub-millimeter) computation for realistic organ-size simulations.