Nowadays Smart grids utilize digital controllers to guarantee the reliable and efficient operation of their components. However, the numerical simulation of these types of controllers is challenging due to the numerous discrete time events introduced. The interpolation-based method (IBM) can effectively simulate smart digital controllers alongside the differential-algebraic equations of the system under control. Compared to the traditional step-reduction method (SRM) that reduces the time step for each time event during the simulation, IBM allows the use of variable time steps and solves the system while ensuring simulation accuracy. However, in the case of computationally demanding controllers, the computational performance of IBM suffers significantly. In this paper, a modified version of IBM is introduced, which can handle computationally demanding controllers in dynamic simulation of power systems. The performance of the proposed method is assessed and compared to SRM and IBM by simulating a test system with different controllers.