Hi,
This is likely a very basic question, but I have not been able to find an answer on my own yet. I have a very simple model (<10 substystems with basic functions, nothing fancier than an integrator) where the simulation time becomes really slow after I change an initial condition in a single integrator (from -65 to -30). This behavior also occurs if I remove the integrator completely. Additionally, if I let the simulation run, the scope omits the first 10 seconds of the simulation. I have tried various ODE solvers and played with the tolerance levels a bit. Could anyone help me understand why this is happening or point me toward a source of help?
Best,
Best Answer