Simulation calculates wrong times
I built a simple process and set up the simulation with the right times and resources - one whole instance of the process should take about 3 minutes and I have it run for 50 instances. Since the 50 instances are being processed simultaneously (as it flows thru the process), the total time for running these 50 instances should be 30 to 45 minutes (which is the time I defined for it run in the properties setting).
However, when the simulation runs, the results show that the total time was about 20 hours and the waiting times in the process were also very high at 6 hours. It seems the Min time, Max time, and Avg time all seem to be excessive where they represent the entire processing of 50 instances rather than for each instance which should be averaging only about 3 minutes.
Is this an error in the calculation? Or is there a different setting I need to define so that the Min, Max, Avg, and Total times are calculated on each instance rather than across all instances?