Simulate v0.2.0, a Julia package for discrete event simulation

Simulate.jl provides three schemes for modeling and simulating discrete event systems (DES): 1) event scheduling, 2) interacting processes and 3) continuous sampling. It introduces a clock and allows to schedule arbitrary Julia functions or expressions as events, processes or sampling operations on the clock’s timeline. It provides simplicity and flexibility in building models and performance in simulation.

Please look at it and tell, what you think. I would be happy if you find it useful.

Paul Bayer


Have you done performance compared to SimPy or other DES frameworks?

Is the process-oriented part of this based on Julia’s builtin coroutines (Task) and just using wait and Condition under the hood? If so this is very exciting! Building something like that has been on my personal list of things to try for a while now.

I’ve been using SimJulia, but it’s reliance on ResumableFunctions makes debugging nearly impossible, and performance seems to drop dramatically any time the simulation is more than trivial (my suspicion is that some interaction of @resumable and my code is causing type-instability, but I haven’t had time to get to the root of the problem yet).


I didn’t take performance measurements. My first experiments showed simulations in a Julia framework much faster than in Python/SimPy – to no surprise.

The process-oriented part of Simulate.jl is only a few lines of code using Julia Tasks reading and writing on Channels and of course yield, which was the simplest approach and worked very well. The standard library’s channel functions then use waitand Condition as you mentioned.

Everything works well even with millions of events as you can see in the dice game example. Until now I don’t know of any scaling problems. The approach is suited to experiment with newer possibilities of Julia 1.3 (@spawn …), which I will do soon.

Would you be interested in including some benchmarking results in the docs if I do some conparisons later?

Does Simulate.jl support multi-core parallel calculation? Some benchmarck comparison with SimJulia.jl and EventSimulation.jl would be very useful.

@non-Jedi: It would be useful to have benchmarking results and they should be also in the docs. Until now my focus was mainly to get it running. I am sure that things can be further sped up. Benchmarks could be useful for improvement too.

@zhangliye: until now I did entire simulations in parallel with the @threads macro. This works well. I want to experiment with events and processes on parallel threads in the current development version. I don’t see any major obstacle for it to work. But until now I had rather lightweight tasks to coordinate. So perhaps the speedup will be not so much as with parallel simulations. If you have heavier tasks to coordinate, running them in parallel should speedup things more. Are you interested to try it out?

It would be cool if Simulate.jl supports parallel simulation as discussed Does SimJulia support parallel computation? I will have a try. Thanks!

I’d like to suggest you rename the package DiscreteEvent.jl or DiscreteEventSimulate.jl.

There are a whole lot of other systems that can be simulated computatoinally: differential equations, partial differential equations, chemical reactions, planetary disc accretion, global weather and climate models, etc. etc. etc. Using “SImulate.jl” for just discrete event simulation carves out an inappropriately large portion of package namespace.


Thanks, you are right. I particularly like DiscreteEvent.jl. I opened an issue.


I am rewriting a traffic simulation platform which is currently using SimJulia.jl and want to compare the efficiency. Simulate.jl has good documentation. However, for the users of SimPy or SimJulia, it is difficult to find the some familiar concepts such as Resource and Resource Consumption in the manual. If you can explain and add one example from SimPy/SimJulia, it would be very helpful for the SimPy/SimJulia users. Ross example

SimJulia and SimPy provide same short tutorial, SimPy in 10 mins, SimJulia in 10 mins. As a SimPy user, I can get the key idea after reading this in several minutes. If Simulate.jl can translate this short tutorial, it will make some users easy to try this package.

1 Like

Simulate.jl has no resource concept since this can easily expressed as token in a Channel of native Julia.

The SimPy example thus becomes in Simulate.jl:

using Simulate, Printf
struct Cable end    # a cable is the resource of the battery charging station bcs

function car(env::Clock, name::String, bcs::Channel, driving_time::Int, charge_duration::Int)
    delay!(env, driving_time)
    now!(env, SF(println, @sprintf("%s arriving at %d", name, tau(env))))
    cable = take!(bcs) # take the cable from the charging station
    now!(env, SF(println, @sprintf("%s starting to charge at %d", name, tau(env))))
    delay!(env, charge_duration)
    put!(bcs, cable)   # put it back
    now!(env, SF(println, @sprintf("%s leaving the bcs at %d", name, tau(env))))

env = Clock()
bcs = Channel{Cable}(2)    # define the battery charging station
for i in 1:2
    put!(bcs, Cable())     # with two cables (resources)
for i in 0:3
    process!(env, SP(i, car, env, "Car $i", bcs, i*2, 5), 1)
run!(env, 20)

This gives the same output as SimPy:

Car 0 arriving at 0
Car 0 starting to charge at 0
Car 1 arriving at 2
Car 1 starting to charge at 2
Car 2 arriving at 4
Car 0 leaving the bcs at 5
Car 2 starting to charge at 5
Car 3 arriving at 6
Car 1 leaving the bcs at 7
Car 3 starting to charge at 7
Car 2 leaving the bcs at 10
Car 3 leaving the bcs at 12

"run! finished with 20 clock events, 0 sample steps, simulation time: 20.0"

I agree that I should explain that better in Getting Started and will update the documentation soon. Thank you for pointing this out.

1 Like

Thank you so much for your response! It is very helpful. I like Simulate.jl very much, which would be have performance benefit.I have write an example of using the bike-share described as follows based on your example.

In a bike-share system, there is one bike and two users. The user use the bike as follows.
step1: The user start the trip at time start_t from the bike station;
step2: If there is bike, rent the bike and use the bike for trip_t and return the bike;
If no bike, wait there

I write a struct Resource in resource.jl to describe the resource similar to SimPy and SimJulia.jl as follows.

using Simulate
import Base.isempty
import Base.length

abstract type ResType end
abstract type AbstractResource end
struct DiscreteSource <: ResType end

struct ResourceBase{T<:ResType} <: AbstractResource

function ResourceBase{T}(n::Int=0) where T<:ResType
    res = Channel{T}(n)
    for i in 1:n
        put!(res, T())
    return ResourceBase{T}( res, T)

request!(res::AbstractResource) = take!(res.container)
release!(res::AbstractResource, source::ResType) = put!(res.container, source)
release!(res::AbstractResource) = put!(res.container, res.res_type())
isempty(res::AbstractResource) = !isready(res.container)
length(res::AbstractResource) = length( )
capacity( res::AbstractResource ) = res.container.sz_max
capacity!(res::AbstractResource, n::Int) = (res.container.sz_max = n)
slots(res::AbstractResource) = ( res.container.sz_max - length(res) )

const Resource = ResourceBase{DiscreteSource}

The simulation is as follows,


function ride_bike(sim::Simulator, name::String, bikes::Resource, start_time::Int, trip_duration::Int, user_id::Int)
    delay!(sim.env, start_time)
    println( @sprintf("%s starting at %d", name, tau(sim.env)) )      # this line will result in logic error
    push!(sim.log, @sprintf("%s starting at %d", name, tau(sim.env)) )
    println( @sprintf("%s rent a bikeat %d", name, tau(sim.env)))  # this line will result in logic error
    push!(sim.log, @sprintf("%s rent a bike at %d", name, tau(sim.env)) )

    delay!(sim.env, trip_duration)
    println( @sprintf("%s return the bike at %d", name, tau(sim.env)))
    push!(sim.log, @sprintf("%s return the bike at %d", name, tau(sim.env)) )
    @show user_id, driving_time, tau(sim.env)

function sim()
    bikes = Resource(1)
    sim = Simulator() 

    id = 0
    start_t = [0, 2]
    rent_t = [5, 1]
    for i in 1:2
        id +=1 
        t = start_t[i]
        d = rent_t[i]
        process!(sim.env, SP(id, ride_bike, sim, "Car $id", bikes, t, d, id), 1 )    # process started emediately 

    @show "start simulate"

    run!(sim.env, 20)

    @show sim.traffic_inf
    for r in sim.log


The result is not correct as follows,

"start simulate" = "start simulate"
Car 1 starting at 0
Car 1 rent a bikeat 0
Car 2 starting at 2
Car 2 rent a bikeat 7
Car 1 return the bike at 7
(user_id, driving_time, tau(sim.env)) = (1, 0, 20.0)
sim.traffic_inf = Dict{Int64,Tuple{Int64,Int64}}()
Car 1 starting at 0
Car 1 rent a bike at 2
Car 2 starting at 7
Car 2 rent a bike at 20
Car 1 return the bike at 20

If I remove the println() in function ride_bike(), the result is correct.

function ride_bike(sim::Simulator, name::String, bikes::Resource, start_time::Int, trip_duration::Int, user_id::Int)
    delay!(sim.env, start_time)
    #println( @sprintf("%s starting at %d", name, tau(sim.env)) )
    push!(sim.log, @sprintf("%s starting at %d", name, tau(sim.env)) )
    #println( @sprintf("%s rent a bikeat %d", name, tau(sim.env))) 
    push!(sim.log, @sprintf("%s rent a bike at %d", name, tau(sim.env)) )

    delay!(sim.env, trip_duration)
    #println( @sprintf("%s return the bike at %d", name, tau(sim.env)))
    push!(sim.log, @sprintf("%s return the bike at %d", name, tau(sim.env)) )
    @show user_id, driving_time, tau(sim.env)

The correct result as follows

"start simulate" = "start simulate"
(user_id, driving_time, tau(sim.env)) = (1, 0, 5.0)
(user_id, driving_time, tau(sim.env)) = (2, 2, 6.0)
sim.traffic_inf = Dict{Int64,Tuple{Int64,Int64}}()
Car 1 starting at 0
Car 1 rent a bike at 0
Car 2 starting at 2
Car 1 return the bike at 5
Car 2 rent a bike at 5
Car 2 return the bike at 6

Two questions:
(1) Why the result is different when I use ‘println()’ in ride_bike()?
(2) How to model “If there is no bike available, the user wait for t, after that the user leave (cancel the request of a bike).”
Thank you so much!

1 Like

For your first question please look at IO-operations in the documentation. Please enclose IO-operations in a task in a now!()-call like this:

now!(sim.env, SF(println, @sprintf("%s rent a bikeat %d", name, tau(sim.env))))

This transfers the print operation to the clock and the clock thus cannot proceed asynchronously before printing is finished. Then it should work.

Your second question regards customer reneging: If a task (a customer) waits for a channel via a take!-call, it blocks and cannot renege if it takes too long. For reneging to work, your request!-function has instead to conditionally wait! like this:

function request!(res::AbstractResource, clk::Clock, timeout=Inf::Float64)
	wait!(SF((ch, clock, tmax) -> isready(ch) | tau(clock) >= tmax, res.container, clk, tau(clk)+timeout))
	isready(res.container) ? take!(res.container) : nothing

This checks the channel repeatedly via isready without blocking until there is something or the maximum waiting time is exceeded. In the latter case it returns nothing.

Since this is a standard problem, future versions of Simulate.jl must provide a standard pattern/function for it. I hope, this helps for now.

PS (edit): I opened an issue on this.