100 Julia exercises with solutions

Here is a Julia version of exercises with solutions

100 Julia exercises with solutions

which is based on the exercises given here: GitHub - rougier/numpy-100: 100 numpy exercises (with solutions)

You are welcome with alternate and/or efficient solutions for the solutions to the exercises. Feel free to suggest your solutions on comments here/blog comments.

I have attempted the first 50 qns. Remaining qns will be attended in due course.

11 Likes

For exercise 10 it seems like you check for not equal to one instead of zero?

  1. Find indices of non-zero elements from [1,2,0,0,4,0]
findall(!isequal(1),a)
1 Like

findall(!isequal(0),a) would be the correct command. I will change accordingly.

!iszero does the same thing.

7 Likes

8. Reverse a vector (first element becomes last)

1 a[end:-1:1]
2 reverse(a)

Either the question should be more like

8. Return the reverse of a vector (first element becomes last)

Or the answer should be

reverse!(a)
3 Likes

14. Create a random vector of size 30 and find the mean value

1 using Statistics
2 a = rand(1,30)

This is a 1x30 Matrix. A Vector of size 30 would be:

a = rand(30)

So, answer depends on how stringent you want to be with the term “vector”.

5 Likes

Changed.

Updated the solution

Included in the answer.

It would be easier to submit pull requests (if this were on github).

For problem 1, you can just do import LinearAlgebra as la these days.

For 7, you can simply do a = [10:49;]

For 10, it’s probably more natural to use !=(0) than !isequal(0).

For 11, you can do I(3).

For 15, it would be simpler to just assign the corners twice, e.g. a[:,1] .= 1 instead of a[2:end-1,1] .= 1. (The performance difference is negligible.)

For 16, I would do b = zeros(eltype(a), size(a) .+ 1)

For 20, you can do CartesianIndices((6,7,8))[100]

Exercise 22 seems ambiguous. Often in linear algebra, “normalize” would mean dividing by some norm, but that’s not what you’re doing here.

For exercise 24 I wouldn’t make it mutable, which would be a bad idea for efficiency of storing arrays of that type.

For 25, I would use x[a .<= x .<= b] .*= -1.

For 27, I would say “valid” rather than “legal”. (No one is going to arrest you if you throw an exception.)

For 28, the analogue of the numpy exercise would be [0] ./ [0] and [0] .Ă· [0], not [0] / [0] and [0] // [0].

For 35, you could use @. A = (A+B)*(-A/2) to compute it in-place and store the result in A.

For 36, I wouldn’t call Int64.. If you call trunc.(...) the results are already integer-valued. (Even if they are stored as floating-point numbers, they are still integers.)

I don’t know why you have all the trailing commas in a = rand(10,) etcetera; it’s not idiomatic.

For 44, I would use hypot to compute the radius, and cis(x) instead of exp(im*x).

9 Likes

Fantastic review. I have updated my blog considering your feedback.

Nice work! Small typo: Exercise 33 should read Dates.Day(1)

1 Like

corrected. Actually Dates prefix is not a must.

A few suggestions to add to those made above:

  • You frequently use a trailing comma in function calls, for example in rand(10,). This comma isn’t necessary. Usually you would see it written rand(10).
  • You frequently vectorize code in the solutions (presumably coming from python, where it is needed for performance). Vectorizing is not considered best practice in julia, because it often involves allocation of unnecessary intermediate arrays. In python that allocation is worth the speed gain, but since julia is already fast allocating unnecessarily only slows it down.
  • Some more notes on specific exercises:
  1. extrema will find both numbers in one pass.
minm, maxm = extrema(a)
  1. An array comprehension would work nicely here:
a = [Int(isodd(i+j)) for i in 1:8, j in 1:8]
  1. We can make this more general (meaning for any number of dimensions) and also more “julian” by writing the following function:
function to_cartesian_ind(shape, linear_ind)
    map(shape) do l 
        linear_ind, cartesian_ind = fldmod1(linear_ind, l)
        cartesian_ind
    end
end

to_cartesian_ind((6,7,8), 100)  # returns (4,3,3)

Using fldmod1 is a nice trick here to get the index in the current dimension (with mod1, which has range (0, n]), and the remainder for the next iteration in a single line.

  1. Again, don’t vectorize. It is helpful to use a helper function here:
to_polar(x, y) = hypot(x, y) * cis(atan(y, x))

[to_polar(x...) for x in eachrow(a)] # the "splat" operator `...` turns the 2-vector x into two separate numbers for `to_polar`.

# equivalent:
[to_polar(a[i, 1], a[i, 2]) for i in axes(a, 1)]  

Technically, this way (and yours) might not be in the spirit of the exercise, since the output doesn’t match the input (it’s a vector of complex numbers instead of a matrix). If we wanted a matrix with (r, θ) rows instead, we would have to go about things a bit differently (could be a useful exercise to try to do that as well!)

  1. Using a comprehension here is nice! you could also take advantage of broadcasting, which takes into account the shape of the inputs (not just their length)
C = 1 ./ (X .- Y')
  1. technically, this doesn’t print all the values in the array. It prints as many as fit in the terminal (or other output). To print them all, you could do, for example, foreach(println, a).

  2. argmin with a function as the first argument will return the value that minimizes the function. If we take abs(y-x) as that function, we will get the value closest to x. So you could do:

argmin(y -> abs(y - x), a)
5 Likes

argmin(y → abs(y - x), a)
ERROR: MethodError: no method matching argmin(::var"#11#12", ::Vector{Float64})
Closest candidates are:
argmin(::Any) at array.jl:2346
Stacktrace:
[1] top-level scope
@ REPL[15]:1

True, this method was added in julia 1.6

This is so nice.
Thought it was a great opportunity to take the ride and try to use Literate.jl to do the work of publishing.
So I started a small project at Julia 100 Exercises.
So everything is in a Julia file (.jl).

5 Likes

Thats nice. Happy learning Julia

1 Like

I have finished all 100 questions in Julia 100 Exercises.
It was a great experience to work with Literate.jl.

3 Likes