Thank you everyone for supporting me all along. Now, I know what to do.
I’ve mastered algorithms to be pretty good at it in like 200 problems solved on hackerrank and other websites. Now, it’s time for me to do the same with pull requests.
Let me get it straight here. My code quality might not be very clean. I might consume a lot of reviewers’ time, and my pull request success rate might be low, but I will accept the opportunity if given a chance.
I’m willing to implement small new functionality, optimize the code, add new specialized methods, or fix other issues you find.
200 pull requests. Not sure if I would be able to keep going, but if I don’t get started, I miss all the shot I don’t take.
I’m tired of having a bunch of ideas that I can’t find a way to execute, and I’m pretty sure the Julia community is just as enthusiastic to fix issues whenever they arise, and even go to crazy length to fix it.
Time to develop my bias for action and coding.
Thank you again for supporting me all this time.
Please get me onboard. Thank you.
2 Likes
Do you have a specific area you’re interested in the “major” Julia packages depends on that
Optimizing stuff for performance would be my favorite. Should be something doable by someone with understanding in algorithms and some basic math (calculus level). Maybe autograd would be okay if it doesn’t involve heavy computation. Implementing machine learning stuff like neural network layers/etc would be good too. Still, things that someone with CS101 knowledge and perhaps some data science knowledge can quickly onboard would be pretty good.
I made my first Stockfish patch, Stockfish itself being quite a complicated program, and I was encouraged that I could do it, and I could. So, maybe what I need is some package that’s open for people to contribute and for some developers to encourage me to go in. I learn pretty quickly so should be fine with a lot of things where the missing knowledge can be filled quickly and doesn’t take years learning the basics to even get started (like finite element analysis, though maybe someone working on FEM might prove me wrong on that as well.).
Lux is trying to moove a lot of layers to Reactant, there is an issue following it in Lux.jl.
Also, personal project that could help make julia shine is moving Transformers.jl to Lux.jl + Reactant.jl but you will need a good enough cuda gpu.
As for Optimisation it’s still focused on Optimization.jl and JuMP.jl
1 Like
I’ll look at Lux! Thanks!