Amazing that Julia got the lion’s share of the slots for the NumFocus organization!
How many are you personally working with / mentoring / co-mentoring this year, Chris?
Amazing that Julia got the lion’s share of the slots for the NumFocus organization!
My name is Benjamin Chu from UCLA, 1st time GSoC student mentored by Kevin Keys. Congratulations to all!
I’m working on a package that does iterative hard-thresholding (IHT.jl). This algorithm is fast enough (avoids hessian matrices!) that we can run a standard biology dataset today of size ~10000 x 1,000,000 on a personal computer and finish within an hour or so. It’s an idea that has been around for maybe 10~20 years(?), but I guess new enough that it still doesn’t have a wiki page
I’m planning to add 3 features to the current IHT package for analyzing GWAS (i.e. genetics) data. Some of these are more or less grind throughs, such as learning to manipulate binary datafiles, but others such as grouping predictors are quite non-trivial for me in terms of how to actually do it. But I live by the principle that I try my best and life will figure out a way (most of the time), so I’ll probably be okay
I work on genetics data, and I’m good at math!
I am Tejan Karmali, a third-year undergrad at National Institute of Technology Goa, India. I’ll be working on adding new deep learning models in the model zoo of Flux.jl. I’ll be mentored by Mike Innes, Simon Danisch, Chris Rackauckas and Stefan Karpinski.
As part of the project, I’ll be implementing some cool deep learning papers in recent years like AlphaGo, Decoupled Neural Interfaces, and Spatial Transformer Networks. I target to demonstrate the ease with which we can implement a variety of models in Flux. In the process, I’ll also be contributing to NNlib.jl by adding some more modules which will be required in my project.
Really excited to be part of Julia community! Looking forward to an amazing summer ahead!
I am Soham Tamba, a fourth-year undergrad at National Institute of Technology Goa, India. My main goal is to add parallel versions of available functions on LightGraphs.jl. I will also we adding some new functionality.
Currently, I am trying familiarize myself with all of Julia’s parallel computing functionality as well as learn the principles to follow while producing such code. The main challenge I’m anticipating is producing efficient code given that some of the usual parallel computing functionality for Julia has not been developed yet.
Mentors: Divanyansh Srivastav and Chris Rackauckas
I am Avik Pal, a first-year undergrad at Indian Institute of Technology, Kanpur, India. This summer I shall be working on the model zoo of Flux. I will be mentored by Mike Innes, Christopher Rackauckas and Viral B. Shah.
In my project, I shall be implementing deep learning models for Computer Vision Tasks. The main tasks that I look forward to accomplishing include object classification and object localization. I will also be adding some advanced layers to Flux which will allow creating these models easier. Finally, I shall also be writing detailed tutorials describing how to go about creating new architectures using Flux.
Looking forward to a really productive and awesome summer with the Julia community.
I think I should clarify (as I did here). I am a mentor on each Julialang project for administrative purposes, but am personally mentoring only the 6 DiffEq-related projects.
Who are your mentors?
Congrats, and welcome to the Julia Community!
Hello, everyone! So excited to be here!
My name is Matt, and I’m a PhD student in linguistics, specializing in experimental phonetics, at the University of Alberta. My project is centered on implementing a deep neural network for speech recognition in Flux. Concretely, it’s the model described in Zhang et al.'s (2017) “Towards end-to-end speech recognition with deep convolutional neural networks.” I’ll be mentored by Mike Innes, and Chris Rackauckas is listed as well (though he’s mentioned above what his involvement in the projects is).
In my experience, papers on automatic speech recognition with deep learning have been sparse in implementation details, and often have no associated implementation code, which is bad for science. So, this project is a step toward remedying the small amount of working examples for speech recognition by implementing a state-of-the-art neural net. Additionally, I plan to produce documentation geared toward getting newcomers to speech recognition with deep learning up to speed, because I’ve found that it’s difficult to claw your way into this area, and if we can pull them into using Julia-based tools, that would be good as well.
Hello everyone, glad to be part of GSoC this year. My name is Mohamed, I am a first year PhD student in topology optimization at UNSW Canberra, Australia.
In my project, I will implement the locally optimal block preconditioned conjugate gradient (LOBPCG) algorithm to find minimum and maximum generalized eigenvalues of large sparse symmetric systems, and their corresponding eigenvectors. I will also play with and develop a few algebraic multigrid preconditioners as well as other common preconditioners and put/interface them together in a package. The LOBPCG algorithm itself will be interfaced in IterativeSolvers.jl.
Testing and benchmarking of the packages will be done using dummy matrices at first then using finite element linear buckling analysis problems built with the help of JuAFEM.jl. The competition here is mainly
Base.eigs and JacobiDavidson.jl, and so far I am not disappointed https://github.com/mohamed82008/LOBPCG.jl.
Documentation and porting to v1.0 are also part of the project, and shared-memory parallelism is an extra nugget. Wish me luck!
Mentors: Harmen Stoppels, Alan Edelman, and Chris Rackauckas.
Let me know if you want to chat or get more info about audio DSP in Julia (maybe in a separate thread or on slack). I’d definitely love to see the various JuliaAudio packages working well with the machine learning ecosystem, so let me know if you hit rough edges.
I’m Sean Innes, a third-year CS student at the University of Bristol, England. Happy to have Julia for my first GSoC project.
I’m being mentored by Tom Short and my project is going to help get Julia compiling down to WASM. Hopefully I can get some simple mathematical / matrix libraries compiling by the end of the summer.
Looking forward to getting started!
Hello everyone, glad to be a part of this community!
I’m Ayush Shridhar, a second year undergraduate from International Institute Of Information Technology, Bhubaneswar, India. I’ll be working on model import, export and computer vision based models in Julia.
In this project, I aim to make a reader for models from other frameworks, such as ONNX. This would help us in loading pretrained models into Flux. I’ll also be working on a few other Computer Vision based packages, such as MetalHead.jl. I’d aim to load Detectron model, to achieve the goal of Object Detection in Julia. I’ve begun work on the ONNX.jl package, which can be found under the FluxML organization.
Mentors: Mike Innes, Phil Tomson, Chris Rackauckas
Looking forward to an amazing summer!
Hello everyone, feels great to become a part of one the most amazing open source communities out there!
I am Vaibhav Kumar Dixit a second year undergraduate student of Mathematics and Computing at Indian Institute of Technology (B.H.U.), Varanasi. My project for GSoC involves parameter estimation of dynamical models an area I have come to enjoy quite a lot. I will be mainly working on implementing Bayesian techniques for the problem, namely SAEM and MAP in DiffEqBayes.jl. I am being mentored by Vijay Ivaturi and the amazing Christopher Rackauckas.
Looking forward to a tremendously enriching and productive summer.
Hi everyone and congratulations to all who put in efforts and got selected!
My name is Jiawei Li, right now I’m Economics student at University of Birmingham in my exchange (or 3rd) year. My home university is University of Nottingham Ningbo campus in China. I’ve been passionate about data science and applying computational techniques to social science study. Getting involved in GSoC gives me an invaluable chance to learn a lot.
This summer I’m working with my mentor Patrick Mogensen (@pkofod) for JuliaNLSolvers. JuliaNLSolvers mainly has three packages,
Optim.jl for solving optimization problems,
NLsolve.jl for solving systems of nonlinear equations and
LsqFit.jl for curve fitting or parameter estimation. Currently,
LsqFit.jl is at very beginning stage and a lot of features could be added, such as trust region method and non-vector functionality. I’ll also be working on documentations and benchmarks for
NLsolve.jl. Any suggestions are welcome!
I’m so excited to be part of Julia community! I absolutely love Julia and functional programming. Hope everyone here have great fun coding this summer.
Hello everyone! My name is Xingjian Guo and My GSoC project is Native Julia Implementation of Exponential Runge-Kutta Integrators with @ChrisRackauckas as my mentor.
I’m currently pursuing a MSc. in Scientific Computing degree at NYU. I’m passionate about numerical computing in general and in particular, differential equations.The aim of my project is simple: implement robust, state-of-the-art exponential RK solvers in native Julia under the JuliaDiffEq framework. To achieve this, I need to delve into the deep waters of Krylov subspace methods and write efficient methods to handle the matrix-phi functions. Another important aspect of the project is to structure the algorithms in a way that are compatible with different problem formulations (semi-linear vs general, time-independent vs time-dependent).
This is an ambitious project so goodbye to travel and video game time . On the other hand it’s really cool to contribute to the Julia community! And I wish the best for your projects too!
My name is Arijit Kar. I am a second year Computer Science undergrad at Indian Institute of Technology Kharagpur, India. As part of GSoC, I will be working on implementing Optical Flow and Object Tracking algorithms in pure Julia. Zygmunt Szpak and Chris Rackauckas will be mentoring my project.
I will be adding 2 optical flow algorithms namely Lucas-Kanade and Farneback which are the two most widely used optical flow algorithms, used in different scenarios depending on whether speed or accuracy is the major requirement. For the object tracking module, I will be adding 4 tracking algorithms that have different advantages over each other in terms of tracking accuracy, ability to handle occlusion, speed, etc.
Very excited to be working with the Julia community and look forward to having an wonderful summer!
Hi Vaibhav, great to hear. Feel free to come forward for Bayesian Inference on (Stochastic) Differential Equation topics. I have an introduction into the topic here: http://mschauer.eu/journal/2018/01/19/parameter-inference-for-a-simple-sir-model/
My name is Gaetano Priori, I’m a third-year student in Mathematics at the Polytechnic University of Turin and this is my first GSoC.
I’m being mentored by Jameson Nash and Valentin Churavy on improving the thread safety in the Julia compiler. The two main goals for the end of the summer are implementing a per thread RNG and providing a race-free method cache by testing RCU and memory barriers on TypeMap. During the working period I will also develop new features to detect bugs in multi thread applications.
Personally I see this as a great opportunity to get involved in the Julia community, and I am very excited to start!
Thanks, Spencer! I just might have to take you up on that offer in the near future here; I’m not super familiar with all that Julia has to offer for DSP, but I’m eager to learn!
@mschauer I do plan on taking up Bayesian Inference on SDEs, currently I am working on implementing multiple shooting objective (within DiffEqParamEstim.jl) primarily for SDEs. The link you provided is quite detailed and I think it will be really helpful in setting up support for SDEs in DiffEqBayes.jl. Thanks!