What are you working on? [Feb 2019]

question
#1

A earlier ‘what are you working on?’ post was pretty well received so I figured I’d make a new topic so people can chat without notifying all the previous responders.

If the mods think this should just be a continuation of the old thread, feel free to close it.


I’ll start off: I’m writing some code for calculating certain corrections to the BCS theory of superconductivity. This basically involves using NLsolve.jl to solve a set of rather ugly Euler-Lagrange equations in momentum space.

I’ve tested my code in the BCS limit against some code a colleague wrote in Mathematica to make sure I reproduce the right values when I turn off my corrections and my code was around 100 times faster than his so I’m quite pleased about that, especially since this correction I’m considering is waaaay more computationally expensive than the simple case considered in the test.

I still had to use Mathematica for finding the symbolic form of my corrections though. I dream of the day when Julia will trump Mathematica’s symbolic capabilities.

7 Likes

#2

Are you not able to use SymPy via Julia to meet your needs? Otherwise there are native Julia symbolic maths libraries e.g. SymEngine to name one.

0 Likes

#3

I mean, SymPy might be able to do what I need but in my experience SymPy is just not as good as Mathematica (plus I’m more familiar with Mathematica than SymPy). SymEngine is not native Julia, it’s a Julia wrapper around C++ and its functionality last I heard was a strict subset of SymPy but faster.

If I had to guess, I’d think it’ll be a few years before I have enough enthusiasm / trust to use a symbolic julia library over Mathematica, though I look forward to that day!

1 Like

#4

Btw, REDUCE is actually the programming language which inspired the creation of Mathematica

My package Reduce.jl is intended to provide a preview of what symbolic computation might look like in Julia someday in the future. I’ve discussed with the developers of the computer algebra system what would be needed to implement the REDUCE language in native Julia. What’s needed is basically a lower level parser which can read the underlying source code and transpile it into Julia, since the computer algebra system is hundreds of thousands of lines of code in its own special language, which is intentionally designed to be parsed for code generation. However, I’m not concerned with it right now.

Currently, I am working on optimizing the type stability and minimizing code footprint for Grassmann.jl

7 Likes

#5

I’ve been trying to make a standalone, relocatable version of my Gtk based editor using ApplicationBuilder. After some struggle with relocating and fixing the paths of all the binary dependencies I managed to make it work on mac OS (assuming it’s installed in /Applications/). The application is pretty large, 800MB, but given all that’s included with it, it’s not that bad.

That said I had some issues with networking that are very hard to debug, since they only show up when running the application on another computer, so the whole thing wasn’t very useful. But with some more work we should be able to bundle and distribute Gtk based apps.

2 Likes

#6

I’m finishing (hopefully) a paper about constrained nonlinear least-squares, which we implemented in Julia, and also trying to get to a stable version on all main packages on JuliaSmoothOptimizers.

6 Likes

#7

Among other things, I’m working on a Julia implementation of stochastic arithmetic: a method which helps diagnosing floating-point-related errors in a computing code. I’m revisiting some experimental, notebook-based work I did last year, improving it and converting it into a full-fledged package:

Some things I learned in the process:

  • how to use Documenter.jl
  • how to set up CI tools to run tests, track coverage and update documentation (skeleton.jl helped a great deal there)
  • tons of cool stuff about macros (which was one of the original goals of this experiment)
7 Likes

#8

A few things are going on simultaneously for me:

  • I’m working on getting AMDGPU’s ROCm/HSA runtime supported in Julia, so that those of us with AMD GPUs can benefit from the amazing progress with adding GPU support across the Julia ecosystem. Currently, the lack of cohesive documentation on their object/executable loading has made this annoying, but I’m making good progress nonetheless! I hope to have an HSARuntime.jl package out on Github in the coming days/weeks. After that, I have the much harder job of setting up something akin to CUDANative.jl for AMDGPU support, but thankfully there’s already great work done by Tim to use for inspiration.
  • Plugging away (for the last ~3 years) at my collection of packages I’m tentatively calling “ChimeraUniverse”, which are designed to work together to create a system for automatically evolving, training, and testing AI “agents” based on spiking neural networks. The recent blog post on DiffEqFlux.jl has inspired me to start porting the core algorithms over to the DiffEq ODE format (since SNNs are ODEs at their core, after all), and re-using Flux’s various layers to make life easier. My first goal is to train agents which have the Unix shell/TTY as their primary I/O interface, and I will start with training them to do routine sysadmin tasks and simple maintenance. This is my main research project, and once it starts bearing fruit, I’ll make a Discourse post and accompanying blog post to shed more details on the project, for anyone who might be interested.
  • Sending in patches here-and-there to Drew Devault’s SourceHut project to make it more compatible with Julia’s standard workflow. This is a really interesting project, in my opinion, and one that I look forward to using to host my git repos, test my projects via awesome CI, and just generally provide all the different services required to support open source projects like the ones mentioned above. I’ll also be writing a Discourse+blog post combo on this once I get the chance, with a focus on how Julia developers and users can get started with SourceHut for software development.
  • Various little things on the side, some Julia related, some not, but trying to involve myself in the open source community as much as I can!
8 Likes

#9

I’m rewriting https://github.com/rdeits/cryptics in Julia! Among other things, that has meant:

The new solver is not yet complete, but it does work pretty well already: https://github.com/rdeits/CrypticCrosswords.jl

The next steps are:

  • Even more performance improvements
  • Creating a simple web interface and figuring out how to host it
6 Likes

#10

I’m writing a processor that takes the measurements from a satellite scatterometer and uses them to estimate the normalised radar cross section of the Earth’s surface. It’s my hope that the experimental processor written in Julia will be a fraction of the size and faster than the official processor being written by a subcontractor in Java.
Also having fun comparing Crystal lang with Julia lang.

8 Likes

#11

I’m agonizingly close to releasing a Julia translation of the UCR Dynamic Time Warping optimized implementation. I haven’t done much profiling yet, but it seems pretty fast. More importantly, it fixes a number of bugs in the original implementation and adds new features!

5 Likes

#12

Having a play around with cubaatomics to create a fast countmap for soem data types

0 Likes

#13

I am detecting and analyzing the echolocation clicks of odontocetes in an 80 TB (and growing) acoustic dataset recorded by a broadband hydrophone in Monterey Bay, California. This is the first project where my entire workflow has been in Julia, from reading raw .wav files, to pre-filtering the data, to detecting clicks, to clustering and classifying them using unsupervised (and eventually supervised) learning, to visualizing and analyzing the resulting time series. Being able to do all this has really driven home for me how far the language and package ecosystem have come in just the past year or two!

11 Likes

#14

I’m working on a few half finished packages that make large environmental datasets easy to use in ecological models.

GrowthRates.jl is for constructing organism growth rate layers from SMAP data. It has modular growth models you can chain arbitrarily, and can run them on the GPU.

Microclimate.jl Wraps microclim datasets for spatial mapping of ecophysiological models.

1 Like

#15

I’m very new to Julia but I’ve been creating a package Paillier.jl which is an implementation of the Paillier partially homomorphic cryptosystem.

One example use is carrying out private set intersection - https://github.com/hardbyte/Paillier.jl/blob/master/examples/private_set_intersection.jl.

2 Likes

#16

This is so awesome! Can you talk more about what you expect to find?

0 Likes

#17

Tearing my hair out trying to load Fannie Mae data into JuliaDB.jl

1 Like

#18

I am working on Bayesian inference for hierarchical with many parameters (think 400k-10m). While allocation for NUTS/MCMC was insignificant before (it did happen, but it didn’t matter), now it is becoming costly.

I just rewrote parts of LogDensityProblems.jl to support pre-allocated buffers, DynamicHMC.jl is in the process of being adapted. I am also taking this opportunity to make parts the latter more modular, robust, and improve test coverage.

Also working on some code to make posterior predictive checks easier. I think that PPC is an integral part of Bayesian inference, but by the time the model is estimated there is often little energy remaining for it and it involves a lot of boilerplate code. It turns out that making the parameter structure flatter makes this a lot easier, so I am considering changes to TransformVariables.jl that effect.

6 Likes