Calling libjulia function from C with unicode

Hello,

I’ve post two questions on Stackoverflow about confusion using libjulia in C.

http://stackoverflow.com/questions/42625565/libjulia-julia-c-api-named-variable-convention-for-function-call

I’m confusing about calling a function with “named-unicode-parameter”, for example, a function with this signature:

function bjt(typ; is=1e-12, η=1, isc=is, ise=is, ηc=η, ηe=η, βf=1000, βr=10)

Thank you very much if you find time to answer me.

Regards.

There’s no direct way to call functions with kw arguments in julia. Mainly because there isn’t an direct representation of it in C. The easiest way is to just construct a julia function that translate the positional argument toi keyword arguments.

Also note that you have to root the result of jl_box_float64 or you’ll get segfault from time to time.

The easiest way is to just construct a julia function that translate the positional argument toi keyword arguments.

If I understand, I need to patch the julia framework that I want to call with an additionnal “bjt” function that wrap the original bjt function with a positional argument signature? If i’m right, that allow me, by the way, to avoid unicode character (beta/eta).

No way to do a simple call with a jl_eval_string if I replace unicode by a \u0000 UTF-8 equivalent or by a different method more or less mysterious? The Julia documentation say that the parser admit UTF-8 \unicode string, but what about a c string (char*)?

Thank you for your time.

No, you can just define a new function with jl_eval_string.

Just to be clear, I don’t see how this is related to Unicode at all, it’s just about calling functions with kwargs whether they are ASCII or not.

Just to be clear, I don’t see how this is related to Unicode at all, it’s just about calling functions with kwargs whether they are ASCII or not.

OK, to be precise and if you want to test what I mean, the package is ACME :

Pkg.add("ACME"). 

With this framework, you can create electronical elements and connect each one to a electronical circuit. For my example, I use the specific transistor type BJT (bipolar junction transistor) with is julia function signature defined as:

 function bjt(typ; is=1e-12, η=1, isc=is, ise=is, ηc=η, ηe=η, βf=1000, βr=10)
 ...
 end

By using the main evaluation method of libjulia :

jl_eval_string

I can get a correct result with :

jl_eval_string ("bjt(:npn)");

or with the symbol :pnp. I get a correct pointer to a Element (main element ptr of ACME package)

If I call the evaluation with dummy value (same as default values) :

jl_eval_string ("bjt(:npn, isc=1e-12, ise= 1e-12)");

I get a good pointer too. But if I call the evaluation with :

jl_eval_string ("bjt(:npn, 1e-12, 1e-12)");

I get an exception :

MethodError(ACME.bjt(:npn,0.0,0.0))

I get an exception too when I call the evaluation with all the parameters (unnamed) in correct order :

MethodError(ACME.bjt,(:npn,0.0,0.0,1.0,1.0,1000.0,10.0)) 

So, I need to call the jl_eval_string with all the kwargs, you’re right.

Now about the unicode question, as you can see in the signature of the bjt function, the kwargs use two unicode characters (beta and eta).

If I call the jl_eval_string with unicode in string :

bjt(:npn, isc=0.000000, ise=0.000000, ?c=1.000000, ?e=1.000000, ?f=1000.000000, ?r=10.000000)
ErrorException("syntax: keyword argument is not a symbol: \"*(?,c)\""

If I call the jl_eval_string with julia unicode “tab completion sequence” as define here :

http://docs.julialang.org/en/stable/manual/unicode-input/

I get exception :

bjt(:npn, isc=0.000000, ise=0.000000, \etac=1.000000, \etae=1.000000, \betaf=1000.000000, \betar=10.000000)
ErrorException("syntax: \"\\\" is not a unary operator")

The same result when I call bjt with UTF-8 (\x) UTF-16 (\u) :

          eta (η)     beta (β)
UTF-8  : \xce\xb7    \xce\xb2
UTF-16 : \u03b7      \u03b2

I think it’s because the jl_eval_string use simple ASCII string signature as input :

const char* 

not wide string

wchar*

Or I misunderstand something somewhere.

Thank you very much for taking time for me.

Max

I found the solution.

By using the litteral UTF-8 conversion, I get a correct pointer. So no need to explicitly convert the unicode char to UTF-8 or UTF-16/32, just call the function like that :

std::string expr = string_format(std::string(u8"bjt(%s, isc=%f, ise=%f, ηc=%f, ηe=%f, βf=%f, βr=%f)"),
					     type == npn ? ":npn" : ":pnp", isc, ise, ηc, ηe, βf, βr);
std::string id = owner.make_unique_ID(type);
jl_value_t* ptr = jl_eval_string((id + " = " + expr).c_str());

The UTF-8 litteral conversion work here on VS2017RC :

const char* utf8literal = u8"This is an unicode UTF8 string! with my beta β and eta η";

Thank you very much for you support.

Long live to Julia.