Just to be clear, I don’t see how this is related to Unicode at all, it’s just about calling functions with kwargs whether they are ASCII or not.
OK, to be precise and if you want to test what I mean, the package is ACME :
Pkg.add("ACME").
With this framework, you can create electronical elements and connect each one to a electronical circuit. For my example, I use the specific transistor type BJT (bipolar junction transistor) with is julia function signature defined as:
function bjt(typ; is=1e-12, η=1, isc=is, ise=is, ηc=η, ηe=η, βf=1000, βr=10)
...
end
By using the main evaluation method of libjulia :
jl_eval_string
I can get a correct result with :
jl_eval_string ("bjt(:npn)");
or with the symbol :pnp. I get a correct pointer to a Element (main element ptr of ACME package)
If I call the evaluation with dummy value (same as default values) :
jl_eval_string ("bjt(:npn, isc=1e-12, ise= 1e-12)");
I get a good pointer too. But if I call the evaluation with :
jl_eval_string ("bjt(:npn, 1e-12, 1e-12)");
I get an exception :
MethodError(ACME.bjt(:npn,0.0,0.0))
I get an exception too when I call the evaluation with all the parameters (unnamed) in correct order :
MethodError(ACME.bjt,(:npn,0.0,0.0,1.0,1.0,1000.0,10.0))
So, I need to call the jl_eval_string with all the kwargs, you’re right.
Now about the unicode question, as you can see in the signature of the bjt function, the kwargs use two unicode characters (beta and eta).
If I call the jl_eval_string with unicode in string :
bjt(:npn, isc=0.000000, ise=0.000000, ?c=1.000000, ?e=1.000000, ?f=1000.000000, ?r=10.000000)
ErrorException("syntax: keyword argument is not a symbol: \"*(?,c)\""
If I call the jl_eval_string with julia unicode “tab completion sequence” as define here :
http://docs.julialang.org/en/stable/manual/unicode-input/
I get exception :
bjt(:npn, isc=0.000000, ise=0.000000, \etac=1.000000, \etae=1.000000, \betaf=1000.000000, \betar=10.000000)
ErrorException("syntax: \"\\\" is not a unary operator")
The same result when I call bjt with UTF-8 (\x) UTF-16 (\u) :
eta (η) beta (β)
UTF-8 : \xce\xb7 \xce\xb2
UTF-16 : \u03b7 \u03b2
I think it’s because the jl_eval_string use simple ASCII string signature as input :
const char*
not wide string
wchar*
Or I misunderstand something somewhere.
Thank you very much for taking time for me.
Max