About operators & some other questions & possible suggestions

Hi. I am new to Julia and have enjoyed it so far.

What a beautiful language. I have not made anything special or serious with it yet, but was playing with it and was getting used to it. I have used C and C-like languages, (Object) Pascal, R and functional and logical ones before, but Julia is the one that is really interesting to me, because I see that in time it could replace both static languages, dynamic languages and certain others (but probably not logic programming languages, though I would like to see this). If there would be one programming language such as Julia to be a kind of swift dynamic language with batteries, very suitable for the different areas it is now already aimed at, and with the possibility of static compilation to efficient stand-alone binaries, then this is the language that would be really be great which I would want to use as a general language as well as one for certain specific purposes (e.g., statistics). I would like to even see possibilities to use it for game development, though I suppose you could simply call an existing C library (but I wonder how much of a hassle, if any, it would be with possible gotchas)? I think it could at some point, or maybe already, be used for those learning computer science or a first programming language.

Something else now: I noticed that symbols much more in tune with standard logic and mathematics symbols may be used. This is great. I was wondering if Julia would be enhanced in this area. What about including symbols such as:

¬ for negation (equivalent to !);
× for multiplication (equivalent to *);
{ and } for sets;
∧ for conjunction (equivalent to &&);
∨ for inclusive disjunction (equivalent to ||)

so that Julia’s expressiveness and beauty are enhanced? There are after all already symbols such as ≠ for !=, ∈ for element of (or in), and other nice ones. It would sure make Julia nicer.

I was also wondering about the possibility of Julia as a logic programming language, that it might even have a kind of logic programming mode or that it is somehow combined with other paradigms. I never liked Prolog, because its syntax is so ugly. It’s called a logic programming language, and it is, and it may be used for serious business, but at the moment of my writing it doesn’t adhere to standard more traditional logical notation (∧, ∨, ¬, ⊃, etc., including all generally accepted ways in the respective disciplines). Looking at Prolog syntax almost disgusts me. Would Julia at some point in time, after other probably more pressing business has been taken care of, be enhanced in this area? I was suddenly thinking about something with already existing abstract types and hierarchies, but meh… In the meantime let me dream of using Julia as a logical language and querying capabilities with great notational expressiveness. Yes, it could have quantor symbols such as


Wouldn’t that be great? It would require great work. But let me dream about it, will you?

All in all, I am excited about Julia and its future. It looks very bright. Please continue all your good work.

1 Like

Hi and welcome!

Note that some of these symbols are already parsed as operators and you can define them as you want, e.g.,

julia> ×(a,b) = *(a,b)
× (generic function with 1 method)

julia> 2×3

To default to this definition for everybody is not a good idea, some people use × for other things, like the cross product.


I should also note that in general I would suggest avoiding “personal synonyms” for standard operations like *, because you’ll end up writing in a weird personal idiom of Julia. When in Rome….

In cases where there are perfectly good, short, ASCII operators already we tend not to add additional synonyms. See also the long discussion about && and || synonyms here: https://github.com/JuliaLang/julia/pull/19788

1 Like

Thanks, baggepinnen.

I see what you mean, and I had actually forgotten that I could define them myself. Well, too bad! The multiplication dot came to mind, but maybe it would also be problematic. Is there no solution at all (besides personal definitions)? (My post became longer than I expected, so I wrote some things down below. Maybe you are interested.)

Thanks, stevengj, for the links and words. There are some nice discussions going on, with many interesting comments.

I myself don’t mind using && and ||, but they are really a kind of leftover from the ugly C-like syntax (where there is hardly respect, if I may say so, for what came before it, and so less overall consistency throughout history), and a simple & and | (or, if you dare, even a reserved letter v) look better and are shorter to type. I don’t see the problem about confusing && and &, though. Well, at least compared to C, Wirth had much more respect with his languages, which are unfortunately underused for the wrong reasons!

Here comes the long part. I expect that some, maybe many for all I know here, will disagree.

To me it’s a kind of no-brainer, if I may use that word, to use what is most consistent, especially if it was already standard before computer science even seriously hit the scene (which has been, what, only half a century or something like that(?), but I am no specialist historian): it seems obvious that one should respect and stay in tune with already existing conventions (∧, ∨, ⊻, ¬, &, ~, :=, =, etc.), and probably, because of the nature of programming situations, make them work in different contexts, which would also relieve people from learning too many different symbols. We then might say that &&, ||, & and |, etc., the way C and its lookalikes use them, are also existing conventions, but of course, as mentioned, this simply goes against what was already existing and which should have been respected. (Maybe there were no other options available due to circumstances, though I like to think the design of C’s syntax could have been better, even with certain limitations. Now live in the 21st century, so we better move on.)

The following fact, additionally, is actually very simple and important (and this was at least already mentioned or implied by me and which is obvious to us): Julia is already conforming, with great boldness and grace, if such words are sufficiently accurate, to already established principles, so why not go all the way and make that full change now? Learning symbols is easy anyway, and always necessary. (May as well learn how it’s supposed to be, how it’s been established). And those who are unfamiliar, e.g., with formal logical symbols will now become better because they will have learnt something new, and if they should ever engage, e.g., in formal logic (where even a little bit is good for anyone, by the way) they have the advantage of already knowing some essential operators. So, there is even more cohesion and holism in matters of knowledge. (What I write below about the input of such symbols would also help.)**** So why not simply do use…

∧, &: in different contexts (logical conjunction, bitwise operation);
∨: in contexts of logical inclusive disjunction and bitwise operation;
~, ¬: negation in different contexts;
⊻: bitwise operation;
:= (or even the Unicode symbol) for assignment;
= for testing equality;
{ } for sets (instead of using Set);
even find a good way to standardize × and ⋅ ;
etc. etc. …?

Why not find a good way, a good design, to conform as much as possible to what was already agreed upon, and, where there are possible clashes in symbols (such as ×), find a kind of good working respecting balance (taking keyboards, LaTeX notation, handiness, etc. into account), so as to let the syntax be even more beautiful and expressive, since Julia has already gone this route significantly?

****(Yes, besides using LaTeX notation in code editors and in the REPL, the plugins for code editors could simply somehow make it easy to even input symbols by simply pressing the standard ASCII character (e.g., &) and then suggesting or immediately replacing it with the now-standard symbol (e.g., ∧). Maybe this could even be done in the offcial REPL. Some years ago I was actually trying to design a programming language, but I never came to actually building it, but the ecosystem around a language, even if LaTeX + Tab is available, may greatly assist the coder to input those Unicode symbols normally unavailable on standard keyboard layouts.)

1 Like

I think one problem is that there are always someone using an editor without support for entering these characters, and someone using a display which do not display them correctly (chrome on android fails to display many unicode characters when I’m browsing code), so the ascii alternatives always have to be present.

Then, some of the symbols you propose have a well-established meaning, have different meaning in different fields. , for instance, is the min operator in some fields, i.e., x ∧ y = min(x,y). See here for instance, or indeed the many different definitions for various operators here. It could thus be very confusing for some people to have it mean &&.

Also not that the \cdot is already defined in LinearAlgebra to mean dot product.

julia> ones(3)⋅randn(3)

Also note that CS notation is optimized for keyboards/typewriters, while mathematical notation is optimized for pen & paper / chalk & blackboard, using its much richer set of glyphs to permit single-character variable names, and using the fact that it is interpreted by smart humans instead of stupid silicon to employ strategic ambiguity. It appears quite clear that CS traditions are more relevant than mathematical traditions in this respect (after all, you are using a keyboard and text editor (not word processor!) to enter code, and are reading it in a monospaced font instead of compiled latex output).

Tex/Latex agrees, i.e. takes the CS route of focusing on 7-bit ascii (even though unicode support is getting better and better). APL had some good ideas, but the insane character set was not one of them.

CS tradition is very US-centric (keyboard ergonomics of characters like \$/{}), but that’s a bullet we all have to bite.


The character input would be no problem, because the ecosystem would have taken care of it in the first place (as already explained). As for someone possibly using a display without the capability to see the Unicode characters, inclusion could be used (which is something that came to mind already, where one may have many different symbols for the same things), but this would defeat Julia’s then-purpose to set the standard once and for all. Julia already seems to make concessions or compromises anyway, and who uses such a primitive display these days (which is possibly not necessarily Julia’s problem)? Not just that, but it looks as though the REPL on Windows is bugged, but they did not have that fixed yet, did they, and they still put it out there? Then they may as well follow this plan of setting the standard once and for all and go all the way. (Certainly, certain things have to be taken into consideration, as already expressed, but the essential idea then may as well be carried out.)

About those established meanings: logic comes first as a superior or much more essential subject, for, without its basic principles, one cannot even think, let alone have science, and let alone - especially - construct a programming language, so the symbols used in logic (¬, ∧, ∨, etc.), as they have greater rank, are to be so used obviously.

The whole thing, more or less as it was stated before, would greatly improve things.

foobar_lv2, then why is Julia already filled with all sorts of syntax much more in tune with mathematical notation?

Besides that, why must we care in this day and age about that old to-earlier-subjects-disrespectful CS tradition, knowing (and you seem to agree by your own word (". . . but that’s a bullet we all have to bite.") that it is indeed in a sense a bad thing, if now is the time to set things straight with Julia?

I really don’t understand why the both of you oppose.

There are people who are minimalists and don’t like carrying around large font packs on their system just to be able to write code :slight_smile: I’m not one of them, but I’ve thought about it once or twice.

Sorry, but you’re not going to get a positive response to this statement. There are so many different kinds of people using Julia, and I’m sure that many of them would be easily able to argue a counterpoint that is equally as valid as this one. Also, the assumption that logic is the quintessential foundation of everything, without yourself fully knowing the architecture of our universe and reality (which is likely impossible), is itself a logical fallacy. Let’s just leave these symbols up to the user to define in the way they find most useful and “logical”.

Probably because most people who use Julia, have also used another language that does have these sorts of semantics and syntax, and it is thus comfortable and easy for them to grasp quickly. This is key to the growth of an up-and-coming language like Julia. Maybe in the future we can change these things if they truly are too cumbersome, but doing so right now would likely be a large misstep.

¬, and are just notation. !, && and || are also just notation. Why is one any better than the other? The only differences are familiarity and convenience. The former set may be familiar if you’ve studied advanced mathematics. Entering them is pretty inconvenient on most keyboards. The latter are familiar to people who have used many programming languages and are very convenient since they have keys on almost all keyboards.