Problem solved with ChatGPT

Hey Julianners,
just wanted to share I didn’t ask a question here as ChatGPT answered it instead of the masters.

I just wanted to share how freaking incredible this is.

As Google like the texts are will copy the text here too.

Is it possible to get the future type of variable in Julia and initialize it with that type with template programming?

In Julia, it is not possible to get the type of a variable before it has been defined. You can, however, initialize a variable with a specified type using type annotations. For example:

Copy code

julia> x = 5 # x is an Int64 by default

julia> y::Int64 = 5 # y is an Int64 because we specified its type

Type annotations can be used with template programming to define a function that takes a variable of a specific type as an argument. For example:

Copy code

julia> function f(x::T) where T
          # Function body here
       end

In this case, the function f can be called with an argument of any type T, and the type of the argument will be known at compile time. This can be useful for writing functions that are generic over a variety of types, or for optimizing code by allowing the compiler to generate more efficient machine code for specific types.

1 Like

This is unfortunately not correct and a common misunderstanding (I am ignoring the heuristics around non-specializing).

8 Likes

I think it is correct.
Or I am not sure what you want to say.

The chatbot provided answer isn’t actually very good. There’s different solutions to this problem employed in different packages.

BTW I think this thread should be moved from General Usage to Off-topic, but I can’t seem to do that anymore.

2 Likes

It’s not correct. It doesn’t make the compiler optimize the code any differently (for almost all types)

6 Likes

Well if you mean it optimize it pretty well anyway without specifieing the type, then of course you can be strict and saying it won’t optimize it further. But the AI could mean it is better than cutting the parametric type from certain struct you can make. Also you could use the T to generate some optimised code in some edge cases. Also you could create a struct where this could create better code then specifying the type without the parametric type. But of course I let this topic off.

Well, if it is so, then it is sad.
If you know any of it, can you help me out a little with a direction where could I find it?

The reponse by ChatGPT is useless at best, but more likely misleading. For one thing, it says that you can use an annotation to restrict the type of an argument. Then it gives an example that does not restrict the type at all. It then implies and gives a strong impression (but doesn’t quite say explicitly) that using the annotation T is required in order to know the type at compile time. This is completely false and misleading. For example:

julia> f(x::T) where T = x
f (generic function with 1 method)

julia> g(x) = x
g (generic function with 1 method)

julia> @code_typed f(1)
CodeInfo(
1 ─     return x
) => Int64

julia> @code_typed g(1)
CodeInfo(
1 ─     return x
) => Int64

It would be interesting to see if you can get good advice on this kind of thing from ChatGPT. Maybe by changing the input. I haven’t tried it myself.

11 Likes

I don’t have enough time right now, but I suggest you create another thread specifically for your question, if you can’t find an an already existing answer by searching.

Part of good advice is consistency. Knowing that the is a substantial risk GPT is wrong (like here) makes all its advice dubious. You’ll have to verify with a more trustworthy source anyway.

5 Likes

Yes. That’s true. I’m interested in seeing when it does things correctly. It’s certainly not consistently correct.

1 Like

I tried some Winograd Schemas. Chatgpt seems a bit confused by some like kind of misses the point, but TBH in a way that reminds me of (sometimes obtuse) people.

For other examples of these schemas, it knocks it out of the part. For example an article from 3 years ago quotes a researcher saying, well if AI gets these correct slightly more than half the time, that’s only slightly better than random. But with ChatGPT you can be sure its not a guess because it doesn’t just give the answer. It writes a little essay demonstrating clearly that it “understands” the situation.

It seems to be oversubscribed recently, but it might be interesting to see if it could be otherwise useful in writing software. For example find some simple errors in Julia code. Or improve it in some way. … Replace x with a descriptive name, or any number of other chores. Julia specific things would be cool. Like tell you that a particular use of @inbounds is not safe, or add one that is.

Just add in a little ChatGPT compiler pass :robot: :rofl:

1 Like

Not quite yet! I just did this:
Me: Can you improve this Julia language function?

 function f(v::AbstractVector)
    _sum = 0
    for i in 1:length(v)
        _sum += v[i]
    end
    return _sum
end

It replies (with syntactically highlighted code)
ChatGPT: T Sure, here is one way to improve this function:

function f(v::AbstractVector)
    return sum(v)
end

But then it repeats the rookie mistake above (but with properly quoted code) :

ChatGPT: This new version of the function uses the sum function from the Base library to compute the sum of the elements in v, which is more concise and efficient than the original version.

Note that AbstractVector is a broad type that allows the function to accept any type of vector, such as a Vector, Array, or Range, as its argument. However, it would be more specific and efficient to use a concrete type, such as Vector, instead of AbstractVector if you know that the function will always be called with a Vector argument.

ChatGPT reminds me of a disorder that cause a person to do something akin to lying.

I add my own editorial comment in square brackets below.

Me: Can you improve this Julia language function?

function f(v::AbstractVector)
    _sum = 0
    for i in 1:length(v)
        _sum += v[i]^2
    end
    return _sum
end

ChatGPT: [CORRECT]
One way to improve this function is to use the built-in sum function to calculate the sum of the squared elements in the vector, rather than using a for loop. This would make the function more concise and efficient. The updated function would look like this:

function f(v::AbstractVector)
    return sum(x -> x^2, v)
end

ChatGPT: [CORRECT] Alternatively, if the goal is to calculate the sum of the squares of the elements in the vector, then the sum function can be called directly on the squared vector, without the need for a lambda function:

function f(v::AbstractVector)
    return sum(v.^2)
end

ChatGPT: [CORRECT] Another potential optimization is to use the built-in dot function to calculate the dot product of the vector with itself. Since the dot product of a vector with itself is the sum of the squares of its elements, this would also be a more efficient and concise way to calculate the sum of the squares of the elements in the vector. The updated function would look like this:

function f(v::AbstractVector)
    return dot(v, v)
end

ChatGPT: [WRONG, mostly. sumabs2 was removed from Julia many years ago] Finally, it is also possible to use the sumabs2 function to calculate the sum of the squares of the elements in the vector. This function is optimized for efficient calculation of this quantity, and may be faster than the other alternatives mentioned above. The updated function would look like this:

function f(v::AbstractVector)
    return sumabs2(v)
end

Me: sumabs was deprecated in Julia in 2016. Should I still use it? Do you think it’s even available in recent versions?

ChatGPT: [WRONG. I did mistakenly write sumabs instead of sumabs2. But ChatGPT now replies as if sumabs is a real thing, when it’s not. I also claims falsley that sumabs2 is up-to-date, when in fact it is long gone.]
It is not recommended to use the sumabs function, as it has been deprecated in Julia and may not be available in recent versions. Instead, it is recommended to use the sumabs2 function, which calculates the sum of the squares of the elements in a vector, or to use the dot function to calculate the dot product of the vector with itself. These alternatives are both more efficient and more up-to-date than the deprecated sumabs function.

Me: I made a mistake, I meant that the sumabs2 function has been deprecated for eight years. What’s the latest version of Julia I could use that would include sumabs2?

ChatGPT: [WRONG. It most definitely was deprecated and removed about six years ago] The sumabs2 function has not been deprecated in Julia, and it is still available in the latest versions of the language. As such, you can use the sumabs2 function in any recent version of Julia without encountering any issues.

2 Likes

I think ChatGPT is good at talking nonsense. I asked it to give some functions in Julia. It gave some examples by fake or nonexisting packages (even some R package name to me) and illustrated that you start with using XXX and then call the function. I told it that there is no such package in Juila. It admitted and then recommended more fake packages.

Another example is its answer to 4/15. It answered that 4/15 can be a decimal with value 0.266666666666 and it can also be a fraction which can be simplified to 2/9.

4 Likes

Even though you prepend the first paragraph with “ChatGPT”, it is still really ambiguous whether the second paragraph is your own or ChatGPT’s (though I’ll assume it’s a mistake you wouldn’t make :wink: )

Perhaps you can put quotes in a quote block? Or even do this:

[quote=“ChatGPT”]
Gobbledygook.

More talk.
[/quote]

to get

As for using ChatGPT, I’ve seen it do some incredible stuff, and I’ve also seen it just make up functions and packages out of thin air, and repeatedly give wrong answers. It’s a very mixed bag.

3 Likes

Not about julia but I got interested. Below is some math homework help :man_facepalming:

Sounds/reads like an authority on the subject, makes an elementary mistake, gets it wrong, continues to sound like an expert… Poor teachers who have to correct AI assisted homework problems :smiley:

Me: solve \int_0^\infty 1/(1+x^2) dx

ChatGPT:
To solve this integral, we can use a substitution. Let u=1+x^2, so du=2x\ dx. Then the integral becomes:

\int_0^\infty \frac{1}{1+x^2}\ dx = \int_1^\infty \frac{1}{u}\ du = \left[\log u\right]_1^\infty = \boxed{\infty}.

Note that the integral is not defined, as it approaches infinity as the upper limit goes to infinity. This is because the function \frac{1}{1+x^2} does not have a finite limit as x goes to infinity.

1 Like

I’ve edited my post to try clarify what is reasonable and what is BS.

[WRONG AI-generated content]

Note that the integral is not defined, as it approaches infinity as the upper limit goes to infinity. This is because the function does not have a finite limit as x goes to infinity.

Wow. I wonder where it finds that text. I guess correct statements on this outweigh incorrect ones online.