Restructure of the promotion mechanism for broadcast#18642
Restructure of the promotion mechanism for broadcast#18642stevengj merged 3 commits intoJuliaLang:masterfrom
Conversation
|
|
58796f1 to
87b0cb7
Compare
|
Will this mean that |
base/broadcast.jl
Outdated
| @inline broadcast(f, x::Number...) = f(x...) | ||
| @inline broadcast{N}(f, t::NTuple{N}, ts::Vararg{NTuple{N}}) = map(f, t, ts...) | ||
| @inline broadcast(f, As::AbstractArray...) = broadcast_t(f, promote_eltype_op(f, As...), As...) | ||
| @inline broadcast(f, As::AbstractArray...) = _broadcast(f, As...) |
There was a problem hiding this comment.
Is there still a point in calling this _broadcast instead of broadcast? Isn't multiple dispatch enough here?
There was a problem hiding this comment.
Calling broadcast now first checks the containers types to determine the resulting container. In this case I was trying to avoid that check for the case were we know all are Array like containers.
| # PR 16988 | ||
| @test Base.promote_op(+, Bool) === Int | ||
| @test isa(broadcast(+, [true]), Array{Int,1}) | ||
| @test Base.promote_op(Float64, Bool) === Float64 |
There was a problem hiding this comment.
Are you dropping this test because it would fail now? That would be unfortunate in case something outside Base relies on it...
base/broadcast.jl
Outdated
| end | ||
|
|
||
| @inline broadcast_t(f, T, As...) = broadcast!(f, similar(Array{T}, broadcast_indices(As...)), As...) | ||
| @inline function _broadcast_t(f, T, shape, iter, As...) |
There was a problem hiding this comment.
add some comments here and below describing what these are for
|
|
||
| function broadcast_c(f, ::Type{Tuple}, As...) | ||
| shape = broadcast_indices(As...) | ||
| check_broadcast_indices(shape, As...) |
There was a problem hiding this comment.
why is this being removed?
There was a problem hiding this comment.
broadcast_indices already fails when the sizes are not compatible. The only places where we really need it is for broadcast! where we don't know the size of the supplied container.
| # PR 16988 | ||
| @test Base.promote_op(+, Bool) === Int | ||
| @test isa(broadcast(+, [true]), Array{Int,1}) | ||
| @test Base.promote_op(Float64, Bool) === Float64 |
There was a problem hiding this comment.
does this now return something different, or is it a method error?
There was a problem hiding this comment.
I returns Any because I remove the specialized method when the first argument is a type. promote_op would only be used by the unary and binary elementwise operations in base, as well as by the matrix multiplication methods, where we know that the operator is not a type.
I guess I could leave the definition for uses outside Base addressing @martinholters concern above.
There was a problem hiding this comment.
should it be deprecated if base won't need it going forward?
There was a problem hiding this comment.
Yeah, I guess so, will do.
There was a problem hiding this comment.
FWIW
promote_op(op::Type, T) = Core.Inference.return_type(op, Tuple{T})
promote_op(op::Type, T1, T2) = Core.Inference.return_type(op, Tuple{T1,T2})seems to be inferable and handles cases where the constructor specializes, e.g.
promote_op(Array{Int64}, Int) then correctly returns Array{Int64,1}, not just Array{Int64}.
Not that I'm too worried about improving the obsolete promote_op, but maybe you can employ this somehow to increase the chances of getting a leaftype in _broadcast?
@TotalVerb: yes. The only case where it would work as before is for elementwise unary and binary operators in Base, where it would still preserve the more general abstract type. |
6b55f4a to
0b7e17f
Compare
|
Comments so far addressed. Can we check performance? |
|
@nanosoldier |
|
Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels |
0b7e17f to
5d4cd87
Compare
|
Can we ask nanosoldier again? |
|
@nanosoldier |
|
Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels |
20f5ba0 to
fa03b1d
Compare
|
Ok. I believe there should be no more performance regressions. |
|
@nanosoldier |
|
Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels |
|
All right, there are no more regressions (just the usual noisy benchmarks). But feel free to review or make more suggestions. |
base/deprecated.jl
Outdated
|
|
||
| # promote_op method where the operator is also a type | ||
| function promote_op(op::Type, Ts::Type...) | ||
| depwarn("promote_op(op::Type, ::Type...) is deprecated as is no longer " * |
7cca21a to
6bd6840
Compare
|
Best to run the benchmarks again, since @nanosoldier |
|
Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels |
|
That's odd, some of those performance regressions weren't present before the last rebase (in which there are no changes, just the rebase). Probably other PRs touching Anyway, I'll see what I can do. |
|
@pabloferz, the previous time the benchmark was run on this PR was September 28, which was before the broadcast benchmarks were added to BaseBenchmarks. ... oh wait, nanosoldier is showing an improvement in the broadcast fusion benchmarks. The regressions are in linalg? |
|
The regressions in linalg may come from the fact that some matrix products, e.g. |
|
The problem was with Should be better now. |
8e78b26 to
a174718
Compare
|
Let's see: @nanosoldier |
|
Your benchmark job has completed - possible performance regressions were detected. A full report can be found here. cc @jrevels |
|
Ah yes, much better. What about the sparse transposes? Unrelated/noise? |
|
I believe that the sparse transposes are either unrelated to the PR or noise. |
|
Yeah, it doesn't look like the sparse-transpose code uses anything touched by this PR. The travis failure seems to be an unrelated network problem with I think this is okay to merge. |
| @inline broadcast_elwise_op(f, As...) = | ||
| broadcast!(f, similar(Array{promote_eltype_op(f, As...)}, broadcast_indices(As...)), As...) | ||
|
|
||
| ftype(f, A) = typeof(a -> f(a)) |
There was a problem hiding this comment.
Can this just be typeof(f)?
There was a problem hiding this comment.
Yeah, I think so, I don't know why I put it like this (same for the one below).
| T = _promote_op(f, _default_type(S)) | ||
| @_pure_meta | ||
| Z = Tuple{_default_type(S)} | ||
| T = _default_eltype(Generator{Z, typeof(a -> f(a))}) |
* Restructure the promotion mechanism for broadcast * More broadcast tests * Use broadcast for element wise operators where appropriate
This addresses #18622 and replaces #18623. This PR is an overhaul of the promotion mechanism for
broadcastand gets rid off somepromote_eltype_opand_promote_opmethods. The later is somewhat replaced by the already existing_default_eltypewhich is used bymapand comprehensions.