Releases: LuxDL/Lux.jl
Releases · LuxDL/Lux.jl
MLDataDevices-v1.15.3
MLDataDevices MLDataDevices-v1.15.3
Diff since MLDataDevices-v1.15.2
Merged pull requests:
- Fix isbits type support for GPU device transfer (#1587) (@Copilot)
Closed issues:
- [MLDataDevices] broken support for
isbitstypes movement (#1586)
v1.27.1
Lux v1.27.1
Merged pull requests:
- chore: bump actions/download-artifact from 5 to 6 (#1575) (@dependabot[bot])
- ci: fix download path for cuda ci (#1576) (@avik-pal)
- chore: bump crate-ci/typos from 1.39.2 to 1.40.0 (#1578) (@dependabot[bot])
- test: Metal test now works (#1581) (@avik-pal)
- Add AbstractChar array support to MLDataDevices (#1582) (@Copilot)
- Mark LuxCore imports as public using @public (#1585) (@Copilot)
Closed issues:
MLDataDevices-v1.15.2
MLDataDevices MLDataDevices-v1.15.2
Diff since MLDataDevices-v1.15.1
Merged pull requests:
- feat: precompile common workloads (#1485) (@avik-pal)
- feat: batched_jacobian for Reactant (#1507) (@avik-pal)
- test: update tests for enzyme (#1552) (@avik-pal)
- fix: update how the error message looks (#1553) (@avik-pal)
- Use
|>for moving data to devices (#1559) (@abhro) - feat: return sequence properly + checkpointing + mincut (#1561) (@avik-pal)
- fix(LuxLib): avoid extra copy if input and output are aliased (#1562) (@avik-pal)
- ci: use dependabot for updating compat entries (#1563) (@avik-pal)
- chore: bump actions/checkout from 5 to 6 (#1564) (@dependabot[bot])
- chore: bump crate-ci/typos from 1.39.0 to 1.39.2 (#1565) (@dependabot[bot])
- Put plot labels within plotting directives (#1566) (@abhro)
- Remove unnecessary
begin...endmarkers (#1567) (@abhro) - ci(docs): update cpu builds to use default gh actions (#1569) (@avik-pal)
- chore: bump actions/download-artifact from 5 to 6 (#1575) (@dependabot[bot])
- ci: fix download path for cuda ci (#1576) (@avik-pal)
- chore: bump crate-ci/typos from 1.39.2 to 1.40.0 (#1578) (@dependabot[bot])
- test: Metal test now works (#1581) (@avik-pal)
- Add AbstractChar array support to MLDataDevices (#1582) (@Copilot)
- Mark LuxCore imports as public using @public (#1585) (@Copilot)
Closed issues:
- Add simple tests for other accelerators (#686)
- DifferentiationInterface testing (#769)
- Enzyme Cache Invalidation Failure with v1.10 (#1551)
- OneHotArrays + Reactant with cross entropy loss (#1556)
- Overhead of convolution on AMD GPU (#1557)
LuxTestUtilsnot re-exported (#1579)- [MLDataDevices] failure at transfering non numerical array (#1580)
- Mark LuxCore imports in Lux as public (#1584)
v1.27.0
Lux v1.27.0
Merged pull requests:
- feat: precompile common workloads (#1485) (@avik-pal)
- feat: batched_jacobian for Reactant (#1507) (@avik-pal)
- test: update tests for enzyme (#1552) (@avik-pal)
- fix: update how the error message looks (#1553) (@avik-pal)
- Use
|>for moving data to devices (#1559) (@abhro) - feat: return sequence properly + checkpointing + mincut (#1561) (@avik-pal)
- fix(LuxLib): avoid extra copy if input and output are aliased (#1562) (@avik-pal)
- ci: use dependabot for updating compat entries (#1563) (@avik-pal)
- chore: bump actions/checkout from 5 to 6 (#1564) (@dependabot[bot])
- chore: bump crate-ci/typos from 1.39.0 to 1.39.2 (#1565) (@dependabot[bot])
- Put plot labels within plotting directives (#1566) (@abhro)
- Remove unnecessary
begin...endmarkers (#1567) (@abhro) - ci(docs): update cpu builds to use default gh actions (#1569) (@avik-pal)
Closed issues:
LuxLib-v1.13.1
LuxLib LuxLib-v1.13.1
Merged pull requests:
- feat: precompile common workloads (#1485) (@avik-pal)
- feat: batched_jacobian for Reactant (#1507) (@avik-pal)
- feat: more informative error on constructing trainstate with compiled function (#1547) (@avik-pal)
- fix: minor reactant stuff + docs build (#1548) (@avik-pal)
- feat: use a caching allocator for GPUArrays workflows (#1549) (@avik-pal)
- Avoid reconstruction in
Internal.unsafe_free!(#1550) (@AntonOresten) - test: update tests for enzyme (#1552) (@avik-pal)
- fix: update how the error message looks (#1553) (@avik-pal)
- Use
|>for moving data to devices (#1559) (@abhro) - feat: return sequence properly + checkpointing + mincut (#1561) (@avik-pal)
- fix(LuxLib): avoid extra copy if input and output are aliased (#1562) (@avik-pal)
- ci: use dependabot for updating compat entries (#1563) (@avik-pal)
- chore: bump actions/checkout from 5 to 6 (#1564) (@dependabot[bot])
- chore: bump crate-ci/typos from 1.39.0 to 1.39.2 (#1565) (@dependabot[bot])
- Put plot labels within plotting directives (#1566) (@abhro)
- Remove unnecessary
begin...endmarkers (#1567) (@abhro) - ci(docs): update cpu builds to use default gh actions (#1569) (@avik-pal)
Closed issues:
v1.26.0
Lux v1.26.0
Merged pull requests:
- feat: migrate DDIM to Reactant (#1158) (@avik-pal)
- docs: stop manual specification of precision config (#1536) (@avik-pal)
- CompatHelper: add new compat entry for TensorBoardLogger at version 0.1 for package DDIM, (keep existing compat) (#1537) (@github-actions[bot])
- CompatHelper: add new compat entry for ImageShow at version 0.3 for package DDIM, (keep existing compat) (#1538) (@github-actions[bot])
- CompatHelper: add new compat entry for OhMyThreads at version 0.8 for package DDIM, (keep existing compat) (#1539) (@github-actions[bot])
- chore: bump crate-ci/typos from 1.38.1 to 1.39.0 (#1541) (@dependabot[bot])
- refactor: use
EnzymeRules.@easy_rulein Lux.jl (#1542) (@avik-pal) - Fix identity_init filling entire submatrix instead of diagonal (#1544) (@Copilot)
- feat: more informative error on constructing trainstate with compiled function (#1547) (@avik-pal)
- fix: minor reactant stuff + docs build (#1548) (@avik-pal)
- feat: use a caching allocator for GPUArrays workflows (#1549) (@avik-pal)
- Avoid reconstruction in
Internal.unsafe_free!(#1550) (@AntonOresten)
Closed issues:
MLDataDevices-v1.15.1
MLDataDevices MLDataDevices-v1.15.1
Diff since MLDataDevices-v1.15.0
Merged pull requests:
- feat: migrate DDIM to Reactant (#1158) (@avik-pal)
- feat: support distributed training via TrainState API (#1529) (@avik-pal)
- ci: run LuxCore + MLDataDevices testing on 1.12 (#1534) (@avik-pal)
- docs: stop manual specification of precision config (#1536) (@avik-pal)
- CompatHelper: add new compat entry for TensorBoardLogger at version 0.1 for package DDIM, (keep existing compat) (#1537) (@github-actions[bot])
- CompatHelper: add new compat entry for ImageShow at version 0.3 for package DDIM, (keep existing compat) (#1538) (@github-actions[bot])
- CompatHelper: add new compat entry for OhMyThreads at version 0.8 for package DDIM, (keep existing compat) (#1539) (@github-actions[bot])
- chore: bump crate-ci/typos from 1.38.1 to 1.39.0 (#1541) (@dependabot[bot])
- refactor: use
EnzymeRules.@easy_rulein Lux.jl (#1542) (@avik-pal) - Fix identity_init filling entire submatrix instead of diagonal (#1544) (@Copilot)
- feat: more informative error on constructing trainstate with compiled function (#1547) (@avik-pal)
- fix: minor reactant stuff + docs build (#1548) (@avik-pal)
- feat: use a caching allocator for GPUArrays workflows (#1549) (@avik-pal)
- Avoid reconstruction in
Internal.unsafe_free!(#1550) (@AntonOresten)
Closed issues:
- Reactant get_device with sharding throws error inside of MLDataDevices, impossible to use with TrainState API (#1520)
- Automatically cache allocations for JuliaGPU workloads (#1527)
- Exporting to Jax manual entry segfaults with recent reactant (#1540)
- Identity matrix initialization fills all entries with ones (#1543)
- Embedding Layer results in scalar indexing with Reactant? (#1546)
LuxLib-v1.13.0
LuxLib LuxLib-v1.13.0
Merged pull requests:
- feat: migrate DDIM to Reactant (#1158) (@avik-pal)
- Add type-stable eltype control to device adaptors with comprehensive testing (#1498) (@Copilot)
- chore: bump crate-ci/typos from 1.36.2 to 1.36.3 (#1499) (@dependabot[bot])
- chore: bump crate-ci/typos from 1.36.3 to 1.37.2 (#1500) (@dependabot[bot])
- CompatHelper: bump compat for BFloat16s to 0.6 for package CIFAR10, (keep existing compat) (#1501) (@github-actions[bot])
- CompatHelper: bump compat for BFloat16s to 0.6 for package Qwen3, (keep existing compat) (#1502) (@github-actions[bot])
- CompatHelper: bump compat for JLArrays to 0.3 for package test, (keep existing compat) (#1503) (@github-actions[bot])
- ci: use 1.11 (#1504) (@avik-pal)
- feat: JVP and VJP APIs for Reactant (#1506) (@avik-pal)
- chore: bump crate-ci/typos from 1.37.2 to 1.38.1 (#1508) (@dependabot[bot])
- CompatHelper: bump compat for Optimization to 5 for package GravitationalWaveForm, (keep existing compat) (#1510) (@github-actions[bot])
- CompatHelper: bump compat for Optimization to 5 for package OptimizationIntegration, (keep existing compat) (#1511) (@github-actions[bot])
- CompatHelper: bump compat for BLISBLAS in [weakdeps] to 0.2 for package LuxLib, (keep existing compat) (#1512) (@github-actions[bot])
- CompatHelper: bump compat for BLISBLAS to 0.2 for package test, (keep existing compat) (#1513) (@github-actions[bot])
- feat: move rng to reactant device (#1517) (@avik-pal)
- fix: donation errors for reactant (#1518) (@avik-pal)
- feat: allow passing a sync option (#1519) (@avik-pal)
- chore: bump actions/upload-artifact from 4 to 5 (#1526) (@dependabot[bot])
- feat: support distributed training via TrainState API (#1529) (@avik-pal)
- feat: support track numbers via reactant device API (#1533) (@avik-pal)
- ci: run LuxCore + MLDataDevices testing on 1.12 (#1534) (@avik-pal)
- docs: stop manual specification of precision config (#1536) (@avik-pal)
- CompatHelper: add new compat entry for TensorBoardLogger at version 0.1 for package DDIM, (keep existing compat) (#1537) (@github-actions[bot])
- CompatHelper: add new compat entry for ImageShow at version 0.3 for package DDIM, (keep existing compat) (#1538) (@github-actions[bot])
- CompatHelper: add new compat entry for OhMyThreads at version 0.8 for package DDIM, (keep existing compat) (#1539) (@github-actions[bot])
- chore: bump crate-ci/typos from 1.38.1 to 1.39.0 (#1541) (@dependabot[bot])
- refactor: use
EnzymeRules.@easy_rulein Lux.jl (#1542) (@avik-pal) - Fix identity_init filling entire submatrix instead of diagonal (#1544) (@Copilot)
Closed issues:
- Rethinking
eltypeconversions in Adaptors (#1015) - CUDA.jl along cannot trigger automatic GPU backend selection (#1245)
- Fix remaining CUDA testing (#1457)
- Global configuration for setting
sync=truein training API (#1509) - Invalid buffer donation in new Reactant versions (#1514)
- Lux.jl and Reactant and StableRNG interaction (#1515)
- memory leak (?) on AMD MI250X GPUs (#1516)
- Reactant get_device with sharding throws error inside of MLDataDevices, impossible to use with TrainState API (#1520)
- Error "failed to run pass manager on module" only on Vector input (#1521)
- Local MPI rank is always 0 if
Ipoptsolver is imported before Lux and MPI (#1525) - Reactant RNG handling broken in latest release (#1531)
- Exporting to Jax manual entry segfaults with recent reactant (#1540)
- Identity matrix initialization fills all entries with ones (#1543)
WeightInitializers-v1.2.2
WeightInitializers WeightInitializers-v1.2.2
Diff since WeightInitializers-v1.2.1
Merged pull requests:
- feat: migrate DDIM to Reactant (#1158) (@avik-pal)
- feat: use new NCCL version (#1492) (@avik-pal)
- feat: replace Compat.jl with SciMLPublic.jl for @public macro (#1497) (@Copilot)
- Add type-stable eltype control to device adaptors with comprehensive testing (#1498) (@Copilot)
- chore: bump crate-ci/typos from 1.36.2 to 1.36.3 (#1499) (@dependabot[bot])
- chore: bump crate-ci/typos from 1.36.3 to 1.37.2 (#1500) (@dependabot[bot])
- CompatHelper: bump compat for BFloat16s to 0.6 for package CIFAR10, (keep existing compat) (#1501) (@github-actions[bot])
- CompatHelper: bump compat for BFloat16s to 0.6 for package Qwen3, (keep existing compat) (#1502) (@github-actions[bot])
- CompatHelper: bump compat for JLArrays to 0.3 for package test, (keep existing compat) (#1503) (@github-actions[bot])
- ci: use 1.11 (#1504) (@avik-pal)
- feat: JVP and VJP APIs for Reactant (#1506) (@avik-pal)
- chore: bump crate-ci/typos from 1.37.2 to 1.38.1 (#1508) (@dependabot[bot])
- CompatHelper: bump compat for Optimization to 5 for package GravitationalWaveForm, (keep existing compat) (#1510) (@github-actions[bot])
- CompatHelper: bump compat for Optimization to 5 for package OptimizationIntegration, (keep existing compat) (#1511) (@github-actions[bot])
- CompatHelper: bump compat for BLISBLAS in [weakdeps] to 0.2 for package LuxLib, (keep existing compat) (#1512) (@github-actions[bot])
- CompatHelper: bump compat for BLISBLAS to 0.2 for package test, (keep existing compat) (#1513) (@github-actions[bot])
- feat: move rng to reactant device (#1517) (@avik-pal)
- fix: donation errors for reactant (#1518) (@avik-pal)
- feat: allow passing a sync option (#1519) (@avik-pal)
- chore: bump actions/upload-artifact from 4 to 5 (#1526) (@dependabot[bot])
- feat: support distributed training via TrainState API (#1529) (@avik-pal)
- feat: support track numbers via reactant device API (#1533) (@avik-pal)
- ci: run LuxCore + MLDataDevices testing on 1.12 (#1534) (@avik-pal)
- docs: stop manual specification of precision config (#1536) (@avik-pal)
- CompatHelper: add new compat entry for TensorBoardLogger at version 0.1 for package DDIM, (keep existing compat) (#1537) (@github-actions[bot])
- CompatHelper: add new compat entry for ImageShow at version 0.3 for package DDIM, (keep existing compat) (#1538) (@github-actions[bot])
- CompatHelper: add new compat entry for OhMyThreads at version 0.8 for package DDIM, (keep existing compat) (#1539) (@github-actions[bot])
- chore: bump crate-ci/typos from 1.38.1 to 1.39.0 (#1541) (@dependabot[bot])
- Fix identity_init filling entire submatrix instead of diagonal (#1544) (@Copilot)
Closed issues:
- Rethinking
eltypeconversions in Adaptors (#1015) - CUDA.jl along cannot trigger automatic GPU backend selection (#1245)
- Fix remaining CUDA testing (#1457)
- Relax NCCL dep for testing (#1479)
- Use
SciMLPublic.jlinstead ofCompatfor@public(#1496) - Global configuration for setting
sync=truein training API (#1509) - Invalid buffer donation in new Reactant versions (#1514)
- Lux.jl and Reactant and StableRNG interaction (#1515)
- memory leak (?) on AMD MI250X GPUs (#1516)
- Reactant get_device with sharding throws error inside of MLDataDevices, impossible to use with TrainState API (#1520)
- Error "failed to run pass manager on module" only on Vector input (#1521)
- Local MPI rank is always 0 if
Ipoptsolver is imported before Lux and MPI (#1525) - Reactant RNG handling broken in latest release (#1531)
- Exporting to Jax manual entry segfaults with recent reactant (#1540)
- Identity matrix initialization fills all entries with ones (#1543)
v1.25.0
Lux v1.25.0
Merged pull requests:
- chore: bump actions/upload-artifact from 4 to 5 (#1526) (@dependabot[bot])
- feat: support distributed training via TrainState API (#1529) (@avik-pal)
- feat: support track numbers via reactant device API (#1533) (@avik-pal)
- ci: run LuxCore + MLDataDevices testing on 1.12 (#1534) (@avik-pal)
Closed issues:
- Fix remaining CUDA testing (#1457)
- memory leak (?) on AMD MI250X GPUs (#1516)
- Reactant get_device with sharding throws error inside of MLDataDevices, impossible to use with TrainState API (#1520)
- Error "failed to run pass manager on module" only on Vector input (#1521)
- Local MPI rank is always 0 if
Ipoptsolver is imported before Lux and MPI (#1525) - Reactant RNG handling broken in latest release (#1531)