Skip to content

Multimodal: Add Cross-Attention and Mixture-of-Transformers Archs#79

Merged
amazloumi merged 9 commits intomainfrom
feat/multimodal
May 7, 2026
Merged

Multimodal: Add Cross-Attention and Mixture-of-Transformers Archs#79
amazloumi merged 9 commits intomainfrom
feat/multimodal

Conversation

@amazloumi
Copy link
Copy Markdown
Member

This PR is just merging PRs #77 and #78 to main.

@amazloumi amazloumi requested review from Naeemkh and mmshad May 7, 2026 18:11
@codecov
Copy link
Copy Markdown

codecov Bot commented May 7, 2026

Codecov Report

❌ Patch coverage is 95.03106% with 16 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
kempnerforge/model/mot.py 93.85% 4 Missing and 3 partials ⚠️
kempnerforge/model/init.py 81.25% 1 Missing and 2 partials ⚠️
kempnerforge/model/transformer.py 95.16% 1 Missing and 2 partials ⚠️
kempnerforge/distributed/parallel.py 0.00% 1 Missing and 1 partial ⚠️
kempnerforge/model/vlm.py 94.44% 1 Missing ⚠️
Files with missing lines Coverage Δ
kempnerforge/config/vlm.py 100.00% <100.00%> (ø)
kempnerforge/model/cross_attention.py 100.00% <100.00%> (ø)
kempnerforge/model/modality.py 100.00% <100.00%> (ø)
kempnerforge/model/vlm.py 98.96% <94.44%> (-1.04%) ⬇️
kempnerforge/distributed/parallel.py 58.33% <0.00%> (-0.71%) ⬇️
kempnerforge/model/init.py 90.00% <81.25%> (-10.00%) ⬇️
kempnerforge/model/transformer.py 93.67% <95.16%> (+0.60%) ⬆️
kempnerforge/model/mot.py 93.85% <93.85%> (ø)
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@amazloumi amazloumi merged commit adb6aa3 into main May 7, 2026
6 checks passed
@amazloumi amazloumi deleted the feat/multimodal branch May 7, 2026 19:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants