Skip to content

pstoica/senseweave

Repository files navigation

SenseWeave Architecture Overview

SenseWeave is a Unity + OneJS hybrid for real-time particle-based drawing. This README captures the current architecture so we can reason about the system without spelunking through scripts every time.

High-Level Flow

  1. OneJS (React/TypeScript) UI boots the Unity scene, instantiates the key managers, and mirrors Unity state through signals.
  2. LayerManager owns every renderable layer (IRenderLayer) and orchestrates rendering, grouping, ordering, and undo history.
  3. BrushCanvas + BrushLayer collect drawing input and generate particle output textures for brush content.
  4. EffectLayer instances apply shader effects (per-layer or global) through the modular shader pipeline.
  5. ParticleRenderFeature (URP) composites layer outputs, invokes global effects, and writes the final frame.
OneJS UI ──> LayerManager ──> LayerGroups ──> Brush/Effect Layers ──> Shader pipeline ──> URP Render

Core Systems

Layer Stack & Groups

  • IRenderLayer defines the shared API (bypass, opacity, blend mode, render target).
  • LayerManager keeps a flat stack and a hierarchical LayerGroup tree so we can reorder and nest layers while keeping rendering straightforward.
  • LayerGroup composites child outputs into its own RenderTexture; groups can be nested and moved across the tree.
  • BrushLayer implements IRenderLayer and renders brush strokes into a dedicated RenderTexture via CustomParticleRenderer.
  • EffectLayer implements IRenderLayer, consumes an input texture, and emits a processed output using the shader pipeline.

Shader Effect Pipeline

  • ShaderEffect is now a polymorphic hierarchy (PixelateShaderEffect, MeltShaderEffect, etc.) rather than a field soup. Each subclass knows how to serialize/deserialize its parameters and how to push them to a material (ApplyToMaterial).
  • ShaderEffect subclasses expose their own parameter definitions; effect layers use these directly to apply shaders, and layerStore mirrors Unity’s ordered layer stack (brush + FX) for the UI.
  • EffectLayer caches per-effect materials, sets the _EffectType shader keyword, and calls ApplyToMaterial before blitting.
  • ParticleRenderFeature finds the cached effect manager, then applies enabled global effects with command buffers during URP’s render loop.

Input & Brush Runtime

  • BrushCanvas manages input, per-brush undo history, and serialization of stroke particle data.
  • BrushLayer holds stroke collections, owns the per-layer render target/camera, and routes requests to CustomParticleRenderer.
  • CustomParticleRenderer (not documented here in detail) batches particles, feeds GPU buffers, and renders into the layer render target.
  • BrushProperties.Shape/Image/Color modules expose geometry, texture, and color state independently so future brushes can mix sources (image-only, masked image, color-only shapes) without rewriting the renderer.

OneJS Integration

  • App/components/DrawingSystem.tsx creates Unity GameObjects for LayerManager, BrushManager, ShapePackManager, and the main camera.
  • App/stores/projectStore.ts mirrors Unity state. As of the bypass refactor, UI models track bypass flags rather than isVisible to stay aligned with Unity semantics.
  • Types live in App/types/SceneTypes.ts; regenerate app.d.ts whenever C# signatures change.

Key Files & Directories

  • Assets/ – Unity scripts, shaders, materials (see Assets/README.md for asset-specific notes).
  • App/ – OneJS (TypeScript + React) UI and integration logic.
  • ARCHITECTURE_REDESIGN.md – Working roadmap/planning doc. Use it for sprint details and experiments.
  • ShaderEffect.cs – Polymorphic effect definitions + serialization utilities.
  • LayerManager.cs / LayerGroup.cs – Layer orchestration.
  • ParticleRenderFeature.cs – URP render feature that pulls everything together.

Working Notes

  • Serialization: ShaderEffectSerializer can read both the new format and the legacy flat model. Hook it into any persistence path that saves effect stacks.
  • Typing: Always regenerate app.d.ts after C# changes so TypeScript stays correct.
  • Debug Logging: Several components still emit verbose Debug.LogWarning calls (especially the new effect pipeline). Trim or gate them once the pipeline stabilizes.

Future Documentation

  • Layer blend modes & opacity are still TODO in the runtime; document the blending strategy once implemented.
  • When new layer types (image/video/audio) come online, extend this README with their lifecycle and render integration.
  • If we formalize save/load flows, add a persistence section covering canvas data, effect stacks, and UI state syncing.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors