- Application Overview
- Key Features
- Hardware & Peripherals Used
- System Architecture
- Event System
- Rhythm Detection (Microphone Processing)
- Menu System
- Media References
- Design Reflection (OOAD Perspective)
This project is an embedded rhythm-training application developed for a Microcontrollers course using an STM32 (ARM Cortex-M) microcontroller and the STM32 HAL. Received grade 5 out of 5.
The system focuses on real-time signal processing, event-driven architecture, and tight integration between hardware abstraction and application logic.
The system runs entirely on the STM32 and provides three rhythm‑related modes accessible through a joystick‑driven menu rendered on a character LCD.
Metronome Generates a steady tempo using a buzzer and LED feedback. The BPM is user‑selectable through the joystick‑driven menu, allowing real‑time tempo changes.
Clap BPM Detection Uses microphone input to detect user claps and estimate BPM in real time. This mode demonstrates ADC + DMA signal acquisition, peak detection, and timing analysis.
Training / Accuracy Mode Users clap along to a target tempo and receive timing accuracy feedback. The system measures clap timing offsets relative to expected beats and computes accuracy statistics over time.
- Custom event‑driven architecture
- Fully joystick‑driven menu system rendered on LCD
- Multiple operating modes (metronome, BPM detection, training)
- Real‑time microphone signal processing
- Visual (LCD, LEDs) and audio (buzzer) feedback
- Accuracy and timing statistics for rhythm training
- Strong separation between hardware abstraction, application logic, and UI
- STM32 microcontroller (ARM Cortex‑M)
- I2C – Character LCD interface
- ADC + DMA – Microphone sound sensor and joystick analog inputs
- GPIO / EXTI – Button input handling (joystick)
- Timers / PWM – Buzzer output and RGB LED timing
The project is structured around a central event system. Hardware modules never directly control application logic or UI; instead, they generate events that are handled centrally based on the current system state.
Hardware input → Event → Central dispatcher → Active subsystem
This architecture avoids tight coupling, reduces duplicated logic, and allows each module to remain focused on a single responsibility.
The event system acts as the backbone of the application.
Event event;
if (Event_Pop(&event)) {
EVENT_handle(&event);
}- Events are pushed by hardware‑level modules (joystick, microphone, timers)
- A central handler dispatches events to the appropriate subsystem
- Behavior changes naturally with system mode, without branching inside drivers
Example joystick event routing:
void EVENT_handle_joystick_event(Event* event) {
JOYSTICK_handle_event(hsystem_state.current_menu, event);
}Rhythm and clap detection is one of the most critical parts of the system and is handled inside the microphone abstraction layer.
-
Microphone sound sensor input is sampled using ADC + DMA
-
A dynamic noise floor is maintained using a moving average
-
A clap is detected when:
- The signal rises above a threshold relative to the noise floor
- A valid falling edge confirms a peak (prevents false positives)
-
Time between detected claps is measured
-
A smoothed average interval is maintained
-
BPM is calculated as:
BPM = 60000 / average_interval_ms
- A
readyflag is set once BPM stabilizes
void MIC_process_event(MICType* mic, EventQueueType* evq) {
if (MIC_update_bpm(mic, HAL_GetTick())) {
EVENT_que_push_back(evq, (Event){EVENT_SOUND_DETECTED});
}
}This design ensures:
- No UI or mode‑specific logic exists inside the microphone driver
- BPM detection can be reused by multiple modes (display, training, statistics)
During training mode:
- Each clap is timestamped
- The expected beat time is known from the target BPM
- The timing offset (early/late) is computed
- Accuracy percentage is calculated and averaged over time
The menu system is fully data‑driven and tightly integrated with the event system. Menus are not hardcoded; instead, they are defined through structures and callbacks.
MenuItemType menu_items_home[] = {
{MENU_ITEM_STATIC_TEXT, 1, 0, 0, "Play", NULL, &MENU_change, &menu_bpm_play},
...
};
MENU_init(&menu_home, &hsystem_state.hlcd, MENU_TYPE_HOME,
menu_items_home, 4);Menus support:
- Static, dynamic text
- Dynamic values (BPM, accuracy, timing)
- Navigation and action items
typedef struct MenuType {
TextLCDType* lcd;
MenuInstanceType instance;
MenuItemType* menu_items_non_sel;
MenuItemType* menu_items_sel;
uint8_t item_count_non_sel;
uint8_t item_count_sel;
uint8_t selected_index;
} MenuType;typedef struct MenuItemType {
MenuItemFormatType format;
uint8_t is_selectable;
uint8_t col;
uint8_t row;
char text_format[MENU_MAX_TEXT_LENGTH+1];
MenuCallback update_callback;
MenuCallback action_callback;
void* in_action_callback;
char text_buffer[MENU_MAX_TEXT_LENGTH+1];
} MenuItemType;- Each menu item owns its behavior via callbacks
update_callbackallows live values without UI logic duplicationaction_callbackcleanly triggers state changes or events- Menu logic is entirely decoupled from application modes
- Works seamlessly with the joystick event system
void JOYSTICK_handle_event(MenuType* menu, Event* event) {
if (event->type == EVENT_JOYSTICK_PRESS) {
MenuItemType* current_item =
&menu->menu_items_sel[menu->selected_index];
current_item->action_callback(current_item->in_action_callback);
}
}This project was developed prior to my formal experience with Object-Oriented Analysis and Design (OOAD). It follows a procedural, event-driven style typical of embedded C, using shared state and function-pointer–based abstraction.
With my current OOAD knowledge while keeping the same language and platform (C on STM32), the design would be refined as follows:
OOAD issue
- Event ownership, handling responsibility, and lifecycle are implicit and depend on global interpretation and external system state.
OOAD-in-C improvement
-
Define explicit event ownership contracts where each event type has a single owning subsystem responsible for handling behavior and side effects.
-
Replace global dispatch with object-level handlers, for example:
RhythmController_HandleEvent(&rhythm, &event); MenuController_HandleEvent(&menu, &event);
This corresponds to embedded implementations of the Command and Observer patterns using C structs and function interfaces.
OOAD issue
- Shared global system state creates coupling between subsystems and forces implicit dependencies on mode flags and external context.
OOAD-in-C improvement
- Model major subsystems as long-lived domain objects (e.g.,
RhythmController,TrainingSession,MenuController) that own their internal state and expose narrow APIs. - Reduce the global system state to a routing and lifecycle coordinator rather than a shared data container.
This reflects OOAD principles of encapsulation and aggregate ownership, implemented in C via disciplined interfaces.
OOAD issue
- Menu behavior is driven externally, with input handling, selection changes, and rendering controlled outside the menu’s ownership boundary.
OOAD-in-C improvement
-
Promote the menu to an active UI controller object responsible for input handling, selection state transitions, and rendering decisions.
-
Interact with the menu only through explicit interfaces, for example:
Menu_HandleEvent(&menu, &event); Menu_Update(&menu);
This formalizes menu ownership and reduces coupling between input handling, UI logic, and system state.
OOAD issue
- Clap detection, tempo estimation, and training analysis are combined along a single control path, coupling multiple responsibilities.
OOAD-in-C improvement
-
Decompose rhythm functionality into explicit single-responsibility components:
ClapDetector- signal processing and clap detectionTempoEstimator- interval smoothing and BPM calculationTrainingAnalyzer- timing offset and accuracy evaluation
-
Compose these components explicitly instead of conditionally reusing logic.
This preserves existing algorithms and performance while improving responsibility boundaries.
