This page shows how to create a model by using Cascade Windowing.
Cascade windowing is a particular case of ensemble learning based on the concatenation of several sub segments (cascade). By using all the information collected from each cascade, a model can be created for longer time frames.
Case Study: Creating a Prediction for a 3 sec audio time frame
Sample rate of the data is 16kHz.
Create the 3 sec segments in DCL
Since we are targeting 3 sec. audio data frame (16000*3sec = 48000 time points), the Data Collection Lab (DCL) segments should be minimum 3 sec. long. if it is less than 3 sec., cascade windowing will ignore the segment. Also, If DCL segment is longer than 3 sec., cascade windowing uses first 3 sec. of the segment and ignore the rest of the segment data. i.e: if DCL segment is 4 sec. (64000 time points), cascade windowing will ignore last sec. of the data.
We will use 6 cascades for the duration of 0.5 sec•
For each cascade, we creates features by using the SensiML feature libraries (For details, see the attached notebook). For simplicity and demonstration purpose, we used only 7 features. Feature Cascade transform concatenate features from 6 consecutive cascades.
Selection of Significant Features
After features are created, we select most significant features by using one of the feature selection algorithms from SensiML feature selection library.
Significant Feature Distribution for 3 sec cascades
The figure shown below demonstrates how the significant features are distributed for each class.
Model based on significant features
Significant Features are used to train and optimize a model that creates a prediction for every 3 sec. of the audio data frame. The figure below shows a simple model structure. Each color represents a different class. The neurons (squares) with the same color show how specific data groups occupy the solution spaces. Each small circle represents a significant feature vector. When the location of the unknow vector fall in the influence field of a neuron, the model creates a prediction with the label of the neuron.