|
Figure 1. Synchronization variance in 2D heatmaps and the expected output of each behavior. An example is given of each type of froglet compared to the ideal correlation of each of the three types of synchronization, previously defined as: synchronized, independent and inverted. Rows show: type of movement, ideal correlation, ideal distribution, one experimental example, measured correlation and original class, respectively.
|
|
Figure 2. Symmetry variance in 2D heatmaps. Two experimental examples illustrate how a difference in symmetry is reflected in the heatmaps. The white line shows perfect symmetry, while the black-dashed line shows the corresponding line of the estimated slope β0. We can see that in the healthy froglet both black and white lines are almost the same, giving a β0 almost equal to 1. On the other hand, in the hemisected froglet the right foot tends to move approximately half the distance of the left foot's stroke, making white and black lines differ significantly, thus showing a lack of symmetry.
|
|
Figure 3. Examples of distribution in the feature space of the 90 videos using two kinematic features. They are colored with their original class (uninjured, hemisected and transected). In panel A the features synchronization (F1) and symmetry (F2) are shown, whereas in panel B features of right and left foot angles range (F3 and F4) are shown.
|
|
Figure 4. Two examples of the right foot range (F3) measured. The blue shadow shows the moment where the froglet has its foot in the highest position, while the green is the lowest position. The range is defined as the arc between the highest and lowest position. The healthy froglet has a relatively large range of movement while the transected one only moves it's pelvis, having a relatively small range of movement.
|
|
Figure 5. Diagram illustrating the five step process for each video. The whole video is processed frame by frame and then summarized in the feature extraction phase. Finally, the video is classified into one of the three levels of spinal cord damage.
|
|
Figure 6. Experimental Setup. (A) Illustration of spinal cord injuries in the 6th vertebrae of X. laevis froglets in the dorsal and frontal views, (i) uninjured, (ii) transected and (iii) hemisected animals. (B) Picture of the setup used to record the movies of froglets. The tripod where the video camera is placed, a box internally illuminated with LED lights, a glass container with Barth solution and a spoon and a Pasteur pipette for swimming stimulation of froglets. fv, frontal view; dv, dorsal view of the spinal cord; sc, spinal cord; gm, grey matter; wm, white matter; cc, central canal.
|
|
Figure 7. Diagram showing the steps to detect the frogletâs limbs. The algorithm detects limb zones, joint zones and endpoints for each of the four quadrants. Each limb is defined by the endpoint furthest from its joint.
|
|
Figure 8. Example of a 2D histogram is constructed and outputted by the algorithm. In the examples, the angle-pairs (αR, αL) are 49°, 88° and 52°, 45°. For these frames, the bins â 49, 88 and 52, 45 â of the 2D histogram are increased by one. The histogram is represented on the right as a heatmap with a color scale; the deepest red value is associated with the most common angle-pair observed throughout the video, while the blue is associated with an angle-pair not seen during the video. The white line represents perfect symmetry, meaning that both feet are horizontally mirroring.
|
|
Figure 9. Examples of the output of a classifier plotting all 90 videos using two features: synchronization (F1) and symmetry (F2). (A) Diagram of a classification process. Labeled training data is computed to a decision surface of the feature space. This is a mathematical formula that depends on the type of classifier. New data can be tested and classified depending where it falls in the decision surface. (B) Different classifiers and how they divide the feature space in decision boundaries.
|
|
Fig. 1. Synchronization variance in 2D heatmaps and the expected output of each behavior. An example is given of each type of froglet compared to the ideal correlation of each of the three types of synchronization, previously defined as: synchronized, independent and inverted. Rows show: type of movement, ideal correlation, ideal distribution, one experimental example, measured correlation and original class, respectively.
|
|
Fig. 2. Symmetry variance in 2D heatmaps. Two experimental examples illustrate how a difference in symmetry is reflected in the heatmaps. The white line shows perfect symmetry, while the black-dashed line shows the corresponding line of the estimated slope β0. We can see that in the healthy froglet both black and white lines are almost the same, giving a β0 almost equal to 1. On the other hand, in the hemisected froglet the right foot tends to move approximately half the distance of the left foot's stroke, making white and black lines differ significantly, thus showing a lack of symmetry.
|
|
Fig. 3. Examples of distribution in the feature space
of the 90 videos using two kinematic features. They are colored with their original class (uninjured, hemisected and transected). In panel A the features synchronization (F1) and symmetry (F2) are shown, whereas in panel B features of right and left foot angles range (F3 and F4) are shown.
|
|
Fig. 4. Two examples of the right foot range (F3) measured. The blue shadow shows the moment where the froglet has its foot in the highest position, while the green is the lowest position. The range is defined as the arc between the highest and lowest position. The healthy froglet has a relatively large range of movement while the transected one only moves it's pelvis, having a relatively small range of movement.
|
|
Fig. 5. Diagram illustrating the five step process for each video. The whole video is processed frame by frame and then summarized in the feature extraction phase. Finally, the video is classified into one of the three levels of spinal cord damage.
|
|
Fig. 6. Experimental Setup. (A) Illustration of spinal cord injuries in the 6th vertebrae of X. laevis froglets in the dorsal and frontal views, (i) uninjured, (ii) transected and (iii) hemisected animals. (B) Picture of the setup used to record the movies of froglets. The tripod where the video camera is placed, a box internally illuminated with LED lights, a glass container with Barth solution and a spoon and a Pasteur pipette for swimming stimulation of froglets. fv, frontal view; dv, dorsal view of the spinal cord; sc, spinal cord; gm, grey matter; wm, white matter; cc, central canal.
|
|
Fig. 7. Diagram showing the steps to detect the frogletâs limbs. The algorithm detects limb zones, joint zones and endpoints for each of the four quadrants. Each limb is defined by the endpoint furthest from its joint.
|
|
Fig. 8. Example of a 2D histogram is constructed and outputted by the algorithm. In the examples, the angle-pairs (αR, αL) are 49°, 88° and 52°, 45°. For these frames, the bins â 49, 88 and 52, 45 â of the 2D histogram are increased by one. The histogram is represented on the right as a heatmap with a color scale; the deepest red value is associated with the most common angle-pair observed throughout the video, while the blue is associated with an angle-pair not seen during the video. The white line represents perfect symmetry, meaning that both feet are horizontally mirroring.
|
|
Fig. 9. Examples of the output of a classifier plotting
all 90 videos using two features: synchronization (F1) and symmetry (F2). (A) Diagram of a classification process. Labeled training data is computed to a decision surface of the feature space. This is a mathematical formula that depends on the type of classifier. New data can be tested and classified depending where it falls in the decision surface. (B) Different classifiers and how they divide the feature space in decision boundaries.
|