Camera Exercise Configuration
Configure and control the fullscreen camera exercise using URL parameters for object detection or emotion detection.
The Camera Exercise API allows fullscreen camera interaction with different gamemodes. Configure detection mode, camera facing, sensitivity, and debug overlays via URL parameters.
Basic Parameters
Core parameters that define the camera exercise behavior.
gamemode
StringSelects the detection mode. Options: detectObject, detectFaceEmotion.
?gamemode=detectFaceEmotion Examples:
/?gamemode=detectObject
/?gamemode=detectFaceEmotion
cameraFacing
StringChoose which camera to use: user (front) or environment (back).
?cameraFacing=user Examples:
/?cameraFacing=user
sensitivity
Number (0-1)Detection sensitivity (0 to 1).
?sensitivity=0.8 Examples:
/?sensitivity=0.8
displayHorizontal
StringControls the horizontal position of the predictions panel: left, center, or right.
?displayHorizontal=left Examples:
/?displayHorizontal=center
displayVertical
StringControls the vertical position of the predictions panel: top, middle, or bottom.
?displayVertical=middle Examples:
/?displayVertical=bottom
maxDisplayItems
NumberMaximum number of predictions shown simultaneously.
?maxDisplayItems=8 Examples:
/?maxDisplayItems=3
cameraOpacity
Number (0-1)Control the camera feed opacity/transparency (0.0 to 1.0).
?cameraOpacity=0.7 Examples:
/?cameraOpacity=0.5
/?cameraOpacity=0.8
Complete Example
A full camera configuration example:
/?gamemode=detectFaceEmotion&cameraFacing=user&sensitivity=0.7&cameraOpacity=1.0