Quick Start Guide
1
Open the Application

Navigate to http://localhost/classroomai/ in Chrome or Edge. The entry splash page will appear. Click Enter System → to go to the Dashboard.

2
Start a Live Camera Session

Click Live Feed in the sidebar. Click Start Session. Allow browser camera and microphone permissions. The AI models will load from CDN — this takes ~15 seconds the first time, and is instant on subsequent visits (cached). The noise meter and behaviour breakdown update in real time.

3
Watch the AI Overlay

Real-time colour-coded bounding boxes appear on each detected student face with their current behaviour label. The event log below records every behaviour transition. The noise meter shows ambient class noise via the microphone.

4
Save & Review

Click Stop & Save Session. The session data is stored locally. Visit Dashboard for a summary, Statistics for charts, or Session Report for a detailed printable report.

5
Analyse a Video

Go to Video Analysis. Drag and drop (or browse) a classroom video file (MP4, WebM, MOV). Click Analyse Video. The AI will process frame by frame with the same overlay. A summary report is displayed when done.

Frequently Asked Questions
Why does the camera not work?
Camera access via getUserMedia requires either localhost or a HTTPS connection. If you open the file directly (file://) or from a plain HTTP domain other than localhost, the browser will block camera access. Run the app through Laragon/XAMPP at http://localhost/classroomai/.
The AI models take too long to load. What can I do?
AI model weights (~7MB total) are downloaded from the jsDelivr CDN on first use. After the first load, your browser caches them — subsequent loads are instant. Ensure you have a stable internet connection for the first use. If loading fails, check the browser console for network errors and try refreshing.
Why are some faces not being detected?
Face detection accuracy is affected by: lighting (bright, even light works best), distance (students too far from the camera may not be detected), angle (faces need to be roughly frontal), and occlusion (hands covering faces). Try improving classroom lighting and positioning the webcam at the front/centre of the class.
Can I adjust the AI sensitivity?
Yes — go to Settings. The AI Sensitivity slider adjusts the face detection confidence threshold. Increasing sensitivity detects more faces but may produce false positives. The Detection Interval slider controls how often the AI analyses a frame (200ms–1000ms). Shorter intervals use more CPU.
Is my data sent to any external server?
No. All AI processing happens entirely in your browser using WebGL-accelerated TensorFlow.js. No video frames, images, or biometric data leave your device. Session statistics (behaviour counts, attention percentage) are only stored in your browser's localStorage and optionally in the local PHP backend — never transmitted externally.
The Suspicious label appears too often. Why?
The Suspicious label is triggered when a student's nose-tip landmark shifts more than 28% from the horizontal face centre — indicating head-turning. This can be a false positive when students look at the blackboard or a neighbouring student for legitimate reasons. Reduce false positives by adjusting the detection angle threshold in the code (js/ai-engine.js) or simply treat Suspicious alerts as conversation-starters rather than definitive flags.
How do I export session data?
Go to Settings → Data Management → Export Session Data. This downloads all sessions as a JSON file. For a single-session printable report, go to Session Report, select the session, and use the Print / Export PDF button (uses the browser's built-in Print dialogue — choose "Save as PDF").
How do I view error logs?
Open your browser's Developer Console (F12 → Console). All AI events and errors are logged there with colour-coded prefixes. Additionally, all logs are stored in localStorage under the key cai_logs and can be downloaded as JSON by running Logger.exportLogs() in the browser console.
Behaviour Label Reference
LabelColourMeaningRecommended Action
✓ AttentiveGreenCalm, focused, processing informationNone required
★ EngagedCyanActively enjoying or interestedPositive — maintain activity
! AlertBlueSuddenly surprised or alertObserve — may need context
− DisengagedAmberWithdrawn, uninvolvedCheck in with student
⚠ AnxiousOrangeStressed, fearful, test anxietyOffer reassurance
✗ DisruptiveRedFrustrated, hostile behaviourImmediate attention needed
? ConfusedPurpleDoes not understand contentRephrase or re-explain
👀 SuspiciousDark RedLooking sideways (head turned)Investigate — possible copying
Tips & Troubleshooting
Refresh AI modelsCtrl + Shift + R
Toggle dark / light modeMoon icon (topbar)
Check error logsF12 → Console
Export logs to JSONLogger.exportLogs()
Stop camera streamStop & Save button
Clear all session dataSettings → Clear Data
View this help pageSidebar → Help
Print session reportReport → Print PDF
For further support or to report an issue, contact the project team: Precious Lydiah & Shirleen Wambui, Maryhill Girls High School — or visit the project documentation page.