Video Analytics

Barring rare exceptions, the final stage of electronics manufacturing is largely manual or semi – manual. It typically consists of Final Assembly — where mechanical parts, connectors, and wired components are fitted — and Box Build Assembly / System Integration, during which the PCB is enclosed, cables are connected, displays attached, and structural elements secured. The quality of the final product depends a lot on the skill and performance of the workers in the assembly lines.

As the first offering in its video-enabled manufacturing – efficiency portfolio, Alumnus has developed a video-analytics solution that monitors the Final Assembly and Box Build Assembly stages, providing actionable insights into worker performance and identifying opportunities to improve productivity.

The solution can be deployed to analyse worker movements — either at the individual level to identify training needs, or at the aggregate level to highlight broader process improvements. By tracking the sequence of micro - activities — small but measurable actions such as picking, placing, aligning, or fastening — the system verifies whether each task is executed within expected time windows and Standard Operating Procedures (SOP) guidelines. Specifically, the system can identify:
  • Deviations from SOPs, such as performing assembly steps out of the prescribed sequence.
  • Micro-activities where the time taken consistently exceeds benchmark levels
  • Erratic behavioural patterns
  • Bottleneck micro-activities where a higher concentration of errors results in productivity losses, indicating clear retraining needs / process improvements
  • Workers with significantly deviant activity patterns that fall outside acceptable norms
The solution can generate a wide range of custom, on - demand analytics, tailored to the specific needs of the production environment. It is designed to handle the inherent complexity of manual and semi-manual assembly processes, as well as the natural variability of human movement, while maintaining a high and acceptable level of accuracy in detecting anomalies. Instead of generic “fast vs slow” observations, the system pinpoints exactly which motion elements cause delays. Enabling targeted skill development with personalized coaching while reducing training time. Additionally, by recognizing unsafe or non-ergonomic body movements, it helps prevent repetitive strain injuries.
The results produced by the system are reliable, consistent, and fully verifiable through audit trails and repeatable observations. Data captured across shifts/operators can reveal optimal motions, variation sources, and redesign opportunities to drive Kaizen, Lean, and industrial engineering initiatives. Appropriate safeguards are in place to ensure compliance with all relevant privacy guidelines, including measures that prevent identification of individuals unless explicitly required and authorised. The solution is engineered to operate reliably across variable lighting conditions, different camera resolutions, and even high-speed video streams. At its core, it uses a customised and performance – tuned Multi – stage Temporal Convolutional Network (TCN), implemented in PyTorch, for precise action recognition and segmentation. This architecture enables the system to track micro-activities with high scalability and consistency across diverse tasks and production environments.

25+ Years of Engineering to the Core

We believe in making a difference through our work, and we do it with a passionate sense of purpose.

info@alumnux.com

Drag