Can artificial intelligence become your next personal trainer? One Maltese start-up is redefining fitness by merging AI, computer vision, and physiotherapy to create a virtual coach that understands your body’s every move – guiding you toward safer, smarter, and truly personalised exercise.
Technology and movement are merging in fascinating ways, and one Maltese start-up is taking this fusion to the next level. UN1Q Fitness Ltd is developing a smartphone application that offers a world-class experience in exercise, nutrition, and behaviour change, tailored to each user. Acting as a virtual coach, the app will seamlessly integrate into each client’s lifestyle, guiding them safely and effectively toward their fitness goals. Founded and led by Physician Dr Malcolm Borg, UN1Q Fitness draws on the expertise of leading Maltese experts in musculoskeletal physiotherapy and computer vision (which enables computers to interpret the visual world through images and videos to identify objects, people, and features).
Where AI Meets Movement
The power behind UN1Q’s innovation lies in advanced artificial intelligence algorithms supported by computer vision. By analysing smartphone images and videos, the system identifies static posture and movement dysfunctions using specific anatomical landmarks. This analysis is conducted for the 11 foundational movements, including the squat, deadlift, and push-up movements. These foundational movements form the building blocks of all human movement. This data is then processed by software to create a personalised exercise plan that considers the individual’s movement limitations, helping them train safely and confidently.


(Images courtesy of UN1Q Fitness)
Beyond this, the software also identifies potential causes of postural or movement issues. It then generates a corrective exercise programme designed specifically for each user – delivered directly through the app.
The STEPS Project: From Vision to Validation
To advance its computer vision capabilities, UN1Q Fitness launched the STEPS project in collaboration with Dr John Xerri de Caro from the Department of Physiotherapy at the University of Malta, funded by Xjenza Malta. The goal of STEPS was to acquire a diverse dataset from human participants to validate and optimise the algorithms powering the postural and movement analysis function.
By combining diagnostic rule-based algorithms with this new data, the UN1Q system can identify problem areas and their underlying causes. This would allow the app to prescribe exercises safely within the user’s available range of motion. When recurring movement faults are detected, the diagnostic engine uses pattern recognition to suggest targeted corrective strategies.
Building the Data Foundation
At the start of the STEPS project, the team acquired essential equipment: a body composition analysis machine, fitness equipment, smartphones, tablets, DSLR cameras, RGB-D cameras, and powerful workstations. Physiotherapists were brought on board to supervise movements and classify faults, while technical researchers handled data capture and editing.


Before data collection began, specific movement faults were defined for each exercise. By the end of the project, 246 participants took part in the STEPS study. Each participant underwent a physical assessment, followed by guided movement sessions supervised by physiotherapists. Their movements were recorded and captured from multiple angles using various cameras, allowing researchers to classify their movements and improve the Computer Vision Engine (CVE) – enhancing feature extraction, setting thresholds, and strengthening robustness. This culminated in a prototype able to efficiently analyse key postural faults, particularly the squat movement.
For instance, the system may analyse recordings of the squat from a lateral (side) view to extract up to ten potential faults, such as a heel rise or limited hip flexion. It can currently detect most faults in squat movements from a lateral view with an accuracy of >84%. Likewise, squat movements captured from a frontal (front) view are analysed for three key faults with a 90% accuracy rate. Analysis of postural images is another feature that identifies six potential postural defects, achieving accuracy rates of >73%.


The last milestone for the project will be to integrate the developed algorithms into the app’s front-end interface. This will allow the user to upload images of their posture and single rep videos of a squat movement. These will then be processed through UN1Q’s CVE to extract base features from the user’s posture and movements. The CVE will then provide the user with information about which faults were detected and the deviation from normal.
What’s Next for UN1Q Fitness
The UN1Q app will be launched in January 2026 with multiple features, including limited computer vision functionality that analyses body fat and muscularity for body composition assessment from an image. In addition, the Computer Vision feature will be part of the release, with posture and squat analysis already integrated. The aim is to have algorithms for 13 fundamental movements by the end of 2026
Beyond its original scope, the STEPS project has enabled the team to extend the AI model to other movements. They intend to continue developing this model for 13 fundamental movements over the course of a year and integrate the technology into the application at the end of 2026.
As UN1Q Fitness continues to refine its platform, expect the fusion of AI, computer vision, and physiotherapy to pave the way toward smarter, safer, and more personalised fitness coaching – right in the palm of your hand.

The project ‘STEPS’ by UN1Q Fitness Ltd was supported by Xjenza Malta, through the FUSION: R&I Technology Development Programme LITE (Grant No: R&I-2024-017L).




Comments are closed for this article!