In the field of manufacturing quality control, we have developed solutions using computer vision that identify product defects in real-time, thereby significantly reducing reliance on manual inspection.

Year: 2020 (Šibo – plastic caps), 2024 (Talum – aluminium products, with TriSense)
Technology: computer vision
Context: industrial quality control, automation of production processes
1. Šibo (2020) – plastic caps
We developed an AI module that automatically detects defects on plastic caps (cracks, deformations, structural irregularities).
The system was designed to operate directly on the production line, enabling faster response to non-conforming products in a series.
2. Talum (2024) – aluminium products (with TriSense)
Together with our partner TriSense, we developed a solution for identifying surface defects on aluminium products, with higher precision and in a more demanding lighting environment.
The system enables more consistent, repeatable, and significantly faster quality control compared to manual inspection. For this project, we received the silver GZS Award for Best Innovations 2024.
In the field of manufacturing quality control, we have developed computer vision solutions that identify product defects in real time, significantly reducing the reliance on manual inspection.
Years: 2020 (Šibo – plastic caps), 2024 (Talum – aluminium products, with TriSense)
Technology: computer vision
Context: industrial quality control, automation of production processes
1. Šibo (2020) – plastic caps
We developed an AI module that automatically detects defects in plastic caps (cracks, deformations, structural irregularities).
The system on the production line enables faster response times to non-compliant products within a series.
2. Talum (2024) – aluminium products (with TriSense)
Together with our partner TriSense, we developed a solution for identifying surface defects on aluminium products with higher precision and in more demanding lighting environments.
The system enables more consistent, repeatable, and significantly faster quality control compared to manual inspection. For this project, we received the GZS Silver Award for Best Innovation 2024.
Year: 2015 (core library)
Further applications:
Technology: Computer vision
Context: EU project and subsequent commercial use
Within the European project Empathic (2015), we developed a core library for facial gesture recognition, which detected facial expressions in real-time and enabled the interpretation of basic emotional states.
We designed the library modularly, which allowed it to be subsequently used in two completely different, yet technologically related applications:
1. Call Centre Analysis – Studio Moderna (2018)
Application of the library for analysing facial expressions in recorded call centre interactions.
Objective: To understand the correlation between employee facial expressions and customer satisfaction, and to support consultant training.
2. Panacea Gaming – EU Project (2020)
Application of the library in a gamified learning tool for teaching children with autism to recognise and express emotions.
Objective: To create an interactive game that encourages children to recognise and express emotions through rapid feedback and encouragement.
Impacts:
At a time when GPS navigation in cities could be off by as much as 50 m, and indoor navigation relied on WiFi transmitters, we worked with the Association of the Blind and Visually Impaired to assess how AI (computer vision) could increase the safety and independence of people with visual impairments.
Year: 2016
Client / partners: EU project; Association of Societies of the Blind and Visually Impaired of Slovenia
In the FIONA project, we developed a pilot solution to support people with visual impairments while walking indoors. The project was created under a European tender, in collaboration with the Slovenian Association of Societies of the Blind and Visually Impaired, so the entire development was focused on users’ real needs.
We developed an interface for navigating blind and visually impaired people in indoor spaces where GPS did not work. We paired image recognition from a mobile phone camera with visual annotation data. A voice interface guided the user along the route.
In brief:
At a time when GPS navigation in cities could deviate by up to 50 m, we collaborated with the Association of the Blind and Partially Sighted to explore how computer vision could enhance the safety and independence of visually impaired individuals.
Year: 2014
Client / Partners: EU project; Association of Societies of the Blind and Partially Sighted of Slovenia
In the ALICE project, we developed a pilot solution to assist visually impaired individuals with outdoor mobility. The project was initiated within the framework of a European tender and in close cooperation with the Association of Societies of the Blind and Partially Sighted of Slovenia for understanding user needs and testing.
Using computer vision recognition, we supplemented GPS navigation information, adding recognised environmental elements to location data. The mobile application guided the user verbally along pre-annotated paths in urban environments.
A brief summary of the findings:
For Comland, the project also brought a profound experience of participating in inclusive development with users, and above all, an even greater awareness of the responsibility we, as a development team, have towards the users of our solutions.
Our first AI projects, in 2005 and 2007. At a time when computer vision was primarily a research topic, we developed two pilot projects for the Ministry of Transport of the Republic of Slovenia for the automatic recognition of traffic signage in a real-world environment.
Client: Ministry of Transport of the Republic of Slovenia
Technology: computer vision
Year: 2005 (vertical signage) and 2007 (horizontal signage)
The goal was to verify whether computer vision technologies could shorten and simplify the procedures for field maintenance of traffic infrastructure. In the first project in 2005, we implemented recognition of vertical signage (certain types of traffic signs), and in the second project in 2007, horizontal signage (pedestrian crossings).
The projects practically demonstrated the usefulness of the technology and, at the same time, clearly showed its limitations at that time when applying the technology in real-world conditions. In other words, the most advanced recognition methods at that time were still very sensitive to road conditions, e.g. glare and reflections in wet weather, dark shadows in summer, etc.
Effects: