Integrated Machine Vision and PLC Commanding for Efficient Bottle Label Detection in Industrial Processes: A Unified Approach for Quality Control

Miras Akhmetov, Damir Kanymkulov, Amir Amirov, Almira Askhatova, Tohid Alizadeh

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

This paper presents an integrated approach to bottle label detection in industrial processes, combining machine vision techniques with Programmable Logic Controller (PLC) commanding for an efficient and reliable quality control system. The logic of the label detection station was developed using ladder diagrams to simulate the process and determine whether to retain or remove a bottle on the conveyor based on label presence. The image processing component, executed in Python, manipulated the main variable indicating the label's presence on the bottle. CODESYS was employed to visually represent the process, and in the simulation phase, the bottle traversed the conveyor, halting when the proximity sensor activated. Python code provided the labeling status, and the corresponding image was displayed on a simulation screen. To bridge the gap between image processing and logic control, the functionality of image processing in Python was linked with the CODESYS logic. An OPC UA server in CODESYS facilitated external connections, enabling Python to access and configure CODESYS variables. Challenges, such as scanning rate disparities between Python and CODESYS, were addressed through the introduction of sessions synchronized by a counter. Technical steps involved using an OPC UA client program to monitor server availability and access CODESYS program variables. The installation of the Security plugin in CODESYS ensured secure external connections. The project's key realization was the seamless linkage between image processing and PLC logic, demonstrating an effective integration of machine vision into industrial processes. For bottle label detection, the paper employed OpenCV for object recognition. The image segmentation method, utilizing adaptive thresholding, distinguished the bottle from its background, optimizing the separation process. Contours were identified using findContours(), and a thorough cleanup using arcLength() and approxPolyDP() functions ensured only relevant labels remained. The combination of geometric analysis and parameter optimization resulted in precise and effective label detection. In conclusion, the proposed approach showcases the successful integration of machine vision and PLC commanding for bottle label detection in industrial settings. The synergy between image processing and logic control offers a fast and error-free solution for quality control inspections, laying the groundwork for future advancements in industrial automation.

Original languageEnglish
Title of host publication2024 10th International Conference on Control, Automation and Robotics, ICCAR 2024
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages66-70
Number of pages5
ISBN (Electronic)9798350373172
DOIs
Publication statusPublished - 2024
Event10th International Conference on Control, Automation and Robotics, ICCAR 2024 - Singapore, Singapore
Duration: Apr 27 2024Apr 29 2024

Publication series

Name2024 10th International Conference on Control, Automation and Robotics, ICCAR 2024

Conference

Conference10th International Conference on Control, Automation and Robotics, ICCAR 2024
Country/TerritorySingapore
CitySingapore
Period4/27/244/29/24

Keywords

  • Bottle Label Detection
  • Image Processing
  • Industrial Automation
  • Machine Vision
  • OPC UA server
  • PLC Commanding
  • Quality Control

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Science Applications
  • Media Technology
  • Control and Optimization
  • Modelling and Simulation

Fingerprint

Dive into the research topics of 'Integrated Machine Vision and PLC Commanding for Efficient Bottle Label Detection in Industrial Processes: A Unified Approach for Quality Control'. Together they form a unique fingerprint.

Cite this