Skip to main content
LTTS

LTTS

Quick Links mini

  • Search
  • info@LTTS.com
  • English
  • 日本語
  • Israel
  • German
  • Contact
  • Engineering the change
  • Industry
    • Transportation
      • Aerospace
      • Automotive
      • Rail Transportation
      • Travel & Hospitality
      • Trucks & Off-Highway Vehicles
    • Industrial Products
      • Building Solutions
      • Lighting Engineering
      • Power electronics & drives
      • Renewable Energy
      • Test & Measurement
      • Power Generation & Transmission
    • Plant Engineering
      • CAPEX Project E/EPCM Services
      • Digital Engineering Services
      • Engineering Reapplication & Global Rollouts
      • Integrated Asset Management Services
      • Operational excellence
      • Plant sustenance & management
      • Regulatory compliance engineering
    • Semiconductors
      • IP Core Solutions
    • Media & Entertainment
      • Cable & Broadcasting
      • OTT
      • Rdk
      • Set-Top Boxes
      • Smart Home
    • Consumer Electronics
      • Consumer IoT
      • Enterprise Devices
      • Personal Devices
    • Healthcare
      • Healthcare Providers
      • Medical Devices
    • Telecommunications
      • 5G
      • SDN/NFV
      • Wireless Networks
      • Wireline Networks
    • Oil & Gas
      • Digitalization
      • Oil Field Equipment
      • Owners and Operators
    • Software Products
  • Services
    • Product Engineering
      • User Experience
      • Mechanical Design
        • CAE & CFD
        • CAx Automation
        • Integrated Design, Validation & Testing
      • Security
        • Security Monitoring
        • Security Solutions
        • Security Services
      • Software
        • Cloud Engineering
        • DevOps
        • Engineering Analytics
        • Immersive Experiences
        • Voice Innovations
      • Hardware
        • Embedded Systems
        • Wearables Engineering
        • Testing
        • Sustenance
      • VLSI
      • Testing & Validation
    • Manufacturing Engineering
      • PLM on Cloud
      • aspenONE on Cloud
      • Plant Design & Engineering
      • Digital Factory & Simulations
      • Line Expansion & Transfer
      • Frugal Product Design
      • Asset Care
      • Accelerated Operations
      • Manufacturing Automation
      • Agile Supply Chain
      • Content Engineering
    • Operations Engineering
      • Connected Service Support
      • Integrated Content Management
      • Intelligent Building Management
      • Sourcing & Procurement
    • Engineering Consultancy
      • Industry 4.0
      • Product Strategy
      • Security
      • Smart Factory
      • Sustainability
  • Solutions
    • FlyBoard®Advanced Digital Signage Solution
    • Connected Security Integrative Zero Trust Architecture
    • ESM Energy and Sustainability Manager
    • Cogmation Device Test Automation Framework
    • i-BEMSIntelligent Building Experience Management
    • UBIQWeise 2.0 Device to Cloud IoT Platform
    • AiKno™ Machine Learning, NLP & Vision Computing
    • Semiconductor IP For Security, Communication & Verification
    • nBOnnB-IoT Protocol Stack
    • Avertle®AI Predictive Maintenance Solution
    • ARC Asset Reliability Centre
    • CHEST-rAi™ AI Chest X-Ray Radiology Assist Suite
    • AnnotAi
    • AiCEArtificial Intelligence Clinical Evaluation
  • Insights
    • Blogs
      • 6 DevOps Trends to Watch Out for in 2023
      • A Validation Overview of ECU Communication Protocols in Automotive Audio Management
      • QOE Testing Paradigms for Media Streaming Platforms: Is Automation Your Answer
      • SIMD/SAMD: Everything You Need to Know
    • News
      • L&T Technology Services reports 21% growth and crosses the ₹300 crore mark in Net Profit in Q3FY23
      • LTTS Selected as Strategic Engineering Partner to Airbus for Advanced Capabilities and Digital Manufacturing Services
      • LTTS agrees to acquire Smart World & Communication business of L&T
      • Top global enterprises and engineers named winners of the inaugural Digital Engineering Awards
    • POV
      • Hidden Correlations Shaping the Future of European Enterprises
      • From the Super Bowl to 5G
      • Vehicle-to-Everything (V2X) Enabling Smarter, Safer, and Greener Transportation
      • Will 5G Technologies Drive a New Future for Media & Entertainment?
    • eBooks
      • The Art of Cyberwar
      • Digital Twin - The Future of Manufacturing
      • Digitalising Wind Energy Ecosystem
      • INDUSTRY 4.0: The Future Is Now
      • Digital Engineering Explained
      • Sustainability Engineering
  • Explore LTTS
    • About Us
    • Nearshore Centers
    • Testimonials
    • Events & Webinars
    • News & Media
    • Board of Directors
    • CSR
    • Accolades
    • Quality Management
    • Analysts
    • Careers
    • Investors
    • Media Kit
    • Resources
    • Alliances
    • Sustainability
  • Contact
 

Industrial Products

Machine Vision: Designing for Success

 

Image Processing

Machine Vision: Designing for Success

  1. Home
  2. Blogs
  3. Industry
  4. Machine Vision: Designing for Success

Machine Vision: Designing for Success

Machine Vision: Designing for Success
Published on: 17 May, 2018
0 Views
0 comments
Share This Article:
  • Twitter
  • Facebook
  • Linked in
Flexible Feeding
High-speed I/O Triggering
High-speed Sorting
Machine Vision
Synchronized Integration
Synergetic Integration
Vision-guided Motion
Visual Servo Control
Web Inspection System

In our previous blog, we looked at some of the best practices to keep in mind when designing machine vision solutions. To reiterate, a well-designed machine vision system enables manufacturers to improve product quality, enhance process control, and increase manufacturing efficiency while lowering the total cost of ownership. Good design starts with selecting a motion-vision integration type, based on the machine’s automation tasks.

Integrated machine vision design

In an integrated machine vision system, the motion and the vision systems can have varying levels of interaction, from basic information exchange to advanced vision-based feedback. The level of interaction depends on the requirements of the machine, that is, the sequence, the accuracy and precision, and the nature of the tasks that must be performed by the machine. Depending on the level of interaction between the motion and the vision systems, a design can be based on one of the following four types of integration: synergetic integration, synchronized integration, vision-guided motion, and visual servo control. For a high ROI, the machine must meet the specified requirements at deployment and must scale well with next-generation process and product improvements. Hence, integrators must first identify the current and future requirements and use those requirements to determine the type of integration that will best suit the application.

Synergetic integration

Synergetic Integration is the most basic type of integration. In this type of integration, the motion and the vision systems exchange basic information such as velocity or a time base. The time to communicate between the motion and vision systems is typically on the order of tens of seconds. A good example of synergetic integration is a web inspection system (Figure 1). In a web inspection system, the motion system moves the web, usually at a constant velocity. The vision system generates a pulse train to trigger cameras, and it uses the captured images to inspect the web. The vision system needs to know the velocity of the web in order to determine the rate for triggering the cameras.

Synchronized integration

In synchronized integration, the motion and the vision systems are synchronized through high-speed I/O triggering. High-speed signals wired between the motion and the vision systems are used to trigger events and communicate commands between the two systems. This I/O synchronization effectively synchronizes the software routines running on the individual systems. A good example of synchronized integration is high-speed sorting, in which objects are sorted based on the difference in specific image features, such as color, shape, or size.

In a high-speed sorting application, the vision system triggers a camera to capture the image of a part moving across the camera (Figure 2). The motion system uses the same trigger to capture the position of the part. Next, the vision system analyzes the image to determine if the part of interest exists at that position. If it does, that position is buffered. Because the conveyor is moving at a constant velocity, the motion system can use the buffered position to trigger an air nozzle further down the conveyor. When the part reaches the air nozzle, the air nozzle is triggered to move the part to a different conveyor, sorting the different colored parts. High-speed sorting is widely used in the food industry to sort product types or discard defective products. It achieves a high throughput, lowers labor costs, and significantly reduces defective shipments resulting from human errors.

Vision-guided motion

In vision-guided motion, the vision system provides some guidance to the motion system, such as the position of a part or the error in the orientation of the part. As we move from a basic to a more advanced integration type, there is an additional layer of interaction between the motion and the vision systems. For example, you can have high-speed I/O triggering in addition to vision guidance.

A good example of vision-guided motion is flexible feeding. In flexible feeding, parts exist in random positions and orientations. The vision system takes an image of the part, determines the coordinates of the part, and then provides the coordinates to the motion system (Figure 3). The motion system uses these coordinates to move an actuator to the part to pick it up. It can also correct the orientation of the part before placing it. With this implementation, you do not need any fixtures to orient and position the parts before the pick-and-place process. You can also overlap inspection steps with the placement tasks. For example, the vision system can inspect the part for defects and provide pass/fail information to the motion system, and the actuator can then discard the defective part instead of placing it.

Figure 4 shows the block diagram of the vision-guided motion system described in Figure 3. The vision system provides the position of the part to the motion trajectory generator at least once every second. This type of processing requires fast real-time systems that can meet the timing and processing needs of a vision-guided motion system.

In a vision-guided motion system, the vision system provides guidance to the motion system only at the beginning of a move. There is no feedback during or after the move to verify that the move was correctly executed. This lack of feedback makes the move prone to errors in the pixel-to-distance conversion, and the accuracy of the move is entirely dependent on the motion system. These drawbacks become prominent in high-accuracy applications with moves in the millimeter and submillimeter range.

Visual servo control

The drawbacks of vision-guided motion can be eliminated if the vision system provides continual feedback to the motion system during the move. In visual servo control, the vision system provides initial guidance to the motion system as well as continuous feedback during the move. The vision system captures, analyzes, and processes the images to provide feedback in the form of position setpoints for the position loop (dynamic look and move) or actual position feedback (direct servo). Visual servo control reduces the impact of errors from pixel to distance conversions and increases the precision and accuracy of existing automation. With visual servo control, you can solve applications that were previously considered unsolvable, such as those that require micrometer or submicrometer alignments. Visual servo implementations, especially those based on the dynamic look-and-move approach, are becoming viable through field programmable gate array (FPGA) technologies that provide hardware acceleration for time-critical vision processing tasks and that can achieve the response rates required to close the fast control loops used in motion tasks.

Authors

L&T Technology Services’ Industrial Products Division
L&T Technology Services’ Industrial Products Division

Related Blogs

L&T Technology Services’ Industrial Products Division
Machine Vision: Design Challenges and Best Practices
14 May, 2018
Mahavir Agarwal
Reducing Cost of Quality: Calibrate, Automate, Assure
13 Oct, 2017
L&T Technology Services' Smart Products and Services Division
COVID-19: Leveraging Technology for Social Distancing on a Global Scale
18 May, 2020
Leave a Comment
About text formats

Comments

No Comments

×Explore
  • Industry
  • Spotlight
  • ×
  • Automotive
  • Consumer Electronics
  • Industrial Engineering
  • Lighting & Building Solutions
  • Media & Entertainment
  • Medical Devices
  • Oil & Gas
  • Plant Engineering
  • Power Electronics
  • Renewable Energy
  • Semiconductors
  • Industrial Products
  • Transportation
  • Telecommunications
  • 5G
  • Cloud engineering
  • Cyber security
  • Embedded systems
  • Industry 4.0
  • Smart Manufacturing
  • Smart Products
  • Sustainability
  • Artificial Intelligence
  • AR/VR
  • Image Processing
  • Connected Healthcare
  • Smart Factory
  • Digital Twins
  • Building Automation
  • Autonomous Transport
  • Robotics
  • Digital Entertainment
  • Machine Learning
  • UI/UX
  • Manufacturing Automation
  • Smart Sourcing
  • Simulation
  • Software Defined Networking
  • Telehealth
  • Wearables
  • Design Thinking
  • IoT Security
  • ER&D Hackathon 2019
  • Digital Media
  • The New Normal
  • Data Mesh
  • Media
  • parent-company-logo.png
  • Need Help
  • Contact Us
  •  

Contact Us

By clicking Submit, you agree to the Privacy Policy

  • Engineering the change
  • Industry
    • Transportation
    • Industrial Products
    • Plant Engineering
    • Semiconductors
    • Media & Entertainment
    • Consumer Electronics
    • Healthcare
    • Telecommunications
    • Oil & Gas
    • Software Products
  • Services
    • Products
      • CAE & CFD
      • CAx Automation
      • Cloud Engineering
      • DevOps
      • Embedded Systems
      • Engineering Analytics
      • Immersive Experiences
      • Integrated Design, Validation & Testing
      • Security Monitoring
      • Security Solutions
      • Security Services
      • Sustenance
      • Testing
      • Testing & Validation
      • User Experience
      • VLSI
      • Voice Innovations
      • Wearables Engineering
    • Manufacturing
      • PLM on Cloud
      • aspenONE on Cloud
      • Plant Design & Engineering
      • Digital Factory & Simulations
      • Line Expansion & Transfer
      • Frugal Product Design
      • Asset Care
      • Accelerated Operations
      • Manufacturing Automation
      • Agile Supply Chain
      • Content Engineering
    • Operations
      • Connected Service Support
      • Integrated Content Management
      • Intelligent Building Management
      • Sourcing & Procurement
    • Consultancy
      • Industry 4.0
      • Product Strategy
      • Security
      • Smart Factory
      • Sustainability
  • Solutions
    • i-BEMS
    • Connected Security
    • nBOn
    • UBIQWeise 2.0
    • ESM
    • AiKno™
    • Cogmation
    • Avertle®
    • ARC
    • Chest-rAi™
    • AiCE
  • Insights
    • Blogs
    • News
    • POV
    • eBooks
  • Explore LTTS
    • About Us
    • Nearshore Centers
    • Testimonials
    • Events & Webinars
    • News & Media
    • Board of Directors
    • CSR
    • Accolades
    • Alliances
    • Quality Management
    • Sustainability
  •  
  •  
  •  
  •  
  •  
^
  •  
  •  
  •  
  •  
  •  

© 2023 L&T Technology Services Limited. All Rights Reserved.

  • COPYRIGHT & TERMS
  • PRIVACY
  • Site Map
  • info@LTTS.com