1 AIT Asian Institute of Technology

Visual feedback robot control based on artificial neural network

AuthorSupattra Plermkamon
Call NumberAIT Diss no.ISE-02-01
Subject(s)Robots--Control systems
Neural networks (Computer science)

NoteA dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Engineering, School of Advanced Technologies
PublisherAsian Institute of Technology
Series StatementDissertation ; no. ISE-02-01
AbstractVersatile v1s1on systems are being employed increasingly for currently automated visual feedback intelligent robotic control to perform complex manufacturing tasks especially in tracking and grasping dynamic object on the conveyor. They can also make work-cell to use less structured feeding techniques. The vision systems are used for interpreting capabilities without previous knowledge of the arbitrary object information and they can automatically generate the optimum tracking trajectory for the robot. However the numerous number of newly part products which are manufactured cause the troublesome work to manufacturing engineers and the need to pre-program to make severe demand for the precision of operation especially in repetitive tasks will be increased, as well as, the currently automated visual feedback robot systems use vision and robot system as separate tools. The best solution for these kind of problems still can not be implemented successfully with ease. Therefore, this study tried to develop the new single adaptive prediction and execution algorithm for the picking and placing static and dynamic object in the real-time operation robot work-cell. The operator can feel convenient to operate all tasks only by this single algorithm. The proposed system was designed by integrated a stationary monocular CCD camera with off-the-shelf frame grabber and an industrial robot operation into a single application on MATLAB. The new adaptive linear robot control system for a robot work-cell that can visually track and intercept static and dynamic objects undergoing arbitrary motion anywhere along its predicted trajectory within the robot's workspace, is presented in this study. The proposed system used a combination of the model based object recognition technique and L VQ network for classifying static objects which without overlapping. The proposed robot control system also used optical flow technique to determine the target trajectory and used the MADALINE network to generate a predicted robot trajectory based on visual servoing in both off-line and on-line processes. On-line planning program can operate without the need to pre-program in excruciating detail all the required tasks and any change in a task is possible without changing the robot program. Necessity of determining model of the robot, camera for all the static and dynamic objects and environment will be eliminated. This proposed system can operate efficiently on both static and dynamic object tracking. The conveyor speeds that give the smallest error value are 35-86 millimeter/second with robot fully at speed of two meter/second. In case of static object tracking, the system can classify the arbitrary object that we want to select accurately but in case of dynamic object tracking, the proposed system can not classify and the implementation is done only for one dimensional direction because of the time constraint. After learning process on robot, it is shown that KUKA robot is capable adaptability of tracking and intercepting both static and dynamic objects at an optimal rendezvous point on the conveyor accurately in real time.
Year2002
Corresponding Series Added EntryAsian Institute of Technology. Dissertation ; no. ISE-02-01
TypeDissertation
SchoolSchool of Advanced Technologies (SAT)
DepartmentDepartment of Industrial Systems Engineering (DISE)
Academic Program/FoSIndustrial Systems Engineering (ISE)
Chairperson(s)Afzulpurkar, Nitin V.;
Examination Committee(s)Bohez, Ir. Erik L.J.; Pham Minh Dung;Lee, Gerald Seet Gim ;
Scholarship Donor(s)Ministry of University Affairs;
DegreeThesis (Ph.D.) - Asian Institute of Technology, 2002


Usage Metrics
View Detail0
Read PDF0
Download PDF0