Araştırma Makalesi
BibTex RIS Kaynak Göster

GTGS Gerçekleme Problemleri İçin Akıllı Çözümlerin Endüstriyel Bir Robot Manipülatöre Uygulanması

Yıl 2020, Cilt: 7 Sayı: 2, 764 - 789, 30.12.2020
https://doi.org/10.35193/bseufbd.682875

Öz

Görüntü-tabanlı görsel servolama (GTGS) duruş kestirimi gerektirmediğinden robot manipülatörler için popüler GS yaklaşımlarından biridir. GTGS teorik sorunların yanında gerçekleme sırasındaki sorunlarla da baş etmek zorundadır. Bu gerçekleme sorunlarından temel üç tanesi sırasÜıyla etkileşim matrisinin tersinin eldesi, kontrolör için uygun bir sabit kazanç değerinin tanımlanması ve özniteliklerin görüş alanı içinde tutulmasıdır. GTGS için etkileşim matrisi her ne kadar yalancı tersi ile beraber kullanılsa da tekilliklerin oluşması durumunda kontrol yasası işleyememektedir. Diğer bir taraftan sabit kazanç değeri yakınsama hızı ile uç işlevci hızları arasında bir ödünleşmeye sebep olmaktadır. GTGS işleyişi sırasında özniteliklerin görüş alanını terk edebilmesi de yaygın bir sorundur. Bu çalışmada bu sorunları çözmek için önerilen akıllı yaklaşımların endüstriyel tipte bir robot manipülatör üzerine uygulanması hedeflenmiştir. Uygulamanın ilk aşaması olarak akıllı yaklaşımcı birimler etkileşim matrisinin tersinin yerini almakta ve tekillik sorunu ortadan kalkmaktadır. İkinci aşama olarak sabit kazanç yerine her çevrimde hata ve hata türevinin değerine göre kazanç hesabı yapan bir bulanık mantık birimi kullanılmıştır. Üçüncü aşamada ise alınan görüntü düzleminde bölgeler tanımlanmış, bir bulanık mantık birimi yardımıyla özniteliklerin görüş alanı içerisinde kalması sağlanmıştır. Çalışmada tüm gerçeklemelere dair deneysel sonuçlar sunulmuş ve tartışılmıştır.

Destekleyen Kurum

TÜBİTAK

Proje Numarası

117E511

Teşekkür

Bu çalışma TÜBİTAK tarafından 3001 kodlu Başlangıç Projeleri kapsamında TÜBİTAK-117E511 numarası ile desteklenmiştir.

Kaynakça

  • Hill, J.& Park W. T. (1979). Real time control of a robot with a mobile camera. Proceedings of the 9th ISIR, 233–246.
  • Hutchinson, S. A., Hager G. D. & Corke P. I. (1996). A tutorial on visual servo control. IEEE Int. Conf. Robot. Autom., cilt. 12, no. 5, 651–670, doi: 10.1109/70.538972.
  • Chaumette F. & Hutchinson S. (2006). Visual servo control. I. Basic approaches. IEEE Robot. Autom. Mag., cilt 13, no. 4, 82–90, doi: 10.1109/MRA.2006.250573.
  • Collewet C., Marchand E. & Chaumette F. (2008). Visual servoing set free from image processing. Proc. - IEEE Int. Conf. Robot. Autom., 81–86, doi: 10.1109/ROBOT.2008.4543190.
  • Kallem V., Swensen J. P., Hager G. D. & Cowan N. J. (2007). Kernel-based visual servoing. IEEE/RSJ International Conference on Intelligent Robots and Systems, 1975–1980.
  • Tahri O. & Chaumette F. (2005). Point-based and region-based image moments for visual servoing of planar objects. IEEE Trans. Robot., cilt. 21, no. 6, 1116–1127, doi: 10.1109/TRO.2005.853500.
  • Bakthavatchalam M., Tahri O. & Chaumette F. (2018). A direct dense visual servoing approach using photometric moments. IEEE Trans. Robot., cilt. 34, no. 5, 1226–1239, doi: 10.1109/TRO.2018.2830379.
  • Bateux Q. & Marchand E. (2017). Histograms-based visual servoing. IEEE Robotics and Automation Letters, cilt. 2, no. 1, 80–87, 2017.
  • Bourquardez O., Mahony R., Hamel T. & Chaumette F. (2006). Stability and performance of image based visual servo control using first order spherical image moments. IEEE/RSJ Int. Conf. Intell. Robot. Syst., 4304–4309, doi: 10.1109/IROS.2006.281963.
  • Malis E., Chaumette F. & Boudet S. (1999). 2 1/2 D visual servoing. IEEE Trans. Robot. Autom., cilt. 15, no. 2, 238–250, doi: 10.1109/70.760345.
  • Corke P. I.& Hutchinson S. A. (2001). A new partitioned approach to image-based visual servo control. IEEE Trans. Robot. Autom., cilt.17, no. 4, 507–515, doi: 10.1109/70.954764.
  • He Z., Wu C., Zhang S. & Zhao X. (2019). Moment-based 2.5-D visual servoing for textureless planar part grasping for textureless planar part grasping. IEEE Transactions on Industrial Electronics, cilt. 66, no. 10, 7821–7830.
  • Chaumette F. (1998). Potential problems of stability and convergence in image-based and position-based visual servoing,” Lecture Notes in Control and Information Sciences, 66–78.
  • Kumar P. P. & Behera L. (2010). Visual servoing of redundant manipulator with Jacobian matrix estimation using self-organizing map. Rob. Auton. Syst., cilt. 58, no. 8, 978–990, doi: 10.1016/j.robot.2010.04.001.
  • Kosmopoulos D. I. (2011). Robust Jacobian matrix estimation for image-based visual servoing. Robot. Comput. Integr. Manuf., cilt. 27, no. 1, 82–87, doi: 10.1016/j.rcim.2010.06.013.
  • Zhong X., Zhong X. & Peng X. (2015). Robots visual servo control with features constraint employing Kalman-neural-network filtering scheme. Neurocomputing, cilt. 151, 268–277, doi: 10.1016/j.neucom.2014.09.043.
  • Gonçalves P. J., Mendonça L. F., Sousa J. M. C. & Pinto J. R. C. (2008). Uncalibrated eye-to-hand visual servoing using inverse fuzzy models. IEEE Trans. Fuzzy Syst., cilt. 16, no. 2, 341–353, 2008.
  • Mansard N. & Chaumette F. (2007). Task sequencing for high-level sensor-based control. IEEE Trans. Robot., cilt. 23, no. 1, 60–72, doi: 10.1109/TRO.2006.889487.
  • Kermorgant O.& Chaumette F. (2014). Dealing with constraints in sensor-based robot control. IEEE Trans. Robot., cilt. 30, no. 1, 244–257.
  • Chesi G., Hashimoto K., Prattichizzo D. & Vicino A. (2004). Keeping features in the field of view in eye-in-hand visual servoing: A switching approach. IEEE Trans. Robot., cilt. 20, no. 5, 908–913, doi: 10.1109/TRO.2004.829456.
  • Yoshikawa T. (1985). Manipulability of robotic mechanisms. Int. J. Rob. Res., cilt. 4, no. 2, 3–9, 1985.
  • Zhao Z. Y., Tomizuka M., & Isaka S. (1993). Fuzzy Gain Scheduling of PID Controllers. IEEE Trans. Syst. Man Cybern., cilt. 23, no. 5, 1392–1398, doi: 10.1109/21.260670.
  • Mezouar Y. & Chaumette F. (2002). Path planning for robust image-based control. IEEE Trans. Robot. Autom., cilt. 18, no. 4, 534–549.
  • Chesi G. & Hung Y. S. (2007). Global path-planning for constrained and optimal visual servoing. IEEE Trans. Robot., cilt. 23, no. 5, 1050–1060.
  • Gonzales R. C.& Woods R. E. (2008). Digital Image Processing, 3.rd. Pearson.
  • Haykin S. (1998). Neural Networks: A Comprehensive Foundation, 2nd ed. Upper Saddle River, NJ, USA: Prentice Hall PTR.
  • Huang G., Zhou H., Ding X. & Zhang R. (2012). Extreme learning machine for regression and multiclass classification. IEEE Trans. Syst. Man, Cybern. B Cybern., cilt. 42, no. 2, 513–529.
  • Janabi-Sharifi F., Deng L. & Wilson W. J. (2011). Comparison of basic visual servoing methods. IEEE/ASME Trans. Mechatronics, cilt. 16, no. 5, 967–983, 2011, doi: 10.1109/TMECH.2010.2063710.

Implementation of Intelligent Solutions for IBVS Realization Problems to an Industrial Robot Manipulator

Yıl 2020, Cilt: 7 Sayı: 2, 764 - 789, 30.12.2020
https://doi.org/10.35193/bseufbd.682875

Öz

Image-based visual servoing (IBVS) is one of the popular VS approaches for robot manipulators since it does not require pose estimation. IBVS has to cope with theoretical problems as well as problems during implementation. The main three of these realization problems are obtaining the inverse of the interaction matrix, defining a suitable constant gain value for the controller, and keeping the features in the field of view, respectively. Although the interaction matrix for IBVS is used with the pseudoinverse, the control law does not work if singularities occur. On the other hand, the constant gain value causes a trade-off between the convergence speed and the end-effector speed. It is also a common problem that the features may leave the field of view during IBVS operation. In this study, it is aimed to implement the intelligent approaches proposed to solve these problems on an industrial type robot manipulator. As the first stage of the implementation, the intelligent approximation units take the place of the inverse of the interaction matrix and the problem of singularity disappears. As a second stage, instead of a fixed gain, a fuzzy logic unit which calculates gain according to the value of error and error derivative in each iteration is used. In the third stage, regions are defined in the image plane, and the features are kept within the field of view with the help of a fuzzy logic unit. In this study, experimental results of all implementations are presented and discussed.

Proje Numarası

117E511

Kaynakça

  • Hill, J.& Park W. T. (1979). Real time control of a robot with a mobile camera. Proceedings of the 9th ISIR, 233–246.
  • Hutchinson, S. A., Hager G. D. & Corke P. I. (1996). A tutorial on visual servo control. IEEE Int. Conf. Robot. Autom., cilt. 12, no. 5, 651–670, doi: 10.1109/70.538972.
  • Chaumette F. & Hutchinson S. (2006). Visual servo control. I. Basic approaches. IEEE Robot. Autom. Mag., cilt 13, no. 4, 82–90, doi: 10.1109/MRA.2006.250573.
  • Collewet C., Marchand E. & Chaumette F. (2008). Visual servoing set free from image processing. Proc. - IEEE Int. Conf. Robot. Autom., 81–86, doi: 10.1109/ROBOT.2008.4543190.
  • Kallem V., Swensen J. P., Hager G. D. & Cowan N. J. (2007). Kernel-based visual servoing. IEEE/RSJ International Conference on Intelligent Robots and Systems, 1975–1980.
  • Tahri O. & Chaumette F. (2005). Point-based and region-based image moments for visual servoing of planar objects. IEEE Trans. Robot., cilt. 21, no. 6, 1116–1127, doi: 10.1109/TRO.2005.853500.
  • Bakthavatchalam M., Tahri O. & Chaumette F. (2018). A direct dense visual servoing approach using photometric moments. IEEE Trans. Robot., cilt. 34, no. 5, 1226–1239, doi: 10.1109/TRO.2018.2830379.
  • Bateux Q. & Marchand E. (2017). Histograms-based visual servoing. IEEE Robotics and Automation Letters, cilt. 2, no. 1, 80–87, 2017.
  • Bourquardez O., Mahony R., Hamel T. & Chaumette F. (2006). Stability and performance of image based visual servo control using first order spherical image moments. IEEE/RSJ Int. Conf. Intell. Robot. Syst., 4304–4309, doi: 10.1109/IROS.2006.281963.
  • Malis E., Chaumette F. & Boudet S. (1999). 2 1/2 D visual servoing. IEEE Trans. Robot. Autom., cilt. 15, no. 2, 238–250, doi: 10.1109/70.760345.
  • Corke P. I.& Hutchinson S. A. (2001). A new partitioned approach to image-based visual servo control. IEEE Trans. Robot. Autom., cilt.17, no. 4, 507–515, doi: 10.1109/70.954764.
  • He Z., Wu C., Zhang S. & Zhao X. (2019). Moment-based 2.5-D visual servoing for textureless planar part grasping for textureless planar part grasping. IEEE Transactions on Industrial Electronics, cilt. 66, no. 10, 7821–7830.
  • Chaumette F. (1998). Potential problems of stability and convergence in image-based and position-based visual servoing,” Lecture Notes in Control and Information Sciences, 66–78.
  • Kumar P. P. & Behera L. (2010). Visual servoing of redundant manipulator with Jacobian matrix estimation using self-organizing map. Rob. Auton. Syst., cilt. 58, no. 8, 978–990, doi: 10.1016/j.robot.2010.04.001.
  • Kosmopoulos D. I. (2011). Robust Jacobian matrix estimation for image-based visual servoing. Robot. Comput. Integr. Manuf., cilt. 27, no. 1, 82–87, doi: 10.1016/j.rcim.2010.06.013.
  • Zhong X., Zhong X. & Peng X. (2015). Robots visual servo control with features constraint employing Kalman-neural-network filtering scheme. Neurocomputing, cilt. 151, 268–277, doi: 10.1016/j.neucom.2014.09.043.
  • Gonçalves P. J., Mendonça L. F., Sousa J. M. C. & Pinto J. R. C. (2008). Uncalibrated eye-to-hand visual servoing using inverse fuzzy models. IEEE Trans. Fuzzy Syst., cilt. 16, no. 2, 341–353, 2008.
  • Mansard N. & Chaumette F. (2007). Task sequencing for high-level sensor-based control. IEEE Trans. Robot., cilt. 23, no. 1, 60–72, doi: 10.1109/TRO.2006.889487.
  • Kermorgant O.& Chaumette F. (2014). Dealing with constraints in sensor-based robot control. IEEE Trans. Robot., cilt. 30, no. 1, 244–257.
  • Chesi G., Hashimoto K., Prattichizzo D. & Vicino A. (2004). Keeping features in the field of view in eye-in-hand visual servoing: A switching approach. IEEE Trans. Robot., cilt. 20, no. 5, 908–913, doi: 10.1109/TRO.2004.829456.
  • Yoshikawa T. (1985). Manipulability of robotic mechanisms. Int. J. Rob. Res., cilt. 4, no. 2, 3–9, 1985.
  • Zhao Z. Y., Tomizuka M., & Isaka S. (1993). Fuzzy Gain Scheduling of PID Controllers. IEEE Trans. Syst. Man Cybern., cilt. 23, no. 5, 1392–1398, doi: 10.1109/21.260670.
  • Mezouar Y. & Chaumette F. (2002). Path planning for robust image-based control. IEEE Trans. Robot. Autom., cilt. 18, no. 4, 534–549.
  • Chesi G. & Hung Y. S. (2007). Global path-planning for constrained and optimal visual servoing. IEEE Trans. Robot., cilt. 23, no. 5, 1050–1060.
  • Gonzales R. C.& Woods R. E. (2008). Digital Image Processing, 3.rd. Pearson.
  • Haykin S. (1998). Neural Networks: A Comprehensive Foundation, 2nd ed. Upper Saddle River, NJ, USA: Prentice Hall PTR.
  • Huang G., Zhou H., Ding X. & Zhang R. (2012). Extreme learning machine for regression and multiclass classification. IEEE Trans. Syst. Man, Cybern. B Cybern., cilt. 42, no. 2, 513–529.
  • Janabi-Sharifi F., Deng L. & Wilson W. J. (2011). Comparison of basic visual servoing methods. IEEE/ASME Trans. Mechatronics, cilt. 16, no. 5, 967–983, 2011, doi: 10.1109/TMECH.2010.2063710.
Toplam 28 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular Mühendislik
Bölüm Makaleler
Yazarlar

Tolga Yuksel 0000-0003-4425-7513

Proje Numarası 117E511
Yayımlanma Tarihi 30 Aralık 2020
Gönderilme Tarihi 31 Ocak 2020
Kabul Tarihi 20 Temmuz 2020
Yayımlandığı Sayı Yıl 2020 Cilt: 7 Sayı: 2

Kaynak Göster

APA Yuksel, T. (2020). GTGS Gerçekleme Problemleri İçin Akıllı Çözümlerin Endüstriyel Bir Robot Manipülatöre Uygulanması. Bilecik Şeyh Edebali Üniversitesi Fen Bilimleri Dergisi, 7(2), 764-789. https://doi.org/10.35193/bseufbd.682875