• Aucun résultat trouvé

Les travaux de cette thèse s’intéressaient à l’utilisation des caméras catadioptriques pour palier au champ de vision réduit des caméras conventionnelles. A cause de la complexité de la géométrie de formation des images omnidirectionnelles, l’adaptation des outils de traitement classiques à ce type d’images était une tâche inévitable. Dans cette perspective nous avons présenté une étude se rapportant à l’élaboration d’un nouvel outil de traitement des images sphériques. En fait, cet outil représente un opérateur de détection de contours, basé sur un modèle sphérique de charges électriques virtuelles. La comparaison de ces performances aux autres opérateurs classiques, a mis en évidence significativement l’amélioration apportée. Un tel outil pourrait servir de base pour des applications de surveillance et de reconnaissance d’objets.

En guise d’application, les contributions et les concepts proposés ont permis le développement de nouvelles techniques pour l'utilisation des capteurs de vision omnidirectionnelle dans le domaine de la navigation. Ainsi, nous avons conçu et réalisé un robot mobile nommé ESCALADE360. La conception de cette plateforme robotique faisait partie de notre projet de recherche qui vise à développer et tester de nouveaux algorithmes pour le suivi d'objets et l’asservissement tout en utilisant des caméras omnidirectionnelles. Précisément nous avons abordé le problème de la détection en temps réel de cible mobile, et le suivi dans des environnements dynamiques. Dans ce scénario, nous avons proposé une technique basée sur la détection des couleurs moins sensible à la variation de l'éclairement de l'environnement. Les résultats expérimentaux indiquent que la méthode proposée basée sur l’invariance chromatique peut effectivement distinguer et suivre des cibles mobiles en environnement intérieur et extérieur.

Par ailleurs, nous avons présenté un nouveau concept de suiveur solaire basé sur la technologie de la vision par ordinateur. Ce suiveur utilise un système d’imagerie omnidirectionnelle pour fournir des informations précises sur la position du soleil. Le suivi se fait en temps réel, indépendamment des coordonnées spatio- temporelles avec moins de sensibilité aux conditions météorologiques tout en offrant un large champ de suivi. Plusieurs expériences ont été effectuées pour comparer l'efficacité de la production d'électricité de ce suiveur solaire avec un panneau fixe ainsi qu’un suiveur classique. La fiabilité de ce concept a été confirmée. La richesse des informations obtenues par le module d’acquisition peut être exploitée non seulement pour le suivi du spectre solaire, mais aussi pour la prédiction d’un éventuel recouvrement nuageux ce qui permettra de gérer intelligemment une installation photovoltaïque dans le futur.

Dans un second volet dédié à la stéréovision omnidirectionnelle, nous avons mené une étude des sections coniques ainsi qu’une prospection des systèmes catadioptriques existants. Au terme de cette étude, nous avons présenté la conception de deux nouveaux systèmes omnistéréo catadioptriques à miroirs sphériques et à configuration verticale. Deux modèles de triangulation passive ont

été développés pour l’estimation de la profondeur. Le fondement mathématique de la variation de l’erreur sur la profondeur avait également été présenté. Des séquences d’expérimentation ont été réalisées pour évaluer les concepts proposés. En perspective, nous envisageons exploiter le deuxième capteur omnistéréo proposé comme module d’acquisition embarqué sur la plateforme robotique ESCALADE360 pour étendre le champ d’application de suivi aux environnements dynamiques avec la capacité de perception de la profondeur. Cet axe de recherche peut être déployé dans le domaine de la localisation et la reconstruction 3D simultanées.

Références

[1] N. Franceschini, J. M. Pichon and C. Blanes, "Real time visuomotor control: from flies to robots," Advanced Robotics, 1991.'Robots in Unstructured

Environments', 91 ICAR., Fifth International Conference. IEEE , pp. 931-

935, 1991.

[2] R. Benosman and S. B. Kang, "Panoramic Vision: Sensors, Theory and Applications," 2001.

[3] M. D. Grossberg and S. K. Nayar, "A general imaging model and a method for finding its parameters," In Computer Vision, 2001. ICCV 2001.

Proceedings. Eighth IEEE International Conference , vol. 2, pp. 108-115,

2001.

[4] H. Nagahara, Y. Yagi and M. Yachida, "Wide field of view catadioptrical head-mounted display," International Conference on Intelligent Robots and

Systems, p. 3738–3743., 2003.

[5] G. Sandini, P. Questa, D. Scheffer, B. Diericks and A. Mannucci, "A retina- like CMOS sensor and its applications," Sensor Array and Multichannel

Signal Processing Workshop. Proceedings of the 2000 IEEE, pp. 514-519,

2000.

[6] P. Peer and F. Solina, "Panoramic depth imaging: Single standard camera approach," International Journal of Computer Vision, vol. 47, no. (1-3), pp. 149-160, 2002.

[7] C. Fermüller, Y. Aloimonos, P. Baker, R. Pless, J. Neumann and B. Stuart, "Multi-camera networks: eyes from eyes," Omnidirectional Vision, 2000.

Proceedings. IEEE Workshop, pp. 11-18, 2000.

[8] J. Neumann, C. Fermüller and Y. Aloimonos, "Eyes from eyes: New cameras for structure from motion," In Omnidirectional Vision, Proceedings. Third

Workshop on IEEE, pp. 19-26, 2002.

[9] R. Cutler, Y. Rui, A. Gupta, J. J. Cadiz, I. Tashev, L. W. He and S. Silverberg, "Distributed meetings: A meeting capture and broadcasting system," In

Proceedings of the tenth ACM international conference on Multimedia, pp.

503-51, 2002.

[10] H. Bakstein and T. Pajdla, "Panoramic mosaicing with a 180 field of view lens," In Omnidirectional Vision, Proceedings. Third Workshop. IEEE., pp. 60-67, 2002.

[11] R. Swaminathan, M. D. Grossberg and S. K. Nayar, "Caustics of catadioptric cameras," In Computer Vision, ICCV 2001. Proceedings. Eighth IEEE

International Conference, vol. 2, pp. 2-9, 2001.

[12] A. Basu and S. Licardie, "Multi-camera networks: Eyes from eyes,"

IEEE/RSJ, Intelligent Robots and Systems, vol. 3, p. 1822–1828., 1993.

[13] Y. Xiong and K. Turkowski, "Creating image-based VR using a self- calibrating fisheye lens," In Computer Vision and Pattern Recognition, IEEE

Computer Society Conference , pp. 237-243, 1997.

[14] T. Kurita, H. Shimai, Y. Baba, T. Mishima, M. Tanaka, S. Akaho and S. Umeyama, "Gaze control on virtual active vision system with binocular fish- eye lenses.," In Systems, Man, and Cybernetics, 2000 IEEE International

Conference, vol. 3, pp. 1644-1649, 2000.

[15] C. Bräuer-Burchardt and K. Voss, "A new algorithm to correct fish-eye-and strong wide-angle-lens-distortion from single images," In Image Processing,

2001. Proceedings. 2001 International Conference . IEEE., vol. 1, pp. 225-

228, 2001.

[16] H. Bakstein and T. Pajdla, "Rendering novel views from a set of omnidirectional mosaic images," In Computer Vision and Pattern

Recognition Workshop, 2003. CVPRW'03. Conference. IEEE., vol. 7, pp. 74-

74, 2003.

[17] D. Rees., "Panoramic television viewing system," U.S. Patent No. 3,505,465,

April 1970.

[18] Y. Vagi and S. Kawato, "Panoramic scene analysis with conic projection,"

International Conference on Robots and Systems, 1990.

[19] J. Hong, X. Tan, B. Pinette, R. Weiss and E. Riseman, "Image-based homing," IEEE International Conference on Robotics and Automation, vol. 1, p. 620 –625, 1991.

[20] K. Yamazawa, Y. Yagi and M. Yachida, "Omindirectional imaging with hyperboliodal projection," IEEE/RSJ International Conference on

Intelligent Robots and System, p. 1029–1034, 1993.

[21] S. K. Nayar and S. Baker, "Catadioptric image formation.," In Proceedings

of the 1997 DARPA Image Understanding Workshop, pp. 1431-1437, 1997.

[22] J. C. Maxwell, «Digital Image Processing Mathematical and Computational Methods,» 1868.

[23] R. C. Gonzalez, «RE woods, Digital Image Processing,» Addison–Wesely

Publishing Company, 1992.

[24] S. Sridhar, «Oxford university publication,» Digital Image Processing , 2001. [25] J. Canny, «A computational approach to edge detection,» Pattern Analysis

and Machine Intelligence, IEEE Transactions on 6 (1986), pp. 679-698, 1986.

[26] C. Drocourt, «Localisation et modélisation de l'environnement d'un robot mobile par coopération de deux capteurs omnidirectionnels,» PhD diss.,

ANRT [diff.], 2002.

[27] J.-J. Gonzalez-Barbosa, «Vision panoramique pour la robotique mobile: stérovision et localisation par indexation d'images,» PhD diss., Toulouse,

INPT, 2004.

[28] D. H. Lim, «Robust edge detection in noisy images,» Computational

Statistics & Data Analysis, vol. 50, n° %13, pp. 803-812, 2006.

[29] T. A. Abbasi and M. U. Abbasi, "A novel FPGA-based architecture for Sobel edge detection operator," International Journal of Electronics, vol. 94, no. 9, pp. 889-896, 2007.

[30] B. Bouda, L. Masmoudi et D. Aboutajdine, «A new model for edge detection in digital images,» GVIP Special issue on Edge Detection, vol. 7, pp. 25-30, 2007.

[31] Strauss, Olivier and F. Comby, "Variable structuring element based fuzzy morphological operations for single viewpoint omnidirectional images,"

Pattern Recognition, vol. 40, no. 12, pp. 3578-3596, 2007.

[32] F. Jacquey, L. Kevin, F. Comby et O. Strauss, «Non-additive approach for gradient-based edge detection,» In International Conference on Image

Processing, vol. 3, pp. 49-52, 2007.

[33] I. Bogdanova, X. Bresson, J. Thiran and P. Vandergheynst, "Scale space analysis and active contours for omnidirectional images," Image Processing,

IEEE Transactions, vol. 16, no. 7, pp. 1888-1901, 2007.

[34] O. El kadmiri and L. Masmoudi, "A new corner detection method for omnidirectional images," Journal of Theoretical and Applied Information

Technology, vol. 58, no. 2, 2013.

[35] C. Geyer and K. Daniilidis, "Catadioptric projective geometry," International

[36] C. Geyer and K. Daniilidis, "A unifying theory for central panoramic systems and practical implications," In Computer Vision—ECCV 2000, Springer

Berlin Heidelberg, pp. 445-461, 2000.

[37] Y. Xianghua and H. Zha, "Using sphere images for calibrating fisheye cameras under the unified imaging model of the central catadioptric and fisheye cameras," In Pattern Recognition, 2006. ICPR 2006. 18th

International Conference, IEEE, vol. 1, pp. 539-542, 2006.

[38] L. Shih-Schon and R. Bajcsy, "True single view point cone mirror omni- directional catadioptric system," In Computer Vision, ICCV 2001.

Proceedings. Eighth IEEE International Conference, vol. 2, pp. 102-107,

2001.

[39] D. Steven and K. Konolige, "Approximating a single viewpoint in panoramic imaging devices," In Robotics and Automation, Proceedings. ICRA'00. IEEE

International Conference, vol. 4, pp. 3931-3938, 2000.

[40] X. Zhiyu, X. Dai and X. Gong, "Noncentral catadioptric camera calibration using a generalized unified model," Optics letters, vol. 38, no. 9, pp. 1367- 1369, 2013.

[41] R. A. Hicks et R. Bajcsy, «Catadioptric sensors that approximate wide-angle perspective projections,» In Computer Vision and Pattern Recognition,

Proceedings. IEEE Conference, vol. 1, pp. 545-551, 2000.

[42] O. El kadmiri, Z. El kadmiri and L. Masmoudi, "A Spherical Electrostatic Model Edge Detector for Omnidirectional Images," Journal of Theoretical

and Applied Information Technology, vol. 51, no. 3, 2013.

[43] J. R. Fram et E. S. Deutsch, «On the quantitative evaluation of edge detection schemes and their comparison with human performance,»

Computers, IEEE Transactions, vol. 100, n° %16, pp. 616-628, 1975.

[44] E. S. Deutsch and J. R. Fram, "A quantitative study of the orientation bias of some edge detector schemes," IEEE Trans. Computers, vol. 27, no. 3, pp. 205-213, 1978.

[45] S. Baker and S. K. Nayar, "A theory of single-viewpoint catadioptric image formation," International Journal of Computer Vision, vol. 35(2) :175–196, 1999.

[46] S. Baker and S. K. Nayar, "A theory of catadioptric image formation," In

Computer Vision, 1998. Sixth International Conference. IEEE, pp. 35-42,

[47] V. Nalwa, "A true omnidirectional viewer," Technical report Bell

Laboratories, 1996.

[48] T. Kawanishi, K. Yamazawa, H. Iwasa, H. Takemura and N. Yokoya, "Generation of high-resolution stereo panoramic images by omnidirectional imaging sensor using hexagonal pyramidal mirrors," Fourteenth

International conference on Pattern Recognition, vol. 1, pp. 485-489, 1998.

[49] H. Hua and N. Ahuja, "A high-resolution panoramic camera," IEEE

International Conference on Computer Vision and Pattern Recognition, vol.

1, p. 960–967, 2001.

[50] T. Kar-Han, H. Hong Hua and N. Ahuja, "Transactions on Pattern Analysis and Machine Intelligence," IEEE, vol. 26, no. 7, p. 941 – 946, 2004.

[51] S. Nene and S. K. Nayar, "Stereo with mirrors," Sixth International

Conference on Computer Vision, p. 1087–1094, 1998.

[52] J. Gluckman and S. K. Nayar, "Planar catadioptric stereo: Geometry and calibration," IEEE Computer Vision and Pattern Recognition, vol. 1, p. 22– 28, 1999.

[53] J. Gluckman and S. K. Nayar, "Rectified catadioptric stereo sensors," IEEE

Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 2, p.

224–236, 2002.

[54] K. Yamazawa, Y. Yagi and M. Yachida, "Obstacle detection with omnidirectional image sensor hyperomni vision," IEEE International

Conference on Robotics and Automation, vol. 1, p. 1062–1067, 1995.

[55] S. C. Wei, Y. Yagi and M. Yachida, "Building local floor map by use of ultrasonic and omni-directional vision sensor," IEEE International

Conference on Robotics and Automation, vol. 3, p. 2548 –2553, 1998.

[56] Y. Onoe, N. Yokoya, K. Yamazawa and H. Takemur, "Visual surveillance and monitoring system using an omnidirectional video camera," Fourteenth

International Conference on Pattern Recognition, vol. 1, p. 588 –592, 1998.

[57] H. Nagahara, Y. Yagi and M. Yachida, "Resolution improving method from multi-focal omnidirectional images," EEE International Conference on

Image Processing, vol. 1, p. 654 – 657, 2001.

[58] M. Ollis, H. Herman and S. Singh, "Analysis and design of panoramic stereo vision using equi-angular pixel cameras," Technical Report CMU-RI-TR-99-

[59] Y. Yagi and M. Yachida, "Omnidirectional sensing for human interaction,"

Third Workshop on Omnidirectional Vision, p. 121–127, 2002.

[60] S. Nayar, "Catadioptric omnidirectional camera," IEEE International

Conference on Computer Vision and Pattern Recognition, p. 482–488, 1997.

[61] J. Gluckman and S. K. Nayar, "Ego-motion and omnidirectional cameras,"

IEEE International Conference on Computer Vision, p. 999–1005, 1998.

[62] J. Gluckman, S. K. Nayar and K. J. Thoresz, "Real-time omnidirectional and panoramic stereo," DARPA Image Understanding Workshop, p. 299–303, 1998.

[63] S. K. Nayar and V. Peri, "Folded catadioptric cameras," IEEE International

Conference on Computer Vision and Pattern Recognition, p. 217–223., 1999.

[64] P. Sturm, "A method for 3d reconstruction of piecewise planar objects from single panoramic images," IEEE Workshop on Omnidirectional Vision, p. 119 –126, 2000.

[65] Y. YYagi and S. Kawato, "Panoramic scene analysis with conic projection,"

IEEE/RSJ International Conference on Intelligent Robots and Systems, vol.

1, p. 181 – 187, 1990.

[66] C. Pegard and M. E.M., "A mobile robot using a panoramic view," IEEE

International Conference on Robotics and Automation, vol. 1, p. 89–94.,

1996.

[67] C. Cauchois, E. Brassart, C. Drocourt and P. Vasseur, "Calibration of the omnidirectionnal vision sensor : Syclop," IEEE International Conference on

Robotics and Automation, vol. 2, p. 1287 – 1292, 1999.

[68] C. Drocourt, L. Delahoche, B. Marhic and A. Clerentin, "Simultaneous localization and map construction method using omnidirectional stereoscopic information," IEEE International Conference on Robotics and

Automation, vol. 1, p. 894 – 899, 2002.

[69] D. Southwell, B. Vandegriend and A. Basu, "A conical mirror pipeline inspection system," IEEE International Conference on Robotics and

Automation, vol. 4, p. 3253 – 3258, 1996.

[70] M. Yachida, "Omnidirectional sensing and combined multiple sensing,"

IEEE Computer Vision for Virtual Reality Based Human Communications,

[71] S. Lin and R. Bajcsy, "High resolution catadioptric omni-directional stereo sensor for robot vision," IEEE International Conference on Robotics and

Automation, p. 1694–1699, 2003.

[72] L. Spacek, "A catadioptric sensor with multiple viewpoints," Robotics and

Autonomous Systems, vol. 51, p. 3–15, 2005.

[73] Z. Zhang, R. Weiss and E. M. Riseman, "Feature matching in 360◦ waveforms for robot navigation," IEEE Computer Vision and Pattern

Recognition., p. 742 –743, 1991.

[74] D. Southwell, A. Basu, M. Fiala and J. Reyda, "Panoramic stereo," 13th

International Conference on Pattern Recognition, vol. 1, p. 378 – 382, 1996.

[75] M. Fiala and A. Basu, "Panoramic stereo reconstruction using non-svp optics," 16th International Conference on Pattern Recognition., vol. 4, p. 27 – 30., 2002.

[76] N. Winters, J. Gaspar, G. Lacey and J. Santos-Victor, "Omni-directional vision for robot navigation," IEEE Workshop on Omnidirectional Vision, p. 21–28, 2000.

[77] R. Hicks and R. Bajcsy, "Catadioptric sensors that approximate wideangle perspective projections," IEEE International Conference on Computer Vision

and Pattern Recognition, p. 545–551, 2000.

[78] R. Hicks and R. Perline, "Geometric distributions for catadioptric sensor design," IEEE Computer Society’s Computer Vision and Pattern Recognition, p. 584–589, 2001.

[79] R. A. Hicks and K. R. Perline, "Equi-areal catadioptric sensors," Workshop

on Omnivision, p. 13–18, 2002.

[80] J. Gaspar, N. Winters and J. Santos-Victor, "Vision-based navigation and environmental representations with an omnidirectional camera," IEEE

Transactions on Robotics and Automation, vol. 16, no. 6, p. 890 – 898, 2000.

[81] M. Kobilarov, G. Sukhatme, J. Hyams and P. Batavia, "People tracking and following with mobile robot using an omnidirectional camera and a laser,"

Proceedings of the 2006 IEEE International Conference on Robotics and Automation, Orlando : Florida, 2006.

[82] F. Basso, M. Munaro, S. Michieletto, E. Pagello and E. Menegatti, "Fast and robust multi-people tracking from RGB-D data for a mobile robot," In

Intelligent Autonomous Systems 12 Springer Berlin Heidelberg, pp. 265-276,

[83] Z. El kadmiri, O. El kadmiri, S. El joumani, Z. Kaddouri and L. Masmoudi, "Color Based Omnidirectional Target Tracking," International Journal of

Imaging and Robotics, vol. 16, no. 1, pp. 43-50, 2016.

[84] Z. El kadmiri, O. El Kadmiri and L. Masmoudi, "Depth Estimation For Mobile Robot Using Single Omnidirectional Camera System," Journal of

Theoretical and Applied Information Technology, vol. 44, no. 1, 2012.

[85] Z. El kadmiri, O. El kadmiri, L. Masmoudi and M. N. Bargach, "A Novel Solar Tracker Based On Omnidirectional Computer Vision," Journal of

Solar Energy, Vols. Volume 2015 (2015), Article ID 149852, no.

http://dx.doi.org/10.1155/2015/149852, 2015.

[86] K. Kabidi, M. N. Bargach and R. Tadili, "Generation of solar irradiance on inclined surface using along terms measurements," Physical and Chemical

News, vol. 68, no. 1892, p. 17–24., 2013.

[87] S. Armstrong and W. G. Hurley, "A new methodology to optimise solar energy extraction under cloudy conditions," Renewable Energy, vol. 35, no. 4, pp. 780-787, 2010.

[88] E. Calabrò, «Determining optimum tilt angles of photovoltaic panels at typical north-tropical latitudes,» Journal of renewable and sustainable

energy , vol. 1, n° %13 , p. 033104, 2009.

[89] O. C. Aja, H. H. Al-Kayiem and Z. A. A. Karim, "Analytical investigation of collector optimum tilt angle at low latitude," Journal of Renewable and

Sustainable Energy, vol. 5, no. 6, p. 063112, 2013.

[90] A. Mechaqrane, R. Tadili and M. N. Bargach, "Optimal tilt angles for solar collectors facing south at Fez city (Morocco)," Journal of Natural Sciences

Research, vol. 4, no. 10, pp. 120-127, 2014.

[91] R. C. Neville, "Solar energy collector orientation and tracking mode," Solar

energy, vol. 20, no. 1, pp. 7-11., 1978.

[92] S. Abdallah, "The effect of using sun tracking systems on the voltage– current characteristics and power generation of flat plate photovoltaics,"

Energy conversion and management, vol. 45, no. 11, pp. 1671-1679, 2004.

[93] H. Mousazadeh, A. Keyhani, A. Javadi, H. Mobli, K. Abrinia and A. Sharifi, "A review of principle and sun-tracking methods for maximizing solar systems output," Renewable and Sustainable Energy Reviews, vol. 13, no. 8, pp. 1800-1818, 2009.

[94] C. Y. Lee, P. C. Chou, C. M. Chiang and C. F. Lin, "Sun tracking systems: a review," Sensors, vol. 9, no. 5, pp. 3875-3890, 2009.

[95] P. Baltas, M. Tortoreli and P. E. Russell, "Evaluation of power output for fixed and step tracking photovoltaic arrays," Solar Energy, vol. 37, no. 2, pp. 63-147, 1986.

[96] M. Brunotte, A. Goetzberger and U. Blieske, "Two-stage concentrator permitting concentration factors up to 300X with one-axis tracking," Solar

Energy, vol. 56, no. 3, p. 285–300, 1996.

[97] S. S. N. Rumala, "A shadow method for automatic tracking," Solar Energy, vol. 37, no. 3, p. 245–7, 1986.

[98] R. Zogbi and D. Laplaze, "Design and construction of a sun tracker," Solar

Energy, vol. 33, no. 314, p. 369–72, 1984.

[99] P. L. Milea, O. Oltu, M. Dragulinescu and M. Dascalu, "Optimizing solar panel energetic efficiency using an automatic tracking microdetector,"

Proceedings of the WSEAS International Conference on Renewable Energy Sources, Arcachon, France, 2007.

[100] V. Poulek and M. Libra, "A Very Simple Solar Tracker for Space and Terrestrial Applications," Solar Energy Materials and Solar Cells, vol. 60, no. 2, pp. 99-103, 2000.

[101] M. Berenguel, F. R. Rubio, A. Valverde, P. J. Lara, M. R. Arahal, E. F. Camacho and M. López, "An artificial vision-based control system for automatic heliostat positioning offset correction in a central receiver solar power plant," Solar Energy, vol. 7, 2004.

[102] S. K. Nayar, "Sphereo: Determining depth using two specular spheres and a single camera,," In 1988 Robotics Conferences,International Society for

Optics and Photonics, pp. 245-254, 1989.

[103] T. Pajdlar, T. Svobda and V. Hlavac, "Epipolar geometry of central catadioptric camera," International Journal of Computer Vision, vol. 49, pp. 23-37, 2002.

[104] H. Koyasu, J. Miura and Y. Shirai, "Recognizing moving obstacles for robot navigation using real-time omnidirectional stereo vision," Journal of

Robotics and Mechatronic, vol. 2, pp. 147-156, 2002.

[105] S. Yi and N. Ahuja, "A novel omnidirectional stereo vision system with a single camera," Image Anal. Recogn, no. 4142, pp. 146-156, 2006.

[106] J. Gluckman and S. Nayar, "Catadioptric Stereo Using Planar Mirrors,"

[107] D. Lee and I. Kweon, "A novel stereo camera system by a bi-prism," IEEE

Transaction on Robotics and Automation, vol. 16, no. 5, pp. 528-541, 2000.

[108] F. R. Corrêa, V. C. Guizilini and J. O. Junior, "Omnidirectional Stereovision System with Two-Lobe Hyperbolic Mirror for Robot Navigation," ABCM

Symposium Series in Mechatronics, vol. 2, pp. 653-660, 2006.

[109] H. Moravec, «Obstacle Avoidance and Navigation in the Real World by a Seeing Robot Rover,» Ph.D. thesis, , Stanford University, Stanford,

California, n° %1Available as Stanford AIM-340, CS-80-813 and CMU-RI-

TR-3, May 1980.

[110] R. VAILLANT and I. SURIN, "Reconstruction de visages par stéréovision active," Traitement du Signal [Trait. Signal], vol. 12, pp. 201-211, 1995. [111] Z. Zhu, "Omnidirectional stereo vision," Proceedings of the Workshop on

Omnidirectional Vision, Budapest, Hungary, 2001.

[112] Z. Zhu, K. D. Rajasekar, E. M. Riseman and A. R. Hanson, "Panoramic virtual stereo vision of cooperative mobile robots for localizing 3d moving objects," In Omnidirectional Vision, 2000. Proceedings. IEEE Workshop on

(pp. 29-36). IEEE, 2000.

[113] Z. Zhu, K. D. Rajasekar, E. M. Riseman and A. R. Hanson, "3D localization of multiple moving people by an omnidirectional stereo system of cooperative mobile robots," Technical Report TR #00-14, Computer Science

Dept., UMASS-Amherst, March, 2000, 2000.

[114] C. Salinas, H. Montes, G. fernendez, P. Gonzales de Santos and M. Armamda, "Catadioptric panoramic stereovision for humanoid robots,"

Robotica, vol. 30, pp. 799-811, 2011.

[115] S. Goto, A. Yamashita, R. Kawanishi, T. Kaneko and H. Asama, "3D environment measurement using binocular stereo and motion stereo by mobile robot with omnidirectional stereo camera," Computer Vision

Workshops (ICCV Workshops), 2011 IEEE International Conference on. IEEE, 2011.

[116] H. Koyasu, J. Miura and Y. Shirai, "Real-time omnidirectional stereo for obstacle detection and tracking in dynamic environments," Intelligent

Robots and Systems, 2001. Proceedings. 2001 IEEE/RSJ International Conference, vol. 1, 2001.

[117] T. L. Conroy and J. B. Moore, "Resolution invariant surfaces for panoramic vision systems," In Computer Vision, The Proceedings of the Seventh IEEE

[118] E. L. Cabral, J. C. de Souza and M. C. Hunold, "Omnidirectional stereo vision with a hyperbolic double lobed mirror," Pattern Recognition, 2004.

ICPR 2004. Proceedings of the 17th International Conference. IEEE, vol. 1,

pp. 1-9, 2004.

[119] J. Gaspar, C. Deccó, J. Okamoto Jr and J. Santos-Victor, "Constant resolution omnidirectional cameras," In Omnidirectional Vision, 2002.

Proceedings. Third Workshop on IEEE, pp. pp. 27-34, 2002.

[120] J. Salvi, X. Armangué and J. Batlle, "A comparative review of camera calibrating methods with accuracy evaluation," Pattern recognition, vol. 35(7), pp. 1617-1635, 2002.

[121] Z. EL Kadmiri, O. EL Kadmiri and L. Masmoudi, "Design of a new non-SVP omnistereo vision sensor," Submitted to press, 2016.

[122] Agrawal, Amit, Taguchi, Yuichi, Ramalingam and Srikumar, "Analytical forward projection for axial non-central dioptric and catadioptric cameras,"

In Computer Vision–ECCV 2010, Springer Berlin Heidelberg, pp. 129-143,

2010.

[123] O. El Kadmiri, «Contribution à la Stéréovision Omnidirectionnelle et au Traitement des Images Catadioptriques : Application aux Systèmes Autonomes,» Phd diss., Université Mohammed v, Faculté des Sciences de

Rabat, 2014.

[124] «http://www1.cs.columbia.edu/CAVE/projects/cat_cam_360/».

[125] G. Gales, «Mise en correspondance de pixels pour la stéréovision binoculaire par propagation d'appariements de points d'intérêt et sondage de régions,»

Diss. Université de Toulouse, 2011.

[126] "http://images.marinespecies.org/resized/70044_gigantocypris-muelleri- skogsberg-1920---a-myodocopid.jpg". [127] «http://www.fullview.com/technology.html,» Copyright ©1999–2016. [128] «http://www.andrews.edu/~calkins/math/webtexts/conicsl.jpg,» Copyright © 1998. [129] «http://www.nikonweb.com/fisheye/fisheye_6mm.jpg,» © 2005-2015. [130] "https://fr.wikipedia.org/wiki/%C5%92il_compos%C3%A9," 2015. [131] "http://www.logitech.com/fr-be/support/hd-webcam-c310," 2016.

Documents relatifs