READ THESE TERMS AND CONDITIONS CAREFULLY BEFORE USING THIS WEBSITE.
https://nrc-publications.canada.ca/eng/copyright
Vous avez des questions? Nous pouvons vous aider. Pour communiquer directement avec un auteur, consultez la
première page de la revue dans laquelle son article a été publié afin de trouver ses coordonnées. Si vous n’arrivez
pas à les repérer, communiquez avec nous à PublicationsArchive-ArchivesPublications@nrc-cnrc.gc.ca.
Questions? Contact the NRC Publications Archive team at
PublicationsArchive-ArchivesPublications@nrc-cnrc.gc.ca. If you wish to email the authors directly, please see the
first page of the publication for their contact information.
NRC Publications Archive
Archives des publications du CNRC
This publication could be one of several versions: author’s original, accepted manuscript or the publisher’s version. /
La version de cette publication peut être l’une des suivantes : la version prépublication de l’auteur, la version
acceptée du manuscrit ou la version de l’éditeur.
Access and use of this website and the material on it are subject to the Terms and Conditions set forth at
Development of a Real-Time Laser Scanning System for Object
Recognition, Inspection, and Robot Control
Beraldin, Jean-Angelo; Rioux, Marc; Livingstone, F.R.; King, L.
https://publications-cnrc.canada.ca/fra/droits
L’accès à ce site Web et l’utilisation de son contenu sont assujettis aux conditions présentées dans le site
LISEZ CES CONDITIONS ATTENTIVEMENT AVANT D’UTILISER CE SITE WEB.
NRC Publications Record / Notice d'Archives des publications de CNRC:
https://nrc-publications.canada.ca/eng/view/object/?id=1a1b62c2-5457-4d27-b6e0-3eaec766698e
https://publications-cnrc.canada.ca/fra/voir/objet/?id=1a1b62c2-5457-4d27-b6e0-3eaec766698e
2
1
o o
1
2
NRC35063{SPIE{Vol.2057TelemanipulatorTechnologyandSpaceTelerob otics(1993),pp.454{461.
Recognition, Inspection, and Robot Control
Abstract
1 Introduction
1
F.R.Livingstone,L.King J.-A.Beraldin,M.Rioux
HymarcLtd. Institute forInformationTechnology
38AurigaDrive NationalResearch CouncilCanada
Ottawa,Ontario,CanadaK2E 8A5 Ottawa,Ontario,CanadaK1A 0R6
A real-timelaser scanner for object recognition, insp ection, and rob otcontrol is presented inthis pap er. Range
and intensity imagesare generated inp erfect registration at arate of 10 Mega-samples p er second. Imageshave a
resolutionof 483lineseach having512pixels. Owing to itscompatibilitywiththevideostandard RS-170, therange
cameracanb edirectlyinterfacedtoanimagepro cessorthroughavideoframegrabb erwitheitherananalogordigital
input. Theangulareldofviewis30 30 . Thestand-odistanceis0.5mandtheop erationaldepthofeldis1m.
Furthermore, co op erative targets canb e measureduptoseveral metersawayfromthecamera. Therangeresolution
is12bits. Undernormalop eratingconditions,arangeprecision of0.2mmand2mmare achievedat0.5mand1m
resp ectively. Thecamera weighs only2.75kg. Thesource inthe present systemis aNd-YAGlaser emittingalaser
b eamwith awavelength of 1064 nmand a maximump ower onthe scene of 2W. Futuredevelopmentswillinclude
theuseofasource at 1540nmtomakethesystemeye-safe. Thetestingoftheprototyp eiswellunderwayand so on,
imagesofavarietyofrepresentative objectswillb e available.
Canada is participating inthe Space Station Freedom Program by providinga MobileServicing System (MSS)
that will b e used in assembling and maintaining the station as well as servicing and manipulating payloads and
loading/unloadingtheshuttle. TheMSSconsistsof twoelementsonthestation, theMobileServicing Centre (MSC)
andtheMSSMaintenanceDep ot(MMD).ThroughtheStrategicTechnologiesinAutomationandRob otics(STEAR)
Program,the Canadian Space Agency (CSA) is fostering additional technological development b eyond the activity
b eingundertakenbytheMSSindustrialteam(SPAR,IMP,CAE,CAL,SED,MDA).Thetechnicalobjectiveofoneof
theprojectsistodevelopvisionsystemtechnologyforapplicationtothecontrolandop erationofrob oticsystems. The
purp oseof theproject is nottobuild actual ighthardwareor softwareforthe MSSbut to develop technology that
willmeettheneedsofthelongertermgrowthandupgradingoftheMSS.Concepts mayb edevelop edinanapplication
areawhere theContractoralreadyhasexp ertise,aslongasthelevelofcomplexityordicultyiscomparableto that
envisagedontheMSS.
This pap er describ es areal-timelaser scanner builtby HymarcLtd. under contract to theCSA.The systemis
based on recent advances made at the National Research Council on such a system . The laser scanner generates
registered rangeandintensityimagesaccordingto theRS-170standard. Compatibilityto thisvideostandard allows
for a direct link to the Articial Vision Function (AVF) that is a key element of the MSS. Integration of the laser
scanner to the AVF will allowthree dimensionalsurveying of the rob otic work-site, identication of known objects
fromregisteredrange andintensityimages,andobject trackingbased onrangedata. Theinvestigationofalgorithms
formo del-basedtrackingarerep ortedinaseparatearticle .
Thegeneral requirementsof theMSS arereviewedinSection 2. Section 3discusses theneedfor visioninspace
rob otics. An in-depthdescription of the scanner and its calibrationare presented inSection 4. Finally,conclusions
MSS Command
Processing
MSS
AVF* Tracking
Control Mode
AVF* Command
Processing
Camera/PTU
Control
AVF*
Text/Graphics
Display
MSS Command
Processing
Image
Processing
Video
Switch
AVS
Software
AVF* Video
Display
EE
PTU
AVF*
AVU**
Camera
Camera
Camera
Camera
Camera
Object
Laser
Scanner
AVF*
AVU**
- Artificial Vision Function
- Advanced Vision Unit
3 3.1 General
Impactofgrowthon basicAVF.
2 MSS Vision Requirements
3 Vision for Space Robotics
Figure1:
ThecurrentAVFisbasedontheSpaceVisionSystem(SVS)thatwasdemonstratedonCANEX-2. TheSVS
analy-sesvideosignalsfromtheclosedcircuittelevisionsystemofthespaceshuttleandprovidesreal-timep osition/orientation
and velo city information of an object with aco op erative targetarray . The baseline requirements for the Articial
VisionFunction(AVF)areasfollows:
Toidentifysuitablyilluminatedobjects(targets) fromtheirvideoimages.
Toestimatethep osition,attitude,translationalrate,androtational rateoftheobject.
Toprovideappropriatecameracontroltotracktheobject.
Tob eabletotrackobjects b eforecapturebyMSSmanipulatorsandb erthobjects/payloadshandledby
manip-ulators.
TheCSAintendsto enhance thecapabilities oftheAVFby developingadditionalvision systemssuch as a
laser-based range nding camera, as shown in Figure 1. The enhanced AVF willb e capable of p erforminglab elled and
unlab elledobjectidentication,shap edeterminationofunknownobjects,automatictargetacquisition,up dateofworld
mo deldatabase,andmanipulatorendp ointandlinkde ectionsensing.Thesenewfeatureswillb eusedinthefollowing
areas: enhancedtargetrecognitioninAVFtracking/b erthing,worldmapvericationforcollisiondetection/avoidance,
andobjectrecognitionand vericationduringautomaticop erations,e.g.,truss assemblytask.
Computervisionwillb eintegraltothesuccessoftheSpaceStationProject. Thespaceenvironmentisanop enone,and
eventso ccurunexp ectedly andwithdeviationsfromanypre-sp ecied plan.Thus,sensory-basedfeedbackisnecess ary,
andvisionisthesensethroughwhichsomeofthemostvaluableinformationab outtheenvironmentisavailable. Space
op erations are alsocharacterized by anoverriding requirement forop erational safety and reliability. Given thevast
amountsofuncertaintythat characterize most complexrob oticop erations andgiven theirhighlyunstablenature, it
followsagainthatlargeamountsofsensingdataarerequiredforreliability. Todate,mostsensinghasb eeninteractive.
Ahumanop eratorhasdirectvisualfeedbackeitherthroughtheco ckpitwindowsorviaavideoimagingsystemmounted
y
4
5
4 Real-Time Laser Scanner
3.2 Active versus Passive Vision
3.3 Synchronized Scanning Principle
4.1 Vision System Electronicsand Data Processing
StationProjectisthatatthepresentlevelofplanning,theamountofextravehicular activity(EVA)requiredapp ears
high. Thus, any technology likelyto decrease this need willreduce op erational costs and increase the safety of the
astronauts.
Vision involves the analysis of the prop erties of the luminous ux re ected or radiated by objects. To recover the
geometricalstructure s of theseobjects, either torecognize or to trackthem through time,twobasicvision strategies
are available. One strategy, passive vision, attempts to analyzethe structure of the scene under ambient light. In
contrast,active visionisanattempttoreducetheambiguityofscene analysisbystructuringthewayinwhichimages
areformed. By keeping theprop ertiesofinvarianceinrotationandtranslationof3-Dobjects,sensors thatcapitalize
onactive visioncan resolve mostoftheambiguitiesfoundwith2-Dimagingsystems. Moreover,with structuredlight
approaches, the 3-D information b ecomes insensitive to background illumination and to surface color and texture.
Thus,thetaskofpro cessingthe3-Ddataisgreatlysimplied. Triangulation-basedlaserrangecamerasareexamples
ofvisionsystemsbuiltonsuchastrategy.
For close-range spaceactivities,arangecamerabased ontriangulationwith asingle-sp ot scanningmechanismis
recommended. Single-sp otscanningisemphasizedhereb ecauseatypicalsceneinspacewillb ehighincontrastandwill
generate spuriousre ections. Theseoriginatefromtheintense naturalradiationeldcausedmostlybythesun. Also,
thethermalblanketsusedtoprotect somepiecesof equipmentarehighlysp ecularat theop eratinglaserwavelength.
Further,whenthespacestationisintheshadowoftheEarth,thevisionsystemwillhavetoop erateinquasi-darkness.
Thissituationcallsforacamerathatcanop erate overawidedynamicrangeofintensities. Throughthecontrolofthe
cameraparameters onapixel topixelbasis,increased dynamicrangeoftherangecameraisachievable. Hence,range
camerasbased onsingle-sp otscanningcan b eusedinalongertimewindowduringoneorbitofthespacestation .
Rioux intro duced an innovative approach to triangulation based range imaging by using a synchronized scanning
scheme which allowsvery largeelds of viewwithsmalltriangulationangles withoutcompromisingprecision. With
smallertriangulationangles,areductionofshadoweectsisinherentlyachieved. Implementationofthistriangulation
techniquebyanauto-synchronizedscanner approachgivesaconsiderablereductionintheopticalheadsizecompared
to standard triangulation metho ds. Range measurement andlateral p osition measurementare essentially decoupled
andthereisanimprovedambientlightimmunitydue tothesmallinstantaneouseld ofview.
Many issues aect the requirements for the signal pro cessing hardware. The basic and key requirement is that the
electronicsprovide rangeandintensitydataatnominally10MHz. However,anal designmustaddressotherissues,
includingdesignandmanufacturingcosts,systemsize,andp owerrequirements. Theunderlyingconsideration,forany
spaceorterrestrialpro duct,isthat thesystemhasfourbasic functionalelements:
1. Controlofthep olygonalmirrorand -axisscannerandgeneration ofp osition andintensitysignalsat10MHz
2. Dataacquisition,p ointpro cessing(e.g.,calibration)and intermediatestorage -allinrealtime(10MHz)
3. Imagepro cessing
4. Man/machineinterface
Eachofthesefourmainareaswillrequireuniqueelectronicstosatisfythetechnicalrequirements. However,many
ofthese requirements areachievable,fromatechnical p ointof view,with technology readilyavailableto day. This is
nottosaythatanimplementationwithto day'stechnologieswouldb eappropriateforaspaceenvironment,butrather
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAAA
AAAAAA
AAAAAA
AAAAAA
AAAAAA
AAAAAA
AAAAAA
AAAAAA
AAAAAA
AAAAAA
AAAAAA
AAAAAA
AAAAAA
AAAAAA
AAAAAA
AAAAAA
AAAAAA
Analog Unit
VME bus
Local bus
MVME-147
MaxGraph
MegaStore-8
MegaStore-8
MaxScan
Daughter Card
Compute Unit
Timing Unit
MAXbus
Console
Terminal
RS-170
Video
Monitor
Polygon
Controller
Laser
340M
Disk
Tape
Drive
Real-time
Scanner
x y o o TM TM TM y = N =N =N Systemarchitecture .4.2 Video Rate Scanner
Figure2:
and -axis scanner, scan synchronization, signal generation, analog-to-digital conversion, real-timecalibration, and
short-termhigh-sp eedstorage areareaswheresolutionsexistortechnologiesarereadilyavailabletocreatesolutions.
The functionalelements 2 and 4 are implementedon a MAXVideo familyof imagepro cessing b oards from
Datacub e. Figure2showsadiagramofthesystemarchitecture. Thearchitectureofthereal-timelaserscannerisbuilt
aroundtheindustry standardVMEbus,aMotorola68040generalpurp ose CPUrunningunder areal-timeop erating
systemOS9 fromMicroware, someMaxVideo b oardsand4customsignalpro cessing b oards. Atthe moment,
theimagepro cessingisdoneonadierentplatform. Sp ecializedhardwareb oardscanb eaddedinalaterphasewhen
allthealgorithmsrequired forthecontrolandop erationofrob oticsystemswillhave b eenfullyinvestigated.
Figure 3a shows a schematic diagram that emphasizes the dierent elements of the laser scanner. The raster typ e
imagesareobtainedinthefollowingway. Thelaserb eamisrelayedtothep olygonscannerthroughaseriesoffo cusing
lenses. Therotationofthep olygonimpartsascanningmotiontothelaserb eamalongthe axis. Thescanalongthe
axisis achievedwith a galvanometer (ormotor). The re ected lightfromthe scene is collected back through the
mirrormountedonthegalvanometer(ormotor),afacetofthep olygonscanneropp ositetotheprojectionofthelaser
b eam,thecollectinglens,andnallyitimpingesonap ositiondetectorwheretheinformationondepthandintensity
isderived. Theauto-synchronizationeect isachievedbytwofacetsofthep olygonalmirror.
IntheRS-170standard, afullimageiscomp osedof 525horizontallines(483 are active) whichare displayed on
amonitor in1 30second. Repro duction of a fullimageon amonitor is achieved by the sequential transmissionof
twointerleaved elds at 60 Hz. The equivalent line rate is therefore 15750lines/second. At a samplingrate of 10
MHz,512samplescan b e acquiredp er line. Due tothe size ofthecollectingmirror,it isnotp ossible toachieve the
required15750scans/sec ondusingagalvanometerdrivenmirrororresonantscanner. Thetechnologythatcanachieve
thissp eedisap olygonalmirrordrivenathighsp eedbyamotor. Ap olygonalmirrorwith facets rotatestheb eam
through 720 . A 30 scanangle willrequire a24-facet p olygon,which isone ofthestandards intheindustry. To
achievevideolinerate,itmustrotateat15750 revolutionsp ersecond,i.e.,656.25rps. Suchscannersusuallyrotate
continuouslyinonedirectionataxedsp eedtoproviderep etitiveunidirectionalscans. Themirrorismounteddirectly
onthe electric motorshaft. The combinedinertia of the p olygonand motorrotor contributeto rotational stability.
Thishigh inertia rendersthem impracticalfor applicationsrequiringrapid changesinscanvelo city. For true RS-170
compatibility, thegalvanometer mustb e drivenat 60 Hz. Withthis prototyp e, it isdriven at 30Hz (alsoknown as
y
x
z
Galvanometer
Scanner
Position
Detector
Polygon
Scanner
Source
a)
Source
Collimating
Lens
Position
Sensor
Imaging
Lens
Field
of View
β
fo
d
γ
a
b)
r ' o o o 6 5 4 y ! a a ! l a ! l 4.3 Optical DesignSchematic diagramof(a)completeoptical arrangement,(b) mainoptical pathunfolded. Figure3:
Withanylaser-basedrangecamera,depthofeld(DOF)andaccuracyarerelated. Increasedaccuracycanb eachieved
byreducing thedepth of eld. For rob otic applications,adepth of eld of 1000mmisreasonable, according to the
technicalliteratureandfeedbackfromusersofrob ots. Figure3bshowsthegeometricalarrangementofthetriangulation
approachusedherewiththe -axisscannerremovedforclarity. Thelaserb eamiscollimatedandfo cusedinthevolume
ofviewinsuchawaythatthelasersp otsizeisalmostconstantwithinit. Togetmaximumresolutionfromtheimaging
device,thedesired depth ofeldwasmatchedtotheRayleighcriterion ofafo cused Gaussianlaserb eam. Thisleads
tothefollowingrelationship,
= 2
(1)
where isthe wavelengthof thesource, is thedepth of eld,and is theradiusof thefo cused lasersp ot at the
center of theDOF. Using =1000mm, = 1000mmand a source emitting at 1064nm,one gets 0.41mm.
Witha eld of viewof ab out 400mmat = 1000 mm, then with asp ot radius of 0.41 mm,more than 975pixels
canb esampledalongthescanaxis. Inpractice, 512pixelsp erlinearesampledinordertomatchRS-170-compatible
imagepro cessingsystems. Thesp ecicationsoftherob oticvisionsystemaresummarizedinTable1.
Theestimationoftherequired laserp ower isanimp ortantstep inthedesignpro cess. Bytakingintoaccountthe
normalop eratingrangeofthep osition detector,a2W laserisrequired tomeasuresurfaces withvaryingre ectanc e.
A dio de laseris preferred b ecause ofits size; thesingle-mo de typ e gives b etter b eamquality thanmulti-mo deones.
Atthepresent time,themaximumlaser p ower available with thistyp e of laseris 1W.Because ofthe level oflaser
p owerrequiredbythisrangecamera,eye safetyb ecameimp ortantfromthestartoftheproject. Twop ossibilitieshad
to b e considered, i.e., either lo cateor design ap osition detector with a current gainmechanism (e.g.,avalanche or
micro-channelplate-basedp ositionsensitivedetectors )ormovetoasaferop eratingwavelength. Thelattersolutionwas
favoredb ecauseoftechnicallimitationsofthepreviousdevices. AnInGaAsp ositiondetectorsensitiveat1540nmwas
selected . Thiscontinuousresp onse detectorpro duces twocurrentsrelatedto thep ositionofthecentroidofthelight
distributionimpingingon itssurface. Thebandwidthof this InGaAsdetector is 7.2MHz, which meets the required
videobandwidthof4MHz. Inthe visiblesp ectrum, afraction ofamilliwattis sucienttocause eye damage,since
thep owerofcoherentlightenteringthepupilismagnied,attheretina,through thelens,byafactorofroughly10 .
Atlongerwavelengths,however,strongabsorptionbywaterpresent intheeye serves toreduceopticalp owerreaching
thelens,thusincreasingthedamagethreshold. At1540nmthethreshold p ower foreye damageisroughly10 times
greaterthanwhatitisinthevisiblesp ectrum. Also,thenoiseeectivep oweroftheInGaAsp ositiondetectorisab out
A
B
C
D
E
F
y y 7 74.4 Range Data Calibration
Sp ecications ofreal-timelaserscanner.
Conceptforthe scanner(not toscale): A=130mm,B=70mm,C=40mm,D=70 mm,E=40mm,F=190mm.
Stand-o 500mm
X-axisscan (atmid-DOF) 400mm
Y-axisscan (atmid-DOF) 400mm
Rangeprecision min-DOF 0.2mm
mid-DOF 2.0mm
max-DOF 12mm
Samplingrate 10MHz
Videorate: RS-170 Samplesp erline 512
Linesp er images 525
Imagerate 30
Table1:
Figure4:
surfaces aresimilarat 1540 nmand inthevisible region . Thus, there is p otentialfor signalimprovementof 40dB
(optical)if1540nmischoseninsteadofvisiblewavelengthsformakingeye-safemeasurements. Forthisrstprototyp e,
aNd:YAGlaser(1064nm)wasusedinstead oftheerbium-dop edb er laser b ecauseofthelaserp ower required. At
thepresent rateofdevelopment,however,itisreasonabletoexp ect higherp owersinthenearfuture.
The mechanical layout of a scanner for rob otic vision is mainly determined by the space requirements of the
p olygonalmirror,drive motor,and the -axismotorandmirror. Aconcept forthescanner isillustratedinFigure 4.
Themirror/motorassemblyismountedonanopticalb edplatewhich alsosupp ortsalloftheopticalcomp onentsand
thusensuresastableopticalalignment. Thelaserinputisfromabreoptic connectionandtheb eamisdirectedonto
thep olygonalmirrorby asteering mirror. Thep ortion ofthecasingthat containsthe -axismotorand scan mirror
hasb eenmademo dularsothatitcanb eremovedandusedasasingle-axisscanner. Themo dulecouldalsob ereplaced
bysp eciallensassemblyforahighaccuracyversionof thescanner. Thedimensionsoftherangecameraare givenin
Figure4. Thep olygonmotor,which istheheaviest comp onent,weighs0.6 kg. Therange cameraweighs ab out2.75
kg.
Therange camerayields threequantities p er samplinginterval: twofor theangular p osition ofthe mirrorsand one
2 8 x;y 5 Conclusion 6 Acknowledgements References
pro cessingalgorithms. Are-mappingofthesevariablestoamorecommonco ordinatesystemlikearectangularsystem
istherefore required.
Thecalibrationisp erformedwitha atplatecomp osedofanarrayoftargetsthatisp ositionedatknownlo cations
infrontof therangecamerawith aprecise linearstage. This plate is madeupof evenlyspaced targets onagridof
11 13. Oncetheregisteredrangeandintensityimageshaveb eenacquiredforallthetargetarrayp ositions,theedges
ofthosetargetsareextractedtosub-pixelaccuracyusingamomentpreservingalgorithm. Thecenterofeachtarget(in
termsoftherawcameraco ordinates)isthendeterminedbyttinganellipsetotheedgep oints. Aniterativenonlinear
simultaneous least-squares adjustmentmetho d extracts theinternaland externalparameters of the cameramo del .
There-mappingof theraw datato arectangularco ordinate systemis doneo-linethrough software. Unfortunately,
resultscouldnotb e shownduetounforseenproblemswiththelasersource. Thetestingoftheprototyp eiswellunder
wayandso on,imagesofavarietyofrepresentative objects willb e available.
Thispap er presentedareal-timelaserrange cameraforobject recognition,insp ectionand rob ot control. Thevision
requirementsfortheMobileServicingSystemthat willb edevelop edbyCanadathrough theCanadianSpaceAgency
anditspartners(industrial,universityandresearch lab oratories)werereviewed. Theneedforvisioninspacerob otics
wasclearlyidentied. Particularly,arangecamerabasedup onsinglelasersp otscanningwasidentiedasaverygo o d
strategythatcan meetthosevisionrequirements.
Theprop osed systemmeasuresregisteredrangeandintensityimagesat10Mega-samplesp ersecondaccordingto
theRS-170videostandard,i.e.,483lineswith 512pixels p erline. Withastand-odistanceof 0.5m,anop erational
depth of eld of 1 m,and arange resolution of 12 bits, the rangecamera willpro duce imagesadequate for various
rob otic tasks, e.g. collisiondetection and avoidance. Here, arate of 30 images p er second should provide enough
dataab out theenvironmentsurroundingthe rob ot. Thecontrol ofthe camera,thedata storage and low-levelp oint
pro cessingareimplementedonacommercialplatform. Futuredevelopmentswillincludededicatedb oardstop erform
thereal-timecorrectionandresamplingoftheimagesonaregular grid. Also,softwarewillb erequiredtointerpret
thedata and communicateits meaningto thesystem user. The particulartasks considered areobject identication
from surface structure , work cell mo delling,and p ose determination. Gathering the range data is only part of the
solutioninprovidingavisionsystemcapability.
TheauthorswishtothanktheCanadianSpaceAgencyforthenancialandtechnicalsupp ortalongthecourseofthis
project. TheauthorswishtoexpresstheirgratitudetoG.Thompsonforhistechnicalassistanceprovidedinthecourse
ofthe systemtesting and L.R. Schmidtfromthe STEAR program for hisvaluable comments. Finally,theauthors
wishtothankE.Kidd forherhelp inpreparingthetextandP.Amiraultforthetechnicalillustrations.
[1] J.-A. Beraldin, M. Rioux, F. Blais, J. Domey and L. Cournoyer, \Registered Range and Intensity Imagingat
10-Mega Samplesp erSecond,"Opt.Eng.31(1),pp.88{94(1992).
[2] W.Cheung,P.LettsandG.Roth,\Mo del-basedTrackingusingaReal-timeLaserRangeFinder,"InPress,SPIE
Vol.2063(1993).
[3] S.G.Maclean,M.Rioux,F.Blais,J.Gro dski,P.Milgram,H.F.L.Pinkneyand B.A.Aikenhead,\VisionSystem
DevelopmentinaSpace Lab oratory,"SPIE, Vol.1395,8{15(1990).
[4] F.Blais,M.Rioux,andS.G.Mclean,\Intelligent,VariableResolutionLaserScannerfortheSpaceVisionSystem,"
[6] P. Maigne, J.-A. Beraldin, T.M. Vanderwel, M. Buchanan and D. Landheer,\An InGaAs/InP PositionSensing
Photo detector", IEEEJ.ofQuant.Electr.,26(5),820{823,(1990).
[7] M.Rioux,J.-A.Beraldin,M.S.O'SullivanandL.Cournoyer,\Eye-SafeLaserScannerforRangeImaging,"Appl.
Opt., 30(16),2219{2223(1991).
[8] J.-A.Beraldin,S.F.El-HakimandL.Cournoyer,\PracticalRangeCameraCalibration,"InPress,SPIEVol.2067,