To link to this article : DOI:10.1016/j.compind.2013.03.018
http://dx.doi.org/10.1016/j.compind.2013.03.018
To cite this version : Fillatreau, Philippe and Fourquet, Jean-Yves and Le
Bolloc'h, Romain and Cailhol, Simon and Datas, Adrien and Puel, Bernard
Using virtual reality and 3D industrial numerical models for immersive
interactive checklists. (2013) Computers in Industry, vol. 64 (n° 9). pp.
1253-1262. ISSN 0166-3615
OATAO is an open access repository that collects the work of Toulouse researchers
and makes it freely available over the web where possible.
This is an author-deposited version published in :
http://oatao.univ-toulouse.fr/
Eprints ID : 10646
Any correspondance concerning this service should be sent to the repository
Using
virtual
reality
and
3D
industrial
numerical
models
for
immersive
interactive
checklists
P.
Fillatreau
a,*
,
J.-Y.
Fourquet
a,
R.
Le
Bolloc’h
a,
S.
Cailhol
a,
A.
Datas
a,
B.
Puel
baLGP-ENIT,INPT,Universite´ deToulouse,47Avenued’Azereix,BP1629,65016TarbesCedex,France bAlstomTransport,France
1. Introduction
The methods and tools for 3D imaging and interactive simulationsinvirtualreality(VR)havenowpenetratedindustrial activities. They are used in particular for project (e.g. design) reviews, or for the implementation of Computer Aided Design (CAD) models in manufacturing industries [34,16,19]. This is particularlythecaseintheactivitiesconnectedtotheaerospace andautomotiveindustries.
In parallel, the means dedicated to the verification of the procedures linked to quality and control processes, design, engineering,manufacturing,appealtothelogic ofchecklists,in adigitalformat(spreadsheet),butwhicharestoredessentiallyin text form. Various checklist are used in the Product Lifecycle Management (PLM) concerning requirements, design, mainte-nance,etc.Theyare mainlycharacterizedby theneedtoshare informationefficientlyinacollaborativeworkandbyahierarchy of questions on a textual basis. They require in succession a navigationin thechecklist, aninspectionoftheitemand/oran experimentationofthefunctionalityandanupdateofthecurrent informationconcerningtheitem.While thereisanincreasingly strongneedforintegratednumericaltoolsthatmaybeusedinall thestagesoftheindustrialProductLifecycleManagement(PLM),
thestate-of-the-artstudiesclaimingaVR-PLMintegrationinthe literature,alwaysfocusonaparticularsteporaspectofthePLM.As ananswertotheseneeds,weproposeagenerictoolforimmersive checklist-basedprojectreviewsthatappliestoallstepsofthePLM. Itallowstolinkthevirtualexperimentofthe3Dvisualizationand navigation, the manipulation of 3D digital models and the interactiveupdateofanindustrialchecklistdescribingaprocedure supplied byan industrialpartner.So, for each ofthequestions listedinthechecklist,thefollowingpossibilitiesaregiventothe user:
tosharewithotheroperatorsorindustrialsitestheimmersive visualizationofthe3Ddigitalmodelofthesystemtobechecked; tointeractintuitivelywiththe3Dmodelinordertoperformthe
testslistedinthechecklistvirtually;
tosave simulations or snapshots related toeach step of the procedureslistedinthechecklist;
to update the checklist (e.g. attach text comments or 3D multimedia simulations to the corresponding tests listed in thechecklist).
Theseactionsaremadepossiblebytheimplementationofa virtualrealityplatformcomposedofa3Dimmersivedisplay,tools for motion capture, a hapticarm [13],a data glove[26]and a softwarelayerallowingtheimplementation andtherunningof interactive real-timescenarios whichenables theassociationof manipulationpropertiestothe3DobjectsstemmingfromaCAD modelofanindustrialsystem.Thecontributionisthree-fold:
Keywords: Virtualreality 3Dindustrialinspection Qualityandcontrolprocesses 3Dvisualization
Sensorimotorinterfaces
ABSTRACT
AtthedifferentstagesofthePLM,companiesdevelopnumerouschecklist-basedproceduresinvolving prototypeinspectionandtesting.Besides,techniquesfromCAD,3Dimaging,animationandvirtual realitynowformamaturesetoftoolsforindustrialapplications.Theworkpresentedinthisarticle developsauniqueframeworkforimmersivechecklist-basedprojectreviewsthatappliestoallstepsof thePLM.Itcombinesimmersivenavigationinthechecklist,virtualexperimentswhenneededand multimediaupdateofthechecklist.Itprovidesagenerictool,independentoftheconsideredchecklist, reliesontheintegrationofvariousVRtoolsandconcepts,inamodularway,andusesanoriginalgesture recognition.Feasibilityexperimentsarepresented,validatingthebenefitsoftheapproach.
*Correspondingauthor.Tel.:þ330562445080.
E-mailaddresses:Philippe.Fillatreau@enit.fr(P.Fillatreau),fourquet@enit.fr (J.-Y.Fourquet),simon.cailhol@enit.fr(S.Cailhol).
1.a generic tool for virtual experiment-based checklists that appliestoallstepsofthePLM.
2.theintegrationofvariousVRtoolsandconceptsinanindustrial framework.
3.anewparadigmof3Dimmersiveprotocolfordesign,quality, control,engineering,etc.ofindustrialpartsorsystems,basedon originalgesturerecognition.
Inthispaper,wefirstpresentastateoftheartofvirtualtoolsfor industry,andtheindustrialcontextofthiswork.Then,wepresent ournovelandgenericVRreviewparadigm,focusingonVRtoolsfor checklistsandona3Dimmersiveprotocolbasedonhandposture andpositionrecognition.Implementationdetailsand experimen-talresultsareprovided,andthepaperendswiththeconclusions andperspectivesofthiswork.
2. StateoftheartofVRtoolsforindustry
2.1. VRapplicationsingeneral
Virtual Reality (VR), as defined in [12], is ‘‘a scientific and technical domain exploiting the possibilities of computers and behaviouralinterfacestosimulateinavirtualworldthebehaviour of3Dentities,whichinteractinrealtimewitheachotherandoneor more users in pseudo-natural immersion through sensorimotor channels’’.Thetechniquesofimmersioninvirtualrealitycallfor differentmeanscorrespondingtotheuser’ssenses.Theyconnect theimplementationofsensorimotorinterfacestotheprocessingof thevirtualworldinwhichtheuserisimmersed.Thus,theoperator receivesinformationaboutthedigitalscene(3Dstereoimmersion usingabigscreen,collisionforcefeedbackthroughahapticdevice tosensethecontactwithvirtualobjectsintheenvironment,etc.) and he can act on this scenethanks tothe implementation of drivinginterfaces(hapticarm,motioncapturesystemallowingto pilotthepointofviewoftheuserortheanimationofahuman avatar, data glove). VR hasbeen developing over the past few decades and has especially gained tremendous attention and developmentsinthelasttenyears.Thistechnologyhaspenetrated into the fields of games, health, biology, military operations, education, learning and training, etc. and given birth to applications liketele-presenceandtele-operation(see[17]for tele-presenceforconferences or[35]forrobottele-operation), datavisualizationandexploration(see[11]forimmersivedata visualizationtohelpfordecisionmaking),oraugmentedreality (see[6]foramaintenanceapplication inthefieldsof aeronau-tics).
2.2. Virtualprototypeversusphysicalprototype
Meanwhile, the emergence during these last decades of computingtechnologies andmore recentlyoftoolsforbuilding digitalmock-upsor3Ddigitalizationofexistingobjectshasmade virtualprototypingamoreandmorecommonpracticeinmany industrialsectors.Atthedevelopmentstageofproducts,industrial companiesnowprefer,whenapplicable,exploitingdigitalmodels ratherthanexpensiverealphysicalprototypes.Generally,virtual prototypes are used in the upstream design steps, and real prototypesinthedownstreamsteps.
Using suchdigital prototypes poses a number of questions. Choosingthetypeof modeltobebuiltisa criticalone.Virtual prototypes usually correspond to 3D CAD models. As the prototypestobevalidatedbecomemoreandmorecomplex,the corresponding 3D models and involved geometries become increasingly sophisticated. Up-to-date CAD tools also allow associating useful propertiestothegeometric objectsinvolved, e.g.:
informationonwhichmaterialtheobjectismadeof(oftenin relationtoappearancethrough theimplicituseof a textures library);
mechanicalcharacteristicssuchasmodulusofrigidity,Young’s modulus,Poisson’sratio,or frictioncoefficient,tobeusedby materialsresistanceorfiniteelementscomputationtools; appearanceproperties(transparencyoropacity,texture); otherphysicalpropertieslikequalifyingagivengeometricobject
asanobstacleornotin3Dsimulationswheretheobjectsarein motion.
Modelstructureisalsoakeyquestion;anappropriatestructure canbedesignedforspecifictasks;arelevantorganizationofthe modelgeometriesintoa hierarchycanbeusedtoautomatically generatenomenclatures.Lastbutnotleast,formatconversionsand compatibilities are another central question posed by virtual prototyping in industry, as a virtual prototype is generally processed through different tools (no integrated tool over the complete ProductLifecycleManagement exists) and data often havetobesharedbetweendifferentindustrialsites.
Physicalmodelsstillarenecessaryatsomedownstreamsteps (forexample,carindustryusessculptureclaymodelsfordesigners tocheckthephysicalformofacar).But,whenapplicable,theuseof virtualprototypesprovidesmanyadvantages.De´pince´ etal.[8]have listed the expected benefits ofVirtual Manufacturing. Fromthe productpointofview,theyemphasizereductionoftime-to-market, ofthenumberofphysicalprototypemodels,andtheimprovement ofquality.Inthedesignphase,listedbenefitsincludethepossibility ofsimulatingmanufacturingalternatives,tooptimizethedesignof productandprocessesforspecifictasks(e.g.assembly)orevaluate variousproductionscenarios;fromtheproductionpointofview,the authors emphasizethe reduction of materialwaste and costof tooling,theimprovementoftheconfidenceintheprocess,orlower manufacturingcosts.Mousavietal.[28]presentasurveycarriedout among several international and domestic car companies of Malaysiaabout the benefitsandbarriersofusingVRprototypes and systems. Emphasizedbenefits include reduction of rework, improvement of quality, cost savings, better client satisfaction, marketing effectiveness, and productivity, while highlighted barriersincludethepossiblelackoftrainedpeople,thetimeneeded toget proficient, software and hardware costs, and the lackof softwareandhardwarestandards.Thus,bothstudiesdefinemainly trade-oriented qualitative impact factors for the use of VR techniques and very few numerical indicators to measure this impact are proposed in the literature. More recent works [36]
emphasize other advantages of the use of VR prototypes or techniques:
digital models make it possible to automatically generate associateddocumentation(e.g.nomenclature);
virtual prototypes allow implementing multi-site remote collaborativework (involvingremote users and datasharing) (seee.g.[17]foraMixedRealityteleconferenceapplicationwith severalremotelylocatedusers ina shared virtualworldwith sharedvirtualobjects).
2.3. VR-PLMintegrationapplications
AsmoreandmorepowerfulComputer-AidedEngineering(CAE) tools arise, there is an increasingly strong need for integrated numericaltoolsthatmaybeusedinallthestagesofthePLM.
InararereviewofVRapplicationsinmanufacturingprocess simulation, Mujber et al. [29] propose a classification for VR applications in industry into three groups: design (design and prototyping),operationsmanagement(planning,simulationand training) and manufacturing processes (machining, assembly,
inspection).Althoughthisdemonstratesawiderangeofindustrial applicationsforVR,theuseofVRisfarfromreachingallstagesof the PLM (products requirements validation and acceptance of product,forexample,arenotreallyaddressedbyVRapplications). Asa matteroffact,ifwelookatstudiesclaiming aVR-PLM integrationintheliterature,theyalwaysfocusonaparticularstage oraspectandwecandistinguishworksdealingwith:
-assemblyordisassemblyapplications:Bordegonietal.[3]involve two6-DOFinterfaces(ahapticdeviceandaWiimotecontrol).Li et al. [21] propose VRtoolsfor disassembly and maintenance training,thedisassemblyactions sequencebeingautomatically generatedfromtheassembledgeometriesdescribingthesystem tobeoperated.LoockandSchomer[22]focusonthemodellingof rigidanddeformableobjectsforassemblysimulations.
-ergonomicsanalysis:Moreauetal.[27]dealwiththedesignofa hapticdevicetostudytheergonomicsofapushbutton,[32]with themodellingandanimationofvirtualhandsforthe manipula-tionof3Dobjectsusingahapticinterface.DiGironimoetal.[9]
presentaninnovativemethodologyforassessingtheusabilityof a product, focusing on the definitionof a synthetic usability index.Marcetal.[24]showhowvirtualrealitycanbeatoolfora betterconsiderationoftheusabilityofaproduct,highlightingthe contributionsofengineeringandpsychoergonomicapproaches tointegratehealthandsafetyfromthedesignstageon.Chedmail etal.[5] presentamulti-agentarchitecturetovalidateapath planner for a manikin or a manipulator robot for access and visibilitytasksandtoallowtheusertotakeintoaccount(among otherthings)ergonomicconstraintsforthemanikinorjointsand mechanical limits for the robot. More generally, ergonomic requirementsareoftenexpressedinachecklist(seeforexample
[14]or[1])andcantakeadvantageofVRtechniques.
-Productmodelling:Bordegonietal.[2]dealwiththe develop-ment of a VR application and a haptic interface intended for designers or sculptors; the development is based on the observationofprofessionaldesignerswhilemodellingto repro-duceinthevirtualworldthemodellingtechniquesusedinthereal world;Bourdotetal.[4]andPicon[31]dealwiththeintegrationof classical CAD modelling techniques in VR,enriched by haptic feedback.Meyrueisetal.[25]defineandusetheD3procedurefor immersivedesign:(1)Draw(selectionofsurfacestobemodified), (2)Deform(definitionandmakingofmodifications),(3)Design (transferofmodificationsmadetotheCADmodels).Raposoetal.
[33]integrateVRandCADinengineering(maintenance)projects forlargecomplexpetroleumengineeringprojects.Furtherworks proposeapproachesforcouplingVRandPLMsystemsbyreducing the authoring burden, like Noon et al. [30] (rapid design and assessment of large vehicles) or Makris et al. [23] (involving semantic-basedtaxonomyforimmersiveproductdesign). -Simulation:Dangelmaieretal.[7]useVRandaugmentedreality
techniquesformanufacturingprocesses,andproposeageneric solutionforuserinterfaceanddatastructure,andLeeetal.[20]
involve augmented reality for workshop design and update simulations.
-Decision-makinghelp:DijkstraandTimmermans[10]developa researchtoolforconjointanalysisi.e.analysisofparametershaving aninfluenceonaconsumer’sdecisiontobuyaproduct.Eddyand Lewis [11] proposea cloud representationof datafor decision-makinghelp.Kieferletal.[18]useVirtualandAugmentedReality forprojectplanninginarchitectureorurbanism.
2.4. VR-PLM:synthesisandchallenges
VRisnowusedinmanyindustrialapplicationsandcutscosts duringtheimplementationofaPLM.Themainchallengesarea resultofthefollowingdrawbacks:
implementation of a CAE simulation is a time-consuming process;
VRsystemsusedinindustry focuson oneorafew particular stepsofadevelopmentcycle(e.g.designreview),andmaybe usedintheframeworkofthecorrespondingproduct develop-mentprojectreview.ThereisnoVRtoolinthecurrentstateof theartwhichenablesustodealgloballywiththedifferentsteps ofthePLMandthecorrespondingprojectsreviews.
The work presented here can bring an improvement with respect tothese commonly encountered drawbacks. Itaims at proposinganintegratedmultimodalVRprojectreviewtool.The benefitsofthistoolinthePLMwillbethreefold:
first,itconstitutesagenericprojectreviewtoolwhichmaybe used for any project review in the PLM: only the virtual experimentsdifferandtheprojectreviewscriptwedeveloped onlyneedstoinvolvetherightexperiment-orientedsimulation module.Thiswillallowadrasticreductionofdevelopmenttimes forprojectreviewsimulations;
second, its multimodality enables us to produce multimedia documentsassociated withthe whole PLM (not only project reviewsbutalsovirtualprototypes,orscenariosandreportsfor productsorsystemsoperationsimulation);
third, the proposed project review paradigm will offer rich interactionandimmersionmeans,asitmayuseanyoftheVR simulationmodulesintegratedinourplatform.
3. Industrialcontext
Theworkspresentedherehavebeendevelopedinpartnership with a transportation company. In this domain, products or systems are complexand arise fromtheintegration of various components, which involve different competences: mechanical design,sizingofelectricactuators andconverters,adaptationof air/watercoolingsystems,etc.
Inordertodevelopsuchcomplexproductsorsystems,aglobal schemeindicatingthedifferentstepsofthePLM,togetherwiththe associatedgatereviewshasbeendeveloped,allowingforastepby stepchecking.Thegatereviewsareusuallydescribedasalistof questions tobeanswered;someofthem mayneedtoperform someexperiments(forexample,checkingthepresenceofaspecific partorsystematthedesignstage,orverifyingtheoperationofa systemtosimulateoperationsatassemblyormaintenancestages). The PLM approach intends to provide, among other things, traceabilitybetweenPLMsteps,theabilitytodetecterrorsasearly aspossible,thereforetoanticipatepotentialerrorsbeforefuture stepsarereached.Forexample,atagivensystemdesignphase,an exhaustiveverificationofthedesignshouldalsoprovideawayto validate future operations of the system, e.g. assembly or maintenance.Thenumerousquestionstobeansweredandtests tobeperformed duringagiven projectgate reviewareusually gatheredinchecklists.Thesechecklistsaregenerallymadeupof worksheetsthatshouldbeupdatedtextuallyby integratingthe resultsobtainedforeachquestionsetortesttobeperformed.
To perform these tests using traditional methods, industry needstobuildvariousphysicalprototypesofthepartsorsystems to be produced, and a physical manipulation of the real components is often required. Review checklists then allow exhaustivetestsneededtovalidateanystage ofthePLM,from the requirements, to the design and compatibility of system components, tothefinalassembly, operation andmaintenance. Note that theway the experimentsare performed on the real prototype isgenerally notmemorized orlinkedtotheupdated checklist.
Now, CAD models are progressively integrated into PLM softwaresuites.Industrialcompaniesaimatrunningasmanyof theprojectreviewchecklistsaspossibleusingvirtualnumerical modelsofthepartsorsystemsandtheircomponentsinsteadof arealprototype,inordertooptimizedevelopmentcosts.Fig.1
inspiredfromAlstom Transport’sproductdevelopment philos-ophyshowsaproductorsystemdevelopmentcycle.Itpresents the main developmentsteps, and theassociated gatereviews (reddiamond-shapedboxesatthepassagefromagivenproject steptothefollowingone):theSGR(SpecificationGateReview) istherequirementsreview;thePGR(PreliminaryGateReview), isthepreliminarydesignreview;theCGR(CriticalGateReview) is the detailed design review; the FEI (First Equipment Inspection)istheprototypereview;andtheIQA(InitialQuality Approval)istheindustrializationreview.Fig.1showshowthe industrialcompaniesintendtousevirtualprototypes tocheck as many steps as possible. Only the steps that cannot be performed withouta physical prototypeshould involve a real prototype.
However, manyprojectreviewsrequireexperimentation(i.e. manipulationoroperationofpartsorsystems),andbuildingand runningthecorrespondingsimulationsremainsadifficulttaskas the state of the art presented in Section 2 shows. It requires simultaneously visualizingand manipulatingthe corresponding 3D componentsorsystemsin arealistic andintuitiveway.The increasinglypowerfulimmersivecapabilitiesofVRtoolsopenthe waytonewpossibilitiesofperformingchecklist testsinVR,by usingmoreandmoreofftheshelforspeciallytailoredinteractive meansonvirtualprototypes.
4. AgenericVRprojectreviewparadigm
Our contribution relieson thedigital model of an electrical power converter. The virtual world is thus constituted by a workshopinwhichtheoperatorcanactonthevarious3Ddigital components of the converter designed by the engineering
departmentof thecompany. This3D environment is placed in thebackgroundofagivenchecklistusuallyusedbythecompany foraprojectreview(e.g.aproductdesignreview).Thus,everyitem ofthischecklistisdynamicallyconnectedtoasceneinwhicha certainnumberofactionsaremadeavailabletotheuser.Theuser canmoveinthescenethankstothemotioncaptureofmarkershe iswearing.Hecanalsoadapthispointofviewofthescenethanks tothemotioncaptureofhis3Dglasses.Forobjectmanipulation, thechosenmodeofinteractionrestsonthesimultaneoususeofa dataglove,whichenablesustomeasurethepositionofthevarious fingersoftheoperator’shand,andoftheopticalmotioncaptureof aframeattachedtotheoperator’shandequippedwithmarkers. Thesetwopiecesofinformationareusedtonavigateinthescene, seizeobjects,visualizetheavatarofthehandoftheuserbutalsoto navigatethroughtheinteractivemenuofthechecklistinorderto update it (e.g. attach multimedia information about the tests performed). The simultaneous detection of thepostures of the hand(relativepositioningoffingers)andofthehandtrajectories allows a large variety of intuitive means of interactions and constitutesanoriginalscientificcontributionofthiswork.
To meet our objectives, we integrated the Alstom checklist (encompassingallthestepsand gatereviewsrelatedtoagiven project)toourVRplatform.WecreatedaVRchecklistparadigm whichaimsatrunninganygatereviewofanindustrialproject, usingallthebasicmodulesscriptsalreadydevelopedinourVR platform, e.g. interactive planning for systems assembly or disassembly[19],partsmanipulationusinghapticsorcyberglove devices,virtualvisitorinspectionusingmotioncaptureorahaptic device,humanmovementgeneration,motioncaptureandanalysis
[15],etc.
4.1. IntegrationofvariousVRmodulesandconceptsinanindustrial framework
AVRenvironmentmustprovidetheoperatorwiththeabilityto executea multiplicityofscenariosbyusing3Dand multimedia
contents through sensorimotor devices. In order to realize the globalintegrationofthemultimodalchecklistVRmoduleandofall othermodulesanddevicesonourVRplatform,wehavedesigned themodulararchitecture presentedin Fig.2. Thus, the project reviewsimulationenvironmentisbasedon:
devices:asetofsensorimotorinterfacesandtheircontrollers (blueboxesinFig.2):dataglove,IRmotioncapture,flystick,haptic arm,stereovisualization.
modules:asetofVRmodules(greyboxesinthegreenbox) correspondingtoindependent scripts.‘‘Checklist’’enables usto open a review checklist file, to explore all corresponding proceduresandintegrateanswerstothecorrespondingquestions. ‘‘MotionCaptureManipulation’’enablesustousethedataglove andthemotioncapturesystemstomanipulateobjects.‘‘Human Movement Analysis’’ allows for human movement generation, motioncaptureandanalysis.‘‘MotionCaptureNavigation’’enables ustonavigate(e.g.visitorexplorethesceneorinspectobjects)in the virtual scene using the motion capture system. ‘‘Haptic Manipulation’’and ‘‘Haptic Navigation’’correspond respectively toobjectmanipulationandnavigationinthevirtualsceneusing thehapticarm.
data flows: Starting the simulation script triggers all VR modules,whichkeeprunninginparallel.Alldevicecontrollersare alsostartedinparallel,soeachdeviceintegratedintheplatformis alsoreadyforuse.Fig.2alsodescribesthedataexchangedbetween all involved modules, and the direction in which the data is exchanged.Theblackarrowsdescribethedataexchangebetween eachdeviceanditscontroller;thebluearrowsdescribe,thedata exchangesbetweenthedevicecontrollersandtheVRmodules.The data exchanged between the devices and their controllers are basically measures and/or control signals; the data exchanges betweenthedevicecontrollersandtheVRmodulesarelinkedto the nature of the devices and the interaction and immersion possibilitiestheyoffer.Forexample:
-the‘‘Checklist’’VRmodulespecificallyusesacombinationof3D pose (position and orientation) of a markers-based target attachedto the data glove and hand configuration (provided by the 22 sensors of the data glove), according to the hand postureandposeparadigmpresentedinSection4.3.Thesedata arerespectivelyprovidedbythemotioncapturesystemandthe dataglove.
-thedataexchangebetween thehapticdeviceand the‘‘Haptic Manipulation’’and‘‘HapticNavigation’’VRmodulesis bidirec-tionalanddataexchanged consistsmainlyofthepositionand orientationoftheend-effectorofthehapticarm.
Fig.2alsoshowsthatthestereovisualizationandtheflystick systems are used by all VR modules. The stereo visualization moduleuses ofcoursethe 3Dgeometriesof theobjectsof the scene,butalsothelocalizationoftheheadsuppliedbythemotion capturesysteminordertocontrolthepointofviewinthescene (seeSection6forfurtherdetails).
4.2. AgenericVRtoolforchecklists
Whatwehavemainlydevelopedinthisworkisagenerictool forprojectreviewsthathasbeentestedbyrunningtheassociated checklistsandvirtualexperimentsthroughallthestepsofthePLM.
Fig.3presentstheadvantagesofthistoolanditsabilitytobring improvementstoallstepsofthePLM.
VR simulations of project reviews usethe VR modules and devicesintegratedinourVRplatform.ForeachstepofthePLM,our toolallowstheusers:
-torunanumberofVRmodulesanddevicestoperformvirtual testsandexperimentsassociatedwiththisstep.
-to generate multimedia documentation associated with the correspondingreview.
To be more precise, this tool allows the user to open the checklist,explorethewholesetofquestions,performthevirtual tests,andupdatethechecklistbyattachingmultimedia informa-tionsuchas:textinformation toansweraquestion(byusinga standardcomputerkeyboard),screenshotofaviewofthesystem orparttobecontrolled (toprovethepresenceofsomedesired featureforexample),videorecordingofavirtualtest(byusingthe standard possibilities offered by Virtools). The attachment of multimediainformationtothecheck-listfileismadeavailableto theuserthroughaspecificmenu(asshowninvideo2,figure11, (2:13)).
Theimplementationoftheprojectreviewitselfandtheway devicesorVRmodulesarerunarethesameforeachPLMstep.The only components related to each particular PLM step are the answers given to the questions and the virtual experiments performed. Scripts allowing to run the given VR modules are modularandfullyreusableinanyvirtualexperimentperformed. Thisfeatureallowsustoobtainshortdevelopmenttimesofproject reviewsimulationsandthusgivesanefficientsolutiontooneof themainlimitationsofthecurrentCAEstateoftheartpresentedin Section2.4.
Theexpectedaddedvalueliesintheeconomical(savingofcost and time) and environmental (virtual versus real prototype) advantagesoftheuseofvirtualprototypingoverthewholePLM,as well as quality management benefits (problems can be better anticipatedanderrorsaredetectedearlierinthePLM)leadingto morepreventiveandfewercorrectiveactions.
4.3. 3Dimmersiveprotocolbasedonhandpostureandposition recognition
ThedevelopmentoftheChecklistmodulehasgivenrisetoa new3Dinteractiveprotocoltoexplorethechecklistarchitecture
byusinghandgestures.Theprinciplereliesonthejointuseof2 typesofinformation:
-Detectionofhandpostures,measuredbythedataglove,amonga predefined set of significant hand postures. Fig. 4 shows the alphabetofposturesweusedinthisworkforthechecklist. -Detection of hand movements, using the IR motion capture
systemandatargetequippedwithretroreflectivemarkers.
The combination of those 2 pieces of information allows recognizingactualhandposturesandgesturesinordertoexplore thechecklistandtoperformvirtualexperiments.
Thechecklistdataisrepresentedasatreestructureillustrated in Fig.5. In this example, level1 defines thegate review type (requirements,design,etc.)andlevel2definesthemaintechnical domains(electrical,mechanical,etc.).Thefollowinglevelsidentify subdomains,ifany(thenumberoflevelsmayvarydependingon thechecklist considered),andthetreeleaves correspondtothe questionsandteststobeperformed.Fig.5alsopresentsthehand postures or movements to be used to explore the tree: hand posture1tomovefromlevelntoleveln+1,posture2tomove fromlevelntolevelnÿ1,handposture3associatedwithmotion captureallowsselectingoneitem,andhandposture4associated withmotioncaptureallowsverticalscrollingofasetofitemsin ordertoreachanondisplayeditem.Theseposturesenabletheuser to navigate and select items inside the tree structure of the
Fig.5.Checklisttreestructure. Fig.3.PLMconceptioncyclewithVRtools.
checklist.Finally,postures1and5allowtheusertorespectively enterandquitthechecklistexploration.
Fig.6presentsthecompletealgorithmoftheChecklistmodule. Posturerecognitionisbasedonametricrelatedtotheangles measuredon the22 sensorsprovidedbythedataglove.Atthe beginningofeachchecklistsimulation,avectorofsensorvaluesfor aneutralposturePnisrecordedandtheuserisaskedtoperforma
30s initialization procedure. This procedure records vectors of sensor values associated with postures 1–5. A signed and normalized distance Pij to neutral value is computed for each
sensorjandeachreferenceposturePi:
Pij¼
v
ijÿv
njUBjÿLBj
wherejisthesensorindex,
v
ijthemeasuredvalueforpostureiandsensorj,
v
njthemeasuredvalueonsensorjfortheneutralposture,UBjandLBjarerespectivelytheupperandlowerboundonsensorj
values. Then, the posture recognition module compares these distancestothedistancebetweencurrentandneutralposturePn:
di¼ ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi X22 j¼1
a
2 jðPijÞ2 v u u twhere
a
jisthetunedweightthatpermitsefficientidentificationofpostures.ThenacurrentposturePisrecognizedasoneofthe5 specificposturesPiofthefigure4.3whenthedistancediislower
than a predefined threshold
e
. This simple tuning method isefficientintheconsideredframeworkandcanbecomparedwith resultsintheliterature(e.g.[26]).
In fact, onthe onehand,IRcaptured situationof theframe attachedtothehandenablestheoperatortonavigate geometri-callyinthevirtualscene,tonavigateinthechecklistmenuand,on theotherhand,thedatagloveenablestheoperatortoselectandto navigate hierarchically in the menubut also,when the virtual experimentisrunning,toperformspecificoperations:catchingor pushingobjectsinthevirtualexperiment.Theavatarofthehand appearsonthevirtualsceneinordertoshowrelativepositioningin spacebutalsotoshowactions.Asaconclusion,thejointuseofthe motion capture and dataglove for hand posture and position recognition allows the interacting userto be autonomous and effective in capitalizing data during the simulation in the VR environment.
Obviously, the navigation and selection protocol could be adapted depending on the sensorimotor interfaces without significantchangestotheglobalarchitectureoftheVR environ-ment.
5. Implementation
Theaboveapproachhasbeenimplementedbyusingspecific devicesandsoftwareinordertoexplorefeasibilityandtoquantify thebenefitsof thisnewprojectreviewenvironment.Ofcourse, othertechnologicalchoicesremainconsistentwiththeapproach.
5.1. TheVRplatformhardware
Inthissection,wepresenttheVRplatformofthelaboratory(see
Fig.7).Itconsistsofthefollowingelements:
A passive stereovision 3D immersive visualization system composedofthefollowingitems:twoProjectionDesignF2video projectorsfittedwithtwoopticalfiltersallowingforacircular polarization of the light, a 3mhigh and 2.25mwide stereo screen, several pairs of 3D glasses adapted to the passive visualizationthankstoequallycircularlypolarizedglasses. A‘‘Virtuose6D35-45’’hapticarm(seeFig.8).Thehapticsystem
iscomposedoftwoelements:a‘‘Virtuose6D35-45’’hapticarm andacontrolPC.Thehapticarmisasensorimotorbidirectional interface which allows manipulation by the user and force feedback.
A‘‘CyberGloveII’’wirelessmotioncapturedataglove(seeFig.8), which uses proprietary resistive bend-sensing technology to provide in real time 22 joint-angle measurements (thus providingreal-timemeasurementsofthehandposture config-urations).
Fig.7.VRplatform. Fig.6.Checklistalgorithm.
AnARTrack1Infrared(IR)motioncapturesystem(seeFigs.7and 8). ThesystemdesignedbytheARTcompany,iscomposedof fourIRcameras,sixsetsoftargets,eachbearingasetofpassive retroreflective markers set-up in a distinctive geometric configuration,andacontrolPC.Itprovidesinrealtimethe3D positionandorientationofanytarget.Theoperatorcanalsousea controllernamedflystick,allowinghimtocommunicateorders correspondingtoactionsontheflystickbuttons,includingatwo analogaxiscontrol,andequippedwithmarkersforlocalization bythemotioncapturesystem.
5.2. TheVRplatformsoftware
Themainsoftwarecomponentsare:
5.3. RunningtheChecklistmodule
TheChecklistmoduleopens theexcelfile, importsdataand builds the corresponding tree architecture in the Virtools environment.Then,thedifferentlevelsofthetreearedisplayed asinteractivemenusandareaccessibletogetherwiththeircontent through the gesture recognition system. When a question is selected,theassociated3Dvirtualenvironmentislaunchedand theoperatorexecutesthevirtualexperimentwithawiderangeof scenarios.During thevirtualexperiment,theoperatorcanstart videoorsnapshotrecording.Whenaquestionhasbeenchecked, an answer is stored in the excel spreadsheet and linked dynamically to the multimedia data related to the virtual experiment(videos,snapshots).
6. Experimentalresults
Thechecklist tool hasbeenimplemented and tested on the laboratory VR platform. The videos in this section show the experiments.
The first video (Fig. 9) illustrates the use of the gesture recognitionsystemforchecklistnavigation.First,thehandposture calibrationprocedurethatbuildsthealphabetisshown.Next,each navigation functionality in the checklist tree structure is used (posture1tovalidateanitemoraquestion,posture2tostepback inthechecklisttreestructureandpostures3and4withmotion capture to respectively select and scroll items). In this video, posture1isalsousedtoopenthechecklist,andposture5toclose it.
In the second video (Fig. 10), thechecklist is performed to answertwotechnicalquestions.Inthisvideo,partmanipulationis done thanksto cyberglove and motion capture and thehand’s avataris displayed in thescene. Thefirst technical questionis selectedandansweredbyusingasnapshotofaninspectedpart.A newtechnicalquestionisthenselectedandansweredbyrecording a video showing part assembly thanks to the cyberglove and motioncapturemanipulation.
Thelastvideo(Fig.11)illustratestheuseofchecklistandhaptic manipulation modules together. First, a technical question is selectedinthechecklist,then,avideoofthetestisrecorded.To completethemultimediadocumentationofthetechnicalquestion, acommentistyped.Thetechnicalquestionissetas‘‘seen’’andwe canfurtherseethechecklistspreadsheetupdated.
Preliminaryfeasibilityexperimentshavebeenconductedwith abouttenengineers.Theyprovetheusefulnessoftheapproachand thequickadaptationoftheusertothenavigationandselection scheme. In particular, the selected procedure for calibration is
Fig.9.Checklistnavigation[video1.mp4]. Fig.8.Sensorimotorinterfaces:hapticarmontheleft,headandhandARTmarkers,
cyberglove.
CAD: The virtual environmentsmanipulated and explored arebasedon3DCADmodelsgeneratedwithstandard CADtools.
Virtools: Allmodulespresentedinthiswork(projectreviewand VRmodules)havebeenimplementedasscriptsusing theVirtools1environmentforinteractivesimulations
development.Thethreelevelsofdevelopment provid-edbyVirtoolshavebeenused:
-Level1: Topgraphicallevel(dataflowtypegraphical scripts, involving behavioural processing blockscalledBuildingBlocks(BB)). -Level2: VSL(VirtoolsScriptingLanguage)thatallows
theprogrammertobuildspeciallytailored BBs.
-Level3:Clanguagelevel(SDK).Thislevelhasbeen used notably for computations needing moreadvancedmemorymanagement. VirtoolsisusedtoendowCADobjectswithproperties usedininteractivesimulations.Inparticular,Virtools associatesbehaviourproperties–orattributes–with the3Dobjectsmanipulatedsuchas:beinganobstacle ornot,beingmobileornot.ItalsoendowsCADobjects withphysicalproperties(defininggravity,deformation ofobjects)andbuildshierarchiesbetweenobjects. Microsoftofficesuite: Thechecklisthasbeenwrittenasanexcel
worksheet.Thisformatservesas input/ outputstorageformat.
successful and is not really sensitive to user change. It takes between15and30sdependingontheskilloftheuser.Then,after this calibration phase, the engineers succeeded in naturally moving around inside the checklist and selecting items in the menu.Thehandposturerecognitionisnotsensitivetouserchange thanks to the initialization procedure at the beginning of simulations. Concerning robustness issues when facing a large numberofusers,itcouldbeinterestingtostudythereductionof thesizeoftheposturealphabet.Asmallerposturealphabetcanbe obtainedbyusingmoremotioncapture(forexample,wecoulduse horizontalmovementstochangemenulevelsinthechecklist).
Concerning performance metrics, according to Section 2.2, therearecurrentlyveryfewnumericalindicatorstomeasurethe impactfactorof thiskindofapproachintheliterature,andthe measure of this impact can only be done through mid-term evaluationbythe(industrial)endusers.
7. Conclusionandperspectives
A new paradigmfor 3D immersiveproject review hasbeen developedonamodularbasisbyusingaVRenvironment.Ithas beentestedsuccessfullyforfeasibilityonarealindustrialproduct
development case and is naturally adapted to a variety of situationsinvolvingchecklists(e.g.assessmentofanew mainte-nanceprocedure,upgradeofaproduct,reorganizationofafactory, etc.).
This work does not claim to provide a commercial tool. It providesaproof-of-conceptofanewwayofusingnumericaldata andinteractivesimulationinawidespreadindustrialprocedure. Thisworkexploreshowcompaniescouldimprovesignificantlythe traceabilityofprojectreviews,decisionsanddocumentationinall thestepsofthePLM.
Themodularityofthearchitectureissuchthatothermeansof interactionareconceivablewithoutdrasticchangesofthewhole scheme. New interactive means are easily ‘‘plugable’’ into the architecturedependingontheirincreasingefficiency:inparticular, keyboardtypingofanswerswillbebeneficiallyreplacedbyspeech recognition tools. This paves the way towards a hand-free collaborative checklist based on virtual experiments with the ability to store numerical data relative to questions and to experimentalprocedure.
Preliminaryfeasibilityexperimentshavebeenconductedwith engineers.Theyprovetheusefulnessoftheapproachandthequick adaptationoftheusertothenavigationandselectionscheme.The engineerswho have tested theproposed tool predict improve-mentsover theconventional methods. Now,the assessmentof thosebenefitswouldnowrequiremid-termexperimentsbythe industrialendusers.Ofcourse,thekeytoawideandprofessional disseminationofthisapproachistheergonomicsoftheentire3D environmentfrombothsoftwareandhardwareviewpoints.
AppendixA. SupplementaryData
Supplementary dataassociated withthis articlecan be found, inthe onlineversion, athttp://dx.doi.org/10.1016/j.compind.2013. 03.018.
References
[1]L.Bennes,F.Bazzaro,J.-C.Sagot,Virtualrealityasasupporttoolfor ergonomic-styleconvergence:multidisciplinaryinteractiondesignmethodologyandcase study,in:Proceedingsofthe2012VirtualRealityInternationalConference,ACM, 2012,p.24.
[2]M.Bordegoni,U.Cugini,Hapticmodelingintheconceptualphasesofproduct design,VirtualReality9(2)(2006)192–202.
[3]M.Bordegoni,U.Cugini,P.Belluco,M.Aliverti,Evaluationofahaptic-based interactionsystemforvirtualmanualassembly,in:VirtualandMixedReality, LNCS,vol.5622,Springer,2009,pp.303–312.
[4]P.Bourdot,T.Convard, F.Picon, M.Ammi, D.Touraine, J.Ve´zien, VR–CAD integration:multimodalimmersiveinteractionandadvancedhapticparadigms forimpliciteditionofcadmodels,Computer-AidedDesign42(5)(2010)445–461. [5]P.Chedmail,D.Chablat,C.LeRoy,Adistributedapproachforaccessandvisibility taskwithamanikinandarobotinavirtualrealityenvironment?IEEE Transac-tionsonIndustrialElectronics50(4)(2003)692–698.
[6]P.Chevalet,N.DeBonnefoy,Lesapportsdelare´alite´ augmente´enomadepourla maintenanceae´ronautique,in:4e`meConfe´renceFrancophonedeMOde´lisationet SIMulation(MOSIM’03),2003.
[7]W.Dangelmaier,M.Fischer,J.Gausemeier,M.Grafe,C.Matysczok,B.Mueck, Virtualandaugmentedrealitysupportfordiscretemanufacturingsystem simu-lation,ComputersinIndustry56(4)(2005)371–383.
[8]P.De´pince´,D.Chablat,P.-O.Woelk,Virtualmanufacturing:toolsforimproving designandproduction,2007.arXivpreprintarXiv:0708.0495.
[9]G.DiGironimo,G.Matrone,A.Tarallo,M.Trotta,A.Lanzotti,Avirtualreality approachforusabilityassessment:casestudyonawheelchair-mountedrobot manipulator,EngineeringwithComputers(2012)1–15.
[10]J.Dijkstra,H.Timmermans,Conjointanalysisandvirtualreality,areview,in:4th DesignandDecisionSupportSystemsinArchitectureandUrbanPlanning Con-ference:DDSS,vol.98,1998.
[11]J.Eddy,K.Lewis,Visualizationofmultidimensionaldesignandoptimizationdata usingcloudvisualization,in:ASMEConferenceProceedings2002(36223),2002, pp.899–908.
[12]P.Fuchs,Letraite´ delare´alite´ virtuelle,vol.1:l’hommeetl’environnement virtuel,Pressesdel’EcoledesMines,Paris,2006.
[13]F.Gosselin,C.Andriot,J.Savall,J.Martı´ n,Largeworkspacehapticdevicesfor human-scaleinteraction:asurvey,Haptics:Perception,DevicesandScenarios (2008)523–528.
Fig.10.Checklistandmotioncapturenavigationandmanipulation[video2.mp4].
[14]D.Gude,E.Branahl,P.Kawalek,A.Prions,W.Laurig,Evaluationofavirtual reality-basedergonomicstutorial,in:D.deWaard,K.A.Brookhuis,S.M.Sommer,W.B. Verwey(Eds.),HumanFactorsintheAgeofVirtualReality,ShakerPublishing, Maastricht,TheNetherlands,2003,pp.117–128.
[15]V.Hue,J.-Y.Fourquet,P.Chiron,Onrealistichumanmotionsimulationforvirtual manipulationtasks,in:Proceedingsofthe10thIEEEInternationalConferenceon Control,Automation,RoboticsandVision(ICARCV’08),Hanoi,Vietnam,(2008), pp.167–172.
[16]S.Jayaram,U.Jayaram,Y.Kim,C.DeChenne,K. Lyons,C.Palmer,T.Mitsui, Industrycasestudiesintheuseofimmersivevirtualassembly,VirtualReality 11(4)(2007)217–228.
[17]T.Kantonen,C.Woodward,N.Katz,Mixedrealityinvirtualworld teleconferenc-ing,in:ProceedingsoftheIEEEConferenceonVirtualReality,2010,pp.179–182. [18]J.Kieferl,U.Wossner,Mixedrealities:improvingtheplanningprocessbyusing augmentedandvirtualreality,in:Proceedingsofthe4thConferenceonNew TechnologiesinLandscapeArchitecture,2003.
[19]N.Ladeveze,J.Fourquet,B.Puel,Interactivepathplanningforhapticassistancein assemblytasks,Computers&Graphics34(1)(2010)17–25.
[20]J. Lee,S.Han, J.Yang, Constructionofa computer-simulatedmixedreality environmentforvirtualfactorylayoutplanning,ComputersinIndustry62(1) (2011)86–98.
[21]J.Li,L.Khoo,S.Tor,Desktopvirtualrealityformaintenancetraining:anobject oriented prototype system(v-realism), Computers inIndustry 52(2) (2003) 109–125.
[22]A.Loock,E.Schomer,Avirtualenvironmentforinteractiveassemblysimulation: from rigidbodies todeformable cables, in: 5thWorld Multiconference on Systemics,CyberneticsandInformatics(SCI01),vol.3,2001,325–332. [23]S.Makris,L.Rentzos,G.Pintzos,D.Mavrikios,G.Chryssolouris,Semantic-based
taxonomyfor immersiveproductdesignusingVR techniques,CIRPAnnals: ManufacturingTechnology61(1)(2012)147–150.
[24]J. Marc,N.Belkacem,J. Marsot,Virtual reality:adesigntool forenhanced considerationofusability,SafetyScience45(5)(2007)589–601.
[25]V.Meyrueis,A.Paljic,P.Fuchs,D3:animmersiveaideddesigndeformation method,in:Proceedingsofthe16thACMSymposiumonVirtualRealitySoftware andTechnology,ACM,2009,pp.179–182.
[26]M.Moni,A.Ali,Hmmbasedhandgesturerecognition:areviewontechniquesand approaches,in:2ndIEEEInternationalConferenceonComputerScienceand InformationTechnology(ICCSIT2009),2009,433–437.
[27]G. Moreau,P.Fuchs, P.Stergiopoulos, Applicationsofvirtual reality inthe manufacturingindustry:fromdesignreviewtoergonomicstudies,Me´canique etindustries5(2)(2004)171–179.
[28]M.Mousavi,F.AbdulAziz,N.Ismail,Opportunitiesandconstraintsofvirtual realityapplicationininternationalanddomesticcarcompaniesofMalaysia,in: 2012UKSim14thInternationalConferenceonIEEEComputerModellingand Simulation(UKSim),2012,273–277.
[29]T.Mujber,T.Szecsi,M.Hashmi,Virtualrealityapplicationsinmanufacturing process simulation, Journal of Materials Processing Technology 155 (2004) 1834–1838.
[30]C.Noon,R.Zhang,E.Winer,J.Oliver,B.Gilmore,J.Duncan,Asystemforrapid creationandassessmentofconceptuallargevehicledesignsusingimmersive virtualreality,ComputersinIndustry63(5)(2012)500–512.
[31]F.Picon,Interactionhaptiquepourlaconceptiondeformesencaoimmersive, Ph.D.Thesis,Universite´ ParisXI,2010.
[32]M.Pouliquen,A.Bernard,J.Marsot,L.Chodorge,Virtualhandsandvirtualreality multimodalplatformtodesignsaferindustrialsystems,ComputersinIndustry58 (1)(2007)46–56.
[33]A.Raposo,I.Santos, L.Soares,G.Wagner,E.Corseuil,M.Gattass,Environ: integratingVRandCADinengineeringprojects,IEEEComputerGraphicsand Applications29(6)(2009)91–95.
[34]J.Sreng,A.Le´cuyer,C.Me´gard,C.Andriot,Usingvisualcuesofcontacttoimprove interactivemanipulationofvirtualobjectsinindustrialassembly/maintenance simulations,IEEETransactionsonVisualizationandComputerGraphics12(5) (2006)1013–1020.
[35]A.Tarault,P.Bourdot,J.Ve´zien,Sacari:animmersiveremotedrivinginterfacefor autonomousvehicles,in:ComputationalScience–ICCS2005,2005,3–15. [36]Z.Wang,Interactiveprojectreviewofdeformablepartsthroughhapticinterfaces
invirtualreality,Ph.D.Thesis,Universite´ deRennes1,2011.
P.FillatreaugraduatedfromtheENSEEIHTEngineering Schoolin1988.HeobtainedhisPhDinperceptionfor mobileroboticsin1994attheLAAS-CNRSLaboratory,
Toulouse, France. His PhD works, funded by the
Framatome Company, dealt with perception, 3D
environmentmodellingandlocalizationforan autono-mousall-terrainmobilerobot,intheframeworkofjoint
projects dealing with public safety or planetary
exploration.In1991–92hespentoneyearattheMatra
MarconiSpace(nowAstrium)Company,wherehewas
involvedintheEureka-AMRproject.In1994–95,he
wasapost-doctoralvisitingresearcheratthe Depart-ment of Artificial Intelligence of the University of
Edinburgh,Scotland;fundedbyaHumanCapitaland
Mobility fellowship of the European Community, his works dealt with non
polyhedralobjectsrecognitionandmodelupdateforpublicsafetymobilerobots.In
1996, Philippe Fillatreau joined the Delta Technologies Sud-Ouest Company
(Toulouse)wherehewastechnicalheadofthecomputervisionactivities.There,he leadresearchprojectsinthefieldsofComputerVisionandAlgorithmArchitecture Adequacy,aimingatintegratingadvancedimageprocessingalgorithmsinsmart camerasbasedonFPGAsforveryhighthroughputapplications(e.g.transports safety,onlinesatelliteimagesprocessingorqualitycontrolofindustrialprocesses). In2009,hejoinedtheEcoleNationaled’Inge´nieursdeTarbesasaresearcherand lecturer.HiscurrentresearchinterestsarerelatedtoVirtualReality,Computer Vision,MobileRoboticsandAlgorithmArchitectureAdequacy.
J.-Y.FourquetreceivedthePhDdegreeinroboticsfrom theUniversityofToulouse,Toulouse,France,in1990. HeiscurrentlyaProfessorwiththeNationalSchoolof Engineers in Tarbes (ENIT), University of Toulouse, where,since2011,hehasbeentheHeadofLaboratoire GeeniedeProduction(LGP).From1991to1992,hewas anAssociateProfessorwiththeComputingandSystems
Engineering Program, Federal University of Rio de
Janeiro,RiodeJaneiro,Brazil.From1992to2000,he
was an Associate Professor with the University of
Toulouse and a Researcherwith theLaboratoryfor
AnalysisandArchitectureofSystems.In2000,hejoined theLGP-ENIT,whereheledtheIntelligent Manufactur-ingGroupfrom2007to2009andtheDynamicDecision andInteractionsforSystemsteamfrom2009to2011.Hehasparticipatedorled variousprojectsattheindustrial,regional,national,orEuropeanlevels.Hisresearch
interestsmainlyconcernmobilemanipulation,humanmotionsimulation,and
interactiveassemblyscenarios.
R.LeBolloc’hgraduatedfromtheENITEngineering
School(UniversityofToulouse,France)in2011.His worksleadduringhisfivemonthsinternshipendinghis studiesattheENITandasixmonthsfollow-upresearch contractattheLGP-ENITlaboratorydealtwith
interac-tiveandimmersiveprojectreviewsbasedon
check-lists,andweresupervisedbyPhilippeFillatreauand
Jean-Yves Fourquet. Romain continued his career
developingvariousmobileGPSandmapping
applica-tionsonAndroidOS,andhasrecentlyjoinedAccenture TechnologySolutionsinSingaporeasassociate soft-wareengineer.
S.CailholgraduatedfromtheENITEngineeringSchool (Universite´ deToulouse,France)in2011.Tofinishhis studiesattheENIT,hemadehisfivemonthsinternship
in theLAPLACE (LAboratoire PLAsma etConversion
d’E´nergie) Laboratory of the ENSEEIHT Engineering
SchoolinToulousewhereheworkedonamathematical
approachforpulsewidthmodulation.InSeptember
2011,heintegratedtheDIDS(De´cisionetInteraction
Dynamique pour les Syste`mes) team in the LGP
(Laboratoire de Ge´nie de Production)Laboratoryof theENITEngineeringSchoolasaPhDstudenttowork oninteractivepathplanningontheVRplatformofthe laboratoryunderthesupervisionofJean-YvesFourquet andPhilippeFillatreau.
A.DatasgraduatedfromtheENITEngineeringSchool (Universite´ deToulouse,France)in2008.Tofinishhis studiesattheENIT,hemadehisfivemonthsinternship
attheIRCcYN(InstitutdeRechercheen
Communica-tionsetCyberne´tiquedeNantes)inNanteswherehe
workedonbipedlocomotion.InSeptember2008,he
integratedtheDIDS(De´cisionetInteractionDynamique pourlesSyste`mes)team intheLGP(Laboratoirede Ge´niedeProduction)LaboratoryoftheENIT Engineer-ingSchoolasaPhDstudenttoworkonhumanmotion underthesupervisionofPascaleChironandJean-Yves Fourquet.
B.PuelreceivedFrenchdiplomaofEngineeringfrom NationalSchoolofEngineersinTarbes(ENIT)France,in 1986.HeiscurrentlySeniorExpertinTractionand
AuxiliaryElectronicConverter HardwareinAlstom
Companyonrailwayequipment’sassociatingTarbes,
CharleroiandMilanTractionnetworksites.Hecreated andanimateddifferentMechanicalIntegrationdesign officesinTarbesasElectronicPowermodulehardware forallAlstomTransportin1997,TractionandAuxiliary Convertersin2007.Hehasparticipatedorledvarious projectsoninternalAlstomdevelopments,patentsand
supervising of thesis work relating to Mechanical
fittings&StructuralGlueonTraction&Auxcubicles
from 2005 to 2008, Virtual Reality about Design,