• Aucun résultat trouvé

Evolution of non-speech sound memory in postlingual deafness: implications for cochlear implant rehabilitation

N/A
N/A
Protected

Academic year: 2022

Partager "Evolution of non-speech sound memory in postlingual deafness: implications for cochlear implant rehabilitation"

Copied!
9
0
0

Texte intégral

(1)

Article

Reference

Evolution of non-speech sound memory in postlingual deafness:

implications for cochlear implant rehabilitation

LAZARD, D S, et al.

Abstract

Neurofunctional patterns assessed before or after cochlear implantation (CI) are informative markers of implantation outcome. Because phonological memory reorganization in post-lingual deafness is predictive of the outcome, we investigated, using a cross-sectional approach, whether memory of non-speech sounds (NSS) produced by animals or objects (i.e.

non-human sounds) is also reorganized, and how this relates to speech perception after CI.

We used an fMRI auditory imagery task in which sounds were evoked by pictures of noisy items for post-lingual deaf candidates for CI and for normal-hearing subjects. When deaf subjects imagined sounds, the left inferior frontal gyrus, the right posterior temporal gyrus and the right amygdala were less activated compared to controls. Activity levels in these regions decreased with duration of auditory deprivation, indicating declining NSS representations.

Whole brain correlations with duration of auditory deprivation and with speech scores after CI showed an activity decline in dorsal, fronto-parietal, cortical regions, and an activity increase in ventral cortical regions, the right [...]

LAZARD, D S, et al . Evolution of non-speech sound memory in postlingual deafness:

implications for cochlear implant rehabilitation. Neuropsychologia , 2011, vol. 49, no. 9, p.

2475-82

DOI : 10.1016/j.neuropsychologia.2011.04.025 PMID : 21557954

Available at:

http://archive-ouverte.unige.ch/unige:25805

Disclaimer: layout of this document may differ from the published version.

1 / 1

(2)

ContentslistsavailableatScienceDirect

Neuropsychologia

j our na l h o me p ag e:w w w . e l s e v i e r . c o m / l o c a t e / n e u r o p s y c h o l o g i a

Evolution of non-speech sound memory in postlingual deafness: Implications for cochlear implant rehabilitation

D.S. Lazard

a,b,∗

, A.L. Giraud

a

, E. Truy

c,d

, H.J. Lee

a,e

aEcoleNormaleSupérieure,INSERMU960,ParisF-75005,France

bAP-HP,HôpitalBeaujon,Serviced’ORLetChirurgieCervico-Faciale,ClichyF-92110,France

cCNRS,UMR5020NeurosciencesSensorielles,Comportement,CognitionandUniversitéClaudeBernardLyon1,LyonF-69366,France

dHospicesCivilsdeLyon,HôpitalEdouardHerriot,Départementd’ORL,deChirurgieCervico-Maxillo-Facialeetd’Audiophonologie,LyonF-69003,France

eHallymUniversityCollegeofMedicine,DepartmentofOtorhinolaryngology–HeadandNeckSurgery,896,Pyeongchon-dong,Dongan-gu,Anyang-si, Gyeonggi-do431-070,RepublicofKorea

a r t i c l e i n f o

Articlehistory:

Received19January2011

Receivedinrevisedform23March2011 Accepted19April2011

Available online 29 April 2011

Keywords:

Plasticity Amygdala Thalamus Dorso-ventral Outcome Predictor

a b s t r a c t

Neurofunctionalpatternsassessedbeforeoraftercochlearimplantation(CI)areinformativemarkersof implantationoutcome.Becausephonologicalmemoryreorganizationinpost-lingualdeafnessispredic- tiveoftheoutcome,weinvestigated,usingacross-sectionalapproach,whethermemoryofnon-speech sounds(NSS)producedbyanimalsorobjects(i.e.non-humansounds)isalsoreorganized,andhowthis relatestospeechperceptionafterCI.WeusedanfMRIauditoryimagerytaskinwhichsoundswere evokedbypicturesofnoisyitemsforpost-lingualdeafcandidatesforCIandfornormal-hearingsub- jects.Whendeafsubjectsimaginedsounds,theleftinferiorfrontalgyrus,therightposteriortemporal gyrusandtherightamygdalawerelessactivatedcomparedtocontrols.Activitylevelsintheseregions decreasedwithdurationofauditorydeprivation,indicatingdecliningNSSrepresentations.Wholebrain correlationswithdurationofauditorydeprivationandwithspeechscoresafterCIshowedanactivity declineindorsal,fronto-parietal,corticalregions,andanactivityincreaseinventralcorticalregions,the rightanteriortemporalpoleandthehippocampalgyrus.Bothdorsalandventralreorganizationspre- dictedpoorspeechperceptionoutcomeafterCI.Theseresultssuggestthatpost-CIspeechperception relies,atleastpartially,ontheintegrityofaneuralsystemusedforprocessingNSSthatisbasedon audio–visualandarticulatorymappingprocesses.Whenthisneuralsystemisreorganized,post-lingual deafsubjectsresorttoinefficientsemantic-andmemory-basedstrategies.Theseresultscomplement thoseofotherstudiesonspeechprocessing,suggestingthatbothspeechandNSSrepresentationsneed tobemaintainedduringdeafnesstoensurethesuccessofCI.

© 2011 Elsevier Ltd. All rights reserved.

1. Introduction

Neuro-functionalinvestigations of auditoryrehabilitation by cochlearimplantation have providednewinsights intotherole ofbrainplasticityandcognitivecontrolincochlearimplant(CI) outcome (Moore & Shannon, 2009). Neuro-functional cognitive patternsestablishedprior toor afterimplantation have proven tobenot only informative,but alsoreliable prognosismarkers (Champoux,Lepore,Gagne,&Theoret,2009;Lazardetal.,2010;

Leeetal.,2001;Lee,Giraud,etal.,2007;Lee,Truy,Mamou,Sappey- Marinier,&Giraud,2007;Mortensen,Mirz,&Gjedde,2006;Rouger etal.,2007;Sharma,Nash,&Dorman,2009).Althoughtheseneu- ralpatternsarenotindividuallyuseful,theymayhelpintargeting

Correspondingauthorat:EcoleNormaleSupérieure,INSERMU960,29rue d’Ulm,75005Paris,France.Tel.:+33144322640;fax:+33147040836.

E-mailaddresses:diane.lazard@bjn.aphp.fr,dianelazard@yahoo.fr(D.S.Lazard).

efficientpreventivebehavioralrehabilitationtocounterthedele- teriouseffectsofdeafness-inducedbrainreorganization.

Toassessfunctionalreorganizationinpost-lingualdeafnessand howitaidsspeechprocessingfollowingCI,weexploredauditory memoryinpost-lingualdeafsubjects(Lazardetal.,2010;Lee,Truy, etal.,2007).Wethusshowedthatphonologicalmemorydeclines inthecourseofauditorydeprivation,andthatthisdeclinepredicts poorCIoutcome(Lazardetal.,2010).Post-lingualdeafsubjects reacttothislossoffunctionbyusingalternativeneuralstrategies, suchasglobalsemanticanalysis,andbyenhancingneuralactiv- ityintherightposteriortemporalcortex(Lazardetal.,2010)–a reorganizationthatisconsistentlydescribedacrossstudies(Giraud

&Lee, 2007; Lazard et al.,2010; Lee, Giraud, et al.,2007; Lee, Truy,etal.,2007).Asthisregionisinvolvedmoreinnon-speech sound (NSS) than speechsoundprocessing (Halpern& Zatorre, 1999;Thierry,Giraud,&Price,2003;Zatorre&Halpern,1993),this adaptationisdetrimentaltospeechperceptionrestorationwithCI (Lazardetal.,2010).

0028-3932/$seefrontmatter© 2011 Elsevier Ltd. All rights reserved.

doi:10.1016/j.neuropsychologia.2011.04.025

(3)

2476 D.S.Lazardetal./Neuropsychologia49 (2011) 2475–2482

Here,weaskwhetherreorganizationoftherightposteriorsupe- rior temporal region in post-lingual deafness is related to NSS representationdecline,whetheritisassociatedwithbroaderalter- ationsofNSSprocessing,andhowthisreorganizationimpactson speechperceptionwithCI. Weinvestigated,inacross-sectional approach,auditoryimageryofnon-humansoundsinpost-lingual deafcandidatesfor cochlearimplantation. Neural activitymea- suredwithfMRIwasanalyzedandcomparedacrossdeafsubjects andnormal-hearingcontrols,andinthelattergroup,correlated withauditorydeprivationdurationandCIscoresaftersurgery.

2. Methods 2.1. Subjects

TwentyadultsparticipatedinthisfMRIstudy(approvedbythelocalethicscom- mitteeCPP,Sud-EstIV,CentreLéonBérard,Lyon,France).Tenpost-lingual,severe toprofounddeafcandidatesforCI(8womenand2men,meanage±s.d.=53±14.7 years)andtenage-matchednormal-hearingcontrols(6womenand4men,mean age±s.d.=41.1±13.77yearsparticipatedinthestudy;normalhearingwasdefined bypuretonethresholds≤20dBHLforfrequenciesfrom500to4000Hz).Allsub- jectshadnormalorcorrected-to-normalvision,nohistoryofneurologicalpathology andwereright-handedaccordingtotheEdinburghhandednessinventory(Oldfield, 1971,allparticipantsscoredover60).Noneofthesubjectshadspecifictrainingin music,andnonewereprofessionalmusicians.NoneofthetenCIcandidatesused signlanguage;theyallreliedonlipreadingandwrittenlanguageforcommunication.

ClinicalcharacteristicsofdeafsubjectsaresummarizedinTable1.Importantly,the durationofhearingloss,i.e.thetimeelapsedsincethefirstauditoryacuitydecrease leadingtotheuseofhearingaids,andthedurationofdeafness,i.e.thetimeelapsed sincesubjectscouldnolongercommunicatebyhearingevenwiththebest-fitted hearingaids,weredistinguishedastheseparametersmightdifferentlyinfluence cerebralreorganization.Subjectnumber7wasimplantedduetosuddendeafness followingabilateraltemporalbonefracturethataffectedinnerearstructures.In oneCIcandidatewithprogressivehearingloss,thetransitionfrommildtosevere hearinglossoccurredduringchildhoodlongafterlanguageacquisition,andthen remainedstableformanyyearsuntilthisstudywasperformed(subjectnumber 10).Giventhissubject’soutlierstatuswithrespecttodeafnessevolution,herdata wereexcludedfromcorrelationswithdeafnessduration,butnotfromthosewith durationofhearingloss.

2.2. Experimentalparadigm

fMRIdatawereacquiredinbothsubjectsamplesduringataskthatconsistedin imaginingthecolorsorthesoundsproducedbyvisuallypresenteditems(objects oranimals,butnothumans).Weperformedmultipleregressionanalysesbetween thefMRIdataandtheauditorywordrecognitionscoresat6monthsaftercochlear implantationintheCIcandidategroup(Table1).Subjectswereeitherimplanted withaCochleardevice(Melbourne,Australia)orwithaMXMdevice(Vallauris, France).

2.3. ImagingparametersforfMRIexperiment

Gradientecho-planarfMRIdatawithblood-oxygenationleveldependentcon- trastwereacquiredwitha1.5Tmagneticresonancescanner(SiemensSonata, MedicalSystems,Erlangen,Germany)withstandardheadcoiltoobtainvolume serieswith33contiguousslices(voxelsize3.4mm×3.4mm×4mm,nogap,repe- titiontime2.95s,echotime60ms)coveringthewholebrain.Earplugs(meansound attenuation30dB)andearmuffs(meansoundattenuation20dB)wereprovided bothtocontrolsanddeafsubjectstoequateasmuchaspossibletheexperimental environmentbetweengroups.Weacquired498functionalimagespersubjectin tworuns.

2.4. fMRIdesignandimagingtasks

Thevisualstimuliwere80blackandwhitepicturesofeverydaylifeitems,half living(butnothuman)andhalfobjects.Subjectswereaskedtoimagineeitherthe soundproducedbyitems(NSSimagery),ortoimaginetheircolor(onlysilentobjects, colorimagery),ortosilentlynamethem(controltask,silentandnoisyobjects).

Conditionswererandomizedacrosssubjects.

Subjectswererequestedtoself-ratetheirperformancebypressingdifferent keysdependingonwhether(leftbutton)ornot(rightbutton)theythoughtsound orcolorimagerywasproperlyachieved.Accuracyandrelatedreactiontimeswere recorded.Noobjectivecontrolwaspossibleinthisexperiment,butastheitemsused fortheexperimentcouldnotbeunknowntotheparticipants,weassumethatthey couldatleastattempttoimaginethesound(Bunzeck,Wuestenberg,Lutz,Heinze,&

Jancke,2005).ThefMRIexperimentwascomprisedoftworunsofanevent-related design.Imageswerepresentedfor1.5sandwerefollowedbyafixationcrossran- domlyvaryingfrom1to7s.Ascreenshowingwritteninstructions(3.5s)preceded

eachimage.AllsubjectsperformedatrainingsessionusingPresentationsoftware version9.90(NeurobehavioralSystem,Inc.,Albany,CA,USA)justbeforethescan- ning.Noneofthesubjectsreportedvisualcomplexityorambiguityofvisualobjects, assessedbyaquestionnaireafterthetrainingsession.

2.5. Statisticalanalyses

Accuracy and reaction times were compared across groups using the Mann–Whitney–Wilcoxontest(resultsareindicatedinmean±standarddeviation).

ThefMRIdatawereanalyzedusingtheSPM5(StatisticalParametricMap- ping, Centre for Neuroimaging, London, UK, http://www.fil.ion.ucl.ac.uk/spm) implementedinaMatlab7.1(Mathworks,Natick,MA,USA)environment,and displayedusingMRIcronsoftware(www.sph.sc.edu/comd/rorden/mricron).We performedstandardpreprocessing(realignment,unwarping,normalizationand spatialsmoothingwithan8-mmfullwidthathalf-maximumGaussiankernel)and calculatedcontrastimagesversusbaseline(pre-andpost-stimulationimages)in eachsinglesubjectforeachtask(imaginesound,imaginecolor,andnaming).We performedgroupanalysesofeachcontrastcitedabove(onesamplet-tests),plus contrastscomparingeachconditionwiththeothers.GroupdifferencesbetweenCI candidatesandcontrolswereexploredusingtwosamplet-testsandinclusivemasks atp=0.05.Inthesoundimagerytask,task-by-groupinteractionswithinclusive masksandANOVAmodelinggroupsandconditionswereadditionallyperformed.

FortheCIcandidategroup,weenteredcontrastimagesforauditoryimagery(minus baseline)intoaregressionanalysistotestwhetherneuralactivationvariedasafunc- tionofwordrecognitionscoresaftersixmonthsofimplantationandasafunction ofauditorydeprivation.Resultsofimaginganalyseswerethresholdedatp0.0001 uncorrected,exceptfortheinteractionanalyses(p0.001).

Frompeakvoxelsofsignificantclusters,individualbetavalueswereextracted fromthemaincontrastimaginesound>baselineorimaginecolor>baselineand testedforpossiblecorrelations(Spearman’scorrelationtest)withclinicalvariables (i.e.durationofhearingloss/deafness,post-CIwordrecognition).Individualvalues werealsocomparedacrossgroupsusingtheMann–Whitney–Wilcoxontestand withingroupsusingpaired-t-tests.

3. Results

3.1. Behavioralassessmentandcontroltasks

Themeanscoresreflecting taskfeasibility(self-scoringaccu- racy) and reaction times during the imagery tasks were not statistically different between CI candidates and controls: for thecolorimagerytask,accuracywas74%±23and73%±28and reactiontimeswere1792ms±977.4and1738ms±630.4forcan- didates and controls, respectively; for the sound imagery task, accuracy was 79%±23 and 84%±16 and reaction times were 1771ms±1096and 1673ms±546, forcandidatesand controls, respectively.Deafsubjectsdidnotfinditdifficulttomentallyevoke soundsfromthepictures,eventhoughthesesoundshadnotbeen heardforalongtime.

Main contrasts for the color imagery task are illustrated in Fig.1.Networksofbothgroupslargelyoverlappedwithactivation ofbilateralfrontalregions,angulargyri,inferiortemporalcortex andcerebellum(Table2).Groupcomparisonshowedthatonlyone regionwassignificantlyunder-activatedbydeafsubjects:theright prefrontalcortex(282256,clustersize32,Zscore3.92).Group comparisonofthenamingtask(controltask)didnotshowasig- nificantdifference,confirmingthatCIcandidatesdidnotdisplay anyhighcognitivefunctiondisabilities,particularlysemanticones.

Becausethese contrasts didnot showrelevant differences, and becauseweareinterestedinauditorymemory,wegonofurther intodetails.

3.2. Thenon-speechsoundimagerynetworkinnormal-hearing anddeafsubjects

Sound imagery in normal-hearing subjects activated multi- modal cognitive areas, the bilateral frontal and left parieto- temporalareas(Krautetal.,2006;Leffetal.,2008;Scott,Blank, Rosen,&Wise,2000;Shannon&Buckner,2004),andareasded- icated to NSS processing such as the right posterior temporal cortexandtheleftinsula(Beauchamp,Lee,Argall,&Martin,2004;

(4)

Table1

Clinicaldataofthetenprofounddeafcandidatesforcochlearimplantation(CI).

PatientNo. Sex Ageatexperiment(year) BilateralHLduration(years) Deafnessduration(months) WRSpre-CIa(%,60dB) WRSpost-CI(%,60dB)

1 F 67.7 46 36 0 48

2 M 45 26 4 0 41

3 F 32 10 12 0 69

4 M 57.9 41 16 0 84

5 F 59 11 36 0 88

6 F 56.2 23 36 0 24

7 F 25.5 0.5 5 0 86

8 F 56 22 48 0 48

9 F 73 16 8 0 86

10 F 54.7 30 50 28

HL=hearingloss.WRS=wordrecognitionscores.

aSubjectsweretestedwiththree-phonememonosyllabicFrenchwords(recordedmaterial),wearingoptimallyfittedhearingaids.

Table2

Areasofsignificantactivationforsoundandcolorimageryinthetwogroups(p0.0001,uncorrected).

Deafpatients Controls

L/R Region MNIcoordinates

(xyz)

Cluster size

Zscore L/R Region MNIcoordinates

(xyz)

Cluster size

Zscore

Sound>baseline

L Inferiorfrontalgyrus −4624−16 130 4.41 L Inferiorfrontalgyrus −501021 993 5.36

AnteriorSTG −544−6 4.51

R Inferiorfrontalgyrus 60140 461 4.95

L Superiorfrontalgyrus −345618 158 3.93 L Superiorfrontalgyrus −223248 22 4.06

L Inferiorparietallobule −44−6638 312 4.81 L Temporo-parietaljunction −62−4022 60 4.73

R PosteriorSTS/MTG 70−4010 128 4.30

L Cerebellum −22−70−30 413 4.26 L Cerebellum −28−56−32 532 4.85

R Cerebellum 44−72−28 1365 5.34 R Cerebellum 16−82−24 791 4.71

Color>baseline

L Inferiorfrontalgyrus −4820−12 146 4.08 L Inferiorfrontalgyrus −5220−6 372 4.40

−58224 17 3.99

R Inferiorfrontalgyrus 6280 9 5.84 R Middle/inferiorfrontalgyrus 464414 325 4.37

60142 208 4.34

L Middlefrontalgyrus −405616 30 4.56

L Superiorfrontalgyrus −206228 65 4.06 L Superiorfrontalgyrus −146028 65 4.06

R Anteriorfrontalgyrus 3862−2 11 3.47 R Pre-frontalcortex 242266 17 3.94

L Angulargyrus −54−4054 142 3.93 L Angulargyrus −40−6438 390 5.56

R Angulargyrus 54−5640 71 3.63 R Angulargyrus 52−6248 221 4.52

L Middle/Inferiortemporalgyrus −64−44−10 18 3.40 L Inferiortemporalgyrus −54−66−16 759 5.19 R Middle/Inferiortemporalgyrus 62−32−14 85 3.77

L Cerebellum −24−70−30 419 4.43 L Cerebellum −34−58−38 109 4.52

R Cerebellum 42−72−28 879 5.23 R Cerebellum 32−62−36 645 4.85

Sound>color

L Insula −4246 289 3.89

L Inferiorparietallobule −46−7028 107 3.90

R PosteriorSTS/MTG 70−3410 27 3.55

STG:superiortemporalgyrus,STS:superiortemporalsulcus,MTG:middletemporalgyrus.

Doehrmann&Naumer,2008;Engel,Frum,Puce,Walker,&Lewis, 2009;Halpern&Zatorre,1999;Lewisetal.,2004;Thierryetal., 2003;Zatorre&Halpern,1993)(Fig.2,greyblobsand Table2).

Eventhoughdeafsubjectsdidnotreportdifficultiesinimagining sounds,theydidnotrecruitthesameregionsasthecontrols,but insteadactivatedmoreanteriorfrontalregions(Fig.2,blueblobs andTable2).Comparedtocontrols,CIcandidatesunder-activated

theleftinferiorfrontalgyrus,theleftinsula,andtherightposterior temporalcortex(Fig.2,yellowblobs,Table3),anddidnotshow enhancedneuralactivityinanybrainregion.

Inpost-hoccorrelations,activationlevelsinthoseregionsthat wereunder-activatedinCIcandidateswerenegativelycorrelated withthedurationofauditorydeprivation(withdeafnessduration inleftinferiorfrontalgyrusp=0.01,rho=−0.77;withhearingloss

Table3

Groupcomparisonandtask-by-groupinteraction.

Contrast L/R Region MNIcoordinates(xyz) Clustersize Zscore

Controls>CIcandidatesinthesoundimagery L Insula −4482 4 3.38

R PosteriorSTS/MTG 68−3610 8 3.35

L Inferiorfrontalgyrus −502230 2 3.28

Interaction:controls>CIcandidatesandsound>color R Amygdala 300−20 57 4.97

CIcandidates>controlsinsoundimagery

Interaction:CIcandidates>controlsandsound>color R Thalamus 14−248 2 3.20

STS:superiortemporalsulcus,MTG:middletemporalgyrus.

(5)

2478 D.S.Lazardetal./Neuropsychologia49 (2011) 2475–2482

Fig.1. Colorimageryinnormalhearingcontrolsanddeafcochlearimplant(CI)can- didates.Surfacerenderingimagesdisplaysignificantclustersfrommaincontrasts, suchascolorimageryincontrols(darkgreen),colorimageryinCIcandidates(pink), andcontrolsmorethanCIcandidatesduringthecolorimagerytask(lightgreen).For illustrationpurposes,renderedimageswerethresholdedatuncorrectedp<0.005, withextentthresholdk=50.

durationinrightposteriortemporalcortexp=0.02,rho=−0.73and leftinsulap=0.1,rho=−0.54(trend);Fig.2,scatterplots).Activa- tionlevelsoftherightposteriortemporalcortexandtheleftinsula werenotcorrelatedwithpost-CIscores,buttheactivationlevelof theleftinferiorfrontalgyruswascorrelatedpositively(p<0.0001, rho=0.94).

3.3. Reorganizationofthenon-speechsoundimagerynetworkin post-lingualdeafsubjects

A significant task-by-group interaction reflecting under- activatedareas in CI candidatesrelative tocontrolsduring the soundimagerytaskversusthecolorimagerytaskwasfoundfor therightamygdala(Fig.3,yellowblobandTable3).Groupcom- parisonfromextractedvaluesshowedthattherightamygdalawas

notactivatedinCIcandidatesforsoundimageryrelativetocontrols (p=0.01,Fig.3,lefthistogram),butwasparadoxicallyactivatedfor colorimagery(p<0.001).Activationoftherightamygdaladuring thesoundimagerytaskwasnegativelycorrelatedwithdurationof hearingloss(Fig.3,leftscatterplot).

Thetask-by-groupinteractiondisplayingover-activationinCI candidatesrelativetocontrolsshowedasignificanteffectinthe rightthalamus (Fig.3, green blob and Table3).Activationlev- elsindicatedthatcontrolsdidnotrecruitthisregionacrosstasks (activity ator belowbaseline). CI candidatesshoweda relative activationduringthesoundimagerytaskcomparedtothecolor imagerytask(intra-group groupcomparison,p=0.04), andalso comparedtothesoundimageryincontrols(inter-groupcompari- sonp=0.06,Fig.3,righthistogram).Thepartoftherightthalamus showing this effect was mainly the pulvinar, extending tothe mediannucleus.Neuralactivityoftherightthalamusduringsound imagerywaspositivelycorrelatedwithhearinglossduration(Fig.3, rightscatterplot).

3.4. WholebrainfMRIcorrelationwithhearinglossdurationand post-CIspeechperception

Since neuralactivityin regions dedicated toNSS processing decreasedfollowingauditorydeprivationindeafsubjects,weper- formedawholebraincorrelationwithbothdurationofhearingloss andpost-CIscores(Fig.4).Whenimaginingsounds,futuregood performerswithCIshowedthestrongestneuralactivityindorsal fronto-parietalandoccipitalregions(blueblobs),whereasfuture poorperformershadthestrongestactivationlevelsinaventralnet- workspreadingalongbilateralmedialtemporallobes,includingthe hippocampalgyrus(redblobs,Table4).Awholebraincorrelation withdurationofhearinglossconfirmedthisdorso-ventraldisso- ciation(Fig.4,goldenandgreenblobs):thedorsalnetwork was negativelycorrelatedwithdurationofhearingloss,whereasthe ventralnetworkwaspositivelycorrelatedwithdurationofhear- ingloss(overlapwiththeleftmedialtemporalgyrusandtheright anteriortemporallobe).Furthermore,althoughitdidnotappearat

Fig.2.Soundimageryinnormal-hearingcontrolsanddeafcochlearimplant(CI)candidates.Surfacerenderingimagesdisplaysignificantclustersfrommaincontrasts,such assoundimageryincontrols(grey),soundimageryinCIcandidates(skyblue),andcontrolsmorethanCIcandidatesduringthesoundimagerytask(yellow).Forillustration purpose,renderedimageswerethresholdedatuncorrectedp<0.001,withextentthresholdk=50.Inthreeregionsunder-activatedindeafsubjects,individualextracted betavalueswereplottedasafunctionofthedurationofauditorydeprivation(durationofdeafnessordurationofhearingloss).

(6)

Fig.3.Renderingofgroup-by-taskinteractionsduringthesoundandcolorimagery.ControlsmorethanCIcandidates(leftpanel,yellowblob,frontalplaneshowingtheright amygdala),andCIcandidatesmorethancontrols(rightpanel,greenblob,axialplaneshowingtherightthalamus).Histogramsdepictinter-groupandinter-taskcomparisons ofextractedvalues.Scatterplotsrepresentpost-hoccorrelationswithdurationofhearingloss.

Fig.4. Surfacerenderingforwholebraincorrelationwithpost-cochlearimplantscoresat6monthsaftersurgery(positivecorrelation:blue,negativecorrelation:red)and withdurationofhearingloss(positivecorrelation:green,negativecorrelation:golden).Effectdisplayedatp=0.01(T=2.90).

Table4

WholebraincorrelationbetweenneuralresponsetosoundimageryandCIoutcomesinCIcandidates.

Correlation L/R Region MNIcoordinates Clustersize Zscore

PositivecorrelationwithCIscores L Inferiorfrontalgyrus −482230 38 4.77

L Postcentralgyrus −46−2850 67 3.86

L Linualgyrus −4−744 89 3.46

R Middlefrontalgyrus 482236 13 3.44

NegativecorrelationwithCIscores L Hippocampalgyrus −18−10−24 37 4.66

−34−22−16 8 3.85

R Hippocampalgyrus 38−18−24 30 3.85

L Precentalgyrus −48−222 7 3.87

R MiddleSTG 46−1212 36 3.74

STG:superiortemporalgyrus.

(7)

2480 D.S.Lazardetal./Neuropsychologia49 (2011) 2475–2482

thewholebrainlevel(thresholdedatp<0.001),activitysampled fromtherighthippocampalgyruswasalsopositivelycorrelated withhearinglossduration (p=0.04;rho=0.66, post-hoccorrela- tion),confirmingtheincreasinginvolvementoftheventralnetwork withtime.

4. Discussion

Neuro-functionalstrategiesdevelopedduringauditorydepriva- tionaccountforCIoutcome,overandabovewhatmaybeexplained byperipheralfactors(Moore&Shannon,2009).Deafnessdoesnot onlyprecludeauditoryprocessing,butalsomoreintegratedcog- nitiveprocesses such asphonological processing (Lazardet al., 2010;Mortensenetal.,2006), andthereforepromptsimportant functionalbrainreorganization(Strelnikovetal.,2010).Thisreor- ganizationisa determinantof speechoutcome asitshapesthe futurespeechcomprehensionsystem.Froma purelyperceptual viewpoint,deafnessshouldaffectnotonlyspeech(Lazardetal., 2010)butalsoNSSprocessing.However,becauseofcommunicative needs,weassumethattheevolutionofspeechandNSSprocessing couldfollowdifferenttrajectories,whichmightdifferentlyimpact CIoutcome.Becausephonologicalprocessingdecreaseswiththe durationofauditorydeprivationinpost-lingualdeafsubjectsas reportedinourpreviousstudy(Lazardetal.,2010),wepropose thatNSSprocessingcouldalsodeclineasaconsequenceofauditory deprivationandinteractwithCIoutcome.

Whileneuralactivationduringthecolorimagerytaskwasnot reallydifferentbetweengroups, deafsubjectsshoweddifferent neuralactivityinregionsofauditoryinformationprocessingduring theNSSimagerytask.Itisworthnotingthattheprimaryauditory corticesofnormalhearingsubjectswerenotactivatedbypoten- tialnoisecontaminationfromthescannerthatmighthavebeen presentdespitesoundattenuation,duringbothcolorimageryand soundimagerytasks.Anynoise-relatedactivitywasaveragedout bycontrastingtask-relatedactivitywithbaseline-relatedactivity, duringwhichthesoundwasalsopresent.Thereforewecandismiss theideathatdeafsubjectsandnormalhearingcontrolswerenotin thesametestingconditions,andweconfirmthatsoundimagery isa mental processthatdoesnot recruitprimary auditorycor- tex(Bunzecketal.,2005).RegionsimplicatedinNSSprocessing innormal-hearingcontrols(theleftinferiorfrontalgyrus(Halpern

&Zatorre,1999;Zatorre,Halpern,&Bouffard,2010),theleftinsula andtherightposteriortemporalcortex(Zatorre&Halpern,1993)) werelessactivatedindeafsubjectsthanincontrols,andactivity levelsoftheseregionswerenegatively correlatedwithauditory deprivation.We offerthehypothesis that this changeoccurred earlyinthetimecourseofauditorydeprivationfortherightpos- teriortemporalcortex,becauseitsactivationcorrelatedwiththe durationof hearinglossrather than withtheduration ofdeaf- ness.Because therightposterior temporalcortex is specialized inmultimodal integrationofNSS andmusic (Beauchamp etal., 2004;Doehrmann&Naumer,2008;Engeletal.,2009;Lewisetal., 2004;Zatorre&Halpern,1993),whenauditoryinputsweakenand oral communication becomes cognitively more demanding,the involvementof therightposterior temporalcortex in NSSpro- cessingmaydecrease.Thislossoffunctioncouldpotentiallymake availablecognitiveresourcesforphonologicalprocessing,assug- gestedbytheobservationthatthisregionisabnormallyrecruitedin post-lingualdeafsubjectsduringphonologicaltasks(Lazardetal., 2010)orduringlipreading(Lee,Truy,etal.,2007).

Adeclineof neuralactivitywasalsoobservedina regionof theleftinferiorfrontalgyrusthatoccurredratherlateintheevo- lutionofthehearinghandicap(correlationwiththedurationof deafness,ratherthan thedurationof hearingloss).Atthesame time,inferiorprefrontalactivationpredictedgoodCIscores,mean- ingthat itsdisengagement ismerely a matterof time. Theleft

inferiorfrontalgyrusisamultimodalareainvolvedinhierarchi- callanguageprocessing(Sahin,Pinker,Cash,Schomer,&Halgren, 2009)and inlanguageproduction planning(Hickok& Poeppel, 2007;Turkeltaub&Coslett,2010).Itsratherdelayeddecline,com- paredtotherighttemporalcortex,mayreflectthepartialrelianceof soundimageryonsound-to-articulationmapping(onomatopoeia self-production),withthismappingbecomingmoredifficultwith timeinthecaseofprolongeddeafness.

Thedisengagement of theright amygdalain soundimagery indeafsubjectswithincreasingengagementduringvisual(color) imagerywasanunexpectedfinding.Theamygdalaisimplicated inthemultisensoryprocessingofemotions(Koelsch,Fritz,Muller,

&Friederici,2006;Mesulam,1998;Spreng,2000)anditinteracts withtheanteriorsuperiortemporalgyrusviadirectconnections (Pandya,1995).Sincetheanterior superiortemporal gyruswas not activated in deafsubjects during sound imagery (but only incontrols),itispossiblethatsoundevocationnolongerentails anemotionalreaction.Themodificationofactivationoftheright amygdalamay reflect neuralreorganization in deafpatientsto extract emotional content from their now predominant visual inputs(Dye&Bavelier,2010).

Wealsoobservedanincreasingactivationoftherightthalamus (presumablythepulvinar)withhearinglossduration,paralleling theobservedcorticalmodifications. Thepulvinaris amultisen- soryrelaywhereinformation fromdifferentsensory modalities converges(Cappe,Morel,Barone,&Rouiller,2009;Hackettetal., 2007; Pandya, 1995). It interacts with visual area V1 (Shipp, 2003),butalsowithsuperiortemporalareas(Pandya,1995;Wong, Gharbawie, Luethke,&Kaas, 2008)and thefrontallobe(Cappe etal.,2009;Fuster,1997).Thepulvinar(mostlyinitsmedianpart) belongs to a cortico–thalamo–cortical loop transferring, among other modalities, auditory information to corticalareas (Cappe et al.,2009), thus allowingfor faststimulus-inducedresponses (Schroeder&Foxe,2005).Inanauditoryimagerytaskwithnormal- hearingsubjects(melodyimagery),activationofa rightinferior frontal/thalamicnetworkhasbeenreported(Halpern&Zatorre, 1999).Inreactiontothedisengagementoffrontalareasinauditory imagery,deaf subjectsmaystrengthen subcortical involvement (Giraud,Price,Graham,Truy,&Frackowiak,2001).

Wholebraincorrelationswithpost-CIscoresshowedadorso- ventraldissociation,dependingonthedurationofhearingloss.The dorsalnetwork,whereactivitydecreasedwithdeafnessduration, largelyoverlappedwithanetworkdescribedasthephonological routeofreading(Aparicio,Gounot,Demont,&Metz-Lutz,2007;

Ziegleretal.,2008)usedbydeafsubjectstoperformphonolog- ical tasks,e.g. rhyming tasks, and that we previously foundto positivelycorrelatewithpost-CIscores(Lazardetal.,2010).The associatedinvolvementsoftheleftinferiorfrontalandleftpost- centralgyrishowthatfuturegoodCIperformersutilizedregions involvedinlanguageproductionplanningandregionscontrolling languageproductioneffectorstoimagineNSS(Hickok&Poeppel, 2007;Turkeltaub&Coslett,2010).Theirmentalevocationinvolved phonemicinformationanalysis/retrievalandaction/speechrecon- struction.Theventralnetworkincludedsemanticrecognitionand globalidentificationregions (anterior temporalpole) (Martin &

Chao, 2001)and memory retrievalareas in the medial tempo- rallobe,runningalongbilateralparahippocampalgyrus(Pandya, 1995).Neuralactivityintheseventralregionswaspositivelycor- relatedwithdurationofhearingloss,andpredictedpooroutcome.

Thisdorso-ventraldissociationisarecurrentfinding,consistently predictingspeechabilitieswithCIsindeafenedsubjects(Giraud

&Lee,2007;Lazardetal.,2010;Lee,Giraud,etal.,2007).Good CIperformersrely moreonfronto-parietalregionsdedicatedto attentionalandhighlevelstrategies(e.g.phonologicassemblyand semanticassociations)(Lazardetal.,2010;Lee,Truy,etal.,2007;

Mortensenetal.,2006),reproducingsoundsthroughaudio–visual

Références

Documents relatifs

[r]

[r]

[r]

[r]

WKMUTVPn§ZI[ˆKQnCrM†MVC:W&gt;QRC:H1C8J_T‘YKI[L†QnC•¬I[PnT‘acW&gt;C_x_E&gt;I?JKMQRC8M†MUdGq:Psg C:T&gt;g

'LVHDVHVRI 2HVRSKDJXV 6WRPDFKDQG

'LVHDVHVRIRUDO

*ORPHUXODUGLVHDVHUHQDO IDLOXUHDQGRWKHUGLVRUGHUVRI NLGQH\DQGXUHWHU0DODGLHV JORPpUXODLUHVLQVXIILVDQFH UpQDOHHWDXWUHVDIIHFWLRQVGX