• Aucun résultat trouvé

The CLEF 2003 Cross Language Image Retrieval Task

N/A
N/A
Protected

Academic year: 2022

Partager "The CLEF 2003 Cross Language Image Retrieval Task"

Copied!
10
0
0

Texte intégral

(1)

Æ

Æ

!"##$

!% & '

( )

% &

* + ,% &

!

!

%

- (

% ! .

*%% ,

+ .

(

%

/ ! . ,

% &.

. % 0

,

! ! !

% !

(

( %

0 ' *1+

*"+ *-+%

(

. (

) % 2

*%% 345+

%% )

. )

. ) 6 %%

7 8 6 78*395 31#5+% :

.' %

-

(2)

%% .

. % ;

345 &- ,

. % / )

! (

%

& !' *1+ *"+

% <

% & ! (

( %

& !

(

%

& ! '

)

% & &-

.*%% . +%

,

% &

. )

% & ,7.,,

8 . !

. %

0. ! (

!= 0 ! %

6

, , % 0

6

. 6 % !

. 6

% :

.

.

. %

!

% / !

- %

& *-+

% &

( % &

- ) 3>5

(3)

.

) .% B 3C5

&- D - *

D+

,

%

! '

1% *%% +%

"% %

$% 6 %

B 2 3$5

% :

*%%

+ .7 8

% ) *%% E0- 0+

*%% 6 ) 31#5+

*%% &- +% !

%

! , % @

. 78 .

,% 0

! % :

% 0

% &

$#####

, 3115% 0 , $####

, 6

%

& !

.

% 0 !%:

)

&- ,% & ">1$$

* $A>"$? 1"#9A + % & 6 *>"F+

* + 1>$"

144" * 14"#+% .

% ! 1 %

& ,

0 % & >) ,

,% )

% 0>1F

) % ,

1C @

½

(4)

:&%

!

6 G

H %

%14##

I !

;,!"#"J<:;'&*1>"1,14#>+

;

0 H

: HK &

*1>C1,1>9C+%E &0'%

!1' 0 !

%% 6,6 % & ,

% 0 %

& %

(

% E

(.(

%

&- @&- ! )

% & ,

*%% (

+ Æ *%%

)+% &7 G

8 3"% 1#A45%

&)) ,

%% )6

% & 6,

! ) % 0

) % 0

%

& )*6

+ "94A .* A9F +%

/ 315 )

. % 0 )%% . )

(5)

. *%% 1C .A#F +%

0 6

) % !

67) 87) 0 8

78 7 8 7 8

7 8% & ) % &

. * +% &

.% &1 C# $%C

!%

; )6D

% < 6

6

) , %

! ",$

% <

) % ,

%

; ! E

; !

% /

% &

*%% . +% &

! %

/ . 6

% &

' *1+.66 *"+

% & )

6 *%%

( +% &

' *1+6

*"+ 6 %

& %

6

. (* !)

.6 +% &

&- ! @&-% .

*+

% &

H. L

%

? !* ?C +

¾

!"

(6)

* +% &

*.

. +,&1%

& ) 78 6,

* C#### +H ,

Æ6) 6 *

+% I

/, 6 * ,

1## +

% &

6 H

6 . % 06

'

% &

% &

) . % 6

% 6

%

& D6

*%% )

+

6 *%% + *%%

+% /

6

%% . 6 *%%

+

) %

0

?C % <

* +

)

% - *.+

% /

D '

1% 6

%

"% 6

%

$% 6

%

?% 6

%

& ,

% &

* &1%

%

(7)

!"#

" $

% & $

" $

$ ' % &%(( $$ $

)

* $

" +& $

!, !" $

- ""# $

% & ! $

. / /

)&

+0 1& $ $

$ 12 $

&& $

+&

3

& /

'1&

3&4&

/

0 $

1 &! 5 $

!3 .

$ +&& $

! &!

% $

1& !

46 !(& $$

&

& $

0 7 $

) .- $ $

+&& $

$ &8 $

" &

)

$

9& $

!

: & $ $

.%&& $

%!&&%& $ $

'

%& & $

$ &

6 $ $ $ $

$ $ $

$

$

&1' & !

(8)

"% %

$% & 6 %

?% 0

%

C% ! *.+%

A% E .%

9% ( C#%

& %

! .

( %

*+

. %% 78

7 8% 0 6

% &

( % I 6

% &

%

! !"##$' @ & ,

*@&+E* + Æ

% @

, H

% Æ

* +E"C ( ,

E% @&

? ? %

@& .

' E ,

E D E&<E , "%" ,% &

' *1+, *"+.

%

%

E - N

% ! ,

, ' !& %

& % -;0@ *,, +% !

/@ .

. :-, %

Æ <"C

* , +% 0)

(9)

:@:*0+ #%#A"? , #%CA1A #%C91>*O+

*P+ "9 , 1 1

& #%#C$4 , #%?#?9 #%?#?$*O&+

$# , 9 C

E #%#C#$ , #%?">C #%?#>$*O&+

"4 , > C

@ #%#">4 , #%$4#? ,

$" , 9 ,

!- #%#C"4 , #%?$># #%$91#*O&1+

$# , $ C

#%#$"# , #%?#9A #%?$"$*O&+

$" , $ C

@ , #%">>>*@&+ #%">C# ,

, 1" 1" ,

& "' 2 0 1##

* +

& .

/@% )

%

*.+

% /

( % &"

D ! %

1## H

1## )

%

( Æ E

'

%

& ! % 0 ,

% 0 @ -,

% &

6 Æ -* ' ;-J-CA99>J#1+%

!"

315 B ' 0. %

1444%

¿

(10)

0.0 0.2 0.4 0.6 0.8 1.0

0.0 0.2 0.4 0.6 0.8 1.0

Precision

Recall

Mono (daeenenQor) Italian (sheff) German (sheff) Dutch (sheff) French (sheff) Spanish (daespenQTdoc) Chinese (NTUiaCoP)

!"' ,- * +

3"5 , ' & %

C"*1"+'1#A9Q1#9""##1%

3$5 :"##1% !!" #"##1%

3?5 % $ %$&" '

( )* !!+, -")# -

.!/"- 0 $4?Q?#?"##"%

3C5 & % $ %$&" '

( )* !!+, -")#

- .!/" -0 $CCQ$9#"##"%

3A5 , ' 0 )

% B%C*1+'?1QC4"##"%

395 %2%0 % % 0 %

B%"$*?+'">9Q"441449%

3>5 %/% % 1 2 ?9QC4% ' M,

I % / *+ - - M 1449

1449%

345 0%0 ;% ' 0 %

B%$*"+'A$QAA"###%

31#5 %%& <% M%@ % % 0

,, ,%

B%?*$J?+'"9CQ"4?"##1%

3115 @%2%-% & %

Références

Documents relatifs

For clarity, we have tried to stick to the following terms when appropriate: manually defined segments (for English indexing), 15- minute snippets (for ASR training),

There were two Geographic Information Retrieval tasks: monolingual (English to English or German to German) and bilingual (language X to English or language X to German, where X

Participants were asked to categorise their submissions by the following dimensions: query language, type (automatic or manual), use of feedback (typically relevance feedback is

The corresponding English and French UMLS terms associated with the concepts identified by MetaMap were added to the original English Queries.. The text was processed by SMART and

For the CLEF- 2005 CL-SR track we were able to address this challenge by distributing training topics with relevance judgments that had been developed using

[r]

Different approaches have been used for image retrieval. 1) A user can submit a text query, and the system can search for images using image captions. 2) A user can submit an

These figures are larger than those observed for the known story boundary test condition in the CLEF 2003 CL-SDR task [2].. One possible explanation is the small size of the