• Aucun résultat trouvé

Can superconductivity be predicted with the aid of pattern recognition techniques ?

N/A
N/A
Protected

Academic year: 2021

Partager "Can superconductivity be predicted with the aid of pattern recognition techniques ?"

Copied!
11
0
0

Texte intégral

(1)

HAL Id: jpa-00209387

https://hal.archives-ouvertes.fr/jpa-00209387

Submitted on 1 Jan 1982

HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci- entific research documents, whether they are pub- lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers.

L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.

Can superconductivity be predicted with the aid of pattern recognition techniques ?

F.W. Pijpers, G. Vertogen

To cite this version:

F.W. Pijpers, G. Vertogen. Can superconductivity be predicted with the aid of pattern recognition

techniques ?. Journal de Physique, 1982, 43 (1), pp.97-106. �10.1051/jphys:0198200430109700�. �jpa-

00209387�

(2)

97

Can superconductivity be predicted with the aid of pattern recognition techniques ?

F. W. Pijpers

Department of Analytical Chemistry, Catholic University, Toernooiveld, Nijmegen, The Netherlands and G. Vertogen

Institute for Theoretical Physics, Catholic University, Toernooiveld, Nijmegen, The Netherlands (Rep le 13 janvier 1981, révisé le 11 mai, accepti le 11 septembre 1981)

Résumé. 2014 Nous employons une technique de reconnaissance de corrélations pour essayer de trouver des carac-

tères communs aux éléments supraconducteurs dont les propriétés déjà connues servent de base de départ. Les caractéristiques essentielles sont : le travail de sortie électronique et le nombre d’électrons de valence selon Miede- ma, la chaleur spécifique, la chaleur de fusion et de sublimation, le point de fusion et le rayon atomique. Nos prévi-

sions sont valables à 90 %. Cependant en ce qui concerne les alcalins et les alcalino-terreux d’un groupe donné,

nos prévisions sont moins convaincantes.

Abstract.

2014

Pattern recognition techniques were employed in order to investigate the possibility to find features of the elements of the periodic system that may be relevant for the description of their behaviour with respect to

superconductivity. Learning machines were constructed using those elements of the periodic system whose super-

conducting properties have been well studied. Relevant features appear to be the electronic work function and the number of valence electrons as given by Miedema, the specific heat, the heat of melting, the heat of sublimation,

the melting point and the atomic radius. The learning machines have a predicting capability of the order of 90 %.

The predictive power of these machines concerning the superconducting behaviour of the alkali and alkaline-

earth metals belonging to a given test set, however, appears to be less convincing.

J. Physique 43 (1982) 97-106 JANVIER 1982,

Classification

Physics Abstracts

74.10

-

74.20D

1. Introduction.

-

The phenomenon superconduc- tivity is only partly understood. Although the basic

mechanism is explained by the celebrated BCS-

theory [ 1 ], which states that the electron-phonon

interaction gives rise to an attractive electron-electron interaction resulting in Cooper pairs, the theory is not entirely satisfactory. The theory lacks the power to predict from physical features whether a given element,

say Sc, will be superconducting or not.

It is clear that the ability to predict superconducting

behaviour of a given material, starting from a number

of its physical and chemical data, would be of great importance, both from the technological and theore-

tical point of view. The selection of the relevant

physical and chemical properties could be the first step towards a really predicting theory. Such a theory

could facilitate the search for, and the design of, superconducting materials with higher transition temperatures than the known materials. The selection of relevant physical and chemical data in order to

predict unknown behaviour is well known in the field of alloying. As shown by Miedema [2] alloy formation

can be almost completely predicted starting from the compressibility and the electronic work function of the constituent elements. Because of the evident success

of this approach it is obvious to investigate whether

such a procedure could be successful when super-

conductivity is considered. In principle, all well

defined physical and chemical data are appropriate.

This means an intricate data processing problem is at

hand. A suitable technique to deal with such a problem

is offered by pattern recognition [3]. This technique is

well known in the fields of chemistry, biology and the

social sciences [4-9].

Pattern recognition, as applied to superconductivity,

means that the pattern of a superconducting element is positioned in a multidimensional feature space, that is spanned by all physical and chemical data, called features, of that particular element When a number of elements, positioned in that feature space, group

Article published online by EDP Sciences and available at http://dx.doi.org/10.1051/jphys:0198200430109700

(3)

together or cluster, it is obvious that their physical

and chemical behaviour is similar. In pattern reco-

gnition it is assumed that such a behaviour not only

holds for the known physical and chemical data,

but also reflects similar behaviour of properties that

are not yet measured or cannot be measured without considerable effort.

In this paper we restrict the discussion to a selection of the elements of the periodic system. Selecting a set

of elements, that is composed of both superconducting

and non-superconducting elements, we investigate

whether two different clusters can be found in the feature space, corresponding with the presence or absence of the property superconductivity. Because

of the comparative unacquaintance of pattern reco- gnition techniques in solid state physics we also deal briefly with the relevant mathematics of this strategy in sections 2 and 3. The main conclusions are given

in section 4.

2. Composition of a training set.

-

In table I the

elements are given that are used for the construction of the training set or learning machine. Only those

elements are selected whose superconducting pro-

perties are well described in the literature [10]. Alkali

and alkaline-earth metals and elements with a super-

conducting behaviour that seems to be limited to thin films of material or only occurs at high pressure, are e£duded from this list. In table II a list of physical

and chemical properties, in pattern recognition terms

called features, is given that describes, we hope, the

behaviour of the elements being called patterns.

The matrix spanned by all features of all patterns forms the initial data set. In order to get an unbiased feature treatment all feature data are autoscaled according to

where xi,j denotes feature j of pattern i, (1j the standard deviation of feature j for all patterns, Xj the mean

Table II.

-

Features investigated for their eventual predictive power.

value of feature j over the N patterns and x’,j the

autoscaled feature value [3].

In an initial calculation the interfeature correlation matrix is constructed from the autoscaled features

according to

The interfeature correlation gives an indication about the mutual dependence of the various features. It can

be used to reduce the dimensionality of the feature space by removal of those features that are highly

correlated to others. Such a reduction is advantageous

because the eventual relationship between the pro- perty, i.e. superconducting or non-superconducting,

and the remaining selected features is simplified.

Table I.

-

The training set has been composed of the following elements tabulated according to the periodic system

of elements. The shaded elements are non-superconducting.

(4)

99

The correlation of a feature i with the parametrized property that is taken to be one if the element is non-

superconducting and two if the element is super-

conducting, is calculated according to

where pk denotes the value of the property of pattert) k corresponding with the absence or presence of super-

conductivity, and p the mean value of that property [3].

These correlations are used to select features out of a set of highly correlated ones. The set of selected features listed in table III was constructed in this way. By trial

Table III.

-

The features selected by means of the feature to property correlation.

,

(*) The magnetic susceptibility, being infinite for Fe, Co and Ni, is assumed to be 104 times 106 c.g.s. units because of computational

reason.

and error it was established that the inclusion of the

remaining features did not enhance the information content of the data matrix significantly in view of the successful prediction of the presence or absence of the property superconductivity by the learning machine.

On the other hand removal of one of the selected features diminished the predictive power of the

training set. In the following section this conclusion

is illustrated by comparing the results of three different

learning machines that are denoted by A, B and C

and given in table IV.

Table IV.

-

Various sets of selected features,- AH.,.

denotes the heat of sublimation at 25°C; T m is the melting .temperature and n is the number of relevant

valence electrons rounded off to integer values.

3. Results.

-

The selected elements that compose the learning machine can be divided into two cate-

gories. The first category contains the non-super-

conducting elements, while the second category holds the superconducting ones. Consequently pattern reco- gnition techniques with supervision for two categories

can be applied. Two methods may be applied in order

to measure the features that are mainly responsible

for the discrimination of both categories, namely (1)

the weighting procedure according to Fisher and (2)

the weighting procedure according to the variances of the features belonging to each category.

According to both procedures 0 *, R and cp are the

three most important features for classification. Substi- tution of cp by cp In (Tm + 273) gives a slight improve-

ment of this feature for classification, whereas the

replacement of AH,,, by AHs,/(T m + 273) is immaterial with respect to classification purposes.

3 .1 THE TRAINING SET.

-

After the features were

selected and their relative importance for classi- fication was determined, a classification based upon the hierarchical clustering mode (HIER) and the

minimal spanning tree (TREE) [7, 9] was performed by using the set of selected features, variance weighted

for two categories. The resulting clusters are presented

in table V for HIER [7, 9] and in figure 1 for TREE.

Table Va.

-

Hierarchical dendrogram for the learning machines A, B and C. The symbol x* means that element x

is classified in the wrong category.

(5)

Table Vb.

Table Vc.

(6)

101

Fig. la.

-

Minimal spanning tree for the elements of the training set for learning machine A. The shaded elements are non-superconducting.

Fig. lb.

-

Minimal spanning tree for elements of the training set for learning machine B.

The similarity values Sij, given in table V, are based

upon the mutual distance Dij between patterns i and j

and are defined as Sij

=

1

-

Dij/Dmax. The distance Dij is the Euclidean distance in the M dimensional space spanned by the M selected features as given in

table III (M equals 7, 6 and 3 respectively). The ele-

ments are grouped on levels of similarity Sij in HIER

and are linked in branched chains based upon the distances Dij in TREE.

It follows from table V that always more than two

clusters of elements are formed. The first set of selected features, learning machine A, gives rise to three

clusters. Two of the clusters contain elements with the

« wrong » property : Ag and Sb being non-super-

conducting are found in a cluster of 14 superconducting

elements on a similarity level of 0.42 and V being

superconducting is found in a cluster of non-super-

conducting elements on a similarity level of 0.60.

On a similarity level of 0.57 a cluster of non-super-

conducting elements is combined with a cluster of

superconducting elements and the classification loses its meaning.

Learning machine B shows that superconducting

and non-superconducting elements already cluster together on a similarity of 0.79. This indicates a

diminished classifying ability compared with learning

machine A.

The third set of features, learning machine C, gives

a clustering of superconducting elements with non-

superconducting ones on a similarity level of 0.89.

The third cluster, however, consists of a combination

of three superconducting elements and one non-

(7)

superconducting element on a similarity level of 0.91,

which means a poor classifying ability.

In TREE all elements of the training set are linked together as shown in figure 1. The connecting lines

have not been drawn on scale. Learning machine A gives rise to one cluster of non-superconducting

elements surrounded by three clusters of superconduct- ing ones with the exception of Sn that is classified as

non-superconducting instead of superconducting. The

cluster algorithm denotes Pt as a separate element because of its distance from all other elements. Learn-

ing machine B also connects the non-superconducting

elements with each other with the exception of Sb

and U; the resulting cluster is here also surrounded

by three clusters of superconducting elements, while

two elements, V and Zn, are classified in the wrong category. The element Pt should be considered again

as being separated from all other elements because of its large distance from these elements. Since the computer algorithm for the calculation of the minimal

spanning tree is not suited for a learning machine with three or less features the minimal spanning tree for learning machine C has not been given.

Another method to test the predictive value of the

training set is provided by the nearest neighbour

method KNN [7, 9]. Here a particular element is classified according to the classification of its nearest

neighbours as far as the mutual distance is concerned.

Learning machine A gives the best results with a

group of six nearest neighbours. Then four elements

are faultily classifield : Rh and Sb being non-super- conducting are predicted to be superconducting,

whereas the reverse holds for V and Sn. Learning

machine B gives equally good results with a group

Fig. 2a.

-

The Non-Linear Mapping (NLM) of the elements of the

training set for learning machine A. The shaded elements are non-

superconducting.

of three or four nearest neighbours. Here six elements

are wrongly classified, viz. Rh, Zn, V, Sn, Sb and U.

The predictive value of learning machine C is consi- derable less. The elements, viz. Rh, V, Sb, Pd, Mo, Ag, W, Hg, Cu, Au and U, have the wrong property.

The classification based upon the mutual distance

can be visualized by a non-linear mapping NLM [7, 9]

on a two dimensional plane. Figure 2a represents the NLM of the results of learning machine A. All non-

superconducting elements are found in the upper right part of the map with the exception of V., The NLM

of the results of learning machine B looks slightly better, see figure 2b, although here too V is positioned

nearer to the non-superconducting element Cu, than

to the superconducting ones Sn and Zn, whereas Sb

and U are in a wrong position. The mapping obtained

with learning machine C is considerably less useful,

here Re, Os, U, Ru and Ag are found in the wrong

position.

A final test on the quality of the training set is given by the procedure LEAST [7, 9] that aims at the pre- diction of the property by means of a linear combi- nation of the selected features. Learning machine A gives rise to a linear combination of the form

where a property value equal or less than 1.5 means

non-superconducting. The property of the elements of the training set is predicted with an accuracy of 89 %, only Cu, Rh, Sb and Ru are predicted uncorrectly.

Reduction of the features to a statistically significant

set, a procedure called STEP [3], keeps 03A6 *, and AHsg,

Fig. 2b.

-

NLM of the elements of the training set for learning

machine B.

(8)

103

whereas the other five features are skipped. The predictive value of this reduced set of features dimi- nished til 83 % ; here Sb, U, Ru, Os, Ir and Hg are

misclassified. The linear combination of the reduced set reads

The same procedure applied to learning machine B gives a linear combination that predicts the property of 92 % of the elements of the training set correctly : only Cu, Rh and Sb are misclassified It holds

Reduction of the number of features to a statistically significant set keeps 0* and AH,,g and gives also

The number of correctly classified elements diminishes,

now Cu, Ag, Sb, U, Ru and Ir are misclassified.

Application of LEAST to learning machine C gives

In this case 20 % of the elements are misclassified,

viz. Cu, U, Mn, Ag, Au, Re, Os and Ru. A further reduction of features appears to be impossible.

3.2 THE TEST SET.

-

The predictive power of the

learning machine obtained from the training set,

mainly consisting of transition elements, can be employed to a number of elements grouped in a test

set. These elements are the alkali metals Li, Na, K, Rb and Cs, the alkaline-earth metals Be, Mg, Ca, Sr and Ba, and some other elements whose superconducting

behaviour is unknown or restricted to special situa-

tions like thin films of material. The elements of the test set are given in table VI. The results of the various

predicting methods based upon the learning machines A, B and C are listed in table VII.

Learning machine A predicts almost all elements of the test set to be superconducting using the nearest neighbour method KNN irrespective of the number of neighbours. Only the elements Be and Ge are

predicted to be non-superconducting according to a

number of neighbours of one up to eleven. The linear feature combination LEAST predicts B, Be, As, Ge and Mg to be non-superconducting, whereas the reduced feature set of STEP predicts Mg to be super-

conducting but extends the number of non-super-

conducting elements with Cs, Sc, Y, Bi, Ca and Ba.

Learning machine B gives the same results for the element B as machine A, using the KNN method.

However, the alkali metals Na, K, Rb and Cs are now predicted to be non-superconducting, if only one neighbour is consulted. The element Ge is predicted

to be superconducting, if learning machine B uses up to six neighbours; with more neighbours the reverse

behaviour is predicted. A comparison between the

results of learning machines A and B concerning the procedure LEAST shows that they only differ in the

predicted behaviour of Li; according to machine B Li

is non-superconducting. The procedure STEP gives

the same results for the learning machines A and B except for the element B. Finally learning machine C gives nearly the same results; only As is now predicted

to be non-superconducting for all neighbours con-

sulted. The procedure LEAST predicts here Li, B

and Be to be superconducting, while the procedure

STEP is not applicable.

The classification of the test set elements can be visualized by means of the Karhunen-Loeve

(KARLOV) transformation [3]. Purpose of this trans-

formation is to group the features in linear combi- nations that have the highest variance, i.e. highest

information content. The seven features of learning

machine A give rise to seven linear and mutually

u

Table VI.

-

The test set has been composed of the following elements tabulated according to the periodic systems

of elements. The shaded elements are non-superconducting, the superconducting behaviour of the remaining elements

is unknown or depends on special situations.

(9)

Table VII.

-

Comparison of the predictions for test set elements made by various learning machines with appli-

cation of different techniques.

1 : Predicted as non-superconducting.

2 : Predicted as superconducting.

? : Depends on the number of neighbours applied

orthogonal combinations (o eigenvectors ») ranked according to their information contents (« eigen-

values » ). A projection of all patterns on the plane spanned by the first two eigenvectors having the highest

information content and representing 79 % of all

available information in this case is given in figure 3a.

Fig. 3a.

-

Karhunen-Loeve (KARLOV) transformation of training

and test set elements for learning machine A. The training set ele-

ments are denoted by circles, the shaded elements are non-super-

conducting. The test elements are denoted by squares.

Apart from Ge all test set elements, represented by

squares, are found in the region where superconducting

elements are positioned, indicating that these elements

are expected to be superconducting. Application of the

KARLOV transformation to learning machine B yields a set of vectors with the first two vectors having already an information content of 73 %. Now Ge is

not an exception and the projection of all patterns

on the plane defined by these two most important eigenvectors produces a diagram where the test set

elements are found in the region of superconducting elements, see figure 3b. The KARLOV transformation

applied to learning machine C does not give rise to a plot of patterns that can be interpreted as clusters

of elements with different properties and therefore

was not reproduced here.

4. Discussion and conclusions.

-

The seven-feature

learning machine A and the six-feature learning

machine B give rise to pattern to feature ratios of 5.1 and 6.0 respectively, which are quite reasonable according to pattern recognition standards [7, 9].

A two cluster training set, however, is never obtained

as can be seen from table V and figures 1 a and 1 b.

In figure I a the group of non-superconducting elements

with the exception of Sn is surrounded by three groups of superconducting elements, while in figure 1 b the

group of non-superconducting elements with the

exception of V and Zn is flanked by two groups of

superconducting elements. Taking into account the

distances between the elements in the hyperdimen-

(10)

105

Fig. 3b.

-

KARLOV transformation of training and test set elements

for learning machine B.

sional space, however, the situation is less simple;

Pt is so far apart from all other elements that, although

it is non-superconducting, it seems to represent a class of its own.

A comparison between learning machine A and B, the latter operating with the entropy features

shows that both do not differ distinctly. This is not surprising from the point of view of pattern recognition

because Tm, the melting temperature, is highly corre-

lated with AH,,g and therefore cannot be expected of high importance. Learning machine C is obtained from A and B by omitting the features n, Tm, AH.,, and AH,,, which do not appear to be strongly correlated

with the property. Consequently it should be expected

that learning machine C is not really inferior to the

other ones. The actual results obtained with this machine therefore are rather disappointing.

All three learning machines predict Y, Sc and the

alkali metals to be superconducting, although the

nearest neighbour classification according to learning

machine B makes an exception for the alkali metals,

if only one neighbour is used (being Sb in this case).

In a recent article [11] it was shown that a super-

conducting modification of Y and Sc exists at high

pressure, whereas a superconducting modification of Csv is known as well at high pressure [10].

It should be pointed out here that the relevant features include the electronic work function 0* and the number of valence electrons n because’ of their relevance in the theory of Miedema concerning the

formation of alloys. Both features appear to be cor-

related with a correlation coefficient of 0.6. This raises

the question whether these features are the correct

parameters to describe the elements. This does not

affect the theory of alloy formation, however, because

here only the differences between the features of the considered elements are important.

The relative importance of various features for the

description of superconductivity is presented in

table IV. The important ones seem to be 03A6*, R and c, or even better cp In (Tm + 273). An eventual theory should at least account for these physical properties. This does not exclude of course features, that have not been taken into account, ore features,

that were excluded because of their correlation with others and/or their lack of correlation with the pro- perty superconductivity, as follows from a comparison

of learning machine C with A and B.

It seems relevant to make a remark about V, which is found near the cluster of non-superconducting

elements in learning machine A (see Fig. la) and learning machine B (see Fig I b). This behaviour seems

in accordance with a speculation of Rietschel and Winter [12] that V is close to being magnetic and has a strongly depressed T c due to spin fluctuations.

The question may be raised whether the classifica- tion into two distinct categories is correct. It is. appa- rent from figure 1 that non-superconducting elements

seem to have more in common than superconducting

ones, which have widely varying feature values. A trial to describe exclusively the superconducting elements,

that was based upon their transition temperatures as given in the literature [11], was unsuccessful.

As already mentioned the important features seem

to be 0 *, R and cp In (Tm + 273) according to pattern recognition techniques. Theoretically at least the following features are expected to be important : (1) the linear term of the specific heat describing the density

of states at the Fermi level; (2) the magnetic suscepti- bility ; (3) the high temperature electrical resistivity hopefully measuring the electron-phonon coupling;

(4) the Debye temperature. However, the correlations appear to be disappointingly low; the feature to pro-

perty correlations are respectively : - 0.2 ± 0.3,

-

0.4 + 0.3, - 0.2 + 0.3,

-

0.3 + 0.2. Composite

features may also be considered. According to Mie-

dema [13] the linear term in the specific heat divided by the magnetic susceptibility is expected to be highly

correlated with the property superconductivity. It

appears that the correlation is low, 0.3 ± 0.2. This poor correlation cannot be attributed to the fact that we

took finite susceptibility values for Fe, Co and Ni.

Other composite features like the wave velocity given by the square root of the Young modulus divided by

the density or the characteric impedance being the

square root of the Young modulus times the density

were also considered; the correlations are respectively

- 0.3 +0.2 and 0.1 +0.3.

It should be remarked here that inclusion of the

remaining features does not enhance the predicting

capability of the learning machine. Clearly the list of

(11)

features is by no means exhaustive. We limited our- selves to features given in table II, because not all required data for other features, like e.g. the sound attenuation in the normal state and the heat conduc-

tivity, could be obtained from the literature. We took

only those features that were known for nearly all

considered elements.

Finally it should be noted and stressed that pattern recognition does not provide a theory. In that respect it cannot be compared with the recent results of

sophisticated ab initio calculations [14]. Pattern reco- gnition only may hint at the relative importance of

some physical and chemical data for the problem

under investigation. Clearly a predictive power should be ascribed to the learning machines. However, the predictive power of a learning machine, that is mainly

built upon the properties of transition elements, seems

less convincing in applying the machine to a test set

consisting of mainly alkali and alkaline-earth metals.

Acknowledgments.

-

The authors wish to thank : A. J. Dekker of the University of Groningen, P. G. de

Gennes of the Ecole Superieure de Physique et de

Chimie Industrielles de Paris, W. Goossens and A. R. Miedema of the Philips Research Laboratories

Eindhoven, and G. Kateman, F. M. Mueller,

B. G. M. Vandeginste, A. Weyland and P. R. Wyder

of the University of Nijmegen for their critical reading

of the original manuscript and their useful suggestions.

They also thank B. R. Kowalski, University of Seattle,

for making the computer program « Arthur » avai- lable to us.

References

[1] BARDEEN, J., COOPER, L. N. and SCHIEFFER, J. R., Phys. Rev.

108 (1957) 1175.

[2] MIEDEMA, A. R. et al., J. Phys. F. 3 (1973) 1558 ; J. Less-Com-

mon Metals 32 (1973) 117 and 41 (1975) 283; Philips

Techn. Rev. 36 (1976) 217; CALPHAD 1 (1977) 341.

[3] DUEWER, D. L., KOSKINEN, J. R. and KOWALSKI, B. R.,

« Arthur » Laboratory for Chemometrics, Department of Chemistry BG, 10 Univ. of Washington, Seattle, Washington 98195.

[4] TAN, J. T., GONZALES, R. C., Pattern recognition principles (Addison-Wesley Reading Mass.) 1979.

[5] JURS, R. C. and ISENHOUR, T. L., Chemical application of

Pattern recognition (John Wiley and Sons, N. Y.) 1975.

[6] ISENHOUR, T. L., KOWALSKI, B. R., JURS, R. C., « Applica-

tion of pattern recognition to chemistry » C.R.C. Critical Reviews in Anal. Chem. 1-44 July 1974.

[7] KATEMAN, G., PIJPERS, F. W., « Quantity control in Analytical Chemistry », Vol 60 in « Chemical Analysis » : A series of

monographs on Analytical Chemistry and its applications,

B. J. Elving and J. D. Winefordner eds., J. M. Kolthoff ed. em. (Wiley, New York) 1981, chapter 4.

[8] KOWALSKI, B. R., Chemometrics, Anal. Chem. 52 (5) (1980) 112A2014122A 2014 a review article.

[9] MASSART, D. L., DIJKSTRA, A., KAUFMAN, L., Evaluation and optimization of laboratory methods and analytical proce- dures (Elsevier, Amsterdam) 1978, chapters 18 and 20.

[10] C.R.C. Handbook of Chemistry and Physics, 60st edition, Robert C. Weast ed. (The Chemical Rubber Co., Ohio, U.S.A.) 1979-1980.

[11] WITTIG, J., PROBST, C., SCHMIDT, F. A. and GSCHNEIDER Jr., K. A., Phys. Rev. Lett. 42 (1979) 469.

[12] RIETSCHEL, H. and WINTER, H., Phys. Rev. Lett. 43 (1979) 1256.

[13] MIEDEMA, A. R., private communication.

[14] GLÖTZEL, D., RAINER, D. and SCHOBER, H. R., Z. Phys.

B 35 (1979) 317.

Références

Documents relatifs

We define a partition of the set of integers k in the range [1, m−1] prime to m into two or three subsets, where one subset consists of those integers k which are < m/2,

Aware of the grave consequences of substance abuse, the United Nations system, including the World Health Organization, has been deeply involved in many aspects of prevention,

Keywords: joint action, music ensemble, sample entropy, expressive gesture, string

Index Terms—Business to worker, multiagent system, multiagent architecture, generic service component, field trial, virtual

The domain ontology of the pattern defines specific concepts of the domain and their relations and notations based on the core ontologies of the pattern Figure 4.. involving

The resolution of a plane elasticity problem comes down to the search for a stress function, called the Airy function A, which is bi- harmonic, that is to say ∆(∆A)=0. The expression

A natural question about a closed negatively curved manifold M is the following: Is the space MET sec<0 (M) of negatively curved met- rics on M path connected?. This problem has

Consider an infinite sequence of equal mass m indexed by n in Z (each mass representing an atom)... Conclude that E(t) ≤ Ce −γt E(0) for any solution y(x, t) of the damped