• Aucun résultat trouvé

The CoRoT legacy data

N/A
N/A
Protected

Academic year: 2022

Partager "The CoRoT legacy data"

Copied!
1
0
0

Texte intégral

(1)

The CoRoT Legacy Book c

The authors, 2016

DOI:10.1051/978-2-7598-1876-1.c020

Part II

The CoRoT legacy data

This part is dedicated to the data. It describes how the observations were planned and performed, how the data were processed and finally where data are archived.

The first chapter shows how it has been possible to man- age a reasonable mission, taking into account the scientific objectives and the mission constraints. It presents how the scientific specifications have been translated in the obser- vation programme and its successive runs. It describes the observations from all aspects: selection criteria (scientific and operational), tools, implementation, global results and specific results. It shows also how scientists and engineers in charge of the operations at CNES and in the laboratories have adapted the major principles to the results of the first observations and to the instrument in flight.

The second and third chapters deal with the final processing of the data. The second chapter describes the data building, the correction philosophy and the latest al- gorithms used to generate the legacy CoRoT data delivered to the community. The impacts of several instrumental and environmental effects on the data are described, and the choices made for the correction algorithm developments are detailed. The first steps of the corrections are applied to the effects (either instrumental or environmental) which are

understood and can be modelled on both bright stars and faint stars light-curves. An optional step is possible in the faint stars field thanks to the large number of light- curves, and a simple exposure-based algorithm able to re- move residual instrumental effects that were not possibly modelled, is described in details in chapter 3.

The fourth chapter describes the format and content of the scientific data. It aims to give the maximal information to handle the data so that they can still be used many years from now!

All data are provided with several levels of corrections;

the levels of corrections are available as different extensions in a single FITS file. The data with the highest level of cor- rections (called N2 data) are on purpose easy to handle and should be used as first choice. Less corrected data (called N1) are available on request but they require a deep under- standing of the instrument and the observation conditions to be scientifically helpful. They should be looked at only for specific purposes as instrumental studies, need for 1s sam- pling data, check of unexpected behoviors in the N2 data. . .

The fifth chapter indicates where and how the data are archived.

27

Références

Documents relatifs

In this paper we present Spar- qPlug, an approach that uses the SPARQL query language and the HTML Document Object Model to convert legacy HTML data sets into RDF.. This

There are people within public health institutions, as well as in donor agencies, who have the erroneous notion that vertical programmes such as immunization interfere with the

This work discusses the existing approaches to establish the ontological commit- ment of a database conceptual schema applied to legacy systems and raises some of the difficulties

In this respect, leftist activists were as incapable as public authorities in comprehending what Seveso people considered to be the priority in responding to the

The latest concern about Motherisk content published in CFP and in other medical journals came to light earlier this summer after Dr Jonathan Zipursky and Dr David Juurlink

Principal historical periods are represented by means of a hierarchy of classes having as root the class Sicil- ian Period and whose instances are related with the individual

In our implementation we offer these Mementos (i.e., prior ver- sions) with explicit URIs in different ways: (i) we provide access to the original dataset descriptions retrieved

The EU Community project seeks to promote, facilitate, and ultimately exploit the synergy of a cutting-edge intelligent collaboration platform with a community