4. Concluding remarks
This paper gives a formal description of schedule sets and schedule generation schemes for the job-shopproblem with sequence-dependent setup times. We have shown that some fundamental dominance properties are lost when considering simple extensions of the non- delay and Giffler-Thompson SGSs based on operation appending to sequence-dependent setup times, as performed by most previous studies on priority rule-based heuristics for the SDST-JSP in the literature, e.g. Allahverdi, Gupta and Aldowaisan (1999) Kim and Bobrowski (1994),  and Ovacik and Uzsoy (1994). On the other hand, we have demonstrated that the serial algorithm based on operation insertion (Kolisch (1996)) has the ability to generate a dominant set of schedules.
In this paper, we consider the Cyclic JobShopProblem (CJSP) where processing times are affected by uncertainty. Several studies were conducted on the deterministic CJSP. The CJSP with identical parts is studied in (Roundy, R. 1992). The author shows that the problem is NP-hard and designs a branch and bound algorithm to solve the problem. Hanen (1994) investigates the general CJSP and presents a branch and bound procedure to tackle the problem. A general framework for modeling and solving cyclic scheduling problems is presented in (Brucker, P. and Kampmeyer, T. 2008). The authors present different models for cyclic versions of the jobshopproblem. However, a few works consider cyclic scheduling problems under uncertainty. Che, A. et. al. (2015) investigate the cyclic hoist scheduling problem with processing time window constraints where the hoist transportation times are uncertain. The authors define a robustness measure for cyclic hoist schedule and a bi-objective mixed integer linear program to optimize the cycle time and the robustness.
Boolean variable s ( [i ] ) indicates whether is positive or negative with a [i ] value True and False , respectively. Besides, some notations to be used in this paper are given in Table 1.
The considered problem is usually decomposed into two sub-problems: (1) the sequencing problem, in which a processing sequence of an optimal schedule is found for a given no-wait jobshopproblem, and (2) the timetabling problem, in which the optimal schedule which has the minimal makespan is found for a given processing sequence. In the paper, a polynomial algorithm based on the divide-and-conquer approach is proposed to solve the timetabling problem of a two-machine no-wait jobshopproblem.
3.1 JobShop Scheduling Problem with Sequence Dependent Setup-times A jobshopproblem with sequence-dependent setup times, involves, as in a regular JSP, m machines and nm tasks, partitioned into n Jobs of m tasks. As for a JSP, the tasks have to run in a predefined order for every job and two tasks sharing a machine cannot run concurrently, that is, the starting times of these tasks should be separated by at least the duration of the first. However, for each machine and each pair of tasks running on this machine, the machine needs to be setup to accommodate the new task. During this setup the machine must stand idle. The duration of this operation depends on the sequence of tasks, that is, for every pair of tasks (t i , t j ) running on the same machine we are given the setup time s(i, j) for t j following t i and the setup time s(j, i) for t i following t j . The setup times respect the triangular inequality, that is ∀i, j, k s(i, j) + s(j, k) ≥ s(i, k). The objective is to minimise the makespan. More formally:
polynomial lower bound for the best-case makespan performance has been pro- posed. Three steps are used for the computation of this new lower bound: first, we adapt the literature findings to GoPO. Then, this lower bound is tightened using the groups precedence property and the last step is the improvement of this lower bound using a precedence constraint property between the operations. The second contribution of the work presented in this article concerns the study of the usefulness of the best-case parameter in a decision-aid system. For this, the best-case parameter has been implemented in a decision-aid reactive algorithm and experimented in both deterministic and non-deterministic environments. This ex- perimental implementation of the best-case parameter exhibits good performance. The rest of the article is structured as follows: in section 2 and 3, GoPO is described and the problematic of the presented work is discussed. In section 4, the lower bound for the best-case of GoPO is presented. Section 5 is devoted to the reactive phase of GoPO where a new reactive algorithm using the best-case parameter has been evaluated on well-known instances of the jobshopproblem. In the same section, the reactive decision-aid algorithm which uses the best-case parameter is studied on both deterministic and non-deterministic environments. Finally, main conclusions are summarized in the last section.
the total time of execution of the schedule, is a classical regular objective.
Actually, manufacturing problems are not deterministic. This is why group sequencing was created by Erschler and Roubellat (1989). This method aims at solving the job- shopproblem by proposing not only one schedule but a set of different schedules in order to delay decisions to take into account uncertainties. This set of schedules is presented through groups of permutable operations. Group sequencing evaluates a group sequence according to the worst case quality in the set of feasible schedules.
that minimizes the makespan, i.e, the maximum job completion time.
The F 2|l j |C max problem is NP-hard in the strong sense even with unit-time oper-
ations . Therefore, it has been the scope of a variety of investigations. As far we know,  and  proposed lower bound methods. Moreover,  investigated heuristic approaches where he introduced four constructive heuristics and a Tabu Search algo- rithm. It should be noted that  implemented an exact method based on the branch- and-bound method of , which is originally made for the job-shopproblem. Another branch-and-bound method was proposed by  for the unit-time operations case.
3.3 Shifting Bottleneck Heuristic (SBH)
The Shifting Bottleneck Heuristic (SBH) was proposed by  to solve the minimum makespan problem for the jobshopproblem. The principle of this heuristic is to itera- tively determine a machine considered as bottleneck and to optimally schedule this ma- chine only. For each machine, a single machine scheduling problem1|rj|Lmaxis solved using the branch and bound technique proposed by . The bottleneck machine is the one with the highest Lmax. Then, using a disjunctive graph, the machines already scheduled are re-sequenced to include the optimal sequence of the current bottleneck machine. For each operation Oj, a release date rj and a due date dj are computed itera- tively: rj is the earliest beginning date of operation Oj, computed from its predecessors already scheduled; dj is the latest completion time of operation Oj. For a more complete presentation of the SBH, the reader may refer to .
It can be observed that in previous studies the selection of the process structure and respective station functionality for operations execution have been considered in iso- lation. In many real life problems such an integration can have a significant impact on process efficiency (Bukchin and Rubinovitz, 2003). The problem of simultaneous structural-functional synthesis of the customized assembly system is still at the begin- ning of its investigation (Levin et al. 2016). Previously isolated gained insights into jobshop scheduling, scheduling and sequencing with alternative parallel machines can now be integrated in a unified framework. Three most important prerequisites for such an integration, i.e., data interchange between the product and stations, flexible stations dedicated to various technological operations, and real-time capacity utiliza- tion control are enabled by Industry 4.0 technology.
handled on machine M 2 and its processing time is denoted by b i . M 1 and M 2 are two serial-
batch (or sum-batch) machines with a limited capacity, denoted by c, in terms of the number of jobs (each job i has a unit size s i = 1). The following assumptions are made: (i) preemption
is not allowed, (ii) each batch in both machines does not contain more than c operations, (iii) operations within a batch can be processed in any order, (iv) the total processing time of a batch is equal to the sum of the processing times of the operations in the batch, (v) an operation is assumed to be completed if all the operations in its batch are completed (batch availability constraint). Two criteria are considered here. The first criterion is to minimize the number of batches #batch. This criterion reflects situations where processing of any batch induces a fixed cost, which leads to a total cost proportional to #batch. The second criterion is the makespan C max , i.e. the completion time of the last job on machine M 2 . This bicriteria
The jobshop scheduling problem (JSSP) is critical and practical in the production and process control. It involves schedule optimization with n jobs and m machines and is referred as a NP-completion problem. Industries need effective and fast solutions to resolve the practical scheduling problem. The classical jobshop scheduling theory ensures optimality, but only deals with a limited number of jobs and machines, as the computing time increases exponentially with the problem size. The approximate approaches cannot always ensure optimality, but provide near-optimal solutions in a reasonable time (Ghedjati, 1999). Genetic algorithm is one of the approximate approaches that could provide a fast solution to scheduling problems because of its distributing calculation characteristics.
statistics for this analysis sample appear in Table 2; 47% of the sample are in the treated group. With respect to our two dependent variables, 57% of our observations come from individuals who report being married, and average job security on the one-to-six scale is a little over four.
Although we here focus on a specific part of the French population, i.e. young private- sector workers with a permanent contract, the share of married workers in our estimation sample is similar to the national value (see Figure 1). The share of unmarried partnered workers in our sample is slightly higher than the national figure, but this is very likely because we exclude individuals above age 49. The marriage share in the French adult population has steadily decreased from 1990 to 2009 in Figure 1, falling from 56% to 52%. Over the same
In , the authors consider a fresh food production and distribution problem. The authors identify three stages: a stage of batch processing of raw materials into food products, a stage for packaging these products and a stage for their im- mediate distribution. The production environment is complex and sequence dependent setups costs are considered. For the distribution problem, tight time windows at customer location are considered. The authors propose a hierarchical approach, batching the customer orders with similar temperature and processing requirements and compatible delivery and vehicle departure times, and applying a heuristic approach to solve the distribution planning problem.
The relationship between Unemployment Insurance (UI) benefit duration, unemployment duration and subsequent job duration is investigated using a multi- state duration model with state specific unobserved heterogeneity. I examine two potential explanations for the negative correlation between unemployment and job spell durations; UI benefits increase job matching quality (the "Matching" effect) vs unobserved heterogeneity ("Adverse Selection"). The Matching effect is found to be weak. Although new jobs accepted within 5 weeks of benefit termination seem to have a higher dissolution rate, the negative correlation between unemployment and job duration is mostly explained by unobserved heterogeneity. Various simulations indicate that increasing the maximum benefit duration by one week will raise expected unemployment duration by 1.0 to 1.5 days but will raise expected job duration by 0.5 to 0.8 days only.
Vers une « job rotation » à la française ?
Bernard Gazier 1 et Frédéric Bruggeman 2
Le principe de la « job rotation » consiste à remplacer par des chômeurs préalablement formés des salariés envoyés en formation de moyenne ou longue durée. Il s’agit de faire d’une pierre deux coups. D’une part l’entreprise peut développer une politique intensive de montée en compétences sans désorganiser sa production – et cet avantage peut être très important dans le cas de PME –. D’autre part, les chômeurs, sélectionnés par l’agence de l’emploi sur la base des besoins de l’entreprise, bénéficient d’une démarche d’acquisition de compétences consolidée par une mise en œuvre lors de la période de remplacement, immédiatement après leur formation. Cette politique a été développée concrètement par deux fois au Danemark. Une première fois de 1995 à 2000 dans un contexte de chômage élevé : de l’ordre de 10% en 1995. Il avait baissé à 5% en 2000. Une seconde fois entre 2012 et 2016 dans un contexte de redémarrage progressif de l’activité après la crise commencée en 2007. Là aussi le Danemark avait voulu attaquer de hauts niveaux de chômage et en 2016 la courbe était redescendue. La première application a été massive et généralisée, la seconde plus ciblée et davantage centrée sur les PME. Il s’agit ainsi d’une des illustrations les plus spectaculaires du « learnfare » nordique (aide aux chômeurs et aux personnes peu qualifiées via une politique intensive de formation), faite dans un pays disposant de peu de grandes firmes et en conséquence organisé autour de ses PME, et ne pouvant mobiliser facilement les ressources de l’apprentissage.
consumption for manufacturing scheduling is addressed. We focus our attention on a job-shop environment where machines can work at different speeds and therefore different energies consumed, i.e. CO 2 emissions. It represents an ex-
tension of the classical job-shop scheduling problem, where each operation has to be executed by one machine and this machine can work at different speeds, problem which has been introduced by . Energy-efficient scheduling of such type of manufacturing systems demands an optimization approach whose dual objectives are to minimize both the CO 2 emissions and the makespan. To solve