• Aucun résultat trouvé

PDF Generation Test Course

N/A
N/A
Protected

Academic year: 2022

Partager "PDF Generation Test Course"

Copied!
138
0
0

Texte intégral

(1)

Collection Editor:

Chuck Bearden

(2)
(3)

Collection Editor:

Chuck Bearden Authors:

Richard Baraniuk Ian Barland Chuck Bearden

Connexions Jose A. Cruz-Cruz Matthias Felleisen

William Frey Adan Galvan John Greiner Jason Holden

National Instruments Don Johnson Christopher Kelty

Nick Kingsbury Phokion Kolaitis

Darryl Morrell Melissa Selik Malan Shiralkar

Moshe Vardi

Online:

< http://legacy.cnx.org/content/col10278/1.5/ >

OpenStax-CNX

(4)

Collection structure revised: December 16, 2009 PDF generated: September 5, 2014

For copyright and attribution information for the modules contained in this collection, see p. 126.

(5)

1 CALS Tables

1.1 CALS Table Samples . . . 1

1.2 Internal cnxn targets . . . 3

1.3 Math Table . . . 6

1.4 Bad Header Table . . . 7

1.5 Good Header Table . . . .. . . 12

1.6 More Bad Header Tables . . . 17

1.7 Author Blank Entry Table . . . .. . . 18

1.8 Rowspan Table - Feasibility Constraints . . . 18

1.9 Entry Table 1 . . . 43

1.10 Entry Table 2 . . . 49

1.11 Long Table . . . .. . . 62

1.12 Media Table . . . 74

Solutions . . . 76

2 cnxn media targets 2.1 Reduce Samples VI . . . .. . . 77

2.2 Icons and Connector Panes . . . 82

2.3 Low-Level File I/O VI and Functions . . . 86

3 Equation Breaking 3.1 Classic Fourier Series . . . 89

3.2 Complex Fourier Series . . . 91

3.3 Fast Fourier Transform (FFT) . . . 96

3.4 Connexions Employee Training Module . . . 98

3.5 Propositional Logic: normal forms . . . 99

3.6 Propositional Logic: subproofs . . . .. . . 101

Solutions . . . 110

4 cnxns, anyone? 4.1 cnxns . . . 113

Solutions . . . 121

Bibliography . . . 122

Index . . . 124

Attributions . . . .126

(6)
(7)

Chapter 1

CALS Tables

1.1 CALS Table Samples

1

Simple table with small cell contents, and colspans for all columns (all widths are '*')

a b c d e

f g h i j

k l m n o

Table 1.1

Simple table with wide cell contents A Sultan sat on his ori-

ental mat, in his harem in downtown Persia,

He took a sip of his cof- fee, just a drip, and he said to his servant Ker- sia,

"Ah, curse ya, curse ya, curse ya, That's the worst cup of coee in Persia!"

"Cause... All I want is a proper cup of coee, made from a proper cop- per coee pot,

I may be o my nut, but I want a proper cup of coee from a proper copper pot.

Iron coee pots and tin coee pots, they're no good to me!

If I can't have a proper cup of coee from a proper copper coee pot, I'll Throw you in the sea

In old Baghdad in old Baghdad, in old Bagh- dad

Very often I have had cups of coee by the dozen

And you all should make my coee just as good and without my blasted cussing

"Ah, curse ya, curse ya, curse ya, That's the worst cup of coee in Persia!"

continued on next page

1This content is available online at <http://legacy.cnx.org/content/m11797/1.6/>.

Available for free at Connexions <http://legacy.cnx.org/content/col10278/1.5>

(8)

Oh All I want is a proper cup of coee, made from a proper copper coee pot,

I may be o my nut, but I want a proper cup of coee from a proper copper pot.

Brass coee pots, glass coee pots They're no good to me

If I can't have a proper cup of coee From a proper copper coee pot, I'll have a cup of tea!"

Table 1.2

Simple table with explicit colwidths

col1,1 col1,2 col1,3

col2,1 col2,2 col2,3

Table 1.3

Simple table with mixed colwidths and a colspan

col1,1 col1,2 col1,3

col2,1 col2,2 col2,3

col3,1 col3,2-3; I'm using a spanname attribute to refer to a spanspec in my containing tgroup. The spanspec in turn refers to the starting and ending colspecs, which I need to compute my width correctly.

col4,1-2; I'm using a spanname attribute to refer to a spanspec in my containing tgroup. The spanspec in turn refers to the starting and ending colspecs, which I need to compute my width correctly.

col4,3

col5,1 col5,2-3; I'm using namest and nameend attributes to refer directly to the starting and ending colspecs for this span, which I need to compute my width correctly.

col6,1-2; I'm using namest and nameend attributes to refer directly to the starting and ending colspecs for this span, which I need to compute my width correctly.

col6,3

col7,1-3; I'm using namest and nameend attributes to refer directly to the starting and ending colspecs for this span, which I need to compute my width correctly.

col8,1-3; I'm using a spanname attribute to refer to a spanspec in my containing tgroup. The spanspec in turn refers to the starting and ending colspecs, which I need to compute my width correctly.

Table 1.4

Simple table with small cell contents; no colspans, so this table should be set in LR mode

a b c d e

f g h i j

k l m n o

Table 1.5

Simple table with explicit colwidths, and thead

(9)

colH,1 colH,2 colH,3

colH,1-2 colH,3

col1,1 col1,2 col1,3

col2,1 col2,2 col2,3

Table 1.6

Simple table with explicit colwidths, and thead

col1,1 col1,2 col1,3

col2,1 col2,2 col2,3

colF,1 colF,2 colF,3

colF,1-3

Table 1.7

1.1.1 Max's samples

long long long long long long a b

c d

e f g

Table 1.8

1.2 Internal cnxn targets

2

1.2.1 Exercises

Exercise 1.2.1: Exercise with names (Solution on p. 76.)

Named problem

How many reference librarians does it take to screw in a light bulb?

Exercise 1.2.2: Exercise without names (Solution on p. 76.)

How many reference librarians does it take to screw in a light bulb?

1.2.2 Rules etc.

Rule 1.1: Occam's Razor

Don't multiply entities unnecessarily.

Proof:

It's in the eating.

2This content is available online at <http://legacy.cnx.org/content/m11799/1.15/>.

(10)

Theorem 1.1: Gudger's Theorem

Overalls worn while farming will acquire a local memory.

Proof:

Doc Watson said so.

Rule 1.2:

This is a rule without a name.

Proof:

No name.

Theorem 1.2:

This is a theorem without a name.

Proof:

No name.

1.2.3 Computer code

<xsl:template match="*">

<xsl:apply-templates select="@*"/>

<xsl:apply-templates>

</xsl:template>

file = open('myfile.txt') for line in file:

print "Woof: %d" % len(line) file.close()

(11)

1.2.4 Table entries

I am a rather elderly

man. The nature of my avoca-

tions for the last thirty years has brought me into more than ordi- nary contact with what would seem an inter- esting and somewhat singular set of men, of whom as yet noth- ing that I know of has ever been written:

I mean the law-copyists or scriveners.

I have known very many of them, professionally and privately, and if I pleased, could relate divers histories, at which good-natured gentlemen might smile, and sentimental souls might weep.

But I waive the bi- ographies of all other scriveners for a few passages in the life of Bartleby, who was a scrivener of the strangest I ever saw or heard of.

While of other law- copyists I might write the complete life, of Bartleby nothing of that sort can be done.

I believe that no ma- terials exist for a full and satisfactory biogra- phy of this man.

It is an irreparable loss

to literature. Bartleby was one of those beings of whom nothing is ascertainable, except from the original sources, and in his case those are very small.

What my own as- tonished eyes saw of Bartleby, _that_ is all I know of him, except, indeed, one vague re- port which will appear in the sequel.

Ere introducing the scrivener, as he rst appeared to me, it is t I make some mention of myself, my _employ- ees_, my business, my chambers, and general surroundings; because some such description is indispensable to an adequate understanding of the chief character about to be presented.

Imprimis: I am a man who, from his youth up- wards, has been lled with a profound convic- tion that the easiest way of life is the best.

Hence, though I belong to a profession prover- bially energetic and ner- vous, even to turbu- lence, at times, yet nothing of that sort have I ever suered to invade my peace.

continued on next page

(12)

Table 1.9

1.2.5 Bartleby the Scrivener

I am a rather elderly man. The nature of my avocations for the last thirty years has brought me into more than ordinary contact with what would seem an interesting and somewhat singular set of men, of whom as yet nothing that I know of has ever been written:I mean the law-copyists or scriveners. I have known very many of them, professionally and privately, and if I pleased, could relate divers histories, at which good- natured gentlemen might smile, and sentimental souls might weep. But I waive the biographies of all other scriveners for a few passages in the life of Bartleby, who was a scrivener of the strangest I ever saw or heard of. While of other law-copyists I might write the complete life, of Bartleby nothing of that sort can be done.

I believe that no materials exist for a full and satisfactory biography of this man. It is an irreparable loss to literature. Bartleby was one of those beings of whom nothing is ascertainable, except from the original sources, and in his case those are very small. What my own astonished eyes saw of Bartleby, _that_ is all I know of him, except, indeed, one vague report which will appear in the sequel.

Ere introducing the scrivener, as he rst appeared to me, it is t I make some mention of myself, my _employees_, my business, my chambers, and general surroundings; because some such description is indispensable to an adequate understanding of the chief character about to be presented.

Imprimis: I am a man who, from his youth upwards, has been lled with a profound conviction that the easiest way of life is the best. Hence, though I belong to a profession proverbially energetic and nervous, even to turbulence, at times, yet nothing of that sort have I ever suered to invade my peace. I am one of those unambitious lawyers who never addresses a jury, or in any way draws down public applause; but in the cool tranquility of a snug retreat, do a snug business among rich men's bonds and mortgages and title-deeds.

All who know me, consider me an eminently _safe_ man. The late John Jacob Astor, a personage little given to poetic enthusiasm, had no hesitation in pronouncing my rst grand point to be prudence; my next, method. I do not speak it in vanity, but simply record the fact, that I was not unemployed in my profession by the late John Jacob Astor; a name which, I admit, I love to repeat, for it hath a rounded and orbicular sound to it, and rings like unto bullion. I will freely add, that I was not insensible to the late John Jacob Astor's good opinion.

1.3 Math Table

3

1.3.1 Common CTFT Properties

Time Domain Signal Frequency Domain Signal Condition

e−(at)u(t) a+iω1 a >0

eatu(−t) a−iω1 a >0

e−(a|t|) a22a2 a >0

te−(at)u(t) (a+iω)1 2 a >0

continued on next page

3This content is available online at <http://legacy.cnx.org/content/m10099/2.12/>.

(13)

tne−(at)u(t) n!

(a+iω)n+1 a >0

δ(t) 1

1 2πδ(ω)

e0t 2πδ(ω−ω0)

cos (ω0t) π(δ(ω−ω0) +δ(ω+ω0)) sin (ω0t) iπ(δ(ω+ω0)−δ(ω−ω0))

u(t) πδ(ω) +1

sgn (t) 2

cos (ω0t)u(t) π2(δ(ω−ω0) +δ(ω+ω0)) +

ω02−ω2

sin (ω0t)u(t) 2iπ (δ(ω−ω0)−δ(ω+ω0)) +

ω0

ω02−ω2

e−(at)sin (ω0t)u(t) (a+iω)ω0202 a >0 e−(at)cos (ω0t)u(t) a+iω

(a+iω)202 a >0

u(t+τ)−u(t−τ) 2τsin(ωτωτ )= 2τsinc (ωt)

ω0

π sin(ω0t)

ω0t = ωπ0sinc (ω0) u(ω+ω0)−u(ω−ω0)

t τ + 1

u τt + 1

−u τt +

τt + 1 u τt

−u τt −1

= triag t

τsinc2 ωτ2

ω0

sinc2 ω20t

ω

ω0 + 1 u

ω ω0 + 1

−u

ω ω0

+ −ωω

0 + 1 u

ω ω0

−u

ω ω0 −1

= triag

ω 0

P

n=−∞δ(t−nT) ω0P

n=−∞δ(ω−nω0) ω0= T et

2

2 σ√

2πeσ22ω2

Table 1.10 triag[n] is the triangle function for arbitrary real-valuedn. triag[n]= {

1 +n if−1≤n≤0 1−n if 0< n≤1

0 otherwise

1.4 Bad Header Table

4

In the module of Use of Laplacian PDFs in Image Compression (Section 1.5) we have assumed that ideal entropy coding has been used in order to calculate the bit rates for the coded data. In practise we must use real codes and we shall now see how this aects the compression performance.

There are three main techniques for achieving entropy coding:

• Human Coding - one of the simplest variable length coding schemes.

• Run-length Coding (RLC) - very useful for binary data containing long runs of ones of zeros.

4This content is available online at <http://legacy.cnx.org/content/m11091/2.3/>.

(14)

• Arithmetic Coding - a relatively new variable length coding scheme that can combine the best features of Human and run-length coding, and also adapt to data with non-stationary statistics.

We shall concentrate on the Human and RLC methods for simplicity. Interested readers may nd out more about Arithmetic Coding in chapters 12 and 13 of the JPEG Book.

First we consider the change in compression performance if simple Human Coding is used to code the subimages of the 4-level Haar transform.

The calculation of entropy in this equation5 from our discussion of entropy assumed that each message with probability pi could be represented by a word of length [U+EF59]i = −log2pi bits. Human codes require the[U+EF59]i to be integers and assume that thepi are adjusted to become:

p^i= 2−[U+EF59]i (1.1)

where the[U+EF59]i are integers, chosen subject to the constraint thatP

i

p^i≤1 (to guarantee that su- cient uniquely decodable code words are available) and such that the mean Human word length (Human entropy), ^

H=P

ipi[U+EF59]i, is minimised.

We can use the probability histograms which generated the entropy plots in gures of level 1 energies6, level 2 energies7, level 3 energies8 and level 4 energies9 to calculate the Human entropies ^

H for each subimage and compare these with the true entropies to see the loss in performance caused by using real Human codes.

An algorithm for nding the optimum codesizes [U+EF59]i is recommended in the JPEG specication [the JPEG Book, Appendix A, Annex K.2, g K.1]; and a Mathlab M-le to implement it is given in M-le code (p. 11).

5"Entropy", (1) <http://legacy.cnx.org/content/m11088/latest/#eq5>

6"Entropy", Figure 3 <http://legacy.cnx.org/content/m11088/latest/#gure5>

7"The Multi-level Haar Transform", Figure 2 <http://legacy.cnx.org/content/m11089/latest/#gure8>

8"The Multi-level Haar Transform", Figure 3 <http://legacy.cnx.org/content/m11089/latest/#gure9>

9"The Multi-level Haar Transform", Figure 4 <http://legacy.cnx.org/content/m11089/latest/#gure10>

(15)

Figure 1.1: Comparison of entropies (columns 1, 3, 5) and Human coded bit rates (columns 2, 4, 6) for the original (columns 1 and 2) and transformed (columns 3 to 6) Lenna images. In columns 5 and 6, the zero amplitude state is run-length encoded to produce many states with probabilities<0.5.

(16)

Numerical results used in the gure - Entropies and Bit rates of Subimages for Qstep=15

Column: 1 2 3 4 5 6 -

0.0264 0.0265 0.0264 0.0266

0.0220 0.0222 0.0221 0.0221 Level 4 0.0186 0.0187 0.0185 0.0186

0.0171 0.0172 0.0171 0.0173 - 0.0706 0.0713 0.0701 0.0705

0.0556 0.0561 0.0557 0.0560 Level 3 3.7106 3.7676 0.0476 0.0482 0.0466 0.0471 -

0.1872 0.1897 0.1785 0.1796

0.1389 0.1413 0.1340 0.1353 Level 2 0.1096 0.1170 0.1038 0.1048 - 0.4269 0.4566 0.3739 0.3762

0.2886 0.3634 0.2691 0.2702 Level 1 0.2012 0.3143 0.1819 0.1828 - Totals: 3.7106 3.7676 1.6103 1.8425 1.4977 1.5071

Table 1.11

Figure 1.1 shows the results of applying this algorithm to the probability histograms and Table 1.11:

Numerical results used in the gure - Entropies and Bit rates of Subimages for Qstep=15 lists the same results numerically for ease of analysis. Columns 1 and 2 compare the ideal entropy with the mean word length or bit rate from using a Human code (the Human entropy) for the case of the untransformed image where the original pels are quantized with Qstep = 15. We see that the increase in bit rate from using the real code is:

3.7676

3.7106−1 = 1.5%

But when we do the same for the 4-level transformed subimages, we get columns 3 and 4. Here we see that real Human codes require an increase in bit rate of:

1.8425

1.6103−1 = 14.4%

Comparing the results for each subimage in columns 3 and 4, we see that most of the increase in bit rate arises in the three level-1 subimages at the bottom of the columns. This is because each of the probability histograms for these subimages (see gure10) contain one probability that is greater than 0.5. Human codes cannot allocate a word length of less than 1 bit to a given event, and so they start to lose eciency rapidly when−log2pi becomes less than 1, ie whenpi>0.5.

Run-length codes (RLCs) are a simple and eective way of improving the eciency of Human coding when one event is much more probable than all of the others combined. They operate as follows:

• The pels of the subimage are scanned sequentially (usually in columns or rows) to form a long 1- dimensional vector.

• Each run of consecutive zero samples (the most probable events) in the vector is coded as a single event.

10"Entropy", Figure 3 <http://legacy.cnx.org/content/m11088/latest/#gure5>

(17)

• Each non-zero sample is coded as a single event in the normal way.

• The two types of event (runs-of-zeros and non-zero samples) are allocated separate sets of codewords in the same Human code, which may be designed from a histogram showing the frequencies of all events.

• To limit the number of run events, the maximum run length may be limited to a certain value (we have used 128) and runs longer than this may be represented by two or more run codes in sequence, with negligible loss of eciency.

Hence RLC may be added before Human coding as an extra processing step, which converts the most probable event into many separate events, each of which haspi <0.5and may therefore be coded eciently.

Figure 1.2 shows the new probability histograms and entropies for level 1 of the Haar transform when RLC is applied to the zero event of the three bandpass subimages. Comparing this with a previous gure11, note the absence of the high probability zero events and the new states to the right of the original histograms corresponding to the run lengths.

Figure 1.2: Probability histograms (dashed) and entropies (solid) of the four subimage of the Level 1 Haar transform of Lenna (see gure12) after RLC.

The total entropy per event for an RLC subimage is calculated as before from the entropy histogram.

However to get the entropy per pel we scale the entropy by the ratio of the number of events (runs and

11"Entropy", Figure 3 <http://legacy.cnx.org/content/m11088/latest/#gure5>

12"The Haar Transform", Figure 1(b) <http://legacy.cnx.org/content/m11087/latest/#gure1b>

(18)

non-zero samples) in the subimage to the number of pels in the subimage (note that with RLC this ratio will no longer equal one - it will hopefully be much less).

Figure 1.2 gives the entropies per pel after RLC for each subimage, which are now less than the entropies in this gure13. This is because RLC takes advantage of spatial clustering of the zero samples in a subimage, rather than just depending on the histogram of amplitudes.

Clearly if all the zeros were clustered into a single run, this could be coded much more eciently than if they are distributed into many runs. The entropy of the zero event tells us the mean number of bits to code each zero pel if the zero pels are distributed randomly, ie if the probability of a given pel being zero does not depend on the amplitudes of any nearby pels.

In typical bandpass subimages, non-zero samples tend to be clustered around key features such as object boundaries and areas of high texture. Hence RLC usually reduces the entropy of the data to be coded.

There are many other ways to take advantage of clustering (correlation) of the data - RLC is just one of the simplest.

In Figure 1.1, comparing column 5 with column 3, we see the modest (7%) reduction in entropy per pel achieved by RLC, due clustering in the Lenna image. The main advantage of RLC is apparent in column 6, which shows the mean bit rate per pel when we use a real Human code on the RLC histograms of Figure 1.2.

The increase in bit rate over the RLC entropy is only 1.5071

1.4977−1 = 0.63%

compared with 14.4% when RLC is not used (columns 3 and 4).

Finally, comparing column 6 with column 3, we see that, relative to the simple entropy measure, combined RLC and Human coding can reduce the bit rate by

1−1.5071

1.6103 = 6.4%

The closeness of this ratio to unity justies our use of simple entropy as a tool for assessing the information compression properties of the Haar transform - and of other energy compression techniques as we meet them.

The following is the listing of the M-le to calculate the Human entropy from a given histogram.

% Find Huffman code sizes: JPEG fig K.1, procedure Code_size.

% huffhist contains the histogram of event counts (frequencies).

freq = huffhist(:);

codesize = zeros(size(freq));

others = -ones(size(freq)); %Pointers to next symbols in code tree.

% Find non-zero entries in freq, and loop until only 1 entry left.

nz = find(freq > 0);

while length(nz) > 1,

% Find v1 for least value of freq(v1) > 0.

[y,i] = min(freq(nz));

v1 = nz(i);

% Find v2 for next least value of freq(v2) > 0.

nz = nz([1:(i-1) (i+1):length(nz)]); % Remove v1 from nz.

[y,i] = min(freq(nz));

v2 = nz(i);

% Combine frequency values.

13"Entropy", Figure 3 <http://legacy.cnx.org/content/m11088/latest/#gure5>

(19)

freq(v1) = freq(v1) + freq(v2);

freq(v2) = 0;

codesize(v1) = codesize(v1) + 1;

% Increment code sizes for all codewords in this tree branch.

while others(v1) > -1, v1 = others(v1);

codesize(v1) = codesize(v1) + 1;

endothers(v1) = v2;

codesize(v2) = codesize(v2) + 1;

while others(v2) > -1, v2 = others(v2);

codesize(v2) = codesize(v2) + 1;

endnz = find(freq > 0);

end

% Generate Huffman entropies by multiplying probabilities by code sizes.

huffent = (huffhist(:)/sum(huffhist(:))) .* codesize;

1.5 Good Header Table

14

It is found to be appropriate and convenient to model the distribution of many types of transformed image coecients by Laplacian distributions. It is appropriate because much real data is approximately modeled by the Laplacian probability density function (PDF), and it is convenient because the mathematical form of the Laplacian PDF is simple enough to allow some useful analytical results to be derived.

A Laplacian PDF is a back-to-back pair of exponential decays and is given by:

p(x) = 1 2x0e

|x|

x0 (1.2)

where x0 is the equivalent of a time constant which denes the width of the PDF from the centre to the 1e points. The initial scaling factor ensures that the area underp(x)is unity, so that it is a valid PDF.

Figure 1.3 shows the shape ofp(x).

14This content is available online at <http://legacy.cnx.org/content/m11090/2.4/>.

(20)

Figure 1.3: Laplacian PDF,p(x), and typical quantiser decision thresholds, shown for the case when the quantiser step sizeQ= 2x0

The mean of this PDF is zero and the variance is given by:

v(x0) = R

−∞x2p(x)dx

= 2R x2

2x0exx0dx

= 2x02

(1.3)

(using integration by parts twice).

Hence the standard deviation is:

σ(x0) = p v(x0)

= √

2x0

(1.4) Given the variance (power) of a subimage of transformed pels, we may calculatex0and hence determine the PDF of the subimage, assuming a Laplacian shape. We now show that, if we quantise the subimage using a uniform quantiser with step sizeQ, we can calculate the entropy of the quantised samples and thus estimate the bit rate needed to encode the subimage in bits/pel. This is a powerful analytical tool as it shows how the compressed bit rate relates directly to the energy of a subimage. The vertical dashed lines in Figure 1.3 show the decision thresholds for a typical quantiser for the case whenQ= 2x0.

First we analyse the probability of a pel being quantised to each step of the quantiser. This is given by the area underp(x)between each adjacent pair of quantiser thresholds.

• Probability of being at step 0,p0=P r

12Q

< x < 12Q

= 2P r

0< x < 12Q

• Probability of being at step k,pk=P r k−12

Q < x < k+12 Q First, forx2≥x1≥0, we calculate:

P r[x1< x < x2] = Z x2

x1

p(x)dx=

−1 2

exx0|xx21 =1 2

e

x1 x0 −e

x2 x0

(21)

Therefore,

p0= 1−e2xQ0 (1.5)

and, fork≥1,

pk = 12 e (k−12)Q

x0 −e (k+ 12)Q

x0

!

= sinh Q

2x0

ekQx0

(1.6)

By symmetry, ifkis nonzero,p−k=pk= sinh Q

2x0

e|k|Qx0 Now we can calculate the entropy of the subimage:

H = −P

k=−∞pklog2pk

= (−(p0log2p0))−2P

k=1pklog2pk

(1.7) To make the evaluation of the summation easier when we substitute forpk, we let

pk =αrk whereα= sinh Q

2x0

andr=exQ0. Therefore, P

k=1pklog2pk = P

k=1αrklog2 αrk

= P

k=1αrk(log2α+klog2r)

= αlog2αP

k=1rk+αlog2rP k=1krk

(1.8)

NowP

k=1rk= 1−rr and, dierentiating byr: P

k=1krk−1= 1

(1−r)2. Therefore, P

k=1pklog2pk = αlog2α1−rr +αlog2r r

(1−r)2

= 1−rαr

log2α+log1−r2r (1.9)

and

p0log2p0= 1−√ r

log2 1−√

r (1.10)

Hence the entropy is given by:

H = − 1−√ r

log2 1−√

r

− 2αr 1−r

log2α+log2r 1−r

(1.11) Because bothαandrare functions of xQ0, thenH is a function of just xQ0 too. We expect that, for constant Q, as the energy of the subimage increases, the entropy will also increase approximately logarithmically, so we plotH against xQ0 in dB in Figure 1.4. This shows that our expectations are born out.

(22)

Figure 1.4: EntropyH and approximate entropyHaof a quantised subimage with Laplacian PDF, as a function of xQ0 in dB.

We can show this in theory by considering the case when xQ0 1, when we nd that:

α' Q 2x0

r'1− Q

x0 '1−2α

√r'1−α

Using the approximationlog2(1−)' −ln(2) for small, it is then fairly straightforward to show that H ' −log2α+ 1

ln (2) 'log22ex0 Q

We denote this approximation asHa in Figure 1.4, which shows how close to H the approximation is, for x0> Q(i.e. for xQ0 >0 dB).

We can compare the entropies calculated using (1.11) with those that were calculated from the bandpass subimage histograms, as given in these gures describing Haar transform energies and entropies; level 1

(23)

energies15, level 2 energies16, level 3 energies17, and level 4 energies18. (The Lo-Lo subimages have PDFs which are more uniform and do not t the Laplacian model well.) The values ofx0are calculated from:

x0= std. dev.

√2 =

s subimage energy 2 (no of pels in subimage) The following table shows this comparison:

Transform

level Subimage

type Energy (×

106) No of pels x0 Laplacian

entropy Measured entropy

1 Hi-Lo 4.56 16384 11.80 2.16 1.71

1 Lo-Hi 1.89 16384 7.59 1.58 1.15

1 Hi-Hi 0.82 16384 5.09 1.08 0.80

2 Hi-Lo 7.64 4096 30.54 3.48 3.00

2 Lo-Hi 2.95 4096 18.98 2.81 2.22

2 Hi-Hi 1.42 4096 13.17 2.31 1.75

3 Hi-Lo 13.17 1024 80.19 4.86 4.52

3 Lo-Hi 3.90 1024 43.64 3.99 3.55

3 Hi-Hi 2.49 1024 34.87 3.67 3.05

4 Hi-Lo 15.49 256 173.9 5.98 5.65

4 Lo-Hi 6.46 256 112.3 5.35 4.75

4 Hi-Hi 3.29 256 80.2 4.86 4.38

Table 1.12

We see that the entropies calculated from the energy via the Laplacian PDF method (second column from the right) are approximately 0.5 bit/pel greater than the entropies measured from the Lenna subimage histograms. This is due to the heavier tails of the actual PDFs compared with the Laplacian exponentially decreasing tails. More accurate entropies can be obtained ifx0is obtained from the mean absolute values of the pels in each subimage. For a Laplacian PDF we can show that

Mean absolute value = R

−∞|x|p(x)dx

= 2R 0

x

2x0exx0dx

= x0

(1.12)

This gives values ofx0that are about 20% lower than those calculated from the energies and the calculated entropies are then within approximately 0.2 bit/pel of the measured entropies.

15"Entropy", Figure 3 <http://legacy.cnx.org/content/m11088/latest/#gure5>

16"The Multi-level Haar Transform", Figure 2 <http://legacy.cnx.org/content/m11089/latest/#gure8>

17"The Multi-level Haar Transform", Figure 3 <http://legacy.cnx.org/content/m11089/latest/#gure9>

18"The Multi-level Haar Transform", Figure 4 <http://legacy.cnx.org/content/m11089/latest/#gure10>

(24)

1.6 More Bad Header Tables

19

1.6.1 Complex numbers

m-le environments have excellent support for complex numbers. The imaginary unit is denoted by i or (as preferred in Electrical Engineering) j. To create complex variablesz1= 7 +iandz2= 2e simply enter z1

= 7 + j and z2 = 2*exp(j*pi)

The table (Table 1.13: Manipulating complex numbers) gives an overview of the basic functions for manipulating complex numbers, wherez is a complex number.

Manipulating complex numbers

m-le

Re(z) real(z)

Im(z) imag(z)

|z| abs(z)

Angle(z) angle(z)

z conj(z)

Table 1.13

1.6.2 Operations on Matrices

In addition to scalars, m-le environments can operate on matrices. Some common matrix operations are shown in the Table (Table 1.14: Common matrix operations) below; in this table, M and N are matrices.

Common matrix operations

Operation m-le

M N M*N

M−1 inv(M)

MT M'

det(M) det(M)

Table 1.14 Some useful facts:

• The functions length and size are used to nd the dimensions of vectors and matrices, respectively.

• Operations can also be performed on each element of a vector or matrix by proceeding the operator by ".", e.g .*, .^ and ./.

Example 1.1 LetA=

 1 1 1 1

. Then A^2 will returnAA =

 2 2 2 2

, while A.^2 will return

 12 12 12 12

=

 1 1 1 1

.

19This content is available online at <http://legacy.cnx.org/content/m13751/1.2/>.

(25)

Example 1.2

Given a vector x, compute a vector y having elementsy(n) =sin(x(n))1 . This can be easily be done the command y=1./sin(x) Note that using / in place of ./ would result in the (common) error

"Matrix dimensions must agree".

1.7 Author Blank Entry Table

20

The table below provides a number of unilateral and bilateral z-transforms21. The table also species the region of convergence22.

note: The notation forzfound in the table below may dier from that found in other tables. For example, the basic z-transform of u[n] can be written as either of the following two expressions, which are equivalent:

z

z−1 = 1

1−z−1 (1.13)

Signal Z-Transform ROC

δ[n−k] z−k All(z)

u[n] z−1z |z|>1

−u[(−n)−1] z−1z |z|<1

nu[n] (z−1)z 2 |z|>1

n2u[n] z(z+1)

(z−1)3 |z|>1

n3u[n] z(z2+4z+1)

(z−1)4 |z|>1 (−αn)u[(−n)−1] z−αz |z|<|α|

αnu[n] z−αz |z|>|α|

nu[n] (z−α)αz 2 |z|>|α|

n2αnu[n] αz(z+α)

(z−α)3 |z|>|α|

Qm

k=1n−k+1

αmm! αnu[n] z

(z−α)m+1

γncos (αn)u[n] z(z−γcos(α))

z2−(2γcos(α))z+γ2 |z|>|γ|

γnsin (αn)u[n] z2−(2γcos(α))z+γzγsin(α) 2 |z|>|γ|

Table 1.15

1.8 Rowspan Table - Feasibility Constraints

23

HOW TO EDIT: Write your module for a student audience. To complete or edit the sections below erase the provided textual commentaries then add your own content using one or more of the following strategies:

20This content is available online at <http://legacy.cnx.org/content/m10119/2.14/>.

21"The Z Transform: Denition" <http://legacy.cnx.org/content/m10549/latest/>

22"Region of Convergence for the Z-transform" <http://legacy.cnx.org/content/m10622/latest/>

23This content is available online at <http://legacy.cnx.org/content/m14789/1.9/>.

(26)

- Type or paste the content directly into the appropriate section - Link to a published CNX module or an external online resource

using the ``Links'' tabs (see example on the right)

- Link to a document or multimedia file within the content after uploading the file using the ``Files'' tab (see example below) - Cite content not available online

Word Version of this Template

This media object is a downloadable le. Please view or download it at

< EAC TK STD TEMPLATE.doc>

Figure 1.5: This is an example of an embedded link. (Go to "Files" tab to delete this le and replace it with your own les.)

1.8.1 Introduction

In this module you will study a real world ethical problem, the Toysmart case, and employ frameworks based on the software development cycle to (1) specify ethical and technical problems, (2) generate solutions that integrate ethical value, (3) test these solutions, and (4) implement them over situation-based constraints.

This module will provide you with an opportunity to practice integrating ethical considerations into real world decision-making and problem-solving in business and computing. This whole approach is based on an analogy between ethics and design (Whitbeck).

Large real world cases like Toysmart pivot around crucial decision points. You will take on the role of one of the participants in the Toysmart case and problem-solve in teams from one of three decision points.

Problem-solving in the real world requires perseverance, moral creativity, moral imagination, and reason- ableness; one appropriates these skills through practice in dierent contexts. Designing and implementing solutions requires identifying conicting values and interests, balancing them in creative and dynamic solu- tions, overcoming technical limits, and responding creatively to real world constraints.

Each decision point requires that you take up the position of a participant in the case and work through decision-making frameworks from his or her perspective. You may be tempted to back out and adopt an evaluative posture from which to judge the participants. Resist this temptation. This module is specically designed to give you practice in making real world decisions. These skills emerge when you role play from one of the standpoints within the case. You will learn that decision-making requires taking stock of one's situation from within a clearly dened standpoint and then accepting responsibility for what arises from within that standpoint.

Cases such as Toysmart are challenging because of the large amount of information gathering and sorting they require. Moral imagination responds to this challenge by providing dierent framings that help to lter out irrelevant data and structure what remains. Framing plays a central role in problem specication. For example, Toysmart could be framed as the need to develop more eective software to help negotiate the exchange of information online. In this case, a software programming expert would be brought in to improve P3P programs. Or it could be framed as a legal problem that requires ammending the Bankruptcy Code.

What is important at this stage is that you and your group experiment with multiple framings of the case

(27)

around your decision point. This makes it possible to open up avenues of solution that would not be possible under one framing.

Tackling large cases in small teams also helps develop the communication and collaboration skills that are required for group work. Take time to develop strategies for dividing the work load among your team members. The trick is to distribute equally but, at the same time, to assign tasks according the dierent abilities of your team members. Some individuals are better at research while others excell in interviewing or writing. Also, make sure to set aside time when you nish for integrating your work with that of your teammates. Start by quickly reviewing the information available on the case. This is called scoping the case. Then formulate specic questions to focus further research on information relevant to your problem solving eorts. This includes information pertinent to constructing a socio-technical analysis, identifying key embedded ethical issues, and uncovering existing best and worst practices.

A case narrative, STS (socio-technical system) description, and two ethical reections have been published at http://computingcases.org. This module also links to websites on bankruptcy and privacy law, the Model Business Corporation Act, consumer privacy information, and the TRUSTe website.

1.8.1.1 Toysmart Narrative

Toysmart was a Disney-supported company that sold educational toys online from December 1998 to May 2000. After disappointing Christmas sales in 1999, Disney withdrew its nancial support. The greatly weakened dot-com company lasted less than a year after this. On May 22, 2000, Toysmart announced that it was closing down and brought in a consulting rm, The Recovery Group, to evaluate its assets, including a customer data base of 260,000 proles, each worth up to $500.

Fierce opposition emerged when Toysmart placed ads in the Wall Street Journal and the Boston Globe to sell this data base. Customer interest groups pointed out that Toysmart had promised not to share customer information with third parties. Toysmart also prominently displayed the TRUSTe seal which testied further to the company's obligations to respect customer privacy and security. Selling this data to third parties would break Toysmart promises, violate TRUSTe policies, and undermine consumer condence in the security and privacy of online transactions. Toysmart's obligations to its customers came into direct conict with its nancial obligations to its investors and creditors.

TRUSTe reported Toysmart's intention to sell its data base to the FTC (Federal Trade Commission) who on July 10, 2000 led a complaint "seeking injunctive and declaratory relief to prevent the sale of condential, personal customer information" (FTC article) Toysmart's promise never to share customer PII with third parties provided the legal foundation for this complaint. According to the FTC, Toysmart "violated Section 5 of the FTC Act by misrepresenting to customers that personal information would never be shared with third parties, then disclosing, selling, or oering that information for sale." Finally, because it collected data from children under 13 who entered various contests oered on its website, Toysmart was also cited for violating the Children's Online Privacy Protection Act or COPPA.

The FTC reached a settlement with Toysmart. The bankrupt dot-com must "le an order in the bankruptcy court prohibiting the sale of its customer data as a 'stand-alone asset'. In other words, the rights bundled in the liquidation and sale of Toysmart did not include the liberty of buyers to dispose of the asset in whatever way they saw t. According to the negotiated settlement, buyers were bound by the commitments and promises of the original owners. Toysmart creditors "can sell electronic assets only if the purchasing company abided by the same privacy policy." In essence, the FTC asked Toysmart creditors to honor the spirit, if not the letter, of Toysmart's original promise to its customers not to sell their PII to third parties. Creditors now had to guarantee that (1) the buyer had the same basic values as Toysmart (for example, a commitment to selling quality, educational toys), (2) the buyer use the data in the same way that Toysmart had promised to use it when collecting it, and (3) the buyer would not transfer the information to third parties without customer consent. In this way, the settlement proposed to protect Toysmart customer privacy interests while allowing creditors to recover their losses through the sale of the bankrupt company's

"crown jewel", its customer data base.

On August 17, 2000, the Federal Bankruptcy Court declined to accept the Toysmart-FTC settlement.

Instead, they argued that Toysmart and the FTC should wait to see if any parties willing to buy the data

(28)

base would come forward. The Bankruptcy Court felt that potential buyers would be scared o by the FTC suit and the pre-existing obligations created by Toysmart promises and TRUSTe standards. Should a buyer come forth, then they would evaluate the buyer's oer in terms of the FTC-Toysmart settlement designed to honor the privacy and security commitments made to Toysmart customers.

A nal settlement was reached on January 10, 2001. When a buyer did not come forward, Buena Vista Toy Company, a Disney Internet subsidiary who was also a major Toysmart creditor, agreed to buy the data base for $50,000 with the understanding that it would be immediately destroyed. The data base was then deleted and adavits were provided to this eect.

1.8.1.2 Toysmart Chronology

Time Line

1997 David Lord, former college football player, come to

work for Holt Education Outlet in Waltham, Mass.

December 1998 Lord and Stan Fung (Zero Stage Capital) buy

Holt Education Outlet and rename it "Toysmart."

(Lorek) Toysmart focuses on providing customers with access to 75,000 toys through online catalogue.

(Nashelsky).

August 1999 Toysmart turns down a 25 million oer from an

investment rm. Accepts Disney oer of 20 million in cash and 25 million in advertising,

September 1999 Toysmart post privacy policy which promises not to

release information collected on customers to third parties. At about this time, Toysmart receives per- mission from TRUSTe to display its seal certifying thatToysmart has adopted TRUSTe procedures for protecting privacy and maintaining information se- curity.

Christmas 1999 After disappointing Christmas toy sales, Disney

withdraws its support from Toysmart.

April 2000 COPPA goes into eect. (Childhood Online Pri-

vacy Protection Act) Prohibits soliciting informa- tion from children under 13 without parental con- sent.

June 2000 (approximately) Toysmart erases 1500 to 2000 customer proles from data base to comply with COPPA (informa- tion collected after law went into eect)

continued on next page

(29)

May 22, 2000 Toysmart announces that it is closing its operations and selling its assets. Its initial intention is to reor- ganize and start over.

June 9, 2000 Toysmart creditors le an involuntary bankruptcy

petition rejecting Toysmart proposal to reorganize.

They petition the U.S. Trustee to form a Creditors Committee to oversee the liquidation of Toysmart assets.

June 23, 2000 Toysmart consents to involuntary bankruptcy peti-

tion. Files Chapter 11 bankruptcy. It rejects reor- ganization and works with lawyers and the Recov- ery Group to liquidate its assets.

June 2000 Recovery Group analyzes Toysmart assets and iden-

ties its customer information data base as one of its most valuable assets (a "crown jewel")

June 9, 2000 Disney subsidiary, acting as Toysmart creditor,

places ads in Wall Street Journal and Boston Globe oer Toysmart customer data base for sale.

After June 9, 2000 TRUSTe discovers Toysmart ad. Informs FTC

(Federal Trade Commission) that selling of cus- tomer data base to third parties violates TRUSTe guidelines and violates Toysmart's promises to cus- tomers(13,2)

July 10, 2000 FTC les complaint against Toysmart "seeking in-

junctive and declaratory relief to prevent the sale of condential, personal customer information." Dis- trict attorneys of 41 states also participate in com- plaint against Toysmart.

July 27, 2000 Hearing by U.S. Bankruptcy Court on Toysmart

case. Includes Toysmart proposal to sell customer data base.

Late July 2000 FTC and Toysmart reach settlement. Toysmart

can only sell customer information to a third part who shares Toysmart values and agrees to carry out same privacy policy as Toysmart.

continued on next page

(30)

Late July 2000 Federal bankruptcy court rejects FTC and Toys- mart settlement. Suggests waiting to see if a buyer comes forth.

January 10, 2001 Walt Disney Internet subsidiary (Buena Vista Toy

Company?) pays Toysmart $50,000 for its data base. Toysmart then destroys the data base and provides conrming adavit.(18,2)

Table 1.16: Chronology of Toysmart Case Insert paragraph text here.

1.8.1.3 Supporting Documents and Tables

Toysmart Creditors

Creditor Description Debt Impact

Zero Stage Capital Venture Capital Firm 4 million

Citibank 4 million

Arnold Communica-

tions 2.5 million

Children's Television

Workshop 1.3 million

Data Connections Set up high speed ca- ble and ber optics for Toysmart

85,000 Data Connections took

out loan to keep solvent Integrated Handling

Concepts Set up packaging and

handling system for Toysmart

40,000 Requires dot-coms to

pay up front after Toys- mart experience

Blackstone Software business 45,000 "It puts us in jeopardy

as well"

PAN Communica-

tions "Public relations

agency specializing in e-business"

171,390 Turns down deals with

dot-com companies and requires up-front pay- ments

Table 1.17: Source Lorek Insert paragraph text here.

1.8.1.4 Intermediate Moral Concept: Informed Consent Concept and Denition

• Informed Consent: The risk bearer consents to taking on the risk on the basis of a complete under- standing of its nature and breadth.

• Belmont Report: "subjects, to the degree that they are capable, be given the opportunity to choose what shall or shall not happen to them."

• "This opportunity is provided when adequate standards for informed consent are satised."

(31)

• Quotes take from Belmont Report

Arguments for Free and Informed Consent as a Moral Right

• Free and informed consent is essential for the exercise of moral autonomy. Absence implies force, fraud, or manipulation all of which block the exercise of moral autonomy.

• The standard threat occurs when crucial risk information is not communicated to risk taker. This could be because the risk taker cannot appreciate the risk, because the mode of communication is inadequate, or because the information has been covered up. Given this standard threat, free and informed consent is vulnerable; it must be protected.

• Informed consent must be shaped around its feasibility, that is, the ability of the duty holder to recognize and respect this right in others. If private individuals exercise their right as a veto, then they can block socially benecial projects. There are also serious problems concerning children, mentally challenged adults, and future generations. Finally, it may not be possible or feasible to know all risks in advance.

Conditions for Recognizing and Respecting Right

• From Belmont Report

• Information: research procedure, their purposes, risks and anticipated benets, alternative procedures (where therapy is involved), and a statement oering the subject the opportunity to ask questions and to withdraw at any time from the research.

• Comprehension: manner and context in which information is conveyed is as important as the infor- mation itself.

• Voluntariness: an agreement to participate in research constitutes a valid consent only if voluntarily given. This element of informed consent requires conditions free of coercion and undue inuence.

Other Legal and Moral Frameworks

• Institutional Research Boards or IRBs now require documentation of informed consent on research projects carried out under the university's auspicies. This is in response to requirements by granting agencies such as the National Institute for Health and the National Science Foundation.

• Consenting to the transfer of PII (personal identifying information) online:opt-in and opt- out.

• Opt-in: Information is transferred only upon obtaining express consent. Default is not transferring information.

• Opt-in: Information transfer is halted only when person to whom information applies does something positive, i.e., refuses to consent to transfer. Default is on transferring the information.

• Liability Rules and Property Rules: These also have to do with consent. Sago makes this distinction with reference to activities that have an impact on the environment. an injunction referring to liability rules stops the activity to protect the individual who proves impact. Property rules require only that the producer of the environmental impact compensate the one who suers the impact.

Cases Employing Informed Consent

• Therac-25: Patients receiving radiation therapy should be made aware of the risks involved with treatment by the machine. Free and informed consent is involved when shutting down the machines to investigate accident reports or continuing operating the machines while investigating accident reports.

In both cases, it is necessary, under this right, to let patients know what is going on and their risks.

• Toysmart Case: Toysmart creditors are about to violate Toysmart's promise not to transfer customer information proles to third parties. This transfer can occur, morally, but only with the express consent of the customers who have provided the information. The devil is in the details. Do opt-in or opt-out procedures best recognize and respect free and informed consent in this case?

(32)

• Hughes Case: Hughes customers want their chips right away and are pressuring Saia and crowd to deliver them. Would they consent to renegotiating the conditions under which environmental tests can be skipped?

1.8.2 Privacy and Property Summaries

Triangle of Privacy

Figure 1.6: Seeing privacy in its STS Context.

(33)

Intellectual Property

Figure 1.7: Summary of issues on Intellectual Property

Bibliographical Note

The triangle of privacy is widely disseminated in the literature of business ethics. The author rst became aware of it form George G Brenkert (1981) "Privacy, Polygraphs and Work," Business and Professional Ethics 1, Fall 1981" 19-34. Information on intellectual property comes from Lawrence Lessig (2006) Code.2, Basic Books: Chapter 10.

1.8.3 What you need to know . . .

1.8.3.1 What you need to know about socio-technical systems

1. STS have seven broad components: hardware, software, physical surroundings, peo- ple/groups/roles, procedures, laws, and data/data structures.

2. Socio-technical systems embody values

• These include moral values like safety, privacy, property, free speech, equity and access, and security.

Non-moral values can also be realized in and through Socio Technical Systems such as eciency, cost- eectiveness, control, sustainability, reliability, and stability.

(34)

• Moral values present in Socio Technical Systems can conict with other embedded moral values; for example, privacy often conicts with free speech. Non-moral values can conict with moral values;

developing a safe system requires time and money. And, non-moral values can conict; reliability undermines eciency and cost eectiveness. This leads to three problems that come from dierent value conicts within Socio Technical Systems and between these systems and the technologies that are being integrated into them.

• Mismatches often arise between the values embedded in technologies and the Socio Technical Sys- tems into which they are being integrated. As UNIX was integrated into the University of California Academic Computing STS (see Machado case at Computing Cases), the values of openness and trans- parency designed into UNIX clashed with the needs of students in the Academic Computing STS at UCI for privacy.

• Technologies being integrated into Socio Technical Systems can magnify, exaggerate, or exacerbate existing value mismatches in the STS. The use of P2P software combined with the ease of digital copying has magnied existing conicts concerning music and picture copyrights.

• Integrating technologies into STSs produces both immediate and remote consequences and impacts.

3. Socio-technical systems change

• These changes are bought about, in part, by the value mismatches described above. At other times, they result from competing needs and interests brought forth by dierent stakeholders. For example, bicycle designs, the conguration of typewriter keys, and the design and uses of cellular phones have changed as dierent users have adapted these technologies to their special requirements.

• These changes also exhibit what sociologists call a trajectory, that is, a path of development. Tra- jectories themselves are subject to normative analysis. For example, some STSs and the technologies integrated into them display a line of development where the STS and the integrated technology are changed and redesigned to support certain social interests. The informating capacities of computing systems, for example, provide information which can be used to improve a manufacturing processes can or to monitor workers for enhancing management power. (See Shoshanna Zubo, The Age of the Smart Machine

• Trajectories, thus, outline the development of STSs and technologies as these are inuenced by internal and external social forces.

In this section, you will learn about this module's exercises. The required links above provide information on the frameworks used in each section. For example, the Socio-Technical System module provides background information on socio-technical analysis. The "Three Frameworks" module provides a further description of the ethics tests, their pitfalls, and the feasibility test. These exercises will provide step by step instructions on how to work through the decision points presented above.

For more information see Hu and Jawer below.

Decision Point One:

You are David Lord, a former employee of Holt Educational Outlet, a manufacturer of educational toys located in Waltham, Mass. Recently, you have joined with Stan Fung of Zero Stage Capital, a venture capital rm to buy out Holt Educational Outline. After changing its name to Toysmart, you and Fung plan to transform this brick and mortar manufacturer of educational toys into an online rm that will link customers to a vast catalogue of educational, high quality toys. Designing a website to draw in toy customers, linking to information on available toys, setting up a toy distribution and shipping system, and implementing features that allow for safe and secure online toy purchases will require considerable nancing. But, riding the crest of the dot-com boom, you have two promising options. First, a venture capital rm has oered you $20,000,000 for website development, publicity, and other services. Second, Disney has oered the same amount for nancing, but has added to it an additional $25,000,000 in advertising support. Disney has a formidable reputation in this market, a reputation which you can use to trampoline Toysmart into prominence in the growing market in educational toys. However, Disney also has a reputation of micro-managing its partners. Develop a plan for nancing your new dot-com.

Things to consider in your decision-making:

Références

Documents relatifs

280 SET PROGRAMMER Data I/O's 280 Set Programmer gives you high performance at an affordable price The 280 pro- grams up to eight MOS/CMOS EPROM and EEPROM devices at a

A little time invested now will help you make the most of your conference and will give you the skills necessary to participate in any conference at The University of

At that time the com­ pany, under the direction of Peter Z inovieff, was exploring the possibilities of using mini­ computers to control electronic music instruments.. My job was

al [12] where the imagined latent state is used to compute the reward value for optimizing reinforcement learning tasks and the work of Finn and Levine [5], where a predictive model

In keeping with the principles of rough consensus, running code, architectural integrity, and in the interest of ensuring the global stability of the Internet, the IAB

When NETED is used to create a file (that is, when it is invoked from command level with an argument which specifies the name of a file which does not already exist in the

Design of an FTP server would be simpler if all command verbs were the same length, and design of an FTP user would be simpler if either all command verbs were the same length, or

We extend LoST with three additional &lt;findService&gt; query types, giving the protocol the ability to find the N nearest instances of a particular service, all services