|
|
|
| 2 |
>
|
|
| C |
alculation Sheet
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
|
|
|
|
| I |
|
|
|
| D |
|
|
|
|
| J |
ob Description
Immediate Predecessors |
Planned Duration (Weeks) |
Staff (
|
|
|
|
| N |
umber)
Rate/Person/Week |
Task Cost (
|
|
|
| B |
|
|
|
| A |
C)
A
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
.
|
|
|
|
| 0 |
ASS
|
|
|
| E |
|
|
| M |
B
|
|
|
| L |
E EN
|
|
|
| G |
INE M
|
|
| O |
UNT
|
|
|
| Start |
2
| 4 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| $1,440 |
$11,520 |
B
| 2.0
|
|
|
| F |
IN PREPARATION
Start 1
|
|
|
|
| 3 |
$1,440
|
|
| $4,320 |
C
| 3.0 MAR
|
|
|
|
|
| K |
FIN AND LAUNC
|
|
| H |
LUG LINES
Start 1 3 $1,440 $4,320
D
|
|
|
| 4.0 INSERTING ENGINE MOUNT |
A 2 3 $1,440
| $8,640 |
E
|
|
|
| 5.0 ATTACH FINS |
D 1 3 $1,440 $4,320
F
|
|
|
| 6.0 ATTACH SHOCK CORD |
Start 2 3 $1,440 $8,640
G
|
|
|
| 7.0 ASSEMBLE NOSE CONE |
Start 1 2 $1,440
| $2,880 |
H
|
|
|
| 8.0 ATTACH PARACHUTE/SHOCK CORD |
G 1 1 $1,440 $1,440
I
|
|
|
| 9.0 ATTACH LAUNCH LUG |
E 1 1 $1,440 $1,440
J
|
|
|
| 10.0 PAINTING THE ROCKET |
I 1 4 $1,440
$5,760 |
K
|
|
|
| 11.0 APPLICATION OF DECALS |
J 1 1 $1,440 $1,440
L
|
|
|
| 12.0 APPLYING CLEAR COAT |
K 1 1 $1,440 $1,440
M
|
|
|
| 13.0 DISPLAY NOZZLE ASSEMBLY |
K 1 3 $1,440 $4,320
N
|
|
|
| 14.0 ROCKET PREFLIGHT |
L 1 2 $1,440 $2,880
O
|
|
|
| 15.0 PREPARE FOR TEST LAUNCH |
N 1 1 $1,440 $1,440
| 24 weeks Level of Effort |
| Blended Rate= |
$1,440
11 weeks duration |
Performance Sheet
| BCWS (PV) TASK |
|
|
| Week 1 |
|
|
| Week 2 |
|
|
| Week 3 |
|
|
| Week 4 |
|
|
| Week 5 |
|
|
| Week 6 |
|
|
| Week 7 |
|
|
| Week 8 |
|
|
| Week 9 |
|
|
| Week 10 |
|
|
| Week 11 |
|
|
| Week 12 |
|
|
| Week 13 |
|
|
| Week 14 |
|
|
| Week 15 |
|
|
| Week 16 |
B
| AC Totals |
A
|
|
| 1.0 ASSEMBLE ENGINE MOUNT |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| $0 |
B
|
|
| 2.0 FIN PREPARATION |
$0
C
|
|
| 3.0 MARK FIN AND LAUNCH LUG LINES |
$0
D 4.0 INSERTING ENGINE MOUNT $0
E 5.0 ATTACH FINS $0
F 6.0 ATTACH SHOCK CORD $0
G 7.0 ASSEMBLE NOSE CONE $0
H 8.0 ATTACH PARACHUTE/SHOCK CORD $0
I 9.0 ATTACH LAUNCH LUG $0
J 10.0 PAINTING THE ROCKET $0
K 11.0 APPLICATION OF DECALS $0
L 12.0 APPLYING CLEAR COAT $0
M 13.0 DISPLAY NOZZLE ASSEMBLY $0
N 14.0 ROCKET PREFLIGHT $0
O 15.0 PREPARE FOR TEST LAUNCH $0
|
|
| Equipment |
$0
|
|
| Material |
$0
$0
|
|
| Weekly Total |
$0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0
|
| Cumulative Cost (PV) |
$0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0
|
| ID |
ACWP (AC) TASK |
Week 1 Week 2 Week 3 Week 4 Week 5 Week 6 Week 7 Week 8 Week 9 Week 10 Week 11 Week 12 Week 13 Week 14 Week 15 Week 16 AC Totals
A 1.0 ASSEMBLE ENGINE MOUNT $0
B 2.0 FIN PREPARATION $0
C 3.0 MARK FIN AND LAUNCH LUG LINES $0
D 4.0 INSERTING ENGINE MOUNT $0
E 5.0 ATTACH FINS $0
F 6.0 ATTACH SHOCK CORD $0
G 7.0 ASSEMBLE NOSE CONE $0
H 8.0 ATTACH PARACHUTE/SHOCK CORD $0
I 9.0 ATTACH LAUNCH LUG $0
J 10.0 PAINTING THE ROCKET $0
K 11.0 APPLICATION OF DECALS $0
L 12.0 APPLYING CLEAR COAT $0
M 13.0 DISPLAY NOZZLE ASSEMBLY $0
N 14.0 ROCKET PREFLIGHT $0
O 15.0 PREPARE FOR TEST LAUNCH $0
Equipment $0
Material $0
$0
Weekly Total $0 $0 $0
|
| Cumulative Cost (AC) |
$0 $0 $0
ID
| BCWP (EV) TASK |
Week 1 Week 2 Week 3 Week 4 Week 5 Week 6 Week 7 Week 8 Week 9 Week 10 Week 11 Week 12 Week 13 Week 14 Week 15 Week 16
EV Totals |
A 1.0 ASSEMBLE ENGINE MOUNT $0
B 2.0 FIN PREPARATION $0
C 3.0 MARK FIN AND LAUNCH LUG LINES $0
D 4.0 INSERTING ENGINE MOUNT $0
E 5.0 ATTACH FINS $0
F 6.0 ATTACH SHOCK CORD $0
G 7.0 ASSEMBLE NOSE CONE $0
H 8.0 ATTACH PARACHUTE/SHOCK CORD $0
I 9.0 ATTACH LAUNCH LUG $0
J 10.0 PAINTING THE ROCKET $0
K 11.0 APPLICATION OF DECALS $0
L 12.0 APPLYING CLEAR COAT $0
M 13.0 DISPLAY NOZZLE ASSEMBLY $0
N 14.0 ROCKET PREFLIGHT $0
O 15.0 PREPARE FOR TEST LAUNCH $0
Equipment $0
Material $0
$0
Weekly Total $0 $0 $0
|
| Cumulative Cost (EV) |
$0 $0 $0
Week 1 Week 2 Week 3 Week 4 Week 5 Week 6 Week 7 Week 8 Week 9 Week 10 Week 11 Week 12 Week 13 Week 14 Week 15 Week 16
Cumulative Cost (PV) $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0
Cumulative Cost (AC) $0 $0 $0
Cumulative Cost (EV) $0 $0 $0
| BAC= |
0
| BCWS (PV)= |
0
| ACWP (AC)= |
0
| BCWP (EV)= |
0
|
|
| SV= |
BCWP-BCWS |
|
| CV= |
BCWP-ACWP |
SV= CV=
SV= $0 CV= $0
| Project is $xxK behind schedule |
Project is $xxK over budget |
|
|
| SPI= |
BCWP/BCWS |
|
| CPI= |
BCWP/ACWP |
SPI= CPI=
SPI= 0 CPI= 0
For every dollar spent on scheduled
effort we realize $0.xx worth of progress. |
For every dollar spent, we realize $0.xx of planned result. |
|
|
|
| EAC |
=
BAC/CPI |
|
| PM Eval= |
+1
| 0% |
to -5% EAC
EAC = PM Eval=
| EAC/BAC |
EAC =
| $0.00 |
PM Eval= 0% EAC
| The PM’s EAC isxx% over/under BAC and is/is not in trouble. |
May 22-28
May 29-Jun 4
Jun 5-Jun 11
Jun 12-Jun18
Jun 19-Jun 25
Jun 26-Jul 2
Jul 3-Jul9
Jul 10-Jul 16
Jul 17-Jul23
Jul 24-Jul30
Jul 31-Aug 6
Aug 7-Aug 13
Performance Sheet
Cumulative Cost (PV)
Cumulative Cost (AC)
Cumulative Cost (EV)
BAC=$xxx,xxxx
EAC=$xxx,xxx
Planned Value (PV) Section
Earned Value (EV) Section
Actual Cost (AC) Section
Cumulative or S-Curve Graph
Performance Measures Section
|
|
|
|
|
|
|
|
|
|
| 2 |
>Recovered_Sheet
|
|
|
|
|
|
|
|
|
|
| 1 |
Summary
| Application |
Requested |
Previous |
| Delta |
Issues/Risks/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Dec |
isions
| AP |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| $ |
21,
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 0 |
2
| 6 |
|
|
| $0 |
.00
-$
|
| 21,026 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 10 |
0
% AP decoms dependent on purchase of final eight
| SAVE |
S @ ~ $1M (UFR).
| BDRE |
$
|
| 108,
|
|
|
| 5 |
58
| $0.00 |
-$108,558 |
Review of this application with customers resulted in a coonversion requirement @ $108.6K + Capital expense (UFR). |
| DSE |
$126,
|
| 29 |
| 7 |
$
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 100 |
,6
|
|
|
|
|
|
|
|
| 3 |
9.25
-$25,
| 65 |
8
System under delay for stress testing and code defect remediation. Delay may cause slip past morotorium. |
| HOBIC |
N/A |
$0.00 $0
System transferred . No langer a NIS Y2K responsibility. |
| MM |
| $
|
|
|
| 17,675 |
$0.00
| -$17,675 |
System undergoing migration from M/F host. Y2K Compliance integral part of migration coding. |
| NAP |
$17,675 $0.00 -$17,675
Decomission dependent on NEMAS acceptance of NAP as front-end processor or utilization of TCP/IP. |
| NIC |
$122,
| 20 |
6
$105,
|
| 57 |
1.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 50 |
-$
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 16 |
,63
|
|
| 4 |
On Schedule, Under Requested Budget. Certain actuals not posted to date. |
SAVE
| $
|
|
|
| 60 |
4,179
$
|
| 11 |
2,773.75
-$491,
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 40 |
5
In order to meet Y2K Compliance by EOY, SAVE dependent on purchace of eight additional SAVEs @ $1M (UFR). |
| TOTAL= |
| $
| 1,017,6
|
|
|
| 15 |
$318,984.50 |
| 98 Allocation |
$
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 38 |
2,000
$382,000.00 |
Delta
| -$6
|
|
| 35 |
,615
$63,015.50 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Jan |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Feb |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Mar |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Apr |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| May |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Jun |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Jul |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Aug |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Sep |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Oct |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Nov |
Dec
|
|
|
|
|
|
|
| BCWS |
| $
|
|
|
| 4,655 |
| $11,7
| 80 |
$
|
|
| 23 |
,
3
|
|
| 32 |
$26,173 |
$154,
|
|
|
| 45 |
4
$
| 198 |
,992
$281,197 |
$331,507 |
$387,998 |
$462,
|
|
|
|
| 70 |
8
$9
|
|
| 96 |
,077
$1,017,615
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| ACWP |
$4,655 $11,780
$23,920 |
$27,3
|
| 30 |
$158,826 |
$209,292 |
$
| 240,259 |
Summary
Jan Jan
Feb Feb
Mar Mar
Apr Apr
May May
Jun Jun
Jul Jul
Aug Aug
Sep Sep
Oct Oct
Nov Nov
Dec Dec
BCWS
ACWP
Cumulative Cost for NIS Y2K (7/14/98)
4655
4655
11780
11780
23332
23919.6
26172.5
27330.1
15
| 44 |
54.197142
|
| 85 |
7
158825.63952381
198992.38
76 |
190
|
|
|
|
|
|
|
|
| 48 |
209292.249047619
281197.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 43 |
5238095
2
| 402 |
59.106190476
3
315 |
07.459047619
387998.459047619
46
270 |
8.3
638 |
09524
996076.720952381
10
| 176 |
14.86380952
Roll-Up
| BCWS
|
|
|
| 6/6/98 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1998 |
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Total |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Internal (MCI Staff) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Hours |
| 122.5 |
|
| 187 |
.5
304 |
74.75 |
1148 |
881 |
|
|
|
|
| 120 |
6
622 |
790 |
732 |
332
| 205 |
6,605 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Rate |
38 38 38 38 38 38 38 38 38 38 38 38
$ 4,655
| 7,
|
| 125 |
.00
| 1
|
| 1,552 |
.00
2,
| 840 |
.50
43,624.00 |
33,478.00 |
45,828.00 |
23,636.00 |
30,020.00 |
27,816.00 |
12,
| 616 |
.00
| 7,790.00 |
250,981 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| External (Contractors) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Grade |
Hours 0 0 0 0 120 176
212 |
253 |
294 |
| 248 |
183 |
|
| 174 |
1,
| 66 |
0
Rate
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 56 |
56 56 56 56 56 56 56 56 56 56 56
$
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| – 0 |
– 0 – 0 – 0
| 6,722.86 |
| 9,860 |
.19
11,877.05 |
14,174.02 |
16,471.00 |
13,893.90 |
10,252.36 |
| 9,748 |
.14
9
| 3,000 |
Grade Hours 0 0 0 0 0 0 70 125 100
| 290 |
65 0
650 |
Rate 100 100 100 100 100 100 100 100 100 100 100 100
$ – 0 – 0 – 0 – 0 – 0 – 0
|
| 7,000.00 |
| 12,500.00 |
| 10,000.00 |
2
| 9,000 |
.00
6,500.00 |
– 0
65,000 |
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Total Contractor |
Hours – 0 – 0 – 0 – 0 120 176
282 |
378 |
394 |
538 |
248 174
2,310 |
$ – 0 – 0 – 0 – 0
|
| 6,723 |
9,860
18,877 |
26,674 |
26,471 |
42,894 |
16,752 |
9,748
158,000 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Capital (Detail Items) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Hardware |
$ 0 0 0 0 0
| 1200 |
|
| 1500 |
0
0 0
|
|
|
|
| 1000 |
501000 |
1000
519,200 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Software |
$ 0 0 0 0 0 0 0 0 0 0 0 0 – 0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Other Costs (Detail Items) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Team/Factory Costs |
$ 0 0 0 0
77934.84 |
0 0 0 0 0 0 0
77,935 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Travel |
$ 0 0 0 0 0 0
2500 |
0 0
|
|
|
|
|
|
| 300 |
0
3000 3000
1
| 1,500 |
$ 0 0 0 0 0 0 0 0 0 0 0 0 – 0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Total 1998 |
$ 4,655
| 7,125 |
11,552
2,841 |
128,282 |
44,538 |
82,205 |
50,310 |
56,491 |
74,710 |
533,368 |
21,538 |
1,017,615
| ACWP 6/2/98 |
1998
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
Internal (MCI Staff) Hours 122.5 187.5 174
| 89.75 |
784.5 |
963.5 |
638 0 0 0 0 0
2,960 |
Rate 38 38 38 38 38 38 38 38 38 38 38 38
$ 4,655
| 7,125.00 |
6,612.00 |
3,410.50 |
29,811.00 |
36,613.00 |
24,244.00 |
– 0 – 0 – 0 – 0 – 0
|
|
|
| 112 |
,471
External (Contractors)
Grade Hours 0 0 0 0
|
|
| 160 |
160 120 0 0 0 0 0
| 440 |
Rate 56 56 56 56 56 56 56 56 56 56 56 56
$ – 0 – 0 – 0 – 0
|
| 8,963.81 |
8,963.81 6,722.86 – 0 – 0 – 0 – 0 – 0
24,650 |
Grade Hours 0 0
|
|
|
|
|
| 130 |
0
| 82.5 |
| 115 |
0 0 0 0 0 0
328 |
Rate 43 43 43 43 43 43 43 43 43 43 43 43
$ – 0 – 0
| 5,
| 527 |
.60
– 0
| 3,507.90 |
| 4,889.80 |
– 0 – 0 – 0 – 0 – 0 – 0
13,925 |
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Total Contractor Hours – 0 – 0 130 – 0
|
| 243 |
275 |
120 – 0 – 0 – 0 – 0 – 0
768 |
$ – 0 – 0
|
|
| 5,528 |
– 0
12,472 |
13,854 |
6,723 – 0 – 0 – 0 – 0 – 0
38,576 |
Capital (Detail Items)
Hardware $ 0 0 0 0 0 0 0 0 0 0 0 0 – 0
Software $ 0 0 0 0 0 0 0 0 0 0 0 0 – 0
Other Costs (Detail Items)
Team/Factory Costs $ 0 0 0 0
| 89212.83 |
0 0 0 0 0 0 0
89,213 |
Travel $ 0 0 0 0 0 0 0 0 0 0 0 0 – 0
$ – 0
Total 1998 $ 4,655 7,125
| 12,140 |
3,411 |
131,496 |
50,467 |
30,967 |
– 0 – 0 – 0 – 0 – 0 240,259
1998 Y2K Budget Revision
&L&D&C&P&R&T
AP
| Budget Description: Update |
|
|
|
|
|
| App Mgr Name |
|
| Larry Lafreniere |
| Application Name: Adjunct Processor (AP) |
|
|
|
|
|
| Project Coord Name |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| Jeff Tyler |
|
| Phase: Decom Assessment |
|
|
|
|
|
| VP Name |
|
|
|
|
|
| Patrice Carroll |
|
|
|
|
|
|
| Work Request # |
|
|
|
|
|
| Director Name |
|
|
|
|
|
|
|
|
|
| Bob Laird |
|
|
|
|
|
|
| Work Request Name |
|
|
|
|
|
| Authorized Dep’ts |
|
|
|
|
|
| 2895 |
|
|
|
|
|
|
| Project # |
|
|
|
|
|
| Date |
6/7/98 |
|
|
|
|
|
|
| Company # |
| SHL
|
|
| Project Code # |
1998
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
Internal (MCI Staff) Hours 32 32 32 96
Rate 38 38 38 38 38 38 38 38 38 38 38 38
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
|
|
| 1,216.00 |
1,216.00 1,216.00
3,
| 64 |
8
External (Contractors)
Grade Hours 16 16 16 16 16 16 96
Rate 56 56 56 56 56 56 56 56 56 56 56 56
$ – 0 – 0 – 0 – 0 – 0 – 0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 896 |
.38
|
|
|
|
|
|
|
|
|
|
|
|
| 896.38 |
896.38 896.38 896.38 896.38
| 5,378 |
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Total Contractor Hours – 0 – 0 – 0 – 0 – 0 – 0 16 16 16 16 16 16 96
$ – 0 – 0 – 0 – 0 – 0 – 0 896 896 896 896 896 896 5,378
Capital (Detail Items)
Hardware $ 1000 1000 1000 3,000
Software $ – 0
Other Costs (Detail Items)
Team/Factory Costs $ – 0
Travel $ 3000 3000 3000 9,000
$ – 0
Total 1998 $ – 0 – 0 – 0 – 0 – 0 – 0 896 896 896
|
|
| 6,112 |
6,112 6,112 21,026
ACWP 1998
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
Internal (MCI Staff) Hours – 0
Rate 38 38 38 38 38 38 38 38 38 38 38 38
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
External (Contractors)
Grade Hours – 0
Rate 56 56 56 56 56 56 56 56 56 56 56 56
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Total Contractor Hours – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Capital (Detail Items)
Hardware $ – 0
Software $ – 0
Other Costs (Detail Items)
Team/Factory Costs $ – 0
Travel $ – 0
$ – 0
Total 1998 $ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
BCWS – 0 – 0 – 0 – 0 – 0 – 0 896
| 1,793 |
|
|
|
|
| 2,689 |
8,802 |
14,914 |
21,026
|
|
|
|
|
|
| BCWP |
0 0 0 0 0 0
ACWP 0 – 0 – 0 – 0 – 0 – 0
AP
Jan Jan Jan
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Sep Sep Sep
Oct Oct Oct
Nov Nov Nov
Dec Dec Dec
BCWS
BCWP
ACWP
Cumulative Costs for AP Y2K Decommission
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
896.380952381
1792.7619047619
2689.1428571429
8801.5238095238
14913.9047619048
| 210 |
26.2857142857
BDRR
| Budget Description: Update |
App Mgr Name
|
|
|
| Larry LaFreniere |
| Application Name:Billing Detail Record Reporting (BDRR) |
Project Coord Name Jeff Tyler
| Phase: Conversion Assessment |
VP Name Patrice Carroll
Work Request # Director Name Bob Laird
Work Request Name Authorized Dep’ts 2895
Project # Date 6/6/98
Company #
Project Code #
1998
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
Internal (MCI Staff) Hours
|
| 215 |
255 |
340 |
315 70 10
1,205 |
Rate 38 38 38 38 38 38 38 38 38 38 38 38
$ – 0 – 0 – 0 – 0 – 0 – 0
| 8,170.00 |
9,690.00 |
12,920.00 |
11,970.00 |
2,660.00 |
| 380.00 |
45,790 |
External (Contractors)
Grade Hours 50 85 80 85 20 15
| 335 |
Rate 56 56 56 56 56 56 56 56 56 56 56 56
$ – 0 – 0 – 0 – 0 – 0 – 0
|
|
|
|
| 2,801 |
.19
| 4,762.02 |
4,481.90 |
4,762.02
1,120.48 |
840.36 |
18,768 |
Grade Hours 70 125 100 130 15 0 440
Rate 100 100 100 100 100 100 100 100 100 100 100 100
$ – 0 – 0 – 0 – 0 – 0 – 0 7,000.00 12,500.00 10,000.00
| 13,000.00 |
1,500.00 |
– 0
44,000 |
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Total Contractor Hours – 0 – 0 – 0 – 0 – 0 – 0 120 210
| 180 |
215 35 15
775 |
$ – 0 – 0 – 0 – 0 – 0 – 0
| 9,801 |
17,262 |
14,482 |
17,762 |
2,620 |
840
62,768 |
Capital (Detail Items)
Hardware $ – 0
Software $ – 0
Other Costs (Detail Items)
Team/Factory Costs $ – 0
Travel $ – 0
$ – 0
Total 1998 $ – 0 – 0 – 0 – 0 – 0 – 0
|
| 17,971 |
26,952 |
27,402 |
29,732 |
5,280 |
1,220 |
108,558
ACWP 1998
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
Internal (MCI Staff) Hours – 0
Rate 38 38 38 38 38 38 38 38 38 38 38 38
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
External (Contractors)
Grade Hours – 0
Rate 56 56 56 56 56 56 56 56 56 56 56 56
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Grade Hours – 0
Rate 100 100 100 100 100 100 100 100 100 100 100
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Total Contractor Hours – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Capital (Detail Items)
Hardware $ – 0
Software $ – 0
Other Costs (Detail Items)
Team/Factory Costs $ – 0
Travel $ – 0
$ – 0
Total 1998 $ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
BCWS – 0 – 0 – 0 – 0 – 0 – 0 17,971
| 44,923 |
72,325 |
102,057 |
107,338 |
108,558
BCWP 0 0 0 0 0 0
ACWP 0 – 0 – 0 – 0 – 0 – 0
Jeff Tyler:
Est cost of shipping and hanling of stratus boxes for turn-in credit.
Jeff Tyler:
Est cost of shipping and hanling of stratus boxes for turn-in credit.
Jeff Tyler:
Est cost of shipping and hanling of stratus boxes for turn-in credit.
Jeff Tyler:
Est. cost of trips to two sites for unexpected problems.
Jeff Tyler:
Est. cost of trips to two sites for unexpected problems.
Jeff Tyler:
Est. cost of trips to two sites for unexpected problems.
BDRR
Jan Jan Jan
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Sep Sep Sep
Oct Oct Oct
Nov Nov Nov
Dec Dec Dec
BCWS
BCWP
ACWP
Cumulative Cost Of BDR Y2K Conversion
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
17971.1904761905
44923.2142857
143 |
72325.119047619
102057.142857143
107337.619047619
108557.976190476
DSE
|
|
|
| Budget Description: 1998 Y2K Update |
App Mgr Name Larry Lafreniere
| Application Name: ISP\NIS\DSE |
Project Coord Name Jeff Tyler
| Phase: Conversion
|
|
| Analysis |
VP Name Patrice Carroll
Work Request #
| 120743 |
Director Name
Open |
Work Request Name Authorized Dep’ts 2895
Project # Date
| 5/7/98 |
Company #
|
| SHL Project Code # |
1998
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
Internal (MCI Staff) Hours 29
|
| 37.75 |
220.5 |
23
|
|
| 413 |
66 76 187 270 205 50 23
1,600 |
Rate 38 38 38 38 38 38 38 38 38 38 38 38
$
|
|
|
| 1,102 |
| 1,434.50 |
| 8,379 |
.00
|
|
| 874 |
.00
|
|
| 15,694.00 |
2,508.00 |
2,888.00 |
7,106.00 |
10,260.00 |
7,790.00
|
|
|
|
|
|
|
|
|
|
|
| 1,900.00 |
874.00 |
60,810 |
External (Contractors)
Grade Hours 40 48 38 40 70 35 35 11
| 317 |
Rate 56 56 56 56 56 56 56 56 56 56 56 56
$ – 0 – 0 – 0 – 0
|
|
|
|
|
|
|
|
|
|
| 2,240.95 |
|
|
|
| 2,689.14 |
|
| 2,128.90 |
2,240.95
3,921.67 |
| 1,960.83 |
1,960.83
616.26 |
17,760 |
Grade Hours 160 50 210
Rate 100 100 100 100 100 100 100 100 100 100 100 100
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
| 16,000.00 |
5,000.00 |
– 0
2
| 1,000 |
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Total Contractor Hours – 0 – 0 – 0 – 0 40 48 38 40 70
| 195 |
85 11 527
$ – 0 – 0 – 0 – 0
|
|
|
|
|
|
|
|
|
|
| 2,241 |
2,689
|
| 2,129 |
2,241
3,922 |
17,961 |
6,961 |
616
38,760 |
Capital (Detail Items)
Hardware $ – 0
Software $ – 0
Other Costs (Detail Items)
Team/Factory Costs $
| 26,727.88 |
26,728 |
Travel $ – 0
$ – 0
Total 1998 $ 1,102
|
| 1,435 |
8,379 874
44,663 |
5,197 |
5,017 |
9,347 |
14,182 |
25,751 |
8,861 |
1,490 |
| 126,297 |
|
|
| Ass/Req |
Analysis
Stress
|
|
| Test |
Delay (DP&D)
|
| Coding |
Test
|
| FVO |
|
| Roll-Out |
| 93.00% |
7
| 9.00% |
74.00% |
5
| 8.00% |
9.00% 8.00%
| 1,024.86 |
2,158.12 |
8,358.58 |
8,865.50 |
12,885.15 |
13,300.92 |
1998
ACWP Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
Internal (MCI Staff) Hours 29 37.75
| 90.5 |
23 15 10
60.5 |
266 |
Rate 38 38 38 38 38 38 38 38 38 38 38 38
$ 1,102 1,434.50
| 3,439.00 |
874.00
570.00 |
380.00
2,299.00 |
– 0 – 0 – 0 – 0 – 0
10,099 |
External (Contractors)
Grade Hours 40 43 30
| 113 |
Rate 56 56 56 56 56 56 56 56 56 56 56 56
$ – 0 – 0 – 0 – 0 2,240.95
|
| 2,409 |
.02
1,680.71 |
– 0 – 0 – 0 – 0 – 0
6,331 |
Grade Hours 130 130
Rate 43 43 43 43 43 43 43 43 43 43 43 43
$ – 0 – 0
| 5,527.60 |
– 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 5,528
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Total Contractor Hours – 0 – 0 130 – 0 40 43 30 – 0 – 0 – 0 – 0 – 0 243
$ – 0 – 0 5,528 – 0 2,241 2,409
| 1,681 |
– 0 – 0 – 0 – 0 – 0
11,858 |
Capital (Detail Items)
Hardware $ – 0
Software $ – 0
Other Costs (Detail Items)
Team/Factory Costs $
|
| 27,763 |
.06
27,763
Travel $ – 0
$ – 0
Total 1998 $ 1,102 1,435
| 8,967 |
874
30,574 |
2,789 |
3,980 |
– 0 – 0 – 0 – 0 – 0
49,720 |
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
BCWS
|
| 1102 |
| 2,537 |
10,916 |
11,790 |
56,452 |
61,649 |
66,666 |
76,013 |
90,195 |
115,946 |
124,807 |
126,297
BCWP
| 1,025 |
3,183 |
11,542 |
20,407 |
33,292 |
35,052 |
ACWP 1102 2,537
| 11,503 |
12,377 |
42,951 |
45,740 |
&L&D&C&A&R&T
DSE
Jan Jan Jan
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Sep Sep Sep
Oct Oct Oct
Nov Nov Nov
Dec Dec Dec
BCWS
BCWP
ACWP
Cumulative Costs for DSE Y2K (7/14/98)
1102
1024.86
1102
2536.5
3182.975
2536.5
10915.5
115
| 41.5 |
5
11503.1
11789.5
20407.045
12377.1
56452.3323809524
33292.1949142857
42951.1123809524
61649.4752380952
35051.5662571429
45740.1361904762
66666.38
76013.3323809524
90194.999047619
115945.832380952
124806.665714286
126296.927619048
MM
| Budget Description: 1998 Y2K Update |
App Mgr Name Larry LaFreniere
| Application Name: Match Merge (MM) |
Project Coord Name Jeff Tyler
Phase: Decom Assessment VP Name Patrice Carroll
Work Request # Director Name Bob Laird
Work Request Name Authorized Dep’ts 2895
Project # Date 6/6/98
Company #
SHL Project Code #
1998
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
Internal (MCI Staff) Hours 50 50 50 50 50 50 300
Rate 38 38 38 38 38 38 38 38 38 38 38 38
$ – 0 – 0 – 0 – 0 – 0 – 0 1,900.00 1,900.00 1,900.00 1,900.00 1,900.00 1,900.00
|
| 11,400 |
External (Contractors)
Grade Hours 16 16 16 16 16 16 16 112
Rate 56 56 56 56 56 56 56 56 56 56 56 56
$ – 0 – 0 – 0 – 0 – 0 896.38 896.38 896.38 896.38 896.38 896.38 896.38
|
|
|
| 6,275 |
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Total Contractor Hours – 0 – 0 – 0 – 0 – 0 16 16 16 16 16 16 16 112
$ – 0 – 0 – 0 – 0 – 0 896 896 896 896 896 896 896 6,275
Capital (Detail Items)
Hardware $ – 0
Software $ – 0
Other Costs (Detail Items)
Team/Factory Costs $ – 0
Travel $ – 0
$ – 0
Total 1998 $ – 0 – 0 – 0 – 0 – 0 896
|
|
|
|
|
|
|
|
|
|
|
| 2,796 |
2,796 2,796 2,796 2,796 2,796 17,675
ACWP 1998
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
Internal (MCI Staff) Hours – 0
Rate 38 38 38 38 38 38 38 38 38 38 38 38
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
External (Contractors)
Grade Hours – 0
Rate 56 56 56 56 56 56 56 56 56 56 56 56
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Total Contractor Hours – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Capital (Detail Items)
Hardware $ – 0
Software $ – 0
Other Costs (Detail Items)
Team/Factory Costs $ – 0
Travel $ – 0
$ – 0
Total 1998 $ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
BCWS – 0 – 0 – 0 – 0 – 0 896
|
| 3,693 |
| 6,489 |
| 9,286 |
| 12,082 |
| 14,878 |
17,675
BCWP 0 0 0 0 0 0
ACWP 0 – 0 – 0 – 0 – 0 – 0
Jeff Tyler:
Lack of tester will require product testing in ICCA with a contracted teser.
Jeff Tyler:
Lack of tester will require product testing in ICCA with a contracted teser.
MM
Jan Jan Jan
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Sep Sep Sep
Oct Oct Oct
Nov Nov Nov
Dec Dec Dec
BCWS
BCWP
ACWP
Cumulative Cost Of Match Merge Y2K Conversion
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
896.380952381
0
0
3692.7619047619
6489.1428571429
9285.5238095238
12081.9047619048
14878.2857142857
17674.6666666667
NAP
Budget Description: 1998 Y2K Update App Mgr Name Larry LaFreniere
| Application Name: NIC Adjunct Processor (NAP) |
Project Coord Name Jeff Tyler
| Phase: Decom Assessment |
VP Name Patrice Carroll
Work Request # Director Name Bob Laird
Work Request Name Authorized Dep’ts 2895
Project # Date 6/6/98
Company #
Project Code #
1998
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
Internal (MCI Staff) Hours 50 50 50 50 50 50 300
Rate 38 38 38 38 38 38 38 38 38 38 38 38
$ – 0 – 0 – 0 – 0 – 0 – 0 1,900.00 1,900.00 1,900.00 1,900.00 1,900.00 1,900.00 11,400
External (Contractors)
Grade Hours 16 16 16 16 16 16 16 112
Rate 56 56 56 56 56 56 56 56 56 56 56 56
$ – 0 – 0 – 0 – 0 – 0 896.38 896.38 896.38 896.38 896.38 896.38 896.38 6,275
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Total Contractor Hours – 0 – 0 – 0 – 0 – 0 16 16 16 16 16 16 16 112
$ – 0 – 0 – 0 – 0 – 0 896 896 896 896 896 896 896 6,275
Capital (Detail Items)
Hardware $ – 0
Software $ – 0
Other Costs (Detail Items)
Team/Factory Costs $ – 0
Travel $ – 0
$ – 0
Total 1998 $ – 0 – 0 – 0 – 0 – 0 896 2,796 2,796 2,796 2,796 2,796 2,796 17,675
ACWP 1998
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
Internal (MCI Staff) Hours – 0
Rate 38 38 38 38 38 38 38 38 38 38 38 38
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
External (Contractors)
Grade Hours – 0
Rate 56 56 56 56 56 56 56 56 56 56 56 56
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Total Contractor Hours – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Capital (Detail Items)
Hardware $ – 0
Software $ – 0
Other Costs (Detail Items)
Team/Factory Costs $ – 0
Travel $ – 0
$ – 0
Total 1998 $ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
BCWS – 0 – 0 – 0 – 0 – 0 896 3,693 6,489 9,286 12,082 14,878 17,675
BCWP 0 0 0 0 0 0
ACWP 0 – 0 – 0 – 0 – 0 – 0
NAP
Jan Jan Jan
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Sep Sep Sep
Oct Oct Oct
Nov Nov Nov
Dec Dec Dec
BCWS
BCWP
ACWP
Cumulative Cost Of NAP Y2K Claimed Compliance
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
896.380952381
0
0
3692.7619047619
6489.1428571429
9285.5238095238
12081.9047619048
14878.2857142857
17674.6666666667
NIC
Budget Description: 1998 Y2K Update App Mgr Name Larry LaFreniere
| Application Name:Network Information Concentrator (NIC) |
Project Coord Name Jeff Tyler
| Phase: Conversion Analysis |
VP Name Patrice Carroll
Work Request #
| 120742 |
Director Name Bob Laird
Work Request Name Authorized Dep’ts 2895
Project # Date 6/6/98
Company #
SHL Project Code #
1998
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
Internal (MCI Staff) Hours 38 41.5 44
| 29.5 |
413 413 413 40 40 40 40 0 1,552
Rate 38 38 38 38 38 38 38 38 38 38 38 38
$
|
|
|
|
|
| 1,444 |
|
| 1,577 |
.00
|
| 1,672 |
.00
| 1,121 |
.00
15,694.00 15,694.00 15,694.00
|
|
|
|
|
|
|
| 1,520.00 |
1,520.00 1,520.00 1,520.00 – 0
58,976 |
External (Contractors)
Grade Hours 40 48 38 40 48 40 40 50
|
|
|
| 344 |
Rate 56 56 56 56 56 56 56 56 56 56 56 56
$ – 0 – 0 – 0 – 0 2,240.95 2,689.14 2,128.90 2,240.95 2,689.14 2,240.95 2,240.95
| 2,801.19 |
|
|
| 19,272 |
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Total Contractor Hours – 0 – 0 – 0 – 0 40 48 38 40 48 40 40 50 344
$ – 0 – 0 – 0 – 0 2,241 2,689 2,129 2,241 2,689 2,241 2,241 2,801 19,272
Capital (Detail Items)
Hardware $ 15000
| 15,000 |
Software $ – 0
Other Costs (Detail Items)
Team/Factory Costs $
|
| 27,457 |
.32
27,457
Travel $ 1500 1,500
$ – 0
Total 1998 $ 1,444 1,577 1,672 1,121
| 45,392 |
18,383 |
34,323 |
|
|
|
| 3,761 |
| 4,209 |
3,761 3,761 2,801
| 122,206 |
Ass/Req Analysis Coding Test FVO Roll-Out
| 90.00% |
81.00% |
88.00% |
77.00% |
56.00% |
17.00% |
| 1,299.60 |
2,576.97 |
2,770.96 |
3,634.13 |
29,053.80 |
32,178.94 |
ACWP 1998
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
Internal (MCI Staff) Hours 38 41.5 44
| 44.5 |
89.25 |
147 |
64
468 |
Rate 38 38 38 38 38 38 38 38 38 38 38 38
$ 1,444
| 1,577.00 |
1,672.00 |
| 1,691 |
.00
3,391.50 |
5,586.00 |
2,432.00 |
– 0 – 0 – 0 – 0 – 0
17,794 |
External (Contractors)
Grade Hours 60 57 45
|
| 162 |
Rate 56 56 56 56 56 56 56 56 56 56 56 56
$ – 0 – 0 – 0 – 0
|
| 3,361 |
.43
| 3,193 |
.36
|
| 2,521 |
.07
– 0 – 0 – 0 – 0 – 0
| 9,076 |
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Total Contractor Hours – 0 – 0 – 0 – 0 60 57 45 – 0 – 0 – 0 – 0 – 0 162
$ – 0 – 0 – 0 – 0 3,361 3,193 2,521 – 0 – 0 – 0 – 0 – 0 9,076
Capital (Detail Items)
Hardware $ – 0
Software $ – 0
Other Costs (Detail Items)
Team/Factory Costs $
|
| 28,538 |
.09
28,538
Travel $ – 0
$ – 0
Total 1998 $ 1,444 1,577 1,672 1,691
| 35,291 |
8,779 |
4,953 |
– 0 – 0 – 0 – 0 – 0
55,407 |
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
BCWS 1,444
|
| 3,021 |
| 4,693 |
5,814 |
51,206 |
69,589 |
103,912 |
107,673 |
111,882 |
115,643 |
119,404 |
122,206
BCWP
| 1,300 |
3,877 |
5,348 |
6,405 |
32,688 |
61,233 |
ACWP 1,444 3,021 4,693
| 6,384 |
41,675 |
50,454 |
&L&D&C&A&R&T
NIC
Jan Jan Jan
Feb Feb Feb
Mar Mar Mar
Apr Apr Apr
May May May
Jun Jun Jun
Jul Jul Jul
Aug Aug Aug
Sep Sep Sep
Oct Oct Oct
Nov Nov Nov
Dec Dec Dec
BCWS
BCWP
ACWP
Cumulative Costs for NIC Y2K (7/14/98)
1444
1299.6
1444
3021
3876.57
3021
4693
5347.93
4693
5814
6405.09
6384
51206.2723809524
32687.9325333333
41675.0185714286
69589.4152380952
61232.739352381
50454.3757142857
103912.32
107673.272380952
111882.415238095
115643.367619048
119404.32
122205.51047619
SAVE
Budget Description: 1998 Y2K Update App Mgr Name Larry LaFreniere
| Application Name:Storage And Verification Element (SAVE) |
Project Coord Name Jeff Tyler
| Phase: Conversion
| Design |
& Planning
VP Name Patrice Carroll
Work Request #
| 120741 |
Director Name Bob Laird
Work Request Name Authorized Dep’ts 2895
Project # Date
| 6/6//1998 |
Company #
Project Code #
1998
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
Internal (MCI Staff) Hours
|
| 55.5 |
| 108.25 |
| 39.5 |
| 22.25 |
322 |
402 402 40 40 40 40 40 1,552
Rate 38 38 38 38 38 38 38 38 38 38 38 38
$
|
|
|
|
| 2,109 |
| 4,113.50 |
|
| 1,501 |
.00
| 845.50 |
12,236.00 |
| 15,276.00 |
15,276.00 1,520.00 1,520.00 1,520.00 1,520.00 1,520.00
58,957 |
External (Contractors)
Grade Hours 40 48 38 40 48 40 40 50 344
Rate 56 56 56 56 56 56 56 56 56 56 56 56
$ – 0 – 0 – 0 – 0 2,240.95 2,689.14 2,128.90 2,240.95 2,689.14 2,240.95 2,240.95 2,801.19 19,272
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Total Contractor Hours – 0 – 0 – 0 – 0 40 48 38 40 48 40 40 50 344
$ – 0 – 0 – 0 – 0 2,241 2,689 2,129 2,241 2,689 2,241 2,241 2,801 19,272
Capital (Detail Items)
Hardware $ 1200
| 500,000 |
501,200 |
Software $ – 0
Other Costs (Detail Items)
Team/Factory Costs $
| 23,749.64 |
23,750 |
Travel $ 1000 1,000
$ – 0
Total 1998 $ 2,109
|
| 4,114 |
1,501
| 846 |
38,227 |
19,
| 165 |
18,405 |
3,761 4,209 3,761
503,761 |
4,321 |
| 604,179 |
Ass/Req Analysis Design Coding Test FVO Roll-Out
| 100.0% |
87.0% |
78.0% |
79.0% |
71.0% |
11.0% |
| 2,109.00 |
5,687.75 |
1,170.78 |
667.95 |
27,140.88 |
2,108.17 |
ACWP 1998
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
Internal (MCI Staff) Hours 55.5 108.25 39.5 22.25
| 680.25 |
806.5 |
513.5 |
2,226 |
Rate 38 38 38 38 38 38 38 38 38 38 38 38
$ 2,109 4,113.50
| 1,501.00 |
845.50
25,849.50 |
30,647.00 |
19,513.00 |
– 0 – 0 – 0 – 0 – 0
84,579 |
External (Contractors)
Grade Hours 60 60 45 165
Rate 56 56 56 56 56 56 56 56 56 56 56 56
$ – 0 – 0 – 0 – 0
|
| 3,361.43 |
3,361.43
2,521.07 |
– 0 – 0 – 0 – 0 – 0
9,244 |
Grade Hours 82.5 115 198
Rate 43 43 43 43 43 43 43 43 43 43 43 43
$ – 0 – 0 – 0 – 0 3,507.90 4,889.80 – 0 – 0 – 0 – 0 – 0 – 0
| 8,398 |
Grade Hours – 0
Rate – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
$ – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0 – 0
Total Contractor Hours – 0 – 0 – 0 – 0 143
| 175 |
45 – 0 – 0 – 0 – 0 – 0
363 |
$ – 0 – 0 – 0 – 0
| 6,869 |
8,251 |
2,521 – 0 – 0 – 0 – 0 – 0
17,642 |
Capital (Detail Items)
Hardware $ – 0
Software $ – 0
Other Costs (Detail Items)
Team/Factory Costs $
| 32,911.68 |
32,912 |
Travel $ – 0
$ – 0
Total 1998 $ 2,109 4,114 1,501 846
| 65,631 |
38,898 |
22,034 |
– 0 – 0 – 0 – 0 – 0
135,132 |
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
BCWS 2,109
|
| 6,223 |
| 7,724 |
| 8,569 |
46,796 |
65,961 |
84,366 |
88,127 |
92,336 |
96,097 |
599,858 |
604,179
BCWP 0
| 5,688 |
6,859 |
7,526 |
34,667 |
36,776 |
ACWP
| 2109 |
6,223 7,724 8,569
74,200 |
113,098 |
Jeff Tyler: The NAP box in Perryman needs to have an OS upgrade to VOS 12.4. This is necessary for Y2K compliance, and to stay current with the kit revision level (build 08). We have been requested to fund Sheldons trip to Perryman for this activity. The costs should not exceed $1200, and this should be billable to the Y2K funds.
The maintenance is necessary before May 29, and is requested for a Sunday afternoon due to the behavior patterns of the NAP.
Jeff Tyler:
Estimated cost of memory repacement for NIC testing box canibalized to support Lab in Y2K testing.
&L&D&C&A&R&T
SAVE
BCWS
BCWP
ACWP
Cumulative Cost of SAVE Y2K (7/14/98)
Risk Assessment
| ID |
Risk Event |
Probability |
Impact |
Response |
Priority |
Resonsibility |
| NIS Adjunct Processor (AP) Y2K Project |
1
| Decom by 12/31/98 |
|
|
|
|
|
|
|
| Med |
|
|
|
|
| Hi |
1
Edie Smith |
2
| Use as Tape Drive for NIC |
Hi
|
|
|
|
|
|
|
|
|
| Low |
2 Jeff Tyler
3
| Get
| Exemption |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| HI |
Low 3 Jeff Tyler
| NIS Billing Detail Record Reporting (BDRR) Y2K Project |
1
|
| Not decom’d |
HI HI
(a)
|
| Get exmption |
(b) Xfer to DSS
(c)
Claim Compliance |
(d)
Convert |
1
2 Get exmption HI Low 5 Jeff Tyler
3
| Transfer to DSS |
Low HI 3 Bob Laird
4 Claim Compliance
| Low-Med |
HI 4 Jeff Tyler
5 Convert HI Med
| Ramp up staffing ASAP |
2 Larry Lafreniere
| NIS Data Server for EVS (DSE) Y2K Project |
1
| Coding Slip |
Med Hi
Request Exemption |
2 Jeff Tyler
2
| No test capability |
Hi Hi
ICCA product test |
1
| John Anderson |
3
| No PM |
Med Hi
Job Req. |
3 John Anderson
| NIS Match Merge (MM) Y2K Project |
1 Not decom’d HI HI Exemption 2 Jeff Tyler
2
| Host migration delays |
HI HI
Requires Assesment |
1
John Libermann |
3 Get exmption HI Low 3 Jeff Tyler
| NIS NIC Adjunct Processor (NAP) Y2K Project |
1
| Not decom’d by 12/31/98 |
Med Low
(a) Submit exception.
(b) Xfer to NEMAS as front end processor.
(c) See if TEFAC can replace it.
(d)
| Institute TCP/IP with NEMAS |
.
(e) Resort to Claimed Compliance.
1 Bob Laird
2
| Receive exmption |
| Med-HI |
Low 2 Jeff Tyler
3
| Transfer to NEMAS |
Low HI
If not then goto TEFAC |
3 Bob Laird
4
| TEFEC to replace need for NAP |
Med-Low |
HI
If not then goto TCP/IP |
4
Kim Greer |
5 Institute TCP/IP with NEMAS Med-HI Med
| If not then goto Claimed Compliance |
5
Dave Weis |
6
| Go Claimed Compliance |
Low Med
If not then retire |
6 Jeff Tyler
7
| Retire |
Low Low
Escalate Business Case |
7 Bob Laird
| NIS Storage And Verification Element (SAVE) Y2K Project |
1
| Not Compliant by 12/31/98 |
HI HI
Purchase remaining eight SAVE boxes |
1 Bob Laird
2
| RolLowut by Moratorium |
Med Med
Apply for exemption |
2 Jeff Tyler
Jeff Tyler:
Estimated cost of O/S upgrade
Jeff Tyler:
Includes $6K in change controls not planned for
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 2 |
>
|
|
| C |
alculation Sheet
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
|
|
|
| I |
|
|
|
| D |
Job Description |
Immediate Predecessors |
Planned Duration (Months) |
Staff (Number) |
Rare/Person/Month |
Task Cost (
|
|
|
|
| B |
|
|
|
|
| A |
C)
| 8 |
Mos
|
|
|
| E |
ffort to Date (Mos)
% Complete |
Outstanding Durtation (Mos) |
Staff Level |
ACWP (AC) |
A
|
|
|
| Electrical Design |
| Start |
|
|
|
|
| 4 |
| 6 |
$
|
|
|
| 1 |
|
|
|
|
|
|
|
|
|
|
|
|
| 0 |
,000
$240,000 |
|
| 3 |
0
125% |
0 0
$300,000 |
B
|
|
|
| Assemble Boards |
A 4 3
|
|
|
|
|
|
|
| $10,000 |
$120,000 |
9 |
| 75% |
1 3
$90,000 |
C
|
|
|
| Test Boards |
B 2 2 $10,000
|
|
|
|
|
| $40,000 |
0
|
|
|
| 0% |
2 2
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| $0 |
D
|
|
|
| Software Design |
Start 4 1 $10,000 $40,000 4
100% |
0 0 $40,000
E
|
|
|
| Programming |
D 2 2 $10,000 $40,000 8
200% |
1 2
| $80,000 |
|
|
|
| F |
|
|
| Software Testing |
E 2 2 $10,000 $40,000 0 0% 4 2 $0
|
|
|
|
| G |
|
|
| Robot Body Design |
A 4 2 $10,000 $80,000 6 75% 1 2
$60,000 |
|
|
|
| H |
|
|
| Robot Construction |
G 2 2 $10,000 $40,000 0 0% 2 2 $0
I
|
|
|
| Final Assembly |
C,F,H |
2 2 $10,000 $40,000 0 0% 2 2 $0
| 26 |
22 |
$680,000 |
$570,000 |
| Blended Rate= |
$10,000
BCWS Gantt
|
|
| ID |
BCWP (PV) TASK |
|
|
| Month 1 |
|
|
| Month 2 |
|
|
| Month 3 |
|
|
| Month 4 |
|
|
| Month 5 |
|
|
| Month 6 |
|
|
| Month 7 |
|
|
| Month 8 |
|
|
| Month 9 |
|
|
| Month 10 |
|
|
| Month 11 |
|
|
| Month 12 |
B
|
| AC Totals |
A Electrical Design $0
B Assemble Boards $0
C Test Boards $0
D Software Design $0
E Programming $0
F Software Testing $0
G Robot Body Design $0
H Robot Construction $0
I Final Assembly $0
$0
|
|
| Monthly Total |
$0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0
|
| Cumulative Cost (PV) |
$0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0
ID
| ACWP (AC) TASK |
Month 1 Month 2 Month 3 Month 4 Month 5 Month 6 Month 7 Month 8 Month 9 Month 10 Month 11 Month 12
| Month 13 |
| Month 14 |
| Month 15 |
AC Totals
A Electrical Design $0
B Assemble Boards $0
C Test Boards $0
D Software Design $0
E Programming $0
F Software Testing $0
G Robot Body Design $0
H Robot Construction $0
I Final Assembly $0
Monthly Total $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0
|
| Cumulative Cost (AC) |
$0 $0 $0 $0 $0 $0 $0 $0
ID
| BCWP (EV) TASK |
Month 1 Month 2 Month 3 Month 4 Month 5 Month 6 Month 7 Month 8 Month 9 Month 10 Month 11 Month 12 Month 13 Month 14 Month 15 AC Totals
A Electrical Design $0
B Assemble Boards $0
C Test Boards $0
D Software Design $0
E Programming $0
F Software Testing $0
G Robot Body Design $0
H Robot Construction $0
I Final Assembly $0
Monthly Total $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0
|
| Cumulative Cost (EV) |
$0 $0 $0 $0 $0 $0 $0 $0
Month 1 Month 2 Month 3 Month 4 Month 5 Month 6 Month 7 Month 8 Month 9 Month 10 Month 11 Month 12
Cumulative Cost (PV) $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0 $0
Cumulative Cost (AC) $0 $0 $0 $0 $0 $0 $0 $0
Cumulative Cost (EV) $0 $0 $0 $0 $0 $0 $0 $0
| BAC= |
0
| BCWS (PV)= |
0
| ACWP (AC)= |
0
| BCWP (EV)= |
0
|
|
| SV= |
BCWP-BCWS |
|
| CV= |
BCWP-ACWP |
SV= CV=
SV= $0 CV= $0
| Project is $xxK behind schedule |
Project is $xxxK over budget |
|
|
| SPI= |
BCWP/BCWS |
|
| CPI= |
BCWP/ACWP |
SPI= CPI=
SPI= 0 CPI= 0
For every dollar spent on scheduled
effort we realize $0.xx worth of progress. |
For every dollar spent, we realize $0.xx of planned result. |
|
|
|
| EAC |
=
BAC/CPI |
|
| PM Eval= |
+10% to -5% EAC |
EAC = PM Eval=
| EAC/BAC |
EAC =
| $0.00 |
PM Eval= 0% EAC
| The PM’s EAC is XX% over/under BAC and is in trouble/not in trouble. |
BCWS Gantt
Cumulative Cost (PV)
Cumulative Cost (AC)
Cumulative Cost (EV)
Sheet3
BAC=$XXX,XXX
EAC=$XXX,XXX
RobertD. Coleman, PhD © 2006 rcoleman@mba1971.hbs.edu
1
What Is Circular Reasoning?
Logical fallacies are a type of error in reasoning, errors which may be recognized and
corrected by observant thinkers. There are a large number of informal fallacies that are
cataloged, and some have multiple names. The frequency of occurrence is one way to
rank the fallacies. The ten most-frequent fallacies probably cover the overwhelming
majority of illogical reasoning. With a Pareto effect, 20% of the major fallacies might
account for 80% of fallacious reasoning.
One of the more common fallacies is circular reasoning, a form of which was called
“begging the question” by Aristotle in his book that named the fallacies of classical logic.
The fallacy of circular reasoning occurs when the conclusion of an argument is
essentially the same as one of the premises in the argument. Circular reasoning is an
inference drawn from a premise that includes the conclusion, and used to prove the
conclusion. Definitions of words are circular reasoning, but they are not inference.
Inference is the deriving of a conclusion in logic by either induction or deduction.
Circular reasoning can be quite subtle, can be obfuscated when intentional, and thus can
be difficult to detect.
Circular reasoning as a fallacy refers to reasoning in vicious circles or vicious circular
reasoning, in contrast to reasoning in virtuous circles or virtuous circular reasoning.
Virtuous circular reasoning is sometimes used for pedagogical purposes, such as in math
to show that two different statements are equivalent expressions of the same thing. In a
logical argument, viciously circular reasoning occurs when one attempts to infer a
conclusion that is based upon a premise that ultimately contains the conclusion itself.
Why is vicious circular reasoning unacceptable and fatal? Genuine method proceeds from
the known to the unknown. Vicious circular reasoning proceeds from the known to the
equally known. Vicious circular reasoning, therefore, violates genuine method. Vicious
circular reasoning does not add anything new, it does not advance learning, and it does
not add to knowledge. Vicious circular reasoning goes nowhere and leads nowhere —
hence, its descriptive name “circular”. It literally moves in a circuit or a circle.
Most people do not study logical fallacies as part of their formal education. Those who
study them typically do so as part of a course in logic, maybe called critical thinking, in
the philosophy department. The rest of us have to learn about them on our own in order to
make and detect sound arguments. Note that the word argument applies to all reasoning
regardless of form, and thus it includes hypotheses, models, arguments and studies.
Here are the citations for a classical text and for a modern text about logic.
Prior Analytics and Topics, Aristotle
The Logic of Real Arguments, Alec Fisher, Second Edition, 2004, Cambridge
University Press.
Robert D. Coleman, PhD © 2006 rcoleman@mba1971.hbs.edu
2
The following is a list of Internet sites with information about the fallacies of informal
logic including the fallacy of circular reasoning, begging the question, or petitio principii.
http://www.kcmetro.cc.mo.us/longview/CTAC/fallacy.htm
Critical Thinking Across the Curriculum Project: Informal Fallacies
Table of Contents (17 Fallacies)
Fallacies of Deception:
Fallacies of Distraction: Fallacies involving Counterfeit:
False Dilemma Affirming the Consequent
Slippery Slope Denying the Antecedent
Straw Man Equivocation
Begging the Question or Circularity
Fallacies which use Emotion or Motive in place of Support:
Appeal to Pity Appeal to Authority
Appeals to Tradition Prejudicial Language
Appeal to Force Appeal to Mass Opinion
Fallacies which employ both (Double Trouble):
Ad Hominem – Abusive Ad Hominem – Ridicule
Ad Hominem – Circumstantial Tu Quoque – Two wrongs
http://www.ramdac.org/fallacies.php
Fallacy Tutorial Pro 3.0, 1995, Dr. Michael C. Labossiere (42 fallacies)
Introduction. Description of Fallacies.
In order to understand what a fallacy is, one must understand what an argument is. Very
briefly, an argument consists of one or more premises and one conclusion. A premise is a
statement (a sentence that is either true or false) that is offered in support of the claim
being made, which is the conclusion (which is also a sentence that is either true or false).
There are two main types of arguments: deductive and inductive. A deductive argument
is an argument such that the premises provide (or appear to provide) complete support for
the conclusion. An inductive argument is an argument such that the premises provide (or
appear to provide) some degree of support (but less than complete support) for the
conclusion. If the premises actually provide the required degree of support for the
conclusion, then the argument is a good one. A good deductive argument is known as a
valid argument and is such that if all its premises are true, then its conclusion must be
true. If all the argument is valid and actually has all true premises, then it is known as a
sound argument. If it is invalid or has one or more false premises, it will be unsound. A
Robert D. Coleman, PhD © 2006 rcoleman@mba1971.hbs.edu
3
good inductive argument is known as a strong (or “cogent”) inductive argument. It is
such that if the premises are true, the conclusion is likely to be true.
A fallacy is, very generally, an error in reasoning. This differs from a factual error, which
is simply being wrong about the facts. To be more specific, a fallacy is an “argument” in
which the premises given for the conclusion do not provide the needed degree of support.
A deductive fallacy is a deductive argument that is invalid (it is such that it could have all
true premises and still have a false conclusion). An inductive fallacy is less formal than a
deductive fallacy. They are simply “arguments” which appear to be inductive arguments,
but the premises do not provided enough support for the conclusion. In such cases, even
if the premises were true, the conclusion would not be more likely to be true.
http://www.iep.utm.edu/f/fallacies.htm
The Internet Encyclopedia of Philosophy (164 fallacies)
A fallacy is a kind of error in reasoning. The alphabetical list below contains 164 names
of the most common fallacies, and it provides explanations and examples of each of
them. Fallacies should not be persuasive, but they often are. Fallacies may be created
unintentionally, or they may be created intentionally in order to deceive other people. The
vast majority of the commonly identified fallacies involve arguments, although some
involve explanations, or definitions, or other products of reasoning. Sometimes the term
“fallacy” is used even more broadly to indicate any false belief or cause of a false belief.
The list below includes some fallacies of this sort, but most are fallacies that involve
kinds of errors made while arguing informally in natural language.
The discussion that precedes the list begins with an account of the ways in which the term
“fallacy” is vague. Attention then turns to the number of competing and overlapping
ways to classify fallacies of argumentation. For pedagogical purposes, researchers in the
field of fallacies disagree about the following topics: which name of a fallacy is more
helpful to students’ understanding; whether some fallacies should be de-emphasized in
favor of others; and which is the best taxonomy of the fallacies. Researchers in the field
are also deeply divided about how to define the term “fallacy,” how to define certain
fallacies, and whether any general theory of fallacies at all should be pursued if that
theory’s goal is to provide necessary and sufficient conditions for distinguishing between
fallacious and non-fallacious reasoning generally. Analogously, there is doubt in the field
of ethics regarding whether researchers should pursue the goal of providing necessary
and sufficient conditions for distinguishing moral actions from immoral ones.
Introduction
The first known systematic study of fallacies was due to Aristotle in his De Sophisticis
Elenchis (Sophistical Refutations), an appendix to the Topics. He listed thirteen types.
After the Dark Ages, fallacies were again studied systematically in Medieval Europe.
This is why so many fallacies have Latin names. The third major period of study of the
fallacies began in the later twentieth century due to renewed interest from the disciplines
Robert D. Coleman, PhD © 2006 rcoleman@mba1971.hbs.edu
4
of philosophy, logic, communication studies, rhetoric, psychology, and artificial
intelligence.
The term “fallacy” is not a precise term. One reason is that it is ambiguous. It can refer
either to (a) a kind of error in an argument, (b) a kind of error in reasoning (including
arguments, definitions, explanations, etc.), (c) a false belief, or (d) the cause of any of the
previous errors including what are normally referred to as “rhetorical techniques”.
Philosophers who are researchers in fallacy theory prefer to emphasize meaning (a), but
their lead is often not followed in textbooks and public discussion.
http://www.hebrew4christians.com/Clear_Thinking/Informal_Fallacies/Informal_Fallacies.html
Informal Fallacies (71 fallacies in 11 categories)
You simply cannot properly begin to properly read the various texts without first being
grounded in the basics of clear thinking. By familiarizing yourself with these forms of
reasoning you may guard yourself from making the same sorts of errors (as well as to
catch errors in the thinking of others who purport to be speaking the truth). A brief
introduction of the subject included.
An informal fallacy is an attempt to persuade that obviously fails to demonstrate the truth
of its conclusion, deriving its only plausibility from a misuse of ordinary language. Most
scholars categorize informal fallacies as: (1) fallacies of relevance: appeal to ignorance,
appeal to authority, ad hominem arguments, appeals to emotion, force, etc., irrelevant
conclusions, and appeals to pity; (2) fallacies of presumption: accident, converse
accident, false cause, begging the question, and complex question; (3) fallacies of
ambiguity: equivocation, amphiboly, accent, composition, and division.
http://www.datanation.com/fallacies/
Stephen’s Guide to the Logical Fallacies (53 fallacies)
http://www.adamsmith.org/logicalfallacies/
Adam Smith Institute Logical Fallacies (76 fallacies, incuding Petito Principii, Circulus
in Probando, and Blinding with Science)
http://www.drury.edu/ess/Logic/Informal/Overview.html
A Database of Informal Fallacies, 1987, Dr. Charles Ess (28 fallacies)
http://www.csun.edu/~dgw61315/fallacies.html
Logical Fallacies and the Art of Debate (21 fallacies, including Petitio Principii and
Circulus in Demonstrando)
http://www.fallacyfiles.org/ and http://www.fallacyfiles.org/begquest.html
The Fallacy Files (155 fallacies, including Circular Argument, Circulus in Probando,
Petitio Principii, Question-Begging, and Vicious Circle)
http://en.wikipedia.org/wiki/Logical_fallacy
Wikipedia: Logical fallacy (111 fallacies, including Begging the Question)
Robert D. Coleman, PhD © 2006 rcoleman@mba1971.hbs.edu
5
http://en.wikipedia.org/wiki/Circular_reasoning
Wikipedia: Circular reasoning
In logic, begging the question is the term for a type of fallacy occurring in deductive
reasoning in which the proposition to be proved is assumed implicitly or explicitly in one
of the premises. For an example of this, consider the following argument: “Only an
untrustworthy person would run for office. The fact that politicians are untrustworthy is
proof of this.” Such an argument is fallacious, because it relies upon its own
proposition—in this case, “politicians are untrustworthy”—in order to support its central
premise. Essentially, the argument assumes that its central point is already proven, and
uses this in support of itself.
Begging the question is also known by its Latin name petitio principii and is related to
the fallacy known as circular argument, circulus in probando, vicious circle or circular
reasoning. As a concept in logic the first known definition in the West is by the Greek
philosopher Aristotle around 350 B.C., in his book Prior Analytics, where he classified it
as a material fallacy.
The term is usually not used to describe the broader fallacy that occurs when the evidence
given for a proposition is as much in need of proof as the proposition itself. The more
accepted classification for such arguments is as a fallacy of many questions.
See modern usage controversy, below, over a common usage of “begs the question” with
the meaning “raises the question”.
http://en.wikipedia.org/wiki/Category:Causal_fallacies
Wikipedia: Causal fallacies
This category is for questionable cause fallacies, arguments where a cause is incorrectly
identified.
http://en.wikipedia.org/wiki/Category:Informal_fallacies
Wikipedia: Informal fallacies
This category is for arguments that are fallacious for reasons other than structural
(“formal”) flaws, such as due to ambiguity or a common error in their premises.
As people learn about something, they
presume they actually know and
understand the subject and apply their
knowledge to new situations. In reading
Pell’s article, I was stuck by his
description of the ‘top thinkers in project
management today’ (Pells, 2010). The
arrogance apparent in his comments
about the characteristics of these ‘top
thinkers’ was apparent. It would seem
that determining the top thinkers in a
group to be akin to determining the most
beautiful person in a group. It is in the
eye of the beholder and as such, is an
opinion not based on any sort of analysis
of facts. It certainly seemed like Mr. Pell
was unhappy with his (and others) lack of
inclusion. Pells makes his point
regarding arrogance and brings up very
real issues with arrogance in project
management and organizations.
To paraphrase Pells, arrogance in a
project manager (or an organization) can
bring about such negative issues as:
undervaluing expertise and experience,
encouraging inappropriate shortcuts, and
thus increasing risks. When planning a
project, the organization and the project
manager need to be careful and plan and
monitor projects carefully and analytically
in order to avoid arrogance (Pells 2010).
The problem of project manager
arrogance is a very human one. As
people learn about something, they
presume they actually know and
understand the subject and apply their
knowledge to new situations, sometimes
well and sometimes poorly. In order to
minimize the poor application of that
knowledge, as pointed out by Krock
(2010), managers cannot be right every
time on every subject and they need to
recognize this and make decisions based
on sound, rational analysis and not rely
Here the writer introduces the reader to
the situation to set the stage for his
premise or thesis statement. Knowledge
of the reading is presented followed by
the writer’s comprehension of what the
reading is about.
Application of the understanding of the
material is shown to prepare for the
analysis and then the premise is broken
down into parts and reviewed.
Finally, the analysis is synthesized into a
drawn conclusion from the analysis. But
this is just an opinion unless it (the
conclusion) is tested. This is called
evaluation, using a different perspective
from that used in the analysis to validate
that the conclusion drawn by the writer
from the analysis is supported by a
different source reference or example.
on their ‘instinct’, or snap judgments
because those judgments can be based
on their own emotional response to the
subject at hand and not sound, rational
analysis.
References
Pells, D. (2010). The dangers of arrogance in project management. PM World Today.
August, 2010 (Vol. XII, Issue VIII). Retrieved August 6, 2010 from
http://www.pmworldtoday.net
Krock, E. (August 16, 2010) Product Management tips and best practices: humility, agile
product and project management blog web site Retrieved on February 10, 2011 from
http://www.voximate.com/blog/article/89/product-management-tips-best-practices-
humility/
http://www.pmworldtoday.net/
http://www.voximate.com/blog/article/author/eric-krock/
http://www.voximate.com/blog/article/89/product-management-tips-best-practices-humility/
http://www.voximate.com/blog/article/89/product-management-tips-best-practices-humility/
NCC
OTB
CBB
TAB Profit / Fees
Contract Price
Earned Value Management
‘Gold Card’‘Gold Card’
Management
Reserve
OVERRUN
AUW
Control
Accounts
Undistributed
Budget
OTB
PMB
Summary Level
Planning Packages
Management Reserve
EAC
PMB
TAB
BAC
TERMINOLOGY
NCC Negotiated Contract Cost Contract price less profit / fee(s)
AUW Authorized Unpriced Work Work contractually approved, but not yet negotiated / definitized
CBB Contract Budget Base Sum of NCC and AUW
OTB O T t B li S f CBB d i d
Work Packages Planning Packages
g g g
Cost
Variance
Schedule Variance
ACWP
BCWS
$
OTB Over Target Baseline Sum of CBB and recognized overrun
TAB Total Allocated Budget Sum of all budgets for work on contract = NCC, CBB, or OTB
BAC Budget At Completion Total budget for total contract thru any given level
PMB Performance Measurement Baseline Contract time-phased budget plan
MR Management Reserve Budget withheld by Ktr PM for unknowns / risk management
UB Undistributed Budget Broadly defined activities not yet distributed to CAs
CA Control Account Lowest CWBS element assigned to a single focal point to plan & control
scope / schedule / budget
WP W k P k N t d t il l d ti iti ithi CAVARIANCES Favorable is Positive Unfavorable is Negative
BCWP
Time
Now
Completion
Date
Time
WP Work Package Near-term, detail-planned activities within a CA
PP Planning Package Far-term CA activities not yet defined into WPs
BCWS Budgeted Cost for Work Scheduled Value of work planned to be accomplished = PLANNED VALUE
BCWP Budgeted Cost for Work Performed Value of work accomplished = EARNED VALUE
ACWP Actual Cost of Work Performed Cost of work accomplished = ACTUAL COST
EAC Estimate At Completion Estimate of total cost for total contract thru any given level;
may be generated by Ktr, PMO, DCMA, etc. = EACKtr / PMO / DCMA
LRE Latest Revised Estimate Ktr’s EAC or EACKtr
SLPP S L l Pl i P k F t ti iti t t d fi d i t CA
VARIANCES Favorable is Positive, Unfavorable is Negative
Cost Variance CV = BCWP – ACWP CV % = (CV / BCWP) *100
Schedule Variance SV = BCWP – BCWS SV % = (SV / BCWS) * 100
Variance at Completion VAC = BAC – EAC
OVERALL STATUS
% Schedule = (BCWSCUM / BAC) * 100
% Complete = (BCWPCUM / BAC) * 100 SLPP Summary Level Planning Package Far-term activities not yet defined into CAs
TCPI To Complete Performance Index Efficiency needed from ‘time now’ to achieve an EAC
EVM POLICY: DoDI 5000.02, Encl 4. Table 5. EVMS in accordance with ANSI/EIA-748 is required for cost or
incentive contracts, subcontracts, intra-government work agreements, & other agreements valued > $20M (Then-Yr $).
EVMS contracts > $50M (TY $) require that the EVM system be formally validated by the cognizant contracting officer.
Additional Guidance in Defense Acquisition Guidebook and the Earned Value Management Implementation Guide
(EVMIG). EVMS is discouraged on Firm-Fixed Price, Level of Effort, & Time & Material efforts regardless of cost.
DoD TRIPWIRE METRICS Favorable is > 1.0, Unfavorable is < 1.0
Cost Efficiency CPI = BCWP / ACWP
Schedule Efficiency SPI = BCWP / BCWS
p ( CUM )
% Spent = (ACWPCUM / BAC) * 100
BASELINE EXECUTION INDEX (BEI) (Schedule Metric)
BEI = # of Baseline Tasks Actually Completed / # of Baseline Tasks Scheduled for Completion
EVM CONTRACTING REQUIREMENTS:
Non-DoD FAR Clauses – Solicitation – 52.234-2 (Pre-Award IBR) or 52.234-3 (Post Award IBR)
– Solicitation & Contract – 52.234-4
DoD( ≥ $20M) DFAR Clauses – 252.234-7001 for solicitations and 252.234-7002 for solicitations & contracts
Contract Performance Report – DI-MGMT-81466A * 5 Formats (WBS, Organization, Baseline, Staffing & Explanation)
Integrated Master Schedule – DI-MGMT-81650 * (Mandatory for DoD EVMS contracts)
Integrated Baseline Review (IBR) – Mandatory for all EVMS contracts
BEI = # of Baseline Tasks Actually Completed / # of Baseline Tasks Scheduled for Completion
CPLI = (Critical Path Duration + Float Duration (to baseline finish)) / Critical Path Duration
CRITICAL PATH LENGTH INDEX (CPLI) (Schedule Metric)
TO COMPLETE PERFORMANCE INDEX (TCPI) # §
TCPIEAC = Work Remaining / Cost Remaining = (BAC – BCWPCUM) / (EAC – ACWPCUM)
ESTIMATE AT COMPLETION #
EAC A t l t D t + [(R i i W k) / (Effi i F t )]
EVM Home Page = https://acc.dau.mil/evm eMail Address: EVM.dau@dau.mil
DAU POC: (703) 805-5259 (DSN 655)
Revised January 2009
g ( ) y
* See the EVMIG for CPR & IMS tailoring guidance.EAC = Actuals to Date + [(Remaining Work) / (Efficiency Factor)]
EACCPI = ACWPCUM + [(BAC – BCWPCUM) / CPICUM ] = BAC / CPICUM
EACComposite = ACWPCUM + [(BAC – BCWPCUM) / (CPICUM * SPICUM)]
# To Determine a Contract Level TCPI or EAC; You May Replace BAC with TAB
§ To Determine the TCPI BAC or LRE Replace EAC with BAC or LRE
All you ever wanted to know about earned value analysis
*
ACWP
BCWP
BCWS
CV =
CPI =
= SV
= SPI
Minus
Divided By
Minus
Divided By
*
IF
ACWP>BCWP
ACWP=BCWP
ACWP 0
CPI > 1
The
Project is
Over
Budget
On
Budget
Under
Budget
*
IF
BCWS>BCWP
BCWS =BCWP
BCWS 0
SPI > 1
The
Project is
Behind
Schedule
On
Schedule
Ahead of
Schedule
*
EV – Previously called BCWP or Budgeted Cost of Work Performed, Earned Value or actual work.
PV – Previously call BCWS or Budgeted Cost of Work Scheduled, Planned Value or the project budget.
AC – Previously called ACWP or Actual Cost of Work Performed, Actual Costs
CV – Cost Variance = BCWP – ACWP
SC – Schedule Variance = BCWP – BCWS
CPI – Cost Performance Index = BCWP/ACWP
SPI – Schedule Performance Index = BCWP/BCWS
EAC – Estimate At Completion, a forecast of most likely total project cost based upon project performance and risk.
Schedule = Original Schedule/SPI
Cost = Min: Original Budget/CPI or Max: Original Budget/(CPI * SPI)
*
BAC – Budgeted at Completion = Σ of all the budgets (PV or BCWS)
VAC – Variance at Completion = BAC – EAC
ETC – Estimate to Complete = EAC – AC
*
*
Value of the future of fund available today
FV = PV * (1 + i) n
If you have $1,000 invested for three years at 10% how much will you have at the end of year three?
EOY 1 = $1,000 * (1 + 10%) = $1,100
EOY 2 = $1,100 * (1 + 10%) = $1,210
EOY 3 = $1,210 * (1 + 10%) = $1,331
*
Value today of funds available in the future.
PV = FV / (1 + i)n
If you want $1,000 in three years, how much do you have to invest today at 8% to receive your $1,000?
EOY 1 = $1,000 / (1 + 10%) = $925.93
EOY 2 = $925.93 / (1 + 10%) = $857.34
EOY 3 = $857.34 / (1 + 10%) = $793.83
*
Net Present Value – Present Value minus present cost.
Internal Rate of Return – Average rate of return earned over the life of the project. It is where discounted cash flow minus up front cost equals zero.
*
PERT
Weighted
Average
=
Optimistic + 4XMost Likely + Pessimistic
6
PERT
Standard
Deviation
=
Optimistic – Pessimistic
6
*
AnsariX Prize Gauchito Rocket
Table of Contents
Executive Summary ……………………………………………………………... page
3
Key Stakeholders …………………………………………………………………. page 3
Business Needs ………………………………………………………………….. page 3
Proposal
7
/
8
Scale Ansari X Gauchito Rocket …………………… page
4
1
.
0
Scope
Management Plan …………………………………………………. page
5
–
11
Project Scope Change Order………………………………………………….. page 8
2
>
2.0
Schedule
Management Plan
…………………………………………….. page
12
–3
9
Appendix MM: Milestones………………………………………………….. page
1
6
Appendix CR: Change Request Form…………………………… page
17
–
18
Appendix WT: WBS at Tracked Level………………………….. page
19
–21
Appendix WG: WBS Milestone Gantt ………………………….. page
22
–
25
Appendix WR: WBS with Resources Leveled…………………… page 26–38
Appendix ND:
Gauchito Network Diagram
……………………… page 39
3.0
Cost Management Plan……………………………………………………. page
40
–50
Example Cost Management Report………………………………………… page
47
4.0
Quality Management Plan……………………………………………… page 51–75
Project Quality Control………………………………………………………. page 55–70
Project Audits & Quality Reviews ……………………………. page 74–75
5.0
Staffing
Management Plan ………………………………… page 76–82
6.0
Communications
Management Plan ……………………….. page 83–87
Enclosure I ………………………………………………………. page 85
Project Communications Planner ……………………………… page 86–87
7.0
Risk Management Plan ……………………………………. page
88
–93
Risk Register …………………………………………………… page 92
8.0
Procurement
Management Plan ……………………………. page 94–98
Probability and Impact Matrix ………………………………… page 97
Benchmark
……………………………………………………… page 98
Appendix A Project Charter …………………………………… page 99–
10
4
Appendix B Product Description ……………………………… page 105–106
Appendix C Preliminary Scope Statement ……………………. page 107–
112
Appendix D Work Breakdown Structure ……………………… page 1
13
–118
Appendix E Cost Rollup Estimates…………………………… page 119–
1
20
Appendix F Scheduled
Start
Date
s……………………………. page 121–122
Appendix G Responsibility Assignment Matrix ……………… page 123–1
24
Appendix H Performance Measurement Baselines ………….. page 125
Appendix I Major Milestones ………………………………… page 126
Appendix J Key or Required Staff …………………………… page 127
Appendix K Key Risks ………………………………………. page 1
28
Appendix L Constraints ……………………………………… page 129
Appendix M Assumptions …………………………………… page 1
30
Appendix N Construction Plans ……………………………… page 131–1
35
Works Cited ………………………………………………….. page
136
Executive Summary
Space Systems Technology (
SST
)) is a worldwide leader in providing business solutions. Space Systems Technology will engineer and build a 7/8 scale model of the Ansari X Gauchito Rocket. The project’s start date is May 22,
200
6, and the end date is July 26, 2006. The order of magnitude estimate for the project budget will be $63,000.
The requirements in this project, the project scope, project proposal, budget utilization, cost, constraints, requirements, project plan, responsibilities, and the risk factors have all been discussed and worked on.
Space Systems Technology, the engineering firm, has contracted for the following reasons:
· The personnel are highly skilled and experienced in the field.
· Over 30 years experience in successful engineering.
In conclusion, Space Systems Technology is a firm specializing in the production of built to scale, fully functioning models.
Key Stakeholders
Pablo de Leon & Associates along with the members of Space Systems Technology are the key stakeholders for the Ansari X Gauchito Rocket project.
Business Needs
Pablo de Leon & Associates requires Space Systems Technology to build a functioning scaled rocket based on its design for entry into the Ansari X Prize Cup. This scaled rocket prototype will allow Pablo de Leon to test their design for reduced cost and prove the feasibility of building a full-sized rocket.
Proposal 7/8 Scale Ansari X Gauchito Rocket
1.0
Scope Management Plan
1.1 Scope Definition
The project scope for the Gauchito rocket will be defined by the project charter and preliminary scope statement as well as the scope management plan and all approved change requests. The work breakdown structure presents the project deliverables in a hierarchical manner, and the definition will include the completed work breakdown structure down to the work package level. The work breakdown structure will be based upon the construction plan, which will be strictly adhered to. In order to ensure proper definition of scope, the project manager will meet with all of the key personnel regarding every facet of this project. All phases of the project will be broken down according to amount of effort required into smaller work packages.
1.2 Scope Documentation
The project scope will be documented through an engineering specs descriptive document that is provided by the customer and has been inputted into the project charter. If there are any discrepancies between the product description and the product requirements, as the product description states the rocket must be 37 feet long, the project charter requires a 7/8 scale, which is a
43
foot long rocket; then the project manager and the team will discuss this with the customer and define the actual length. There will be an internal central database set up for this project specifically to allow all project team members access to the same information regarding the exact scope for this project. All documentation specific to this project shall be archived and tracked within this database.
1.3 Scope Verification
The project scope shall be verified through the project scope statement, the project scope management plan, and constant communication with the project sponsor to ensure the deliverables are being met and understood. An inspection of each deliverable shall be performed and shall be compared to the construction plan and the work breakdown structure. The scope deliverables are understood to be as follows:
1.0
ASSEMBLE ENGINE MOUNT
2.0 FIN PREPARATION
3.0 MARK FIN AND LAUNCH LUG LIN
ES
4.0
INSERTING ENGINE MOUNT
5.0 ATTACH FINS
6.0
ATTACH SHOCK CORD
7.0 ASSEMBLE
NO
SE CONE
8.0
ATTACH PARACHUTE
/SHOCK CORD
9.0
ATTACH LAUNCH LUG
10.0
PAINTING THE ROCKET
11.0
APPLICATION OF DECA
LS
12.0
APPLYING CLEAR COAT
13.0
DISPLAY NOZZLE ASSEMBLY
14
.0 ROCKET PR
EF
LIGHT
15.0
PREPARE FOR TEST LAUNCH
1.4 Scope Management
The project scope will be managed through the utilization of change requests to the project and shall be reviewed by the subject matter expert it affects, the project manager, and the sponsor. Identification of any risk to the project regarding the change as well as the effect on the schedule of the project and cost will be considered. No changes will be approved until the project manager and team meet with the sponsor to validate the scope. The project manager and customer need to meet to get the customer’s feedback on any requested changes after they have been validated through the scope to ensure that all of the deliverables are still being met if the change is approved. The project manager and the sponsor must both sign off and approve any changes requested. Any and all corrective actions that are suggested will be considered in the same manner as the requested changes. Corrective actions that need to be taken will be signed off on by the project manager and the sponsor and will be implemented by the project manager. The following change order form will be utilized during the duration of this project.
Project Scope Change Order
Project
Name
: Ansari X Prize Cup-Gauchito Rocket
Project Manager
:
Julie Davis
Project Tracking Number: PMGT
60
5-001 Date:
Summary of Change:
Rationale for Change:
Brief overview of the impact of this change on . . .
· Project schedule:
· Quality of deliverables:
· Costs:
· Stakeholders and/or core team members:
· Other deliverables, including amount and quality:
Change approved by (signatures):
Sponsor
Jeff Tyler
: ___________________________________________ Date: ___________
Project Manager
Julie Davis
: ____________________________________ Date: _________
The project scope will also be managed through the project scope statement, the work breakdown structure, the project scope management plan, and performance reports, as well as any approved change requests and all work performance information. The implemented change control procedures will be followed with no exceptions. Variance analysis shall be utilized when needed.
1.5
Scope Control
The project scope will be controlled by utilizing the project scope statement, work breakdown structure, work breakdown structure dictionary, project scope management plan, all project performance reports, all approved change requests, and work performance information, which are the scope inputs of the project. The tools used will be the change management plan, variance analysis, and any replanning needed to stay within the scope of the project and on task to complete the deliverables, as well as the utilization of the configuration management system. The outputs for the scope control will include any project scope updates, all work breakdown structure updates, all scope baseline updates, any requested changes, any recommended corrective actions, any organizational process asset updates, and any updates to the project management plan. All changes, including a preliminary high-level evaluation of schedule, cost, labor, etc., will be presented to the Configuration Control Board (CCB) immediately as time is of the essence in the short-term project to determine if any changes should be approved.
1.5.1 Causation of Scope Changes
Please note the most frequent causes of scope change requests include:
· Errors in Omissions—Change in requirements that would result in schedule delays and cost increases that are not allowed in this project at all, due to the short time frame and the race to win the Ansari X Prize.
· Value-Adding Opportunities—The unforeseen advent of new technology that could add value to the project. Maintain the scope of the project if the new opportunity does not have sufficient return on investment value to offset any risk that could be involved utilizing the new product. We are contracted to utilize only those tool sets that were previously agreed upon. No changes will be acceptable unless signed off on by the CCB, the PM, and the sponsor.
· Competitive Pressures—Other competitors getting closer to being the first to develop the winning rocket may cause our team to have to speed up the schedule and work holidays and weekends.
· Schedule Slippage—Can sometimes force the team to reduce the scope of the project. This is not an option with this project as it is so detailed and short term. Do not forget this team wants to be first to have a fully functioning rocket to enable Pablo de Leon & Associates to win the Ansari X Prize Cup.
1.6 Scope Development & Breakdown
Process for developing the WBS from the detailed scope statement will be completed through the use of subject matter experts (SMEs). A project team review will be conducted after the project manager breaks down each deliverable into the level of the work package, where it can be assigned to one person, and a cost control account created for that work package, which must be less than
80
hours. For example, cutting and sanding the fins is decomposed into cutting fins as one task and sanding fins as one task. The tasks may then each be further decomposed into cutting fin #1 and sanding fin #1, with a work package for each fin for cutting and one for sanding.
1.7 Formal Scope Verification
Formal verification of the deliverable will be done through the utilization of the WBS, the project charter, and the scope management plan, as well as any signed, approved change requests and updates. This will be done through reviewing the deliverables and their requirements at each milestone by the PM and sponsor.
1.8 Formal Acceptance
Formal acceptance of deliverable will be obtained through the above-stated verification
process so that when the rocket is complete, the customer can review all of the formal verification requirements and test procedures that the sponsor previously signed off on with the PM. Formal acceptance is considered complete when the customer comes to pick up the rocket no later than 3 days after completion of contract deliverables.
2.0 Schedule Management Plan
Objectives
The purpose of this document is to define and document how changes to the schedule will be managed for this project.
Overview
The project shall be managed by means of an established work breakdown structure (WBS). This WBS shall be the basis for establishing project schedule and for definition of project tasks and deliverables. The project manager (PM) shall collect actual completion and resource utilization data from the team members in order to calculate actual performance against predicted cost and schedule. This data shall be presented to the program manager in the form of a weekly project status report.
Task status and resource utilization statistics shall be reported to the PM utilizing the elements of the WBS. The project shall be controlled using Microsoft Project to establish project tasks, dependencies, and schedule. WBS activity decomposition shall be such that tasks can be completed in a maximum one-week time frame. Credit for task progress shall only be awarded upon task completion. Team members shall provide weekly statistics on resource expenditures by task as well as information concerning task completion. This information shall then be compiled to provide schedule status to the sponsor.
2.1 Assumptions
· The materials and equipment will arrive one week prior to project start date.
· All necessary personnel will be available at the time of the project start.
· The contracts will be followed as written without any delays or difficulties.
2.2 Schedule Variance Response Process
Once the project schedule is base-lined, the PM is responsible for ensuring that actual effort and start and end dates are entered for each activity. This data is essential to establish whether schedule variances exist. When variance is required, the PM is the primary decision maker; for major problems, the sponsor will be the ultimate decision maker.
2.3 Major Problems
· On a regular basis, but not less than once per week, the PM will create an earned value report. Using this report, the PM will identify and isolate both positive and negative schedule variances.
· When a variance exceeds +/– 5%, the PM must determine what is causing the variance and decide if the variance requires corrective action.
· If corrective action is required, it is up to the discretion of the sponsor as to whether a schedule change request is necessary or if the variance can be absorbed within the existing project schedule. If a schedule change request is necessary, it is submitted through the standard schedule change control procedures as outlined in the following.
· Notification will be sent to the sponsor and all functional managers via email; when required, a copy of the signed change request form will be attached.
2.4 Minor Problems
· Minor variances will be absorbed into the project through schedule changes or shifting of project resources.
2.5
Schedule Change Control Processes
1.1. All schedule modifications must go though the following change control process:
1) Identify and assess the schedule change.
2) Fill out and submit a “Change Request Form,” along with required supporting documentation to the PM.
3) The PM will review the change request and may possibly request additional documentation prior to review by the CCB.
4) The CCB will evaluate the change. Using the “Change Request Form,” the CCB will mark the change as:
1.1.4.1. Approved, in which case the PM will incorporate the change and adjust other project planning factors as necessary.
1.1.4.2. Approved pending additional supporting documentation, in which case the PM will specify and coordinate gathering of the required documentation, incorporate the change, and adjust other project planning factors as necessary.
1.1.4.3. Denied, in which case the PM will notify the requestor of the status and reason for denial.
5) The PM will document the change request outcome and as necessary update WBS and schedule documentation if impacted.
2.6 Schedule References
A WBS for the level the project will be tracked is in Appendix WT.
A resource-leveled project plan is available in Appendix WR.
A time-based network diagram is available in Appendix ND.
A milestone chart is available in Appendix MM.
A change request form is in Appendix CR
Appendix MM: Milestones
|
|
|
|
|
|
|
Name
|
Finish
Date
|
|
|
|
|
1.0 ASSEMBLE ENGINE MOUNT
|
|
|
|
|
|
|
|
6/5/2006
|
|
|
|
|
|
2.0 FIN PREPARATION
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5/25/2006
|
3.0
MARK FIN AND LAUNCH LUG LINES
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5/30/2006
|
|
|
|
|
|
4.0 INSERTING ENGINE MOUNT
|
|
|
|
|
|
|
|
|
|
6/9/2006
|
|
|
|
|
|
5.0 ATTACH FINS
|
|
|
|
|
|
6/
16
/2006
|
|
|
|
|
|
6.0 ATTACH SHOCK CORD
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5/26/2006
|
|
|
|
|
|
7.0
ASSEMBLE NOSE CONE
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5/23/2006
|
|
|
|
|
|
8.0 ATTACH PARACHUTE/SHOCK CORD
|
|
|
|
|
|
|
|
|
|
|
|
6/2/2006
|
|
|
|
|
|
9.0 ATTACH LAUNCH LUG
|
|
|
|
|
|
|
|
|
6/22/2006
|
|
|
|
|
|
10.0 PAINTING THE ROCKET
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
6/29/2006
|
11.0
APPLICATION OF DECALS
|
|
|
|
|
|
|
|
7/6/2006
|
|
|
|
|
|
12.0 APPLYING CLEAR COAT
|
|
|
|
|
|
7/7/2006
|
|
|
|
|
|
13.0 DISPLAY NOZZLE ASSEMBLY
|
|
|
|
|
|
|
|
|
|
|
|
|
7/13/2006
|
14.0
ROCKET PREFLIGHT
|
|
|
|
|
|
|
7/20/2006
|
15.0 PREPARE FOR TEST LAUNCH
|
|
|
7/26/2006
|
Appendix CR: Change Request Form
Project Name:
Date Request Submitted:
Title
of Change Request:
Change Order Number:
Submitted by: (name and contact information)
Change Category:
(Scope (Schedule (Cost (Technology (Other
Description of change requested:
Events that made this change necessary or desirable:
Justification for the change/why it is needed/desired to continue/complete the project:
Impact of the proposed change on:
Scope:
Schedule:
Cost:
Staffing:
Risk:
Other:
Suggested implementation if the change request is approved:
Required approvals:
|
Name/Title
|
Date
|
Approve/Reject
|
Reference:
Schwable, Kathy, Information Technology Project Management, Fourth Edition, Thomson Course Technology, 2006. Found at www.course.com/mis/schwable4e.
Appendix WT: WBS at Tracked Level
Name
|
Duration
|
|
Start Date
|
|
|
|
Finish Date
|
|
1.0 ASSEMBLE ENGINE MOUNT
|
95
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5/22/2006
|
|
|
|
|
|
6/5/2006
|
|
-1.1 Measure, Mark, and Cut Engine Tube
|
|
35
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5/22/2006
|
|
|
|
|
|
|
|
|
|
|
|
5/26/2006
|
–
1.2 Cut Engine Tube
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2
|
5/26/2006
5/26/2006
|
-1.3 Glue, Tube, Assemble Hook
|
|
|
|
7
|
5/26/2006
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5/30/2006
|
–
1.4 Assemble Mylar Ring to Tube
|
9
|
5/30/2006
5/30/2006
–
1.5 Assemble Yellow Engine Block to Engine Mount Tube
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
10
|
|
|
|
|
|
|
|
|
|
|
|
5/31/2006
|
|
|
|
|
|
|
|
5/31/2006
|
–
1.6 Assemble Centering Rings
22
|
|
|
|
|
|
|
|
|
|
|
|
6/1/2006
|
|
|
|
|
|
6/2/2006
|
–
1.7 Application of Glue Fillets
|
|
|
|
|
|
|
|
|
|
|
|
|
10
|
6/5/2006
6/5/2006
|
|
2.0 FIN PREPARATION
|
30
|
5/22/2006
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5/25/2006
|
–
2.1
Sand
/Cut Fins
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8
|
5/22/2006
5/22/2006
–
2.2 Cutting Out Fins
|
|
|
|
|
|
12
|
|
|
|
|
|
5/23/2006
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5/24/2006
|
–
2.3 Stack and Sand Fins
10
|
|
|
|
|
|
|
5/24/2006
|
5/25/2006
|
|
|
|
3.0 MARK FIN AND LAUNCH LUG LINES
|
33
5/22/2006
5/30/2006
–
3.1 Cut – Tape
|
|
|
13
|
5/22/2006
5/25/2006
–
3.2 Remove guide, connect fins and lug lines, extend LL line
|
|
|
|
|
|
|
|
16
|
5/25/2006
5/30/2006
–
3.3 Extend Launch Lug Line
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4
|
5/30/2006
5/30/2006
|
|
4.0 INSERTING ENGINE MOUNT
|
43
|
|
|
|
|
|
|
|
|
6/6/2006
|
|
|
|
|
|
6/9/2006
|
|
|
-4.1 Mark Inside of Tube @ 5/8″ Where LL is
|
|
|
|
|
|
7
|
|
|
|
|
|
|
|
6/6/2006
|
6/6/2006
–
4.2 Glue Tube
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5
|
6/6/2006
|
|
|
|
|
6/7/2006
|
–
4.3 Assemble Engine Hook
|
|
|
|
|
18
|
|
|
|
6/7/2006
|
|
|
|
|
|
6/8/2006
|
–
4.4 Gluing Center Body Ring
|
|
|
|
|
|
|
13
|
6/9/2006
6/9/2006
|
|
5.0 ATTACH FINS
|
73
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
6/12/2006
|
|
|
|
6/16/2006
|
–
5.1 Attach Fin #1
10
|
|
|
|
|
|
|
|
|
6/12/2006
|
|
|
|
|
|
|
|
|
|
|
|
6/13/2006
|
–
5.2 Attach Fin #2
10
6/12/2006
|
|
|
|
|
|
|
|
|
6/13/2006
|
–
5.3 Attach Fin #3
10
6/12/2006
6/13/2006
–
5.4 Attach Fin #4
10
6/12/2006
6/13/2006
–
5.5 Check Fin Alignment
|
|
|
|
|
|
|
|
|
|
|
|
|
16
|
6/13/2006
|
|
|
|
|
6/15/2006
|
–
5.6 Allow Glue to Dry
|
|
|
|
17
|
|
|
|
6/15/2006
|
6/16/2006
|
6.0 ATTACH SHOCK CORD
|
44
5/22/2006
5/26/2006
–
6.1 Cut Out Shock Cord Mount
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5
|
5/22/2006
5/22/2006
–
6.2 First Glue Application
|
|
|
12
|
5/22/2006
5/24/2006
–
6.3 Second Glue Application
|
|
|
|
|
8
|
5/24/2006
5/25/2006
–
6.4 Squeeze and Hold
|
|
|
|
|
6
|
5/25/2006
5/25/2006
–
6.5 Attaching Shock Cord Mount
13
5/25/2006
5/26/2006
|
|
7.0 ASSEMBLE NOSE CONE
|
16
5/22/2006
5/23/2006
–
7.1 Glue Nose Cone
16
5/22/2006
5/23/2006
|
|
8.0 ATTACH PARACHUTE/SHOCK CORD
|
|
|
|
|
|
18
|
5/30/2006
6/2/2006
–
8.1 Attach Lines
7
5/30/2006
5/31/2006
–
8.2 Attach Parachute
5
5/31/2006
|
|
|
|
|
6/1/2006
|
–
8.3 Tie Lines
|
|
|
6
|
6/1/2006
6/2/2006
|
9.0 ATTACH LAUNCH LUG
|
32
|
|
|
|
|
|
|
6/19/2006
|
|
|
|
|
|
|
|
6/22/2006
|
–
9.1 Glue Launch Lines
|
|
|
|
|
|
|
|
|
4
|
|
|
|
|
|
|
|
6/19/2006
|
6/19/2006
–
9.2 Application of Glue Fillets
|
|
|
|
|
|
28
|
6/19/2006
6/22/2006
|
|
10.0 PAINTING THE ROCKET
|
|
64
|
6/22/2006
|
|
|
|
|
|
|
|
|
|
|
6/29/2006
|
–
1
0.1
Apply First Coat
16
6/22/2006
|
|
|
|
|
6/23/2006
|
–
10.2
Sand
8
|
|
|
|
|
6/23/2006
|
6/23/2006
–
1
0.3
Apply Final Coat
|
|
40
|
|
|
|
6/26/2006
|
6/29/2006
|
|
|
|
|
11.0 APPLICATION OF DECALS
|
|
|
|
35
|
6/29/2006
|
|
|
|
|
|
|
7/6/2006
|
–
11.1 Apply First Decal
5
6/29/2006
6/29/2006
–
11.2 Apply Second Decal
5
6/29/2006
|
|
|
|
|
|
|
|
|
|
|
|
|
|
6/30/2006
|
–
11.3 Apply Third Decal
5
|
|
|
|
|
|
|
6/30/2006
|
6/30/2006
–
11.4 Apply Fourth Decal
5
6/30/2006
|
|
|
|
|
|
|
|
|
7/3/2006
|
–
11.5 Apply Fifth Decal
5
|
|
|
7/3/2006
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7/5/2006
|
–
11.6 Apply Sixth Decal
5
|
|
|
|
|
|
|
7/5/2006
|
7/5/2006
–
11.7 Apply Seventh Decal
5
7/5/2006
7/6/2006
|
|
12.0 APPLYING CLEAR COAT
|
16
7/6/2006
|
|
|
7/7/2006
|
–
12.1 Apply Clear Coat to Entire Rocket
16
7/6/2006
7/7/2006
|
|
13.0 DISPLAY NOZZLE ASSEMBLY
|
|
|
|
|
32
|
|
|
|
|
|
|
|
|
7/10/2006
|
|
|
|
|
|
|
|
7/13/2006
|
–
13.1 Spray Nozzle Base White
18
|
|
|
7/10/2006
|
|
|
|
7/11/2006
|
–
13.2 Apply Glue
14
|
|
|
|
|
|
|
|
|
|
7/12/2006
|
7/13/2006
|
|
|
|
|
14.0 ROCKET PREFLIGHT
|
|
|
|
42
|
7/13/2006
|
|
|
7/20/2006
|
14.1 Prepare
13
7/13/2006
|
|
|
|
|
|
|
7/17/2006
|
14.2 Spike
4
|
|
|
|
|
|
|
7/17/2006
|
7/17/2006
14.3 Fold
4
7/17/2006
|
|
|
|
|
|
|
7/18/2006
|
14.4 Roll
4
|
|
|
|
|
|
|
7/18/2006
|
7/18/2006
14.5 Re-insert
|
|
|
17
|
7/18/2006
7/20/2006
|
|
|
|
|
15.0 PREPARE FOR TEST LAUNCH
|
|
|
|
|
|
|
|
32
|
|
|
7/21/2006
|
|
|
|
7/26/2006
|
–
15.1 Insert Engine
32
|
|
|
7/21/2006
|
7/26/2006
Appendix WG: WBS Milestone Gantt
Name
|
|
|
Duration
(hours)
|
8
16
24
|
32
40
48
|
56
|
64
72
|
80
|
88
|
96
|
104
|
112
|
120
|
128
|
136
|
1.0 ASSEMBLE ENGINE MOUNT
|
|
|
65
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2.0 FIN PREPARATION
|
|
19
|
3.0 MARK FIN AND LAUNCH LUG LINES
17
4.0 INSERTING ENGINE MOUNT
28
5.0 ATTACH FINS
42
6.0 ATTACH SHOCK CORD
18
7.0 ASSEMBLE NOSE CONE
10
8.0 ATTACH PARACHUTE/SHOCK CORD
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3
|
9.0 ATTACH LAUNCH LUG
12
10.0 PAINTING THE ROCKET
|
|
|
|
33
|
11.0 APPLICATION OF DECALS
28
12.0 APPLYING CLEAR COAT
10
13.0 DISPLAY NOZZLE ASSEMBLY
|
|
|
20
|
14.0 ROCKET PREFLIGHT
|
|
11
|
15.0 PREPARE FOR TEST LAUNCH
5
Name
Duration
(hours)
144
|
152
|
160
|
168
|
176
|
184
|
192
|
200
|
208
|
216
|
224
232
|
240
248
|
256
|
264
|
1.0 ASSEMBLE ENGINE MOUNT
65
2.0 FIN PREPARATION
19
3.0 MARK FIN AND LAUNCH LUG LINES
17
4.0 INSERTING ENGINE MOUNT
28
5.0 ATTACH FINS
42
6.0 ATTACH SHOCK CORD
18
7.0 ASSEMBLE NOSE CONE
10
8.0 ATTACH PARACHUTE/SHOCK CORD
3
9.0 ATTACH LAUNCH LUG
12
10.0 PAINTING THE ROCKET
33
11.0 APPLICATION OF DECALS
28
12.0 APPLYING CLEAR COAT
10
13.0 DISPLAY NOZZLE ASSEMBLY
20
14.0 ROCKET PREFLIGHT
11
15.0 PREPARE FOR TEST LAUNCH
5
Name
Duration
(hours)
272
|
280
|
288
|
296
|
304
|
312
|
320
|
328
|
336
|
344
|
352
|
360
|
368
|
376
|
384
|
392
|
400
408
|
416
|
424
|
432
1.0 ASSEMBLE ENGINE MOUNT
65
2.0 FIN PREPARATION
19
3.0 MARK FIN AND LAUNCH LUG LINES
17
4.0 INSERTING ENGINE MOUNT
28
5.0 ATTACH FINS
42
6.0 ATTACH SHOCK CORD
18
7.0 ASSEMBLE NOSE CONE
10
8.0 ATTACH PARACHUTE/SHOCK CORD
3
9.0 ATTACH LAUNCH LUG
12
10.0 PAINTING THE ROCKET
33
11.0 APPLICATION OF DECALS
28
12.0 APPLYING CLEAR COAT
10
13.0 DISPLAY NOZZLE ASSEMBLY
20
14.0 ROCKET PREFLIGHT
11
15.0 PREPARE FOR TEST LAUNCH
5
Appendix WR: WBS with Resources Leveled
|
|
|
|
Name
|
|
|
Duration
|
Start Date
Finish Date
Resource Names
|
1.0 ASSEMBLE ENGINE MOUNT
|
95
|
5/22/2006
6/5/2006
|
1.1 Measure, Mark, and Cut Engine Tube
|
35
5/22/2006
5/26/2006
–
1.1.1 Lay ruler along engine tube
5
5/22/2006
5/22/2006
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Fitter #1
|
|
-1.1.2 Measure engine from left of engine tube @ 1/8″
|
5
5/22/2006
5/23/2006
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Draftsman #1
|
–
1.1.3 Mark left end of engine tube @ 1/8″
5
5/23/2006
5/23/2006
Draftsman #1
–
1.1.4 Measure engine from left of engine tube @ 3/4″
5
5/23/2006
5/24/2006
Draftsman #1
|
-1.1.5 Mark from left of engine tube @ 3/4″
|
5
5/24/2006
5/25/2006
Draftsman #1
–
1.1.6 Measure engine tube from left of engine tube @ 1 1/2″
5
5/25/2006
5/25/2006
Draftsman #1
–
1.1.7 Mark from left of engine tube @ 1 1/2″
5
5/25/2006
5/26/2006
Draftsman #1
|
-1.2 Cut Engine Tube
|
|
2
|
5/26/2006
5/26/2006
|
-1.2.1 Cut slit of 1/8″ @ 1 1/2 inch mark on engine tube
|
2
5/26/2006
5/26/2006
|
|
Cutter #1
|
–
1.3 Glue, Tube, Assemble Hook
7
5/26/2006
5/30/2006
–
1.3.1 Apply thin line of glue completely around engine at 3/4″ mark
2
5/26/2006
5/26/2006
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Gluer #1
|
–
1.3.2 Position hook per diagram
2
5/26/2006
5/30/2006
Fitter #1
–
1.3.3 Insert engine hook into 1/8″ slit on engine mount tube
3
5/30/2006
5/30/2006
Fitter #1
|
-1.4 Assemble Mylar Ring to Tube
|
|
9
|
5/30/2006
5/30/2006
|
-1.4.1 Slide mylar ring onto engine mount tube at 3/4″ mark
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1
|
5/30/2006
5/30/2006
Fitter #1
–
1.4.2
Let dry
8
5/30/2006
5/30/2006
|
-1.5 Assemble Yellow Engine Block to Engine Mount Tube
|
10
5/31/2006
5/31/2006
–
1.5.1 Apply glue inside front of engine mount tube
1
5/31/2006
5/31/2006
Gluer #1
–
1.5.2 Insert yellow engine block flush with the right end per diagram
1
5/31/2006
5/31/2006
Fitter #1
–
1.5.3 Let dry
8
5/31/2006
5/31/2006
|
-1.6 Assemble Centering Rings
|
|
22
|
6/1/2006
6/2/2006
–
1.6.1 Remove centering rings from card with modeling knife
2
6/1/2006
6/1/2006
Cutter #1
–
1.6.2 Apply thin line of glue around engine mount tube @ 1/8″ mark
1
6/1/2006
6/1/2006
Gluer #1
–
1.6.3 Slide notched centering ring onto glued line @ 1/8″ mark
1
6/1/2006
6/1/2006
Fitter #1
–
1.6.4
Let glue set
8
6/1/2006
6/2/2006
–
1.6.5 Apply thin line of glue to opposite side of notched center ring flush with end of engine mount tube
1
6/2/2006
6/2/2006
Gluer #1
–
1.6.6 Slide un-notched centering ring in place over glue flush with end of engine tube mount
1
6/2/2006
6/2/2006
Fitter #1
–
1.6.7 Let dry
8
6/2/2006
6/2/2006
|
-1.7 Application of Glue Fillets
|
10
6/5/2006
6/5/2006
–
1.7.1 Apply glue fillets to both sides of centering rings for reinforcement
2
6/5/2006
6/5/2006
Gluer #1
–
1.7.2 Let dry
8
6/5/2006
6/5/2006
2.0 FIN PREPARATION
|
|
30
|
5/22/2006
5/25/2006
|
|
-2.1 Sand/Cut Fins
|
8
5/22/2006
5/22/2006
–
2.1.1 Sand laser cut balsa sheet w/ fine sandpaper
8
5/22/2006
5/22/2006
|
Sander-I #1
|
|
-2.2 Cutting Out Fins
|
12
5/23/2006
5/24/2006
2.2.1 Cut out fin #1 w/ modeling knife
3
5/23/2006
5/23/2006
|
|
|
Cutter #2
|
2.2.2 Cut out fin #2 w/ modeling knife
3
5/23/2006
5/23/2006
Cutter #2
2.2.3 Cut out fin #3 w/ modeling knife
3
5/23/2006
5/24/2006
Cutter #2
2.2.4 Cut out fin #4 w/ modeling knife
3
5/24/2006
5/24/2006
Cutter #2
|
-2.3 Stack and Sand Fins
|
10
5/24/2006
5/25/2006
–
2.3.1 Stack fins
2
5/24/2006
5/24/2006
|
|
Fitter #2
|
–
2.3.2 Sand edges of fins
8
5/24/2006
5/25/2006
Sander-I #1
|
|
3.0 MARK FIN AND LAUNCH LUG LINES
|
|
33
|
5/22/2006
5/30/2006
|
-3.1 Cut – Tape
|
13
5/22/2006
5/25/2006
–
3.1.1 Cut out tube marking guide
2
5/22/2006
5/22/2006
Cutter #3
|
–
3.1.2 Tape tube marking guide around body tube
3
5/23/2006
5/23/2006
|
|
|
|
Fitter #3
|
–
3.1.3
Mark body tube at arrows
4
5/23/2006
5/23/2006
|
|
|
|
|
|
|
|
|
|
|
Draftsman #2
|
–
3.1.4
Mark launch lug line as LL on body tube
4
5/24/2006
5/25/2006
Draftsman #2
|
-3.2 Remove Guide, Connect Fins and Lug Lines, Extend LL Line
|
16
5/25/2006
5/30/2006
–
3.2.1 Remove tube marking guide from body tube
4
5/25/2006
5/25/2006
Fitter #3
–
3.2.2
Connect fins
using door frame
4
5/25/2006
5/26/2006
Fitter #3
–
3.2.3
Connect launch lug lines
using door frame
8
5/26/2006
5/30/2006
Fitter #3
|
-3.3 Extend Launch Lug Line
|
4
5/30/2006
5/30/2006
–
3.3.1 Extend launch lug line 3 3/4″ from end of tube
4
5/30/2006
5/30/2006
Draftsman #2
4.0 INSERTING ENGINE MOUNT
|
|
43
|
6/6/2006
6/9/2006
-4.1 Mark Inside of Tube @ 5/8″ Where LL is
7
6/6/2006
6/6/2006
–
4.1.1 Measure inside tube to 5/8″ position on tube
4
6/6/2006
6/6/2006
Draftsman #1
–
4.1.2
Mark inside tube at 5/8″
3
6/6/2006
6/6/2006
Draftsman #1
|
-4.2 Glue Tube
|
5
6/6/2006
6/7/2006
–
4.2.1 Measure inside rear of body tube to 1 3/4″ position on tube
3
6/6/2006
6/7/2006
Draftsman #1
–
4.2.2 Use finger to smear glue 1 3/4″ inside rear of body tube along LL
2
6/7/2006
6/7/2006
Gluer #1
|
-4.3 Assemble Engine Hook
|
18
6/7/2006
|
6/8/2006
|
–
4.3.1 Align engine hook with LL line
5
6/7/2006
6/8/2006
Fitter #1
–
4.3.2 Insert engine mount into body tube until centering ring is even w/ the 5/8″ glue mark
5
6/8/2006
6/8/2006
Fitter #1
–
4.3.3 Let dry
8
6/8/2006
6/8/2006
|
-4.4 Gluing Center Body Ring
|
13
6/9/2006
6/9/2006
–
4.4.1 Locate scrap piece of balsa to apply glue
1
6/9/2006
6/9/2006
Fitter #1
–
4.4.2 Apply glue to centering/body tube joint
4
6/9/2006
6/9/2006
Gluer #1
–
4.4.3 Let dry
8
6/9/2006
6/9/2006
5.0 ATTACH FINS
|
|
73
|
6/12/2006
6/16/2006
|
-5.1 Attach Fin #1
|
10
6/12/2006
6/13/2006
–
5.1.1 Apply thin layer of glue to edge of fin
3
6/12/2006
6/12/2006
|
Gluer #3
|
–
5.1.2 Allow to dry (1 minute for model)
1
6/12/2006
6/12/2006
–
5.1.3 Apply second layer of glue to edge of fin
2
6/12/2006
6/12/2006
Gluer #3
–
5.1.4 Attach fin to body tube along one of fin lines flush w/ end
4
6/12/2006
6/13/2006
Fitter #3
|
-5.2 Attach Fin #2
|
10
6/12/2006
6/13/2006
–
5.2.1 Apply thin layer of glue to edge of fin #2
3
6/12/2006
6/12/2006
|
Gluer #2
|
–
5.2.2 Allow to dry (1 minute for model)
1
6/12/2006
6/12/2006
–
5.2.3 Apply second layer of glue to edge of fin #2
2
6/12/2006
6/12/2006
Gluer #2
–
5.2.4 Attach fin #2 to body tube along one of fin lines flush w/ end
4
6/12/2006
6/13/2006
Fitter #2
|
-5.3 Attach Fin #3
|
10
6/12/2006
6/13/2006
–
5.3.1 Apply thin layer of glue to edge of fin #3
3
6/12/2006
6/12/2006
Gluer #1
–
5.3.2 Allow to dry (1 minute for model)
1
6/12/2006
6/12/2006
–
5.3.3 Apply second layer of glue to edge of fin #3
2
6/12/2006
6/12/2006
Gluer #1
–
5.3.4 Attach fin #3 to body tube along one of fin lines flush w/ end
4
6/12/2006
6/13/2006
Fitter #1
|
-5.4 Attach Fin #4
|
10
6/12/2006
6/13/2006
–
5.4.1 Apply thin layer of glue to edge of fin #4
3
6/12/2006
6/12/2006
|
|
|
|
|
|
Gluer #4
|
–
5.4.2 Allow to dry (1 minute for model)
1
6/12/2006
6/12/2006
–
5.4.3 Apply second layer of glue to edge of fin #4
2
6/12/2006
6/12/2006
Gluer #4
–
5.4.4 Attach fin #4 to body tube along one of fin lines flush w/ end
4
6/12/2006
6/13/2006
Fitter #4
|
|
-5.5 Check Fin Alignment
|
16
6/13/2006
6/15/2006
–
5.5.1
Check fin #1 alignment
as shown in diagram
4
6/13/2006
6/13/2006
Draftsman #1
–
5.5.2
Check fin #2 alignment
as shown in diagram
4
6/13/2006
|
|
|
6/14/2006
|
Draftsman #2
–
5.5.3
Check fin #3 alignment
as shown in diagram
4
6/14/2006
6/14/2006
Draftsman #1
–
5.5.4
Check fin #4 alignment
as shown in diagram
4
6/14/2006
6/15/2006
Draftsman #2
|
-5.6 Allow Glue to Dry
|
17
6/15/2006
6/16/2006
–
5.6.1 Let glue set
5
6/15/2006
6/15/2006
–
5.6.2 Stand rocket on end
4
6/15/2006
6/16/2006
Fitter #1
–
5.6.3
Let glue dry completely
8
6/16/2006
6/16/2006
6.0 ATTACH SHOCK CORD
|
44
|
5/22/2006
5/26/2006
|
-6.1 Cut Out Shock Cord Mount
|
5
5/22/2006
5/22/2006
–
6.1.1
Cut out shock cord from front page
5
5/22/2006
5/22/2006
Cutter #1
|
-6.2 First Glue Application
|
12
5/22/2006
5/24/2006
–
6.2.1 Attach shock cord to shock cord mount
4
5/22/2006
5/23/2006
|
|
Fitter #5
|
–
6.2.2 Apply glue to shock cord mount
4
5/23/2006
5/23/2006
Gluer #4
–
6.2.3 Fold edge of shock cord mount forward over glued shock cord
4
5/23/2006
5/24/2006
Fitter #5
|
-6.3 Second Glue Application
|
8
5/24/2006
5/25/2006
–
6.3.1 Apply glue to shock cord mount
4
5/24/2006
5/24/2006
Gluer #4
–
6.3.2 Fold forward again-see diagram for clarification
4
5/24/2006
5/25/2006
Fitter #5
|
-6.4 Squeeze and Hold
|
6
5/25/2006
5/25/2006
–
6.4.1 Squeeze shock cord/shock cord mount tightly
2
5/25/2006
5/25/2006
Gluer #4
–
6.4.2 Hold for 1 minute
4
5/25/2006
5/25/2006
Gluer #4
|
-6.5 Attaching Shock Cord Mount
|
13
5/25/2006
5/26/2006
–
6.5.1 Glue mount 1″ inside body tube
4
5/25/2006
5/26/2006
Gluer #4,Fitter #5
|
–
6.5.2 Hold until glue sets
1
5/26/2006
5/26/2006
Gluer #4
–
6.5.3
Let dry completely
8
5/26/2006
5/26/2006
7.0 ASSEMBLE NOSE CONE
16
5/22/2006
5/23/2006
|
-7.1 Glue Nose Cone
|
16
5/22/2006
5/23/2006
–
7.1.1 Apply plastic cement to inside rim of nose cone
4
5/22/2006
5/22/2006
Gluer #5
|
–
7.1.2 Press nose cone insert into place over plastic cement inside of nose cone rim
4
5/22/2006
5/22/2006
Fitter #2
–
7.1.3 Let dry completely
8
5/22/2006
5/23/2006
8.0 ATTACH PARACHUTE/SHOCK CORD
18
5/30/2006
6/2/2006
|
-8.1 Attach Lines
|
7
5/30/2006
5/31/2006
–
8.1.1 Pass shroud line on parachute through eyelet
7
5/30/2006
5/31/2006
Fitter #1
|
-8.2 Attach Parachute
|
5
5/31/2006
6/1/2006
–
8.2.1 Pass parachute through loop in shroud-look to diagram for clarification
5
5/31/2006
6/1/2006
Fitter #1
|
-8.3 Tie Lines
|
6
6/1/2006
6/2/2006
–
8.3.1 Tie shock cord to nose cone using a double knot
6
6/1/2006
6/2/2006
Fitter #1
9.0 ATTACH LAUNCH LUG
32
6/19/2006
6/22/2006
|
-9.1 Glue Launch Lines
|
4
6/19/2006
6/19/2006
|
-9.1.1 Glue LL centered onto LL line on rocket body
|
4
6/19/2006
6/19/2006
Gluer #1
|
-9.2 Application of Glue Fillets
|
|
28
|
6/19/2006
6/22/2006
–
9.2.1 Apply glue fillets along launch lug
4
6/19/2006
6/19/2006
Gluer #1
–
9.2.2 Apply glue fillets along fin/body tube joints
12
6/20/2006
|
|
|
|
6/21/2006
|
Gluer #1
–
9.2.3 Smooth each fillet with finger
4
6/21/2006
6/21/2006
Gluer #1
–
9.2.4 Let glue dry completely
8
6/21/2006
6/22/2006
10.0 PAINTING THE ROCKET
|
|
64
|
6/22/2006
6/29/2006
|
-10.1 Apply First Coat
|
16
6/22/2006
6/23/2006
–
10.1.1 Spray rocket with white primer
8
6/22/2006
6/22/2006
|
|
|
|
Painter-I #1
|
–
10.1.2 Let dry
8
6/22/2006
6/23/2006
|
-10.2 Sand
|
8
6/23/2006
6/23/2006
–
10.2.1 Sand entire rocket
8
6/23/2006
6/23/2006
Sander-I #1,Sander-I #2,Sander-II #1,Sander-II #2
|
|
–
10.3
Apply Final Coat
|
|
40
|
|
6/26/2006
|
6/29/2006
–
10.3.1 Spray completed rocket with white second coat of primer
8
6/26/2006
6/26/2006
|
Painter-II #1
,Painter-II #2
|
–
10.3.2 Let dry
8
6/26/2006
|
6/27/2006
|
–
10.3.3 Spray nose cone with copper paint
16
6/27/2006
|
6/28/2006
|
Painter-II #1,Painter-II #2
–
10.3.4 Let dry
8
6/28/2006
6/29/2006
|
|
11.0 APPLICATION OF DECALS
|
35
6/29/2006
7/6/2006
|
-11.1 Apply First Decal
|
5
6/29/2006
6/29/2006
–
11.1.1 Remove first decal from back sheet
1
6/29/2006
6/29/2006
Draftsman #1
–
11.1.2 Place on rocket where indicated
3
6/29/2006
6/29/2006
Draftsman #2
–
11.1.3 Rub decal to remove bubbles
1
6/29/2006
6/29/2006
Draftsman #1
|
-11.2 Apply Second Decal
|
5
6/29/2006
6/30/2006
–
11.2.1 Remove second decal from backing sheet
1
6/29/2006
6/29/2006
Draftsman #1
–
11.2.2 Place on rocket where indicated
3
6/29/2006
6/30/2006
Draftsman #2
–
11.2.3 Rub decal to remove bubbles
1
6/30/2006
6/30/2006
Draftsman #1
|
-11.3 Apply Third Decal
|
5
6/30/2006
6/30/2006
–
11.3.1 Remove third decal from backing sheet
1
6/30/2006
6/30/2006
Draftsman #1
–
11.3.2 Place on rocket where indicated
3
6/30/2006
6/30/2006
Draftsman #2
–
11.3.3 Rub decal to remove bubbles
1
6/30/2006
6/30/2006
Draftsman #1
|
-11.4 Apply Fourth Decal
|
5
6/30/2006
7/3/2006
–
11.4.1 Remove fourth decal from backing sheet
1
6/30/2006
6/30/2006
Draftsman #1
–
11.4.2 Place on rocket where indicated
3
7/3/2006
7/3/2006
Draftsman #2
–
11.4.3 Rub decal to remove bubbles
1
7/3/2006
7/3/2006
Draftsman #1
|
-11.5 Apply Fifth Decal
|
5
7/3/2006
7/5/2006
–
11.5.1 Remove fifth decal from backing sheet
1
7/3/2006
7/3/2006
Draftsman #1
–
11.5.2 Place on rocket where indicated
3
7/3/2006
7/3/2006
Draftsman #2
–
11.5.3 Rub decal to remove bubbles
1
7/5/2006
7/5/2006
Draftsman #1
|
-11.6 Apply Sixth Decal
|
5
7/5/2006
7/5/2006
–
11.6.1 Remove sixth decal from backing sheet
1
7/5/2006
7/5/2006
Draftsman #1
–
11.6.2 Place on rocket where indicated
3
7/5/2006
7/5/2006
Draftsman #2
–
11.6.3 Rub decal to remove bubbles
1
7/5/2006
7/5/2006
Draftsman #1
|
-11.7 Apply Seventh Decal
|
5
7/5/2006
7/6/2006
–
11.7.1 Remove seventh decal from backing sheet
1
7/5/2006
7/5/2006
Draftsman #1
–
11.7.2 Place on rocket where indicated
3
7/5/2006
7/6/2006
Draftsman #2
–
11.7.3 Rub decal to remove bubbles
1
7/6/2006
7/6/2006
Draftsman #1
12.0 APPLYING CLEAR COAT
16
7/6/2006
7/7/2006
|
-12.1 Apply Clear Coat to Entire Rocket
|
16
7/6/2006
7/7/2006
–
12.1.1 Apply clear coat to entire rocket
8
7/6/2006
7/7/2006
Painter-II #1
|
–
12.1.2
Dry completely
8
7/7/2006
7/7/2006
13.0 DISPLAY NOZZLE ASSEMBLY
32
7/10/2006
7/13/2006
|
-13.1 Spray Nozzle Base White
|
18
7/10/2006
|
7/11/2006
|
–
13.1.1 Paint nozzle #1 w/ silver paint pen
|
|
|
2.5
|
7/10/2006
7/10/2006
Painter-I #1
–
13.1.2 Paint nozzle #2 w/ silver paint pen
2.5
7/10/2006
7/10/2006
Painter-I #1
–
13.1.3 Paint nozzle #3 w/ silver paint pen
2.5
7/10/2006
7/10/2006
Painter-I #1
–
13.1.4 Paint nozzle #4 w/ silver paint pen
2.5
7/10/2006
7/11/2006
Painter-I #1
–
13.1.5 Allow to dry
8
7/11/2006
7/11/2006
|
-13.2 Apply Glue
|
|
14
|
|
7/12/2006
|
7/13/2006
–
13.2.1 Apply glue to tab on nozzle #1
|
|
|
1.5
|
7/12/2006
7/12/2006
Gluer #1
–
13.2.2 Place nozzle #1 into hole on base
2
7/12/2006
7/12/2006
Fitter #1
–
13.2.3 Apply glue to tab on nozzle #2
1.5
7/12/2006
7/12/2006
Gluer #1
–
13.2.4 Place nozzle #2 into hole on base
2
7/12/2006
7/12/2006
Fitter #1
–
13.2.5 Apply glue to tab on nozzle #3
1.5
7/12/2006
7/13/2006
Gluer #1
–
13.2.6 Place nozzle #3 into hole on base
2
7/13/2006
7/13/2006
Fitter #1
–
13.2.7 Apply glue to tab on nozzle #4
1.5
7/13/2006
7/13/2006
Gluer #1
–
13.2.8 Place nozzle #4 into hole on base
2
7/13/2006
7/13/2006
Fitter #1
|
|
14.0 ROCKET PREFLIGHT
|
|
42
|
7/13/2006
7/20/2006
|
|
14.1 Prepare
|
13
7/13/2006
7/17/2006
–
14.1.1 Remove nose cone from rocket
6
7/13/2006
|
|
|
7/14/2006
|
Fitter #1
–
14.1.2 Locate recovery wadding
1
7/14/2006
7/14/2006
Fitter #1
–
14.1.3 Insert 4–5 loosely crumpled squares of recovery wadding
6
7/14/2006
7/17/2006
Fitter #1
|
|
14.2 Spike
|
4
7/17/2006
7/17/2006
–
14.2.1 Pull parachute into a spike-see diagram for clarification
4
7/17/2006
7/17/2006
Fitter #1
|
|
14.3 Fold
|
4
7/17/2006
7/18/2006
–
14.3.1 Fold parachute according to diagram
4
7/17/2006
7/18/2006
Fitter #1
|
|
14.4 Roll
|
4
7/18/2006
7/18/2006
–
14.4.1 Roll parachute according to diagram
4
7/18/2006
7/18/2006
Fitter #1
|
14.5 Re-Insert
|
17
7/18/2006
7/20/2006
|
-14.5.1 Wrap lines loosely around rolled parachute—see diagram for clarification
|
5
7/18/2006
|
7/19/2006
|
Fitter #1
–
14.5.2 Insert parachute into body tube of rocket
6
7/19/2006
7/20/2006
Fitter #1
–
14.5.3 Insert shock cord into body tube of rocket
2
7/20/2006
7/20/2006
Fitter #1
–
14.5.4 Insert nose cone into body tube of rocket
4
7/20/2006
7/20/2006
Fitter #1
|
|
15.0 PREPARE FOR TEST LAUNCH
|
32
7/21/2006
7/26/2006
|
-15.1 Insert Engine
|
32
7/21/2006
7/26/2006
–
15.1.1
Remove engine
10
7/21/2006
|
7/24/2006
|
|
|
Engineer #1
|
–
15.1.2
Insert tip to touch propellant
10
7/24/2006
|
7/25/2006
|
Engineer #1
–
15.1.3
Insert engine into rocket
12
7/25/2006
7/26/2006
Engineer #1
ES
0
|
95h
EF
|
95
|
LS
0
Slack
0
LF
95
1:
Assemble Engine Mount
ES
0
3
3h
EF
33
LS
62
Slack
62
LF
95
3:
Mark Fin & LL Lines
ES
0
30h
EF
30
LS
108
Slack
108
LF
138
2:
Fin Preperation
ES
0
44h
EF
|
44
|
ES
196
Slack
196
LF
240
|
6:
Attach Shock Cord
ES
0
16h
EF
16
LS
224
|
Slack
224
LF
240
7:
Assemble Nose Cone
ES
0
32h
EF
32
LS
400
|
Slack
400
LF
432
|
13:
Display Nozzle Assembly
ES
95
43h
EF
138
LS
95
Slack
0
LF
138
4:
Insert Engine Mount
ES
138
73h
EF
211
LS
138
Slack
0
LF
211
5:
Attach Fins
ES
44
3h
EF
47
LS
240
Slack
196
LF
243
8:
Attach Chute/Shock Cord
ES
243
64h
EF
307
LS
243
Slack
0
LF
307
10:
Painting the Rocket
ES
211
32h
EF
243
LS
211
Slack
0
LF
243
9:
Attach Launch Lug
ES
307
35h
EF
342
LS
307
Slack
0
LF
342
11:
Application of Decals
ES
342
16h
EF
358
LS
342
Slack
0
LF
358
12:
Applying Clear Coat
ES
358
42h
EF
400
LS
358
Slack
0
LF
400
14:
Rocket Pre-Flight
ES
400
32h
EF
432
LS
400
Slack
0
LF
432
15:
Prepare for Test Launch
Early
Start
Duration
Early
Finish
Late
StartSlack
Late
Finish
Task Name
Legend
Gauchito Network Diagram
3.0 Cost Management Plan
Introduction
The Gauchito Rocket Project Cost Management Plan covers defining cost estimates, creating the cost baseline, and managing the cost of the project. Because the project is of short duration, requiring intense workloads during the schedule in order to produce the deliverables, cost will be critical to manage and to control in order to meet the definitive budget.
The overall labor estimate is $19,950.
The estimated at completion (EAC) is $50,150 for the 10-week schedule. The EAC includes all direct labor costs, as well as material and equipment costs. The details of cost management for the Gauchito project are covered below.
1. Equipment and custom parts
2. Equipment and material order dates
3. Precision formats
4. How the cost estimate is developed
5. Precision formats
6. Personnel usage by time period labor expenditure on the project (Reference Appendix E)
7. How cost baseline and budget was developed (Reference Appendix H)
8. Identify when personnel are utilized and differences with the cost budget
9. Reporting formats for cost management
10. Cost control and managing process
11. Cost constraints
12. Cost assumptions
The constraints and assumptions are specific to cost management of the project. The overall project constraints and assumptions also apply to cost management.
Equipment and Custom Parts
The Gauchito project requires a complete rocket assembly and specialty hybrid engines. These parts must be purchased and on-site no later than the Friday before the project start date to ensure work can start on schedule. The risk management plan addresses response if some material and equipment is not received by that date. For example, if the paint is not received, it would have no cost impact because it is not used at project start, and it can be reordered and delivered before it is needed. The previous order would be cancelled so double charges are not incurred.
The cost for materials, equipment, and custom parts is included in the first week, as shown in the cost baseline in Appendix H, page 125. A breakout is shown below.
|
Item
|
Cost
|
Purchase Date
|
|
Rocket Assembly – all standard rocket components, including rocket engine wadding, parachute, nose cone, and rocket body
|
$15,000
|
Due at project start
|
|
Hybrid Engines – A3-4T (4 @ $2,500 each)
|
$10,000*
|
|
Due at project start
|
|
Delivery costs
|
$125
|
|
|
|
Materials as specified in the assembly construction plan, including pens, pencils, glue, scissors, engine wadding, paint, tape measure, string, sandpaper, and modeling knife
|
$5,000
|
Due at project start
* The original cost estimates included the hybrid engines, but the cost increased after procurement evaluation and determining the actual cost. The difference is
$5,000
between the scope estimated budget and the cost baseline provided in this plan.
Equipment and Material Order Dates
All equipment, including the specialty hybrid engines, is to be ordered at least 12 days prior to project start. The hybrid engines are not required until the last week of the project. However, in order to validate compatibility with the rocket, they are due at the same time as other materials and equipment, the week before project start.
The project plan and cost approval will include authorization to order the equipment before project start and allocate funds to the materials and equipment. Though the material and equipment will be ordered prior to project start, they are not allocated to the baseline until after receipt, so they are applied to the cost expenditures for the first week of the project. They are included in the overall EAC.
3.1 Precision Formats
3.1.1 Cost estimation for WBS by skill set:
· The cost estimates are in whole hours, rounded to the nearest hour.
· The labor rates are rounded to the nearest $5.00 increment and are shown at the bottom of the cost estimate spreadsheet.
3.1.2 Baseline budget and EAC:
· The material and equipment cost is rounded up to the nearest $
100
.00.
· The labor rate for employees is a blended rate.
· The costs are reflected by week plus cumulative for all weeks to calculate the EAC.
3.2 How the Cost Estimate is Developed
The Gauchito project manager and team members met to review the product description, project charter, and preliminary scope statement. These documents provided an overall definition of the goal, the deliverables, initial rough order cost and budget estimates, and the project time frame.
The team reviewed the work breakdown structure, initial staffing resource needs, scheduled durations for each deliverable, milestones, and the target budget. These detailed data provided the basis upon which to refine resources needed for each WBS work package and to define costs for each type of resource skill set (e.g., fitters for the project).
In order to arrive at a complete cost estimate, the project manager and team met with subject matter experts (SMEs) for each of the disciplines (e.g., fitters, draftsmen, and other skill types). This proved invaluable in defining the activities needed to accomplish each work package task, if it was not predefined. The SMEs and project team reviewed the Gauchito Rocket Construction Plan (in Appendix C) as a reference to ensure all activities were included for cost estimation.
In addition, the project manager reviewed existing cost estimates for the previously concluded Generic Estes project. While that project was not of the same size and scope of the Gauchito project, it provided useful cost information and lessons learned for the cost estimates.
The project manager gathered the estimation of time, in man-hours, needed by each skill set to complete each work package. The total time for each deliverable, and for all deliverables, was calculated, as well as total time for each skill set (e.g., fitter). The hourly rate for each skill set was applied to the resource time estimates to arrive at the total project labor cost.
No overhead resources were included in the cost estimate. Though the duration estimate number of hours includes “dummy” time for drying the glue, it is not included in any cost estimates.
Personnel Usage by Time Period Labor Expenditure On Project
Appendix C shows the personnel skill set needed for the project and when they will be used on the project. The resource skill sets and time estimates for each are applied to each WBS work package. This duration estimate is part of the cost estimate spreadsheet.
Duration smoothing was not applied because the tasks are so well defined, and SST has previous experience form the Generic Estes project. No optimistic schedule was determined. A consideration for a pessimistic schedule was that the new hybrid engines are unknown technology in conjunction with the standard rocket kit assembly. However, the team determined that the task would take no more time than installing standard rockets.
3.3 How the Cost Baseline and Budget is Developed
The total labor costs from the duration and cost estimate chart, and the material and equipment costs, were input to create the cost baseline and budget. Labor costs: The total costs from the cost estimation results—the personnel by WBS work package—have been input as part of the development of the budget and cost baseline. The resource requirements for each week were determined using resource staffing information per the staffing management plan. The resources needed to complete the work packages scheduled for the week have been input to the spend plan and total costs applied to the week based on blended labor rates. The labor cost was based upon a blended rate of $35 per hour for employees. The contracted labor cost was based upon a rate of $50 per hour. The total labor cost is reflected in the appendices.
All cost of material and equipment is included in the baseline. The materials and equipment costs have been provided per the procurement management plan and will be purchased for delivery the week before project start. Therefore, those costs are reflected during the first week of the project. The difference from the preliminary budget is $5,000. However, there is no difference between the definitive budget and cost baseline because the costs are included.
The baseline budget EAC is targeted to within –5% and +10 of the budget at completion (BAC).
Identify When Personnel Are Utilized and Differences with the Cost Budget
The baseline is in Appendix H. The baseline is shown in increments by time period reporting, which is weekly, showing the labor costs per each week as well as the material and equipment cost per week.
The costs are reflected in an S-curve chart showing the cumulative budgeted costs weekly and showing the total budgeted cost, EAC.
The spend plan for personnel is shown in Appendix H as part of the input to creating the cost baseline and budget. The input to the spend plan is the duration estimates and costs applied to each resource type, along with the schedule. The cost is shown weekly for each resource needed to complete the work scheduled for that week.
The differences between the spend plan costs and the original cost budget estimates reflect a slight increase due to expert evaluation of the amount of time needed to complete each task.
Reporting Formats for Cost Reports
The project manager will report weekly on cost and schedule status for the previous week. The report will be distributed to SST executive management, the sponsor, finance, De Leon and Associates customer contacts, and functional managers with staff assigned to the project. The report will be briefed at the weekly project meeting. These costs are based on the budgeted cost of work scheduled (BCWS) in the baseline cost in Appendix H. An example report format follows, and it will use earned value management to provide actual costs and work performed against the BCWS (the planned value [PV]). The BCWS/baseline costs, the actual costs, and the cost of amount of work performed will be plotted on the S-curve chart, similar to the graphic shown. The amount of work performed may reflect either a deliverable or the WBS tasks that should have been completed in support of a deliverable. The cost variance (earned value – actual cost) and the schedule variance (earned value – planned value) are included, along with explanations for the variance.
EXAMPLE Cost Management Report: Figure 3-1
Gauchito Project Cost Management Report Date:
The Gauchito Project EAC is ____________.
The Gauchito Project BAC is ____________.
Week #: _______
Earned value for cost of work actually performed (BCWP or EV):
Actual costs for the week (ACWP or AC):
Cost variance (EV – AC):
Explanation and corrective action, if needed:
Earned value for cost of work actually performed (BCWP or EV):
Budgeted cost of work scheduled (BCWS or PV):
Schedule variance (EV – PV):
Explanation and corrective action, if needed:
BCWS
ACWP
BCWP
SV = BCWP – BCWS
CV = BCWP – ACWP
Duration
Cost
Total Project Planned Cost
Figure 4-2 Cost Baseline Cumulative S-Curve Kaplan
3.5 Cost Control
The project manager will use the weekly time reporting and rollup costs reported for each account/WBS to total the amount spent for the week. This amount will be calculated into the actual costs for the week. The project manager will use the weekly status reports to determine which deliverables and work packages were completed.
If the labor actual cost exceeds the value of the work completed, compared to the cost baseline, the project manager will determine reasons for the difference. For example, if Deliverable 1 is completed during Week 1 of the project, with an earned value of $1,000, and the actual labor costs for the week total $1,200, the variance is negative $200. One reason may be that tasks in the work packages scheduled for Week 1 took longer than expected due to unforeseen problems.
One area to examine is to ensure that additional work was not performed. If problems arose, then the project team will identify those problem areas and a course of action to see if the time can be made up and if costs will not increase. This is especially critical if the problem could occur in future work packages for the project.
If the earned value for Deliverable 1 completed in Week 1 is $1,000, and the planned value was to be $1,200, then there is a negative
$200
in schedule. This means some tasks may be behind schedule. The project manager will use the same approach as for cost variances in determining root cause and course of action.
All changes to scope, schedule, contracts, or purchases will follow the overall change control process outlined in the scope management plan. Proposed changes will be presented to the configuration control board, along with the associated cost changes to the baseline cost.
If costs are increased due to finding that the tasks take longer than expected, the project manager and contracts/procurement management will determine a course of action based on the contract type. The project manager will consult with functional managers to define alternate approaches.
As approved changes occur, updates will be applied to the cost estimates, personnel usage, and material and equipment changes. The project manager will revise the baseline with a new EAC, as well as revise this cost management plan. The changes will be applied to other impacted plans as necessary.
To enable the project manager to track costs and changes, corporate databases for time reporting, issues, and risks will be used. In addition, standard corporate guidelines for cost accounting, procurement, contracts, and earned value management will be followed.
Cost Constraints
1. The fitter positions will be outsourced. This increases the labor rates to contractor rates, used in cost estimation.
2. The Memorial Day and Independence Day holidays fall within the project start/finish dates.
Cost Assumptions
1. Labor is calculated based on man-hour rates.
2. Labor rates have been provided as static for this project. No increases are planned in the 10-week project time frame. The blended rate used is standard across the corporation.
Contractor
labor rates are negotiated per the outsourcing contract and are static for the project’s duration.
3. No overtime hours and rates for resources are included in the cost estimates.
4. No cost estimates are applied for Memorial Day and Independence Day. Fitters are not paid for that day, and employee time off is part of overhead costs.
Project Name: Ansari X Prize Gauchito Rocket
Product-Process: Quality
Prepared By:
Thomas Jones
Project Quality Plan Version Control
|
Version
|
Date
|
Author
|
Change Description
|
|
|
1.0
|
4/30/06
|
Thomas Jones
|
Initial
|
4.0 Quality Management Plan
4.1 Project Quality Plan Purpose
The purpose of this plan is to ensure that the rocket is built to the specifications set by the customer.
Customer
satisfaction with the workmanship, cost, and final delivery of the final product is key to achieving the goal of follow-on work of future contracts.
Quality Management Method
The construction material used for this project has been handpicked by the customer. It will only be inspected for damage that may have occurred during shipping. The construction processes have also been written by the customer. Quality for this project will only consist of ensuring that the construction processes have been followed and whether there is any room for improvement.
4.2 Quality Plan Processes
Quality Assurance
It is the policy of Space Systems Technology (SST) and its elements to develop, integrate, and implement QA and QC practices to assure delivery of quality products and services that meet or exceed customer needs and expectations in accordance with applicable laws, policies, technical criteria, schedules, and budgets. Adherence to quality principles and established QA and QC practices is integral to the roles and responsibilities of all SST’s elements and functions.
Quality standards have been set by the customer in the kit assembly directions given to the Gauchito rocket project team. The QA team will inspect each deliverable as milestones are reached. Checklists will be developed using the kit directions for each station assembling a section of the rocket. The checklist will simply ensure each step in the assembly process is done in accordance with the customer’s instructions.
Quality Control
The checklists at each station will be picked up by the QA team during the inspection of the deliverable and given to the customer with training documentation.
Project Deliverables and Processes Acceptance Criteria
All deliverables will be constructed according to kit instructions.
1.0 ASSEMBLE ENGINE MOUNT
2.0 FIN PREPARATION
3.0 MARK FIN AND LAUNCH LUG LINES
4.0 INSERTING ENGINE MOUNT
5.0 ATTACH FINS
6.0 ATTACH SHOCK CORD
7.0 ASSEMBLE NOSE CONE
8.0 ATTACH PARACHUTE/SHOCK CORD
9.0 ATTACH LAUNCH LUG
10.0 PAINTING THE ROCKET
11.0 APPLICATION OF DECALS
12.0 APPLYING CLEAR COAT
13.0 DISPLAY NOZZLE ASSEMBLY
14.0 ROCKET PREFLIGHT
15.0 PREPARE FOR TEST LAUNCH
Project Overview
This project will address design consideration of the Gauchito rocket. The rocket’s design, launch, and test results can be observed using a 7/8 scale of the full-sized rocket. This approach provides a valid measurement of the rocket’s success without the time and expense necessary to build a full-sized rocket. This will also help in the identification, mitigation, and avoidance of risk to the space development program. This project is being undertaken to show that our company has the ability to produce a reliable test product that accurately duplicates the final full-sized rocket. To do this, we will deliver the completed 7/8 scale rocket 3 days after the assembly is completed to Peterson AFB test launch site.
See appendix for WBS, schedule, risks, and cost.
Quality Standards
Rocket parts included in the kit have been inspected. These parts are of acceptable quality and grade for this project. The 4 solid fuel rockets being installed on the 7/8th scaled rocket have been engineered and tested to produce similar thrust as the 4 hybrid engines on the full-scale rocket.
Quality Tools
Audits of all deliverables will be made by a QA team member. Checklists will be verified and signed by the fitter, draftsman, gluer, or sander. The checklist will be verified and signed by the QA team member and turned over to the customer with the training documentation.
Quality Manager’s Responsibilities
Thomas Jones has been assigned quality manager and has the responsibility for:
|
· Specifying how the quality assurance processes should be applied
|
|
· Specifying how the quality control procedures should be applied
|
|
· Specifying the continuous process improvement for the project
|
|
· Defining criteria for the effective execution of key project activities, processes, and deliverables
|
|
· Defining quality management responsibilities for the project
|
|
· Identifying or including any checklists or templates that should be used by project team members
|
|
· Defining how the project will be audited to ensure compliance with the quality management plan
|
Project Quality Assurance
Quality assurance helps to establish if a deliverable is acceptable based on the processes used to create it. Quality assurance processes are used to evaluate overall project performance frequently and to determine that quality reviews were held, deliverables tested, and customer acceptance acquired.
Quality Assurance Procedures
Quality has been designed into the product. All of the parts used in this rocket project have been handpicked by the customer for the 7/8th scaled rocket. Although the assembly procedures were written by the customer, they will be monitored by QA team members looking for ways to improve the process of gluing, sanding, and painting through the use of better tools or material. Staff and subcontractors with knowledge of the processes being used will be asked for recommendations for improvement on the checklists.
Subcontract Fitters—Three of the top fitter contracting companies in the local area have been judged on their previous job performances. The local fitters union has been found to have a reputation for good, reliable work with exceptional employees.
|
Contractor
|
Reliability %
|
Work Quality
|
Mistakes
|
|
Local Fitters Union
|
|
100
|
100
0
|
ACME Fitters
|
40
60
|
|
|
25
|
|
Fitters-R-Us
|
25
65
25
Project Monitoring Processes
A checklist will be marked off for each of the measurable steps in the construction process. These checklists will be signed off by one of the staff or subcontracted fitters and a QA team member. This will ensure that all steps in the construction process were followed. Staff and subcontractors with knowledge of the processes being used will be asked for recommendations for improvement on the checklists. These recommendations will be passed to management and the customer for consideration in improving the construction of the full-scale rocket.
Project Quality Control
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Step Completed
|
|
ASSEMBLE ENGINE MOUNT
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
YES
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
NO
|
|
Mark left end of engine tube @ 1/8”
|
|
Mark from left of engine tube @ ¾”
|
|
Mark from left of engine tube @ 1 ½”
|
|
Cut slit of 1/8” @ 1 ½ inch mark on engine tube
|
|
Glue tube
, assemble hook
|
|
Apply thin line of glue completely around engine at ¾” mark
|
|
Insert engine hook into 1/8’ slit on engine mount tube
|
|
Slide mylar ring onto engine mount tube at 3/4″ mark
|
|
Glue dry at ¾” mark
|
|
Apply glue inside front of engine mount tube
|
|
Yellow engine block flush with the right end per diagram
|
|
Let dry
|
|
Remove centering rings from card with modeling knife
|
|
Apply thin line of glue around engine mount tube @ 1/8″ mark
|
|
Slide notched centering ring onto glued line @ 1/8″ mark
|
|
Glue set
|
|
Apply thin line of glue to opposite side of notched center
ring flush with end of engine mount tube
|
|
Slide un-notched centering ring in place over glue flush
with end of engine tube mount
|
|
|
Centering rings dry
|
|
Apply glue fillets to both sides of centering rings
for reinforcement
|
Centering rings dry
Signatures Date
Verifier: ____________________________________ ______________
QA Team: _____________________________________ ______________
Suggestions for improvement (materials or processes): _____________________
_________________________________________________________________
_________________________________________________________________
_________________________________________________________________
_________________________________________________________________
Step Completed
|
|
|
FIN PREPARATION
|
YES
NO
|
|
Sand laser cut balsa sheet w/ fine sandpaper
|
|
|
Cut out fin #1
|
|
|
Cut out fin #2
|
|
|
Cut out fin #3
|
|
|
Cut out fin #4
|
|
|
Sand edges of fins
|
FIN PREPARATION
Sand laser cut balsa sheet w/ fine sandpaper
Cut out fin #1
Cut out fin #2
Cut out fin #3
Cut out fin #4
Sand edges of fins
FIN PREPARATION
Signatures Date
Verifier: ____________________________________ ______________
QA Team: _____________________________________ ______________
Suggestions for improvement (materials or processes): _____________________ _________________________________________________________________
_________________________________________________________________
_________________________________________________________________
_________________________________________________________________
Step Completed
|
|
|
MARK FIN AND LAUNCH LUG LINES
|
YES
NO
|
|
|
Mark body tube at arrows
|
|
|
|
Mark launch lug line as LL on body tube
|
|
|
|
Connect fins
|
|
|
Connect launch lug lines
|
|
|
Extend launch lug line 3 ¾” from end of tube
|
MARK FIN AND LAUNCH LUG LINES
Mark body tube at arrows
Mark launch lug line as LL on body tube
Connect fins
Connect launch lug lines
Extend launch lug line 3 ¾” from end of tube
MARK FIN AND LAUNCH LUG LINES
Mark body tube at arrows
Mark launch lug line as LL on body tube
Connect fins
Signatures Date
Verifier: ____________________________________ ______________
QA Team: _____________________________________ ______________
Suggestions for improvement (materials or processes): _____________________ _________________________________________________________________
_________________________________________________________________
Step Completed
|
INSERTING ENGINE MOUNT
|
YES
NO
|
Mark inside tube at 5/8″
|
|
Glue tube
|
|
Smeared glue 1 3/4″ inside rear of body tube along LL
|
|
Aligned engine hook with LL line
|
|
Inserted engine mount into body tube until centering ring
is even w/ the 5/8″ glue mark
|
|
Located scrap piece of balsa
|
|
Applied glue to centering/body tube joint
|
Signatures Date
Verifier: ____________________________________ ______________
QA Team: _____________________________________ ______________
Suggestions for improvement (materials or processes): _____________________ _________________________________________________________________
_________________________________________________________________
_________________________________________________________________
_________________________________________________________________
Step Completed
|
ATTACHING FINS
|
YES
NO
|
Applied thin layer of glue to edge of fin
|
|
|
|
|
Allowed to dry (1 minute for model)
|
|
Applied second layer of glue to edge of fin
|
|
Attached fin #1 to body tube along one of fin lines flush w/ end
|
|
Applied thin layer of glue to edge of fin #2
|
Allowed to dry (1 minute for model)
|
Applied second layer of glue to edge of fin #2
|
|
Attached fin #2 to body tube along one of fin lines flush w/ end
|
|
Applied thin layer of glue to edge of fin #3
|
Allowed to dry (1 minute for model)
|
Applied second layer of glue to edge of fin #3
|
|
Attached fin #3 to body tube along one of fin lines flush w/ end
|
|
Applied thin layer of glue to edge of fin #4
|
Allowed to dry (1 minute for model)
|
Applied second layer of glue to edge of fin #4
|
|
Attached fin #4 to body tube along one of fin lines flush w/ end
|
|
Check fin #1 alignment
|
|
Check fin #2 alignment
|
|
Check fin #3 alignment
|
|
Check fin #4 alignment
|
|
Let glue set
|
|
Stood rocket on end
|
|
Let glue dried completely
|
Signatures Date
Verifier: ____________________________________ ______________
QA Team: _____________________________________ ______________
Suggestions for improvement (materials or processes): _____________________ _________________________________________________________________
Step Completed
|
ATTACH SHOCK CORD
|
YES
NO
|
Cut out shock cord from front page
|
|
Attached shock cord to shock cord mount
|
|
|
Applied glue to shock cord mount
|
|
Folded edge of shock cord mount over glued shock cord
|
Applied glue to shock cord mount
|
Folded over shock cord second time
|
|
Held shock cord for 1 minute
|
|
Glued shock cord mount 1″ inside body tube
|
|
Held until glue sets
|
|
|
|
Let dry completely
|
Signatures Date
Verifier: ____________________________________ ______________
QA Team: _____________________________________ ______________
Suggestions for improvement (materials or processes): _____________________ _________________________________________________________________
_________________________________________________________________
_________________________________________________________________
_________________________________________________________________
Step Completed
|
|
ASSEMBLE NOSE CONE
|
YES
NO
|
|
Applied plastic cement to inside rim of nose cone
|
|
|
Pressed nose cone insert into place over plastic cement
|
|
|
inside of nose cone rim
|
Let dry completely
ASSEMBLE NOSE CONE
Applied plastic cement to inside rim of nose cone
Pressed nose cone insert into place over plastic cement
inside of nose cone rim
Let dry completely
Signatures Date
Verifier: ____________________________________ ______________
QA Team: _____________________________________ ______________
Suggestions for improvement (materials or processes): _____________________ _________________________________________________________________
_________________________________________________________________
_________________________________________________________________
_________________________________________________________________
Step Completed
|
ATTACH PARACHUTE
|
YES
NO
|
Passed shroud line on parachute through eyelet
|
|
Attached parachute
|
|
Passed parachute through loop in shroud
|
|
Tied lines
|
|
Tied shock cord to nose cone using a double knot
|
Signatures Date
Verifier: ____________________________________ ______________
QA Team: _____________________________________ ______________
Suggestions for improvement (materials or processes): _____________________ _________________________________________________________________
_________________________________________________________________
_________________________________________________________________
_________________________________________________________________
Step Completed
|
ATTACH LAUNCH LUG
|
YES
NO
|
Glued LL centered onto LL line on rocket body
|
|
Applied glue fillets along launch lug
|
|
Applied glue fillets along fin/body tube joints
|
|
Smoothed each fillet with finger
|
|
Let glue dry completely
|
Signatures Date
Verifier: ____________________________________ ______________
QA Team: _____________________________________ ______________
Suggestions for improvement (materials or processes): _____________________ _________________________________________________________________
_________________________________________________________________
_________________________________________________________________
_________________________________________________________________
Step Completed
|
APPLICATION OF DECALS
|
YES
NO
|
First decal placed on rocket according to instructions
|
|
|
|
|
|
|
Bubbles removed from decal
|
|
|
|
Second decal placed on rocket according to instructions
|
Bubbles removed from decal
|
Third decal placed on rocket according to instructions
|
|
Bubbles removed from third decal
|
|
Fourth decal placed on rocket according to instructions
|
Bubbles removed from decal
|
Fifth decal placed on rocket according to instructions
|
Bubbles removed from decal
|
Sixth decal placed on rocket according to instructions
|
Bubbles removed from decal
|
Seventh decal placed on rocket according to instructions
|
Bubbles removed from decal
Signatures Date
Verifier: ____________________________________ ______________
QA Team: _____________________________________ ______________
Suggestions for improvement (materials or processes): _____________________ _________________________________________________________________
_________________________________________________________________
Step Completed
|
PAINTING THE ROCKET
|
YES
NO
|
Sprayed rocket with white primer
|
|
|
|
Let dry
|
|
Sanded entire rocket
|
|
Sprayed completed rocket with white second coat of primer
|
Let dry
|
Sprayed nose cone with copper paint
|
Let dry
Signatures Date
Inspector: ____________________________________ ______________
QA Team: _____________________________________ ______________
Suggestions for improvement (materials or processes): _____________________ _________________________________________________________________
_________________________________________________________________
_________________________________________________________________
_________________________________________________________________
Step Completed
|
APPLYING CLEAR COAT
|
YES
NO
|
Applied clear coat to entire rocket
|
|
Dry completely
|
Signatures Date
Verifier: ____________________________________ ______________
QA Team: _____________________________________ ______________
Suggestions for improvement (materials or processes): _____________________ _________________________________________________________________
_________________________________________________________________
_________________________________________________________________
_________________________________________________________________
Step Completed
|
DISPLAY NOZZLE ASSEMBLY
|
YES
NO
|
Nozzle #1 painted w/ silver paint
|
|
Nozzle #2 painted w/ silver paint
|
|
Nozzle #3 painted w/ silver paint
|
|
Nozzle #4 painted w/ silver paint
|
|
Nozzles allowed to dry
|
|
Apply glue to tab on nozzle #1
|
|
Nozzle #1 was placed into hole on base
|
|
Apply glue to tab on nozzle #2
|
|
Nozzle #2 was placed into hole on base
|
|
Apply glue to tab on nozzle #3
|
|
Nozzle #3 was placed into hole on base
|
|
Apply glue to tab on nozzle #4
|
|
Nozzle #4 was placed into hole on base
|
Signatures Date
Verifier: ____________________________________ ______________
QA Team: _____________________________________ ______________
Suggestions for improvement (materials or processes): _____________________ _________________________________________________________________
_________________________________________________________________
_________________________________________________________________
_________________________________________________________________
Step Completed
|
ROCKET PREFLIGHT
|
YES
NO
|
Removed nose cone from rocket
|
|
Inserted 4–5 loosely crumpled squares of recovery wadding
|
|
Parachute folded accordingly
|
|
Parachute lines wrap loosely around rolled parachute
|
|
Parachute inserted into body tube of rocket
|
|
Shock cord inserted into body tube of rocket
|
|
Nose cone inserted onto body tube of rocket
|
Signatures Date
Verifier: ____________________________________ ______________
QA Team: _____________________________________ ______________
Suggestions for improvement (materials or processes): _____________________ _________________________________________________________________
_________________________________________________________________
_________________________________________________________________
_________________________________________________________________
Step Completed
|
|
PREPARE FOR TEST LAUNCH
|
YES
NO
|
|
Engine installed
|
|
Remove engine
|
|
Insert tip to touch propellant
|
|
Insert engine into rocket
|
PREPARE FOR TEST LAUNCH
Engine installed
Signatures Date
Verifier: ____________________________________ ______________
QA Team: _____________________________________ ______________
Suggestions for improvement (materials or processes): _____________________ _________________________________________________________________
_________________________________________________________________
_________________________________________________________________
_________________________________________________________________
Project Deliverables
The checklist will be signed by a verifier and a QA team member. The checklist will also include suggestions by the staff member or subcontracted fitter who did the construction of the rocket deliverable.
Project Deliverables Test & Acceptance Process
The construction material used for this project has been handpicked by the customer.
Project Deliverables Acceptance Criteria
All deliverables will be constructed according to kit instructions.
01-Assembling of the Engine Mount:
The engine mount will pass inspection and be accepted if:
· Three marks are on the engine tube at 1/8, ¾, and 1 ½ inches.
· A 1/8th inch slit has been cut 1 ½ inches on the engine tube.
· The assembly hook has been inserted into 1/8th inch slit and glued at the ¾ inch mark.
· The yellow engine block is glued into the front of the mount tube and flush with the end of the mount tube.
· The notched centering ring is glued into place at the 1/8th inch mark.
· The un-notched centering ring is glued flush with the tube end.
· The glue has dried completely.
02-
Fin Preparation
:
The fin preparation will pass inspection and be accepted if:
· The fins have been carefully cut from the balsa sheet.
· The edges of the fins have been sanded smooth.
03-Marking of Fin and Lug Lines:
The fin and lug line markings will pass inspection and be accepted if:
· The fin and lug lines have been marked on the body tube using the guide.
04-Insertion of Engine Mount:
The engine mount will pass inspection and be accepted if:
· The engine hook and launch lug are aligned.
· The engine mount has been inserted into the body tube and the centering ring is flush with the 5/8th inch mark.
· Extra glue has been added to the 5/8th inch mark inside the body tube.
· The glue has dried completely.
05-Fin Attachment:
The fin attachment will pass inspection and be accepted if:
· The four fins should project straight out from the body tube.
· Each fin should be on the fin lines flush with the end.
· The glue has dried completely.
06-Shock Cord Attachment:
The shock cord attachment will pass inspection and be accepted if:
· The shock cord has been attached one inch to the inside of the body tube.
· The glue has died completely.
07-Nose Cone Assembly:
The nose cone attachment will pass inspection and be accepted if:
· The tube type cement has been used to connect the nose cone and the nose cone insert.
· The glue has died completely.
08-Parachute and Shock Cord Attachment:
The parachute and shock cord attachment will pass inspection and be accepted if:
· The shroud line has been connected through the eyelet on the nose cone insert.
· The parachute has been passed through the parachute line loop.
· The shock cord has been tied to the nose cone with a double knot.
09-Launch Lug Attachment:
The launch lug attachment will pass inspection and be accepted if:
· The launch lug has been centered on the LL line and double glued.
· The glue has dried completely.
10-Paint the Rocket
:
The paint job will pass inspection and be accepted if:
· The rocket is painted and the surface is smooth.
11- Decal Application:
The decal application will pass inspection and be accepted if:
· All decals are applied.
· All decals are smooth.
12-Clear Coat Application:
The clear coat application will pass inspection and be accepted if:
· The rocket is smooth.
· The clear coat is completely dry.
13-Display Nozzle Assembly:
The display nozzle assembly will pass inspection and be accepted if:
· The base is painted white.
· The nozzles are painted silver.
· The base plate and the silver nozzles are completely dry.
· The silver nozzles are glued into place on the base plate.
14-Rocket Pre-Flight
:
The rocket preflight assembly will pass inspection and be accepted if:
· The rocket will have four or five sheets of recovery wadding loosely packed in the top section.
· The parachute will be folded with the parachute lines loosely wrapped around it.
· The parachute will be placed into the top section of the rocket.
· The cone will be placed onto the top of the rocket.
15-Prepare for Test Launch:
· The engine will be inserted into the rocket.
Project Audits & Quality Reviews
|
Project Quality Audit Review
|
Planned Date
|
Quality Review Auditor
|
Comments
|
|
Assembling of the Engine Mount Audit
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Completion of Deliverable
|
|
|
|
|
|
|
|
|
|
|
|
|
|
QA Team Member
|
|
Fin Preparation Audit
|
Completion of Deliverable
QA Team Member
|
Marking of Fin and Lug Lines Audit
|
Completion of Deliverable
QA Team Member
|
Insertion of Engine Mount Audit
|
Completion of Deliverable
QA Team Member
|
Fin Attachment Audit
|
Completion of Deliverable
QA Team Member
|
Shock Cord Attachment Audit
|
Completion of Deliverable
QA Team Member
|
Nose Cone Assembly Audit
|
Completion of Deliverable
QA Team Member
|
Parachute and Shock Cord Attachment Audit
|
Completion of Deliverable
QA Team Member
|
Launch Lug Attachment Audit
|
Completion of Deliverable
QA Team Member
|
Paint the Rocket Audit
|
Completion of Deliverable
QA Team Member
|
Decal Application Audit
|
Completion of Deliverable
QA Team Member
|
Clear Coat Application Audit
|
Completion of Deliverable
QA Team Member
|
Display Nozzle Assembly Audit
|
Completion of Deliverable
QA Team Member
|
Rocket Pre-Flight Audit
|
Completion of Deliverable
QA Team Member
Quality Plan Approvals
Prepared by:
Project
Manager
|
Approved by:
Project Sponsor
Executive Sponsor
Client Sponsor
5.0 Staffing Management Plan
Project Name: Ansari X Prize Gauchito Rocket
Project Manager: Julie Davis, Space Systems Technology
Project Tracking Number: PMGT 605-0001 Date: May 8, 2006
Project Justification: In developing the 7/8 scaled-down model of the Ansari X Gauchito rocket, the opportunity to address needed modifications identified through quality analysis and testing on the design is invaluable to the Pablo de Leon & Associates space development program. This will also help in the identification, mitigation, and avoidance of risk to the space development program.
Overview of Staff requirements:
01 –
Fitters (Not generic to the organization)
02 –
Draftsmen
03 –
Painter(s)
04 –
Gluer/Assembler
05 –
Cutter
06 –
Sander Level I
07 -………………………………………………………………
Sander Level II (5.1.4/13.1.1)
08 –
Engineer
5.1 Key Constraints
Fitters are not generic to the organization; there is a requirement for fitters on this project.
Fitters will be outsourced through a staffing organization and employed based on requirements identified in the WBS and further matched to the histogram.
5.2 Key Assumptions
Resources are properly allocated to ensure load balancing throughout the project.
5.3 Staff Requirements
1. The Gauchito Scaled Demo Project is a project that will utilize much of the expertise within De Leon Enterprises (DLE). This is a precision project that calls upon many skills and skill sets housed under the De Leon logo. As usual, we will rise to the task, asking the best of the best to step forward in anticipation of a well-produced and structurally sound product.
Below are specific job requirements in support of the Gauchito project; individuals volunteering to support this undertaking will be screened by the project manager, Julie Davis, as it is her responsibility to ensure team success. The primary criteria used in selection will be:
· Attendance and performance
· Special knowledge and skills
· Work history
· Current project workload
· Authorization and cost to work overtime
· Interest in the project
5.4 CTU, Staff Acquisition and
Team D
evelopment
Selected individuals will be detailed to support this project as an addendum to ongoing requirements. Specific position requirements are listed in the following. A team-building activity will be scheduled after selection and prior to the beginning of the project. This project has an expected completion date of not later than May 10, 2006.
Job Description for: STRUCTURAL DRAFTSMAN
This position requires a bachelor’s degree in civil engineering or a closely related field; or five years of experience in rocketry and structural design work including six months of CADD operation and an associate’s degree in civil engineering, drafting, or design technology. This candidate must have experience with ESTES plans. The person assigned to this position will perform other duties as assigned.
Job Description for: CUTTER
This position requires a bachelor’s degree in knife or scissors use. Selected individual will be able to cut, read, and interpret simple instructions. Individual will be able to cut out, with an appropriate tool, any patterns within one millimeter of specifications. The person assigned to this position will perform other duties as assigned.
Job Description for: PAINTER I
1. Must possess a high school diploma or GED equivalent with three years journey-level experience.
2. Will scrape, sandpaper, prime, or seal surfaces prior to painting.
3. Must be able to mix, match, and apply paint, varnish, shellac, enamel, and other finishes.
4. Will clean and care for brushes, spray guns, and other equipment.
Job Description for: PAINTER II (SUPERVISOR)
1. Must possess a high school diploma or GED equivalent with five years master-level experience.
2. Responsible for the preparation of all surfaces prior to painting.
3. Ensures proper application and adherence to surfaces of all kinds.
4. Responsible for checking the condition of woodwork, reporting any carpentry needs, clean-up, and ensuring completeness and customer satisfaction with assigned projects.
Job Description for: TEST ENGINEER (ROCKET SCIENTIST)
Review test specifications. Develop product test plans and identify and procure test equipment as needed. Interface with local project managers and worldwide customers. Follow NAR guidelines and procedures. Participate in design reviews. Support lab accreditation efforts.
General requirements: Bachelor’s degree in rocket Engineering required. Master’s degree in engineering preferred. Minimum of 3 years of experience in product.
Verification testing. Knowledge of rocket specifications and quality systems preferred. Strong computer and communication skills required. Experience with NAR rocket testing methods and equipment experience in electronic hardware and software design methods.
Job Description for: SANDER I
1. Must possess a high school diploma or GED equivalent with three years journey-level experience.
2. Uses a machine or hand sands any surface until surface is smooth.
3. Applies filler compound to surfaces to seal wood.
Job Description for: SANDER II
1. Must possess a high school diploma or GED equivalent with five years master-level experience.
2. Uses a machine or hand sands to “fine” specifications.
3. Responsible for selection of proper “grit” sandpaper and/or sanding appliances.
4. Verifies adherence to requirements using applicable tools and/or experience.
Job Description for: GLUER/ASSEMBLER
Applies glue between seams in order to bind parts of wooden rockets or other items to make them airtight by either of following methods: (1) guides gluing tool that automatically forces gluing material into seam or (2) fills glue runner (funnel) with glue and guides runner along seam to fill seam with glue. Removes excess glue, using scraper or towel.
5.5 Constraints
This project has a requirement for fitters that are not organic to DLE. Authorization has been secured from management for external recruitment.
Job Description for: STRUCTURAL FITTER
This position requires an associate’s degree in fitter sciences or a closely related field; or five years of experience in performing duties including part or equipment location, assembly, and/or construction. Additional requirements for this position include the ability to read and follow simple instructions. The person that fills this position will perform other duties as assigned.
Recruitment
Project
Team A
will immediately begin recruitment actions through the DLE recruitment section to recruit and hire a qualified candidate(s) to meet these staffing needs. After initial selection by recruitment, candidates will be interviewed by Team A and will need to secure a two-thirds majority vote in order to secure the position(s). This is a process that is now, and will remain, germane to DLE enterprises.
5.6 Continued Employment
Under the general terms of this project and based on the past performance of DLE, this firm anticipates continued contracts in support of this parent project; as such, DLE will support the uninterrupted employment of a minimum of two fitters. Hired individuals will remain under the functional control/management of the assembly section. Reevaluation of this policy will occur not later than one calendar year from the final date of this document.
Sufficient resources are available to support this undertaking. Resource allocation has been formally studied, and supporting documentation (enclosed) is available for review by functional managers. These requirements will be continually evaluated throughout the life of the project and adjustments made as required.
6.0 Communications Management Plan
6.1 Effective communication, internally and externally, is the most overlooked resource in the project management arena. As we delve into yet another project, this will not become a stumbling block for SST. We are not reinventing the wheel; we will continue to communicate in a manner that has been proven to work in support of other projects. The following listed procedures have been designed to build communication at all levels and to ease the supporting processes while continuing to capitalize on its inherent benefit.
6.2 The project manager is responsible for disseminating any and all information to concerned parties. This responsibility includes information of the type indicated in the following chart as well as other information deemed pertinent by any interested party. Items not covered in this chart (Enclosure II) will, at a minimum, be posted in the Gauchito project website daily (by close of business).
Note: The project secretary will post weekly status meeting minutes to the company Gauchito project website weekly.
6.3 Information becomes such when it is inclusive of the components who, what, when, where, why, and how; effective communication includes these components and addresses the specific subject of interest. Information communicated about this or any project will be as detailed as possible initially and will be followed up in writing or in an email to all concerned parties. Any input, as a contributor to the successful outcome of this project, will be welcomed.
6.4 When seeking answers or addressing concerns, a response in writing will be provided to the initiating party within 24 hours (work hours). If this time frame is violated or is not sufficient, escalation will follow the below-listed path:
a. Functional department manager
b. Project manager
c. Project sponsor
NOTE: Information will not be communicated directly to the customer except through the personnel listed above.
Escalation to any of the above-listed parties will be in writing or through email. Additionally, a copy of this correspondence must be maintained by the initiating party.
6.5 The following chart, shown as Enclosure I, is a guide to be used in support of this communications plan; concerned parties have been broken into teams to ensure that information flows are consistent with the needs of both stakeholders and management. This list may not be inclusive, and provisions for the update of it, Enclosure II, or this communications plan are detailed in the paragraph below.
6.6
Organization
al feedback is always welcome. As DLE strives to satisfy the needs of both its customer and employees, concerned individuals should feel free to offer suggestions for improvement, update, or refinement of this or any other policies. Requests for modification of this plan or communication chart(s) will be voiced in writing through the hierarchal structure as specified above. Failure to comply with this minimal communication guidance will result in disciplinary action or termination.
|
TEAM A
|
|
|
|
|
Title
|
Name
|
|
|
Organization
|
|
Sponsor
|
Jeff Tyler
|
|
|
|
|
|
|
|
|
|
|
|
SST
|
Project Manager
|
Julie Davis
|
SST
|
Cost Financing
|
Gwen Edward
|
SST
|
TEAM B
|
Title
Name
Organization
|
|
Quality Assessment
|
|
|
|
|
|
|
|
|
Brian Kirouac
|
SST
Quality Assessment
|
|
|
|
|
|
|
|
Tom Jones
|
SST
|
TEAM C
|
Title
Name
Organization
|
Functional Mgr. (Drafting)
|
B. Jose Alonzo
|
SST
|
Functional Mgr. (Painting)
|
|
|
|
Robert Muse
|
SST
|
Functional Mgr. (Finishing/Sanding)
|
Buford T. Linking
|
SST
|
Functional Mgr. (Assembly/Gluer)
|
J. Christian Bose
|
SST
|
Functional Mgr. (Fitters)
|
Charles Gooding
|
SST
|
TEAM D
|
Title
Name
Organization
|
|
Procurement
|
Jeanea Brown
|
SST
Enclosure I
Project Communications Planner
|
Who?
|
What
Information?
|
When?
|
How?
(Form/Medium)
|
|
Team A
|
Team B
|
Team C
|
Team D
|
Project
Manager
All
Recruitment
Supplies confirmation
Fitters assignment
Change modification request
Change mod. confirmation
Schedule modification
Personnel change (impact) Personnel change (no impact)
Facilities (impact)
Weather (impact)
Weekly status
Stage completion
Project completion
Change mod. confirmation
Schedule modification
Stage completion
Supplies confirmation
Supplies requests/mod.
Supplies lost or damaged
Budgetary changes
Change mod. confirmation Schedule modification
Personnel change (impact)
Personnel change (no impact)
Facilities (impact) Weather (impact)
Stage completion
Project completion
Schedule/costs Reports
Weekly status meeting minutes
Timecards (fitters)
|
Upon receipt
Contract signing
Upon receipt
Prior to implementation
48 hours prior
Prior to change
Weekly status meeting
Immediate
Immediate
Weekly
Upon completion
One-day prior
After completion
After implementation
Upon completion
Upon receipt
Immediately
Immediately
Immediately
Immediately
Immediately
Immediately
Immediately
Immediately
Immediately
Upon completion
One-day prior
Weekly
Within 24 hours
Fridays by 5:00 p.m.
|
Email/weekly meeting
Email/weekly meeting
Telephonic
Telephonic/email
Weekly meeting
Telephonic/email
Weekly meeting
Telephonic/email
Telephonic/email
Weekly meeting
Email/weekly meeting
Email
Email
Email
Email
Email
Email/weekly meeting
Email/weekly meeting
Email/telephonic
Email
Email
Email
Email
Email
Email
Email
Email
Email/database update
Email/project website
Fax/email
|
7.0 Risk Management Plan
After holding a risk planning meeting, the Gauchito rocket project team focused on all the of the possible project risks and their ramifications and determined that it would be imperative to seek out a risk management plan in order to prepare for risk events and lay out mitigation plans for the risks that seemed the most likely to affect the project’s scope, quality, or schedule.
7.1 Methodology
The Gauchito project team determined that the best way to approach risk determination and
definition would be to bring the entire team together, including all of the individuals of the various skill sets involved in project construction, and brainstorm. It seemed using everyone’s ideas and narrowing the risk events down from there was the most accurate way to determine all of the risks involved from every perspective of this project. Data from previous projects completed by the SST team members were used as input to risk definition.
7.2
Role
s and Responsibilities
It was decided among the project team that the roles and responsibilities would be laid out as shown in
the appendix titled Risk, Roles, and Responsibilities. The bulk of the responsibility will be placed on the project manager, as this person has the authority within the team to approve or deny any mitigation plans that are developed as well as the authority to alter the scope statement, the budget, or the quality plan as may be required if any of the determined risk events occurs.
7.3 Budgeting
The Gauchito rocket project team discussed the budget and what effects there would be, if any,
according to each risk event. It was determined that while certain risks could have an effect on the schedule and quality of the project, corrective costs would be negligible unless the risk event went unnoticed and thus uncorrected for an extended time period. Therefore, the project manager, along with the cost and risk managers, determined it would be feasible to allocate roughly 10% of the budget to risk management and mitigation. This percentage will be reevaluated weekly for the life of the project, but it is not necessary to “re-baseline” for this project.
7.4 Timing
After conferring with the rest of the project team, the risk manager, along with the other team managers,
attempted to determine precisely how critically the project schedule might be affected should any of the risk events occur. It was determined that, with the mitigation plans that were approved by the project manager, the schedule would take a minimal hit if any one event occurred. Buffers were already written into the schedule should any unforeseen overtures occur, in order to prevent the project going past its scheduled end date, and the team feels these buffers will be appropriate for the risk management plan; rewriting the project schedule will be unnecessary.
7.5 Risk Categories
It seemed most appropriate, according to the Gauchito rocket project team, to categorize the risks by
the three major constraints of any project. These three constraints are, of course, cost, quality, and schedule. The reasons for selecting these three particular categories are as follows: cost, as any risk event that occurs, will inevitably affect the project’s budget in some form or another; quality, because it often happens that in order to mitigate a risk event or alter a project plan after an unexpected risk occurs that effects cost or schedule, time and money have to be reallocated, and that often alters the quality of a product; and schedule, due to the fact that many risk events on any project alter the schedule in some way or other, be it a large or small change.
7.6 Definition of Risk Probability and Impact
After careful consideration and collaboration among all of the plan managers on the project, it
was determined that a project of this scale needed impact and probably measurements to signify low, moderate, and high. Very low and very high were deemed unnecessary. Numeric values were assigned to each of these levels as follows:
Probability –
|
Low /
0.05
|
Medium / 0.1
|
|
High /
0.5
|
Impact –
|
Low / 0.1
|
Medium / 0.3
|
High / 0.5
The determined risk events were scored on their probability and impact in a risk assessment table, prior to being placed into the Probability Impact Matrix. The total ratings in the table, which were determined by taking probability multiplied by impact, can be used to determine the organization’s sensitivity to each risk event listed.
|
Activity
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Probability of Risk
|
Magnitude of Damage
|
Planned Action
|
|
Rank
|
Low/0.05
|
Med./0.1
|
|
High/0.5
|
Low/0.1
|
Med./ 0.3
|
High/0.5
Total Rating
|
Type of Action
|
|
Key personnel unavailable
|
3
|
|
X
|
|
|
|
|
|
|
X
|
|
0.05
|
Ensure contractors are available if necessary
|
|
Delayed delivery of materials and equipment
|
1
|
X
X
0.05
Make delivery a requirement in supplier contracts
|
|
Weather
|
5
X
|
X
|
0.01
|
Change schedule if necessary
|
|
Materials shortages
|
4
X
X
|
0.03
|
Contract with current supplier and identify alternate suppliers if needed
|
|
Damage to original parts provided in kit
|
2
X
X
0.03
Insurance and liability waiver
|
Revised Stakeholder Tolerances
After determining the risk categories, the specific risk events, their probability and impact,
and their total ratings, all of this information was put together and brought before the project sponsor for approval before it was presented to the primary stakeholders. The risk events were laid out before the project stakeholders, as well as the mitigation plans that had been approved for each event. After careful consideration and discussions between themselves, the project sponsor, and the project manager, the primary stakeholders informed the PM that they would tolerate the risk events as they were laid out and were more than willing to allow SST to continue with the project. The stakeholder tolerances did not have to be altered in this instance, but as the project is monitored for risks, the stakeholders will be kept abreast of any changes; should their risk tolerance change, it will be documented for this project and for future reference.
7.7 Reporting Formats for
Risk Register
The risk register, as shown in the example below, lays out the definition of the risk, what category it
fits into, the root cause of the risk event (if the root cause has been determined and is relevant to the project), the triggers for the risk, and how SST will respond to the risk event. There will be weekly quality assessments, and at this time the risk manager will go along with the quality manager, and the RM will be assessing the progress of the project and any occurrences that could be considered a risk. The assessments, as well as any and all reports from the construction staff and/or the functional managers, will all be used to determine risk occurrences, which will all go into the register, be they big or small.
Risk Register
Project
Revision: Date:
|
|
Risk #
|
Risk Description
|
Risk Root Cause
|
Risk Category
|
Trigger
|
Risk Response
|
1
2
3
4
7.8 Tracking
It is imperative that any risk events be tracked throughout the life of this project, be they big or small,
as a small risk is likely to become a big risk and a detriment to the project if it goes unchecked for a length of time. As this information can be important to any member of the project, it needs to be easily accessible; therefore, each plan manager, as well as the project manager, has copies of the risk register and the weekly assessments available for viewing upon request.
8.0 Procurement Management Plan
8.1 Contract Types
There are several different contracts to choose from, so it was left to the procurement manager to determine what contracts would be the most beneficial for the Gauchito rocket project team, in order to ensure that all necessary materials and equipment were delivered in a timely fashion without it becoming a huge expense to the organization. The procurement manager, after discussing all of the materials and equipment to be purchased, determined that the best form of contract to use for materials and equipment (M&E) purchases was firm fixed price.
These contracts were entered into with a clear understanding that all of the M&E purchased was to be delivered no later than a week prior to the project start. This was agreed upon by the organization and the suppliers, and the legal documents were filed with legal, with copies being distributed to the project manager, the project sponsor, and a copy to go on file in the procurement office.
There was also a need to outsource the fitters for this project, as that skill set is no longer available on-staff within the organization. After a discussion with HR and using the quality manager’s research, it was determined that the fitters would be contracted from the local fitter’s union. This means that contracts will be drawn up specifying their pay per hour, their breaks and lunches, and all other requirements necessary to meet union requirements. Because these fitters are unionized and will have a specific hourly wage for their skill set and experience, it was decided that a fixed-price contract would be best in this instance as well. The fitters will receive their hourly wages and that will be all, as there will not be any overtime or extraneous costs where the outsourced staff is involved.
8.2 Uses of Organizational Procurement, Contracting, or Purchasing Departments
The procurement manager, along with the rest of the procurement department, was responsible for
determining which suppliers to contact, drawing up all of the requests for proposals (RFPs), creating the benchmark to determine the best supplier to meet the project team’s needs, determining what types of contracts to enter into with suppliers, and managing those contracts throughout the life of this project.
8.3 Standardized Procurement Documents
There are several forms of standardized documents used in procurement, many of which were
used for this project. These documents include the firm fixed-price contracts entered into with the suppliers, the requests for proposal sent out to several suppliers, and the benchmark.
8.4 Constraints and Assumptions
There are a number of constraints and assumptions that have to be considered with any project,
and this one is no exception. The assumptions and constraints for this project are as follows:
Constraints:
· There are no available fitters within the organization.
· Trained fitters have to be outsourced for this project.
· Delayed delivery of materials and/or equipment will result in a schedule delay.
Assumptions:
· The materials and equipment will arrive one week prior to project start date.
· All necessary personnel will be available at the time of the project start.
· The contracts will be followed as written without any delays or difficulties.
8.5 Purchase and Acquisition Lead Times
The suppliers for all of the necessary materials and equipment are in the United States and provide
rush delivery, but that is no reason to delay the purchase of the necessary M&E. It was determined that the wisest time to place the purchase orders was 12 days prior to project start to ensure that the suppliers had the necessary materials and equipment in stock and ready to ship in order to ensure it all arrived promptly on the designated delivery date.
8.6 Types of Warranties
It was clearly stated upon the purchase of the materials and equipment necessary for this project
that everything came with a manufacturer warranty, and the stakeholders and project sponsor all deemed these warranties sufficient for this project because this rocket is a test product and will not be used repeatedly, but rather for data gathering.
8.7 Probability and Impact Matrix
|
Probability and Impact Matrix
|
|
Probability
|
|
|
0.5
|
|
|
0.1
|
Risk 5
|
Risk 1
Risk 3
|
0.01
|
Risk 2
Risk 4
|
|
Impact
|
0.1
0.3
|
0.5
|
= High-Level Risk—Resolve immediately
|
|
= Moderate Risk—Track through the life of the project
|
|
= Minimal Risk—Be aware of it, but no tracking necessary
|
Benchmark
|
Benchmark
|
|
Suppliers
|
Ballard Power
|
Quantum Tech
|
Cordant Tech
|
Pratt & Whitney
|
|
Cost of Hybrid Engines (4)
|
$12,000
|
$10,000
|
$10,750
|
$11,000
|
|
Cost of M&E
|
$4,500
|
|
$5,000
|
$5,000
$5,500
|
|
Delivery Time
|
1.5 weeks
|
1 week
|
2 weeks
|
2.5 weeks
|
|
Delivery Cost
|
$200
|
$125
|
$175
|
$195
|
|
RFP Score
|
8
10
9
10
|
Quantum Tech has the most criteria matching what the Gauchito project team is looking for.
|
|
Therefore, it will be the obvious choice for the supplier for the necessary specialty hybrid engines.
|
APPENDIX A
Space Systems Technology
Ansari X Prize Cup-Gauchito Rocket
Project Charter
Project Name: Ansari X Prize Cup Gauchito Rocket
Project Manager: Julie Davis
Project Tracking Number: PMGT 605-0001 Date: May 7, 2006
Project Justification:
This project will address design consideration of the Gauchito rocket. The rocket’s design, launch, and test results can be observed using a 7/8 scale of the full-sized rocket. This approach provides a valid measurement of the rocket’s success without the time and expense necessary to build a full-sized rocket. This will also help in the identification, mitigation, and avoidance of risk to the Pablo de Leon X Prize entry. This project is being undertaken to show that our company has the ability to produce a reliable test product that accurately duplicates the final full-sized rocket.
Overview of Deliverables:
1.0 ASSEMBLE ENGINE MOUNT
2.0 FIN PREPARATION
3.0 MARK FIN AND LAUNCH LUG LINES
4.0 INSERTING ENGINE MOUNT
5.0 ATTACH FINS
6.0 ATTACH SHOCK CORD
7.0 ASSEMBLE NOSE CONE
8.0 ATTACH PARACHUTE/SHOCK CORD
9.0 ATTACH LAUNCH LUG
10.0 PAINTING THE ROCKET
11.0 APPLICATION OF DECALS
12.0 APPLYING CLEAR COAT
13.0 DISPLAY NOZZLE ASSEMBLY
14.0 ROCKET PREFLIGHT
15.0 PREPARE FOR TEST LAUNCH
Specific Project Objectives and Success Criteria:
A. The project goal is to build a functional model rocket at 7/8 the actual size within three months from the start date of the project.
B. The SST project manager “PM” Julie Davis will be responsible for providing the sponsor Jeff Tyler with the scheduling status on a daily to weekly basis as this is a short duration project.
1. Cost
A. The Gauchito rocket 7/8 model rocket will be built on the estimated funding of $63,000 dollars.
B. The cost will be further defined as the project resources and cost estimates progress using SST existing templates. The budget status shall be provided to the project sponsor on a weekly basis.
2. Quality
A. The Gauchito rocket is to be built as a 7/8 scale model of the planned full-size rocket the Delta II.
B. The Gauchito rocket will be built according to all specifications in the kit.
C. Length 39.37 ft, diameter 7.28 ft, GTOW 17,637 lb, DRY WT 5291 lb.
D. All deliverables stated will be inspected by QA staff before completion of rocket.
E. Risk management shall be addressed by the SST team in tandem with QA.
F. Testing will be done by the test staff to gather metrics during the test launch.
G. Goal is an altitude of 67 miles with a max speed of 2,684 mph.
Primary Stakeholders and Roles:
The primary stakeholders are as follows.
Name
|
|
Role
|
Responsibilities/Authority
|
|
|
SST
|
Mr. Jeff Tyler
|
Pablo de Leon & Associates/Signoff Charter
|
Fulfill customer contract (e.g. funding; monitor contract fulfillment, coordinating test site and rocket test launch demonstration).
Approval of project completion and closure.
|
SST
Mr. Jeff Tyler
Pablo de Leon & Associates
Project Sponsor/Signoff Charter
|
|
Communicate with stakeholders and commitment of personnel resources.
|
|
SST
Ms. Julie Davis
|
Pablo De Leon & Associates-Ansari X Prize Gauchito Rocket Project Manager/Signoff Charter/Scope
|
Responsibilities:
Coordinate the project planning, executing, monitoring and controlling, and closing, following DOD’s processes based on the PMI.
Ensure project deliverables are completed on time and in budget.
Report progress of project to stakeholders to cover critical path schedule, deliverables, and any identified risks updated during weekly status meetings.
Coordinate training for outsourced fitters.
Coordinate and lead project meetings.
Authority:
Communicate with DOD contact on any issues.
Communicate directly with project sponsor Major T.J. Stone and DOD executives on status and issues.
Communicate with resource officer and affected functional managers regarding resource allocation and scheduling.
Authorize changes and any corrections to be taken, to include DOD activities.
Limitations:
The PM will not manage HR activities regarding DOD personnel.
|
SST
Julie Davis
|
Scope
|
-Project Charter—Julie
-Project Preliminary Scope—Tom
-Product Description—Gwen
-WBS—Julie
-Constraints—Julie
-Assumptions—Julie
|
|
SST
Brian Kirouac
|
Schedule
|
-Scheduled start dates for WBS tasks
-Major milestones and target dates
|
|
SST
Tom Jones
|
Quality
|
-Provide quality assurance staff for the project to validate each deliverable.
|
|
SST
Gwen Edwards
|
Cost/Financing
|
-Preparation of cost estimates
-Performance measurement baseline
|
|
|
SST
Kevin LaSalle
|
Staffing
|
Responsibility Assignment Matrix
-Key or required staff
|
|
SST
Jeanea Brown
|
Risk Manager
|
Identify—Key risks and provide risk management resources to the project to facilitate identifying risks and planning for contingencies.
|
|
SST
Kevin LaSalle
|
Communications
|
Description of how the communication for the project is going to proceed.
|
SST
Kevin LaSalle
Procurement
Coordination of resources with resource officer and any functional managers affected.
|
Key Constraints:
1. SST does not currently have the employee resources available during our project start to finish dates for the fitter positions.
2. The materials must be delivered the Friday before project start.
3. Project must be completed in 3 months time.
4. Estimated budget is not to exceed $63,000.
5. Customer may add to the scope of the project.
Key Assumptions:
1. All materials for the rocket will have been purchased by SST and received no later than the Friday before project start. (Recommended engines are as follows: 1/2A3-2T, A3-4T, A10-3T.)
2. All activities associated to building the model rocket will follow the National Association of Rocketry (NAR) Safety Code.
3. All project management activities will use SST’s project tools, templates, and processes, based on PMI standards.
4. The project can be completed in 3 months time and within an estimated budget of $63,000.
Signatures—The following people agree that the above information is accurate:
· SST project team members:
Ms. Julie Davis
____________________________________________
Ms. Gwen Edwards
____________________________________________
Ms. Jeanea Brown
____________________________________________
Mr. Tom Jones
____________________________________________
Mr. Kevin LaSalle
____________________________________________
Mr. Brian Kirouac
____________________________________________
· Project sponsor and/or authorizing manager(s):
Mr. Jeff Tyler ____________________________________________
APPENDIX B
Gauchito Rocket Scaled Demonstration
Product Description
Project Name: Ansari X Prize Gauchito Rocket Date: May 8, 2006
Project Overview:
Pablo de Leon and Associates is continuing the concept of scaled rocket validation by requesting the construction of a 7/8 scale of the Delta II rocket. The first company that completes construction of a rocket that meets the requirements will win the competition.
Rocket Components and Steps:
The components of the rocket will be built following the steps in the provided construction plan. The steps are below.
01-Assembling of the Engine Mount
02-Fin Preparation
03-Marking of Fin and Lug Lines
04-Insertion of Engine Mount
05-Fin Attachment
06-Shock Cord Attachment
07-Nose Cone Assembly
08-Parachute and Shock Cord Attachment
09-Launch Lug Attachment
10-Paint the Rocket
11-Decal Application
12-Clear Coat Application
13-Display Nozzle Assembly
14-Rocket Pre-Flight
15-Prepare for Rocket Test Launch
Specific Product Specifications:
The Gauchito rocket will be built according to the following specifications:
1. 7/8 scale model of the planned full-size rocket the Delta II.
2. The rocket will be assembled with materials and equipment in the assembly kit. All rocket components in the construction kit will be assembled as part of the rocket or launch setup.
3. Recommended hybrid engines are as follows: 1/2A3-2T, A3-4T, A10-3T to replace the standard engines in the construction kit. The construction company will select the best engine to meet the performance requirements.
4. Length = 39.37 ft, Diameter = 7.28 ft, GTOW 17,637 lb, DRY WT 5291 lb.
Performance: The Gauchito rocket performance will be based upon the following.
1. Reaching a launch altitude of 67 miles in less than 17 seconds total flying time.
2. Total thrust of 52,910 pounds.
3. The maximum speed capable of being reached is 2,684 mph, and it will be part of metrics gathered during test. It is not a determination in measuring if the rocket performance is acceptable. It is, however, a factor in obtaining the launch altitude specified. A reduced speed may make the rocket incapable of reaching the required altitude.
4. Payload capacity of three crewmembers or 300 kg.
5. Crew environment: Nitrogen-oxygen using pressurized suites in a pure oxygen atmosphere.
Quality:
1. The Gauchito rocket will be built according to all instructions in the construction plan.
2. All activities associated with building the model rocket will follow the National Association of Rocketry (NAR) Safety Code.
Product Assumptions:
1. Pablo de Leon & Associates will take possession of the rocket upon completion of assembly following the above steps and meeting the requirements.
APPENDIX C
Preliminary Scope Statement
Gauchito Rocket
Project Name: Ansari X Prize Gauchito Rocket
Project Manager: Julie Davis
Project Tracking Number: PMGT 605-0001 Date: April 19, 2006
Project Justification: This project will address design consideration of the Gauchito rocket. The rocket’s design, launch, and test results can be observed using a 7/8 scale of the full-sized rocket. This approach provides a valid measurement of the rocket’s success without the time and expense necessary to build a full-sized rocket. This will also help in the identification, mitigation, and avoidance of risk to the space development program. This project is being undertaken to show that our company has the ability to produce a reliable test product that accurately duplicates the final full-sized rocket. To do this, we will deliver the completed 7/8 scale rocket 3 days after the assembly is completed to Peterson AFB test launch site.
I. Overview of Deliverables:
Reference the work breakdown structure (WBS), prepared as a separate document and an example included in Appendix D, for details on each deliverable.
All deliverables will be constructed according to kit instructions.
1.0 ASSEMBLE ENGINE MOUNT
2.0 FIN PREPARATION
3.0 MARK FIN AND LAUNCH LUG LINES
4.0 INSERTING ENGINE MOUNT
5.0 ATTACH FINS
6.0 ATTACH SHOCK CORD
7.0 ASSEMBLE NOSE CONE
8.0 ATTACH PARACHUTE/SHOCK CORD
9.0 ATTACH LAUNCH LUG
10.0 PAINTING THE ROCKET
11.0 APPLICATION OF DECALS
12.0 APPLYING CLEAR COAT
13.0 DISPLAY NOZZLE ASSEMBLY
14.0 ROCKET PREFLIGHT
15.0 PREPARE FOR TEST LAUNCH
II. Specific Project Objectives and Success Criteria:
1. The project objective is to develop, build, and test a working model rocket at 7/8 the actual size within two months from the start date of the project.
2. The initial milestones are as follows and can be changed utilizing already implemented change management templates as the project progresses toward completion.
3. Schedule
a. The project goal is to build a working model within three months from project start.
b. The demonstration rocket testing will be scheduled within one week of completion of the model rocket, at the test launch facility. The test launch will be scheduled with an alternate date in case the wind speed is over 10 miles per hour on the target date because high wind speed may negatively impact the rocket success.
c. The preliminary schedule in Appendix I shows milestones for each deliverable, based on the WBS. The schedule will be further refined during project planning, and milestones may be moved, depending on factors such as concurrent work activities.
d. Each resource assigned to the project will provide weekly status reports to include issues.
e. The project manager will provide schedule status to the project sponsor and the customer on a weekly basis, given the project’s short duration.
4. Cost
a. The build of the rocket will be based on an agreed-upon funding of $50,000.
b. The project planning will further refine the preliminary resource and cost estimates provided in Appendix C. This process will use existing templates and procedures. The project manager will provide budget status to the project sponsor and the customer on a weekly basis, given the project’s short duration.
c. Engineering resources assigned to the project will support refining requirements and cost estimates.
5. Quality
a. The scaled composite Gauchito rocket will be a 7/8 scale model of the planned full-sized rocket model.
b. The demonstration rocket will be built to the specifications in the kit, including optional steps, such as fin sanding to optimize performance.
c. All fins available in the kit will be installed, per the fin preparation step, to ensure rocket stability.
d. Quality assurance personnel assigned to the project will prepare a quality assurance plan input to the PM to gather necessary approvals.
e. Quality assurance personnel assigned to the project will examine each individual component produced by a deliverable before attaching each component as part of the rocket. QA will also inspect the rocket prior to launch.
f. Configuration management personnel assigned to the project will support the PM in preparation of a configuration management plan and support the definition of activities for defect correction.
g. Configuration management personnel assigned to the project will maintain a list of issues and resolutions.
h. Risk management personnel assigned to the project will support the PM in preparation of a risk management plan to identify risks, level of risk, and contingencies if the risks should occur.
i. Test personnel assigned to the project will support the PM to provide input to the test plan.
j. The rocket will be launched using the purchased four hybrid rocket engines.
k. The rocket body will remain intact through the test launch, without breaking up before reaching the target altitude, and return safely.
III. Scope Management Issues:
The project scope will be maintained by entering all requirements and work activities into existing program tools for tracking requirements (WBS, scope management plan, project charter, etc.). The activities will meet the project objectives and the contract requirements.
1. Defects:
If defects are identified by QA during construction, they will document and provide the data to the PM. The PM and project team will identify options for correcting the defects and present them, along with the most effective option. Any schedule, cost, risk, and performance impacts will be identified. The change control board (CCB) will approve the corrective action and associated impacts.
The finance personnel assigned to the project will work contract changes as needed due to changes to cost and schedule.
Configuration management will maintain a list of defects and resolutions and CCB minutes.
2. Change requests:
If requests for additional materials or changes in construction vary from those provided in the installation kit, they will complete a change request form and provide it to the PM.
(Note: Changes may be requested due to identifying a better installation technique.)
The request, along with cost, schedule, and risk impacts, will be presented to the CCB for evaluation and approval or denial.
If the change request is approved, the PM will update cost and schedule to incorporate the change. Finance personnel assigned to the project will work contract changes as need due to changes to cost and schedule. Engineering, CM, QA, and risk and technical documentation will be updated accordingly. This includes updates to the kit instructions.
If the change request is denied, no changes will be made to the project scope.
Any additional materials needed due to a change request will be procured by Space Systems Technology (SST) and provided to the project test site.
Primary Stakeholders and Roles:
Name
Role
Responsibilities
|
SST
|
Customer
|
Fulfill customer contract (e.g., funding; monitor contract fulfillment, including participation in Configuration Control Board [CCB], coordinating test site and rocket test launch demonstration).
|
Mr. Jeff Tyler
|
Project Sponsor/Signoff Scope Statement
|
Communicate with stakeholders and commitment of personnel resources.
|
Ms. Julie Davis
|
Project Manager/Signoff Scope Statement
|
Coordinate the project planning, executing, monitoring and controlling, and closing. Responsible for the project budget, cost, and schedule. Provide schedule, define risks, report status, task personnel resources, coordinate any necessary training, and coordinate and lead CCB and project meetings. The PM will provide input to line managers on staff performance but will not manage HR activities.
(Reference full responsibilities and authority in the project charter.)
|
|
Mr. Tom Jones
|
Quality Assurance (QA) and Signoff Scope Statement
|
Provide QA resources to the project to validate each deliverable. Participate in CM processes for possible defect correction in deliverables.
|
|
Mr. Kevin LaSalle
|
Staffing Manager/Signoff Scope Statement
|
Provide staffing resources to the project to coordinate support as specified in the statement of work (SOW).
|
|
Ms. Gwen Edwards
|
Cost Management/Signoff Scope Statement
|
Provide financial resources for contract activities and closure.
|
|
Mr. Brian Kirouac
|
Schedule Management Plan/Signoff Scope Statement
|
Defining severity levels of potential schedule impacts. Identifying who needs to be involved at each level. Determining how changes will be incorporated into the project schedule. Determining how schedule changes will be communicated to key project stakeholders.
|
|
Ms. Jeanea Brown
|
Risk Manager/Signoff Scope Statement
|
Provide risk management resources to the project to facilitate identifying risks and planning for contingencies.
|
IV. Key Constraints:
1. SST does not currently have the employee resources available during our project start to finish dates for the fitter positions. If the materials are not delivered in the time frame in which the project was based on, there will be a change to the project delivery date that we do not have available to us.
2. The materials must be delivered the Friday before project start.
3. Project must be completed in 3 months time.
4. Estimated budget is not to exceed $63,000.
V. Key Assumptions:
5. All materials for the rocket will have been purchased by SST and received no later than the Friday before project start. (Recommended engines are as follows: 1/2A3-2T, A3-4T, A10-3T.)
6. All activities associated with building the model rocket will follow the National Association of Rocketry (NAR) Safety Code.
7. All project management activities will use SST’s project tools, templates, and processes, based on PMI standards.
8. The project can be completed in 3 months time and within an estimated budget of $63,000.
Signatures—The following people agree that the above information is accurate:
· Project team members:
Ms. Julie Davis, Project Manager ________________________________
Ms. Gwen Edwards, Finance
_________________________________
Mr. Brian Kirouac, Engineering Manager ________________________________
Mr. Kevin LaSalle, Staffing Manager _________________________________
Ms. Jeannea Brown, Risk Manager _________________________________
Mr. Tom Jones, Quality Manager _________________________________
· Project sponsor and/or authorizing manager(s):
Mr. Jeff Tyler, Sponsor ________________________________
APPENDIX D
WORK BREAKDOWN STRUCTURE
1.0 ASSEMBLE ENGINE MOUNT
1.1 Measure, Mark, and Cut Engine Tube
1.1.1 Lay ruler along engine tube
1.1.2 Measure engine from left of engine tube @ 1/8″
1.1.3 Mark left end of engine tube @ 1/8″
1.1.4 Measure engine from left of engine tube @ 3/4″
1.1.5 Mark from left of engine tube @ 3/4″
1.1.6 Measure engine tube from left of engine tube @ 1 1/2″
1.1.7 Mark from left of engine tube @ 1 1/2″
1.2 Cut Engine Tube
1.2.1 Cut Slit of 1/8″ @ 1 1/2 inch mark on engine tube
1.3 Glue, Tube, Assemble Hook
1.3.1 Apply thin line of glue completely around engine at 3/4″ mark
1.3.2 Position hook per diagram
1.3.3 Insert engine hook into 1/8″ slit on engine mount tube
1.4 Assemble Mylar Ring to Tube
1.4.1 Slide mylar ring onto engine mount tube at 3/4″ mark
1.4.2 Let dry
1.5 Assemble Yellow Engine Block to Engine Mount Tube
1.5.1 Apply glue inside front of engine mount tube
1.5.2 Insert yellow engine block flush with the right end per diagram
1.5.3 Let dry
1.6 Assemble Centering Rings
1.6.1 Remove centering rings from card with modeling knife
1.6.2 Apply thin line of glue around engine mount tube @ 1/8″ mark
1.6.3 Slide notched centering ring onto glued line @ 1/8″ mark
1.6.4 Let glue set
1.6.5 Apply thin line of glue to opposite side of notched center ring flush with end of engine mount tube
1.6.6 Slide un-notched centering ring in place over glue flush with end of engine tube mount
1.6.7 Let dry
1.7 Application of Glue Fillets
1.7.1 Apply glue fillets to both sides of centering rings for reinforcement
1.7.2 Let dry
2.0 FIN PREPARATION
2.1 Sand/Cut Fins
2.1.1 Sand laser cut balsa sheet w/ fine sandpaper
2.2 Cutting Out Fins
2.2.1 Cut out fin #1 w/ modeling knife
|
2.2.2 Cut out fin #2 w/ modeling knife
|
2.2.3 Cut out fin #3 w/ modeling knife
|
2.2.4 Cut out fin #4 w/ modeling knife
|
2.3 Stack and Sand Fins
2.3.1 Stack fins
2.3.2 Sand edges of fins
3.0 MARK FIN AND LAUNCH LUG LINES
3.1 Cut – Tape
3.1.1 Cut out tube marking guide
3.1.2 Tape tube marking guide around body tube
3.1.3 Mark body tube at arrows
3.1.4 Mark launch lug line as LL on body tube
3.2 Remove guide, connect fins and lug lines, extend LL line
3.2.1 Remove tube marking guide from body tube
3.2.2 Connect fins using door frame
3.2.3 Connect launch lug lines using door frame
3.3 Extend Launch Lug Line
3.3.1 Extend launch lug line 3 3/4″ from end of tube
4.0 INSERTING ENGINE MOUNT
4.1 Mark inside of tube @ 5/8″ where LL is
4.1.1 Measure inside tube to 5/8″ position on tube
4.1.2 Mark inside tube at 5/8″
4.2 Glue Tube
4.2.1 Measure inside rear of body tube to 1 3/4″ position on tube
4.2.2 Use finger to smear glue 1 3/4″ inside rear of body tube along LL
4.3 Assemble Engine Hook
4.3.1 Align engine hook with LL line
4.3.2 Insert engine mount into body tube until centering ring is even w/ the 5/8″ glue mark
4.3.3 Let dry
4.4 Gluing Center Body Ring
4.4.1 Locate scrap piece of balsa to apply glue
4.4.2 Apply glue to centering/body tube joint
4.4.3 Let dry
5.0 ATTACH FINS
5.1 Attach Fin #1
5.1.1 Apply thin layer of glue to edge of fin
5.1.2 Allow to dry (1 minute for model)
5.1.3 Apply second layer of glue to edge of fin
5.1.4 Attach fin to body tube along one of fin lines flush w/ end
5.2 Attach Fin #2
5.2.1 Apply thin layer of glue to edge of fin #2
5.2.2 Allow to dry (1 minute for model)
5.2.3 Apply second layer of glue to edge of fin #2
5.2.4 Attach fin #2 to body tube along one of fin lines flush w/ end
5.3 Attach Fin #3
5.3.1 Apply thin layer of glue to edge of fin #3
5.3.2 Allow to dry (1 minute for model)
5.3.3 Apply second layer of glue to edge of fin #3
5.3.4 Attach fin #3 to body tube along one of fin lines flush w/ end
5.4 Attach Fin #4
5.4.1 Apply thin layer of glue to edge of fin #4
5.4.2 Allow to dry (1 minute for model)
5.4.3 Apply second layer of glue to edge of fin #4
5.4.4 Attach fin #4 to body tube along one of fin lines flush w/ end
5.5 Check Fin Alignment
5.5.1 Check fin #1 alignment as shown in diagram
5.5.2 Check fin #2 alignment as shown in diagram
5.5.3 Check fin #3 alignment as shown in diagram
5.5.4 Check fin #4 alignment as shown in diagram
5.6 Allow Glue to Dry
5.6.1 Let glue set
5.6.2 Stand rocket on end
5.6.3 Let glue dry completely
6.0 ATTACH SHOCK CORD
6.1 Cut Out Shock Cord Mount
6.1.1 Cut out shock cord from front page
6.2 First Glue Application
6.2.1 Attach shock cord to shock cord mount
6.2.2 Apply glue to shock cord mount
6.2.3 Fold edge of shock cord mount forward over glued shock cord
6.3 Second Glue Application
6.3.1 Apply glue to shock cord mount
6.3.2 Fold forward again-see diagram for clarification
6.4 Squeeze and Hold
6.4.1 Squeeze shock cord/shock cord mount tightly
6.4.2 Hold for 1 minute
6.5 Attaching Shock Cord Mount
6.5.1 Glue mount 1″ inside body tube
6.5.2 Hold until glue sets
6.5.3 Let dry completely
7.0 ASSEMBLE NOSE CONE
7.1 Glue Nose Cone
7.1.1 Apply plastic cement to inside rim of nose cone
7.1.2 Press nose cone insert into place over plastic cement inside of nose cone rim
7.1.3 Let dry completely
8.0 ATTACH PARACHUTE/SHOCK CORD
8.1 Attach Lines
8.1.1 Pass shroud line on parachute through eyelet
8.2 Attach Parachute
8.2.1 Pass parachute through loop in shroud-look to diagram for clarification
8.3 Tie Lines
8.3.1 Tie shock cord to nose cone using a double knot
9.0 ATTACH LAUNCH LUG
9.1 Glue Launch Lines
9.1.1 Glue LL centered onto LL Line on rocket body
9.2 Application of Glue Fillets
9.2.1 Apply glue fillets along launch lug
9.2.2 Apply glue fillets along fin/body tube joints
9.2.3 Smooth each fillet with finger
9.2.4 Let glue dry completely
10.0 PAINTING THE ROCKET
10.1 Apply First Coat
10.1.1 Spray rocket with white primer
10.1.2 Let dry
10.2 Sand
10.2.1 Sand entire rocket
10.3 Apply Final Coat
10.3.1 Spray completed rocket with white second coat of primer
10.3.2 Let dry
10.3.3 Spray nose cone with copper paint
10.3.4 Let dry
11.0 APPLICATION OF DECALS
11.1 Apply First Decal
11.1.1 Remove first decal from back sheet
11.1.2 Place on rocket where indicated
11.1.3 Rub decal to remove bubbles
11.2 Apply Second Decal
11.2.1 Remove second decal from backing sheet
11.2.2 Place on rocket where indicated
11.2.3 Rub decal to remove bubbles
11.3 Apply Third Decal
11.3.1 Remove third decal from backing sheet
11.3.2 Place on rocket where indicated
11.3.3 Rub decal to remove bubbles
11.4 Apply Fourth Decal
11.4.1 Remove fourth decal from backing sheet
11.4.2 Place on rocket where indicated
11.4.3 Rub decal to remove bubbles
11.5 Apply Fifth Decal
11.5.1 Remove fifth decal from backing sheet
11.5.2 Place on rocket where indicated
11.5.3 Rub decal to remove bubbles
11.6 Apply Sixth Decal
11.6.1 Remove sixth decal from backing sheet
11.6.2 Place on rocket where indicated
11.6.3 Rub decal to remove bubbles
11.7 Apply Seventh Decal
11.7.1 Remove seventh decal from backing sheet
11.7.2 Place on rocket where indicated
11.7.3 Rub decal to remove bubbles
12.0 APPLYING CLEAR COAT
12.1 Apply Clear Coat to Entire Rocket
12.1.1 Apply clear coat to entire rocket
12.1.2 Dry completely
13.0 DISPLAY NOZZLE ASSEMBLY
13.1 Spray Nozzle Base White
13.1.1 Paint nozzle #1 w/ silver paint pen
13.1.2 Paint nozzle #2 w/ silver paint pen
13.1.3 Paint nozzle #3 w/ silver paint pen
13.1.4 Paint nozzle #4 w/ silver paint pen
13.1.5 Allow to dry
13.2 Apply Glue
13.2.1 Apply glue to tab on nozzle #1
13.2.2 Place nozzle #1 into hole on base
13.2.3 Apply glue to tab on nozzle #2
13.2.4 Place nozzle #2 into hole on base
13.2.5 Apply glue to tab on nozzle #3
13.2.6 Place nozzle #3 into hole on base
13.2.7 Apply glue to tab on nozzle #4
13.2.8 Place nozzle #4 into hole on base
14.0 ROCKET PREFLIGHT
14.1 Prepare
14.1.1 Remove nose cone from rocket
14.1.2 Locate recovery wadding
14.1.3 Insert 4–5 loosely crumpled squares of recovery wadding
14.2 Spike
14.2.1 Pull parachute into a spike-see diagram for clarification
14.3 Fold
14.3.1 Fold parachute according to diagram
14.4 Roll
14.4.1 Roll parachute according to diagram
14.5 Re-insert
|
14.5.1 Wrap lines loosely around rolled parachute-see diagram for clarification
14.5.2 Insert parachute into body tube of rocket
14.5.3 Insert shock cord into body tube of rocket
14.5.4 Insert nose cone into body tube of rocket
15.0 PREPARE FOR TEST LAUNCH
15.1 Insert Engine
15.1.1 Remove engine
15.1.2 Insert tip to touch propellant
15.1.3 Insert engine into rocket
APPENDIX E
COST ROLLUP ESTIMATES
(see next page)
Resource types – estimates in man-hours for Duration Estimate
TASKSFitterDraftsmanGluerCutterSanderISanderIIPainter IPainter IIEngineerDummy
1.0 ASSEMBLE ENGINE MOUNT143074000004095
1.1 Measure, Mark and Cut Engine Tube 53000000000
-1.1.1 Lay ruler along engine tube
5
-1.1.2 Measure engine from left of engine tube
tube @ 1/8″5
-1.1.3 Mark left end of Engine Tube @ 1/8′ 5
-1.1.4 Measure engine from left of engine tube @
3/4″5
-1.1.5 Mark from left of EngineTube @ 3/4″ 5
-1.1.6 Measure engine tube from left of engine
tube @ 11/2″5
-1.1.7 Mark from left of Engine Tube @ 1 1/2″5
-1.2 Cut Engine Tube0002000000
-1.2.1 Cut Slit of 1/8″ @ 1 1/2 inch Mark on
Engine Tube 2
-1.3 Glue, Tube, Assemble Hook 5020000000
-1.3.1 Apply thin line of glue completely around
engine at 3/4″ mark2
-1.3.2 Position Hook per diagram2
-1.3.3 Insert Engine Hook into 1/8″ Slit on
Engine Mount Tube3
-1.4 Assemble Mylar Ring to Tube1000000008
-1.4.1 Slide Mylar ring onto Engline Mount tube
at 3/4″ mark 1
-1.4.2 Let Dry8
-1.5 Assemble Yellow Engine Block to
Engine Mount Tube1010000008
-1.5.1 Apply glue inside front of Engine Mount
tube 1
-1.5.2 Insert Yellow Engine Block flush with the
right end per diagram1
-1.5.3 Let Dry8
-1.6 Assemble Centering Rings20220000016
-1.6.1 Remove Centering rings from card with
modeling knife2
-1.6.2 Apply thin line of Glue around engine
mount tube @ 1/8″ mark1
-1.6.3 Slide notched Centering Ring onto glued
line @ 1/8″ mark1
-1.6.4 Let Glue Set8
-1.6.5 Apply thin line of Glue to opposite side of
notched center ring flush with end of engine
mount tube1
-1.6.6 Slide unnotched Centering Ring in place
over glue flush with end of engine tube mount1
-1.6.7 Let Dry8
-1.7 Application of Glue Fillets0020000008
-1.7.1 Apply Glue Fillets to both sides of
Centering Rings for reinforcement2
-1.7.2 Let Dry8
2.0 FIN PREPARATION20012160000030
-2.1 Sand/Cut fins
0000800000
-2.1.1 Sand Laser Cut Balsa Sheet w/Fine
Sandpaper 8
-2.2 Cutting Out Fins00012000000
2.2.1 Cut out fin #1 w/modeling knife3
2.2.2 Cut out fin #2 w/modeling knife3
2.2.3 Cut out fin #3 w/ modeling knife3
2.2.4 Cut out fin #4 w/modeling knife3
-2.3 Stack and Sand Fins2000800000
-2.3.1 Stack Fins2
-2.3.2 Sand Edges of fins8
3.0 MARK FIN AND LAUNCH LUG LINES19120200000033
-3.1 Cut – Tape 3802000000
3.1.1 Cut out tube marking guide2
-3.1.2 Tape tube marking guide around body
tube3
-3.1.3 Mark body tube at arrows
4
-3.1.4 Mark Launch Lug Line as LL on Body
tube4
-3.2 Remove guide, connect fins and lug
lines, extend LL line16000000000
-3.2.1 Remove Tube Marking guide from body
tube 4
-3.2.2 Connect Fins using door frame4
-3.2.3 Connect launch lug lines using door frame
8
-3.3 Extend Launch Lug Line0400000000
-3.3.1 Extend launch lug line 3 3/4″ from end of
tube4
4.0 INSERTING ENGINE MOUNT111060000001643
-4.1 Mark inside of tube @ 5/8″ where LL is0700000000
-4.1.1 Measure inside tube to 5/8″ position on
tube4
-4.1.2 Mark inside tube at 5/8″3
-4.2 Glue Tube0320000000
-4.2.1 Measure inside rear of body tube to 1 3/4′
position on tube3
-4.2.2 Use finger to smear glue 1 3/4″ inside rear
of body tube along LL.2
-4.3 Assemble Engine Hook 10000000008
-4.3.1 Align engine hook with LL line
5
-4.3.2 Insert engine mount into body tube until
centering ring is even w/the 5/8″ glue mark5
-4.3.3 Let Dry8
-4.4 Gluing Center Body Ring1040000008
-4.4.1 Locate scrap piece of balsa to apply glue
1
-4.4.2 Apply glue to centering/body tube joint 4
-4.4.3 Let Dry8
5.0 ATTACH FINS2016200000001773
-5.1 Attach Fin #1 4050000001
-5.1.1 Apply thin layer of glue to edge of fin
3
-5.1.2 Allow to dry (1 minute for model)
1
-5.1.3 Apply second layer of glue to edge of fin
2
-5.1.4 Attach Fin to body tube along one of fin
lines flush w/end4
-5.2 Attach Fin #2 4050000001
-5.2.1 Apply thin layer of glue to edge of fin#23
-5.2.2 Allow to dry (1 minute for model)
1
-5.2.3 Apply second layer of glue to edge of fin
#22
-5.2.4 Attach Fin #2 to body tube along one of
fin lines flush w/end4
-5.3 Attach Fin #34050000001
-5.3.1 Apply thin layer of glue to edge of fin #3
3
-5.3.2 Allow to dry (1 minute for model)
1
-5.3.3 Apply second layer of glue to edge of fin
#32
-5.3.4 Attach Fin #3 to body tube along one of
fin lines flush w/end4
-5.4 Attach Fin #44050000001
-5.4.1 Apply thin layer of glue to edge of fin #4
3
-5.4.2 Allow to dry (1 minute for model)
1
-5.4.3 Apply second layer of glue to edge of fin
#42
-5.4.4 Attach Fin #4 to body tube along one of
fin lines flush w/end4
-5.5 Check Fin Alignment 01600000000
-5.5.1 Check Fin #1 Alignment as shown in
diagram4
-5.5.2 Check Fin #2 Alignment as shown in
diagram4
-5.5.3 Check Fin #3 Alignment as shown in
diagram4
-5.5.4 Check Fin #4 Alignment as shown in
diagram4
-5.6 Allow glue to dry 40000000013
-5.6.1 Let Glue Set5
-5.6.2 Stand Rocket on end4
-5.6.3 let glue dries completely8
6.0 ATTACH SHOCK CORD16019500000848
-6.1 Cut out shock cord mount 0005000000
-6.1.1 Cut out shock cord from front page
5
-6.2 First Glue Application8040000000
-6.2.1 Attach shock cord to shock cord mount
4
-6.2.2 Apply glue to shock cord mount4
-6.2.3 Fold edge of shock cord mount forward
over glued shock cord4
-6.3 Second Glue Application4040000000
-6.3.1 Apply glue to shock cord mount
4
-6.3.2 Fold forward again-see diagram for
clarification4
-6.4 Squueze and Hold0060000000
-6.4.1 Squeeze shock cord/shock cord mount
tightly2
-6.4.2 Hold for 1 minute4
-6.5 Attaching Shock Cord Mount4050000008
-6.5.1 Glue mount 1″ inside body tube44
-6.5.2 Hold until glue sets1
-6.5.3 Let Dry Completely8
7.0 ASSEMBLE NOSE CONE404000000816
-7.1 Glue nose cone 4040000008
-7.1.1 Apply plastic cememt to inside rim of
nose cone 4
-7.1.2 Press Nose Cone Insert into place over
plastic cement inside of nose cone rim4
-7.1.3 Let Dry Completely8
8.0 ATTACH PARACHUTE/SHOCK CORD1800000000018
-8.1 Attach Lines7000000000
-8.1.1 Pass shroud line on parachute through
eyelit 7
-8.2 Attach Parachute5000000000
-8.2.1 Pass parachute through loop in shroud-
look to diagram for clarification5
-8.3 Tie Lines6000000000
-8.3.1 Tie shock cord to nose cone using a
double knot6
9.0 ATTACH LAUNCH LUG0024000000832
-9.1 Glue launch lines 0040000000
-9.1.1 Glue LL centerd onto LL Line on rocket
body 4
-9.2 Application of Glue Fillets00200000008
-9.2.1 Apply glue fillets along launch lug
4
-9.2.2 Apply glue fillets along fin/body tube joints
12
-9.2.3 Smooth each fillet with finger
4
-9.2.4 Let glue dry completely8
10.0 PAINTING THE ROCKET000011684802497
-10.1 Apply first coat0000008008
-10.1.1 Spray rocket with white primer
8
-10.1.2 Let Dry8
-10.2 Sand 00001160000
-10.1.2 Sand entire rocket116
-10.3 Apply final coat000000048016
-10.3.1 Spray completed rocket with white
second coat of primer 16
-10.3.2 Let Dry8
-10.3.3 Spray Nose Cone with Copper paint 32
-10.3.4 Let Dry8
11.0 APPLICATION OF DECALS0350000000035
-11.1 Apply first decal 0500000000
-11.1.1 Remove First decal from back sheet 1
-11.1.2 Place on Rocket where indicated3
-11.1.3 Rub decal to remove bubbles1
-11.2 Apply second decal 0500000000
-11.2.1 Remove second decal from backing
sheet1
-11.2.2 Place on Rocket where indicated3
-11.2.3 Rub decal to remove bubbles
1
-11.3 Apply third decal0500000000
-11.3.1 Remove third decal from backing sheet1
-11.3.2 Place on Rocket where indicated3
-11.3.3 Rub decal to remove bubbles
1
-11.4 Apply fourth decal 0500000000
-11.4.1 Remove fourth decal from backing sheet
1
-11.4.2 Place on Rocket where indicated3
-11.4.3 Rub decal to remove bubbles
1
-11.5 Apply fifth decal 0500000000
-11.5.1 Remove fifth decal from backing sheet
1
-11.5.2 Place on Rocket where indicated 3
-11.5.3 Rub decal to remove bubbles1
-11.6 Apply sixth Decal 0500000000
-11.6.1 Remove sixth decal from backing sheet1
-11.6.2 Place on Rocket where indicated3
-11.6.3 Rub decal to remove bubbles
1
-11.7 Apply seventh Decal0500000000
-11.7.1 Remove seventh decal from backing
sheet1
-11.7.2 Place on Rocket where indicated3
-11.7.3 Rub decal to remove bubbles1
12.0 APPLYING CLEAR COAT000000080816
-12.1 Apply clear coat to entire rocket0000000808
12.1.1 Apply clear coat to entire rocket 8
12.1.2 Dry Completely8
13.0 DISPLAY NOZZLE ASSEMBLY808000900833
-13.1 Spray Nozzle Base White0000009008
-13.1.1 Paint Nozzle #1 w/Silver Paint Pen2
-13.1.2 Paint Nozzle #2 w/ Silver Paint Pen2
-13.1.3 Paint Nozzle #3 w/ Silver Paint Pen2
-13.1.4 Paint Nozzle #4 w/ Silver Paint Pen3
-13.1.5 Allow to dry8
-13.2 Apply Glue8080000000
-13.2.1 Apply glue to tab on nozzle #12
-13.2.2 Place Nozzle #1 into hole on base2
-13.2.3 Apply glue to tab on nozzle #22
-13.2.4 Place Nozzle #2 into hole on base2
-13.2.5 Apply glue to tab on nozzle #32
-13.2.6 Place Nozzle #3 into hole on base2
-13.2.7 Apply glue to tab on nozzle #42
-13.2.8 Place Nozzle #4 into hole on base2
14.0 ROCKET PREFLIGHT4200000000042
14.1 prepare13000000000
-14.1.1 Remove Nose Cone from Rocket6
-14.1.2 Locate recovery wadding
1
-14.1.3 Insert 4-5 loosely crumpled squares of
recovery wadding6
14.2 Spike4000000000
-14.2.1 Pull parachute into a spike-see diagram
for clarification4
14.3 Fold4000000000
-14.3.1 Fold parachute according to diagram
4
14.4 Roll4000000000
-14.4.1 Roll parachute according to diagram
4
14.5 Re-insert17000000000
-14.5.1 Wrap lines loosly around rolled
parachute-see diagram for clarification5
-14.5.2 Insert parachute into body tube of rocket
6
-14.5.3 Insert shock cord into body tube of
rocket2
-14.5.4 Insert nose cone into body tube of rocket
4
15.0 PREPARE FOR TEST LAUNCH0000000032032
-15.1 Insert Engine00000000320
-15.1.1 Remove engine
10
-15.1.2 Insert tip to touch propellant
10
-15.1.3 Insert engine into rocket
12
RESOURCE TOTALS 15410388231716175632137643
Add resource totals as cross check 643
RESOURCE HOURLY RATES$50.00$40.00$25.00$40.00$25.00$30.00$25.00$30.00$55.00
RESOURCE COSTS $7,700.00$4,120.00$2,200.00$920.00$425.00$480.00$425.00$1,680.00$1,760.00########
APPENDIX F
SCHEDULED START DATES
Name
Duration
Start Date
Finish Date
1.0 ASSEMBLE ENGINE MOUNT
95
5/22/2006
6/5/2006
-1.1 Measure, Mark, and Cut Engine Tube
35
5/22/2006
5/26/2006
-1.2 Cut Engine Tube
2
5/26/2006
5/26/2006
|
-1.3 Glue, Tube, Assemble Hook
|
7
5/26/2006
5/30/2006
-1.4 Assemble Mylar Ring to Tube
9
5/30/2006
5/30/2006
-1.5 Assemble Yellow Engine Block to Engine Mount Tube
10
5/31/2006
5/31/2006
-1.6 Assemble Centering Rings
22
6/1/2006
6/2/2006
-1.7 Application of Glue Fillets
10
6/5/2006
6/5/2006
2.0 FIN PREPARATION
30
5/22/2006
5/25/2006
|
-2.1 Sand/Cut fins
|
8
5/22/2006
5/22/2006
-2.2 Cutting Out Fins
12
5/23/2006
5/24/2006
-2.3 Stack and Sand Fins
10
5/24/2006
5/25/2006
3.0 MARK FIN AND LAUNCH LUG LINES
33
5/22/2006
5/30/2006
-3.1 Cut – Tape
13
5/22/2006
5/25/2006
-3.2 Remove Guide, Connect Fins and Lug Lines, Extend LL Line
16
5/25/2006
5/30/2006
-3.3 Extend Launch Lug Line
4
5/30/2006
5/30/2006
4.0 INSERTING ENGINE MOUNT
43
6/6/2006
6/9/2006
|
-4.1 Mark Inside of Tube @ 5/8″ Where LL Is
|
7
6/6/2006
6/6/2006
-4.2 Glue Tube
5
6/6/2006
6/7/2006
-4.3 Assemble Engine Hook
18
6/7/2006
6/8/2006
-4.4 Gluing Center Body Ring
13
6/9/2006
6/9/2006
5.0 ATTACH FINS
73
6/12/2006
6/16/2006
-5.1 Attach Fin #1
10
6/12/2006
6/13/2006
-5.2 Attach Fin #2
10
6/12/2006
6/13/2006
-5.3 Attach Fin #3
10
6/12/2006
6/13/2006
-5.4 Attach Fin #4
10
6/12/2006
6/13/2006
-5.5 Check Fin Alignment
16
6/13/2006
6/15/2006
-5.6 Allow Glue to Dry
17
6/15/2006
6/16/2006
6.0 ATTACH SHOCK CORD
44
5/22/2006
5/26/2006
-6.1 Cut Out Shock Cord Mount
5
5/22/2006
5/22/2006
-6.2 First Glue Application
12
5/22/2006
5/24/2006
-6.3 Second Glue Application
8
5/24/2006
5/25/2006
-6.4 Squeeze and Hold
6
5/25/2006
5/25/2006
-6.5 Attaching Shock Cord Mount
13
5/25/2006
5/26/2006
7.0 ASSEMBLE NOSE CONE
16
5/22/2006
5/23/2006
-7.1 Glue Nose Cone
16
5/22/2006
5/23/2006
8.0 ATTACH PARACHUTE/SHOCK CORD
18
5/30/2006
6/2/2006
-8.1 Attach Lines
7
5/30/2006
5/31/2006
-8.2 Attach Parachute
5
5/31/2006
6/1/2006
-8.3 Tie Lines
6
6/1/2006
6/2/2006
9.0 ATTACH LAUNCH LUG
32
6/19/2006
6/22/2006
-9.1 Glue Launch Lines
4
6/19/2006
6/19/2006
-9.2 Application of Glue Fillets
28
6/19/2006
6/22/2006
10.0 PAINTING THE ROCKET
64
6/22/2006
6/29/2006
-10.1 Apply First Coat
16
6/22/2006
6/23/2006
-10.2 Sand
8
6/23/2006
6/23/2006
-10.3 Apply Final Coat
40
6/26/2006
6/29/2006
11.0 APPLICATION OF DECALS
35
6/29/2006
7/6/2006
-11.1 Apply First Decal
5
6/29/2006
6/29/2006
-11.2 Apply Second Decal
5
6/29/2006
6/30/2006
-11.3 Apply Third Decal
5
6/30/2006
6/30/2006
-11.4 Apply Fourth Decal
5
6/30/2006
7/3/2006
-11.5 Apply Fifth Decal
5
7/3/2006
7/5/2006
-11.6 Apply Sixth Decal
5
7/5/2006
7/5/2006
-11.7 Apply Seventh Decal
5
7/5/2006
7/6/2006
12.0 APPLYING CLEAR COAT
16
7/6/2006
7/7/2006
-12.1 Apply Clear Coat to Entire Rocket
16
7/6/2006
7/7/2006
13.0 DISPLAY NOZZLE ASSEMBLY
32
7/10/2006
7/13/2006
-13.1 Spray Nozzle Base White
18
7/10/2006
7/11/2006
-13.2 Apply Glue
14
7/12/2006
7/13/2006
14.0 ROCKET PREFLIGHT
42
7/13/2006
7/20/2006
|
-14.1 Prepare
|
13
7/13/2006
7/17/2006
|
-14.2 Spike
|
4
7/17/2006
7/17/2006
|
-14.3 Fold
|
4
7/17/2006
7/18/2006
|
-14.4 Roll
|
4
7/18/2006
7/18/2006
|
-14.5 Re-Insert
|
17
7/18/2006
7/20/2006
15.0 PREPARE FOR TEST LAUNCH
32
7/21/2006
7/26/2006
-15.1 Insert Engine
32
7/21/2006
7/26/2006
APPENDIX G
RESPONSIBILITY ASSIGNMENT MATRIX
|
Member Teams (Deliverables Owners)
|
|
Deliverable(s)
|
Stage Name
|
|
|
|
Team A
|
Team B
Team C
Team D
Core Team
|
Alternate
|
1.0
|
Assemble Engine Mount
|
|
|
|
|
|
|
|
|
|
CORE
|
|
|
|
|
|
|
|
|
|
|
N/A
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Quality Check
|
Brian Kirouac
|
|
|
|
|
|
|
|
|
|
|
DoD Rep.
|
|
2.0
|
Fin Preparation
|
CORE
N/A
Quality Check
Tom Jones
DoD Rep.
|
3.0
|
Mark Fin & Launch Lug Lines
|
CORE
N/A
Quality Check
Brian Kirouac
DoD Rep.
|
4.0
|
Inserting Engine Mount
|
CORE
N/A
Quality Check
Tom Jones
DoD Rep.
|
5.0
|
Attach Fins
|
CORE
N/A
Quality Check
Brian Kirouac
DoD Rep.
|
6.0
|
Attach Shock Cord
|
CORE
N/A
Quality Check
Tom Jones
DoD Rep.
|
7.0
|
Assemble Nose Cone
|
|
|
|
|
|
|
|
J.C. Bose
|
|
|
|
|
B.T. Linking
|
Quality Check
Brian Kirouac
DoD Rep.
|
8.0
|
Attach Parachute & Shock Cord Assembly
|
J.C. Bose
B.T. Linking
Quality Check
Tom Jones
DoD Rep.
|
9.0
|
Attach Launch Lug
|
J.C. Bose
B.T. Linking
Quality Check
Brian Kirouac
DoD Rep.
|
10.0
|
Painting the Rocket
|
Robert Muse
J.C. Bose
|
10.2
|
Sand
|
B.T. Linking
J.C. Bose
|
10.3
|
Apply White Primer
|
Robert Muse
J.C. Bose
|
11.0
|
Application of Decals
|
J.C. Bose
B.T. Linking
Quality Check
Brian Kirouac
DoD Rep.
|
12.0
|
Applying Clear Coat
|
Robert Muse
J.C. Bose
Quality Check
Tom Jones
DoD Rep.
|
13.0
|
Display Nozzle Assembly
|
CORE
N/A
Quality Check
Brian Kirouac
DoD Rep.
|
14.0
|
Rocket Pre-Flight
|
Tom Jones
CORE
N/A
Quality Check
Team A
N/A
|
15.0
|
Prepare for Test Launch
|
Team A
Brian Kirouac
CORE
N/A
Quality Check
Team A
Tom Jones
CORE
N/A
APPENDIX H
PERFORMANCE MEASUREMENT BASELINES
CATEGORYWEEK 1WEEK 2WEEK 3WEEK 4WEEK 5WEEK 6WEEK 7WEEK 8WEEK 9WEEK 10
Labor$5,685.00$1,815.00$1,110.00$2,260.00$1,715.00$2,310.00$840.00$1,345.00$2,100.00$770.00
Material $5,000.00
Equipment / $25,200.00
parts
TOTAL$35,885.00$1,815.00$1,110.00$2,260.00$1,715.00$2,310.00$840.00$1,345.00$2,100.00$770.00$50,150.00
CUMULATIVE $35,885.00$37,700.00$38,810.00$41,070.00$42,785.00$45,095.00$45,935.00$47,280.00$49,380.00$50,150.00
Cumulative Total Chart
EAC = $50,150.00
$0.00
$10,000.00
$20,000.00
$30,000.00
$40,000.00
$50,000.00
$60,000.00
12345678910
APPENDIX I
MAJOR MILESTONES
Name
Finish Date
1.0 ASSEMBLE ENGINE MOUNT
6/5/2006
2.0 FIN PREPARATION
5/25/2006
3.0 MARK FIN AND LAUNCH LUG LINES
5/30/2006
4.0 INSERTING ENGINE MOUNT
6/9/2006
5.0 ATTACH FINS
6/16/2006
6.0 ATTACH SHOCK CORD
5/26/2006
7.0 ASSEMBLE NOSE CONE
5/23/2006
8.0 ATTACH PARACHUTE/SHOCK CORD
6/2/2006
9.0 ATTACH LAUNCH LUG
6/22/2006
10.0 PAINTING THE ROCKET
6/29/2006
11.0 APPLICATION OF DECALS
7/6/2006
12.0 APPLYING CLEAR COAT
7/7/2006
13.0 DISPLAY NOZZLE ASSEMBLY
7/13/2006
14.0 ROCKET PREFLIGHT
7/20/2006
15.0 PREPARE FOR TEST LAUNCH
7/26/2006
APPENDIX J
KEY OR REQUIRED STAFF
TEAM A
Title
Name
Organization
Customer
Pablo De Leon & Associates
Sponsor
Jeff Tyler
Pablo De Leon & Associates
Cost Financing
Jeanea Brown
Space Systems Technology
Engineer
N/A
Space Systems Technology
Project Manager
Julie Davis
Space Systems Technology
TEAM B
Title
Name
Organization
Quality Assessment
Brian Kirouac
Space Systems Technology
Quality Assessment
Tom Jones
Space Systems Technology
TEAM C
Title
Name
Organization
Functional Mgr. (Drafting)
B. Jose Alonzo
Space Systems Technology
Functional Mgr. (Painting)
Robert Muse
Space Systems Technology
Functional Mgr. (Finishing/Sanding)
Buford T. Linking
Space Systems Technology
Functional Mgr (Cutting)
Ben E. Blades
Space Systems Technology
Functional Mgr. (Assembly/Gluer)
J. Christian Bose
Space Systems Technology
Functional Mgr. (Fitters/Cutters)
Charles Gooding
Space Systems Technology
TEAM D
Title
Name
Organization
Procurement
I.B. Buying
Space Systems Technology
APPENDIX K
KEY RISKS
Activity
Probability of Risk
Magnitude of Damage
Planned Action
Rank
Low/0.05
Med./0.1
High/0.5
Low/0.1
Med./ 0.3
High/0.5
Total Rating
Type of Action
Key personnel unavailable
1
X
X
0.05
Ensure contractors are available if necessary.
Delayed delivery of materials and equipment
2
X
X
0.05
Make delivery a requirement in supplier contracts.
Damage to original parts provided in kit
3
X
X
0.03
Insurance & Liability Waiver
Materials shortages
4
X
X
0.02
Contract with current supplier and identify alternate suppliers if needed.
Weather
5
X
X
0.01
Change schedule if necessary.
APPENDIX L
CONSTRAINTS
1. SST does not currently have the employee resources available during our project start to finish dates for the fitter positions. If the materials are not delivered in the time frame on which the project was based, there will be a change to the project delivery date that we do not have available to us. This affects the project human resource management knowledge area.
2. The materials must be delivered the Friday before project start. This affects the project procurement management knowledge area.
3. Project must be completed in 3 months time. This affects the project time management knowledge area.
4. Estimated budget is not to exceed $63,000. This affects the project cost management knowledge area.
APPENDIX M
ASSUMPTIONS
1. All materials for the rocket will have been purchased by SST and received no later than the Friday before project start. (Recommended engines are as follows: 1/2A3-2T, A3-4T, A10-3T.) This is an external event that must occur for the project to be successful.
2. All activities associated to building the model rocket will follow the National Association of Rocketry (NAR) Safety Code. This covers implicit and explicit instructions.
3. All project management activities will use SST’s project tools, templates, and processes, based on PMI standards. This covers implicit and explicit instructions.
4. The project can be completed in 3 months time and within an estimated budget of $63,000. This is covered by explicit and implicit instructions as well as a cost baseline analysis and the WBS and schedule.
APPENDIX N
CONSTRUCTION PLANS
WORKS CITED
Parts of the cost management plan were derived from:
1. Space Systems Technology – Cost baseline Cumulative S Curve (Space Systems Technology Distance Learning, Module 6 Cost Budgeting and Control, course PMGT605).
Parts of the schedule plan were derived from:
2. Cox, D. “SCHEDULE MANAGEMENT PLAN for Department of Energy BMIS-FM Project” (2000), retrieved April 27, 2006,
http://www.mbe.doe.gov/me2-5/i-manage/ENG503-2ScheduleManagementPlan
.
3. McNeece, P. “SOFTWARE PROJECT MANAGEMENT PLAN FOR THE MJY TEAM,” retrieved April 27, 2006,
http://www.baz.com/kjordan/swse625/htm/spmp_0_1.htm
.
Parts of the communications plan were derived from:
4. HRD Price. Retrieved from
http://www.hrdpress.com
PAGE
– 39 –
_1208455837.xls
Sheet1
CATEGORY WEEK 1 WEEK 2 WEEK 3 WEEK 4 WEEK 5 WEEK 6 WEEK 7 WEEK 8 WEEK 9 WEEK 10
Labor $5,685.00 $1,815.00 $1,110.00 $2,260.00 $1,715.00 $2,310.00 $840.00 $1,345.00 $2,100.00 $770.00
Material $5,000.00
Equipment / $25,200.00
parts
TOTAL $35,885.00 $1,815.00 $1,110.00 $2,260.00 $1,715.00 $2,310.00 $840.00 $1,345.00 $2,100.00 $770.00 $50,150.00
CUMULATIVE $35,885.00 $37,700.00 $38,810.00 $41,070.00 $42,785.00 $45,095.00 $45,935.00 $47,280.00 $49,380.00 $50,150.00
Cumulative Total Chart
EAC = $50,150.00
&CGauchito Project BCWS and EAC
&C8 May 2006&R&P/&N
Sheet1
0
0
0
0
0
0
0
0
0
0
Sheet2
Sheet3
_1281792913.xls
Sheet1
Resource types – estimates in man-hours for Duration Estimate
TASKS Fitter Draftsman Gluer Cutter SanderI SanderII Painter I Painter II Engineer Dummy
1.0 ASSEMBLE ENGINE MOUNT 14 30 7 4 0 0 0 0 0 40 95
1.1 Measure, Mark and Cut Engine Tube 5 30 0 0 0 0 0 0 0 0
-1.1.1 Lay ruler along engine tube 5
-1.1.2 Measure engine from left of engine tube tube @ 1/8″ 5
-1.1.3 Mark left end of Engine Tube @ 1/8′ 5
-1.1.4 Measure engine from left of engine tube @ 3/4″
5
-1.1.5 Mark from left of EngineTube @ 3/4″ 5
-1.1.6 Measure engine tube from left of engine tube @ 11/2″ 5
-1.1.7 Mark from left of Engine Tube @ 1 1/2″ 5
-1.2 Cut Engine Tube 0 0 0 2 0 0 0 0 0 0
-1.2.1 Cut Slit of 1/8″ @ 1 1/2 inch Mark on Engine Tube 2
-1.3 Glue, Tube, Assemble Hook 5 0 2 0 0 0 0 0 0 0
-1.3.1 Apply thin line of glue completely around engine at 3/4″ mark
2
-1.3.2 Position Hook per diagram 2
-1.3.3 Insert Engine Hook into 1/8″ Slit on Engine Mount Tube 3
-1.4 Assemble Mylar Ring to Tube 1 0 0 0 0 0 0 0 0 8
-1.4.1 Slide Mylar ring onto Engline Mount tube at 3/4″ mark 1
-1.4.2 Let Dry 8
-1.5 Assemble Yellow Engine Block to Engine Mount Tube 1 0 1 0 0 0 0 0 0 8
-1.5.1 Apply glue inside front of Engine Mount tube 1
-1.5.2 Insert Yellow Engine Block flush with the right end per diagram 1
-1.5.3 Let Dry 8
-1.6 Assemble Centering Rings 2 0 2 2 0 0 0 0 0 16
-1.6.1 Remove Centering rings from card with modeling knife 2
-1.6.2 Apply thin line of Glue around engine mount tube @ 1/8″ mark 1
-1.6.3 Slide notched Centering Ring onto glued line @ 1/8″ mark 1
-1.6.4 Let Glue Set 8
-1.6.5 Apply thin line of Glue to opposite side of notched center ring flush with end of engine mount tube 1
-1.6.6 Slide unnotched Centering Ring in place over glue flush with end of engine tube mount 1
-1.6.7 Let Dry 8
-1.7 Application of Glue Fillets 0 0 2 0 0 0 0 0 0 8
-1.7.1 Apply Glue Fillets to both sides of Centering Rings for reinforcement 2
-1.7.2 Let Dry 8
2.0 FIN PREPARATION 2 0 0 12 16 0 0 0 0 0 30
-2.1 Sand/Cut fins 0 0 0 0 8 0 0 0 0 0
-2.1.1 Sand Laser Cut Balsa Sheet w/Fine Sandpaper 8
-2.2 Cutting Out Fins 0 0 0 12 0 0 0 0 0 0
2.2.1 Cut out fin #1 w/modeling knife 3
2.2.2 Cut out fin #2 w/modeling knife 3
2.2.3 Cut out fin #3 w/ modeling knife 3
2.2.4 Cut out fin #4 w/modeling knife 3
-2.3 Stack and Sand Fins 2 0 0 0 8 0 0 0 0 0
-2.3.1 Stack Fins 2
-2.3.2 Sand Edges of fins 8
3.0 MARK FIN AND LAUNCH LUG LINES 19 12 0 2 0 0 0 0 0 0 33
-3.1 Cut – Tape 3 8 0 2 0 0 0 0 0 0
3.1.1 Cut out tube marking guide 2
-3.1.2 Tape tube marking guide around body tube
3
-3.1.3 Mark body tube at arrows 4
-3.1.4 Mark Launch Lug Line as LL on Body tube 4
-3.2 Remove guide, connect fins and lug lines, extend LL line 16 0 0 0 0 0 0 0 0 0
-3.2.1 Remove Tube Marking guide from body tube 4
-3.2.2 Connect Fins using door frame 4
-3.2.3 Connect launch lug lines using door frame 8
-3.3 Extend Launch Lug Line 0 4 0 0 0 0 0 0 0 0
-3.3.1 Extend launch lug line 3 3/4″ from end of tube 4
4.0 INSERTING ENGINE MOUNT 11 10 6 0 0 0 0 0 0 16 43
-4.1 Mark inside of tube @ 5/8″ where LL is 0 7 0 0 0 0 0 0 0 0
-4.1.1 Measure inside tube to 5/8″ position on tube
4
-4.1.2 Mark inside tube at 5/8″ 3
-4.2 Glue Tube 0 3 2 0 0 0 0 0 0 0
-4.2.1 Measure inside rear of body tube to 1 3/4′ position on tube 3
-4.2.2 Use finger to smear glue 1 3/4″ inside rear of body tube along LL
. 2
-4.3 Assemble Engine Hook 10 0 0 0 0 0 0 0 0 8
-4.3.1 Align engine hook with LL line 5
-4.3.2 Insert engine mount into body tube until centering ring is even w/the 5/8″ glue mark 5
-4.3.3 Let Dry 8
-4.4 Gluing Center Body Ring 1 0 4 0 0 0 0 0 0 8
-4.4.1 Locate scrap piece of balsa to apply glue 1
-4.4.2 Apply glue to centering/body tube joint 4
-4.4.3 Let Dry 8
5.0 ATTACH FINS 20 16 20 0 0 0 0 0 0 17 73
-5.1 Attach Fin #1 4 0 5 0 0 0 0 0 0 1
-5.1.1 Apply thin layer of glue to edge of fin 3
-5.1.2 Allow to dry (1 minute for model) 1
-5.1.3 Apply second layer of glue to edge of fin 2
-5.1.4 Attach Fin to body tube along one of fin lines flush w/end 4
-5.2 Attach Fin #2 4 0 5 0 0 0 0 0 0 1
-5.2.1 Apply thin layer of glue to edge of fin#2 3
-5.2.2 Allow to dry (1 minute for model) 1
-5.2.3 Apply second layer of glue to edge of fin #2
2
-5.2.4 Attach Fin #2 to body tube along one of fin lines flush w/end 4
-5.3 Attach Fin #3 4 0 5 0 0 0 0 0 0 1
-5.3.1 Apply thin layer of glue to edge of fin #3 3
-5.3.2 Allow to dry (1 minute for model) 1
-5.3.3 Apply second layer of glue to edge of fin #3
2
-5.3.4 Attach Fin #3 to body tube along one of fin lines flush w/end 4
-5.4 Attach Fin #4 4 0 5 0 0 0 0 0 0 1
-5.4.1 Apply thin layer of glue to edge of fin #4 3
-5.4.2 Allow to dry (1 minute for model) 1
-5.4.3 Apply second layer of glue to edge of fin #4
2
-5.4.4 Attach Fin #4 to body tube along one of fin lines flush w/end 4
-5.5 Check Fin Alignment 0 16 0 0 0 0 0 0 0 0
-5.5.1 Check Fin #1 Alignment as shown in diagram 4
-5.5.2 Check Fin #2 Alignment as shown in diagram 4
-5.5.3 Check Fin #3 Alignment as shown in diagram 4
-5.5.4 Check Fin #4 Alignment as shown in diagram 4
-5.6 Allow glue to dry 4 0 0 0 0 0 0 0 0 13
-5.6.1 Let Glue Set 5
-5.6.2 Stand Rocket on end 4
-5.6.3 let glue dries completely 8
6.0 ATTACH SHOCK CORD 16 0 19 5 0 0 0 0 0 8 48
-6.1 Cut out shock cord mount 0 0 0 5 0 0 0 0 0 0
-6.1.1 Cut out shock cord from front page 5
-6.2 First Glue Application 8 0 4 0 0 0 0 0 0 0
-6.2.1 Attach shock cord to shock cord mount 4
-6.2.2 Apply glue to shock cord mount 4
-6.2.3 Fold edge of shock cord mount forward over glued shock cord
4
-6.3 Second Glue Application 4 0 4 0 0 0 0 0 0 0
-6.3.1 Apply glue to shock cord mount 4
-6.3.2 Fold forward again-see diagram for clarification 4
-6.4 Squueze and Hold 0 0 6 0 0 0 0 0 0 0
-6.4.1 Squeeze shock cord/shock cord mount tightly 2
-6.4.2 Hold for 1 minute 4
-6.5 Attaching Shock Cord Mount 4 0 5 0 0 0 0 0 0 8
-6.5.1 Glue mount 1″ inside body tube 4 4
-6.5.2 Hold until glue sets 1
-6.5.3 Let Dry Completely 8
7.0 ASSEMBLE NOSE CONE 4 0 4 0 0 0 0 0 0 8 16
-7.1 Glue nose cone 4 0 4 0 0 0 0 0 0 8
-7.1.1 Apply plastic cememt to inside rim of nose cone 4
-7.1.2 Press Nose Cone Insert into place over plastic cement inside of nose cone rim 4
-7.1.3 Let Dry Completely 8
8.0 ATTACH PARACHUTE/SHOCK CORD 18 0 0 0 0 0 0 0 0 0 18
-8.1 Attach Lines 7 0 0 0 0 0 0 0 0 0
-8.1.1 Pass shroud line on parachute through eyelit 7
-8.2 Attach Parachute 5 0 0 0 0 0 0 0 0 0
-8.2.1 Pass parachute through loop in shroud-look to diagram for clarification 5
-8.3 Tie Lines 6 0 0 0 0 0 0 0 0 0
-8.3.1 Tie shock cord to nose cone using a double knot
6
9.0 ATTACH LAUNCH LUG 0 0 24 0 0 0 0 0 0 8 32
-9.1 Glue launch lines 0 0 4 0 0 0 0 0 0 0
-9.1.1 Glue LL centerd onto LL Line on rocket body 4
-9.2 Application of Glue Fillets 0 0 20 0 0 0 0 0 0 8
-9.2.1 Apply glue fillets along launch lug 4
-9.2.2 Apply glue fillets along fin/body tube joints 12
-9.2.3 Smooth each fillet with finger 4
-9.2.4 Let glue dry completely 8
10.0 PAINTING THE ROCKET 0 0 0 0 1 16 8 48 0 24 97
-10.1 Apply first coat 0 0 0 0 0 0 8 0 0 8
-10.1.1 Spray rocket with white primer 8
-10.1.2 Let Dry 8
-10.2 Sand 0 0 0 0 1 16 0 0 0 0
-10.1.2 Sand entire rocket 1 16
-10.3 Apply final coat 0 0 0 0 0 0 0 48 0 16
-10.3.1 Spray completed rocket with white second coat of primer 16
-10.3.2 Let Dry 8
-10.3.3 Spray Nose Cone with Copper paint 32
-10.3.4 Let Dry 8
11.0 APPLICATION OF DECALS 0 35 0 0 0 0 0 0 0 0 35
-11.1 Apply first decal 0 5 0 0 0 0 0 0 0 0
-11.1.1 Remove First decal from back sheet 1
-11.1.2 Place on Rocket where indicated 3
-11.1.3 Rub decal to remove bubbles 1
-11.2 Apply second decal 0 5 0 0 0 0 0 0 0 0
-11.2.1 Remove second decal from backing sheet
1
-11.2.2 Place on Rocket where indicated 3
-11.2.3 Rub decal to remove bubbles 1
-11.3 Apply third decal 0 5 0 0 0 0 0 0 0 0
-11.3.1 Remove third decal from backing sheet 1
-11.3.2 Place on Rocket where indicated 3
-11.3.3 Rub decal to remove bubbles 1
-11.4 Apply fourth decal 0 5 0 0 0 0 0 0 0 0
-11.4.1 Remove fourth decal from backing sheet 1
-11.4.2 Place on Rocket where indicated 3
-11.4.3 Rub decal to remove bubbles 1
-11.5 Apply fifth decal 0 5 0 0 0 0 0 0 0 0
-11.5.1 Remove fifth decal from backing sheet 1
-11.5.2 Place on Rocket where indicated 3
-11.5.3 Rub decal to remove bubbles 1
-11.6 Apply sixth Decal 0 5 0 0 0 0 0 0 0 0
-11.6.1 Remove sixth decal from backing sheet 1
-11.6.2 Place on Rocket where indicated 3
-11.6.3 Rub decal to remove bubbles 1
-11.7 Apply seventh Decal 0 5 0 0 0 0 0 0 0 0
-11.7.1 Remove seventh decal from backing sheet
1
-11.7.2 Place on Rocket where indicated 3
-11.7.3 Rub decal to remove bubbles 1
12.0 APPLYING CLEAR COAT 0 0 0 0 0 0 0 8 0 8 16
-12.1 Apply clear coat to entire rocket 0 0 0 0 0 0 0 8 0 8
12.1.1 Apply clear coat to entire rocket 8
12.1.2 Dry Completely 8
13.0 DISPLAY NOZZLE ASSEMBLY 8 0 8 0 0 0 9 0 0 8 33
-13.1 Spray Nozzle Base White 0 0 0 0 0 0 9 0 0 8
-13.1.1 Paint Nozzle #1 w/Silver Paint Pen 2
-13.1.2 Paint Nozzle #2 w/ Silver Paint Pen 2
-13.1.3 Paint Nozzle #3 w/ Silver Paint Pen 2
-13.1.4 Paint Nozzle #4 w/ Silver Paint Pen 3
-13.1.5 Allow to dry 8
-13.2 Apply Glue 8 0 8 0 0 0 0 0 0 0
-13.2.1 Apply glue to tab on nozzle #1 2
-13.2.2 Place Nozzle #1 into hole on base 2
-13.2.3 Apply glue to tab on nozzle #2 2
-13.2.4 Place Nozzle #2 into hole on base 2
-13.2.5 Apply glue to tab on nozzle #3 2
-13.2.6 Place Nozzle #3 into hole on base 2
-13.2.7 Apply glue to tab on nozzle #4 2
-13.2.8 Place Nozzle #4 into hole on base 2
14.0 ROCKET PREFLIGHT 42 0 0 0 0 0 0 0 0 0 42
14.1 prepare 13 0 0 0 0 0 0 0 0 0
-14.1.1 Remove Nose Cone from Rocket 6
-14.1.2 Locate recovery wadding 1
-14.1.3 Insert 4-5 loosely crumpled squares of recovery wadding 6
14.2 Spike 4 0 0 0 0 0 0 0 0 0
-14.2.1 Pull parachute into a spike-see diagram for clarification
4
14.3 Fold 4 0 0 0 0 0 0 0 0 0
-14.3.1 Fold parachute according to diagram 4
14.4 Roll 4 0 0 0 0 0 0 0 0 0
-14.4.1 Roll parachute according to diagram 4
14.5 Re-insert 17 0 0 0 0 0 0 0 0 0
-14.5.1 Wrap lines loosly around rolled parachute-see diagram for clarification 5
-14.5.2 Insert parachute into body tube of rocket 6
-14.5.3 Insert shock cord into body tube of rocket
2
-14.5.4 Insert nose cone into body tube of rocket 4
15.0 PREPARE FOR TEST LAUNCH 0 0 0 0 0 0 0 0 32 0 32
-15.1 Insert Engine 0 0 0 0 0 0 0 0 32 0
-15.1.1 Remove engine 10
-15.1.2 Insert tip to touch propellant 10
-15.1.3 Insert engine into rocket 12
RESOURCE TOTALS 154 103 88 23 17 16 17 56 32 137 643
Add resource totals as cross check 643
RESOURCE HOURLY RATES $50.00 $40.00 $25.00 $40.00 $25.00 $30.00 $25.00 $30.00 $55.00
RESOURCE COSTS $7,700.00 $4,120.00 $2,200.00 $920.00 $425.00 $480.00 $425.00 $1,680.00 $1,760.00 $19,710.00
Sheet2
Sheet3
_1208438882.vsd
Task Name�
Early Start�
Duration�
Early Finish�
Late Start�
Slack�
Late Finish�
1:
Assemble Engine Mount�
ES
0�
95h�
EF
95�
LS
0�
Slack
0�
LF
95�
3:
Mark Fin & LL Lines�
ES
0�
33h�
EF
33�
LS
62�
Slack
62�
LF
95�
2:
Fin Preperation�
ES
0�
30h�
EF
30�
LS
108�
Slack
108�
LF
138�
6:
Attach Shock Cord�
ES
0�
44h�
EF
44�
ES
196�
Slack
196�
LF
240�
7:
Assemble Nose Cone�
ES
0�
16h�
EF
16�
LS
224�
Slack
224�
LF
240�
13:
Display Nozzle Assembly�
ES
0�
32h�
EF
32�
LS
400�
Slack
400�
LF
432�
4:
Insert Engine Mount�
ES
95�
43h�
EF
138�
LS
95�
Slack
0�
LF
138�
5:
Attach Fins�
ES
138�
73h�
EF
211�
LS
138�
Slack
0�
LF
211�
8:
Attach Chute/Shock Cord�
ES
44�
3h�
EF
47�
LS
240�
Slack
196�
LF
243�
10:
Painting the Rocket�
ES
243�
64h�
EF
307�
LS
243�
Slack
0�
LF
307�
9:
Attach Launch Lug�
ES
211�
32h�
EF
243�
LS
211�
Slack
0�
LF
243�
11:
Application of Decals�
ES
307�
35h�
EF
342�
LS
307�
Slack
0�
LF
342�
12:
Applying Clear Coat�
ES
342�
16h�
EF
358�
LS
342�
Slack
0�
LF
358�
14:
Rocket Pre-Flight�
ES
358�
42h�
EF
400�
LS
358�
Slack
0�
LF
400�
15:
Prepare for Test Launch�
ES
400�
32h�
EF
432�
LS
400�
Slack
0�
LF
432�
Legend�
�
Gauchito Network Diagram�
Judgment and Decision Making, Vol. 11, No. 2, March 2016, pp.
147
–167
Backward planning: Effects of planning direction on predictions of
task completion time
Jessica Wiese∗ Roger Buehler† Dale Griffin‡
Abstract
People frequently underestimate the time needed to complete tasks and we examined a strategy – known as backward
planning – that may counteract this optimistic bias. Backward planning involves starting a plan at the end goal and then
working through required steps in reverse-chronological order, and is commonly advocated by practitioners as a tool for
developing realistic plans and projections. We conducted four experiments to test effects on completion time predictions and
related cognitive processes. Participants planned for a task in one of three directions (backward, forward, or unspecified) and
predicted when it would be finished. As hypothesized, predicted completion times were longer (Studies 1–4) and thus less
biased (Study 4) in the backward condition than in the forward and unspecified conditions. Process measures suggested that
backward planning may increase attention to situational factors that delay progress (e.g., obstacles, interruptions, competing
demands), elicit novel planning insights, and alter the conceptualization of
time.
Keywords: prediction, planning fallacy, task completion time, debiasing, optimistic bias
1 Introduction
The ability to accurately predict when an upcoming task will
be finished is important in many areas of life. People make
decisions, choices, and binding commitments on the basis of
completion time predictions, so errors can be costly. How-
ever, a substantial collection of research suggests that peo-
ple commonly underestimate the time needed to complete
tasks. In the present research, we examine a strategy that
has been suggested as a prophylactic against this optimistic
bias. In particular, we provide the first empirical test of an
approach to planning – known as backward planning – that
is often advocated by practitioners in applied settings. Back-
ward planning involves starting a plan at the time of comple-
tion and working back through the required steps in reverse-
chronological order. Our main objective is to test whether
backward planning helps people to arrive at more realistic
predictions of task completion time.
This research is part of a doctoral dissertation submitted by the first au-
thor to Wilfrid Laurier University, and was supported by a grant to the sec-
ond and third authors from the Social Sciences and Humanities Research
Council of Canada (grant number 435–2013–1322).
Copyright: © 2016. The authors license this article under the terms of
the Creative Commons Attribution 3.0 License.
∗Wilfrid Laurier University.
†Corresponding author: Psychology Department, Wilfrid Laurier Uni-
versity, 75 University Avenue West, Waterloo, ON, Canada, N2L 3C5.
Email: rbuehler@wlu.ca.
‡University of British Columbia.
1.1 Bias in completion time predictions
Previous research indicates that people commonly underes-
timate how long it will take to finish tasks. Much of this
work has documented the phenomenon known as the plan-
ning fallacy (Kahneman & Tversky, 1979), a form of op-
timistic bias wherein people underestimate the time it will
take to complete an upcoming task even though they real-
ize that similar tasks have taken longer in the past (for re-
views see Buehler, Griffin & Peetz, 2010; Buehler & Griffin,
2015). The basic tendency to underestimate task completion
times (i.e., an underestimation bias or optimistic bias) has
been documented for a wide range of personal, academic,
and work-related tasks (e.g., Buehler, Griffin & Ross, 1994;
Byram, 1997; Griffin & Buehler, 1999; Kruger & Evans,
2004; Min & Arkes, 2012; Roy, Christenfeld & McKenzie,
2005; Taylor, Pham, Rivkin & Armor, 1998).
However, this robust optimistic bias in task completion
prediction does not imply that people tend to underestimate
how much time they will spend working on a task. Indeed,
researchers and theorists have distinguished between predic-
tions of performance time (i.e., the amount of time spent
working on the target task itself) and completion time (i.e.,
when the task is finished) (Buehler, Griffin & Peetz, 2010;
Halkjelsvik & Jørgensen, 2012). These are very different
predictions and their accuracy depends on different factors.
Task completion times depend not only on the performance
time for the target task but also on the time taken by fac-
tors external to the task, such as competing activities, in-
terruptions, delays, and procrastination. Consequently, pre-
dictions of task completion time appear to be more prone
to optimistic bias than are predictions of task performance
147
http://journal.sjdm.org/vol11.2.html
http://creativecommons.org/licenses/by/3.0/
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 148
time (Buehler et al., 1994; Buehler et al., 2010; Halkjelsvik
& Jørgensen, 2012).
Although there are multiple reasons why people underes-
timate task completion times, one of the key contributors to
bias, somewhat ironically, is people’s tendency to base pre-
dictions on a plan for carrying out the task. To arrive at a
prediction, people often generate a plan-based scenario or
simulation that depicts the sequence of steps that will lead
from the beginning to successful conclusion of a project
(Buehler et al., 1994; Buehler & Griffin, 2003). This ap-
proach leaves them prone to bias. Mental scenarios typically
do not provide a comprehensive and thorough representation
of future events; instead, scenarios are idealized, schematic,
and oversimplified, in that they focus on a few central fea-
tures and omit peripheral or non-schematic elements (Dun-
ning, 2007; Liberman, Trope & Stephan, 2007). Further-
more, given that people plan for success rather than failure,
plan-based scenarios tend to focus on positive rather than
negative possibilities (Newby-Clark, Ross, Buehler, Koehler
& Griffin, 2000). In short, the tendency to underestimate
completion times stems partly from limitations in how peo-
ple imagine or plan for an upcoming task.
We examine a cognitive strategy that might counter these
problems, and thus our work contributes to an emerging
literature on the “debiasing” of optimistic task completion
predictions (Buehler & Griffin, 2015; Buehler, Griffin &
Peetz, 2010; Halkjelsvik & Jørgensen, 2012). Several exist-
ing strategies sidestep the problems associated with overly
optimistic plans by prompting predictors to focus on “out-
side” information, that is, information other than their plans
for the specific target task (e.g., previous completion times,
estimates from neutral observers). For instance, the strategy
of “reference class forecasting” requires forecasters to base
predictions on a distribution of outcomes from comparable
previous projects (Flyvbjerg, 2008; Lovallo & Kahneman,
2003), and empirical tests support the effectiveness of this
strategy in reducing time and cost overruns in large scale
construction projects (Flybjerg, 2008; Flyvbjerg, Garbuio &
Lovallo, 2009). Similarly, research on smaller, individual
projects found that prompting people to base predictions on
past experience (by highlighting the relevance of previous
completion times) resulted in unbiased predictions (Buehler
et al., 1994). Note, however, that such strategies are most
applicable in those relatively rare prediction contexts where
a class of comparable projects can be readily identified.
Other interventions encourage predictors to unpack or
decompose the target task into smaller segments (Byram,
1997; Connolly & Dean, 1997; Forsyth & Burt, 2008;
Kruger & Evans, 2004). Given that plans generated holis-
tically tend to be incomplete and oversimplified, breaking
down a larger task into smaller sub-tasks may highlight steps
that need to be completed, but would otherwise have been
overlooked (Kruger & Evans, 2004). Tests of this strategy
have yielded somewhat mixed results. Kruger and Evans
found that unpacking reduced prediction bias (for a simi-
lar “segmentation effect” see Forsyth & Burt, 2008), how-
ever other studies found that similar strategies were not ef-
fective (Byram, 1997; Connolly & Dean, 1997). Unpack-
ing appears to be less effective if there are few task com-
ponents (Kruger & Evans, 2004), if the unpacked compo-
nents will be easy to carry out (Hadjichristidis, Summers &
Thomas, 2014), or if the tasks are in the distant future (Mo-
her, 2012). Moreover, sometimes asking predictors to de-
velop a detailed, concrete plan can actually exacerbate the
optimistic bias in prediction (Buehler & Griffin, 2003), sug-
gesting there is a risk that such strategies could backfire.
Interventions that focus attention directly on potential ob-
stacles or problems have also produced mixed outcomes.
Some studies have found that people predict longer comple-
tion times if they are prompted to focus on potential obsta-
cles (Peetz, Buehler & Wilson, 2010). In other studies, how-
ever, people’s predictions were not influenced by instruc-
tions to consider potential problems or surprises (Byram,
1997; Hinds, 1999) or to generate scenarios that differed
from their initial plans (Newby-Clark et al., 2000). When
people are confronted directly with potential obstacles, they
may be reluctant to incorporate this information into their
predictions. Their desire to complete the task promptly may
elicit a form of motivated reasoning (Kunda, 1990) or desir-
ability bias (Krizan & Windschitl, 2007) wherein they dis-
miss the relevance of undesirable possibilities. Interestingly,
people instructed to imagine a task from the perspective of
an outside observer may be less prone to these motivated
reasoning processes (Buehler, Griffin, Lam & Deslauriers,
2012), and more willing to contemplate potential obstacles.
In sum, although previous research has identified several
promising debiasing strategies, there appear to be limits to
their applicability and effectiveness. The strategy examined
in the present research – backward planning – capitalizes on
predictors’ natural inclination to base predictions on plan-
based scenarios, but induces them to generate plans in a
manner that might avoid the usual pitfalls.
1.2 The backward planning strategy
Our research was inspired by ideas gaining currency in ap-
plied fields of project management, where practitioners ad-
vocate the use of backward planning (also referred to as
back-planning or back-casting; Lewis, 2002; Verzuh, 2005).
Backward planning involves starting with the target goal
or completion time in mind, and working back toward the
present by identifying the steps needed to attain the goal
in reverse-chronological order. The earliest references to
backward planning emerged in the development of forecast-
ing models for long-term (e.g., 30–50 year) issues related to
socioeconomic and resource policy (e.g., future energy de-
mands, Lovins, 1976; sustainable transport systems, Robin-
son, 1982; Baltic Sea exploration, Dreborg, Hunhammar,
http://journal.sjdm.org/vol11.2.html
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 149
Kemp-Benedict, Raskin, 1999). More recently, backward
planning has been advocated in the practitioner literature for
smaller projects in organizational contexts such as educa-
tion, government, and business (Lewis, 2002; Verzuh, 2005,
Wiggins & McTighe, 1998). Even closer to home, back-
ward planning is commonly recommended for tasks that in-
dividuals carry out in everyday life, such as school assign-
ments, work-related tasks, and personal projects (e.g., Flem-
ing, 2010; Rutherford, 2008; Saintamour, 2008; The Ball
Foundation, 2007).
In each of these contexts, it has been argued that back-
ward planning can yield unique insights that would not be
derived from a traditional chronological planning process.
A common theme is that backward planning provides plan-
ners with a novel perspective that prompts them to attend
to information that would otherwise be neglected. For ex-
ample, it has been suggested that backward planning helps
people to: identify more clearly the steps they will need to
take, appreciate how steps are dependent on one another,
and anticipate potential obstacles. However, to our knowl-
edge, no empirical research has been conducted to support
such claims. Our studies provide the first empirical exami-
nation of backward planning and, to ensure the findings have
widespread practical relevance, target the kinds of tasks and
projects that people carry out in the course of everyday life.
1.3 Effects of backward planning
Our main hypothesis, in line with the anecdotal evidence
reviewed above, is that backward planning may lead peo-
ple to generate later, and thus more realistic, predictions of
task completion time. We also sought to explore cognitive
processes underlying this effect. Thus, we considered sev-
eral cognitive processes that have been shown to affect task
completion predictions, and could be influenced by back-
ward planning.
First, backward planning may counter people’s natural in-
clination to focus on an idealized and hence highly-fluent
scenario of task completion. Backward planning prompts
people to adopt a novel temporal outlook that may disrupt
the chronological, narrative structure of plan-based scenar-
ios. Consequently, backward planners should be less likely
to rely upon a schematic or idealized task representation.
Consistent with this reasoning, research on temporal direc-
tion in memory has shown that instructions to recall a se-
ries of past events in reverse-chronological order results in
fewer schema-based intrusions in memory (Geiselman &
Callot, 1990; Geiselman, Fisher, MacKinnon & Holland,
1986). Along similar lines, backward planning may lead
predictors to focus less exclusively on central, schematic in-
formation (e.g., a plan for successful task completion) and
focus more on the kinds of information that are typically ne-
glected (e.g., additional steps, potential obstacles, and com-
peting demands on their time). In other words, backward
planning may disrupt the fluent planning process that typi-
cally leads to a focus on successful completion, and instead
raise the salience of possible barriers to completion.
Another intriguing possibility is that backward planning
may shift the planner’s perception of the flow of time. Peo-
ple can view the passing of time either as the individual
moving through time (ego motion perspective) or as time
moving toward the individual (time motion perspective)
(Boroditsky, 2000; Clark, 1973). Because backward plan-
ning requires moving cognitively from the future back to-
ward the present, it may emphasize the flow of time and
induce a time
motion perspective.
In a relevant study (Boltz
& Yum, 2010), participants were induced to adopt either a
time motion or ego motion perspective using visual scenes
(e.g., clouds moving toward the person vs. the person mov-
ing toward clouds) or linguistic cues, and then predicted
how long tasks would take to perform. Adopting a time
motion perspective reduced the underestimation bias in task
predictions, and the authors suggest that this was because
the time motion perspective makes deadlines seem closer.
Thus, backward planning might result in less optimistic pre-
dictions of task completion time, in part, because it leads
people to adopt a time motion perspective and hence feel
closer to the deadline.
1.4 Present studies and hypotheses
We conducted four experiments to test effects of planning
direction on task completion predictions and related cogni-
tions. In each study, we asked participants to develop a plan
for completing a target task, and manipulated the tempo-
ral direction of their planning. Participants were randomly
assigned to either a backward planning condition, forward
planning condition, or a control condition where direction
was unspecified. After planning for the task, participants
made a series of time predictions. The primary dependent
variable was their prediction of when they would finish the
target task (task completion time). Participants also pre-
dicted when they would start working on the task (start time)
and how much actual working time it would take (perfor-
mance time). In Study 4, participants also reported actual
completion times in a follow-up session, allowing us to ex-
amine the degree of bias in their predictions. In each study,
after reporting their time predictions, participants completed
a set of process measures that assessed the degree to which
the planning exercise elicited novel insights (e.g., led them
to clarify the steps they would need to take, to think of steps
they wouldn’t have thought of otherwise, to think of poten-
tial problems or obstacles they could encounter), led them
to anticipate potential obstacles or to think of more plan-
ning steps, as well as their perception of the flow of time
(time motion vs. ego motion perspective).1
1Instructions and measures for each study are in the
materials supplement. These include several extra measures (e.g.,
http://journal.sjdm.org/vol11.2.html
http://journal.sjdm.org/16/16101/materials
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 150
Our main hypotheses concern the predictions of task com-
pletion time, as previous research suggests that this type of
prediction is particularly susceptible to optimistic bias. We
expected that participants would predict longer task com-
pletion times (Hypothesis 1), and thus be less prone to un-
derestimate actual completion times (Hypothesis 2) in the
backward planning condition than in the forward and un-
specified conditions. Although the planning process might
also influence when people actually finish tasks (e.g., Goll-
witzer, 1999; Taylor et al., 1998), previous research sug-
gests that planning processes have a greater impact on pre-
dicted than on actual completion times (Buehler & Griffin,
2003, Buehler, Griffin & MacDonald, 1997). Thus, to the
extent that backward planning leads people to predict later
task completion times, it should also make them less prone
to underestimate their
actual completion times.
We examined predicted start times and performance times
to shed additional light on where backward planning ex-
erted its effects. If backward planning disrupts the fluency
of planning processes, as we have proposed, this could shift
the whole set of planning milestones later in time, leading to
a shift in predicted start times as well as completion times.
That is, backward planners may be more aware of potential
delays at each planning milestone – including task initiation.
Thus, our working hypothesis was that predicted start times
would also be delayed in the backward condition. However,
there are other reasonable possibilities. Backward planning
might lead people to predict finishing tasks later, but not
starting them later, if it draws attention to delays that would
occur only after starting the task. Moreover, an increased
focus on potential delays could even prompt participants to
plan earlier start times in order to accommodate the delays.
We were also uncertain whether backward planning would
affect predictions of performance time, given that previous
research has shown people are less prone to underestimate
the time they spend on the task itself (Buehler, Griffin &
Peetz, 2010; Halkjelsvik & Jørgensen, 2012). Our work-
ing hypothesis was that backward planning would influence
predicted completion times, but not performance times, by
drawing attention to obstacles external to the task itself.
We also examined the process measures to test whether,
consistent with our theorizing, participants in the backward
planning condition would report experiencing more novel
planning insights, anticipate more potential obstacles, in-
clude more steps in their plans and be more likely to adopt a
time motion perspective than participants in the forward and
unspeci
fied conditions.
perceived control, perceived time pressure, and perceived difficulty of
planning) that are not discussed because they were not obtained in each
study and did not yield consistent effects. Results for these measures are
summarized in the results supplement (see Table S5).
2 Study 1: Date night
Study 1 provided an initial test of the effects of planning
direction on prediction. To enhance experimental control,
the study used a standard target task: Participants imagined
a scenario used in previous research (Kruger & Evans, 2004)
in which they needed to prepare for an upcoming romantic
date. They were instructed to develop a detailed plan for this
task in one of three temporal directions (backward, forward,
or unspecified) and then predict how soon they would be
finished. It was hypothesized that participants would predict
later completion times in the backward condition than in the
forward and unspecified conditions.
2.1 Method
2.1.1 Participants
Initially 239 undergraduate psychology students were re-
cruited for the study, however, seven participants were ex-
cluded because they did not complete the planning exercise
(n = 4) or the dependent measures (n = 3). The final sample
consisted of 232 participants (50 male, 179 female, 3 other
identity) between the ages of 17 and 37 (M = 19.24, SD =
1.98) compensated with
course credit.
2.1.2 Procedure
Participants completed a self-administered online survey ex-
amining how people plan for future events. Participants first
provided demographic information including age, gender,
and year in university. Participants were then presented with
a scenario (Kruger & Evans, 2004) in which they needed to
prepare for a dinner date, and were asked to imagine it as
though it was actually happening. In this scenario, the par-
ticipant had recently met someone and arranged for a date
at a fancy restaurant on Saturday at 8:00 p.m. It was now
Saturday at 2:00 p.m. and the participant had no plans for
the afternoon except getting ready for the date.
Participants were asked to develop a detailed plan of the
actions they would take to prepare for the date. To guide
their planning, participants were provided with a “timeline”
spanning the period between the present (2:00 p.m.) and the
time of the date (8:00 p.m.) broken into 30 minute inter-
vals. Each interval was accompanied by an expandable text
box, and participants were instructed to list all the steps they
would take to get ready for the date, beginning each sepa-
rate step on a new line, and to state “no plans” for any time
interval when they would not be preparing for the date.
To manipulate planning direction, participants were ran-
domly assigned to one of three conditions. In the backward
planning condition, participants were instructed: “We want
you to develop your plan in a particular way called backward
planning. Backward planning involves starting with the very
last step that needs to be taken and then moving back from
http://journal.sjdm.org/vol11.2.html
http://journal.sjdm.org/16/16101/supp.results
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 151
there in a reverse-chronological order. That is, you should
try to picture in your mind the steps you will work through
in order to reach your goal (getting ready for your date) in a
backward direction.” Corresponding with these instructions,
the timeline was presented in reverse-chronological order
(i.e., the top text box was labeled 8:00 p.m. and the bottom
one labeled 2:00 p.m.) and participants were reminded to
work through it in that order. In the forward planning con-
dition, participants received parallel instructions to plan in
a forward direction, and, corresponding with these instruc-
tions, the timeline was presented in a chronological order. In
the unspecified planning condition, the instructions did not
specify a temporal direction. Although the text boxes were
again presented in chronological order, participants could
choose to work through them in any order.
Time predictions: The primary dependent variable was
the prediction of task completion time. Participants were
asked to indicate the time (hour and minute) they would be
ready for the date. Participants also predicted the time they
would start getting ready (i.e., task start time) and how long
it would take to get ready (i.e., task performance time).
Process measures: We counted the number of separate
steps that participants listed in their plans. Also, after gener-
ating their time predictions, participants completed several
measures concerning their perceptions of the planning exer-
cise and the target task.
Perceived insights. Four items assessed participants’ per-
ceptions of whether the planning exercise resulted in novel
insights. Participants rated the extent to which they agreed
(1 = Strongly disagree, 7 = Strongly agree) that the plan-
ning exercise: “Helped me clarify the steps I would need to
take to prepare for a date”, “Made me think of steps that
I wouldn’t have thought of otherwise”, “Made me break
down my plans into important steps”, and “Made me think
of potential problems or obstacles I could encounter”. These
items were averaged to form an index of perceived planning
insights (α = .82, M = 4.02, SD = 1.25).
Potential obstacles. Four items assessed the anticipation
of obstacles or problems that could arise. Using a scale from
1 (Not at all) to 7 (Extremely), participants rated how dif-
ficult it would be to stick to their plan, and how likely it
was they would: “Need to carry out extra steps they didn’t
think to include in their plans”, “Encounter problems when
preparing for the date”, and “Be delayed by interruptions or
distractions from outside events”. These items were aver-
aged to form an index of potential obstacles (α = .67, M =
4.00, SD = 1.18).
Motion perspective. To measure motion perspective, par-
ticipants were asked to imagine that the date originally
scheduled for 8:00 p.m. had to be rescheduled and moved
forward 1 hour, and to indicate the new time of the date
(adapted from McGlone & Harding, 1998). Participants
who responded “9:00 p.m.” were coded as having an ego
motion perspective; interpreting the forward time change
as later suggests they adopted an orientation in which they
were moving toward the time of the date. Those who re-
sponded “7:00 p.m.” were coded as having a time motion
perspective; interpreting the forward time change as earlier
suggests they had the perception that the time of the date
was moving toward them.
2.2 Results
To examine effects of planning direction, each dependent
measure was submitted to a regression analysis that in-
cluded two orthogonal contrasts. The first contrast provides
a powerful, focused test of our hypothesis by pitting back-
ward planning against the forward and unspecified condi-
tions (backward = 2, forward = –1, unspecified = –1). The
second contrast compares the forward and unspecified con-
ditions (backward = 0, forward = 1, unspecified = –1) which
were not expected to differ. Because gender differences
were plausible for the date preparation task, the regressions
also included gender (male = 1, female = –1) and its inter-
action with each contrast. We report one-tail tests of sig-
nificance for contrast 1, reflecting our directional hypothe-
sis, and two-tail tests otherwise. See Table 1 for descriptive
statistics and contrast coefficients.
2.2.1 Time predictions
Participants’ predictions of when they would finish getting
ready for the date were converted into a number of minutes
before the 8:00 p.m. deadline. These completion time pre-
dictions were submitted to the regression analysis described
above. Consistent with the hypothesis, the first contrast
was significant, indicating that participants expected to fin-
ish with less time to spare in the backward planning condi-
tion than in the forward and unspecified conditions.2 Partic-
ipants also predicted they would start later in the backward
planning condition than in the forward and unspecified con-
ditions.3 However, a parallel analysis of the performance
time predictions (time on task) indicated that participants
did not expect to spend more time working on the task itself
2In each study the distribution of predicted completion times was pos-
itively skewed, thus, we also performed the statistical tests after a square
root transformation. These additional tests revealed the same effects. There
was a significant effect of contrast 1 on predicted completion times in each
study (Study 1 p = .002; Study 2 p = .002; Study 3 p = .006; Study 4 p =
.001).
3To compare effects of backward planning on predicted completion
time and start time, in each study we conducted a repeated measures
ANOVA with type of prediction (completion time vs. start time) as a within
subject factor and contrast 1 as a between subjects factor. There was not a
significant interaction in any study (Study 1 p = .90; Study 2 p = .39; Study
3 p = .48; Study 4 p = .48), suggesting that the effect of backward planning
on the two types of predictions did not differ.
http://journal.sjdm.org/vol11.2.html
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 152
Table 1: Dependent variables by planning direction (Study 1).
Backward Forward Unspecified Contrast 1 (2 -1 -1) Contrast 2 (0 1 -1)
N 80 72 80
Completion time M 31.01 42.90 44.89 −4.255∗∗ −.855
SD (22.87) (41.48) (40.94) (1.679) (2.961)
Start time M 157.44 179.79 184.31 −8.552∗∗ .147
SD (74.33) (85.54) (82.96) (3.630) (6.402)
Performance time M 125.53 136.22 129.96 −3.071 3.934
SD (78.36) (75.86) (66.88) (3.237) (5.709)
Plan steps M 12.89 11.78 11.38 0.413∗ .233
SD (5.25) (5.20) (4.67) (0.233) (.411)
Insights M 4.42 3.81 3.79 0.208∗∗∗ .012
SD (1.23) (1.19) (1.22) (0.057) (0.100)
Obstacles M 4.32 3.85 3.80 .169∗∗∗ .022
SD (1.15) (1.20) (1.10) (.054) (.095)
†p < .10, ∗p < .05, ∗∗p < .01, ∗∗∗p < .001. The values for contrasts are unstandardized coefficients (SEs in parenthesis).
Table 2: Zero order correlations with predicted completion
time.
Study 1 Study 2 Study 3 Study 4
Predicted start .28∗∗ .39∗∗ .44∗∗ .10
Predicted performance −.07 .06 .23∗∗ −.03
Plan steps −.19∗∗ −.24∗∗ −.07 −.13
Insights −.14∗ −.04 .07 .03
Obstacles .02 −.14 −.01 −.05
Time motion −.08 −.01 .08 −.15∗
*p < .05, ∗∗p < .01.
in the backward planning condition than in the forward and
unspecified conditions.
The regressions also revealed significant effects of gen-
der indicating that males expected to start their date prepa-
rations later (B = –22.67, SE = 6.27, t = –3.62, p < .001)
and spend less time preparing for the date (B = –31.048, SE
= 5.586, t = –5.558, p < .001) than did females, but there
was no effect of gender on predictions of completion time
(B = 2.105, SE = 2.897, t = .696, p = .487). There were no
significant interactions, suggesting that the effects of plan-
ning direction on prediction generalized across gender. The
results supplement (Table S1) provides descriptive statistics
and contrast coefficients by gender.
2.2.2 Process measures
We also performed the standard regression analysis on each
of the process measures. Significant effects of contrast 1
indicated that participants in the backward planning condi-
tion, compared to those in the forward and unspecified con-
ditions, included more steps in their plans, experienced more
novel planning insights, and anticipated greater obstacles.
To examine the effect of backward planning (vs. for-
ward and unspecified) on the dichotomous motion perspec-
tive measure, we performed two χ2 tests of association that
parallel the two contrasts. Participants were more likely to
adopt a time motion perspective (vs. an ego motion perspec-
tive) in the backward condition (74.7%) than in the forward
(54.9%) and unspecified conditions (57.5%), χ2(1, N = 230)
= 7.49, p = .006. The prevalence of the time motion per-
spective did not differ across the forward and unspecified
conditions, χ2(1, N = 151) = .10, p = .75.
Correlations between the completion time predictions and
process measures are presented in Table 2. Participants who
predicted later task completion times (i.e., less time before
the deadline) reported more novel planning insights, r(227)
= –.14, p = .03, and included more steps in their plans,
r(230) = –.19, p = .01.
We also conducted mediational analyses that tested
whether the effect of backward planning (contrast 1) on
predicted completion times was mediated by each of the
process measures. Specifically, we used the bootstrapping
method (Preacher & Hayes, 2008) to test the indirect effect
http://journal.sjdm.org/vol11.2.html
http://journal.sjdm.org/16/16101/supp.results
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 153
of backward planning on predicted completion time through
the process measures: plan steps, insights, obstacles, and
motion perspective. There was a significant indirect effect
for plan steps (M(axb) = –.475, SE = .347, 95% CI [-1.661,
-.0544]) and insights (M(axb) = –.666, SE = .449, 95% CI [–
1.871, –.022]). These results suggest that the effect of back-
ward planning on predicted completion times was partially
mediated by an increase in the number of steps included
in the plan and the novel insights experienced by backward
planners.
2.3 Discussion
The results supported the primary hypothesis that back-
ward planning, in comparison to forward and unspecified
planning, results in longer predictions of task completion
time. Backward planners predicted they would finish get-
ting ready for a date with less time to spare. Notably, back-
ward planners also predicted they would start later than par-
ticipants in the other conditions. This finding suggests that
backward planning may have drawn attention not only to po-
tential delays while carrying out the task, but also to factors
that could delay task initiation. That is, backward planning
appeared to shift the whole set of planning steps — includ-
ing task initiation — later in time.
The lack of an effect on performance time predictions
suggests that the effects of backward planning on both pre-
dicted completion times and start times were caused because
backward planners made greater allowance for factors exter-
nal to the task itself (e.g., unexpected interruptions, procras-
tination, competing demands) that could delay completion
of the target task.
Consistent with this interpretation, participants believed
that obstacles were more likely in the backward planning
condition than in the other two conditions. Backward plan-
ners also included more steps in their plans and reported
having experienced more new insights from the planning
exercise, and these cognitions played a role in mediating the
effect of backward planning on predicted completion time.
The effect for planning steps is perhaps surprising, given
the absence of an effect on performance time predictions. It
may be that some planning steps involved a form of contin-
gency planning (e.g., planning how to accommodate poten-
tial obstacles if they arise) rather than steps to be taken while
working on the task. Finally, backward planners were more
inclined to adopt a time motion perspective, which has been
shown in previous research to increase completion time pre-
dictions (Boltz & Yum, 2010).
Additional studies are needed to ensure the findings are
not due to idiosyncratic features of the date preparation task.
One particular concern with this task is that it might not be
representative of tasks that are prone to optimistic bias. Peo-
ple are more likely to underestimate completion times when
tasks are longer in duration (Buehler, Griffin & Peetz, 2010;
Halkjelsvik & Jørgensen, 2012) and they are motivated to
finish early (Buehler et al., 1997; Byram, 1997). It remains
to be seen whether the effects of planning direction general-
ize to such tasks.
3 Study 2: School assignment
This study tested whether planning direction would influ-
ence completion time predictions for a different target task.
We again created a standard scenario for all participants, but
this time involving a task – a major school project with in-
centives for early completion – that is highly susceptible to
optimistic bias (Buehler, Griffin & Peetz, 2010). Partici-
pants developed a plan for completing the task using back-
ward, forward, or unspecified planning, and then predicted
how far before the deadline it would be finished.
3.1 Method
3.1.1 Participants
Initially 156 undergraduate psychology students completed
the study, however 20 participants were excluded because
they did not complete the planning exercise and dependent
measures (n = 2) or failed an attention check embedded in
the questionnaire (n = 18). The attention check was com-
prised of two items directing participants to select a speci-
fied response (e.g., “This is a data quality question. Please
select four on the scale below”). Such items can increase
the likelihood that respondents pay attention when complet-
ing self-administered questionnaires (Berinsky, Margolis &
Sances, 2014; Oppenheimer, Meyvis & Davidenko, 2009).
Participants were excluded if they gave incorrect responses
to both items.4 The final sample consisted of 136 under-
graduate students (45 male, 89 female, 1 other identity, 1
missing) between the ages of 17 and 41 (M = 19.08 years,
SD = 2.31 years) compensated with course credit.
3.2 Procedure
The procedure was similar to Study 1 but with a different tar-
get task. In an online questionnaire, participants were asked
to imagine a scenario in which they needed to complete a
major school assignment in the next two weeks. In this sce-
nario, the participant was required to write a major paper
that must be at least 12 pages long and include a minimum
of eight references, four from journal articles available only
in the library. Additionally, it was noted that the assignment
fell at a time of year that was usually busy for students, and,
as an incentive to have it done promptly, the instructor would
4When these participants are included, results are very similar (see Ta-
ble S2 in the results supplement). There is an effect of backward planning
on predicted completion time (p = .02), predicted start time (p = .04), in-
sights (p < .001), and obstacles (p = .09).
http://journal.sjdm.org/vol11.2.html
http://journal.sjdm.org/16/16101/supp.results
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 154
Table 3: Dependent variables by planning direction (Study 2).
Backward Forward Unspecified Contrast 1 (2 –1 –1) Contrast 2 (0 1 –1)
N 44 50 42
Completion time M 2.25 3.44 3.79 −.454∗∗ −.173
SD (2.04) (2.91) (3.11) (.167) (.286)
Start time M 11.84 12.26 13.02 −.267 −.382
SD (3.88) (3.80) (3.38) (.226) (.387)
Performance time M 21.22 17.83 24.08 .086 −3.127
SD (20.15) (20.41) (26.61) (1.372) (2.347)
Plan steps M 14.61 14.16 14.60 .079 −.218
SD (5.21) (4.56) (5.77) (.316) (.541)
Insights M 5.22 4.44 4.87 .188∗∗∗ −.217∗
SD (.83) (1.00) (1.05) (.059) (.101)
Obstacles M 3.98 3.60 3.31 .174∗ .145
SD (1.45) (1.31) (1.39) (.085) (.145)
†p < .10, ∗p < .05, ∗∗p < .01, ∗∗∗p < .001. The values for contrasts are unstandardized coefficients (SEs in parenthesis).
award an extra 2% for every day before the due date that the
assignment was submitted.5
Participants were asked to develop a plan of the steps they
would take to complete the assignment. They were provided
with a timeline comprised of 14 text boxes spanning the pe-
riod between the present date (Day 1) and the due date (Day
14), and were instructed to use the text boxes to list the steps
they would take to complete the assignment. They were to
state “no plans” in the text box for any day they did not
plan to work on the assignment. To manipulate planning di-
rection, participants were randomly assigned to three condi-
tions (backward, forward, or unspecified) using instructions
adapted from Study 1.
3.2.1 Time predictions
The primary dependent variable was the prediction of task
completion time. Participants were asked, “How many days
before the due date will you finish the assignment?” and re-
sponse options ranged from 0 days before the due date (i.e.,
the due date itself) through 14 days before the due date (i.e.,
today). Participants also predicted how many days before
the due date they would start the assignment, and how many
hours of actual working time
it would take.
5An additional instruction was included in an attempt to vary perceived
task importance. Participants were told either that the assignment was ex-
tremely important (worth 50% of the final grade) or that it was not all that
important (worth 10% of the final grade). This manipulation produced no
effects and is not discussed further.
3.2.2 Process measures
Participants then completed process measures similar to
those in the previous study. Planning insights were assessed
using the same four items from Study 1 (α = .79, M = 4.82,
SD = 1.01). Potential obstacles were assessed with a single
item in this study, as the remaining items were inadvertently
omitted: Participants simply rated how difficult it would be
to stick to their plan (1 = Extremely easy, 7 = Extremely
difficult). To measure motion perspective, participants were
asked to imagine that the due date (14 days from today) for
the assignment had been moved forward two days, and to
indicate how many “days from today” the assignment was
now due. Participants who responded “16 days from to-
day” were coded as having an ego motion perspective, while
those who responded “12 days from today” were coded as
having a time motion perspective.
3.3 Results
Dependent measures were again regressed on the two or-
thogonal contrasts as in Study 1. See Table 3 for descriptive
statistics and regression coefficients.6
6In this study and subsequent studies, gender was not included as an
additional predictor. Preliminary analyses indicated that gender did not
have significant effects on time predictions, and did not alter the pattern or
significance of the reported effects.
http://journal.sjdm.org/vol11.2.html
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 155
3.3.1 Time predictions
The regression analyses revealed that, as hypothesized, par-
ticipants predicted they would finish closer to the deadline
in the backward planning condition than in the forward and
unspecified conditions. Predictions in the forward and un-
specified conditions did not differ significantly. The anal-
yses did not reveal significant effects of planning direction
(contrast 1 or 2) on participants’ predictions of when they
would start the assignment, or how long they would work
on it.
3.3.2 Process measures
The regression analyses performed on the process measures
revealed that, unlike Study 1, participants did not list more
plans in the backward planning condition than in the for-
ward or unspecified conditions. However, as in Study 1,
planning direction had a significant effect on the perceived
insights index: Participants experienced greater planning in-
sights in the backward condition than in the forward and
unspecified condition (contrast 1). Perceived insights were
also greater in the unspecified than in the forward condition
(contrast 2). There was also evidence, as in Study 1, that
backward planning increased the anticipation of obstacles.
Participants believed it would be harder to stick to their plan
in the backward condition than in the forward and unspeci-
fied conditions.
Finally, there was again a significant effect of planning di-
rection on motion perspective. Participants were more likely
to adopt a time motion perspective (vs. an ego motion per-
spective) in the backward condition (76.9%) than in the for-
ward (31.7%) and unspecified conditions (38.5%), χ2(1, N
= 119) = 18.44, p = .006, and there was not a significant dif-
ference across the forward and unspecified conditions, χ2(1,
N = 80) = .401, p = .53.
Again there were few correlations between the comple-
tion time predictions and the process measures (see Table
2). Participants expected to finish closer to deadline when
they anticipated more potential obstacles, r(134) = –.14, p =
.10, and listed more steps in their plans, r(130) = –.24, p =
.01. We also used the bootstrapping test, as in Study 1, to ex-
amine the indirect effect of backward planning on predicted
completion time through the process measures: plan steps,
insights, obstacles, and motion perspective. There were no
significant
indirect effects.
3.4 Discussion
The study provided further evidence that backward planning
results in later predictions of task completion time, even for
the kind of task that is highly susceptible to optimistic bias
(i.e., an extensive project with incentives for early comple-
tion). Backward planning also appeared to have a parallel
effect on predicted start times — with backward planners
predicting they would start the task later — although this ef-
fect on its own was not significant. There was no evidence
that backward planning influenced predictions of the num-
ber of hours that would be spent working on the task itself.
The process measures provided further evidence that back-
ward planning leads people to experience more novel in-
sights during the planning process and to anticipate greater
obstacles while carrying out the task, although it could not
be shown that these processes mediated the effects of back-
ward planning on pre
diction.
A limitation of the first two studies is that they examined
hypothetical tasks that participants did not actually perform.
Although this procedure affords a high degree of experimen-
tal control, it limits our ability to generalize results to con-
sequential, real world tasks. Accordingly, the next two stud-
ies tested effects of backward planning on a variety of tasks
that participants were planning to carry out (Studies 3 and
4), and assessed the effects of planning direction on actual
completion times as well as predictions (Study 4).
4 Study 3: Real tasks
Study 3 tested the effect of planning direction on predictions
concerning real projects. Given our interest in debiasing,
we again sought target tasks shown to be highly suscepti-
ble to bias in previous research – namely extensive projects
that participants wanted to complete promptly. Thus, par-
ticipants were asked to nominate a project they needed to
complete in the next month that would require multiple steps
across several days, and that they hoped to finish as soon as
possible. The nominated projects were further classified as
academic (e.g., finishing an essay) or personal (e.g., making
a slideshow of pictures for a wedding) to determine whether
effects generalized across these broad project types. Partici-
pants were then instructed to plan for the project using either
forward, backward, or unspecified planning. Notably, due
to the fact that participants nominated projects with vary-
ing deadlines, the planning exercise was not structured with
a standard timeline as in previous studies. Instead partici-
pants were provided with a single open-ended text box to
list all the steps of their plan. After developing a plan for the
project, participants predicted when it would be finished.
4.1 Method
4.1.1 Participants
Initially 240 undergraduate psychology students were re-
cruited. A substantial number of the participants were ex-
cluded because they nominated tasks that did not meet the
criteria in the instructions: exams or tests that could only be
done at a fixed time (n = 54), tasks with a deadline more
than a month away (n = 13), or tasks with a deadline the
day of the study (n = 6). Participants were also excluded if
http://journal.sjdm.org/vol11.2.html
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 156
they predicted finishing after the stated deadline (n = 17),
or did not complete the main dependent measures (n = 3).
The final sample consisted of 147 undergraduate students
(62 male, 85 female) between the ages of 17 and 47 (M =
19.50 years, SD = 3.24 years) who were compensated with
course credit.
4.2 Procedure
Participants first reported demographic information and then
were asked to identify a project they would be doing in the
coming month. This could be either a school project (e.g.,
writing a paper) or a personal project (e.g., organizing your
photo albums) as long as it was a major project that would
involve carrying out steps across several days. Addition-
ally, participants were instructed that the project must be one
that: they were required to complete sometime in the next
month (i.e., there was a firm deadline), they were free to
complete any time before the deadline, and they were hop-
ing to finish as soon as possible. Participants briefly identi-
fied the project and reported the date of the deadline.
Participants then completed a planning exercise that
asked them to develop a detailed plan for the project, and
were randomly assigned to one of the three planning con-
ditions (backward, forward, unspecified) using instructions
similar to those in previous studies. They were provided
with an open-ended text box and asked to list the steps of
their plan in point form.
4.2.1 Time predictions
Participants were asked to predict task completion time in
relation to the deadline: How many days before the deadline
do you think you will finish the project? Participants also
predicted how many days before the deadline they would
start working on the project, and how many hours of actual
working time it would take to complete their project.
4.2.2 Process measures
Participants completed the four items that assessed their per-
ception that the planning exercise had resulted in novel plan-
ning insights, using a response scale from 1 (Not at all) to
11 (Extremely) (α = .66, M = 7.47, SD = 1.71). They also
completed the four items used in Study 1 that assessed their
anticipation of obstacles, using a response scale from 1 (Not
at all) to 11 (Extremely) (α = .72, M = 7.11, SD = 1.84).
To measure motion perspective, participants were asked to
imagine that a hypothetical meeting originally scheduled for
next week on Wednesday had been moved forward two days
and to indicate the new meeting date. Participants who re-
sponded “Friday” were coded as having an ego motion per-
spective, while those who responded “Monday” were coded
as having a time motion perspective.7
4.3 Results
An examination of the project descriptions indicated that
about half the participants (n = 79, 53.7%) nominated aca-
demic projects (e.g., writing an essay, completing a statis-
tics assignment) and the remaining participants (n = 68,
46.3%) nominated personal projects (e.g., creating a photo
slideshow for a wedding, booking a vacation). Accordingly,
to control for variability in the projects, project type was in-
cluded as a factor in the regression analyses (-1 = academic,
1 = personal). Each dependent measure was regressed on
project type, the two contrasts, and the project type X con-
trast interactions. Descriptive statistics and regression coef-
ficients are presented in Table 4.
4.3.1 Time predictions
The regression analysis for completion time predictions re-
vealed an effect of project type, as participants who selected
academic projects (M = 3.09, SD = 2.78) predicted they
would finish closer to the deadline than those who selected
non-academic projects (M = 3.90, SD = 3.03), B = .500, SE
= .250, p = .047. There was also a significant effect of con-
trast 1. Once again, as hypothesized, participants predicted
they would finish closer to the deadline in the backward
condition than in the forward and unspecified conditions.
Predictions did not differ across the forward and unspeci-
fied conditions (contrast 2). There was not an interaction of
project type and contrast 1 (p = .51) or contrast 2 (p = .26)
suggesting that the effect of backward planning generalized
across academic and personal projects. The results supple-
ment provides descriptive statistics and contrast coefficients
by project type (see Table S3).
The analysis of predicted start times also revealed an ef-
fect of project type, indicating that participants expected to
start closer to the deadline for academic projects (M = 8.74,
SD = 10.61) than for personal projects (M = 15.75, SD =
9.55), B = 3.613, SE = .845, p < .001. There were not sig-
nificant effects of the planning direction contrasts. How-
ever, predicted start times were again descriptively later in
the backward planning condition than in the other two con-
ditions, and the test of contrast 1 approached significance,
B = –.831, SE = .582, p = .08. The analysis of predicted
performance times did not reveal significant effects of the
planning direction contrasts or project type.
To further explore the significant effect of backward plan-
ning on predicted completion time we performed an addi-
7The study was conducted across two academic terms, and the unspeci-
fied condition and two questionnaire items (motion perspective, fourth rat-
ing of perceived insights) were not added until the second term. Hence
there are fewer participants in the unspecified condition and in analyses of
motion perspective.
http://journal.sjdm.org/vol11.2.html
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 157
Table 4: Dependent variables by planning direction (Study 3).
Backward Forward Unspecified Contrast 1 (2 -1 -1) Contrast 2 (0 1 -1)
N 59 56 32
Completion time M 2.77 3.87 4.03 −.396∗∗ −.141
SD (2.19) (3.54) (2.72) (.163) (.319)
Start time M 10.75 12.32 13.78 −.831† −1.246
SD (8.94) (10.39) (13.72) (.582) (1.126)
Performance time M 13.57 14.02 10.25 .490 1.611
SD (13.75) (14.33) (13.72) (.799) (1.548)
Plan steps M 6.80 5.73 6.38 .249† −.295
SD (2.87) (3.07) (2.70) (.166) (.325)
Insights M 7.42 7.58 7.38 −.022 .061
SD (1.46) (1.80) (1.99) (.097) (.190)
Obstacles M 7.08 7.35 6.75 .013 .330
SD (1.61) (1.80) (2.24) (.104) (.204)
†p < .10, ∗p < .05, ∗∗p < .01. The values for contrasts are unstandardized coefficients (SEs in parenthesis).
tional control analysis. To control for the variation in task
deadline lengths introduced by using self-selected tasks,
we conducted a supplementary analysis including deadline
length as a covariate. We also included interaction terms be-
tween the deadline length variable and the two standard con-
trasts, reasoning that the beneficial effect of backward plan-
ning on predicted completion times would have a greater
scope to reveal itself in projects with longer deadlines. Thus,
we regressed the predicted completion times on the two con-
trasts, deadline length, and the interactions between the con-
trasts and deadline length. For simplicity of interpretation,
deadline length was centered at the grand mean. We had di-
rectional hypotheses for both the beneficial effects of back-
wards planning (contrast 1) and for the positive interaction
between backwards planning and deadline length.
As expected, projects with longer deadlines were associ-
ated with completion time predictions that were further be-
fore the deadline (B = .12, t(141) = 6.48, p < .01). Even
when deadline length was controlled, the first contrast (pit-
ting backwards planning against the forward and control
condition) was still significant (B = –.27, t(1,141) = 1.87,
p = .03); this effect was marginally stronger when deadlines
were longer (interaction B = –.020, t(1,141) = 1.56, p = .06).
No other effects approached significance
in this regression.
4.3.2 Process measures
The process measures were also submitted to the standard
regression analysis that included project type, the two con-
trasts, and the project type X contrast interactions. There
was not a significant effect of contrast 1 on planning in-
sights, potential obstacles, or the number of steps in the plan.
There was again an effect of planning direction on motion
perspective. Participants were more likely to adopt a time
motion perspective in the backward condition (78.0%) than
in the forward (34.4%) and unspecified conditions (40.6%),
χ
2(1, N = 105) = 16.51, p < .001, and the latter two con-
ditions did not differ significantly, χ2(1, N = 64) = .27, p =
.61.
As seen in Table 2, completion time predictions were not
correlated significantly with any of the process measures (ps
> .12). We also used the same method as in previous studies
to test for indirect effects of backward planning on predicted
completion time through the process measures: plan steps,
insights, obstacles, and motion perspective. Naturally, given
the lack of correlation, the analyses revealed no significant
indirect effects.
4.4 Discussion
Study 3 again found that backward planning, in comparison
to forward and unspecified planning, resulted in less opti-
mistic predictions of task completion time, and extends this
finding to include real tasks in people’s lives. The effect of
backward planning on predicted start times approached sig-
nificance, and was again, descriptively, parallel to its effect
on predicted completion times. There was again no evidence
that backward planning influenced predictions of the time
http://journal.sjdm.org/vol11.2.html
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 158
that would be spent on the task itself.
Effects of planning direction seen previously on perceived
planning insights and potential obstacles were not obtained
in this study. The absence of these effects could reflect any
number of changes made to the procedure (e.g., the move
to a real task, the increased variability created by examin-
ing a unique project for each participant, the unstructured
response format of the planning exercise) and we cannot de-
termine which of these changes may have been responsible.
Given that the next study also examines self-nominated tar-
get tasks, we postpone further discussion of these findings
to the general discussion.
A noteworthy limitation of the studies so far is that they
have not assessed actual task completion times, and thus
cannot directly address questions of prediction bias. Al-
though backward planning led to later predicted completion
times (i.e., closer to the deadline), which could generally
help to curb optimistic bias, there is as yet no direct evi-
dence that predictions were less biased as a result of back-
ward planning. This issue is addressed in the final study.
5 Study 4: Predicted vs. actual times
The main purpose was to replicate the effect of planning di-
rection on predicted completion times for real projects, and
to test whether backward planning reduces the tendency to
underestimate completion times. Thus, the procedure was
similar to the previous study, but included follow-up mea-
sures to track completion times for the target project. This
allowed us to test whether participants tended to underesti-
mate task completion time, and whether the backward plan-
ning strategy reduced this prediction bias.
Although there are various forms of prediction accuracy
(e.g., prediction bias, correlational accuracy; Buehler et al.,
1994; Epley & Dunning, 2006; Kruger & Evans, 2004),
we focused primarily on prediction bias (i.e., the mean dif-
ference between predicted and actual times) because it is
arguably most consequential for real world time forecasts.
Even if people’s predicted completion times are sensitive to
variations in actual times (i.e., correlational accuracy or dis-
crimination), a systematic tendency to underestimate actual
completion times (i.e., prediction bias) is likely to have se-
rious ramifications. Thus, our main objective was to test
effects on prediction bias. Nevertheless, to shed light on the
workings of the backward planning intervention, we also ex-
amined regressions that tested the sensitivity of predictions
to variation in actual times.
5.1 Method
5.1.1 Participants
Initially 196 participants were recruited from Amazon
MTurk, however participants were again excluded if they
did not nominate tasks consistent with the criteria stated
in the instructions (n = 6) or did not complete the plan-
ning exercise according to instructions (n = 3). All remain-
ing participants correctly answered the same attention check
items used in Study 2. The sample for the initial prediction
questionnaire consisted of 187 participants (103 females, 82
males, 2 other identity) between the ages of 18 and 74 (M
= 31.85 years, SD = 11.39 years). A follow-up question-
naire sent out two weeks later was completed by 161 (86%)
of these participants, and 125 (59 male, 66 female; M =
32.40 years, SD = 11.50 years) of the participants reported
that they had completed the target project. Participants were
compensated $.50 for the initial questionnaire and $1 for the
follow-up questionnaire.
5.1.2 Procedure
The initial online questionnaire was similar to that of Study
3. Participants first provided demographic information (i.e.,
age, gender) and an e-mail address so that they could be
sent a follow-up questionnaire. Participants were then in-
structed to think of a major project they would be doing in
the next two weeks that would involve carrying out multiple
steps across several days. Additionally, participants were in-
structed that their project must be one that they had to com-
plete sometime within the next two weeks (i.e., there was a
firm deadline), they were free to complete at any time before
the deadline, and they were hoping to finish as soon as pos-
sible. Participants described the project briefly and reported
its deadline. Participants then completed the planning exer-
cise used in Study 3 and were randomly assigned to either
the forward, backward, or unspecified planning condition.
Time predictions: As in the previous study, participants
were asked: “How many days before the deadline do you
think you will finish the project?” Participants also predicted
how many days before the deadline they would start working
on the project and how many hours of actual working time
it would take.
Process measures: Participants again completed the four
items that assessed their perception that the planning exer-
cise resulted in novel planning insights (1 = Not at all, 7 =
Extremely) (α = .76, M = 4.76, SD = 1.14), and the four
items that assessed their beliefs about potential obstacles
(α = .56, M = 3.96, SD = 1.02). To measure motion per-
spective, participants were asked to imagine that a meeting
originally scheduled for next Wednesday has been moved
forward two days and to indicate the new meeting date. Par-
ticipants who responded “Friday” were coded as having an
ego motion perspective while those who responded “Mon-
day” were coded as having a time motion perspective.
http://journal.sjdm.org/vol11.2.html
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 159
Table 5: Dependent variables by planning direction (Study 4).
Backward Forward Unspecified Contrast 1 (2 -1 -1) Contrast 2 (0 1 -1)
N 61 62 64
Completion time M 2.57 4.11 3.33 −.401∗∗ .383
SD (2.57) (3.51) (3.50) (.169) (.288)
Start time M 8.52 8.82 9.42 −.494∗ −.648
SD (3.90) (3.82) (4.29) (.238) (.371)
Performance time M 14.85 14.32 24.27 −1.219 −4.841∗
SD (18.21) (18.87) (27.56) (1.145) (1.951)
Plan steps M 6.90 6.26 7.30 .031 −.524
SD (4.28) (3.01) (3.29) (.187) (.318)
Insights M 5.00 4.59 4.70 .106∗ −.063
SD (1.29) (1.02) (1.07) (.059) (.100)
Obstacles M 4.10 3.83 3.96 .055 −.069
SD (0.94) (0.98) (1.33) (.057) (.098)
†p < .10, ∗p < .05, ∗∗p < .01. The values for contrasts are unstandardized coefficients (SEs in parenthesis).
Follow-up measures: Two weeks later, participants were
sent an e-mail with a link to the follow-up questionnaire.
The e-mail reminded participants of the nominated project
and its deadline. Participants were asked whether they had
finished the project, and if so, to report how many days be-
fore the deadline they had finished it, how many days before
the deadline they had started working on it, and how many
hours of actual working time they had spent on it.
5.2 Results
In this community sample, participants were most likely to
nominate personal projects (n = 133; e.g., bathroom reno-
vation, paint a canvas) followed by work related projects (n
= 28; e.g., write performance reports, prepare month end
balance sheet) and academic projects (n = 26; e.g., write
an essay, register for classes). Thus we created a project
type variable that distinguished the personal projects from
the academic and work-related projects (–1 = academic and
work, 1 = personal).
Analyses of the initial questionnaire were performed on
the full sample (n = 187; see Table 5); the dependent mea-
sures were again regressed on project type, the two con-
trasts, and the project type X contrast interactions. Analy-
ses of actual times and prediction bias (i.e., predicted-actual
time) were performed on the subset of participants who fin-
ished the target project (n = 125; see Table 6). For these par-
ticipants, we could test whether there was a systematic ten-
dency to underestimate task completion times, and whether
backward planning reduced this bias.
Time predictions: The analysis of predicted completion
time again revealed the hypothesized effect of backward
planning. Participants expected to finish the project signif-
icantly closer to deadline in the backward condition than in
the forward and unspecified conditions. No other effects
were significant in this analysis.
The analysis of predicted start times also revealed an
effect of backward planning. Participants predicted they
would start significantly later in the backward planning con-
dition than in the other conditions. This effect was qualified
by a significant interaction with project type, B = –.589, SE
= .238, p = .014. For academic and work projects, partici-
pants predicted later start times (fewer days before deadline)
in the backward planning condition than in the other condi-
tions (Ms = 6.38 vs. 9.76), whereas for personal projects,
predicted start times did not differ (Ms = 9.10 vs. 8.82). The
results supplement provides descriptive statistics and con-
trast coefficients by project type (see Table S4).
The analysis of performance time predictions yielded an
unexpected effect of the second contrast. Participants pre-
dicted they would spend more hours working on the task
in the unspecified condition than in the forward condition.
Again this effect was qualified by a significant interaction
with project type, B = –6.999, SE = 2.024, p = .001, indicat-
ing that the unexpected difference between the unspecified
http://journal.sjdm.org/vol11.2.html
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 160
Table 6: Predicted vs. actual times for completed projects by planning direction (Study 4).
Backward Forward Unspecified Contrast 1 (2 -1 -1) Contrast 2 (0 1 -1)
N 44 43 38
Predict completion M 2.82 3.91 3.00 −.266† .407
SD (2.63) (3.27) (3.00) (.187) (.330)
Actual completion M 2.64 1.98 1.79 .214 .062
SD (2.44) (1.93) (1.79) (.132) (.232)
Bias M .18 1.93 1.21 −.480∗∗ .345
SD (2.60) (3.29) (2.62) (.182) (.321)
Predict start M 7.61 8.61 8.71 −.612∗ −.246
SD (3.82) (3.80) (4.34) (.281) (.444)
Actual start M 6.91 7.88 7.97 −.275 .010
SD (4.18) (3.99) (4.88) (.274) (.483)
Bias M .70 .72 .74 −.041 −.036
SD (2.92) (3.81) (3.32) (.214) (.376)
Predict performance M 14.73 14.21 24.08 −1.095 −4.615†
SD (20.42) (17.37) (26.84) (1.366) (2.405)
Actual performance M 13.82 18.05 20.53 −1.552 −1.010
SD (16.70) (14.40) (19.12) (1.058) (1.862)
Bias M .91 −3.84 3.55 .456 −3.605†
SD (19.93) (13.79) (21.04) (1.173) (2.065)
†p < .10, ∗p < .05, ∗∗p < .01. The values for contrasts are unstandardized coefficients (SEs in parenthesis).
and forward conditions was found for academic and work
projects (Ms = 39.05 vs. 10.53) but not for personal projects
(Ms = 16.52 vs. 16.00).
As in Study 3, we conducted an additional control anal-
ysis that regressed predicted completion times on deadline
length, the two contrasts, and interactions between the con-
trasts and deadline length, to control for variation in the
length of deadlines for the self-selected tasks. All predic-
tor variables were grand-mean centered. Again as expected,
projects with longer deadlines were associated with pre-
dicted completion times further before the deadline (B =
.34, t(181) = 4.94, p < .01). Even when deadline length
was controlled, the first contrast (pitting backwards planning
against the forward planning and unspecified condition) was
still significant (B = –.33, t(181) = 2.04, p < .03). Although
the interaction between deadline length and backwards plan-
ning was of a similar magnitude as in Study 3 (interaction B
= –.023 instead of –.020), in this study it was far from sig-
nificant (p > .6). No other effects approached significance
in this regression.
Time predictions vs. actual time: To test whether pre-
dictions were systematically biased, paired t-tests compared
the predicted and actual times. Participants predicted they
would finish their projects further before the deadline (M =
3.25, SD = 2.99) than they actually did finish (M = 2.15,
SD = 2.10), t(124) = 4.18, p < .001. This finding is con-
sistent with previous evidence that people tend to underesti-
mate task completion times. Participants also predicted they
would start work on the project further before deadline (M
= 8.29, SD = 3.98) than they actually did (M = 7.57, SD =
4.33), t(124) = 2.41, p = .02. Participants’ predictions of the
hours they would spend working on the project (M = 17.39,
SD = 21.94) did not differ from actual performance times
(M = 17.31, SD = 16.84), t(124) = .05, p = .96.
We hypothesized that the degree of optimistic bias in
completion time predictions would be reduced by backward
planning. To test this hypothesis, we computed predicted-
actual difference scores, with greater positive values indi-
cating a greater underestimation bias (see Table 6). These
bias scores were submitted to the standard regression anal-
ysis, which included project type, the two contrasts, and the
http://journal.sjdm.org/vol11.2.html
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 161
interactions. The analysis revealed a significant effect of
the first contrast, indicating that, as hypothesized, predic-
tions were less biased in the backward condition than in the
forward and unspecified conditions. Bias did not differ sig-
nificantly between the forward and unspecified conditions.
There were no interactions with project type. The analyses
of bias in predicted start times and performance times did
not yield any significant effects.
We also conducted the control regression on bias in com-
pletion time predictions that included deadline length, the
two contrasts, and interactions between the contrasts and
deadline length. Again as expected, projects with longer
deadlines were associated with more bias (B = .18, t(119) =
2.34, p < .03). Even when deadline length was controlled,
the first contrast (pitting backwards planning against the for-
ward planning and control condition) was significant (B =
–.40, t(1,119) = 2.16, p < .02), indicating that the underesti-
mation bias was smaller in the backward planning condition.
No other effects approached significance in this regression.
The above analyses indicate that bias in predicted com-
pletion times was influenced by planning direction. To test
whether bias was significant in each condition, paired t-
tests compared predicted and actual completion times within
each condition. The predicted and actual completion times
differed significantly in the forward, t(42) = 3.85, p < .001,
and unspecified conditions, t(37) = 2.85, p = .01, but not in
the backward planning condition, t(43) = .46, p = .65.
Finally, we conducted regressions of actual times on pre-
dicted times to examine the sensitivity of predictions as well
as bias. In these regressions, perfectly sensitive and unbi-
ased predictions would result in unstandardized regression
coefficients of 1 (perfect sensitivity) and an intercept of 0
(lack of systematic bias). We also included the two con-
trasts, which in this case indicate if the intercept (bias) is dif-
ferent between the various conditions, and interaction terms
between the predicted time and the two contrasts, which in-
dicate if the slopes (sensitivity) are different between con-
ditions. In this special case, we centered both the actual
completion times and the predicted completion times by the
grand mean for the predicted completion times; this simpli-
fies the interpretation of the intercept term, which now tests
the degree of bias when predicted times are at their mean.
For the relation between predicted and actual completion
times, the intercept was –1.13, t(119) = 6.52, p < .01, in-
dicating that when predicted days before deadline were at
their mean, the actual completion times averaged 1.13 days
later. The effect of the first contrast was significant (B = .33,
t(119) = 2.71, p < .01), indicating once again that the back-
ward planning condition was less biased than the other two
conditions. The slope between predicted and actual comple-
tion times was significant (B = .30, t(119) = 5.13, p < .01)
but indicated a fair degree of insensitivity in predicting com-
pletion, as the slope was far below 1.0. Finally, there was
an almost-significant trend toward greater sensitivity in the
backward planning condition than in the other conditions,
B(interaction) = .07, t(119) = 1.61, p < .06 (two-tailed).
For the relation between predicted and actual start times,
the intercept was –0.87, t(119) = 2.96, p < .01, indicating
that when predicted start days were at their mean, the actual
start times averaged .87 days later. There were no effects of
either contrast on the actual start times. The slope between
predicted and actual completion times was significant and
substantial (B = .73, t(119) = 5.13, p < .01) indicating a
strong sensitivity in prediction to actual start times. This is
much higher sensitivity than for predicted completion times,
possibly because the start times are much closer to the time
of prediction than are the pre
dicted completion times.
For the relation between predicted and actual perfor-
mance time, the intercept was only –0.33 hours (t(119) =
0.26, p > .70), indicating a high degree of accuracy (an aver-
age deviation of about half an hour) and providing little ev-
idence of bias in performance time predictions. There was
no effect of either contrast on the actual performance time;
the closest to significance was the backwards planning con-
trast (B = –1.31, t(119) = 1.48, p > .07). The slope between
predicted and actual performance time was significant and
substantial (B=.44, t(119) = 7.29, p < .01) indicating a mod-
erate sensitivity in predictions for the time spent working on
the task.
Process measures: The process measures were again sub-
mitted to the standard regression analysis that included the
two contrasts, project type, and their interactions. An exam-
ination of the first contrast revealed that participants again
reported greater insights in the backward condition than in
the forward and unspecified conditions (contrast 1), which
did not differ significantly from each other (contrast 2).
Thus, there was again evidence, as in the first two studies,
that participants experienced greater planning insights when
they engaged in backward planning. Planning direction did
not influence the number of steps included in the plan, or the
anticipation of obstacles. Also, unlike the first three stud-
ies, participants were no more likely to adopt a time motion
perspective in the backward condition (56.7%) than in the
forward (60.7%) and unspecified conditions (61.9%), χ2(1,
N = 184) = .36, p = .55, which also did not differ from each
other, χ2(1, N = 124) = .02, p = .89.
An examination of the correlations in Table 2 indicates
that participants who adopted the time motion perspective
(vs. ego motion) made less optimistic predictions, r(182) =
–.15, p = .05. However, unlike previous studies, motion per-
spective was not affected by planning direction, and thus
was not a viable mediator. Indeed our tests of mediation for
the process measures (using the bootstrap method to test for
indirect effects of backward planning on predicted comple-
tion times) revealed no significant indirect effects.
http://journal.sjdm.org/vol11.2.html
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 162
Table 7: Meta analysis of the effect of backward planning (contrast 1).
Study 1 Study 2 Study 3 Study 4 Weighted average z-statistic
Completion time −.349 −.496 −.407 −.349 −.389 −4.856
Start time −.324 −.215 −.239 −.324 −.285 −3.568
Performance time −.131 .011 .103 −.131 −.052 −0.654
Plan steps .244 .046 .251 .244 .208 2.613
Insights .502 .581 −.038 .502 .394 4.901
Obstacles .432 .373 .021 .431 .327 4.091
Note: Table values are Hedges’ g for the effect of backward planning (contrast 1) in each study, the
weighted average effect size, and the z-statistic for the weighted average effect size.
5.3 Discussion
The study again demonstrated that backward planning, in
comparison to other forms of planning, results in later pre-
dictions of task completion time. It also found that back-
ward planning helped to curb the prevalent optimistic bias in
task completion predictions. Participants generally underes-
timated how long they would take to finish their projects;
however, this bias was eliminated in the backward plan-
ning condition. Backward planning led participants to pre-
dict later completion times, and did not have a correspond-
ing impact on actual times, so it counteracted the system-
atic bias in prediction. Moreover, in addition to reducing
bias, there was evidence that backward planning may have
slightly improved the sensitivity of prediction to variation in
actual completion times.
As in previous studies, planning direction appeared to
have a similar effect on predicted start times, with backward
planners expecting to start the task later than participants
in the other conditions, although this effect was limited to
academic and work related projects. There were also unex-
pected effects on performance time predictions that were not
observed previously. Participants predicted to spend more
hours working on their tasks in the unspecified condition
than in the backward or forward conditions, and this dif-
ference appeared only for the academic and work related
projects. Given that this unexpected effect emerged in only
one of the four studies, we believe it should be interpreted
cautiously.
The process measures again revealed an effect of plan-
ning direction that was consistent with our theorizing. As
in Studies 1 and 2, an effect of planning direction on per-
ceived insights re-emerged, with backward planners report-
ing more novel planning insights (e.g., breaking plans into
important steps, thinking of new steps, considering poten-
tial obstacles) than forward and unspecified planners. How-
ever tests of mediation did not indicate that these planning
insights mediated the effect of backward planning on pre-
dicted completion times.
Unexpectedly, the effect of planning direction on motion
perspective found in the first three studies was not obtained
in Study 4. A possible explanation is that the phrasing of
the question was altered slightly between Studies 3 and 4.
Participants were asked to state the day of a meeting orig-
inally scheduled for “next week on Wednesday” (Study 3)
or “next Wednesday” (Study 4) that has been moved for-
ward two days. Omitting the phrase “next week” could have
created confusion for participants completing the survey on
a Monday or Tuesday. In particular, if they believed the
question referred to the coming Wednesday, a response of
Monday would imply the meeting was in the past. Given
that data was collected from 83% of the sample on a Tues-
day, this change in the wording might account for the lack
of effect in Study 4.
6 Meta analysis of effect sizes
To better understand and characterize the effects of back-
ward planning, we performed a meta-analysis that aggre-
gated effects across the four studies. Table 7 presents a
standardized effect size, Hedges g, for contrast 1 (backward
vs. forward and unspecified) in each study, as well as the
weighted average effect size across the four studies. The
meta-analysis supports the conclusion that backward plan-
ning led participants to predict later completion times and
start times, and did not have a systematic influence on pre-
dicted performance times. The meta-analysis also indicates
that, across the studies, backward planning led participants
to include more steps in their plans, to experience more
novel insights while developing their plans, and to consider
more potential obstacles.
http://journal.sjdm.org/vol11.2.html
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 163
7 General discussion
7.1 Backward planning and time prediction
People frequently underestimate the time it will take to com-
plete tasks, and the present studies tested whether back-
ward planning could help them arrive at more realistic fore-
casts. Consistent with our two primary hypotheses, instruct-
ing participants to engage in backward planning led them
to predict later task completion times (Studies 1-4) and re-
duced or eliminated optimistic bias (Study 4). The effects
on prediction were robust: They generalized across hypo-
thetical task scenarios (preparing for a date, completing a
major school assignment) and a variety of real tasks, across
student and community samples, and across variations in the
format of the planning exercise. Backward planning elimi-
nated bias because it prompted later predicted completion
times without a corresponding impact on actual comple-
tion times. This pattern is consistent with previous evidence
that factors that influence plans and predictions often do not
carry through and equally affect behavior over the longer
term (Buehler et al., 1997; Buehler, Peetz & Griffin, 2010;
Koehler & Poon, 2006; Poon, Koehler & Buehler, 2014).
Whereas backward planning influenced predicted com-
pletion time in each study, it did not have a measurable
impact on predictions of performance time. This pattern
of differential effects highlights the value of the theoreti-
cal distinction between predictions of completion time and
performance time (Buehler, Griffin & Peetz, 2010; Halk-
jelsvik & Jørgensen, 2012). Task completion times depend
not only on the duration of the task itself, but also on a host
of external factors such as time spent on competing activi-
ties, interruptions, and procrastination. Thus the impact of
backward planning on predicted completion times appears
to reflect the additional considerations that apply uniquely to
these predictions. Moreover, in Study 4 participants under-
estimated task completion times but not performance times.
This result is consistent with literature reviews suggesting
that underestimation bias is more common and more pro-
nounced for task completion time than for performance time
(Buehler, Griffin & Peetz, 2010; Buehler & Griffin, 2015;
Halkjelsvik & Jørgensen, 2012.
It is also noteworthy that backward planning appeared
to exert roughly equal effects on predicted start times and
predicted completion times. The effect of backward plan-
ning on predicted start times was significant in two studies
(Studies 1 and 4), as well as in the 4-study meta-analysis,
and the magnitude of the backward planning effect on start
times never differed significantly from the parallel effect on
completion times. This pattern suggests that backward plan-
ning prompted participants to shift the whole set of planning
milestones later in time, resulting in a shift in predicted start
times as well as predicted completion times.
Across all studies, predictions in the forward and unspec-
ified conditions were generally very similar. In all four stud-
ies, contrast 2 (comparing the forward and unspecified con-
ditions) revealed no effects on predicted completion times
or start times; and in only one study (Study 4) was there
an effect on predicted performance time. The similar results
in these conditions may indicate, consistent with our general
expectations, that participants typically planned in a forward
direction unless instructed otherwise. The detailed planning
requirements used in both of these two conditions resembled
unpacking procedures used in previous research (Kruger &
Evans, 2004), and yet participants underestimated task com-
pletion times, suggesting that unpacking plans into specific
steps was not sufficient to eliminate bias. Previous research
suggests that unpacking is less effective if there are few
components to unpack (Kruger & Evans, 2004), if the un-
packed components will be easy to carry out (Hadjichristidis
et al., 2014), and if the tasks are in the distant future (Mo-
her, 2012). Temporal direction appears to be an additional
moderator of unpacking effects.
7.2 Related cognitive processes
Our studies also assessed the effects of planning direction
on related cognitive processes. The process measures al-
lowed us to explore potential mechanisms underlying ef-
fects of backward planning, and to gain insights into the
phenomenological experience of planning in a backward di-
rection. Effects were less robust on these measures than on
prediction. This could be because the measures were later in
the questionnaire, further removed from the planning exer-
cise, or because the cognitive processes we tried to capture
are not highly accessible for self-report. Nevertheless, when
aggregated across the studies, several effects that emerged
were congruent with our theorizing.
We expected that backward planning would disrupt the
fluent planning process that typically leads to a focus on
successful completion, and instead raise the salience of in-
formation that is often neglected – such as required extra
steps and potential obstacles. We also expected that the dis-
ruption of well-rehearsed, schematic planning scripts would
lead predictors to feel they are experiencing novel insights in
the planning process. Consistent with this theorizing, the ag-
gregated results indicated that backward planning led partic-
ipants to experience novel planning insights (e.g., clarify the
steps they would need to take, think of steps they wouldn’t
have thought of otherwise, think of potential problems or
obstacles they could encounter) and to report increased an-
ticipation of obstacles or problems. Furthermore, perceived
planning insights were correlated with predicted completion
time, and mediated the impact of backward planning on pre-
diction, in Study 1.
In addition, an examination of the plans listed by partic-
ipants suggested that backward planning led participants to
consider additional steps. Although the effect size was rel-
http://journal.sjdm.org/vol11.2.html
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 164
atively small, participants tended to include more planning
steps in the backward planning condition than in other con-
ditions, and in Study 1 the increase in planning steps played
a role in mediating the effect of backward planning on pre-
diction.
We also tested the possibility that planning direction
would shift the planners’ perceptions of the flow of time.
Consistent with our theorizing, in three studies (Studies 1,
2, and 3) backward planning increased the likelihood of
adopting a time motion perspective, wherein time is expe-
rienced as moving toward the individual. This perspective
has been found to result in longer predictions of task com-
pletion time in past research (Boltz & Yum, 2010). In the
present research, a time motion perspective was associated
with longer predictions in one study (Study 4), though it did
not mediate effects of planning direction on prediction. Our
dichotomous single-item measure may not have been suf-
ficiently reliable to capture the indirect effects of backward
planning through the time motion perspective; future studies
may benefit from other approaches to measuring or manip-
ulating this construct.
Undoubtedly, backward planning also works through pro-
cesses that were not captured by our measures. One possi-
bility is that the planning exercise creates anchoring effects.
In many domains, people arrive at judgments by first con-
templating a salient value that serves as the starting point
or anchor, and then adjusting (often insufficiently) from that
value (Strack & Mussweiler, 1997; Tversky & Kahneman,
1974). Indeed, several studies have revealed anchoring ef-
fects in time predictions (Buehler, Peetz & Griffin, 2010;
König, 2005; LeBoeuf & Shafir, 2009; Thomas & Hand-
ley, 2008). Buehler et al. found that task completion pre-
dictions were influenced by ostensibly arbitrary “starting
points” suggested by the experimenter. For example, partici-
pants made earlier predictions when the initial starting point
was the current date (early anchor) rather than the deadline
date (late anchor) and they were asked to adjust from this
starting point to arrive at their prediction. Conceivably back-
ward planning heightens the salience of the deadline, and
thus it functions as an anchor for subsequent judgments.
Although we cannot assess the role of anchoring effects
in our studies, we believe the effects are not entirely at-
tributable to anchoring. In the current study, unlike previ-
ous anchoring manipulations (e.g., Buehler, Peetz & Grif-
fin, 2010), participants created detailed plans that inter-
vened between the putative anchor and the predictions. Al-
though backward planners began at the deadline, they went
on to identify every step they would take, and the full plan
was available to inform their predictions. Notably, then,
backward planners were not focused on the deadline when
making predictions; they just began the planning process
there. Furthermore, the anchoring-and-insufficient adjust-
ment model predicts a relatively mindless shift that does not
affect the actual content of plans, such as the salience of ob-
stacles or the number of steps needed for completion. The
evidence that backward planning influenced these higher or-
der cognitive processes suggests that the planning exercise
was doing more than merely providing differential anchors.
7.3 Implications and future directions
The present studies add to the planning fallacy literature by
testing the consequences of a planning strategy that has been
widely advocated but not subjected to empirical scrutiny.
The results offer support for several anecdotal claims sug-
gested by advocates of the approach (e.g., Fleming, 2010;
Rutherford, 2008; Saintamour, 2008; The Ball Foundation,
2007). In particular, the studies provide evidence that back-
ward planning can elicit novel insights that help people to
develop more realistic plans and expectations. The studies
also extend the research literatures on behavioral prediction
in general (Dunning, 2007) and task completion prediction
in particular (Buehler, Griffin & Peetz, 2010; Halkjelsvik &
Jørgensen, 2012) by exploring the role of temporal direc-
tion. Although previous research has identified many other
sources of accuracy and bias in prediction, our work is the
first to examine this factor.
The present research could also have direct practical ap-
plications. In many contexts people strive to predict accu-
rately when a task will be finished. They may be called
upon by others to provide a realistic estimate, or may pri-
vately seek an accurate prediction to guide their own deci-
sions. Moreover, people make important decisions and bind-
ing commitments on the basis of these predictions, and thus
errors can be costly. For example, individuals may rely on
task completion predictions to decide which projects, and
how many projects, to tackle in the coming month. A ten-
dency to underestimate completion times can result in over-
commitment, stress, and aggravation. Planning interven-
tions similar to those used in our experiments could be im-
plemented in a range of settings where practitioners (e.g.,
teachers, project managers, co-workers) depend on realistic
time estimates. The planning exercise is relatively brief and
easily administered with written instructions, and a partic-
ular advantage is that it capitalizes on people’s natural in-
clination to base predictions on a specific plan for the task
at hand. Our findings can also inform recommendations of-
fered to the public (e.g., in textbooks and popular media) to
improve planning, prediction, and time management.
An avenue for future research is to test the generality
of effects using different variants of backward planning.
Our intervention resembled task unpacking, and varied only
the temporal direction in which task components were un-
packed. However, backward planning can take different
forms, and may introduce elements beyond temporal di-
rection. In organizational contexts, for example, backward
planners are sometimes asked to identify critical start and
finish times for each step in a complex project, defined as
http://journal.sjdm.org/vol11.2.html
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 165
the absolute latest starting and finishing time for each step
that would still allow the deadline to be met (i.e., the critical
path; Lewis, 2002; Verzuh, 2005). It may be that the struc-
ture of backward planning can be tailored to suit specific
tasks, contexts, or objectives.
There are almost certainly other moderators of and
boundary conditions on the effects of planning direction that
could be examined. According to Construal Level Theory
(Trope & Liberman, 2003), people should make more op-
timistic predictions when tasks are further in the future, be-
cause temporal distance heightens the prevailing tendency to
rely on oversimplified representations of a task. This implies
that backward planning may be most beneficial for projects
in the distant future. It also remains to be seen whether
backward planning will be effective in group settings that
depend on collaborative planning. Personal characteristics
relevant to planning and prediction, such as the propensity
to engage in planning (Lynch, Netemeyer, Spiller & Za-
mmit, 2010) and dispositional procrastination (Lay, 1986),
may also moderate effects of planning direction.
Finally, whereas our work focused on predictions of task
completion, backward planning has potential to influence
other types of predictions where optimistic bias is prevalent,
such as predictions of future expenses (Peetz & Buehler,
2009), affective states (Wilson & Gilbert, 2003), or socially
desirable behaviors (Epley & Dunning, 2000; Koehler &
Poon, 2006). More generally, varying temporal direction
in the mental simulation of future events could influence a
variety of outcomes that depend on people’s cognitive rep-
resentation of the future, such as goal pursuit, motivation,
and self-control. By continuing to explore the role of tem-
poral direction in people’s cognitive representation of future
events, research can provide valuable new insights into the
links between past experience, present realities, and expec-
tations for the future.
References
Berinsky, A. J., Margolis, M. F., & Sances, M. W. (2014).
Separating the shirkers from the workers? Making sure
respondents pay attention on self-administered surveys.
American Journal of Political Science, 58(3), 739–753.
http://dx.doi.org/10.1111/ajps.12081.
Boltz, M. G., & Yum, Y. N. (2010). Temporal concepts
and predicted duration judgments. Journal of Experimen-
tal Social Psychology, 46, 895–904. http://dx.doi.org/10.
1016/j.jesp.2010.07.002.
Boroditsky, L. (2000). Metaphoric structuring: Understand-
ing time through spatial metaphors. Cognition, 75, 1–28.
http://dx.doi.org/10.1016/S0010-0277(99)00073-6.
Buehler, R., & Griffin, D. (2003). Planning, personality, and
prediction: The role of future focus in optimistic time pre-
dictions. Organizational Behavior and Human Decision
Processes, 92, 80–90. http://dx.doi.org/10.1016/S0749-
5978(03)00089-X.
Buehler, R., & Griffin, D. (2015). The planning fallacy:
When plans lead to optimistic forecasts. In M. D. Mum-
ford & M. Frese (Eds.), The psychology of planning in
organizations: Research and applications (pp. 31–57).
Taylor & Francis
Press.
Buehler, R., Griffin, D., Lam, K. C. H., & Deslauriers, J.
(2012). Perspectives on prediction: Does third-person
imagery improve task completion estimates? Organi-
zational Behavior and Human Decision Processes, 117,
138–149. http://dx.doi.org/10.1016/j.obhdp.2011.09.001.
Buehler, R., Griffin, D., & MacDonald, H. (1997). The
role of motivated reasoning in optimistic time predictions.
Personality and Social Psychology Bulletin, 23, 238–247.
http://dx.doi.org/10.1177/0146167297233003.
Buehler, R., Griffin, D., & Peetz, J. (2010). The plan-
ning fallacy: Cognitive, motivational, and social ori-
gins. In M. P. Zanna & J. M. Olson (Eds.), Advances in
experimental social psychology (Volume 43, pp. 1–62).
San Diego: Academic Press. http://dx.doi.org/10.1016/
S0065-2601(10)43001-4.
Buehler, R., Griffin, D., & Ross, M. (1994). Exploring
the “planning fallacy”: Why people underestimate their
task completion times. Journal of Personality and So-
cial Psychology, 67, 366–381. http://dx.doi.org/10.1037/
0022-3514.67.3.366.
Buehler, R., Peetz, J., & Griffin, D. (2010). Finish-
ing on time: When do predictions influence comple-
tion times? Organizational Behavior and Human Deci-
sion Processes, 111, 23–32. http://dx.doi.org/10.1016/j.
obhdp.2009.08.001.
Byram, S. J. (1997). Cognitive and motivational factors in-
fluencing time predictions. Journal of Experimental Psy-
chology: Applied, 3, 216–239. http://dx.doi.org/10.1037/
1076-898X.3.3.216.
Clark, H. H. (1973). Space, time, semantics, and the child.
In T. E. Moore (Ed.), Cognitive development and the ac-
quisition of language (pp. 27-63). New York: Academic
Press.
Connolly, T., & Dean, D. (1997). Decomposed versus holis-
tic estimates of effort required for software writing tasks.
Management Science, 43, 1029–1045. http://dx.doi.org/
10.1287/mnsc.43.7.1029.
Dreborg, K. H., Hunhammar, S., Kemp-Benedict, E., &
Raskin, P. (1999). Scenarios for the Baltic Sea region: A
vision of sustainability. International Journal of Sustain-
able Development and World Ecology, 6, 34–44. http://
dx.doi.org/10.1080/13504509.1999.9728470.
Dunning, D. (2007). Prediction: The inside view. In A.
W. Kruglanski & E. T. Higgins (Eds.), Social psychology:
Handbook of basic principles (2nd ed. pp. 69-90). New
York: Guilford Press.
Epley, N., & Dunning, D. (2000). Feeling “holier than
http://journal.sjdm.org/vol11.2.html
http://dx.doi.org/10.1111/ajps.12081
http://dx.doi.org/10.1016/j.jesp.2010.07.002
http://dx.doi.org/10.1016/j.jesp.2010.07.002
http://dx.doi.org/10.1016/S0010-0277(99)00073-6
http://dx.doi.org/10.1016/S0749-5978(03)00089-X
http://dx.doi.org/10.1016/S0749-5978(03)00089-X
http://dx.doi.org/10.1016/j.obhdp.2011.09.001
http://dx.doi.org/10.1177/0146167297233003
http://dx.doi.org/10.1016/S0065-2601(10)43001-4
http://dx.doi.org/10.1016/S0065-2601(10)43001-4
http://dx.doi.org/10.1037/0022-3514.67.3.366
http://dx.doi.org/10.1037/0022-3514.67.3.366
http://dx.doi.org/10.1016/j.obhdp.2009.08.001
http://dx.doi.org/10.1016/j.obhdp.2009.08.001
http://dx.doi.org/10.1037/1076-898X.3.3.216
http://dx.doi.org/10.1037/1076-898X.3.3.216
http://dx.doi.org/10.1287/mnsc.43.7.1029
http://dx.doi.org/10.1287/mnsc.43.7.1029
http://dx.doi.org/10.1080/13504509.1999.9728470
http://dx.doi.org/10.1080/13504509.1999.9728470
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 166
thou”: Are self-serving assessments produced by errors in
self or social psychology? Journal of Personality and So-
cial Psychology, 83, 300–312. http://dx.doi.org/10.1037/
0022-3514.79.6.861.
Epley, N., & Dunning, D. (2006). The mixed blessings of
self-knowledge in behavioral prediction: Enhanced dis-
crimination but exacerbated bias. Personality and Social
Psychology Bulletin, 32(5), 641–655. http://dx.doi.org/
10.1177/0146167205284007.
Fleming, G. (2010). Backward planning: Plan your
project from end to beginning! Retrieved from
http://homeworktips.about.com/od/timemanagement/
a/planning.htm.
Flyvbjerg, B. (2008). Curbing optimism bias and strategic
misrepresentation in planning: Reference class forecast-
ing in practice. European Planning Studies, 16, 3–21.
http://dx.doi.org/10.1080/09654310701747936.
Flyvbjerg, B., Garbuio, M., & Lovallo, D. (2009). Delu-
sion and deception in large infrastructure projects: Two
models for explaining and preventing executive disaster.
California Management Review, 51, 170–193. http://dx.
doi.org/10.1225/CMR423.
Forsyth, D. K., & Burt, C. D. B. (2008). Allocating time to
future tasks: The effect of task segmentation on planning
fallacy bias. Memory and Cognition, 36, 791–798. http://
dx.doi.org/10.3758/MC.36.4.791.
Geiselman, R. E. and Callot, R. (1990). Reverse versus
forward recall of script based texts. Applied Cognitive
Psychology, 4, 141-144. http://dx.doi.org/10.1002/acp.
2350040206.
Geiselman, R. E., Fisher, R. P., MacKinnon, D. P., & Hol-
land, H. L. (1986). Enhancement of eyewitness mem-
ory with the Cognitive Interview. American Journal
of Psychology, 99, 385–401. http://dx.doi.org/10.2307/
1422492.
Gollwitzer, P. M. (1999). Implementation intentions: Strong
effects of simple plans. American Psychologist, 54, 493–
503. http://dx.doi.org/10.1037/0003-066X.54.7.493.
Griffin, D., & Buehler, R. (1999). Frequency, probabil-
ity, and prediction: Easy solutions to cognitive illusions?
Cognitive Psychology, 38, 48–78. http://dx.doi.org/10.
1006/cogp.1998.0707.
Hadjichristidis, C., Summers, B., & Thomas, K. (2014).
Unpacking estimates of task duration: The role of typ-
icality. Journal of Experimental Social Psychology, 51,
45–50. http://dx.doi.org/10.1016/j.jesp.2013.10.009.
Halkjelsvik, T., & Jørgensen, M. (2012). From origami to
software development: A review of studies on judgment-
based predictions of performance time. Psychologi-
cal Bulletin, 138, 238–271. http://dx.doi.org/10.1037/
a0025996.
Hinds, P. J. (1999). The curse of expertise: The effects of
expertise and debiasing methods on predictions of novice
performance. Journal of Experimental Psychology: Ap-
plied, 5, 205–221. http://dx.doi.org/10.1037/1076-898X.
5.2.205.
Kahneman, D., & Tversky, A. (1979). Prospect theory: An
analysis of decisions under risk. Econometrica, 47, 313–
327.
Koehler, D., & Poon, C. S. K. (2006). Self-predictions over-
weight strength of current intentions. Journal of Exper-
imental Social Psychology, 42, 517–524. http://dx.doi.
org/10.1016/j.jesp.2005.08.003.
König, C. J. (2005). Anchors distort estimates of expected
duration. Psychological Reports, 96, 253–256. http://dx.
doi.org/10.2466/pr0.96.2.253-256.
Krizan, Z., & Windschitl, P. D. (2007). The influence of out-
come desirability on optimism. Psychological Bulletin,
133, 95–121. http://dx.doi.org/10.1037/0033-2909.133.
1.95.
Kruger, J., & Evans, M. (2004). If you don’t want to be
late, enumerate: Unpacking reduces the planning fallacy.
Journal of Experimental Social Psychology, 40, 586–598.
http://dx.doi.org/10.1016/j.jesp.2003.11.001.
Kunda, Z. (1990). The case for motivated reasoning. Psy-
chological Bulletin, 108, 480–498. http://dx.doi.org/10.
1037/0033-2909.108.3.480.
Lay, C. (1986). At last, my research article on procrasti-
nation. Journal of Research in Personality, 20, 474–495.
http://dx.doi.org/10.1016/0092-6566(86)90127-3.
LeBoeuf, R. A., & Shafir, E. (2009). Anchoring on
the “here” and “now” in time and distance judgments.
Journal of Experimental Psychology: Learning, Mem-
ory, & Cognition, 35, 81–93. http://dx.doi.org/10.1037/
a0013665.
Lewis, J. (2002). Fundamentals of project management.
New York, NY: Amacom.
Liberman, N., Trope, Y., & Stephan, E. (2007). Psychologi-
cal distance. In A. W. Kruglanski & E. T. Higgins (Eds.),
Social psychology: Handbook of basic principles (Vol. 2,
pp. 353-383). New York: Guilford Press.
Lovallo, D., & Kahneman, D. (2003). Delusions of success:
How optimism undermines executives’ decisions. Har-
vard Business Review, 81, 56–63.
Lovins, A. B. (1976). Energy strategy: The road not taken?
Foreign Affairs, 2, 187–217.
Lynch, J. G., Netemeyer, R. G., Spiller, S. A., & Zammit, A.
(2010). A generalizable scale of propensity to plan: The
long and the short of planning for time and for money.
Journal of Consumer Research, 37, 108–128. http://dx.
doi.org/10.1086/649907.
McGlone, M., & Harding, J. (1998). Back (or forward?) to
the future: The role of perspective in temporal language
comprehension. Journal of Experimental Psychology,
24, 1211–1223. http://dx.doi.org/10.1037/0278-7393.24.
5.1211.
Moher, E. (2012). Tempering optimistic bias in temporal
prediction: The role of psychological distance in the un-
http://journal.sjdm.org/vol11.2.html
http://dx.doi.org/10.1037/0022-3514.79.6.861
http://dx.doi.org/10.1037/0022-3514.79.6.861
http://dx.doi.org/10.1177/0146167205284007
http://dx.doi.org/10.1177/0146167205284007
http://homeworktips.about.com/od/timemanagement/a/planning.htm
http://homeworktips.about.com/od/timemanagement/a/planning.htm
http://dx.doi.org/10.1080/09654310701747936
http://dx.doi.org/10.1225/CMR423
http://dx.doi.org/10.1225/CMR423
http://dx.doi.org/10.3758/MC.36.4.791
http://dx.doi.org/10.3758/MC.36.4.791
http://dx.doi.org/10.1002/acp.2350040206
http://dx.doi.org/10.1002/acp.2350040206
http://dx.doi.org/10.2307/1422492
http://dx.doi.org/10.2307/1422492
http://dx.doi.org/10.1037/0003-066X.54.7.493
http://dx.doi.org/10.1006/cogp.1998.0707
http://dx.doi.org/10.1006/cogp.1998.0707
http://dx.doi.org/10.1016/j.jesp.2013.10.009
http://dx.doi.org/10.1037/a0025996
http://dx.doi.org/10.1037/a0025996
http://dx.doi.org/10.1037/1076-898X.5.2.205
http://dx.doi.org/10.1037/1076-898X.5.2.205
http://dx.doi.org/10.1016/j.jesp.2005.08.003
http://dx.doi.org/10.1016/j.jesp.2005.08.003
http://dx.doi.org/10.2466/pr0.96.2.253-256
http://dx.doi.org/10.2466/pr0.96.2.253-256
http://dx.doi.org/10.1037/0033-2909.133.1.95
http://dx.doi.org/10.1037/0033-2909.133.1.95
http://dx.doi.org/10.1016/j.jesp.2003.11.001
http://dx.doi.org/10.1037/0033-2909.108.3.480
http://dx.doi.org/10.1037/0033-2909.108.3.480
http://dx.doi.org/10.1016/0092-6566(86)90127-3
http://dx.doi.org/10.1037/a0013665
http://dx.doi.org/10.1037/a0013665
http://dx.doi.org/10.1086/649907
http://dx.doi.org/10.1086/649907
http://dx.doi.org/10.1037/0278-7393.24.5.1211
http://dx.doi.org/10.1037/0278-7393.24.5.1211
Judgment and Decision Making, Vol. 11, No. 2, March 2016 Backward planning 167
packing effect. Ph.D. Thesis, University of Waterloo, On-
tario, Canada.
Min, K. S., & Arkes H. R. (2012). When is difficult planning
good planning? The effects of scenario-based planning
on optimistic prediction bias. Journal of Applied Social
Psychology, 42, 2701–2729. http://dx.doi.org/10.1111/j.
1559-1816.2012.00958.x.
Newby-Clark, I. R., Ross, M., Buehler, R., Koehler, D. J.,
& Griffin, D. (2000). People focus on optimistic sce-
narios and disregard pessimistic scenarios while predict-
ing task completion times. Journal of Experimental Psy-
chology: Applied, 6, 171–182. http://dx.doi.org/10.1037/
1076-898X.6.3.171.
Oppenheimer, D. M., Meyvis, T., & Davidenko, N. (2009).
Instructional manipulation checks: Detecting satisficing
to increase statistical power. Journal of Experimental So-
cial Psychology, 45(4), 867–872. http://dx.doi.org/10.
1016/j.jesp.2009.03.009.
Peetz, J., & Buehler, R. (2009). Is there a budget fal-
lacy? The role of savings goals in the prediction of
personal spending. Personality and Social Psychol-
ogy Bulletin, 35, 1579–1591. http://dx.doi.org/10.1177/
0146167209345160.
Peetz, J., Buehler, R., & Wilson, A. E. (2010). Planning for
the near and distant future: How does temporal distance
affect task completion predictions? Journal of Experi-
mental Social Psychology, 46, 709-720. http://dx.doi.org/
10.1016/j.jesp.2010.03.008.
Poon, C. S. K., Koehler, D. J., & Buehler, R. (2014). On
the psychology of self-prediction: Considerations of situ-
ational barriers to intended actions. Judgment and Deci-
sion Making, 9, 207–225.
Preacher, K. J., & Hayes, A. F. (2008). Asymptotic and re-
sampling procedures for assessing and comparing indirect
effects in multiple mediator models. Behavior Research
Methods, 40, 879-891. http://dx.doi.org/10.3758/BRM.
40.3.879.
Robinson, J. B. (1982). Energy backcasting: A proposed
method of policy analysis. Energy Policy, 10, 337-344.
http://dx.doi.org/10.1016/0301-4215(82)90048-9.
Roy, M. M., Christenfeld, N., & McKenzie, C. R. M.
(2005). Underestimating the duration of future events:
Faulty prediction or memory bias? Psychological Bul-
letin, 131, 738–756. http://dx.doi.org/10.1037/0033-
2909.131.5.738.
Rutherford, D. (2008). Backwards planning. Retrieved
from http://cll.berkeley.edu.
Saintamour, F. (2008). Corporate infantry: Everything I
know about corporate sales I learned in combat. Lans-
ing, MI: Frederic Saintamour.
Strack, F., & Mussweiler, T. (1997). Explaining the enig-
matic anchoring effect: Mechanisms of selective acces-
sibility. Journal of Personality and Social Psychology,
73, 437–446. http://dx.doi.org/10.1037//0022-3514.73.3.
437.
Taylor, S. E., Pham, L. B., Rivkin, I. D., & Armor, D.
A. (1998). Harnessing the imagination: Mental simula-
tion, self-regulation, and coping. American Psychologist,
53, 429–439. http://dx.doi.org/10.1037/0003-066X.53.4.
429.
The Ball Foundation of Glen Ellyn. (2007). Backwards
planning is a great strategy for those who find it hard to
get started. Retrieved from http://www.careervision.org/
About/Backwards_Planning_Strategy.htm.
Thomas, K. E., & Handley, S. J. (2008). Anchoring in time
estimation. Acta Psychologica, 127, 24–29. http://dx.doi.
org/10.1016/j.actpsy.2006.12.004.
Trope, Y., & Liberman, N. (2003). Temporal construal. Psy-
chological Review, 110, 403–421. http://dx.doi.org/10.
1037/0033-295X.110.3.403.
Tversky, A., & Kahneman, D. (1974). Judgment under un-
certainty: Heuristics and biases. Science, 185, 1124–
1131. http://dx.doi.org/10.1126/science.185.4157.1124.
Verzuh, E. (2005). The fast forward MBA in project man-
agement: Quick tips, speedy solutions, and cutting-edge
ideas. Hoboken, NJ: John Wiley & Sons.
Wiggins, G., & McTighe, J. (1998). Understanding by de-
sign. Alexandria, VA: ASCD.
Wilson, T. D., & Gilbert, D. T. (2003). Affective fore-
casting. In M. P. Zanna (Ed.), Advances in experimen-
tal social psychology (Vol. 35, pp. 345–411). San Diego,
CA: Academic Press. http://dx.doi.org/10.1016/S0065-
2601(03)01006-2.
http://journal.sjdm.org/vol11.2.html
http://dx.doi.org/10.1111/j.1559-1816.2012.00958.x
http://dx.doi.org/10.1111/j.1559-1816.2012.00958.x
http://dx.doi.org/10.1037/1076-898X.6.3.171
http://dx.doi.org/10.1037/1076-898X.6.3.171
http://dx.doi.org/10.1016/ j.jesp.2009.03.009
http://dx.doi.org/10.1016/ j.jesp.2009.03.009
http://dx.doi.org/10.1177/0146167209345160
http://dx.doi.org/10.1177/0146167209345160
http://dx.doi.org/10.1016/j.jesp.2010.03.008
http://dx.doi.org/10.1016/j.jesp.2010.03.008
http://dx.doi.org/10.3758/BRM.40.3.879
http://dx.doi.org/10.3758/BRM.40.3.879
http://dx.doi.org/10.1016/0301-4215(82)90048-9
http://dx.doi.org/10.1037/0033-2909.131.5.738
http://dx.doi.org/10.1037/0033-2909.131.5.738
http://cll.berkeley.edu
http://dx.doi.org/10.1037//0022-3514.73.3.437
http://dx.doi.org/10.1037//0022-3514.73.3.437
http://dx.doi.org/10.1037/0003-066X.53.4.429
http://dx.doi.org/10.1037/0003-066X.53.4.429
http://www.careervision.org/About/Backwards_Planning_Strategy.htm
http://www.careervision.org/About/Backwards_Planning_Strategy.htm
http://dx.doi.org/10.1016/j.actpsy.2006.12.004
http://dx.doi.org/10.1016/j.actpsy.2006.12.004
http://dx.doi.org/10.1037/0033-295X.110.3.403
http://dx.doi.org/10.1037/0033-295X.110.3.403
http://dx.doi.org/10.1126/science.185.4157.1124
http://dx.doi.org/10.1016/S0065-2601(03)01006-2
http://dx.doi.org/10.1016/S0065-2601(03)01006-2
Copyright of Judgment & Decision Making is the property of Society for Judgment &
Decision Making and its content may not be copied or emailed to multiple sites or posted to a
listserv without the copyright holder’s express written permission. However, users may print,
download, or email articles for individual use.
Unit2 [GM592: Project Planning and the Project Plan]
1 of 3
Assignment 1: Individual Assignment
In this Assignment, you will be assessed on the following outcome:
GM592-2: Plan schedule management with associated resources.
This Assignment is designed to evaluate your ability to research, organize, and demonstrate project
data and financial information pertaining to the development of the feasibility study within the project
initiation phase. These exercises mimic actual situations one could expect to occur between the
project manager and their sponsor or key stakeholders. Its assessments are directed toward
measuring mastery in synthesis of information, proper classifications, critical thinking, and attention to
detail, explanations, and professional acumen.
Given the information provided you for your assigned rocket assembly project (See Course
Resources):
Construct a Resource Allocation Matrix for the labor, equipment and material allocated to
your assigned rocket project.
Construct a Gantt schedule based on the W BS activities at the task level with associated
deliverable milestones for your assigned rocket.
Construct a project Network Diagram for your assigned rocket project.
You will need an evening for data collection. This will require about two hours to write up your
findings. Download the appropriate templates in Course Resources to record the information found.
Alternatively you can use recommended internet links in the appendix of your text for a similar
template. The document you use must meet all criteria specified in the grading rubric. Fill in all
sections completely using your assigned assignment scenario found in Course Resources.
1. Go to Course Resources or locate an appropriate template from the internet that meets the rubric
criteria. Using this worksheet and your assigned project, answer the sections on the worksheet. If
there are sections missing from the template for the assigned documents but required in the
rubric, be sure to address them. Upload all documents as separate files to the designated team
members’
Dropbox.
2. Go to the internet and find a product description template that meets the rubric criteria. Using this
worksheet and your assigned project, answer the sections on the worksheet. Upload this charter
to the designated team members’ Dropbox.
3. Ensure that your project documents address the criteria of the rubric and follow the stated
requirements.
Unit 2 [GM592: Project Planning and the Project Plan]
2 of 3
Directions for Submitting your Individual Assignment:
To submit your Unit 2 individual assignment, upload all assignment documents to the Unit 2
Assignment 1 Dropbox. Make sure that you have saved a copy of each of the tools to submit for
this assignment.
You have three deliverables. Be sure to upload all deliverables into the Unit 2 Assignment 1
Dropbox.
GM592 Unit 2 Individual Assignment
Points
Possible
Points
Earned
Content (0-24 points)
1. Tool Development (Resource Allocation Matrix)
a) Contains all project W BS tasks?
8
b) Contains all requisite/identified labor resources?
c) Identifies method of cross-impacting labor resources to project W BS
tasks?
2. Tool Development (Gantt schedule)
a) All W BS tasks depicted?
8
b) Each task level activity depicted with incremental bar(s)
c) Schedule relationships specified using Critical Path Method (CPM)
d) Critical path correctly identified according to CPM?
3. Tool Development (Network Diagram)
a) Utilizes activity-on-node format?
8
b) Project divided into all task level activities with task number, task
name, and predecessors identified?
c) Task level activities correctly sequenced according to precedence
(reflects Gantt schedule)?
d) Critical path correctly identified according to CPM?
e) Durations for all activities correctly entered (same as Gantt durations)?
f) Forward pass correctly performed?
g) Backward pass correctly performed?
h) Float correctly calculated?
i) Minimum project duration same as reflected on Gantt schedule?
Analysis (0-9 points)
Response exhibits strong higher-order critical thinking and analysis (e.g.,
evaluation). Paper shows original thought.
3
Analysis includes proper classifications, explanations, comparisons and
inferences.
3
Critical thinking includes appropriate judgments, conclusions and
assessment based on evaluation and synthesis of information.
3
Writing (0-7 points)
Unit 2 [GM592: Project Planning and the Project Plan]
3 of 3
Grammatical skills are strong with typically less than one error per page.
Correct use of APA when assigned.
3
Appropriate to the assignment, fresh (interesting to read), accurate, (no
far-fetched, unsupported comments), precise (say what you mean), and
concise (not wordy).
2
Project is in 12-point font. Narrative sections are double-spaced with a
double space between. Project is free of serious errors; grammar,
punctuation, and spelling help to clarify the meaning by following
accepted conventions.
2
Total 40
Unit2 [GM592: Project Planning and the Project Plan]
1 of 3
Assignment 2: Team Assignment
This Assignment is designed to evaluate your ability to research, organize, and demonstrate
project data and financial information pertaining to the development of the schedule baseline
within the project planning phase. These exercises mimic actual situations one could expect to
occur between the project manager and the sponsor or key stakeholders. Its assessments are
directed toward measuring mastery in synthesis of information, proper classifications, critical
thinking, attention to detail, explanations, and professional acumen.
In this Assignment, you and your team will develop a Definitive Duration Estimate and a
Schedule Management Plan for your assigned rocket project.
You and your team have received word that your company, Space Systems Technologies (SST)
does not employ the “Fitter” labor resource you will need in assembling the rocket. As a
corporate policy, this skill set is outsourced as part of an agreement with the local Fitters Union
to keep an open shop. All Fitter labor needs are required to be outsourced to the Fitters Union
Local #1234 for all new rocket assembly projects. The SST human resources department will
coordinate with the procurement office and provide the assigned rocket assembly project with
the necessary Union Fitters once they receive the estimated number of labor hours required for
this project.
Identify your group members and using the collaboration area below Unit 6 collaborate and
develop the following from your assigned Ansari “X-Prize” project entry:
Develop a Definitive Duration Estimate for your assigned rocket project down to the work
package level.
Develop a Schedule Management Plan for your assigned rocket project.
Your team will need an evening for data collection. This will require about two hours per group
member to write up your findings. Download the appropriate templates in Course Resources to
record the information found. Alternatively you can use recommended internet links in the
appendix of your text for a similar template. The document you use must meet all criteria
specified in the grading rubric. Fill in all sections completely using your assigned assignment
scenario found in Course Resources.
1. Go to Course Resources or locate an appropriate template from the internet that meets the
rubric criteria. Using this worksheet and your assigned project, answer the sections on the
worksheet. If there are sections missing from the template for the assigned documents but
required in the rubric, be sure to address them. Upload all documents as separate files to
the designated team members’ Dropbox.
2. Go to the internet and find a product description template that meets the rubric criteria.
Using this worksheet and your assigned project, answer the sections on the worksheet.
Upload this charter to the designated team members’ Dropbox.
3. Ensure that your project documents address the criteria of the rubric below and follows the
stated requirements.
Unit 2 [GM592: Project Planning and the Project Plan]
2 of 3
Directions for Submitting your Team Assignment:
To submit your Unit 2 Team assignment, have one person designated by the team upload
all assignment documents to the Unit 2 Assignment 2 Dropbox. Make sure that you have
saved a copy of each of the tools to submit for this assignment.
Each team member must submit a peer evaluation individually to your Unit 2 Assignment 2
Dropbox.
GM592 Unit 2 Team Assignment
Points
Possible
Points
Earned
Content (0-30 points)
1. Tool Development (Definitive Duration Estimate)
a) Project scope (WBS) correctly entered down to work package level?
15
b) All labor skill sets accounted for with own column?
c) Levels of effort correctly entered at the work package levels?
d) Level of effort (in hours) correctly rolled-up (totaled) to each requisite sub-task level?
e) Sub-task totals correctly rolled-up (totaled) to each requisite task level?
f) Task level totals for each labor skill set totals to reflect total task labor hours?
g) Each labor skill set task totals tallied at bottom of each skill set column to reflect each skill
set total labor for the project?
h) All labor total hours tallied to show total labor hours for the project.
i) All labor levels of effort calculated to show total labor for each increment by task?
2. Tool Development (Schedule Management Plan)
a) Scheduling methodology/Tool to be used for project schedule specified?
15
b) Level of accuracy specified to determine activity duration estimates (round up to the next 1
hour/day/month)?
c) Units of measure (hours/days/weeks/months)?
d) Organizational procedures links (WBS numbering mapped to accounting numbering
mapped to general ledger/budget line, etc.)?
e) Process identified for updating the status and recording of progress in the schedule tool
and defined?
f) Control thresholds specified for schedule variances between planned value and actual
costs (EVM variances thresholds)?
g) Rules for performance measures specified (SV, SPI, etc.) with definitions?
h) Reporting formats and frequency for schedule reports defined?
i) Process descriptions documented?
Analysis (0-11 points)
Unit 2 [GM592: Project Planning and the Project Plan]
3 of 3
Response exhibits strong higher-order critical thinking and analysis (e.g., evaluation). Paper
shows original thought.
3
Analysis includes proper classifications, explanations, comparisons and inferences.
4
Critical thinking includes appropriate judgments, conclusions and assessment based on
evaluation and synthesis of information. 4
Writing (0-9 points)
Grammatical skills are strong with typically less than one error per page. Correct use of APA
when assigned. 3
Appropriate to the assignment, fresh (interesting to read), accurate, (no far-fetched,
unsupported comments), precise (say what you mean), and concise (not wordy). 3
Project is in 12-point font. Narrative sections are double-spaced with a double space
between. Project is free of serious errors; grammar, punctuation, and spelling help to clarify
the meaning by following accepted conventions.
3
Peer Evaluation
Minus points lost on Peer Evaluation (15= 0, 14= -1, 13= -2, etc.) 0
Total 50
Turn in your highest-quality paper
Get a qualified writer to help you with
“ Project Planning 2 ”
Get high-quality paper
Guarantee! All work is written by expert writers!