neroden
Engineer
OK, so it's a little hard to explain if you don't understand how software works.
So, when it shows the effects of a proposed dispatching plan, it shows *overly optimistic effects*. If it tells the human dispatcher to follow said plan, it is an *overly optimistic plan* with *overly optimistic effects*. If it sets the switches itself automatically, it is *still* an overly optimistic plan with overly optimistic effects...
So the automated dispatching system claims that a given plan will successfully move various trains to various locations by various times; and it won't. Something will happen, a grade crossing will malfunction or something, and then the entire thing will snarl up and everything will be late.
A human dispatcher could have deliberately left extra empty sidings, held the main emptier than it needed to be, etc., and then there would be some slack which could be used in case of accidents.
An automated dispatching program *could* theoretically do this too, but *that's not what the programmers were instructed to do*. They were almost certainly instructed to write an "efficient" dispatching program which maximized the use of the capacity. This, as I said earlier, is the error.
The human dispatchers have in the back of their heads "I need to leave some slack in case something goes wrong". They'll start steering trains away from a yard when it's only 85% full, not 100% full. The automated dispatching system might hold trains off the main just-in-time to let Amtrak by; the human dispatcher would clear the way earlier, just in case something went wrong.
You CAN write programs which can incorporate this know-how, but I am dead certain that when the program was commissioned, that's not what the programmers were told to do, and so they made a fragile system.
I'm certain of this because
(a) it is a really, really common error to make when commissioning computer programs for controlling things;
(b) the execs of the railroads have been obsessed with efficient use of capacity rather than with having redundancy for trouble, which makes them more likely to make this error.
Computers are very good at doing things very fast. So, if the execs ordered the human dispatchers to leave no slack, it might take months for them to adjust their behavior and clog up the railroad completely. The computer, instructed to do the same thing, will clog up the railroad within days!
Could be any of these, really. The point is that the dispatching system is trying to replace the *decision-making knowhow* of the human dispatcher, but it's doing so with *less information*, and specifically with less information which causes it to *bias optimistically*.When you say "automated" dispatching, do you mean the computer sets the lights and switches (e.g., for a siding), or it tells the human dispatcher what to do, or it shows the effects of what a dispatcher wants to do?
So, when it shows the effects of a proposed dispatching plan, it shows *overly optimistic effects*. If it tells the human dispatcher to follow said plan, it is an *overly optimistic plan* with *overly optimistic effects*. If it sets the switches itself automatically, it is *still* an overly optimistic plan with overly optimistic effects...
So the automated dispatching system claims that a given plan will successfully move various trains to various locations by various times; and it won't. Something will happen, a grade crossing will malfunction or something, and then the entire thing will snarl up and everything will be late.
A human dispatcher could have deliberately left extra empty sidings, held the main emptier than it needed to be, etc., and then there would be some slack which could be used in case of accidents.
An automated dispatching program *could* theoretically do this too, but *that's not what the programmers were instructed to do*. They were almost certainly instructed to write an "efficient" dispatching program which maximized the use of the capacity. This, as I said earlier, is the error.
The human dispatchers have in the back of their heads "I need to leave some slack in case something goes wrong". They'll start steering trains away from a yard when it's only 85% full, not 100% full. The automated dispatching system might hold trains off the main just-in-time to let Amtrak by; the human dispatcher would clear the way earlier, just in case something went wrong.
You CAN write programs which can incorporate this know-how, but I am dead certain that when the program was commissioned, that's not what the programmers were told to do, and so they made a fragile system.
I'm certain of this because
(a) it is a really, really common error to make when commissioning computer programs for controlling things;
(b) the execs of the railroads have been obsessed with efficient use of capacity rather than with having redundancy for trouble, which makes them more likely to make this error.
Computers are very good at doing things very fast. So, if the execs ordered the human dispatchers to leave no slack, it might take months for them to adjust their behavior and clog up the railroad completely. The computer, instructed to do the same thing, will clog up the railroad within days!
Last edited by a moderator: