The RTA's 2003-2006 trial proved that flashing lights significantly improve road safety around schools. It also showed that Type 1 lights (2 flashing lights only) were more effective at reducing traffic speeds than Type 3 lights (2 flashing lights plus flashing ring around the "40" numerals.)
The government claimed however that the original lights were unreliable as they had an average of 2 faults in 18 months (99.3% reliability). They embarked on a new trial that ran for 6 months from February 2007.
Seven companies were selected to install lights, at a total cost of $7.2M for 100 school zones.
The trial was supposed to compare results from 100 school zones. Due to poor planning and the use of non-scientific processes and procedures however, data for 42 school zones had to be excluded. Speed data was therefore only analysed for 58 school zones.
Unlike the first trial there were no follow-up speed measurements, no control sites without lights were used to validate the results, and speed measurements were carried out by the RTA not by the independent consultants.
The report runs to just 16 pages, compared with 69 pages for the first report (excluding appendices). It contains none of the detailed statistical analysis contained in the first report (eg. Comparison of ANOVA vs ANCOVA analysis).
The report was never published and access to it was denied. It had to be obtained via Freedom of Information (GIPA) legislation. The reason it was suppressed by the RTA is obvious.
In spite of the rigorous first trial having found that Type 1 lights were most effective at reducing speeds, the second trial barely analysed them. The RTA had clearly already decided to adopt the far more expensive Type 3 lights and hence had no interest in property assessing Type 1 lights:
The 2007 report conveniently reversed the 2003-2006 figures and found that on average Type 1 lights reduced 85th percentile speeds by 4.1km/h versus 6.4km/h for Type 3 lights (Standard PAD type).
Unlike the first report however, the second report makes no claims about the results being statistically significant. On the contrary it states that without further analysis the results CANNOT be claimed to be statistically significant. (p.7, last para.) In other words the results as presented were meaningless.
The report goes on to further qualify the results by stating:
"ARRB cautions the reader in accepting this ranking as an absolute result. Analysis shows the ranking can and does vary depending on the criteria being used and the ranking will vary across speed zone and road environments." (Summary p.1, last para.)
"An additional limit of the data that needs to be considered is that the traffic surveys have been undertaken at different times of the school year. This is likely to introduce an effect on the consistency of traffic flows through the trial sites due to season affects at the various sites. The extent of this cannot be gauged based on the information provided and hence the affect on the analysis cannot be quantified." (p.8, Sect 2.4.2, para. 3)
"As a final point, ARRB believes there are significant constraints embedded in the analysis undertaken for this report. These constraints have been briefly discussed in this report and ARRB believes caution must be applied when discussing the results or interpreting them as absolutes." (p.16, Sect. 4.2, para. 4)
The first report contains no similar qualifications.
A significant finding of the report was that for 70km/h school zones, Type 1 lights reduced traffic speeds by far more than Type 3 (standard) lights in all measurement categories.
Most importantly they reduced the percentage of vehicles travelling at high speed (20km/h or more over the speed limit) by over three times the percentage reduction achieved by Type 3 (standard) lights. (7.6% vs 2.4% reduction).
No Type 1 lights were tested and analysed in 80-100km/h speed limit areas.
There is thus no evidence that Type 3 (standard) lights, which the RTA has adopted, are any more effective than Type 1 lights at reducing traffic speeds on main roads.
There were various other issues with the 2007 trial:
There were numerous faults with the lights during the trial. They were found to be 98.2% reliable overall, less reliable than the lights used in the original trial. See this media release from the Premier (p.2 last bullet point).
The Schoolzone Santa was monitoring half a dozen of the 100 sites. He recorded and photographed the following failures during the trial. Full details are on his web site.
The above fault trend with the RTA's much-hyped "new technology" lights continued following the trial. Notable examples include:
Only one type of lights was 100% reliable during the trial (and for the 3 following years) - the cheap ones installed at Peakhurst and Lugarno.
The lights used in the original trial cost $12,000 per sign.
The lights used in the second trial cost up to $75,000 per sign yet were less reliable. eg.
|Astucia||10 signs||$750,000||$75,000 per sign|
|Streetscape Projects (Moses Obeid)||20 signs||$1,475,000||$73,750 per sign|
Part of the high cost of the new lights was the back-to-base monitoring, an RTA requirement. In spite of that capability it took up to a week for faults to be fixed during the trial.
If faults with lights on some of Sydney's busiest roads during the government's much-hyped trial were not fixed for a week, what is the point of back-to-base monitoring?
Hundreds of students and parents pass the signs each day. The schools simply need to offer a small reward to the first child who reports a fault with the lights. Parents who drop off their children every day can also be rostered on to check the lights daily.
Lights that are monitored by schools are infinitely better than no lights at all, which is the inevitable result for over 90% of school zones at present due to the high cost of back-to-base monitoring.
Many of the lights were installed behind trees, poles or bends in the road. What is the point of spending thousands of dollars on lights that cannot be seen?