Due Diligence Due Diligence

Design Safety Decisions Don't Disappear

A recent civil case in New South Wales has highlighted the importance of diligent ‘safety in design’ decisions being made by architects and engineers.

The case in question involved a golf club patron falling into a sunken garden bed adjacent to a car park. The car park provided 90-degree car parking. A kerb separated the garden bed and the car park. The patron’s car was parked with its boot facing the kerb.

At the time of the fall, the patron had placed a large object into his car boot and stepped backwards from the kerb. However, what appeared to be a garden bed at the level of the car park was in fact a sunken garden bed some 800 millimetres below car park level, with foliage that had grown to the level of the car park.

As a result of the fall, the patron sustained injuries. He subsequently sued the golf club as owner and operator of the facility. He also sued the designer of the garden bed, a prominent architecture firm. Following appeals, the courts ruled in favour of the plaintiff, finding that both the golf club and the architecture firm had been negligent, with liability divided 75 per cent and 25 per cent respectively.A key point in the finding against the architecture firm was that they, as designer of the landscape that included the garden bed, must have had knowledge of the types of plants in the location, and could have reasonably foreseen that these would grow to obscure the depth of the garden bed next to the car park. The court also found that the provision of a balustrade would have prevented the injuries sustained by the plaintiff, and that this could reasonably have been included in the garden bed design.The safety influence of designersDesigners, including architects and engineers, have enormous influence over the safety of our designs. Our decisions determine how our designs may be constructed, operated, used, maintained, upgraded, decommissioned and disposed of.  And with great power comes great responsibility.Designers must meet many responsibilities with their designs, including function, cost, contract terms, time frames, constructability, operability, maintainability, environmental impact and safety. As we make design decisions we attempt to foresee the future, when our ideas become material reality and our design decisions are put into practice. Through this foresight we attempt to balance our many responsibilities. Arguably the foremost amongst these is safety.And so as designers we stand at our point in time, trying to foresee all credible safety incidents that may occur on, in, around, and because of our designs, and to address them through our design decisions.Unfortunately, despite our best efforts, this is an imperfect exercise. The best we can do is convince ourselves that we’ve not overlooked any critical safety issues, and that we have provided a design that includes all reasonable measures to address these critical issues. That is, no matter what happens in the future, we want to know (and demonstrate) that right here and now we are making diligent decisions, addressing safety and balancing all our other responsibilities.In the end, although we use foresight, our decisions will be judged in hindsight. We need to consider how our decisions will be examined if (when) something goes wrong and someone gets hurt. In general this means that we need to show that during the design phase we had considered the potential for the incident (or had good reason to have missed it), and that all reasonable practicable design measures were included.Designers developing good answers to these questions during the design phase is becoming more and more of a focus, not just for fear of litigation, but with the increased focus under the national Model Work Health and Safety legislation on the safety influence and responsibilities of organisations’ ‘officers’, which could very likely include engineers and architects.The diligent design safety processSo how can designers do this? Firstly, we need to demonstrate why we are confident that all credible, critical issues are identified at the design stage. A good approach to this is a vulnerability assessment. It provides a formal, high-level argument that all critical safety risks to all exposed groups in all project phases have been identified. Patrons falling into the lowered garden bed during car park use would seem to be quite foreseeable.From there, any obviously reasonable measures must be implemented to address the identified risks. Measures commonly implemented by designers in similar situations are a very good guide to this. It may involve applying a design standard or guideline, or standard industry measures. This recognised good practice is the minimum that must be in place. It may be in the form of a specific design, performance requirements, or a general approach to similar issues, but it must be implemented.In this instance, provision of a physical barrier to prevent people falling into a lower area adjacent to a foot trafficked area may me considered recognised good practice. Or perhaps there is a general design principle that steps or drop-offs should not be located at the end of ninety-degree car parking. The importance of knowledge sharing among designers is obvious.The crucial final step then involves considering any further potential measures that may be implemented in addition to recognised good practice. At this point the range of other designer responsibilities can be added to the balance.  For any further potential measure, the benefit, in terms of risk reduction, can be balanced against the functional, financial, environmental, etc. implications to determine if the measure in question is justified.For instance, signage warning of the lowered garden bed would have been a further potential measure. Selection of low-height mature plants for the garden bed to emphasis the lower ground level would also have been an option. The increased expense of signage or (for example) plant purchase and watering costs could then be balanced against the benefits provided by greater patron awareness of the drop-off.This approach allows designers to take their diverse responsibilities into account while still ensuring good design safety decisions are made. It provides, through recognised good practice, a minimum level of protection against foreseeable risks for all exposed persons. It then provides financial efficiency by allowing designers to balance their other responsibilities against the benefits of any further options. And through this process, our design decisions can be carried forward into reality with the knowledge that we have exercised safety due diligence.

This article first appeared on Sourceable.

Read More

Everyone is Entitled to Protection – But not Always the Same Level of Risk

When it comes to dealing with a known safety hazard, everyone is entitled to the same minimum level of protection.

This is the equity argument. It arises from Australia’s work health and safety legislation. It seems elementary. It is elementary. It has also, with the best intentions, been pushed aside by engineers for many years.

The 1974 UK Health and Safety at Work Act introduced the concept of “so far as is reasonably practicable” (SFAIRP) as a qualifier for duties set out in the Act. These duties required employers (and others) to ensure the health, safety and welfare of persons at work.

The SFAIRP principle, as it is now known, drew on the common law test of ‘reasonableness’ used in determining claims of negligence with regard to safety. This test was (and continues to be) developed over a long period of time through case law. In essence, it asks what a reasonable person would have done to address the situation in question.

One key finding elucidating the test is the UK’s Donoghue v. Stevenson (1932), also known as ‘the snail in the bottle’ case, which looked at what ‘proximity’ meant when considering who could be adversely affected by one’s actions.

Another is the UK’s Edwards v. National Coal Board (1949), in which the factors in determining what is ‘reasonably practicable’ were found to include the significance of the risk, and the time, difficulty and expense of potential precautions to address it.

These and other findings form a living, evolving understanding of what should be considered when determining the actions a reasonable person would take with regard to safety. They underpin the implementation of the SFAIRP principle in legislation.

And although in 1986 Australia and the UK formally severed all ties between their respective legislature and judiciary, both the High Court of Australia and Australia’s state and federal parliaments have retained and evolved the concepts of ‘reasonably practicable’ and SFAIRP in our unique context.

In determining what is ‘reasonable’ the Courts have the benefit of hindsight. The facts are present (though their meaning may be argued). Legislation, on the other hand, looks forward. It sets out what must be done, which if it is not done, will be considered an offence.

Legislating (i.e. laying down rules for the future) with regard to safety is difficult in this respect. The ways in which people can be damaged are essentially infinite. That people should try not to damage each other is universally accepted, but how could a universal moral principle against an infinite set of potential events be addressed in legislation?

Obviously not through prescription of specific safety measures (although this has been attempted in severely constrained contexts, for instance, specific tasks in particular industries). And given the complex and coincident factors involved in many safety incidents, how could responsibility for preventing this damage be assigned?

The most appropriate way to address this in legislation has been found, in different places and at different times, to be to invoke the test of reasonableness. That is, to qualify legislated duties for people to not damage each other with “so far as is reasonably practicable.”

This use of the SFAIRP principle in health and safety legislation, as far as it goes, has been successful. It has provided a clear and objective test, based on a long and evolving history of case law, for the judiciary to determine, after an event, if someone did what they reasonably ought to have done before the event to avoid the subsequent damage suffered by someone else. With the benefit of hindsight the Courts enjoy, this is generally fairly straightforward.

However, determining what is reasonable without this benefit - prior to an event - is more difficult. How should a person determine what is reasonable to address the (essentially infinite) ways in which their actions may damage others? And how could this be demonstrated to a court after an event?

Engineers, as a group, constantly make decisions affecting people’s safety. We do this in design, construction, operation, maintenance, and emergency situations. This significant responsibility is well understood, and safety considerations are paramount in any engineering activity. We want to make sure our engineering activities are safe. We want to make sure nothing goes wrong. And, if it does, we want to be able to explain ourselves. In short, we want to do it right. And if it goes wrong, we want to have an argument as to why we did all that was reasonable.

Some key elements of a defensible argument for reasonableness quickly present themselves. Such an argument should be systematic, not haphazard. It should, as far as possible, be objective. And through these considerations it should demonstrate equity, in that people are not unreasonably exposed to potential damage, or risk.

Engineers, being engineers, looked at these elements and thought: maths.

Engineers, Lawyers & Safety / R2A Due Diligence Engineers

In 1988 the UK Health and Safety Executive (HSE) were at the forefront of this thinking. In the report of an extensive public inquiry into the proposed construction of the Sizewell B nuclear power plant the inquiry’s author, Sir Frank Layfield, made the recommendation that the HSE, as the UK’s statutory health and safety body, “should formulate and publish guidance on the tolerable levels of individual and social risk to workers and the public from nuclear power stations.”

This was a new approach to demonstrating equity with regards to exposure to risk. The HSE, in their 1988 study The Tolerability of Risk from Nuclear Power Stations, explored the concept. This review looked at what equity of risk exposure meant, how it might be demonstrated, and, critically, how mathematical and approaches could be used for this. It introduced the premise that everyone in (UK) society was constantly exposed to a ‘background’ level of risk which they were, if not comfortable with, at least willing to tolerate. This background risk was the accumulation of many varied sources, such as driving, work activities, house fires, lightning, and so on.

The HSE put forward the view that, firstly, there is a level of risk exposure individuals and society consider intolerable. Secondly, the HSE posited that there is a level of risk exposure that individuals and society consider broadly acceptable. Between these two limits, the HSE suggested that individuals and society would tolerate risk exposure, but would prefer for it to be lowered.

After identifying probabilities of fatality for a range of potential incidents, the HSE suggested boundaries between these ‘intolerable’, ‘tolerable’ and ‘broadly acceptable’ zones, the upper being risk of fatality of one in 10,000, and the lower being risk of fatality of one in 1,000,000.

The process of considering risk exposure and attempting to bring it within the tolerable or broadly acceptable zones was defined as reducing risk “as low as reasonably practicable,” or ALARP. This could be demonstrated through assessments of risk that showed that the numerical probability and/or consequence (i.e. resultant fatalities) of adverse events were lower than one or both of these limits. If these limits were not met, measures should be put in place until they were. And thus reducing risk ALARP would be demonstrated.

The ALARP approach spread quickly, with many new maths- and physics-based techniques being developed to better understand the probabilistic chains of potential events that could lead to different safety impacts. Over the subsequent 25 years, it expanded outside the safety domain.

Standards were developed using the ALARP approach as a basis, notably Australian Standard 4360, the principles of which were eventually brought into the international risk management standard ISO 31000 in 2009. This advocated the use of risk tolerability criteria for qualitative (i.e. non-mathematical, non-quantitative) risk assessments.

And from there, the ALARP approach spread through corporate governance, and became essentially synonymous with risk assessment as a whole, at least in Australia and the UK. It was held up as the best way to demonstrate that, if a safety risk or other undesired event manifested, decisions made prior to the event were reasonable.

But all was not well.

Consider again the characteristics of a defensible argument. It should be systematic, objective and demonstrate equity, in that people are not unreasonably exposed to risk.

Engineers have, by adopting the ALARP approach, attempted to build these arguments using maths, on the premise that, firstly, there are objective acceptable and intolerable levels of risk, as demonstrated by individual and societal behaviour, and, secondly, risk exposure within specific contexts (e.g. a workplace) could be quantified to these criteria. There are problems with mathematical rigour, which introduce subjectivity when quantifying risk in this manner, but on the whole these are seen as a deficit in technique rather than philosophy, and are generally considered solvable given enough time and computing power.

However, there is another way of constructing a defensible argument following the characteristics above.

Rather than focusing on the level of risk, the precautionary approach emphasises the level of protection against risk. For safety risks it does this by looking firstly at what precautions are in place in similar scenarios. These ‘recognised good practice’ precautions are held to be reasonable due to their implementation in existing comparable situations. Good practice may also be identified through industry standards, guidelines, codes of practice and so on.

The precautionary approach then looks at other precautionary options and considers on one hand the significance of the risk against, on the other, the difficulty, expense and utility of conduct required to implement and maintain each option. This is a type of cost-benefit assessment.

In practice, this means that if two parties with different resources face the same risk, they may be justified in implementing different precautions, but only if they have first implemented recognised good practice.

Critically, however, good practice is the ideas represented by these industry practices, standard, guidelines and so on, rather than the specific practices or the standards themselves. For example, implementing an inspection regime at a hazardous facility is unequivocally considered to be good practice. The frequency and level of detail required for inspection will vary depending on the facility and its particular context, but having no inspection regime at all is unacceptable.

The precautionary approach provides a formal, systematic, and objective safety decision-making alternative to the ALARP approach.

Equity with regard to safety can be judged in a number of ways. The ALARP approach considers equity of risk exposure. A second approach, generally used in legislation, addresses equity through eliminating exposure to specific hazards for particular groups of people, without regard to probability of occurrence. For example, dangerous goods transport is prohibited for most major Australian road tunnels regardless of how unlikely they may be to actually cause harm. In this manner, road tunnel users are provided equity in that none of them should be exposed to dangerous goods hazards in these tunnels.

The precautionary approach provides a third course. It examines equity inherent in the protection provided against particular hazards. It provides the three key characteristics in building a defensible argument for reasonableness.

It can be approached systematically, by first demonstrating identification and consideration of recognised good practice, and the decisions made for further options.

It is clearly objective, especially after an event; either the precautions were there or they were not.

And it considers equity in that for a known safety hazard, recognised good practice precautions are the absolute minimum that must be provided to protect all people exposed to the risk. Moving forward without good practice precautions in place is considered unacceptable, and would not provide equity to those exposed to the risk. While further precautions may be justified in particular situations, this will depend on the specific context, magnitude of the risk and the resources available.

Oddly enough, this is how the Courts view the world.

The Courts have trouble understanding the ALARP approach, especially in a safety context. From their point of view, once an issue is in front of them something has already gone wrong. Their role is then to objectively judge if a defendant’s (e.g. an engineer’s) decisions leading up to the event were reasonable.

Risk, in terms of likelihood and consequence, is no longer relevant; after an event the likelihood is certain, and the consequences have occurred. The Courts’ approach, in a very real sense, involves just two questions:

Was it reasonable to think this event could happen (and if not, why not)?Was there anything else reasonable that ought to have been in place that would have prevented these consequences?The ALARP approach is predicated on the objective assessment of risk prior to an event. However, after an event, the calculated probability of risk is very obviously called into question. This is especially so as the Courts tend to see low-likelihood high-consequence events.

If, using the ALARP approach, a safety risk was determined to have less than a one in 1,000,000 (i.e. ‘broadly acceptable’) likelihood of occurring, and then occurred shortly afterwards, serious doubt would be cast on the accuracy of the likelihood assessment.

But, more importantly, the Courts don’t take the level of risk into account in this way. It is simply not relevant to them. If a risk is assessed as ‘tolerable’ or ‘broadly acceptable’ the answer to the Courts’ first question above is obviously ‘yes’. The Courts’ second question then looks not at the level of risk in isolation, but at whether further reasonable precautions were available before the event.

‘Reasonable’ in an Australian legal safety context follows the 1949 UK Edwards v. National Coal Board definition and was refined by the High Court of Australia in Wyong Shire Council v. Shirt (1980). It requires that, when deciding on what to do about a safety risk, one must consider the options available and their reasonableness, not the level of risk in isolation. This is the requirement of the SFAIRP principle.

This firstly requires an understanding of whether options are reasonable by virtue of being recognised good practice. The reasonableness of further options can then be judged by considering the benefit (i.e. risk reduction) they could provide, as well as the costs required to implement them. Options judged as unreasonable on this basis may be rejected. It is only in this calculus that the level of risk (considered first in the ALARP approach) is considered by the Courts.

The ALARP approach does not meet this requirement. If a risk is determined to be ‘broadly acceptable’ then, by definition, risk equity is achieved, and no further precautions are required. But this may not satisfy the Courts’ requirement for equity of minimum protection from risk through recognised good practice precautions. It may also result in further reasonable options being dismissed.

The precautionary approach, on the other hand, specifically addresses the way in which the Courts determine if reasonable steps were taken, in a systematic, objective and equity-based manner. From a societal point of view, the Courts are our conscience. Making safety decisions consistent with how our Courts examine them would seem to be a responsible approach to engineering.

The ALARP approach was a good idea that didn’t work. With the best intentions, it was developed to its logical conclusions and was subsequently found to not meet society’s requirements as set forward by the Courts.

The precautionary approach’s recent prominence has been driven by the adoption of the SFAIRP principle in the National Model Work Health and Safety Act, now adopted in most Australian jurisdictions, followed by similar changes through the Rail Safety National Law, the upcoming Heavy Vehicle National Law and others. And as the common law principle of reasonableness finds it way into more legislation the need for an appropriate safety decision-making approach becomes paramount. It is an old idea made new, and it works. It provides equity.

Is there any good reason to not implement it?

This article first appeared on Sourceable.

Read More
Due Diligence Due Diligence

Unknown Knowns: The Perils of Blind Spots

When demonstrating due diligence, it’s not just what you know and who you know, it’s what you don’t know that you know.

Donald Rumsfeld’s infamous 2002 quote provoked much discussion: “…as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know…there are also unknown unknowns – the ones we don't know we don't know. It is the latter category that tend to be the difficult ones.”

Rumsfeld’s comment emphasised the importance of unforeseen (and possibly unforeseeable) risks. However, he did not speak about a potential fourth category, the ‘unknown knowns.’

T. E. Lawrence wrote of the ideal military organisation having “perfect 'intelligence,' so that we could plan in certainty.” In practice, this is essentially impossible. An executive’s difficulty in knowing what is happening throughout their organisation increases exponentially with the organisation’s size. This gives rise to many well-known and resented management frameworks, including risk and quality systems, communication protocols, timesheets, and so on.

A heuristic technique known as the Johari Window considers the intersection of a person’s state of knowledge with that of their surrounding community. Adapting this to a organisation’s executive’s point of view gives the following Rumsfeldian categories:

Rumsfeldian Categories / Unknown Knowns / R2A Due Diligence Engineers

These ‘unknown knowns,’ or blind spots, may take a range of forms, including different solutions implemented in different departments for similar problems. At best this is inefficient, and at worst it may demonstrate that, in the case of something going badly wrong, the organisation had a different and clearly reasonable way to address the issue but failed to do so. In this way, recognised good practice may be known and understood within an organisation but not communicated to those who would fund its implementation. A situation may occur in which something goes wrong and good practice measures could have prevented it. This leaves organisations (and relevant managers) open to charges of negligence.

Blind spots may also manifest in the form of operations teams using workarounds to bypass inefficient or perceived low value systems imposed by management. These may arise from benevolent or benign intentions, but can also involve the deliberate flouting of rules or laws, as seen in the recurring ‘rogue financial trader’ scandals.

These scenarios occur again and again in large organisations, and regularly appear in high-profile crisis management media stories. A prominent recent case is Volkswagen’s 2015 diesel emissions controversy. Volkswagen’s CEO admitted that from 2009 to 2014 up to eleven million of its diesel cars (including 91,000 in Australia) had deliberate “defeat” software installed.

This software reduces engine emissions (and hence performance) when it detects the vehicle is undergoing regulatory emissions testing such as that conducted by the United States Environmental Protection Agency (EPA). During normal driving, the software increases vehicle performance (and emissions.) This approach was used to have vehicles approved by US EPA regulators while still marketing the cars as high performance vehicles.

Following the admission, Volkswagen suspended sales of some models and stated that it had set aside 6.5 billion euros to deal with the issue and its fallout. The CEO resigned, and a new chair was elected to the supervisory board. Dozens of lawsuits have since been filed against the company, including a US$61 billion suit from the US Department of Justice.

One investigation into this matter noted sociologist Diane Vaughan’s investigation into the 1986 Challenger space shuttle disaster, citing her concept of “normalisation of deviance.” The investigation stated that, rather than explicit or implicit executive direction to game the emissions testing regime, “…it’s more likely that the scandal is the product of an engineering organisation that evolved its technologies in a way that subtly and stealthily, even organically, subverted the rules.”

This can occur through ongoing ‘tweaking’ by system engineers, with no single change considered enough to break ‘the rules’ but with the accumulation over time enough to go past approved limits. Workforce turnover obviously plays a role in this, with the gradually evolving status quo more likely to be accepted than challenged by each new employee. The Volkswagen board chairman’s statement that “we are talking here not about a one-off mistake but a chain of errors” supports this view, with the German investigation’s chief prosecutor subsequently stating that “no former or current board members” were under investigation.

In almost all of these scenarios, it is eventually found that someone, somewhere in the organisation, was aware of the issue and had misgivings about the organisation’s course of action. And when this knowledge becomes public, it often does serious damage to the organisation's reputation.

One approach to tease out these often complex and hidden views, decisions and knowledge is through the ‘generative interview’ technique. This is based on British psychologist James Reason’s classifications of organisational culture. These run on a spectrum from pathological, through bureaucratic, to generative. These classifications signify a range of organisational cultural characteristics. Three key indicators for executive blind spots relate to failure and new ideas; their response to failure, their response to new ideas, and their attitude to issues within the organisation.

Pathological organisations punish failure (motivating employees to conceal it), actively discourage new ideas, and don’t want to know about issues. Bureaucratic organisations provide local fixes for failures, think that new ideas often present problems, and may find out about organisational issues if staff persist in speaking out. Generative organisations implement far-reaching reforms to address failures, welcome new ideas, and actively seek to find issues.

Generative interviews adopt a communication approach with characteristics of a generative organisational culture. They aim to gain the insight of ‘good players’ at a range of levels within an organisation. They are conducted in the spirit of enquiry rather than audit. That is, they are used to look for views, ideas and solutions rather than just for problems or non-conformances, but they listen carefully to issues raised. If an interesting idea or view is common to multiple levels of an organisation, this indicates that it should be further investigated.

When trying to demonstrate diligence in executive decision-making, this harnessing of knowledge at all levels of the organisation is critical. Without it, senior decision-makers may overlook well-known critical issues, and reasonable precautions may be missed. In a post-event investigation, it is difficult to demonstrate diligence if someone within the organisation knew about what could have gone wrong or how to prevent it but could not communicate this to those with the power to address it.

This approach is not a panacea for identifying issues faced by an organisation. However, it helps executives focus on, identify and address their organisational blind spots. In this manner it helps answer a key aspect of due diligence in decision-making: what are our unknown knowns?

This article first appeared on Sourceable.

Read More
Due Diligence Due Diligence

Should Vic Parliament Cool the Planet to Protect Melbourne?

One of the more interesting philosophical issues arising from the introduction of the model WHS legislation is the question of whether the precautionary principle incorporated in environmental legislation is congruent with the precautionary approach of the model WHS legislation.The environmental precautionary principle is typically articulated as follows:

"If there are threats of serious or irreversible environmental damage, lack of full scientific certainty should not be used as a reason for postponing measures to prevent environmental degradation."

Due diligence is normally recognised as a defence for breach of that legislation.The words in Australian legislation are derived from the 1992 Rio Declaration. This formulation is usually recognised as being ultimately derived from the 1980s German environmental policy. The origin of the principle is generally ascribed to the German notion of Vorsorgeprinzip, literally, the principle of foresight and planning.The WHS legislation also adopts a precautionary approach. It basically requires that all possible practicable precautions for a particular safety issue be identified, and then those that are considered reasonable in the circumstances are to be adopted. In a very real sense, it develops the principle of reciprocity as articulated by Lord Atkin in Donoghue vs Stevenson following the Christian articulation, quote:

"The rule that you are to love your neighbour becomes in law you must not injure your neighbour; and the lawyer's question 'Who is my neighbour?' receives a restricted reply. You must take reasonable care to avoid acts or omissions which you can reasonably foresee would be likely to injure your neighbour."

The dark side of the golden rule, as Immanuel Kant noted, is its lack of universality. In his view, it could be manipulated by whom you consider to be your neighbour. Queen Victoria for example, apparently considered neighbours to mean other royalty. The notion of HRH (his or her royal highness) makes it clear that everyone else is HCL (his or her common lowness). It becomes us and them rather than we.Presently, it’s not altogether clear whom our politicians regard as neighbours. At least for Australian citizens, we are all equal before Australian law, irrespective of race, religion and other such factors. So a fellow Australian citizen is at least a neighbour. Security based electoral populism may erode this although so far our courts have remained resolute in this regard. Victorians probably regard current and future Victorians as neighbours. But what about current and future New South Welshmen?Interestingly, in describing what constitutes a due diligence defence under the WHS act, Barry Sherriff and Michael Tooma favourably quote a case from the Land and Environment Court in NSW, suggesting that due diligence as a defence under WHS law parallels due diligence as a defence under environmental legislation.Does this mean that the two precautionary approaches, despite having quite divergent developmental paths, have converged? Tentatively, the answer seems to be ‘yes’. The common element appears to be the concern with uncertainty stemming from the potential limitations of scientific knowledge to describe comprehensively and predict accurately threats to human safety and the environment.So what does this mean? In committing all these apparently convergent principles in legislation, Australian parliaments have been passing legislation to enshrine the precautionary principle as their raison d'être.Consider global warming, which might be natural or man made or a combination of both. As described in an earlier article, a runaway scenario that melts the Greenland ice cap would raise sea levels by seven metres. This would be tough on Melbourne and see many suburbs underwater. We Victorians seem to have the capability to cool the planet to prevent such an outcome. At $10 billion to $20 billion, we can probably afford it judging by a $5 billion desalination plant from which we are yet to take water.If the Victorian Parliament is serious about implementing the legislation it has enacted, then should the Parliament move to cool the planet to protect Melbourne?

This article first appeared on Sourceable.

Read More

Mixed Messages from Governments on Poles and Wires

According to the Australian Energy Regulator (AER), unexpected events that lead to substantial overspend by owners of poles and wires is capped to 30 per cent.

The rest can be transferred through to the consumer. That is, it does not have to be budgeted for.

Quoting the AER:

"Where an unexpected event leads to an overspend of the capex amount approved in this determination as part of total revenue, a service provider will be only required to bear 30% of this cost if the expenditure is found to be prudent and efficient. For these reasons, in the event that the approved total revenue underestimates the total capex required, we do not consider that this should lead to undue safety or reliability issues."

This has the immediate effect of making poles and wires a valuable saleable asset as the full cost of risk associated with large, rare events like the 2009 Black Saturday bushfires in Victoria does not need to be included in the valuation. For example, the recent, cumulative $1 billion payout in Victoria has relatively little effect on the profit outcomes for the owner. It also means that the commercial incentive to test for further reasonably practicable precautions to address such events is greatly reduced.

This is inconsistent with accepted probity and governance principles. Ordinarily, all persons (natural or otherwise) are required to be responsible and accountable for their own negligence. At least this is the policy position adopted by responsible organisations like Engineers Australia. Their position requires members to practice within their area of competence and have appropriate professional indemnity insurances to protect their clients. The point is that owners and operators should be accountable for negligence, which the commercial imperative desires to abrogate.

In the case of the Black Saturday bush fires for example, this governance failure has been practically addressed by our customary backstop, the legal system, in the form of the common law claims made by affected parties, and the outcomes of the Bushfire Royal Commission and the flow on work by the Powerline Bushfire Safety Taskforce and the continuing Powerline Bushfire Safety Program.

Distribution of Conductor-Soil Arcs at Instant of Initial Contact / R2A Due Diligence Engineers
Distribution of conductor-soil arcs at instant of initial contact (16 amps, 19/3.25AAC conductor)

Particularly, the use of Rapid Earth Fault Current Limiting (REFCL) devices (aka Petersen coils or Ground Fault Neutralisers) on 22-kilovolt lines has been demonstrated to have very significant ability to prevent bushfire starts from single phase grounding faults, faults which the Royal Commission found to be responsible for a significant number of the devastating black Saturday fires. A program to install these in rural Victoria at a preliminary cost of around $500 million appears inevitable, but under the current regulatory regime this cost will be (mostly) passed to the consumer. It is a sad reflection that it takes the death of 173 people to get the worth of such precautions tested and established as being reasonable.

Our Parliaments have seemingly addressed this in a convoluted manner by implementing the model Work Health and Safety laws in all jurisdictions (presently excepting Victoria and Western Australia). This makes officers (directors et al) personally liable for systemic organisational safety recklessness (cases where the officers knew or made or let hazardous occurrences happen) providing for up to five years jail and $600,000 in personal fines. In Queensland, it’s also a criminal matter. There have not been any test cases to date so the effectiveness of this legislation has not been evaluated.

From an engineering perspective, the exclusion of the cost implications of big rare events from the valuation of assets means irrational decisions with regards to the safe operation will inevitably occur and that the community will periodically suffer as a result.

This article first appeared on Sourceable.

Read More
Due Diligence Due Diligence

Engineers Australia Safety Case Guidelines Due to Be Released

The Engineers Australia Safety Case Guideline (3rd Edition) is presently being reviewed by Engineers Australia legal counsel.

It is expected to be released by the Risk Engineering Society through Engineers Australia Media in early 2014.

This third edition of the Safety Case Guideline considers how a safety case argument can be used as a tool to positively demonstrate safety due diligence consistent with the model Work Health and Safety (WHS) legislation (Work Safe Australia 2011) and the Rail Safety National law amongst others, and to provide general information concerning the concepts and applications of risk theory to safety case arguments.

The Guideline adopts a precautionary approach to demonstrating safety due diligence, meaning that safety risk should be eliminated or reduced so far as is reasonably practicable (SFAIRP) rather than reducing risk to as low as is reasonably practicable (ALARP) as encouraged by numerous Australian and international standards and regularly used by many Australian engineers. The Guideline emphasises that attempting to equate SFAIRP and ALARP is naively courageous and will not survive post-event judicial scrutiny.

The expected adoption of the Guideline represents the intellectual tipping point in the technical management of safety risk, at least in Australia, since a guideline or code of practice published by practitioners in their area of competence takes legal precedence over an industry-based standard unless that standard is called-up by statute or regulation. The call-up of a standard in legislation is frowned upon by parliamentary counsel since it derogates the power of parliament to unelected standards committees rather defeating the purpose of a parliamentary democracy. Advice is that under the new safety legislation the hierarchy is now:

New Engineering Safety Legislation / R2A / Due Diligence Engineers

In a discussion of the legal status of standards, Minter Ellison partner Paul Wentworth concluded that "Engineers should remember that in the eyes of the courts, in the absence of any legislative or contractual requirement, an Australian Standard amounts only to an expert opinion about usual or recommended practice. Also, that in the performance of any design, reliance on an Australian Standard does not relieve an engineer from the duty to exercise his or her own skill and expertise."

Previously, due diligence meant compliance with the laws of man. The Guideline emphasises that to be safe in reality (meaning an absence of harm), one must first manage the laws of nature rather than the laws of man. Safety due diligence therefore requires a positive demonstration of the alignment of the laws of nature with the laws of man, in that order.

This is quite different to demonstrating due diligence in the finance world. Money isn’t real. That is, it does not exist in a state of nature, for example, it does not grow on trees and therefore the laws of nature don’t directly apply to it. This means that in the financial world, due diligence will probably continue to be considered the same as compliance. Equating due diligence with compliance is an approach many audit committees pursue with regard to safety risk but which is now effectively prohibited by statute law in most Australian jurisdictions.

This article first appeared on Sourceable (no longer available).

Read More
Due Diligence Due Diligence

Out of the Vault

A recent ABC News article mentions citywide tributes to one of Melbourne’s most famous and controversial sculptures, Vault, more popularly known as the ‘Yellow Peril’. The sculpture was removed from City Square following community and political opposition – Queen Elizabeth is said to have asked if it could be painted "a more agreeable colour" – and it was stored and installed at a progressive number of locations until reaching its current home at the Australian Centre for Contemporary Art.

Vault at ACCA. (Photo source)

However architectural tributes to the sculpture have apparently arisen in a number of public artworks, including RMIT University and the Melbourne International Gateway. The Vault’s sculptor, Ron Robertson-Swann, also worked with Melbourne City Council to embed a reference to the sculpture into Swanson Street tram stops.

The tram stop tribute to Vault. (Photo source)

Of this project, Robertson-Swann said "That was one of the hardest design jobs I've ever done … Everything you did, oh my God, you couldn't do that [because of] health and safety”, with the result being "gentle and timid suggestions of fragments of Vault".

This shows a common problem arising from hazard-based risk management approaches, such as that espoused by AS31000. The issue arises from the cognitive and cultural differences between judgment and compliance, between art and function.

Hazard-based safety risk approaches look at a scenario and ask: “What could go wrong? What are we doing about this? Is it safe enough? Is the risk low enough?” This approach does identify safety issues and measures to address them. However, it also tends to push away new ideas and approaches, the result of a focus on ways in which these questions have previously been signed off – e.g. company procedures and industry technical standards.

A precaution-based safety risk approach, in contrast, looks first at the goals of the endeavor. Provided it is not considered prohibitively dangerous, it asks: “What safety measures should be in place so that we can move forward?” Precaution-based assessment is options, rather than hazard-based, and can thus effectively examine innovative suggestions as well as current practice, while maintaining a clear view of the overall goals of the process.

This is an especially powerful technique in design safety assessments. It allows artists, architects, engineers, constructors, owners, operators and maintainers to all put forward their various concerns and suggestions in a common framework. This enhances communication between stakeholders, and lets project leaderships teams make decisions that clearly address and explain their consideration of the diverse requirements of any multi-discipline project.

In the case of the Swanson Street tram stops, it may be that the use of a precaution-based design safety process which considered the safety and artistic goals as both key to the project success, rather than in opposition, may have led to a stronger artistic outcome without sacrificing safety.

For more detail on this approach read R2A’s Safety Due Diligence White Paper.

Read More