About us
4 day AI Course
4/5 day AI course Int
2 Day AI course
Accident Investigation CPD for executives
Grounded Theory
Modern day AI philosophy
Educational ethos
Educational standards
Retrospective AI analysis
AI Policies
Assisting legal advisors
Health and Safety Offences, corporate Manslaughter definative Guide re Sentencing
Open Course Details
My Blog
Guestbook
Contact
   
 


The philosophy behind modern day accident investigation


Many organisations invest heavily in loss control;  be it health, welfare, control of asset damage, security or damage to reputation and brand.  Investigations surrounding loss events have often concentrated upon the ‘front end’ of the sequence of events rather than looking at causation.  In fact if one examines how accident investigation reports are written they are seldom chronological and often start with an explanation of how an event occurred, rather than taking an event as being an incident of uncontrolled outcome that is symptomatic of some set of causal failures existing further up the hierarchy of the organisation.  In the past the majority of formal work in relation to developing investigation processes was in the area of personal injury.  This fact triggered a number of  defensive reactive approaches fundamentally aimed at trying to protect an organisation from civil claims.  The result of this positionality meant that organisations were encouraged not to examine events too deeply as this would be likely to expose the upper echelons to legal scrutiny.

Many systems have been developed to enable better understanding of background processes, some even have elaborate computer based systems that declare that they will identify causation.  Sadly the truth is that causation is rarely identified or investigated;  this is particularly the case  when using these systems and in some organisations there is even a naivety as to what meaning of the concept of ‘underlying causation’ actually happens to be. 

All organisations know that events cost them money, many actually recognise a certain number of ‘acceptable events’ that they are willing to risk experiencing in order to reduce the costs of production and to maintain what they consider to be an acceptable competitive status.  The problem with this approach is that an organisation can very quickly accept a level of business risk which becomes the norm without recognising evolutional changes in best practice and expose itself to potential repetitional harm.  The result is that clients simply move away; little is said, competitors seem to be able to outperform the suffering organisation, eventually it either re-invents itself or disappears from the marketplace. 

Within most evolved organisations there has been a recognition that huge amounts of money and resources have been spent and are still being allocated toward understanding and reducing accidents.  With some very favourable results this has led initially to a very pronounced drop in the number of loss events, however even with the identification of best practice, the provision of correct materials, high quality equipment and the selection of top quality staff it is now recognised that there seems to be the phenomenon of a flat line occurring within the statistics.  Essentially the flat line relates to a more or less stable number of events per annum.  An unmeasured but acceptable level of business risk.  From a financial perspective the difference between where this line sits and zero reflects what has become known as ‘accepted hidden loss’ (AHL).  AHL in one company sat at about 13% of turnover.   It is seldom measured and often completely overlooked, but in the tighter competitive world we now find ourselves in, it can be the difference between existence and extinction.  Of more importance, in the world of tendering is the cumulative AHL linked to a large  project when each sub-organisation involved submits costs that knowingly or unknowingly include this hidden amount.


A business that manufactures tools buys-in substandard metal for € / $ / £ 100,000.  This material can not be used and effectively the organisation has made a mistake in its choice.  There is a real loss to the business of € / $ / £ 100,000.  However this loss was not expected and thus it has to be covered by sales of tools in the market place.  If the company was working at a profit margin of 5% then it would have to produce and sell € / $ / £ 2m in order to cover its loss.  If the organisation is able to recognise a stable profit margin of say 3%, then it would appear logical to suggest that the AHL is something that occurs each year and thus is absorbed into the turnover of the company.  Maybe in year one it’s a serious injury accident, in year two a massive over order, but the reality is that if the loss is not seen to affect profit then it has to be AHL and represents a financial waste.

Similarly a public funded organisation spends the same € / $ / £100,000.  The loss represents an impact to its budget; however the budget is funded entirely by the taxes of the businesses that work in the region.  If the average profit margin for business in that area is 5% then the loss to the tax pot is € / £ / $ 100,000; but for local industry to make this amount it has to produce and find a market for € / £ / $ 2m.  This is not always an easy process in a limited competitive market.  If we retrospectively examine loss within the public sector how much did business have to produce to pay real loss.  This is political dynamite.

The main question is what can be done in order to manage this discordance between the optimum of zero events which most believe is a goal rather than an achievable reality and whatever the flat line represents in terms of AHL.  The first step is to understand the concept of causation and to recognise that it is firmly linked to the strategic management of the organisation.  Secondly it has to be appreciated that ‘old fashioned science’ has no place in modern day qualitative data collection and analysis.  Finally there must be a firm trust in the basic methodology of the analysis process that is used and in particular a cognisance of ‘grounded theoretical’ systems and what they can do for the organisation.  All of this has to be supported by a data collection process that is acceptable to both academia and in many arenas the criminal and civil courts, the maxim, ‘poor data equals poor analysis equals poor development’ is still very true.

I would like to examine these issues in a little more depth from bottom up


Data collection

Following serious loss events organisational investigators are usually expected to gather data to be able to assess both what occurred and why it took place; additionally in many locations they are also expected to pass this data on to legal advisors so that they may identify any exposure to civil or criminal legal proceedings.  This data collection process commences at the outset of the investigation.  Obviously poor data collection equates to poor assessment and then poor control of change.  Data tends to fall into one of two categories; that which existed before the event, usually documents and similar organisational records and that which is obtained after the event such as records of the scene and accounts of witnesses.   Accounts of witnesses are without doubt the most important source of information and it’s this data that is usually relied upon by the courts to identify what took place, identify causation and indeed culpability.  Seldom does a pre-existing document help to establish causation.

There is a common misconception that the witness interviewing process is simple and straight forward.  It is not uncommon to find some very undeveloped organisations still using panel interviews and indeed even having witnesses write out their own statements.  This state of affairs provides good evidence of the lack of development within those organisations concerned

A witness interview is not the same as a suspect interrogation and requires well trained personnel who understand open themed interviewing techniques linked to qualitative data collection.  In particular they need to appreciate the issues surrounding perceptual confusions and the importance of not seeking to confirm any pre-existing hypothesis during the interview process.  Organisationally there should never be the occasion where a witness writes down what they have perceived as it  is now well understood that this perceived account will be a mixture of gestalt, fact and opinion.  There is much more to this process than a simple briefing sheet such as this can contain.

Finally, in some locations there is a legal requirement to follow certain statement taking rules.  This may even include the nature of the form that is used as in England and Wales, and who can see the statements concerned.  Failure to adhere to local legal requirements will obviously have a detrimental effect on the reputation and integrity of the organisations involved.

Grounded Theory (GT). The analysis of data

All too often accident investigation processes have tried to emulate the prosecution authorities systems with the result that witnesses interviews become almost judicial in their nature.  In some organisations there is even a clear link between investigative interviewing and disciplinary processes.  Industrial investigators do not have to keep to the legal standard of prosecution, however the depth of their examinations will often go far beyond that of any prosecutors.  Due to the historical processes used for gathering evidence to prosecute, there is a desire, indeed almost a maxim, that one should, ‘stick to the facts’.  This old fashioned and rather positivistic philosophical viewpoint has no place in modern organisations and should be consigned to the history books.  The same mentality caused crusading missionaries to try to convince whole continents that they should convert to their social norms and without any direction to the contrary the affected societies either happily followed their lead or were forced to do so. 

Many ‘young’ evolving organisations also followed this same style of development with strategic decision makers convinced that they were infallible and clearly demonstrating a lack of cognisance of context.  Today the strategic decision makers in the same positions have far better information and education, however they are still at the mercy of the organisation to ensure that they are in receipt of the best information.  This realisation does not take place overnight and goes through an evolutionary process.  Initially there is a desire to collect all information and this obviously quickly leads to a state of information overload.  At this point the organisation tends to become selective in the data it presents to its senior decision makers.  Often loss statistics and especially accident statistics are reduced to a set of graphs.  This leads to a desire to achieve a reduction in the ‘negative’ statistics rather than an examination of causation.  The fact that a negative statistic can be attributed to an event and that this is likely to be the result of individual failure at the front end of the organisation draws the the executive mindset into a false belief that their systems are without weakness and the causation is something to do with the errant individual.  Efforts are concentrated at the front end of the organisation where it is perceived that loss is occurring rather than at the strategic level where loss is instigated.  The ethos appears to reflect a mindset that if we can control the statistics then we can control causation.  Seldom is thought given to what causation actually happens to be.  In many instances in an evolving organisation there are favourable and positive results from focussing on those at the front of the organisation; however after a while the positive gains associated with each initiative reduce and the organisation appears to be at a loss as to why they are still experiencing failure.  Strategic managers reflect back to they know what initially worked and concentrate on workforce control and often fail to examine their own influence on safety culture.  Often the organisation will buy itself a new ‘executive toy’ in the form of some accident investigation (sic) software.  Quite often there is a lack of appreciation that there is a difference between accident reporting software and accident investigation software.  The latter being something that likely does not exist as if it did it would arguable have been employed by the authorities.  This is believed to represent a huge advance in causal analysis but little actually changes and the flat line (AHL) if recognised stays stable.  The simple reason for this failure is that the organisation at strategic level has not yet understood  accident causation, does not appreciate how data should be collected and when it does collect data does not know how to analyse it.


So to grounded theory (GT).  In the late 90’s a system of accident investigation was established that required weaknesses to be identified at an organisational level.  The process identified 11 key areas of organisational failure wish were given the name ‘basic risk factors’ or BRF’s.  The author of this work was Jop Groenweg.  Apparently unrecognised at this time his research team had in fact identified the first or primary set of ‘codes’ required to start a GT based investigation.  Grounded theory requires the researcher to identify the theoretical explanation for some phenomena from the data that they gather, rather than from their own understanding of the information.  The statement, ‘let the data speak for itself’ is often quoted at students of this qualitative investigative process.  Even just using this very basic primary coding helps and organisation identify what conditions existed that set up the workplace environment where loss took place and to correct those weaknesses for each event examined.  When an organisation then decides to take the next step and starts to collate trends in the BRF’s they begin to move toward secondary coding (discussed below).

The mental processes behind GT are not new.  It is quite likely that strategic managers who identify a recurring pattern of failure will look at it and ask what caused it; a form of letting the data speak for itself.  Linking it to accident investigation and coding is new.  It must be remembered that this process uses a data collection system and provides information that is acceptable to the highest legal and academic authorities.  Each organisation involved in a loss event will have its own set of organisational failure areas and thus the process allows examination across the whole supply chain from client to sub-contractor.  The process can even take the investigation into assessing social norms and government positionality.  The only limits here are those imposed by some less enlightened individuals who possibly have a different agenda and who try to restrict access to essential information in an effort to direct attention away from causation.

There is a big problem with this system and that becomes evident when an organisation decides not to explore causation beyond identifying the primary level organisational failure areas.  Often this phenomenon is experienced during the evolution of awareness of the process.  The process required to identify causation is seen as being more etherial and almost non scientific and as a result it is treated with some suspicion until it is understood.


So on to Causation

Causal analysis is simply the examination of those attributes that have allowed executive decision makers to follow a path that resulted in the development of organisational failures within their company.

Secondary coding is slightly different from primary coding in that it does not deal with primary data.  Secondary coding requires the investigator to identify key groups within different data sets from different events gathered from different events.  Thus consider the data where it is identified that 16 out of 20 events at 12 different sites had an element where education was an issue.  Just selecting education as an area of weakness is not helpful as the category is enormous and simply checking it in the fashion of an audit will only discover how discordant it was from a set of organisational standards that obviously represent a degree of failure in the first place.  Currently only a human has the capacity to analyse the different information and identify areas of commonality within this data set.  For instance it might be that an analysis identifies that there is an issue with the selection of training providers and although each selection identified in the data set complies with auditing standards, the fact that there are 14 out of 16 failures suggests that the standard itself is an issue.  The secondary coding process requires the investigator to look at the nature of the failures.  It is possible that education failed because the decision makers lack the necessary knowledge and awareness to make the correct decision.  If this trait keeps appearing and in particular if it appears across different sites that have different local management teams then the obvious conclusion is that the weakness was triggered further up the organisation.

This leads the investigator to tertiary coding which again is different and requires executive decision makers to be willing to get involved in the causal analysis process.  It is clearly understood that they are far too busy to analyse every event that takes place.  Indeed they should not really be concerned with examining individual failures even if the outcome was extreme.  There may also be sound legal reasoning not to analyse each failure individually (A topic for future discussion).  It is however a key business function to control loss at a strategic level and in order to do this executive decision makers should analyse their decision making.  This analysis should take place in order to identify where ‘they’ may have been responsible for the evolution of a workplace environment that has subsequently been found to have areas of organisational weakness.  In the case of the selection of the training providers the question is not what are we going to do to develop higher standards of selection, but what were the attributes of our previous decision making that allowed us to install such latent failures within the the organisation.

This issue is not new, however the process is.  The data that is categorised into secondary codes needs to be trended so that at regular intervals executive decision makers are able to be pulled together and undergo a process of facilitated discussions aimed at identifying what it was that they did or did not do that resulted in loss.  This is not a process of appointing blame.  The workplace environment has to be set by strategic decision makers somewhere and they will do the best that they feel they can do given the resources that they have.  Invariably they will be working in a way that is compliant with any governing legislation and proof of this will be obtained through any number of audits or inspections that are carried out.  However their ideal workplace will not be perfect as they will still be having loss events and a degree of AHL.  The decision making attributes that led to the organisational failures listed in the primary codes can be inserted into a framework of Decision Making Factors (DMF’s); these 12 factors help executive decision makers to understand any weaknesses that precipitated the development of the organisational failure areas. 

These discussions are ‘facilitated’ in nature, as there is a tendency for executive decision makers to spend wasted time examining organisational failure and trying to micro-manage these issues rather than to look at their role in the evolutionary process.  This introversion of the analysis process is difficult for some powerful personalities; at first it is usually viewed with a level of suspicion and in a way that tends to suggest that blame is being levelled at the top rather than seeing the process as a new key business tool designed to take the organisation forward.

The question often arrises about recording the facilitated discussion.  The reality is that it would be sensible to prove that it took place, but due to the nature of the learning processes involved there is little to be gained from recording the whole process.  All that is required are notes on the main developments and the fact that the discussion actually took place.  Thus it is essential that the session is facilitated by someone who understands the investigation process and the process of facilitated discussion.  In reality any learning and realisation takes place later, after the discussion session has finished when the individual participants have time to self reflect.  The effect is that the next decision is taken in a more grounded fashion with an amended set of core values and greater awareness of self.  This change will be reflected downward throughout the organisation.  The benefits to an organisation’s reputation are obvious where it can be demonstrated that there is an ongoing development of the safety culture led from the very highest level and based upon sound data and analytical processes.

Stick to the facts - the end of an era

Old fashioned investigation techniques require that the investigator ‘sticks to the known facts’ of an incident.  This process is fine, except that it fails to take account of so much additional information that often it will point to weaknesses in the organisation that simply are not there.  Lets consider the phrase; ‘tell me what happened then’.  Invariably the witnesses believe that they saw one set of events but when questioned will agree a different account.  Normal human functioning dictates that we have to make sense of our surroundings even to the point where we add things that were not present in order to clarify a data set with missing elements.

Consider the following exercise as it will raise quite proper questions about untested witness testimony and what an individual actually experiences.  Please read the following three times:

A mini was being driven by a red cow along a freeway.  The car over took a bus.  The car came to an intersection and left the freeway.  The car then followed the road and came to a cross roads where it turned left. 

Now please ask yourself these questions without referring to the text.

Did you picture the car driven by the cow?
Did you view it from above, from the side or were you actually in the car?
Did you picture a specific coloured mini or was it just a mini?
How was the cow driving it - was it sitting?
Describe the bus, colour, single or double decker?
Did it take place in town or countryside environment.
What else was going on at the same time - were there other vehicles apart from those mentioned

The problem is that we can not simply stick to the facts, within seconds we begin to fill out recollections with additional information that seems to ‘best fit’.  So, how does a witness know what happened - the simple answer is that they do not.  What is worse they do not recognise the difference between fact and opinion or indeed gestalt, those added bits of information that we insert to make sense of our surroundings.  The only solution to this is to ensure that the industrial investigators are capable of interviewing in such a way that they can punch through the extra information and identify what was actually experienced rather than what the witness thought they experienced.  Again the maxim: poor data equals poor analysis equals poor organisations development’.

Modern day accident investigations take notice of fact, opinion and other human emotional traits.  It is only by doing this that we are able to take an organisation forward.  In the past there has been a desire to emulate prosecution authorities without asking what they are actually doing.  Their processes have almost been placed without good cause on a pedestal by the profession.  The authorities examine a data set that allows them to identify whether an offence has been committed.  When this is done they will often have a set of linked legal elements which enable an investigator to analyse whether there is a case to answer.  Thus in English law the offence of theft has seven elements;

Dishonestly
Appropriates
Property
with the Intention
of Permanently
Depriving
Another of it.

These seven elements have to be proven beyond reasonable doubt for a successful prosecution.  The evidence has to either factually based or from a source whose opinion is considered to be ‘expert’.  In the world of accident investigation this restriction does not help to establish underlying causation; it was never designed to.  Its sole purpose was to establish whether a set of legal standards had been complied with.  Identifying causation is far more complex and requires an understanding of human functioning which can only be examined against a full set of information.  Organisations regularly make decisions based on opinion of human emotion.  How often is something more stringently risk assessed because someone considers it dangerous.  Ignore the words hazard and risk and concentrate on the concept of danger and we find ourselves in a world on qualification rather than quantification.  In many respects the very nature of our penultimate legal judgement is based upon opinions;  what does the jury think?  So even here there is step toward qualitative analytical process.


In Summary

 - Modern day accident investigators must be able to interview witnesses and identify gestalt from opinion and fact.

 - Modern day accident investigators should be able to interview in such a way that the statements that they produce are suitable for use in the legal process

- A modern day accident investigation system should utilise grounded theory and coding processes and not be linked to the out dated tree style analysis systems of days gone by.

- Modern day accident investigation systems should not simply gather statistics for the sake of statistical representation; the reasons for the statistics should be analysed.

- Modern day accident investigators need to cautious when using audit standards as a measure - remember you are measuring against a system that has failed in some way

- A modern day accident investigation system should amongst other processes (The Identification of human failure and belief) be able to provide trended evidence pointing to weaknesses in ‘organisational failure areas’

- A modern day AI system should be able to deliver data to senior executive management that is both academically sound and acceptable to legal scrutiny

- A modern day system should be able to provide a basis for facilitated discussions at board level and thus the identification of underlying causation

- Modern day accident investigation trainers should be conversant with the legal issues associated with gathering evidence in the regions where they work

- Modern day accident investigation trainers should be fully conversant with GT methodology

- Modern day accident investigation trainers should be aware of the evolution of the subject