The population explosion in the second half of the twentieth century has placed considerable financial and operational
strains on emergency management agencies. Economic losses due to the natural catastrophic events have grown exponentially
since 1950. The human rescue and relocation problems in the aftermath of a catastrophe have also become a logistic and
economic nightmare. The impact is more pronounced for poor developing countries with the national government carrying the
largest burden
and the agriculture being hit most strongly. For the richer countries, risk management is often shared by individuals,
business communities, and insurers as well as local and national governments.
Although the numbers of geological events have remained relatively constant over many decades, the global temporal data
show sharp increases in climate related events. In fact, the exponential increases in insurance losses caused by weather
related catastrophes is one reason the climate change has become a hot bottom issue to the insurance industry. Multi-hazard
loss estimation studies serve several purposes, prominent amongst which is the design of insurance policies and reinsurance
decisions.
In traditional insurance models, losses are predicted using recent past experience and limited data. The model is
ineffective in dealing with low frequency, high severity catastrophic losses and produce sharp periodic jump in premiums.
This is bad for homeowners and the insurance firms. Catastrophe models take a long term view using scientific models and
can potentially result in relatively stable premiums.
The catastrophe model implemented in our software utilizes four basic modules: Exposure (or inventory), Hazard,
Vulnerability and Loss Estimation. Exposure module relies primarily on the input data by the users to define the form
while the other three modules represent the engine of the catastrophe model. For any given
structure,
the response criterion is selected from more than 600 cases for earthquakes and 100 cases for hurricanes. The input data is
augmented by default settings so that a complete analysis is performed for all
levels
of investigation regardless of the user
expertise. However, the more basic inquiries result in the higher degrees of uncertainties.
The standard guide ASTM E-2026 defines and establishes good commercial practice and standard-of-care in the United States
for conducting a probabilistic study of expected losses to buildings from seismic events (ASTM 1999). It identifies four
levels
of investigation, of which
levels
0, I, and II are considered for this study. Each
level
can be easily extended to
define the requirements for all hazards, including wind and flood.
Level O,
Level I,
Level II
|