Pirates of the 18th century and a 19th century German “Iron” Chancellor preceded the United States in the creation of a social system for the protection of injured workers. The modern U.S. workers’ compensation system owes parts of its existence to this parentage.
Arrrrg, I’m Hurt!
Pirates, contrary to popular myth, proved to be highly organized and entrepreneurial. Prior to their assignment to the ranks of outlaws, they were considered highly prized allies of the government, plundering and sharing the spoils with governors of the pre-Revolutionary colonies who gave them a safe port.
Privateering (the gentleman’s term for piracy) was a dangerous occupation; taking booty away from those who did not want to give it up leads to sea battles, hand-to-hand combat and injury. Because of the ever-present chance of impairment, a system was developed to compensate injured “employees.” There was one catch: he or she (there were female pirates as well) had to survive the wounds to collect as there was no recorded compensation for death.
Piratesinfo.com provides some information regarding the amount of payment made to the injured:
• Loss of an eye — 100 pieces of eight (Spanish dollar);
• Loss of a finger — 100 pieces of eight;
• Loss of left arm — 500 pieces of eight;
• Loss of right arm — 600 pieces of eight;
• Loss of left leg — 400 pieces of eight; and
• Loss of right leg — 500 pieces of eight.
Average weekly wage for colonial Americans of this period equated to approximately two pieces of eight per week. Loss of an eye or finger would merit payment approximating 50 weeks of wages. The right arm was worth 300 weeks (a little less than six years). These compare rather closely to modern compensation schedules.
In addition to being compensated, injured crew members were allowed to remain on board and were offered less strenuous duty. The first return-to-work program was created.
Marxism, Socialism, Comp
Otto von Bismarck, the “Iron Chancellor” introduced “Workers’ Accident Insurance” in 1881. Phased in between 1881 and 1884, the program became the model for workers’ compensation programs in Europe and ultimately America.
Bismarck was not known as a socially conscious ruler; the working conditions of the common man were not necessarily foremost in his mind. History teaches that the unification and growth of Germany (Prussia) and the protection of his position were his main concerns. But Bismarck’s main political rivals were Marxist with socialist agendas — a feigned concern for the plight of the common man. On the top of this agenda was the creation of a social program for the protection of workers injured on the job, a workers’ compensation program.
The “Iron Chancellor” eventually outlawed Marxist and other socialist-leaning parties, securing his rule. However, he did borrow some of their ideas to keep peace among the people. Workers’ Accident Insurance became the first compulsory workers’ compensation program enacted in a modern, industrialized Europe.
England followed Germany’s lead replacing the outdated Employer’s Liability Act of 1880 with its own Workmen’s Compensation Act in 1897. The employer’s liability act was relatively expensive protection that depended on the court system. This is the same type of program common in America during the late 19th century and early 20th century.
America and Workers’ Compensation
America did not enjoin the workers’ compensation social revolution until the 1900s. Maryland (1902), Massachusetts (1908), Montana (1909) and New York (1910) each introduced workers’ compensation statutes. All four laws were struck down under constitutional challenge as violating “due process.”
New York’s 1910 act faced fierce opposition from labor unions. Union officials feared that state control of worker benefits would reduce the need for and popularity of the union. With socialized care and compensation, the necessity of the union was compromised and long-term loyalty to the union was in question.
On March 24, 1911, the New York Court of Appeals declared the state’s compulsory workers’ compensation law unconstitutional. One hundred forty-six (146) workers were killed the next day in a fire at the Triangle Waist Co. in New York City. Not all were killed in the fire, most died attempting to escape the flames, jumping from nine and 10 stories to the street below.
With no workers’ compensation system, family members and dependents had to turn to the courts in an attempt to force Triangle to compensate the injured and the families of the dead. The owners were tried for manslaughter and acquitted. A civil suit against the owners netted each of 23 families $75 in damages (The Columbia Electronic Encyclopedia). New York finally adopted a workers’ compensation law in 1913 that would withstand constitutional challenges.
Employer Negligence
Prior to the enactment of workers’ compensation laws, the only source of compensation for any injured employee was through the courts. Employees had to prove the employer was negligent to gain any compensation for lost wages or medical bills. Employers utilized several defenses against charges of negligence:
• Assumption of Risk: Proving negligence requires evidence that a duty of care is owed. When an employee assumes the risk of an inherently dangerous or recognizably potentially dangerous activity, the duty of care is lifted off the employer. With no required duty of care, there can be no negligence. Employees in hazardous occupations were believed to understand the hazards and assumed the risk of injury;
• Contributory Negligence: Doctrine of defense stating that if the injured person was even partially culpable in causing or aggravating his own injury, he is barred from any recovery from the other party. This is an absolute defense; and
• Fellow Servant Rule: Defense against employer negligence asserting that an employee’s injury was caused by a fellow employee, not by the acts of the employer. If proven, negligence was not asserted against the employer and recovery could be severely limited or barred.
Very few workers had the means to bring suit. Those who could afford a lawsuit had to overcome the defenses available to the employer. The result: very few employers were held responsible for injury and required to pay. Awards for successful suits were unpredictable, ranging from too little to merit the trouble to more than the employer planned.
Congress enacted two laws to limit the harshness of these defenses. The Employers’ Liability Acts of 1906 and 1908 were federal attempts to soften the contributory negligence doctrine. These legislative attempts did little to protect injured workers from the ravages of defense attorneys and juries.
The Great Tradeoff!
Human capital (the value of the employee) became a driving force behind the push for a system of protection. Stories (although no evidence currently exists) of injured mine workers being laid at the door of their houses with no compensation or admission of negligence from the mine owners, leaving the families to struggle for a means of support and help, made their way through industrialized cities and states leading to demands for a better system. Recognition of the value of employees and other events between 1900 and 1911 helped spur the movement towards a social system of workers’ compensation in the U.S:
• 1908 — President Taft signed the first viable workers’ compensation statute into law with the creation of the Federal Employers Liability Act designed to protect railroad workers involved in interstate commerce (the program is still in existence today);
• 1908-1909 — Various states set up commissions to study the merits and drawbacks of a social system of injured employee compensation. Overwhelmingly these commissions reported that business, industry and employees supported such a system (the basis of study was the German law);
• 1910 — Crystal Eastman compiled and penned, “Work Accidents and the Law.” This document presented the problems inherent in the then-current system of negligence-based compensation in light of the cost to human capital. It also highlighted the benefits of a workers’ compensation program as preventative in nature (employers would be more willing to invest in safety if the cost of injury was ultimately on them). This work is credited with changing business and labor attitudes towards workers’ compensation and employee safety;
• 1911 — Triangle Waist Co. fire; and
• 1911 — “The Great Tradeoff” debate. Before any plan could move forward, an agreement between labor and industry had to be reached; each had to be willing to give up something for a workers’ compensation system to function properly.
The employer agreed to pay medical bills and lost wages, regardless of fault; and the employee agreed to give up the right to sue.
Wisconsin passed its workers’ compensation law in May 1911, becoming the first state to effectuate an on-going workers’ compensation program that survived legal challenges. Nine more states adopted workers’ compensation laws before the close of 1911. By the end of 1920, 42 states plus Alaska and Hawaii (even though statehood didn’t come for either until 1959) enacted workers’ compensation statutes. Mississippi was the last state to implement a workers’ compensation statute, waiting until 1948.
Voluntary vs. Compulsory
Early programs (1911-1916) were voluntary participation laws. Employers were not compelled by the various statutes to purchase workers’ compensation. Compulsory participation laws doomed earlier programs, being struck down as unconstitutional. The Fourteenth Amendment required due process before a person or entity could be compelled to part with property.
In 1917, the Supreme Court upheld the constitutionality of compulsory insurance requirements, opening the door for every state to require the purchase of workers’ compensation coverage. Then, as now, each state instituted different threshold requirements.
Workers’ compensation laws have evolved and expanded since the beginning, but these are the roots of the modern American workers’ compensation system.