Workers’ Compensation

With a few exceptions, employers and businesses obtain workers’ compensation insurance,
which offers benefits for accidents involving employees while they are on the job. Most
businesses in Florida are required by law to acquire workers’ compensation insurance.
Employees who sustain an injury at work are compensated under a workers’ compensation
policy, regardless of culpability. Due to this insurance, companies are protected from some
employee injury lawsuits.