Workers Compensation
States requires employers to offer workers compensation insurance to their employees. This insurance covers employees medical and compensation for a portion of income they lose. This insurance also covers employers from any lawsuit from their workers.