Skip to content
All posts

What should I do if I was injured at work and my employer is telling me not to file a workers' compensation claim?

As a lawyer, if you were injured at work and your employer is telling you not to file a workers' compensation claim, your employer is acting unlawfully. It is your right to file a workers' compensation claim after being injured at work, and your employer cannot deny or discourage you from doing so.

Workers' compensation laws vary by state, so it is crucial to seek legal advice from an attorney familiar with the workers' compensation laws in your state. However, regardless of the state, your employer must provide workers' compensation benefits to any employee who is injured while working. These benefits typically include medical expenses, lost wages, and disability benefits.

If your employer is discouraging you from filing a workers' compensation claim, you should first document the conversation and any other actions your employer takes that imply they do not want you to file a claim. This evidence is crucial if legal action is necessary.

You should then consult an attorney who specializes in workers' compensation claims. They can advise you on your legal rights and options, as well as how to proceed in your specific situation. Your attorney may file a claim on your behalf or negotiate with your employer's insurance company to obtain a fair settlement.

In summary, if you were injured at work and your employer is telling you not to file a workers' compensation claim, seek legal advice immediately. Your employer's actions are unlawful, and you have the right to receive workers' compensation benefits for any injury suffered while on the job. Document any interactions with your employer, and consult with an attorney who specializes in workers' compensation claims to understand your legal rights and options.