LinkedIn is bringing some interesting new safety features to users. Announced this week, the Microsoft-owned business-centric social media platform made some changes around user safety. Specifically, how reported content is handled on the service.

Starting with getting information on reported content. While LinkedIn already allows users to report content from other users, getting wider details on what happens after has not been available. With the new update, when content is removed or someone reports content, they will receive a notification with more information.

Currently, the feature is debuting in the United States, Canada, and France. The company says it will reach other countries in the coming weeks.

Elsewhere, the network is also tweaking its reminders experience. Building on a recent improvement of the company’s Professional Community Policies, LinkedIn will now send reminders about its goals. For example, users will now see messages about what the network wants to achieve in terms of respect across users.

These messages will be on display when a user logs in.

Getting to Grips with Inappropriate Content

While LinkedIn does not face the same level of abusive content as Facebook or Twitter, it is still an issue. To thwart inappropriate content being posted, the company will include a new warning message.

This notification will be on message that may include harassing content. Users can decide whether to report the content:

“These warnings let you decide if you’d like to report the content, signalling to us that it’s an unwanted message, and allow us to take appropriate action against the sender. Or view and mark it as safe and help tell our algorithms what’s okay by you.”

LinkedIn has created a landing page for the new policies, which can be seen here.