Reporting & Moderation Policy
Purpose
To provide every member with a safe, respectful platform and a documented, accessible method of reporting abuse, harm, or any violations of community guidelines, and to ensure urgent cases are promptly addressed and all actions are logged for legal compliance and review.
Last updated: 17 March 2026
​
This section explains how reporting and moderation work inside Where I’m At.
Where I’m At is a privacy-first, low-pressure, lived-experience education and peer-support community.
​
It is not therapy, crisis support, diagnosis, treatment, legal advice, or one-to-one coaching.
Moderation exists to help the community feel:
-
calm
-
clear
-
safe enough
-
low-pressure
-
bounded
-
useful without overwhelm
How members can report content
Members can report content in any of these ways:
-
use the platform Report option on a post or comment, where available
-
message a Moderator or Admin directly if that feels safer, or if the issue involves privacy
-
tag a moderator in the thread only when appropriate and the situation feels active or escalating
If possible, include:
-
a link or screenshot reference to the content inside the platform
-
what feels unsafe, out of scope, or concerning
-
whether you would prefer public intervention or private handling
What moderators can do
To protect safety, privacy, and community standards, moderators may take the following actions.
Content actions
-
hide content temporarily while reviewing
-
remove content that breaks guidelines
-
ask the author to edit identifying information, graphic detail, or other unsafe material
-
move or re-place content if it has been posted in the wrong space
If requested edits are not made in a reasonable time, moderators may remove the content to protect the community.
Member actions
-
gentle reminder, public or private
-
clearer boundary message or formal warning
-
temporary mute or suspension when behaviour is harming safety or trust
-
removal or ban for serious or repeated harm
​
Examples of serious harm include:
-
harassment
-
hate speech
-
doxxing
-
threats
-
repeated confidentiality breaches
-
deliberate misinformation presented as fact
-
repeated pressure, selling, or boundary violations after reminders
Moderators aim to use the least intense action that restores safety, but will act more quickly where needed.
Typical response times
We do our best to respond as quickly as we can, but we are a small team and cannot guarantee fixed timelines.
In general:
-
urgent safety concerns are prioritised as soon as seen
-
harassment, privacy breaches, and serious misinformation are reviewed as soon as possible, often within 1–2 days
-
lower-level guideline issues may take longer depending on volume and moderator availability
If something feels urgent, please report it and message a moderator if possible.
Safety concerns and confidentiality limits
Where I’m At is a peer-support and educational space.
We are not able to provide crisis counselling, therapy, or emergency response.
If someone appears at risk of harm
Moderators may:
-
respond with a safety message encouraging emergency or crisis support in the member’s own country
-
hide or limit content that could be harmful to others
-
privately message the member with crisis signposting where platform features allow
-
document the incident internally so moderation stays consistent
Confidentiality in the community
Members are expected to keep what is shared here private.
That means:
-
no screenshots
-
no sharing another member’s story outside the space
-
no reposting identifying details elsewhere
However, confidentiality is not absolute.
If there appears to be a serious and immediate safety risk, moderators may take protective action within the platform.
We do not share personal member information with employers, external partners, or sponsors.
Any reporting outside the moderation team should be aggregated and anonymised.
Bottom line
We aim to protect privacy and emotional safety as much as possible, while prioritising safety when risk is high.
What happens after you report
-
You submit a report or message a moderator.
-
A moderator reviews the content and surrounding context.
-
If needed, the content may be hidden while it is reviewed.
-
A moderation decision is made based on the guidelines and safety needs.
-
We may message you, or the member involved, if clarification or support is needed.
-
Serious or repeated issues are logged internally.
-
We act to protect the community, even if we cannot share every detail of another member’s moderation outcome.
.png)