Report Content
Members can report messages inside groups if they believe the content:
- Violates school policies
- Violates Hischool platform rules
- Contains harassment or abuse
- Contains harmful or inappropriate material
- Disrupts learning or collaboration
Reporting helps maintain a safe, respectful, and productive environment.
Who Can Report Content?
Section titled “Who Can Report Content?”Any member of the school can report messages inside groups.
Reporting is a responsibility — not a punishment tool.
It should only be used when content clearly violates rules or community standards.
What Can Be Reported?
Section titled “What Can Be Reported?”Members may report:
- Messages in public groups
- Messages in private groups (if they are members)
- Inappropriate attachments or shared files
- Harmful links or embedded content
Reports should be related to rule violations.
See:
→ School Policies
When to Report a Message
Section titled “When to Report a Message”You should report a message if it includes:
- Harassment or bullying
- Hate speech
- Threats or intimidation
- Explicit or harmful material
- Spam or malicious links
- Repeated disruption of discussions
- Academic misconduct (if defined in school policy)
Do not report content simply because you disagree with someone’s opinion.
How to Report Content
Section titled “How to Report Content”To report a message:
- Locate the message in the group.
- Open the message options menu ( … ).
- Select Report.
- Choose a reason for the report.
- Submit the report.
Reports are sent to authorized moderators or school administrators.
What Happens After a Report?
Section titled “What Happens After a Report?”After a report is submitted:
- Moderators receive a notification.
- The reported message is reviewed.
- The surrounding discussion context may also be examined.
- Moderators determine whether a rule violation occurred.
Possible outcomes include:
- No action
- Warning issued to the member
- Message removed
- Temporary restriction
- Member removed or banned
See:
→ Banned Users
Confidentiality
Section titled “Confidentiality”Reports are handled privately.
The reported member cannot see:
- Who submitted the report
- Internal moderation discussions
- Review notes from moderators
Moderators should treat reports with confidentiality and professionalism.
False or Malicious Reporting
Section titled “False or Malicious Reporting”Submitting reports intentionally without valid reason:
- Disrupts moderation processes
- May violate school policies
- Can lead to disciplinary action
Reports should always be submitted honestly and responsibly.
School-Level vs Platform-Level Violations
Section titled “School-Level vs Platform-Level Violations”There are two levels of rule enforcement.
School Policy Violations
Section titled “School Policy Violations”Handled internally by school moderators.
Defined in:
→ School Policies
Platform Policy Violations
Section titled “Platform Policy Violations”Severe cases may escalate to platform-level moderation.
Examples include:
- Severe harassment
- Illegal content
- Coordinated abuse
- Repeated harmful behavior across schools
Moderator Responsibilities
Section titled “Moderator Responsibilities”Moderators reviewing reports should:
- Review the full conversation context
- Apply policies consistently
- Avoid bias or favoritism
- Document serious moderation decisions
- Escalate severe violations if required
Moderation actions should be fair and proportional.
Best Practices for Schools
Section titled “Best Practices for Schools”- Encourage respectful communication.
- Clearly define reporting guidelines.
- Train moderators in consistent enforcement.
- Avoid overreacting to minor conflicts.
- Maintain transparency about rules and policies.
A healthy reporting culture builds long-term trust in the community.
Common Questions
Section titled “Common Questions”Can members see if their report was accepted?
Section titled “Can members see if their report was accepted?”Report outcomes may not always be shared publicly.
Moderation decisions are usually handled internally.
Can a report automatically ban a user?
Section titled “Can a report automatically ban a user?”No.
Reports trigger a review process.
Moderators decide the appropriate action.
Can reported content be deleted immediately?
Section titled “Can reported content be deleted immediately?”Yes.
Moderators may remove content if it clearly violates rules or poses immediate harm.