8 Illegal Content Risk Assessment v2025.01
Will Webberley edited this page 2025-02-03 16:59:54 +00:00
This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

Illegal Content Risk Assessment (Including Mitigation Measures) (v2025.01)

This risk assessment was carried out in January 2025 by project lead, Will Webberley (will@treadl.com), who is also responsible for the assessment and for online safety on Treadl. This assessment pertains to the Treadl service running at treadl.com ("Treadl"). The assessment has been reviewed and approved also by Will Webberley.

This is a risk assessment carried out and documented as part of Treadl's commitment to maintain compliance with the UK Online Safety Act and in accordance with our Online Safety Policy, which also includes information regarding how Treadl takes appropriate steps to keep the risk assessment up to date.

Introduction

Brief service description

Treadl is a web and mobile application targeted at textile weavers. It allows people to create projects and add their weaving patterns and other files (including documents and images) to these projects. Projects can be marked as "public", which then allows them to be discoverable to the public and listed on the public "Explore" page.

Users can choose to configure their profile, including with an avatar image and general biographical information.

At the time of writing Treadl receives around 1,500 unique monthly visitors. It has around 6,800 worldwide registered users, of which around 500 users have been active within the last month.

We deem Treadl to not be "likely to be accessed by children", as it is aimed at adult artists and weavers, it is not designed to attract children, there are no known cases of children using Treadl, and children are not allowed to use the service, as governed by our other published policies. We acknowledge that Treadl does not enforce age verification on visitors and users.

We do not collect precise user location data (aside from non-structured plain-text fields users can optionally add to their profile). However, our general site analytics indicates that 6% of our visitors are from the UK. To err on the side of caution, we hereby estimate that 10% of our monthly visitors and registered users are from the UK (~150 monthly visitors and ~680 registered users).

Treadl has been in operation since November 2018.

Conducting the assessment

We have followed Ofcom's published "four-step process" to conduct and produce the risk assessment. We have also consulted with the following resources in preparing our assessment:

Step 1: Understanding the kinds of illegal content that need to be assessed

In this section, we use Ofcom's illegal content Risk Profiles to identify the risk factors relevant to Treadl for each of the 17 types of priority illegal content.

Identifying priority illegal content & other potential illegal harm

We identify the kinds of priority illegal content as follows:

  1. Terrorism
  2. Child Sexual Exploitation and Abuse (CSEA) -- including also:
    1. Grooming
    2. Image-based Child Sexual Abuse Material (CSAM)
    3. CSAM URLs
  3. Hate
  4. Harassment, stalking, threats and abuse
  5. Controlling or coercive behaviour
  6. Intimate image abuse
  7. Extreme pornography
  8. Sexual exploitation of adults
  9. Human trafficking
  10. Unlawful immigration
  11. Fraud and financial offences
  12. Proceeds of crime
  13. Drugs and psychoactive substances
  14. Firearms, knives and other weapons
  15. Encouraging or assisting suicide
  16. Foreign interference
  17. Animal cruelty

Given the niche nature of the Treadl service, and its limited user base, we have no reason to believe that any other types of illegal content or any other illegal harm not listed in the priority illegal types will occur.

Treadl service characteristics

To help in identifying the risk factors, we first determine Treadl's service characteristics based on our answers to the following questions (where a (Y) indicates a "yes"):

1) Is Treadl any of the following service types? a. Social media service (N) b. Messaging service (N) c. Gaming service (N) d. Adult service (N) e. Discussion form or chat room service (N) f. Marketplace or listing service (N) g. File-storage and file-sharing service (Y) - Treadl is not primarily used for sharing files, though some users upload files to their projects, which can be public, and so we take this into account.

2) Do child users access some or all of Treadl? No, children are not known to access Treadl. It is also against the service policies for children to use the service.

3) Does Treadl have any of the following functionalities related to how users identify themselves to one another? a. Users can display identifying information through a user profile that can be viewed by others (e.g. images, usernames, age) (Y) b. Users can share content anonymously (e.g. anonymous profiles or access without an account) (N)

4) Does Treadl have any of the following functionalities related to how users network with one another? a. Users can connect with other users (N) b. Users can form closed groups or send group messages (N)

5) Does Treadl have any of the following functionalities that allow users to communicate with one another? a. Livestreaming (either open or closed channels) (N) b. Direct messaging (including ephemeral direct messaging) (N) c. Encrypted messaging (N) d. Commenting on content (Y) e. Posting or sending images or videos (either open or closed channels) (Y) f. Posting or sending location information (Y) g. Re-posting and forwarding content (N)

6) Does Treadl allow users to post goods and services for sale? (N)

7) Does my service have any of the following functionalities that allow users to find or encounter content? a. Searching for user-generated content (Y) b. Hyperlinking (N)

8) Does my service use content or network recommender systems? (Y)

Based on our assessment of Treadl's characteristics, we will specifically select risk factors according to the following sections of the U2U Risk Profile:

  • 1g
  • 3a
  • 5d
  • 5e
  • 5f
  • 7a
  • 8

Identifying risk factors

According to the characteristics identified above, we intend to consider the following risk factors in the next step of our risk assessment, having consulted Ofcom's Risk Profiles.

  • 1g:
    • File-storage or file-sharing services (illegal harms: all)
      • On Treadl, users can upload arbitrary files to their projects. Projects can be made publicly available. As such, we identify that part of Treadl's capabilities are captured by "file storage" and "file sharing".
  • 3a:
    • User profiles (illegal harms: fraud and financial services, proceeds of crime, foreign interference, CSEA (grooming), harassment/stalking/threats/abuse, drugs and psychoactive substances, hate, unlawful immigration, human trafficking, and sexual exploitation of adults offences)
      • Users can maintain minimal user profiles on Treadl. However, such profiles intentionally do not by default aim to construct a representation of a real person. However, people can optionally add extra information to do so.
    • Fake user profiles (illegal harms: CSEA (grooming), harassment/stalking/threats/abuse, controlling or coercive behaviour, proceeds of crime, fraud and financial services and foreign interference offences)
      • We acknowledge that fake profiles are a possibility with Treadl. However, we assess that available user actions are focused and restricted (i.e. to creating and managing weaving projects) such that using Treadl for this purpose would not be attractive to those looking to exploit this.
  • 5d:
    • Commenting on content (illegal harms: terrorism, animal cruelty, CSEA (grooming), encouraging or assisting suicide, fraud and financial services, hate, and harassment/stalking/threats/abuse offences)
      • Treadl allows users to write comments on the public projects of other users.
  • 5e:
    • Posting images or videos (illegal harms: terrorism, hate, foreign interference, harassment/stalking/threats/abuse, CSEA (image-based CSAM), animal cruelty, encouraging or assisting suicide, controlling or coercive behaviour, drugs and psychoactive substances, extreme pornography unlawful immigration, human trafficking and intimate image abuse offences)
      • Treadl allows users to upload images and videos to their projects. Furthermore, users can upload an image avatar for their profile.
  • 5f:
    • Posting or sending location information (illegal harms: CSEA (grooming), harassment/stalking/threats/abuse, human trafficking and controlling or coercive behaviour offences)
      • Treadl user profiles include a "location" field. However, this is free text and users can enter anything they like here (whether that is a country, town, etc.). Locations aren't "posted" or "sent" aside from this. Currently around 2% of all user profiles have their location field set.
  • 7a:
    • User-generated content searching (illegal harms: terrorism, drugs and psychoactive substances, firearms, knives and other weapons, extreme pornography, proceeds of crime and fraud and financial services offences)
      • Users can search for user-generated content by typing the name of another user's public project. Only weaving projects can be searched for.
  • 8:
    • Content recommender systems (illegal harms: terrorism, foreign interference, encouraging or assisting suicide and hate offences)
      • Treadl has an "Explore" page, which lists recent weaving patterns posted by other users (in chronological order) and a tab with a random selection of public projects created by other users. There is no specific algorithm for this.
    • Network recommender systems
      • Treadl does not have any network recommender systems. Users can "discover" other users by following links to the a public project owner's user profile.
  • General risk factors:
    • User base demographics
      • For privacy reasons, Treadl does not capture demographic information from its users. As such, unless people intentionally reveal more about themselves than is required on their profiles or other content, identifying individual characteristics of other users on Treadl is non-trivial.
    • Business model (revenue model and growth strategy)
      • Treadl is free and open source software. Whilst we accept donations to help keep servers running, there is no revenue model.
      • There is no growth strategy.
    • Commercial profile
      • Treadl is free software and does not make money. It is volunteer-led. We acknowledge that this reduces staff capacity, and we acknowledge this further in the risk assessment below.

Step 2: Assess the risk of illegal harm

In this step we assign a risk level for each of the 17 types of illegal content and harm. In particular we:

  • Assess the likelihood of encountering the content and of Treadl being used to commit or facilitate an offence;
  • Assess the impact of illegal content occurring on Treadl and the chance of Treadl being used to commit of facilitate an offence.

We now consider each type of illegal content.

1. Terrorism

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • Commenting on content
    • Posting images or videos
    • User-generated content searching
    • Content-recommender systems
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is somewhat likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • Commenting on content (users can comment on the public projects by other users)
    • Reduced: all comments on public projects are also public.
    • Control: all comments must be manually approved before they are viewable by other users.
  • Posting images or videos
    • Control: all project images and videos must be manually approved before they are accessible to other users.
    • Reduced: user profile pictures are all optional and are all public.
    • Control: user profile pictures undergo manual moderation after they are updated.
  • User-generated content searching
    • Reduced: users can only search for other public projects (by project name) or for their own content.
  • Content-recommender systems
    • Reduced: users are only recommended public projects and weaving patterns
    • Control: images, files, videos, and comments in public projects are moderated/approved as described above.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of terrorism content on Treadl is LOW.

2. Child Sexual Exploitation and Abuse (CSEA)

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • User profiles
    • Fake user profiles
    • Commenting on content
    • Posting images or videos
    • Posting or sending location information
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is somewhat likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • User profiles (users have a presence on Treadl, with avatar image, bio, location, and info)
    • Control: all user profile changes are moderated as part of the technical control (for mitigating harms spread through this means).
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics unless usernames are re-used or someone intentionally attempts to reveal more about themselves.
  • Fake user profiles
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics.
    • Control: all user profile changes are moderated. However, we acknowledge that moderators may not know if a fake user profile is impersonating someone else.
    • Reduced: given that all people can do on Treadl is post their weaving projects, we assess that the impact of this illegal content and resultant harm is minimised as a result of fake profiles.
  • Commenting on content (users can comment on the public projects by other users)
    • Reduced: all comments on public projects are also public.
    • Control: all comments must be manually approved before they are viewable by other users.
  • Posting images or videos
    • Control: all project images and videos must be manually approved before they are accessible to other users.
    • Reduced: user profile pictures are all optional and are all public.
    • Control: user profile pictures undergo manual moderation after they are updated.
  • Posting or sending location information
    • Control: all user profile updates (the only place where user location is explicitly attributed) undergo moderation
    • Reduced: only 2% of Treadl users use the location field on user profiles. It is free text, allowing people to be as vague as they like (e.g. country level). Location data is not "posted" or "sent" out aside from this -- it can only be viewed (if set) if someone views someone else's profile.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of general CSEA content on Treadl is LOW.

2 (a). CSEA - Grooming

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • User profiles
    • Fake user profiles
    • Commenting on content
    • Posting or sending location information
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is somewhat likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • User profiles (users have a presence on Treadl, with avatar image, bio, location, and info)
    • Control: all user profile changes are moderated as part of the technical control (for mitigating harms spread through this means).
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics unless usernames are re-used or someone intentionally attempts to reveal more about themselves.
  • Fake user profiles
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics.
    • Control: all user profile changes are moderated. However, we acknowledge that moderators may not know if a fake user profile is impersonating someone else.
    • Reduced: given that all people can do on Treadl is post their weaving projects, we assess that the impact of this illegal content and resultant harm is minimised as a result of fake profiles.
    • Control: if a fake user profile is used to write comments on public projects for the purposes of grooming, such comments are subject to approval mechanisms.
  • Commenting on content (users can comment on the public projects by other users)
    • Reduced: all comments on public projects are also public.
    • Control: all comments must be manually approved before they are viewable by other users..
  • Posting or sending location information
    • Control: all user profile updates (the only place where user location is explicitly attributed) undergo moderation
    • Reduced: only 2% of Treadl users use the location field on user profiles. It is free text, allowing people to be as vague as they like (e.g. country level). Location data is not "posted" or "sent" out aside from this -- it can only be viewed (if set) if someone views someone else's profile.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of CSEA grooming content and/or offences on Treadl is LOW.

2 (b). CSEA - Image-based Child Sexual Abuse Material (CSAM)

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • Posting images or videos
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is less likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • Posting images or videos
    • Control: all project images and videos must be manually approved before they are accessible to other users.
    • Reduced: user profile pictures are all optional and are all public.
    • Control: user profile pictures undergo manual moderation after they are updated.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of CSEA (CSAM) content and/or offences on Treadl is LOW.

2 (c). CSEA - CSAM URLs

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • User profiles
      • Please note: this is not a risk from the Risk Profile explicitly relevant to this harm, but since user profiles can contain links (e.g. to a website or social feed) we acknowledge risk in this feature being exploited for CSAM URLs.
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is less likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • User profiles
    • Control: all user profile changes are moderated by humans.
    • Reduced: all website/URL fields in a profile are "guided". For example, to explicitly refer to particular services (like social feeds) or to the user's own website.
    • Control: user reporting and complaints.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of CSEA (CSAM URLs) content and/or offences on Treadl is LOW.

3. Hate

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • User profiles
    • Commenting on content
    • Posting images or videos
    • Content recommender systems
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is less likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • User profiles (users have a presence on Treadl, with avatar image, bio, location, and info)
    • Control: all user profile changes are moderated as part of the technical control (for mitigating harms spread through this means).
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics unless usernames are re-used or someone intentionally attempts to reveal more about themselves.
  • Commenting on content (users can comment on the public projects by other users)
    • Reduced: all comments on public projects are also public.
    • Control: all comments must be manually approved before they are viewable by other users.
  • Posting images or videos
    • Control: all project images and videos must be manually approved before they are accessible to other users.
    • Reduced: user profile pictures are all optional and are all public.
    • Control: user profile pictures undergo manual moderation after they are updated.
  • Content-recommender systems
    • Reduced: users are only recommended public projects and weaving patterns
    • Control: images, files, videos, and comments in public projects are moderated/approved as described above.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of hate content and/or offences on Treadl is LOW.

4. Harassment, stalking, threats and abuse

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • User profiles
    • Fake user profiles
    • Commenting on content
    • Posting images or videos
    • Posting or sending location information
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is somewhat likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • User profiles (users have a presence on Treadl, with avatar image, bio, location, and info)
    • Control: all user profile changes are moderated as part of the technical control (for mitigating harms spread through this means).
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics unless usernames are re-used or someone intentionally attempts to reveal more about themselves.
  • Fake user profiles
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics.
    • Control: all user profile changes are moderated. However, we acknowledge that moderators may not know if a fake user profile is impersonating someone else.
    • Reduced: given that all people can do on Treadl is post their weaving projects, we assess that the impact of this illegal content and resultant harm is minimised as a result of fake profiles.
    • Control: if a fake user profile is used to write comments on public projects for the purposes of harassment, grooming, etc., such comments are subject to approval mechanisms.
  • Commenting on content (users can comment on the public projects by other users)
    • Reduced: all comments on public projects are also public.
    • Control: all comments must be manually approved before they are viewable by other users.
  • Posting images or videos
    • Control: all project images and videos must be manually approved before they are accessible to other users.
    • Reduced: user profile pictures are all optional and are all public.
    • Control: user profile pictures undergo manual moderation after they are updated.
  • Posting or sending location information
    • Control: all user profile updates (the only place where user location is explicitly attributed) undergo moderation
    • Reduced: only 2% of Treadl users use the location field on user profiles. It is free text, allowing people to be as vague as they like (e.g. country level). Location data is not "posted" or "sent" out aside from this -- it can only be viewed (if set) if someone views someone else's profile.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of harassment/stalking/threats/abuse content and/or offences on Treadl is LOW.

5. Controlling or coercive behaviour

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • Fake user profiles
    • Posting images or videos
    • Posting or sending location information
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is somewhat likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • Fake user profiles
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics.
    • Control: all user profile changes are moderated. However, we acknowledge that moderators may not know if a fake user profile is impersonating someone else.
    • Reduced: given that all people can do on Treadl is post their weaving projects, we assess that the impact of this illegal content and resultant harm is minimised as a result of fake profiles.
    • Control: if a fake user profile is used to write comments on public projects for the purposes of harassment, grooming, etc., such comments are subject to approval mechanisms.
  • Posting images or videos
    • Control: all project images and videos must be manually approved before they are accessible to other users.
    • Reduced: user profile pictures are all optional and are all public.
    • Control: user profile pictures undergo manual moderation after they are updated.
  • Posting or sending location information
    • Control: all user profile updates (the only place where user location is explicitly attributed) undergo moderation
    • Reduced: only 2% of Treadl users use the location field on user profiles. It is free text, allowing people to be as vague as they like (e.g. country level). Location data is not "posted" or "sent" out aside from this -- it can only be viewed (if set) if someone views someone else's profile.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of controlling or coercive behaviour content and/or offences on Treadl is LOW.

6. Intimate image abuse

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • Posting images or videos
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is less likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • Posting images or videos
    • Control: all project images and videos must be manually approved before they are accessible to other users.
    • Reduced: user profile pictures are all optional and are all public.
    • Control: user profile pictures undergo manual moderation after they are updated.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of intimate image abuse content and/or offences on Treadl is LOW.

7. Extreme pornography

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • Posting images or videos
    • User-generated content searching
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is less likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • Posting images or videos
    • Control: all project images and videos must be manually approved before they are accessible to other users.
    • Reduced: user profile pictures are all optional and are all public.
    • Control: user profile pictures undergo manual moderation after they are updated.
  • User-generated content searching
    • Reduced: users can only search for other public projects (by project name) or for their own content.
    • Control: all images, videos and files discoverable by this means would have had to go through a manual approval process first.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of extreme pornography content and/or offences on Treadl is LOW.

8. Sexual exploitation of adults

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • User profiles
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is less likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • User profiles (users have a presence on Treadl, with avatar image, bio, location, and info)
    • Control: all user profile changes are moderated as part of the technical control (for mitigating harms spread through this means).
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics unless usernames are re-used or someone intentionally attempts to reveal more about themselves.
    • Control: all user profiles are subject to a clear reporting mechanism.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of sexual exploitation of adults content and/or offences on Treadl is LOW.

9. Human trafficking

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • User profiles
    • Posting images or videos
    • Posting or sending location information
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is somewhat likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • User profiles (users have a presence on Treadl, with avatar image, bio, location, and info)
    • Control: all user profile changes are moderated as part of the technical control (for mitigating harms spread through this means).
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics unless usernames are re-used or someone intentionally attempts to reveal more about themselves.
    • Control: all user profiles are subject to a clear reporting mechanism.
  • Posting images or videos
    • Control: all project images and videos must be manually approved before they are accessible to other users.
    • Reduced: user profile pictures are all optional and are all public.
    • Control: user profile pictures undergo manual moderation after they are updated.
  • Posting or sending location information
    • Control: all user profile updates (the only place where user location is explicitly attributed) undergo moderation
    • Reduced: only 2% of Treadl users use the location field on user profiles. It is free text, allowing people to be as vague as they like (e.g. country level). Location data is not "posted" or "sent" out aside from this -- it can only be viewed (if set) if someone views someone else's profile.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of human trafficking content and/or offences on Treadl is LOW.

10. Unlawful immigration

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • User profiles
    • Posting images or videos
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is less likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • User profiles (users have a presence on Treadl, with avatar image, bio, location, and info)
    • Control: all user profile changes are moderated as part of the technical control (for mitigating harms spread through this means).
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics unless usernames are re-used or someone intentionally attempts to reveal more about themselves.
    • Control: all user profiles are subject to a clear reporting mechanism.
  • Posting images or videos
    • Control: all project images and videos must be manually approved before they are accessible to other users.
    • Reduced: user profile pictures are all optional and are all public.
    • Control: user profile pictures undergo manual moderation after they are updated.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of unlawful immigration content and/or offences on Treadl is LOW.

11. Fraud and financial offences

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • User profiles
    • Fake user profiles
    • Commenting on content
    • User-generated content searching
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is less likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • User profiles (users have a presence on Treadl, with avatar image, bio, location, and info)
    • Control: all user profile changes are moderated as part of the technical control (for mitigating harms spread through this means).
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics unless usernames are re-used or someone intentionally attempts to reveal more about themselves.
    • Control: all user profiles are subject to a clear reporting mechanism.
  • Fake user profiles
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics.
    • Control: all user profile changes are moderated. However, we acknowledge that moderators may not know if a fake user profile is impersonating someone else.
    • Reduced: given that all people can do on Treadl is post their weaving projects, we assess that the impact of this illegal content and resultant harm is minimised as a result of fake profiles.
  • Commenting on content
    • Reduced: all comments on public projects are also public.
    • Control: all comments must be manually approved before they are viewable by other users.
  • User-generated content searching
    • Reduced: users can only search for other public projects (by project name) or for their own content.
    • Control: all images, videos and files discoverable by this means would have had to go through a manual approval process first.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of fraud and financial offences content on Treadl is LOW.

12. Proceeds of crime

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • User profiles
    • Fake user profiles
    • User-generated content searching
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is less likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • User profiles (users have a presence on Treadl, with avatar image, bio, location, and info)
    • Control: all user profile changes are moderated as part of the technical control (for mitigating harms spread through this means).
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics unless usernames are re-used or someone intentionally attempts to reveal more about themselves.
    • Control: all user profiles are subject to a clear reporting mechanism.
  • Fake user profiles
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics.
    • Control: all user profile changes are moderated. However, we acknowledge that moderators may not know if a fake user profile is impersonating someone else.
    • Reduced: given that all people can do on Treadl is post their weaving projects, we assess that the impact of this illegal content and resultant harm is minimised as a result of fake profiles.
  • User-generated content searching
    • Reduced: users can only search for other public projects (by project name) or for their own content.
    • Control: all images, videos and files discoverable by this means would have had to go through a manual approval process first.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of proceeds of crime content and offences on Treadl is LOW.

13. Drugs and psychoactive substances

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • User profiles
    • Posting images or videos
    • User-generated content searching
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is less likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • User profiles (users have a presence on Treadl, with avatar image, bio, location, and info)
    • Control: all user profile changes are moderated as part of the technical control (for mitigating harms spread through this means).
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics unless usernames are re-used or someone intentionally attempts to reveal more about themselves.
    • Control: all user profiles are subject to a clear reporting mechanism.
  • Posting images or videos
    • Control: all project images and videos must be manually approved before they are accessible to other users.
    • Reduced: user profile pictures are all optional and are all public.
    • Control: user profile pictures undergo manual moderation after they are updated.
  • User-generated content searching
    • Reduced: users can only search for other public projects (by project name) or for their own content.
    • Control: all images, videos and files discoverable by this means would have had to go through a manual approval process first.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of drugs and psychoactive substances content and offences on Treadl is LOW.

14. Firearms, knives and other weapons

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • User-generated content searching
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is less likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • User-generated content searching
    • Reduced: users can only search for other public projects (by project name) or for their own content.
    • Control: all images, videos and files discoverable by this means would have had to go through a manual approval process first.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of firearms, knives and other weapon content and offences on Treadl is LOW.

15. Encouraging or assisting suicide

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • Commenting on content
    • Posting images or videos
    • Content recommender systems
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is somewhat likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • Commenting on content
    • Reduced: all comments on public projects are also public.
    • Control: all comments must be manually approved before they are viewable by other users.
  • Posting images or videos
    • Control: all project images and videos must be manually approved before they are accessible to other users.
    • Reduced: user profile pictures are all optional and are all public.
    • Control: user profile pictures undergo manual moderation after they are updated.
  • Content recommender systems
    • Reduced: users are only recommended public projects and weaving patterns
    • Control: images, files, videos, and comments in public projects are moderated/approved as described above.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of encouraging or assisting suicide content and offences on Treadl is LOW.

16. Foreign interference

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • User profiles
    • Fake user profiles
    • Posting images or videos
    • Content recommender systems
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is somewhat likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • User profiles (users have a presence on Treadl, with avatar image, bio, location, and info)
    • Control: all user profile changes are moderated as part of the technical control (for mitigating harms spread through this means).
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics unless usernames are re-used or someone intentionally attempts to reveal more about themselves.
    • Control: all user profiles are subject to a clear reporting mechanism.
  • Fake user profiles
    • Reduced: Treadl user profiles are intentionally minimal and do not typically reveal personal characteristics.
    • Control: all user profile changes are moderated. However, we acknowledge that moderators may not know if a fake user profile is impersonating someone else.
    • Reduced: given that all people can do on Treadl is post their weaving projects, we assess that the impact of this illegal content and resultant harm is minimised as a result of fake profiles.
  • Posting images or videos
    • Control: all project images and videos must be manually approved before they are accessible to other users.
    • Reduced: user profile pictures are all optional and are all public.
    • Control: user profile pictures undergo manual moderation after they are updated.
  • Content recommender systems
    • Reduced: users are only recommended public projects and weaving patterns
    • Control: images, files, videos, and comments in public projects are moderated/approved as described above.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of foreign interference content and offences on Treadl is LOW.

17. Animal cruelty

Evidence collection

  • Risk profile and risk factors:
    • File sharing services
    • Commenting on content
    • Posting images or videos
  • User complaints:
    • None to date
  • User data (age, background, etc.)
    • We do not collect enough user data to realistically determine Treadl's population type for personal characteristics.
    • Treadl does not facilitate the support of user discovery by user characteristics. For example, I cannot search for users, and user profiles are purposefully kept relatively vague (without requiring real names, gender, etc.)
    • Data indicates that around 5% of user accounts are associated with university email addresses. However, only 0.1% are from UK universities.
  • Retrospective analysis of incidents of harm
    • There are no known incidents of this harm on Treadl.
  • Evidence drawn from existing controls
    • Manual pre-display content-approval is in-play for all uploaded files, content comments, and posted pictures and videos.
    • Content moderation is in-play for all user profile updates.
    • Searching for user-generated content is limited to weaving project "names". Users cannot search by tag or other mechanism.
    • Content-"recommendation" is based on pure randomness, and only recommends weaving projects that have weaving patterns containing threading content. There are no algorithms or machine-learning capabilities that make this process user-oriented nor activity-oriented.
    • Content reporting/complaints is available and signposted from all user-generated content, including the above, user profiles, and projects.
  • Other relevant information
    • We assess that we do not need enhanced inputs to assess this risk and that there are no characteristics which increase this harm.
    • Projects on Treadl (which may contain images, files, and comments) are either public OR private. As such, their content can only be seen by the authors themselves, or to the public (after approval and subject to moderation).

Assessing the likelihood of encountering this illegal content and chance of facilitating offences

  • At a generic service level (i.e. not specific to Treadl's specific nature), the risk factors identified with this harm (see "Risk profile and factors" above) would indicate that this harm is somewhat likely on the service.
  • There are no additional characteristics of Treadl that make this harm more likely.
  • Evidence suggests that this harm is not likely to occur on Treadl, as there have never been any known incidents nor user complaints.
  • There are a number of measures that reduce the likelihood of this harm further, particularly pre-publish approval of all project files, comments, and post-publish moderation on user profile changes. As such, any incidents of this harm in these spaces would be removed/reported before it becomes available to other users, or -- in some cases -- very shortly afterwards.

Assessing the impact of this illegal content and chance of facilitating an offence

  • Nature and severity of the harm
    • There are no known incidents of users having harmful experience relating to this harm on Treadl.
  • Impact on affected individuals
    • We assess that the likelihood of individual characteristics becoming a factor of increased impact of this harm is low due to the limited way for people to express their individuality using the service, and for others to identify individuals (since users cannot be searched for -- either by their characteristics or otherwise).
  • The design and operation of Treadl
    • Treadl has no revenue model or growth strategty -- it's a free and open source service offering. There is no financial incentive for Treadl to have algorithms to keep users "entertained" or amplify content subjectively and all features are provided on a utilitarian basis.
    • Treadl's commercial profile means there is a potential capacity issue in the required human content reviews and moderation. However, current evidence demonstrates that the volume of needed moderation is low, with only a small handful needed per day (due to Treadl's small nature, and most people using the tool for weaving pattern editing). If this were to increase 2x or 3x then this would still be manageable by one person -- particularly with the technical tools in place to help with this process. Technical tools could be developed, and additional human process can be introduced, to further scale this capability when required.
    • We assess that there are no other service characteristics that would cause the impact of this harm to be increased.
    • Recommender systems used by Treadl supply data either by date or by pure randomness -- and are weaving project focused (for example, the chronological "Explore" timeline only includes weaving patterns, and images and files and comments are not included). Given all other content is approved or moderated the likelihood of this factor increasing the impact of the harm is low.
    • We assess that, based on the evidence, user characteristics data cannot be used to increase the impact of this harm to specific user sub-groups. As such, user demographics is not considered to be a significant contributor to any additional impact of this harm.
  • None of the collected evidence suggests there is any impact resulting from this harm on Treadl. There have been zero user complaints/reports of this content to-date and there are no known cases of it. If content were to appear, despite approval and moderation processes, Treadl does not receive enough traffic to substantially increase the impact of this harm.

Existing controls that reduce the risk of this harm The following technical and process controls reduce the likelihood and impact of this harm:

  • Content approval, which means that all project images and files and comments must be approved before they are available to others.
  • Content moderation, which means that other content (e.g. user profiles and project names and descriptions) are moderated in a timely manner (according to our Online Safety Policy) after potentially becoming available to others. Note that even private projects are moderated.
  • Reporting capabilities, to allow any illegal content that is encountered before moderation takes place is reported in order to prioritise management to reduce further dissemination of the illegal content.

Controlling associated Risk Profile risk factors Pertinent factors from the Risk Profile related to this harm are managed or reduced as follows:

  • File sharing services (users can upload files to projects, and projects can be public)
    • Control: all files uploaded to projects must be manually approved before they are available to other users.
  • Commenting on content
    • Reduced: all comments on public projects are also public.
    • Control: all comments must be manually approved before they are viewable by other users.
  • Posting images or videos
    • Control: all project images and videos must be manually approved before they are accessible to other users.
    • Reduced: user profile pictures are all optional and are all public.
    • Control: user profile pictures undergo manual moderation after they are updated.

Additional characteristics we consider We also consider the following characteristics in the assessment of the likelihood and impact of this type of harm:

  • Commercial profile: potentially increases risk of harm if moderation requirements exceed team capacity. However, there is no evidence of this capacity being reached, and this risk is mitigated using easy to use technical controls that are available also remotely.
  • Functionality: Treadl does not have the capability to support private or group-based messaging. Content is either limited to its author or it is public. This reduces the effectiveness of using Treadl to commit or facilitate offences related to this harm.
  • Target audience: Treadl is purely focused on the art and weaving community, with service features heavily specific to this use-case. We consider that this significantly reduces the likelihood and potential impact of this type of harm occurring, and in being used to facilitate or commit an offence.
  • User demographics: service features and capabilities (as described above) mean that we consider the risk of personal characteristics increasing the likelihood or impact of harm to be low.

Risk level Based on the above evidence and reasoning, we assess that the likelihood and impact of animal cruelty content and offences on Treadl is LOW.

Step 3: Decide measures, implement, and record

Based on our description of Treadl and our risk assessment in Step 2, we assess that:

  • Treadl is a "smaller" service
    • Because it has a low number of UK visitors and users.
  • Treadl is a "low-risk" service
    • Because our assessment of each of the identified illegal types of content was "low".

We now consult the Codes of Practice to determine the technical, process, and other measures recommended for Treadl. We outline these below and, for each, determine a course of action for implementation.

  • ICU A2: Individual accountable for illegal content safety duties and reporting and complaints duties
    • We have assigned an individual accountable for such duties as defined in our Online Safety Policy (Will Webberley, project lead).
  • ICU C1: Having a content moderation function to review and assess suspected illegal content
    • Treadl has in place both a content moderation to review and assess suspected illegal content and an approval system for the most potentially sensitive types of content (comments, images and videos and other files in projects).
  • ICU C2: Having a content moderation function that allows for the swift take down of illegal content
    • The content moderation system described above allows for identifying illegal content. Our Online Safety Policy describes process and function to enable the swift take down of any identified illegal content.
  • ICU D1: Enabling complaints
    • Treadl has a robust, easy-to-use and access, and tested complaints and reporting procedure via a form on a dedicated page on our web application.
  • ICU D2: Having an easy to find, easy to access, and easy to use complaints system and processes
    • Treadl's complaints and reporting system comprises of an easy to access and use form that is clearly displayed on a dedicated page on our web application. Submissions to this form immediately notify admins via email, who can then take appropriate action, according to our Online Safety Policy.
    • The form is clearly accessible via buttons and links alongside each piece of user generated content. Supporting content can also be included in the form.
  • ICU D7: Appropriate action for relevant complaints about suspected illegal content
    • If complaints are received about suspected illegal content, Treadl admins will review the content and determine its legality and appropriateness for the service. If the content is found to be illegal or otherwise be in breach of service terms of use, it will be immediately removed.
  • ICU D9: Appropriate action for relevant complaints which are appeals determination (services that are neither large nor multi risk)
    • If a complaint is received that is an appeal about a previous takedown as a result of complaint or moderation process, Treadl administrators will promptly identify this based on the content of the complaint.
  • ICU D10: Appropriate action for relevant complaints which are appeals action following determination
    • According to our Online Safety Policy, processes will allow for the original user or content data to be restored to the service if a decision is made to the reverse the original decision as a result of a complaint or moderation process.
  • ICU D11: Appropriate action for relevant complaints about proactive technology, which are not appeals
    • This practice is related to complaints made about content takedowns. The complaints form can be used to make such complaints, upon receipt of which Treadl admins will review the original action taken alongside the terms of use and illegal content guidelines. If the original action is found to be in breach of Tread's terms of use, the content/user will be permitted to be reinstated, the complainer will be notified and any relevant actions the complainer can take will be described.
    • It is worth noting also that Treadl's approval mechanism may not be subject to "proactive technology", in which case this Practice would not apply to Treadl.
  • ICU D12: Appropriate action for all other relevant complaints
    • If a complaint is received that complains that Treadl is not complying with its relevant duties on safety or illegal content, or complaints related to freedom of expression or privacy, Treadl's nominated Online Safety responsible person will handle the complaint. This will be done on a case-by-case basis whilst protecting UK users and within a timeframe as described in our Online Safety Policy.
  • ICU D13: Exception: manifestly unfounded complaints
    • If a complaint is received that is not an appeal but is determined to be manifestly unfound, the complaint will be disregarded as described in our Online Safety Policy. The policy also describes our review process to ensure maintained accuracy in such determinations.
  • ICU G1: Terms of service: substance (all services)
    • Treadl's terms of use describe how users and visitors are to be protected from illegal content, the reports process, and moderation process.
  • ICU G3: Terms of service: clarity and accessibility
    • Treadl's terms of use are clearly signposted and are available from the footer of every web page and from the mobile app. They are designed ton be easy-to-read, easily understandable and accessible.
  • ICU H1: Removing accounts of proscribed organisations
    • A proscribed organisation is an organisation that has been banned under the UK Terrorism Act. If Treadl admins encounter content or users that are determined to be from a proscribed organisation (e.g. as a result of moderation, approval, or a complaint), the user or content will be removed. For example, this might be a user using a username related to a proscribed org (or alias thereof), bio, or avatar image related to content or assets related to a proscribed org, or content created by a user is related to a proscribed org.

We confirm that all relevant recommended measures from the Codes of Practice have, as of the date of this risk assessment, been implemented, as described above.

Alternative measures (beyond the list above and the existing controls described in Step 2) will not be implemented at this time.

Step 4: Report, review and update

Treadl has a published and publicly-available Online Safety Policy, which describes how and when this risk assessment is reviewed and updated, and defines the allocated person (Will Webberley) responsible for online safety at Treadl.

We confirm that this assessment and its results have been circulated through Treadl governance channels.