Velas Products: Multi-Model Governance

Since a decentralized system by its architecture is an automated software complex that does not require a person to maintain it, human participation should be maximally automated, systematized and reduced to a decentralized governance.

Motivation

In society today, social networks have become some of the most popular online platforms with by far the most online domination over every aspect of life. These have become the public squares where people spend their time chatting with friends, sharing content, and publishing information.

When people have the opportunity to upload and download content freely and to leverage the powerful reach offered by platforms, this can lead to unwanted or unintended consequences, such as spam, propaganda, violence, etc., which require platform moderation and/or removal. Therefore, any system that involves the storage and publication of video and other content requires a moderator and decentralized systems are no exception to this factor.

Since a decentralized system, by its architecture, is an automated software complex that does not require a person to maintain it, human participation should be maximally automated, systematized and reduced to a decentralized governance.

Decision

Content moderation in BitOrbit will be fully provided to, and maintained by, community and validators that are selected by stakeholders.

When an individual or collection of user mark content on BitOrbit as “undesirable”, it immediately goes before the committee for review. The committee consists of 19 validators who moderate the content and receive rewards in VLX currency for their work.

The Committee will verify the content and, in any case of violations of our BitOrbit Terms of Service, will remove it from public access.

An example of a rule in action under BitOrbit:

  • Sexual content = rejected
  • Manifestation of violence = rejected
  • References to terrorism = rejected

To help and assist our validators and process automation we are providing an AI module that is capable of processing any and all pictures and video processed through BitOrbit before assigning an appropriate label to the content. When our AI module receives notification of rejected content, it retrains itself to improve the labeling rules through the future. By the time of the public release, the algorithm will be trained to identify 95% of the problems in automated mode to minimize the involvement of validator.

Each validator will be able to independently integrate the AI module and make progress in moderation to improve its rating in the future.


Share

By About

Leave a Reply

Your email address will not be published. Required fields are marked *