Debates over free speech, online self-expression, and the rights of minors continue to influence the digital landscape. One platform that has recently drawn increased public attention is BrandArmy, an app and website marketed as a creator-focused space. While BrandArmy allows users under 18 in certain contexts, child-safety advocates have raised concerns about whether the platform’s structure may inadvertently expose minors to inappropriate adult attention. No wrongdoing has been legally established, and the concerns reflect broader industry-wide debates regarding youth safety online.
These guidelines and expectations were once laid on the shoulders of so-called “dance moms”. But, with the advent of TikTok and the concept of pay-to-connect platforms like Patreon and OnlyFans, a new level of alleged child exploitation has emerged with the growing onset of BrandArmy.
According to publicly available information, BrandArmy was co-founded by CEO Ramon Mendez and launched in 2020.
The company describes its mission as providing creators with a comprehensive platform to “launch, invite, engage, interact, and monetize a superfan community.” It emphasizes that creators maintain control over their data, intellectual property, pricing, and engagement features. BrandArmy publicly states that its policies prohibit nudity and explicit sexual content.
BrandArmy also compares itself to Patreon and OnlyFans, noting that it offers a variety of monetization tools and higher potential pricing tiers. While the platform prohibits sexual content, it allows non-nude imagery that may be stylistically suggestive. Critics argue that such allowances, even when technically within guidelines, could create an online environment where minors may receive attention from older users. These concerns reflect patterns observed across multiple platforms and are not unique to BrandArmy.
ABC News Australia previously reported on experiences shared by a teenage user, noting that comments appearing on her page included a range of compliments and emojis. According to the report, some comments appeared to be from adults. The outlet also noted that subscribers could pay monthly fees to access the creator’s content or send messages. The reporting did not establish illegal activity but highlighted the potential for harmful interactions when minors monetize personal imagery online.
Experts in child protection warn that commercializing youth-generated content—particularly imagery that can be perceived as suggestive—may increase the risk of exploitation, even when platforms enforce explicit-content bans. Many argue that existing safeguards across the industry may not fully prevent adults from seeking out content involving minors.
Some critics of regulation argue that restricting minors from monetizing content online could infringe upon free expression or entrepreneurial opportunities. However, child-safety organizations widely contend that protective measures must take precedence when there is a potential risk of exploitation. These concerns are part of a broader national and international conversation about youth privacy, digital labor, and the responsibilities of platforms hosting mixed-age creator communities.
BrandArmy has sometimes been colloquially compared to other monetized creator platforms, though such comparisons can oversimplify or misrepresent the platform’s stated policies. There is no verified evidence that BrandArmy engages in or facilitates trafficking, and any such claims should be treated with caution unless supported by formal investigations or authoritative sources. Nonetheless, experts agree that platforms allowing minors to monetize images must implement stringent protections to reduce risks related to grooming, coercion, or inappropriate contact.
As public debates continue, many advocates urge parents, caregivers, and policymakers to remain informed about how monetized creator platforms operate, and to consider stronger digital safeguards and clearer industry standards for youth participation online.

