The lawsuit was filed this week by a 14-year-old boy and his mother in a district court in California. The plaintiffs argue that offensive content was shown in Snapchat's Discover page, where non-subscribed publications are delivered to user feeds.
The lawsuit says that by routinely including sexually explicit content without providing adequate warnings, the app's Discover feature is in violation of the Communications Decency Act:
Millions of parents in the United States today are unaware that Snapchat is curating and publishing this profoundly sexual and offensive content to their children. By engaging in such conduct directed at minors, and making it simple and easy for users to 'snap' each other's content from Snapchat Discover, Snapchat is reinforcing the use of its service to facilitate problematic communications, such as 'sexting,' between minors. Snapchat has placed profit from monetizing Snapchat Discover over the safety of children.
The lawsuit, which is seeking class-action status, seeks civil penalties and a requirement that Snapchat includes an in-app warning about sexual content.
Publishers regularly create specialized content for the platform and Snapchat receives advertising revenue from these partners in return. Users can subscribe to specific publisher channels, but the Discover page brings exposure to publishers they have not subscribed to.
Snapchat claims its partners have editorial independence, but according to The Verge (also a content provider for Snapchat) the company reportedly exercises a heavy hand in guiding the look and feel of published stories.
Snapchat is rated in the App Store as appropriate for children ages 12 and over, noting that it may contain infrequent or mild sexual content, nudity, suggestive themes, profanity, and references to drugs and alcohol. That contrasts with Snapchat's terms of service, which restrict use to children 13 and older.
You can read the lawsuit here.