We scan new podcasts and send you the top 5 insights daily.
The legal framework for bars ("dram shop laws"), which holds them liable for damages caused by over-served patrons, could be applied to gambling. This would create a financial disincentive for platforms like DraftKings and FanDuel to exploit users who show clear signs of addiction.
Counter to the typical use case, DraftKings applies AI defensively. The technology analyzes user communications across multiple touchpoints—like customer service and marketing—to detect patterns of problem gambling and flag them for review, promoting responsible platform use.
Lacking direct addiction data, Pennsylvania's voluntary self-exclusion program serves as a proxy. The number of 18-35 year olds banning themselves from gambling platforms jumped from ~50 per year before 2019 to ~1500 per year after online legalization, indicating a massive, hidden crisis among young people.
A fundamental flaw in gambling regulation is that agencies are often tasked with maximizing state tax revenue from betting. This creates an inherent conflict of interest, prioritizing state income over public health and making it structurally difficult to implement meaningful consumer protections.
The debate shouldn't be about banning gambling, but about regulating its delivery mechanism. Modern apps are designed to be "frictionless," removing all barriers to betting and turning casual interest into a compulsive "rabbit hole." The solution is to mandate friction, like daily spending and time limits.
To overcome Section 230 protections shielding platforms from liability for user content, recent lawsuits focus on the inherent design of the platforms. The argument is that features like infinite scroll and algorithmic feeds are themselves defective, addictive products, making companies liable for product design flaws rather than user posts.
The bull thesis for DraftKings relies on one of three potential legal or regulatory outcomes against prediction markets: new Congressional legislation, a Supreme Court ruling clarifying state jurisdiction (seen as most likely), or a future, more aggressive CFTC.
A targeted approach to social media regulation is to remove Section 230 liability protection specifically for content that platforms' algorithms choose to amplify. If a company reverse-engineers a user's behavior to promote harmful content, they should be held liable, just as a bartender is for over-serving a customer.
Recent lawsuits against Meta signal a new legal strategy. Instead of focusing on content (protected by Section 230), plaintiffs successfully argue that the platforms are defectively designed products that cause harm (addiction), opening a product liability flank that tech companies have struggled to defend.
A landmark case against Meta has validated a novel legal theory that sidesteps Section 230 protections. By suing over harmful and addictive product design rather than user-generated content, plaintiffs have created a new and potent legal threat to social media platforms, holding them liable for their core algorithms.
Despite mounting evidence of financial ruin and addiction, meaningful regulation is unlikely to be driven by public health concerns. Instead, the trigger will likely be a high-profile sports integrity scandal, such as a star athlete caught betting, which threatens the profitability of the sports leagues themselves.