Instead of using generalist AI, LookAtMedia built a "media vertical AI model" trained on over a million journalists' writing. This focused approach yields higher quality, more authentic content with a near-zero hallucination rate (less than 0.01%), which is crucial for maintaining credibility with the media.
To maintain quality, 6AM City's AI newsletters don't generate content from scratch. Instead, they use "extractive generative" AI to summarize information from existing, verified sources. This minimizes the risk of AI "hallucinations" and factual errors, which are common when AI is asked to expand upon a topic or create net-new content.
For specialized, high-stakes tasks like insurance underwriting, enterprises will favor smaller, on-prem models fine-tuned on proprietary data. These models can be faster, more accurate, and more secure than general-purpose frontier models, creating a lasting market for custom AI solutions.
Instead of building a single, monolithic AI agent that uses a vast, unstructured dataset, a more effective approach is to create multiple small, precise agents. Each agent is trained on a smaller, more controllable dataset specific to its task, which significantly reduces the risk of unpredictable interpretations and hallucinations.
Before generative AI, AlphaSense built its sentiment analysis model by employing a large team for years to manually tag financial statements. This highly specialized, narrow AI still surpasses the performance of today's more generalized Large Language Models for that specific task, proving the enduring value of focused training data.
Instead of relying solely on massive, expensive, general-purpose LLMs, the trend is toward creating smaller, focused models trained on specific business data. These "niche" models are more cost-effective to run, less likely to hallucinate, and far more effective at performing specific, defined tasks for the enterprise.
M&A Science's "intelligence hub" differentiates from generalist AI like ChatGPT by grounding answers in a closed ecosystem of 400+ expert interviews. It provides sourced, experiential intelligence rather than generic internet-scraped guesses, making it a reliable tool for high-stakes professional work.
Building a single, all-purpose AI is like hiring one person for every company role. To maximize accuracy and creativity, build multiple custom GPTs, each trained for a specific function like copywriting or operations, and have them collaborate.
LookAtMedia's platform has evolved beyond a simple PR tool for companies. Major media groups and journalism schools are now adopting it to generate high-quality, error-free news content internally. This creates a two-sided ecosystem where the tool both creates and satisfies the demand for news stories.
Claude's proficiency in writing is not accidental. Its development, backed by Amazon's Jeff Bezos (who owns The Washington Post), involved training on high-quality journalistic and literary sources. This strategic use of superior training data gives it a distinct advantage in crafting persuasive prose.
True personalization in media outreach goes beyond using a journalist's name. LookAtMedia's AI analyzes a journalist's recent work and rewrites a core press release to match their specific language model and audience interests. This hyper-personalization dramatically increases the likelihood of media coverage.