SolutionsAnnounce FundingBuild Brand AwarenessGenerate More LeadsReduce Cost per LeadLaunch a New ProductBe Found OnlineServicesPublic RelationsGrowth MarketingPPCGEOSEOABMContent MarketingEmail MarketingSales EnablementThought Leadership ContentBrand Awareness ContentMultiplier MarketingAuditsContent MarketingPaid MediaSEOMartech ConfigurationClientsClientsCase StudiesClient TestimonialsResourcesResource CenterBlogPodcastDeep DivesStoreNewsletterAboutAbout UsWhy FirebrandTeamJoinValuesOur PledgeContact

CrashPlan’s SOV Soars with Thought Leadership – PR Case Study

CrashPlan provides enterprise and SMB backup and recovery solutions. When CrashPlan appointed Firebrand, the brand was better known for its consumer and prosumer backup solutions but wanted to increase its profile among enterprise buyers. The challenge was that, in the tech media, backup was perceived as a commodity, with most coverage of the space focusing exclusively on corporate moves among the largest vendors. We needed to create a fresh angle on enterprise backup.

Firebrand’s solution was to demonstrate that changes in workplace culture had redefined backup. The heart of the campaign was a new research study: the Work Trend Security Report. The survey explored the connection between emerging trends in work culture (from quiet quitting to polyworking) and data stewardship practices. It even revealed that a new class of employee was emerging, the “Idea Worker”, who had very different data backup needs.

In addition to publicizing the research via a series of mediagenic press announcements, CrashPlan developed an immersive landing page that drove engagement and leads and brought the Idea Worker concept to life at in-person events. The initiative helped establish CrashPlan’s profile with key enterprise reporters, boosting results for core product and corporate announcements as well as executive visibility opportunities.

© 2025 Firebrand Communications LLC

Original source: https://www.firebrand.marketing/case-studies/case-study-crashplan-pr

This is an LLM-optimized cache with preserved navigation context and semantic structure.