Tag Archives: Media Workflow

Automate Your Data Workflows With Backblaze B2 Event Notifications

Post Syndicated from Bala Krishna Gangisetty original https://backblazeprod.wpenginepowered.com/blog/announcing-event-notifications/

A decorative image showing the Backblaze logo on a cloud with an alert notification.

Backblaze believes companies should be able to store, use, and protect their data in whatever way is best for their business—and that doing so should be easy. That’s why we’re such fierce advocates for the open cloud and why today’s announcement is so exciting.

Event Notifications—available today in private preview—gives businesses the freedom to build automated workloads across the different best-of-breed cloud platforms they use or want to use, saving time and money and improving end user experiences.

Here’s how: With Backblaze Event Notifications, any data changes within Backblaze B2 Cloud Storage—like uploads, updates, or deletions—can automatically trigger actions in a workflow, including transcoding video files, spooling up data analytics, delivering finished assets to end users, and many others. Importantly, unlike many other solutions currently available, Backblaze’s service doesn’t lock you into one platform or require you to use legacy tools from AWS.

So, to businesses that want to create an automated workflow that combines different compute, content delivery networks (CDN), data analytics, and whatever other cloud service: Now you can, with the bonus of cloud storage at a fifth of the rates of other solutions and free egress.

If you’re already a Backblaze customer, you can join the waiting list for the Event Notifications preview by signing up here. Once you’re admitted to the preview, the Event Notifications option will become visible in your Backblaze B2 account.

A screenshot of the where to find Event Notifications in your Backblaze account.

Not a Backblaze customer yet? Sign up for a free Backblaze B2 account and join the waitlist. Read on for more details on how Event Notifications can benefit you.

With Event Notifications, we can eliminate the final AWS component, Simple Queue Service (SQS), from our infrastructure. This completes our transition to a more streamlined and cost-effective tech stack. It’s not just about simplifying operations—it’s about achieving full independence from legacy systems and future-proofing our infrastructure.


— Oleh Aleynik, Senior Software Engineer and Co-Founder at CloudSpot.

A Deeper Dive on Backblaze’s Event Notifications Service

Event Notifications is a service designed to streamline and automate data workflows for Backblaze B2 customers. Whether it’s compressing objects, transcoding videos, or transforming data files, Event Notifications empowers you to orchestrate complex, multistep processes seamlessly.

The top line benefit of Event Notifications is its ability to trigger processing workflows automatically whenever data changes on Backblaze B2. This means that as soon as new data is uploaded, changed, or deleted, the relevant processing steps can be initiated without manual intervention. This automation not only saves time and resources, but it also ensures that workflows are consistently executed with precision, free from human errors.

What sets Event Notifications apart is its flexibility. Unlike some other solutions that are tied to specific target services, Event Notifications allows customers the freedom to choose the target services that best suit their needs. Whether it’s integrating with third-party applications, cloud services, or internal systems, Event Notifications seamlessly integrates into existing workflows, offering unparalleled versatility.

Finally, Event Notifications doesn’t only bring greater ease and efficiency to workflows, it is also designed for very easy enablement. Whether via browser UI or SDKs or APIs or CLI, it is incredibly simple to set up a notification rule and integrate it with your preferred target service. Simply choose your event type, set the criteria, and input your endpoint URL, and a new workflow can be configured in minutes.

What Is Backblaze B2 Event Notifications Good For?

By leveraging Event Notifications, Backblaze B2 customers can simplify their data processing pipelines, reduce manual effort, and increase operational efficiency. With the ability to automate repetitive tasks and handle millions of objects per day, businesses can focus on extracting insights from their data rather than managing the logistics of data processing.

A diagram showing the steps of event notifications.

Automating tasks: Event Notifications allows users to trigger automated actions in response to changes in stored objects like upload, delete, and hide actions, streamlining complex data processing tasks.

Orchestrating workflows: Users can orchestrate multi-step workflows, such as compressing files, transcoding videos, or transforming data formats, based on specific object events.

Integrating with services: The feature offers flexible integration capabilities, enabling seamless interaction with various services and tools to enhance data processing and management.

Monitoring changes: Users can efficiently monitor and track changes to stored objects, ensuring timely responses to evolving data requirements and faster security response to safeguard critical assets.

What Are Some of the Key Capabilities of Backblaze B2 Event Notifications?

  • Flexible Implementation: Event Notifications are sent as HTTP POST requests to the desired service or endpoint within your infrastructure or any other cloud service. This flexibility ensures seamless integration with your existing workflows. For instance, your endpoint could be Fastly Compute, AWS Lambda, Azure Functions, or Google Cloud Functions, etc.
  • Event Categories: Specify the types of events you want to be notified about, such as when files are uploaded and deleted. This allows you to receive notifications tailored to your specific needs. For instance, you have the flexibility to specify different methods of object creation, such as copying, uploading, or multipart replication, to trigger event notifications. You can also manage Event Notification rules through UI or API.
  • Filter by Prefix: Define prefixes to filter events, enabling you to narrow down notifications to specific sets of objects or directories within your storage on Backblaze B2. For instance, if your bucket contains audio, video, and text files organized into separate prefixes, you can specify the prefix for audio files to receive event notifications exclusively for audio files.
  • Custom Headers: Include personalized HTTP headers in your event notifications to provide additional authentication or contextual information when communicating with your target endpoint. For example, you can use these headers to add necessary authentication tokens or API keys for your target endpoint, or include any extra metadata related to the payload to offer contextual information to your webhook endpoint, and more.
  • Signed Notification Messages: You can configure outgoing messages to be signed by the Event Notifications service, allowing you to validate signatures and verify that each message was generated by Backblaze B2 and not tampered with in transit.
  • Test Rule Functionality: Validate the functionality of your target endpoint by testing event notifications before deploying them into action. This allows you to ensure that your integration with your target endpoint is set up correctly and functioning as expected.

Want to Learn More About Event Notifications?

Event Notifications represents a significant advancement in data management and automation for Backblaze B2 users. By providing a flexible and powerful capability for orchestrating data processing workflows, Backblaze continues to empower businesses to unlock the full potential of their data with ease and efficiency.

Join the Waitlist ➔ 

The post Automate Your Data Workflows With Backblaze B2 Event Notifications appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Join Backblaze Tech Talks at NAB 24

Post Syndicated from James Flores original https://backblazeprod.wpenginepowered.com/blog/join-backblaze-tech-talks-at-nab-24/

A decorative image showing a film strip flowing into a cloud with the Backblaze and NAB Show logos displayed.

For those of you attending NAB 2024 (coming up in Las Vegas from April 14–17), we’re excited to invite you to our Backblaze Tech Talk series in booth SL7077. This series will deliver insights from expert guest speakers from a range of media workflow service providers in conversation with Backblaze solution engineers. Whether you’re an experienced workflow architect or new to the industry, anyone attending will leave with actionable insights to improve their own media workflows. 

All presentations are free, open to attendees, and will be held in the Backblaze booth (SL7077). Bonus: Get scanned while you’re there for exclusive Backblaze swag.

Sunday, April 14:

  • 3:00 p.m.: Leslie Hathaway, Sales Engineer and Brian Scheffler, Pre-Sales Sys. Engineer at Quantum discuss AI tools, CatDV Classic & .io utilizing Backblaze for primary storage.  

Monday, April 15:

  • 10:00 a.m.: Helge Høibraaten, Co-Founder of CuttingRoom presents “Cloud-Powered Remote Production: Collaborative Video Editing on the Back of Backblaze.”
  • 11:00 a.m.: Mattia Varriale, Sales Director EMEA at Backlight presents “Optimizing Media Workflow: Leveraging iconik and Backblaze for Cost-Effective, Searchable Storage.”
  • 1:00 p.m.: Danny Peters, VP of Business Development, Americas at ELEMENTS presents “Bridging On-Premises and Cloud Workflows: The ELEMENTS Media Ecosystem.”
  • 2:00 p.m.: Sam Bogoch, CEO at Axle AI with a new product announcement that is Powered by Backblaze.
  • 3:00 p.m.: Greg Hollick, Chief Product Officer and Co-Founder at CloudSoda presents “Effortless Integration: Automating Media Assets into Backblaze with CloudSoda.”

Tuesday, April 16:

  • 10:00 a.m.: Raul Vecchione, from Product Marketing at bunny.net presents “Edge Computing—Just Smarter.”
  • 11:00 a.m.: Paul Matthijs Lombert, CEO at Hedge presents “Every Cloud Workflow Starts at the (H)edge.”    
  • 1:00 p.m.: Craig Hering, Co-Founder & CEO of Suite Studios presents “Suite Studios and Backblaze Integration Providing Direct Access to Your Data for Real-Time Editing and Archive.”
  • 2:00 p.m.: Murad Mordukhay, CEO of Qencode presents “Building an Efficient Content Repository With Backblaze.”

Don’t miss out on these great tech talks. Elevate your expertise and connect with fellow media  industry leaders. We look forward to seeing you at NAB! And, if you’re ready to sit down and take a deep dive into your storage needs, book a meeting here.

The post Join Backblaze Tech Talks at NAB 24 appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Managing the Media Tidal Wave: Backlight iconik’s 2024 Media Report

Post Syndicated from Jennifer Newman original https://backblazeprod.wpenginepowered.com/blog/managing-the-media-tidal-wave-backlight-iconiks-2024-media-report/

A decorative image showing several types of files falling into the a cloud with the Backblaze logo on it.

Everyone knows we’re living through an exceptional time when it comes to media production: Every day we experience a tidal wave of content—social video, virtual reality (VR) and augmented reality (AR) gaming, 10K sports footage, every streaming option imaginable—crashing down on us.

Two eye popping stats underscore that this perception is real: In December, Netflix shared that its users streamed close to 100 billion hours of content on its platform during the first half of 2023. At the beginning of 2024, YouTube revealed that its users watch one billion hours of video daily. 

It’s hard to make sense of that volume of content, it’s even harder to understand how it’s produced. Imagine the armies of people and types of programs required to capture, ingest, transcode, store, tag, edit, distribute, and archive all of it. Managing that content means touching every stage, start to finish, of the production process through the data lifecycle. 

A photo showing a woman surrounded by glistening raindrops, some of which transform into media icons.

To further complicate things, the modern production person’s workflow is nothing close to linear. They have to deal with:

  • A diversity of inputs: Content is pouring in from various sources. Keeping track of all this content, ensuring its quality, and organizing it effectively can feel like trying to catch a waterfall in a teacup. 
  • A variety of formats and file types: Each platform or device may require a different format or resolution, making it difficult to maintain consistency across the board and adding another layer of complexity.
  • Managing metadata, or indexing and tagging: With so much content flying around, ensuring that each file is properly tagged with relevant information is crucial for easy retrieval and management. However, manually inputting metadata for every piece of media can be time-consuming and prone to errors.
  • Remote and in-person collaboration: With team members spread across different locations, coordinating efforts and maintaining version control can be a headache.
  • Storage and scalability: As the volume of media grows, so does the need for storage space. Finding a solution that can accommodate this growth without breaking the bank can be tough.

While many companies are jumping in to provide tools to manage this tidal wave of content, one company, Backlight iconik, differentiates itself by providing industry-leading tools and offering public media reports on the state of media data today.

Backlight iconik’s 2024 Media Stats Report

Since 2018, iconik has provided a cloud-based media asset management (MAM) tool to help production professionals tame the insanity of modern content development. For the past four years, they’ve also provided an annual Media Stats report to help the industry understand the type of media being developed and distributed, as well as where and how it’s stored. (In 2022, Backlight, a global media technology, acquired iconik, hence the name change.) If you want the full story, please check out Backlight iconik’s 2024 Media Stats Report,

As cloud storage specialists here at Backblaze (and, lovers of stats ourselves), we would like to dig into their stats on storage and offer our own take here for you today.

2024 Media Storage Trend: The Content Tidal Wave Is Not a Mirage

According to Backlight iconik’s Media Stats Report, iconik’s data exploded to 152PB, shooting up by a whopping 57%—that’s 53PB more than in 2023. To put it in perspective, that’s roughly 6TB of fresh data pouring in every hour. This surge in data can be attributed to both new customers integrating their archives with iconik and existing customers ramping up their usage.

2024 Media Storage Trend: Audio on the Rise

An interesting find in their study was the difference between audio and video asset growth. Iconik is now managing 328 years of video (up 41% YoY) and 208 years of audio (up 50% YoY).

A decorative display of media stats from iconik's media report. The title reads Asset Duration. On the left, a blue circle displays the words 328 years of video, which represents 41% growth year on year. On the right, an orange circle contains the words 208 years of audio, which represents 50% growth year on year.
Note that “duration” in this context is measuring the total hours of runtime for each file. Source.

Over the last year, the growth of audio assets managed by Backlight iconik has surged, surpassing that of video, with a staggering 1,700 hours being added daily. They believe this surge is closely tied to the remarkable expansion of both the podcasting and audiobook markets in recent years. The global podcast market ballooned to $17.9 billion in 2023 and is forecasted to soar to $144 billion by 2032. Similarly, the audiobook market is projected to hit $35 billion by 2030, with expected revenue of $35.05 billion in the same year. While audio files are smaller than video files by far, it’s reasonable to anticipate a continued upward trajectory for audio assets across the media and entertainment landscape.

2024 Trend: The Shift to Cloud Storage for Cost-Effective Storage and Collaboration

According to Backlight iconik’s 2024 Media Stats Report, the trend toward cloud storage is definitely on the rise as the increased competition in the market and move away from hyperscalers drive more reasonable pricing. Companies are opting to transition to the cloud at their own speed, and hybrid cloud setups give them the freedom to shift assets as needed to improve things like performance, ease of access, security, and meeting regulatory requirements.

Get Ahead of the Wave: Pair iconik With Modern Cloud Storage Today

The reasons so many media professionals are moving to cloud are relatively simple: Cloud workflows enable enhanced collaboration and flexibility, greater cost predictability, and heightened security and management capabilities. And often, all of the above is possible at a lower total cost than legacy solutions.

Pairing Backlight iconik and Backblaze provides a simple solution for users to manage, collaborate, and store media projects easily. By integrating with iconik, Backblaze boosts workflow effectiveness, delivering a strong cloud-based MAM system that allows thorough management of Backblaze B2 Cloud Storage data right from a web browser.

Customer Story: How One Streaming Company Tackled Remote Content Workflows With Backblaze and Backlight iconik

When Goalcast, the empowering media company, decided to dive into making their own content, they realized their current setup just wasn’t cutting it. With their team spread out all over the place, they needed an easy way to get footage, access videos from anywhere, and keep a stash of finished files ready to jazz up for YouTube, Facebook, Instagram, Snapchat, TikTok, and the Goalcast OTT app. 

Goalcast combined LucidLink’s cloud-based workflows, iconik’s media asset manager and uploader features, and Backblaze B2 Cloud Storage. The integration between iconik, LucidLink, and Backblaze creates a slick media workflow. The content crew uploads raw footage straight into iconik, tossing in key details. Original files zip into Goalcast’s Backblaze B2 Bucket automatically, while edited versions are up for grabs via LucidLink. After the editing magic, final assets kick back into Backblaze B2.

The integration and partnership means endless possibilities for Goalcast. They’re saving around 150 hours a month in grunt work and stress. Now, they don’t have to fret about where footage hides or how to snag it—it’s all securely stored in Backblaze, ready for anyone on the team to grab, no matter where they’re working from.

You can get lost in the weeds of tech companies and storage solutions. It can hurt your brain. The sweet spot is these three—iconik, LucidLink, and Backblaze—and how they work together.

—Dan Schiffmacher, Production Operations Manager, Goalcast

2024 Media Mega Trend

Looking at Backlight iconik’s numbers and forecasts from a 25,000 foot vantage point makes one thing painfully clear: Effective media management and storage are going to be absolutely crucial for media teams to succeed in the future landscape of production. Dive deeper into how Backblaze and Backlight iconik can support you now and down the road, ensuring seamless media management, and affordable storage, that creates easy, stress-free expansion as your data continues to grow. 

Already have iconik and want to get started with Backblaze? Click here. 

The post Managing the Media Tidal Wave: Backlight iconik’s 2024 Media Report appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

A Disaster Recovery Game Plan for Media Teams

Post Syndicated from James Flores original https://www.backblaze.com/blog/a-disaster-recovery-game-plan-for-media-teams/

A decorative image showing icons representing file types surrounding a cloud. The background has sports imagery incorporated.

When it comes to content creation, every second you can spend honing your craft counts. Which means things like disaster recovery planning are often overlooked—they’re tasks that easily get bumped to the bottom of every to-do list. Yet, the consequences of data loss or downtime can be huge, affecting everything from marketing strategy to viewer engagement. 

For years, LTO tape has been a staple in disaster recovery (DR) plans for media teams that focus on everything from sports teams to broadcast news to TV and film production. Using an on-premises network attached storage (NAS) backed up to LTO tapes stored on-site, occasionally with a second copy off-site, is the de facto DR strategy for many. And while your off-site backup may be in a different physical location, more often than not, it’s the same city and still vulnerable to some of the same threats.  

As in all areas of business, the key to a successful DR plan is preparation. Having a solid DR plan in place can be the difference between bouncing back swiftly or facing downtime. Today, I’ll lay out some of the challenges media teams face with disaster recovery and share some of the most cost-effective and time-efficient solutions.

Disaster Recovery Challenges

Let’s dive into some potential issues media teams face when it comes to disaster recovery.

Insufficient Resources

It’s easy to deprioritize disaster recovery when you’re facing budgetary constraints. You’re often faced with a trade-off: protect your data assets or invest in creating more. You might have limited NAS or LTO capacity, so you’re constantly evaluating what is worthy of protecting. Beyond cost, you might also be facing space limitations where investing in more infrastructure means not just shouldering the price of new tapes or drives, but also building out space to house them.

Simplicity vs. Comprehensive Coverage: Keeping Up With Scale

We’ve all heard the saying “keep it simple, stupid.” But sometimes you sacrifice adequate coverage for the sake of simplicity. Maybe you established a disaster recovery plan early on, but haven’t revisited it as your team scaled. Broadcasting and media management can quickly become complex, involving multiple departments, facilities, and stakeholders. If you haven’t revisited your plan, you may have gaps in your readiness to respond to threats. 

As media teams grow and evolve, their disaster recovery needs may also change, meaning disaster recovery backups should be easy, automated, and geographically distanced. 

The LTO Fallacy

No matter how well documented your processes may be, it’s inevitable that any process that requires a physical component is subject to human error. And managing LTO tapes is nothing if not a physical process. You’re manually inserting LTO tapes into an LTO deck to perform a backup. You’re then physically placing that tape and its replicas in the correct location in your library. These processes have a considerable margin of error; any deviation from an established procedure compromises the recovery process.

Additionally, LTO components—the decks and the tapes themselves—age like any other piece of equipment. And ensuring that all appropriate staff members are adequately trained and aware of any nuances of the LTO system becomes crucial in understanding the recovery process. Achieving consistent training across all levels of the organization and maintaining hardware can be challenging, leading to gaps in preparedness.

Embracing Cloud Readiness

As a media team faced with the challenges outlined above, you need solutions. Enter cloud readiness. Cloud-based storage offers unparalleled scalability, flexibility, and reliability, making it ideal for safeguarding media for teams large and small. By leveraging the power of the cloud, media teams can ensure seamless access to vital information from any location, at any time. Whether it’s raw footage, game footage, or final assets, cloud storage enables rapid recovery and minimal disruption in the event of a disaster.

Cloud Storage Considerations for Media Teams

Migrating to a cloud-based disaster recovery model requires careful planning and consideration. Here are some key factors for sports teams to keep in mind:

  1. Data Security: Content security is becoming more and more of a top priority with many in the media space concerned about footage leakage and the growing monetization of archival content. Ensure your cloud provider employs robust security measures like encryption, and verify compliance with industry standards to maintain data privacy, especially if your media content involves sensitive or confidential information. 
  2. Cost Efficiency: Given the cost of NAS servers, LTO tapes, and external hard drives, scaling on-premises solutions indefinitely is not always the best solution. Extending your storage to the cloud makes scaling easy, but it’s not without its own set of considerations. Evaluate the cost structure of different cloud providers, considering factors like storage capacity, data transfer costs, and retention minimums.  
  3. Geospatial Redundancy: Driving LTO tapes to different locations or even shipping them to secure sites can become a logistical nightmare. When data is stored in the cloud, it not only can be accessed from anywhere but the replication of that data across geographic locations can be automated. Consider the geographical locations of the cloud servers to ensure optimal accessibility for your team, minimizing latency and providing a smooth user experience.
  4. Interoperability: With data securely stored in the cloud it becomes instantly accessible to not only users but across different systems, platforms, and applications. This facilitates interoperability with applications like cloud media asset managers (MAMs) or cloud editing solutions and even simplifies media distribution. When choosing a cloud provider, consider APIs and third-party integrations that might enhance the functionality of your media production environment. 
  5. Testing and Training: Testing and training are paramount in disaster recovery to ensure a swift and effective response when crises strike. Rigorous testing identifies vulnerabilities, fine-tunes procedures, and validates recovery strategies. Simulated scenarios enable teams to practice and refine their roles, enhancing coordination and readiness. Regular training instills confidence and competence, reducing downtime during actual disasters. By prioritizing testing and training, your media team can bolster resilience, safeguard critical data, and increase the likelihood of a seamless recovery in the face of unforeseen disasters.

Cloud Backup in Action

For Trailblazer Studios, a leading media production company, satisfying internal and external backup requirements led to a complex and costly manual system of LTO tape and spinning disk drive redundancies. They utilized Backblaze’s cloud storage to streamline their data recovery processes and enhance their overall workflow efficiency.

Backblaze is our off-site production backup. The hope is that we never need to use it, but it gives us peace of mind.

—Kevin Shattuck, Systems Administrator, Trailblazer Studios

The Road Ahead

As media continues to embrace digital transformation, the need for robust disaster recovery solutions has never been greater. By transitioning away from on-premises solutions like LTO tape and embracing cloud readiness, organizations can future-proof their operations and ensure uninterrupted production. And, while cloud readiness creates a more secure foundation for disaster recovery, having data in the cloud creates a pathway into the future teams can take advantage of a wave of cloud tools designed to foster productivity and efficiency. 

With the right strategy in place, media teams can turn potential disasters into mere setbacks, while taking advantage of their new cloud centric posterity maintaining their competitive edge. 

The post A Disaster Recovery Game Plan for Media Teams appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

How to Run AI/ML Workloads on CoreWeave + Backblaze

Post Syndicated from Pat Patterson original https://www.backblaze.com/blog/how-to-run-ai-ml-workloads-on-coreweave-backblaze/

A decorative image showing the Backblaze and CoreWeave logos superimposed on clouds.

Backblaze compute partner CoreWeave is a specialized GPU cloud provider designed to power use cases such as AI/ML, graphics, and rendering up to 35x faster and for 80% less than generalized public clouds. Brandon Jacobs, an infrastructure architect at CoreWeave, joined us earlier this year for Backblaze Tech Day ‘23. Brandon and I co-presented a session explaining both how to backup CoreWeave Cloud storage volumes to Backblaze B2 Cloud Storage and how to load a model from Backblaze B2 into the CoreWeave Cloud inference stack.

Since we recently published an article covering the backup process, in this blog post I’ll focus on loading a large language model (LLM) directly from Backblaze B2 into CoreWeave Cloud.

Below is the session recording from Tech Day; feel free to watch it instead of, or in addition to, reading this article.

More About CoreWeave

In the Tech Day session, Brandon covered the two sides of CoreWeave Cloud: 

  1. Model training and fine tuning. 
  2. The inference service. 

To maximize performance, CoreWeave provides a fully-managed Kubernetes environment running on bare metal, with no hypervisors between your containers and the hardware.

CoreWeave provides a range of storage options: storage volumes that can be directly mounted into Kubernetes pods as block storage or a shared file system, running on solid state drives (SSDs) or hard disk drives (HDDs), as well as their own native S3 compatible object storage. Knowing that, you’re probably wondering, “Why bother with Backblaze B2, when CoreWeave has their own object storage?”

The answer echoes the first few words of this blog post—CoreWeave’s object storage is a specialized implementation, co-located with their GPU compute infrastructure, with high-bandwidth networking and caching. Backblaze B2, in contrast, is general purpose cloud object storage, and includes features such as Object Lock and lifecycle rules, that are not as relevant to CoreWeave’s object storage. There is also a price differential. Currently, at $6/TB/month, Backblaze B2 is one-fifth of the cost of CoreWeave’s object storage.

So, as Brandon and I explained in the session, CoreWeave’s native storage is a great choice for both the training and inference use cases, where you need the fastest possible access to data, while Backblaze B2 shines as longer term storage for training, model, and inference data as well as the destination for data output from the inference process. In addition, since Backblaze and CoreWeave are bandwidth partners, you can transfer data between our two clouds with no egress fees, freeing you from unpredictable data transfer costs.

Loading an LLM From Backblaze B2

To demonstrate how to load an archived model from Backblaze B2, I used CoreWeave’s GPT-2 sample. GPT-2 is an earlier version of the GPT-3.5 and GPT-4 LLMs used in ChatGPT. As such, it’s an accessible way to get started with LLMs, but, as you’ll see, it certainly doesn’t pass the Turing test!

This sample comprises two applications: a transformer and a predictor. The transformer implements a REST API, handling incoming prompt requests from client apps, encoding each prompt into a tensor, which the transformer passes to the predictor. The predictor applies the GPT-2 model to the input tensor, returning an output tensor to the transformer for decoding into text that is returned to the client app. The two applications have different hardware requirements—the predictor needs a GPU, while the transformer is satisfied with just a CPU, so they are configured as separate Kubernetes pods, and can be scaled up and down independently.

Since the GPT-2 sample includes instructions for loading data from Amazon S3, and Backblaze B2 features an S3 compatible API, it was a snap to modify the sample to load data from a Backblaze B2 Bucket. In fact, there was just a single line to change, in the s3-secret.yaml configuration file. The file is only 10 lines long, so here it is in its entirety:

apiVersion: v1
kind: Secret
metadata:
  name: s3-secret
  annotations:
     serving.kubeflow.org/s3-endpoint: s3.us-west-004.backblazeb2.com
type: Opaque
data:
  AWS_ACCESS_KEY_ID: <my-backblaze-b2-application-key-id>
  AWS_SECRET_ACCESS_KEY: <my-backblaze-b2-application-key>

As you can see, all I had to do was set the serving.kubeflow.org/s3-endpoint metadata annotation to my Backblaze B2 Bucket’s endpoint and paste in an application key and its ID.

While that was the only Backblaze B2-specific edit, I did have to configure the bucket and path where my model was stored. Here’s an excerpt from gpt-s3-inferenceservice.yaml, which configures the inference service itself:

apiVersion: serving.kubeflow.org/v1alpha2
kind: InferenceService
metadata:
  name: gpt-s3
  annotations:
    # Target concurrency of 4 active requests to each container
    autoscaling.knative.dev/target: "4"
    serving.kubeflow.org/gke-accelerator: Tesla_V100
spec:
  default:
    predictor:
      minReplicas: 0 # Allow scale to zero
      maxReplicas: 2 
      serviceAccountName: s3-sa # The B2 credentials are retrieved from the service account
      tensorflow:
        # B2 bucket and path where the model is stored
        storageUri: s3://<my-bucket>/model-storage/124M/
        runtimeVersion: "1.14.0-gpu"
        ...

Aside from storageUri configuration, you can see how the predictor application’s pod is configured to scale from between zero and two instances (“replicas” in Kubernetes terminology). The remainder of the file contains the transformer pod configuration, allowing it to scale from zero to a single instance.

Running an LLM on CoreWeave Cloud

Spinning up the inference service involved a kubectl apply command for each configuration file and a short wait for the CoreWeave GPU cloud to bring up the compute and networking infrastructure. Once the predictor and transformer services were ready, I used curl to submit my first prompt to the transformer endpoint:

% curl -d '{"instances": ["That was easy"]}' http://gpt-s3-transformer-default.tenant-dead0a.knative.chi.coreweave.com/v1/models/gpt-s3:predict
{"predictions": ["That was easy for some people, it's just impossible for me,\" Davis said. \"I'm still trying to" ]}

In the video, I repeated the exercise, feeding GPT-2’s response back into it as a prompt a few times to generate a few paragraphs of text. Here’s what it came up with:

“That was easy: If I had a friend who could take care of my dad for the rest of his life, I would’ve known. If I had a friend who could take care of my kid. He would’ve been better for him than if I had to rely on him for everything.

The problem is, no one is perfect. There are always more people to be around than we think. No one cares what anyone in those parts of Britain believes,

The other problem is that every decision the people we’re trying to help aren’t really theirs. If you have to choose what to do”

If you’ve used ChatGPT, you’ll recognize how far LLMs have come since GPT-2’s release in 2019!

Run Your Own Large Language Model

While CoreWeave’s GPT-2 sample is an excellent introduction to the world of LLMs, it’s a bit limited. If you’re looking to get deeper into generative AI, another sample, Fine-tune Large Language Models with CoreWeave Cloud, shows how to fine-tune a model from the more recent EleutherAI Pythia suite.

Since CoreWeave is a specialized GPU cloud designed to deliver best-in-class performance up to 35x faster and 80% less expensive than generalized public clouds, it’s a great choice for workloads such as AI, ML, rendering, and more, and, as you’ve seen in this blog post, easy to integrate with Backblaze B2 Cloud Storage, with no data transfer costs. For more information, contact the CoreWeave team.

The post How to Run AI/ML Workloads on CoreWeave + Backblaze appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Free Your Premiere Pro Workflows With Backblaze Cloud Storage

Post Syndicated from James Flores original https://www.backblaze.com/blog/free-your-premiere-pro-workflows-with-backblaze-cloud-storage/

A decorative image showing a mockup of Premiere Pro's user interface and the Backblaze storage cloud.

Projects and technologies come and go, and with each new tool comes new workflow changes. But changing the way you move media around can be tough. Maybe you’ve always done things a certain way, and using a new tool feels like too much of a learning curve especially when you’re pressed for time. But the way you’ve always done things isn’t always the best, easiest, or fastest way. Sometimes you need to change the status quo to level up your media operations. 

As a freelance editor, I worked on a recent project that presented some challenges that demanded new approaches to media storage challenges you might also be facing. I solved them with the cloud—but not an all-in-one cloud. My solution was a mix of cloud tools, including Adobe Premiere Pro, which gives me customization and flexibility—the best of all worlds in media workflows

Right Opportunity at the Right Time

Last year I had the opportunity to serve as a digital imaging technician (DIT) on the set of an indie film titled “Vengeance” produced by Falcon Pictures. The role of a DIT can vary. In many instances you’re simply a data wrangler making backups of the data being shot. In others, you work in the color space of the project creating color corrected dailies on set. For “Vengeance”, I was mostly data wrangling. 

“Vengeance” was an 11-day shoot in the mountains of Northern California near Bass Lake. While the rest of the crew spent their days hiking around with equipment, I was stationed back at home base with my DIT cart. With a lot of free time, I found myself logging data as it came in. Logging clip names soon turned into organizing bins and prepping the project for editing. And, while I was not the editor on the project, I was happy to help edit while I was on set. 

The Challenge

A few months after my work as DIT ended, it became clear that “Vengeance” needed a boost in post-production. The editing was a bit stuck—they had no assistant editor to complete logging and to sound sync all the footage. So, I was asked to help out. The only problem: I needed to be able to share my work with another editor who lived 45 miles away.

A screenshot of an indie film, Vengeance, being edited in Adobe Premiere Pro.
Editing “Vengeance” in Adobe Premiere Pro.

Evaluating the World of Workflows and Cloud Tools

So we began to evaluate a few different solutions. It was clear that Adobe Premiere Pro would be used, but data storage was still a big question. We debated a few methods for sharing media:

  1. The traditional route: Sharing a studio. With the other editor 45 miles away, commuting and scheduling time with each other was going to be cumbersome. 
  2. Email: We could email project files back and forth as we worked, but how would we keep track of versioning? Project bloat was a big concern. 
  3. Sharing a shuttle drive. Or what I’m calling “Sneakernet 2.0.” This is a popular method, but far from efficient. 
  4. Google Drive or Dropbox: Another popular option, but also one that comes with costs and service limitations like rate limiting. 

None of these options were great, so we went back to the drawing board. 

The Solution: A Hybrid Workflow Designed for Our Needs

To come to a final decision for this workflow, we made a list of our needs: 

  • The ability to share a Premiere Pro project file for updates. 
  • The ability to share media for the project. 
  • No exchanging external hard drives. 
  • No driving (a car).  
  • Changes need to be real time.

Based on those needs, here’s where we landed.

Sharing Project Files

Adobe recently released a new update to its Team Projects features within Premiere Pro. Team Projects allows you to host a Premiere Pro project in the Adobe cloud and share it with other Adobe Creative Cloud users. This gave us the flexibility to share a single project and share updates in real time. This means no emailing of project files, versioning issues, or bloated files. That left the issues of the media. How do we share media? 

Sharing Media Files

You may think that it would be obvious to share files in the Adobe Creative Cloud where you get 100GB free. And while 100GB may be enough storage for .psd and .ai files, 100GB is nothing for video, especially when we are talking about RED (.r3d) files which start off as approximately 4GB chunks and can quickly add up to terabytes of footage. 

So we put everything in a Backblaze B2 Bucket. All the .r3d source files went directly from my Synology network attached storage (NAS) into a Backblaze B2 Bucket using the Synology Cloud Sync tool. In addition to the source files, I used Adobe Media Encoder to generate proxy files of all the .r3d files. This folder of proxy files also synced with Backblaze automatically. 

Making Changes in Real Time

What was great about this solution is that all of the uploading is done automatically via a seamless Backblaze + Synology integration, and the Premiere Pro Team Project had a slew of publish functions perfect for real-time updates. And because the project files and proxies are stored in the cloud, I could get to them from several computers. I spent time at my desktop PC logging and syncing footage, but was also able to move to my couch and do the same from my MacBook Pro. I never had to move hard drives around, copy projects files, or worry about version control.

The other editor was able to connect to my Backblaze B2 Bucket using Cyberduck, a cloud storage browser for Mac. Using Cyberduck, he was able to pull down all the proxy files I created and share any files that he created. So, we were synced for the entire duration of the project. 

Once the technology was configured, I was able to finish logging for “Vengeance”, sync all the sound, build out stringouts and assemblies, and even a rough cut of every scene for the entire movie, giving the post-production process the boost it needed.

A diagram showing how editors use Backblaze B2 Cloud Storage with Adobe Premiere Pro.

The Power of Centralized Storage for Media Workflows

Technology is constantly evolving, and, in most circumstances, technology makes how we work a lot easier. For years filmmakers have worked on projects by physically moving our source material, whether it was on film reels, tapes, or hard drives. The cloud changed all that

The key to getting “Vengeance” through post-production was our centralized approach to file management. Files existed in Backblaze already, we simply brought Premiere Pro to the data rather than moving the huge amount of files to Premiere Pro via the Creative Cloud. 

The mix of technologies lets us create a customized flow that works for us. Creative Cloud had the benefit of providing a project sharing mechanism, and Backblaze provided a method of sharing media (Synology and Cyberduck) regardless of the tooling each editor had. 

Once we hit picture lock, the centralized files will serve as a distribution point for VFX, color, and sound, making turnover a breeze. It can even be used as a distribution hub—check out how American Public Television uses Backblaze to distribute their finished assets. 

Centralizing in the cloud not only made it easy for me to work from home, it allowed us to collaborate on a project with ease eliminating the overhead of driving, shuttle drive delivery (Sneakernet 2.0), and version control. The best part? A workflow like this is affordable for any size production and can be set up in minutes. 

Have you recently moved to a cloud workflow? Let us know what you’re using and how it went in the comments. 

The post Free Your Premiere Pro Workflows With Backblaze Cloud Storage appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

The Power of Specialized Cloud Providers: A Game Changer for SaaS Companies

Post Syndicated from Amrit Singh original https://www.backblaze.com/blog/the-power-of-specialized-cloud-providers-a-game-changer-for-saas-companies/

A decorative image showing a cloud with the Backblaze logo, then logos hanging off it it for Vultr, Fastly, Equinix metal, Terraform, and rclone.

“Nobody ever got fired for buying AWS.” It’s true: AWS’s one-size-fits-all solution worked great for most businesses, and those businesses made the shift away from the traditional model of on-prem and self-hosted servers—what we think of as Cloud 1.0—to an era where AWS was the cloud, the one and only, which is what we call Cloud 2.0. However, as the cloud landscape evolves, it’s time to question the old ways. Maybe nobody ever got fired for buying AWS, but these days, you can certainly get a lot of value (and kudos) for exploring other options. 

Developers and IT teams might hesitate when it comes to moving away from AWS, but AWS comes with risks, too. If you don’t have the resources to manage and maintain your infrastructure, costs can get out of control, for one. As we enter Cloud 3.0 where the landscape is defined by the open, multi-cloud internet, there is an emerging trend that is worth considering: the rise of specialized cloud providers.

Today, I’m sharing how software as a service (SaaS) startups and modern businesses can take advantage of these highly-focused, tailored services, each specializing and excelling in specific areas like cloud storage, content delivery, cloud compute, and more. Building on a specialized stack offers more control, return on investment, and flexibility, while being able to achieve the same performance you expect from hyperscaler infrastructure.

From a cost of goods sold perspective, AWS pricing wasn’t a great fit. From an engineering perspective, we didn’t want a net-new platform. So the fact that we got both with Backblaze—a drop-in API replacement with a much better cost structure—it was just a no-brainer.

—Rory Petty, Co-Founder & CTO, Tribute

The Rise of Specialized Cloud Providers

Specialized providers—including content delivery networks (CDNs) like Fastly, bunny.net, and Cloudflare, as well as cloud compute providers like Vultr—offer services that focus on a particular area of the infrastructure stack. Rather than trying to be everything to everyone, like the hyperscalers of Cloud 2.0, they do one thing and do it really well. Customers get best-of-breed services that allow them to build a tech stack tailored to their needs. 

Use Cases for Specialized Cloud Providers

There are a number of businesses that might benefit from switching from hyperscalers to specialized cloud providers, including:

In order for businesses to take advantage of the benefits (since most applications rely on more than just one service), these services must work together seamlessly. 

Let’s Take a Closer Look at How Specialized Stacks Can Work For You

If you’re wondering how exactly specialized clouds can “play well with each other,” we ran a whole series of application storage webinars that talk through specific examples and uses cases. I’ll share what’s in it for you below.

1. Low Latency Multi-Region Content Delivery with Fastly and Backblaze

Did you know a 100-millisecond delay in website load time can hurt conversion rates by 7%? In this session, Pat Patterson from Backblaze and Jim Bartos from Fastly discuss the importance of speed and latency in user experience. They highlight how Backblaze’s B2 Cloud Storage and Fastly’s content delivery network work together to deliver content quickly and efficiently across multiple regions. Businesses can ensure that their content is delivered with low latency, reducing delays and optimizing user experience regardless of the user’s location.

2. Scaling Media Delivery Workflows with bunny.net and Backblaze

Delivering content to your end users at scale can be challenging and costly. Users expect exceptional web and mobile experiences with snappy load times and zero buffering. Anything less than an instantaneous response may cause them to bounce. 

In this webinar, Pat Patterson demonstrates how to efficiently scale your content delivery workflows from content ingestion, transcoding, storage, to last-mile acceleration via bunny.net CDN. Pat demonstrates how to build a video hosting platform called “Cat Tube” and shows how to upload a video and play it using HTML5 video element with controls. Watch below and download the demo code to try it yourself.

3. Balancing Cloud Cost and Performance with Fastly and Backblaze

With a global economic slowdown, IT and development teams are looking for ways to slash cloud budgets without compromising performance. E-commerce, SaaS platforms, and streaming applications all rely on high-performant infrastructure, but balancing bandwidth and storage costs can be challenging. In this 45-minute session, we explored how to recession-proof your growing business with key cloud optimization strategies, including ways to leverage Fastly’s CDN to balance bandwidth costs while avoiding performance tradeoffs.

4. Reducing Cloud OpEx Without Sacrificing Performance and Speed

Greg Hamer from Backblaze and DJ Johnson from Vultr explore the benefits of building on best-of-breed, specialized cloud stacks tailored to your business model, rather than being locked into traditional hyperscaler infrastructure. They cover real-world use cases, including:

  • How Can Stock Photo broke free from AWS and reduced their cloud bill by 55% while achieving 4x faster generation.
  • How Monument Labs launched a new cloud-based photo management service to 25,000+ users.
  • How Black.ai processes 1000s of files simultaneously, with a significant reduction of infrastructure costs.

5. Leveling Up a Global Gaming Platform while Slashing Cloud Spend by 85%

James Ross of Nodecraft, an online gaming platform that aims to make gaming online easy, shares how he moved his global game server platform from Amazon S3 to Backblaze B2 for greater flexibility and 85% savings on storage and egress. He discusses the challenges of managing large files over the public internet, which can result in expensive bandwidth costs. By storing game titles on Backblaze B2 and delivering them through Cloudflare’s CDN, they achieve reduced latency since games are cached at the edge, and pay zero egress fees thanks to the Bandwidth Alliance. Nodecraft also benefited from Universal Data Migration, which allows customers to move large amounts of data from any cloud services or on-premises storage to Backblaze’s B2 Cloud Storage, managed by Backblaze and free of charge.

Migrating From a Hyperscaler

Though it may seem daunting to transition from a hyperscaler to a specialized cloud provider, it doesn’t have to be. Many specialized providers offer tools and services to make the transition as smooth as possible. 

  • S3-compatible APIs, SDKs, CLI: Interface with storage as you would with Amazon S3—switching can be as easy as dropping in a new storage target.
  • Universal Data Migration: Free and fully managed migrations to make switching as seamless as possible.
  • Free egress: Move data freely with the Bandwidth Alliance and other partnerships between specialized cloud storage providers.

As the decision maker at your growing SaaS company, it’s worth considering whether a specialized cloud stack could be a better fit for your business. By doing so you could potentially unlock cost savings, improve performance, and gain flexibility to adapt your services to your unique needs. The one-size-fits-all is no longer the only option out there. 

Want to Test It Out Yourself?

Take a proactive approach to cloud cost management: Get 10GB free to test and validate your proof of concept (POC) with Backblaze B2. All it takes is an email to get started.

Download the Ransomware Guide ➔ 

The post The Power of Specialized Cloud Providers: A Game Changer for SaaS Companies appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Unlocking Media Collaboration: How to Use Hybrid Cloud to Boost Productivity

Post Syndicated from Vinodh Subramanian original https://www.backblaze.com/blog/unlocking-media-collaboration-how-to-use-hybrid-cloud-to-boost-productivity/

A decorative image showing a Synology NAS with various icons representing file types going up into a cloud with a Backblaze logo.

In today’s fast-paced media landscape, efficient collaboration is essential for success. With teams managing large files between geographically dispersed team members on tight deadlines, the need for a robust, flexible storage solution has never been greater. Hybrid cloud storage addresses this need by combining the power of on-premises solutions, like network attached storage (NAS) devices, with cloud storage, creating an ideal setup for enhanced productivity and seamless collaboration. 

In this post, I’ll walk you through some approaches for optimizing media workflows using hybrid cloud storage. You’ll learn how to unlock fast local storage, easy file sharing and collaboration, and enhanced data protection, which are all essential components for success in the media and entertainment industry. 

Plus, we’ll share specific media workflows for different types of collaboration scenarios and practical steps you can take to get started with your hybrid cloud approach today using Synology NAS and Backblaze B2 Cloud Storage as an example.

Common Challenges for Media Teams

Before we explore a hybrid cloud approach that combines NAS devices with cloud storage, let’s first take a look at some of the common challenges media teams face, including:

  • Data storage and accessibility.
  • File sharing and collaboration.
  • Security and data protection.

Data Storage and Accessibility Challenges

It’s no secret that recent data growth has been exponential. This is no different for media files. Cameras are creating larger and higher-quality files. There are more projects to shoot and edit. And editors and team members require immediate access to those files due to the high demand for fresh content.

File Sharing and Collaboration Challenges

Back in 2020, everyone was forced to go remote and the workforce changed. Now you can hire freelancers and vendors from around the world. This means you have to share assets with external contributors, and, in the past, this used to exclusively mean shipping hard drives to said vendors (and sometimes, it can still be necessary). Different contractors, freelancers, and consultants may use different tools and different processes.

Security and Data Protection Challenges

Data security poses unique challenges for media teams due to the industry’s specific requirements including managing large files, storing data on physical devices, and working with remote teams and external stakeholders. The need to protect sensitive information and intellectual property from data breaches, accidental deletions, and device failures adds complexity to data protection initiatives. 

How Does Hybrid Cloud Help Media Teams Solve These Challenges?

As a quick reminder, the hybrid cloud refers to a computing environment that combines the use of both private cloud and public cloud resources to achieve the benefits of each platform.

A private cloud is a dedicated and secure cloud infrastructure designed exclusively for a single tenant or organization. It offers a wide range of benefits to users. With NAS devices, organizations can enjoy centralized storage, ensuring all files are accessible in one location. Additionally, it offers fast local access to files that helps streamline workflows and productivity. 

The public cloud, on the other hand, is a shared cloud infrastructure provided by cloud storage companies like Backblaze. With public cloud, organizations can scale their infrastructure up or down as needed without the up-front capital costs associated with traditional on-premises infrastructure. 

By combining cloud storage with NAS, media teams can create a hybrid cloud solution that offers the best of both worlds. Private local storage on NAS offers fast access to large files while the public cloud securely stores those files in remote servers and keeps them accessible at a reasonable price.

How To Get Started With A Hybrid Cloud Approach

If you’d like to get started with a hybrid cloud approach, using NAS on-premises is an easy entry point. Here are a few tips to help you choose the right NAS device for your data storage and collaboration needs. 

  • Storage Requirements: Begin by assessing your data volume and growth rate to determine how much storage capacity you’ll need. This will help you decide the number of drives required to support your data growth. 
  • Compute Power: Evaluate the NAS device’s processor, controller, and memory to ensure it can handle the workloads and deliver the performance you need for running applications and accessing and sharing files.
  • Network Infrastructure: Consider the network bandwidth, speed, and port support offered by the NAS device. A device with faster network connectivity will improve data transfer rates, while multiple ports can facilitate the connection of additional devices.
  • Data Collaboration: Determine your requirements for remote access, sync direction, and security needs. Look for a NAS device that provides secure remote access options, and supports the desired sync direction (one-way or two-way) while offering data protection features such as encryption, user authentication, and access controls. 

By carefully reviewing these factors, you can choose a NAS device that meets your storage, performance, network, and security needs. If you’d like additional help choosing the right NAS device, download our complete NAS Buyer’s Guide. 

Download the Guide ➔

Real-World Examples: Using Synology NAS + Backblaze B2

Let’s explore a hybrid cloud use case. To discuss specific media workflows for different types of collaboration scenarios, we’re going to use Synology NAS as the private cloud and Backblaze B2 Cloud Storage as the public cloud as examples in the rest of this article. 

Scenario 1: Working With Distributed Teams Across Locations

In the first scenario, let’s assume your organization has two different locations with your teams working from both locations. Your video editors work in one office, while a separate editorial team responsible for final reviews operates from the second location. 

To facilitate seamless collaboration, you can install a Synology NAS device at both locations and connect them to Backblaze B2 using Cloud Sync. 

Here’s a video guide that demonstrates how to synchronize Synology NAS to Backblaze B2 using cloud sync.

This hybrid cloud setup allows for fast local access, easy file sharing, and real-time synchronization between the two locations, ensuring that any changes made at one site are automatically updated in the cloud and mirrored at the other site.

Scenario 2: Working With Distributed Teams

In this second scenario, you have teams working on your projects from different regions, let’s say the U.S. and Europe. Downloading files from different parts of the world can be time-consuming, causing delays and impacting productivity. To solve this, you can use Backblaze B2 Cloud Replication. This allows you to replicate your data automatically from your source bucket (U.S. West) to a destination bucket (EU Central). 

Source files can be uploaded into B2 Bucket on the U.S. West region. These files are then replicated to the EU Central region so you can move data closer to your team in Europe for faster access. Vendors and teams in Europe can configure their Synology NAS devices with Cloud Sync to automatically sync with the replicated files in the EU Central data center.

Scenario 3: Working With Freelancers

In both scenarios discussed so far, file exchanges can occur between different companies or within the same company across various regions of the world. However, not everyone has access to these resources. Freelancers make up a huge part of the media and entertainment workforce, and not every one of them has a Synology NAS device. 

But that’s not a problem! 

In this case, you can still use a Synology NAS to upload your project files and sync them with your Backblaze B2 Bucket. Instead of syncing to another NAS or replicating to a different region, freelancers can access the files in your Backblaze B2 Bucket using third-party tools like Cyberduck

This approach allows anyone with an internet connection and the appropriate access keys to access the required files instantly without needing to have a NAS device.

Scenario 4: Working With Vendors

In this final scenario, which is similar to the first one, you collaborate with another company or vendor located elsewhere instead of working with your internal team. Both parties can install their own Synology NAS device at their respective locations, ensuring centralized access, fast local access, and easy file sharing and collaboration. 

The two NAS devices are then connected to a Backblaze B2 Bucket using Cloud Sync, allowing for seamless synchronization of files and data between the two companies.

Whenever changes are made to files by one company, the updated files are automatically synced to Backblaze B2 and subsequently to the other company’s Synology NAS device. This real-time synchronization ensures that both companies have access to the latest versions of the files, allowing for increased efficiency and collaboration. 

Making Hybrid Cloud Work for Your Production Team

As you can see, there are several different ways you can move your media files around and get them in the hands of the right people—be it another one of your offices, vendors, or freelancers. The four scenarios discussed here are just a few common media workflows. You may or may not have the same scenario. Regardless, a hybrid cloud approach provides you with all the tools you need to customize your workflow to best suit your media collaboration needs.

Ready to Get Started?

With Backblaze B2’s pre-built integration with Synology NAS’s Cloud Sync, getting started with your hybrid cloud approach using Synology NAS and Backblaze B2 is simple and straightforward. Check out our guide, or watch the video below as Pat Patterson, Backblaze Chief Technical Evangelist, walks through how to get your Synology NAS data into B2 Cloud Storage in under 10 minutes using Cloud Sync.

Your first step is creating an account.

In addition to Synology NAS, Backblaze B2 Cloud Storage integrates seamlessly with other NAS devices such as Asustor, Ctera, Dell Isilon, iOsafe, Morro Data, OWC JellyFish, Panzura, QNAP, TrueNAS, and more. Regardless of which NAS device you use, getting started with a hybrid cloud approach is simple and straightforward with Backblaze B2.

Hybrid Cloud Unlocks Collaboration and Productivity for Media Teams

Easing collaboration and boosting productivity in today’s fast-paced digital landscape is vital for media teams. By leveraging a hybrid cloud storage solution that combines the power of NAS devices with the flexibility of cloud storage, organizations can create an efficient, scalable, and secure solution for managing their media assets. 

This approach not only addresses storage capacity and accessibility challenges, but also simplifies file sharing and collaboration, while ensuring data protection and security. Whether you’re working within your team from different locations, collaborating with external partners, or freelancers, a hybrid cloud solution offers a seamless, cost-effective, and high-performance solution to optimize your media workflows and enhance productivity in the ever-evolving world of media and entertainment. 

We’d love to hear about other different media workflow scenarios. Share with us how you collaborate with your media teams and vendors in the comments below. 

The post Unlocking Media Collaboration: How to Use Hybrid Cloud to Boost Productivity appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Object Storage for Film, Video, and Content Creation

Post Syndicated from James Flores original https://www.backblaze.com/blog/object-storage-for-film-video-and-content-creation/

A decorative image showing icons representing drives and storage options superimposed on a cloud. A title reads: Object Storage for Media Workflows

Twenty years ago, who would have thought going to work would mean spending most of your time on a computer and running most of your applications through a web browser or a mobile app? Today, we can do everything remotely via the power of the internet—from email to gaming, from viewing our home security cameras to watching the latest and greatest movie trailers—and we all have opinions about the best browsers, too…

Along with that easy, remote access, a slew of new cloud technologies are fueling the tech we use day in and day out. To get to where we are today, the tech industry had to rethink some common understandings, especially around data storage and delivery. Gone are the days that you save a file on your laptop, then transport a copy of that file via USB drive or CD-ROM (or, dare we say, a floppy disk) so that you can keep working on it at the library or your office. And, those same common understandings are now being reckoned with in the world of film, video, and content creation.

In this post, I’ll dive into storage, specifically cloud object storage, and what it means for the future of content creation, not only for independent filmmakers and content creators, but also in post-production workflows.

The Evolution of File Management

If you are reading this blog you are probably familiar with a storage file system—think Windows Explorer, the Finder on Mac, or directory structures in Linux. You know how to create a folder, create files, move files, and delete folders. This same file structure has made its way into cloud services such as Google Drive, Box, and Dropbox. And many of these technologies have been adopted to store some of the largest content, namely media files like .mp4, .wav, or .r3d files.

But, as camera file outputs grow larger and larger and the amount of content generated by creative teams soars, folders structures get more and more complex. Why is this important?

Well, ask yourself: How much time have you spent searching for clips you know exist, but just can’t seem to find? Sure, you can use search tools to search your folder structure but as you have more and more content, that means searching for the proverbial needle in a haystack—naming conventions can only do so much, especially when you have dozens or hundreds of people adding raw footage, creating new versions, and so on.

Finding files in a complex file structure can take so much time that many of the aforementioned companies create system limits preventing long searches. In addition, they may limit uploads and downloads making it difficult to manage the terabytes of data a modern production creates. So, this all begs the question: Is a traditional file system really the best for scaling up, especially in data-heavy industries like filmmaking and video content creation? Enter: Cloud object storage.

Refresher: What is Object Storage?

You can think of object storage as simply a big pool of storage space filled with object data. In the past we’ve defined object data as “some assemblage of data with one unique identifier and an infinite amount of metadata.” The three components that comprise objects in object storage are key here. They include:

  1. Unique Identifier: Referred to as a universally unique identifier (UUID) or global unique identifier (GUID), this is simply a complex number identifier.
  2. Infinite Metadata: Data about the data with endless possibilities.
  3. Data: The actual data we are storing.

So what does that actually mean?

It means each object (this can be any type of file—a .jpg, .mp4, .wav, .r3d, etc.) has an automatically generated unique identifier which is just a number (e.g. 4_z6b84cf3535395) versus a folder structure path you must manually create and maintain (e.g. D:\Projects\JOB4548\Assets\RAW\A001\A001_3424OP.RDM\A001_34240KU.RDC\
A001_A001_1005ku_001.R3D).

An image of a card catalog.
Interestingly enough, this is where metadata comes from.

It also means each object can have an infinite amount of metadata attached to it. Metadata, put simply, is a “tag” that identifies how the file is used or stored. There are several examples of metadata, but here are just a few:

  • Descriptive metadata, like the title or author.
  • Structural metadata, like how to order pages in a chapter.
  • Administrative metadata, like when the file was created, who has permissions to it, and so on.
  • Legal metadata, like who holds the copyright or if the file is in the public domain.

So, when you’re saying an image file is 400×400 pixels and in .jpg format, you’ve just identified two pieces of metadata about the file. In filmmaking, metadata can include things like reel numbers or descriptions. And, as artificial intelligence (AI) and machine learning tools continue to evolve, the amount of metadata about a given piece of footage or image only continues to grow. AI tools can add data around scene details, facial recognition, and other identifiers, and since those are coded as metadata, you will be able to store and search files using terms like “scenes with Bugs Bunny” or “scenes that are in a field of wildflowers”—and that means that you’ll spend less time trying to find the footage you need when you’re editing.

When you put it all together, you have one gigantic content pool that can grow infinitely. It uses no manually created complex folder structure and naming conventions. And it can hold an infinite amount of data about your data (metadata), making your files more discoverable.

Let’s Talk About Object Storage for Content Creation

You might be wondering: What does this have to do with the content I’m creating?

Consider this: When you’re editing a project, how much of your time is spent searching for files? A recent study by GISTICS found that the average creative person searches for media 83 times a week. Maybe you’re searching your local hard drive first, then your NAS, then those USB drives in your closet. Or, maybe you are restoring content off an LTO tape to search for that one clip you need. Or, maybe you moved some of your content to the cloud—is it in your Google Drive or in your Dropbox account? If so, which folder is it in? Or was it the corporate Box account? Do you have permissions to that folder? All of that complexity means that the average creative person fails to find the media they are looking for 35% of the time. But you probably don’t need a study to tell you we all spent huge amounts of time searching for content.

An image showing a command line interface window with a failed search.
Good old “request timed out.”

Here is where object storage can help. With object storage, you simply have buckets (object storage containers) where all your data can live, and you can access it from wherever you’re working. That means all of the data stored on those shuttle drives sitting around your office, your closet of LTO tapes, and even a replica of your online NAS are in a central, easily accessible location. You’re also working from the most recent file.

Once it’s in the cloud, it’s safe from the types of disasters that affect on-premises storage systems, and it’s easy to secure your files, create backups, and so on. It’s also readily available when you need it, and much easier to share with other team members. It’s no wonder many of the apps you use today take advantage of object storage as their primary storage mechanism.

The Benefits of Object Storage for Media Workflows

Object storage offers a number of benefits for creative teams when it comes to streamlining workflows, including:

  • Instant access
  • Integrations
  • Workflow interoperability
  • Easy distribution
  • Off-site back up and archive

Instant Access

With cloud object storage, content is ready when you need it. You know inspiration can strike at any time. You could be knee deep in editing a project, in the middle of binge watching the latest limited series, or out for a walk. Whenever the inspiration decides to strike, having instant access to your library of content is a game changer. And that’s the great thing about object storage in the cloud: you gain access to massive amounts of data with a few clicks.

Integrations

Object storage is a key component of many of the content production tools in use today. For example, iconik is a cloud-native media asset management (MAM) tool that can gather and organize media from any storage location. You can point iconik to your Backblaze B2 Bucket and use its advanced search functions as well as its metadata tagging.

Workflow Interoperability

What if you don’t want to use iconik, specifically? What’s great about using cloud storage as a centralized repository is that no matter what application you use, your data is in a single place. Think of it like your external hard drive or NAS—you just connect that drive with a new tool, and you don’t have to worry about downloading everything to move to the latest and greatest. In essence, you are bringing your own storage (BYOS!).

Here’s an example: CuttingRoom is a cloud native video editing and collaboration tool. It runs entirely in your web browser and lets you create unique stories that can instantly be published to your destination of choice. What’s great about CuttingRoom is its ability to read an object storage bucket as a source. By simply pointing CuttingRoom to a Backblaze B2 Bucket, it has immediate access to the media source files and you can get to editing. On the other hand, if you prefer using a MAM, that same bucket can be indexed by a tool like iconik.

Easy Distribution

Now that your edit is done, it’s time to distribute your content to the world. Or, perhaps you are working with other teams to perfect your color and sound, and it’s time to share your picture lock version. Cloud storage is ready for you to distribute your files to the next team or an end user.

Here’s a recent, real-world example: If you have been following the behind-the-scenes articles about creating Avatar: The Way of Water, you know that not only was its creation the spark of new technology like the Sony Venice camera with removable sensors, but the distribution featured a cloud centric flow. Footage (the film) was placed in an object store (read: a cloud storage database), processed into different formats, languages were added with 3D captions, and then footage was distributed directly from a central location.

And, while not all of us have Jon Landau as our producer, a huge budget, and a decade to create our product, this same flexibility exists today with object storage—with the added bonus that it’s usually budget-friendly as well.

Off-Site Back Up and Archive

And last but certainly not least, let’s talk back up and archive. Once a project is done, you need space for the next project, but no one wants to risk losing the old project. Who out there is completely comfortable hitting the delete key as well as saying yes to the scary prompt, “Are you sure you want to delete?”

Well, that’s what you would have to do in the past. These days, object storage is a great place to store your terabytes and terabytes of archived footage without cluttering your home, office, or set with additional hardware. Compared with on-premises storage, cloud storage lets you add more capacity as you need it—just make sure you understand cloud storage pricing models so that you’re getting the best bang for your buck.

If you’re using a NAS device in your media workflow, you’ll find you need to free up your on-prem storage. Many NAS devices, like Synology and QNAP, have cloud storage integrations that allow you to automatically sync and archive data from your device to the cloud. In fact, you could start taking advantage of this today.

No delete key here—just a friendly archive button.

Getting Started With Object Storage for Media Workflows

Migrating to the cloud may seem daunting, but it doesn’t have to be. Especially with the acceleration of hybrid workflows in the film industry recently, cloud-based workflows are becoming more common and better integrated with the tools we use every day. You can test this out with Backblaze using your free 10GB that you get just for signing up for Backblaze B2. Sure, that may not seem like much when a single .r3d file is 4GB. But with that 10GB, you can test upload speeds and download speeds, try out integrations with your preferred workflow tools, and experiment with AI metadata. If your team is remote, you could try an integration with LucidLink. Or if you’re looking to power a video on-demand site, you could integrate with one of our content delivery network (CDN) partners to test out content distribution, like Backblaze customer Kanopy, a streaming service that delivers 25,000 videos to libraries worldwide.

Change is hard, but cloud storage can be easy. Check out all of our media workflow solutions and get started with your 10GB free today.

The post Object Storage for Film, Video, and Content Creation appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Virtual vs. Remote vs. Hybrid Production

Post Syndicated from James Flores original https://www.backblaze.com/blog/virtual-vs-remote-vs-hybrid-production/

A decorative image showing icons of a NAS device, a video with a superimposed play button, and a cloud with the Backblaze logo.

For many of us, 2020 transformed our work habits. Changes to the way we work that always seemed years away got rolled out within a few months. Fast forward to today, and the world seems to be returning back to some sense of normalcy. But one thing that’s not going back is how we work, especially for media production teams. Virtual production, remote video production, and hybrid cloud have all accelerated, reducing operating costs and moving us closer to a cloud-based reality.

So what’s the difference between virtual production, remote production, and hybrid cloud workflows, and how can you use any or all of those strategies to improve how you work? At first glance, they all seem to be different variations of the same thing. But there are important differences, and that’s what we’re digging into today. Read on to get an understanding of these new ways of working and what they mean for your creative team.

Going to NAB in April?

Want to talk about your production setup at NAB? Backblaze will be there with exciting new updates and hands-on demos for better media workflows. Oh, and we’re bringing some really hot swag. Reserve time to meet with our team (and snap up some sweet goodies) below.

➔ Meet Backblaze at NAB

What Is Virtual Production?

Let’s start with virtual production. It sounds like doing production virtually, which could just mean “in the cloud.” I can assure you, it’s way cooler than that. When the pandemic hit, social distancing became the norm. Gathering a film crew together in a studio or in any location of the world went out the door. Never fear: virtual production came to the rescue.

Virtual production is a method of production where, instead of building a set or going to a specific location, you build a set virtually, usually with a gaming engine such as Unreal Engine. Once the environment is designed and lit within Unreal Engine, it can then be fed to an LED volume. An LED volume is exactly what it sounds like: a huge volume of LED screens connected to a single input (the Unreal Engine environment).

With virtual production, your set becomes the LED volume, and Unreal Engine can change the background to anything you can imagine at the click of a button. Now this isn’t just a LED screen as a background—what makes virtual production so powerful is its motion tracking integration with real cameras.

Using a motion sensor system attached to a camera, Unreal Engine is able to understand where your camera is pointed. (It’s way more tech-y than that, but you get the picture.) You can even match the virtual lens in Unreal Engine with the lens of your physical camera. With the two systems combined, a camera following an actor on a virtual set can react by moving the background along with the camera in real time.

Virtual Production in Action

If you were one of the millions who have been watching The Mandalorian on Disney+, check out this behind the scenes look at how they utilized a virtual production.

 

This also means location scouting can be done entirely inside the virtual set and the assets created for pre-vizualiation can actually carry on into post, saving a ton of time (as the post work actually starts during pre-production.

So, virtual production is easily confused with remote production, but it’s not the same. We’ll get into remote production next.

What Is Remote Production?

We’re all familiar with the stages of production: development, pre-production, production, post-production, and distribution. Remote production has more to do with post-production. Remote production is simply the ability to handle post-production tasks from anywhere.

Here’s how the pandemic accelerated remote production: In post, assets are edited on non-linear editing software (NLEs) connected to huge storage systems located deep within studios and post-production houses. When everyone was forced to work from home, it made editing quite difficult. There were, of course, solutions that allowed you to remotely control your edit bay, but remotely controlling a system from miles away and trying to scrub videos over your at-home internet bandwidth quickly became a nuisance.

To solve this problem, everyone just took their edit bay home along with a hard drive containing what they needed for their particular project. But shuttling drives all over the place and trying to correlate files across all the remote drives meant that the NAS became the next headache. To resolve this confusion over storage, production houses turned to hybrid solutions—our next topic.

What Are Hybrid Cloud Workflows?

Hybrid cloud workflows didn’t originate during the pandemic, but they did make remote production much easier. A hybrid cloud workflow is a combination of a public cloud, private cloud, and an on-premises solution like a network attached storage device (NAS) or storage area network (SAN). When we think about storage, we think about first the relationship of our NLE to our local hard drive, then our relationship between the local computer and the NAS or SAN. The next iteration of this is the relationship of all of these (NLE, local computer, and NAS/SAN) to the cloud.

For each of these on-prem solutions the primary problems faced are capacity and availability. How much can our drive hold, and how do I access the NAS—local area network (LAN) or virtual private network (VPN)? Storage in the cloud inherently solves both of these problems. It’s always available and accessible from any location with an internet connection. So, to solve the problems that remote teams of editors, visual effects (VFX), color, and sound folks faced, the cloud was integrated into many workflows.

Using the cloud, companies are able to store content in a single location where it can then be distributed to different teams (VFX, color, sound, etc.). This central repository makes it possible to move large amounts of data across different regions, making it easier for your team to access it while also keeping it secure. Many NAS devices have native cloud integrations, so the automated file synchronization between the cloud and a local environment is baked in—teams can just get to work.

The hybrid solution worked so well that many studios and post houses have adopted them as a permanent part of their workflow and have incorporated remote production into their day-to-day. A good example is the video team at Hagerty, a production crew that creates 300+ videos a year. This means that workflows that were once locked down to specific locations are now moving to the cloud. Now more than ever, API accessible resources, like cloud storage with S3 compatible APIs that integrates with your preferred tools, are needed to make these workflows actually work.

Just one example of Hagerty’s content.

Hybrid Workflows and Cloud Storage

While the world seems to be returning to a new normal, our way of work is not. For the media and entertainment world, the pandemic gave the space a jolt of electricity, igniting the next wave of innovation. Virtual production, remote production, and hybrid workflows are here to stay. What digital video started 20 years ago, the pandemic has accelerated, and that acceleration is pointing directly to the cloud.

So, what are your next steps as you future-proof your workflow? First, inspect your current set of tools. Many modern tools are already cloud-ready. For example, a Synology NAS already has Cloud Sync capabilities. EditShare also has a tool capable of crafting custom workflows, wherever your data lives. (These are just a few examples.)

Second, start building and testing. Most cloud providers offer free tiers or free trials—at Backblaze, your first 10GB are free, for example. Testing a proof of concept is the best way to understand how new workflows fit into your system without overhauling the whole thing or potentially disrupting business as usual.

And finally, one thing you definitely need to make hybrid workflows work is cloud storage. If you’re looking to make the change a lot easier, you came to the right place. Backblaze B2 Cloud Storage pairs with hundreds of integrations so you can implement it directly into your established workflows. Check out our partners and our media solutions for more.

The post Virtual vs. Remote vs. Hybrid Production appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Work Smarter With Backblaze and Quantum H4000 Essential

Post Syndicated from Jennifer Newman original https://www.backblaze.com/blog/work-smarter-with-backblaze-and-quantum-h4000-essential/

A decorative image displaying the Backblaze and Quantum logos.

How much do you think it costs your creative team to manage data? How much time and money is spent organizing files, searching for files, and maybe never finding those files? Have you ever quantified it? One market research firm has. According to GISTICS, a creative team of eight people wastes more than $85,000 per year searching for and managing files—and larger teams waste even more than that.

Creative teams need better tools to work smarter. Backblaze has partnered with Quantum to simplify media workflows, provide easy remote collaboration, and free up on-premises storage space with seamless content archiving to the cloud. The partnership provides teams the tools needed to compete. Read on to learn more.

What Is Quantum?

Quantum is a data storage company that provides technology, software, and services to help companies make video and other unstructured data smarter—so data works for them and not the other way around. Quantum’s H4000 Essential (H4000E) asset management and shared storage solution offers customers an all-in-one appliance that integrates automatic content indexing, search, discovery, and collaboration. It couples the CatDV asset management platform with Quantum’s StorNext 7 shared storage.

How Does This Partnership Benefit Joint Customers?

By pairing Quantum H4000 Essential with Backblaze B2 Cloud Storage, you get award-winning asset management and shared storage with the ability to archive to the cloud. The partnership provides a number of benefits:

  • Better organization: Creative teams work visually, and the Quantum platform supports visual workflows. All content is available in one place, with automatic content indexing, metadata tagging, and proxy generation.
  • Searchable assets: All content and projects are searchable in an easy to use visual catalog.
  • Seamless collaboration: Teams can use production tools like Adobe Premiere Pro, Final Cut Pro X, and others to work on shared projects as well as tagging, markup, versioning, chat, and approval tools to streamline collaboration.
  • Robust archive management: Archived content can be restored easily from Backblaze B2 to CatDV to keep work in progress lean and focused.
  • On-premises efficiency: Once projects are complete, they can be quickly archived to the cloud to free up storage space on the H4000E for high-resolution production files and ongoing projects.
  • Simplified billing: Data is stored on always-hot storage, eliminating the management frustration that comes with multiple tiers and variable costs for egress and API calls.

Purchase Cloud Capacity the Same Way You Purchase On-Premises

With Backblaze B2 Reserve, you can purchase capacity-based storage starting at 20TB to pair with your Quantum H4000E if you prefer a predictable cloud spend versus consumption-based billing. Key features of B2 Reserve include:

  • Free egress up to the amount of storage purchased per month.
  • Free transaction calls.
  • Enhanced migration services.
  • No delete penalties.
  • Upgraded Tera support.

Who Would Benefit From Backblaze B2 + Quantum H4000E?

The partnership benefits any team that handles large amounts of data, specifically media files. The solution can help teams with:

  • Simplifying media workflows.
  • Easing remote project management and collaboration.
  • Cloud tiering.
  • Extending on-premises storage.
  • Implementing a cloud-first strategy.
  • Backup and disaster recovery planning.
  • Ransomware protection.
  • Managing consistent data growth.

Getting Started With Backblaze B2 and Quantum H4000E

The Quantum H4000E is a highly-integrated solution for collaborative shared storage and asset management. Configured with Backblaze B2 for content archiving and retrieval, it provides new menu options to perform cloud archiving and move, copy, and restore content, freeing up H4000 local storage for high-resolution files. You can easily add on cloud storage to improve remote media workflows, content collaboration, media asset protection, and archive.

With the H4000E, everything you need to get started is in the box, ready to connect to your 10GbE and higher network. And, a simple Backblaze B2 archive plugin connects the solution directly to your Backblaze B2 account.

Simply create a Backblaze account and configure the Backblaze CatDV panel with your credentials.

Join Backblaze at NAB Las Vegas

Join us at NAB to learn more about the Quantum + Backblaze solution. Our booths are neighbors! Schedule a meeting with us for a demo.

The post Work Smarter With Backblaze and Quantum H4000 Essential appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.