All posts by Roderick Bauer

Hard Disk Drive (HDD) vs Solid State Drive (SSD): What’s the Diff?

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/hdd-versus-ssd-whats-the-diff/

whats the diff? SSD vs. HDD

HDDs and SSDs have changed in the two years since Peter Cohen wrote the original version of this post on March 8 of 2016. We thought it was time for an update. We hope you enjoy it.

— Editor

In This Corner: The Hard Disk Drive (HDD)

The traditional spinning hard drive has been a standard for many generations of personal computers. Constantly improving technology has enabled hard drive makers to pack more storage capacity than ever, at a cost per gigabyte that still makes hard drives the best bang for the buck.

IBM RamacAs sophisticated as they’ve become, hard drives have been around since 1956. The ones back then were two feet across and could store only a few megabytes of information, but technology has improved to the point where you can cram 10 terabytes into something about the same size as a kitchen sponge.

Inside a hard drive is something that looks more than a bit like an old record player: There’s a platter, or stacked platters, which spin around a central axis — a spindle — typically at about 5,400 to 7,200 revolutions per minute. Some hard drives built for performance work faster.

Hard Drive exploded viewInformation is written to and read from the drive by changing the magnetic fields on those spinning platters using an armature called a read-write head. Visually, it looks a bit like the arm of a record player, but instead of being equipped with a needle that runs in a physical groove on the record, the read-write head hovers slightly above the physical surface of the disk.

The two most common form factors for hard drives are 2.5-inch, common for laptops, and 3.5-inch, common for desktop machines. The size is standardized, which makes for easier repair and replacement when things go wrong.

The vast majority of drives in use today connect through a standard interface called Serial ATA (or SATA). Specialized storage systems sometimes use Serial Attached SCSI (SAS), Fibre Channel, or other exotic interfaces designed for special purposes.

Hard Disk Drives Cost Advantage

Proven technology that’s been in use for decades makes hard disk drives cheap — much cheaper, per gigabyte than solid state drives. HDD storage can run as low as three cents per gigabyte. You don’t spend a lot but you get lots of space. HDD makers continue to improve storage capacity while keeping costs low, so HDDs remain the choice of anyone looking for a lot of storage without spending a lot of money.

The downside is that HDDs can be power-hungry, generate noise, produce heat, and don’t work nearly as fast as SSDs. Perhaps the biggest difference is that HDDs, with all their similarities to record players, are ultimately mechanical devices. Over time, mechanical devices will wear out. It’s not a question of if, it’s a question of when.

HDD technology isn’t standing still, and price per unit stored has decreased dramatically. As we said in our post, HDD vs SSD: What Does the Future for Storage Hold? — Part 2, the cost per gigabyte for HDDs has decreased by two billion times in about 60 years.

HDD manufacturers have made dramatic advances in technology to keep storing more and more information on HD platters — referred to as areal density. As HDD manufacturers try to outdo each other, consumers have benefited from larger and larger drive sizes. One technique is to replace the air in drives with helium, which reduces reduces friction and supports greater areal density. Another technology that should be available soon uses heat-assisted magnetic recording (HAMR). HAMR records magnetically using laser-thermal assistance that ultimately could lead to a 20 terabyte drive by 2019. See our post on HAMR by Seagate’s CTO Mark Re, What is HAMR and How Does It Enable the High-Capacity Needs of the Future?

The continued competition and race to put more and more storage in the same familiar 3.5” HDD form factor means that it will be a relatively small, very high capacity choice for storage for many years to come.

In the Opposite Corner: The Solid State Drive (SSD)

Solid State Drives (SSDs) have become much more common in recent years. They’re standard issue across Apple’s laptop line, for example the MacBook, MacBook Pro and MacBook Air all come standard with SSDs. So does the Mac Pro.

Inside an SSDSolid state is industry shorthand for an integrated circuit, and that’s the key difference between an SSD and a HDD: there are no moving parts inside an SSD. Rather than using disks, motors and read/write heads, SSDs use flash memory instead — that is, computer chips that retain their information even when the power is turned off.

SSDs work in principle the same way the storage on your smartphone or tablet works. But the SSDs you find in today’s Macs and PCs work faster than the storage in your mobile device.

The mechanical nature of HDDs limits their overall performance. Hard drive makers work tirelessly to improve data transfer speeds and reduce latency and idle time, but there’s a finite amount they can do. SSDs provide a huge performance advantage over hard drives — they’re faster to start up, faster to shut down, and faster to transfer data.

A Range of SSD Form Factors

SSDs can be made smaller and use less power than hard drives. They also don’t make noise, and can be more reliable because they’re not mechanical. As a result, computers designed to use SSDs can be smaller, thinner, lighter and last much longer on a single battery charge than computers that use hard drives.

SSD Conversion KitMany SSD makers produce SSD mechanisms that are designed to be plug-and-play drop-in replacements for 2.5-inch and 3.5-inch hard disk drives because there are millions of existing computers (and many new computers still made with hard drives) that can benefit from the change. They’re equipped with the same SATA interface and power connector you might find on a hard drive.


Intel SSD DC P4500A wide range of SSD form factors are now available. Memory Sticks, once limited to 128MB maximum, now come in versions as large as 2 TB. They are used primarily in mobile devices where size and density are primary factor, such as cameras, phones, drones, and so forth. Other high density form factors are designed for data center applications, such as Intel’s 32 TB P4500. Resembling a standard 12-inch ruler, the Intel SSD DC P4500 has a 32 terabyte capacity. Stacking 64 extremely thin layers of 3D NAND, the P4500 is currently the world’s densest solid state drive. The price is not yet available, but given that the DC P4500 SSD requires only one-tenth the power and just one-twentieth the space of traditional hard disk storage, once the price comes out of the stratosphere you can be sure that there will be a market for it.

Nimbus ExaDrive 100TB SSDEarlier this year, Nimbus Data announced the ExaDrive D100 100TB SSD. This SSD by itself holds over twice as much data as Backblaze’s first Storage Pods. Nimbus Data has said that the drive will have pricing comparable to other business-grade SSDs “on a per terabyte basis.” That likely means a price in the tens of thousands of dollars.

SSD drive manufacturers also are chasing ways to store more data in ever smaller form factors and at greater speeds. The familiar SSD drive that looks like a 2.5” HDD drive is starting to become less common. Given the very high speeds that data can be read and copied to the memory chips inside SSDs, it’s natural that computer and storage designers want to take full advantage of that capability. Increasingly, storage is plugging directly into the computer’s system board, and in the process taking on new shapes.

Anand Lal Shimpi, anandtech.com -- http://www.anandtech.com/show/6293/ngff-ssds-putting-an-end-to-proprietary-ultrabook-ssd-form-factors

A size comparison of an mSATA SSD (left) and an M.2 2242 SSD (right)

Laptop makers adopted the mSATA, and then the M.2 standard, which can be as small as a few squares of chocolate but have the same capacity as any 2.5” SATA SSD.

Another interface technology called NvM Express or NVMe may start to move from servers in the data center to consumer laptops in the next few years. NVMe will push storage speeds in laptops and workstations even higher.

SSDs Fail Too

Just like hard drives, SSDs can wear out, though for different reasons. With hard drives, it’s often just the mechanical reality of a spinning motor that wears down over time. Although there are no moving parts inside an SSD, each memory bank has a finite life expectancy — a limit on the number of times it can be written to and read from before it stops working. Logic built into the drives tries to dynamically manage these operations to minimize problems and extend its life.

For practical purposes, most of us don’t need to worry about SSD longevity. An SSD you put in your computer today will likely outlast the computer. But it’s sobering to remember that even though SSDs are inherently more rugged than hard drives, they’re still prone to the same laws of entropy as everything else in the universe.

Planning for the Future of Storage

If you’re still using a computer with a SATA hard drive, you can see a huge performance increase by switching to an SSD. What’s more, the cost of SSDs has dropped dramatically over the course of the past couple of years, so it’s less expensive than ever to do this sort of upgrade.

Whether you’re using a HDD or an SSD, a good backup plan is essential because eventually any drive will fail. You should have a local backup combined with secure cloud-based backup like Backblaze, which satisfies the 3-2-1 backup strategy. To help get started, make sure to check out our Backup Guide.

Hopefully, we’ve given you some insight about HDDs and SSDs. And as always, we encourage your questions and comments, so fire away!


Editor’s note:  You might enjoy reading more about the future of HDDs and SSDs in our two-part series, HDD vs SSD: What Does the Future for Storage Hold?

The post Hard Disk Drive (HDD) vs Solid State Drive (SSD): What’s the Diff? appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Securely Managing Your Digital Media (SD, CF, SSD, and Beyond)

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/securely-managing-your-digital-media-sd-cf-ssd-and-beyond/

3 rows of 3 memory cards

This is the second in our post exchange series with our friends Zach Sutton and Ryan Hill at Lensrentals.com, who have an online site for renting photography, videography, and lighting equipment. You can read our post from last month on their blog, 3-2-1 Backup Best Practices using Cloud Archiving, and all posts on our blog in this series at Lensrentals post series.

— Editor

Managing digital media securely is crucial for all photographers and videographers. At Lensrentals.com, we take media security very seriously, with dozens of rented memory cards, hard drives, and other data devices returned to our facility every day. All of our media is inspected with each and every rental customer. Most of the cards returned to us in rental shipments are not properly reformatted and erased, so it’s part of our usual service to clear all the data from returned media to keep each client’s identity and digital property secure.

We’ve gotten pretty good at the routine of managing data and formatting storage devices for our clients while making sure our media has a long life and remains free from corruption. Before we get too involved in our process of securing digital media, we should first talk fundamentals.

The Difference Between Erasing and Reformatting Digital Media

When you insert a card in the camera, you’re likely given two options, either erase the card or format the card. There is an important distinction between the two. Erasing images from a card does just that — erases them. That’s it. It designates the area the prior data occupied on the card as available to write over and confirms to you that the data has been removed.

The term erase is a bit misleading here. The underlying data, the 1’s and 0’s that are recorded on the media, are still there. What really happens is that the drive’s address table is changed to show that the space the previous file occupied is available for new data.

This is the reason that simply erasing a file does not securely remove it. Data recovery software can be used to recover that old data as long as it hasn’t been overwritten with new data.

Formatting goes further. When you format a drive or memory card, all of the files are erased (even files you’re designated as “protected”) and also usually adds a file system. This is a more effective method for removing all the data on the drive since all the space previously divided up for specific files has a brand new structure unencumbered by whatever size files were previously stored. Be beware, however, that it’s possible to retrieve older data even after a format. Whether that can happen depends on the formatting method and whether new data has overwritten what was previously stored.

To make sure that the older data cannot be recovered, a secure erase goes further. Rather than simply designating the data that can be overwritten with new data, a secure erase writes a random selection of 1s and 0s to the disk to make sure the old data is no longer available. This takes longer and is more taxing on the card because data is being overwritten rather than simply removed.

Always Format a Card for the Camera You’re Going to Be Using

If you’ve ever tried to use the same memory card on cameras of different makes without formatting it, you may have seen problems with how the data files are displayed. Each camera system handles its file structure a little differently.

For this reason it’s advisable to format the card for the specific camera you’re using. If this is not done, there is a risk of corrupting data on the card.

Our Process For Securing Data

Our inspection process for recording media varies a little depending on what kind of card we’re inspecting. For standardized media like SD cards or compact flash cards, we simply use a card reader to format the card to exFAT. This is done in Disk Utility on the Apple Macbooks that we issue to each of our Video Technicians. We use exFAT specifically because it’s recognizable by just about every device. Since these cards are used in a wide variety of different cameras, recorders, and accessories, and we have no way of knowing at the point of inspection what device they’ll be used with, we have to choose a format that will allow any camera to recognize the card. While our customer may still have to format a card in a camera for file structure purposes, the card will at least always come formatted in a way that the camera can recognize.

Sony SxS media
For proprietary media — things like REDMAGs, SxS, and other cards that we know will only be used in a particular camera — we use cameras to do the formatting. While the exFAT system would technically work, a camera-specific erase and format process saves the customer a step and allows us to more regularly double-check the media ports on our cameras. In fact, we actually format these cards twice at inspection. First, the Technician erases the card to clear out any customer footage that may have been left on it. Next, they record a new clip to the card, around 30 seconds, just to make sure everything is working as it’s supposed to. Finally, they format the card again, erasing the test footage before sending it to the shelf where it awaits use by another customer.

REDMAG Red Mini-Mag You’ll notice that at no point in this process do we do a full secure erase. This is both to save time and to prevent unnecessary wear and tear on the cards. About 75% of the media we get back from orders still has footage on it, so we don’t get the impression that many of our customers are overly concerned with keeping their footage private once they’re done shooting. However, if you are one of those 25% that may have a personal or professional interest in keeping your footage secure after shooting, we’d recommend that you securely erase the media before returning rented memory cards and drives. Or, if you’d rather we handle it, just send an email or note with your return order requesting that we perform a secure erase rather than simply formatting the cards, and we’ll be happy to oblige.

Managing your digital media securely can be easy if done right. Data management and backing up files, on the other hand, can be more involved and require more planning. If you have any questions on that topic, be sure to check out our recent blog post on proper data backup.

— Zach Sutton and Ryan Hill, lensrentals.com

The post Securely Managing Your Digital Media (SD, CF, SSD, and Beyond) appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Mac and iOS Users: Remember to Back Up Before You Upgrade!

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/mac-and-ios-users-remember-to-back-up-before-you-upgrade/

macOS Mojave

New versions of Apple’s operating systems are coming to your iPhone and Mac in the next week! iOS 12 was released today, and macOS 10.14 “Mojave” is available a week from today on September 24. If you’re planning to upgrade your Mac or iOS devices with Apple’s newest software, you should make it a point to back up before you install anything new.

The new releases were announced in June at Apple’s annual Worldwide Developer Conference (WWDC), which gathers thousands of Apple developers from around the world each year. It’s a familiar annual processional: Apple introduces new versions of both the Mac and iOS operating systems. They’re tested by developers and the public throughout the summer.

Back up Early and Often

Changing your Mac or iPhone’s operating system isn’t like installing a new version of an app, even though Apple has tried to make it a relatively simple process. Operating system software is essential software for these devices, and how it works has a cascading effect on all the other apps and services you depend on.

If you’re not currently backing up, it’s easy to get started using our 3-2-1 Backup Strategy. The idea behind the 3-2-1 Backup Strategy is that there should be three copies of your data: The main one you use, a local backup copy, and a remote copy, stored at a secure offsite data center like Backblaze. It’s served us and thousands of our customers very well over the years, so we recommend it unabashedly. Also check out our Mac Backup Guide.

Our advice is to make sure to back up all of your systems before installing operating system software, even final released software. It’s better to be safe rather than sorry, especially where the safety and security of your data are concerned.

The post Mac and iOS Users: Remember to Back Up Before You Upgrade! appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

The Maltese MacBook

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/the-maltese-macbook/

Still from the 1941 John Huston film, The Maltese Falcon, with Humphrey Bogart, Peter Lorre, Mary Astor, and Sydney Greenstreet

Last year we decided to use Apple’s big fall announcement day to talk about backing up Windows computers. This year we’re continuing the tradition of writing something that runs a bit counter to all the hoopla with a tongue-in-cheek post in the style of hardboiled detective fiction entitled, “The Maltese MacBook,” with apologies to Dashiell Hammett.

— Editor

It was a Wednesday and it would have been just like any other Wednesday except Apple was making its big fall product announcements. Just my luck, I had to work in the San Francisco store, which meant that I was the genius who got to answer all the questions.

I had just finished helping a customer who claimed that Siri was sounding increasingly impatient answering his questions when I looked up and saw her walk in the door.

Her blonde hair was streaked with amethyst highlights and she was wearing a black leather tutu and polished kneehigh Victorian boots. Brightly colored tattoos of Asian characters ran up both of her forearms and her neck. Despite all that, she wouldn’t particularly stand out in San Francisco, but her cobalt-blue eyes held me and wouldn’t let me go. She rapidly reduced the distance between the door and where I stood behind the counter at the back of the store.

She plopped a Surface Pro computer on the counter in front of me.

“I lost my data,” she said.

I knew I’d seen her before, but I couldn’t place where.

“That’s a Windows computer,” I said.

She leaned over the counter towards me. Her eyes were even brighter and bluer close up.

“Tell me something I don’t know, genius,” she replied.

Then I remembered where I’d seen her. She was on Press: Here a while back talking about her new startup. She was head of software engineering for a Google spinoff. Angels all over the valley were fighting to throw money at her project. I had been sitting in my boxers eating cold pizza and watching her talk on TV about AI for Blockchain ML.

She was way out of my league.

“I was in Valletta on a business trip using my MacBook Pro,” she said. “I was reading Verlaine on the beach when a wave came in and soaked Reggie. ‘Reggie’ is my MacBook Pro. Before I knew it, it was all over.”

Her eyes misted up.

“You know that there isn’t an Apple store in Malta, don’t you?” she said.

“We have a reseller there,” I replied.

“But they aren’t geniuses, are they?” she countered.

“No, they’re not.” She had me there.

“I had no choice but to buy this Surface Pro at a Windows shop on Strait Street to get me through the conference. It’s OK, but it’s not Reggie. I came in today to get everything made right. You can do that for me, can’t you?”

I looked down at the Surface Pro. We weren’t supposed to work on other makes of computers. It was strictly forbidden in the Genius Training Student Workbook. Alarms were going off in my head telling me to be careful:  this dame meant nothing but trouble.

“Well?” she said.

I made the mistake of looking at her and lingering just a little too long. Her eyes were both shy and probing at the same time. I felt myself falling head over heels into their inky-blue depths.

I shook it off and gradually crawled back to consciousness. I told myself that if a customer’s computer needs help, it doesn’t make any difference what you think of the computer, or which brand it is. She’s your customer, and you’re supposed to do something about it. That’s the way it works. Damn the Genius Training Student Workbook.

“OK,” I said. “Let’s take care of this.”

I asked her whether she had files on the Surface Pro she needed to save. She told me that she used Backblaze Cloud Backup on both the new Surface Pro and her old MacBook Pro. My instincts had been right. This lady was smart.

“That will make it much easier,” I told her. “We’ll just download the backed up files for both your old Macbook Pro and your Surface Pro from Backblaze and put them on a new MacBook Pro. We’ll be done in just a few minutes. You know about Backblaze’s Inherit Backup State, right? It lets you move your account to a new computer, restore all your files from your backups to the computer, and start backing up again without having to upload all your files again to the cloud.

“What do you think?” she asked.

I assumed she meant that she already knew all about Inherit Backup State, so I went ahead and configured her new computer.

I was right. It took me just a little while to get her new MacBook Pro set up and the backed up files restored from the Backblaze cloud. Before I knew it, I was done.

“Thanks” she said. “You’ve saved my life.”

Saved her life? My head was spinning.

She turned to leave. I wanted to stop her before she left. I wanted to tell her about my ideas for an AI-based intelligent customer support agent. Maybe she’d be impressed. But she was already on her way towards the door.

I thought she was gone forever but she stopped just before the door. She flipped her hair back over her shoulder as she turned to look at me.

“You really are a genius.”

She smiled and walked out of the store and out of my life. My eyes lingered on the swinging door as she crossed the street and disappeared into the anonymous mass of humanity.

I thought to myself: she’ll be back. She’ll be back to get a charger, or a Thunderbolt to USB-C adaptor, or Magsafe to USB-C, or Thunderbolt 3 to Thunderbolt 2, or USB-C to Lightning, or USB-A to USB-C, or DisplayPort to Mini DisplayPort, or HDMI to DisplayPort, or vice versa.

Yes, she’ll be back.

I panicked. Maybe she’ll take the big fall for Windows and I’ll never see her again. What if that happened?

Then I realized I was just being a sap. Snap out of it! I’ll wait for her no matter what happens.

She deserves that.

The Maltese Falcon

The post The Maltese MacBook appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

cPanel Backup to B2 Cloud Storage

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/cpanel-backup-to-b2-cloud-storage/

laptop on a desk with a cup of coffee, cell phone, and iPad

Anyone who’s managed a business or personal website is likely familiar with cPanel, the control panel that provides a graphical interface and tools that simplify the process of managing a website. IT professionals who’ve managed hosting servers might know cPanel’s big brother, WHM (Web Host Manager), which is used by server administrators to manage large web hosting servers and cPanels for their customers.

cPanel Dashboard WHM Dashboard
cPanel Dashboard   WHM Dashboard

Just as with any other online service, backup is critically important to safeguard user and business data from hardware failure, accidental loss, or unforeseen events. Both cPanel and WHM support a number of applications for backing up websites and servers.

JetApps’s JetBackup cPanel App

One of those cPanel applications is JetApps’s JetBackup, which supports backing up data to a number of destinations, including local, remote SSH, remote FTP, and public cloud services. Backblaze B2 Cloud Storage was added as a backup destination in version 3.2. Web hosts that support JetBackup for their cPanel and WHM users include Clook, FastComet, TMDHosting, Kualo, Media Street, ServerCake, WebHost.UK.net, MegaHost, MonkeyTree Hosting, and CloudBunny.

cPanel with JetBackup app

cPanel with JetBackup app

JetBackup configuration for B2

JetBackup configuration for B2

Directions for configuring JetBackup with B2 are available on their website.

Note:  JetBackup version 3.2+ supports B2 cloud storage, but that support does not currently include incremental backups. JetApps has told us that incremental backup support will be available in an upcoming release.

Interested in more B2 Support for cPanel and WHM?

JetBackup support for B2 was added to JetBackup because their users asked for it. Users have been vocal in asking vendors to add cPanel/WHM support for backing up to B2 in forums and online discussions, as evidenced on cPanel.net and elsewhere — here, here, and here. The old axiom that the squeaky wheel gets the grease is true when lobbying vendors to add B2 support — the best way to have B2 directly supported by an app is to express your interest directly to the backup app provider.

Other Ways to Back Up Website Data to B2

When a dedicated backup app for B2 is not available, some cPanel users are creating their own solutions using the B2 Command Line Interface (CLI), while others are using Rclone to back up to B2.

B2 CLI example:

#!/bin/bash
b2 authorize_account ACCOUNTID APIKEY
b2 sync –noProgress /backup/ b2://STORAGECONTAINER/

Rclone example:

rclone copy /backup backblaze:my-server-backups –transfers 16

Those with WordPress websites have other options for backing up their sites, which we highlighted in a post, Backing Up WordPress.

Having a Solid Backup Plan is What’s Important

If you’re using B2 for cPanel backup, or are using your own backup solution, please let us know what you’re doing in the comments.

The post cPanel Backup to B2 Cloud Storage appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

The B2 Developers’ Community

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/object-storage-developer-community/

Developers at Work Using Object Storage

When we launched B2 Cloud Storage in September of 2015, we were hoping that the low cost, reliability, and openness of B2 would result in developers integrating B2 object storage into their own applications and platforms.

We’ve continually strengthened and encouraged the development of more tools and resources for the B2 developer community. These resources include APIs, a Command-Line tool, a Java SDK, and code examples for Swift and C++. Backblaze recently added application keys for B2, which enable developers to restrict access to B2 data and control how an application interacts with that data.

An Active B2 Developer Community

It’s three years later and we are happy to see that an active developer community has sprung up around B2. Just a quick look at GitHub shows over 250 repositories for B2 code with projects in ten different languages that range from C# to Go to Ruby to Elixir. A recent discussion on Hacker News about a B2 Python Library resulted in 225 comments.

B2 coding languages - Java, Ruby, C#, Shell, PHP, R, JavaScript, C++, Elixir, Go, Python, Swift

What’s Happening in the B2 Developer Community?

We believe that the two major reasons for the developer activity supporting B2 are, 1) the user demand for inexpensive and reliable storage, and, 2) the ease of implementation of the B2 API. We discussed the B2 API design decisions in a recent blog post.

Sharing and transparency have been cornerstone values for Backblaze since our founding, and we believe openness and transparency breed trust and further innovation in the community. Since we ask customers to trust us with their data, we want our actions to show why we are worthy of that trust.

Here are Just Some of the Many B2 Projects Currently Underway

We’re excited about all the developer activity and all of the fresh and creative ways you are using Backblaze B2 storage. We want everyone to know about these developer projects so we’re spotlighting some of the exciting work that is being done to integrate and extend B2.

Rclone (Go) — In addition to being an open source command line program to sync files and directories to and from cloud storage systems, Rclone is being used in conjunction with other applications such as restic. See Rclone on GitHub, as well.

CORS (General web development) — Backblaze supports CORS for efficient cross-site media serving. CORS allows developers to store large or infrequently accessed files on B2 storage, and then refer to and serve them securely from another website without having to re-download the asset.

b2blaze (Python) — The b2blaze Python library for B2.

Laravel Backblaze Adapter (PHP) — Connect your Laravel project to Backblaze connector with this storage adapter with token caching.

Wal-E (Postgres) — Continuous archiving to Backblaze for your Postgres databases.

Phoenix (Elixir) — File upload utility for the Phoenix web dev framework.

ZFS Backup (Go) — Backup tool to move your ZFS snapshots to B2.

Django Storage (Python) — B2 storage for the Python Django web development framework.

Arq Backup (Mac and Windows application) — Arq Backup is an example of a single developer, Stefan Reitshamer, creating and supporting a successful and well-regarded application for cloud backup. Stefan also is known for being responsive to his users.

Go Client & Libraries (Go) — Go is a popular language that is being used for a number of projects that support B2, including restic, Minio, and Rclone.

How to Get Involved as a B2 Developer

If you’re considering developing for B2, we encourage you to give it a try. It’s easy to implement and your application and users will benefit from dependable and economical cloud storage.

Developers at workStart by checking out the B2 documentation and resources on our website. GitHub and other code repositories are also great places to look. If you follow discussions on Reddit, you could learn of projects in the works and maybe find users looking for solutions.

We’ve written a number of blog posts highlighting the integrations for B2. You can find those by searching for a specific integration on our blog or under the tag B2. Posts for developers are tagged developer.

Developers at work

If you have a B2 integration that you believe will appeal to a significant audience, you should consider submitting it to us. Those that pass our review are listed on the B2 Integrations page on our website. We’re adding more each week. When you’re ready, just review the B2 Integration Checklist and submit your application. We’re looking forward to showcasing your work!

Now’s a good time to join the B2 developers’ community. Jump on in — the water’s great!

P.S. We want to highlight and promote more developers working with B2. If you have a B2 integration or project that we haven’t mentioned in this post, please tell us what you’re working on in the comments.

The post The B2 Developers’ Community appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Backing UP FreeNAS and TrueNAS to Backblaze B2

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/how-to-setup-freenas-cloud-storage/

FreeNAS and TrueNAS

Thanks to recent updates of FreeNAS and TrueNAS, backing up data to Backblaze B2 Cloud Storage is now available for both platforms. FreeNAS/TrueNAS v11.1 adds a feature called Cloud Sync, which lets you sync, move, or copy data to and from Backblaze B2.

What Are FreeNAS and TrueNAS?

FreeNAS and TrueNAS are two faces of a comprehensive NAS storage environment built on the FreeBSD OS and OpenZFS file system. FreeNAS is the open source and development platform, while TrueNAS is the supported and commercial product line offered by IXSystems.

FreeNAS logo

FreeNAS is for the DIY crowd. If you don’t mind working with bleeding-edge software and figuring out how to make your software and hardware work harmoniously, then FreeNAS could be a good choice for you.

TrueNAS logo

If you’re in a business or other environment with critical data, then a fully supported product like TrueNAS is likely the way you’ll want to go. IXsystems builds their TrueNAS commercial server appliances on the battle-tested, open source framework that FreeNAS and OpenZFS provide.

The software developed by the FreeNAS open source community forms the basis for both platforms, so we’ll talk specifically about FreeNAS in this post.

Working with FreeNAS

You can download FreeNAS directly from the open source project website, freenas.org. Once installed, FreeNAS is managed through a comprehensive web interface that is supplemented by a minimal shell console that handles essential administrative functions. The web interface supports storage pool configuration, user management, sharing configuration, and system maintenance.

FreeNAS web UI

FreeNAS supports Windows, macOS and Unix clients.

Syncing to B2 with FreeNAS

Files or directories can be synchronized to remote cloud storage providers, including B2, with the Cloud Sync feature.

Selecting Tasks ‣ Cloud Sync shows the screen below. This screen shows a single cloud sync called “backup-acctg” that “pushes” a file to cloud storage. The last run finished with a status of SUCCESS.

Existing cloud syncs can be run manually, edited, or deleted with the buttons that appear when a single cloud sync line is selected by clicking with the mouse.

FreeNAS Cloud Sync status

Cloud credentials must be defined before a cloud sync is created. One set of credentials can be used for more than one cloud sync. For example, a single set of credentials for Backblaze B2 can be used for separate cloud syncs that push different sets of files or directories.

A cloud storage area must also exist. With B2, these are called buckets and must be created before a sync task can be created.

After the credentials and receiving bucket have been created, a cloud sync task is created with Tasks ‣ Cloud Sync ‣ Add Cloud Sync. The Add Cloud Sync dialog is shown below.

FreeNAS Cloud Sync credentials

Cloud Sync Options

The table below shows the options for Cloud Sync.

Setting Value Type Description
Description string a descriptive name for this Cloud Sync
Direction string Push to send data to cloud storage, or Pull to pull data from the cloud storage
Provider drop-down
menu
select the cloud storage provider; the list of providers is defined by Cloud Credentials
Path browse
button
select the directories or files to be sent for Push syncs or the destinations for Pull syncs
Transfer Mode drop-down
menu
Sync (default): make files on destination system identical to those on the source; files removed from the source are removed from the destination (like rsync –delete)
Copy: copy files from the source to the destination, skipping files that are identical (like rsync)
Move: copy files from the source to the destination, deleting files from the source after the copy (like mv)
Minute slider or
minute selections
select Every N minutes and use the slider to choose a value, or select Each selected minute and choose specific minutes
Hour slider or
hour selections
select Every N hours and use the slider to choose a value, or select Each selected hour and choose specific hours
Day of month slider or
day of month
selections
select Every N days of month and use the slider to choose a value, or select Each selected day of month and choose specific days
Month checkboxes months when the Cloud Sync runs
Day of week checkboxes days of the week when the Cloud Sync runs
Enabled checkbox uncheck to temporarily disable this Cloud Sync

Take care when choosing a Direction. Most of the time, Push will be used to send data to the cloud storage. Pull retrieves data from cloud storage, but be careful: files retrieved from cloud storage will overwrite local files with the same names in the destination directory.

Provider is the name of the cloud storage provider. These providers are defined by entering credentials in Cloud Credentials.

After the Provider is chosen, a list of available cloud storage areas from that provider is shown. With B2, this is a drop-down with names of existing buckets.

Path is the path to the directories or files on the FreeNAS system. On Push jobs, this is the source location for files sent to cloud storage. On Pull jobs, the Path is where the retrieved files are written. Again, be cautious about the destination of Pull jobs to avoid overwriting existing files.

The Minute, Hour, Days of month, Months, and Days of week fields permit creating a flexible schedule of when the cloud synchronization takes place.

Finally, the Enabled field makes it possible temporarily disable a cloud sync job without deleting it.

FreeNAS Cloud Sync Example

This example shows a Push cloud sync which writes an accounting department backup file from the FreeNAS system to Backblaze B2 storage.

Before the new cloud sync was added, a bucket called “cloudsync-bucket” was created with the B2 web console for storing data from the FreeNAS system.

System ‣ Cloud Credentials ‣ Add Cloud Credential is used to enter the credentials for storage on a Backblaze B2 account. The credential is given the name B2, as shown in the image below:

FreeNAS Cloud Sync B2 credentials

Note on encryption: FreeNAS 11.1 Cloud Sync does not support client-side encryption of data and file names before syncing to the cloud, whether the destination is B2 or another public cloud provider. That capability will be available in FreeNAS v11.2, which is currently in beta.

Example: Adding Cloud Credentials

The local data to be sent to the cloud is a single file called accounting-backup.bin on the smb-storage dataset. A cloud sync job is created with Tasks ‣ Cloud Sync ‣ Add Cloud Sync.

The Description is set to “backup-acctg” to describe the job. This data is being sent to cloud storage, so this is a Push. The provider comes from the cloud credentials defined in the previous step, and the destination bucket “cloudsync-bucket” has been chosen.

The Path to the data file is selected.

The remaining fields are for setting a schedule. The default is to send the data to cloud storage once an hour, every day. The options provide great versatility in configuring when a cloud sync runs, anywhere from once a minute to once a year.

The Enabled field is checked by default, so this cloud sync will run at the next scheduled time.

The completed dialog is shown below:

FreeNAS Cloud Sync example

Dependable and Economical Disaster Recovery

In the event of an unexpected data-loss incident, the VMs, files, or other data stored in B2 from FreeNAS or TrueNAS are available for recovery. Having that data ready and available in B2 provides a dependable, easy, and cost effective offsite disaster recovery solution.

Are you using FreeNAS or TrueNAS? What tips do you have? Let us know in the comments.

The post Backing UP FreeNAS and TrueNAS to Backblaze B2 appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Minio as an S3 Gateway for Backblaze B2 Cloud Storage

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/how-to-use-minio-with-b2-cloud-storage/

Minio + B2

While there are many choices when it comes to object storage, the largest provider and the most recognized is usually Amazon’s S3. Amazon’s set of APIs to interact with their cloud storage, often just called “S3,” is frequently the first integration point for an application or service needing to send data to the cloud.

One of the more frequent questions we get is “how do I jump from S3 to B2 Cloud Storage?” We’ve previously highlighted many of the direct integrations that developers have built on B2: here’s a full list.

Another way to work with B2 is to use what is called a “cloud storage gateway.” A gateway is a service that acts as a translation layer between two services. In the case of Minio, it enables customers to take something that was integrated with the S3 API and immediately use it with B2.

Before going further, you might ask “why didn’t Backblaze just create an S3 compatible service?” We covered that topic in a recent blog post, Design Thinking: B2 APIs (& The Hidden Costs of S3 Compatibility). The short answer is that our architecture enables some useful differentiators for B2. Perhaps most importantly, it enables us to sustainably offer cloud storage at a ¼ of the price of S3, which you will really appreciate as your application or service grows.

However, there are situations when a customer is already using the S3 APIs in their infrastructure and want to understand all the options for switching to B2. For those customers, gateways like Minio can provide an elegant solution.

What is Minio?

Minio is an open source, multi-cloud object storage server and gateway with an Amazon S3 compatible API. Having an S3-compatible API means once configured, Minio acts as a gateway to B2 and will automatically and transparently put or get data into a Backblaze B2 account.

Backup, archive or other software that supports the S3 protocol can be configured to point at Minio. Minio internally translates all the incoming S3 API calls into equivalent B2 storage API calls, which means that all Minio buckets and objects are stored as native B2 buckets and objects. The S3 object layer is transparent to the applications that use the S3 API. This enables the simultaneous use of both Amazon S3 and B2 APIs without compromising any features.

Minio has become a popular solution, with over 113.7M+ Docker pulls. Minio implements the Amazon S3 v2/v4 API in the Minio client, AWS SDK, and in the AWS CLI.

Minio and B2

To try it out, we configured a MacBook Pro with a Docker container for the latest version of Minio. It was a straightforward matter to install the community version of Docker on our Mac and then install the container for Minio.

You can follow the instructions on GitHub for configuring Minio on your system.

In addition to using Minio with S3-compatible applications and creating new integrations using their SDK, one can use Minio’s Command-line Interface (CLI) and the Minio Browser to access storage resources.

Command-line Access to B2

We installed the Minio client (mc), which provides a modern CLI alternative to UNIX coreutils such as ls, cat, cp, mirror, diff, etc. It supports filesystems and Amazon S3 compatible cloud storage services. The Minio client is supported on Linux, Mac, and Windows platforms.

We used the command below to add the alias “myb2” to our host to make it easy to access our data.

mc config host add myb2 \
 http://localhost:9000 b2_account_id b2_application_key

Minio client commands

Once configured, you can use mc subcommands like ls, cp, mirror to manage your data.

Here’s the Minio client command to list our B2 buckets:

mc ls myb2

And the result:

Minio client

Browsing Your B2 Buckets

Minio Gateway comes with an embedded web based object browser that makes it easy to access your buckets and files on B2.

Minio browser

Minio is a Great Way to Try Out B2

Minio is designed to be straightforward to deploy and use. If you’re using an S3-compatible integration, or just want to try out Backblaze B2 using your existing knowledge of S3 APIs and commands, then Minio can be a quick solution to getting up and running with Backblaze B2 and taking advantage of the lower cost of B2 cloud storage.

The post Minio as an S3 Gateway for Backblaze B2 Cloud Storage appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Protecting Your Data From Camera to Archive

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/protecting-your-data-from-camera-to-archive/

Camera data getting backed up to Backblaze B2 cloud

Lensrentals.com is a highly respected company that rents photography and videography equipment. We’re a fan of their blog and asked Zach Sutton and Ryan Hill of Lensrentals to contribute something for our audience. We also contributed a post to their blog that was posted today: 3-2-1 Backup Best Practices using Cloud Archiving.

Enjoy!

— Editor

At Lensrentals.com we get a number of support calls, but unfortunately one of them is among the most common: data catastrophes.

The first of the frequent calls is from someone who thought they transferred over their footage or photos before returning their rental and discovered later that they were missing some images or footage. If we haven’t already gone through an inspection of those cards, it’s usually not a problem to send the cards back to them so they can collect their data. But if our techs have inspected the memory cards, then there isn’t much we can do. Our team at Lensrentals.com perform a full and secure reformatting of the cards to keep each customer’s data safe from the next renter. Once that footage is gone, it is unrecoverable and gone forever. This is never a fun conversation to have.

The second scenario is when a customer calls to tell us that they did manage to transfer all the footage over, but one or more of the clips or images were corrupted in the transferring process. Typically, people don’t discover this until after they’ve sent back the memory cards, and after we’ve already formatted the original media. This is another tough phone call to have. On occasion, data corruption happens in camera, but more often than not, the file gets corrupted during the transfer from the media to the computer or hard drive.

These kinds of problems aren’t entirely avoidable and are inherent risks users take when working with digital media. However, as with all risks, you can take proper steps to assure that your data is safe. If a problem arises, there are techniques you can use to work around it.

We’ve summarized our best suggestions for protecting your data from camera to archive in the following sections. We hope you find them useful.

How to Protect Your Digital Assets

Before Your Shoot

The first and most obvious step to take to assure your data is safe is to make sure you use reliable media. For us, we recommend using cards from brands you trust, such as Sandisk, Lexar or ProGrade Digital (a company that took the reins from Lexar). For hard drives, SanDisk, Samsung, Western Digital, and Intel are all considered incredibly reliable. These brands may be more expensive than bargain brands but have been proven time and time again to be more reliable. The few extra dollars spent on reliable media will potentially save you thousands in the long run and will assure that your data is safe and free of corruption.

One of the most important things you should do before any shoot is format your memory card in the camera. Formatting in camera is a great way to minimize file corruption as it keeps the card’s file structure conforming to that camera manufacturer’s specifications, and it should be done every time before every shoot. Equally important, if the camera gives you an option to do a complete or secure format, take that option over the other low-level formatting options available. In the same vein, it’s essential to also take the time to research and see if your camera needs to unmount or “eject” the media before removing it physically. While this option applies more for video camera recording systems, like those found on the RED camera platform and the Odyssey 7Q, it’s always worth checking into to avoid any corruption of the data. More often than not, preventable data corruption happens when the users turn off the camera system before the media has been unmounted.

Finally, if you’re shooting for the entire day, you’ll want to make sure you have enough media on hand for the entire day, so that you do not need to back up and reformat cards throughout the shoot. While it’s possible to take footage off of the card, reformat it, and use it again for the same day, that is not something you’d want to be doing during the hectic environment of a shoot day — it’s best to have extra media on hand. We’ve all made a mistake and deleted a file we didn’t mean to, so it’s best to avoid that mistake by not having to delete or manage files while shooting. Play it safe, and only reformat when you have the time and clear head to do so.

During Your Shoot

On many modern camera systems, you have the option of dual-recording using two different card slots. If your camera offers this option, we cannot recommend it enough. Doubling the media you’re recording onto can overcome a failure in one of the memory cards. While the added cost may be a hard sell, it’s negligible when compared to all the money spent on lights, cameras, actors and lousy pizza for the day. Additionally, develop a system that works for you and keeps everything as organized as possible. Spent media shouldn’t be in the same location as unused media, and your file structure should be consistent throughout the entire shoot. A proper file structure not only saves time but assures that none of the footage goes missing after the shoot, lost in some random folder.

Camera memory cards

Among one of the most critical jobs while on set is the work of a DIT (Digital Imaging Technician) for video, and a DT (Digital Technician) for photography. Essentially, the responsibilities of these positions are to keep the data archived and organized on a set, as well as metadata logging and other technical tasks involved in keeping a shoot organized. While it may not be cost effective to have a DIT/DT on every shoot, if the budget allows for it, I highly recommend you hire one to take on the responsibilities. Having someone on set who is solely responsible for safely backing up and organizing footage helps keep the rest of the crew focused on their obligations to assure nothing goes wrong. When they’re not transferring and archiving data, DIT/DT’s also log metadata, color correct footage and help with the other preliminary editing processes. Even if the budget doesn’t allow for this position to be filled, work to find someone who can solely handle these processes while on set. You don’t want your camera operator to be in charge of also backing up and organizing footage if you can help it.

Ingest Software

If there is one piece of information we’d like for videographers and photographers to take away from this article, it is this: file-moving or ‘offloading’ software is worth the investment and should be used every time you shoot anything. For those who are unfamiliar with offload software, it’s any application that is designed to make it easier for you to back up footage from one location to another, and one shoot to another. In short, to avoid accidents or data corruption, it’s always best to have your media on a MINIMUM of two different devices. The easiest way to do this is to simply dump media onto two separate hard drives, and keep those drives separately stored. Ideally (if the budget allows), you’ll also keep all of your data on the original media for the day as well, making sure you have multiple copies stored in various locations. Many other options are available and recommended if possible, such as RAID arrays or even copying the data over to a cloud service such as Backblaze B2. What offloading software does is just this process, and helps build a platform of automation while verifying all the data as it’s transferred.

There are a few different recommendations I give for offloading software, all at different price points and with unique features. At the highest end of video production, you’ll often see DITs using a piece of software called Silverstack, which offers color grading functionalities, LTO tape support, and basic editing tools for creating daily edits. At a $600 annual price, it is the most expensive in this field and is probably overkill for most users. As for my recommendation, I recommend a tool call Shotput Pro. At $129, Shotput Pro offers all the tools you’d need to build a great archiving process while sacrificing some of the color editing tools. Shotput Pro can simultaneously copy and transfer files to multiple locations, build PDF reports, and verify all transfers. If you’re looking for something even cheaper, there are additional options such as Offload and Hedge. They’re both available for $99 each and give all the tools you’d need within their simple interfaces.

When it comes to photo, the two most obvious choices are Adobe Lightroom and Capture One Pro. While both tools are known more for their editing tools, they also have a lot of archiving functions built into their ingest systems, allowing you to unload cards to multiple locations and make copies on the fly.

workstation with video camera and RAID NAS

When it comes to video, the most crucial feature all of the apps should have is an option called “checksum verification.” This subject can get complicated, but all you really need to know is that larger files are more likely to be corrupted when transferring and copying, so what checksum verification does is verify the file to assure that it’s identical to the original version down to the individual byte. It is by far the most reliable and effective way to ensure that entire volumes of data are copied without corruption or loss of data. Whichever application you choose, make sure checksum verification is an available feature, and part of your workflow every time you’re copying video files. While available on select photo ingesting software, corruption happens less on smaller files and is generally less of an issue. Still, if possible, use it.

Post-Production

Once you’ve completed your shoot and all of your data is safely transfered over to external drives, it’s time to look at how you can store your information long term. Different people approach archiving in different ways because none of us will have an identical workflow. There is no correct way to handle how to archive your photos and videos, but there are a few rules that you’ll want to implement.

The first rule is the most obvious. You’ll want to make sure your media is stored on multiple drives. That way, if one of your drives dies on you, you still have a backup version of the work ready to go. The second rule of thumb is that you’ll want to store these backups in different locations. This can be extremely important if there is a fire in your office, or you’re a victim of a robbery. The most obvious way to do this is to back up or archive into a cloud service such as Backblaze B2. In my production experience I’ve seen multiple production houses implement a system where they store their backup hard drives in a safety deposit box at their bank. The final rule of thumb is especially important when you’re working with significant amounts of data, and that is to keep a working drive separate from an archive drive. The reason for this is an obvious one: all hard drives have a life expectancy, and you can prolong that by minimizing drive use. Having a working drive separate from your archive drives means that your archive drives will have fewer hours on them, thereby extending their practical life.

Ryan Hill’s Workflow

To help visualize what we discussed above, I’ll lay out my personal workflow for you. Please keep in mind that I’m mainly a one-man band, so my workflow is based on me handling everything. I’m also working with a large variety of mediums, so nothing I’m doing is going to be video and camera specific as all of my video projects, photo projects, and graphic projects are organized in the same way. I won’t bore you with details on my file structure, except to say that everything in my root folder is organized by job number, followed by sub-folders with the data classified into categories. I will keep track of which jobs are which, and have a Google Spreadsheet that organizes the job numbers with descriptions and client information. All of this information is secured within my Google account but also allows me to access it from anywhere if needed.

With archiving, my system is pretty simple. I’ve got a 4-drive RAID array in my office that gets updated every time I’m working on a new project. The array is set to RAID 1+0, which means I could lose two of the four hard drives, and still be able to recover the data. Usually, I’ll put 1TB drives in each bay, fill them as I work on projects, and replace them when they’re full. Once they’re full, I label them with the corresponding job numbers and store them in a plastic case on my bookshelf. By no means am I suggesting that my system is a perfect system, but for me, it’s incredibly adaptable to the various projects I work on. In case I was to get robbed, or if my house caught fire, I still have all of my work also archived onto a cloud system, giving me a second level of security.

Finally, to finish up my backup solution, I also keep a two-bay Thunderbolt hard drive dock on my desk as my working drive system. Solid state drives (SSD) and the Thunderbolt connection give me the speed and reliability that I’d need from a drive that I’ll be working from, and rendering outputs off of. For now, there is a single 960gb SSD in the first bay, with the option to extend to the second bay if I need additional storage. I start work by transferring the job file from my archive to the working drive, do whatever I need to do to the files, then replace the old job folder on my archive with the updated one at the end of the day. This way, if I were to have a drive failure, the worst I will lose is a day’s worth of work. For video projects or anything that takes a lot of data, I usually keep copies of all my source files on both my working and archive drive, and just replace the Adobe Premiere project file as I go. Again, this is just my system that works for me, and I recommend you develop one that works for your workflow while keeping your data safe.

The Takeaway

The critical point you should take away is that these sorts of strategies are things you should be thinking about at every step of your production. How does your camera or codec choice affect your media needs? How are you going to ensure safe data backup in the field? How are you going to work with all of this footage in post-production in a way that’s both secure and efficient? Answering all of these questions ahead of time will keep your media safe and your clients happy.

— Zach Sutton and Ryan Hill, lensrentals.com

The post Protecting Your Data From Camera to Archive appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

What’s the Diff: Backup vs Archive

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/data-backup-vs-archive/

Whats the Diff: Backup vs Archive

Backups and archives serve different functions, yet it’s common to hear the terms used interchangeably in cloud storage. It’s important to understand the difference between the two to ensure that your data storage methodology meets your needs in a number of key areas:

  1. retained for the period of time you require
  2. protected from loss or unauthorized access
  3. able to be restored or retrieved when needed
  4. structured or tagged to enable locating specific data
  5. kept current according to your requirements

Our two choices can be broadly categorized:

  • backup is for recovery from hardware failure or recent data corruption or loss
  • archive is for space management and long term retention

What Is a Backup?

A backup is a copy of your data that is made to protect against loss of that data. Typically, backups are made on a regular basis according to a time schedule or when the original data changes. The original data is not deleted, but older backups are often deleted in favor of newer backups.

Data backup graphic

Desktop computers, servers, VMs, and mobile devices are all commonly backed up. Backups can include data, OS and application files, or a combination of these according to the backup methodology and purpose.

The goal of a backup is to make a copy of anything in current use that can’t afford to be lost. A backup of a desktop or mobile device might include just the user data so that a previous version of a file can be recovered if necessary. On these types of devices an assumption is often made that the OS and applications can easily be restored from original sources if necessary (and/or that restoring an OS to a new device could lead to significant corruption issues). In a virtual server environment, a backup could include .VMDK files that contain data and the OS as well as both structured (database) and unstructured data (files) so that the system can be put back into service as quickly as possible if something happens to the original VM in a VMware, Hyper-V, or other virtual machine environment.

In the case of a ransomware attack, a solid backup strategy can mean the difference between being able to restore a compromised system and having to pay a ransom in the vague hopes of getting a decryption key to obtain access to files that are no longer available because they were encrypted by the attacker.

Backups can have additional uses. A user might make go to a backup to retrieve an earlier version of a file because it contains something no longer in the current file, or, as is possible with some backup services such as Backblaze Backup, to share a file with a colleague or other person.

What Is an Archive?

An archive is a copy of data made for long-term storage and reference. The original data may or may not be deleted from the source system after the archive copy is made and stored, though it is common for the archive to be the only copy of the data.

Data archive graphic

In contrast to a backup whose purpose is to be able to return a computer or file system to a state it existed in previously, an archive can have multiple purposes. An archive can provide an individual or organization with a permanent record of important papers, legal documents, correspondence, and other matters. Often, an archive is used to meet information retention requirements for corporations and businesses. If a dispute or inquiry arises about a business practice, contract, financial transaction, or employee, the records pertaining to that subject can be obtained from the archive.

An archive is frequently used to ease the burden on faster and more frequently accessed data storage systems. Older data that is unlikely to be needed often is put on systems that don’t need to have the speed and accessibility of systems that contain data still in use. Archival storage systems are usually less expensive, as well, so a strong motivation is to save money on data storage.

Archives are often created based on the age of the data or whether the project the data belongs to is still active. An archiving program might send data to an archive if it hasn’t been accessed in a specified amount of time, when it has reached a certain age, if a person is no longer with the organization, or the files have been marked for storage because the project has been completed or closed.

Archives also can be created using metadata describing the project. An archiving program can automatically add relevant metadata, or the user can tag data manually to aid in future retrieval. Common metadata added can be business information describing the data, or in the case of photos and videos, the equipment, camera settings, and geographical location where the media was created. Artificial intelligence (AI) can be used to identify and catalog subject matter in some data such as photos and videos to make it easier to find the data at a later date. AI tools will become increasingly important as we archive more data and need to be able to find it based on parameters that might not be known at the time the data was archived.

What’s the Diff?

Backup Archive
Data backup graphic Data archive graphic
Enables rapid recovery of live, changing data Stores unchanging data no longer in use but must be retained
One of multiple copies of data Usually only remaining copy of data
Restore speed: crucial Retrieval speed: not crucial
Short Term Retention
Retained for as long as data is in active use
Long Term Retention
Retained for required period or indefinitely
Duplicate copies are periodically overwritten Data cannot be altered or deleted

What’s the Difference Between Restore and Retrieve?

In general backup systems restore and archive systems retrieve. The tools needed to perform these functions are different.

If you are interested in restoring something from a backup, it usually is a single file, a server, or structured data such as a database that needs to be restored to a specific point in time. You need to know a lot about the data, such as where it was located when it was backed up, the database or folder it was in, the name of the file, when it was backed up, and so forth.

When you retrieve data from an archive, the data is connected in some manner, such as date, email recipient, period of time, or other set of parameters that can be specified in a search. A typical retrieval query might be to obtain all files related to a project name, or all emails sent by a person during a specific period of time.

Trying to use a backup for an archive can present problems. You would need to keep rigorous records of where and when the files were backed up, what medium they were backed up to, and myriad other pieces of information that would need to be recorded at the time of backup. By definition, backup systems keep copies of data currently in use, so maintaining backups for lengthy periods of time go beyond the capabilities of backup systems and would require manual management.

The bottom line is don’t use a backup for an archive. Select the approach that suits your needs: a backup to keep additional copies of data currently in use in case something happens to your primary copy, or an archive to keep a permanent (and perhaps only record) of important data you wish to retain for personal, business, or legal reasons.

Why You Need Both Backup and Archive

It’s clear the a backup and an archive have different uses. Do you need both?

If you’re a business, the wise choice is yes. You need to make sure that your active business data is protected from accidental or malicious loss, and that your important records are maintained as long as necessary for business and legal reasons. If you are an individual or a small business with documents, photos, videos, and other media, you also need both backup and archive to ensure that your data is protected both short and long term and available and retrievable when you need it.

Data backup graphic & Data archive graphic

Selecting the right tools and services for backup and archiving is essential. Each have feature sets that make them suited to their tasks. Trying to use backup for archiving, or archiving for backup, is like trying to fit a round peg into a square hole. It’s best to use the right tool and service for the data storage function you require.

The post What’s the Diff: Backup vs Archive appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

B2 Quickstart Guide

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/b2-quickstart-guide/

B2 Quickstart Guide
If you’re ready to get started with B2 Cloud Storage, these tips and resources will quickly get you up and running with B2.

What can you do with B2, our low-cost, fast, and easy object cloud storage service?

  • Creative professionals can archive their valuable photos and videos
  • Backup, archive, or sync servers and NAS devices
  • Replace tape backup systems (LTO)
  • Host and serve text, photos, videos, and other web content
  • Build apps that demand affordable cloud storage

B2 cloud storage logo

If you haven’t created an account yet, you can do that here.

Just For Newbies

Are you a beginner to B2? Here’s just what you need to get started.

Saving photos to the cloud

Developer or Old Hat at the Cloud?

If you’re a developer or more experienced with cloud storage, here are some resources just for you.

diagram of how to save files to the cloud

Have a NAS You’d Like to Link to the Cloud?

Would you like to get more out of your Synology, QNAP, or Morro Data NAS? Take a look at these posts that enable you to easily extend your local data storage to the cloud.

diagram of NAS to cloud backup

Looking for an Integration to Super Charge the Cloud?

We’ve blogged about the powerful integrations that work with B2 to provide solutions for a wide-range of backup, archiving, media management, and computing needs. The links below are just a start. You can visit our Integrations page or search our blog for the integration that interests you.

diagram of cloud computing integrations

We hope that gets you started. There’s plenty more about B2 on our blog in the “Cloud Storage” category and in our Knowledge Base.

Didn’t find what you need to get started with B2? Ask away in the comments.

The post B2 Quickstart Guide appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Five Cool Multi-Cloud Backup Solutions You Should Consider

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/multi-cloud-backup-solutions/

Multi-Cloud backup

We came across Kevin Soltow’s blog, VMWare Blog, and invited him to contribute a post on a topic that’s getting a lot of attention these days: using multiple public clouds for storing data to increase data redundancy and also to save money. We hope you enjoy the post.
Kevin Soltow
Kevin lives in Zumikon, Switzerland where he works as a Storage Consultant specializing in VMware technologies and storage virtualization. He gradudated from Swiss Federal Institute of Technology in Zurich, and now works on implementing disaster recovery and virtual SAN solutions.

Nowadays, it’s hard to find a company without a backup or DR strategy in place. An organization’s data has become one of its most important assets, so making sure it remains safe and available are key priorities. But does it really matter where your backups are stored? If you follow the “3-2-1” backup rule, you know the answer. You should have at least three copies of your data, two of which are local but on different media, and at least one copy is offsite. That all sounds reasonable.

What about the devices to store your backup data?

Tapes — Tapes were first, had large capacity, the ability to keep data for a long time, but unfortunately they were slow. Historically, they have been less expensive than disks.

Disks — Disks have great capacity, are more durable and faster than tapes, and are improving rapidly in capacity, speed, and cost-per-unit-stored.

In a previous post, Looking for the most affordable cloud storage? AWS vs Azure vs Backblaze B2, I looked at the cost of public cloud storage. With reasonably-priced cloud services available, cloud can be that perfect offsite option for keeping your data safe.

No cloud storage provider can guarantee 100% accessibility and security. Sure, they get close to this number, with claims of 99-point-some-number-of-nines durability, but an unexpected power outage or disaster could knock out their services for minutes, hours, or longer. This happened last year to Amazon S3 when the service suffered from a service disruption. This year, S3 was down due to a power outage. Fortunately, Amazon did not lose their customers’ data. The key words in the 3-2-1 backup rule are at least one copy offsite. More is always better.

Keeping data in multiple clouds provides a clear advantage for reliability, but it also can provide a cost savings, as well. Using multiple cloud providers can simultaneously provide geographically dispersed backups while taking advantage of the lower storage costs available from competitively-priced cloud providers.

In this post, we take a closer look at solutions that support multiple public clouds and allow you to keep several backup copies in different and dispersed clouds.

The Backup Process

The backup process is illustrated in the figure below:

diagram of the backup process from local to cloud

Some solutions create backups and move them to the repository. Data is kept there for a while and then shifted to the cloud where it stays as long as needed.

In this post, I discuss the dedicated software serving as a “data router,” in other words, the software involved in the process of moving data from some local repository to one or more public clouds.

software to send backups to cloud diagram

Let’s have a look at the options we have to achieve this.

1 — Rclone

When I considered solutions that let you back up your data to several clouds, Rclone and CloudBerry were the first solutions that popped into my head. Rclone acts as a data mover synchronizing your local repository with cloud-based object storage. You basically create a backup using something else (e.g. Veeam Backup & Replication), allocate it on-premises, and the solution sends it to several clouds. First developed for Linux, Rclone has a command-line interface to sync files and directories between clouds.

OS Compatibility

The solution can be run on all OS’s using the command-line interface.

Cloud Support

The solution works with most popular public cloud storage platforms, such as Backblaze B2, Microsoft Azure, Amazon S3 and Glacier, Google Cloud Platform, etc.

Feature set

Rclone commands work wonderfully on whatever remote storage system, be it public cloud storage or just your backup server located somewhere else. It also can send data to multiple places simultaneously, but bi-directional sync does not work yet. In other words, changes you make to your files in the cloud do not affect their local copies. The synchronization process is incremental on the file-by-file basis. It should also be noted that Rclone preserves timestamps on files, which helps when searching for the right backup.

This solution provides two options for moving data to the cloud: sync and copy. The first one, sync, allows moving the backups to the cloud automatically as soon as they appear in the specified local directory. The second mode, copy, as expected from its name, allows only copying data from on-premises to the cloud. Deleting your files locally won’t affect the ones stored in the cloud. There’s also an option to verify hash equality.

Learn more about Rclone: https://rclone.org/

2 — CloudBerry Backup

CloudBerry Backup is built from the self-titled backup technology developed for service providers and enterprise IT departments. It is a cross-platform solution. Note that it’s full-fledged backup software, allowing you to not only move backups to the cloud but also create them.

OS compatibility

CloudBerry is a cross-platform solution.

Cloud Support

So far, the solution can talk to Backblaze B2, Microsoft Azure, Amazon S3 and Glacier, Google Cloud Platform, and more.

Feature set

Designed for large IT departments and managed service providers, CloudBerry Backup provides some features that make the solution really handy for the big guys. It offers the opportunity for client customization up to and including the complete rebranding of the solution.

Let’s look at the backup side of this thing. The solution allows backing up files and directories manually. If you prefer, you can sync the selected directory to the root of the bucket. CloudBerry Backup also can schedule backups. Now, you won’t miss them! Another cool thing is backup jobs management and monitoring. Thanks to this feature you are always aware of backup processes on the client machines.

The solution offers AES 256-bit end-to-end encryption to ensure your data safety.

Learn more about CloudBerry Backup: https://www.cloudberrylab.com/managed-backup.aspx

Read Backblaze’s blog post, How To Back Up Your NAS with B2 and CloudBerry.

3 — StarWind VTL

Some organizations still use Virtual Tape Library (VTL), but want to sync their tape objects to the cloud as well.

OS compatibility

This product is available only for Windows.

Cloud Support

So far, StarWind VTL can talk to popular cloud storage platforms like Backblaze B2, AWS S3 and Glacier, and Microsoft Azure.

Feature set

The product has many features for anyone who wants to back up to the cloud. First, it allows sending data to the cloud’s respective tier with their subsequent automatic de-staging. This automation makes StarWind VTL really cool. Second, the product supports both on-premises and public cloud object storage. Third, StarWind VTL supports deduplication and compression, making storage utilization more efficient. This solution also allows client-side encryption.

StarWind supports standard “LTO” (Linear Tape-Open) protocols. This appeals to organizations that have LTO in place since it allows adoption of more scalable, cost efficient cloud storage without having to update the internal backup infrastructure.

All operations in the StarWind VTL environment are done via the Management Console and Web-Console, the web-interface that makes VTL compatible with all browsers.

Learn more about StarWind Virtual Tape Library: https://www.starwindsoftware.com/starwind-virtual-tape-library

Also, see Backblaze’s post on StarWind VTL: Connect Veeam to the B2 Cloud: Episode 2 — Using StarWind VTL

4 — Duplicati

Duplicati is designed for online backups from scratch. It also can send your data directly to multiple clouds or use local storage as a backend.

OS compatibility

It is free and compatible with Windows, macOS, and Linux.

Cloud Support

So far, the solution talks to Backblaze B2, Amazon S3, Mega, Google Cloud Storage, and Microsoft Azure.

Feature set

Duplicati has some awesome features. First, the solution is free. Notably, its team does not restrict using this software for free even for commercial purposes. Second, Duplicati employs decent encryption, compression, and deduplication, making your storage more efficient and safe. Third, the solution adds timestamps to your files, so you can easily find the specific backup. Fourth, the backup scheduler helps make users’ lives simpler. Now, you won’t miss the backup time!

What makes this piece of software special and really handy is backup content verification. Indeed, you never know whether the backup works out until you literally back up from it. Thanks to this feature, you can pinpoint any problems before it is too late.

Duplicati is managed via a web interface, making it possible to access from anywhere and any platform.

Learn more about Duplicati: https://www.duplicati.com/.

Read Backblaze’s post on Duplicati: Duplicati, a Free, Cross-Platform Client for B2 Cloud Storage.

5 — Duplicacy

Duplicacy supports popular public cloud storage services. Apart from the cloud, it can use SFTP servers and NAS boxes as backends.

OS compatibility

The solution is compatible with Windows, Mac OS X, and Linux.

Cloud Support

Duplicacy can offload data to Backblaze B2, Amazon S3, Google Cloud Storage, Microsoft Azure, and more.

Feature set

Duplicacy not only routes your backups to the cloud but also creates them. Note that each backup created by this solution is incremental. Each of them is treated as a full snapshot, allowing simpler restoration, deletion, and backup transition between storage sites. Duplicacy sends your files to multiple cloud storages and uses strong client-side encryption. Another cool thing about this solution is its ability to provide multiple clients with simultaneous access to the same storage.

Duplicacy has a comprehensive GUI that features one-page configuration for quick backup scheduling and managing retention policies. If you are a command-line interface fan, you can manage Duplicacy via the command line.

Learn more about Duplicacy: https://duplicacy.com/

Read Backblaze’s Knowledge Base article, How to upload files to B2 with Duplicacy.

So, Should You Store Your Data in More Than One Cloud?

Undoubtedly, keeping a copy of your data in the public cloud is a great idea and enables you to comply with the 3-2-1 backup rule. By going beyond that and adopting a multi-cloud strategy, it is possible to save money and also gain additional data redundancy and security by having data in more than one public cloud service.

As I’ve covered in this post, there are a number of wonderful backup solutions that can talk to multiple public cloud storage services. I hope this article proves useful to you and you’ll consider employing one of the reviewed solutions in your backup infrastructure.

Kevin Soltow

The post Five Cool Multi-Cloud Backup Solutions You Should Consider appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

What’s the Diff: VMs vs Containers

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/vm-vs-containers/

What's the Diff: Containers vs VMs

Both VMs and containers can help get the most out of available computer hardware and software resources. Containers are the new kids on the block, but VMs have been, and continue to be, tremendously popular in data centers of all sizes.

If you’re looking for the best solution for running your own services in the cloud, you need to understand these virtualization technologies, how they compare to each other, and what are the best uses for each. Here’s our quick introduction.

Basic Definitions — VMs and Containers

What are VMs?

A virtual machine (VM) is an emulation of a computer system. Put simply, it makes it possible to run what appear to be many separate computers on hardware that is actually one computer.

The operating systems (“OS”) and their applications share hardware resources from a single host server, or from a pool of host servers. Each VM requires its own underlying OS, and the hardware is virtualized. A hypervisor, or a virtual machine monitor, is software, firmware, or hardware that creates and runs VMs. It sits between the hardware and the virtual machine and is necessary to virtualize the server.

Since the advent of affordable virtualization technology and cloud computing services, IT departments large and small have embraced virtual machines (VMs) as a way to lower costs and increase efficiencies.

Virtual Machine System Architecture Diagram

VMs, however, can take up a lot of system resources. Each VM runs not just a full copy of an operating system, but a virtual copy of all the hardware that the operating system needs to run. This quickly adds up to a lot of RAM and CPU cycles. That’s still economical compared to running separate actual computers, but for some applications it can be overkill.

That led to the development of containers.

Benefits of VMs

  • All OS resources available to apps
  • Established management tools
  • Established security tools
  • Better known security controls
Popular VM Providers

What are Containers?

With containers, instead of virtualizing the underlying computer like a virtual machine (VM), just the OS is virtualized.

Containers sit on top of a physical server and its host OS — typically Linux or Windows. Each container shares the host OS kernel and, usually, the binaries and libraries, too. Shared components are read-only. Sharing OS resources such as libraries significantly reduces the need to reproduce the operating system code, and means that a server can run multiple workloads with a single operating system installation. Containers are thus exceptionally “light” — they are only megabytes in size and take just seconds to start. In contrast, VMs take minutes to run and are an order of magnitude larger than an equivalent container.

In contrast to VMs, all that a container requires is enough of an operating system, supporting programs and libraries, and system resources to run a specific program. What this means in practice is you can put two to three times as many as applications on a single server with containers than you can with a VM. In addition, with containers you can create a portable, consistent operating environment for development, testing, and deployment.

Containers System Architecture Diagram

Types of Containers

Linux Containers (LXC) — The original Linux container technology is Linux Containers, commonly known as LXC. LXC is a Linux operating system level virtualization method for running multiple isolated Linux systems on a single host.

Docker — Docker started as a project to build single-application LXC containers, introducing several changes to LXC that make containers more portable and flexible to use. It later morphed into its own container runtime environment. At a high level, Docker is a Linux utility that can efficiently create, ship, and run containers.

Benefits of Containers

  • Reduced IT management resources
  • Reduced size of snapshots
  • Quicker spinning up apps
  • Reduced & simplified security updates
  • Less code to transfer, migrate, upload workloads
Popular Container Providers

Uses for VMs vs Uses for Containers

Both containers and VMs have benefits and drawbacks, and the ultimate decision will depend on your specific needs, but there are some general rules of thumb.

  • VMs are a better choice for running apps that require all of the operating system’s resources and functionality, when you need to run multiple applications on servers, or have a wide variety of operating systems to manage.
  • Containers are a better choice when your biggest priority is maximizing the number of applications running on a minimal number of servers.
What’s the Diff: VMs vs. Containers
VMs Containers
Heavyweight Lightweight
Limited performance Native performance
Each VM runs in its own OS All containers share the host OS
Host OS can be different than the guest OS Host OS and container OS are the same
Startup time in minutes Startup time in milliseconds
Hardware-level virtualization OS virtualization
Allocates required memory Requires less memory space
Fully isolated and hence more secure Process-level isolation and hence less secure

For most, the ideal setup is likely to include both. With the current state of virtualization technology, the flexibility of VMs and the minimal resource requirements of containers work together to provide environments with maximum functionality.

If your organization is running a large number of instances of the same operating system, then you should look into whether containers are a good fit. They just might save you significant time and money over VMs.

Are you Using VMs, Containers, or Both?

We will explore this topic in greater depth in subsequent posts. If you are using VMs or containers, we’d love to hear from you about what you’re using and how you’re using them.

The post What’s the Diff: VMs vs Containers appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

How Security Mindfulness Can Help Prevent Data Disasters

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/what-is-cyber-security/

A locked computer screen

A few years ago, I was surprised by a request to consult with the Pentagon on cybersecurity. It surprised me because I have no military background, and it was the Pentagon, whom I suspected already knew a thing or two about security.

I learned that the consulting project was to raise the awareness of cybersecurity among the people who work at the Pentagon and on military bases. The problem they were having was that some did not sufficiently consider the issue of cybersecurity when they dealt with email, file attachments, and passwords, and in their daily interactions with fellow workers and outside vendors and consultants. If these sound like the same vulnerabilities that the rest of us have, you’re right. It turned out that the military was no different than we are in tackling the problem of cybersecurity in their day-to-day tasks.

That’s a problem. These are the people whose primary job requirement is to be vigilant against threats, and yet some were less than vigilant with their computer and communications systems.

But, more than highlighting a problem with just the military, it made me realize that this problem likely extended beyond the military. If the people responsible for defending the United States can’t take cybersecurity seriously, then how can the rest of us be expected to do so?

And, perhaps even more challenging: how do those of us in the business of protecting data and computer assets fix this problem?

I believe that the campaign I created to address this problem for the Pentagon also has value for other organizations and businesses. We all need to understand how to maintain and encourage security mindfulness as we interact with computer systems and other people.

Technology is Not Enough

We continually focus on what we can do with software and hardware to fight against cyber attacks. “Fighting fire with fire” is a natural and easy way of thinking.

The problem is that the technology used to attack us will continually evolve, which means that our technological responses must similarly evolve. The attackers have the natural advantage. They can innovate and we, the defenders, can only respond. It will continue like that, with attacks and defenses leapfrogging each other over and over while we, the defenders, try to keep up. It’s a game where we can never get ahead because the attackers have a multitude of weaknesses to exploit while the defenders have to guess which vulnerability will be exploited next. It’s enough to want to put the challenge out of your mind completely.

So, what’s the answer?

Let’s go back to the Pentagon’s request. It struck me that what the Pentagon was asking me to do was a classic marketing branding campaign. They wanted to make people more aware of something and to think in a certain manner about it. In this case, instead of making people think that using a certain product would make them happier and more successful, the task was to take a vague threat that wasn’t high on people’s list of things to worry about and turn into something that engaged them sufficiently that they changed their behavior.

I didn’t want to try to make cyber attacks more scary — an idea that I rejected outright — but I did want to try to make people understand the real threat of cyber attacks to themselves, their families, and their livelihoods.

Managers and sysadmins face this challenge daily. They make systems as secure as possible, they install security updates, they create policies for passwords, email, and file handling, yet breaches still happen. It’s not that workers are oblivious to the problem, or don’t care about it. It’s just that they have plenty of other things to worry about, and it’s easy to forget about what they should be doing to thwart cyber attacks. They aren’t being mindful of the possibility of intrusions.

Raising Cybersecurity Awareness

People respond most effectively to challenges that are immediate and present. Abstract threats and unlikely occurrences don’t rise sufficiently above the noise level to register in our consciousness. When a flood is at your door, the threat is immediate and we respond. Our long-term health is important enough that we take action to protect it through insurance, check-ups, and taking care of ourselves because we have been educated or seen what happens if we neglect those preparations.

Both of the examples above — one immediate and one long-term — have gained enough mindfulness that we do something about them.

The problem is that there are so many possible threats to us that to maintain our sanity we ignore all but the most immediate and known threats. A threat becomes real once we’ve experienced it as a real danger. If someone has experienced a cyber attack, the experience likely resulted in a change in behavior. A shift in mindfulness made it less likely that the event would occur again due to a new level of awareness of the threat.

Making Mindfulness Work

One way to make an abstract threat seem more real and more possible is to put it into a context that the person is already familiar with. It then becomes more real and more of a possibility.

That’s what I did for the Pentagon. I put together a campaign to raise the level of mindfulness of the threat of cyberattack by associating it with something they were already familiar with considered serious.

I chose the physical battlefield. I branded the threat of cyber attack as the “Silent Battlefield.” This took something that was not a visible, physical threat and turned it into something that was already perceived as a place where actual threats exist: the battlefield. Cyber warfare is silent compared to physical combat, of course, so the branding associated it with the field of combat. At the same time it perhaps also made the threat more insidious; cyber warfare is silent. You don’t hear a shell whistling through the air to warn you of the coming damage. When the enemy is silent, your only choice is be mindful of the threat and therefore, prepared.

Can this approach work in other contexts, say, a business office, an IT department, a school, or a hospital? I believe it can if the right cultural context is found to increase mindfulness of the problem and how to combat it.

First, find a correlative for the threat that makes it real in that particular environment. For the military, it was the battlefield. For a hospital, the correlative might be a disease attempting to invade a body.

Second, use a combination of messages using words, pictures, audio, and video to get the concept across. This is a branding campaign, so just like a branding campaign for a product or service, multiple exposure and multiple delivery mechanisms will increase the effectiveness of the campaign.

Third, frame security measures as positive rather than negative. Focus on the achievement of a positive outcome rather than the avoidance of a negative result. Examples of positive framing of security measures include:

  • backing up regularly enabled the restoration of an important document that was lost or an earlier draft of a plan containing important information
  • recognizing suspicious emails and attachments avoided malware and downtime
  • showing awareness of various types of phishing campaigns enabled the productive continuation of business
  • creating and using unique and strong passwords and multi-factor verification for accounts avoided having to recreate accounts, credentials, and data
  • showing insight into attempts at social engineering and manipulation was evidence of intelligence and value to the organization

Fourth, demonstrate successful outcomes by highlighting thwarted cyber incursions. Give credit to those who are modeling a proactive attitude. Everyone in the organization should reinforce the messages and give positive reinforcement to effective measures when they are employed.

Other things to do to increase mindfulness are:

Reduce stress
A stressful workplace reduces anyone’s ability to be mindful.
Remove other threats so there are fewer things to worry about.
Encourage a “do one thing now” attitude
Be very clear about what’s important. Make sure that security mindfulness is considered important enough to devote time to.
Show positive results and emphasize victories
Highlight behaviors and actions that defeated attempts to breach security and resulted in good outcomes. Make it personal by giving credit to individuals who have done something specific that worked.

You don’t have to study at a zendō to develop the prerequisite mindfulness to improve computer security. If you’re the person whose job it is to instill mindfulness, you need to understand how to make the threats of malware, ransomware, and other security vectors real to the people who must be vigilant against them every day, and find the cultural and psychological context that works in their environment.

If you can find a way to encourage that security mindfulness, you’ll create an environment where a concern for security is part of the culture and thereby greatly increase the resistance of your organization against cyber attacks.

The post How Security Mindfulness Can Help Prevent Data Disasters appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

How to Connect Your QNAP NAS to B2 Cloud Storage

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/qnap-nas-backup-to-cloud/

QNAP NAS and B2 Cloud Storage

Network-Attached-Storage (NAS) devices are great for local backups and archives of data. They have become even more capable, now often taking over functions that used to be reserved for servers.

QNAP produces a popular line of networking products, including NAS units that can work with Macintosh, Windows, Linux, and other OS’s. QNAP’s NAS products are used in office, home, and professional environments for storage and a variety of applications, including business, development, home automation, security, and entertainment.

Data stored on a QNAP NAS can be backed up to Backblaze B2 Cloud Storage using QNAP’s Hybrid Backup Sync application, which consolidates backup, restoration and synchronization functions into a single QTS application. With the latest releases of QTS and Hybrid Backup Sync (HBS), you can now sync the data on your QNAP NAS to and from Backblaze B2 Cloud Storage.

How to Set up QNAP’s Hybrid Backup Sync to Work With B2 Cloud Storage

To set up your QNAP with B2 sync support, you’ll need access to your B2 account. You’ll also need your B2 Account ID, application key and bucket name — all of which are available after you log into your Backblaze account. Finally, you’ll need the Hybrid Backup Sync application installed in QTS on your QNAP NAS. You’ll need QTS 4.3.3 or later and Hybrid Backup Sync v2.1.170615 or later.

  1. Open the QTS desktop in your web browser.

QNAP QTS Desktop

  1. If it’s not already installed, install the latest Hybrid Backup Sync from the App Center.

QNAP QTS AppCenter

  1. Click on Hybrid Backup Sync from the desktop.
  1. Click the Sync button to create a new connection to B2.

QNAP Hybrid Backup Sync

  1. Select “One-way Sync” and “Sync with the cloud.”

QNAP Hybrid Backup Sync -- Create Sync Job

  1. Select “Local to cloud sync.”

QNAP Hybrid Backup Sync -- Sync with the cloud

  1. Select an existing Account (job) or just select “Backblaze B2” to create a new one.

QNAP Hybrid Backup Sync -- Select Account

  1. Enter a display name for this job, and your Account ID and Application key for your Backblaze B2 account.

QNAP Hybrid Backup Sync -- Create Account

  1. Select the source folder on the NAS you’d like to sync, and the bucket name and folder name on B2 for the destination. If you’d like to sync immediately, select the “Sync Now” checkbox. Click “Advanced Settings” if you’d like to configure a backup schedule, select client-side encryption, compression, filters, file replacement policies, and other options. Click “Apply.” If you selected “Sync Now” your job will start.

QNAS Hybrid Backup Sync -- Create Sync Job

QNAP Hybrid Backup Sync -- Advanced Settings

  1. After you’ve finished configuring your job, you will see the “All Jobs” dialog with the status of all your jobs.

QNAP Hybrid Backup Sync -- All Jobs

What You Can Do With B2 and QNAP Hybrid Backup Sync?

The Hybrid Backup Sync app provides you with total control over what gets backed up to B2. You can synchronize in the cloud as little or as much as you want. Here are some practical examples of what you can do with Hybrid Backup Sync and B2 working together.

1 — Sync the Entire Contents of your QNAP to the Cloud

The QNAP NAS has excellent fault-tolerance — it can continue operating even when individual drive units fail — but nothing in life is foolproof. It pays to be prepared in the event of a catastrophe. If you follow our 3-2-1 Backup Strategy, you know how important it is to make sure that you have a copy of your files in the cloud.

2 — Sync Your Most Important Media Files

Using your QNAP to store movies, music and photos? You’ve invested untold amounts of time, money, and effort into collecting those media files, so make sure they’re safely and securely synced to the cloud with Hybrid Backup Sync and B2.

3 — Back Up Time Machine and Other Local Backups

Apple’s Time Machine software provides Mac users with reliable local backup, and many of our customers rely on it to provide that crucial first step in making sure their data is secure.

QNAP enables the NAS to act as a network-based Time Machine backup. Those Time Machine files can be synced to the cloud, so you can make sure to have Time Machine files to restore from in the event of a critical failure.

If you use Windows or Linux, you can configure the QNAP NAS as the destination for your Windows or Linux local data backup. That, in turn, can be synced to the cloud from the NAS.

Why B2?

B2 is the best value in cloud storage. The cost to store data in the B2 cloud is up to 75 percent less than the competition. You can see for yourself with our B2 Cost Calculator.

If you haven’t given B2 a try yet, now is the time. You can get started with B2 and your QNAP NAS right now, and make sure your NAS is synced securely and automatically to the cloud.

The post How to Connect Your QNAP NAS to B2 Cloud Storage appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Advanced Cloud Backup Solutions

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/advanced-cloud-backup-solutions/

Advanced Cloud Backup Solutions Diagram

Simple and easy backup is great for most people. For those who find configuring and managing backup challenging, it can mean the difference between having a computer regularly backed up and doing nothing at all.

For others, simple and easy doesn’t cut it. They have needs that go beyond install and forget it, or have platforms and devices that aren’t handled by a program like Backblaze Computer Backup. Backing up Windows Servers, Linux boxes, NAS devices, or even VMs all require specialized software. Beyond a device or platform need, there are advanced users that want to fine tune or automate how backup fits into their other IT tasks.

For these people, there are a number of storage solutions with applications and integrations that will fulfill their needs. Backblaze B2 Cloud Storage, for example, is general purpose object cloud storage that can be used for a wide range of uses, including data backup and archiving. You can think of B2 as an infinitely large and secure storage drive in the cloud that’s ready for anything you want to do with it. Backblaze provides the storage, along with a robust API, web interface, and a CLI for the do-it-yourself crowd. In addition, there’s a long list of partner integrations to address specific market segments, platforms, use cases, and feature requirements. You just bring the data, and we get you backed up securely and affordably.

Advanced Backup Needs & Solutions

There’s a wide range of features and use cases that could be called advanced backup. Some of these cases go beyond what we define as backup and include archiving. We distinguish backup and archiving in the following way. Backup is making one or more copies of currently active data in case you want to retrieve a previous version of the data or something happens to your primary copy. Archiving is a record of your data at a moment in time. Backblaze Computer Backup is a data backup solution. Backblaze B2, being general purpose, can be used for either backup or archiving, depending on the user’s needs. Recovery is another possible use of object cloud storage that we’ll cover in future posts.

A Dozen Advanced Cloud Backup Use Cases

Below you’ll find a dozen capabilities, use cases, and features that could fall in the category of advanced cloud backup and archiving. All twelve can be addressed with a combination of Backblaze B2 Cloud Storage and a Backblaze solution, or in concert with one of our many integration partners.

1 — File and Directory/Folder Selection
The vast majority of users want all their data backed up. The Backblaze Computer Backup client backs everything up by default and lets users exclude what they want. Some advanced users prefer to manually select what specific drives, folders, and directories are included for backup and/or be able to set global rules for inclusion and exclusion.
2 — Deduplication, Snapshots
Some IT professionals are looking to deduplicate data across multiple machines before backups are made. Others want granular control of recovery to a specific point in time through the use of “snapshots.”
3 — Archiving and Custom Retention Policies, Lifecycle, Versioning
This feature set includes the ability to specify how long a given snapshot of data should be kept (e.g. how long do I want the version of my computer from Jan 7, 2009 to be saved?) Permutations of this feature set include how many versions of a backup file should be retained, and when they should be discarded, if desired.
4 — Platform and Interface
Most computer users are running on Windows or Macintosh, but others are on Linux, Solaris, FreeBSD, or other OSs. Clients and applications can be available in either command-line (CLI) or graphical user interface (GUI) versions, or sometimes both.
Macintosh, Windows, Linux
5 — Servers and NAS
A common need of advanced users and IT managers is the ability to back up servers and Network-Attached Storage (NAS) devices. In some cases, the servers are virtual machines (VMs) that have special backup needs.
Server
6 — Media
Video and photos have their own backup requirements and workflows. These include the ability to attach metadata about where and when the media was created, what equipment was used, what the subject and content are, and other information. Special search technologies are used to retrieve the stored media. Media files also tend to be large, which brings with it extra storage requirements. People who work with media have specific workflows they use, and ways of working with collaborators and production resources that put demands on the data storage. Transcoding and other processes may be available that can change or repurpose the stored data.
Up, Up to the Cloud
7 — Local and Cloud Backups, Multiple Destinations
Some advanced backup needs include backing up to multiple destinations, which can include a local drive or network device, a server, or various remote destinations using various connection and file transfer protocols (FTP, SMB, SSH, WebDav, etc.).
8 — Advanced Scheduling & Custom Actions, Automation
Advanced backup often includes the ability to schedule actions to be repeated at specific times and dates, or to be triggered by actions or conditions, such as if a file is changed or a program has been run and had a specific outcome.
9 — Advanced Security & Encryption
Security is a concern of every person storing data, but some users have requirements for how and where data is encrypted, where the keys are stored, who has access, and recovery options. In addition, some organizations, agencies, and governments have specific requirements what type of encryption is used, who has access to the data, and other requirements.
10 — Mass Data Ingress
Some users have large amounts of data that would take a long time to transfer even over the fastest network connection. These users are interested in other ways of seeding cloud storage, including shipping a physical drive directly to the cloud purveyor.
Backblaze B2 Fireball
11 — Virtual/Hybrid Storage
A local storage device seamlessly and transparently extends storage needs to the cloud.
12 — WordPress Backup
WordPress is the most popular CMS (Content Management System) for websites, with almost 30% of all websites in the world using WordPress — over 350 million. Backup systems for WordPress typically integrate directly with WordPress through a free or paid plugin installed with WordPress.
Read more about it:

A Wide Range of Backup Options and Capabilities

By combining general purpose object cloud storage with custom or off-the-shelf integrations, a wide range of solutions can be created for advanced backup and archiving. We’re already addressed a number of the use cases above on this blog (see links in sections above), and we’ll be addressing more in future posts. We hope you’ll come back and check those out.

Let Us Know What You’re Doing for Advanced Backup and Archiving

Please tell us about any advanced uses you have, or uses you would like to see addressed in future posts. Just drop us a note in the comments.

The post Advanced Cloud Backup Solutions appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Getting Rid of Your Mac? Here’s How to Securely Erase a Hard Drive or SSD

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/how-to-wipe-a-mac-hard-drive/

erasing a hard drive and a solid state drive

What do I do with a Mac that still has personal data on it? Do I take out the disk drive and smash it? Do I sweep it with a really strong magnet? Is there a difference in how I handle a hard drive (HDD) versus a solid-state drive (SSD)? Well, taking a sledgehammer or projectile weapon to your old machine is certainly one way to make the data irretrievable, and it can be enormously cathartic as long as you follow appropriate safety and disposal protocols. But there are far less destructive ways to make sure your data is gone for good. Let me introduce you to secure erasing.

Which Type of Drive Do You Have?

Before we start, you need to know whether you have a HDD or a SSD. To find out, or at least to make sure, you click on the Apple menu and select “About this Mac.” Once there, select the “Storage” tab to see which type of drive is in your system.

The first example, below, shows a SATA Disk (HDD) in the system.

SATA HDD

In the next case, we see we have a Solid State SATA Drive (SSD), plus a Mac SuperDrive.

Mac storage dialog showing SSD

The third screen shot shows an SSD, as well. In this case it’s called “Flash Storage.”

Flash Storage

Make Sure You Have a Backup

Before you get started, you’ll want to make sure that any important data on your hard drive has moved somewhere else. OS X’s built-in Time Machine backup software is a good start, especially when paired with Backblaze. You can learn more about using Time Machine in our Mac Backup Guide.

With a local backup copy in hand and secure cloud storage, you know your data is always safe no matter what happens.

Once you’ve verified your data is backed up, roll up your sleeves and get to work. The key is OS X Recovery — a special part of the Mac operating system since OS X 10.7 “Lion.”

How to Wipe a Mac Hard Disk Drive (HDD)

NOTE: If you’re interested in wiping an SSD, see below.

    1. Make sure your Mac is turned off.
    2. Press the power button.
    3. Immediately hold down the command and R keys.
    4. Wait until the Apple logo appears.
    5. Select “Disk Utility” from the OS X Utilities list. Click Continue.
    6. Select the disk you’d like to erase by clicking on it in the sidebar.
    7. Click the Erase button.
    8. Click the Security Options button.
    9. The Security Options window includes a slider that enables you to determine how thoroughly you want to erase your hard drive.

There are four notches to that Security Options slider. “Fastest” is quick but insecure — data could potentially be rebuilt using a file recovery app. Moving that slider to the right introduces progressively more secure erasing. Disk Utility’s most secure level erases the information used to access the files on your disk, then writes zeroes across the disk surface seven times to help remove any trace of what was there. This setting conforms to the DoD 5220.22-M specification.

  1. Once you’ve selected the level of secure erasing you’re comfortable with, click the OK button.
  2. Click the Erase button to begin. Bear in mind that the more secure method you select, the longer it will take. The most secure methods can add hours to the process.

Once it’s done, the Mac’s hard drive will be clean as a whistle and ready for its next adventure: a fresh installation of OS X, being donated to a relative or a local charity, or just sent to an e-waste facility. Of course you can still drill a hole in your disk or smash it with a sledgehammer if it makes you happy, but now you know how to wipe the data from your old computer with much less ruckus.

The above instructions apply to older Macintoshes with HDDs. What do you do if you have an SSD?

Securely Erasing SSDs, and Why Not To

Most new Macs ship with solid state drives (SSDs). Only the iMac and Mac mini ship with regular hard drives anymore, and even those are available in pure SSD variants if you want.

If your Mac comes equipped with an SSD, Apple’s Disk Utility software won’t actually let you zero the hard drive.

Wait, what?

In a tech note posted to Apple’s own online knowledgebase, Apple explains that you don’t need to securely erase your Mac’s SSD:

With an SSD drive, Secure Erase and Erasing Free Space are not available in Disk Utility. These options are not needed for an SSD drive because a standard erase makes it difficult to recover data from an SSD.

In fact, some folks will tell you not to zero out the data on an SSD, since it can cause wear and tear on the memory cells that, over time, can affect its reliability. I don’t think that’s nearly as big an issue as it used to be — SSD reliability and longevity has improved.

If “Standard Erase” doesn’t quite make you feel comfortable that your data can’t be recovered, there are a couple of options.

FileVault Keeps Your Data Safe

One way to make sure that your SSD’s data remains secure is to use FileVault. FileVault is whole-disk encryption for the Mac. With FileVault engaged, you need a password to access the information on your hard drive. Without it, that data is encrypted.

There’s one potential downside of FileVault — if you lose your password or the encryption key, you’re screwed: You’re not getting your data back any time soon. Based on my experience working at a Mac repair shop, losing a FileVault key happens more frequently than it should.

When you first set up a new Mac, you’re given the option of turning FileVault on. If you don’t do it then, you can turn on FileVault at any time by clicking on your Mac’s System Preferences, clicking on Security & Privacy, and clicking on the FileVault tab. Be warned, however, that the initial encryption process can take hours, as will decryption if you ever need to turn FileVault off.

With FileVault turned on, you can restart your Mac into its Recovery System (by restarting the Mac while holding down the command and R keys) and erase the hard drive using Disk Utility, once you’ve unlocked it (by selecting the disk, clicking the File menu, and clicking Unlock). That deletes the FileVault key, which means any data on the drive is useless.

FileVault doesn’t impact the performance of most modern Macs, though I’d suggest only using it if your Mac has an SSD, not a conventional hard disk drive.

Securely Erasing Free Space on Your SSD

If you don’t want to take Apple’s word for it, if you’re not using FileVault, or if you just want to, there is a way to securely erase free space on your SSD. It’s a little more involved but it works.

Before we get into the nitty-gritty, let me state for the record that this really isn’t necessary to do, which is why Apple’s made it so hard to do. But if you’re set on it, you’ll need to use Apple’s Terminal app. Terminal provides you with command line interface access to the OS X operating system. Terminal lives in the Utilities folder, but you can access Terminal from the Mac’s Recovery System, as well. Once your Mac has booted into the Recovery partition, click the Utilities menu and select Terminal to launch it.

From a Terminal command line, type:

diskutil secureErase freespace VALUE /Volumes/DRIVE

That tells your Mac to securely erase the free space on your SSD. You’ll need to change VALUE to a number between 0 and 4. 0 is a single-pass run of zeroes; 1 is a single-pass run of random numbers; 2 is a 7-pass erase; 3 is a 35-pass erase; and 4 is a 3-pass erase. DRIVE should be changed to the name of your hard drive. To run a 7-pass erase of your SSD drive in “JohnB-Macbook”, you would enter the following:

diskutil secureErase freespace 2 /Volumes/JohnB-Macbook

And remember, if you used a space in the name of your Mac’s hard drive, you need to insert a leading backslash before the space. For example, to run a 35-pass erase on a hard drive called “Macintosh HD” you enter the following:

diskutil secureErase freespace 3 /Volumes/Macintosh\ HD

Something to remember is that the more extensive the erase procedure, the longer it will take.

When Erasing is Not Enough — How to Destroy a Drive

If you absolutely, positively need to be sure that all the data on a drive is irretrievable, see this Scientific American article (with contributions by Gleb Budman, Backblaze CEO), How to Destroy a Hard Drive — Permanently.

The post Getting Rid of Your Mac? Here’s How to Securely Erase a Hard Drive or SSD appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Replacing macOS Server with Synology NAS

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/replacing-macos-server-with-synology-nas/

Synology NAS boxes backed up to the cloud

Businesses and organizations that rely on macOS server for essential office and data services are facing some decisions about the future of their IT services.

Apple recently announced that it is deprecating a significant portion of essential network services in macOS Server, as they described in a support statement posted on April 24, 2018, “Prepare for changes to macOS Server.” Apple’s note includes:

macOS Server is changing to focus more on management of computers, devices, and storage on your network. As a result, some changes are coming in how Server works. A number of services will be deprecated, and will be hidden on new installations of an update to macOS Server coming in spring 2018.

The note lists the services that will be removed in a future release of macOS Server, including calendar and contact support, Dynamic Host Configuration Protocol (DHCP), Domain Name Services (DNS), mail, instant messages, virtual private networking (VPN), NetInstall, Web server, and the Wiki.

Apple assures users who have already configured any of the listed services that they will be able to use them in the spring 2018 macOS Server update, but the statement ends with links to a number of alternative services, including hosted services, that macOS Server users should consider as viable replacements to the features it is removing. These alternative services are all FOSS (Free and Open-Source Software).

As difficult as this could be for organizations that use macOS server, this is not unexpected. Apple left the server hardware space back in 2010, when Steve Jobs announced the company was ending its line of Xserve rackmount servers, which were introduced in May, 2002. Since then, macOS Server has hardly been a prominent part of Apple’s product lineup. It’s not just the product itself that has lost some luster, but the entire category of SMB office and business servers, which has been undergoing a gradual change in recent years.

Some might wonder how important the news about macOS Server is, given that macOS Server represents a pretty small share of the server market. macOS Server has been important to design shops, agencies, education users, and small businesses that likely have been on Macs for ages, but it’s not a significant part of the IT infrastructure of larger organizations and businesses.

What Comes After macOS Server?

Lovers of macOS Server don’t have to fear having their Mac minis pried from their cold, dead hands quite yet. Installed services will continue to be available. In the fall of 2018, new installations and upgrades of macOS Server will require users to migrate most services to other software. Since many of the services of macOS Server were already open-source, this means that a change in software might not be required. It does mean more configuration and management required from those who continue with macOS Server, however.

Users can continue with macOS Server if they wish, but many will see the writing on the wall and look for a suitable substitute.

The Times They Are A-Changin’

For many people working in organizations, what is significant about this announcement is how it reflects the move away from the once ubiquitous server-based IT infrastructure. Services that used to be centrally managed and office-based, such as storage, file sharing, communications, and computing, have moved to the cloud.

In selecting the next office IT platforms, there’s an opportunity to move to solutions that reflect and support how people are working and the applications they are using both in the office and remotely. For many, this means including cloud-based services in office automation, backup, and business continuity/disaster recovery planning. This includes Software as a Service, Platform as a Service, and Infrastructure as a Service (Saas, PaaS, IaaS) options.

IT solutions that integrate well with the cloud are worth strong consideration for what comes after a macOS Server-based environment.

Synology NAS as a macOS Server Alternative

One solution that is becoming popular is to replace macOS Server with a device that has the ability to provide important office services, but also bridges the office and cloud environments. Using Network-Attached Storage (NAS) to take up the server slack makes a lot of sense. Many customers are already using NAS for file sharing, local data backup, automatic cloud backup, and other uses. In the case of Synology, their operating system, Synology DiskStation Manager (DSM), is Linux based, and integrates the basic functions of file sharing, centralized backup, RAID storage, multimedia streaming, virtual storage, and other common functions.

Synology NAS box

Synology NAS

Since DSM is based on Linux, there are numerous server applications available, including many of the same ones that are available for macOS Server, which shares conceptual roots with Linux as it comes from BSD Unix.

Synology DiskStation Manager Package Center screenshot

Synology DiskStation Manager Package Center

According to Ed Lukacs, COO at 2FIFTEEN Systems Management in Salt Lake City, their customers have found the move from macOS Server to Synology NAS not only painless, but positive. DSM works seamlessly with macOS and has been faster for their customers, as well. Many of their customers are running Adobe Creative Suite and Google G Suite applications, so a workflow that combines local storage, remote access, and the cloud, is already well known to them. Remote users are supported by Synology’s QuickConnect or VPN.

Business continuity and backup are simplified by the flexible storage capacity of the NAS. Synology has built-in backup to Backblaze B2 Cloud Storage with Synology’s Cloud Sync, as well as a choice of a number of other B2-compatible applications, such as Cloudberry, Comet, and Arq.

Customers have been able to get up and running quickly, with only initial data transfers requiring some time to complete. After that, management of the NAS can be handled in-house or with the support of a Managed Service Provider (MSP).

Are You Sticking with macOS Server or Moving to Another Platform?

If you’re affected by this change in macOS Server, please let us know in the comments how you’re planning to cope. Are you using Synology NAS for server services? Please tell us how that’s working for you.

The post Replacing macOS Server with Synology NAS appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Connect Veeam to the B2 Cloud: Episode 3 — Using OpenDedup

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/opendedup-for-cloud-storage/

Veeam backup to Backblaze B2 logo

In this, the third post in our series on connecting Veeam with Backblaze B2 Cloud Storage, we discuss how to back up your VMs to B2 using Veeam and OpenDedup. In our previous posts, we covered how to connect Veeam to the B2 cloud using Synology, and how to connect Veeam with B2 using StarWind VTL.

Deduplication and OpenDedup

Deduplication is simply the process of eliminating redundant data on disk. Deduplication reduces storage space requirements, improves backup speed, and lowers backup storage costs. The dedup field used to be dominated by a few big-name vendors who sold dedup systems that were too expensive for most of the SMB market. Then an open-source challenger came along in OpenDedup, a project that produced the Space Deduplication File System (SDFS). SDFS provides many of the features of commercial dedup products without their cost.

OpenDedup provides inline deduplication that can be used with applications such as Veeam, Veritas Backup Exec, and Veritas NetBackup.

Features Supported by OpenDedup:

  • Variable Block Deduplication to cloud storage
  • Local Data Caching
  • Encryption
  • Bandwidth Throttling
  • Fast Cloud Recovery
  • Windows and Linux Support

Why use Veeam with OpenDedup to Backblaze B2?

With your VMs backed up to B2, you have a number of options to recover from a disaster. If the unexpected occurs, you can quickly restore your VMs from B2 to the location of your choosing. You also have the option to bring up cloud compute through B2’s compute partners, thereby minimizing any loss of service and ensuring business continuity.

Veeam logo  +  OpenDedup logo  +  Backblaze B2 logo

Backblaze’s B2 is an ideal solution for backing up Veeam’s backup repository due to B2’s combination of low-cost and high availability. Users of B2 save up to 75% compared to other cloud solutions such as Microsoft Azure, Amazon AWS, or Google Cloud Storage. When combined with OpenDedup’s no-cost deduplication, you’re got an efficient and economical solution for backing up VMs to the cloud.

How to Use OpenDedup with B2

For step-by-step instructions for how to set up OpenDedup for use with B2 on Windows or Linux, see Backblaze B2 Enabled on the OpenDedup website.

Are you backing up Veeam to B2 using one of the solutions we’ve written about in this series? If you have, we’d love to hear from you in the comments.

View all posts in the Veeam series.

The post Connect Veeam to the B2 Cloud: Episode 3 — Using OpenDedup appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.