Tag Archives: dropbox

Storing Encrypted Credentials In Git

Post Syndicated from Bozho original https://techblog.bozho.net/storing-encrypted-credentials-in-git/

We all know that we should not commit any passwords or keys to the repo with our code (no matter if public or private). Yet, thousands of production passwords can be found on GitHub (and probably thousands more in internal company repositories). Some have tried to fix that by removing the passwords (once they learned it’s not a good idea to store them publicly), but passwords have remained in the git history.

Knowing what not to do is the first and very important step. But how do we store production credentials. Database credentials, system secrets (e.g. for HMACs), access keys for 3rd party services like payment providers or social networks. There doesn’t seem to be an agreed upon solution.

I’ve previously argued with the 12-factor app recommendation to use environment variables – if you have a few that might be okay, but when the number of variables grow (as in any real application), it becomes impractical. And you can set environment variables via a bash script, but you’d have to store it somewhere. And in fact, even separate environment variables should be stored somewhere.

This somewhere could be a local directory (risky), a shared storage, e.g. FTP or S3 bucket with limited access, or a separate git repository. I think I prefer the git repository as it allows versioning (Note: S3 also does, but is provider-specific). So you can store all your environment-specific properties files with all their credentials and environment-specific configurations in a git repo with limited access (only Ops people). And that’s not bad, as long as it’s not the same repo as the source code.

Such a repo would look like this:

project
└─── production
|   |   application.properites
|   |   keystore.jks
└─── staging
|   |   application.properites
|   |   keystore.jks
└─── on-premise-client1
|   |   application.properites
|   |   keystore.jks
└─── on-premise-client2
|   |   application.properites
|   |   keystore.jks

Since many companies are using GitHub or BitBucket for their repositories, storing production credentials on a public provider may still be risky. That’s why it’s a good idea to encrypt the files in the repository. A good way to do it is via git-crypt. It is “transparent” encryption because it supports diff and encryption and decryption on the fly. Once you set it up, you continue working with the repo as if it’s not encrypted. There’s even a fork that works on Windows.

You simply run git-crypt init (after you’ve put the git-crypt binary on your OS Path), which generates a key. Then you specify your .gitattributes, e.g. like that:

secretfile filter=git-crypt diff=git-crypt
*.key filter=git-crypt diff=git-crypt
*.properties filter=git-crypt diff=git-crypt
*.jks filter=git-crypt diff=git-crypt

And you’re done. Well, almost. If this is a fresh repo, everything is good. If it is an existing repo, you’d have to clean up your history which contains the unencrypted files. Following these steps will get you there, with one addition – before calling git commit, you should call git-crypt status -f so that the existing files are actually encrypted.

You’re almost done. We should somehow share and backup the keys. For the sharing part, it’s not a big issue to have a team of 2-3 Ops people share the same key, but you could also use the GPG option of git-crypt (as documented in the README). What’s left is to backup your secret key (that’s generated in the .git/git-crypt directory). You can store it (password-protected) in some other storage, be it a company shared folder, Dropbox/Google Drive, or even your email. Just make sure your computer is not the only place where it’s present and that it’s protected. I don’t think key rotation is necessary, but you can devise some rotation procedure.

git-crypt authors claim to shine when it comes to encrypting just a few files in an otherwise public repo. And recommend looking at git-remote-gcrypt. But as often there are non-sensitive parts of environment-specific configurations, you may not want to encrypt everything. And I think it’s perfectly fine to use git-crypt even in a separate repo scenario. And even though encryption is an okay approach to protect credentials in your source code repo, it’s still not necessarily a good idea to have the environment configurations in the same repo. Especially given that different people/teams manage these credentials. Even in small companies, maybe not all members have production access.

The outstanding questions in this case is – how do you sync the properties with code changes. Sometimes the code adds new properties that should be reflected in the environment configurations. There are two scenarios here – first, properties that could vary across environments, but can have default values (e.g. scheduled job periods), and second, properties that require explicit configuration (e.g. database credentials). The former can have the default values bundled in the code repo and therefore in the release artifact, allowing external files to override them. The latter should be announced to the people who do the deployment so that they can set the proper values.

The whole process of having versioned environment-speific configurations is actually quite simple and logical, even with the encryption added to the picture. And I think it’s a good security practice we should try to follow.

The post Storing Encrypted Credentials In Git appeared first on Bozho's tech blog.

Cloud Empire: Meet the Rebel Alliance

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/cloud-empire-meet-the-rebel-alliance/

Cloud Empire: Meet the Rebel Alliance

Last week Backblaze made the exciting announcement that through partnerships with Packet and ServerCentral, cloud computing is available to Backblaze B2 Cloud Storage customers.

Those of you familiar with cloud computing will understand the significance of this news. We are now offering the least expensive cloud storage + cloud computing available anywhere. You no longer have to submit to the lock-in tactics and exorbitant prices charged by the other big players in the cloud services biz.

As Robin Harris wrote in ZDNet about last week’s computing partners announcement, Cloud Empire: Meet the Rebel Alliance.

We understand that some of our cloud backup and storage customers might be unfamiliar with cloud computing. Backblaze made its name in cloud backup and object storage, and that’s what our customers know us for. In response to customers requests, we’ve directly connected our B2 cloud object storage with cloud compute providers. This adds the ability to use and run programs on data once it’s in the B2 cloud, opening up a world of new uses for B2. Just some of the possibilities include media transcoding and rendering, web hosting, application development and testing, business analytics, disaster recovery, on-demand computing capacity (cloud bursting), AI, and mobile and IoT applications.

The world has been moving to a multi-cloud / hybrid cloud world, and customers are looking for more choices than those offered by the existing cloud players. Our B2 compute partnerships build on our mission to offer cloud storage that’s astonishingly easy and low-cost. They enable our customers to move into a more flexible and affordable cloud services ecosystem that provides a greater variety of choices and costs far less. We believe we are helping to fulfill the promise of the internet by allowing customers to choose the best-of-breed services from the best vendors.

If You’re Not Familiar with Cloud Computing, Here’s a Quick Overview

Cloud computing is another component of cloud services, like object storage, that replicates in the cloud a basic function of a computer system. Think of services that operate in a cloud as an infinitely scalable version of what happens on your desktop computer. In your desktop computer you have computing/processing (CPU), fast storage (like an SSD), data storage (like your disk drive), and memory (RAM). Their counterparts in the cloud are computing (CPU), block storage (fast storage), object storage (data storage), and processing memory (RAM).

Computer building blocks

CPU, RAM, fast internal storage, and a hard drive are the basic building blocks of a computer
They also are the basic building blocks of cloud computing

Some customers require only some of these services, such as cloud storage. B2 as a standalone service has proven to be an outstanding solution for those customers interested in backing up or archiving data. There are many customers that would like additional capabilities, such as performing operations on that data once it’s in the cloud. They need object storage combined with computing.

With the just announced compute partnerships, Backblaze is able to offer computing services to anyone using B2. A direct connection between Backblaze’s and our partners’ data centers means that our customers can process data stored in B2 with high speed, low latency, and zero data transfer costs.

Backblaze, Packet and Server Central cloud compute workflow diagram

Cloud service providers package up CPU, storage, and memory into services that you can rent on an hourly basis
You can scale up and down and add or remove services as you need them

How Does Computing + B2 Work?

Those wanting to use B2 with computing will need to sign up for accounts with Backblaze and either Packet or ServerCentral. Packet customers need only select “SJC1” as their region and then get started. The process is also simple for ServerCentral customers — they just need to register with a ServerCentral account rep.

The direct connection between B2 and our compute partners means customers will experience very low latency (less than 10ms) between services. Even better, all data transfers between B2 and the compute partner are free. When combined with Backblaze B2, customers can obtain cloud computing services for as little as 50% of the cost of Amazon’s Elastic Compute Cloud (EC2).

Opening Up the Cloud “Walled Garden”

Traditionally, cloud vendors charge fees for customers to move data outside the “walled garden” of that particular vendor. These fees reach upwards of $0.12 per gigabyte (GB) for data egress. This large fee for customers accessing their own data restricts users from using a multi-cloud approach and taking advantage of less expensive or better performing options. With free transfers between B2 and Packet or ServerCentral, customers now have a predictable, scalable solution for computing and data storage while avoiding vendor lock in. Dropbox made waves when they saved $75 million by migrating off of AWS. Adding computing to B2 helps anyone interested in moving some or all of their computing off of AWS and thereby cutting their AWS bill by 50% or more.

What are the Advantages of Cloud Storage + Computing?

Using computing and storage in the cloud provide a number of advantages over using in-house resources.

  1. You don’t have to purchase the actual hardware, software licenses, and provide space and IT resources for the systems.
  2. Cloud computing is available with just a few minutes notice and you only pay for whatever period of time you need. You avoid having additional hardware on your balance sheet.
  3. Resources are in the cloud and can provide online services to customers, mobile users, and partners located anywhere in the world.
  4. You can isolate the work on these systems from your normal production environment, making them ideal for testing and trying out new applications and development projects.
  5. Computing resources scale when you need them to, providing temporary or ongoing extra resources for expected or unexpected demand.
  6. They can provide redundant and failover services when and if your primary systems are unavailable for whatever reason.

Where Can I Learn More?

We encourage B2 customers to explore the options available at our partner sites, Packet and ServerCentral. They are happy to help customers understand what services are available and how to get started.

We are excited to see what you build! And please tell us in the comments what you are doing or have planned with B2 + computing.

P.S. May the force be with all of us!

The post Cloud Empire: Meet the Rebel Alliance appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Backblaze Announces B2 Compute Partnerships

Post Syndicated from Gleb Budman original https://www.backblaze.com/blog/introducing-cloud-compute-services/

Backblaze Announces B2 Compute Partnerships

In 2015, we announced Backblaze B2 Cloud Storage — the most affordable, high performance storage cloud on the planet. The decision to release B2 as a service was in direct response to customers asking us if they could use the same cloud storage infrastructure we use for our Computer Backup service. With B2, we entered a market in direct competition with Amazon S3, Google Cloud Services, and Microsoft Azure Storage. Today, we have over 500 petabytes of data from customers in over 150 countries. At $0.005 / GB / month for storage (1/4th of S3) and $0.01 / GB for downloads (1/5th of S3), it turns out there’s a healthy market for cloud storage that’s easy and affordable.

As B2 has grown, customers wanted to use our cloud storage for a variety of use cases that required not only storage but compute. We’re happy to say that through partnerships with Packet & ServerCentral, today we’re announcing that compute is now available for B2 customers.

Cloud Compute and Storage

Backblaze has directly connected B2 with the compute servers of Packet and ServerCentral, thereby allowing near-instant (< 10 ms) data transfers between services. Also, transferring data between B2 and both our compute partners is free.

  • Storing data in B2 and want to run an AI analysis on it? — There are no fees to move the data to our compute partners.
  • Generating data in an application? — Run the application with one of our partners and store it in B2.
  • Transfers are free and you’ll save more than 50% off of the equivalent set of services from AWS.

These partnerships enable B2 customers to use compute, give our compute partners’ customers access to cloud storage, and introduce new customers to industry-leading storage and compute — all with high-performance, low-latency, and low-cost.

Is This a Big Deal? We Think So

Compute is one of the most requested services from our customers Why? Because it unlocks a number of use cases for them. Let’s look at three popular examples:

Transcoding Media Files

B2 has earned wide adoption in the Media & Entertainment (“M&E”) industry. Our affordable storage and download pricing make B2 great for a wide variety of M&E use cases. But many M&E workflows require compute. Content syndicators, like American Public Television, need the ability to transcode files to meet localization and distribution management requirements.

There are a multitude of reasons that transcode is needed — thumbnail and proxy generation enable M&E professionals to work efficiently. Without compute, the act of transcoding files remains cumbersome. Either the files need to be brought down from the cloud, transcoded, and then pushed back up or they must be kept locally until the project is complete. Both scenarios are inefficient.

Starting today, any content producer can spin up compute with one of our partners, pay by the hour for their transcode processing, and return the new media files to B2 for storage and distribution. The company saves money, moves faster, and ensures their files are safe and secure.

Disaster Recovery

Backblaze’s heritage is based on providing outstanding backup services. When you have incredibly affordable cloud storage, it ends up being a great destination for your backup data.

Most enterprises have virtual machines (“VMs”) running in their infrastructure and those VMs need to be backed up. In a disaster scenario, a business wants to know they can get back up and running quickly.

With all data stored in B2, a business can get up and running quickly. Simply restore your backed up VM to one of our compute providers, and your business will be able to get back online.

Since B2 does not place restrictions, delays, or penalties on getting data out, customers can get back up and running quickly and affordably.

Saving $74 Million (aka “The Dropbox Effect”)

Ten years ago, Backblaze decided that S3 was too costly a platform to build its cloud storage business. Instead, we created the Backblaze Storage Pod and our own cloud storage infrastructure. That decision enabled us to offer our customers storage at a previously unavailable price point and maintain those prices for over a decade. It also laid the foundation for Netflix Open Connect and Facebook Open Compute.

Dropbox recently migrated the majority of their cloud services off of AWS and onto Dropbox’s own infrastructure. By leaving AWS, Dropbox was able to build out their own data centers and still save over $74 Million. They achieved those savings by avoiding the fees AWS charges for storing and downloading data, which, incidentally, are five times higher than Backblaze B2.

For Dropbox, being able to realize savings was possible because they have access to enough capital and expertise that they can build out their own infrastructure. For companies that have such resources and scale, that’s a great answer.

“Before this offering, the economics of the cloud would have made our business simply unviable.” — Gabriel Menegatti, SlicingDice

The questions Backblaze and our compute partners pondered was “how can we democratize the Dropbox effect for our storage and compute customers? How can we help customers do more and pay less?” The answer we came up with was to connect Backblaze’s B2 storage with strategic compute partners and remove any transfer fees between them. You may not save $74 million as Dropbox did, but you can choose the optimal providers for your use case and realize significant savings in the process.

This Sounds Good — Tell Me More About Your Partners

We’re very fortunate to be launching our compute program with two fantastic partners in Packet and ServerCentral. These partners allow us to offer a range of computing services.

Packet

We recommend Packet for customers that need on-demand, high performance, bare metal servers available by the hour. They also have robust offerings for private / customized deployments. Their offerings end up costing 50-75% of the equivalent offerings from EC2.

To get started with Packet and B2, visit our partner page on Packet.net.

ServerCentral

ServerCentral is the right partner for customers that have business and IT challenges that require more than “just” hardware. They specialize in fully managed, custom cloud solutions that solve complex business and IT challenges. ServerCentral also has expertise in managed network solutions to address global connectivity and content delivery.

To get started with ServerCentral and B2, visit our partner page on ServerCentral.com.

What’s Next?

We’re excited to find out. The combination of B2 and compute unlocks use cases that were previously impossible or at least unaffordable.

“The combination of performance and price offered by this partnership enables me to create an entirely new business line. Before this offering, the economics of the cloud would have made our business simply unviable,” noted Gabriel Menegatti, co-founder at SlicingDice, a serverless data warehousing service. “Knowing that transfers between compute and B2 are free means I don’t have to worry about my business being successful. And, with download pricing from B2 at just $0.01 GB, I know I’m avoiding a 400% tax from AWS on data I retrieve.”

What can you do with B2 & compute? Please share your ideas with us in the comments. And, for those attending NAB 2018 in Las Vegas next week, please come by and say hello!

The post Backblaze Announces B2 Compute Partnerships appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

[$] The true costs of hosting in the cloud

Post Syndicated from jake original https://lwn.net/Articles/748106/rss

Should we host in the cloud or on our own servers? This question was
at the center of Dmytro Dyachuk’s talk, given
during KubeCon +
CloudNativeCon
last November. While many services
simply launch in the cloud without the organizations behind them
considering other options, large
content-hosting services have actually
moved back to their own data centers: Dropbox
migrated in 2016

and Instagram
in 2014
. Because such transitions can be expensive
and risky, understanding the economics of hosting is a critical part
of launching a new service. Actual hosting costs are often
misunderstood, or secret, so it is sometimes difficult to get the
numbers right. In this article, we’ll use Dyachuk’s talk to try to
answer the “million dollar question”: “buy or rent?”

Integration With Zapier

Post Syndicated from Bozho original https://techblog.bozho.net/integration-with-zapier/

Integration is boring. And also inevitable. But I won’t be writing about enterprise integration patterns. Instead, I’ll explain how to create an app for integration with Zapier.

What is Zapier? It is a service that allows you tо connect two (or more) otherwise unconnected services via their APIs (or protocols). You can do stuff like “Create a Trello task from an Evernote note”, “publish new RSS items to Facebook”, “append new emails to a spreadsheet”, “post approaching calendar meeting to Slack”, “Save big email attachments to Dropbox”, “tweet all instagrams above a certain likes threshold”, and so on. In fact, it looks to cover mostly the same usecases as another famous service that I really like – IFTTT (if this then that), with my favourite use-case “Get a notification when the international space station passes over your house”. And all of those interactions can be configured via a UI.

Now that’s good for end users but what does it have to do with software development and integration? Zapier (unlike IFTTT, unfortunately), allows custom 3rd party services to be included. So if you have a service of your own, you can create an “app” and allow users to integrate your service with all the other 3rd party services. IFTTT offers a way to invoke web endpoints (including RESTful services), but it doesn’t allow setting headers, so that makes it quite limited for actual APIs.

In this post I’ll briefly explain how to write a custom Zapier app and then will discuss where services like Zapier stand from an architecture perspective.

The thing that I needed it for – to be able to integrate LogSentinel with any of the third parties available through Zapier, i.e. to store audit logs for events that happen in all those 3rd party systems. So how do I do that? There’s a tutorial that makes it look simple. And it is, with a few catches.

First, there are two tutorials – one in GitHub and one on Zapier’s website. And they differ slightly, which becomes tricky in some cases.

I initially followed the GitHub tutorial and had my build fail. It claimed the zapier platform dependency is missing. After I compared it with the example apps, I found out there’s a caret in front of the zapier platform dependency. Removing it just yielded another error – that my node version should be exactly 6.10.2. Why?

The Zapier CLI requires you have exactly version 6.10.2 installed. You’ll see errors and will be unable to proceed otherwise.

It appears that they are using AWS Lambda which is stuck on Node 6.10.2 (actually – it’s 6.10.3 when you check). The current major release is 8, so minus points for choosing … javascript for a command-line tool and for building sandboxed apps. Maybe other decisions had their downsides as well, I won’t be speculating. Maybe it’s just my dislike for dynamic languages.

So, after you make sure you have the correct old version on node, you call zapier init and make sure there are no carets, npm install and then zapier test. So far so good, you have a dummy app. Now how do you make a RESTful call to your service?

Zapier splits the programmable entities in two – “triggers” and “creates”. A trigger is the event that triggers the whole app, an a “create” is what happens as a result. In my case, my app doesn’t publish any triggers, it only accepts input, so I won’t be mentioning triggers (though they seem easy). You configure all of the elements in index.js (e.g. this one):

const log = require('./creates/log');
....
creates: {
    [log.key]: log,
}

The log.js file itself is the interesting bit – there you specify all the parameters that should be passed to your API call, as well as making the API call itself:

const log = (z, bundle) => {
  const responsePromise = z.request({
    method: 'POST',
    url: `https://api.logsentinel.com/api/log/${bundle.inputData.actorId}/${bundle.inputData.action}`,
    body: bundle.inputData.details,
	headers: {
		'Accept': 'application/json'
	}
  });
  return responsePromise
    .then(response => JSON.parse(response.content));
};

module.exports = {
  key: 'log-entry',
  noun: 'Log entry',

  display: {
    label: 'Log',
    description: 'Log an audit trail entry'
  },

  operation: {
    inputFields: [
      {key: 'actorId', label:'ActorID', required: true},
      {key: 'action', label:'Action', required: true},
      {key: 'details', label:'Details', required: false}
    ],
    perform: log
  }
};

You can pass the input parameters to your API call, and it’s as simple as that. The user can then specify which parameters from the source (“trigger”) should be mapped to each of your parameters. In an example zap, I used an email trigger and passed the sender as actorId, the sibject as “action” and the body of the email as details.

There’s one more thing – authentication. Authentication can be done in many ways. Some services offer OAuth, others – HTTP Basic or other custom forms of authentication. There is a section in the documentation about all the options. In my case it was (almost) an HTTP Basic auth. My initial thought was to just supply the credentials as parameters (which you just hardcode rather than map to trigger parameters). That may work, but it’s not the canonical way. You should configure “authentication”, as it triggers a friendly UI for the user.

You include authentication.js (which has the fields your authentication requires) and then pre-process requests by adding a header (in index.js):

const authentication = require('./authentication');

const includeAuthHeaders = (request, z, bundle) => {
  if (bundle.authData.organizationId) {
	request.headers = request.headers || {};
	request.headers['Application-Id'] = bundle.authData.applicationId
	const basicHash = Buffer(`${bundle.authData.organizationId}:${bundle.authData.apiSecret}`).toString('base64');
	request.headers['Authorization'] = `Basic ${basicHash}`;
  }
  return request;
};

const App = {
  // This is just shorthand to reference the installed dependencies you have. Zapier will
  // need to know these before we can upload
  version: require('./package.json').version,
  platformVersion: require('zapier-platform-core').version,
  authentication: authentication,
  
  // beforeRequest & afterResponse are optional hooks into the provided HTTP client
  beforeRequest: [
	includeAuthHeaders
  ]
...
}

And then you zapier push your app and you can test it. It doesn’t automatically go live, as you have to invite people to try it and use it first, but in many cases that’s sufficient (i.e. using Zapier when doing integration with a particular client)

Can Zapier can be used for any integration problem? Unlikely – it’s pretty limited and simple, but that’s also a strength. You can, in half a day, make your service integrate with thousands of others for the most typical use-cases. And not that although it’s meant for integrating public services rather than for enterprise integration (where you make multiple internal systems talk to each other), as an increasing number of systems rely on 3rd party services, it could find home in an enterprise system, replacing some functions of an ESB.

Effectively, such services (Zapier, IFTTT) are “Simple ESB-as-a-service”. You go to a UI, fill a bunch of fields, and you get systems talking to each other without touching the systems themselves. I’m not a big fan of ESBs, mostly because they become harder to support with time. But minimalist, external ones might be applicable in certain situations. And while such services are primarily aimed at end users, they could be a useful bit in an enterprise architecture that relies on 3rd party services.

Whether it could process the required load, whether an organization is willing to let its data flow through a 3rd party provider (which may store the intermediate parameters), is a question that should be answered in a case by cases basis. I wouldn’t recommend it as a general solution, but it’s certainly an option to consider.

The post Integration With Zapier appeared first on Bozho's tech blog.

Server vs Endpoint Backup — Which is Best?

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/endpoint-backup-for-distributed-computing/

server and computer backup to the cloud

How common are these statements in your organization?

  • I know I saved that file. The application must have put it somewhere outside of my documents folder.” — Mike in Marketing
  • I was on the road and couldn’t get a reliable VPN connection. I guess that’s why my laptop wasn’t backed up.” — Sally in Sales
  • I try to follow file policies, but I had a deadline this week and didn’t have time to copy my files to the server.” — Felicia in Finance
  • I just did a commit of my code changes and that was when the coffee mug was knocked over onto the laptop.” — Erin in Engineering
  • If you need a file restored from backup, contact the help desk at [email protected] The IT department will get back to you.” — XYZ corporate intranet
  • Why don’t employees save files on the network drive like they’re supposed to?” — Isaac in IT

If these statements are familiar, most likely you rely on file server backups to safeguard your valuable endpoint data.

The problem is, the workplace has changed. Where server backups might have fit how offices worked at one time in the past, relying solely on server backups today means you could be missing valuable endpoint data from your backups. On top of that, you likely are unnecessarily expending valuable user and IT time in attempting to secure and restore endpoint data.

Times Have Changed, and so have Effective Enterprise Backup Strategies

The ways we use computers and handle files today are vastly different from just five or ten years ago. Employees are mobile, and we no longer are limited to monolithic PC and Mac-based office suites. Cloud applications are everywhere. Company-mandated network drive policies are difficult to enforce as office practices change, devices proliferate, and organizational culture evolves. Besides, your IT staff has other things to do than babysit your employees to make sure they follow your organization’s policies for managing files.

Server Backup has its Place, but Does it Support How People Work Today?

Many organizations still rely on server backup. If your organization works primarily in centralized offices with all endpoints — likely desktops — connected directly to your network, and you maintain tight control of how employees manage their files, it still might work for you.

Your IT department probably has set network drive policies that require employees to save files in standard places that are regularly backed up to your file server. Turns out, though, that even standard applications don’t always save files where IT would like them to be. They could be in a directory or folder that’s not regularly backed up.

As employees have become more mobile, they have adopted practices that enable them to access files from different places, but these practices might not fit in with your organization’s server policies. An employee saving a file to Dropbox might be planning to copy it to an “official” location later, but whether that ever happens could be doubtful. Often people don’t realize until it’s too late that accidentally deleting a file in one sync service directory means that all copies in all locations — even the cloud — are also deleted.

Employees are under increasing demands to produce, which means that network drive policies aren’t always followed; time constraints and deadlines can cause best practices to go out the window. Users will attempt to comply with policies as best they can — and you might get 70% or even 75% effective compliance — but getting even to that level requires training, monitoring, and repeatedly reminding employees of policies they need to follow — none of which leads to a good work environment.

Even if you get to 75% compliance with network file policies, what happens if the critical file needed to close out an end-of-year financial summary isn’t one of the files backed up? The effort required for IT to get from 70% to 80% or 90% of an endpoint’s files effectively backed up could require multiple hours from your IT department, and you still might not have backed up the one critical file you need later.

Your Organization Operates on its Data — And Today That Data Exists in Multiple Locations

Users are no longer tied to one endpoint, and may use different computers in the office, at home, or traveling. The greater the number of endpoints used, the greater the chance of an accidental or malicious device loss or data corruption. The loss of the Sales VP’s laptop at the airport on her way back from meeting with major customers can affect an entire organization and require weeks to resolve.

Even with the best intentions and efforts, following policies when out of the office can be difficult or impossible. Connecting to your private network when remote most likely requires a VPN, and VPN connectivity can be challenging from the lobby Wi-Fi at the Radisson. Server restores require time from the IT staff, which can mean taking resources away from other IT priorities and a growing backlog of requests from users to need their files as soon as possible. When users are dependent on IT to get back files critical to their work, employee productivity and often deadlines are affected.

Managing Finite Server Storage Is an Ongoing Challenge

Network drive backup usually requires on-premises data storage for endpoint backups. Since it is a finite resource, allocating that storage is another burden on your IT staff. To make sure that storage isn’t exceeded, IT departments often ration storage by department and/or user — another oversight duty for IT, and even more choices required by your IT department and department heads who have to decide which files to prioritize for backing up.

Adding Backblaze Endpoint Backup Improves Business Continuity and Productivity

Having an endpoint backup strategy in place can mitigate these problems and improve user productivity, as well. A good endpoint backup service, such as Backblaze Cloud Backup, will ensure that all devices are backed up securely, automatically, without requiring any action by the user or by your IT department.

For 99% of users, no configuration is required for Backblaze Backup. Everything on the endpoint is encrypted and securely backed up to the cloud, including program configuration files and files outside of standard document folders. Even temp files are backed up, which can prove invaluable when recovering a file after a crash or other program interruption. Cloud storage is unlimited with Backblaze Backup, so there are no worries about running out of storage or rationing file backups.

The Backblaze client can be silently and remotely installed to both Macintosh and Windows clients with no user interaction. And, with Backblaze Groups, your IT staff has complete visibility into when files were last backed up. IT staff can recover any backed up file, folder, or entire computer from the admin panel, and even give file restore capability to the user, if desired, which reduces dependency on IT and time spent waiting for restores.

With over 500 petabytes of customer data stored and one million files restored every hour of every day by Backblaze customers, you know that Backblaze Backup works for its users.

You Need Data Security That Matches the Way People Work Today

Both file server and endpoint backup have their places in an organization’s data security plan, but their use and value differ. If you already are using file server backup, adding endpoint backup will make a valuable contribution to your organization by reducing workload, improving productivity, and increasing confidence that all critical files are backed up.

By guaranteeing fast and automatic backup of all endpoint data, and matching the current way organizations and people work with data, Backblaze Backup will enable you to effectively and affordably meet the data security demands of your organization.

The post Server vs Endpoint Backup — Which is Best? appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Libertarians are against net neutrality

Post Syndicated from Robert Graham original http://blog.erratasec.com/2017/12/libertarians-are-against-net-neutrality.html

This post claims to be by a libertarian in support of net neutrality. As a libertarian, I need to debunk this. “Net neutrality” is a case of one-hand clapping, you rarely hear the competing side, and thus, that side may sound attractive. This post is about the other side, from a libertarian point of view.

That post just repeats the common, and wrong, left-wing talking points. I mean, there might be a libertarian case for some broadband regulation, but this isn’t it.

This thing they call “net neutrality” is just left-wing politics masquerading as some sort of principle. It’s no different than how people claim to be “pro-choice”, yet demand forced vaccinations. Or, it’s no different than how people claim to believe in “traditional marriage” even while they are on their third “traditional marriage”.

Properly defined, “net neutrality” means no discrimination of network traffic. But nobody wants that. A classic example is how most internet connections have faster download speeds than uploads. This discriminates against upload traffic, harming innovation in upload-centric applications like DropBox’s cloud backup or BitTorrent’s peer-to-peer file transfer. Yet activists never mention this, or other types of network traffic discrimination, because they no more care about “net neutrality” than Trump or Gingrich care about “traditional marriage”.

Instead, when people say “net neutrality”, they mean “government regulation”. It’s the same old debate between who is the best steward of consumer interest: the free-market or government.

Specifically, in the current debate, they are referring to the Obama-era FCC “Open Internet” order and reclassification of broadband under “Title II” so they can regulate it. Trump’s FCC is putting broadband back to “Title I”, which means the FCC can’t regulate most of its “Open Internet” order.

Don’t be tricked into thinking the “Open Internet” order is anything but intensely politically. The premise behind the order is the Democrat’s firm believe that it’s government who created the Internet, and all innovation, advances, and investment ultimately come from the government. It sees ISPs as inherently deceitful entities who will only serve their own interests, at the expense of consumers, unless the FCC protects consumers.

It says so right in the order itself. It starts with the premise that broadband ISPs are evil, using illegitimate “tactics” to hurt consumers, and continues with similar language throughout the order.

A good contrast to this can be seen in Tim Wu’s non-political original paper in 2003 that coined the term “net neutrality”. Whereas the FCC sees broadband ISPs as enemies of consumers, Wu saw them as allies. His concern was not that ISPs would do evil things, but that they would do stupid things, such as favoring short-term interests over long-term innovation (such as having faster downloads than uploads).

The political depravity of the FCC’s order can be seen in this comment from one of the commissioners who voted for those rules:

FCC Commissioner Jessica Rosenworcel wants to increase the minimum broadband standards far past the new 25Mbps download threshold, up to 100Mbps. “We invented the internet. We can do audacious things if we set big goals, and I think our new threshold, frankly, should be 100Mbps. I think anything short of that shortchanges our children, our future, and our new digital economy,” Commissioner Rosenworcel said.

This is indistinguishable from communist rhetoric that credits the Party for everything, as this booklet from North Korea will explain to you.

But what about monopolies? After all, while the free-market may work when there’s competition, it breaks down where there are fewer competitors, oligopolies, and monopolies.

There is some truth to this, in individual cities, there’s often only only a single credible high-speed broadband provider. But this isn’t the issue at stake here. The FCC isn’t proposing light-handed regulation to keep monopolies in check, but heavy-handed regulation that regulates every last decision.

Advocates of FCC regulation keep pointing how broadband monopolies can exploit their renting-seeking positions in order to screw the customer. They keep coming up with ever more bizarre and unlikely scenarios what monopoly power grants the ISPs.

But the never mention the most simplest: that broadband monopolies can just charge customers more money. They imagine instead that these companies will pursue a string of outrageous, evil, and less profitable behaviors to exploit their monopoly position.

The FCC’s reclassification of broadband under Title II gives it full power to regulate ISPs as utilities, including setting prices. The FCC has stepped back from this, promising it won’t go so far as to set prices, that it’s only regulating these evil conspiracy theories. This is kind of bizarre: either broadband ISPs are evilly exploiting their monopoly power or they aren’t. Why stop at regulating only half the evil?

The answer is that the claim “monopoly” power is a deception. It starts with overstating how many monopolies there are to begin with. When it issued its 2015 “Open Internet” order the FCC simultaneously redefined what they meant by “broadband”, upping the speed from 5-mbps to 25-mbps. That’s because while most consumers have multiple choices at 5-mbps, fewer consumers have multiple choices at 25-mbps. It’s a dirty political trick to convince you there is more of a problem than there is.

In any case, their rules still apply to the slower broadband providers, and equally apply to the mobile (cell phone) providers. The US has four mobile phone providers (AT&T, Verizon, T-Mobile, and Sprint) and plenty of competition between them. That it’s monopolistic power that the FCC cares about here is a lie. As their Open Internet order clearly shows, the fundamental principle that animates the document is that all corporations, monopolies or not, are treacherous and must be regulated.

“But corporations are indeed evil”, people argue, “see here’s a list of evil things they have done in the past!”

No, those things weren’t evil. They were done because they benefited the customers, not as some sort of secret rent seeking behavior.

For example, one of the more common “net neutrality abuses” that people mention is AT&T’s blocking of FaceTime. I’ve debunked this elsewhere on this blog, but the summary is this: there was no network blocking involved (not a “net neutrality” issue), and the FCC analyzed it and decided it was in the best interests of the consumer. It’s disingenuous to claim it’s an evil that justifies FCC actions when the FCC itself declared it not evil and took no action. It’s disingenuous to cite the “net neutrality” principle that all network traffic must be treated when, in fact, the network did treat all the traffic equally.

Another frequently cited abuse is Comcast’s throttling of BitTorrent.Comcast did this because Netflix users were complaining. Like all streaming video, Netflix backs off to slower speed (and poorer quality) when it experiences congestion. BitTorrent, uniquely among applications, never backs off. As most applications become slower and slower, BitTorrent just speeds up, consuming all available bandwidth. This is especially problematic when there’s limited upload bandwidth available. Thus, Comcast throttled BitTorrent during prime time TV viewing hours when the network was already overloaded by Netflix and other streams. BitTorrent users wouldn’t mind this throttling, because it often took days to download a big file anyway.

When the FCC took action, Comcast stopped the throttling and imposed bandwidth caps instead. This was a worse solution for everyone. It penalized heavy Netflix viewers, and prevented BitTorrent users from large downloads. Even though BitTorrent users were seen as the victims of this throttling, they’d vastly prefer the throttling over the bandwidth caps.

In both the FaceTime and BitTorrent cases, the issue was “network management”. AT&T had no competing video calling service, Comcast had no competing download service. They were only reacting to the fact their networks were overloaded, and did appropriate things to solve the problem.

Mobile carriers still struggle with the “network management” issue. While their networks are fast, they are still of low capacity, and quickly degrade under heavy use. They are looking for tricks in order to reduce usage while giving consumers maximum utility.

The biggest concern is video. It’s problematic because it’s designed to consume as much bandwidth as it can, throttling itself only when it experiences congestion. This is what you probably want when watching Netflix at the highest possible quality, but it’s bad when confronted with mobile bandwidth caps.

With small mobile devices, you don’t want as much quality anyway. You want the video degraded to lower quality, and lower bandwidth, all the time.

That’s the reasoning behind T-Mobile’s offerings. They offer an unlimited video plan in conjunction with the biggest video providers (Netflix, YouTube, etc.). The catch is that when congestion occurs, they’ll throttle it to lower quality. In other words, they give their bandwidth to all the other phones in your area first, then give you as much of the leftover bandwidth as you want for video.

While it sounds like T-Mobile is doing something evil, “zero-rating” certain video providers and degrading video quality, the FCC allows this, because they recognize it’s in the customer interest.

Mobile providers especially have great interest in more innovation in this area, in order to conserve precious bandwidth, but they are finding it costly. They can’t just innovate, but must ask the FCC permission first. And with the new heavy handed FCC rules, they’ve become hostile to this innovation. This attitude is highlighted by the statement from the “Open Internet” order:

And consumers must be protected, for example from mobile commercial practices masquerading as “reasonable network management.”

This is a clear declaration that free-market doesn’t work and won’t correct abuses, and that that mobile companies are treacherous and will do evil things without FCC oversight.

Conclusion

Ignoring the rhetoric for the moment, the debate comes down to simple left-wing authoritarianism and libertarian principles. The Obama administration created a regulatory regime under clear Democrat principles, and the Trump administration is rolling it back to more free-market principles. There is no principle at stake here, certainly nothing to do with a technical definition of “net neutrality”.

The 2015 “Open Internet” order is not about “treating network traffic neutrally”, because it doesn’t do that. Instead, it’s purely a left-wing document that claims corporations cannot be trusted, must be regulated, and that innovation and prosperity comes from the regulators and not the free market.

It’s not about monopolistic power. The primary targets of regulation are the mobile broadband providers, where there is plenty of competition, and who have the most “network management” issues. Even if it were just about wired broadband (like Comcast), it’s still ignoring the primary ways monopolies profit (raising prices) and instead focuses on bizarre and unlikely ways of rent seeking.

If you are a libertarian who nonetheless believes in this “net neutrality” slogan, you’ve got to do better than mindlessly repeating the arguments of the left-wing. The term itself, “net neutrality”, is just a slogan, varying from person to person, from moment to moment. You have to be more specific. If you truly believe in the “net neutrality” technical principle that all traffic should be treated equally, then you’ll want a rewrite of the “Open Internet” order.

In the end, while libertarians may still support some form of broadband regulation, it’s impossible to reconcile libertarianism with the 2015 “Open Internet”, or the vague things people mean by the slogan “net neutrality”.

The Official Projects Book volume 3 — out now

Post Syndicated from Rob Zwetsloot original https://www.raspberrypi.org/blog/projects-book-3/

Hey folks, Rob from The MagPi here with some very exciting news! The third volume of the Official Raspberry Pi Projects Book is out right this second, and we’ve packed its 200 pages with the very best Raspberry Pi projects and guides!

Cover of The Official Projects Book volume 3

A peek inside the projects book

We start you off with a neat beginners guide to programming in Python,  walking you from the very basics all the way through to building the classic videogame Pong from scratch!

Table of contents of The Official Projects Book volume 3

Check out what’s inside!

Then we showcase some of the most inspiring projects from around the community, such as a camera for taking photos of the moon, a smart art installation, amazing arcade machines, and much more.

An article about the Apollo Pi project in The Official Projects Book volume 3

Emulate the Apollo mission computers on the Raspberry Pi

Next, we ease you into a series of tutorials that will help you get the most out of your Raspberry Pi. Among other things, you’ll be learning how to sync your Pi to Dropbox, use it to create a waterproof camera, and even emulate an Amiga.

We’ve also assembled a load of reviews to let you know what you should be buying if you want to extend your Pi experience.

A review of the Pimoroni Enviro pHAT in The Official Projects Book volume 3

Learn more about Pimoroni’s Enviro pHAT

I am extremely proud of what the entire MagPi team has put together here, and I know you’ll enjoy reading it as much as we enjoyed creating it.

How to get yours

In the UK, you can get your copy of the new Official Raspberry Pi Projects Book at WH Smith and all good newsagents today. In the US, print copies will be available in stores such as Barnes & Noble very soon.

Or order a copy from the Raspberry Pi Press store — before the end of Sunday 26 November, you can use the code BLACKFRIDAY to get 10% off your purchase!

There’s also the digital version, which you can get via The MagPi Android and iOS apps. And, as always, there’s a free PDF, which is available here.

We think this new projects book is the perfect stocking filler, although we may be just a tad biased. Anyway, I hope you’ll love it!

Gif of Picard smiling at three children

The post The Official Projects Book volume 3 — out now appeared first on Raspberry Pi.

Your Holiday Cybersecurity Guide

Post Syndicated from Robert Graham original http://blog.erratasec.com/2017/11/your-holiday-cybersecurity-guide.html

Many of us are visiting parents/relatives this Thanksgiving/Christmas, and will have an opportunity to help our them with cybersecurity issues. I thought I’d write up a quick guide of the most important things.

1. Stop them from reusing passwords

By far the biggest threat to average people is that they re-use the same password across many websites, so that when one website gets hacked, all their accounts get hacked.
To demonstrate the problem, go to haveibeenpwned.com and enter the email address of your relatives. This will show them a number of sites where their password has already been stolen, like LinkedIn, Adobe, etc. That should convince them of the severity of the problem.

They don’t need a separate password for every site. You don’t care about the majority of website whether you get hacked. Use a common password for all the meaningless sites. You only need unique passwords for important accounts, like email, Facebook, and Twitter.

Write down passwords and store them in a safe place. Sure, it’s a common joke that people in offices write passwords on Post-It notes stuck on their monitors or under their keyboards. This is a common security mistake, but that’s only because the office environment is widely accessible. Your home isn’t, and there’s plenty of places to store written passwords securely, such as in a home safe. Even if it’s just a desk drawer, such passwords are safe from hackers, because they aren’t on a computer.

Write them down, with pen and paper. Don’t put them in a MyPasswords.doc, because when a hacker breaks in, they’ll easily find that document and easily hack your accounts.

You might help them out with getting a password manager, or two-factor authentication (2FA). Good 2FA like YubiKey will stop a lot of phishing threats. But this is difficult technology to learn, and of course, you’ll be on the hook for support issues, such as when they lose the device. Thus, while 2FA is best, I’m only recommending pen-and-paper to store passwords. (AccessNow has a guide, though I think YubiKey/U2F keys for Facebook and GMail are the best).

2. Lock their phone (passcode, fingerprint, faceprint)
You’ll lose your phone at some point. It has the keys all all your accounts, like email and so on. With your email, phones thieves can then reset passwords on all your other accounts. Thus, it’s incredibly important to lock the phone.

Apple has made this especially easy with fingerprints (and now faceprints), so there’s little excuse not to lock the phone.

Note that Apple iPhones are the most secure. I give my mother my old iPhones so that they will have something secure.

My mom demonstrates a problem you’ll have with the older generation: she doesn’t reliably have her phone with her, and charged. She’s the opposite of my dad who religiously slaved to his phone. Even a small change to make her lock her phone means it’ll be even more likely she won’t have it with her when you need to call her.

3. WiFi (WPA)
Make sure their home WiFi is WPA encrypted. It probably already is, but it’s worthwhile checking.

The password should be written down on the same piece of paper as all the other passwords. This is importance. My parents just moved, Comcast installed a WiFi access point for them, and they promptly lost the piece of paper. When I wanted to debug some thing on their network today, they didn’t know the password, and couldn’t find the paper. Get that password written down in a place it won’t get lost!

Discourage them from extra security features like “SSID hiding” and/or “MAC address filtering”. They provide no security benefit, and actually make security worse. It means a phone has to advertise the SSID when away from home, and it makes MAC address randomization harder, both of which allows your privacy to be tracked.

If they have a really old home router, you should probably replace it, or at least update the firmware. A lot of old routers have hacks that allow hackers (like me masscaning the Internet) to easily break in.

4. Ad blockers or Brave

Most of the online tricks that will confuse your older parents will come via advertising, such as popups claiming “You are infected with a virus, click here to clean it”. Installing an ad blocker in the browser, such as uBlock Origin, stops most all this nonsense.

For example, here’s a screenshot of going to the “Speedtest” website to test the speed of my connection (I took this on the plane on the way home for Thanksgiving). Ignore the error (plane’s firewall Speedtest) — but instead look at the advertising banner across the top of the page insisting you need to download a browser extension. This is tricking you into installing malware — the ad appears as if it’s a message from Speedtest, it’s not. Speedtest is just selling advertising and has no clue what the banner says. This sort of thing needs to be blocked — it fools even the technologically competent.

uBlock Origin for Chrome is the one I use. Another option is to replace their browser with Brave, a browser that blocks ads, but at the same time, allows micropayments to support websites you want to support. I use Brave on my iPhone.
A side benefit of ad blockers or Brave is that web surfing becomes much faster, since you aren’t downloading all this advertising. The smallest NYtimes story is 15 megabytes in size due to all the advertisements, for example.

5. Cloud Backups
Do backups, in the cloud. It’s a good idea in general, especially with the threat of ransomware these days.

In particular, consider your photos. Over time, they will be lost, because people make no effort to keep track of them. All hard drives will eventually crash, deleting your photos. Sure, a few key ones are backed up on Facebook for life, but the rest aren’t.
There are so many excellent online backup services out there, like DropBox and Backblaze. Or, you can use the iCloud feature that Apple provides. My favorite is Microsoft’s: I already pay $99 a year for Office 365 subscription, and it comes with 1-terabyte of online storage.

6. Separate email accounts
You should have three email accounts: work, personal, and financial.

First, you really need to separate your work account from personal. The IT department is already getting misdirected emails with your spouse/lover that they don’t want to see. Any conflict with your work, such as getting fired, gives your private correspondence to their lawyers.

Second, you need a wholly separate account for financial stuff, like Amazon.com, your bank, PayPal, and so on. That prevents confusion with phishing attacks.

Consider this warning today:

If you had split accounts, you could safely ignore this. The USPS would only know your financial email account, which gets no phishing attacks, because it’s not widely known. When your receive the phishing attack on your personal email, you ignore it, because you know the USPS doesn’t know your personal email account.

Phishing emails are so sophisticated that even experts can’t tell the difference. Splitting financial from personal emails makes it so you don’t have to tell the difference — anything financial sent to personal email can safely be ignored.

7. Deauth those apps!

Twitter user @tompcoleman comments that we also need deauth apps.
Social media sites like Facebook, Twitter, and Google encourage you to enable “apps” that work their platforms, often demanding privileges to generate messages on your behalf. The typical scenario is that you use them only once or twice and forget about them.
A lot of them are hostile. For example, my niece’s twitter account would occasional send out advertisements, and she didn’t know why. It’s because a long time ago, she enabled an app with the permission to send tweets for her. I had to sit down and get rid of most of her apps.
Now would be a good time to go through your relatives Facebook, Twitter, and Google/GMail and disable those apps. Don’t be a afraid to be ruthless — they probably weren’t using them anyway. Some will still be necessary. For example, Twitter for iPhone shows up in the list of Twitter apps. The URL for editing these apps for Twitter is https://twitter.com/settings/applications. Google link is here (thanks @spextr). I don’t know of simple URLs for Facebook, but you should find it somewhere under privacy/security settings.
Update: Here’s a more complete guide for a even more social media services.
https://www.permissions.review/

8. Up-to-date software? maybe

I put this last because it can be so much work.

You should install the latest OS (Windows 10, macOS High Sierra), and also turn on automatic patching.

But remember it may not be worth the huge effort involved. I want my parents to be secure — but no so secure I have to deal with issues.

For example, when my parents updated their HP Print software, the icon on the desktop my mom usually uses to scan things in from the printer disappeared, and needed me to spend 15 minutes with her helping find the new way to access the software.
However, I did get my mom a new netbook to travel with instead of the old WinXP one. I want to get her a Chromebook, but she doesn’t want one.
For iOS, you can probably make sure their phones have the latest version without having these usability problems.

Conclusion

You can’t solve every problem for your relatives, but these are the more critical ones.

Backing Up WordPress

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/backing-up-wordpress/

WordPress cloud backup
WordPress logo

WordPress is the most popular CMS (Content Management System) for websites, with almost 30% of all websites in the world using WordPress. That’s a lot of sites — over 350 million!

In this post we’ll talk about the different approaches to keeping the data on your WordPress website safe.


Stop the Presses! (Or the Internet!)

As we were getting ready to publish this post, we received news from UpdraftPlus, one of the biggest WordPress plugin developers, that they are supporting Backblaze B2 as a storage solution for their backup plugin. They shipped the update (1.13.9) this week. This is great news for Backblaze customers! UpdraftPlus is also offering a 20% discount to Backblaze customers wishing to purchase or upgrade to UpdraftPlus Premium. The complete information is below.

UpdraftPlus joins backup plugin developer XCloner — Backup and Restore in supporting Backblaze B2. A third developer, BlogVault, also announced their intent to support Backblaze B2. Contact your favorite WordPress backup plugin developer and urge them to support Backblaze B2, as well.

Now, back to our post…


Your WordPress website data is on a web server that’s most likely located in a large data center. You might wonder why it is necessary to have a backup of your website if it’s in a data center. Website data can be lost in a number of ways, including mistakes by the website owner (been there), hacking, or even domain ownership dispute (I’ve seen it happen more than once). A website backup also can provide a history of changes you’ve made to the website, which can be useful. As an overall strategy, it’s best to have a backup of any data that you can’t afford to lose for personal or business reasons.

Your web hosting company might provide backup services as part of your hosting plan. If you are using their service, you should know where and how often your data is being backed up. You don’t want to find out too late that your backup plan was not adequate.

Sites on WordPress.com are automatically backed up by VaultPress (Automattic), which also is available for self-hosted WordPress installations. If you don’t want the work or decisions involved in managing the hosting for your WordPress site, WordPress.com will handle it for you. You do, however, give up some customization abilities, such as the option to add plugins of your own choice.

Very large and active websites might consider WordPress VIP by Automattic, or another premium WordPress hosting service such as Pagely.com.

This post is about backing up self-hosted WordPress sites, so we’ll focus on those options.

WordPress Backup

Backup strategies for WordPress can be divided into broad categories depending on 1) what you back up, 2) when you back up, and 3) where the data is backed up.

With server data, such as with a WordPress installation, you should plan to have three copies of the data (the 3-2-1 backup strategy). The first is the active data on the WordPress web server, the second is a backup stored on the web server or downloaded to your local computer, and the third should be in another location, such as the cloud.

We’ll talk about the different approaches to backing up WordPress, but we recommend using a WordPress plugin to handle your backups. A backup plugin can automate the task, optimize your backup storage space, and alert you of problems with your backups or WordPress itself. We’ll cover plugins in more detail, below.

What to Back Up?

The main components of your WordPress installation are:

You should decide which of these elements you wish to back up. The database is the top priority, as it contains all your website posts and pages (exclusive of media). Your current theme is important, as it likely contains customizations you’ve made. Following those in priority are any other files you’ve customized or made changes to.

You can choose to back up the WordPress core installation and plugins, if you wish, but these files can be downloaded again if necessary from the source, so you might not wish to include them. You likely have all the media files you use on your website on your local computer (which should be backed up), so it is your choice whether to back these up from the server as well.

If you wish to be able to recreate your entire website easily in case of data loss or disaster, you might choose to back up everything, though on a large website this could be a lot of data.

Generally, you should 1) prioritize any file that you’ve customized that you can’t afford to lose, and 2) decide whether you need a copy of everything in order to get your site back up quickly. These choices will determine your backup method and the amount of storage you need.

A good backup plugin for WordPress enables you to specify which files you wish to back up, and even to create separate backups and schedules for different backup contents. That’s another good reason to use a plugin for backing up WordPress.

When to Back Up?

You can back up manually at any time by using the Export tool in WordPress. This is handy if you wish to do a quick backup of your site or parts of it. Since it is manual, however, it is not a part of a dependable backup plan that should be done regularly. If you wish to use this tool, go to Tools, Export, and select what you wish to back up. The output will be an XML file that uses the WordPress Extended RSS format, also known as WXR. You can create a WXR file that contains all of the information on your site or just portions of the site, such as posts or pages by selecting: All content, Posts, Pages, or Media.
Note: You can use WordPress’s Export tool for sites hosted on WordPress.com, as well.

Export instruction for WordPress

Many of the backup plugins we’ll be discussing later also let you do a manual backup on demand in addition to regularly scheduled or continuous backups.

Note:  Another use of the WordPress Export tool and the WXR file is to transfer or clone your website to another server. Once you have exported the WXR file from the website you wish to transfer from, you can import the WXR file from the Tools, Import menu on the new WordPress destination site. Be aware that there are file size limits depending on the settings on your web server. See the WordPress Codex entry for more information. To make this job easier, you may wish to use one of a number of WordPress plugins designed specifically for this task.

You also can manually back up the WordPress MySQL database using a number of tools or a plugin. The WordPress Codex has good information on this. All WordPress plugins will handle this for you and do it automatically. They also typically include tools for optimizing the database tables, which is just good housekeeping.

A dependable backup strategy doesn’t rely on manual backups, which means you should consider using one of the many backup plugins available either free or for purchase. We’ll talk more about them below.

Which Format To Back Up In?

In addition to the WordPress WXR format, plugins and server tools will use various file formats and compression algorithms to store and compress your backup. You may get to choose between zip, tar, tar.gz, tar.gz2, and others. See The Most Common Archive File Formats for more information on these formats.

Select a format that you know you can access and unarchive should you need access to your backup. All of these formats are standard and supported across operating systems, though you might need to download a utility to access the file.

Where To Back Up?

Once you have your data in a suitable format for backup, where do you back it up to?

We want to have multiple copies of our active website data, so we’ll choose more than one destination for our backup data. The backup plugins we’ll discuss below enable you to specify one or more possible destinations for your backup. The possible destinations for your backup include:

A backup folder on your web server
A backup folder on your web server is an OK solution if you also have a copy elsewhere. Depending on your hosting plan, the size of your site, and what you include in the backup, you may or may not have sufficient disk space on the web server. Some backup plugins allow you to configure the plugin to keep only a certain number of recent backups and delete older ones, saving you disk space on the server.
Email to you
Because email servers have size limitations, the email option is not the best one to use unless you use it to specifically back up just the database or your main theme files.
FTP, SFTP, SCP, WebDAV
FTP, SFTP, SCP, and WebDAV are all widely-supported protocols for transferring files over the internet and can be used if you have access credentials to another server or supported storage device that is suitable for storing a backup.
Sync service (Dropbox, SugarSync, Google Drive, OneDrive)
A sync service is another possible server storage location though it can be a pricier choice depending on the plan you have and how much you wish to store.
Cloud storage (Backblaze B2, Amazon S3, Google Cloud, Microsoft Azure, Rackspace)
A cloud storage service can be an inexpensive and flexible option with pay-as-you go pricing for storing backups and other data.

A good website backup strategy would be to have multiple backups of your website data: one in a backup folder on your web hosting server, one downloaded to your local computer, and one in the cloud, such as with Backblaze B2.

If I had to choose just one of these, I would choose backing up to the cloud because it is geographically separated from both your local computer and your web host, it uses fault-tolerant and redundant data storage technologies to protect your data, and it is available from anywhere if you need to restore your site.

Backup Plugins for WordPress

Probably the easiest and most common way to implement a solid backup strategy for WordPress is to use one of the many backup plugins available for WordPress. Fortunately, there are a number of good ones and are available free or in “freemium” plans in which you can use the free version and pay for more features and capabilities only if you need them. The premium options can give you more flexibility in configuring backups or have additional options for where you can store the backups.

How to Choose a WordPress Backup Plugin

screenshot of WordPress plugins search

When considering which plugin to use, you should take into account a number of factors in making your choice.

Is the plugin actively maintained and up-to-date? You can determine this from the listing in the WordPress Plugin Repository. You also can look at reviews and support comments to get an idea of user satisfaction and how well issues are resolved.

Does the plugin work with your web hosting provider? Generally, well-supported plugins do, but you might want to check to make sure there are no issues with your hosting provider.

Does it support the cloud service or protocol you wish to use? This can be determined from looking at the listing in the WordPress Plugin Repository or on the developer’s website. Developers often will add support for cloud services or other backup destinations based on user demand, so let the developer know if there is a feature or backup destination you’d like them to add to their plugin.

Other features and options to consider in choosing a backup plugin are:

  • Whether encryption of your backup data is available
  • What are the options for automatically deleting backups from the storage destination?
  • Can you globally exclude files, folders, and specific types of files from the backup?
  • Do the options for scheduling automatic backups meet your needs for frequency?
  • Can you exclude/include specific database tables (a good way to save space in your backup)?

WordPress Backup Plugins Review

Let’s review a few of the top choices for WordPress backup plugins.

UpdraftPlus

UpdraftPlus is one of the most popular backup plugins for WordPress with over one million active installations. It is available in both free and Premium versions.

UpdraftPlus just released support for Backblaze B2 Cloud Storage in their 1.13.9 update on September 25. According to the developer, support for Backblaze B2 was the most frequent request for a new storage option for their plugin. B2 support is available in their Premium plugin and as a stand-alone update to their standard product.

Note: The developers of UpdraftPlus are offering a special 20% discount to Backblaze customers on the purchase of UpdraftPlus Premium by using the coupon code backblaze20. The discount is valid until the end of Friday, October 6th, 2017.

screenshot of Backblaze B2 cloud backup for WordPress in UpdraftPlus

XCloner — Backup and Restore

XCloner — Backup and Restore is a useful open-source plugin with many options for backing up WordPress.

XCloner supports B2 Cloud Storage in their free plugin.

screenshot of XCloner WordPress Backblaze B2 backup settings

BlogVault

BlogVault describes themselves as a “complete WordPress backup solution.” They offer a free trial of their paid WordPress backup subscription service that features real-time backups of changes to your WordPress site, as well as many other features.

BlogVault has announced their intent to support Backblaze B2 Cloud Storage in a future update.

screenshot of BlogValut WordPress Backup settings

BackWPup

BackWPup is a popular and free option for backing up WordPress. It supports a number of options for storing your backup, including the cloud, FTP, email, or on your local computer.

screenshot of BackWPup WordPress backup settings

WPBackItUp

WPBackItUp has been around since 2012 and is highly rated. It has both free and paid versions.

screenshot of WPBackItUp WordPress backup settings

VaultPress

VaultPress is part of Automattic’s well-known WordPress product, JetPack. You will need a JetPack subscription plan to use VaultPress. There are different pricing plans with different sets of features.

screenshot of VaultPress backup settings

Backup by Supsystic

Backup by Supsystic supports a number of options for backup destinations, encryption, and scheduling.

screenshot of Backup by Supsystic backup settings

BackupWordPress

BackUpWordPress is an open-source project on Github that has a popular and active following and many positive reviews.

screenshot of BackupWordPress WordPress backup settings

BackupBuddy

BackupBuddy, from iThemes, is the old-timer of backup plugins, having been around since 2010. iThemes knows a lot about WordPress, as they develop plugins, themes, utilities, and provide training in WordPress.

BackupBuddy’s backup includes all WordPress files, all files in the WordPress Media library, WordPress themes, and plugins. BackupBuddy generates a downloadable zip file of the entire WordPress website. Remote storage destinations also are supported.

screenshot of BackupBuddy settings

WordPress and the Cloud

Do you use WordPress and back up to the cloud? We’d like to hear about it. We’d also like to hear whether you are interested in using B2 Cloud Storage for storing media files served by WordPress. If you are, we’ll write about it in a future post.

In the meantime, keep your eye out for new plugins supporting Backblaze B2, or better yet, urge them to support B2 if they’re not already.

The Best Backup Strategy is the One You Use

There are other approaches and tools for backing up WordPress that you might use. If you have an approach that works for you, we’d love to hear about it in the comments.

The post Backing Up WordPress appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Strategies for Backing Up Windows Computers

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/strategies-for-backing-up-windows-computers/

Windows 7, Windows 8, Windows 10 logos

There’s a little company called Apple making big announcements this week, but about 45% of you are on Windows machines, so we thought it would be a good idea to devote a blog post today to Windows users and the options they have for backing up Windows computers.

We’ll be talking about the various options for backing up Windows desktop OS’s 7, 8, and 10, and Windows servers. We’ve written previously about this topic in How to Back Up Windows, and Computer Backup Options, but we’ll be covering some new topics and ways to combine strategies in this post. So, if you’re a Windows user looking for shelter from all the Apple hoopla, welcome to our Apple Announcement Day Windows Backup Day post.

Windows laptop

First, Let’s Talk About What We Mean by Backup

This might seem to our readers like an unneeded appetizer on the way to the main course of our post, but we at Backblaze know that people often mean very different things when they use backup and related terms. Let’s start by defining what we mean when we say backup, cloud storage, sync, and archive.

Backup
A backup is an active copy of the system or files that you are using. It is distinguished from an archive, which is the storing of data that is no longer in active use. Backups fall into two main categories: file and image. File backup software will back up whichever files you designate by either letting you include files you wish backed up or by excluding files you don’t want backed up, or both. An image backup, sometimes called a disaster recovery backup or a system clone, is useful if you need to recreate your system on a new drive or computer.
The first backup generally will be a full backup of all files. After that, the backup will be incremental, meaning that only files that have been changed since the full backup will be added. Often, the software will keep changed versions of the files for some period of time, so you can maintain a number of previous revisions of your files in case you wish to return to something in an earlier version of your file.
The destination for your backup could be another drive on your computer, an attached drive, a network-attached drive (NAS), or the cloud.
Cloud Storage
Cloud storage vendors supply data storage just as a utility company supplies power, gas, or water. Cloud storage can be used for data backups, but it can also be used for data archives, application data, records, or libraries of photos, videos, and other media.
You contract with the service for storing any type of data, and the storage location is available to you via the internet. Cloud storage providers generally charge by some combination of data ingress, egress, and the amount of data stored.
Sync
File sync is useful for files that you wish to have access to from different places or computers, or for files that you wish to share with others. While sync has its uses, it has limitations for keeping files safe and how much it could cost you to store large amounts of data. As opposed to backup, which keeps revision of files, sync is designed to keep two or more locations exactly the same. Sync costs are based on how much data you sync and can get expensive for large amounts of data.
Archive
A data archive is for data that is no longer in active use but needs to be saved, and may or may not ever be retrieved again. In old-style storage parlance, it is called cold storage. An archive could be stored with a cloud storage provider, or put on a hard drive or flash drive that you disconnect and put in the closet, or mail to your brother in Idaho.

What’s the Best Strategy for Backing Up?

Now that we’ve got our terminology clear, let’s talk backup strategies for Windows.

At Backblaze, we advocate the 3-2-1 strategy for safeguarding your data, which means that you should maintain three copies of any valuable data — two copies stored locally and one stored remotely. I follow this strategy at home by working on the active data on my Windows 10 desktop computer (copy one), which is backed up to a Drobo RAID device attached via USB (copy two), and backing up the desktop to Backblaze’s Personal Backup in the cloud (copy three). I also keep an image of my primary disk on a separate drive and frequently update it using Windows 10’s image tool.

I use Dropbox for sharing specific files I am working on that I might wish to have access to when I am traveling or on another computer. Once my subscription with Dropbox expires, I’ll use the latest release of Backblaze that has individual file preview with sharing built-in.

Before you decide which backup strategy will work best for your situation, you’ll need to ask yourself a number of questions. These questions include where you wish to store your backups, whether you wish to supply your own storage media, whether the backups will be manual or automatic, and whether limited or unlimited data storage will work best for you.

Strategy 1 — Back Up to a Local or Attached Drive

The first copy of the data you are working on is often on your desktop or laptop. You can create a second copy of your data on another drive or directory on your computer, or copy the data to a drive directly attached to your computer, such as via USB.

external hard drive and RAID NAS devices

Windows has built-in tools for both file and image level backup. Depending on which version of Windows you use, these tools are called Backup and Restore, File History, or Image. These tools enable you to set a schedule for automatic backups, which ensures that it is done regularly. You also have the choice to use Windows Explorer (aka File Explorer) to manually copy files to another location. Some external disk drives and USB Flash Drives come with their own backup software, and other backup utilities are available for free or for purchase.

Windows Explorer File History screenshot

This is a supply-your-own media solution, meaning that you need to have a hard disk or other medium available of sufficient size to hold all your backup data. When a disk becomes full, you’ll need to add a disk or swap out the full disk to continue your backups.

We’ve written previously on this strategy at Should I use an external drive for backup?

Strategy 2 — Back Up to a Local Area Network (LAN)

Computers, servers, and network-attached-storage (NAS) on your local network all can be used for backing up data. Microsoft’s built-in backup tools can be used for this job, as can any utility that supports network protocols such as NFS or SMB/CIFS, which are common protocols that allow shared access to files on a network for Windows and other operatings systems. There are many third-party applications available as well that provide extensive options for managing and scheduling backups and restoring data when needed.

NAS cloud

Multiple computers can be backed up to a single network-shared computer, server, or NAS, which also could then be backed up to the cloud, which rounds out a nice backup strategy, because it covers both local and remote copies of your data. System images of multiple computers on the LAN can be included in these backups if desired.

Again, you are managing the backup media on the local network, so you’ll need to be sure you have sufficient room on the destination drives to store all your backup data.

Strategy 3 — Back Up to Detached Drive at Another Location

You may have have read our recent blog post, Getting Data Archives Out of Your Closet, in which we discuss the practice of filling hard drives and storing them in a closet. Of course, to satisfy the off-site backup guideline, these drives would need to be stored in a closet that’s in a different geographical location than your main computer. If you’re willing to do all the work of copying the data to drives and transporting them to another location, this is a viable option.

stack of hard drives

The only limitation to the amount of backup data is the number of hard drives you are willing to purchase — and maybe the size of your closet.

Strategy 4 — Back Up to the Cloud

Backing up to the cloud has become a popular option for a number of reasons. Internet speeds have made moving large amounts of data possible, and not having to worry about supplying the storage media simplifies choices for users. Additionally, cloud vendors implement features such as data protection, deduplication, and encryption as part of their services that make cloud storage reliable, secure, and efficient. Unlimited cloud storage for data from a single computer is a popular option.

A backup vendor likely will provide a software client that runs on your computer and backs up your data to the cloud in the background while you’re doing other things, such as Backblaze Personal Backup, which has clients for Windows computers, Macintosh computers, and mobile apps for both iOS and Android. For restores, Backblaze users can download one or all of their files for free from anywhere in the world. Optionally, a 128 GB flash drive or 4 TB drive can be overnighted to the customer, with a refund available if the drive is returned.

Storage Pod in the cloud

Backblaze B2 Cloud Storage is an option for those who need capabilities beyond Backblaze’s Personal Backup. B2 provides cloud storage that is priced based on the amount of data the customer uses, and is suitable for long-term data storage. B2 supports integrations with NAS devices, as well as Windows, Macintosh, and Linux computers and servers.

Services such as BackBlaze B2 are often called Cloud Object Storage or IaaS (Infrastructure as a Service), because they provide a complete solution for storing all types of data in partnership with vendors who integrate various solutions for working with B2. B2 has its own API (Application Programming Interface) and CLI (Command-line Interface) to work with B2, but B2 becomes even more powerful when paired with any one of a number of other solutions for data storage and management provided by third parties who offer both hardware and software solutions.

Backing Up Windows Servers

Windows Servers are popular workstations for some users, and provide needed network services for others. They also can be used to store backups from other computers on the network. They, in turn, can be backed up to attached drives or the cloud. While our Personal Backup client doesn’t support Windows servers, our B2 Cloud Storage has a number of integrations with vendors who supply software or hardware for storing data both locally and on B2. We’ve written a number of blog posts and articles that address these solutions, including How to Back Up your Windows Server with B2 and CloudBerry.

Sometimes the Best Strategy is to Mix and Match

The great thing about computers, software, and networks is that there is an endless number of ways to combine them. Our users and hardware and software partners are ingenious in configuring solutions that save data locally, copy it to an attached or network drive, and then store it to the cloud.

image of cloud backup

Among our B2 partners, Synology, CloudBerry Archiware, QNAP, Morro Data, and GoodSync have integrations that allow their NAS devices to store and retrieve data to and from B2 Cloud Storage. For a drag-and-drop experience on the desktop, take a look at CyberDuck, MountainDuck, and Dropshare, which provide users with an easy and interactive way to store and use data in B2.

If you’d like to explore more options for combining software, hardware, and cloud solutions, we invite you to browse the integrations for our many B2 partners.

Have Questions?

Windows versions, tools, and backup terminology all can be confusing, and we know how hard it can be to make sense of all of it. If there’s something we haven’t addressed here, or if you have a question or contribution, please let us know in the comments.

And happy Windows Backup Day! (Just don’t tell Apple.)

The post Strategies for Backing Up Windows Computers appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Top Ten Ways to Protect Yourself Against Phishing Attacks

Post Syndicated from Roderick Bauer original https://www.backblaze.com/blog/top-ten-ways-protect-phishing-attacks/

It’s hard to miss the increasing frequency of phishing attacks in the news. Earlier this year, a major phishing attack targeted Google Docs users, and attempted to compromise at least one million Google Docs accounts. Experts say the “phish” was convincing and sophisticated, and even people who thought they would never be fooled by a phishing attack were caught in its net.

What is phishing?

Phishing attacks use seemingly trustworthy but malicious emails and websites to obtain your personal account or banking information. The attacks are cunning and highly effective because they often appear to come from an organization or business you actually use. The scam comes into play by tricking you into visiting a website you believe belongs to the trustworthy organization, but in fact is under the control of the phisher attempting to extract your private information.

Phishing attacks are once again in the news due to a handful of high profile ransomware incidents. Ransomware invades a user’s computer, encrypts their data files, and demands payment to decrypt the files. Ransomware most often makes its way onto a user’s computer through a phishing exploit, which gives the ransomware access to the user’s computer.

The best strategy against phishing is to scrutinize every email and message you receive and never to get caught. Easier said than done—even smart people sometimes fall victim to a phishing attack. To minimize the damage in an event of a phishing attack, backing up your data is the best ultimate defense and should be part of your anti-phishing and overall anti-malware strategy.

How do you recognize a phishing attack?

A phishing attacker may send an email seemingly from a reputable credit card company or financial institution that requests account information, often suggesting that there is a problem with your account. When users respond with the requested information, attackers can use it to gain access to the accounts.

The image below is a mockup of how a phishing attempt might appear. In this example, courtesy of Wikipedia, the bank is fictional, but in a real attempt the sender would use an actual bank, perhaps even the bank where the targeted victim does business. The sender is attempting to trick the recipient into revealing confidential information by getting the victim to visit the phisher’s website. Note the misspelling of the words “received” and “discrepancy” as recieved and discrepency. Misspellings sometimes are indications of a phishing attack. Also note that although the URL of the bank’s webpage appears to be legitimate, the hyperlink would actually take you to the phisher’s webpage, which would be altogether different from the URL displayed in the message.

By Andrew Levine – en:Image:PhishingTrustedBank.png, Public Domain, https://commons.wikimedia.org/w/index.php?curid=549747

Top ten ways to protect yourself against phishing attacks

  1. Always think twice when presented with a link in any kind of email or message before you click on it. Ask yourself whether the sender would ask you to do what it is requesting. Most banks and reputable service providers won’t ask you to reveal your account information or password via email. If in doubt, don’t use the link in the message and instead open a new webpage and go directly to the known website of the organization. Sign in to the site in the normal manner to verify that the request is legitimate.
  2. A good precaution is to always hover over a link before clicking on it and observe the status line in your browser to verify that the link in the text and the destination link are in fact the same.
  3. Phishers are clever, and they’re getting better all the time, and you might be fooled by a simple ruse to make you think the link is one you recognize. Links can have hard-to-detect misspellings that would result in visiting a site very different than what you expected.
  4. Be wary even of emails and message from people you know. It’s very easy to spoof an email so it appears to come from someone you know, or to create a URL that appears to be legitimate, but isn’t.

For example, let’s say that you work for roughmedia.com and you get an email from Chuck in accounting ([email protected]) that has an attachment for you, perhaps a company form you need to fill out. You likely wouldn’t notice in the sender address that the phisher has replaced the “m” in media with an “r” and an “n” that look very much like an “m.” You think it’s good old Chuck in finance and it’s actually someone “phishing” for you to open the attachment and infect your computer. This type of attack is known as “spear phishing” because it’s targeted at a specific individual and is using social engineering—specifically familiarity with the sender—as part of the scheme to fool you into trusting the attachment. This technique is by far the most successful on the internet today. (This example is based on Gimlet Media’s Reply All Podcast Episode, “What Kind of Idiot Gets Phished?“)

  1. Use anti-malware software, but don’t rely on it to catch all attacks. Phishers change their approach often to keep ahead of the software attack detectors.
  2. If you are asked to enter any valuable information, only do so if you’re on a secure connection. Look for the “https” prefix before the site URL, indicating the site is employing SSL (Secure Socket Layer). If there is no “s” after “http,” it’s best not to enter any confidential information.
By Fabio Lanari – Internet1.jpg by Rock1997 modified., GFDL, https://commons.wikimedia.org/w/index.php?curid=20995390
  1. Avoid logging in to online banks and similar services via public Wi-Fi networks. Criminals can compromise open networks with man-in-the-middle attacks that capture your information or spoof website addresses over the connection and redirect you to a fake page they control.
  2. Email, instant messaging, and gaming social channels are all possible vehicles to deliver phishing attacks, so be vigilant!
  3. Lay the foundation for a good defense by choosing reputable tech vendors and service providers that respect your privacy and take steps to protect your data. At Backblaze, we have full-time security teams constantly looking for ways to improve our security.
  4. When it is available, always take advantage of multi-factor verification to protect your accounts. The standard categories used for authentication are 1) something you know (e.g. your username and password), 2) something you are (e.g. your fingerprint or retina pattern), and 3) something you have (e.g. an authenticator app on your smartphone). An account that allows only a single factor for authentication is more susceptible to hacking than one that supports multiple factors. Backblaze supports multi-factor authentication to protect customer accounts.

Be a good internet citizen, and help reduce phishing and other malware attacks by notifying the organization being impersonated in the phishing attempt, or by forwarding suspicious messages to the Federal Trade Commission at [email protected]. Some email clients and services, such as Microsoft Outlook and Google Gmail, give you the ability to easily report suspicious emails. Phishing emails misrepresenting Apple can be reported to [email protected].

Backing up your data is an important part of a strong defense against phishing and other malware

The best way to avoid becoming a victim is to be vigilant against suspicious messages and emails, but also to assume that no matter what you do, it is very possible that your system will be compromised. Even the most sophisticated and tech-savvy of us can be ensnared if we are tired, in a rush, or just unfamiliar with the latest methods hackers are using. Remember that hackers are working full-time on ways to fool us, so it’s very difficult to keep ahead of them.

The best defense is to make sure that any data that could compromised by hackers—basically all of the data that is reachable via your computer—is not your only copy. You do that by maintaining an active and reliable backup strategy.

Files that are backed up to cloud storage, such as with Backblaze, are not vulnerable to attacks on your local computer in the way that local files, attached drives, network drives, or sync services like Dropbox that have local directories on your computer are.

In the event that your computer is compromised and your files are lost or encrypted, you can recover your files if you have a cloud backup that is beyond the reach of attacks on your computer.

The post Top Ten Ways to Protect Yourself Against Phishing Attacks appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Why Consumer Design is Good For Business

Post Syndicated from Yev original https://www.backblaze.com/blog/why-consumer-design-is-good-for-business/

Using the Backblaze Cloud Backup App

We know that business users sometimes ask, “Why can’t business software be as easy to use as consumer software?”

At Backblaze, we believe it can be.

We started our business to make backup easier for everyone, knowing that the primary reason why people don’t backup is that it is too complicated and too intimidating.

Backblaze has spent the last decade building an unlimited, inexpensive, and best of all easy-to-use backup service. We designed it from the ground up, with the goal of making it a simple service – one that “just works.” We wanted it to be the easiest backup solution for grandmothers and IT administrators alike.

Having a product that’s intuitive and easy makes it ideal for people that don’t want to fret about backing up or worrying about whether or not the they selected the right files when their backup system was set up. Backblaze backs up all user data by default so there’s no worrying about missing something. What that means is when you use Backblaze for Business – you’re getting a solution that works out of the box not just for the end-user, but also for the account administrator.

Design for Enterprise Scalability but With Consumer Simplicity

Often times when a product is designed “for enterprise” the result can be an unintuitive piece of software that only the systems administrators can navigate. While that may be acceptable for antivirus or anti-spam software, there are many products and services that should not require hours to learn to use. Some of the most common services that businesses use today are known for their ease-of-use. Dropbox Sync, Trello, and Slack come to mind.

Backblaze Online Backup is much the same. Regardless of whether you have one computer or are deploying to an organization of 1,000, Backblaze scales so that you and all your users get the same, simple service that backs up and makes data accessible.

Overcomplexity reduces efficiency
The last thing an IT professional wants is users asking them how a program on their computer works, or complaining about a process that’s supposed to be running in the background. The more bloated and over-designed products and services get, the more stumbling blocks appear before the end-user. When you’re developing a product there’s a fine line between adding features and creating an overwhelmingly complicated user interface. The cost of getting that balance wrong is that it will raise more questions than it provides answers, leaving customers and end-users confused with too many choices. Many of the players in the online backup space have made confusing design choices that leave customers perplexed. We believe easy is better for everyone.

Backblaze for Business is built on top of our award winning Computer Backup product that has been in market for over 10 years. We have over 350 PB under storage and have helped users save over 23 BILLION files. We know a lot about backup.

But businesses have unique needs, such as centralized user management and billing, reporting, monitoring usage, and the ability to act on behalf of any user. When an end-user (or the IT admin) installs Backblaze, the backup starts automatically, backing up all the user-data on the machine. There’s no need to select files or folders. The backup process just starts, because all of the data is important. We’ve heard time and time again that a user’s files were saved because we backed up an obscure directory where one or two important files would have been forgotten about had the user been forced to choose what to back up.

Backblaze just works—for everyone

The best products are the ones that don’t impede your workflow and work seamlessly with the processes you have in place. Which is another reason having something designed with the end-user in mind is helpful. You build software that is aware of its environment (not everyone has top-of-the-line computing systems) and stays out of the way.

Making sure that people are diligent about their backup strategy is hard enough. At Backblaze we believe that simplicity is key, and that’s why we designed a backup service that scales from 1 to 10,000 — without having to change a setting.

The post Why Consumer Design is Good For Business appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Предоставянето на данни на потребителите за целите на сигурността: въпросът за прозрачността

Post Syndicated from nellyo original https://nellyo.wordpress.com/2017/07/13/orders_data/

Интернет компаниите получават искания от различни правозащитни органи и служби за предоставяне на данни на техни потребители. Режимът е различен, но нерядко става дума за искания, придружени от разрешение на съдебен орган.  Същевременно тези искания могат да бъдат придружени и с ограничения (content-based prior restraints)  потребителите да не се информират за факта на такива разследвания.

The Guardian съобщава, че  Facebook, Google, Apple, Microsoft, Twitter, Dropbox, Yelp и Avvo,  както и   правозащитни организации в областта на цифровите свободи като ACLU, Electronic Frontier Foundation, Access Now  и др. поддържат следната позиция: правителството не трябва да  налага задължение за неоповестяване, което ефективно би попречило на потребителите  да упражнят конституционните си права. Предполага се, че става въпрос за искания за разкриване на данни на участници в протести срещу политиката на Тръмп.

Twitter води дело през 2014 г.  – и то не за уведомяването на отделните потребители за това, че профилите им се наблюдават, а за много по-обща информация – за  броя на исканията за разкриване на данни по съображения за национална сигурност. Съд в Калифорния   е потвърдил, че  американското правителство в конкретния случай не е показало  ясната и предстояща опасност , която би могла да оправдае ограничаването на конституционните права на Twitter да говори за подобни искания за проследяване. Според съда може да се изисква от правителството да представя доказателства, че разкриването ще да попречи на разследването. Тук темата не е самото проследяване, а прозрачността, правото на компанията да информира за него.

Microsoft  настоява да се гарантира, че исканията са валидни за определен срок, тъй като искания с неопределен срок според Microsoft са противоконституционни.

В статията се подчертава отново, че компаниите са под двоен натиск –  от страна на правителствата да подпомагат разследванията, но в същото време – от страна на потребителите –  да защитават данните им от намесата на правителството.

Оспорванията на такива искания пред съд са важни, защото според правозащитните организации

има много случаи, при които правоприлагащите органи са използвали тези искания неправомерно.

Доколкото от тази и подобни публикации става ясно, при преобладаващ обществен интерес интернет компаниите съдействат на правителствата и спазват определени ограничения, но  ако:

  •  е налице преобладаващ обществен интерес (за САЩ –  ясна и предстояща опасност) и
  • засегнатите лица  имат всички конституционни права да се защитят.

Ако в България  интернет компаниите са под същия двоен натиск. При постъпило искане те имат много трудната задача да вземат решение  пред два възможни риска

  • да има преобладаващ обществен интерес  – и те да не съдействат на правозащитните органи или
  • да разкрият данните на потребител  – и да се окаже, че няма преобладаващ обществен интерес.

Разпоредбата:

251 б (2) (Доп. – ДВ, бр. 97 от 2016 г., в сила от 06.12.2016 г.) Данните по ал. 1 се съхраняват за нуждите на националната сигурност и за предотвратяване, разкриване и разследване на тежки престъпления. Данните по ал. 1, т. 6 се съхраняват и за осъществяване на операции по издирване и спасяване на лица в случаите по чл. 38, ал. 3 от Закона за защита при бедствия.
(3) Други данни, включително разкриващи съдържанието на съобщенията, не могат да бъдат съхранявани по този ред.

Няма защо да смятаме, че –  за разлика от САЩ  – исканията за предоставяне на данни, съответно разрешенията, винаги са правомерни. А да не говорим за ефективността  – често единствената ефективна последица остава навлизането в личната сфера на потребителя, не и разкриването на престъпления.

Ако вземат решение да предоставят данните, компаниите се изправят и пред аналогичния въпрос за прозрачността.

Filed under: Digital, Media Law, US Law

AWS Hot Startups – May 2017

Post Syndicated from Tina Barr original https://aws.amazon.com/blogs/aws/aws-hot-startups-may-2017/

April showers bring May startups! This month we have three hot startups for you to check out. Keep reading to find out what they’re up to, and how they’re using AWS to do it.

Today’s post features the following startups:

  • Lobster – an AI-powered platform connecting creative social media users to professionals.
  • Visii – helping consumers find the perfect product using visual search.
  • Tiqets – a curated marketplace for culture and entertainment.

Lobster (London, England)

Every day, social media users generate billions of authentic images and videos to rival typical stock photography. Powered by Artificial Intelligence, Lobster enables brands, agencies, and the press to license visual content directly from social media users so they can find that piece of content that perfectly fits their brand or story. Lobster does the work of sorting through major social networks (Instagram, Flickr, Facebook, Vk, YouTube, and Vimeo) and cloud storage providers (Dropbox, Google Photos, and Verizon) to find media, saving brands and agencies time and energy. Using filters like gender, color, age, and geolocation can help customers find the unique content they’re looking for, while Lobster’s AI and visual recognition finds images instantly. Lobster also runs photo challenges to help customers discover the perfect image to fit their needs.

Lobster is an excellent platform for creative people to get their work discovered while also protecting their content. Users are treated as copyright holders and earn 75% of the final price of every sale. The platform is easy to use: new users simply sign in with an existing social media or cloud account and can start showcasing their artistic talent right away. Lobster allows users to connect to any number of photo storage sources so they’re able to choose which items to share and which to keep private. Once users have selected their favorite photos and videos to share, they can sit back and watch as their work is picked to become the signature for a new campaign or featured on a cool website – and start earning money for their work.

Lobster is using a variety of AWS services to keep everything running smoothly. The company uses Amazon S3 to store photography that was previously ordered by customers. When a customer purchases content, the respective piece of content must be available at any given moment, independent from the original source. Lobster is also using Amazon EC2 for its application servers and Elastic Load Balancing to monitor the state of each server.

To learn more about Lobster, check them out here!

Visii (London, England)

In today’s vast web, a growing number of products are being sold online and searching for something specific can be difficult. Visii was created to cater to businesses and help them extract value from an asset they already have – their images. Their SaaS platform allows clients to leverage an intelligent visual search on their websites and apps to help consumers find the perfect product for them. With Visii, consumers can choose an image and immediately discover more based on their tastes and preferences. Whether it’s clothing, artwork, or home decor, Visii will make recommendations to get consumers to search visually and subsequently help businesses increase their conversion rates.

There are multiple ways for businesses to integrate Visii on their website or app. Many of Visii’s clients choose to build against their API, but Visii also work closely with many clients to figure out the most effective way to do this for each unique case. This has led Visii to help build innovative user interfaces and figure out the best integration points to get consumers to search visually. Businesses can also integrate Visii on their website with a widget – they just need to provide a list of links to their products and Visii does the rest.

Visii runs their entire infrastructure on AWS. Their APIs and pipeline all sit in auto-scaling groups, with ELBs in front of them, sending things across into Amazon Simple Queue Service and Amazon Aurora. Recently, Visii moved from Amazon RDS to Aurora and noted that the process was incredibly quick and easy. Because they make heavy use of machine learning, it is crucial that their pipeline only runs when required and that they maximize the efficiency of their uptime.

To see how companies are using Visii, check out Style Picker and Saatchi Art.

Tiqets (Amsterdam, Netherlands)

Tiqets is making the ticket-buying experience faster and easier for travelers around the world.  Founded in 2013, Tiqets is one of the leading curated marketplaces for admission tickets to museums, zoos, and attractions. Their mission is to help travelers get the most out of their trips by helping them find and experience a city’s culture and entertainment. Tiqets partners directly with vendors to adapt to a customer’s specific needs, and is now active in over 30 cities in the US, Europe, and the Middle East.

With Tiqets, travelers can book tickets either ahead of time or at their destination for a wide range of attractions. The Tiqets app provides real-time availability and delivers tickets straight to customer’s phones via email, direct download, or in the app. Customers save time skipping long lines (a perk of the app!), save trees (don’t need to physically print tickets), and most importantly, they can make the most out of their leisure time. For each attraction featured on Tiqets, there is a lot of helpful information including best modes of transportation, hours, commonly asked questions, and reviews from other customers.

The Tiqets platform consists of the consumer-facing website, the internal and external-facing APIs, and the partner self-service portals. For the app hosting and infrastructure, Tiqets uses AWS services such as Elastic Load Balancing, Amazon EC2, Amazon RDS, Amazon CloudFront, Amazon Route 53, and Amazon ElastiCache. Through the infrastructure orchestration of their AWS configuration, they can easily set up separate development or test environments while staying close to the production environment as well.

Tiqets is hiring! Be sure to check out their jobs page if you are interested in joining the Tiqets team.

Thanks for reading and don’t forget to check out April’s Hot Startups if you missed it.

-Tina Barr

 

 

Protecting Your Account

Post Syndicated from Tim Nufire original https://www.backblaze.com/blog/protecting-your-account/


Editor’s Note: This is a copy of an email sent to our customers on 4/28/17. The Backblaze login database has in no way been compromised. That said, we have seen a number of automated login attempts to our site and wanted to alert our users of the risk. See below for more info.
=====
Dear Customer –

Over the last 72 hours, our security team has noticed an increase in automated attempts to log into our users’ accounts using credentials stolen from other websites. To protect your account, we recommend that you:

Change your password
● Add Two-Factor Authentication for additional security

NOTE: The Backblaze login database has not been compromised – the credentials were stolen from other sources.

Regrettably, we live in an era where companies have been breached and their customers’ credentials have been leaked – Dropbox , Adobe , and LinkedIn are just a few, high profile examples. What happens in these attacks is that the attacker acquires “the Dropbox list” and simply tries those usernames and passwords on another site. If your credentials were leaked in one of those hacks and you used the same username/password combination to sign up for other services (such as ours), you are vulnerable.

While we have a number of methods in place to thwart nefarious attacks, there is a limit to what we can do to prevent someone from signing in to an account with a valid username and password. We are sending this message to you today because we know that some of our users credentials are in these stolen lists.

Changing your password now ensures you’re not using a password that was previously leaked. Adding Two-Factor Authentication provides an extra layer of security and protection if end up on one of these lists in the future.

Thank you,

Tim
Chief Cloud Officer
Backblaze

The post Protecting Your Account appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Don’t Get Trapped in iCloud

Post Syndicated from Peter Cohen original https://www.backblaze.com/blog/dont-get-trapped-icloud/

Don't Get Trapped in iCloud

Let me preface this with a bit of history: I’ve been using Macs for more than 30 years. I’ve seen an enormous amount of changes at Apple, and I’ve been using their online services since the AppleLink days (it was a pre-Internet dial-up service for Apple dealers and service people).

Over the past few years Apple’s made a lot of changes to iCloud. They’ve added some great additions to make it a world-class cloud service. But there are drawbacks. In the course of selling, supporting and writing about these devices, I consistently see people make the same mistakes. So with that background let’s get to my central point: I think it’s a big mistake to trust Apple alone with your data. Let me tell you why.

Apple aggressively promotes iCloud to its customers as a way to securely store information, photos and other vital data, leading to a false sense of security that all of your data is safe from harm. It isn’t. Let’s talk about some of the biggest mistakes you can make with iCloud.

iCloud Sync Does Not = Backing Up

Even if the picture of your puppy’s first bath time is on your iPhone and your iPad, it isn’t backed up. One of the biggest mistakes you can make is to assume that since your photos, contacts, and calendar sync between devices, they’re backed up. There’s a big difference between syncing and backing up.

Repeat after me:
Syncing Is Not Backing Up
Syncing Is Not Backing Up
Syncing Is Not Backing Up

iCloud helps you sync content between devices. Add an event to the calendar app on your phone and iCloud pushes that change to the calendar on your Mac too. Take a photo with the iPhone and find it in your Mac’s Photos library without having to connect the phone to the computer. That’s convenient. I use that functionality all the time.

Syncing can be confusing, though. iCloud Photo Library is what Apple calls iCloud’s ability to sync photos between Apple devices seamlessly. But it’s a two-way street. If you delete a photo from your Mac, it gets removed from your iPhone too, because it’s all in iCloud, there is no backup copy anywhere else.

Recently my wife decided that she didn’t want to have the same photos on her Mac and iPhone. Extricating herself from that means shutting off iCloud Photo Library and manually syncing the iPhone and Mac. That adds extra steps to back everything up! Now the phone has to be connected to the Mac, and my wife has to remember to do it. Bottom line: Syncs between the computer and phone happen less frequently when they are manual, which means there’s more opportunity for pictures to get lost. But with Apple’s syncing enabled, my wife runs the risk of deleting photos that are important not just on one device but everywhere.

Relying on any of these features without having a solid backup strategy means you’re leaving it to Apple and iCloud to keep your pictures and other info safe. If the complex and intricate ecosystem that keeps that stuff working goes awry – and as Murphy’s Law demands, stuff always goes wrong – you can find yourself without pictures, music, and important files.

Better to be safe than sorry. Backing up your data is the way to make sure your memories are safe. Most of the people I’ve helped over the years haven’t realized that iCloud is not backing them up. Some of them have found out the hard way.

iCloud Doesn’t Back Up Your Computer

Apple does have something called “iCloud Backup.” iCloud Backup backs up critical info on the iPhone and iPad to iCloud. But it’s only for mobile devices. The “stuff” on your computer is not backed up by iCloud Backup.

Making matters worse, it’s a “space permitting” solution. Apple gives you a scant 5 GB of free space with an iCloud account. To put that in context, the smallest iPhone 7 ships with 32 GB of space. So right off the bat, you have to pay extra to back up a new device. Many of us who use the free account don’t want to pay for more, so we get messages telling us that our devices can’t be backed up.

More importantly, iCloud doesn’t back up your Mac. So while data may be synced between devices in iCloud, most of the content on your Mac isn’t getting backed up directly.

Be Wary of “Store In iCloud” and “Optimize Storage”

macOS X 10.12 “Sierra” introduced new remote storage functions for iCloud including “Store in iCloud” and “Optimize Storage.” Both of these features move information from your Mac to the cloud. The Mac leaves frequently accessed files locally, but files you don’t use regularly get moved to iCloud and purged from the hard drive.

Your data is yours.

Macs, with their high-performance hard drives, can run chronically short of local storage space. These new storage optimization features can offset that problem by moving what you’re not using to iCloud. As long as you stay connected to iCloud. If iCloud isn’t available, neither are your files.

Your data is yours. It should always be in your possession. Ideally, you’d have a local backup of your data (time machine, extra hard drive, etc) AND an offsite copy… not OR. We call that 3-2-1 Backup Strategy. That way you’re not dependent on Apple and a stable Internet connection to get your files when you want them.

iCloud Drive Isn’t a Backup Either

iCloud Drive is another iCloud feature that can lull you into a false sense of security. It’s a Dropbox-style sync repository – files put in iCloud Drive appear on the Mac, iPhone, and iPad. However, any files you don’t choose to add to iCloud Drive are only available locally and are not backed up.

iCloud Drive has limits, too. You can’t upload a file larger than 15 GB. And you can only store as much as you’ve paid for – hit your limit, and you’ll have to pay more. But only up to 2 TB, which will cost you $19.99/month.

Trust But Verify (and Back Up Yourself)

I’ve used iCloud from the start and I continue to do so. iCloud is an excellent sync service. It makes the Apple ecosystem of hardware and software easier to use. But it isn’t infallible. I’ve had problems with calendar syncing, contacts disappearing, and my music getting messed up by iTunes In the Cloud.

That was a real painful lesson for me. I synced thousands of tracks of music I’d had for many years, ripped from the original CDs I owned and had long since put in storage. iTunes In the Cloud synced my music library so I could share it with all my Apple devices. To save space and bandwidth, the service doesn’t upload your library when it can replace tracks with what it thinks are matches in iTunes’ own library. I didn’t want Apple’s versions – I wanted mine, because I’d customized them with album art and spent a lot of time crafting them. Apple’s versions sometimes looked and sounded differently than mine.

If I hadn’t kept a backup copy locally, I’d be stuck with Apple’s versions. That wasn’t what I wanted. My data is mine.

The prospect of downloading thousands of files, and all the time that would take is daunting. That’s why we created the Restore Return Refund program – you can get your backed up files delivered by FedEx on a USB thumbdrive or hard disk drive. You can’t do that with iCloud.

It’s experiences like that which explain why I think it’s so important to understand iCloud’s inherent shortcomings as a backup service. Having your data sync across your devices is a great feature and one I use all the time. However, as a sole backup solution, it’s a recipe for disaster.

Like all sync services if you accidently delete a file on one device it’s gone on all of your devices as soon as the next sync happens. Unfortunately “user error” is an all too common problem and when it comes to your data, it’s not one you want to take for granted.

Which brings us to the last point I want to make. It’s easy to get complacent with one company’s ecosystem, but circumstances change. What happens when you get rid of that Mac or that iPhone and get something that doesn’t integrate as easily with the Apple world? Extricating yourself from any company’s ecosystem can, quite frankly, be an intimidating experience, with lots of opportunities to overlook or lose important files. You can avoid such data insecurity by having your info backed up.

With a family that uses lots of Apple products, I pay for Apple’s iCloud and other Apple services. With a Mac and iPhone, iCloud’s ability to sync content means that my workflow is seamless from mobile to desktop and back. I spend less time fiddling with my devices and more time getting work done. The data on iCloud makes up my digital life. Like anything valuable, it’s common sense to keep my info close and well protected. That’s why I keep a local backup, with offsite backup through Backblaze, of course.

The safety, security, and integrity of your data are paramount. Do whatever you can to make sure it’s safe. Back up your files locally and offsite away from iCloud. Backblaze is here to help. If you need more advice for backing up your Mac, check out our complete Mac Backup Guide for details.

The post Don’t Get Trapped in iCloud appeared first on Backblaze Blog | Cloud Storage & Cloud Backup.

Dear Obama, From Infosec

Post Syndicated from Robert Graham original http://blog.erratasec.com/2017/01/dear-obama-from-infosec.html

Dear President Obama:

We are more than willing to believe Russia was responsible for the hacked emails/records that influenced our election. We believe Russian hackers were involved. Even if these hackers weren’t under the direct command of Putin, we know he could put a stop to such hacking if he chose. It’s like harassment of journalists and diplomats. Putin encourages a culture of thuggery that attacks opposition, without his personal direction, but with his tacit approval.

Your lame attempts to convince us of what we already agree with has irretrievably damaged your message.

Instead of communicating with the America people, you worked through your typical system of propaganda, such as stories in the New York Times quoting unnamed “senior government officials”. We don’t want “unnamed” officials — we want named officials (namely you) who we can pin down and question. When you work through this system of official leaks, we believe you have something to hide, that the evidence won’t stand on its own.

We still don’t believe the CIA’s conclusions because we don’t know, precisely, what those conclusions are. Are they derived purely from companies like FireEye and CrowdStrike based on digital forensics? Or do you have spies in Russian hacker communities that give better information? This is such an important issue that it’s worth degrading sources of information in order to tell us, the American public, the truth.

You had the DHS and US-CERT issue the “GRIZZLY-STEPPE”[*] report “attributing those compromises to Russian malicious cyber activity“. It does nothing of the sort. It’s full of garbage. It contains signatures of viruses that are publicly available, used by hackers around the world, not just Russia. It contains a long list of IP addresses from perfectly normal services, like Tor, Google, Dropbox, Yahoo, and so forth.

Yes, hackers use Yahoo for phishing and malvertising. It doesn’t mean every access of Yahoo is an “Indicator of Compromise”.

For example, I checked my web browser [chrome://net-internals/#dns] and found that last year on November 20th, it accessed two IP addresses that are on the Grizzley-Steppe list:

No, this doesn’t mean I’ve been hacked. It means I just had a normal interaction with Yahoo. It means the Grizzley-Steppe IoCs are garbage.

If your intent was to show technical information to experts to confirm Russia’s involvement, you’ve done the precise opposite. Grizzley-Steppe proves such enormous incompetence that we doubt all the technical details you might have. I mean, it’s possible that you classified the important details and de-classified the junk, but even then, that junk isn’t worth publishing. There’s no excuse for those Yahoo addresses to be in there, or the numerous other problems.

Among the consequences is that Washington Post story claiming Russians hacked into the Vermont power grid. What really happened is that somebody just checked their Yahoo email, thereby accessing one of the same IP addresses I did. How they get from the facts (one person accessed Yahoo email) to the story (Russians hacked power grid) is your responsibility. This misinformation is your fault.

You announced sanctions for the Russian hacking [*]. At the same time, you announced sanctions for Russian harassment of diplomatic staff. These two events are confused in the press, with most stories reporting you expelled 35 diplomats for hacking, when that appears not to be the case.

Your list of individuals/organizations is confusing. It makes sense to name the GRU, FSB, and their officers. But why name “ZorSecurity” but not sole proprietor “Alisa Esage Shevchenko”? It seems a minor target, and you give no information why it was selected. Conversely, you ignore the APT28/APT29 Dukes/CozyBear groups that feature so prominently in your official leaks. You also throw in a couple extra hackers, for finance hacks rather than election hacks. Again, this causes confusion in the press about exactly who you are sanctioning and why. It seems as slipshod as the DHS/US-CERT report.

Mr President, you’ve got two weeks left in office. Russia’s involvement is a huge issue, especially given President-Elect Trump’s pro-Russia stance. If you’ve got better information than this, I beg you to release it. As it stands now, all you’ve done is support Trump’s narrative, making this look like propaganda — and bad propaganda at that. Give us, the infosec/cybersec community, technical details we can look at, analyze, and confirm.

Regards,
Infosec